You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
**LM Studio (Local models via OpenAI-compatible API):**
57
+
58
+
```bash
59
+
MODEL=lmstudio/model-name
60
+
```
61
+
62
+
LM Studio runs a local OpenAI-compatible API server on `http://localhost:1234/v1`. Make sure LM Studio is running with a model loaded before using this provider.
63
+
56
64
### MCP Server Configuration
57
65
58
66
The `MCP_SERVER_URL` environment variable controls MCP (Model Context Protocol) integration. The tool automatically detects whether to use HTTP or StdIO transport based on the value format.
@@ -101,6 +109,7 @@ MCP_SERVER_URL=
101
109
-`ANTHROPIC_API_KEY`: Required when using `anthropic/*` models
102
110
-`OPENAI_API_KEY`: Required when using `openai/*` models (get at https://platform.openai.com/api-keys)
103
111
-`OPENROUTER_API_KEY`: Required when using `openrouter/*` models (get at https://openrouter.ai/keys)
112
+
- No API key required for `lmstudio/*` models (runs locally)
104
113
105
114
### Provider Routing
106
115
@@ -109,6 +118,7 @@ The benchmark tool automatically routes to the correct provider based on the `MO
109
118
-`anthropic/*` → Direct Anthropic API
110
119
-`openai/*` → Direct OpenAI API
111
120
-`openrouter/*` → OpenRouter unified API
121
+
-`lmstudio/*` → LM Studio local server (OpenAI-compatible)
112
122
113
123
This allows switching models and providers without any code changes.
114
124
@@ -154,6 +164,7 @@ tests/
154
164
-**Vercel AI SDK v5**: Agent framework with tool calling
155
165
-**@ai-sdk/anthropic**: Anthropic provider for direct API access
156
166
-**@ai-sdk/openai**: OpenAI provider for direct API access
167
+
-**@ai-sdk/openai-compatible**: OpenAI-compatible provider for LM Studio and other local servers
157
168
-**@openrouter/ai-sdk-provider**: OpenRouter provider for unified access to 300+ models
0 commit comments