Allow vllm provider to work without API key#384
Allow vllm provider to work without API key#384Loule95450 wants to merge 1 commit intosipeed:mainfrom
Conversation
Skip the API key requirement check when using the vllm provider, enabling use of OpenAI-compatible endpoints that don't require authentication (e.g. opencode.ai/zen free models).
There was a problem hiding this comment.
Pull request overview
Updates the HTTP provider factory to allow using the vllm provider against OpenAI-compatible endpoints that do not require authentication, by relaxing API key validation for that provider.
Changes:
- Skip the “missing API key” error when
provider=vllm. - Preserve existing API key requirements for other providers.
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| if apiKey == "" && !strings.HasPrefix(model, "bedrock/") && providerName != "vllm" { | ||
| return nil, fmt.Errorf("no API key configured for provider (model: %s)", model) |
There was a problem hiding this comment.
This change introduces new provider-selection behavior (VLLM without an API key) but there doesn't appear to be a unit test covering CreateProvider for provider=vllm (or the auto-detect case) with an empty API key. Please add a factory test alongside the existing CreateProvider tests to ensure VLLM works without a key and that other providers still error when apiKey is empty.
| if apiKey == "" && !strings.HasPrefix(model, "bedrock/") && providerName != "vllm" { | ||
| return nil, fmt.Errorf("no API key configured for provider (model: %s)", model) | ||
| } |
There was a problem hiding this comment.
This API-key guard only checks the configured providerName. If the provider is auto-detected (i.e., cfg.Agents.Defaults.Provider is empty) but VLLM is selected via the fallback branch (cfg.Providers.VLLM.APIBase != ""), providerName stays empty and this condition will still reject an empty API key. Consider tracking the selected provider (e.g., set providerName = "vllm" when the VLLM fallback branch is chosen, or base the exemption on the chosen apiBase/config rather than the original config field).
📝 Description
Allow the
vllmprovider to work without an API key. This enables use of OpenAI-compatible endpoints that don't require authentication, such as opencode.ai/zen free models (kimi-k2.5-free, glm-5-free, etc.).Single line change in
pkg/providers/http_provider.go: added&& providerName != "vllm"to the API key validation check at line 441.🗣️ Type of Change
🤖 AI Code Generation
🔗 Linked Issue
N/A
📚 Technical Context (Skip for Docs)
HTTPProvider.Chat()method already handles empty API keys correctly (skips theAuthorizationheader at line 104), but theCreateProvider()validation at line 441 blocks empty keys for all providers exceptbedrock. Thevllmprovider is designed for self-hosted/custom endpoints and should also allow keyless access.🧪 Test Environment & Hardware
📸 Proof of Work (Optional for Docs)
Click to view Logs/Screenshots
Config used:
{ "provider": "vllm", "model": "kimi-k2.5-free", "providers": { "vllm": { "api_key": "", "api_base": "https://opencode.ai/zen/v1" } } }☑️ Checklist