Skip to content

Allow vllm provider to work without API key#384

Open
Loule95450 wants to merge 1 commit intosipeed:mainfrom
Loule95450:allow-vllm-no-apikey
Open

Allow vllm provider to work without API key#384
Loule95450 wants to merge 1 commit intosipeed:mainfrom
Loule95450:allow-vllm-no-apikey

Conversation

@Loule95450
Copy link

@Loule95450 Loule95450 commented Feb 17, 2026

📝 Description

Allow the vllm provider to work without an API key. This enables use of OpenAI-compatible endpoints that don't require authentication, such as opencode.ai/zen free models (kimi-k2.5-free, glm-5-free, etc.).

Single line change in pkg/providers/http_provider.go: added && providerName != "vllm" to the API key validation check at line 441.

🗣️ Type of Change

  • 🐞 Bug fix (non-breaking change which fixes an issue)
  • ✨ New feature (non-breaking change which adds functionality)
  • 📖 Documentation update
  • ⚡ Code refactoring (no functional changes, no api changes)

🤖 AI Code Generation

  • 🤖 Fully AI-generated (100% AI, 0% Human)
  • 🛠️ Mostly AI-generated (AI draft, Human verified/modified)
  • 👨‍💻 Mostly Human-written (Human lead, AI assisted or none)

🔗 Linked Issue

N/A

📚 Technical Context (Skip for Docs)

  • Reference: https://opencode.ai/zen
  • Reasoning: The HTTPProvider.Chat() method already handles empty API keys correctly (skips the Authorization header at line 104), but the CreateProvider() validation at line 441 blocks empty keys for all providers except bedrock. The vllm provider is designed for self-hosted/custom endpoints and should also allow keyless access.

🧪 Test Environment & Hardware

  • Hardware: Raspberry Pi (aarch64)
  • OS: Debian (Linux ARM64)
  • Model/Provider: kimi-k2.5-free, glm-5-free, gpt-5-nano via opencode.ai/zen/v1
  • Channels: Telegram, CLI

📸 Proof of Work (Optional for Docs)

Click to view Logs/Screenshots
$ picoclaw agent -m "Hello! Who are you? Reply in one sentence."
2026/02/17 17:37:37 [INFO] agent: Agent initialized {skills_available=6, tools_count=13, skills_total=6}
2026/02/17 17:37:37 [INFO] agent: Processing message from cli:cron: Hello! Who are you? Reply in one sentence.
2026/02/17 17:37:39 [INFO] agent: LLM response without tool calls (direct answer) {iteration=1, content_chars=123}

🦞 Hello! I'm **picoclaw** 🦞, a lightweight AI assistant written in Go, here to help you with information, tasks, and more.

Config used:

{
  "provider": "vllm",
  "model": "kimi-k2.5-free",
  "providers": {
    "vllm": {
      "api_key": "",
      "api_base": "https://opencode.ai/zen/v1"
    }
  }
}

☑️ Checklist

  • My code/docs follow the style of this project.
  • I have performed a self-review of my own changes.
  • I have updated the documentation accordingly.

Skip the API key requirement check when using the vllm provider,
enabling use of OpenAI-compatible endpoints that don't require
authentication (e.g. opencode.ai/zen free models).
Copilot AI review requested due to automatic review settings February 17, 2026 19:58
Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Updates the HTTP provider factory to allow using the vllm provider against OpenAI-compatible endpoints that do not require authentication, by relaxing API key validation for that provider.

Changes:

  • Skip the “missing API key” error when provider=vllm.
  • Preserve existing API key requirements for other providers.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment on lines +441 to 442
if apiKey == "" && !strings.HasPrefix(model, "bedrock/") && providerName != "vllm" {
return nil, fmt.Errorf("no API key configured for provider (model: %s)", model)
Copy link

Copilot AI Feb 17, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This change introduces new provider-selection behavior (VLLM without an API key) but there doesn't appear to be a unit test covering CreateProvider for provider=vllm (or the auto-detect case) with an empty API key. Please add a factory test alongside the existing CreateProvider tests to ensure VLLM works without a key and that other providers still error when apiKey is empty.

Copilot uses AI. Check for mistakes.
Comment on lines +441 to 443
if apiKey == "" && !strings.HasPrefix(model, "bedrock/") && providerName != "vllm" {
return nil, fmt.Errorf("no API key configured for provider (model: %s)", model)
}
Copy link

Copilot AI Feb 17, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This API-key guard only checks the configured providerName. If the provider is auto-detected (i.e., cfg.Agents.Defaults.Provider is empty) but VLLM is selected via the fallback branch (cfg.Providers.VLLM.APIBase != ""), providerName stays empty and this condition will still reject an empty API key. Consider tracking the selected provider (e.g., set providerName = "vllm" when the VLLM fallback branch is chosen, or base the exemption on the chosen apiBase/config rather than the original config field).

Copilot uses AI. Check for mistakes.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant