Skip to content

Add native OpenRouter model support#232

Open
lewtun wants to merge 1 commit intomainfrom
codex/native-openrouter-models
Open

Add native OpenRouter model support#232
lewtun wants to merge 1 commit intomainfrom
codex/native-openrouter-models

Conversation

@lewtun
Copy link
Copy Markdown
Member

@lewtun lewtun commented May 6, 2026

Summary

  • add native openrouter/... model-id handling before the Hugging Face Router fallback
  • keep openai/... reserved for direct OpenAI usage and intentionally do not wire OPENAI_BASE_URL
  • update CLI model-switching help/docs and ensure OpenRouter ids bypass HF Router catalog lookup

Tests

  • uv run --extra dev pytest tests/unit/test_llm_params.py tests/unit/test_cli_rendering.py
  • uv run ruff check .
  • uv run ruff format --check .

Refs #223, #197, #188

Co-authored-by: OpenAI Codex <codex@openai.com>
@github-actions
Copy link
Copy Markdown

github-actions Bot commented May 6, 2026

Claude finished @lewtun's task in 3m 27s —— View job


Review: PR #232 — Add native OpenRouter model support

No blocking issues — 3 P1. Routing change is small and well-scoped: openrouter/... short-circuits before the HF Router fallback in both _resolve_llm_params and _print_hf_routing_info, with matching tests. No P0 found in the routing/auth/agent-loop chain. The three items below are quality-of-life misses for the new prefix.


P1 — Auth error message doesn't mention OpenRouter

agent/core/agent_loop.py:518-520 — when LiteLLM raises 401 for a missing/invalid OpenRouter key, the friendly hint enumerates only Anthropic / OpenAI / HF Router:

"  • Anthropic:   export ANTHROPIC_API_KEY=sk-...\n"
"  • OpenAI:      export OPENAI_API_KEY=sk-...\n"
"  • HF Router:   export HF_TOKEN=hf_...\n\n"

A user picking openrouter/anthropic/claude-opus-4.7 with no OPENROUTER_API_KEY set will hit this branch and get a misleading hint. Add a fourth bullet for OpenRouter so the new prefix has discoverable error guidance. Fix this →

P1 — _get_research_model doesn't step down openrouter/anthropic/...

agent/tools/research_tool.py:222-229 has explicit Sonnet step-downs for anthropic/... and bedrock/.../anthropic, but the new openrouter/anthropic/... ids fall through to "use the same model" — so a session on openrouter/anthropic/claude-opus-4.7 runs every research sub-agent on Opus too. The PR brought a new shape into a function that's pattern-matching prefixes; the step-down should be extended:

if main_model.startswith("openrouter/anthropic/"):
    return "openrouter/anthropic/claude-sonnet-4-6"

Not a correctness break (the fall-through works), but for a frontier-model-via-OpenRouter user this materially changes per-session cost. Fix this →

P1 — test_openai_base_url_is_not_forwarded doesn't test what its name (and the README) claim

tests/unit/test_llm_params.py:33-38 and README.md:64-65 together promise that openai/... is "intentionally not wired" to OPENAI_BASE_URL, so a user who has OPENAI_BASE_URL=https://openrouter.ai/api/v1 set in their env won't have OpenAI calls silently rerouted. The test only verifies that _resolve_llm_params returns {"model": "openai/gpt-5.5"} — but LiteLLM's OpenAI adapter honors OPENAI_BASE_URL from the environment regardless of what's in the params dict (no api_base override is set in llm_params.py:187-197). The invariant the README promises isn't actually enforced.

Either (a) tighten the test to assert end-to-end routing — e.g. monkeypatch litellm.acompletion and verify the resolved request URL when OPENAI_BASE_URL is set — or (b) soften the README claim to "we don't pass api_base; LiteLLM still honors OPENAI_BASE_URL from your env, set it deliberately." Right now a reader could reasonably expect protection that isn't there.


• Branch: codex/native-openrouter-models

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant