Problem
ml-intern currently only works with Anthropic and OpenAI directly.
There's no way to point the OpenAI client at a custom base URL,
which means OpenRouter (and other compatible providers like Together,
Groq, etc.) aren't supported.
Proposed solution
Respect OPENAI_BASE_URL env var when initializing the OpenAI client —
same pattern used by Claude Code, smolagents' OpenAIModel, and the
broader OpenAI SDK ecosystem.
Optionally, a --base-url CLI flag for explicit overrides.
Example usage
export OPENAI_API_KEY="sk-or-v1-..."
export OPENAI_BASE_URL="https://openrouter.ai/api/v1"
ml-intern --model openai/deepseek/deepseek-r1 "fine-tune a Whisper model"
This unlocks 300+ models on OpenRouter without any new dependencies.
Question
Would you accept a PR for this? Happy to keep it minimal — just the
base URL passthrough, no refactoring.
Problem
ml-intern currently only works with Anthropic and OpenAI directly.
There's no way to point the OpenAI client at a custom base URL,
which means OpenRouter (and other compatible providers like Together,
Groq, etc.) aren't supported.
Proposed solution
Respect
OPENAI_BASE_URLenv var when initializing the OpenAI client —same pattern used by Claude Code, smolagents'
OpenAIModel, and thebroader OpenAI SDK ecosystem.
Optionally, a
--base-urlCLI flag for explicit overrides.Example usage
This unlocks 300+ models on OpenRouter without any new dependencies.
Question
Would you accept a PR for this? Happy to keep it minimal — just the
base URL passthrough, no refactoring.