Skip to content

feat: support OpenRouter and custom OpenAI-compatible base URLs #197

@sowhan

Description

@sowhan

Problem

ml-intern currently only works with Anthropic and OpenAI directly.
There's no way to point the OpenAI client at a custom base URL,
which means OpenRouter (and other compatible providers like Together,
Groq, etc.) aren't supported.

Proposed solution

Respect OPENAI_BASE_URL env var when initializing the OpenAI client —
same pattern used by Claude Code, smolagents' OpenAIModel, and the
broader OpenAI SDK ecosystem.

Optionally, a --base-url CLI flag for explicit overrides.

Example usage

export OPENAI_API_KEY="sk-or-v1-..."
export OPENAI_BASE_URL="https://openrouter.ai/api/v1"
ml-intern --model openai/deepseek/deepseek-r1 "fine-tune a Whisper model"

This unlocks 300+ models on OpenRouter without any new dependencies.

Question

Would you accept a PR for this? Happy to keep it minimal — just the
base URL passthrough, no refactoring.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type
    No fields configured for issues without a type.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions