Skip to content

feat: Add support for custom provider endpoints and headers #245

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

tarcisiojr
Copy link

This commit introduces several enhancements for configuration flexibility and usability.

Users can now specify a custom baseURL and headers for API providers (e.g., OpenAI, Gemini, Groq). This enables routing requests through proxies, corporate gateways, or connecting to self-hosted, API-compatible services.

Additionally, two new command-line flags have been added:

  • --config: Allows specifying a path to a custom JSON configuration file, overriding the default search behavior.
  • --prompt-file: Enables loading a prompt from a markdown file, making it easier to run complex, multi-line prompts in non-interactive mode.

The README has been updated to document these new features.

This commit introduces several enhancements for configuration flexibility and usability.

Users can now specify a custom `baseURL` and `headers` for API providers (e.g., OpenAI, Gemini, Groq). This enables routing requests through proxies, corporate gateways, or connecting to self-hosted, API-compatible services.

Additionally, two new command-line flags have been added:
- `--config`: Allows specifying a path to a custom JSON configuration file, overriding the default search behavior.
- `--prompt-file`: Enables loading a prompt from a markdown file, making it easier to run complex, multi-line prompts in non-interactive mode.

The README has been updated to document these new features.
@obukhovaa
Copy link

Did similar thing as a part of #243, but mostly have focused on anthropic and vertex. It turned out it was quite tricky to make it work with LiteLLM proxy in passthrough mode, but it seems to be working just fine now. When I was attempting to use anthropic tools with LOCAL_ENDPOINT override via OpenAI translation on LiteLLM behalf, it wasn't working properly and yield errors, apparently due to some miss-translation.
Would be smart to consolidate changes related to baseURL and headers and keep provider specific implementation in place.

WithOpenAIBaseURL(localEndpoint),
)
}
}

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Too many duplicates of code.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants