Skip to content

Native support for Response API #35

@kowyo

Description

@kowyo

Summary

Add native support for OpenAI Response API format, supporting both Anthropic and OpenAI SDKs natively.

Motivation

While Anthropic models work well with the current Anthropic-endpoint format, OpenAI models do not perform as expected when using the same format. Many providers (Google, open-source models) have great OpenAI format support, but current implementations through proxies like LiteLLM and OpenRouter have limitations:

  • LiteLLM does not map output_config.effort to reasoning.effort
  • OpenRouter BYOK has mapping issues despite free usage benefits

Supporting both SDKs natively will provide better performance and reliability.

Proposed Solution

Refactor the codebase to support both SDKs natively:

  • Add native OpenAI SDK integration alongside existing Anthropic SDK
  • Handle format differences properly for each provider
  • Support OpenAI-specific features like reasoning.effort correctly

Acceptance Criteria

  • Native OpenAI SDK support implemented
  • Anthropic SDK continues to work as before
  • Proper handling of reasoning/effort parameters
  • Documentation for SDK selection and configuration

Additional Context

This will require a major refactor. Tested alternatives:

  • LiteLLM (does not map output_config.effort to reasoning.effort)
  • OpenRouter BYOK (maps output_config.effort to verbosity)

Google and many open-source models support OpenAI format well.

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions