This repository was archived by the owner on Mar 14, 2026. It is now read-only.
feat: Add LiteLLM Proxy Integration for Multi-Provider LLM Support#34
Merged
takasaki404 merged 26 commits intomainfrom Sep 6, 2025
Merged
feat: Add LiteLLM Proxy Integration for Multi-Provider LLM Support#34takasaki404 merged 26 commits intomainfrom
takasaki404 merged 26 commits intomainfrom
Conversation
… add configuration for models
- Added new endpoint `/completions_proxy` to handle chat completions via LiteLLM Proxy. - Introduced LLMClient for unified access to multiple LLM providers. - Implemented streaming and non-streaming responses for chat completions. - Added endpoints for listing available models and health checks. - Created standardized data models for LLM requests and responses. - Updated configuration settings for LiteLLM Proxy. - Added exception handling for OpenAI-specific errors. - Implemented tests for the new chat proxy functionality and LLM client. - Ensured backward compatibility with existing OpenAI models.
…vironment variable usage
…ent and formatting
…LLM config for database logging
…gging and specify database URL
…anced functionality
… to model configurations in litellm_config.yaml
…pseek-r1 and glm-4_5
takasaki404
approved these changes
Sep 6, 2025
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to subscribe to this conversation on GitHub.
Already have an account?
Sign in.
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Title:
feat: Add LiteLLM Proxy Integration for Multi-Provider LLM SupportDescription:
Summary
Adds LiteLLM Proxy integration to enable routing to multiple LLM providers through a unified API interface while maintaining full backward compatibility.
Changes
app/clients/llm_client.py) - Async client for LiteLLM Proxy communication/chat/completions_proxy,/chat/models,/chat/healthendpointsapp/core/config.pyinfra/litellm/)app/models/llm.py)Key Features
Backward Compatibility
/chat/completionsendpoint preservedapp/ai/aimo.pyremains untouchedSetup