-
Notifications
You must be signed in to change notification settings - Fork 10
Open
Description
Description
The current implementation for certain AI model providers is rigid and prone to failure due to non-standard parameter naming. This issue proposes standardizing the API request structure and refactoring the "In House" provider into a more flexible "Self-Hosted" configuration that supports custom endpoints and private model instances (e.g., Ollama, vLLM).
Current Behavior
- Incorrect Payload Keys: Both
RCInHouseHostedModels.tsandGroqModels.tssend non-standard keys likercModelorgroqModelin the JSON body, leading to400 Bad Requesterrors from standard OpenAI-compatible APIs. - Hardcoded URLs: The "In House" provider forces a URL structure of
http://${rcModel}/v1, which prevents the use of HTTPS, custom ports, or specific sub-paths.
Proposed Changes
- Parameter Standardization: Update request bodies to use the standard
modelkey across all providers to ensure compatibility. - Provider Renaming: Refactor the "In House" provider labels across the UI and settings to "Self-Hosted" for better clarity.
- Custom Endpoint and Model Support: Introduce a new setting
ID_SELF_HOSTED_URLandID_SELF_HOSTED_MODELto allow users to input their own base API path (e.g.,https://ollama.internal.net/v1) and model (e.g.,openai/gpt-oss-20b)

Metadata
Metadata
Assignees
Labels
No labels