Skip to content

Integrate with Litellm so that max can use any proprietaty or open weights model without any issues #288

@Greatz08

Description

@Greatz08

Litellm does make things easier for anyone to try any model with different llm api provider keys or self hosted local open weights models, so i will suggest you to integrate it in this project just like many other open source projects use this to give users opition to choose from any model be it open router or direct service provider or open weight self hosted models. They already support 100+ llm's and are quick to add support for new providers, so i will suggest you to integrate this project with litellm. Rest your wish :-))
( Btw litellm is also open source project like this one. You can read more about it on - https://github.com/BerriAI/litellm or https://docs.litellm.ai/docs/ )

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions