Skip to content
This repository was archived by the owner on Oct 22, 2025. It is now read-only.

Conversation

@nick1udwig
Copy link
Member

We can use llamafile, which serves an openai api, as a local llm host. This PR adds a settable base URL for that host as well as a chat request that will get routed to that base URL.

@nick1udwig nick1udwig requested a review from jaxs-ribs May 18, 2024 02:16
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants