-
Notifications
You must be signed in to change notification settings - Fork 174
Closed as duplicate of#7
Closed as duplicate of#7
Copy link
Labels
enhancementNew feature or requestNew feature or request
Description
Problem Statement
Right now OpenWork only works with Anthropic, OpenAI, and Google. That's great for people with API credits, but what about everyone else?
OpenWork is open source, so it feels a bit weird that it's locked to just the big cloud providers. A lot of us run local models with Ollama or use services like OpenRouter, Mistral AI,, Groq etc.. stuff that follows the OpenAI API format. It would be really helpful to have support for these out of the box.
Proposed Solution
Add two new provider options:
- Ollama for running models locally (llama3, mistral, deepseek-coder, qwen, etc.). Just needs a base URL, no API key.
- OpenAI-compatible for anything that speaks the OpenAI API format. You'd set a custom endpoint URL and API key, and it just works.
Use Case
This would help:
- People who want to keep their data local
- Folks who don't want to pay for every API call
- Anyone running their own models on vLLM or LMStudio
- Students and hobbyists just experimenting with agents
- Basically makes OpenWork usable for a lot more people.
Additional Context
Looking at the codebase, the type system already has 'ollama' in
ProviderId but it's not implemented yet. The changes would mainly touch:
src/main/agent/runtime.tsAdd Ollama togetModelInstance()src/main/storage.tsAdd base URL storagesrc/main/ipc/models.tsAdd Ollama to providers list
This issue also related to #5 , #7
I'd be willing to work on this if there's interest.
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
enhancementNew feature or requestNew feature or request