Skip to content

[Feature]: Add support for Ollama and OpenAI-compatible LLM providers #32

@MANOJ-80

Description

@MANOJ-80

Problem Statement

Right now OpenWork only works with Anthropic, OpenAI, and Google. That's great for people with API credits, but what about everyone else?

OpenWork is open source, so it feels a bit weird that it's locked to just the big cloud providers. A lot of us run local models with Ollama or use services like OpenRouter, Mistral AI,, Groq etc.. stuff that follows the OpenAI API format. It would be really helpful to have support for these out of the box.

Proposed Solution

Add two new provider options:

  • Ollama for running models locally (llama3, mistral, deepseek-coder, qwen, etc.). Just needs a base URL, no API key.
  • OpenAI-compatible for anything that speaks the OpenAI API format. You'd set a custom endpoint URL and API key, and it just works.

Use Case

This would help:

  • People who want to keep their data local
  • Folks who don't want to pay for every API call
  • Anyone running their own models on vLLM or LMStudio
  • Students and hobbyists just experimenting with agents
  • Basically makes OpenWork usable for a lot more people.

Additional Context

Looking at the codebase, the type system already has 'ollama' in
ProviderId but it's not implemented yet. The changes would mainly touch:

  • src/main/agent/runtime.ts Add Ollama to getModelInstance()
  • src/main/storage.ts Add base URL storage
  • src/main/ipc/models.ts Add Ollama to providers list

This issue also related to #5 , #7

I'd be willing to work on this if there's interest.

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions