Skip to content

[FEATURE] Add provider abstraction and Ollama support #14

@EQuBitC18

Description

@EQuBitC18

Feature Summary

Add an LLM provider abstraction so Tezrisat can support both OpenAI and Ollama cleanly. Optionally (but necessary right now, priority is Ollama support), make Tezrisat truly LLM-provider-agnostic and provide an abstraction to support most of the well-known LLM providers

Problem/Use Case

Generation is currently OpenAI-coupled, making local-model usage (Ollama) hard to adopt and difficult to maintain without invasive changes.

Proposed Solution

Introduce a provider interface and implementations for OpenAI and Ollama, with configuration-driven provider/model selection.

Alternative Solutions

No response

Priority

Would be helpful

Feature Scope

AI/ML (Content Generation)

Additional Context

No response

Checklist

  • This feature would benefit other users, not just me
  • I have searched for similar feature requests
  • I am willing to help implement this feature

Metadata

Metadata

Labels

enhancementNew feature or requesthelp wantedExtra attention is needed

Projects

No projects

Relationships

None yet

Development

No branches or pull requests

Issue actions