Skip to content

Support additional AI providers for dynamic topic resolution (Claude, GPT, Ollama) #1

@Supull

Description

@Supull

Currently ros2grapher uses Gemini AI for dynamic topic resolution via the --ai flag.

It would be useful to support additional AI providers so users can bring their own preferred model or use a locally hosted one.

Providers to consider:

  • Anthropic Claude (claude.ai API)
  • OpenAI GPT (gpt-4o, gpt-4o-mini)
  • Ollama (local models, no API key needed)
  • Mistral
  • Any OpenAI-compatible endpoint

The implementation would involve abstracting the current Gemini-specific code in ai_resolver.py into a provider interface, then adding implementations for each provider.

A good starting point would be adding a --ai-provider flag:

ros2grapher ./src --ai --ai-provider gemini   # default
ros2grapher ./src --ai --ai-provider claude
ros2grapher ./src --ai --ai-provider openai
ros2grapher ./src --ai --ai-provider ollama --ai-model llama3

Contributions welcome.

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or requesthelp wantedExtra attention is needed

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions