Skip to content

LATS agent architecture not supported for Ollama provider #60

@ocombe

Description

@ocombe

Hello !
When using CODEGRAPH_AGENT_ARCHITECTURE=lats with CODEGRAPH_LLM_PROVIDER=ollama, the agentic tools fail with:

LATS not yet supported for provider Ollama

The build_lats() function in crates/codegraph-mcp-rig/src/agent/builder.rs only has cases for OpenAI and Anthropic providers. Ollama falls through to the error case.

Fix: Add the Ollama case following the same pattern as other providers:

#[cfg(feature = "ollama")]
RigProvider::Ollama => {
    let client = RigLLMAdapter::ollama_client();
    let model = client.completion_model(&model_name);
    Ok(Box::new(LatsAgent {
        model,
        factory,
        max_turns: self.max_turns,
        tier: self.tier,
    }))
}

This compiles and allows LATS to be invoked with Ollama (confirmed via debug logs showing "framework":"AutoAgents-LATS").

Do you need me to open a PR for that ? Or is there a specific reason for not having LATS support for Ollama ?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions