Skip to content

fix: built-in provider/model naming (LMStudio/Ollama/ComfyUI + sentinel rename) #49

@Flare576

Description

@Flare576

Problem

Three related naming issues that need coordinated fixes, including a migration for existing data:

1. "Local LLM" is wrong

The auto-detected provider at :1234 is LM Studio, not a generic "Local LLM". Ollama uses port 11434, not 1234. We're conflating two different tools under one wrong name.

2. Auto-detect creates incomplete accounts

Both TUI and Web onboarding auto-detect create a ProviderAccount with no models array. This causes:

  • /provider shows "no models available" for auto-configured accounts
  • The state migration in state-manager.ts silently patches this on next load, but only after a restart

3. (default) sentinel is unfriendly in TUI

The sentinel value (default) means "omit the model field — let the server use its current default". It works fine in the Web. But in TUI YAML fields, typing extraction_model: Local LLM:(default) is awkward. Rename to something typeable: default (no parens) is the leading candidate.

Migration required: existing accounts already have "(default)" in their data (model names, default_model refs). Any rename needs a state migration pass.

Acceptance Criteria

  • Onboarding detects port and names the provider correctly:
    • :1234LMStudio
    • :11434Ollama
    • :8000/system_statsComfyUI (image provider, existing detection)
  • Auto-detected accounts are created with a proper (default)/default model entry and default_model set to its GUID (not the bare account name)
  • Sentinel renamed from (default)default everywhere:
    • buildResolvedModel sentinel check in llm-client.ts
    • YAML template in yaml-provider.ts
    • ProviderEditor.tsx (web)
    • State migration: walk all accounts, rename name: "(default)"name: "default", update any default_model refs that pointed to the old display name
  • TUI and Web onboarding: Ollama check added (:11434/api/tags or :11434/v1/models)
  • /provider shows the default model correctly for freshly auto-detected accounts

Out of Scope

The resolveTokenLimit bug (bare account name spec not finding token_limit) is being fixed in a separate branch alongside this issue's creation.

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workingenhancementNew feature or request

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions