Problem
Three related naming issues that need coordinated fixes, including a migration for existing data:
1. "Local LLM" is wrong
The auto-detected provider at :1234 is LM Studio, not a generic "Local LLM". Ollama uses port 11434, not 1234. We're conflating two different tools under one wrong name.
2. Auto-detect creates incomplete accounts
Both TUI and Web onboarding auto-detect create a ProviderAccount with no models array. This causes:
/provider shows "no models available" for auto-configured accounts
- The state migration in
state-manager.ts silently patches this on next load, but only after a restart
3. (default) sentinel is unfriendly in TUI
The sentinel value (default) means "omit the model field — let the server use its current default". It works fine in the Web. But in TUI YAML fields, typing extraction_model: Local LLM:(default) is awkward. Rename to something typeable: default (no parens) is the leading candidate.
Migration required: existing accounts already have "(default)" in their data (model names, default_model refs). Any rename needs a state migration pass.
Acceptance Criteria
Out of Scope
The resolveTokenLimit bug (bare account name spec not finding token_limit) is being fixed in a separate branch alongside this issue's creation.
Problem
Three related naming issues that need coordinated fixes, including a migration for existing data:
1. "Local LLM" is wrong
The auto-detected provider at
:1234is LM Studio, not a generic "Local LLM". Ollama uses port11434, not1234. We're conflating two different tools under one wrong name.2. Auto-detect creates incomplete accounts
Both TUI and Web onboarding auto-detect create a
ProviderAccountwith nomodelsarray. This causes:/providershows "no models available" for auto-configured accountsstate-manager.tssilently patches this on next load, but only after a restart3.
(default)sentinel is unfriendly in TUIThe sentinel value
(default)means "omit the model field — let the server use its current default". It works fine in the Web. But in TUI YAML fields, typingextraction_model: Local LLM:(default)is awkward. Rename to something typeable:default(no parens) is the leading candidate.Migration required: existing accounts already have
"(default)"in their data (model names,default_modelrefs). Any rename needs a state migration pass.Acceptance Criteria
:1234→ LMStudio:11434→ Ollama:8000/system_stats→ ComfyUI (image provider, existing detection)(default)/defaultmodel entry anddefault_modelset to its GUID (not the bare account name)(default)→defaulteverywhere:buildResolvedModelsentinel check inllm-client.tsyaml-provider.tsProviderEditor.tsx(web)name: "(default)"→name: "default", update anydefault_modelrefs that pointed to the old display name:11434/api/tagsor:11434/v1/models)/providershows the default model correctly for freshly auto-detected accountsOut of Scope
The
resolveTokenLimitbug (bare account name spec not finding token_limit) is being fixed in a separate branch alongside this issue's creation.