Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 6 additions & 0 deletions .env.example
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,12 @@ LLM_MODEL_NAME=claude-sonnet-4-20250514
# LLM_BASE_URL=https://api.openai.com/v1
# LLM_MODEL_NAME=gpt-4o-mini

# For local Nexa (OpenAI-compatible):
# LLM_PROVIDER=nexa
# LLM_BASE_URL=http://127.0.0.1:11434/v1
# LLM_MODEL_NAME=NexaAI/Llama3.2-3B-NPU-Turbo
# NEXA_API_KEY=nexa
Comment on lines +17 to +21
Copy link

Copilot AI Apr 2, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The Nexa example relies on LLM_PROVIDER=nexa to get the “no real key required” behavior. If LLM_PROVIDER is left blank for auto-detection, the current client initialization still errors before provider detection when no API key is set, so this example likely needs an explicit note that LLM_PROVIDER must be set to nexa for keyless local usage.

Copilot uses AI. Check for mistakes.

# For OpenAI-compatible providers (e.g., OpenRouter):
# LLM_API_KEY=sk-or-your-key
# LLM_BASE_URL=https://openrouter.ai/api/v1
Expand Down
2 changes: 2 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,8 @@ A swarm intelligence prediction engine. Upload documents describing any scenario

**Live:** [synth.scty.org](https://synth.scty.org)

Modified by Viswa.

> Fork of [666ghj/MiroFish](https://github.com/666ghj/MiroFish) — fully translated to English, local graph storage with embedded KuzuDB by default, Claude/Codex CLI support added.

Comment on lines +7 to 10
Copy link

Copilot AI Apr 2, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The added README line "Modified by Viswa." doesn’t describe functionality and can quickly get stale as more contributors modify the repo. Consider replacing it with a short, purpose-focused note (e.g., mention Nexa/local provider support) or removing it entirely.

Suggested change
Modified by Viswa.
> Fork of [666ghj/MiroFish](https://github.com/666ghj/MiroFish) — fully translated to English, local graph storage with embedded KuzuDB by default, Claude/Codex CLI support added.
> Fork of [666ghj/MiroFish](https://github.com/666ghj/MiroFish) — fully translated to English, local graph storage with embedded KuzuDB by default, Claude/Codex CLI support added.

Copilot uses AI. Check for mistakes.
## What it does
Expand Down
10 changes: 7 additions & 3 deletions backend/app/config.py
Original file line number Diff line number Diff line change
Expand Up @@ -63,6 +63,9 @@ def _get_llm_api_key() -> str:
return explicit

provider = (os.environ.get('LLM_PROVIDER', '') or '').strip().lower()
if provider == 'nexa':
# Nexa local server does not require a real key, but OpenAI SDK needs a string
return os.environ.get('NEXA_API_KEY', '') or 'nexa'
if provider == 'anthropic':
return os.environ.get('ANTHROPIC_API_KEY', '')

Expand All @@ -89,7 +92,8 @@ class Config:
LLM_API_KEY = _get_llm_api_key()
LLM_BASE_URL = _get_env_or_default('LLM_BASE_URL', 'https://api.openai.com/v1')
LLM_MODEL_NAME = _get_env_or_default('LLM_MODEL_NAME', 'gpt-4o-mini')
LLM_PROVIDER = os.environ.get('LLM_PROVIDER', '') # 'openai', 'anthropic', 'claude-cli', 'codex-cli'
# Providers: openai | anthropic | claude-cli | codex-cli | nexa (OpenAI-compatible local server)
LLM_PROVIDER = os.environ.get('LLM_PROVIDER', '')
Copy link

Copilot AI Apr 2, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Config.LLM_PROVIDER is taken directly from the environment without .strip().lower(), but later validation compares it to lowercase literals. If a user sets LLM_PROVIDER=NEXA (or has whitespace), validate() will incorrectly treat it as an unknown provider. Normalize LLM_PROVIDER on assignment (or normalize inside validate()).

Suggested change
LLM_PROVIDER = os.environ.get('LLM_PROVIDER', '')
LLM_PROVIDER = os.environ.get('LLM_PROVIDER', '').strip().lower()

Copilot uses AI. Check for mistakes.

# Graph storage config
GRAPH_BACKEND = os.environ.get("GRAPH_BACKEND", "kuzu").lower()
Expand Down Expand Up @@ -129,8 +133,8 @@ class Config:
def validate(cls):
"""Validate required configuration."""
errors = []
if cls.LLM_PROVIDER not in ("claude-cli", "codex-cli") and not cls.LLM_API_KEY:
errors.append("LLM_API_KEY not configured (set LLM_PROVIDER=claude-cli or codex-cli to use CLI instead)")
if cls.LLM_PROVIDER not in ("claude-cli", "codex-cli", "nexa") and not cls.LLM_API_KEY:
Copy link

Copilot AI Apr 2, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

validate() now exempts nexa, but it relies on exact string matches against cls.LLM_PROVIDER. Since provider normalization currently happens in some call sites but not here, this can produce false validation errors. Normalize cls.LLM_PROVIDER (e.g., strip().lower()) before checking membership.

Suggested change
if cls.LLM_PROVIDER not in ("claude-cli", "codex-cli", "nexa") and not cls.LLM_API_KEY:
provider = (cls.LLM_PROVIDER or "").strip().lower()
if provider not in ("claude-cli", "codex-cli", "nexa") and not cls.LLM_API_KEY:

Copilot uses AI. Check for mistakes.
errors.append("LLM_API_KEY not configured (set LLM_PROVIDER=claude-cli/codex-cli, or use nexa/local provider without a key)")
if cls.GRAPH_BACKEND not in {"kuzu", "json"}:
errors.append("GRAPH_BACKEND must be either 'kuzu' or 'json'")
return errors
2 changes: 2 additions & 0 deletions backend/app/utils/llm_client.py
Original file line number Diff line number Diff line change
Expand Up @@ -63,6 +63,8 @@ def _detect_provider(self) -> str:
model_lower = (self.model or "").lower()
base_lower = (self.base_url or "").lower()

if "nexa" in model_lower or "nexa" in base_lower or "11434" in base_lower:
return "nexa"
if any(k in model_lower for k in ["claude", "anthropic"]):
return "anthropic"
if "anthropic" in base_lower:
Expand Down
20 changes: 20 additions & 0 deletions docker-compose.nexa.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
services:
mirofish:
Copy link

Copilot AI Apr 2, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

docker-compose.nexa.yml defines services.mirofish without an image: or build:. Running docker compose -f docker-compose.nexa.yml up will fail unless this file is only intended as an override; add build: ./image: or document the expected multi-file compose usage in the repo docs.

Suggested change
mirofish:
mirofish:
build: .

Copilot uses AI. Check for mistakes.
depends_on: {}
networks:
- default
environment:
LLM_PROVIDER: openai
LLM_BASE_URL: http://host.docker.internal:11434/v1
LLM_API_KEY: nexa
Comment on lines +7 to +9
Copy link

Copilot AI Apr 2, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This Nexa compose config sets LLM_PROVIDER: openai, which bypasses the new nexa provider path and makes the file name/config intent confusing. Consider setting LLM_PROVIDER: nexa (and optionally using NEXA_API_KEY instead of LLM_API_KEY) to align with the new provider support.

Suggested change
LLM_PROVIDER: openai
LLM_BASE_URL: http://host.docker.internal:11434/v1
LLM_API_KEY: nexa
LLM_PROVIDER: nexa
LLM_BASE_URL: http://host.docker.internal:11434/v1
NEXA_API_KEY: nexa

Copilot uses AI. Check for mistakes.
Comment on lines +7 to +9
Copy link

Copilot AI Apr 2, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

host.docker.internal is not resolvable on many Linux Docker setups by default. If this compose file is meant to be portable, add an extra_hosts: ["host.docker.internal:host-gateway"] mapping or document that it requires Docker Desktop / host.docker.internal support.

Copilot uses AI. Check for mistakes.
LLM_MODEL_NAME: NexaAI/Llama3.2-3B-NPU-Turbo
ports:
- "5001:5001"
volumes:
- ./backend/uploads:/app/backend/uploads
- ./backend/data:/app/backend/data
labels: {}

networks:
default:
name: mirofish-nexa
Loading