Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 6 additions & 0 deletions .env.example
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,12 @@ LLM_MODEL_NAME=claude-sonnet-4-20250514
# LLM_BASE_URL=https://api.openai.com/v1
# LLM_MODEL_NAME=gpt-4o-mini

# For local Nexa (OpenAI-compatible):
# LLM_PROVIDER=nexa
# LLM_BASE_URL=http://127.0.0.1:11434/v1
# LLM_MODEL_NAME=NexaAI/Llama3.2-3B-NPU-Turbo
# NEXA_API_KEY=nexa

# For OpenAI-compatible providers (e.g., OpenRouter):
# LLM_API_KEY=sk-or-your-key
# LLM_BASE_URL=https://openrouter.ai/api/v1
Expand Down
2 changes: 2 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,8 @@ A swarm intelligence prediction engine. Upload documents describing any scenario

**Live:** [synth.scty.org](https://synth.scty.org)

Modified by Viswa.

> Fork of [666ghj/MiroFish](https://github.com/666ghj/MiroFish) — fully translated to English, local graph storage with embedded KuzuDB by default, Claude/Codex CLI support added.

## What it does
Expand Down
10 changes: 7 additions & 3 deletions backend/app/config.py
Original file line number Diff line number Diff line change
Expand Up @@ -63,6 +63,9 @@ def _get_llm_api_key() -> str:
return explicit

provider = (os.environ.get('LLM_PROVIDER', '') or '').strip().lower()
if provider == 'nexa':
# Nexa local server does not require a real key, but OpenAI SDK needs a string
return os.environ.get('NEXA_API_KEY', '') or 'nexa'
if provider == 'anthropic':
return os.environ.get('ANTHROPIC_API_KEY', '')

Expand All @@ -89,7 +92,8 @@ class Config:
LLM_API_KEY = _get_llm_api_key()
LLM_BASE_URL = _get_env_or_default('LLM_BASE_URL', 'https://api.openai.com/v1')
LLM_MODEL_NAME = _get_env_or_default('LLM_MODEL_NAME', 'gpt-4o-mini')
LLM_PROVIDER = os.environ.get('LLM_PROVIDER', '') # 'openai', 'anthropic', 'claude-cli', 'codex-cli'
# Providers: openai | anthropic | claude-cli | codex-cli | nexa (OpenAI-compatible local server)
LLM_PROVIDER = os.environ.get('LLM_PROVIDER', '')

# Graph storage config
GRAPH_BACKEND = os.environ.get("GRAPH_BACKEND", "kuzu").lower()
Expand Down Expand Up @@ -129,8 +133,8 @@ class Config:
def validate(cls):
"""Validate required configuration."""
errors = []
if cls.LLM_PROVIDER not in ("claude-cli", "codex-cli") and not cls.LLM_API_KEY:
errors.append("LLM_API_KEY not configured (set LLM_PROVIDER=claude-cli or codex-cli to use CLI instead)")
if cls.LLM_PROVIDER not in ("claude-cli", "codex-cli", "nexa") and not cls.LLM_API_KEY:
errors.append("LLM_API_KEY not configured (set LLM_PROVIDER=claude-cli/codex-cli, or use nexa/local provider without a key)")
Comment on lines +136 to +137
Copy link

Copilot AI Apr 2, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Config.validate() compares cls.LLM_PROVIDER without normalizing case. If a user sets LLM_PROVIDER=NEXA (or other mixed-case), validation won’t treat it as the exempt nexa provider and may produce a misleading “LLM_API_KEY not configured” error. Consider lowercasing LLM_PROVIDER when reading it (or in validate()) to match how LLMClient normalizes providers.

Copilot uses AI. Check for mistakes.
if cls.GRAPH_BACKEND not in {"kuzu", "json"}:
errors.append("GRAPH_BACKEND must be either 'kuzu' or 'json'")
return errors
2 changes: 2 additions & 0 deletions backend/app/utils/llm_client.py
Original file line number Diff line number Diff line change
Expand Up @@ -63,6 +63,8 @@ def _detect_provider(self) -> str:
model_lower = (self.model or "").lower()
base_lower = (self.base_url or "").lower()

if "nexa" in model_lower or "nexa" in base_lower or "11434" in base_lower:
return "nexa"
Comment on lines +66 to +67
Copy link

Copilot AI Apr 2, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This Nexa auto-detection is currently ineffective for the “provider left blank” flow: LLMClient.__init__ raises on missing api_key before calling _detect_provider(). If Nexa is intended to work without an API key and via auto-detection, move provider detection before the API-key check and/or treat nexa as a no-key provider (similar to the CLI providers).

Copilot uses AI. Check for mistakes.
if any(k in model_lower for k in ["claude", "anthropic"]):
return "anthropic"
if "anthropic" in base_lower:
Expand Down
20 changes: 20 additions & 0 deletions docker-compose.nexa.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
services:
mirofish:
depends_on: {}
networks:
- default
environment:
Comment on lines +1 to +6
Copy link

Copilot AI Apr 2, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

mirofish service has no build: or image:. Running docker compose -f docker-compose.nexa.yml up will fail because Compose can’t create the service without an image. If this file is intended to be an override (used with -f docker-compose.yml -f docker-compose.nexa.yml), add an explicit header comment documenting that usage; otherwise add the missing build/image fields here.

Copilot uses AI. Check for mistakes.
LLM_PROVIDER: openai
LLM_BASE_URL: http://host.docker.internal:11434/v1
LLM_API_KEY: nexa
Comment on lines +7 to +9
Copy link

Copilot AI Apr 2, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The Nexa compose file sets LLM_PROVIDER: openai. This bypasses the new nexa provider behaviors (e.g., key handling/validation) and is inconsistent with .env.example’s Nexa instructions. Consider setting LLM_PROVIDER: nexa here (and optionally dropping LLM_API_KEY in favor of NEXA_API_KEY if you want the “no real key required” flow).

Suggested change
LLM_PROVIDER: openai
LLM_BASE_URL: http://host.docker.internal:11434/v1
LLM_API_KEY: nexa
LLM_PROVIDER: nexa
LLM_BASE_URL: http://host.docker.internal:11434/v1
NEXA_API_KEY: nexa

Copilot uses AI. Check for mistakes.
Comment on lines +7 to +9
Copy link

Copilot AI Apr 2, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LLM_BASE_URL uses host.docker.internal, which isn’t available by default on many Linux Docker setups. If this compose config is meant to be cross-platform, add an extra_hosts: ["host.docker.internal:host-gateway"] mapping (or document the platform requirement / use a different host reference).

Copilot uses AI. Check for mistakes.
LLM_MODEL_NAME: NexaAI/Llama3.2-3B-NPU-Turbo
ports:
- "5001:5001"
volumes:
- ./backend/uploads:/app/backend/uploads
- ./backend/data:/app/backend/data
labels: {}

networks:
default:
name: mirofish-nexa
Loading