-
Notifications
You must be signed in to change notification settings - Fork 86
Viswa modified #20
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Viswa modified #20
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change | ||||||||
|---|---|---|---|---|---|---|---|---|---|---|
|
|
@@ -4,6 +4,8 @@ A swarm intelligence prediction engine. Upload documents describing any scenario | |||||||||
|
|
||||||||||
| **Live:** [synth.scty.org](https://synth.scty.org) | ||||||||||
|
|
||||||||||
| Modified by Viswa. | ||||||||||
|
|
||||||||||
| > Fork of [666ghj/MiroFish](https://github.com/666ghj/MiroFish) — fully translated to English, local graph storage with embedded KuzuDB by default, Claude/Codex CLI support added. | ||||||||||
|
|
||||||||||
|
Comment on lines
+7
to
10
|
||||||||||
| Modified by Viswa. | |
| > Fork of [666ghj/MiroFish](https://github.com/666ghj/MiroFish) — fully translated to English, local graph storage with embedded KuzuDB by default, Claude/Codex CLI support added. | |
| > Fork of [666ghj/MiroFish](https://github.com/666ghj/MiroFish) — fully translated to English, local graph storage with embedded KuzuDB by default, Claude/Codex CLI support added. |
| Original file line number | Diff line number | Diff line change | ||||||
|---|---|---|---|---|---|---|---|---|
|
|
@@ -63,6 +63,9 @@ def _get_llm_api_key() -> str: | |||||||
| return explicit | ||||||||
|
|
||||||||
| provider = (os.environ.get('LLM_PROVIDER', '') or '').strip().lower() | ||||||||
| if provider == 'nexa': | ||||||||
| # Nexa local server does not require a real key, but OpenAI SDK needs a string | ||||||||
| return os.environ.get('NEXA_API_KEY', '') or 'nexa' | ||||||||
| if provider == 'anthropic': | ||||||||
| return os.environ.get('ANTHROPIC_API_KEY', '') | ||||||||
|
|
||||||||
|
|
@@ -89,7 +92,8 @@ class Config: | |||||||
| LLM_API_KEY = _get_llm_api_key() | ||||||||
| LLM_BASE_URL = _get_env_or_default('LLM_BASE_URL', 'https://api.openai.com/v1') | ||||||||
| LLM_MODEL_NAME = _get_env_or_default('LLM_MODEL_NAME', 'gpt-4o-mini') | ||||||||
| LLM_PROVIDER = os.environ.get('LLM_PROVIDER', '') # 'openai', 'anthropic', 'claude-cli', 'codex-cli' | ||||||||
| # Providers: openai | anthropic | claude-cli | codex-cli | nexa (OpenAI-compatible local server) | ||||||||
| LLM_PROVIDER = os.environ.get('LLM_PROVIDER', '') | ||||||||
|
||||||||
| LLM_PROVIDER = os.environ.get('LLM_PROVIDER', '') | |
| LLM_PROVIDER = os.environ.get('LLM_PROVIDER', '').strip().lower() |
Copilot
AI
Apr 2, 2026
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
validate() now exempts nexa, but it relies on exact string matches against cls.LLM_PROVIDER. Since provider normalization currently happens in some call sites but not here, this can produce false validation errors. Normalize cls.LLM_PROVIDER (e.g., strip().lower()) before checking membership.
| if cls.LLM_PROVIDER not in ("claude-cli", "codex-cli", "nexa") and not cls.LLM_API_KEY: | |
| provider = (cls.LLM_PROVIDER or "").strip().lower() | |
| if provider not in ("claude-cli", "codex-cli", "nexa") and not cls.LLM_API_KEY: |
| Original file line number | Diff line number | Diff line change | ||||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| @@ -0,0 +1,20 @@ | ||||||||||||||
| services: | ||||||||||||||
| mirofish: | ||||||||||||||
|
||||||||||||||
| mirofish: | |
| mirofish: | |
| build: . |
Copilot
AI
Apr 2, 2026
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This Nexa compose config sets LLM_PROVIDER: openai, which bypasses the new nexa provider path and makes the file name/config intent confusing. Consider setting LLM_PROVIDER: nexa (and optionally using NEXA_API_KEY instead of LLM_API_KEY) to align with the new provider support.
| LLM_PROVIDER: openai | |
| LLM_BASE_URL: http://host.docker.internal:11434/v1 | |
| LLM_API_KEY: nexa | |
| LLM_PROVIDER: nexa | |
| LLM_BASE_URL: http://host.docker.internal:11434/v1 | |
| NEXA_API_KEY: nexa |
Copilot
AI
Apr 2, 2026
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
host.docker.internal is not resolvable on many Linux Docker setups by default. If this compose file is meant to be portable, add an extra_hosts: ["host.docker.internal:host-gateway"] mapping or document that it requires Docker Desktop / host.docker.internal support.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The Nexa example relies on
LLM_PROVIDER=nexato get the “no real key required” behavior. IfLLM_PROVIDERis left blank for auto-detection, the current client initialization still errors before provider detection when no API key is set, so this example likely needs an explicit note thatLLM_PROVIDERmust be set tonexafor keyless local usage.