Add Nimbalyst stack, chatmock-shim docs, and OpenHands BYOK config#6
Add Nimbalyst stack, chatmock-shim docs, and OpenHands BYOK config#6devin-ai-integration[bot] wants to merge 1 commit intomainfrom
Conversation
- stacks/nimbalyst/ — Dockerfile + compose for web-desktop Codex+Claude-Code
routed to chatmock shim (Cursor-Cloud-Agents replacement at $0/request).
- services/chatmock-shim/ — documentation of the two custom patches applied
to upstream chatmock (model_registry aliases + responses_api param strip).
- services/openhands-byok/ — BYOK + MCP (E2B/Firecrawl/Tavily) configuration
for OpenHands Cloud pointed at the chatmock shim.
All secrets scrubbed and replaced with ${VAR} placeholders.
🤖 Devin AI EngineerI'll be helping with this pull request! Here's what you should know: ✅ I will automatically:
Note: I can only respond to comments from users who have write access to this repository. ⚙️ Control Options:
|
| 'export OPENAI_BASE_URL=https://llm.garzaos.cloud/v1' \ | ||
| 'export OPENAI_API_BASE=https://llm.garzaos.cloud/v1' \ | ||
| 'export OPENAI_API_KEY=${CUSTOM_LLM_API_KEY}' \ | ||
| > /etc/profile.d/garza-llm.sh && chmod 0644 /etc/profile.d/garza-llm.sh |
There was a problem hiding this comment.
CRITICAL: Shell expansion happens at build time, not runtime — ${CUSTOM_LLM_API_KEY} will expand to an empty string (no ARG or ENV declares it at this stage), so /etc/profile.d/garza-llm.sh will contain export OPENAI_API_KEY= literally.
When a login shell sources this file it overwrites whatever the container runtime injected via the environment: block in docker-compose.yml, effectively clearing the API key for every terminal session.
Fix: use single quotes or a heredoc with a quoted delimiter so the variable reference is preserved verbatim:
RUN printf '%s\n' \
'export OPENAI_BASE_URL=https://llm.garzaos.cloud/v1' \
'export OPENAI_API_BASE=https://llm.garzaos.cloud/v1' \
'export OPENAI_API_KEY=$CUSTOM_LLM_API_KEY' \
> /etc/profile.d/garza-llm.sh && chmod 0644 /etc/profile.d/garza-llm.sh(Single-quoted strings passed to printf '%s\n' are already literal — the ${...} form inside a double-quoted outer shell context is what causes the premature expansion here. The entire printf command is double-quoted by the surrounding RUN shell, so ${CUSTOM_LLM_API_KEY} is subject to build-time expansion. Stripping the braces to $CUSTOM_LLM_API_KEY within single quotes prevents that.)
| 'echo === A3: ENV ===' \ | ||
| 'echo OPENAI_BASE_URL=$OPENAI_BASE_URL' \ | ||
| 'echo OPENAI_API_KEY=${OPENAI_API_KEY:0:15}...' \ | ||
| 'echo' \ |
There was a problem hiding this comment.
CRITICAL: Same build-time shell expansion issue as line 51. ${OPENAI_API_KEY:0:15} is expanded by the RUN shell at image-build time (where OPENAI_API_KEY is unset), so gtest will always print OPENAI_API_KEY=... (empty prefix) regardless of what key is injected at runtime.
Fix — preserve the variable reference as a literal string by escaping the $:
'echo OPENAI_API_KEY=${OPENAI_API_KEY:0:15}...' \should be:
'echo OPENAI_API_KEY=\${OPENAI_API_KEY:0:15}...' \or use a COPY-based script file instead of inline printf to avoid the quoting complexity entirely.
| 'model = "gpt-5.4"' \ | ||
| 'approval_policy = "never"' \ | ||
| 'sandbox_mode = "danger-full-access"' \ | ||
| '' \ |
There was a problem hiding this comment.
WARNING: sandbox_mode = "danger-full-access" is baked into the default config.toml that gets seeded for every new user. Combined with approval_policy = "never" on the same file (line 63), Codex will silently execute any shell command with no sandboxing and no human confirmation prompt.
Anyone who can reach the web-desktop at https://nimbalyst.garzaos.cloud (basic-auth is the only gate) can issue a single Codex task that exfiltrates /config — including auth.json — or destroys the volume. Consider using approval_policy = "unless-allow-listed" and sandbox_mode = "workspace-write" as safer defaults, or at minimum document the risk prominently.
| restart: unless-stopped | ||
| security_opt: | ||
| - seccomp:unconfined | ||
| environment: |
There was a problem hiding this comment.
WARNING: seccomp:unconfined disables all syscall filtering for this container, significantly expanding the kernel attack surface. The AppImage extraction is done at build time (--appimage-extract in the RUN step), so FUSE is not needed at runtime. Consider removing seccomp:unconfined or replacing it with a targeted seccomp profile that only allows the specific syscalls the Electron/KasmVNC stack actually requires.
| RUN mkdir -p /opt/nimbalyst && cd /opt/nimbalyst \ | ||
| && curl -fL -o Nimbalyst.AppImage \ | ||
| "https://github.com/Nimbalyst/nimbalyst/releases/latest/download/Nimbalyst-Linux.AppImage" \ | ||
| && chmod +x Nimbalyst.AppImage \ |
There was a problem hiding this comment.
WARNING: The Nimbalyst AppImage is fetched from a latest download URL with no version pin and no checksum verification. A breaking upstream release or a supply-chain compromise would be silently picked up on every image rebuild.
Consider pinning to a specific release tag (e.g. .../releases/download/v1.2.3/Nimbalyst-Linux.AppImage) and adding a SHA-256 verification step:
&& echo "<expected-sha256> Nimbalyst.AppImage" | sha256sum -c -
Code Review SummaryStatus: 5 Issues Found | Recommendation: Address before merge Overview
Issue Details (click to expand)CRITICAL
WARNING
Other Observations (not in diff)
Files Reviewed (8 files)
Fix these issues in Kilo Cloud Reviewed by claude-4.6-sonnet-20260217 · 377,211 tokens |
| RUN printf '%s\n' \ | ||
| 'export OPENAI_BASE_URL=https://llm.garzaos.cloud/v1' \ | ||
| 'export OPENAI_API_BASE=https://llm.garzaos.cloud/v1' \ | ||
| 'export OPENAI_API_KEY=${CUSTOM_LLM_API_KEY}' \ |
There was a problem hiding this comment.
🔴 Profile script overwrites OPENAI_API_KEY with empty string because CUSTOM_LLM_API_KEY is not a container env var
The baked-in profile script /etc/profile.d/garza-llm.sh (line 50) contains export OPENAI_API_KEY=${CUSTOM_LLM_API_KEY}. Because the printf argument is single-quoted, the literal ${CUSTOM_LLM_API_KEY} is written into the file. At runtime, docker-compose.yml:19 correctly sets OPENAI_API_KEY from the host's CUSTOM_LLM_API_KEY, but only OPENAI_API_KEY exists as a container environment variable — CUSTOM_LLM_API_KEY itself is never injected into the container. When any login shell (or non-login shell via stacks/nimbalyst/Dockerfile:54-55 bashrc hook) sources this profile script, ${CUSTOM_LLM_API_KEY} expands to empty, overwriting the correct OPENAI_API_KEY with an empty string. This breaks Codex CLI and every tool that reads OPENAI_API_KEY in interactive shells. The gtest helper (stacks/nimbalyst/Dockerfile:78) also explicitly sources this file, so it would clobber the key too.
Prompt for agents
The profile script at /etc/profile.d/garza-llm.sh references CUSTOM_LLM_API_KEY, but docker-compose.yml only injects OPENAI_API_KEY into the container (mapping the host CUSTOM_LLM_API_KEY to it). When the profile script is sourced in a shell session, CUSTOM_LLM_API_KEY is undefined, so OPENAI_API_KEY gets set to empty.
Two possible fixes:
1. In stacks/nimbalyst/docker-compose.yml, also pass CUSTOM_LLM_API_KEY through as a container env var so the profile script can resolve it:
environment:
CUSTOM_LLM_API_KEY: "${CUSTOM_LLM_API_KEY}"
2. In stacks/nimbalyst/Dockerfile line 50, change the profile script to reference the variable that Docker actually sets:
'export OPENAI_API_KEY="${OPENAI_API_KEY}"'
This preserves the Docker-injected value when re-exporting in login shells. It is a no-op if the variable is already set, but at least it will not clobber it with empty.
Option 1 is probably safer because it ensures the original secret name is available inside the container for any script that references it.
Was this helpful? React with 👍 or 👎 to provide feedback.
| RUN printf '%s\n' \ | ||
| 'export OPENAI_BASE_URL=https://llm.garzaos.cloud/v1' \ | ||
| 'export OPENAI_API_BASE=https://llm.garzaos.cloud/v1' \ | ||
| 'export OPENAI_API_KEY=${CUSTOM_LLM_API_KEY}' \ |
There was a problem hiding this comment.
[🟠 High] [🔵 Bug]
/etc/profile.d/garza-llm.sh is written with a literal CUSTOM_LLM_API_KEY placeholder, but the container is only given OPENAI_API_KEY at runtime. When a login shell or gtest sources that script, shell expansion resolves the missing CUSTOM_LLM_API_KEY to empty and clobbers the working key, so the stack's primary codex exec path loses authentication. Use the already-injected OPENAI_API_KEY when generating the profile script, or create the script during container init from the runtime env instead of reassigning from an undefined variable.
# stacks/nimbalyst/Dockerfile
'export OPENAI_API_KEY=${CUSTOM_LLM_API_KEY}' \
# stacks/nimbalyst/docker-compose.yml
OPENAI_API_KEY: "${CUSTOM_LLM_API_KEY}"
Summary
Captures three session artifacts as first-class entries in this infra repo so they survive outside chat/VPS-local files:
stacks/nimbalyst/— Dockerfile + docker-compose for the web-desktop Cursor-Cloud-Agents replacement at https://nimbalyst.garzaos.cloud. linuxserver/webtop base + Codex CLI + Claude Code CLI baked in, withOPENAI_BASE_URLhoisted to/etc/profile.dand Codexconfig.tomlseeded via/custom-cont-init.d. All 5 end-to-end assertions passed (seeTEST-REPORT.md).services/chatmock-shim/— documentation of the two custom patches applied on top of upstream chatmock:model_registry.py— aliaseschatgpt-4o/gpt-4o/chatgptonto the upstreamgpt-5.4, so LiteLLM-based callers can pick a slug that routes to/v1/chat/completions(not the reasoning-only/v1/responses).responses_api.py— stripstemperature/top_p/etc. before forwarding to upstream ChatGPT, which rejects those params on gpt-5* models.services/openhands-byok/— BYOK + MCP config for OpenHands Cloud pointed at the chatmock shim. Documents theopenai/chatgpt-4oslug choice (notopenai/gpt-5.4— that loops on empty output) and the three MCP tools added: E2B (STDIO), Firecrawl (SHTTP), Tavily (SHTTP). All 4 LLM-path assertions passed.All secrets were scrubbed — API keys and basic-auth passwords were replaced with
${VAR}placeholders in every file before commit (grep -rEclean forsk-garza-*/fc-*/tvly-*/e2b_*/ raw passwords).Review & Testing Checklist for Human
stacks/nimbalyst/Dockerfileanddocker-compose.yml— confirm${CUSTOM_LLM_API_KEY}/${NIMBALYST_BASIC_AUTH_PASSWORD}are the only remaining placeholders. Do NOT rebuild the image from this committed Dockerfile as-is — inject real values via environment at build/run time on the VPS.services/chatmock-shim/README.mdmatches the actual patches currently running on VPS 1589219 at/opt/chatgpt-proxies/chatmock/(lines 44-47 ofmodel_registry.py, lines 91-95 ofresponses_api.py).services/openhands-byok/README.mdvs. what's live at https://app.all-hands.dev/settings/mcp.Notes
TEST-REPORT.md— only the LLM path (PONG round-trip) is documented there. A follow-up conversation exercising each of E2B / Firecrawl / Tavily is still pending.services/chatmock-shim/INSTALL-PLAN.mdis the original provisioning walkthrough; it's descriptive (no secrets) but does reference the separate shim VPS IP via${SHIM_VPS_IP}.Link to Devin session: https://app.devin.ai/sessions/68ec8727b8b84f5296095f7bf0155627
Requested by: @itsablabla