Security fixes are provided for the current main branch and latest tagged release.
Please do not open public issues for suspected vulnerabilities.
Use one of these channels:
- Open a private GitHub Security Advisory draft for this repository.
- If that is unavailable, open a minimal issue asking maintainers for a private contact path (without exploit details).
Include:
- affected commit/tag
- impact summary
- reproduction steps
- suggested mitigation (if known)
- keep credentials out of
config.toml - prefer environment variables or a gitignored
.envfile
Emberloom supports an explicit local runtime profile:
- set
[runtime].profile = "local_only" - set
[llm].provider = "ollama"with loopback URL (localhost,127.0.0.1, or::1) - disable outbound integrations (
langfuse, ticket intake sources/webhook)
Use sparks doctor to verify readiness and detect drift:
sparks doctor --skip-llm --ci --fail-on-warnfor config/invariant checkssparks doctor --ci --fail-on-warnfor live local Ollama reachability
Reference guide: docs/local-only-deployment.md
local_onlyconstrains Emberloom runtime configuration and integrations.- Spark execution is container-hardened by default (
network_mode=none, dropped capabilities, read-only rootfs). - Host-level outbound controls (firewall/egress policy) remain operator responsibility.