Skip to content

OpenAI Compatible API

Enreign edited this page Mar 13, 2026 · 2 revisions

OpenAI-Compatible API

Sparks exposes an OpenAI-compatible HTTP API that allows any OpenAI-compatible client — IDE plugins, chat UIs, API wrappers — to use Sparks as a backend drop-in.


Endpoints

Method Path Description
GET /v1/models List available models (ghosts + "sparks")
POST /v1/chat/completions Submit a chat completion request

Setup

Enable in config.toml:

[openai_api]
enabled             = true
bind                = "127.0.0.1:8787"
api_key_env         = "SPARKS_OPENAI_API_KEY"
principal           = "self"
requests_per_minute = 120
burst               = 30

Set the API key:

export SPARKS_OPENAI_API_KEY=your-secret-key

Start Sparks — the API server starts alongside the main process.


Authentication

Bearer token authentication. Pass your configured API key as:

Authorization: Bearer <SPARKS_OPENAI_API_KEY>

Rate Limiting

Config key Default Description
requests_per_minute 120 Sustained request rate limit
burst 30 Short-burst allowance above the sustained rate

Requests exceeding the limit receive 429 Too Many Requests.


Model Routing

The model parameter in your request maps to Sparks internals:

Model value Routes to
sparks Default ghost (classifier picks)
sparks/coder coder ghost directly
sparks/scout scout ghost directly
Any other model ID Default ghost

Advertised models (returned by /v1/models) default to "sparks", the current configured model, and all ghost IDs. Override:

[openai_api]
advertised_models = ["sparks", "sparks/coder", "sparks/scout"]

Usage Example

curl http://127.0.0.1:8787/v1/chat/completions \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer your-secret-key" \
  -d '{
    "model": "sparks",
    "messages": [{"role": "user", "content": "Fix the failing test in auth.rs"}]
  }'

IDE Plugin (Cursor / Continue)

In Cursor settings, set:

  • Base URL: http://127.0.0.1:8787/v1
  • API key: your configured SPARKS_OPENAI_API_KEY
  • Model: sparks (or a specific ghost like sparks/coder)

Unsupported Options

Sparks does not implement the full OpenAI spec. Unsupported options return explicit errors:

Feature Behavior
stream: true Returns 400 with error JSON explaining streaming is not supported
Function calling / tools Returns 400 with error JSON
logprobs, top_logprobs Ignored silently

These deviations are documented intentionally. Sparks returns explicit errors rather than silently ignoring unsupported fields.


Security Notes

  • Always bind to 127.0.0.1 (loopback) when running locally — do not expose to 0.0.0.0 without proper network controls
  • The API key must be set via environment variable; inline in config is blocked by default
  • All requests are logged to the observer event stream

Relevant Source Files

  • src/openai_api.rs — HTTP server, routing, rate limiting
  • src/openai_auth.rs — bearer token validation
  • docs/openai-compatible-api.md — design document with full deviation list

Clone this wiki locally