Skip to content

feat(slack-app phase 1): foundation \u2014 schema + encryption + LLM key endpoints#25

Merged
TS00 merged 2 commits intodevfrom
feat/llm-keys-foundation
Apr 28, 2026
Merged

feat(slack-app phase 1): foundation \u2014 schema + encryption + LLM key endpoints#25
TS00 merged 2 commits intodevfrom
feat/llm-keys-foundation

Conversation

@TS00
Copy link
Copy Markdown
Collaborator

@TS00 TS00 commented Apr 28, 2026

Summary

First slice of the Slack app work (full design: docs/eng-plan-slack-app-v1.md, parent memory d959bc61). Backend-only, no Slack-facing surface yet — lays the durable foundation that Phases 2+ build on.

Schema (3 new tables)

  • llm_keys (migration 023) — customer LLM provider keys, encrypted at rest. Partial unique indexes enforce one key per (scope, provider).
  • slack_workspaces (migration 024) — 1-to-1 mapping Slack workspace → Reflect team or solo user. Bot token encrypted; soft-delete via uninstalled_at.
  • slack_conversation_state (migration 025) — per-thread agent context, TTL'd to 24h.

Encryption (src/llm-key-crypto.ts)

  • AES-256-GCM, 12-byte random nonce per write, 16-byte auth tag appended to ciphertext.
  • Per-tenant sub-key via HKDF-SHA256(masterKey, salt=scopeId, info='reflect-memory:llm-key-v1'). Means a leaked encrypted blob is useless without knowing the team_id/user_id.
  • Master key from env RM_LLM_KEY_ENCRYPTION_KEY (64 hex chars / 32 bytes). Already rolled to dev + prod .env files, distinct keys per env. Backups taken.
  • Lazy validation — service boots without it, fails cleanly only when an LLM-key feature is exercised.

Service (src/llm-key-service.ts)

listLlmKeys / getLlmKeySummary / getLlmKeyPlaintext / setLlmKey (upsert = rotate) / deleteLlmKey. Audit events for create / rotate / remove (last4 only — never plaintext).

HTTP routes (admin-only)

Mirrors the /admin/log-export gating pattern (ownerUserIds.has(request.userId)):

  • GET /admin/llm-keys
  • PUT /admin/llm-keys — body { provider, key }
  • DELETE /admin/llm-keys/:provider

Scope auto-resolved from the caller's user record (team if present, else user-level for solo).

Tests

315/315 green (was 295). 20 new tests:

  • HTTP: empty body / unsupported provider / agent-key 403 / set echoes last4 / rotate replaces with updated_at advance / double-delete 404 / audit events recorded with last4 and never plaintext.
  • Unit: encrypt/decrypt round-trip / wrong-scope decrypt fails (HKDF separation proof) / tampered ciphertext fails (GCM) / KeyScope validation / empty plaintext rejected / malformed master key rejected / missing master key rejected.

What's NOT in this PR (deliberate, follows in Phase 2+)

  • Dashboard UI for the LLM key (lands paired with Slack OAuth UI).
  • Any Slack endpoints — /slack/install, /slack/oauth/callback, /slack/events are all Phase 2/3.
  • The agent loop itself.

Test plan

  • CI all green.
  • Dev deploy succeeds, service boots cleanly, tryValidateMasterKey logs "OK" (no warning) since the env var is set.
  • Manual smoke against api-dev.reflectmemory.com:
    • GET /admin/llm-keys returns {supported_providers: ["anthropic"], keys: []} for an admin token.
    • PUT /admin/llm-keys with a fake key returns last4.
    • GET again shows the key with last4.
    • DELETE /admin/llm-keys/anthropic removes it.
    • Same calls with an agent key → 403.

Made with Cursor

TS00 added 2 commits April 28, 2026 17:15
…ndpoints

First slice of the Slack app work (see docs/eng-plan-slack-app-v1.md).
Backend-only, no Slack-facing surface yet. Lays the durable foundation
that Phases 2+ build on.

Schema (3 new tables, 3 new migrations):
- 023_llm_keys: customer-supplied LLM provider keys, encrypted at rest.
  One row per (team_id|user_id, provider). Partial unique indexes
  enforce one-per-scope.
- 024_slack_workspaces: 1-to-1 mapping of Slack workspace → Reflect
  team (or solo user). Bot token encrypted at rest. Soft-deleted via
  uninstalled_at.
- 025_slack_conversation_state: per-thread short-term agent context,
  TTL'd to 24h.

Encryption (src/llm-key-crypto.ts):
- AES-256-GCM with a 12-byte random nonce per write.
- Per-tenant sub-key derived via HKDF-SHA256(masterKey, salt=scopeId,
  info='reflect-memory:llm-key-v1') so a leaked encrypted blob is
  useless without the team_id/user_id.
- Master key from env RM_LLM_KEY_ENCRYPTION_KEY (64 hex chars / 32
  bytes). Lazy validation — service boots without it but LLM-key
  features are unavailable until set.
- Auth tag (16-byte GCM tag) appended to ciphertext in the same BLOB.

Service (src/llm-key-service.ts):
- listLlmKeys / getLlmKeySummary / getLlmKeyPlaintext / setLlmKey
  (upsert = rotate) / deleteLlmKey.
- Audit events for create / rotate / remove (last4 only — never the
  plaintext key).

HTTP routes (admin-only, mirrors /admin/log-export gating):
- GET    /admin/llm-keys
- PUT    /admin/llm-keys              { provider, key }
- DELETE /admin/llm-keys/:provider
Scope auto-resolved from caller's user record (team if present, else
user).

Tests (+20, total 295 → 315 all green):
- HTTP: empty body, unsupported provider, agent-key 403, set echoes
  last4, rotate replaces with updated_at advance, double-delete 404,
  audit events recorded with last4 (and never plaintext).
- Unit: encrypt/decrypt round-trip, wrong-scope decrypt fails (HKDF
  separation), tampered ciphertext fails (GCM), KeyScope validation,
  empty plaintext rejected, malformed master key rejected, missing
  master key rejected.

Refs: parent memory d959bc61 (Eng Plan: Slack App v1).
Made-with: Cursor
@TS00 TS00 merged commit bd88ef4 into dev Apr 28, 2026
4 checks passed
@TS00 TS00 deleted the feat/llm-keys-foundation branch April 28, 2026 21:47
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant