diff --git a/DESIGN.md b/DESIGN.md
index 3fc50fa..33e77fe 100644
--- a/DESIGN.md
+++ b/DESIGN.md
@@ -1,11 +1,11 @@
# Design System — Spool
## Product Context
-- **What this is:** A local search engine for your thinking — an Electron macOS app that indexes your AI sessions (Claude Code, Codex, ChatGPT), bookmarks (Twitter, GitHub, YouTube), and any URL you capture, then lets you search it all instantly.
+- **What this is:** A local search engine for your thinking — an Electron macOS app that indexes your AI sessions (Claude Code, Codex, Gemini) and lets you search them instantly.
- **Who it's for:** Developers who think with AI daily and have accumulated hundreds of sessions across multiple tools. The persona has re-explained the same context to AI agents dozens of times.
- **Space/industry:** Developer productivity / local-first tooling. Peers: Raycast, Spotlight, Obsidian, Perplexity — but none of them do this.
- **Project type:** macOS Electron app — compact utility window, not a document editor or dashboard.
-- **Core positioning:** "A local Google for your thinking." Search is the entire product. Everything else (sources, capture, AI mode) is in service of the search box.
+- **Core positioning:** "A local Google for your thinking." Search is the entire product. Everything else (session sources, AI mode) is in service of the search box.
## Aesthetic Direction
- **Direction:** Warm Index — library-warm, not terminal-cold. Function-first but with personality.
@@ -124,13 +124,12 @@ Each data source has a fixed color used consistently across badges, chips, and d
### Source Chips (home screen)
- Pill shape, `--surface` background, source dot + name + count.
-- `+ Connect` uses dashed border.
-- Clicking a chip opens the Sources panel filtered to that source.
+- One chip per agent source (Claude / Codex / Gemini).
+- Clicking a chip opens Settings → Sources tab filtered to that source.
-### Sources Panel (accessible from status bar)
-- Slides up from status bar or opens as a separate window. Two tabs: **Sources** and **Import URL**.
-- Sources tab: list of configured connectors with toggle switch, last-sync time, item count.
-- Toggle switches: on = `--accent` background. Off = `--border2` background.
+### Sources Panel (Settings tab)
+- Lists the three built-in agent sources with their session counts.
+- Status: `auto` label + green dot when watcher is healthy.
### AI Answer Card
- Left border: 3px solid `--accent`. Background: `--accent-bg`.
@@ -142,14 +141,13 @@ Each data source has a fixed color used consistently across badges, chips, and d
- Always visible, 30px height, `--surface` background.
- Left: colored dot (green/yellow/red) + synced item count + last sync time.
- Right: `Sources ⊕` button (replace `⊕` with vector icon).
-- Red dot only when a connector has auth error — not for normal background sync.
+- Dot is green when sync is healthy; yellow during active sync; red only on filesystem watcher errors.
## Icons
- **Library:** Lucide React (`lucide-react`) — consistent stroke weight, MIT licensed.
- **Search:** `Search` icon (Lucide)
- **Source indicators:** Replace all emoji placeholder icons with purpose-drawn SVGs or Lucide equivalents. Emoji are placeholders only in mockups.
- **Mode toggle:** Custom SVG — lightning bolt (⚡ Fast) and a minimal "brain" or sparkle (AI mode).
-- **Capture / add:** `PlusCircle` or `Plus` (Lucide)
- **Settings:** `Settings2` (Lucide)
- **Status dots:** No icon — pure colored circle via CSS.
- **Stroke width:** 1.5px at 16px, 1.5px at 14px. Never bold/filled for UI chrome.
diff --git a/README.md b/README.md
index 227a3f9..68fb007 100644
--- a/README.md
+++ b/README.md
@@ -6,7 +6,7 @@ The missing search engine for your own data.
-Search your Claude Code sessions, Codex CLI history, Gemini CLI chats, GitHub stars, Twitter bookmarks, and YouTube likes — locally, instantly.
+Search your Claude Code sessions, Codex CLI history, and Gemini CLI chats — locally, instantly.
> **Early stage.** Spool is under active development — expect rough edges. Feedback, bug reports, and ideas are very welcome via [Issues](https://github.com/spool-lab/spool/issues) or [Discord](https://discord.gg/aqeDxQUs5E).
@@ -26,14 +26,15 @@ pnpm build
## What it does
-Spool indexes your AI conversations and bookmarks into a single local search box.
+Spool indexes your AI conversations into a single local search box.
- **AI sessions** — watches Claude/Codex/Gemini session dirs in real time, including profile-based paths like `~/.claude-profiles/*/projects`, `~/.codex-profiles/*/sessions`, and Gemini’s project temp dirs under `~/.gemini/tmp/*/chats`
-- **Connectors** — sync bookmarks and stars from platforms like Twitter/X, GitHub, and more via installable connector plugins
- **Agent search** — a `/spool` skill inside Claude Code feeds matching fragments back into your conversation
Everything stays on your machine. Nothing leaves.
+> Looking for connectors (Twitter / GitHub / Reddit / etc.)? They now live in **[Spool Daemon](https://spool.pro/daemon)**, a sibling app focused on syncing platform data.
+
## Architecture
```
@@ -74,10 +75,6 @@ until CI finishes; artifacts appear on the release page when it returns.
To test a local build without cutting a release, use `pnpm --filter @spool/app build:mac`.
-## Acknowledgements
-
-- **[fieldtheory-cli](https://github.com/afar1/fieldtheory-cli)** — Twitter/X bookmark sync implementation adapted from this project
-
## License
MIT
diff --git a/docs/connector-developer-guide.md b/docs/connector-developer-guide.md
deleted file mode 100644
index bc5e1fa..0000000
--- a/docs/connector-developer-guide.md
+++ /dev/null
@@ -1,607 +0,0 @@
-# Connector Developer Guide
-
-> Everything you need to build, test, and publish a Spool connector.
-
----
-
-## What is a Connector?
-
-A connector is a small npm package that teaches Spool how to fetch data from one platform source. You implement two methods — `checkAuth()` and `fetchPage()` — and the framework handles everything else: scheduling, state persistence, error retries, progress UI, and search indexing.
-
-A connector does NOT:
-- Know when it will be called (the scheduler decides)
-- Track pagination state (the sync engine manages cursors)
-- Write to the database (the engine handles upserts)
-- Handle retries or backoff (the scheduler handles this)
-
-Your job is simple: **given a `FetchContext`, return one page of items.** Most connectors only need the `cursor` field — `sinceItemId` and `phase` are optional hints.
-
----
-
-## Anatomy of a Connector
-
-### The Interface
-
-```typescript
-interface Connector {
- readonly id: string // Unique ID: 'twitter-bookmarks', 'github-stars'
- readonly platform: string // Platform grouping: 'twitter', 'github'
- readonly label: string // Display name: 'X Bookmarks', 'GitHub Stars'
- readonly description: string // One-liner for picker UI
- readonly color: string // Hex color for badges: '#1DA1F2'
- readonly ephemeral: boolean // true = cache (full-replace), false = user data (incremental)
-
- checkAuth(opts?: Record): Promise
- fetchPage(ctx: FetchContext): Promise
-}
-```
-
-### The Six Properties
-
-| Property | Purpose | Example |
-|----------|---------|---------|
-| `id` | Globally unique across all connectors. Used as DB key, IPC identifier, and npm package suffix. | `'twitter-bookmarks'` |
-| `platform` | Groups connectors from the same service. One platform can have multiple connectors (e.g. `twitter-bookmarks`, `twitter-following`). | `'twitter'` |
-| `label` | Shown in the connector list and settings UI. Keep it short. | `'X Bookmarks'` |
-| `description` | Shown below the label in the connector picker. One sentence. | `'Your saved tweets on X'` |
-| `color` | Hex color for the platform dot/badge in the UI. Use the platform's brand color. | `'#1DA1F2'` |
-| `ephemeral` | **Critical flag.** Determines sync strategy. See [Ephemeral vs. Persistent](#ephemeral-vs-persistent) below. | `false` |
-
-### The Two Methods
-
-#### `checkAuth(): Promise`
-
-Called before each sync cycle and when the user clicks "Connect" in the UI. Returns whether the connector can authenticate with the platform right now.
-
-```typescript
-interface AuthStatus {
- ok: boolean
- error?: SyncErrorCode // Machine-readable error classification
- message?: string // Technical detail (logged, not shown to user)
- hint?: string // User-facing guidance: "Log into X in Chrome, then retry."
-}
-```
-
-**Rules:**
-- Must be fast (< 2 seconds). Don't make network requests here — just check if credentials exist locally.
-- Always provide a `hint` on failure. The hint is shown directly in the UI. Write it as an instruction the user can act on.
-- Never throw. Always return an `AuthStatus` object.
-
-#### `fetchPage(ctx: FetchContext): Promise`
-
-The core data fetching method. Called repeatedly by the sync engine to paginate through the platform's data.
-
-```typescript
-interface FetchContext {
- cursor: string | null // Pagination cursor. null = start from newest.
- sinceItemId: string | null // Platform ID of newest known item. The engine
- // passes this during forward sync so you can
- // optimize if your API supports "since" filtering.
- // null during backfill or first-ever sync.
- phase: 'forward' | 'backfill' // Which sync phase is requesting this page.
-}
-
-interface PageResult {
- items: CapturedItem[] // Items on this page
- nextCursor: string | null // Cursor for next page, null = no more data
-}
-```
-
-**Rules:**
-- When `cursor` is `null`, fetch the **newest** page (most recent items first).
-- Return `nextCursor: null` when there are no more pages.
-- Items should be ordered newest-first within each page (this is how most APIs work naturally).
-- Throw `SyncError` on failures. The engine catches it, updates error state, and the scheduler handles backoff.
-- Keep pages small-ish (10–25 items). The engine adds a delay between pages to avoid rate limiting.
-- You can safely ignore `sinceItemId` and `phase` — just destructure `{ cursor }` and use it. The engine has its own early-exit logic that works regardless.
-
----
-
-## CapturedItem: The Universal Data Unit
-
-Every item from every connector is normalized into this shape before storage:
-
-```typescript
-interface CapturedItem {
- url: string // Original URL on the platform
- title: string // Display title
- contentText: string // Full text content (indexed for search)
- author: string | null // Author handle or name
- platform: string // Must match connector.platform
- platformId: string | null // Platform-unique ID (CRITICAL for dedup)
- contentType: string // 'tweet', 'repo', 'video', 'post', 'page', etc.
- thumbnailUrl: string | null
- metadata: Record // Platform-specific extras
- capturedAt: string // ISO 8601 timestamp from the platform
- rawJson: string | null // Raw API response for future re-parsing
-}
-```
-
-### Key Fields Explained
-
-**`platform` + `platformId`** — The deduplication key. The sync engine upserts items by this pair. If two items share the same `(platform, platformId)`, the newer one updates the older one. **Always set `platformId`** to the platform's native ID for the item (tweet ID, repo ID, video ID, etc.).
-
-**`contentText`** — This is what gets full-text indexed. Put the main textual content here: tweet text, repo description, article body, etc. This powers Spool's search.
-
-**`capturedAt`** — Use the platform's timestamp, not the sync time. For a tweet, this is when the tweet was posted. For a GitHub star, this is when the repo was starred. This determines sort order in search results.
-
-**`metadata`** — Extensible JSON bag for anything not covered by the base fields. Common uses:
-- Engagement counts: `{ likeCount, repostCount, viewCount }`
-- Media attachments: `{ media: [{ type, url, width, height }] }`
-- Author details: `{ authorSnapshot: { handle, name, bio, followers } }`
-- Platform-specific data: `{ language, conversationId, isVerified }`
-
-The framework automatically adds `metadata.connectorId` — you don't need to set this.
-
-**`rawJson`** — Store the raw API response. This allows re-parsing items when the schema changes, without re-fetching from the platform.
-
----
-
-## Ephemeral vs. Persistent
-
-The `ephemeral` flag fundamentally changes how the sync engine treats your connector:
-
-### `ephemeral: false` — User-Owned Data (Default)
-
-For data the user created, saved, or curated: bookmarks, stars, saved posts, watch history.
-
-**Sync strategy: Dual-frontier incremental sync.**
-
-```
-[oldest] ◄── tail (backfill) ──── stored data ──── head (forward) ──► [newest]
-```
-
-- **Forward sync** runs frequently (every 15 min). Fetches from newest, stops when it hits already-known items (3 consecutive pages with 0 new items).
-- **Backfill** runs less often (every 60 min). Fills in historical data from where it last stopped, working backwards through time.
-- Items are upserted (dedup by `platform + platformId`), never deleted.
-- State persists across app restarts: cursors, page counts, error history.
-
-### `ephemeral: true` — Cache Data
-
-For public/trending data not tied to user actions: hot topics, trending repos, rankings.
-
-**Sync strategy: Full-replace.**
-
-- Every sync cycle deletes all existing items for this connector, then fetches fresh.
-- No cursor tracking. Always starts from page 1.
-- Simpler, but items don't persist between syncs.
-
----
-
-## Scheduled Sync: How and When Your Connector Runs
-
-You don't control when your connector runs. The **SyncScheduler** handles this automatically.
-
-### Default Schedule
-
-| Parameter | Default | Meaning |
-|-----------|---------|---------|
-| Forward interval | 15 minutes | How often new items are fetched |
-| Backfill interval | 60 minutes | How often historical backfill runs |
-| Page delay | 1200ms | Sleep between `fetchPage()` calls (rate limiting) |
-| Max minutes per run | 10 minutes | Sync aborts after this (scheduler-initiated only; CLI has no limit) |
-| Concurrency | 1 | Only one connector syncs at a time |
-
-The schedule is **global** — all connectors share the same intervals. Per-connector tuning is not currently exposed (but `configJson` in the DB is reserved for this).
-
-### When Does Sync Happen?
-
-| Event | What Happens | Priority |
-|-------|-------------|----------|
-| App launch | All enabled connectors queue for forward+backfill | 80 |
-| System wake | All enabled connectors queue for forward | 60 |
-| Every 30 seconds | Scheduler checks which connectors are "due" based on interval | 40 (forward) / 20 (backfill) |
-| User clicks "Sync now" | That connector queues immediately | 100 |
-
-Higher priority jobs run first. With concurrency=1, only one connector syncs at a time.
-
-### Error Backoff
-
-When `fetchPage()` throws a `SyncError`, the engine increments `consecutiveErrors` on the connector's state. The scheduler uses this to delay retries:
-
-| Consecutive Errors | Wait Before Retry |
-|-------------------|-------------------|
-| 0 | Normal interval |
-| 1 | 60 seconds |
-| 2 | 5 minutes |
-| 3 | 30 minutes |
-| 4+ | 2 hours (cap) |
-
-On a successful sync, `consecutiveErrors` resets to 0.
-
-**Auth errors are special**: any error code starting with `AUTH_` causes the scheduler to stop retrying entirely. The connector stays disabled until the user manually re-authenticates (clicks "Connect" in the UI, which calls `checkAuth()` again).
-
-### Stop Conditions
-
-The sync engine stops a forward sync when ANY of:
-1. **Reached since-anchor**: A page contains the item matching `sinceItemId` (caught up precisely — most efficient)
-2. **Caught up**: 3 consecutive pages with 0 new items (fallback when no anchor exists)
-3. **End of data**: `nextCursor` is `null`
-4. **Time limit**: Exceeded `maxMinutes` (10 min for scheduler, unlimited for CLI). Forward saves `headCursor` for resume.
-5. **Cancelled**: App is quitting or user aborted. Forward saves `headCursor` for resume.
-6. **Error**: `fetchPage()` threw. Forward saves `headCursor` for resume.
-
-### Progress & Events
-
-The scheduler emits events that flow to the UI in real time:
-
-```typescript
-type SchedulerEvent =
- | { type: 'sync-start'; connectorId: string }
- | { type: 'sync-progress'; progress: SyncProgress }
- | { type: 'sync-complete'; result: ConnectorSyncResult }
- | { type: 'sync-error'; connectorId: string; code: SyncErrorCode; message: string }
-```
-
-The UI shows: which connector is syncing, current page, items found, phase (forward/backfill).
-
----
-
-## Error Handling
-
-Connectors signal errors by throwing `SyncError`:
-
-```typescript
-import { SyncError } from '@spool/core'
-
-throw new SyncError('API_RATE_LIMITED', 'Got 429, retry after 60s')
-throw new SyncError('AUTH_SESSION_EXPIRED', 'Cookie returned 401')
-throw new SyncError('NETWORK_OFFLINE') // message defaults to hint text
-```
-
-### Error Code Reference
-
-| Code | When to Use | Framework Behavior |
-|------|------------|-------------------|
-| `AUTH_CHROME_NOT_FOUND` | Chrome or its cookie DB doesn't exist | Stop scheduling, show "needs setup" |
-| `AUTH_NOT_LOGGED_IN` | Platform cookies missing (user not logged in) | Stop scheduling, show "log in" hint |
-| `AUTH_COOKIE_DECRYPT_FAILED` | OS-level decryption failed | Stop scheduling |
-| `AUTH_KEYCHAIN_DENIED` | macOS Keychain access denied | Stop scheduling |
-| `AUTH_SESSION_EXPIRED` | 401/403 from platform API | Stop scheduling, show "re-authenticate" |
-| `API_RATE_LIMITED` | 429 response | Retry with backoff |
-| `API_SERVER_ERROR` | 5xx response | Retry with backoff |
-| `NETWORK_OFFLINE` | DNS/connection failure | Retry with backoff |
-| `NETWORK_TIMEOUT` | Request timed out | Retry with backoff |
-| `API_PARSE_ERROR` | Response shape doesn't match expected schema | No retry (likely a breaking API change) |
-| `CONNECTOR_ERROR` | Anything else | No retry |
-
-**Rule of thumb**: Use `AUTH_*` codes for anything that requires user action to fix. Use `API_*`/`NETWORK_*` codes for transient issues the framework can retry.
-
----
-
-## Authentication Patterns
-
-The `Connector` interface doesn't prescribe how authentication works — it only requires that `checkAuth()` returns an `AuthStatus`. This gives you flexibility to implement whatever auth pattern your platform needs.
-
-### Pattern 1: Chrome Cookie Extraction (Recommended)
-
-Used by Twitter Bookmarks. Reads encrypted cookies directly from Chrome's SQLite database on macOS. **No user interaction needed** — if the user is logged into the platform in Chrome, it just works.
-
-```typescript
-async checkAuth(): Promise {
- try {
- const cookies = extractChromeCookies('.example.com', ['session_id', 'csrf_token'])
- return { ok: true }
- } catch (e) {
- if (e instanceof SyncError) {
- return { ok: false, error: e.code, message: e.message, hint: e.hint }
- }
- return { ok: false, error: 'AUTH_UNKNOWN', hint: 'Check that Chrome is installed and you are logged in.' }
- }
-}
-
-async fetchPage({ cursor }: FetchContext): Promise {
- const cookies = extractChromeCookies('.example.com', ['session_id', 'csrf_token'])
- const response = await fetch('https://api.example.com/bookmarks', {
- headers: { Cookie: cookies.cookieHeader }
- })
- // ... parse response
-}
-```
-
-**Pros**: Zero friction, no OAuth flow, works with any platform the user is logged into.
-**Cons**: macOS only (for now), requires Chrome, cookies can expire mid-sync.
-
-**Shared utility**: The Twitter Bookmarks connector includes a `chrome-cookies.ts` module with macOS Keychain integration, AES-128-CBC decryption, and Chrome DB version handling. Other cookie-based connectors can reuse or adapt this code.
-
-### Pattern 2: CLI Tool Delegation
-
-Used when a well-maintained CLI tool already exists for the platform (e.g., `gh` for GitHub). The connector shells out to the CLI instead of making direct API calls.
-
-```typescript
-async checkAuth(): Promise {
- try {
- const { stdout } = await execAsync('gh auth status')
- return { ok: true }
- } catch {
- return { ok: false, hint: 'Run `gh auth login` in your terminal.' }
- }
-}
-
-async fetchPage({ cursor }: FetchContext): Promise {
- const page = cursor ? parseInt(cursor) : 1
- const { stdout } = await execAsync(`gh api /user/starred?per_page=30&page=${page}`)
- const repos = JSON.parse(stdout)
- return {
- items: repos.map(repoToCapturedItem),
- nextCursor: repos.length === 30 ? String(page + 1) : null,
- }
-}
-```
-
-**Pros**: Leverages existing auth flows (OAuth tokens managed by the CLI), well-tested API wrappers.
-**Cons**: Requires the CLI to be installed, subprocess overhead, output parsing can be brittle.
-
-### Pattern 3: API Token / Config File
-
-For platforms that use API keys, tokens, or config files. The token is stored in the connector's `configJson` field in the DB, or read from a well-known config file path.
-
-```typescript
-async checkAuth(): Promise {
- const config = this.loadConfig() // from configJson or ~/.config/myplatform/token
- if (!config?.apiToken) {
- return { ok: false, hint: 'Set your API token in Spool connector settings.' }
- }
- return { ok: true }
-}
-```
-
-### Pattern 4: No Auth Required
-
-For public data sources (RSS feeds, public APIs). Just return `{ ok: true }`.
-
-```typescript
-async checkAuth(): Promise {
- return { ok: true }
-}
-```
-
-### Auth Design Guidelines
-
-1. **`checkAuth()` must be fast** — no network calls. Check if credentials exist, not if they're valid.
-2. **Always provide a `hint`** — this is shown to the user in the UI. Make it actionable.
-3. **Never store secrets in code** — use Chrome cookies, CLI auth, or per-connector `configJson`.
-4. **Handle expiration gracefully** — if a 401/403 comes during `fetchPage()`, throw `SyncError('AUTH_SESSION_EXPIRED')`. The framework will stop scheduling and surface it in the UI.
-
----
-
-## Building a Connector: Step by Step
-
-### 1. Create the Package
-
-```bash
-mkdir spool-lab-connector-github-stars
-cd spool-lab-connector-github-stars
-npm init -y
-```
-
-Edit `package.json`:
-
-```json
-{
- "name": "@spool-lab/connector-github-stars",
- "version": "0.1.0",
- "main": "dist/index.js",
- "types": "dist/index.d.ts",
- "spool": {
- "type": "connector",
- "id": "github-stars",
- "platform": "github",
- "label": "GitHub Stars",
- "description": "Repos you've starred on GitHub",
- "color": "#333333",
- "ephemeral": false
- },
- "peerDependencies": {
- "@spool/core": "^0.x"
- }
-}
-```
-
-The `spool` field is the connector manifest. The app reads this to display connector metadata in the UI and on the spool.pro directory page, without loading the module.
-
-### 2. Implement the Connector
-
-```typescript
-// src/index.ts
-import type { Connector, AuthStatus, PageResult, CapturedItem } from '@spool/core'
-import { SyncError } from '@spool/core'
-import { execSync } from 'node:child_process'
-
-export default class GitHubStarsConnector implements Connector {
- readonly id = 'github-stars'
- readonly platform = 'github'
- readonly label = 'GitHub Stars'
- readonly description = 'Repos you\'ve starred on GitHub'
- readonly color = '#333333'
- readonly ephemeral = false
-
- async checkAuth(): Promise {
- try {
- execSync('gh auth status', { stdio: 'pipe' })
- return { ok: true }
- } catch {
- return {
- ok: false,
- error: 'AUTH_NOT_LOGGED_IN',
- hint: 'Install GitHub CLI and run `gh auth login` in your terminal.',
- }
- }
- }
-
- async fetchPage({ cursor }: FetchContext): Promise {
- const page = cursor ? parseInt(cursor) : 1
- const perPage = 30
-
- let stdout: string
- try {
- const result = execSync(
- `gh api /user/starred?per_page=${perPage}&page=${page} -H "Accept: application/vnd.github.v3.star+json"`,
- { encoding: 'utf-8', stdio: ['pipe', 'pipe', 'pipe'] }
- )
- stdout = result
- } catch (e: any) {
- if (e.status === 401) throw new SyncError('AUTH_SESSION_EXPIRED')
- if (e.status === 429) throw new SyncError('API_RATE_LIMITED')
- throw new SyncError('CONNECTOR_ERROR', e.message)
- }
-
- const starred = JSON.parse(stdout)
- const items: CapturedItem[] = starred.map((entry: any) => ({
- url: entry.repo.html_url,
- title: entry.repo.full_name,
- contentText: entry.repo.description ?? '',
- author: entry.repo.owner.login,
- platform: 'github',
- platformId: String(entry.repo.id),
- contentType: 'repo',
- thumbnailUrl: entry.repo.owner.avatar_url,
- metadata: {
- language: entry.repo.language,
- stars: entry.repo.stargazers_count,
- forks: entry.repo.forks_count,
- topics: entry.repo.topics,
- },
- capturedAt: entry.starred_at, // when YOU starred it, not when repo was created
- rawJson: JSON.stringify(entry),
- }))
-
- return {
- items,
- nextCursor: starred.length === perPage ? String(page + 1) : null,
- }
- }
-}
-```
-
-### 3. Declare the Manifest
-
-The `spool` field in `package.json` (shown above) must include:
-
-| Field | Required | Description |
-|-------|----------|-------------|
-| `type` | Yes | Always `"connector"` |
-| `id` | Yes | Must match `connector.id` in code |
-| `platform` | Yes | Must match `connector.platform` in code |
-| `label` | Yes | Display name |
-| `description` | Yes | One-line description |
-| `color` | Yes | Hex color for UI |
-| `ephemeral` | Yes | Sync strategy flag |
-
-### 4. Test Locally
-
-During development, install your connector locally:
-
-```bash
-cd ~/.spool/connectors
-npm install /path/to/your/spool-lab-connector-github-stars
-```
-
-Restart the Spool app. Your connector should appear in the Sources panel. Or test via CLI:
-
-```bash
-spool connector sync github-stars
-```
-
-### 5. Publish
-
-```bash
-npm publish --access public
-```
-
-Users install via:
-```bash
-# From Spool app UI (future), or manually:
-cd ~/.spool/connectors && npm install @spool-lab/connector-github-stars
-```
-
----
-
-## What Happens at Runtime
-
-Here's the full lifecycle of a connector, from installation to search results:
-
-```
-1. DISCOVERY
- App starts → scans ~/.spool/connectors/node_modules/@spool-lab/connector-*
- → require() each → new ConnectorClass() → registry.register(connector)
-
-2. SCHEDULING
- SyncScheduler.start() → queues all enabled connectors (priority 80)
- → tick() every 30s checks which connectors are "due"
- → dequeues highest priority job → calls SyncEngine.sync()
-
-3. SYNC CYCLE (for persistent connectors)
- SyncEngine.sync(connector)
- → loadState() from connector_sync_state table
- → FORWARD: fetchPage({ cursor: headCursor ?? null, sinceItemId, phase: 'forward' })
- → stop on reached_since / stale / timeout / cancel / error
- → interrupted? headCursor saved for resume next cycle
- → completed but sinceItemId not hit? anchor invalidated, rebuilt next cycle
- → BACKFILL: fetchPage({ cursor: tailCursor, sinceItemId: null, phase: 'backfill' })
- → stop on end-of-history / budget
- → saveState()
-
-4. ITEM PROCESSING (per page)
- For each item in PageResult:
- → tag with metadata.connectorId
- → upsert by (platform, platformId) into captures table
- → FTS trigger auto-indexes title + contentText
-
-5. EVENTS
- sync-start → sync-progress (per page) → sync-complete
- → forwarded via IPC to renderer → UI updates in real time
-
-6. SEARCH
- User searches → FTS5 query on captures_fts → results include connector items
- → shown alongside Claude Code sessions in unified results
-```
-
-### Database Tables Your Data Touches
-
-| Table | What's Stored | Who Writes |
-|-------|--------------|------------|
-| `captures` | Your items (one row per CapturedItem) | SyncEngine |
-| `captures_fts` | Full-text index on title + contentText | SQLite trigger (automatic) |
-| `connector_sync_state` | Cursors, error counts, timestamps, enabled flag | SyncEngine |
-
-You never interact with these tables directly. The framework handles all reads and writes.
-
----
-
-## FAQ
-
-### Can I make network requests in `checkAuth()`?
-
-Avoid it. `checkAuth()` is called from the UI thread and should return in under 2 seconds. Check if credentials exist locally (cookies in Chrome DB, CLI auth status, config file). Don't validate them against the remote API — that's what `fetchPage()` is for.
-
-### What if my platform doesn't use cursor-based pagination?
-
-Use page numbers as cursor strings: return `nextCursor: String(page + 1)` and parse with `parseInt(cursor)`. See the GitHub Stars example above.
-
-### What if my platform returns items oldest-first?
-
-The sync engine expects newest-first for forward sync to work correctly (it stops when it hits known items). If your API returns oldest-first, you may need to reverse the response or use `ephemeral: true`.
-
-### How do I store per-connector settings (e.g., which Chrome profile to use)?
-
-The `configJson` field in `connector_sync_state` is available for this. Access it via the constructor options pattern used by Twitter Bookmarks:
-
-```typescript
-constructor(private opts?: { chromeProfileDirectory?: string }) {}
-```
-
-Settings UI is not yet standardized — for now, pass options at registration time.
-
-### What's the difference between a native connector and wrapping an external CLI?
-
-| | Native (e.g., Twitter Bookmarks) | CLI Wrapper (e.g., GitHub Stars via `gh`) |
-|--|---|---|
-| **Auth** | Reads Chrome cookies directly | Delegates to CLI's auth (`gh auth login`) |
-| **Data fetching** | Direct HTTP/GraphQL calls | Shells out to CLI, parses stdout |
-| **Dependencies** | None (Node.js built-ins only) | Requires external CLI installed |
-| **Performance** | Fast, no subprocess overhead | Subprocess per page |
-| **Pagination control** | Full control over cursors and page size | Limited to CLI's pagination options |
-| **Error handling** | Precise: can distinguish 429/401/5xx | Limited: parse stderr or exit codes |
-
-Both implement the same `Connector` interface. The framework doesn't care how `fetchPage()` gets its data. Choose native for high-volume connectors or platforms where you need fine-grained control. Choose CLI wrappers when a good CLI already exists and volume is low.
diff --git a/docs/connector-sync-architecture.md b/docs/connector-sync-architecture.md
deleted file mode 100644
index dec7bc3..0000000
--- a/docs/connector-sync-architecture.md
+++ /dev/null
@@ -1,924 +0,0 @@
-# Connector Architecture
-
-> Plugin-based data sync framework for Spool. A connector is an installable npm package that knows how to read items from one source — a remote API, a local database, a set of files — and hand them to Spool's sync engine as `CapturedItem`s.
-
----
-
-## Core Concepts
-
-### What is a Connector?
-
-A connector is a self-contained module that knows how to check whether its data source is available and fetch paginated items from it. It does NOT know about scheduling, sync state, or storage — those are handled by the framework.
-
-Examples:
-- Remote APIs: `@spool-lab/connector-twitter-bookmarks`, `@spool-lab/connector-github-stars`
-- Local databases: a connector that reads a macOS app's SQLite store
-- Local files: a connector that indexes a directory of notes
-
-A connector only has to implement two methods (`checkAuth` and `fetchPage`). Whether the data comes from HTTP, SQLite, or the filesystem is entirely the connector's concern — the framework treats them uniformly.
-
-### Data Ownership Model
-
-| Kind | Sync Strategy | Examples |
-|------|---------------|----------|
-| **User-owned** | Persistent dual-frontier sync (`ephemeral: false`) | Bookmarks, stars, saved posts, favorites, watch history |
-| **Ephemeral** | Full-replace cache (`ephemeral: true`) | Hot topics, trending, rankings, explore feeds |
-
-User-owned data uses incremental sync with two cursors (head + tail). Ephemeral data is deleted and re-fetched each cycle.
-
----
-
-## Connector Interface
-
-```typescript
-interface Connector {
- /** Unique identifier, e.g. 'twitter-bookmarks' */
- readonly id: string
-
- /** Platform name for grouping, e.g. 'twitter' */
- readonly platform: string
-
- /** Human-readable label, e.g. 'X Bookmarks' */
- readonly label: string
-
- /** Short description for the connector picker */
- readonly description: string
-
- /** UI color for badges/dots */
- readonly color: string
-
- /** Whether this is ephemeral (cache) or user-owned (persistent) */
- readonly ephemeral: boolean
-
- /** Check if authentication is available */
- checkAuth(opts?: Record): Promise
-
- /**
- * Fetch one page of data.
- * The sync engine calls this repeatedly with FetchContext to paginate.
- * The connector can use sinceItemId and phase to optimize fetching,
- * or ignore them and just use the cursor (cursor-walking).
- */
- fetchPage(ctx: FetchContext): Promise
-}
-
-interface FetchContext {
- cursor: string | null // Pagination cursor. null = start from newest.
- sinceItemId: string | null // Platform ID of newest known item (head anchor).
- // Forward passes this so the connector can
- // optimize (e.g. stop early). null during backfill
- // or when no anchor exists yet.
- phase: 'forward' | 'backfill' // Which sync phase is requesting this page.
- signal: AbortSignal // fires when the sync engine wants to stop
-}
-```
-
-A connector only needs to implement two methods: `checkAuth()` and `fetchPage()`. Everything else — persistence, scheduling, retries, UI — is handled by the framework. The `sinceItemId` and `phase` fields in `FetchContext` are informational — a connector can safely ignore them and just use `cursor`. The engine has its own early-exit logic that works regardless of whether the connector acts on these hints.
-
-Connectors should pass `signal` through to `caps.fetch(url, { signal })` and to `abortableSleep(ms, signal)` in retry backoff loops to ensure cancel propagates promptly.
-
-### Key Supporting Types
-
-```typescript
-interface AuthStatus {
- ok: boolean
- error?: string
- code?: SyncErrorCode // machine-readable error classification
- hint?: string // user-facing guidance, e.g. "Log into X in Chrome"
-}
-
-interface PageResult {
- items: CapturedItem[]
- nextCursor: string | null // null = no more data in this direction
-}
-```
-
-### CapturedItem — the Universal Data Unit
-
-Every piece of data flowing through the connector system is a `CapturedItem`. This is the canonical shape for all platform data stored in Spool.
-
-```typescript
-interface CapturedItem {
- /** Original URL on the platform */
- url: string
-
- /** Display title (truncated for tweets, repo name for GitHub, etc.) */
- title: string
-
- /** Full text content */
- contentText: string
-
- /** Author handle or name */
- author: string | null
-
- /** Platform identifier: 'twitter', 'github', 'reddit', etc. */
- platform: string
-
- /** Platform-specific unique ID for deduplication */
- platformId: string | null
-
- /** Content type: 'tweet', 'repo', 'video', 'post', 'page', etc. */
- contentType: string
-
- /** Preview image URL */
- thumbnailUrl: string | null
-
- /** Platform-specific structured data (JSON blob) */
- metadata: Record
-
- /** When the item was created/saved on the platform (ISO 8601) */
- capturedAt: string
-
- /** Raw API response for future re-parsing */
- rawJson: string | null
-}
-```
-
-**Key fields explained:**
-
-| Field | Purpose | Example |
-|-------|---------|---------|
-| `platform` + `platformId` | **Deduplication key**. The sync engine upserts by this pair. | `twitter` + `1234567890` |
-| `contentType` | Determines rendering in search results. | `tweet`, `repo`, `video` |
-| `metadata` | Extensible bag for platform-specific data not covered by the base schema. Connectors store engagement counts, media objects, author snapshots, etc. | `{ likeCount: 42, media: [...] }` |
-| `metadata.connectorId` | **Framework-set** (not connector-set). The sync engine tags every item with the connector ID that produced it, enabling per-connector filtering and cleanup. | `twitter-bookmarks` |
-| `capturedAt` | Used for timeline ordering. Should be the platform's timestamp (when the tweet was posted, when the repo was starred), not the sync time. | `2025-03-15T10:30:00Z` |
-| `rawJson` | Preserved so that schema changes don't require re-fetching. The parser can be re-run on stored raw data. | Full GraphQL response |
-
-### Error Classification
-
-Connectors throw `SyncError` with a typed `code` for the framework to make retry/backoff decisions:
-
-```typescript
-enum SyncErrorCode {
- // Auth errors — connector should surface these clearly
- AUTH_CHROME_NOT_FOUND // Chrome/cookies DB not found
- AUTH_NOT_LOGGED_IN // Required cookies missing
- AUTH_COOKIE_DECRYPT_FAILED // OS keychain decryption failed
- AUTH_KEYCHAIN_DENIED // User denied keychain access prompt
- AUTH_SESSION_EXPIRED // 401/403 from platform API
-
- // Network errors — framework handles retry
- RATE_LIMITED // 429
- SERVER_ERROR // 5xx
- NETWORK_OFFLINE // fetch failed, no connectivity
- NETWORK_TIMEOUT // request timed out
- PARSE_ERROR // response wasn't valid JSON/expected shape
- UNEXPECTED_STATUS // unexpected HTTP status
-
- // Engine errors — framework internal
- MAX_PAGES_REACHED // hit page budget
- SYNC_TIMEOUT // hit time budget
- SYNC_CANCELLED // AbortSignal fired
-
- // Storage errors
- DB_WRITE_ERROR // SQLite write failed
-
- // Catch-all
- CONNECTOR_ERROR // connector-specific unclassified error
-}
-```
-
-Errors with `needsReauth: true` (all `AUTH_*` codes) cause the scheduler to stop retrying until the user re-authenticates. Errors with `retryable: true` (network/server errors) trigger exponential backoff.
-
----
-
-## Sync Engine: Dual-Frontier Model
-
-The sync engine is platform-agnostic. It takes any `Connector` and manages the full sync lifecycle.
-
-### Concept
-
-```
-[history end] ◄── tail frontier ──── stored data ──── head frontier ──► [newest]
- (backfill ←) (→ forward)
-```
-
-Two independent frontiers:
-
-- **Head (forward):** Fetches new items added since last sync. Runs frequently (every 15 min). Stops when it encounters already-known items or runs out of pages.
-- **Tail (backfill):** Fills in historical data. Runs less frequently (every 60 min). Stops when it reaches the end of available history or exhausts its page budget.
-
-### Sync State (per connector, stored in DB)
-
-```typescript
-interface SyncState {
- connectorId: string
-
- // Head frontier
- headCursor: string | null // Forward resume cursor. Non-null only when
- // forward was interrupted (timeout/cancel/error).
- // Cleared on normal completion.
- headItemId: string | null // Platform ID of newest known item (since anchor).
- // Set from page 0 of a fresh forward (not a resumed one).
- // Used as FetchContext.sinceItemId and as the engine's
- // early-exit target. Cleared automatically if forward
- // completes without hitting it (anchor invalidation).
-
- // Tail frontier
- tailCursor: string | null // cursor to resume backfill
- tailComplete: boolean // true = reached end of history
-
- // Metadata
- lastForwardSyncAt: string | null
- lastBackfillSyncAt: string | null
- totalSynced: number
- consecutiveErrors: number
- enabled: boolean
- configJson: string // per-connector config (e.g. chrome profile)
- lastErrorCode: string | null
- lastErrorMessage: string | null
-}
-```
-
-### Stop Conditions
-
-Forward sync stops when ANY of:
-1. **Reached since-anchor**: A page contains the item matching `sinceItemId` (caught up precisely)
-2. **Stale pages**: 3 consecutive pages with 0 new items (fallback when no anchor exists)
-3. **No cursor**: API returned `nextCursor: null` (end of data)
-4. **Timeout**: Exceeded `maxMinutes` (forward interrupted, `headCursor` preserved for resume)
-5. **Cancelled**: `AbortSignal` fired (`headCursor` preserved)
-
-Conditions 1–3 are "normal completion" — `headCursor` is cleared. Conditions 4–5 are "interruption" — `headCursor` retains the current position so the next forward resumes where it stopped instead of re-fetching from the newest end.
-
-### Ephemeral vs. Persistent
-
-```typescript
-class SyncEngine {
- async sync(connector: Connector, opts?: SyncOptions): Promise {
- if (connector.ephemeral) {
- // Delete all existing items for this connector, fetch fresh
- return this.syncEphemeral(connector, opts)
- }
- // Dual-frontier: forward then backfill
- return this.syncPersistent(connector, opts)
- }
-}
-```
-
-### Checkpoint & Crash Safety
-
-The engine checkpoints state to DB every 25 pages. If the app crashes mid-sync:
-- Forward sync: resumes from last saved `headCursor`. Pages between the crash and the last checkpoint may be re-fetched, but dedup by `(platform, platformId)` prevents duplicates.
-- Backfill: resumes from last saved `tailCursor`.
-- No data loss, at most some redundant API calls.
-
----
-
-## Sync Scheduler
-
-The scheduler is the orchestration layer that decides WHEN to run syncs. It runs in the Electron main process.
-
-### Design Principles
-
-1. **Connectors don't know about scheduling.** A connector is a pure data fetcher.
-2. **Sync engine doesn't know about timing.** It runs one sync cycle when asked.
-3. **Scheduler owns the clock.** It decides what to sync, when, and in what order.
-
-### Schedule Configuration
-
-```typescript
-interface ScheduleConfig {
- forwardIntervalMs: number // Default: 15 min
- backfillIntervalMs: number // Default: 60 min
- concurrency: number // Default: 1
- pageDelayMs: number // Default: 1200ms
- retryBackoffMs: number[] // Default: [60s, 300s, 1800s, 7200s]
- maxMinutesPerRun: number // Default: 10 (scheduler); 0 = unlimited (CLI)
-}
-```
-
-### Priority Queue
-
-| Priority | Trigger | Description |
-|----------|---------|-------------|
-| 100 | Manual | User clicked "Sync now" |
-| 80 | Launch | First sync after app launch |
-| 60 | Wake | Sync after system wake from sleep |
-| 40 | Interval | Scheduled forward sync |
-| 20 | Backfill | Background history backfill |
-
-### Error Handling & Backoff
-
-```
-consecutiveErrors = 0 → next sync at normal interval
-consecutiveErrors = 1 → wait 60s
-consecutiveErrors = 2 → wait 5 min
-consecutiveErrors = 3 → wait 30 min
-consecutiveErrors ≥ 4 → wait 2 hr (cap)
-```
-
-Auth errors (`needsReauth`) stop scheduling entirely until the user re-authenticates.
-
-### Lifecycle Events
-
-| Event | Action |
-|-------|--------|
-| App launch | Queue forward sync for all enabled connectors (priority 80) |
-| System wake | Queue forward sync for all enabled connectors (priority 60) |
-| Interval tick | Check which connectors are due, queue at priority 40/20 |
-| Manual trigger | Queue specific connector at priority 100 |
-| Auth error | Mark `needsReauth`, stop scheduling this connector |
-| App quit | Abort running syncs, save state |
-
-### Event System
-
-The scheduler emits events for UI updates:
-
-```typescript
-type SchedulerEvent =
- | { type: 'sync-start'; connectorId: string }
- | { type: 'sync-progress'; progress: SyncProgress }
- | { type: 'sync-complete'; result: ConnectorSyncResult }
- | { type: 'sync-error'; connectorId: string; code: SyncErrorCode; message: string }
-```
-
----
-
-## Connector Plugin System
-
-Connectors are distributed as npm packages and installed to a local directory. The app discovers and loads them at startup. Packages can be authored by anyone — the Spool team ships first-party connectors under the `@spool-lab/*` npm scope, and community authors can publish under any name they choose.
-
-### Package Convention
-
-A connector package is identified by a `spool` manifest field in its `package.json`, **not** by its npm name. Any npm package can declare itself a connector by adding this field:
-
-```json
-{
- "name": "@spool-lab/connector-twitter-bookmarks",
- "version": "1.0.0",
- "main": "dist/index.js",
- "keywords": ["spool-connector"],
- "spool": {
- "type": "connector",
- "id": "twitter-bookmarks",
- "platform": "twitter",
- "label": "X Bookmarks",
- "description": "Your saved tweets on X",
- "color": "#1DA1F2",
- "ephemeral": false
- }
-}
-```
-
-- `spool.type` must be `"connector"` (reserved for future non-connector Spool plugin types)
-- `id` / `platform` / `label` / `description` / `color` / `ephemeral` must match the corresponding fields on the `Connector` interface implementation exported from the package
-- `@spool-lab/` is the scope reserved for first-party packages; any other scope (or unscoped name) is a community package
-- `keywords` must include `"spool-connector"` — this is how spool.pro discovers the package on npm (see "Discovery on spool.pro" below). The keyword is discovery metadata only; the authoritative identification at runtime is still the `spool.type` manifest field
-
-The manifest lets the app read connector metadata (for the directory page, install UI, etc.) without loading the module — and lets the app decide whether to trust the package before running any of its code.
-
-### Trust model
-
-Because connector code runs with file-system and network access, the app distinguishes two trust tiers:
-
-| Tier | Rule | Default behavior |
-|---|---|---|
-| **First-party** | npm scope is `@spool-lab/*` and the package is also listed in Spool's bundled official-connector allow-list | Loaded automatically on startup |
-| **Community** | Any other package that has `spool.type === 'connector'` | Requires explicit user approval at first load, then cached in `~/.spool/config.json` |
-
-On first load of a community connector, Spool shows a consent dialog listing the capabilities the package has declared (see "Capability model" below) and the npm name + version. The user's answer is persisted — subsequent launches load it without re-prompting. The user can revoke trust at any time from Settings, which removes the consent record and disables the connector.
-
-This model keeps `@spool-lab/*` fast-path while still allowing a real community ecosystem. It is **not** a sandbox — a connector the user has trusted can still read files and make network requests. The consent gate is a warning, not a prison.
-
-> **Spec status:** the trust model is specified at the level of the consent flow and allow-list. Capability enforcement is specified below but not yet implemented. Worker-thread isolation is an optional hardening step reserved for a later phase.
-
-### Capability model
-
-> **Spec status: placeholder.** The detailed capability API is under design and will ship with the plugin loader. This section describes the intended shape so third-party authors can plan accordingly.
-
-A connector does not `import 'node:fs'` or `import 'node:http'` directly. Instead, the SDK exposes a constrained set of capabilities that the framework injects into the connector at construction time:
-
-- `fetch(url, init)` — HTTP fetch routed through Spool's network layer (proxy-aware, respects offline/online state). Equivalent to `globalThis.fetch` in shape
-- `cookies` — scoped Chrome/browser cookie reader for connectors that need cookie-based auth (subject to user consent for the specific browser profile)
-- `log` — structured logger that attributes log lines to the connector
-
-> `storage` is reserved for a future SDK v1.1 extension; v1 connectors manage their own state via the engine's `SyncState`.
-
-Any capability a connector uses must be declared in the `spool.capabilities` array in `package.json`:
-
-```json
-{
- "spool": {
- "type": "connector",
- "capabilities": ["fetch", "cookies:chrome"]
- }
-}
-```
-
-The consent dialog shown to users on first load lists these capabilities in plain language ("This connector will make network requests and read your Chrome cookies"). A connector that tries to use an undeclared capability at runtime is terminated with a `CONNECTOR_ERROR` and surfaced to the user.
-
-The v1 capability set is `"fetch" | "cookies:chrome" | "log"`. These names, signatures, and consent strings are frozen as part of the SDK v1 release. Until then this section is a design target, not a contract.
-
-### Discovery on spool.pro
-
-spool.pro is **not an independent registry**. It is a discovery and curation front-end over npm — similar in shape to how the VS Code Marketplace front-ends npm packages, or how Homebrew Cask front-ends upstream releases. Every package shown on spool.pro must exist on the public npm registry; every install button ultimately runs `npm install ` locally in the user's app.
-
-**Discovery mechanism: npm `spool-connector` keyword.** Connector authors add `"spool-connector"` to the `keywords` array in their `package.json`. spool.pro's backend periodically queries the npm registry:
-
-```
-GET https://registry.npmjs.org/-/v1/search?text=keywords:spool-connector&size=250
-```
-
-For each candidate returned, spool.pro fetches the package's `package.json`, cross-validates that `spool.type === 'connector'`, and indexes the `spool.*` manifest fields (label, description, color, capabilities) plus npm metadata (version, author, download count, last-published date).
-
-The keyword is **discovery metadata only**. It is not load-bearing for runtime identification — the Spool app's loader identifies connectors by the `spool.type` manifest field, not by keyword. A package with the keyword but without `spool.type === 'connector'` is rejected by the loader even if it made it into spool.pro's index. A package with `spool.type === 'connector'` but without the keyword will work fine at runtime if a user manually installs it; it just won't be discoverable through spool.pro.
-
-**Trust tiers on spool.pro cards:**
-
-| Badge | Criteria | Install UX |
-|---|---|---|
-| **Official** | Package name is under `@spool-lab/*` scope | Auto-loaded by the Spool app without consent |
-| **Community** | Any other package with the `spool-connector` keyword and valid `spool.type` manifest | Requires user consent on first load |
-
-Every card shows:
-- Package name, version, author (from npm metadata)
-- Label, description, color (from `spool.*` manifest)
-- Declared capabilities, translated to plain language ("Makes network requests · Reads Chrome cookies")
-- Download count, last-published date
-- Trust badge
-- **"Install in Spool" button** → generates a `spool://connector/install/` deep link (see "Deep-link install flow" below)
-
-**What spool.pro's MVP does NOT do:**
-
-- No editorial curation beyond the `@spool-lab/*` scope auto-tag. Any community package with the keyword shows up immediately in the directory, ranked by download count.
-- No submission form. Authors publish to npm the normal way.
-- No automated testing or sandboxing of candidate packages.
-- No takedown mechanism beyond npm's own registry moderation. If a malicious package slips through, spool.pro relies on npm removing it (unpublishing is still a global action on the registry).
-
-**Future curation layer (not Stage E MVP, explicitly deferred):**
-
-- `featured.json` maintained in a public GitHub repo (e.g. `spool-lab/connector-directory`), listing hand-picked community connectors
-- "Featured" badge for packages in that file
-- "Verified" badge for packages that have passed some stability threshold (download count, months since first publish, no unresolved issues) — criteria to be defined
-- Re-ranking logic that pushes Featured and Verified above raw community packages
-
-The MVP ships with only Official vs Community. The curation layer is a second iteration once there are enough community packages to make curation worthwhile.
-
-### Installation & Discovery
-
-**Install location:** `~/.spool/connectors/`
-
-```
-~/.spool/connectors/
-├── node_modules/
-│ ├── @spool-lab/
-│ │ └── connector-twitter-bookmarks/ # shipped with the app, first-run extracted
-│ ├── some-community-scope/
-│ │ └── my-custom-connector/
-│ └── unscoped-connector-package/
-└── package.json # auto-managed, tracks installed connectors
-```
-
-**Install sources — all paths go through the same dynamic loader:**
-
-| Source | How | Backend |
-|---|---|---|
-| **First-run bundle** | The app ships `@spool-lab/connector-*` npm tarballs inside its resource directory. On first launch, if `~/.spool/connectors/` is empty, the app extracts them into place. | File copy |
-| **Deep link from spool.pro** | spool.pro's connector directory buttons open `spool://connector/install/`, which the app handles by running the install flow for that package | `npm install` |
-| **Manual paste** | Settings → Install Connector → user pastes an npm package name | `npm install` |
-| **Local development** | `spool connector install --from ./path/to/local/package` CLI flag for connector authors developing a new plugin | `npm install ` |
-
-There is **no separate "built-in" code path**. Every connector the app loads — including first-party ones the Spool team maintains — goes through the same `~/.spool/connectors/` directory and the same dynamic loader. This is a deliberate choice: it means the first-party code is the first and most-tested consumer of the SDK, any capability a first-party connector needs is also available to community authors, and the plugin loading path is exercised from every launch of the app (not just once after the first community install).
-
-**Install flow for user-initiated installs:**
-1. User clicks "Install" on spool.pro directory, or pastes an npm package name into the app's Settings → Install Connector field
-2. App resolves the source (deep link or direct input) and runs `npm install ` in `~/.spool/connectors/`
-3. App scans every installed package for a `spool` manifest field
-4. For community packages not yet trusted, the app prompts for consent (see "Trust model" above)
-5. Each trusted package's default export is instantiated and registered with `ConnectorRegistry`
-
-**Discovery at startup:**
-```typescript
-// Pseudocode — real loader lives in packages/core/src/connectors/loader.ts
-async function loadConnectors(registry: ConnectorRegistry, trust: TrustStore) {
- const connectorsDir = path.join(homedir(), '.spool', 'connectors')
-
- // First-run bootstrap: extract bundled first-party connectors if the
- // user's connectors directory is empty.
- await extractBundledConnectorsIfNeeded(connectorsDir)
-
- if (!existsSync(path.join(connectorsDir, 'package.json'))) return
-
- // Walk every installed package — not just those with a known name prefix.
- for (const pkgDir of walkNodeModules(path.join(connectorsDir, 'node_modules'))) {
- const pkgJson = readPackageJson(pkgDir)
- if (pkgJson?.spool?.type !== 'connector') continue
-
- if (!trust.isAllowed(pkgJson.name)) {
- // Community package not yet approved — surface in UI, skip loading.
- trust.recordPending(pkgJson)
- continue
- }
-
- try {
- const mod = await import(pkgDir)
- const ConnectorClass = mod.default ?? mod
- const connector = new ConnectorClass(/* capabilities injected here */)
- registry.register(connector)
- } catch (err) {
- // Crash isolation: a broken connector must not take down the loader.
- log.error(`failed to load ${pkgJson.name}: ${err}`)
- }
- }
-}
-```
-
-The loader treats every package as untrusted by default and only loads those in the trust store. First-party packages shipped with the app are added to the trust store automatically as part of the bundle-extraction step. Load failures are isolated so one bad connector cannot prevent the others from registering.
-
-**Uninstall:** `npm uninstall ` in `~/.spool/connectors/`, then remove connector's sync state and captures from DB. The next launch will re-extract first-party bundles if the user has removed them, unless they explicitly set a "do not restore" flag.
-
-### Deep-link install flow
-
-spool.pro's connector directory page has an "Install in Spool" button next to each listed package. Clicking it opens a `spool://connector/install/` URL. The Spool app registers itself as the handler for the `spool://` protocol on install.
-
-```
-https://spool.pro/connectors
- │
- │ user clicks "Install" on @spool-lab/connector-github-stars
- ▼
-spool://connector/install/@spool-lab/connector-github-stars
- │
- │ OS hands off to Spool (custom protocol handler)
- ▼
-App receives the deep link, shows a confirmation dialog:
- "Install @spool-lab/connector-github-stars from npm?"
- │
- │ user confirms
- ▼
-App runs `npm install @spool-lab/connector-github-stars` in ~/.spool/connectors/
- │
- ▼
-Loader picks it up, consent prompt if community, registers with ConnectorRegistry
-```
-
-Deep-link handling uses Electron's `app.setAsDefaultProtocolClient('spool')` in main, the `open-url` event on macOS, and command-line argument parsing on Windows/Linux. The `spool://` scheme is reserved for Spool's own use — any query parameters or additional paths are treated as opaque and validated server-side against the expected shape (`install/`, `open/`, etc.).
-
-**Security note:** deep-link triggers do **not** auto-install. Every install, regardless of source, shows the user a confirmation dialog with the package name and (for community packages) the declared capabilities. A malicious link cannot silently push code onto a user's machine.
-
----
-
-## DB Schema
-
-### `connector_sync_state` — per-connector sync progress
-
-```sql
-CREATE TABLE IF NOT EXISTS connector_sync_state (
- connector_id TEXT PRIMARY KEY,
- head_cursor TEXT,
- head_item_id TEXT,
- tail_cursor TEXT,
- tail_complete INTEGER NOT NULL DEFAULT 0,
- last_forward_sync_at TEXT,
- last_backfill_sync_at TEXT,
- total_synced INTEGER NOT NULL DEFAULT 0,
- consecutive_errors INTEGER NOT NULL DEFAULT 0,
- enabled INTEGER NOT NULL DEFAULT 1,
- config_json TEXT NOT NULL DEFAULT '{}',
- last_error_code TEXT,
- last_error_message TEXT
-);
-```
-
-### `captures` — all connector items
-
-```sql
-CREATE TABLE IF NOT EXISTS captures (
- id INTEGER PRIMARY KEY,
- source_id INTEGER NOT NULL REFERENCES sources(id),
- capture_uuid TEXT NOT NULL UNIQUE,
- url TEXT NOT NULL,
- title TEXT NOT NULL DEFAULT '',
- content_text TEXT NOT NULL DEFAULT '',
- author TEXT,
- platform TEXT NOT NULL,
- platform_id TEXT,
- content_type TEXT NOT NULL DEFAULT 'page',
- thumbnail_url TEXT,
- metadata TEXT NOT NULL DEFAULT '{}',
- captured_at TEXT NOT NULL,
- indexed_at TEXT NOT NULL DEFAULT (datetime('now')),
- raw_json TEXT
-);
-
--- Deduplication: platform + platform_id
--- FTS: captures_fts virtual table on (title, content_text)
-```
-
-Note: The legacy `opencli_sources` and `opencli_setup` tables are removed. All connector state lives in `connector_sync_state`. The `captures.opencli_src_id` column is dropped — connector association is via `json_extract(metadata, '$.connectorId')`.
-
----
-
-## Integration Points
-
-### How a Connector Fits into the Framework
-
-```
-┌──────────────────────────────────────────────────────────┐
-│ spool.pro │
-│ Connector Directory Page │
-│ (curated listing of first-party + community) │
-└───────────────────────┬──────────────────────────────────┘
- │ npm install
- ▼
-┌──────────────────────────────────────────────────────────┐
-│ ~/.spool/connectors/ │
-│ node_modules/**/package.json with `spool.type` │
-└───────────────────────┬──────────────────────────────────┘
- │ trust check → dynamic import()
- ▼
-┌──────────────────────────────────────────────────────────┐
-│ ConnectorRegistry │
-│ register() / list() / get() / has() │
-└───────────────────────┬──────────────────────────────────┘
- │
- ┌─────────┴─────────┐
- ▼ ▼
-┌─────────────────────┐ ┌──────────────────────┐
-│ SyncEngine │ │ SyncScheduler │
-│ (runs sync cycles) │ │ (decides WHEN) │
-│ dual-frontier │ │ priority queue │
-│ upsert to DB │ │ error backoff │
-└─────────┬───────────┘ └──────────┬───────────┘
- │ │
- ▼ ▼
-┌──────────────────────────────────────────────────────────┐
-│ SQLite Database │
-│ captures + captures_fts + connector_sync_state │
-└──────────────────────────────────────────────────────────┘
- │
- ▼
-┌──────────────────────────────────────────────────────────┐
-│ Electron IPC / CLI │
-│ connector:list / connector:sync-now / etc. │
-└───────────────────────┬──────────────────────────────────┘
- │
- ▼
-┌──────────────────────────────────────────────────────────┐
-│ Renderer UI │
-│ SourcesPanel / SettingsPanel / StatusBar │
-└──────────────────────────────────────────────────────────┘
-```
-
-### Electron IPC
-
-| Channel | Input | Output |
-|---------|-------|--------|
-| `connector:list` | — | `ConnectorStatus[]` |
-| `connector:check-auth` | `{ id }` | `AuthStatus` |
-| `connector:sync-now` | `{ id }` | `{ ok: boolean }` |
-| `connector:get-status` | — | `SchedulerStatus` |
-| `connector:set-enabled` | `{ id, enabled }` | `{ ok: boolean }` |
-| `connector:get-capture-count` | `{ connectorId }` | `number` |
-| `connector:install` | `{ packageName }` | `{ ok: boolean }` |
-| `connector:uninstall` | `{ packageName }` | `{ ok: boolean }` |
-
-Event channel: `connector:event` broadcasts `SchedulerEvent` to renderer.
-
-### Preload API
-
-```typescript
-window.spool.connectors = {
- list(): Promise
- checkAuth(id: string): Promise
- syncNow(id: string): Promise<{ ok: boolean }>
- setEnabled(id: string, enabled: boolean): Promise<{ ok: boolean }>
- getStatus(): Promise
- getCaptureCount(connectorId: string): Promise
- install(packageName: string): Promise<{ ok: boolean }>
- uninstall(packageName: string): Promise<{ ok: boolean }>
- onEvent(callback: (event: SchedulerEvent) => void): () => void
-}
-```
-
-### CLI
-
-```bash
-spool connector list # list installed connectors + status
-spool connector sync [connector-id] # sync one or all connectors
-spool connector sync --reset [connector-id] # wipe state and re-sync from scratch
-spool connector install # install a connector from npm
-spool connector uninstall # remove a connector
-```
-
----
-
-## File Structure
-
-```
-packages/core/src/connectors/ # Framework — NOT any individual connector
-├── types.ts # Connector, AuthStatus, PageResult, SyncState, errors
-├── registry.ts # ConnectorRegistry
-├── sync-engine.ts # SyncEngine (dual-frontier logic)
-├── sync-scheduler.ts # SyncScheduler (timing, orchestration)
-└── loader.ts # Plugin discovery & dynamic loading
-
-packages/connectors/ # First-party plugin workspace container
-├── twitter-bookmarks/ # → @spool-lab/connector-twitter-bookmarks on npm
-│ ├── package.json # with `spool.type: 'connector'` + keywords
-│ ├── src/
-│ │ ├── index.ts # TwitterBookmarksConnector (default export)
-│ │ ├── chrome-cookies.ts # uses injected cookies:chrome capability
-│ │ └── graphql-fetch.ts # uses injected fetch capability
-│ └── dist/ # built output, packed as tarball for first-run
-├── typeless/ # → @spool-lab/connector-typeless on npm
-│ ├── package.json
-│ └── src/
-│ ├── index.ts # TypelessConnector (default export)
-│ └── db-reader.ts # reads ~/Library/.../typeless.db
-└── hackernews/ # → @spool-lab/connector-hackernews on npm
- └── ... # future; follows the same shape
-
-~/.spool/connectors/ # User-visible connector install directory
-├── package.json
-└── node_modules/
- ├── @spool-lab/connector-*/ # First-party (bundled with app, auto-trusted)
- └── / # Community (trusted after user consent)
-```
-
-The framework code lives in `packages/core/src/connectors/`. **No connector implementation lives there** — every first-party connector (Twitter Bookmarks, Typeless, Hacker News, …) has its own workspace package under `packages/connectors/`, is built into an npm tarball, and is loaded through the same dynamic-import path as community connectors. This keeps the SDK honest: if the framework ever needs a feature to support one of these, that feature has to be exposed on the SDK surface, not hidden in the core package.
-
-**Workspace layout.** `pnpm-workspace.yaml` declares both levels so pnpm treats every directory under `packages/connectors/` as an independent workspace package:
-
-```yaml
-packages:
- - 'packages/*'
- - 'packages/connectors/*'
-```
-
-**Naming convention for first-party plugins:**
-- Directory: `packages/connectors//`
-- npm package: `@spool-lab/connector-`
-- `spool.id`: `` (may include a sub-scope like `twitter-bookmarks`)
-- `spool.platform`: the underlying platform, not the connector (e.g. `twitter` for `twitter-bookmarks`)
-
-Community plugins do **not** live in this monorepo. They live in their own repositories and publish to npm independently using any package name their authors choose. The `packages/connectors/` directory is reserved for first-party plugins that the Spool team maintains and ships with the app as first-run bundles.
-
----
-
-## Writing a Connector
-
-A minimal connector implementation:
-
-```typescript
-import type { Connector, AuthStatus, PageResult, FetchContext } from '@spool/core'
-
-export default class MyConnector implements Connector {
- readonly id = 'my-platform-bookmarks'
- readonly platform = 'my-platform'
- readonly label = 'My Platform Bookmarks'
- readonly description = 'Your saved items on My Platform'
- readonly color = '#FF6600'
- readonly ephemeral = false
-
- async checkAuth(): Promise {
- // Check if credentials/cookies are available
- return { ok: true }
- }
-
- async fetchPage({ cursor }: FetchContext): Promise {
- // Fetch one page of data from the platform API.
- // sinceItemId and phase are available in FetchContext if your platform
- // supports server-side "since" filtering — most connectors can ignore them
- // and just use cursor. The engine handles early-exit on its own.
- const response = await fetchFromAPI(cursor)
- return {
- items: response.items.map(item => ({
- url: item.url,
- title: item.title,
- contentText: item.body,
- author: item.author,
- platform: this.platform,
- platformId: item.id,
- contentType: 'post',
- thumbnailUrl: null,
- metadata: { /* platform-specific data */ },
- capturedAt: item.createdAt,
- rawJson: JSON.stringify(item),
- })),
- nextCursor: response.nextPage ?? null,
- }
- }
-}
-```
-
-Package it as `@your-scope/connector-my-platform-bookmarks` (or any npm name) with the `spool` manifest in `package.json`, publish to npm, and users can install it from the app's Settings → Install Connector field or from the spool.pro directory. The minimum `package.json` shape is:
-
-```json
-{
- "name": "@your-scope/connector-my-platform-bookmarks",
- "version": "0.1.0",
- "main": "dist/index.js",
- "keywords": ["spool-connector"],
- "peerDependencies": {
- "@spool-lab/connector-sdk": "^1.0.0"
- },
- "spool": {
- "type": "connector",
- "id": "my-platform-bookmarks",
- "platform": "my-platform",
- "label": "My Platform Bookmarks",
- "description": "Your saved items on My Platform",
- "color": "#FF6600",
- "ephemeral": false,
- "capabilities": ["fetch", "log"]
- }
-}
-```
-
-- `keywords: ["spool-connector"]` is how spool.pro's backend discovers your package on npm
-- `spool.type: "connector"` is how the Spool app's loader identifies your package at runtime
-- `spool.capabilities` declares which SDK-injected capabilities your connector needs — this list is shown to users in the first-load consent dialog
-
-### Useful SDK exports
-
-**`abortableSleep(ms, signal)`** — use this inside any retry/backoff loop in `fetchPage`. Unlike plain `setTimeout`, it rejects with the signal's reason when the engine cancels, so `scheduler.stop()` takes effect within one event-loop tick.
-
-### Local source connectors
-
-Not every connector fetches data over the network. A connector that reads a local SQLite database, a directory of markdown files, or another app's export file implements exactly the same `Connector` interface — the framework does not distinguish "remote" from "local" sources.
-
-The technique for making a local source look like a paginated stream is to **synthesize a cursor from a natural ordering** in the data. For a table with a `created_at` column:
-
-```typescript
-async fetchPage({ cursor }: FetchContext): Promise {
- // cursor is the created_at of the last row on the previous page, or null
- // for the first page. Query for 25 rows strictly older than it.
- const db = openMyLocalDb()
- try {
- const rows = queryRows(db, { before: cursor, limit: 25 })
- const items = rows.map(rowToCapturedItem)
- const nextCursor = rows.length === 25
- ? rows[rows.length - 1].created_at
- : null
- return { items, nextCursor }
- } finally {
- db.close()
- }
-}
-```
-
-`checkAuth()` for a local source is typically "is the file readable?":
-
-```typescript
-async checkAuth(): Promise {
- try {
- const db = openMyLocalDb()
- db.close()
- return { ok: true }
- } catch (err) {
- return {
- ok: false,
- error: SyncErrorCode.CONNECTOR_ERROR,
- message: err instanceof Error ? err.message : String(err),
- hint: 'MyApp not found. Install MyApp, create at least one entry, then retry.',
- }
- }
-}
-```
-
-Notes for local connectors:
-
-- The dual-frontier model (forward + backfill) still applies: forward finds items added since the last sync, backfill walks history. With a stable local ordering, "forward" converges after the first cycle and subsequent syncs just pick up deltas.
-- Page delay (`pageDelayMs`) defaults are tuned for remote API rate limits. A local connector can pass `pageDelayMs: 0` via its constructor config if the default 1200ms is wasteful.
-- Error codes like `API_RATE_LIMITED` or `NETWORK_OFFLINE` don't apply. Use `CONNECTOR_ERROR` with a descriptive `hint` for local-specific failures (file missing, database locked, parse failure).
-- `checkAuth()`'s name is legacy — semantically it means "is the source usable right now?" The framework treats any non-`ok` answer the same way.
-
-### Future consideration: source-type taxonomy
-
-The current `Connector` interface is shaped around "paginated pull-based reads from a temporally-ordered source." That model covers:
-
-- Remote cursor-walking APIs (Twitter, GraphQL)
-- Remote `since`-parameterized APIs (GitHub, REST)
-- Local databases with a natural `ORDER BY created_at DESC` ordering
-- Local file directories where mtime serves as the ordering
-
-It does **not** naturally fit:
-
-- Push-based ingestion (filesystem watchers, IPC events from another process)
-- Non-temporal data (configs, static reference material)
-- Sources where the entire state must be re-read each time because no cursor exists (small local files, key-value stores)
-
-Spool currently handles push-based local-file ingestion (Claude Code sessions, Codex history) in a separate subsystem (`packages/core/src/sync/` — the `SpoolWatcher` + `Syncer`), not through the `Connector` framework. This split is intentional: forcing every integration into the paginated model would have produced awkward adapters for sources that don't have a natural pagination story.
-
-If in the future enough local or push-based connectors exist to warrant a unified abstraction, the framework may introduce a **source-type taxonomy** — something like `connector.kind: 'paginated' | 'snapshot' | 'watcher'` — with distinct interface shapes for each kind. This is deliberately **not** done yet because:
-
-1. The current interface has only two local samples (Typeless is a candidate community connector; Claude Code / Codex live outside the framework in `sync/`). Two samples are not enough to generalize a taxonomy correctly.
-2. A premature kind-based split would likely need to be revised once more local samples exist, which would be a breaking public-API change at exactly the wrong time (after community authors have started shipping against v1).
-3. The current interface **already works** for local sources via cursor synthesis — the awkwardness is in naming (`checkAuth` for a file-existence check) and default values (`pageDelayMs` for zero-latency reads), neither of which is a blocker.
-
-The shape of the eventual taxonomy will be decided when there is enough evidence to design it, not before. Until then, local-source authors should use the patterns shown above and accept the HTTP-shaped vocabulary of the current interface.
-
----
-
-## Removed Systems
-
-The following legacy systems have been fully removed in favor of the connector framework:
-
-- **OpenCLI integration** (`packages/core/src/opencli/`): Manager, parser, strategies, onboarding flow. OpenCLI was an external CLI tool that wrapped browser automation for 50+ platforms. Each platform that needs support is now implemented as a standalone connector.
-- **Capture URL** (`CaptureUrlModal.tsx`, Cmd+K): One-off URL fetching via `opencli web read`. Not part of the connector model.
-- **`opencli_sources` table**: Replaced by `connector_sync_state`.
-- **`opencli_setup` table**: No longer needed (no global CLI installation step).
-- **`opencli:*` IPC channels**: Replaced by `connector:*`.
-- **OnboardingFlow**: Each connector handles its own auth; no shared setup wizard.
diff --git a/docs/spool-positioning.md b/docs/spool-positioning.md
index e068ebb..151eab0 100644
--- a/docs/spool-positioning.md
+++ b/docs/spool-positioning.md
@@ -1,8 +1,8 @@
# Spool
-> **The missing search engine for your own data.**
+> **The missing search engine for your own AI sessions.**
-Search your `[Claude Code sessions · Codex history · Gemini chats · ChatGPT history · GitHub stars · Twitter bookmarks · YouTube likes]` — locally.
+Search your `[Claude Code sessions · Codex history · Gemini chats]` — locally.
---
@@ -10,16 +10,16 @@ Search your `[Claude Code sessions · Codex history · Gemini chats · ChatGPT h
### Your coding agent is already the best search engine you have.
-Spool lets Claude Code, Codex, Gemini CLI, and any coding agent search your personal data — past sessions, bookmarks, stars, saves — from a single search box.
-
-### Your bookmarks and stars, synced locally.
-
-Installable connector plugins sync your bookmarks, stars, and saves from Twitter/X, GitHub, and more — no API keys, no tokens. Spool indexes them all locally.
+Spool lets Claude Code, Codex, Gemini CLI, and any coding agent search your past sessions from a single search box.
### Every agent session, indexed automatically.
-Spool watches `~/.claude/`, `~/.codex/`, and Gemini CLI’s `~/.gemini/tmp/*/chats` in real time. Every conversation you have with Claude Code, Codex, or Gemini CLI — searchable the moment it's written.
+Spool watches `~/.claude/`, `~/.codex/`, and Gemini CLI's `~/.gemini/tmp/*/chats` in real time. Every conversation you have with Claude Code, Codex, or Gemini CLI — searchable the moment it's written.
### Context that flows back in.
-A `/spool` skill inside Claude Code. A `spool` CLI in your terminal. Ask your agent to "build on last month's auth discussion" and it actually can — Spool feeds matching fragments from your past sessions and personal data directly into the conversation.
+A `/spool` skill inside Claude Code. A `spool` CLI in your terminal. Ask your agent to "build on last month's auth discussion" and it actually can — Spool feeds matching fragments from your past sessions directly into the conversation.
+
+---
+
+Looking for connectors that sync platform data (Twitter, GitHub, Reddit, etc.)? Those have moved to **[Spool Daemon](https://spool.pro/daemon)**, a sibling app focused on capture sync.
diff --git a/packages/app/src/main/index.ts b/packages/app/src/main/index.ts
index 45c68f7..088f6e9 100644
--- a/packages/app/src/main/index.ts
+++ b/packages/app/src/main/index.ts
@@ -1,8 +1,8 @@
-import { app, BrowserWindow, dialog, ipcMain, Menu, nativeTheme, nativeImage } from 'electron'
+import { app, BrowserWindow, dialog, ipcMain, Menu, nativeTheme, nativeImage, shell } from 'electron'
import { join } from 'node:path'
import { Worker } from 'node:worker_threads'
import {
- getDB, Syncer, SpoolWatcher,
+ getDB, wasNewDb, getInitialUserVersion, Syncer, SpoolWatcher,
searchFragments, searchSessionPreview, listRecentSessions, getSessionWithMessages, getStatus,
starItem, unstarItem, listStarredItems, getStarredUuidsByType,
} from '@spool-lab/core'
@@ -13,7 +13,7 @@ import { setupAutoUpdater, downloadUpdate, quitAndInstall } from './updater.js'
import { openTerminal } from './terminal.js'
import { getSessionResumeCommand } from '../shared/resumeCommand.js'
import { resolveResumeWorkingDirectory } from './sessionResume.js'
-import { loadUIPreferences, saveThemeEditor, saveThemeSource } from './uiPreferences.js'
+import { loadUIPreferences, saveThemeEditor, saveThemeSource, saveSpoolDaemonNoticeShown } from './uiPreferences.js'
import type Database from 'better-sqlite3'
import type { SyncWorkerMessage } from './sync-worker.js'
@@ -417,6 +417,27 @@ ipcMain.handle('spool:ai-cancel', () => {
return { ok: true }
})
+// ── Spool Daemon notice ──────────────────────────────────────────────────
+
+ipcMain.handle('spool:get-daemon-notice-pending', (): boolean => {
+ // Only nudge users who actually upgraded from a pre-M5 schema. Fresh
+ // installs land directly at user_version=5 with no DB beforehand —
+ // nothing to apologize for, no notice needed.
+ if (uiPreferences.spoolDaemonNoticeShown) return false
+ if (wasNewDb()) return false
+ const initialVersion = getInitialUserVersion()
+ return initialVersion !== null && initialVersion < 5
+})
+
+ipcMain.handle('spool:daemon-notice-action', (_e, { action }: { action: 'install' | 'dismiss' }) => {
+ uiPreferences.spoolDaemonNoticeShown = true
+ saveSpoolDaemonNoticeShown()
+ if (action === 'install') {
+ void shell.openExternal('https://spool.pro/daemon')
+ }
+ return { ok: true }
+})
+
// ── Auto-update ──────────────────────────────────────────────────────────
ipcMain.handle('spool:download-update', () => {
diff --git a/packages/app/src/main/uiPreferences.ts b/packages/app/src/main/uiPreferences.ts
index b34bb55..8f91aa7 100644
--- a/packages/app/src/main/uiPreferences.ts
+++ b/packages/app/src/main/uiPreferences.ts
@@ -1,6 +1,6 @@
import { existsSync, mkdirSync, readFileSync, writeFileSync } from 'node:fs'
import { join } from 'node:path'
-import { homedir } from 'node:os'
+import { SPOOL_DIR } from '@spool-lab/core'
import {
normalizeThemeEditorState,
type ThemeEditorStateV1,
@@ -10,13 +10,15 @@ import {
interface UIConfigFile {
themeSource?: unknown
themeEditor?: unknown
+ spoolDaemonNoticeShown?: unknown
}
-const UI_CONFIG_PATH = join(homedir(), '.spool', 'ui.json')
+const UI_CONFIG_PATH = join(SPOOL_DIR, 'ui.json')
export interface UIPreferences {
themeSource: ThemeSource
themeEditor: ThemeEditorStateV1 | null
+ spoolDaemonNoticeShown: boolean
}
function normalizeThemeSource(raw: unknown): ThemeSource {
@@ -33,7 +35,7 @@ function readUIConfig(): UIConfigFile {
}
function writeUIConfig(config: UIConfigFile): void {
- mkdirSync(join(homedir(), '.spool'), { recursive: true })
+ mkdirSync(SPOOL_DIR, { recursive: true })
writeFileSync(UI_CONFIG_PATH, JSON.stringify(config, null, 2), 'utf8')
}
@@ -42,9 +44,15 @@ export function loadUIPreferences(): UIPreferences {
return {
themeSource: normalizeThemeSource(config.themeSource),
themeEditor: normalizeThemeEditorState(config.themeEditor),
+ spoolDaemonNoticeShown: config.spoolDaemonNoticeShown === true,
}
}
+export function saveSpoolDaemonNoticeShown(): void {
+ const config = readUIConfig()
+ writeUIConfig({ ...config, spoolDaemonNoticeShown: true })
+}
+
export function saveThemeSource(themeSource: ThemeSource): void {
const config = readUIConfig()
writeUIConfig({ ...config, themeSource })
diff --git a/packages/app/src/preload/index.ts b/packages/app/src/preload/index.ts
index d141620..e42e08c 100644
--- a/packages/app/src/preload/index.ts
+++ b/packages/app/src/preload/index.ts
@@ -141,6 +141,13 @@ const api = {
setThemeEditorState: (state: ThemeEditorStateV1): Promise<{ ok: boolean }> =>
ipcRenderer.invoke('spool:set-theme-editor-state', { state }),
+ // Spool Daemon notice
+ getDaemonNoticePending: (): Promise =>
+ ipcRenderer.invoke('spool:get-daemon-notice-pending'),
+
+ daemonNoticeAction: (action: 'install' | 'dismiss'): Promise<{ ok: boolean }> =>
+ ipcRenderer.invoke('spool:daemon-notice-action', { action }),
+
// Auto-update
onUpdateStatus: (cb: (data: { status: 'available' | 'downloading' | 'ready' | 'error'; version?: string; percent?: number }) => void) => {
const handler = (_: Electron.IpcRendererEvent, data: unknown) => cb(data as { status: 'available' | 'downloading' | 'ready' | 'error'; version?: string; percent?: number })
diff --git a/packages/app/src/renderer/App.tsx b/packages/app/src/renderer/App.tsx
index 10cde7f..e763865 100644
--- a/packages/app/src/renderer/App.tsx
+++ b/packages/app/src/renderer/App.tsx
@@ -9,6 +9,7 @@ import StarredEntryButton from './components/StarredEntryButton.js'
import StatusBar from './components/StatusBar.js'
import AiAnswerCard from './components/AiAnswerCard.js'
import SettingsPanel from './components/SettingsPanel.js'
+import DaemonNoticeModal from './components/DaemonNoticeModal.js'
import { getSessionResumeCommandPrefix } from '../shared/resumeCommand.js'
import { DEFAULT_SEARCH_SORT_ORDER, type SearchSortOrder } from '../shared/searchSort.js'
import { defaultThemeEditorState, type ThemeEditorStateV1 } from './theme/editorTypes.js'
@@ -65,6 +66,7 @@ export default function App() {
// Settings & modals
const [showSettings, setShowSettings] = useState(false)
+ const [showDaemonNotice, setShowDaemonNotice] = useState(false)
const [settingsTab, setSettingsTab] = useState('general')
const [defaultSearchSort, setDefaultSearchSort] = useState(DEFAULT_SEARCH_SORT_ORDER)
const [resumeToastCommand, setResumeToastCommand] = useState(null)
@@ -178,6 +180,13 @@ export default function App() {
})
}, [])
+ useEffect(() => {
+ if (!window.spool?.getDaemonNoticePending) return
+ window.spool.getDaemonNoticePending()
+ .then(pending => { if (pending) setShowDaemonNotice(true) })
+ .catch(console.error)
+ }, [])
+
useEffect(() => {
applyEditorTheme(themeEditor)
}, [themeEditor])
@@ -587,6 +596,10 @@ export default function App() {
onThemeEditorChange={setThemeEditor}
/>
)}
+
+ {showDaemonNotice && (
+ setShowDaemonNotice(false)} />
+ )}
)
}
diff --git a/packages/app/src/renderer/assets/daemon-icon.png b/packages/app/src/renderer/assets/daemon-icon.png
new file mode 100644
index 0000000..a87f33f
Binary files /dev/null and b/packages/app/src/renderer/assets/daemon-icon.png differ
diff --git a/packages/app/src/renderer/components/DaemonNoticeModal.tsx b/packages/app/src/renderer/components/DaemonNoticeModal.tsx
new file mode 100644
index 0000000..c3a7a84
--- /dev/null
+++ b/packages/app/src/renderer/components/DaemonNoticeModal.tsx
@@ -0,0 +1,81 @@
+import { useState } from 'react'
+import daemonIconUrl from '../assets/daemon-icon.png'
+
+interface Props {
+ onClose: () => void
+}
+
+export default function DaemonNoticeModal({ onClose }: Props) {
+ const [busy, setBusy] = useState<'install' | 'dismiss' | null>(null)
+
+ const handleAction = async (action: 'install' | 'dismiss') => {
+ if (busy) return
+ setBusy(action)
+ try {
+ await window.spool.daemonNoticeAction(action)
+ } finally {
+ onClose()
+ }
+ }
+
+ return (
+
+
+
+
+
+
+
+ Connectors moved to Spool Daemon
+
+
+ Spool now focuses on AI sessions. Twitter, GitHub, Reddit, Hacker News and other
+ platform connectors live in{' '}
+
+ Spool Daemon
+
+ , a sibling app. Synced platform data has been removed from Spool — install Daemon
+ to keep using connectors.
+
+
+
+
+
+
+
+
+
+ )
+}
+
diff --git a/packages/core/README.md b/packages/core/README.md
index 95bcee0..1f99dff 100644
--- a/packages/core/README.md
+++ b/packages/core/README.md
@@ -1,8 +1,8 @@
# @spool-lab/core
-The engine behind [Spool](https://spool.pro) — a local search engine for your AI sessions and connected sources.
+The engine behind [Spool](https://spool.pro) — a local search engine for your AI sessions.
-This package provides the core runtime: session parsing, full-text search, the connector sync engine, and the SQLite database layer. It powers both the Spool desktop app and the `@spool-lab/cli`.
+This package provides the core runtime: session parsing, full-text search, and the SQLite database layer. It powers both the Spool desktop app and the `@spool-lab/cli`.
## Usage
@@ -26,9 +26,8 @@ syncer.syncAll()
- **Session parsers** — reads Claude Code, Codex, and Gemini CLI session files
- **Full-text search** — FTS5 with unicode + trigram indexes for CJK support
-- **Sync engine** — paginated connector sync with cursor-based state, backfill, and error recovery
-- **Connector loader** — discovers and loads connector plugins from `~/.spool/connectors/`
-- **Connector registry** — in-memory registry of available connectors
+- **Watcher** — incremental indexing as new session files arrive
+- **Stars** — pin sessions for quick recall
## Native dependency
diff --git a/packages/core/src/db/db.ts b/packages/core/src/db/db.ts
index b12ef45..c9dd87d 100644
--- a/packages/core/src/db/db.ts
+++ b/packages/core/src/db/db.ts
@@ -1,25 +1,39 @@
import Database from 'better-sqlite3'
import { homedir } from 'node:os'
import { join } from 'node:path'
-import { mkdirSync, statSync } from 'node:fs'
+import { existsSync, mkdirSync, statSync } from 'node:fs'
export const SPOOL_DIR = process.env['SPOOL_DATA_DIR'] ?? join(homedir(), '.spool')
export const DB_PATH = join(SPOOL_DIR, 'spool.db')
let _db: Database.Database | null = null
+let _wasNewDb = false
+let _initialUserVersion: number | null = null
export function getDB(_readonly = false): Database.Database {
if (_db) return _db
mkdirSync(SPOOL_DIR, { recursive: true })
+ // Capture pre-open state before better-sqlite3 creates the file. These two
+ // signals together let callers tell apart "fresh install" from "upgrade":
+ // - wasNewDb=true → DB file did not exist; this is a first-time install
+ // - wasNewDb=false → upgrade path, and initialUserVersion tells you from where
+ _wasNewDb = !existsSync(DB_PATH)
const db = new Database(DB_PATH)
db.pragma('journal_mode = WAL')
db.pragma('foreign_keys = ON')
db.pragma('busy_timeout = 5000')
+ _initialUserVersion = (db.pragma('user_version') as Array<{ user_version: number }>)[0]?.user_version ?? 0
runMigrations(db)
_db = db
return db
}
+/** True if the DB file did not exist before this process opened it. */
+export function wasNewDb(): boolean { return _wasNewDb }
+
+/** user_version of the DB before any migrations ran this process. Null if getDB() hasn't been called. */
+export function getInitialUserVersion(): number | null { return _initialUserVersion }
+
export function getDBSize(): number {
try {
return statSync(DB_PATH).size
diff --git a/packages/core/src/db/migration-v5.test.ts b/packages/core/src/db/migration-v5.test.ts
index e599a08..ea098e0 100644
--- a/packages/core/src/db/migration-v5.test.ts
+++ b/packages/core/src/db/migration-v5.test.ts
@@ -144,6 +144,10 @@ describe('migration v5 (connector subsystem removal)', () => {
const dbModule = await import('./db.js')
const db = dbModule.getDB()
+ // Upgrade-path detection: DB pre-existed and was on v4
+ expect(dbModule.wasNewDb()).toBe(false)
+ expect(dbModule.getInitialUserVersion()).toBe(4)
+
// user_version bumped to 5
expect((db.pragma('user_version') as Array<{ user_version: number }>)[0]?.user_version).toBe(5)
@@ -181,6 +185,10 @@ describe('migration v5 (connector subsystem removal)', () => {
const dbModule = await import('./db.js')
const db = dbModule.getDB()
+ // Upgrade-path detection: DB file did not exist before this run
+ expect(dbModule.wasNewDb()).toBe(true)
+ expect(dbModule.getInitialUserVersion()).toBe(0)
+
expect((db.pragma('user_version') as Array<{ user_version: number }>)[0]?.user_version).toBe(5)
// Stars exists with narrow CHECK