From 3ad9c2b569f5665ff2106a6a70772f73775589af Mon Sep 17 00:00:00 2001 From: Chen <99816898+donteatfriedrice@users.noreply.github.com> Date: Tue, 28 Apr 2026 18:33:05 +0800 Subject: [PATCH] chore(m5): pre-release cleanup of stale references MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Final sweep before v0.4.0. Things found that the per-PR cleanup didn't catch: main/index.ts: drop the spool:// custom protocol registration. The URL handler (parseSpoolUrl + handleSpoolUrl) was removed in PR2, leaving the registration silently broken — clicking spool://… URLs would do nothing. Cleaner to not register a scheme we don't handle. landing/pages/docs/guides/data-sources.md: rewrite to be sessions- only; replace the "Platform data (via connector plugins)" section with a forward to Spool Daemon. The published doc currently tells users to run `spool connectors install …`, which exits 1 on the shipped CLI. landing/pages/docs/quick-start.md: drop step 3 ("Install connector plugins"); replace with the actual CLI commands and a forward to Spool Daemon at the end. landing/pages/docs/reference/configuration.md: drop the ~/.spool/connectors/ row from the data-directory table; add ui.json in its place. landing/pages/docs/guides/agent-integration.md: drop "bookmarks, stars" from the planned-features list; drop "connector data" from the local-search-index description. landing/pages/blog/hello-spool.md: rewrite the connector-plugin paragraph as a forward to Spool Daemon for platform data. Verified app still builds clean (electron-vite + tsc). Landing has no build step in this monorepo (consumed via /daemon redirect + docs static rendering by the landing's own deploy). Co-Authored-By: Claude Opus 4.7 (1M context) --- packages/app/src/main/index.ts | 8 ----- packages/landing/pages/blog/hello-spool.md | 4 +-- .../pages/docs/guides/agent-integration.md | 4 +-- .../landing/pages/docs/guides/data-sources.md | 35 +++---------------- packages/landing/pages/docs/quick-start.md | 17 +++++---- .../pages/docs/reference/configuration.md | 2 +- 6 files changed, 20 insertions(+), 50 deletions(-) diff --git a/packages/app/src/main/index.ts b/packages/app/src/main/index.ts index 088f6e9..dbe899d 100644 --- a/packages/app/src/main/index.ts +++ b/packages/app/src/main/index.ts @@ -25,14 +25,6 @@ if (customUserDataDir) { // macOS menu bar shows the first menu's label as the app name app.setName(isDevMode ? 'Spool DEV' : 'Spool') -if (process.defaultApp) { - if (process.argv.length >= 2) { - app.setAsDefaultProtocolClient('spool', process.execPath, [process.argv[1]!]) - } -} else { - app.setAsDefaultProtocolClient('spool') -} - const uiPreferences = loadUIPreferences() nativeTheme.themeSource = uiPreferences.themeSource let focusExistingWindow = () => {} diff --git a/packages/landing/pages/blog/hello-spool.md b/packages/landing/pages/blog/hello-spool.md index cdb6831..c3d8aa4 100644 --- a/packages/landing/pages/blog/hello-spool.md +++ b/packages/landing/pages/blog/hello-spool.md @@ -18,9 +18,9 @@ You can't grep through JSONL files and get useful results. You can't ask your ag ## How Spool works -Spool watches your session directories in real time. Every conversation becomes searchable the moment it's written — no manual export, no copy-paste. +Spool watches your session directories in real time. Every conversation becomes searchable the moment it's written — no manual export, no copy-paste. All local, all on your machine. -It also indexes data from installable connector plugins: your GitHub stars, Twitter bookmarks, Reddit saves, and more. All local, all on your machine. +For platform data — GitHub stars, Twitter bookmarks, Reddit saves — see [Spool Daemon](/daemon/), our sibling app focused on capture sync. ## What's next: agent-native search diff --git a/packages/landing/pages/docs/guides/agent-integration.md b/packages/landing/pages/docs/guides/agent-integration.md index 89f0d43..f35625a 100644 --- a/packages/landing/pages/docs/guides/agent-integration.md +++ b/packages/landing/pages/docs/guides/agent-integration.md @@ -7,7 +7,7 @@ description: Use Spool as a search backend for your AI coding agents. Agent integration is under active development. The `/spool` skill and standalone CLI are not yet functional — stay tuned. ::: -Spool is designed to work with AI coding agents. Once the integration is ready, your agent will be able to search your personal data — past sessions, bookmarks, stars — without leaving your workflow. Session indexing already supports Claude Code, Codex CLI, and Gemini CLI. +Spool is designed to work with AI coding agents. Once the integration is ready, your agent will be able to search your past sessions without leaving your workflow. Session indexing already supports Claude Code, Codex CLI, and Gemini CLI. ## Planned: Claude Code skill @@ -28,6 +28,6 @@ A standalone `spool` CLI is also under development, which will allow any agent o ## How it will work 1. Your agent sends a search query to Spool -2. Spool searches the local SQLite index (Claude sessions, Codex sessions, Gemini sessions, connector data) +2. Spool searches the local SQLite index (Claude sessions, Codex sessions, Gemini sessions) 3. Matching fragments are returned with source metadata 4. Your agent incorporates the context into its response diff --git a/packages/landing/pages/docs/guides/data-sources.md b/packages/landing/pages/docs/guides/data-sources.md index d7f59e5..3b0fbf4 100644 --- a/packages/landing/pages/docs/guides/data-sources.md +++ b/packages/landing/pages/docs/guides/data-sources.md @@ -1,13 +1,11 @@ --- title: Data Sources -description: Platforms and data types that Spool can index. +description: AI agent session sources that Spool indexes. --- -Spool indexes data from two main sources: **agent sessions** (watched automatically) and **platform data** (pulled via connector plugins). +Spool indexes your AI agent sessions automatically. Each source is watched in real time — new sessions become searchable the moment they're written, no manual export needed. -## Agent sessions (automatic) - -Spool watches these directories in real time: +## Agent sessions | Agent | Path | |-------|------| @@ -17,29 +15,6 @@ Spool watches these directories in real time: | Codex CLI (profiles) | `~/.codex-profiles/*/sessions/` | | Gemini CLI | `~/.gemini/tmp/*/chats/` | -New sessions become searchable the moment they're written. No manual export needed. - -## Platform data (via connector plugins) - -Connector plugins pull your bookmarks, stars, and saves from various platforms to your machine. Spool indexes everything they capture. - -### Supported connectors - -- **Code**: GitHub Stars, GitLab Stars, Bitbucket -- **Social**: Twitter/X Bookmarks, Reddit Saved, Hacker News Favorites -- **Video**: YouTube Likes, Bilibili Favorites -- **Reading**: Substack, Medium Bookmarks, Pocket, Instapaper -- **Professional**: LinkedIn Saved, Slack Bookmarks -- **Notes**: Notion, Obsidian, Apple Notes - -### Syncing data - -```bash -# Install a connector -spool connectors install twitter-bookmarks - -# Sync all installed connectors -spool connectors sync -``` +## Platform data (Twitter, GitHub, Reddit, etc.) -Spool indexes new data from connectors as it arrives. +Connector-based platform sync (bookmarks, stars, saves) lives in **[Spool Daemon](/daemon/)**, a sibling app. Daemon's captures show up alongside Spool sessions in the same search box when both apps are installed. diff --git a/packages/landing/pages/docs/quick-start.md b/packages/landing/pages/docs/quick-start.md index 7fbcc78..5854311 100644 --- a/packages/landing/pages/docs/quick-start.md +++ b/packages/landing/pages/docs/quick-start.md @@ -3,7 +3,7 @@ title: Quick Start description: Get up and running with Spool in under 5 minutes. --- -After [installing Spool](/docs/installation), you can start searching your data right away. +After [installing Spool](/docs/installation), you can start searching your sessions right away. ## 1. Launch Spool @@ -11,15 +11,18 @@ Open Spool from your Applications folder. It starts indexing your Claude Code, C ## 2. Search from the app -Use the search bar to find anything across your indexed data — past agent sessions, bookmarks, starred repos, and more. +Use the search bar to find anything across your indexed sessions. Try a keyword from a recent conversation — it's there the moment the session file is written. -## 3. Install connector plugins +## 3. Search from your terminal -Install connector plugins to pull bookmarks, stars, and saves from your favorite platforms: +The bundled CLI runs the same search engine: ```bash -spool connectors install twitter-bookmarks -spool connectors sync +spool search "auth middleware" +spool list -n 10 +spool show ``` -Spool indexes everything your connectors capture. +## What about platform data (Twitter, GitHub, Reddit, …)? + +Bookmarks, stars, and saves live in **[Spool Daemon](/daemon/)** — a sibling app focused on capture sync. Once you install Daemon, its captures appear alongside Spool sessions in the same search box. diff --git a/packages/landing/pages/docs/reference/configuration.md b/packages/landing/pages/docs/reference/configuration.md index 56c6c05..0eca5de 100644 --- a/packages/landing/pages/docs/reference/configuration.md +++ b/packages/landing/pages/docs/reference/configuration.md @@ -11,7 +11,7 @@ Spool stores its data in `~/.spool/`. |------|---------| | `~/.spool/spool.db` | Local search index (SQLite) | | `~/.spool/agents.json` | Agent and SDK configuration | -| `~/.spool/connectors/` | Connector plugin sync data | +| `~/.spool/ui.json` | UI preferences (theme, etc.) | ## Watched directories