diff --git a/CHANGELOG.md b/CHANGELOG.md
index e6fee5f..a4d043e 100644
--- a/CHANGELOG.md
+++ b/CHANGELOG.md
@@ -2,6 +2,44 @@
All notable changes to Claude Matrix are documented here.
+## [2.4.0] - 2026-04-01
+
+> Leaner, sharper. Claude Code grew up, so we dropped what it handles natively and doubled down on what only Matrix can do.
+
+### Removed
+- **Dreamer (Scheduled Tasks)** - Removed the entire cron/scheduler system (`src/dreamer/`, `croner`, `cronstrue` packages). Claude Code now has native scheduling via `CronCreate`/`RemoteTrigger`.
+- **Background Jobs** - Removed async job system (`src/jobs/`). Reindex now runs synchronously. `matrix_job_status`, `matrix_job_cancel`, `matrix_job_list` tools removed.
+- **Prompt Tool** (`matrix_prompt`) - Removed prompt analysis tool. High token cost, marginal value.
+- **PromptAnalysis Hook** - Removed memory injection into prompts. Token overhead outweighed benefit.
+- **SubagentStart Hook** - Removed hint injection into subagents.
+- **PostToolUse:Bash Hook** - Removed bash output logging. Claude sees output directly.
+- **TaskCompleted Hook** - Removed task completion logging.
+- **Clone Repo Skill** (`/matrix:clone-repo`) - `git clone` works fine natively.
+- **RunMD Skill** (`/matrix:runmd`) - Bash tool covers this.
+- **`@anthropic-ai/sdk`** dependency - Was unused (never imported).
+
+### Changed
+- **DB Migrations Flattened** - Replaced 8 incremental migrations + 100 LOC version detection with a single idempotent legacy upgrade. `SCHEMA_VERSION` bumped to 9. Existing DBs at any version (1-8) upgrade cleanly.
+- **Dependencies Updated**
+ - `@modelcontextprotocol/sdk` 1.25.0 → 1.29.0
+ - `@sinclair/typebox` 0.34.45 → 0.34.49
+ - `web-tree-sitter` 0.26.3 → 0.26.8
+ - `@types/bun` 1.3.4 → 1.3.11
+ - `oxlint` 1.35.0 → 1.58.0
+
+### Improved
+- **Scanner: .gitignore support** - File scanner now parses `.gitignore` and respects all patterns. No more indexing build output or generated files.
+- **Scanner: symlink cycle detection** - Detects and skips symlink cycles and broken symlinks via `lstat()`.
+- **Incremental indexing: content hash** - Uses SHA-256 content hashing (stored in DB) instead of mtime-only detection. Catches refactors where timestamp is unreliable, skips `touch`-without-edit.
+- **Parser: per-file timeout** - 10s timeout per file prevents hangs on malformed/huge files.
+- **Python parser: decorator extraction** - Captures `@property`, `@staticmethod`, `@classmethod`, `@dataclass` and custom decorators. Properties correctly classified. Dataclass fields auto-extracted.
+- **TypeScript parser: re-export tracking** - `export { X } from './bar'`, `export * from './bar'` now create import edges. Fixes caller detection through barrel exports.
+- **Symbol search: fuzzy ranking** - `searchSymbols` now ranks exact > prefix > substring, with optional kind filter and higher default limit (30).
+- **Import resolution: tsconfig path aliases** - Loads `tsconfig.json`/`jsconfig.json` `compilerOptions.paths` for accurate import graph resolution (e.g., `@/utils/foo` → `src/utils/foo`).
+- **Zero lint warnings** - Cleaned all 15 pre-existing unused import warnings.
+
+---
+
## [2.3.1] - 2026-03-02
### Fixed
diff --git a/README.md b/README.md
index 8df5ef4..ceb1dd2 100644
--- a/README.md
+++ b/README.md
@@ -4,16 +4,18 @@
-[](CHANGELOG.md)
+[](CHANGELOG.md)
[](LICENSE)
-Community plugin for Claude Code • Not affiliated with Anthropic
+Community plugin for Claude Code. Not affiliated with Anthropic.
+
+If Matrix saved you time, be kind and leave a star.
---
-Matrix turns Claude Code into a complete development environment. Memory that persists across sessions. Code indexing across 15 languages. Automated hooks that catch issues before they happen. Scheduled tasks that run while you sleep.
+Matrix gives Claude Code persistent memory and deep code intelligence. Solutions survive across sessions. A tree-sitter index makes symbol navigation instant across 15 languages. Hooks catch problems before they happen.
```
/plugin marketplace add ojowwalker77/Claude-Matrix
@@ -22,7 +24,7 @@ Matrix turns Claude Code into a complete development environment. Memory that pe
Requires [Bun](https://bun.sh) v1.0+ and Claude Code v2.0+. Verify with `/matrix:doctor`.
-> **macOS and Linux only.** Windows is not supported. Windows users must use [WSL](https://learn.microsoft.com/en-us/windows/wsl/install) or fork the repo and adapt paths manually.
+> **macOS and Linux only.** Windows users need [WSL](https://learn.microsoft.com/en-us/windows/wsl/install).
## What You Get
@@ -30,12 +32,11 @@ Requires [Bun](https://bun.sh) v1.0+ and Claude Code v2.0+. Verify with `/matrix
|---------|--------------|
| **Memory** | Solutions persist. Claude recalls what worked before. |
| **Code Index** | Find definitions, callers, exports instantly (15 languages) |
-| **Hooks** | Auto-inject context, catch bad packages, warn on sensitive files |
-| **Warnings** | Track problematic files/packages with CVE checks |
-| **Dreamer** | Schedule tasks — daily reviews, weekly audits, automated commits |
+| **Hooks** | Auto-approve reads, catch bad packages, warn on sensitive files |
+| **Warnings** | Track problematic files and packages with CVE checks |
| **Code Review** | 5-phase analysis with blast radius and impact mapping |
-| **Deep Research** | Multi-source aggregation into polished markdown |
-| **Nuke** | Codebase hygiene analysis — dead code, orphaned files, stale TODOs |
+| **Deep Research** | Multi-source research aggregation |
+| **Nuke** | Dead code, orphaned files, circular dependencies, stale TODOs |
| **Context7** | Always-current library documentation |
---
@@ -45,22 +46,22 @@ Requires [Bun](https://bun.sh) v1.0+ and Claude Code v2.0+. Verify with `/matrix
Solutions persist across sessions with semantic search:
```
-You solve a problem → Matrix stores it
-Similar problem later → Matrix recalls it
-Feedback → Rankings improve
+You solve a problem -> Matrix stores it
+Similar problem later -> Matrix recalls it
+Feedback -> Rankings improve
```
-**Tools:** `matrix_recall` · `matrix_store` · `matrix_reward` · `matrix_failure` · `matrix_status`
+**Tools:** `matrix_recall` `matrix_store` `matrix_reward` `matrix_failure` `matrix_status`
---
## Code Index
-Fast symbol navigation. Auto-indexed on session start.
+Fast symbol navigation powered by tree-sitter. Auto-indexed on session start. Respects `.gitignore`. Uses content hashing for reliable incremental updates.
**Languages:** TypeScript, JavaScript, Python, Go, Rust, Java, Kotlin, Swift, C#, Ruby, PHP, C, C++, Elixir, Zig
-**Tools:** `matrix_find_definition` · `matrix_find_callers` · `matrix_search_symbols` · `matrix_list_exports` · `matrix_get_imports`
+**Tools:** `matrix_find_definition` `matrix_find_callers` `matrix_search_symbols` `matrix_list_exports` `matrix_get_imports` `matrix_find_dead_code` `matrix_find_circular_deps`
---
@@ -71,34 +72,10 @@ Matrix runs automatically in the background:
| Trigger | Action |
|---------|--------|
| Session starts | Index code, initialize memory |
+| Tool permission requested | Auto-approve read-only tools |
| Before `npm install` | Check CVEs, deprecation, bundle size |
+| Before reading file | Warn if sensitive (.env, keys, secrets) |
| Before editing file | Warn if file has known issues |
-| Before `git commit` | Suggest code review |
-| Task completed | Offer to save notable solutions |
-
----
-
-## Dreamer
-
-Schedule Claude tasks with native OS schedulers (launchd/crontab):
-
-```
-/scheduler:schedule-add
-> Name: daily-review
-> Schedule: every weekday at 9am
-> Command: /matrix:review
-```
-
-**Use cases:** Daily code review · Weekly dependency audit · Nightly test runs · Automated changelog
-
-**Git Worktree Mode:** Run in isolated branches that auto-commit and push.
-
-**Safety:**
-- `skipPermissions` OFF by default — respects your existing permission rules
-- No custom daemon — uses native OS schedulers, no elevated privileges
-- All inputs sanitized via `shellEscape()`
-
-**Skills:** `/scheduler:schedule-add` · `schedule-list` · `schedule-run` · `schedule-remove` · `schedule-status` · `schedule-logs` · `schedule-history`
---
@@ -108,12 +85,11 @@ Schedule Claude tasks with native OS schedulers (launchd/crontab):
|---------|---------|
| `/matrix:review` | 5-phase code review with impact analysis |
| `/matrix:deep-research` | Multi-source research aggregation |
-| `/matrix:doctor` | Diagnostics + auto-fix |
+| `/matrix:nuke` | Codebase hygiene analysis |
+| `/matrix:doctor` | Diagnostics and auto-fix |
| `/matrix:list` | View solutions, stats, warnings |
| `/matrix:warn` | Manage file/package warnings |
| `/matrix:reindex` | Rebuild code index |
-| `/matrix:nuke` | Codebase hygiene analysis |
-| `/matrix:clone-repo` | Clone external repos for exploration |
---
@@ -131,7 +107,7 @@ Config at `~/.claude/matrix/matrix.config`:
}
```
-**Verbosity:** `full` (~500 tokens) · `compact` (~80, recommended)
+**Verbosity:** `full` (~500 tokens) or `compact` (~80, recommended)
**Model Delegation:** Routes simple ops to Haiku for ~40-50% cost savings.
@@ -153,7 +129,7 @@ All data local. No external API calls for memory.
## Links
-[Changelog](CHANGELOG.md) · [Roadmap](ROADMAP.md) · [Contributing](CONTRIBUTING.md) · [LLM Reference](docs/reference-for-llms.md)
+[Changelog](CHANGELOG.md) [LLM Reference](docs/reference-for-llms.md)
MIT License
diff --git a/bun.lock b/bun.lock
index 306d994..6e4d646 100644
--- a/bun.lock
+++ b/bun.lock
@@ -5,48 +5,63 @@
"": {
"name": "matrix",
"dependencies": {
- "@anthropic-ai/sdk": "^0.71.2",
- "@modelcontextprotocol/sdk": "^1.0.0",
- "@sinclair/typebox": "^0.34.45",
+ "@modelcontextprotocol/sdk": "^1.29.0",
+ "@sinclair/typebox": "^0.34.49",
"@xenova/transformers": "^2.17.2",
- "croner": "^8.1.2",
- "cronstrue": "^2.52.0",
- "web-tree-sitter": "^0.26.3",
+ "web-tree-sitter": "^0.26.8",
},
"devDependencies": {
- "@types/bun": "^1.3.4",
- "@types/node": "^22.10.2",
- "oxlint": "^1.35.0",
+ "@types/bun": "^1.3.11",
+ "@types/node": "^22.19.15",
+ "oxlint": "^1.58.0",
"typescript": "^5.7.2",
},
},
},
"packages": {
- "@anthropic-ai/sdk": ["@anthropic-ai/sdk@0.71.2", "", { "dependencies": { "json-schema-to-ts": "^3.1.1" }, "peerDependencies": { "zod": "^3.25.0 || ^4.0.0" }, "optionalPeers": ["zod"], "bin": { "anthropic-ai-sdk": "bin/cli" } }, "sha512-TGNDEUuEstk/DKu0/TflXAEt+p+p/WhTlFzEnoosvbaDU2LTjm42igSdlL0VijrKpWejtOKxX0b8A7uc+XiSAQ=="],
+ "@hono/node-server": ["@hono/node-server@1.19.12", "", { "peerDependencies": { "hono": "^4" } }, "sha512-txsUW4SQ1iilgE0l9/e9VQWmELXifEFvmdA1j6WFh/aFPj99hIntrSsq/if0UWyGVkmrRPKA1wCeP+UCr1B9Uw=="],
- "@babel/runtime": ["@babel/runtime@7.28.4", "", {}, "sha512-Q/N6JNWvIvPnLDvjlE1OUBLPQHH6l3CltCEsHIujp45zQUSSh8K+gHnaEX45yAT1nyngnINhvWtzN+Nb9D8RAQ=="],
+ "@huggingface/jinja": ["@huggingface/jinja@0.2.2", "", {}, "sha512-/KPde26khDUIPkTGU82jdtTW9UAuvUTumCAbFs/7giR0SxsvZC4hru51PBvpijH6BVkHcROcvZM/lpy5h1jRRA=="],
- "@hono/node-server": ["@hono/node-server@1.19.7", "", { "peerDependencies": { "hono": "^4" } }, "sha512-vUcD0uauS7EU2caukW8z5lJKtoGMokxNbJtBiwHgpqxEXokaHCBkQUmCHhjFB1VUTWdqj25QoMkMKzgjq+uhrw=="],
+ "@modelcontextprotocol/sdk": ["@modelcontextprotocol/sdk@1.29.0", "", { "dependencies": { "@hono/node-server": "^1.19.9", "ajv": "^8.17.1", "ajv-formats": "^3.0.1", "content-type": "^1.0.5", "cors": "^2.8.5", "cross-spawn": "^7.0.5", "eventsource": "^3.0.2", "eventsource-parser": "^3.0.0", "express": "^5.2.1", "express-rate-limit": "^8.2.1", "hono": "^4.11.4", "jose": "^6.1.3", "json-schema-typed": "^8.0.2", "pkce-challenge": "^5.0.0", "raw-body": "^3.0.0", "zod": "^3.25 || ^4.0", "zod-to-json-schema": "^3.25.1" }, "peerDependencies": { "@cfworker/json-schema": "^4.1.1" }, "optionalPeers": ["@cfworker/json-schema"] }, "sha512-zo37mZA9hJWpULgkRpowewez1y6ML5GsXJPY8FI0tBBCd77HEvza4jDqRKOXgHNn867PVGCyTdzqpz0izu5ZjQ=="],
- "@huggingface/jinja": ["@huggingface/jinja@0.2.2", "", {}, "sha512-/KPde26khDUIPkTGU82jdtTW9UAuvUTumCAbFs/7giR0SxsvZC4hru51PBvpijH6BVkHcROcvZM/lpy5h1jRRA=="],
+ "@oxlint/binding-android-arm-eabi": ["@oxlint/binding-android-arm-eabi@1.58.0", "", { "os": "android", "cpu": "arm" }, "sha512-1T7UN3SsWWxpWyWGn1cT3ASNJOo+pI3eUkmEl7HgtowapcV8kslYpFQcYn431VuxghXakPNlbjRwhqmR37PFOg=="],
+
+ "@oxlint/binding-android-arm64": ["@oxlint/binding-android-arm64@1.58.0", "", { "os": "android", "cpu": "arm64" }, "sha512-GryzujxuiRv2YFF7bRy8mKcxlbuAN+euVUtGJt9KKbLT8JBUIosamVhcthLh+VEr6KE6cjeVMAQxKAzJcoN7dg=="],
+
+ "@oxlint/binding-darwin-arm64": ["@oxlint/binding-darwin-arm64@1.58.0", "", { "os": "darwin", "cpu": "arm64" }, "sha512-7/bRSJIwl4GxeZL9rPZ11anNTyUO9epZrfEJH/ZMla3+/gbQ6xZixh9nOhsZ0QwsTW7/5J2A/fHbD1udC5DQQA=="],
+
+ "@oxlint/binding-darwin-x64": ["@oxlint/binding-darwin-x64@1.58.0", "", { "os": "darwin", "cpu": "x64" }, "sha512-EqdtJSiHweS2vfILNrpyJ6HUwpEq2g7+4Zx1FPi4hu3Hu7tC3znF6ufbXO8Ub2LD4mGgznjI7kSdku9NDD1Mkg=="],
+
+ "@oxlint/binding-freebsd-x64": ["@oxlint/binding-freebsd-x64@1.58.0", "", { "os": "freebsd", "cpu": "x64" }, "sha512-VQt5TH4M42mY20F545G637RKxV/yjwVtKk2vfXuazfReSIiuvWBnv+FVSvIV5fKVTJNjt3GSJibh6JecbhGdBw=="],
+
+ "@oxlint/binding-linux-arm-gnueabihf": ["@oxlint/binding-linux-arm-gnueabihf@1.58.0", "", { "os": "linux", "cpu": "arm" }, "sha512-fBYcj4ucwpAtjJT3oeBdFBYKvNyjRSK+cyuvBOTQjh0jvKp4yeA4S/D0IsCHus/VPaNG5L48qQkh+Vjy3HL2/Q=="],
+
+ "@oxlint/binding-linux-arm-musleabihf": ["@oxlint/binding-linux-arm-musleabihf@1.58.0", "", { "os": "linux", "cpu": "arm" }, "sha512-0BeuFfwlUHlJ1xpEdSD1YO3vByEFGPg36uLjK1JgFaxFb4W6w17F8ET8sz5cheZ4+x5f2xzdnRrrWv83E3Yd8g=="],
+
+ "@oxlint/binding-linux-arm64-gnu": ["@oxlint/binding-linux-arm64-gnu@1.58.0", "", { "os": "linux", "cpu": "arm64" }, "sha512-TXlZgnPTlxrQzxG9ZXU7BNwx1Ilrr17P3GwZY0If2EzrinqRH3zXPc3HrRcBJgcsoZNMuNL5YivtkJYgp467UQ=="],
- "@modelcontextprotocol/sdk": ["@modelcontextprotocol/sdk@1.25.0", "", { "dependencies": { "@hono/node-server": "^1.19.7", "ajv": "^8.17.1", "ajv-formats": "^3.0.1", "content-type": "^1.0.5", "cors": "^2.8.5", "cross-spawn": "^7.0.5", "eventsource": "^3.0.2", "eventsource-parser": "^3.0.0", "express": "^5.0.1", "express-rate-limit": "^7.5.0", "jose": "^6.1.1", "json-schema-typed": "^8.0.2", "pkce-challenge": "^5.0.0", "raw-body": "^3.0.0", "zod": "^3.25 || ^4.0", "zod-to-json-schema": "^3.25.0" }, "peerDependencies": { "@cfworker/json-schema": "^4.1.1" }, "optionalPeers": ["@cfworker/json-schema"] }, "sha512-z0Zhn/LmQ3yz91dEfd5QgS7DpSjA4pk+3z2++zKgn5L6iDFM9QapsVoAQSbKLvlrFsZk9+ru6yHHWNq2lCYJKQ=="],
+ "@oxlint/binding-linux-arm64-musl": ["@oxlint/binding-linux-arm64-musl@1.58.0", "", { "os": "linux", "cpu": "arm64" }, "sha512-zSoYRo5dxHLcUx93Stl2hW3hSNjPt99O70eRVWt5A1zwJ+FPjeCCANCD2a9R4JbHsdcl11TIQOjyigcRVOH2mw=="],
- "@oxlint/darwin-arm64": ["@oxlint/darwin-arm64@1.35.0", "", { "os": "darwin", "cpu": "arm64" }, "sha512-ieiYVHkNZPo77Hgrxav595wGS4rRNKuDNrljf+4xhwpJsddrxMpM64IQUf2IvR3MhK4FxdGzhhB6OVmGVHY5/w=="],
+ "@oxlint/binding-linux-ppc64-gnu": ["@oxlint/binding-linux-ppc64-gnu@1.58.0", "", { "os": "linux", "cpu": "ppc64" }, "sha512-NQ0U/lqxH2/VxBYeAIvMNUK1y0a1bJ3ZicqkF2c6wfakbEciP9jvIE4yNzCFpZaqeIeRYaV7AVGqEO1yrfVPjA=="],
- "@oxlint/darwin-x64": ["@oxlint/darwin-x64@1.35.0", "", { "os": "darwin", "cpu": "x64" }, "sha512-1jNHu3j66X5jKySvgtE+jGtjx4ye+xioAucVTi2IuROZO6keK2YG74pnD+9FT+DpWZAtWRZGoW0r0x6aN9sEEg=="],
+ "@oxlint/binding-linux-riscv64-gnu": ["@oxlint/binding-linux-riscv64-gnu@1.58.0", "", { "os": "linux", "cpu": "none" }, "sha512-X9J+kr3gIC9FT8GuZt0ekzpNUtkBVzMVU4KiKDSlocyQuEgi3gBbXYN8UkQiV77FTusLDPsovjo95YedHr+3yg=="],
- "@oxlint/linux-arm64-gnu": ["@oxlint/linux-arm64-gnu@1.35.0", "", { "os": "linux", "cpu": "arm64" }, "sha512-T1lc0UaYbTxZyqVpLfC7eipbauNG8pBpkaZEW4JGz8Y68rxTH7d9s+CF0zxUxNr5RCtcmT669RLVjQT7VrKVLg=="],
+ "@oxlint/binding-linux-riscv64-musl": ["@oxlint/binding-linux-riscv64-musl@1.58.0", "", { "os": "linux", "cpu": "none" }, "sha512-CDze3pi1OO3Wvb/QsXjmLEY4XPKGM6kIo82ssNOgmcl1IdndF9VSGAE38YLhADWmOac7fjqhBw82LozuUVxD0Q=="],
- "@oxlint/linux-arm64-musl": ["@oxlint/linux-arm64-musl@1.35.0", "", { "os": "linux", "cpu": "arm64" }, "sha512-7Wv5Pke9kwWKFycUziSHsmi3EM0389TLzraB0KE/MArrKxx30ycwfJ5PYoMj9ERoW+Ybs0txdaOF/xJy/XyYkg=="],
+ "@oxlint/binding-linux-s390x-gnu": ["@oxlint/binding-linux-s390x-gnu@1.58.0", "", { "os": "linux", "cpu": "s390x" }, "sha512-b/89glbxFaEAcA6Uf1FvCNecBJEgcUTsV1quzrqXM/o4R1M4u+2KCVuyGCayN2UpsRWtGGLb+Ver0tBBpxaPog=="],
- "@oxlint/linux-x64-gnu": ["@oxlint/linux-x64-gnu@1.35.0", "", { "os": "linux", "cpu": "x64" }, "sha512-HDMPOzyVVy+rQl3H7UOq8oGHt7m1yaiWCanlhAu4jciK8dvXeO9OG/OQd74lD/h05IcJh93pCLEJ3wWOG8hTiQ=="],
+ "@oxlint/binding-linux-x64-gnu": ["@oxlint/binding-linux-x64-gnu@1.58.0", "", { "os": "linux", "cpu": "x64" }, "sha512-0/yYpkq9VJFCEcuRlrViGj8pJUFFvNS4EkEREaN7CB1EcLXJIaVSSa5eCihwBGXtOZxhnblWgxks9juRdNQI7w=="],
- "@oxlint/linux-x64-musl": ["@oxlint/linux-x64-musl@1.35.0", "", { "os": "linux", "cpu": "x64" }, "sha512-kAPBBsUOM3HQQ6n3nnZauvFR9EoXqCSoj4O3OSXXarzsRTiItNrHabVUwxeswZEc+xMzQNR0FHEWg/d4QAAWLw=="],
+ "@oxlint/binding-linux-x64-musl": ["@oxlint/binding-linux-x64-musl@1.58.0", "", { "os": "linux", "cpu": "x64" }, "sha512-hr6FNvmcAXiH+JxSvaJ4SJ1HofkdqEElXICW9sm3/Rd5eC3t7kzvmLyRAB3NngKO2wzXRCAm4Z/mGWfrsS4X8w=="],
- "@oxlint/win32-arm64": ["@oxlint/win32-arm64@1.35.0", "", { "os": "win32", "cpu": "arm64" }, "sha512-qrpBkkOASS0WT8ra9xmBRXOEliN6D/MV9JhI/68lFHrtLhfFuRwg4AjzjxrCWrQCnQ0WkvAVpJzu73F4ICLYZw=="],
+ "@oxlint/binding-openharmony-arm64": ["@oxlint/binding-openharmony-arm64@1.58.0", "", { "os": "none", "cpu": "arm64" }, "sha512-R+O368VXgRql1K6Xar+FEo7NEwfo13EibPMoTv3sesYQedRXd6m30Dh/7lZMxnrQVFfeo4EOfYIP4FpcgWQNHg=="],
- "@oxlint/win32-x64": ["@oxlint/win32-x64@1.35.0", "", { "os": "win32", "cpu": "x64" }, "sha512-yPFcj6umrhusnG/kMS5wh96vblsqZ0kArQJS+7kEOSJDrH+DsFWaDCsSRF8U6gmSmZJ26KVMU3C3TMpqDN4M1g=="],
+ "@oxlint/binding-win32-arm64-msvc": ["@oxlint/binding-win32-arm64-msvc@1.58.0", "", { "os": "win32", "cpu": "arm64" }, "sha512-Q0FZiAY/3c4YRj4z3h9K1PgaByrifrfbBoODSeX7gy97UtB7pySPUQfC2B/GbxWU6k7CzQrRy5gME10PltLAFQ=="],
+
+ "@oxlint/binding-win32-ia32-msvc": ["@oxlint/binding-win32-ia32-msvc@1.58.0", "", { "os": "win32", "cpu": "ia32" }, "sha512-Y8FKBABrSPp9H0QkRLHDHOSUgM/309a3IvOVgPcVxYcX70wxJrk608CuTg7w+C6vEd724X5wJoNkBcGYfH7nNQ=="],
+
+ "@oxlint/binding-win32-x64-msvc": ["@oxlint/binding-win32-x64-msvc@1.58.0", "", { "os": "win32", "cpu": "x64" }, "sha512-bCn5rbiz5My+Bj7M09sDcnqW0QJyINRVxdZ65x1/Y2tGrMwherwK/lpk+HRQCKvXa8pcaQdF5KY5j54VGZLwNg=="],
"@protobufjs/aspromise": ["@protobufjs/aspromise@1.1.2", "", {}, "sha512-j+gKExEuLmKwvz3OgROXtrJ2UG2x8Ch2YZUxahh+s1F2HZ+wAceUNLkvy6zKCPVRkU++ZWQrdxsUeQXmcg4uoQ=="],
@@ -68,13 +83,13 @@
"@protobufjs/utf8": ["@protobufjs/utf8@1.1.0", "", {}, "sha512-Vvn3zZrhQZkkBE8LSuW3em98c0FwgO4nxzv6OdSxPKJIEKY2bGbHn+mhGIPerzI4twdxaP8/0+06HBpwf345Lw=="],
- "@sinclair/typebox": ["@sinclair/typebox@0.34.45", "", {}, "sha512-qJcFVfCa5jxBFSuv7S5WYbA8XdeCPmhnaVVfX/2Y6L8WYg8sk3XY2+6W0zH+3mq1Cz+YC7Ki66HfqX6IHAwnkg=="],
+ "@sinclair/typebox": ["@sinclair/typebox@0.34.49", "", {}, "sha512-brySQQs7Jtn0joV8Xh9ZV/hZb9Ozb0pmazDIASBkYKCjXrXU3mpcFahmK/z4YDhGkQvP9mWJbVyahdtU5wQA+A=="],
- "@types/bun": ["@types/bun@1.3.4", "", { "dependencies": { "bun-types": "1.3.4" } }, "sha512-EEPTKXHP+zKGPkhRLv+HI0UEX8/o+65hqARxLy8Ov5rIxMBPNTjeZww00CIihrIQGEQBYg+0roO5qOnS/7boGA=="],
+ "@types/bun": ["@types/bun@1.3.11", "", { "dependencies": { "bun-types": "1.3.11" } }, "sha512-5vPne5QvtpjGpsGYXiFyycfpDF2ECyPcTSsFBMa0fraoxiQyMJ3SmuQIGhzPg2WJuWxVBoxWJ2kClYTcw/4fAg=="],
"@types/long": ["@types/long@4.0.2", "", {}, "sha512-MqTGEo5bj5t157U6fA/BiDynNkn0YknVdh48CMPkTSpFTVmvao5UQmm7uEF6xBEo7qIMAlY/JSleYaE6VOdpaA=="],
- "@types/node": ["@types/node@22.19.3", "", { "dependencies": { "undici-types": "~6.21.0" } }, "sha512-1N9SBnWYOJTrNZCdh/yJE+t910Y128BoyY+zBLWhL3r0TYzlTmFdXrPwHL9DyFZmlEXNQQolTZh3KHV31QDhyA=="],
+ "@types/node": ["@types/node@22.19.15", "", { "dependencies": { "undici-types": "~6.21.0" } }, "sha512-F0R/h2+dsy5wJAUe3tAU6oqa2qbWY5TpNfL/RGmo1y38hiyO1w3x2jPtt76wmuaJI4DQnOBu21cNXQ2STIUUWg=="],
"@xenova/transformers": ["@xenova/transformers@2.17.2", "", { "dependencies": { "@huggingface/jinja": "^0.2.2", "onnxruntime-web": "1.14.0", "sharp": "^0.32.0" }, "optionalDependencies": { "onnxruntime-node": "1.14.0" } }, "sha512-lZmHqzrVIkSvZdKZEx7IYY51TK0WDrC8eR0c5IMnBsO8di8are1zzw8BlLhyO2TklZKLN5UffNGs1IJwT6oOqQ=="],
@@ -106,7 +121,7 @@
"buffer": ["buffer@5.7.1", "", { "dependencies": { "base64-js": "^1.3.1", "ieee754": "^1.1.13" } }, "sha512-EHcyIPBQ4BSGlvjB16k5KgAJ27CIsHY/2JBmCRReo48y9rQ3MaUzWX3KVlBa4U7MyX02HdVj0K7C3WaB3ju7FQ=="],
- "bun-types": ["bun-types@1.3.4", "", { "dependencies": { "@types/node": "*" } }, "sha512-5ua817+BZPZOlNaRgGBpZJOSAQ9RQ17pkwPD0yR7CfJg+r8DgIILByFifDTa+IPDDxzf5VNhtNlcKqFzDgJvlQ=="],
+ "bun-types": ["bun-types@1.3.11", "", { "dependencies": { "@types/node": "*" } }, "sha512-1KGPpoxQWl9f6wcZh57LvrPIInQMn2TQ7jsgxqpRzg+l0QPOFvJVH7HmvHo/AiPgwXy+/Thf6Ov3EdVn1vOabg=="],
"bytes": ["bytes@3.1.2", "", {}, "sha512-/Nf7TyzTx6S3yRJObOAV7956r8cr2+Oj8AC5dt8wSP3BQAoeX58NoHyCU8P8zGkNXStjTSi6fzO6F0pBdcYbEg=="],
@@ -134,10 +149,6 @@
"cors": ["cors@2.8.5", "", { "dependencies": { "object-assign": "^4", "vary": "^1" } }, "sha512-KIHbLJqu73RGr/hnbrO9uBeixNGuvSQjul/jdFvS/KFSIH1hWVd1ng7zOHx+YrEfInLG7q4n6GHQ9cDtxv/P6g=="],
- "croner": ["croner@8.1.2", "", {}, "sha512-ypfPFcAXHuAZRCzo3vJL6ltENzniTjwe/qsLleH1V2/7SRDjgvRQyrLmumFTLmjFax4IuSxfGXEn79fozXcJog=="],
-
- "cronstrue": ["cronstrue@2.61.0", "", { "bin": { "cronstrue": "bin/cli.js" } }, "sha512-ootN5bvXbIQI9rW94+QsXN5eROtXWwew6NkdGxIRpS/UFWRggL0G5Al7a9GTBFEsuvVhJ2K3CntIIVt7L2ILhA=="],
-
"cross-spawn": ["cross-spawn@7.0.6", "", { "dependencies": { "path-key": "^3.1.0", "shebang-command": "^2.0.0", "which": "^2.0.1" } }, "sha512-uV2QOWP2nWzsy2aMp8aRibhi9dlzF5Hgh5SHaB9OiTGEyDTiJJyx0uy51QXdyWbtAHNua4XJzUKca3OzKUd3vA=="],
"debug": ["debug@4.4.3", "", { "dependencies": { "ms": "^2.1.3" } }, "sha512-RGwwWnwQvkVfavKVt22FGLw+xYSdzARwm0ru6DhTVA3umU5hZc28V3kO4stgYryrTlLpuvgI9GiijltAjNbcqA=="],
@@ -178,7 +189,7 @@
"express": ["express@5.2.1", "", { "dependencies": { "accepts": "^2.0.0", "body-parser": "^2.2.1", "content-disposition": "^1.0.0", "content-type": "^1.0.5", "cookie": "^0.7.1", "cookie-signature": "^1.2.1", "debug": "^4.4.0", "depd": "^2.0.0", "encodeurl": "^2.0.0", "escape-html": "^1.0.3", "etag": "^1.8.1", "finalhandler": "^2.1.0", "fresh": "^2.0.0", "http-errors": "^2.0.0", "merge-descriptors": "^2.0.0", "mime-types": "^3.0.0", "on-finished": "^2.4.1", "once": "^1.4.0", "parseurl": "^1.3.3", "proxy-addr": "^2.0.7", "qs": "^6.14.0", "range-parser": "^1.2.1", "router": "^2.2.0", "send": "^1.1.0", "serve-static": "^2.2.0", "statuses": "^2.0.1", "type-is": "^2.0.1", "vary": "^1.1.2" } }, "sha512-hIS4idWWai69NezIdRt2xFVofaF4j+6INOpJlVOLDO8zXGpUVEVzIYk12UUi2JzjEzWL3IOAxcTubgz9Po0yXw=="],
- "express-rate-limit": ["express-rate-limit@7.5.1", "", { "peerDependencies": { "express": ">= 4.11" } }, "sha512-7iN8iPMDzOMHPUYllBEsQdWVB6fPDMPqwjBaFrgr4Jgr/+okjvzAy+UHlYYL/Vs0OsOrMkwS6PJDkFlJwoxUnw=="],
+ "express-rate-limit": ["express-rate-limit@8.3.2", "", { "dependencies": { "ip-address": "10.1.0" }, "peerDependencies": { "express": ">= 4.11" } }, "sha512-77VmFeJkO0/rvimEDuUC5H30oqUC4EyOhyGccfqoLebB0oiEYfM7nwPrsDsBL1gsTpwfzX8SFy2MT3TDyRq+bg=="],
"fast-deep-equal": ["fast-deep-equal@3.1.3", "", {}, "sha512-f3qQ9oQy9j2AhBe/H9VC91wLmKBCCU/gDOnKNAYG5hswO7BLKj09Hc5HYNz9cGI++xlpDCIgDaitVs03ATR84Q=="],
@@ -212,7 +223,7 @@
"hasown": ["hasown@2.0.2", "", { "dependencies": { "function-bind": "^1.1.2" } }, "sha512-0hJU9SCPvmMzIBdZFqNPXWa6dqh7WdH0cII9y+CyS8rG3nL48Bclra9HmKhVVUHyPWNH5Y7xDwAB7bfgSjkUMQ=="],
- "hono": ["hono@4.11.1", "", {}, "sha512-KsFcH0xxHes0J4zaQgWbYwmz3UPOOskdqZmItstUG93+Wk1ePBLkLGwbP9zlmh1BFUiL8Qp+Xfu9P7feJWpGNg=="],
+ "hono": ["hono@4.12.9", "", {}, "sha512-wy3T8Zm2bsEvxKZM5w21VdHDDcwVS1yUFFY6i8UobSsKfFceT7TOwhbhfKsDyx7tYQlmRM5FLpIuYvNFyjctiA=="],
"http-errors": ["http-errors@2.0.1", "", { "dependencies": { "depd": "~2.0.0", "inherits": "~2.0.4", "setprototypeof": "~1.2.0", "statuses": "~2.0.2", "toidentifier": "~1.0.1" } }, "sha512-4FbRdAX+bSdmo4AUFuS0WNiPz8NgFt+r8ThgNWmlrjQjt1Q7ZR9+zTlce2859x4KSXrwIsaeTqDoKQmtP8pLmQ=="],
@@ -224,6 +235,8 @@
"ini": ["ini@1.3.8", "", {}, "sha512-JV/yugV2uzW5iMRSiZAyDtQd+nxtUnjeLt0acNdw98kKLrvuRVyB80tsREOE7yvGVgalhZ6RNXCmEHkUKBKxew=="],
+ "ip-address": ["ip-address@10.1.0", "", {}, "sha512-XXADHxXmvT9+CRxhXg56LJovE+bmWnEWB78LB83VZTprKTmaC5QfruXocxzTZ2Kl0DNwKuBdlIhjL8LeY8Sf8Q=="],
+
"ipaddr.js": ["ipaddr.js@1.9.1", "", {}, "sha512-0KI/607xoxSToH7GjN1FfSbLoU0+btTicjsQSWQlh/hZykN8KpmMf7uYwPW3R+akZ6R/w18ZlXSHBYXiYUPO3g=="],
"is-arrayish": ["is-arrayish@0.3.4", "", {}, "sha512-m6UrgzFVUYawGBh1dUsWR5M2Clqic9RVXC/9f8ceNlv2IcO9j9J/z8UoCLPqtsPBFNzEpfR3xftohbfqDx8EQA=="],
@@ -234,8 +247,6 @@
"jose": ["jose@6.1.3", "", {}, "sha512-0TpaTfihd4QMNwrz/ob2Bp7X04yuxJkjRGi4aKmOqwhov54i6u79oCv7T+C7lo70MKH6BesI3vscD1yb/yzKXQ=="],
- "json-schema-to-ts": ["json-schema-to-ts@3.1.1", "", { "dependencies": { "@babel/runtime": "^7.18.3", "ts-algebra": "^2.0.0" } }, "sha512-+DWg8jCJG2TEnpy7kOm/7/AxaYoaRbjVB4LFZLySZlWn8exGs3A4OLJR966cVvU26N7X9TWxl+Jsw7dzAqKT6g=="],
-
"json-schema-traverse": ["json-schema-traverse@1.0.0", "", {}, "sha512-NM8/P9n3XjXhIZn1lLhkFaACTOURQXjWhV4BA/RnOv8xvgqtqpAX9IO4mRQxSx1Rlo4tqzeqb0sOlruaOy3dug=="],
"json-schema-typed": ["json-schema-typed@8.0.2", "", {}, "sha512-fQhoXdcvc3V28x7C7BMs4P5+kNlgUURe2jmUT1T//oBRMDrqy1QPelJimwZGo7Hg9VPV3EQV5Bnq4hbFy2vetA=="],
@@ -284,7 +295,7 @@
"onnxruntime-web": ["onnxruntime-web@1.14.0", "", { "dependencies": { "flatbuffers": "^1.12.0", "guid-typescript": "^1.0.9", "long": "^4.0.0", "onnx-proto": "^4.0.4", "onnxruntime-common": "~1.14.0", "platform": "^1.3.6" } }, "sha512-Kcqf43UMfW8mCydVGcX9OMXI2VN17c0p6XvR7IPSZzBf/6lteBzXHvcEVWDPmCKuGombl997HgLqj91F11DzXw=="],
- "oxlint": ["oxlint@1.35.0", "", { "optionalDependencies": { "@oxlint/darwin-arm64": "1.35.0", "@oxlint/darwin-x64": "1.35.0", "@oxlint/linux-arm64-gnu": "1.35.0", "@oxlint/linux-arm64-musl": "1.35.0", "@oxlint/linux-x64-gnu": "1.35.0", "@oxlint/linux-x64-musl": "1.35.0", "@oxlint/win32-arm64": "1.35.0", "@oxlint/win32-x64": "1.35.0" }, "peerDependencies": { "oxlint-tsgolint": ">=0.10.0" }, "optionalPeers": ["oxlint-tsgolint"], "bin": { "oxc_language_server": "bin/oxc_language_server", "oxlint": "bin/oxlint" } }, "sha512-QDX1aUgaiqznkGfTM2qHwva2wtKqhVoqPSVXrnPz+yLUhlNadikD3QRuRtppHl7WGuy3wG6nKAuR8lash3aWSg=="],
+ "oxlint": ["oxlint@1.58.0", "", { "optionalDependencies": { "@oxlint/binding-android-arm-eabi": "1.58.0", "@oxlint/binding-android-arm64": "1.58.0", "@oxlint/binding-darwin-arm64": "1.58.0", "@oxlint/binding-darwin-x64": "1.58.0", "@oxlint/binding-freebsd-x64": "1.58.0", "@oxlint/binding-linux-arm-gnueabihf": "1.58.0", "@oxlint/binding-linux-arm-musleabihf": "1.58.0", "@oxlint/binding-linux-arm64-gnu": "1.58.0", "@oxlint/binding-linux-arm64-musl": "1.58.0", "@oxlint/binding-linux-ppc64-gnu": "1.58.0", "@oxlint/binding-linux-riscv64-gnu": "1.58.0", "@oxlint/binding-linux-riscv64-musl": "1.58.0", "@oxlint/binding-linux-s390x-gnu": "1.58.0", "@oxlint/binding-linux-x64-gnu": "1.58.0", "@oxlint/binding-linux-x64-musl": "1.58.0", "@oxlint/binding-openharmony-arm64": "1.58.0", "@oxlint/binding-win32-arm64-msvc": "1.58.0", "@oxlint/binding-win32-ia32-msvc": "1.58.0", "@oxlint/binding-win32-x64-msvc": "1.58.0" }, "peerDependencies": { "oxlint-tsgolint": ">=0.18.0" }, "optionalPeers": ["oxlint-tsgolint"], "bin": { "oxlint": "bin/oxlint" } }, "sha512-t4s9leczDMqlvOSjnbCQe7gtoLkWgBGZ7sBdCJ9EOj5IXFSG/X7OAzK4yuH4iW+4cAYe8kLFbC8tuYMwWZm+Cg=="],
"parseurl": ["parseurl@1.3.3", "", {}, "sha512-CiyeOxFT/JZyN5m0z9PfXw4SCBJ6Sygz1Dpl0wqjlhDEGGBP1GnsUVEL0p63hoG1fcj3fHynXi9NYO4nWOL+qQ=="],
@@ -366,8 +377,6 @@
"toidentifier": ["toidentifier@1.0.1", "", {}, "sha512-o5sSPKEkg/DIQNmH43V0/uerLrpzVedkUh8tGNvaeXpfpuwjKenlSox/2O/BTlZUtEe+JG7s5YhEz608PlAHRA=="],
- "ts-algebra": ["ts-algebra@2.0.0", "", {}, "sha512-FPAhNPFMrkwz76P7cdjdmiShwMynZYN6SgOujD1urY4oNm80Ou9oMdmbR45LotcKOXoy7wSmHkRFE6Mxbrhefw=="],
-
"tunnel-agent": ["tunnel-agent@0.6.0", "", { "dependencies": { "safe-buffer": "^5.0.1" } }, "sha512-McnNiV1l8RYeY8tBgEpuodCC1mLUdbSN+CYBL7kJsJNInOP8UjDDEwdk6Mw60vdLLrr5NHKZhMAOSrR2NZuQ+w=="],
"type-is": ["type-is@2.0.1", "", { "dependencies": { "content-type": "^1.0.5", "media-typer": "^1.1.0", "mime-types": "^3.0.0" } }, "sha512-OZs6gsjF4vMp32qrCbiVSkrFmXtG/AZhY3t0iAMrMBiAZyV9oALtXO8hsrHbMXF9x6L3grlFuwW2oAz7cav+Gw=="],
@@ -382,7 +391,7 @@
"vary": ["vary@1.1.2", "", {}, "sha512-BNGbWLfd0eUPabhkXUVm0j8uuvREyTh5ovRa/dyow/BqAbZJyC+5fU+IzQOzmAKzYqYRAISoRhdQr3eIZ/PXqg=="],
- "web-tree-sitter": ["web-tree-sitter@0.26.3", "", {}, "sha512-JIVgIKFS1w6lejxSntCtsS/QsE/ecTS00en809cMxMPxaor6MvUnQ+ovG8uTTTvQCFosSh4MeDdI5bSGw5SoBw=="],
+ "web-tree-sitter": ["web-tree-sitter@0.26.8", "", {}, "sha512-4sUwi7ZyOrIk5KLgYLkc2A/F0LFMQnBhfb+2Cdl7ik4ePJ6JD+fk4ofI2sA5eGawBKBaK4Vntt7Ww5KcEsay4A=="],
"which": ["which@2.0.2", "", { "dependencies": { "isexe": "^2.0.0" }, "bin": { "node-which": "./bin/node-which" } }, "sha512-BLI3Tl1TW3Pvl70l3yq3Y64i+awpwXqsGBYWkkqMtnbXgrMD+yj7rhW0kuEDxzJaYXGjEW5ogapKNMEKNMjibA=="],
@@ -390,10 +399,12 @@
"zod": ["zod@4.2.1", "", {}, "sha512-0wZ1IRqGGhMP76gLqz8EyfBXKk0J2qo2+H3fi4mcUP/KtTocoX08nmIAHl1Z2kJIZbZee8KOpBCSNPRgauucjw=="],
- "zod-to-json-schema": ["zod-to-json-schema@3.25.0", "", { "peerDependencies": { "zod": "^3.25 || ^4" } }, "sha512-HvWtU2UG41LALjajJrML6uQejQhNJx+JBO9IflpSja4R03iNWfKXrj6W2h7ljuLyc1nKS+9yDyL/9tD1U/yBnQ=="],
+ "zod-to-json-schema": ["zod-to-json-schema@3.25.2", "", { "peerDependencies": { "zod": "^3.25.28 || ^4" } }, "sha512-O/PgfnpT1xKSDeQYSCfRI5Gy3hPf91mKVDuYLUHZJMiDFptvP41MSnWofm8dnCm0256ZNfZIM7DSzuSMAFnjHA=="],
"prebuild-install/tar-fs": ["tar-fs@2.1.4", "", { "dependencies": { "chownr": "^1.1.1", "mkdirp-classic": "^0.5.2", "pump": "^3.0.0", "tar-stream": "^2.1.4" } }, "sha512-mDAjwmZdh7LTT6pNleZ05Yt65HC3E+NiQzl672vQG38jIrehtJk/J3mNwIg+vShQPcLF/LV7CMnDW6vjj6sfYQ=="],
+ "protobufjs/@types/node": ["@types/node@22.19.3", "", { "dependencies": { "undici-types": "~6.21.0" } }, "sha512-1N9SBnWYOJTrNZCdh/yJE+t910Y128BoyY+zBLWhL3r0TYzlTmFdXrPwHL9DyFZmlEXNQQolTZh3KHV31QDhyA=="],
+
"prebuild-install/tar-fs/tar-stream": ["tar-stream@2.2.0", "", { "dependencies": { "bl": "^4.0.3", "end-of-stream": "^1.4.1", "fs-constants": "^1.0.0", "inherits": "^2.0.3", "readable-stream": "^3.1.1" } }, "sha512-ujeqbceABgwMZxEJnk2HDY2DlnUZ+9oEcb1KzTVfYHio0UE6dG71n60d8D2I4qNvleWrrXpmjpt7vZeF1LnMZQ=="],
}
}
diff --git a/docs/reference-for-llms.md b/docs/reference-for-llms.md
index 91cf6b9..b2dd8aa 100644
--- a/docs/reference-for-llms.md
+++ b/docs/reference-for-llms.md
@@ -8,8 +8,8 @@ This document contains everything needed to understand, customize, fork, and sel
Claude Matrix is a plugin for Claude Code that adds:
- **Persistent memory** with semantic search (solutions, failures)
-- **Code indexing** for 15 languages
-- **Automatic hooks** for permissions, security, context injection
+- **Code indexing** for 15 languages with .gitignore support and content hash diffing
+- **Automatic hooks** for permissions, security, package auditing
- **Codebase hygiene** via Nuke (dead code, orphans, circular deps)
- **Library docs** via Context7
- **Diagnostics** with auto-fix
@@ -27,7 +27,7 @@ Claude-Matrix/
│ ├── db/
│ │ ├── client.ts # SQLite connection (bun:sqlite)
│ │ ├── schema.ts # Full database schema
-│ │ └── migrate.ts # Schema migrations (v4 for v2.0)
+│ │ └── migrate.ts # Flattened schema migrations (v9)
│ ├── embeddings/
│ │ ├── local.ts # transformers.js embeddings
│ │ └── utils.ts # cosine similarity
@@ -39,59 +39,45 @@ Claude-Matrix/
│ │ ├── reward.ts # matrix_reward
│ │ ├── failure.ts # matrix_failure
│ │ ├── status.ts # matrix_status
-│ │ ├── warn.ts # matrix_warn (v2.0: consolidated)
-│ │ ├── prompt.ts # matrix_prompt
-│ │ ├── doctor.ts # matrix_doctor
-│ │ └── index-tools.ts # Code index tools + find_callers + dead code + circular deps
+│ │ ├── warn.ts # matrix_warn (consolidated)
+│ │ ├── doctor/ # matrix_doctor diagnostics
+│ │ └── index-tools.ts # Code index tools + dead code + circular deps
│ ├── server/
│ │ └── handlers.ts # Tool dispatch handler
│ ├── hooks/
-│ │ ├── index.ts # Hook dispatcher + re-exports
-│ │ ├── format-helpers.ts # Verbosity-aware formatters (v2.0)
-│ │ ├── rule-engine.ts # User-configurable rules (v2.0)
+│ │ ├── index.ts # Hook utilities + re-exports
+│ │ ├── unified-entry.ts # Hook dispatcher
+│ │ ├── format-helpers.ts # Verbosity-aware formatters
+│ │ ├── rule-engine.ts # User-configurable rules
│ │ ├── session-start.ts # Initialize DB, index code
│ │ ├── permission-request.ts # Auto-approve read-only tools
│ │ ├── pre-tool-read.ts # Sensitive file detection
│ │ ├── pre-tool-bash.ts # Package auditing
│ │ ├── pre-tool-edit.ts # File warnings (cursed files)
-│ │ ├── pre-tool-web.ts # Intercept docs lookups → Context7
-│ │ ├── post-tool-bash.ts # Log installs
-│ │ ├── prompt-utils.ts # Prompt analysis utilities
-│ │ ├── subagent-start.ts # Inject Matrix guidance for subagents
-│ │ └── task-completed.ts # Offer to save notable solutions
+│ │ └── pre-tool-web.ts # Intercept docs lookups -> Context7
│ ├── indexer/
-│ │ ├── index.ts # Main indexer
-│ │ ├── analysis.ts # Dead code + circular dependency detection
-│ │ ├── parser.ts # tree-sitter parsing
-│ │ ├── store.ts # Index storage + find_callers
+│ │ ├── index.ts # Main indexer with content hashing
+│ │ ├── scanner.ts # File discovery (.gitignore, symlink safe)
+│ │ ├── diff.ts # Content hash diffing
+│ │ ├── analysis.ts # Dead code + circular deps + tsconfig paths
+│ │ ├── parser.ts # tree-sitter parsing (per-file timeout)
+│ │ ├── store.ts # Index storage + fuzzy search + find_callers
│ │ └── languages/ # Language-specific extractors (15 langs)
-│ ├── dreamer/ # Scheduled task automation
-│ │ ├── index.ts # Dreamer exports
-│ │ ├── store.ts # Task persistence
-│ │ ├── types.ts # Task types
-│ │ ├── cron/ # Cron parsing + humanizer
-│ │ ├── scheduler/ # Native OS schedulers (launchd, crontab)
-│ │ └── actions/ # add, list, run, remove, status, logs, history
-│ ├── jobs/ # Background job system
-│ │ ├── manager.ts # Job lifecycle management
-│ │ └── workers.ts # Job execution + timeouts
│ ├── repo/
│ │ ├── fingerprint.ts # Detect project type
│ │ └── store.ts # Repo CRUD
│ └── config/
-│ └── index.ts # User configuration + verbosity + rules
+│ └── index.ts # User configuration
├── hooks/
│ └── hooks.json # Claude Code hook definitions
├── skills/ # Slash commands (skills format)
-│ ├── list/ # /matrix:list (includes export)
+│ ├── list/ # /matrix:list
│ ├── warn/ # /matrix:warn
│ ├── reindex/ # /matrix:reindex
│ ├── doctor/ # /matrix:doctor
│ ├── review/ # /matrix:review
│ ├── deep-research/ # /matrix:deep-research
-│ ├── nuke/ # /matrix:nuke (v2.2.2)
-│ ├── clone-repo/ # /matrix:clone-repo
-│ └── dreamer/ # /matrix:dreamer (scheduled tasks)
+│ └── nuke/ # /matrix:nuke
├── scripts/
│ ├── run-hooks.sh # Hook runner
│ └── run-mcp.sh # MCP server runner
@@ -102,7 +88,7 @@ Claude-Matrix/
---
-## MCP Tools (22 total, v2.2.2)
+## MCP Tools (18 total)
### Memory Tools
@@ -115,7 +101,7 @@ Claude-Matrix/
| `matrix_status` | Get memory statistics | `readOnlyHint`, `delegable` |
| `matrix_get_solution` | Fetch full solution details by ID | `readOnlyHint`, `delegable` |
-### Warning Tool (v2.0 - Consolidated)
+### Warning Tool
| Tool | Purpose | Annotations |
|------|---------|-------------|
@@ -128,34 +114,24 @@ Claude-Matrix/
| Tool | Purpose | Annotations |
|------|---------|-------------|
| `matrix_find_definition` | Find symbol definition | `readOnlyHint`, `delegable` |
-| `matrix_find_callers` | Find all files that use a symbol (v2.0) | `readOnlyHint`, `delegable` |
-| `matrix_search_symbols` | Search symbols by partial name | `readOnlyHint`, `delegable` |
+| `matrix_find_callers` | Find all files that use a symbol | `readOnlyHint`, `delegable` |
+| `matrix_search_symbols` | Fuzzy search symbols by name (filterable by kind) | `readOnlyHint`, `delegable` |
| `matrix_list_exports` | List exports from file/directory | `readOnlyHint`, `delegable` |
| `matrix_get_imports` | Get imports for a file | `readOnlyHint`, `delegable` |
| `matrix_index_status` | Get index status | `readOnlyHint`, `delegable` |
| `matrix_reindex` | Trigger reindexing | `idempotentHint`, `delegable` |
-| `matrix_find_dead_code` | Find exported symbols with zero callers and orphaned files (v2.2.2) | `readOnlyHint`, `delegable` |
-| `matrix_find_circular_deps` | Build import graph and detect circular dependency chains (v2.2.2) | `readOnlyHint`, `delegable` |
-
-### Scheduling & Jobs
-
-| Tool | Purpose | Annotations |
-|------|---------|-------------|
-| `matrix_dreamer` | Schedule and manage automated Claude tasks | — |
-| `matrix_job_status` | Get status of a background job | `readOnlyHint`, `delegable` |
-| `matrix_job_cancel` | Cancel a running background job | — |
-| `matrix_job_list` | List background jobs by status | `readOnlyHint`, `delegable` |
+| `matrix_find_dead_code` | Find exported symbols with zero callers and orphaned files | `readOnlyHint`, `delegable` |
+| `matrix_find_circular_deps` | Detect circular dependency chains | `readOnlyHint`, `delegable` |
### Other Tools
| Tool | Purpose | Annotations |
|------|---------|-------------|
-| `matrix_prompt` | Analyze prompt for ambiguity | `readOnlyHint` |
| `matrix_doctor` | Run diagnostics and auto-fix | `idempotentHint` |
### Delegable Tools (for Haiku sub-agents)
-These 12 tools are marked with `_meta.delegable: true`:
+These tools are marked with `_meta.delegable: true`:
```
matrix_recall, matrix_reward, matrix_status
matrix_warn (all actions)
@@ -168,14 +144,13 @@ matrix_find_dead_code, matrix_find_circular_deps
**Not delegable** (require Opus reasoning):
- `matrix_store` - needs judgment on what to store
- `matrix_failure` - needs root cause analysis
-- `matrix_prompt` - meta-analysis of prompts
- `matrix_doctor` - diagnostics interpretation
---
## Database Schema
-SQLite database at `~/.claude/matrix/matrix.db`
+SQLite database at `~/.claude/matrix/matrix.db`. Schema version 9.
### Core Tables
@@ -221,17 +196,6 @@ CREATE TABLE failures (
created_at TEXT
);
--- Repos (project fingerprints)
-CREATE TABLE repos (
- id TEXT PRIMARY KEY,
- name TEXT NOT NULL,
- path TEXT,
- languages JSON DEFAULT '[]',
- frameworks JSON DEFAULT '[]',
- patterns JSON DEFAULT '[]',
- fingerprint_embedding BLOB
-);
-
-- Warnings (file/package grudges)
CREATE TABLE warnings (
id TEXT PRIMARY KEY,
@@ -248,13 +212,13 @@ CREATE TABLE warnings (
### Code Index Tables
```sql
--- Indexed files
+-- Indexed files (with content hash for reliable change detection)
CREATE TABLE repo_files (
id INTEGER PRIMARY KEY,
repo_id TEXT NOT NULL,
file_path TEXT NOT NULL,
mtime INTEGER NOT NULL,
- hash TEXT,
+ hash TEXT, -- SHA-256 content hash
indexed_at TEXT
);
@@ -274,7 +238,7 @@ CREATE TABLE symbols (
signature TEXT
);
--- Imports
+-- Imports (including re-exports tracked by TS parser)
CREATE TABLE imports (
id INTEGER PRIMARY KEY,
file_id INTEGER NOT NULL,
@@ -296,66 +260,12 @@ Defined in `hooks/hooks.json`:
| Hook | Trigger | What it does |
|------|---------|--------------|
-| `SessionStart` | Session begins | Init DB, auto-create config, index code (15 languages) |
+| `SessionStart` | Session begins | Init DB, auto-create config, index code |
| `PermissionRequest` | Tool permission asked | Auto-approve read-only tools (configurable) |
| `PreToolUse:Read` | Before reading file | Detect sensitive files (.env, keys, secrets) |
| `PreToolUse:Bash` | Before shell command | Audit packages (CVEs, deprecation, size) |
| `PreToolUse:Edit/Write` | Before file edit | Check for file warnings (cursed files) |
-| `PreToolUse:WebFetch/WebSearch` | Before web lookup | Intercept library docs → Context7 |
-| `PostToolUse:Bash` | After shell command | Log package installations |
-| `SubagentStart` | Subagent spawns | Inject Matrix guidance (prefer index tools, Context7) |
-| `TaskCompleted` | Task finishes | Suggest storing notable solutions |
-
-### Hook Configuration
-
-Each hook can be configured in `~/.claude/matrix/matrix.config`:
-
-```json
-{
- "hooks": {
- "permissions": {
- "autoApproveReadOnly": true,
- "autoApprove": {
- "coreRead": true, // Read, Glob, Grep
- "web": true, // WebFetch, WebSearch
- "matrixRead": true, // matrix_recall, status, find_definition, etc.
- "context7": true // resolve-library-id, query-docs
- },
- "neverAutoApprove": ["matrix_store", "matrix_warn_add"],
- "additionalAutoApprove": []
- },
- "sensitiveFiles": {
- "enabled": true,
- "behavior": "ask", // warn, block, ask, disabled
- "patterns": {
- "envFiles": true,
- "keysAndCerts": true,
- "secretDirs": true,
- "configFiles": true,
- "passwordFiles": true,
- "cloudCredentials": true
- },
- "customPatterns": [],
- "allowList": [".env.example", ".env.template"]
- },
- "packageAuditor": {
- "enabled": true,
- "behavior": "ask",
- "checks": {
- "cve": true,
- "deprecated": true,
- "bundleSize": true,
- "localWarnings": true
- },
- "blockOnCriticalCVE": true
- },
- "cursedFiles": {
- "enabled": true,
- "behavior": "ask"
- }
- }
-}
-```
+| `PreToolUse:WebFetch/WebSearch` | Before web lookup | Intercept library docs -> Context7 |
---
@@ -370,34 +280,21 @@ File: `~/.claude/matrix/matrix.config` (auto-created on first run)
"defaultMinScore": 0.3,
"defaultScope": "all"
},
- "merge": {
- "defaultThreshold": 0.8
- },
- "list": {
- "defaultLimit": 20
- },
- "export": {
- "defaultDirectory": "~/Downloads",
- "defaultFormat": "json"
- },
- "display": {
- "colors": true,
- "boxWidth": 55,
- "cardWidth": 70,
- "truncateLength": 40
- },
- "scoring": {
- "highThreshold": 0.7,
- "midThreshold": 0.4
- },
"hooks": {
"enabled": true,
- "complexityThreshold": 5,
- "permissions": { ... },
- "sensitiveFiles": { ... },
- "packageAuditor": { ... },
- "cursedFiles": { ... },
- "promptAnalysis": { ... }
+ "verbosity": "compact",
+ "permissions": {
+ "autoApproveReadOnly": true,
+ "autoApprove": {
+ "coreRead": true,
+ "web": true,
+ "matrixRead": true,
+ "context7": true
+ }
+ },
+ "sensitiveFiles": { "enabled": true, "behavior": "ask" },
+ "packageAuditor": { "enabled": true, "behavior": "ask" },
+ "cursedFiles": { "enabled": true, "behavior": "ask" }
},
"indexing": {
"enabled": true,
@@ -410,7 +307,8 @@ File: `~/.claude/matrix/matrix.config` (auto-created on first run)
"enabled": true,
"preferMatrixIndex": true,
"preferContext7": true
- }
+ },
+ "delegation": { "enabled": true, "model": "haiku" }
}
```
@@ -451,70 +349,62 @@ Tree-sitter grammars downloaded on first use (~1-2MB each):
`function`, `class`, `interface`, `type`, `enum`, `variable`, `const`, `method`, `property`, `namespace`
----
+### Parser Features (v2.4.0)
-## Self-Hosting Guide
+- Per-file parse timeout (10s) prevents hangs
+- Python: decorator extraction (@property, @staticmethod, @classmethod, @dataclass)
+- TypeScript: re-export tracking (export { X } from, export * from)
+- Content hash diffing for reliable incremental indexing
+- .gitignore respected during scanning
+- tsconfig path alias resolution for import graph accuracy
-### Fork & Customize
+---
-1. Fork the repo:
-```bash
-git clone https://github.com/ojowwalker77/Claude-Matrix
-cd Claude-Matrix
-```
+## Skills (Slash Commands)
-2. Install dependencies:
-```bash
-bun install
-```
+| Command | Purpose |
+|---------|---------|
+| `/matrix:review` | 5-phase code review with impact analysis |
+| `/matrix:deep-research` | Multi-source research aggregation |
+| `/matrix:nuke` | Codebase hygiene analysis (dead code, orphans, circular deps) |
+| `/matrix:doctor` | Diagnostics and auto-fix |
+| `/matrix:list` | View solutions, stats, warnings |
+| `/matrix:warn` | Manage file/package warnings |
+| `/matrix:reindex` | Rebuild code index |
-3. Customize:
-- Edit `src/tools/schemas.ts` to add/modify tools
-- Edit `src/tools/validation.ts` for input validation
-- Edit `src/hooks/*.ts` to customize hook behavior
-- Edit `src/config/index.ts` for default settings
-- Edit `hooks/hooks.json` to enable/disable hooks
+---
-4. Test:
-```bash
-bun test
-bun run lint
-```
+## Data Locations
-5. Test locally:
-```bash
-claude --plugin-dir /path/to/your-fork
+```
+~/.claude/matrix/
+├── matrix.db # SQLite database (all data)
+├── matrix.config # Configuration file (auto-created)
+├── models/ # Embedding model cache (~23MB)
+├── grammars/ # Tree-sitter WASM files (~1-2MB each)
+└── .initialized # Version marker
```
-### Create Your Own Plugin
+---
-1. Update `.claude-plugin/plugin.json`:
-```json
-{
- "name": "your-matrix",
- "description": "Your custom Matrix",
- "version": "1.0.0",
- "author": { "name": "Your Name" },
- "repository": "https://github.com/you/your-matrix"
-}
-```
+## Self-Hosting Guide
-2. Update `.claude-plugin/marketplace.json`:
-```json
-{
- "id": "you/your-matrix",
- "name": "your-matrix",
- "displayName": "Your Matrix",
- "publisher": "you"
-}
-```
+### Fork and Customize
-3. Push to GitHub
+1. Clone:
+```bash
+git clone https://github.com/ojowwalker77/Claude-Matrix
+cd Claude-Matrix && bun install
+```
-4. Install your version:
+2. Test:
+```bash
+bun test && bun run lint
```
-/plugin marketplace add you/your-matrix
-/plugin install your-matrix@you-your-matrix
+
+3. Run locally:
+```bash
+claude --plugin-dir /path/to/your-fork
```
### Key Customization Points
@@ -528,55 +418,6 @@ claude --plugin-dir /path/to/your-fork
| Change hook behavior | `src/hooks/*.ts` | Modify automatic actions |
| Disable a hook | `hooks/hooks.json` | Remove hook entry |
| Change defaults | `src/config/index.ts` | Modify DEFAULT_CONFIG |
-| Add new config option | `src/config/index.ts` | Add to interface and defaults |
-
-### Adding a New Hook
-
-1. Create `src/hooks/your-hook.ts`:
-```typescript
-import { getConfig } from '../config/index.js';
-
-export async function yourHook(input: any): Promise<{ continue: boolean; message?: string }> {
- const config = getConfig();
- // Your logic here
- return { continue: true };
-}
-```
-
-2. Register in `src/hooks/index.ts`
-
-3. Add to `hooks/hooks.json`:
-```json
-{
- "hooks": [
- {
- "matcher": "YourTrigger",
- "hooks": [{
- "type": "command",
- "command": "bun run hooks/your-hook.ts"
- }]
- }
- ]
-}
-```
-
-### Adding a New Language
-
-1. Create `src/indexer/languages/yourlang.ts`:
-```typescript
-import { LanguageParser } from './base.js';
-import type { ParseResult } from '../types.js';
-
-export class YourLangParser extends LanguageParser {
- parse(filePath: string, content: string): ParseResult {
- // Extract symbols and imports using tree-sitter
- }
-}
-```
-
-2. Register in `src/indexer/languages/index.ts`
-
-3. Add grammar URL to `src/indexer/parser.ts`
---
@@ -591,76 +432,6 @@ export class YourLangParser extends LanguageParser {
---
-## Performance Optimizations
-
-### Token Savings
-- Compact JSON output (~10-15% reduction)
-- Haiku-delegable tools for simple operations (12 tools)
-
-### MCP Annotations
-All tools have official hints:
-- `readOnlyHint` - No side effects
-- `idempotentHint` - Safe to retry
-- `destructiveHint` - Deletes data (1 tool)
-
-### MCP Server Instructions
-Surfaced to LLM via `_meta.instructions`:
-- Prefer Matrix index tools over grep/bash
-- Prefer Context7 over WebFetch/WebSearch for docs
-- Delegation guidance for subagents
-
----
-
-## Skills (Slash Commands)
-
-| Command | Purpose |
-|---------|---------|
-| `/matrix:review` | 5-phase code review with impact analysis |
-| `/matrix:deep-research` | Multi-source research aggregation |
-| `/matrix:nuke` | Codebase hygiene analysis (dead code, orphans, circular deps) |
-| `/matrix:doctor` | Diagnostics + auto-fix |
-| `/matrix:list` | View solutions, stats, warnings (includes export) |
-| `/matrix:warn` | Manage file/package warnings |
-| `/matrix:reindex` | Rebuild code index |
-| `/matrix:clone-repo` | Clone external repos for exploration |
-| `/matrix:dreamer` | Schedule and manage automated Claude tasks |
-
----
-
-## Data Locations
-
-```
-~/.claude/matrix/
-├── matrix.db # SQLite database (all data)
-├── matrix.config # Configuration file (auto-created)
-├── models/ # Embedding model cache (~23MB)
-├── grammars/ # Tree-sitter WASM files (~1-2MB each)
-└── .initialized # Version marker
-```
-
----
-
-## Diagnostics (matrix_doctor)
-
-Checks performed:
-1. **Directory** - ~/.claude/matrix/ exists and writable
-2. **Database** - matrix.db exists, not corrupted, schema valid
-3. **Config** - matrix.config parseable, valid structure
-4. **Hooks** - hooks.json valid, scripts executable
-5. **Code Index** - Repository detected, index populated
-6. **Repo Detection** - Project files found (package.json, go.mod, etc.)
-
-Auto-fixes (data-safe):
-- Create missing directories
-- Run schema migrations
-- Reset corrupted config (NOT database)
-- Rebuild code index
-
-**Never auto-fixed** (requires user action):
-- Corrupted database (preserves user data)
-
----
-
## License
-MIT - Fork freely, customize as needed.
+MIT
diff --git a/hooks/hooks.json b/hooks/hooks.json
index 1d4ee52..5e6c9b5 100644
--- a/hooks/hooks.json
+++ b/hooks/hooks.json
@@ -84,40 +84,6 @@
}
]
}
- ],
- "PostToolUse": [
- {
- "matcher": "Bash",
- "hooks": [
- {
- "type": "command",
- "command": "${CLAUDE_PLUGIN_ROOT}/scripts/run-hooks.sh post-tool-bash",
- "timeout": 10
- }
- ]
- }
- ],
- "SubagentStart": [
- {
- "hooks": [
- {
- "type": "command",
- "command": "${CLAUDE_PLUGIN_ROOT}/scripts/run-hooks.sh subagent-start",
- "timeout": 10
- }
- ]
- }
- ],
- "TaskCompleted": [
- {
- "hooks": [
- {
- "type": "command",
- "command": "${CLAUDE_PLUGIN_ROOT}/scripts/run-hooks.sh task-completed",
- "timeout": 10
- }
- ]
- }
]
}
}
diff --git a/package.json b/package.json
index deb8f9f..9eea1f7 100644
--- a/package.json
+++ b/package.json
@@ -35,18 +35,15 @@
"lint:fix": "oxlint src/ --fix"
},
"dependencies": {
- "@anthropic-ai/sdk": "^0.71.2",
- "@modelcontextprotocol/sdk": "^1.0.0",
- "@sinclair/typebox": "^0.34.45",
+ "@modelcontextprotocol/sdk": "^1.29.0",
+ "@sinclair/typebox": "^0.34.49",
"@xenova/transformers": "^2.17.2",
- "croner": "^8.1.2",
- "cronstrue": "^2.52.0",
- "web-tree-sitter": "^0.26.3"
+ "web-tree-sitter": "^0.26.8"
},
"devDependencies": {
- "@types/bun": "^1.3.4",
- "@types/node": "^22.10.2",
- "oxlint": "^1.35.0",
+ "@types/bun": "^1.3.11",
+ "@types/node": "^22.19.15",
+ "oxlint": "^1.58.0",
"typescript": "^5.7.2"
},
"engines": {
diff --git a/skills/clone-repo/SKILL.md b/skills/clone-repo/SKILL.md
deleted file mode 100644
index a7d157b..0000000
--- a/skills/clone-repo/SKILL.md
+++ /dev/null
@@ -1,34 +0,0 @@
----
-name: Clone Repo
-description: This skill should be used when the user asks to "clone a repo", "fetch code from github", "get external code", "look at a repo", or needs to explore an external repository for code examples and patterns.
-user-invocable: true
-allowed-tools:
- - Bash
- - Read
- - Glob
- - Grep
----
-
-# Clone Repo
-
-Clone an external repository for exploration and code reference.
-
-## Usage
-
-Parse user arguments: `
[branch]`
-
-- **repo**: GitHub `owner/repo` or full URL
-- **branch** (optional): specific branch to clone
-
-## Process
-
-1. Create `temp/` directory in project root (gitignored)
-2. Shallow clone: `git clone --depth 1 [--branch ] temp/`
-3. Explore, read, and search the cloned code as needed
-4. When done, clean up: `rm -rf temp/`
-
-## Notes
-
-- Always use `--depth 1` to minimize download size
-- Clean up when finished — don't leave cloned repos around
-- If `temp/` doesn't exist in `.gitignore`, add it
diff --git a/skills/deep-research/references/research-pipeline.md b/skills/deep-research/references/research-pipeline.md
index 8ea00d1..37f7764 100644
--- a/skills/deep-research/references/research-pipeline.md
+++ b/skills/deep-research/references/research-pipeline.md
@@ -26,7 +26,7 @@ Collect information from multiple sources in parallel:
- Include related failures to avoid
### 4. GitHub Repositories
-- For implementation patterns, use `/matrix:clone-repo` to clone and explore authoritative repos
+- For implementation patterns, clone and explore authoritative repos via `git clone --depth 1`
### 5. Additional Sources (for exhaustive depth)
- GitHub Issues/Discussions via `gh` CLI
diff --git a/skills/dreamer/SKILL.md b/skills/dreamer/SKILL.md
deleted file mode 100644
index 17dbae8..0000000
--- a/skills/dreamer/SKILL.md
+++ /dev/null
@@ -1,70 +0,0 @@
----
-name: Matrix Dreamer
-description: This skill should be used when the user asks to "schedule a task", "add scheduled task", "list scheduled tasks", "run task now", "remove scheduled task", "task scheduler", "dreamer", "cron task", "automate with claude", or needs to manage scheduled Claude Code tasks.
-user-invocable: true
-agent: haiku
-allowed-tools:
- - mcp__plugin_matrix_matrix__matrix_dreamer
----
-
-# Matrix Dreamer - Scheduled Task Automation
-
-Schedule and manage automated Claude Code tasks using native OS schedulers (launchd on macOS, crontab on Linux).
-
-Parse user arguments from the skill invocation (text after the trigger phrase).
-
-Use the `matrix_dreamer` tool with the appropriate action parameter.
-
-## CRITICAL: One-Time vs Recurring
-
-**BEFORE using `action: "add"`, you MUST clarify with the user:**
-- "Do you want this to run ONCE or RECURRING (daily/weekly/etc)?"
-
-Natural language like "at 1am" or "tonight at 3pm" is converted to cron expressions, which are **always recurring**. Users often expect one-time execution but get daily tasks instead.
-
-- **One-time execution:** Use `action: "run"` for immediate execution
-- **Recurring tasks:** Use `action: "add"` only after confirming recurring intent
-
-## Actions
-
-**List tasks** (default, or "list"):
-Use `matrix_dreamer` with `action: "list"`.
-- Optional: `limit` to cap results
-- Optional: `tag` to filter by tag
-
-**Add task** ("add", "schedule"):
-Use `matrix_dreamer` with `action: "add"` and:
-- `name`: Task name (required, max 100 chars)
-- `schedule`: Cron expression or natural language (e.g., "every day at 9am")
-- `command`: Claude prompt or /command to execute
-- Optional: `description`, `workingDirectory`, `timeout`, `timezone`, `tags`, `skipPermissions`
-- Optional: `worktree: { enabled: true }` for isolated git worktree execution
-- Optional: `env` for environment variables
-
-**Run immediately** ("run "):
-Use `matrix_dreamer` with `action: "run"` and `taskId`.
-
-**Remove task** ("remove ", "delete "):
-Use `matrix_dreamer` with `action: "remove"` and `taskId`.
-
-**System status** ("status"):
-Use `matrix_dreamer` with `action: "status"`.
-
-**View logs** ("logs "):
-Use `matrix_dreamer` with `action: "logs"` and `taskId`.
-- Optional: `lines` (default: 50)
-- Optional: `stream`: "stdout", "stderr", or "both"
-
-**Execution history** ("history"):
-Use `matrix_dreamer` with `action: "history"`.
-- Optional: `limit` to cap results
-
-## Examples
-
-- `/matrix:dreamer` - list all scheduled tasks
-- `/matrix:dreamer add "daily-review" every day at 9am "/matrix:review staged"`
-- `/matrix:dreamer run task_abc123`
-- `/matrix:dreamer status`
-- `/matrix:dreamer logs task_abc123`
-- `/matrix:dreamer remove task_abc123`
-- `/matrix:dreamer history`
diff --git a/skills/runmd/SKILL.md b/skills/runmd/SKILL.md
deleted file mode 100644
index 8f9987b..0000000
--- a/skills/runmd/SKILL.md
+++ /dev/null
@@ -1,63 +0,0 @@
----
-name: RunMD
-description: This skill should be used when the user asks to "run markdown", "execute markdown", "run playbook", "runmd", "test markdown", "execute code blocks", "run shell blocks", or needs to execute shell code blocks from markdown files.
-user-invocable: true
-allowed-tools:
- - Bash
- - Read
- - Glob
- - Grep
----
-
-# RunMD
-
-Execute shell code blocks from markdown files using [runmd](https://github.com/ojowwalker77/runmd).
-
-## Usage
-
-Parse user arguments: ` [options]`
-
-- **file**: Path to a markdown file containing shell code blocks
-- **--headless**: Run in non-interactive mode (default when invoked from here)
-- **--fail-fast**: Stop on first block failure
-- **--blocks 0,2,5**: Run only specific block indices (0-based)
-
-## Process
-
-1. Confirm `runmd` is installed: `which runmd || bun install -g runmd`
-2. If no file specified, search for markdown files with shell blocks:
- ```
- Find *.md files, scan for ```bash/```sh/```zsh/```shell code fences
- ```
-3. Show the user which shell blocks exist in the file (list them with indices)
-4. Execute: `runmd run [--fail-fast] [--blocks ]`
-5. Report results — pass/fail per block with exit codes
-
-## Shell Block Detection
-
-Executable code fences use these languages:
-- ```bash
-- ```sh
-- ```zsh
-- ```shell
-
-Other code blocks (js, python, etc.) are display-only and not executed.
-
-## Environment Variables
-
-runmd auto-loads `.env` files from the markdown file's directory. Variables referenced as `${VAR_NAME}` in shell blocks are substituted before execution.
-
-## Modes
-
-| Mode | Command | Use Case |
-|------|---------|----------|
-| Headless | `runmd run ` | CI/CD, automation, scripted execution |
-| Interactive | `runmd ` | Manual exploration, editing, step-by-step |
-
-Default to **headless** mode (`runmd run`) when executing from this skill. Suggest **interactive** mode if the user wants to explore or edit.
-
-## Notes
-
-- Exit code 1 if any block fails in headless mode
-- `--fail-fast` stops at the first failure instead of running all blocks
-- `--blocks` accepts comma-separated 0-based indices to run selectively
diff --git a/src/__tests__/prompt-utils.test.ts b/src/__tests__/prompt-utils.test.ts
deleted file mode 100644
index 24a2948..0000000
--- a/src/__tests__/prompt-utils.test.ts
+++ /dev/null
@@ -1,121 +0,0 @@
-import { describe, test, expect, beforeEach, afterEach } from 'bun:test';
-import { mkdirSync, rmSync, writeFileSync } from 'fs';
-import { join } from 'path';
-import { tmpdir } from 'os';
-import { getGitContextData } from '../hooks/prompt-utils.js';
-
-describe('getGitContextData', () => {
- let testDir: string;
- let gitDir: string;
-
- beforeEach(() => {
- // Create unique test directory
- testDir = join(tmpdir(), `matrix-test-${Date.now()}-${Math.random().toString(36).slice(2)}`);
- mkdirSync(testDir, { recursive: true });
- });
-
- afterEach(() => {
- // Cleanup test directory
- try {
- rmSync(testDir, { recursive: true, force: true });
- } catch {
- // Ignore cleanup errors
- }
- });
-
- test('returns null branch for non-git directory', async () => {
- const result = await getGitContextData(testDir);
-
- expect(result.branch).toBeNull();
- expect(result.commits).toEqual([]);
- expect(result.changedFiles).toEqual([]);
- });
-
- test('returns branch/commits/changedFiles for git repo', async () => {
- // Initialize git repo
- gitDir = testDir;
- const gitInit = Bun.spawnSync(['git', 'init'], { cwd: gitDir });
- expect(gitInit.exitCode).toBe(0);
-
- // Configure git for test
- Bun.spawnSync(['git', 'config', 'user.email', 'test@test.com'], { cwd: gitDir });
- Bun.spawnSync(['git', 'config', 'user.name', 'Test User'], { cwd: gitDir });
-
- // Create and commit a file
- writeFileSync(join(gitDir, 'test.txt'), 'test content');
- Bun.spawnSync(['git', 'add', 'test.txt'], { cwd: gitDir });
- Bun.spawnSync(['git', 'commit', '-m', 'Initial commit'], { cwd: gitDir });
-
- // Create a modified file
- writeFileSync(join(gitDir, 'modified.txt'), 'modified content');
-
- const result = await getGitContextData(gitDir);
-
- // Should have branch (master or main depending on git version)
- expect(result.branch).not.toBeNull();
- expect(['main', 'master']).toContain(result.branch as string);
-
- // Should have at least one commit
- expect(result.commits.length).toBeGreaterThan(0);
- expect(result.commits[0]).toContain('Initial commit');
-
- // Should show untracked file
- expect(result.changedFiles.length).toBeGreaterThan(0);
- });
-
- test('handles git command failures gracefully', async () => {
- // Use a directory that exists but isn't a git repo
- // Calling git commands should fail but not throw
-
- const result = await getGitContextData(testDir);
-
- // Should return empty/null values, not throw
- expect(result).toBeDefined();
- expect(result.branch).toBeNull();
- expect(result.commits).toEqual([]);
- expect(result.changedFiles).toEqual([]);
- });
-
- test('handles non-existent directory gracefully', async () => {
- const nonExistentDir = join(testDir, 'does-not-exist');
-
- const result = await getGitContextData(nonExistentDir);
-
- // Should return empty values, not throw
- expect(result).toBeDefined();
- expect(result.branch).toBeNull();
- });
-
- test('returns empty strings/arrays for empty repo (no commits)', async () => {
- gitDir = testDir;
- Bun.spawnSync(['git', 'init'], { cwd: gitDir });
-
- const result = await getGitContextData(gitDir);
-
- // Branch might be null or empty in a fresh repo with no commits
- // Commits should be empty
- expect(result.commits).toEqual([]);
- });
-
- test('limits changed files to 10', async () => {
- gitDir = testDir;
- Bun.spawnSync(['git', 'init'], { cwd: gitDir });
- Bun.spawnSync(['git', 'config', 'user.email', 'test@test.com'], { cwd: gitDir });
- Bun.spawnSync(['git', 'config', 'user.name', 'Test User'], { cwd: gitDir });
-
- // Create initial commit
- writeFileSync(join(gitDir, 'init.txt'), 'init');
- Bun.spawnSync(['git', 'add', '.'], { cwd: gitDir });
- Bun.spawnSync(['git', 'commit', '-m', 'init'], { cwd: gitDir });
-
- // Create 15 untracked files
- for (let i = 0; i < 15; i++) {
- writeFileSync(join(gitDir, `file${i}.txt`), `content ${i}`);
- }
-
- const result = await getGitContextData(gitDir);
-
- // Should be limited to 10 files
- expect(result.changedFiles.length).toBeLessThanOrEqual(10);
- });
-});
diff --git a/src/__tests__/workers.test.ts b/src/__tests__/workers.test.ts
deleted file mode 100644
index 6566737..0000000
--- a/src/__tests__/workers.test.ts
+++ /dev/null
@@ -1,100 +0,0 @@
-import { describe, test, expect, beforeEach, afterEach } from 'bun:test';
-import { clearJobTimeout, clearAllJobTimeouts, _jobTimeoutsForTesting } from '../jobs/workers.js';
-
-describe('timeout tracking', () => {
- beforeEach(() => {
- // Clear map before each test
- _jobTimeoutsForTesting.clear();
- });
-
- afterEach(() => {
- // Cleanup any remaining timeouts
- for (const [, timeout] of _jobTimeoutsForTesting) {
- clearTimeout(timeout);
- }
- _jobTimeoutsForTesting.clear();
- });
-
- test('clearJobTimeout removes timeout from map and returns true', () => {
- const jobId = 'test_job_001';
- const timeout = setTimeout(() => {}, 10000);
- _jobTimeoutsForTesting.set(jobId, timeout);
-
- expect(_jobTimeoutsForTesting.has(jobId)).toBe(true);
-
- const result = clearJobTimeout(jobId);
-
- expect(result).toBe(true);
- expect(_jobTimeoutsForTesting.has(jobId)).toBe(false);
- });
-
- test('clearJobTimeout returns false for unknown jobId', () => {
- const result = clearJobTimeout('nonexistent_job');
-
- expect(result).toBe(false);
- });
-
- test('clearJobTimeout clears the actual timeout', () => {
- const jobId = 'test_job_002';
- let timeoutFired = false;
-
- const timeout = setTimeout(() => {
- timeoutFired = true;
- }, 50);
- _jobTimeoutsForTesting.set(jobId, timeout);
-
- clearJobTimeout(jobId);
-
- // Wait longer than the timeout would have fired
- return new Promise((resolve) => {
- setTimeout(() => {
- expect(timeoutFired).toBe(false);
- resolve();
- }, 100);
- });
- });
-
- test('clearAllJobTimeouts clears all timeouts and returns count', () => {
- // Add multiple timeouts
- for (let i = 0; i < 5; i++) {
- const timeout = setTimeout(() => {}, 10000);
- _jobTimeoutsForTesting.set(`job_${i}`, timeout);
- }
-
- expect(_jobTimeoutsForTesting.size).toBe(5);
-
- const cleared = clearAllJobTimeouts();
-
- expect(cleared).toBe(5);
- expect(_jobTimeoutsForTesting.size).toBe(0);
- });
-
- test('clearAllJobTimeouts returns 0 when no timeouts exist', () => {
- const cleared = clearAllJobTimeouts();
-
- expect(cleared).toBe(0);
- });
-
- test('clearAllJobTimeouts prevents all callbacks from firing', () => {
- const firedCallbacks: string[] = [];
-
- // Add multiple timeouts
- for (let i = 0; i < 3; i++) {
- const jobId = `job_${i}`;
- const timeout = setTimeout(() => {
- firedCallbacks.push(jobId);
- }, 50);
- _jobTimeoutsForTesting.set(jobId, timeout);
- }
-
- clearAllJobTimeouts();
-
- // Wait longer than any timeout would have fired
- return new Promise((resolve) => {
- setTimeout(() => {
- expect(firedCallbacks.length).toBe(0);
- resolve();
- }, 150);
- });
- });
-});
diff --git a/src/config/index.ts b/src/config/index.ts
index fd4cfbc..58b2f08 100644
--- a/src/config/index.ts
+++ b/src/config/index.ts
@@ -74,21 +74,6 @@ export interface CursedFilesConfig {
behavior: 'warn' | 'block' | 'ask';
}
-// ═══════════════════════════════════════════════════════════════
-// Prompt Analysis Hook Config
-// ═══════════════════════════════════════════════════════════════
-export interface PromptAnalysisConfig {
- enabled: boolean;
- shortcuts: { enabled: boolean };
- codeNavigation: { enabled: boolean };
- memoryInjection: {
- enabled: boolean;
- maxSolutions: number;
- maxFailures: number;
- minScore: number;
- };
-}
-
// ═══════════════════════════════════════════════════════════════
// Code Review Config
// ═══════════════════════════════════════════════════════════════
@@ -101,30 +86,6 @@ export interface GitCommitReviewConfig {
autoRun: boolean;
}
-// ═══════════════════════════════════════════════════════════════
-// Dreamer Config (Scheduled Task Automation)
-// ═══════════════════════════════════════════════════════════════
-export interface DreamerWorktreeConfig {
- /** Default branch prefix for worktree branches */
- defaultBranchPrefix: string;
- /** Default remote for pushing worktree branches */
- defaultRemote: string;
- /** Optional custom base path for worktrees */
- defaultBasePath?: string;
-}
-
-export interface DreamerExecutionConfig {
- /** Default timeout in seconds */
- defaultTimeout: number;
- /** Default value for skip permissions flag */
- defaultSkipPermissions: boolean;
-}
-
-export interface DreamerConfig {
- worktree: DreamerWorktreeConfig;
- execution: DreamerExecutionConfig;
-}
-
// ═══════════════════════════════════════════════════════════════
// User-Configurable Rules (v2.0)
// ═══════════════════════════════════════════════════════════════
@@ -170,7 +131,6 @@ export interface HooksConfig {
stop: StopHookConfig;
packageAuditor: PackageAuditorConfig;
cursedFiles: CursedFilesConfig;
- promptAnalysis: PromptAnalysisConfig;
gitCommitReview: GitCommitReviewConfig;
// v2.0 User Rules
userRules: UserRulesConfig;
@@ -226,17 +186,6 @@ export interface MatrixConfig {
/** Model for delegable tools: 'haiku' (cheaper) or 'sonnet' (more capable) */
model: 'haiku' | 'sonnet';
};
- /** Dreamer scheduled task automation settings */
- dreamer: DreamerConfig;
- /** Local HTTP dashboard settings */
- dashboard: {
- /** Enable the local dashboard HTTP server */
- enabled: boolean;
- /** TCP port to listen on (default: 4444) */
- port: number;
- /** Host to bind — 127.0.0.1 keeps it loopback-only */
- host: string;
- };
}
function getDownloadsDirectory(): string {
@@ -338,19 +287,6 @@ export const DEFAULT_CONFIG: MatrixConfig = {
behavior: 'ask' as const,
},
- // ─── Prompt Analysis (UserPromptSubmit hook) ───
- promptAnalysis: {
- enabled: true,
- shortcuts: { enabled: true },
- codeNavigation: { enabled: true },
- memoryInjection: {
- enabled: true,
- maxSolutions: 3,
- maxFailures: 2,
- minScore: 0.35,
- },
- },
-
// ─── Code Review Config ───
// Controls /matrix:review behavior and commit suggestions
gitCommitReview: {
@@ -408,22 +344,6 @@ export const DEFAULT_CONFIG: MatrixConfig = {
enabled: true,
model: 'haiku' as const, // Use haiku for cheaper read-only operations
},
- dreamer: {
- worktree: {
- defaultBranchPrefix: 'matrix-dreamer/',
- defaultRemote: 'origin',
- defaultBasePath: undefined,
- },
- execution: {
- defaultTimeout: 300, // 5 minutes
- defaultSkipPermissions: false,
- },
- },
- dashboard: {
- enabled: true,
- port: 4444,
- host: '127.0.0.1',
- },
};
let cachedConfig: MatrixConfig | null = null;
diff --git a/src/db/migrate.ts b/src/db/migrate.ts
index 2f730ee..1a6ddc7 100644
--- a/src/db/migrate.ts
+++ b/src/db/migrate.ts
@@ -4,303 +4,246 @@ import { homedir } from 'os';
import { SCHEMA_SQL } from './schema.js';
// Schema version - increment when schema changes
-export const SCHEMA_VERSION = 8;
+export const SCHEMA_VERSION = 9;
-// Migration definitions - each migration upgrades from (version - 1) to version
-const migrations: Record = {
- // v1 -> v2: Added hooks integration tables
- 2: `
- -- Warnings for files and packages
+function getDbPath(): string {
+ return process.env['MATRIX_DB'] || join(homedir(), '.claude', 'matrix', 'matrix.db');
+}
+
+/**
+ * Detect current schema version.
+ * - Has schema_version table → read it
+ * - Has solutions table but no schema_version → legacy DB (return 1)
+ * - Empty DB → fresh (return 0)
+ */
+function getCurrentVersion(db: Database): number {
+ try {
+ const hasVersionTable = db.query(`
+ SELECT name FROM sqlite_master WHERE type='table' AND name='schema_version'
+ `).get();
+
+ if (hasVersionTable) {
+ const row = db.query('SELECT MAX(version) as version FROM schema_version').get() as { version: number } | null;
+ return row?.version || 0;
+ }
+
+ // No version table — check if this is an existing DB
+ const hasSolutions = db.query(`
+ SELECT name FROM sqlite_master WHERE type='table' AND name='solutions'
+ `).get();
+
+ if (hasSolutions) {
+ // Legacy DB without version tracking — create the table and mark as v1
+ db.exec(`
+ CREATE TABLE schema_version (
+ version INTEGER PRIMARY KEY,
+ applied_at TEXT DEFAULT (datetime('now'))
+ )
+ `);
+ db.exec('INSERT INTO schema_version (version) VALUES (1)');
+ return 1;
+ }
+
+ // Fresh DB
+ return 0;
+ } catch {
+ return 0;
+ }
+}
+
+/**
+ * Check if a column exists on a table
+ */
+function hasColumn(db: Database, table: string, column: string): boolean {
+ const cols = db.query(`PRAGMA table_info(${table})`).all() as { name: string }[];
+ return cols.some(c => c.name === column);
+}
+
+/**
+ * Single idempotent migration that brings any legacy DB (v1-v8) to v9.
+ * Uses IF NOT EXISTS / column checks so it's safe to run on any version.
+ */
+function runLegacyUpgrade(db: Database): void {
+ // ── Create tables that may be missing ──────────────────────────────
+
+ db.exec(`
CREATE TABLE IF NOT EXISTS warnings (
- id TEXT PRIMARY KEY,
- type TEXT NOT NULL CHECK(type IN ('file', 'package')),
- target TEXT NOT NULL,
- ecosystem TEXT,
- reason TEXT NOT NULL,
- severity TEXT DEFAULT 'warn' CHECK(severity IN ('info', 'warn', 'block')),
- repo_id TEXT REFERENCES repos(id),
- created_at TEXT DEFAULT (datetime('now')),
- UNIQUE(type, target, ecosystem, repo_id)
+ id TEXT PRIMARY KEY,
+ type TEXT NOT NULL CHECK(type IN ('file', 'package')),
+ target TEXT NOT NULL,
+ ecosystem TEXT,
+ reason TEXT NOT NULL,
+ severity TEXT DEFAULT 'warn' CHECK(severity IN ('info', 'warn', 'block')),
+ repo_id TEXT REFERENCES repos(id),
+ created_at TEXT DEFAULT (datetime('now')),
+ UNIQUE(type, target, ecosystem, repo_id)
);
CREATE TABLE IF NOT EXISTS dependency_installs (
- id INTEGER PRIMARY KEY AUTOINCREMENT,
- package_name TEXT NOT NULL,
- package_version TEXT,
- ecosystem TEXT NOT NULL CHECK(ecosystem IN ('npm', 'pip', 'cargo', 'go')),
- repo_id TEXT REFERENCES repos(id),
- command TEXT NOT NULL,
- session_id TEXT,
- cve_cache JSON,
- size_bytes INTEGER,
- deprecated INTEGER DEFAULT 0,
- installed_at TEXT DEFAULT (datetime('now'))
+ id INTEGER PRIMARY KEY AUTOINCREMENT,
+ package_name TEXT NOT NULL,
+ package_version TEXT,
+ ecosystem TEXT NOT NULL CHECK(ecosystem IN ('npm', 'pip', 'cargo', 'go')),
+ repo_id TEXT REFERENCES repos(id),
+ command TEXT NOT NULL,
+ session_id TEXT,
+ cve_cache JSON,
+ size_bytes INTEGER,
+ deprecated INTEGER DEFAULT 0,
+ installed_at TEXT DEFAULT (datetime('now'))
);
CREATE TABLE IF NOT EXISTS session_summaries (
- id TEXT PRIMARY KEY,
- session_id TEXT NOT NULL,
- repo_id TEXT REFERENCES repos(id),
- summary TEXT,
- complexity INTEGER,
- stored_solution_id TEXT REFERENCES solutions(id),
- user_decision TEXT CHECK(user_decision IN ('stored', 'skipped', 'edited')),
- created_at TEXT DEFAULT (datetime('now'))
+ id TEXT PRIMARY KEY,
+ session_id TEXT NOT NULL,
+ repo_id TEXT REFERENCES repos(id),
+ summary TEXT,
+ complexity INTEGER,
+ stored_solution_id TEXT REFERENCES solutions(id),
+ user_decision TEXT CHECK(user_decision IN ('stored', 'skipped', 'edited')),
+ created_at TEXT DEFAULT (datetime('now'))
);
CREATE TABLE IF NOT EXISTS api_cache (
- cache_key TEXT PRIMARY KEY,
- response JSON NOT NULL,
- created_at TEXT DEFAULT (datetime('now'))
+ cache_key TEXT PRIMARY KEY,
+ response JSON NOT NULL,
+ created_at TEXT DEFAULT (datetime('now'))
);
- CREATE INDEX IF NOT EXISTS idx_warnings_type_target ON warnings(type, target);
- CREATE INDEX IF NOT EXISTS idx_warnings_repo ON warnings(repo_id);
- CREATE INDEX IF NOT EXISTS idx_dep_installs_package ON dependency_installs(package_name);
- CREATE INDEX IF NOT EXISTS idx_dep_installs_session ON dependency_installs(session_id);
- CREATE INDEX IF NOT EXISTS idx_dep_installs_repo ON dependency_installs(repo_id);
- CREATE INDEX IF NOT EXISTS idx_session_summaries_session ON session_summaries(session_id);
- CREATE INDEX IF NOT EXISTS idx_api_cache_created ON api_cache(created_at);
- `,
-
- // v2 -> v3: Enhanced memory with structured metadata
- 3: `
- ALTER TABLE solutions ADD COLUMN category TEXT CHECK(category IN ('bugfix', 'feature', 'refactor', 'config', 'pattern', 'optimization'));
- ALTER TABLE solutions ADD COLUMN complexity INTEGER CHECK(complexity >= 1 AND complexity <= 10);
- ALTER TABLE solutions ADD COLUMN prerequisites JSON DEFAULT '[]';
- ALTER TABLE solutions ADD COLUMN anti_patterns JSON DEFAULT '[]';
- ALTER TABLE solutions ADD COLUMN code_blocks JSON DEFAULT '[]';
- ALTER TABLE solutions ADD COLUMN related_solutions JSON DEFAULT '[]';
- ALTER TABLE solutions ADD COLUMN supersedes TEXT REFERENCES solutions(id);
- CREATE INDEX IF NOT EXISTS idx_solutions_category ON solutions(category);
- CREATE INDEX IF NOT EXISTS idx_solutions_complexity ON solutions(complexity);
- CREATE INDEX IF NOT EXISTS idx_solutions_supersedes ON solutions(supersedes);
- `,
-
- // v3 -> v4: Skill Factory columns
- 4: `
- ALTER TABLE solutions ADD COLUMN promoted_to_skill TEXT;
- ALTER TABLE solutions ADD COLUMN promoted_at TEXT;
- CREATE INDEX IF NOT EXISTS idx_solutions_promoted ON solutions(promoted_to_skill);
- `,
-
- // v4 -> v5: Dreamer (Scheduled Task Automation)
- 5: `
- CREATE TABLE IF NOT EXISTS dreamer_tasks (
- id TEXT PRIMARY KEY,
- name TEXT NOT NULL,
- description TEXT,
- enabled INTEGER DEFAULT 1,
- cron_expression TEXT NOT NULL,
- timezone TEXT DEFAULT 'local',
- command TEXT NOT NULL,
- working_directory TEXT DEFAULT '.',
- timeout INTEGER DEFAULT 300,
- env JSON DEFAULT '{}',
- skip_permissions INTEGER DEFAULT 0,
- worktree_enabled INTEGER DEFAULT 0,
- worktree_base_path TEXT,
- worktree_branch_prefix TEXT DEFAULT 'matrix-dreamer/',
- worktree_remote TEXT DEFAULT 'origin',
- tags JSON DEFAULT '[]',
- repo_id TEXT REFERENCES repos(id),
- created_at TEXT DEFAULT (datetime('now')),
- updated_at TEXT DEFAULT (datetime('now'))
+ CREATE TABLE IF NOT EXISTS hook_executions (
+ hook_name TEXT NOT NULL,
+ session_id TEXT NOT NULL,
+ executed_at TEXT DEFAULT (datetime('now')),
+ PRIMARY KEY (hook_name, session_id)
);
- CREATE TABLE IF NOT EXISTS dreamer_executions (
- id TEXT PRIMARY KEY,
- task_id TEXT NOT NULL REFERENCES dreamer_tasks(id) ON DELETE CASCADE,
- started_at TEXT NOT NULL,
- completed_at TEXT,
- status TEXT NOT NULL CHECK(status IN ('running','success','failure','timeout','skipped')),
- triggered_by TEXT NOT NULL,
- duration INTEGER,
- exit_code INTEGER,
- output_preview TEXT,
- error TEXT,
- task_name TEXT NOT NULL,
- project_path TEXT,
- cron_expression TEXT,
- worktree_path TEXT,
- worktree_branch TEXT,
- worktree_pushed INTEGER
+ CREATE TABLE IF NOT EXISTS plugin_meta (
+ key TEXT PRIMARY KEY,
+ value TEXT NOT NULL,
+ updated_at TEXT DEFAULT (datetime('now'))
);
- CREATE INDEX IF NOT EXISTS idx_dreamer_tasks_repo ON dreamer_tasks(repo_id);
- CREATE INDEX IF NOT EXISTS idx_dreamer_tasks_enabled ON dreamer_tasks(enabled);
- CREATE INDEX IF NOT EXISTS idx_dreamer_executions_task ON dreamer_executions(task_id);
- CREATE INDEX IF NOT EXISTS idx_dreamer_executions_started ON dreamer_executions(started_at DESC);
- CREATE INDEX IF NOT EXISTS idx_dreamer_executions_status ON dreamer_executions(status);
- `,
+ -- Code indexer tables (may be missing on DBs that upgraded before indexer was added)
+ CREATE TABLE IF NOT EXISTS repo_files (
+ id INTEGER PRIMARY KEY AUTOINCREMENT,
+ repo_id TEXT NOT NULL,
+ file_path TEXT NOT NULL,
+ mtime INTEGER NOT NULL,
+ hash TEXT,
+ indexed_at TEXT DEFAULT (datetime('now')),
+ UNIQUE(repo_id, file_path)
+ );
- // v5 -> v6: Plugin metadata table
- 6: `
- CREATE TABLE IF NOT EXISTS plugin_meta (
- key TEXT PRIMARY KEY,
- value TEXT NOT NULL,
- updated_at TEXT DEFAULT (datetime('now'))
+ CREATE TABLE IF NOT EXISTS symbols (
+ id INTEGER PRIMARY KEY AUTOINCREMENT,
+ repo_id TEXT NOT NULL,
+ file_id INTEGER NOT NULL,
+ name TEXT NOT NULL,
+ kind TEXT NOT NULL,
+ line INTEGER NOT NULL,
+ column INTEGER NOT NULL,
+ end_line INTEGER,
+ exported INTEGER DEFAULT 0,
+ is_default INTEGER DEFAULT 0,
+ scope TEXT,
+ signature TEXT,
+ FOREIGN KEY (file_id) REFERENCES repo_files(id) ON DELETE CASCADE
);
- `,
- // v6 -> v7: Missing tables that were in schema but not migrations
- // This fixes databases that upgraded from older versions
- 7: `
- -- Track one-time hook executions per session
- CREATE TABLE IF NOT EXISTS hook_executions (
- hook_name TEXT NOT NULL,
- session_id TEXT NOT NULL,
- executed_at TEXT DEFAULT (datetime('now')),
- PRIMARY KEY (hook_name, session_id)
+ CREATE TABLE IF NOT EXISTS imports (
+ id INTEGER PRIMARY KEY AUTOINCREMENT,
+ file_id INTEGER NOT NULL,
+ imported_name TEXT NOT NULL,
+ local_name TEXT,
+ source_path TEXT NOT NULL,
+ is_default INTEGER DEFAULT 0,
+ is_namespace INTEGER DEFAULT 0,
+ is_type INTEGER DEFAULT 0,
+ line INTEGER NOT NULL,
+ FOREIGN KEY (file_id) REFERENCES repo_files(id) ON DELETE CASCADE
);
- -- Background job tracking for async operations
- CREATE TABLE IF NOT EXISTS background_jobs (
- id TEXT PRIMARY KEY,
- tool_name TEXT NOT NULL,
- status TEXT NOT NULL DEFAULT 'queued' CHECK(status IN ('queued', 'running', 'completed', 'failed', 'cancelled')),
- progress_percent INTEGER DEFAULT 0,
- progress_message TEXT,
- input JSON,
- result JSON,
- error TEXT,
- pid INTEGER,
- created_at TEXT DEFAULT (datetime('now')),
- started_at TEXT,
- completed_at TEXT
+ CREATE TABLE IF NOT EXISTS symbol_refs (
+ id INTEGER PRIMARY KEY AUTOINCREMENT,
+ symbol_id INTEGER NOT NULL,
+ file_id INTEGER NOT NULL,
+ line INTEGER NOT NULL,
+ column INTEGER NOT NULL,
+ FOREIGN KEY (symbol_id) REFERENCES symbols(id) ON DELETE CASCADE,
+ FOREIGN KEY (file_id) REFERENCES repo_files(id) ON DELETE CASCADE
);
+ `);
+
+ // ── Add solution columns that may be missing ───────────────────────
+
+ const solutionColumns: Array<{ name: string; sql: string }> = [
+ { name: 'category', sql: "category TEXT CHECK(category IN ('bugfix', 'feature', 'refactor', 'config', 'pattern', 'optimization'))" },
+ { name: 'complexity', sql: 'complexity INTEGER CHECK(complexity >= 1 AND complexity <= 10)' },
+ { name: 'prerequisites', sql: "prerequisites JSON DEFAULT '[]'" },
+ { name: 'anti_patterns', sql: "anti_patterns JSON DEFAULT '[]'" },
+ { name: 'code_blocks', sql: "code_blocks JSON DEFAULT '[]'" },
+ { name: 'related_solutions', sql: "related_solutions JSON DEFAULT '[]'" },
+ { name: 'supersedes', sql: 'supersedes TEXT REFERENCES solutions(id)' },
+ { name: 'promoted_to_skill', sql: 'promoted_to_skill TEXT' },
+ { name: 'promoted_at', sql: 'promoted_at TEXT' },
+ ];
+
+ for (const col of solutionColumns) {
+ if (!hasColumn(db, 'solutions', col.name)) {
+ db.exec(`ALTER TABLE solutions ADD COLUMN ${col.sql}`);
+ }
+ }
+
+ // ── Create all indexes ─────────────────────────────────────────────
- -- Indexes
+ db.exec(`
+ -- Core
+ CREATE INDEX IF NOT EXISTS idx_solutions_repo ON solutions(repo_id);
+ CREATE INDEX IF NOT EXISTS idx_solutions_scope ON solutions(scope);
+ CREATE INDEX IF NOT EXISTS idx_solutions_score ON solutions(score DESC);
+ CREATE INDEX IF NOT EXISTS idx_solutions_scope_score ON solutions(scope, score DESC);
+ CREATE INDEX IF NOT EXISTS idx_solutions_created ON solutions(created_at DESC);
+ CREATE INDEX IF NOT EXISTS idx_solutions_category ON solutions(category);
+ CREATE INDEX IF NOT EXISTS idx_solutions_complexity ON solutions(complexity);
+ CREATE INDEX IF NOT EXISTS idx_solutions_supersedes ON solutions(supersedes);
+ CREATE INDEX IF NOT EXISTS idx_failures_repo ON failures(repo_id);
+ CREATE INDEX IF NOT EXISTS idx_failures_signature ON failures(error_signature);
+ CREATE INDEX IF NOT EXISTS idx_failures_type ON failures(error_type);
+ CREATE INDEX IF NOT EXISTS idx_usage_solution ON usage_log(solution_id);
+ CREATE INDEX IF NOT EXISTS idx_usage_created ON usage_log(created_at DESC);
+
+ -- Hooks
+ CREATE INDEX IF NOT EXISTS idx_warnings_type_target ON warnings(type, target);
+ CREATE INDEX IF NOT EXISTS idx_warnings_repo ON warnings(repo_id);
+ CREATE INDEX IF NOT EXISTS idx_dep_installs_package ON dependency_installs(package_name);
+ CREATE INDEX IF NOT EXISTS idx_dep_installs_session ON dependency_installs(session_id);
+ CREATE INDEX IF NOT EXISTS idx_dep_installs_repo ON dependency_installs(repo_id);
+ CREATE INDEX IF NOT EXISTS idx_session_summaries_session ON session_summaries(session_id);
+ CREATE INDEX IF NOT EXISTS idx_api_cache_created ON api_cache(created_at);
CREATE INDEX IF NOT EXISTS idx_hook_executions_session ON hook_executions(session_id);
- CREATE INDEX IF NOT EXISTS idx_background_jobs_status ON background_jobs(status);
- CREATE INDEX IF NOT EXISTS idx_background_jobs_tool ON background_jobs(tool_name);
- `,
- // v7 -> v8: Composite indexes for faster symbol lookups
- 8: `
- -- Speeds up searchSymbols: WHERE repo_id = ? AND name LIKE ?
+ -- Code indexer
+ CREATE INDEX IF NOT EXISTS idx_repo_files_repo ON repo_files(repo_id);
+ CREATE INDEX IF NOT EXISTS idx_repo_files_path ON repo_files(repo_id, file_path);
+ CREATE INDEX IF NOT EXISTS idx_symbols_name ON symbols(name);
+ CREATE INDEX IF NOT EXISTS idx_symbols_repo ON symbols(repo_id);
+ CREATE INDEX IF NOT EXISTS idx_symbols_file ON symbols(file_id);
+ CREATE INDEX IF NOT EXISTS idx_symbols_kind ON symbols(kind);
+ CREATE INDEX IF NOT EXISTS idx_symbols_exported ON symbols(exported);
CREATE INDEX IF NOT EXISTS idx_symbols_name_repo ON symbols(repo_id, name);
- -- Speeds up listExports: WHERE repo_id = ? AND exported = 1
CREATE INDEX IF NOT EXISTS idx_symbols_repo_exported ON symbols(repo_id, exported);
- `,
-};
-
-function getDbPath(): string {
- return process.env['MATRIX_DB'] || join(homedir(), '.claude', 'matrix', 'matrix.db');
-}
-
-function getCurrentVersion(db: Database): number {
- try {
- // Check if schema_version table exists
- const tableExists = db.query(`
- SELECT name FROM sqlite_master
- WHERE type='table' AND name='schema_version'
- `).get();
-
- if (!tableExists) {
- // Create schema_version table
- db.exec(`
- CREATE TABLE schema_version (
- version INTEGER PRIMARY KEY,
- applied_at TEXT DEFAULT (datetime('now'))
- )
- `);
-
- // Check if this is a fresh DB or existing one without versioning
- const hasSolutions = db.query(`
- SELECT name FROM sqlite_master WHERE type='table' AND name='solutions'
- `).get();
-
- if (hasSolutions) {
- // Existing DB, check if it has ALL v2 tables
- const v2Tables = ['warnings', 'dependency_installs', 'session_summaries', 'api_cache'];
- const existingTables = db.query(`
- SELECT name FROM sqlite_master WHERE type='table' AND name IN ('warnings', 'dependency_installs', 'session_summaries', 'api_cache')
- `).all() as { name: string }[];
-
- let hasAllV2Tables = v2Tables.every(t => existingTables.some(e => e.name === t));
-
- // Validate table integrity - verify tables are actually accessible
- if (hasAllV2Tables) {
- try {
- // Quick query to verify v2 tables are not corrupted
- db.query('SELECT 1 FROM warnings LIMIT 1').get();
- } catch {
- // Tables exist but may be corrupted, treat as v1
- hasAllV2Tables = false;
- }
- }
-
- // Check for v3, v4, and v5 features
- let hasV3Columns = false;
- let hasV4Columns = false;
- let hasV5Tables = false;
- if (hasAllV2Tables) {
- try {
- const cols = db.query(`PRAGMA table_info(solutions)`).all() as { name: string }[];
- hasV3Columns = cols.some(c => c.name === 'category');
- hasV4Columns = cols.some(c => c.name === 'promoted_to_skill');
- } catch {
- hasV3Columns = false;
- hasV4Columns = false;
- }
-
- // Check for v5 (Dreamer tables)
- if (hasV4Columns) {
- try {
- const dreamerTable = db.query(`
- SELECT name FROM sqlite_master WHERE type='table' AND name='dreamer_tasks'
- `).get();
- hasV5Tables = !!dreamerTable;
- } catch {
- hasV5Tables = false;
- }
- }
- }
-
- // Check for v6 (plugin_meta table)
- let hasV6Tables = false;
- if (hasV5Tables) {
- try {
- const pluginMetaTable = db.query(`
- SELECT name FROM sqlite_master WHERE type='table' AND name='plugin_meta'
- `).get();
- hasV6Tables = !!pluginMetaTable;
- } catch {
- hasV6Tables = false;
- }
- }
-
- // Check for v7 (background_jobs and hook_executions tables)
- let hasV7Tables = false;
- if (hasV6Tables) {
- try {
- const bgJobsTable = db.query(`
- SELECT name FROM sqlite_master WHERE type='table' AND name='background_jobs'
- `).get();
- const hookExecTable = db.query(`
- SELECT name FROM sqlite_master WHERE type='table' AND name='hook_executions'
- `).get();
- hasV7Tables = !!(bgJobsTable && hookExecTable);
- } catch {
- hasV7Tables = false;
- }
- }
-
- const initialVersion = hasV7Tables ? 7 : (hasV6Tables ? 6 : (hasV5Tables ? 5 : (hasV4Columns ? 4 : (hasV3Columns ? 3 : (hasAllV2Tables ? 2 : 1)))));
- db.exec(`INSERT INTO schema_version (version) VALUES (${initialVersion})`);
- return initialVersion;
- } else {
- // Fresh DB, will run full schema
- return 0;
- }
- }
-
- const row = db.query('SELECT MAX(version) as version FROM schema_version').get() as { version: number } | null;
- return row?.version || 0;
- } catch {
- return 0;
- }
+ CREATE INDEX IF NOT EXISTS idx_imports_file ON imports(file_id);
+ CREATE INDEX IF NOT EXISTS idx_imports_source ON imports(source_path);
+ CREATE INDEX IF NOT EXISTS idx_imports_name ON imports(imported_name);
+ CREATE INDEX IF NOT EXISTS idx_symbol_refs_symbol ON symbol_refs(symbol_id);
+ CREATE INDEX IF NOT EXISTS idx_symbol_refs_file ON symbol_refs(file_id);
+ `);
+
+ // ── Drop removed feature tables ────────────────────────────────────
+
+ db.exec('DROP TABLE IF EXISTS dreamer_executions');
+ db.exec('DROP TABLE IF EXISTS dreamer_tasks');
+ db.exec('DROP TABLE IF EXISTS background_jobs');
}
export interface MigrationResult {
@@ -321,28 +264,24 @@ export function runMigrations(): MigrationResult {
let migrationsRun = 0;
try {
- // If fresh DB (version 0), run full schema
if (currentVersion === 0) {
+ // Fresh DB — run full schema
+ db.exec(`
+ CREATE TABLE IF NOT EXISTS schema_version (
+ version INTEGER PRIMARY KEY,
+ applied_at TEXT DEFAULT (datetime('now'))
+ )
+ `);
db.exec(SCHEMA_SQL);
db.exec(`INSERT OR REPLACE INTO schema_version (version) VALUES (${SCHEMA_VERSION})`);
- db.close();
- return {
- fromVersion: 0,
- toVersion: SCHEMA_VERSION,
- migrationsRun: 1,
- success: true,
- };
- }
-
- // Run incremental migrations
- for (let v = currentVersion + 1; v <= SCHEMA_VERSION; v++) {
- const migration = migrations[v];
- if (migration) {
- db.exec(migration);
- db.exec(`INSERT INTO schema_version (version) VALUES (${v})`);
- migrationsRun++;
- }
+ migrationsRun = 1;
+ } else if (currentVersion < SCHEMA_VERSION) {
+ // Legacy DB — run single idempotent upgrade
+ runLegacyUpgrade(db);
+ db.exec(`INSERT OR REPLACE INTO schema_version (version) VALUES (${SCHEMA_VERSION})`);
+ migrationsRun = 1;
}
+ // else: already at current version, nothing to do
db.close();
return {
diff --git a/src/db/schema.test.ts b/src/db/schema.test.ts
index 68420cf..015295e 100644
--- a/src/db/schema.test.ts
+++ b/src/db/schema.test.ts
@@ -93,53 +93,6 @@ afterAll(() => {
}
});
-// Read migration SQL directly from migrate.ts for testing upgrade paths
-// This simulates what happens when upgrading from v4 to v5
-function getMigrationV5SQL(): string {
- return `
- CREATE TABLE IF NOT EXISTS dreamer_tasks (
- id TEXT PRIMARY KEY,
- name TEXT NOT NULL,
- description TEXT,
- enabled INTEGER DEFAULT 1,
- cron_expression TEXT NOT NULL,
- timezone TEXT DEFAULT 'local',
- command TEXT NOT NULL,
- working_directory TEXT DEFAULT '.',
- timeout INTEGER DEFAULT 300,
- env JSON DEFAULT '{}',
- skip_permissions INTEGER DEFAULT 0,
- worktree_enabled INTEGER DEFAULT 0,
- worktree_base_path TEXT,
- worktree_branch_prefix TEXT DEFAULT 'matrix-dreamer/',
- worktree_remote TEXT DEFAULT 'origin',
- tags JSON DEFAULT '[]',
- repo_id TEXT,
- created_at TEXT DEFAULT (datetime('now')),
- updated_at TEXT DEFAULT (datetime('now'))
- );
-
- CREATE TABLE IF NOT EXISTS dreamer_executions (
- id TEXT PRIMARY KEY,
- task_id TEXT NOT NULL,
- started_at TEXT NOT NULL,
- completed_at TEXT,
- status TEXT NOT NULL CHECK(status IN ('running','success','failure','timeout','skipped')),
- triggered_by TEXT NOT NULL,
- duration INTEGER,
- exit_code INTEGER,
- output_preview TEXT,
- error TEXT,
- task_name TEXT NOT NULL,
- project_path TEXT,
- cron_expression TEXT,
- worktree_path TEXT,
- worktree_branch TEXT,
- worktree_pushed INTEGER
- );
- `;
-}
-
describe('Schema and Migration Integrity', () => {
test('fresh and migrated DBs have same tables', () => {
const freshTables = getAllTables(freshDb);
@@ -211,19 +164,6 @@ describe('Schema and Migration Integrity', () => {
expect(freshCols).toEqual(migratedCols);
});
- // Dreamer tables
- test('dreamer_tasks columns match', () => {
- const freshCols = getTableInfo(freshDb, 'dreamer_tasks');
- const migratedCols = getTableInfo(migratedDb, 'dreamer_tasks');
- expect(freshCols).toEqual(migratedCols);
- });
-
- test('dreamer_executions columns match', () => {
- const freshCols = getTableInfo(freshDb, 'dreamer_executions');
- const migratedCols = getTableInfo(migratedDb, 'dreamer_executions');
- expect(freshCols).toEqual(migratedCols);
- });
-
test('plugin_meta columns match', () => {
const freshCols = getTableInfo(freshDb, 'plugin_meta');
const migratedCols = getTableInfo(migratedDb, 'plugin_meta');
@@ -232,17 +172,6 @@ describe('Schema and Migration Integrity', () => {
});
describe('Default Value Matching', () => {
- test('dreamer_tasks defaults match', () => {
- const fresh = getTableInfo(freshDb, 'dreamer_tasks');
- const migrated = getTableInfo(migratedDb, 'dreamer_tasks');
-
- for (const col of fresh) {
- const migratedCol = migrated.find(c => c.name === col.name);
- expect(migratedCol).toBeDefined();
- expect(migratedCol?.dflt_value).toBe(col.dflt_value);
- }
- });
-
test('solutions defaults match', () => {
const fresh = getTableInfo(freshDb, 'solutions');
const migrated = getTableInfo(migratedDb, 'solutions');
@@ -267,31 +196,3 @@ describe('Schema and Migration Integrity', () => {
});
});
-describe('Migration Path Divergence Detection', () => {
- test('v5 migration dreamer_tasks defaults match schema.ts', () => {
- // Create DB using v5 migration SQL (simulates upgrade from v4)
- const upgradedDbPath = join(tmpdir(), `matrix-upgraded-${Date.now()}.db`);
- const upgradedDb = new Database(upgradedDbPath, { create: true });
- upgradedDb.exec(getMigrationV5SQL());
-
- // Get column info from both
- const freshCols = getTableInfo(freshDb, 'dreamer_tasks');
- const upgradedCols = getTableInfo(upgradedDb, 'dreamer_tasks');
-
- // Check each column's default value
- const mismatches: string[] = [];
- for (const freshCol of freshCols) {
- const upgradedCol = upgradedCols.find(c => c.name === freshCol.name);
- if (upgradedCol && upgradedCol.dflt_value !== freshCol.dflt_value) {
- mismatches.push(
- `Column '${freshCol.name}': schema.ts='${freshCol.dflt_value}' vs migrate.ts='${upgradedCol.dflt_value}'`
- );
- }
- }
-
- upgradedDb.close();
- if (existsSync(upgradedDbPath)) unlinkSync(upgradedDbPath);
-
- expect(mismatches).toEqual([]);
- });
-});
diff --git a/src/db/schema.ts b/src/db/schema.ts
index c2f72e4..202838f 100644
--- a/src/db/schema.ts
+++ b/src/db/schema.ts
@@ -157,22 +157,6 @@ CREATE TABLE IF NOT EXISTS plugin_meta (
updated_at TEXT DEFAULT (datetime('now'))
);
--- Background job tracking for async operations
-CREATE TABLE IF NOT EXISTS background_jobs (
- id TEXT PRIMARY KEY,
- tool_name TEXT NOT NULL,
- status TEXT NOT NULL DEFAULT 'queued' CHECK(status IN ('queued', 'running', 'completed', 'failed', 'cancelled')),
- progress_percent INTEGER DEFAULT 0,
- progress_message TEXT,
- input JSON,
- result JSON,
- error TEXT,
- pid INTEGER, -- process ID for orphan cleanup
- created_at TEXT DEFAULT (datetime('now')),
- started_at TEXT,
- completed_at TEXT
-);
-
-- Indexes for hooks tables
CREATE INDEX IF NOT EXISTS idx_warnings_type_target ON warnings(type, target);
CREATE INDEX IF NOT EXISTS idx_warnings_repo ON warnings(repo_id);
@@ -182,8 +166,6 @@ CREATE INDEX IF NOT EXISTS idx_dep_installs_repo ON dependency_installs(repo_id)
CREATE INDEX IF NOT EXISTS idx_session_summaries_session ON session_summaries(session_id);
CREATE INDEX IF NOT EXISTS idx_api_cache_created ON api_cache(created_at);
CREATE INDEX IF NOT EXISTS idx_hook_executions_session ON hook_executions(session_id);
-CREATE INDEX IF NOT EXISTS idx_background_jobs_status ON background_jobs(status);
-CREATE INDEX IF NOT EXISTS idx_background_jobs_tool ON background_jobs(tool_name);
-- ============================================================================
-- Code Indexer Tables (Repository Symbol Index)
@@ -259,57 +241,4 @@ CREATE INDEX IF NOT EXISTS idx_imports_name ON imports(imported_name);
CREATE INDEX IF NOT EXISTS idx_symbol_refs_symbol ON symbol_refs(symbol_id);
CREATE INDEX IF NOT EXISTS idx_symbol_refs_file ON symbol_refs(file_id);
--- ============================================================================
--- Dreamer Tables (Scheduled Task Automation)
--- ============================================================================
-
--- Scheduled task definitions
-CREATE TABLE IF NOT EXISTS dreamer_tasks (
- id TEXT PRIMARY KEY,
- name TEXT NOT NULL,
- description TEXT,
- enabled INTEGER DEFAULT 1,
- cron_expression TEXT NOT NULL,
- timezone TEXT DEFAULT 'local',
- command TEXT NOT NULL,
- working_directory TEXT DEFAULT '.',
- timeout INTEGER DEFAULT 300,
- env JSON DEFAULT '{}',
- skip_permissions INTEGER DEFAULT 0,
- worktree_enabled INTEGER DEFAULT 0,
- worktree_base_path TEXT,
- worktree_branch_prefix TEXT DEFAULT 'matrix-dreamer/',
- worktree_remote TEXT DEFAULT 'origin',
- tags JSON DEFAULT '[]',
- repo_id TEXT REFERENCES repos(id),
- created_at TEXT DEFAULT (datetime('now')),
- updated_at TEXT DEFAULT (datetime('now'))
-);
-
--- Execution history for scheduled tasks
-CREATE TABLE IF NOT EXISTS dreamer_executions (
- id TEXT PRIMARY KEY,
- task_id TEXT NOT NULL REFERENCES dreamer_tasks(id) ON DELETE CASCADE,
- started_at TEXT NOT NULL,
- completed_at TEXT,
- status TEXT NOT NULL CHECK(status IN ('running','success','failure','timeout','skipped')),
- triggered_by TEXT NOT NULL,
- duration INTEGER,
- exit_code INTEGER,
- output_preview TEXT,
- error TEXT,
- task_name TEXT NOT NULL,
- project_path TEXT,
- cron_expression TEXT,
- worktree_path TEXT,
- worktree_branch TEXT,
- worktree_pushed INTEGER
-);
-
--- Indexes for dreamer tables
-CREATE INDEX IF NOT EXISTS idx_dreamer_tasks_repo ON dreamer_tasks(repo_id);
-CREATE INDEX IF NOT EXISTS idx_dreamer_tasks_enabled ON dreamer_tasks(enabled);
-CREATE INDEX IF NOT EXISTS idx_dreamer_executions_task ON dreamer_executions(task_id);
-CREATE INDEX IF NOT EXISTS idx_dreamer_executions_started ON dreamer_executions(started_at DESC);
-CREATE INDEX IF NOT EXISTS idx_dreamer_executions_status ON dreamer_executions(status);
`;
diff --git a/src/dreamer/actions/add.ts b/src/dreamer/actions/add.ts
deleted file mode 100644
index 17ddd65..0000000
--- a/src/dreamer/actions/add.ts
+++ /dev/null
@@ -1,118 +0,0 @@
-/**
- * Add Action Handler
- *
- * Creates a new scheduled task.
- */
-
-import type { DreamerInput } from '../../tools/validation.js';
-import type { DreamerTask } from '../types.js';
-import { createTask, deleteTask } from '../store.js';
-import { getScheduler, isPlatformSupported } from '../scheduler/index.js';
-import { parseSchedule, getNextRuns, cronToHuman } from '../cron/index.js';
-import { getConfig } from '../../config/index.js';
-
-export interface AddResult {
- success: boolean;
- task?: DreamerTask;
- schedule?: {
- expression: string;
- human: string;
- nextRuns: string[];
- };
- error?: string;
-}
-
-export async function handleAdd(input: DreamerInput): Promise {
- // Validate required fields
- if (!input.name) {
- return { success: false, error: 'Missing required field: name' };
- }
- if (!input.schedule) {
- return { success: false, error: 'Missing required field: schedule' };
- }
- if (!input.command) {
- return { success: false, error: 'Missing required field: command' };
- }
-
- // Check platform support
- if (!isPlatformSupported()) {
- return {
- success: false,
- error: `Dreamer is not supported on this platform. Supported: macOS (launchd), Linux (crontab).`,
- };
- }
-
- // Parse schedule (cron or natural language)
- const scheduleResult = parseSchedule(input.schedule);
- if (scheduleResult.error) {
- return { success: false, error: scheduleResult.error };
- }
-
- // Generate task ID
- const taskId = crypto.randomUUID();
-
- // Get config defaults
- const config = getConfig();
- const dreamerConfig = config.dreamer;
-
- // Build worktree config
- const worktreeEnabled = input.worktree?.enabled ?? false;
-
- // Create task object
- const task: Omit = {
- id: taskId,
- name: input.name,
- description: input.description,
- enabled: true,
- cronExpression: scheduleResult.expression,
- timezone: input.timezone ?? 'local',
- command: input.command,
- workingDirectory: input.workingDirectory ?? process.cwd(),
- timeout: input.timeout ?? dreamerConfig.execution.defaultTimeout,
- env: input.env ?? {},
- skipPermissions: input.skipPermissions ?? dreamerConfig.execution.defaultSkipPermissions,
- worktreeEnabled,
- worktreeBasePath: input.worktree?.basePath ?? dreamerConfig.worktree.defaultBasePath,
- worktreeBranchPrefix: input.worktree?.branchPrefix ?? dreamerConfig.worktree.defaultBranchPrefix,
- worktreeRemote: input.worktree?.remoteName ?? dreamerConfig.worktree.defaultRemote,
- tags: input.tags ?? [],
- repoId: undefined,
- };
-
- // Save to database first
- let savedTask: DreamerTask;
- try {
- savedTask = createTask(task);
- } catch (error) {
- return {
- success: false,
- error: error instanceof Error ? error.message : 'Failed to save task to database',
- };
- }
-
- // Register with native scheduler - rollback DB if this fails
- try {
- const scheduler = getScheduler();
- await scheduler.register(savedTask);
- } catch (error) {
- // Rollback: delete the task from database since scheduler registration failed
- deleteTask(taskId);
- return {
- success: false,
- error: error instanceof Error ? error.message : 'Failed to register with scheduler',
- };
- }
-
- // Get next run times
- const nextRuns = getNextRuns(scheduleResult.expression, 3, task.timezone);
-
- return {
- success: true,
- task: savedTask,
- schedule: {
- expression: scheduleResult.expression,
- human: cronToHuman(scheduleResult.expression),
- nextRuns: nextRuns.map((d) => d.toISOString()),
- },
- };
-}
diff --git a/src/dreamer/actions/history.ts b/src/dreamer/actions/history.ts
deleted file mode 100644
index 6bb329c..0000000
--- a/src/dreamer/actions/history.ts
+++ /dev/null
@@ -1,95 +0,0 @@
-/**
- * History Action Handler
- *
- * Retrieves execution history for all tasks.
- */
-
-import type { DreamerInput } from '../../tools/validation.js';
-import type { ExecutionStatus } from '../types.js';
-import { getAllExecutions } from '../store.js';
-import { formatDuration, formatTimeAgo } from '../cron/index.js';
-
-export interface HistoryEntry {
- id: string;
- taskId: string;
- taskName: string;
- status: ExecutionStatus;
- startedAt: string;
- startedAtHuman: string;
- completedAt?: string;
- duration?: number;
- durationHuman?: string;
- exitCode?: number;
- error?: string;
- triggeredBy: string;
- projectPath?: string;
- worktree?: {
- path?: string;
- branch?: string;
- pushed?: boolean;
- };
-}
-
-export interface HistoryResult {
- success: boolean;
- executions: HistoryEntry[];
- total: number;
- summary: {
- success: number;
- failure: number;
- timeout: number;
- running: number;
- };
-}
-
-export async function handleHistory(input: DreamerInput): Promise {
- const limit = input.limit ?? 50;
- const executions = getAllExecutions({ limit });
-
- const entries: HistoryEntry[] = [];
- const summary = {
- success: 0,
- failure: 0,
- timeout: 0,
- running: 0,
- };
-
- for (const exec of executions) {
- // Count by status
- if (exec.status === 'success') summary.success++;
- else if (exec.status === 'failure') summary.failure++;
- else if (exec.status === 'timeout') summary.timeout++;
- else if (exec.status === 'running') summary.running++;
-
- entries.push({
- id: exec.id,
- taskId: exec.taskId,
- taskName: exec.taskName,
- status: exec.status,
- startedAt: exec.startedAt,
- startedAtHuman: formatTimeAgo(new Date(exec.startedAt)),
- completedAt: exec.completedAt,
- duration: exec.duration,
- durationHuman: exec.duration ? formatDuration(exec.duration) : undefined,
- exitCode: exec.exitCode,
- error: exec.error,
- triggeredBy: exec.triggeredBy,
- projectPath: exec.projectPath,
- worktree:
- exec.worktreePath || exec.worktreeBranch
- ? {
- path: exec.worktreePath,
- branch: exec.worktreeBranch,
- pushed: exec.worktreePushed,
- }
- : undefined,
- });
- }
-
- return {
- success: true,
- executions: entries,
- total: entries.length,
- summary,
- };
-}
diff --git a/src/dreamer/actions/index.ts b/src/dreamer/actions/index.ts
deleted file mode 100644
index ca83da3..0000000
--- a/src/dreamer/actions/index.ts
+++ /dev/null
@@ -1,13 +0,0 @@
-/**
- * Dreamer Action Handlers
- *
- * Exports all action handlers for the unified matrix_dreamer tool.
- */
-
-export { handleAdd, type AddResult } from './add.js';
-export { handleList, type ListResult, type TaskSummary } from './list.js';
-export { handleRun, type RunResult } from './run.js';
-export { handleRemove, type RemoveResult } from './remove.js';
-export { handleStatus, type StatusResult } from './status.js';
-export { handleLogs, type LogsResult } from './logs.js';
-export { handleHistory, type HistoryResult, type HistoryEntry } from './history.js';
diff --git a/src/dreamer/actions/list.ts b/src/dreamer/actions/list.ts
deleted file mode 100644
index c1cdc5b..0000000
--- a/src/dreamer/actions/list.ts
+++ /dev/null
@@ -1,88 +0,0 @@
-/**
- * List Action Handler
- *
- * Lists all scheduled tasks.
- */
-
-import type { DreamerInput } from '../../tools/validation.js';
-import { getAllTasks, getLatestExecution, countExecutionsByStatus } from '../store.js';
-import { cronToHuman, getNextRun, formatTimeUntil } from '../cron/index.js';
-
-export interface TaskSummary {
- id: string;
- name: string;
- enabled: boolean;
- schedule: string;
- scheduleHuman: string;
- nextRun?: string;
- nextRunHuman?: string;
- lastRun?: {
- status: string;
- startedAt: string;
- duration?: number;
- };
- stats: {
- success: number;
- failure: number;
- total: number;
- };
- tags: string[];
- workingDirectory: string;
- worktreeEnabled: boolean;
-}
-
-export interface ListResult {
- success: boolean;
- tasks: TaskSummary[];
- total: number;
-}
-
-export async function handleList(input: DreamerInput): Promise {
- const tasks = getAllTasks({
- tag: input.tag,
- limit: input.limit,
- });
-
- const summaries: TaskSummary[] = [];
-
- for (const task of tasks) {
- const lastExecution = getLatestExecution(task.id);
- const executionStats = countExecutionsByStatus(task.id);
- const nextRun = task.enabled ? getNextRun(task.cronExpression, task.timezone) : null;
-
- summaries.push({
- id: task.id,
- name: task.name,
- enabled: task.enabled,
- schedule: task.cronExpression,
- scheduleHuman: cronToHuman(task.cronExpression),
- nextRun: nextRun?.toISOString(),
- nextRunHuman: nextRun ? formatTimeUntil(nextRun) : undefined,
- lastRun: lastExecution
- ? {
- status: lastExecution.status,
- startedAt: lastExecution.startedAt,
- duration: lastExecution.duration,
- }
- : undefined,
- stats: {
- success: executionStats.success,
- failure: executionStats.failure,
- total:
- executionStats.success +
- executionStats.failure +
- executionStats.timeout +
- executionStats.skipped,
- },
- tags: task.tags,
- workingDirectory: task.workingDirectory,
- worktreeEnabled: task.worktreeEnabled,
- });
- }
-
- return {
- success: true,
- tasks: summaries,
- total: summaries.length,
- };
-}
diff --git a/src/dreamer/actions/logs.ts b/src/dreamer/actions/logs.ts
deleted file mode 100644
index 548ee06..0000000
--- a/src/dreamer/actions/logs.ts
+++ /dev/null
@@ -1,104 +0,0 @@
-/**
- * Logs Action Handler
- *
- * Retrieves logs for a scheduled task.
- */
-
-import { existsSync, readFileSync } from 'fs';
-import { join } from 'path';
-import { homedir } from 'os';
-import type { DreamerInput } from '../../tools/validation.js';
-import { getTask } from '../store.js';
-
-/**
- * Get the logs directory for Dreamer
- */
-function getLogsDir(): string {
- return join(homedir(), '.claude', 'matrix', 'dreamer', 'logs');
-}
-
-export interface LogsResult {
- success: boolean;
- taskId?: string;
- taskName?: string;
- logs?: {
- stdout?: string;
- stderr?: string;
- combined?: string;
- };
- paths?: {
- stdout: string;
- stderr: string;
- };
- error?: string;
-}
-
-export async function handleLogs(input: DreamerInput): Promise {
- if (!input.taskId) {
- return { success: false, error: 'Missing required field: taskId' };
- }
-
- const task = getTask(input.taskId);
- if (!task) {
- return { success: false, error: `Task not found: ${input.taskId}` };
- }
-
- const logDir = getLogsDir();
- const stdoutPath = join(logDir, `${task.id}.out.log`);
- const stderrPath = join(logDir, `${task.id}.err.log`);
- // For Linux crontab, logs go to a single file
- const combinedPath = join(logDir, `${task.id}.log`);
-
- const lines = input.lines ?? 50;
- const stream = input.stream ?? 'both';
-
- const result: LogsResult = {
- success: true,
- taskId: task.id,
- taskName: task.name,
- paths: {
- stdout: stdoutPath,
- stderr: stderrPath,
- },
- logs: {},
- };
-
- // Helper to read last N lines
- const readLastLines = (filePath: string, n: number): string | undefined => {
- if (!existsSync(filePath)) {
- return undefined;
- }
-
- try {
- const content = readFileSync(filePath, 'utf-8');
- const allLines = content.split('\n');
- const lastLines = allLines.slice(-n).join('\n');
- return lastLines || '(empty)';
- } catch {
- return undefined;
- }
- };
-
- // Read requested streams
- if (stream === 'stdout' || stream === 'both') {
- result.logs!.stdout = readLastLines(stdoutPath, lines);
- }
-
- if (stream === 'stderr' || stream === 'both') {
- result.logs!.stderr = readLastLines(stderrPath, lines);
- }
-
- // Also check combined log (for Linux)
- if (stream === 'both') {
- result.logs!.combined = readLastLines(combinedPath, lines);
- }
-
- // Check if any logs were found
- const hasLogs = result.logs!.stdout || result.logs!.stderr || result.logs!.combined;
- if (!hasLogs) {
- result.logs = undefined;
- result.error = 'No logs found. Task may not have executed yet.';
- }
-
- return result;
-}
diff --git a/src/dreamer/actions/remove.ts b/src/dreamer/actions/remove.ts
deleted file mode 100644
index 91bca82..0000000
--- a/src/dreamer/actions/remove.ts
+++ /dev/null
@@ -1,49 +0,0 @@
-/**
- * Remove Action Handler
- *
- * Removes a scheduled task.
- */
-
-import type { DreamerInput } from '../../tools/validation.js';
-import { getTask, deleteTask } from '../store.js';
-import { getScheduler, isPlatformSupported } from '../scheduler/index.js';
-
-export interface RemoveResult {
- success: boolean;
- taskId?: string;
- taskName?: string;
- error?: string;
-}
-
-export async function handleRemove(input: DreamerInput): Promise {
- if (!input.taskId) {
- return { success: false, error: 'Missing required field: taskId' };
- }
-
- const task = getTask(input.taskId);
- if (!task) {
- return { success: false, error: `Task not found: ${input.taskId}` };
- }
-
- try {
- // Unregister from native scheduler
- if (isPlatformSupported()) {
- const scheduler = getScheduler();
- await scheduler.unregister(task.id);
- }
-
- // Delete from database
- deleteTask(task.id);
-
- return {
- success: true,
- taskId: task.id,
- taskName: task.name,
- };
- } catch (error) {
- return {
- success: false,
- error: error instanceof Error ? error.message : 'Failed to remove task',
- };
- }
-}
diff --git a/src/dreamer/actions/run.ts b/src/dreamer/actions/run.ts
deleted file mode 100644
index a843b3e..0000000
--- a/src/dreamer/actions/run.ts
+++ /dev/null
@@ -1,195 +0,0 @@
-/**
- * Run Action Handler
- *
- * Manually triggers a task execution.
- */
-
-import { spawn } from 'child_process';
-import type { DreamerInput } from '../../tools/validation.js';
-import { getTask, createExecution, updateExecution } from '../store.js';
-import { shellEscape } from '../scheduler/index.js';
-
-// Cap output buffer to prevent memory exhaustion on verbose tasks
-const MAX_OUTPUT_SIZE = 1024 * 1024; // 1MB limit
-
-export interface RunResult {
- success: boolean;
- executionId?: string;
- status?: 'success' | 'failure' | 'timeout';
- exitCode?: number;
- duration?: number;
- outputPreview?: string;
- error?: string;
-}
-
-export async function handleRun(input: DreamerInput): Promise {
- if (!input.taskId) {
- return { success: false, error: 'Missing required field: taskId' };
- }
-
- const task = getTask(input.taskId);
- if (!task) {
- return { success: false, error: `Task not found: ${input.taskId}` };
- }
-
- // Create execution record
- const startTime = Date.now();
- const execution = createExecution({
- taskId: task.id,
- startedAt: new Date().toISOString(),
- status: 'running',
- triggeredBy: 'manual',
- taskName: task.name,
- projectPath: task.workingDirectory,
- cronExpression: task.cronExpression,
- });
-
- // Build the command
- const escapedCommand = shellEscape(task.command);
- const flags: string[] = [];
- if (task.skipPermissions) {
- flags.push('--dangerously-skip-permissions');
- }
- const flagStr = flags.length > 0 ? ` ${flags.join(' ')}` : '';
- const fullCommand = `cd ${shellEscape(task.workingDirectory)} && claude -p ${escapedCommand}${flagStr}`;
-
- // Execute with timeout using spawn for proper cleanup
- const timeoutMs = (task.timeout || 300) * 1000;
-
- return new Promise((resolve) => {
- const proc = spawn('sh', ['-c', fullCommand], {
- env: { ...process.env, ...task.env },
- });
-
- let stdout = '';
- let stderr = '';
- let killed = false;
-
- // Set up timeout to kill process
- const timeoutId = setTimeout(() => {
- killed = true;
- proc.kill('SIGKILL');
- }, timeoutMs);
-
- // Collect output with size cap to prevent memory exhaustion
- proc.stdout.on('data', (data) => {
- if (stdout.length < MAX_OUTPUT_SIZE) {
- stdout += data.toString().slice(0, MAX_OUTPUT_SIZE - stdout.length);
- }
- });
-
- proc.stderr.on('data', (data) => {
- if (stderr.length < MAX_OUTPUT_SIZE) {
- stderr += data.toString().slice(0, MAX_OUTPUT_SIZE - stderr.length);
- }
- });
-
- // Handle process completion
- proc.on('close', (code) => {
- // Always cleanup
- clearTimeout(timeoutId);
- try {
- proc.stdout.destroy();
- } catch {
- /* already destroyed */
- }
- try {
- proc.stderr.destroy();
- } catch {
- /* already destroyed */
- }
-
- const duration = Date.now() - startTime;
- const output = stdout || stderr;
- const outputPreview = output.length > 500 ? output.slice(0, 500) + '...' : output;
-
- if (killed) {
- // Timeout case
- updateExecution(execution.id, {
- completedAt: new Date().toISOString(),
- status: 'timeout',
- duration,
- error: `Task timed out after ${task.timeout}s`,
- outputPreview: stderr.slice(0, 500) || undefined,
- });
-
- resolve({
- success: false,
- executionId: execution.id,
- status: 'timeout',
- duration,
- error: `Task timed out after ${task.timeout}s`,
- });
- } else if (code === 0) {
- // Success case
- updateExecution(execution.id, {
- completedAt: new Date().toISOString(),
- status: 'success',
- duration,
- exitCode: 0,
- outputPreview,
- });
-
- resolve({
- success: true,
- executionId: execution.id,
- status: 'success',
- exitCode: 0,
- duration,
- outputPreview,
- });
- } else {
- // Failure case
- updateExecution(execution.id, {
- completedAt: new Date().toISOString(),
- status: 'failure',
- duration,
- exitCode: code ?? undefined,
- error: stderr || `Process exited with code ${code}`,
- outputPreview: stderr.slice(0, 500) || undefined,
- });
-
- resolve({
- success: false,
- executionId: execution.id,
- status: 'failure',
- exitCode: code ?? undefined,
- duration,
- error: stderr || `Process exited with code ${code}`,
- });
- }
- });
-
- // Handle spawn errors
- proc.on('error', (err) => {
- clearTimeout(timeoutId);
- try {
- proc.stdout.destroy();
- } catch {
- /* already destroyed */
- }
- try {
- proc.stderr.destroy();
- } catch {
- /* already destroyed */
- }
-
- const duration = Date.now() - startTime;
-
- updateExecution(execution.id, {
- completedAt: new Date().toISOString(),
- status: 'failure',
- duration,
- error: err.message,
- });
-
- resolve({
- success: false,
- executionId: execution.id,
- status: 'failure',
- duration,
- error: err.message,
- });
- });
- });
-}
diff --git a/src/dreamer/actions/status.ts b/src/dreamer/actions/status.ts
deleted file mode 100644
index b719c08..0000000
--- a/src/dreamer/actions/status.ts
+++ /dev/null
@@ -1,102 +0,0 @@
-/**
- * Status Action Handler
- *
- * Gets the status of the Dreamer scheduler system.
- */
-
-import type { DreamerInput } from '../../tools/validation.js';
-import { getAllTasks, getAllExecutions } from '../store.js';
-import {
- getScheduler,
- isPlatformSupported,
- getPlatformName,
- getSchedulerName,
-} from '../scheduler/index.js';
-
-export interface StatusResult {
- success: boolean;
- platform: {
- name: string;
- supported: boolean;
- scheduler?: string;
- };
- scheduler?: {
- healthy: boolean;
- registeredCount: number;
- errors: string[];
- };
- tasks: {
- total: number;
- enabled: number;
- disabled: number;
- };
- executions: {
- total: number;
- running: number;
- recent: {
- success: number;
- failure: number;
- timeout: number;
- };
- };
- error?: string;
-}
-
-export async function handleStatus(_input: DreamerInput): Promise {
- const supported = isPlatformSupported();
-
- // Get task counts
- const allTasks = getAllTasks();
- const enabledTasks = allTasks.filter((t) => t.enabled);
-
- // Get recent executions (last 100)
- const recentExecutions = getAllExecutions({ limit: 100 });
- const runningExecutions = recentExecutions.filter((e) => e.status === 'running');
- const successExecutions = recentExecutions.filter((e) => e.status === 'success');
- const failureExecutions = recentExecutions.filter((e) => e.status === 'failure');
- const timeoutExecutions = recentExecutions.filter((e) => e.status === 'timeout');
-
- const result: StatusResult = {
- success: true,
- platform: {
- name: getPlatformName(),
- supported,
- scheduler: supported ? getSchedulerName() : undefined,
- },
- tasks: {
- total: allTasks.length,
- enabled: enabledTasks.length,
- disabled: allTasks.length - enabledTasks.length,
- },
- executions: {
- total: recentExecutions.length,
- running: runningExecutions.length,
- recent: {
- success: successExecutions.length,
- failure: failureExecutions.length,
- timeout: timeoutExecutions.length,
- },
- },
- };
-
- // Get native scheduler status if supported
- if (supported) {
- try {
- const scheduler = getScheduler();
- const status = await scheduler.getStatus();
- result.scheduler = {
- healthy: status.healthy,
- registeredCount: status.taskCount,
- errors: status.errors,
- };
- } catch (error) {
- result.scheduler = {
- healthy: false,
- registeredCount: 0,
- errors: [error instanceof Error ? error.message : 'Failed to get scheduler status'],
- };
- }
- }
-
- return result;
-}
diff --git a/src/dreamer/cron/humanizer.ts b/src/dreamer/cron/humanizer.ts
deleted file mode 100644
index 1546fe5..0000000
--- a/src/dreamer/cron/humanizer.ts
+++ /dev/null
@@ -1,163 +0,0 @@
-/**
- * Cron Expression Humanization
- *
- * Converts cron expressions to human-readable descriptions and provides
- * date formatting utilities for execution records.
- */
-
-import cronstrue from 'cronstrue';
-
-/**
- * Convert a cron expression to a human-readable description
- */
-export function cronToHuman(expression: string): string {
- try {
- return cronstrue.toString(expression, {
- use24HourTimeFormat: false,
- verbose: false,
- });
- } catch {
- return expression;
- }
-}
-
-/**
- * Convert a cron expression to a verbose human-readable description
- */
-export function cronToHumanVerbose(expression: string): string {
- try {
- return cronstrue.toString(expression, {
- use24HourTimeFormat: false,
- verbose: true,
- });
- } catch {
- return expression;
- }
-}
-
-/**
- * Format a date for display
- */
-export function formatDate(date: Date): string {
- return date.toLocaleString('en-US', {
- weekday: 'short',
- month: 'short',
- day: 'numeric',
- hour: 'numeric',
- minute: '2-digit',
- hour12: true,
- });
-}
-
-/**
- * Format a date as ISO string for storage
- */
-export function formatISO(date: Date): string {
- return date.toISOString();
-}
-
-/**
- * Format a duration in milliseconds to human-readable string
- */
-export function formatDuration(ms: number): string {
- if (ms < 1000) {
- return `${ms}ms`;
- }
-
- const seconds = Math.floor(ms / 1000);
- if (seconds < 60) {
- return `${seconds}s`;
- }
-
- const minutes = Math.floor(seconds / 60);
- const remainingSeconds = seconds % 60;
-
- if (minutes < 60) {
- return remainingSeconds > 0
- ? `${minutes}m ${remainingSeconds}s`
- : `${minutes}m`;
- }
-
- const hours = Math.floor(minutes / 60);
- const remainingMinutes = minutes % 60;
-
- return remainingMinutes > 0
- ? `${hours}h ${remainingMinutes}m`
- : `${hours}h`;
-}
-
-/**
- * Format time ago (e.g., "2 hours ago", "3 days ago")
- */
-export function formatTimeAgo(date: Date): string {
- const now = new Date();
- const diffMs = now.getTime() - date.getTime();
- const diffMins = Math.round(diffMs / (1000 * 60));
- const diffHours = Math.round(diffMs / (1000 * 60 * 60));
- const diffDays = Math.round(diffMs / (1000 * 60 * 60 * 24));
-
- if (diffMins < 1) {
- return 'just now';
- }
-
- if (diffMins < 60) {
- return `${diffMins} minute${diffMins === 1 ? '' : 's'} ago`;
- }
-
- if (diffHours < 24) {
- return `${diffHours} hour${diffHours === 1 ? '' : 's'} ago`;
- }
-
- if (diffDays === 1) {
- return 'yesterday';
- }
-
- if (diffDays < 7) {
- return `${diffDays} days ago`;
- }
-
- return formatDate(date);
-}
-
-/**
- * Format time until (e.g., "in 2 hours", "in 3 days")
- */
-export function formatTimeUntil(date: Date): string {
- const now = new Date();
- const diffMs = date.getTime() - now.getTime();
-
- if (diffMs < 0) {
- return 'now';
- }
-
- const diffMins = Math.round(diffMs / (1000 * 60));
- const diffHours = Math.round(diffMs / (1000 * 60 * 60));
- const diffDays = Math.round(diffMs / (1000 * 60 * 60 * 24));
-
- if (diffMins < 1) {
- return 'in less than a minute';
- }
-
- if (diffMins < 60) {
- return `in ${diffMins} minute${diffMins === 1 ? '' : 's'}`;
- }
-
- if (diffHours < 24) {
- return `in ${diffHours} hour${diffHours === 1 ? '' : 's'}`;
- }
-
- if (diffDays === 1) {
- return 'tomorrow';
- }
-
- if (diffDays < 7) {
- return `in ${diffDays} days`;
- }
-
- return formatDate(date);
-}
-
-/**
- * Index exports
- */
-export { formatDate as formatDateShort };
diff --git a/src/dreamer/cron/index.ts b/src/dreamer/cron/index.ts
deleted file mode 100644
index c82767b..0000000
--- a/src/dreamer/cron/index.ts
+++ /dev/null
@@ -1,24 +0,0 @@
-/**
- * Cron Utilities
- *
- * Provides cron expression validation, parsing, and humanization.
- */
-
-export {
- validateCron,
- getNextRuns,
- getNextRun,
- naturalLanguageToCron,
- parseSchedule,
- CRON_PRESETS,
-} from './parser.js';
-
-export {
- cronToHuman,
- cronToHumanVerbose,
- formatDate,
- formatISO,
- formatDuration,
- formatTimeAgo,
- formatTimeUntil,
-} from './humanizer.js';
diff --git a/src/dreamer/cron/parser.ts b/src/dreamer/cron/parser.ts
deleted file mode 100644
index cd7dd39..0000000
--- a/src/dreamer/cron/parser.ts
+++ /dev/null
@@ -1,236 +0,0 @@
-/**
- * Cron Expression Parsing and Validation
- *
- * Provides cron validation, natural language parsing, and next run calculations.
- * Uses croner for validation and execution scheduling.
- */
-
-import { Cron } from 'croner';
-
-/**
- * Validate a cron expression
- */
-export function validateCron(expression: string): {
- valid: boolean;
- error?: string;
-} {
- try {
- new Cron(expression);
- return { valid: true };
- } catch (error) {
- return {
- valid: false,
- error: error instanceof Error ? error.message : 'Invalid cron expression',
- };
- }
-}
-
-/**
- * Get the next N run times for a cron expression
- */
-export function getNextRuns(
- expression: string,
- count: number = 5,
- timezone?: string
-): Date[] {
- const cron = new Cron(expression, {
- timezone: timezone === 'local' ? undefined : timezone,
- });
-
- const runs: Date[] = [];
- let next = cron.nextRun();
-
- while (next && runs.length < count) {
- runs.push(next);
- next = cron.nextRun(next);
- }
-
- return runs;
-}
-
-/**
- * Get the next run time for a cron expression
- */
-export function getNextRun(expression: string, timezone?: string): Date | null {
- const runs = getNextRuns(expression, 1, timezone);
- return runs[0] || null;
-}
-
-/**
- * Common cron presets with human-readable names
- */
-export const CRON_PRESETS: Record = {
- 'every-minute': {
- expression: '* * * * *',
- description: 'Every minute',
- },
- 'every-5-minutes': {
- expression: '*/5 * * * *',
- description: 'Every 5 minutes',
- },
- 'every-15-minutes': {
- expression: '*/15 * * * *',
- description: 'Every 15 minutes',
- },
- 'every-30-minutes': {
- expression: '*/30 * * * *',
- description: 'Every 30 minutes',
- },
- 'hourly': {
- expression: '0 * * * *',
- description: 'Every hour',
- },
- 'daily-midnight': {
- expression: '0 0 * * *',
- description: 'Daily at midnight',
- },
- 'daily-9am': {
- expression: '0 9 * * *',
- description: 'Daily at 9:00 AM',
- },
- 'daily-6pm': {
- expression: '0 18 * * *',
- description: 'Daily at 6:00 PM',
- },
- 'weekdays-9am': {
- expression: '0 9 * * 1-5',
- description: 'Weekdays at 9:00 AM',
- },
- 'weekly-monday': {
- expression: '0 9 * * 1',
- description: 'Every Monday at 9:00 AM',
- },
- 'weekly-friday': {
- expression: '0 17 * * 5',
- description: 'Every Friday at 5:00 PM',
- },
- 'monthly-first': {
- expression: '0 9 1 * *',
- description: 'First day of month at 9:00 AM',
- },
-};
-
-/**
- * Try to parse natural language into a cron expression
- * Returns undefined if parsing fails
- */
-export function naturalLanguageToCron(input: string): string | undefined {
- const lower = input.toLowerCase().trim();
-
- // Check presets first
- for (const [, preset] of Object.entries(CRON_PRESETS)) {
- if (lower === preset.description.toLowerCase()) {
- return preset.expression;
- }
- }
-
- // Common patterns
- const patterns: [RegExp, string | ((match: RegExpMatchArray) => string)][] = [
- // Every X minutes
- [/^every\s+(\d+)\s+minutes?$/i, (m) => `*/${m[1]} * * * *`],
-
- // Every X hours
- [/^every\s+(\d+)\s+hours?$/i, (m) => `0 */${m[1]} * * *`],
-
- // Every hour
- [/^every\s+hour$/i, '0 * * * *'],
- [/^hourly$/i, '0 * * * *'],
-
- // Daily at X (various formats)
- [/^daily\s+at\s+(\d{1,2})(?::(\d{2}))?\s*(am|pm)?$/i, parseTimePattern],
- [/^every\s+day\s+at\s+(\d{1,2})(?::(\d{2}))?\s*(am|pm)?$/i, parseTimePattern],
-
- // Time without "daily" prefix (e.g., "at 9am", "at 10:30pm")
- [/^at\s+(\d{1,2})(?::?(\d{2}))?\s*(am|pm)?$/i, parseTimePattern],
-
- // Compact time format (e.g., "9am", "1015am", "10:30pm")
- [/^(\d{1,2})(?::?(\d{2}))?\s*(am|pm)$/i, parseTimePattern],
-
- // Weekdays at X
- [/^(?:every\s+)?weekdays?\s+at\s+(\d{1,2})(?::(\d{2}))?\s*(am|pm)?$/i, (m) => {
- const { hour, minute } = parseTime(m);
- return `${minute} ${hour} * * 1-5`;
- }],
-
- // Every Monday/Tuesday/etc at X
- [/^every\s+(monday|tuesday|wednesday|thursday|friday|saturday|sunday)\s+at\s+(\d{1,2})(?::(\d{2}))?\s*(am|pm)?$/i, (m) => {
- const days: Record = {
- sunday: 0, monday: 1, tuesday: 2, wednesday: 3,
- thursday: 4, friday: 5, saturday: 6,
- };
- const dayName = m[1] ?? 'monday';
- const day = days[dayName.toLowerCase()] ?? 1;
- const { hour, minute } = parseTime(m, 2);
- return `${minute} ${hour} * * ${day}`;
- }],
-
- // Weekly (defaults to Monday 9am)
- [/^weekly$/i, '0 9 * * 1'],
-
- // Monthly (defaults to 1st at 9am)
- [/^monthly$/i, '0 9 1 * *'],
- ];
-
- for (const [pattern, result] of patterns) {
- const match = lower.match(pattern);
- if (match) {
- if (typeof result === 'function') {
- return result(match);
- }
- return result;
- }
- }
-
- return undefined;
-}
-
-/**
- * Helper to parse time from regex match
- */
-function parseTime(m: RegExpMatchArray, hourIdx = 1): { hour: number; minute: number } {
- const hourStr = m[hourIdx];
- let hour = hourStr ? parseInt(hourStr, 10) : 0;
- const minuteStr = m[hourIdx + 1];
- const minute = minuteStr ? parseInt(minuteStr, 10) : 0;
- const period = m[hourIdx + 2]?.toLowerCase();
-
- if (period === 'pm' && hour < 12) hour += 12;
- if (period === 'am' && hour === 12) hour = 0;
-
- return { hour, minute };
-}
-
-/**
- * Pattern handler for time-based cron expressions
- */
-function parseTimePattern(m: RegExpMatchArray): string {
- const { hour, minute } = parseTime(m);
- return `${minute} ${hour} * * *`;
-}
-
-/**
- * Parse schedule input - tries cron first, then natural language
- */
-export function parseSchedule(input: string): {
- expression: string;
- parsed: boolean;
- error?: string;
-} {
- // First check if it's already a valid cron expression
- const cronResult = validateCron(input);
- if (cronResult.valid) {
- return { expression: input, parsed: false };
- }
-
- // Try natural language parsing
- const nlCron = naturalLanguageToCron(input);
- if (nlCron) {
- return { expression: nlCron, parsed: true };
- }
-
- return {
- expression: '',
- parsed: false,
- error: `Could not parse schedule: "${input}". Use a cron expression (e.g., "0 9 * * *") or natural language (e.g., "every day at 9am").`,
- };
-}
diff --git a/src/dreamer/index.ts b/src/dreamer/index.ts
deleted file mode 100644
index 00ce396..0000000
--- a/src/dreamer/index.ts
+++ /dev/null
@@ -1,103 +0,0 @@
-/**
- * Claude Dreamer - Scheduled Task Automation
- *
- * Unified entry point for the matrix_dreamer MCP tool.
- * Dispatches actions to appropriate handlers.
- */
-
-import type { DreamerInput } from '../tools/validation.js';
-import {
- handleAdd,
- handleList,
- handleRun,
- handleRemove,
- handleStatus,
- handleLogs,
- handleHistory,
- type AddResult,
- type ListResult,
- type RunResult,
- type RemoveResult,
- type StatusResult,
- type LogsResult,
- type HistoryResult,
-} from './actions/index.js';
-
-// Re-export types
-export type {
- DreamerTask,
- DreamerExecution,
- ExecutionStatus,
- SchedulerStatus,
-} from './types.js';
-
-export type {
- AddResult,
- ListResult,
- RunResult,
- RemoveResult,
- StatusResult,
- LogsResult,
- HistoryResult,
-};
-
-// Union of all possible results
-export type DreamerResult =
- | AddResult
- | ListResult
- | RunResult
- | RemoveResult
- | StatusResult
- | LogsResult
- | HistoryResult;
-
-/**
- * Main handler for the matrix_dreamer tool.
- * Dispatches to the appropriate action handler based on input.action.
- */
-export async function matrixDreamer(input: DreamerInput): Promise {
- switch (input.action) {
- case 'add':
- return handleAdd(input);
-
- case 'list':
- return handleList(input);
-
- case 'run':
- return handleRun(input);
-
- case 'remove':
- return handleRemove(input);
-
- case 'status':
- return handleStatus(input);
-
- case 'logs':
- return handleLogs(input);
-
- case 'history':
- return handleHistory(input);
-
- default:
- return {
- success: false,
- error: `Unknown action: ${(input as { action: string }).action}`,
- } as RunResult;
- }
-}
-
-// Re-export utilities for external use
-export {
- getScheduler,
- isPlatformSupported,
- getPlatformName,
- getSchedulerName,
-} from './scheduler/index.js';
-
-export {
- validateCron,
- parseSchedule,
- cronToHuman,
- getNextRuns,
- getNextRun,
-} from './cron/index.js';
diff --git a/src/dreamer/scheduler/base.ts b/src/dreamer/scheduler/base.ts
deleted file mode 100644
index 29e3e70..0000000
--- a/src/dreamer/scheduler/base.ts
+++ /dev/null
@@ -1,234 +0,0 @@
-/**
- * Base Scheduler - Abstract Class for Platform Schedulers
- *
- * Provides common functionality for platform-specific scheduler implementations.
- * Handles command generation, worktree script generation, and shared utilities.
- */
-
-import type { DreamerTask, SchedulerStatus } from '../types.js';
-import { shellEscape, sanitizeForComment } from './shell.js';
-import { getConfig } from '../../config/index.js';
-
-/**
- * Abstract base class for platform-specific schedulers
- */
-export abstract class BaseScheduler {
- /**
- * Human-readable name of the scheduler
- */
- abstract readonly name: string;
-
- /**
- * Platform identifier
- */
- abstract readonly platform: 'darwin' | 'linux';
-
- /**
- * Register a task with the native scheduler
- */
- abstract register(task: DreamerTask): Promise;
-
- /**
- * Unregister a task from the native scheduler
- */
- abstract unregister(taskId: string): Promise;
-
- /**
- * Check if a task is registered
- */
- abstract isRegistered(taskId: string): Promise;
-
- /**
- * Get current scheduler status
- */
- abstract getStatus(): Promise;
-
- /**
- * List all registered task IDs
- */
- abstract listRegistered(): Promise;
-
- /**
- * Generate the command to execute Claude in non-interactive mode
- *
- * Security: Uses single quotes for the prompt to prevent shell expansion
- * of backticks, $(), and other shell metacharacters.
- */
- protected getExecutionCommand(task: DreamerTask): string {
- const command = task.command;
-
- // Use shellEscape to wrap in single quotes (prevents all shell expansion)
- const escapedCommand = shellEscape(command);
-
- // Build the claude command
- // Note: timeout is handled by the native scheduler, not Claude CLI
- // The -p flag enables non-interactive print mode
- const flags: string[] = [];
- if (task.skipPermissions) {
- flags.push('--dangerously-skip-permissions');
- }
-
- const flagStr = flags.length > 0 ? ` ${flags.join(' ')}` : '';
- return `claude -p ${escapedCommand}${flagStr}`;
- }
-
- /**
- * Get the working directory for a task
- */
- protected getWorkingDirectory(task: DreamerTask): string {
- return task.workingDirectory || '.';
- }
-
- /**
- * Get the task label/identifier for the native scheduler
- */
- protected getTaskLabel(taskId: string): string {
- return `com.claude.dreamer.${taskId}`;
- }
-
- /**
- * Check if task uses worktree execution
- */
- protected usesWorktree(task: DreamerTask): boolean {
- return task.worktreeEnabled === true;
- }
-
- /**
- * Generate a shell script for git worktree-based execution
- * Returns null if worktree is not enabled
- *
- * Security: All user-provided values are escaped using shellEscape()
- * to prevent command injection attacks.
- */
- protected generateWorktreeScript(task: DreamerTask, logDir: string): string | null {
- if (!task.worktreeEnabled) {
- return null;
- }
-
- // Escape all user-provided values for safe shell embedding
- const config = getConfig();
- const defaultPrefix = config.dreamer.worktree.defaultBranchPrefix;
- const defaultRemote = config.dreamer.worktree.defaultRemote;
-
- const mainRepoEsc = shellEscape(this.getWorkingDirectory(task));
- const taskIdEsc = shellEscape(task.id);
- const taskNameEsc = shellEscape(task.name);
- const branchPrefixEsc = shellEscape(task.worktreeBranchPrefix || defaultPrefix);
- const remoteNameEsc = shellEscape(task.worktreeRemote || defaultRemote);
- const logDirEsc = shellEscape(logDir);
-
- // Sanitize task name for comment (strip anything that could break comment)
- const commentSafeName = sanitizeForComment(task.name);
-
- // Base path handling - either escaped user value or computed default
- const basePathLine = task.worktreeBasePath
- ? `WORKTREE_BASE=${shellEscape(task.worktreeBasePath)}`
- : 'WORKTREE_BASE="$(dirname "$MAIN_REPO")/.$(basename "$MAIN_REPO")-worktrees"';
-
- // The command is already formatted for shell execution by getExecutionCommand()
- // It handles escaping of the user prompt. We embed it directly as a command.
- const command = this.getExecutionCommand(task);
-
- return `#!/bin/bash
-set -e
-
-# Task: ${commentSafeName} (${task.id})
-# Generated by Claude Matrix Dreamer
-
-MAIN_REPO=${mainRepoEsc}
-TASK_ID=${taskIdEsc}
-TASK_NAME=${taskNameEsc}
-TIMESTAMP=$(date +%s)
-WORKTREE_NAME="task-${'$'}{TASK_ID:0:8}-$TIMESTAMP"
-BRANCH_PREFIX=${branchPrefixEsc}
-REMOTE=${remoteNameEsc}
-LOG_DIR=${logDirEsc}
-
-# Resolve main repo to absolute path
-if [[ ! "$MAIN_REPO" = /* ]]; then
- MAIN_REPO="$(pwd)/$MAIN_REPO"
-fi
-MAIN_REPO=$(cd "$MAIN_REPO" && pwd)
-
-# Verify git repo
-if [ ! -d "$MAIN_REPO/.git" ]; then
- echo "Error: Not a git repository: $MAIN_REPO" >&2
- exit 1
-fi
-
-# Determine worktree base path
-${basePathLine}
-WORKTREE_PATH="$WORKTREE_BASE/$WORKTREE_NAME"
-BRANCH_NAME="\${BRANCH_PREFIX}\${WORKTREE_NAME}"
-
-echo "Creating worktree at: $WORKTREE_PATH"
-
-# Create worktree base directory
-mkdir -p "$WORKTREE_BASE"
-
-# Create git worktree with new branch
-git -C "$MAIN_REPO" worktree add "$WORKTREE_PATH" -b "$BRANCH_NAME"
-
-# Execute Claude in the worktree
-cd "$WORKTREE_PATH"
-echo "Executing Claude in: $WORKTREE_PATH"
-CLAUDE_EXIT=0
-${command} || CLAUDE_EXIT=$?
-
-echo "Claude exited with code: $CLAUDE_EXIT"
-
-# Commit and push changes
-COMMIT_MSG="Claude task: $TASK_NAME [$TASK_ID]"
-PUSHED=false
-HAS_CHANGES=false
-
-git add -A
-if ! git diff --cached --quiet; then
- HAS_CHANGES=true
- git commit -m "$COMMIT_MSG"
- if git push -u "$REMOTE" "$BRANCH_NAME"; then
- PUSHED=true
- echo "Changes pushed to $REMOTE/$BRANCH_NAME"
- else
- echo "Warning: Failed to push changes" >&2
- fi
-else
- echo "No changes to commit"
-fi
-
-# Cleanup: remove worktree if push succeeded or no changes
-if [ "$PUSHED" = "true" ] || [ "$HAS_CHANGES" = "false" ]; then
- echo "Cleaning up worktree..."
- cd "$MAIN_REPO"
- git worktree remove "$WORKTREE_PATH" --force || true
- echo "Worktree removed"
-else
- echo "Keeping worktree at $WORKTREE_PATH for manual review (push failed)"
-fi
-
-exit $CLAUDE_EXIT
-`;
- }
-}
-
-/**
- * Error thrown when scheduler operations fail
- */
-export class SchedulerError extends Error {
- public readonly platform: string;
- public readonly operation: string;
- public override readonly cause?: Error;
-
- constructor(
- message: string,
- platform: string,
- operation: string,
- cause?: Error
- ) {
- super(message);
- this.name = 'SchedulerError';
- this.platform = platform;
- this.operation = operation;
- this.cause = cause;
- }
-}
diff --git a/src/dreamer/scheduler/darwin.ts b/src/dreamer/scheduler/darwin.ts
deleted file mode 100644
index aef28bb..0000000
--- a/src/dreamer/scheduler/darwin.ts
+++ /dev/null
@@ -1,373 +0,0 @@
-/**
- * Darwin (macOS) Scheduler - launchd Implementation
- *
- * Manages scheduled tasks using macOS launchd plist files.
- */
-
-import { exec } from 'child_process';
-import { promisify } from 'util';
-import { existsSync, mkdirSync, writeFileSync, unlinkSync, readdirSync } from 'fs';
-import { join } from 'path';
-import { homedir } from 'os';
-import { BaseScheduler, SchedulerError } from './base.js';
-import { shellEscape } from './shell.js';
-import type { DreamerTask, SchedulerStatus } from '../types.js';
-
-const execAsync = promisify(exec);
-
-/**
- * Get the logs directory for Dreamer
- */
-function getLogsDir(): string {
- const dir = join(homedir(), '.claude', 'matrix', 'dreamer', 'logs');
- if (!existsSync(dir)) {
- mkdirSync(dir, { recursive: true });
- }
- return dir;
-}
-
-/**
- * macOS launchd scheduler implementation
- */
-export class DarwinScheduler extends BaseScheduler {
- readonly name = 'launchd';
- readonly platform = 'darwin' as const;
-
- private readonly launchAgentsDir: string;
-
- constructor() {
- super();
- this.launchAgentsDir = join(homedir(), 'Library', 'LaunchAgents');
- }
-
- /**
- * Get the plist file path for a task
- */
- private getPlistPath(taskId: string): string {
- return join(this.launchAgentsDir, `${this.getTaskLabel(taskId)}.plist`);
- }
-
- /**
- * Get the path for a worktree script
- */
- private getWorktreeScriptPath(taskId: string): string {
- return join(getLogsDir(), `${taskId}.worktree.sh`);
- }
-
- async register(task: DreamerTask): Promise {
- try {
- // Ensure directories exist
- if (!existsSync(this.launchAgentsDir)) {
- mkdirSync(this.launchAgentsDir, { recursive: true });
- }
- const logDir = getLogsDir();
-
- // Generate and write worktree script if enabled
- if (this.usesWorktree(task)) {
- const script = this.generateWorktreeScript(task, logDir);
- if (script) {
- const scriptPath = this.getWorktreeScriptPath(task.id);
- writeFileSync(scriptPath, script, { mode: 0o755 });
- }
- }
-
- const plistContent = this.generatePlist(task);
- const plistPath = this.getPlistPath(task.id);
-
- // Unload existing if present (ignore errors)
- try {
- await execAsync(`launchctl unload ${shellEscape(plistPath)}`);
- } catch {
- // Ignore - might not be loaded
- }
-
- // Write plist file
- writeFileSync(plistPath, plistContent, 'utf-8');
-
- // Load the agent
- await execAsync(`launchctl load ${shellEscape(plistPath)}`);
- } catch (error) {
- throw new SchedulerError(
- `Failed to register task "${task.name}" with launchd`,
- this.platform,
- 'register',
- error as Error
- );
- }
- }
-
- async unregister(taskId: string): Promise {
- const plistPath = this.getPlistPath(taskId);
-
- try {
- // Unload agent (ignore errors if not loaded)
- try {
- await execAsync(`launchctl unload ${shellEscape(plistPath)}`);
- } catch {
- // Ignore - might not be loaded
- }
-
- // Remove plist file
- if (existsSync(plistPath)) {
- unlinkSync(plistPath);
- }
-
- // Remove worktree script if exists
- const scriptPath = this.getWorktreeScriptPath(taskId);
- if (existsSync(scriptPath)) {
- unlinkSync(scriptPath);
- }
- } catch (error) {
- throw new SchedulerError(
- `Failed to unregister task "${taskId}" from launchd`,
- this.platform,
- 'unregister',
- error as Error
- );
- }
- }
-
- async isRegistered(taskId: string): Promise {
- const plistPath = this.getPlistPath(taskId);
- return existsSync(plistPath);
- }
-
- async getStatus(): Promise {
- const tasks = await this.listRegistered();
- const errors: string[] = [];
-
- for (const taskId of tasks) {
- try {
- const label = this.getTaskLabel(taskId);
- const { stdout } = await execAsync(`launchctl list ${label}`);
- if (stdout.includes('Could not find')) {
- errors.push(`Task ${taskId} plist exists but not loaded`);
- }
- } catch {
- errors.push(`Task ${taskId} check failed`);
- }
- }
-
- return {
- healthy: errors.length === 0,
- taskCount: tasks.length,
- errors,
- platform: this.platform,
- };
- }
-
- async listRegistered(): Promise {
- try {
- if (!existsSync(this.launchAgentsDir)) {
- return [];
- }
-
- const files = readdirSync(this.launchAgentsDir);
- const prefix = 'com.claude.dreamer.';
- const suffix = '.plist';
-
- return files
- .filter((f) => f.startsWith(prefix) && f.endsWith(suffix))
- .map((f) => f.slice(prefix.length, -suffix.length));
- } catch {
- return [];
- }
- }
-
- /**
- * Generate launchd plist content for a task
- */
- private generatePlist(task: DreamerTask): string {
- const label = this.getTaskLabel(task.id);
- const calendarInterval = this.cronToCalendarInterval(task.cronExpression);
- const logDir = getLogsDir();
-
- // Determine the program arguments based on whether worktree is enabled
- let programArgs: string;
- if (this.usesWorktree(task)) {
- const scriptPath = this.getWorktreeScriptPath(task.id);
- programArgs = ` ProgramArguments
-
- /bin/bash
- ${this.escapeXml(scriptPath)}
- `;
- } else {
- const command = this.getExecutionCommand(task);
- const workDir = this.getWorkingDirectory(task);
- // Use shell escaping for workDir (single quotes prevent injection)
- // Then XML escape the entire bash command for plist embedding
- const shellCmd = `cd ${shellEscape(workDir)} && ${command}`;
- programArgs = ` ProgramArguments
-
- /bin/bash
- -c
- ${this.escapeXml(shellCmd)}
- `;
- }
-
- // Build environment variables
- let envSection = ` EnvironmentVariables
-
- PATH
- /usr/local/bin:/usr/bin:/bin:/opt/homebrew/bin:${process.env.HOME}/.local/bin`;
-
- // Add custom env vars
- if (task.env && Object.keys(task.env).length > 0) {
- for (const [key, value] of Object.entries(task.env)) {
- envSection += `
- ${this.escapeXml(key)}
- ${this.escapeXml(value)}`;
- }
- }
- envSection += `
- `;
-
- return `
-
-
-
- Label
- ${this.escapeXml(label)}
-
-${programArgs}
-
- ${calendarInterval}
-
- StandardOutPath
- ${logDir}/${task.id}.out.log
-
- StandardErrorPath
- ${logDir}/${task.id}.err.log
-
- RunAtLoad
-
-
-${envSection}
-
-`;
- }
-
- /**
- * Convert cron expression to launchd StartCalendarInterval
- */
- private cronToCalendarInterval(expression: string): string {
- const parts = expression.trim().split(/\s+/);
- if (parts.length !== 5) {
- throw new Error(`Invalid cron expression: ${expression}`);
- }
-
- const minute = parts[0]!;
- const hour = parts[1]!;
- const day = parts[2]!;
- const month = parts[3]!;
- const weekday = parts[4]!;
-
- // Parse each field with appropriate bounds
- const mins = minute !== '*' ? this.parseField(minute, 0, 59) : null;
- const hours = hour !== '*' ? this.parseField(hour, 0, 23) : null;
- const days = day !== '*' ? this.parseField(day, 1, 31) : null;
- const months = month !== '*' ? this.parseField(month, 1, 12) : null;
- const weekdays = weekday !== '*' ? this.parseField(weekday, 0, 6) : null;
-
- // Build the dict entries for a single interval
- const buildDict = (
- m: number | null,
- h: number | null,
- d: number | null,
- mo: number | null,
- wd: number | null
- ): string => {
- let dict = ' \n';
- if (m !== null) dict += ` Minute\n ${m}\n`;
- if (h !== null) dict += ` Hour\n ${h}\n`;
- if (d !== null) dict += ` Day\n ${d}\n`;
- if (mo !== null) dict += ` Month\n ${mo}\n`;
- if (wd !== null) dict += ` Weekday\n ${wd}\n`;
- dict += ' ';
- return dict;
- };
-
- // Generate cartesian product of all field values
- const minValues = mins ?? [null];
- const hourValues = hours ?? [null];
- const dayValues = days ?? [null];
- const monthValues = months ?? [null];
- const weekdayValues = weekdays ?? [null];
-
- const dicts: string[] = [];
- for (const m of minValues) {
- for (const h of hourValues) {
- for (const d of dayValues) {
- for (const mo of monthValues) {
- for (const wd of weekdayValues) {
- dicts.push(buildDict(m, h, d, mo, wd));
- }
- }
- }
- }
- }
-
- // If only one interval, use dict format; otherwise use array format
- if (dicts.length === 1) {
- return `StartCalendarInterval\n${dicts[0]}`;
- }
-
- return `StartCalendarInterval\n \n${dicts.join('\n')}\n `;
- }
-
- /**
- * Parse a cron field to extract numeric values
- */
- private parseField(field: string, min: number, max: number): number[] {
- const values: number[] = [];
-
- for (const part of field.split(',')) {
- if (part.includes('/')) {
- // Handle step values like */15 or 0-30/5
- const splitParts = part.split('/');
- const range = splitParts[0] ?? '*';
- const step = Number(splitParts[1] ?? 1);
- let start = min;
- let end = max;
-
- if (range !== '*') {
- if (range.includes('-')) {
- const rangeParts = range.split('-').map(Number);
- start = rangeParts[0] ?? min;
- end = rangeParts[1] ?? max;
- } else {
- start = Number(range);
- }
- }
-
- for (let i = start; i <= end; i += step) {
- values.push(i);
- }
- } else if (part.includes('-')) {
- const rangeParts = part.split('-').map(Number);
- const start = rangeParts[0] ?? min;
- const end = rangeParts[1] ?? max;
- for (let i = start; i <= end; i++) {
- values.push(i);
- }
- } else {
- values.push(Number(part));
- }
- }
-
- // Filter values to be within valid bounds
- return values.filter((v) => v >= min && v <= max);
- }
-
- /**
- * Escape XML special characters
- */
- private escapeXml(str: string): string {
- return str
- .replace(/&/g, '&')
- .replace(//g, '>')
- .replace(/"/g, '"')
- .replace(/'/g, ''');
- }
-}
diff --git a/src/dreamer/scheduler/factory.ts b/src/dreamer/scheduler/factory.ts
deleted file mode 100644
index 140eb58..0000000
--- a/src/dreamer/scheduler/factory.ts
+++ /dev/null
@@ -1,89 +0,0 @@
-/**
- * Scheduler Factory
- *
- * Detects the current platform and returns the appropriate scheduler.
- * Supports macOS (launchd) and Linux (crontab).
- */
-
-import { platform } from 'os';
-import { BaseScheduler, SchedulerError } from './base.js';
-import { DarwinScheduler } from './darwin.js';
-import { LinuxScheduler } from './linux.js';
-
-/**
- * Cached scheduler instance
- */
-let cachedScheduler: BaseScheduler | null = null;
-
-/**
- * Get the appropriate scheduler for the current platform
- */
-export function getScheduler(): BaseScheduler {
- if (cachedScheduler) {
- return cachedScheduler;
- }
-
- const currentPlatform = platform();
-
- switch (currentPlatform) {
- case 'darwin':
- cachedScheduler = new DarwinScheduler();
- break;
- case 'linux':
- cachedScheduler = new LinuxScheduler();
- break;
- default:
- throw new SchedulerError(
- `Unsupported platform: ${currentPlatform}. Dreamer supports macOS (launchd) and Linux (crontab).`,
- currentPlatform,
- 'init'
- );
- }
-
- return cachedScheduler;
-}
-
-/**
- * Get the platform name
- */
-export function getPlatformName(): string {
- const currentPlatform = platform();
- switch (currentPlatform) {
- case 'darwin':
- return 'macOS';
- case 'linux':
- return 'Linux';
- default:
- return currentPlatform;
- }
-}
-
-/**
- * Get the native scheduler name for the current platform
- */
-export function getSchedulerName(): string {
- const currentPlatform = platform();
- switch (currentPlatform) {
- case 'darwin':
- return 'launchd';
- case 'linux':
- return 'crontab';
- default:
- return 'unknown';
- }
-}
-
-/**
- * Check if the current platform is supported
- */
-export function isPlatformSupported(): boolean {
- const currentPlatform = platform();
- return ['darwin', 'linux'].includes(currentPlatform);
-}
-
-/**
- * Reset the cached scheduler (useful for testing)
- */
-export function resetSchedulerCache(): void {
- cachedScheduler = null;
-}
diff --git a/src/dreamer/scheduler/index.ts b/src/dreamer/scheduler/index.ts
deleted file mode 100644
index ff094b8..0000000
--- a/src/dreamer/scheduler/index.ts
+++ /dev/null
@@ -1,24 +0,0 @@
-/**
- * Scheduler Module Exports
- *
- * Platform-agnostic scheduler interface for managing scheduled tasks.
- */
-
-export { BaseScheduler, SchedulerError } from './base.js';
-export { DarwinScheduler } from './darwin.js';
-export { LinuxScheduler } from './linux.js';
-export {
- getScheduler,
- getPlatformName,
- getSchedulerName,
- isPlatformSupported,
- resetSchedulerCache,
-} from './factory.js';
-export {
- shellEscape,
- sanitizeForComment,
- isSafeIdentifier,
- GIT_REF_PATTERN,
- GIT_REMOTE_PATTERN,
- SAFE_PATH_PATTERN,
-} from './shell.js';
diff --git a/src/dreamer/scheduler/linux.ts b/src/dreamer/scheduler/linux.ts
deleted file mode 100644
index 1e464a9..0000000
--- a/src/dreamer/scheduler/linux.ts
+++ /dev/null
@@ -1,196 +0,0 @@
-/**
- * Linux Scheduler - crontab Implementation
- *
- * Manages scheduled tasks using the system crontab.
- */
-
-import { exec } from 'child_process';
-import { promisify } from 'util';
-import { existsSync, mkdirSync, writeFileSync, unlinkSync } from 'fs';
-import { join } from 'path';
-import { homedir } from 'os';
-import { BaseScheduler, SchedulerError } from './base.js';
-import { shellEscape } from './shell.js';
-import type { DreamerTask, SchedulerStatus } from '../types.js';
-
-const execAsync = promisify(exec);
-
-/**
- * Get the logs directory for Dreamer
- */
-function getLogsDir(): string {
- const dir = join(homedir(), '.claude', 'matrix', 'dreamer', 'logs');
- if (!existsSync(dir)) {
- mkdirSync(dir, { recursive: true });
- }
- return dir;
-}
-
-/**
- * Linux crontab scheduler implementation
- */
-export class LinuxScheduler extends BaseScheduler {
- readonly name = 'crontab';
- readonly platform = 'linux' as const;
-
- /**
- * Marker comment to identify our cron entries
- */
- private readonly MARKER_PREFIX = '# claude-dreamer:';
-
- /**
- * Get the path for a worktree script
- */
- private getWorktreeScriptPath(taskId: string): string {
- return join(getLogsDir(), `${taskId}.worktree.sh`);
- }
-
- async register(task: DreamerTask): Promise {
- try {
- const logDir = getLogsDir();
- const logPath = `${logDir}/${task.id}.log`;
-
- // Ensure logs directory exists
- if (!existsSync(logDir)) {
- mkdirSync(logDir, { recursive: true });
- }
-
- // Determine the command to run
- let cronCommand: string;
-
- if (this.usesWorktree(task)) {
- // Generate and write worktree script
- const script = this.generateWorktreeScript(task, logDir);
- if (script) {
- const scriptPath = this.getWorktreeScriptPath(task.id);
- writeFileSync(scriptPath, script, { mode: 0o755 });
- // Shell escape the script path for safe execution
- cronCommand = `bash ${shellEscape(scriptPath)}`;
- } else {
- // Fallback to direct execution with shell-escaped workDir
- const command = this.getExecutionCommand(task);
- const workDir = this.getWorkingDirectory(task);
- cronCommand = `cd ${shellEscape(workDir)} && ${command}`;
- }
- } else {
- // Direct execution with shell-escaped workDir
- const command = this.getExecutionCommand(task);
- const workDir = this.getWorkingDirectory(task);
- cronCommand = `cd ${shellEscape(workDir)} && ${command}`;
- }
-
- // Get current crontab
- const currentCrontab = await this.getCurrentCrontab();
-
- // Remove existing entry for this task
- const lines = currentCrontab
- .split('\n')
- .filter((line) => !line.includes(`${this.MARKER_PREFIX}${task.id}`));
-
- // Add new entry - use shellEscape for log path to prevent injection
- const logPathEsc = shellEscape(logPath);
- const cronLine = `${task.cronExpression} ${cronCommand} >> ${logPathEsc} 2>&1 ${this.MARKER_PREFIX}${task.id}`;
- lines.push(cronLine);
-
- // Update crontab
- const newCrontab = lines.filter((l) => l.trim()).join('\n') + '\n';
- await this.setCrontab(newCrontab);
- } catch (error) {
- throw new SchedulerError(
- `Failed to register task "${task.name}" with crontab`,
- this.platform,
- 'register',
- error as Error
- );
- }
- }
-
- async unregister(taskId: string): Promise {
- try {
- const currentCrontab = await this.getCurrentCrontab();
-
- // Remove entry for this task
- const lines = currentCrontab
- .split('\n')
- .filter((line) => !line.includes(`${this.MARKER_PREFIX}${taskId}`));
-
- // Update crontab
- const newCrontab = lines.filter((l) => l.trim()).join('\n') + '\n';
- await this.setCrontab(newCrontab);
-
- // Remove worktree script if exists
- const scriptPath = this.getWorktreeScriptPath(taskId);
- if (existsSync(scriptPath)) {
- unlinkSync(scriptPath);
- }
- } catch (error) {
- throw new SchedulerError(
- `Failed to unregister task "${taskId}" from crontab`,
- this.platform,
- 'unregister',
- error as Error
- );
- }
- }
-
- async isRegistered(taskId: string): Promise {
- const currentCrontab = await this.getCurrentCrontab();
- return currentCrontab.includes(`${this.MARKER_PREFIX}${taskId}`);
- }
-
- async getStatus(): Promise {
- const tasks = await this.listRegistered();
-
- return {
- healthy: true, // crontab is always healthy if we can read it
- taskCount: tasks.length,
- errors: [],
- platform: this.platform,
- };
- }
-
- async listRegistered(): Promise {
- const currentCrontab = await this.getCurrentCrontab();
- const taskIds: string[] = [];
-
- for (const line of currentCrontab.split('\n')) {
- const match = line.match(new RegExp(`${this.MARKER_PREFIX}(.+)$`));
- if (match && match[1]) {
- taskIds.push(match[1].trim());
- }
- }
-
- return taskIds;
- }
-
- /**
- * Get the current user's crontab content
- */
- private async getCurrentCrontab(): Promise {
- try {
- const { stdout } = await execAsync('crontab -l');
- return stdout;
- } catch (error: unknown) {
- // "no crontab for user" is not an error for us
- const err = error as { stderr?: string };
- if (err.stderr?.includes('no crontab')) {
- return '';
- }
- throw error;
- }
- }
-
- /**
- * Set the user's crontab content
- */
- private async setCrontab(content: string): Promise {
- // Write to temp file to avoid shell escaping issues entirely
- const tmpFile = join(homedir(), '.claude', 'matrix', 'dreamer', '.crontab.tmp');
- writeFileSync(tmpFile, content, 'utf-8');
- try {
- await execAsync(`crontab ${shellEscape(tmpFile)}`);
- } finally {
- unlinkSync(tmpFile);
- }
- }
-}
diff --git a/src/dreamer/scheduler/shell.ts b/src/dreamer/scheduler/shell.ts
deleted file mode 100644
index 9c97d2b..0000000
--- a/src/dreamer/scheduler/shell.ts
+++ /dev/null
@@ -1,53 +0,0 @@
-/**
- * Shell Escaping Utilities
- *
- * Provides safe command generation for shell execution.
- */
-
-/**
- * Escape a string for safe embedding in a single-quoted bash string.
- * Single quotes in bash prevent all expansion, making this the safest approach.
- *
- * Examples:
- * shellEscape('foo') -> "'foo'"
- * shellEscape("it's") -> "'it'\\''s'"
- * shellEscape('$HOME') -> "'$HOME'" (no expansion)
- * shellEscape('"; rm -rf /') -> "'\"'; rm -rf /'"
- *
- * The technique: wrap in single quotes, escape any internal single quotes
- * by ending the string, adding an escaped quote, and starting a new string.
- * 'it'\''s' = 'it' + \' + 's' = it's
- */
-export function shellEscape(str: string): string {
- return "'" + str.replace(/'/g, "'\\''") + "'";
-}
-
-/**
- * Sanitize a string to contain only safe characters for display in comments.
- * Removes shell metacharacters and control characters while preserving
- * common punctuation used in task names.
- */
-export function sanitizeForComment(str: string): string {
- // First replace newlines with spaces
- let result = str.replace(/\n/g, ' ');
-
- // Remove shell metacharacters: $ ` # \ | & < > and control chars
- // Keep: alphanumeric, spaces, common punctuation like : ; ( ) [ ] { } @ ' " - _ . , ! ?
- result = result.replace(/[$`#\\|&<>]/g, '');
-
- return result;
-}
-
-/**
- * Validate that a string matches a safe identifier pattern
- */
-export function isSafeIdentifier(str: string, pattern: RegExp): boolean {
- return pattern.test(str);
-}
-
-// Validation patterns for git operations
-export const GIT_REF_PATTERN = /^[a-zA-Z0-9/_.-]+$/;
-export const GIT_REMOTE_PATTERN = /^[a-zA-Z0-9_.-]+$/;
-
-// Validation pattern for filesystem paths (conservative)
-export const SAFE_PATH_PATTERN = /^[a-zA-Z0-9/_. ~-]+$/;
diff --git a/src/dreamer/store.ts b/src/dreamer/store.ts
deleted file mode 100644
index 623cce2..0000000
--- a/src/dreamer/store.ts
+++ /dev/null
@@ -1,403 +0,0 @@
-/**
- * Dreamer Database Store
- *
- * Handles all database operations for scheduled tasks and execution records.
- */
-
-import { getDb } from '../db/index.js';
-import type {
- DreamerTask,
- DreamerTaskRow,
- DreamerExecution,
- DreamerExecutionRow,
- ExecutionStatus,
-} from './types.js';
-import { rowToTask, rowToExecution } from './types.js';
-
-// ============================================================================
-// Task Operations
-// ============================================================================
-
-/**
- * Create a new scheduled task
- */
-export function createTask(task: Omit): DreamerTask {
- const db = getDb();
- const now = new Date().toISOString();
-
- db.run(
- `INSERT INTO dreamer_tasks (
- id, name, description, enabled, cron_expression, timezone,
- command, working_directory, timeout, env, skip_permissions,
- worktree_enabled, worktree_base_path, worktree_branch_prefix, worktree_remote,
- tags, repo_id, created_at, updated_at
- ) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)`,
- [
- task.id,
- task.name,
- task.description ?? null,
- task.enabled ? 1 : 0,
- task.cronExpression,
- task.timezone,
- task.command,
- task.workingDirectory,
- task.timeout,
- JSON.stringify(task.env || {}),
- task.skipPermissions ? 1 : 0,
- task.worktreeEnabled ? 1 : 0,
- task.worktreeBasePath ?? null,
- task.worktreeBranchPrefix,
- task.worktreeRemote,
- JSON.stringify(task.tags || []),
- task.repoId ?? null,
- now,
- now,
- ]
- );
-
- return {
- ...task,
- createdAt: now,
- updatedAt: now,
- };
-}
-
-/**
- * Get a task by ID
- */
-export function getTask(taskId: string): DreamerTask | null {
- const db = getDb();
- const row = db.query(
- 'SELECT * FROM dreamer_tasks WHERE id = ?'
- ).get(taskId);
-
- return row ? rowToTask(row) : null;
-}
-
-/**
- * Get all tasks
- */
-type SQLQueryBindings = string | number | boolean | null | Uint8Array;
-
-export function getAllTasks(options?: {
- enabledOnly?: boolean;
- repoId?: string;
- tag?: string;
- limit?: number;
-}): DreamerTask[] {
- const db = getDb();
- const conditions: string[] = [];
- const params: SQLQueryBindings[] = [];
-
- if (options?.enabledOnly) {
- conditions.push('enabled = 1');
- }
-
- if (options?.repoId) {
- conditions.push('repo_id = ?');
- params.push(options.repoId);
- }
-
- if (options?.tag) {
- // JSON array contains tag - escape LIKE wildcards to prevent injection
- const escapedTag = options.tag.replace(/%/g, '\\%').replace(/_/g, '\\_');
- conditions.push("tags LIKE ? ESCAPE '\\'");
- params.push(`%"${escapedTag}"%`);
- }
-
- let sql = 'SELECT * FROM dreamer_tasks';
- if (conditions.length > 0) {
- sql += ' WHERE ' + conditions.join(' AND ');
- }
- sql += ' ORDER BY created_at DESC';
-
- if (options?.limit) {
- sql += ' LIMIT ?';
- params.push(options.limit);
- }
-
- const rows = db.query(sql).all(...params);
- return rows.map(rowToTask);
-}
-
-/**
- * Update a task
- */
-export function updateTask(
- taskId: string,
- updates: Partial>
-): DreamerTask | null {
- const db = getDb();
- const task = getTask(taskId);
- if (!task) return null;
-
- const now = new Date().toISOString();
- const fields: string[] = [];
- const params: SQLQueryBindings[] = [];
-
- if (updates.name !== undefined) {
- fields.push('name = ?');
- params.push(updates.name);
- }
- if (updates.description !== undefined) {
- fields.push('description = ?');
- params.push(updates.description);
- }
- if (updates.enabled !== undefined) {
- fields.push('enabled = ?');
- params.push(updates.enabled ? 1 : 0);
- }
- if (updates.cronExpression !== undefined) {
- fields.push('cron_expression = ?');
- params.push(updates.cronExpression);
- }
- if (updates.timezone !== undefined) {
- fields.push('timezone = ?');
- params.push(updates.timezone);
- }
- if (updates.command !== undefined) {
- fields.push('command = ?');
- params.push(updates.command);
- }
- if (updates.workingDirectory !== undefined) {
- fields.push('working_directory = ?');
- params.push(updates.workingDirectory);
- }
- if (updates.timeout !== undefined) {
- fields.push('timeout = ?');
- params.push(updates.timeout);
- }
- if (updates.env !== undefined) {
- fields.push('env = ?');
- params.push(JSON.stringify(updates.env));
- }
- if (updates.skipPermissions !== undefined) {
- fields.push('skip_permissions = ?');
- params.push(updates.skipPermissions ? 1 : 0);
- }
- if (updates.worktreeEnabled !== undefined) {
- fields.push('worktree_enabled = ?');
- params.push(updates.worktreeEnabled ? 1 : 0);
- }
- if (updates.worktreeBasePath !== undefined) {
- fields.push('worktree_base_path = ?');
- params.push(updates.worktreeBasePath);
- }
- if (updates.worktreeBranchPrefix !== undefined) {
- fields.push('worktree_branch_prefix = ?');
- params.push(updates.worktreeBranchPrefix);
- }
- if (updates.worktreeRemote !== undefined) {
- fields.push('worktree_remote = ?');
- params.push(updates.worktreeRemote);
- }
- if (updates.tags !== undefined) {
- fields.push('tags = ?');
- params.push(JSON.stringify(updates.tags));
- }
- if (updates.repoId !== undefined) {
- fields.push('repo_id = ?');
- params.push(updates.repoId);
- }
-
- if (fields.length === 0) {
- return task;
- }
-
- fields.push('updated_at = ?');
- params.push(now);
- params.push(taskId);
-
- db.run(
- `UPDATE dreamer_tasks SET ${fields.join(', ')} WHERE id = ?`,
- params
- );
-
- return getTask(taskId);
-}
-
-/**
- * Delete a task
- */
-export function deleteTask(taskId: string): boolean {
- const db = getDb();
- const result = db.run('DELETE FROM dreamer_tasks WHERE id = ?', [taskId]);
- return result.changes > 0;
-}
-
-// ============================================================================
-// Execution Operations
-// ============================================================================
-
-/**
- * Create an execution record
- */
-export function createExecution(execution: Omit): DreamerExecution {
- const db = getDb();
- const id = crypto.randomUUID();
-
- db.run(
- `INSERT INTO dreamer_executions (
- id, task_id, started_at, completed_at, status, triggered_by,
- duration, exit_code, output_preview, error, task_name,
- project_path, cron_expression, worktree_path, worktree_branch, worktree_pushed
- ) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)`,
- [
- id,
- execution.taskId,
- execution.startedAt,
- execution.completedAt ?? null,
- execution.status,
- execution.triggeredBy,
- execution.duration ?? null,
- execution.exitCode ?? null,
- execution.outputPreview ?? null,
- execution.error ?? null,
- execution.taskName,
- execution.projectPath ?? null,
- execution.cronExpression ?? null,
- execution.worktreePath ?? null,
- execution.worktreeBranch ?? null,
- execution.worktreePushed !== undefined ? (execution.worktreePushed ? 1 : 0) : null,
- ]
- );
-
- return { id, ...execution };
-}
-
-/**
- * Update an execution record
- */
-export function updateExecution(
- executionId: string,
- updates: Partial>
-): void {
- const db = getDb();
- const fields: string[] = [];
- const params: SQLQueryBindings[] = [];
-
- if (updates.completedAt !== undefined) {
- fields.push('completed_at = ?');
- params.push(updates.completedAt);
- }
- if (updates.status !== undefined) {
- fields.push('status = ?');
- params.push(updates.status);
- }
- if (updates.duration !== undefined) {
- fields.push('duration = ?');
- params.push(updates.duration);
- }
- if (updates.exitCode !== undefined) {
- fields.push('exit_code = ?');
- params.push(updates.exitCode);
- }
- if (updates.outputPreview !== undefined) {
- fields.push('output_preview = ?');
- params.push(updates.outputPreview);
- }
- if (updates.error !== undefined) {
- fields.push('error = ?');
- params.push(updates.error);
- }
- if (updates.worktreePath !== undefined) {
- fields.push('worktree_path = ?');
- params.push(updates.worktreePath);
- }
- if (updates.worktreeBranch !== undefined) {
- fields.push('worktree_branch = ?');
- params.push(updates.worktreeBranch);
- }
- if (updates.worktreePushed !== undefined) {
- fields.push('worktree_pushed = ?');
- params.push(updates.worktreePushed ? 1 : 0);
- }
-
- if (fields.length > 0) {
- params.push(executionId);
- db.run(
- `UPDATE dreamer_executions SET ${fields.join(', ')} WHERE id = ?`,
- params
- );
- }
-}
-
-/**
- * Get executions for a task
- */
-export function getExecutions(taskId: string, limit?: number): DreamerExecution[] {
- const db = getDb();
- let sql = 'SELECT * FROM dreamer_executions WHERE task_id = ? ORDER BY started_at DESC';
- const params: SQLQueryBindings[] = [taskId];
-
- if (limit) {
- sql += ' LIMIT ?';
- params.push(limit);
- }
-
- const rows = db.query(sql).all(...params);
- return rows.map(rowToExecution);
-}
-
-/**
- * Get all executions (history)
- */
-export function getAllExecutions(options?: {
- status?: ExecutionStatus;
- limit?: number;
-}): DreamerExecution[] {
- const db = getDb();
- const conditions: string[] = [];
- const params: SQLQueryBindings[] = [];
-
- if (options?.status) {
- conditions.push('status = ?');
- params.push(options.status);
- }
-
- let sql = 'SELECT * FROM dreamer_executions';
- if (conditions.length > 0) {
- sql += ' WHERE ' + conditions.join(' AND ');
- }
- sql += ' ORDER BY started_at DESC';
-
- if (options?.limit) {
- sql += ' LIMIT ?';
- params.push(options.limit);
- }
-
- const rows = db.query(sql).all(...params);
- return rows.map(rowToExecution);
-}
-
-/**
- * Get latest execution for a task
- */
-export function getLatestExecution(taskId: string): DreamerExecution | null {
- const executions = getExecutions(taskId, 1);
- return executions[0] ?? null;
-}
-
-/**
- * Count executions by status for a task
- */
-export function countExecutionsByStatus(taskId: string): Record {
- const db = getDb();
- const rows = db.query<{ status: string; count: number }, [string]>(
- `SELECT status, COUNT(*) as count FROM dreamer_executions WHERE task_id = ? GROUP BY status`
- ).all(taskId);
-
- const counts: Record = {
- running: 0,
- success: 0,
- failure: 0,
- timeout: 0,
- skipped: 0,
- };
-
- for (const row of rows) {
- counts[row.status as ExecutionStatus] = row.count;
- }
-
- return counts;
-}
diff --git a/src/dreamer/types.ts b/src/dreamer/types.ts
deleted file mode 100644
index 07c1e4e..0000000
--- a/src/dreamer/types.ts
+++ /dev/null
@@ -1,204 +0,0 @@
-/**
- * Dreamer - Scheduled Task Automation Types
- *
- * Internal types for the scheduler system.
- * Public API types are defined in tools/validation.ts
- */
-
-// ============================================================================
-// Task Configuration
-// ============================================================================
-
-/**
- * Worktree configuration for isolated execution
- */
-export interface WorktreeConfig {
- enabled: boolean;
- basePath?: string;
- branchPrefix: string;
- remoteName: string;
-}
-
-/**
- * Scheduled task from database
- */
-export interface DreamerTask {
- id: string;
- name: string;
- description?: string;
- enabled: boolean;
- cronExpression: string;
- timezone: string;
- command: string;
- workingDirectory: string;
- timeout: number;
- env: Record;
- skipPermissions: boolean;
- worktreeEnabled: boolean;
- worktreeBasePath?: string;
- worktreeBranchPrefix: string;
- worktreeRemote: string;
- tags: string[];
- repoId?: string;
- createdAt: string;
- updatedAt: string;
-}
-
-/**
- * Task as stored in SQLite (raw row format)
- */
-export interface DreamerTaskRow {
- id: string;
- name: string;
- description: string | null;
- enabled: number; // SQLite boolean
- cron_expression: string;
- timezone: string;
- command: string;
- working_directory: string;
- timeout: number;
- env: string; // JSON
- skip_permissions: number; // SQLite boolean
- worktree_enabled: number; // SQLite boolean
- worktree_base_path: string | null;
- worktree_branch_prefix: string;
- worktree_remote: string;
- tags: string; // JSON
- repo_id: string | null;
- created_at: string;
- updated_at: string;
-}
-
-// ============================================================================
-// Execution Records
-// ============================================================================
-
-/**
- * Execution status values
- */
-export type ExecutionStatus = 'running' | 'success' | 'failure' | 'timeout' | 'skipped';
-
-/**
- * Execution record from database
- */
-export interface DreamerExecution {
- id: string;
- taskId: string;
- startedAt: string;
- completedAt?: string;
- status: ExecutionStatus;
- triggeredBy: string;
- duration?: number;
- exitCode?: number;
- outputPreview?: string;
- error?: string;
- taskName: string;
- projectPath?: string;
- cronExpression?: string;
- worktreePath?: string;
- worktreeBranch?: string;
- worktreePushed?: boolean;
-}
-
-/**
- * Execution as stored in SQLite (raw row format)
- */
-export interface DreamerExecutionRow {
- id: string;
- task_id: string;
- started_at: string;
- completed_at: string | null;
- status: string;
- triggered_by: string;
- duration: number | null;
- exit_code: number | null;
- output_preview: string | null;
- error: string | null;
- task_name: string;
- project_path: string | null;
- cron_expression: string | null;
- worktree_path: string | null;
- worktree_branch: string | null;
- worktree_pushed: number | null; // SQLite boolean
-}
-
-// ============================================================================
-// Scheduler Status
-// ============================================================================
-
-/**
- * Status of the native OS scheduler
- */
-export interface SchedulerStatus {
- healthy: boolean;
- taskCount: number;
- errors: string[];
- platform: 'darwin' | 'linux';
-}
-
-// ============================================================================
-// Conversion Helpers
-// ============================================================================
-
-function safeJsonParse(json: string | null | undefined, fallback: T): T {
- try { return json ? JSON.parse(json) : fallback; } catch { return fallback; }
-}
-
-/**
- * Convert database row to DreamerTask
- */
-export function rowToTask(row: DreamerTaskRow): DreamerTask {
- return {
- id: row.id,
- name: row.name,
- description: row.description ?? undefined,
- enabled: row.enabled === 1,
- cronExpression: row.cron_expression,
- timezone: row.timezone,
- command: row.command,
- workingDirectory: row.working_directory,
- timeout: row.timeout,
- env: safeJsonParse(row.env, {}),
- skipPermissions: row.skip_permissions === 1,
- worktreeEnabled: row.worktree_enabled === 1,
- worktreeBasePath: row.worktree_base_path ?? undefined,
- worktreeBranchPrefix: row.worktree_branch_prefix,
- worktreeRemote: row.worktree_remote,
- tags: safeJsonParse(row.tags, []),
- repoId: row.repo_id ?? undefined,
- createdAt: row.created_at,
- updatedAt: row.updated_at,
- };
-}
-
-// Valid execution status values for runtime validation
-const VALID_STATUSES: ExecutionStatus[] = ['running', 'success', 'failure', 'timeout', 'skipped'];
-
-/**
- * Convert database row to DreamerExecution
- */
-export function rowToExecution(row: DreamerExecutionRow): DreamerExecution {
- // Validate status at runtime - fallback to 'failure' if invalid
- const status: ExecutionStatus = VALID_STATUSES.includes(row.status as ExecutionStatus)
- ? (row.status as ExecutionStatus)
- : 'failure';
-
- return {
- id: row.id,
- taskId: row.task_id,
- startedAt: row.started_at,
- completedAt: row.completed_at ?? undefined,
- status,
- triggeredBy: row.triggered_by,
- duration: row.duration ?? undefined,
- exitCode: row.exit_code ?? undefined,
- outputPreview: row.output_preview ?? undefined,
- error: row.error ?? undefined,
- taskName: row.task_name,
- projectPath: row.project_path ?? undefined,
- cronExpression: row.cron_expression ?? undefined,
- worktreePath: row.worktree_path ?? undefined,
- worktreeBranch: row.worktree_branch ?? undefined,
- worktreePushed: row.worktree_pushed === 1 ? true : row.worktree_pushed === 0 ? false : undefined,
- };
-}
diff --git a/src/hooks/index.ts b/src/hooks/index.ts
index cbfeed9..7beb60d 100644
--- a/src/hooks/index.ts
+++ b/src/hooks/index.ts
@@ -25,17 +25,6 @@ export interface PreToolUseInput extends HookInput {
}
-/**
- * SubagentStart hook input (Claude Code 2.0.43+)
- * Fires when a subagent (Explore, Plan, etc.) starts
- */
-export interface SubagentStartInput extends HookInput {
- agent_id: string;
- agent_type: string;
- hook_event_name: 'SubagentStart';
-}
-
-
/**
* PermissionRequest decision structure
*/
diff --git a/src/hooks/permission-request.ts b/src/hooks/permission-request.ts
index cdf025d..db86eb3 100644
--- a/src/hooks/permission-request.ts
+++ b/src/hooks/permission-request.ts
@@ -56,7 +56,6 @@ const MATRIX_READ_TOOLS = new Set([
'mcp__plugin_matrix_matrix__matrix_index_status',
'mcp__plugin_matrix_matrix__matrix_reindex',
// Utility (read-only)
- 'mcp__plugin_matrix_matrix__matrix_prompt',
'mcp__plugin_matrix_matrix__matrix_doctor',
]);
diff --git a/src/hooks/post-tool-bash.ts b/src/hooks/post-tool-bash.ts
deleted file mode 100644
index 72e1f3e..0000000
--- a/src/hooks/post-tool-bash.ts
+++ /dev/null
@@ -1,101 +0,0 @@
-#!/usr/bin/env bun
-/**
- * PostToolUse:Bash Hook (Dependency Logger)
- *
- * Runs after Bash tool completes.
- * Logs successful package installations to the database for audit.
- *
- * Exit codes:
- * 0 = Success
- * 1 = Non-blocking error
- */
-
-import {
- readStdin,
- hooksEnabled,
- parsePackageCommand,
- type HookInput,
-} from './index.js';
-import { getDb } from '../db/client.js';
-import { fingerprintRepo, getOrCreateRepo } from '../repo/index.js';
-
-interface PostToolBashInput extends HookInput {
- tool_name: string;
- tool_input: Record;
- tool_response: {
- stdout?: string;
- stderr?: string;
- exitCode?: number;
- };
-}
-
-export async function run() {
- try {
- // Check if hooks are enabled
- if (!hooksEnabled()) {
- process.exit(0);
- }
-
- // Read input from stdin
- const input = await readStdin();
-
- // Only log successful executions
- const exitCode = input.tool_response?.exitCode;
- if (exitCode !== 0) {
- process.exit(0);
- }
-
- // Get command from tool input
- const command = input.tool_input.command as string | undefined;
- if (!command) {
- process.exit(0);
- }
-
- // Parse package command
- const parsed = parsePackageCommand(command);
- if (!parsed || parsed.packages.length === 0) {
- // Not a package install command
- process.exit(0);
- }
-
- // Get repo context
- const detected = fingerprintRepo(input.cwd);
- const repoId = await getOrCreateRepo(detected);
-
- // Log each package to the database
- const db = getDb();
-
- for (const packageName of parsed.packages) {
- // Extract version if present (e.g., "lodash@4.17.21" -> "4.17.21")
- let name = packageName;
- let version: string | null = null;
-
- const versionMatch = packageName.match(/^(.+)@([\d.]+.*)$/);
- if (versionMatch) {
- name = versionMatch[1]!;
- version = versionMatch[2]!;
- }
-
- db.query(`
- INSERT INTO dependency_installs
- (package_name, package_version, ecosystem, repo_id, command, session_id)
- VALUES (?, ?, ?, ?, ?, ?)
- `).run(
- name,
- version,
- parsed.ecosystem,
- repoId,
- command,
- input.session_id
- );
- }
-
- process.exit(0);
- } catch (err) {
- // Log error but don't block
- console.error(`[Matrix] Logger hook error: ${err instanceof Error ? err.message : err}`);
- process.exit(1);
- }
-}
-
-if (import.meta.main) run();
diff --git a/src/hooks/prompt-utils.ts b/src/hooks/prompt-utils.ts
deleted file mode 100644
index 062480d..0000000
--- a/src/hooks/prompt-utils.ts
+++ /dev/null
@@ -1,439 +0,0 @@
-/**
- * Prompt Analysis Utilities
- *
- * Reusable functions for prompt analysis, extracted from the Prompt Agent.
- * Used by both UserPromptSubmit hook (silent mode) and matrixPrompt tool (interactive).
- */
-
-import { existsSync, readFileSync } from 'fs';
-import { join } from 'path';
-import {
- formatGitContext,
- formatPromptContext,
- assembleContext,
- getVerbosity,
- type GitContextData,
-} from './format-helpers.js';
-
-// Shortcut patterns for quick responses
-export const SHORTCUTS: Record = {
- 'ship it': { action: 'execute', label: 'Execute with best interpretation' },
- 'just do it': { action: 'execute', label: 'Execute with best interpretation' },
- 'yolo': { action: 'execute', label: 'Execute with best interpretation' },
- 'go ahead': { action: 'execute', label: 'Execute with best interpretation' },
- 'proceed': { action: 'execute', label: 'Execute with best interpretation' },
- 'nah': { action: 'abort', label: 'Abort, user will rephrase' },
- 'nope': { action: 'abort', label: 'Abort, user will rephrase' },
- 'abort': { action: 'abort', label: 'Abort, user will rephrase' },
- 'cancel': { action: 'abort', label: 'Abort, user will rephrase' },
- 'expand': { action: 'expand', label: 'Show more granular options' },
- 'more options': { action: 'expand', label: 'Show more granular options' },
- 'hierarchize': { action: 'hierarchize', label: 'Create subtask plan' },
- 'break it down': { action: 'hierarchize', label: 'Create subtask plan' },
- 'skip': { action: 'skip', label: 'Skip question, use judgment' },
-};
-
-// Ambiguity patterns
-export const AMBIGUITY_PATTERNS = {
- scope: {
- patterns: [
- /\b(this|that|it|the)\s+(code|function|file|module|component)\b/i,
- /\b(fix|update|change|refactor|improve)\s+(it|this|that)\b/i,
- /\bthe\s+(bug|issue|problem|error)\b/i,
- ],
- question: 'Which specific file/component are you referring to?',
- type: 'scope' as const,
- },
- target: {
- patterns: [
- /\b(something|somewhere|somehow)\b/i,
- /\b(some|few|couple)\s+(of\s+)?(files?|functions?|components?)\b/i,
- /\b(here|there)\b/i,
- ],
- question: 'Can you specify the exact location or target?',
- type: 'target' as const,
- },
- approach: {
- patterns: [
- /\b(better|improve|optimize|enhance|make\s+it\s+(good|nice|clean))\b/i,
- /\b(properly|correctly|right\s+way)\b/i,
- /\b(best\s+practice|standard|convention)\b/i,
- ],
- question: 'What specific improvement are you looking for?',
- type: 'approach' as const,
- },
- action: {
- patterns: [
- /^(fix|handle|deal\s+with)\s+(the\s+)?(auth|error|bug|issue)$/i,
- /^(add|implement|create)\s+(a\s+)?(feature|functionality)$/i,
- ],
- question: 'What exactly should be fixed/implemented?',
- type: 'action' as const,
- },
-};
-
-export interface Shortcut {
- trigger: string;
- action: 'execute' | 'abort' | 'expand' | 'hierarchize' | 'skip';
-}
-
-export interface AmbiguityResult {
- type: 'scope' | 'target' | 'approach' | 'action';
- question: string;
-}
-
-export interface Assumption {
- category: string;
- assumption: string;
- confidence: number;
-}
-
-export interface PromptContext {
- claudeMd: string[];
- git: string[];
- memory: string[];
-}
-
-export interface SilentAnalysisResult {
- shortcut: Shortcut | null;
- ambiguity: AmbiguityResult | null;
- confidence: number;
- assumptions: Assumption[];
- contextInjected: string[];
-}
-
-/**
- * Detect shortcut in prompt
- */
-export function detectShortcut(prompt: string): Shortcut | null {
- const normalized = prompt.toLowerCase().trim();
-
- for (const [trigger, info] of Object.entries(SHORTCUTS)) {
- if (normalized === trigger || normalized.startsWith(`${trigger} `) || normalized.endsWith(` ${trigger}`)) {
- return { trigger, action: info.action };
- }
- }
-
- return null;
-}
-
-/**
- * Analyze prompt for ambiguity
- */
-export function analyzeAmbiguity(prompt: string): AmbiguityResult | null {
- const ambiguities: { type: AmbiguityResult['type']; question: string; priority: number }[] = [];
-
- for (const [_key, config] of Object.entries(AMBIGUITY_PATTERNS)) {
- for (const pattern of config.patterns) {
- if (pattern.test(prompt)) {
- ambiguities.push({
- type: config.type,
- question: config.question,
- priority: config.patterns.length,
- });
- break;
- }
- }
- }
-
- if (ambiguities.length === 0) {
- return null;
- }
-
- // Return highest priority ambiguity
- ambiguities.sort((a, b) => b.priority - a.priority);
- const top = ambiguities[0];
- if (!top) {
- return null;
- }
-
- return {
- question: top.question,
- type: top.type,
- };
-}
-
-/**
- * Extract relevant sections from CLAUDE.md
- */
-function extractRelevantSections(content: string): string | null {
- const relevantKeywords = [
- 'preference',
- 'style',
- 'convention',
- 'pattern',
- 'avoid',
- 'always',
- 'never',
- 'use',
- 'don\'t',
- ];
-
- const lines = content.split('\n');
- const relevant: string[] = [];
-
- for (const line of lines) {
- const lower = line.toLowerCase();
- if (relevantKeywords.some(kw => lower.includes(kw))) {
- relevant.push(line.trim());
- }
- }
-
- return relevant.length > 0 ? relevant.slice(0, 10).join('; ') : null;
-}
-
-/**
- * Load CLAUDE.md files (project + global)
- */
-export function loadClaudeMdContext(cwd?: string): string[] {
- const contexts: string[] = [];
- const workingDir = cwd || process.cwd();
-
- // Project-level CLAUDE.md
- const projectClaudeMd = join(workingDir, 'CLAUDE.md');
- if (existsSync(projectClaudeMd)) {
- try {
- const content = readFileSync(projectClaudeMd, 'utf-8');
- const relevantSections = extractRelevantSections(content);
- if (relevantSections) {
- contexts.push(`[Project CLAUDE.md] ${relevantSections}`);
- }
- } catch {
- // Ignore read errors
- }
- }
-
- // Global CLAUDE.md
- const home = process.env.HOME || process.env.USERPROFILE || '';
- const globalClaudeMd = join(home, '.claude', 'CLAUDE.md');
- if (existsSync(globalClaudeMd)) {
- try {
- const content = readFileSync(globalClaudeMd, 'utf-8');
- const relevantSections = extractRelevantSections(content);
- if (relevantSections) {
- contexts.push(`[Global CLAUDE.md] ${relevantSections}`);
- }
- } catch {
- // Ignore read errors
- }
- }
-
- return contexts;
-}
-
-/**
- * Run a git command with proper stream cleanup
- * Ensures no resource leaks even on error/timeout
- */
-async function runGitCommand(args: string[], cwd: string): Promise {
- const proc = Bun.spawn(['git', ...args], {
- cwd,
- stdout: 'pipe',
- stderr: 'pipe',
- });
-
- try {
- const output = await new Response(proc.stdout).text();
- await proc.exited; // Wait for process to fully exit
- return output.trim();
- } finally {
- // Ensure streams are closed even on error
- try {
- proc.stdout.cancel();
- } catch {
- /* already closed */
- }
- try {
- proc.stderr.cancel();
- } catch {
- /* already closed */
- }
- try {
- proc.kill();
- } catch {
- /* already dead */
- }
- }
-}
-
-/**
- * Get git context as structured data (v2.0)
- * Returns raw data for verbosity-aware formatting
- */
-export async function getGitContextData(cwd?: string): Promise {
- const workingDir = cwd || process.cwd();
- const result: GitContextData = {
- branch: null,
- commits: [],
- changedFiles: [],
- };
-
- try {
- // Get current branch
- const branchOutput = await runGitCommand(['branch', '--show-current'], workingDir);
- result.branch = branchOutput || null;
-
- // Get recent commit messages (last 3)
- const logOutput = await runGitCommand(['log', '--oneline', '-3'], workingDir);
- if (logOutput) {
- result.commits = logOutput.split('\n');
- }
-
- // Get changed files (staged + unstaged)
- const statusOutput = await runGitCommand(['status', '--short'], workingDir);
- if (statusOutput) {
- result.changedFiles = statusOutput.split('\n').slice(0, 10);
- }
- } catch {
- // Not a git repo or git not available
- }
-
- return result;
-}
-
-/**
- * Get git context (backward compatible wrapper)
- * @deprecated Use getGitContextData() + formatGitContext() instead
- */
-export async function getGitContext(cwd?: string): Promise {
- const data = await getGitContextData(cwd);
- const formatted = formatGitContext(data, 'full');
- return formatted ? formatted.split('\n') : [];
-}
-
-/**
- * Generate assumptions based on context
- */
-export function generateAssumptions(
- prompt: string,
- claudeMdContext: string[],
- gitContext: string[]
-): Assumption[] {
- const assumptions: Assumption[] = [];
-
- // Infer from git branch
- const branchContext = gitContext.find(c => c.includes('[Git Branch]'));
- if (branchContext) {
- const branch = branchContext.replace('[Git Branch] ', '');
- if (branch.includes('feature/')) {
- assumptions.push({
- category: 'scope',
- assumption: `Working on feature: ${branch.replace('feature/', '')}`,
- confidence: 0.8,
- });
- } else if (branch.includes('fix/') || branch.includes('bugfix/')) {
- assumptions.push({
- category: 'task',
- assumption: 'This is a bug fix task',
- confidence: 0.9,
- });
- }
- }
-
- // Infer from changed files
- const changedContext = gitContext.find(c => c.includes('[Changed Files]'));
- if (changedContext && prompt.toLowerCase().includes('continue')) {
- assumptions.push({
- category: 'scope',
- assumption: `Continue work on recently changed files`,
- confidence: 0.7,
- });
- }
-
- // Infer from CLAUDE.md preferences
- if (claudeMdContext.length > 0) {
- assumptions.push({
- category: 'style',
- assumption: 'Following project/personal conventions from CLAUDE.md',
- confidence: 0.95,
- });
- }
-
- return assumptions;
-}
-
-/**
- * Calculate confidence score
- */
-export function calculateConfidence(
- prompt: string,
- ambiguity: AmbiguityResult | null,
- assumptions: Assumption[],
- memoryContext: string[]
-): number {
- let confidence = 70; // Base confidence
-
- // Reduce for ambiguity
- if (ambiguity) {
- confidence -= 20;
- }
-
- // Reduce for very short prompts
- const wordCount = prompt.split(/\s+/).length;
- if (wordCount < 5) {
- confidence -= 15;
- } else if (wordCount > 20) {
- confidence += 10;
- }
-
- // Boost for having relevant memory
- if (memoryContext.length > 0) {
- confidence += 10;
- }
-
- // Boost for high-confidence assumptions
- const highConfidenceAssumptions = assumptions.filter(a => a.confidence > 0.8);
- confidence += highConfidenceAssumptions.length * 5;
-
- // Reduce for vague action words
- if (/\b(somehow|something|maybe|possibly|perhaps)\b/i.test(prompt)) {
- confidence -= 10;
- }
-
- // Boost for specific file references
- if (/\.(ts|js|py|go|rs|java|tsx|jsx|json|yaml|yml|md)\b/.test(prompt)) {
- confidence += 10;
- }
-
- return Math.max(10, Math.min(100, confidence));
-}
-
-/**
- * Run silent prompt analysis (non-interactive, for hooks)
- *
- * Returns analysis without blocking or asking questions.
- * Used by UserPromptSubmit hook for shortcut/ambiguity detection.
- *
- * Note: Git context and CLAUDE.md parsing removed — Claude Code already
- * provides both natively. This eliminates 3 subprocess spawns + file I/O
- * per prompt.
- */
-export async function analyzePromptSilent(
- prompt: string,
- _cwd?: string
-): Promise {
- // Check for shortcuts first
- const shortcut = detectShortcut(prompt);
-
- // If abort shortcut, return early
- if (shortcut?.action === 'abort') {
- return {
- shortcut,
- ambiguity: null,
- confidence: 0,
- assumptions: [],
- contextInjected: [],
- };
- }
-
- // Analyze ambiguity (for warning, not blocking)
- const ambiguity = analyzeAmbiguity(prompt);
-
- return {
- shortcut,
- ambiguity,
- confidence: 70, // Base confidence — detailed scoring moved to matrix_prompt tool
- assumptions: [],
- contextInjected: [],
- };
-}
diff --git a/src/hooks/session-start.ts b/src/hooks/session-start.ts
index f37a2cd..acafd07 100644
--- a/src/hooks/session-start.ts
+++ b/src/hooks/session-start.ts
@@ -25,7 +25,6 @@ import { runMigrations } from '../db/migrate.js';
const CURRENT_VERSION = '1.0.4';
-const CLAUDE_DIR = join(homedir(), '.claude');
// File suggestion installation moved to utils/file-suggestion.ts
@@ -40,7 +39,6 @@ function ensureConfigComplete(): void {
// Check for v2.0+ required sections
const missingSections: string[] = [];
- if (!config.hooks?.promptAnalysis?.memoryInjection) missingSections.push('memoryInjection');
if (!config.hooks?.permissions) missingSections.push('permissions');
if (!config.hooks?.userRules) missingSections.push('userRules');
if (!config.hooks?.gitCommitReview) missingSections.push('gitCommitReview');
diff --git a/src/hooks/subagent-start.ts b/src/hooks/subagent-start.ts
deleted file mode 100644
index 9569031..0000000
--- a/src/hooks/subagent-start.ts
+++ /dev/null
@@ -1,138 +0,0 @@
-#!/usr/bin/env bun
-/**
- * SubagentStart Hook
- *
- * Runs when a subagent (Explore, Plan, etc.) starts.
- * Injects Matrix-specific guidance:
- * - Prefer Matrix index tools over Grep for code search
- * - Prefer Context7 over WebSearch for library docs
- *
- * Exit codes:
- * 0 = Success (output used by Claude Code)
- * 1 = Non-blocking error (stderr shown to user)
- */
-
-import {
- readStdin,
- outputJson,
- hooksEnabled,
- type SubagentStartInput,
- type HookOutput,
-} from './index.js';
-import { getConfig } from '../config/index.js';
-import { getVerbosity } from './format-helpers.js';
-
-/**
- * Build guidance context for subagents based on config
- */
-function buildSubagentGuidance(agentType: string): string[] {
- const config = getConfig();
- const toolSearch = config.toolSearch;
- const verbosity = getVerbosity();
-
- const guidance: string[] = [];
-
- // Matrix Index preference
- if (toolSearch.preferMatrixIndex) {
- if (verbosity === 'compact') {
- guidance.push(
- '[Matrix] Prefer matrix_find_definition, matrix_search_symbols over Grep for code search'
- );
- } else {
- guidance.push(
- '[Matrix Guidance] For code navigation and symbol search, prefer these Matrix tools:',
- ' - matrix_find_definition: Find where a symbol is defined',
- ' - matrix_search_symbols: Search for symbols by name pattern',
- ' - matrix_list_exports: List exported symbols from a file',
- ' - matrix_get_imports: Get imports for a file',
- 'These are faster and more accurate than Grep for code structure queries.'
- );
- }
- }
-
- // Context7 preference
- if (toolSearch.preferContext7) {
- if (verbosity === 'compact') {
- guidance.push(
- '[Matrix] Prefer Context7 (resolve-library-id + query-docs) over WebSearch for library docs'
- );
- } else {
- guidance.push(
- '[Matrix Guidance] For library documentation lookup, prefer Context7 tools:',
- ' 1. resolve-library-id: Get the Context7 library ID for a package',
- ' 2. query-docs: Query documentation with the resolved library ID',
- 'Context7 provides accurate, up-to-date documentation without web search noise.'
- );
- }
- }
-
- // Agent-specific guidance for explore/plan agents
- const agentTypeLower = (agentType || '').toLowerCase();
- if (agentTypeLower === 'explore' || agentTypeLower === 'plan') {
- if (verbosity === 'full') {
- guidance.push(
- `[Matrix Guidance for ${agentType} agent]`,
- 'Use matrix_recall to check for existing solutions before implementing new ones.',
- 'Use matrix_index_status to check if code index is available for this repository.'
- );
- } else {
- guidance.push('[Matrix] Check matrix_recall for existing solutions');
- }
- }
-
- return guidance;
-}
-
-export async function run() {
- try {
- // Check if hooks are enabled
- if (!hooksEnabled()) {
- process.exit(0);
- }
-
- // Read input from stdin
- const input = await readStdin();
-
- // Get config
- const config = getConfig();
- const toolSearch = config.toolSearch;
-
- // Skip if tool search preferences are all disabled
- if (!toolSearch.preferMatrixIndex && !toolSearch.preferContext7) {
- process.exit(0);
- }
-
- // Build guidance
- const guidance = buildSubagentGuidance(input.agent_type);
-
- if (guidance.length === 0) {
- process.exit(0);
- }
-
- // Format guidance as context
- const additionalContext = guidance.join('\n');
-
- // Build output
- const output: HookOutput = {
- hookSpecificOutput: {
- additionalContext,
- },
- };
-
- // Optional: show terminal message in verbose mode
- if (toolSearch.verbose) {
- output.systemMessage = `[Matrix] Injected guidance for ${input.agent_type || 'unknown'} subagent`;
- }
-
- outputJson(output);
- process.exit(0);
- } catch (err) {
- // Log error but don't block subagent
- console.error(
- `[Matrix] SubagentStart hook error: ${err instanceof Error ? err.message : err}`
- );
- process.exit(1);
- }
-}
-
-if (import.meta.main) run();
diff --git a/src/hooks/task-completed.ts b/src/hooks/task-completed.ts
deleted file mode 100644
index bbfb064..0000000
--- a/src/hooks/task-completed.ts
+++ /dev/null
@@ -1,78 +0,0 @@
-#!/usr/bin/env bun
-/**
- * TaskCompleted Hook
- *
- * Runs when a Claude Code task completes (Agent Teams, v2.1.32+).
- * Suggests storing the solution in Matrix memory if the task was
- * significant enough (high token/tool usage).
- *
- * Exit codes:
- * 0 = Success
- * 1 = Non-blocking error
- */
-
-import {
- readStdin,
- outputJson,
- hooksEnabled,
- log,
- type HookInput,
- type HookOutput,
-} from './index.js';
-import { getConfig } from '../config/index.js';
-
-export interface TaskCompletedInput extends HookInput {
- task_id?: string;
- task_subject?: string;
- task_description?: string;
- total_tokens?: number;
- tool_uses?: number;
- duration_ms?: number;
-}
-
-/**
- * Check if a completed task is significant enough to suggest storing
- */
-function isSignificantTask(input: TaskCompletedInput): boolean {
- // At least 5 tool uses or 20k tokens suggests meaningful work
- if ((input.tool_uses ?? 0) >= 5) return true;
- if ((input.total_tokens ?? 0) >= 20000) return true;
- return false;
-}
-
-export async function run() {
- try {
- if (!hooksEnabled()) {
- process.exit(0);
- }
-
- const input = await readStdin();
- const config = getConfig();
-
- // Skip if store suggestions are disabled
- if (!config.hooks.stop?.suggestStore?.enabled) {
- process.exit(0);
- }
-
- if (!isSignificantTask(input)) {
- process.exit(0);
- }
-
- // Suggest storing the solution
- const subject = input.task_subject || 'Unknown task';
- const tokens = input.total_tokens ? `${Math.round(input.total_tokens / 1000)}k tokens` : '';
- const tools = input.tool_uses ? `${input.tool_uses} tool uses` : '';
- const stats = [tokens, tools].filter(Boolean).join(', ');
-
- log(`[Matrix] Significant task completed: "${subject}" (${stats}). Consider using matrix_store to save this solution.`);
-
- const output: HookOutput = {};
- outputJson(output);
- process.exit(0);
- } catch (err) {
- console.error(`[Matrix] TaskCompleted hook error: ${err instanceof Error ? err.message : err}`);
- process.exit(1);
- }
-}
-
-if (import.meta.main) run();
diff --git a/src/hooks/unified-entry.ts b/src/hooks/unified-entry.ts
index 50a9fb2..5fdde92 100644
--- a/src/hooks/unified-entry.ts
+++ b/src/hooks/unified-entry.ts
@@ -3,22 +3,16 @@ import { run as sessionStart } from './session-start.js';
import { run as permissionRequest } from './permission-request.js';
import { run as preToolBash } from './pre-tool-bash.js';
import { run as preToolRead } from './pre-tool-read.js';
-import { run as postToolBash } from './post-tool-bash.js';
import { run as preToolEdit } from './pre-tool-edit.js';
import { run as preToolWeb } from './pre-tool-web.js';
-import { run as subagentStart } from './subagent-start.js';
-import { run as taskCompleted } from './task-completed.js';
const hooks: Record Promise> = {
'session-start': sessionStart,
'permission-request': permissionRequest,
'pre-tool-bash': preToolBash,
'pre-tool-read': preToolRead,
- 'post-tool-bash': postToolBash,
'pre-tool-edit': preToolEdit,
'pre-tool-web': preToolWeb,
- 'subagent-start': subagentStart,
- 'task-completed': taskCompleted,
};
const hookType = process.argv[2];
diff --git a/src/http/dashboard.ts b/src/http/dashboard.ts
deleted file mode 100644
index 6173e9c..0000000
--- a/src/http/dashboard.ts
+++ /dev/null
@@ -1,738 +0,0 @@
-/**
- * Matrix Dashboard HTML — v3
- * Design system mirrors claude-website: #000 bg, #22c55e accent, Inter + JetBrains Mono
- */
-
-export function getDashboardHtml(): string {
- return `
-
-
-
-
-Matrix Dashboard
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
- | Problem | Category | Scope |
- Score | Uses | Created | |
-
-
-
-
-
-
-
-
-
-
-
-
- | Type | Message | Root Cause |
- Count | Created | |
-
-
-
-
-
-
-
-
-
-
-
-
- | Type | Target | Reason |
- Severity | Ecosystem | Created | |
-
-
-
-
-
-
-
-
-
-
-
-
- | Name | Path | Languages |
- Files | Symbols | |
-
-
-
-
-
-
-
-
-
-
-
-
- | ID | Tool | Status | Progress |
- Created | Completed | |
-
-
-
-
-
-
-
-
-
-
Scheduled Tasks
-
-
-
- | Name | Schedule | Command |
- Enabled | Created | |
-
-
-
-
-
Recent Executions
-
-
-
- | Task | Status | Duration |
- Started | Error |
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-`;
-}
diff --git a/src/http/index.ts b/src/http/index.ts
deleted file mode 100644
index b6f6a5a..0000000
--- a/src/http/index.ts
+++ /dev/null
@@ -1 +0,0 @@
-export { startDashboard } from './server.js';
diff --git a/src/http/server.ts b/src/http/server.ts
deleted file mode 100644
index 3ca6d76..0000000
--- a/src/http/server.ts
+++ /dev/null
@@ -1,288 +0,0 @@
-/**
- * Matrix Local Dashboard HTTP Server
- *
- * Runs alongside the MCP stdio transport on 127.0.0.1:4444 (configurable).
- * Serves the single-file dashboard HTML and a minimal REST API backed
- * directly by the shared SQLite database.
- */
-
-import { getDb } from '../db/index.js';
-import { getConfig, saveConfig } from '../config/index.js';
-import { cancelJob } from '../jobs/manager.js';
-import { matrixReindex } from '../tools/index-tools.js';
-import { getDashboardHtml } from './dashboard.js';
-
-const CORS = {
- 'Access-Control-Allow-Origin': '*',
- 'Access-Control-Allow-Methods': 'GET, POST, PUT, DELETE, OPTIONS',
- 'Access-Control-Allow-Headers': 'Content-Type',
-} as const;
-
-/**
- * Start the local HTTP dashboard server.
- * Returns null if dashboard.enabled is false in config.
- */
-export function startDashboard(): ReturnType | null {
- const config = getConfig();
- if (!config.dashboard?.enabled) return null;
-
- const host = config.dashboard.host;
- const preferredPort = config.dashboard.port;
-
- // Try preferred port first, then fall back to OS-assigned port
- for (const tryPort of [preferredPort, 0]) {
- try {
- const server = Bun.serve({
- port: tryPort,
- hostname: host,
- fetch: handleRequest,
- error(err: Error) {
- console.error('[Dashboard] HTTP error:', err.message);
- return new Response('Internal Server Error', { status: 500 });
- },
- });
-
- console.error(`[Matrix] Dashboard → http://${host}:${server.port}`);
- return server;
- } catch {
- if (tryPort !== 0) {
- console.error(`[Matrix] Port ${tryPort} in use, trying random port…`);
- }
- }
- }
-
- console.error('[Matrix] Dashboard failed to start — no available port');
- return null;
-}
-
-// ── Request dispatcher ──────────────────────────────────────────────────
-
-async function handleRequest(req: Request): Promise {
- const url = new URL(req.url);
-
- if (req.method === 'OPTIONS') {
- return new Response(null, { status: 204, headers: CORS });
- }
-
- try {
- const res = await route(req.method, url.pathname, url.searchParams, req);
- // Attach CORS to every response
- for (const [k, v] of Object.entries(CORS)) res.headers.set(k, v);
- return res;
- } catch (err) {
- const msg = err instanceof Error ? err.message : String(err);
- console.error('[Dashboard] Route error:', msg);
- return json({ error: msg }, 500);
- }
-}
-
-async function route(
- method: string,
- path: string,
- params: URLSearchParams,
- req: Request,
-): Promise {
- // ── Static dashboard ─────────────────────────────────────────────────
- if (method === 'GET' && path === '/') {
- return new Response(getDashboardHtml(), {
- headers: { 'Content-Type': 'text/html; charset=utf-8' },
- });
- }
-
- // ── Stats ─────────────────────────────────────────────────────────────
- if (method === 'GET' && path === '/api/stats') return json(getStats());
-
- // ── Solutions ─────────────────────────────────────────────────────────
- if (method === 'GET' && path === '/api/solutions') return json(getSolutions(params));
- if (method === 'DELETE' && path.startsWith('/api/solutions/')) {
- return json(deleteRow('solutions', seg(path, 3)));
- }
-
- // ── Failures ──────────────────────────────────────────────────────────
- if (method === 'GET' && path === '/api/failures') return json(getFailures());
- if (method === 'DELETE' && path.startsWith('/api/failures/')) {
- return json(deleteRow('failures', seg(path, 3)));
- }
-
- // ── Warnings ──────────────────────────────────────────────────────────
- if (method === 'GET' && path === '/api/warnings') return json(getWarnings());
- if (method === 'DELETE' && path.startsWith('/api/warnings/')) {
- return json(deleteRow('warnings', seg(path, 3)));
- }
-
- // ── Repos / Index ─────────────────────────────────────────────────────
- if (method === 'GET' && path === '/api/repos') return json(getRepos());
- if (method === 'POST' && path.startsWith('/api/repos/') && path.endsWith('/reindex')) {
- const repoPath = decodeURIComponent(seg(path, 3));
- return json(await triggerReindex(repoPath));
- }
-
- // ── Background Jobs ───────────────────────────────────────────────────
- if (method === 'GET' && path === '/api/jobs') return json(getJobs());
- if (method === 'DELETE' && path.startsWith('/api/jobs/')) {
- const jobId = seg(path, 3);
- const cancelled = cancelJob(jobId);
- return json({ cancelled, jobId });
- }
-
- // ── Dreamer ───────────────────────────────────────────────────────────
- if (method === 'GET' && path === '/api/dreamer/tasks') return json(getDreamerTasks());
- if (method === 'DELETE' && path.startsWith('/api/dreamer/tasks/')) {
- return json(deleteRow('dreamer_tasks', seg(path, 4)));
- }
- if (method === 'GET' && path === '/api/dreamer/executions') return json(getDreamerExecutions());
-
- // ── Config ────────────────────────────────────────────────────────────
- if (method === 'GET' && path === '/api/config') return json(getConfig());
- if (method === 'PUT' && path === '/api/config') {
- const body = await req.json() as Parameters[0];
- saveConfig(body);
- return json({ ok: true });
- }
-
- return new Response('Not Found', { status: 404 });
-}
-
-// ── Response helpers ────────────────────────────────────────────────────
-
-function json(data: unknown, status = 200): Response {
- return new Response(JSON.stringify(data), {
- status,
- headers: { 'Content-Type': 'application/json' },
- });
-}
-
-/** Extract path segment at index (0-based after splitting by '/') */
-function seg(path: string, idx: number): string {
- return path.split('/')[idx] ?? '';
-}
-
-// ── Data access functions ───────────────────────────────────────────────
-
-function getStats() {
- const db = getDb();
- const totals = db.query(`
- SELECT
- (SELECT COUNT(*) FROM solutions) AS total_solutions,
- (SELECT COUNT(*) FROM failures) AS total_failures,
- (SELECT COUNT(*) FROM warnings) AS total_warnings,
- (SELECT COUNT(*) FROM repos) AS total_repos
- `).get() as Record;
-
- const byCategory = db.query(`
- SELECT category, COUNT(*) AS count FROM solutions
- WHERE category IS NOT NULL GROUP BY category ORDER BY count DESC
- `).all();
-
- const byScope = db.query(`
- SELECT scope, COUNT(*) AS count FROM solutions GROUP BY scope ORDER BY count DESC
- `).all();
-
- // Map numeric buckets to human labels
- const rawBuckets = db.query(`
- SELECT CAST(FLOOR(score / 0.2) AS INTEGER) AS bucket, COUNT(*) AS count
- FROM solutions GROUP BY bucket ORDER BY bucket
- `).all() as Array<{ bucket: number; count: number }>;
- const bucketLabels = ['0–0.2', '0.2–0.4', '0.4–0.6', '0.6–0.8', '0.8–1.0'];
- const scoreDist = [0, 1, 2, 3, 4].map(b => ({
- label: bucketLabels[b],
- count: (rawBuckets.find(r => r.bucket === b)?.count ?? 0),
- }));
-
- const topTags = db.query(`
- SELECT json_each.value AS tag, COUNT(*) AS count
- FROM solutions, json_each(solutions.tags)
- GROUP BY tag ORDER BY count DESC LIMIT 12
- `).all();
-
- return { totals, byCategory, byScope, scoreDist, topTags };
-}
-
-function getSolutions(params: URLSearchParams) {
- const db = getDb();
- const limit = Math.min(Number(params.get('limit') ?? 300), 1000);
- const category = params.get('category');
- const scope = params.get('scope');
-
- const where: string[] = [];
- const args: (string | number)[] = [];
- if (category) { where.push('category = ?'); args.push(category); }
- if (scope) { where.push('scope = ?'); args.push(scope); }
-
- const sql = `SELECT id, problem, solution, scope, category, complexity, score,
- uses, successes, failures, tags, created_at, updated_at FROM solutions`
- + (where.length ? ' WHERE ' + where.join(' AND ') : '')
- + ' ORDER BY score DESC LIMIT ?';
- args.push(limit);
- return db.query(sql).all(...args);
-}
-
-function getFailures() {
- const db = getDb();
- return db.query(`
- SELECT id, error_type, error_message, root_cause, fix_applied, occurrences, created_at
- FROM failures ORDER BY created_at DESC LIMIT 200
- `).all();
-}
-
-function getWarnings() {
- const db = getDb();
- return db.query(`
- SELECT id, type, target, ecosystem, reason, severity, repo_id, created_at
- FROM warnings ORDER BY severity DESC, created_at DESC
- `).all();
-}
-
-function getRepos() {
- const db = getDb();
- return db.query(`
- SELECT r.id, r.name, r.path, r.languages, r.frameworks, r.created_at,
- COUNT(DISTINCT rf.id) AS indexed_files,
- COUNT(DISTINCT s.id) AS symbol_count
- FROM repos r
- LEFT JOIN repo_files rf ON rf.repo_id = r.id
- LEFT JOIN symbols s ON s.repo_id = r.id
- GROUP BY r.id ORDER BY r.created_at DESC
- `).all();
-}
-
-async function triggerReindex(repoPath: string) {
- const result = await matrixReindex({ repoPath, full: false });
- return result;
-}
-
-function getJobs() {
- const db = getDb();
- return db.query(`
- SELECT id, tool_name, status, progress_percent, progress_message,
- created_at, started_at, completed_at, error
- FROM background_jobs ORDER BY created_at DESC LIMIT 50
- `).all();
-}
-
-function getDreamerTasks() {
- const db = getDb();
- return db.query(`
- SELECT id, name, description, enabled, cron_expression,
- timezone, command, working_directory, timeout, tags, created_at
- FROM dreamer_tasks ORDER BY created_at DESC
- `).all();
-}
-
-function getDreamerExecutions() {
- const db = getDb();
- return db.query(`
- SELECT id, task_id, started_at, completed_at, status,
- duration, exit_code, error, task_name
- FROM dreamer_executions ORDER BY started_at DESC LIMIT 100
- `).all();
-}
-
-/** Generic single-row delete — table name is validated by the caller via routing. */
-function deleteRow(table: 'solutions' | 'failures' | 'warnings' | 'dreamer_tasks', id: string) {
- if (!id) return { deleted: false, error: 'Missing id' };
- const db = getDb();
- // Table names come from hard-coded string literals above — safe from injection
- const result = db.query(`DELETE FROM ${table} WHERE id = ?`).run(id);
- return { deleted: result.changes > 0, id };
-}
diff --git a/src/index.ts b/src/index.ts
index c3d0592..e3946f7 100644
--- a/src/index.ts
+++ b/src/index.ts
@@ -10,9 +10,6 @@ import { closeDb } from './db/index.js';
import { handleToolCall } from './server/index.js';
import { getConfig } from './config/index.js';
import { toolRegistry } from './tools/registry.js';
-import { cleanupOrphanedProcesses } from './jobs/manager.js';
-import { clearAllJobTimeouts } from './jobs/workers.js';
-import { startDashboard } from './http/index.js';
import pkg from '../package.json';
const VERSION: string = pkg.version;
@@ -78,29 +75,6 @@ Delegate read-only tools to ${model === 'haiku' ? 'Haiku' : 'Sonnet'} sub-agents
IMPORTANT: When using these tools, spawn a Task agent with model="${model}" to reduce costs.`);
}
- // Dreamer (Scheduled Tasks) - Clarification Required
- sections.push(`## Dreamer (Scheduled Tasks) - IMPORTANT
-When using matrix_dreamer to add a scheduled task, you MUST clarify the user's intent:
-
-**BEFORE scheduling, ASK the user:**
-- "Do you want this to run ONCE or RECURRING (daily/weekly/etc)?"
-
-**Why this matters:**
-- "at 1am" or "tonight at 3pm" → Could be one-time OR daily
-- Natural language is converted to cron (which is always recurring)
-- Users often expect one-time execution but get daily recurring tasks
-
-**One-time tasks:**
-- Use \`action: "run"\` for immediate execution
-- Or schedule and immediately remove after first run
-- Warn user that true delayed one-time is not natively supported
-
-**Recurring tasks:**
-- Use \`action: "add"\` with explicit recurring keywords confirmed
-- Examples: "every day at 9am", "daily", "weekly on Monday"
-
-NEVER assume recurring. ALWAYS confirm first.`);
-
// When to Use Matrix Tools (always included)
sections.push(`## When to Use Matrix Tools
@@ -122,12 +96,6 @@ DON'T USE for:
}
async function main(): Promise {
- // Cleanup orphaned jobs from previous runs (prevents zombie processes)
- const orphansCleaned = cleanupOrphanedProcesses();
- if (orphansCleaned > 0) {
- console.error(`[Matrix] Cleaned up ${orphansCleaned} orphaned job(s) from previous session`);
- }
-
const instructions = buildInstructions();
const server = new Server(
@@ -162,13 +130,7 @@ async function main(): Promise {
const transport = new StdioServerTransport();
await server.connect(transport);
- // Start local HTTP dashboard alongside MCP (non-blocking, separate server)
- const dashboardServer = startDashboard();
-
const shutdown = () => {
- // Clear all pending job timeouts to prevent orphan timers
- clearAllJobTimeouts();
- dashboardServer?.stop(true);
closeDb();
process.exit(0);
};
diff --git a/src/indexer/analysis.ts b/src/indexer/analysis.ts
index 5a73a58..bde5f0f 100644
--- a/src/indexer/analysis.ts
+++ b/src/indexer/analysis.ts
@@ -5,9 +5,70 @@
* Builds on the existing index store (symbols, imports, repo_files tables).
*/
+import { existsSync, readFileSync } from 'fs';
+import { join } from 'path';
import { getDb } from '../db/client.js';
import { findExports, findCallers } from './store.js';
-import type { ExportResult, SymbolKind } from './types.js';
+import type { SymbolKind } from './types.js';
+
+// ============================================================================
+// tsconfig Path Alias Resolution
+// ============================================================================
+
+interface TsconfigPaths {
+ baseUrl?: string;
+ paths?: Record;
+}
+
+/**
+ * Load tsconfig.json path aliases from a repo root.
+ * Returns a map of alias prefix -> directory prefix for import resolution.
+ */
+export function loadTsconfigPaths(repoRoot: string): Map {
+ const aliasMap = new Map();
+
+ for (const configName of ['tsconfig.json', 'jsconfig.json']) {
+ const configPath = join(repoRoot, configName);
+ if (!existsSync(configPath)) continue;
+
+ try {
+ const raw = readFileSync(configPath, 'utf-8');
+ // Strip single-line comments (tsconfig allows them)
+ const stripped = raw.replace(/\/\/.*$/gm, '');
+ const config = JSON.parse(stripped) as { compilerOptions?: TsconfigPaths };
+
+ const baseUrl = config.compilerOptions?.baseUrl ?? '.';
+ const paths = config.compilerOptions?.paths;
+ if (!paths) continue;
+
+ for (const [alias, targets] of Object.entries(paths)) {
+ const target = targets[0];
+ if (!target) continue;
+
+ // Convert "src/*" -> "src/" and "@/*" -> "@/"
+ const aliasPrefix = alias.replace(/\/?\*$/, '');
+ const targetPrefix = join(baseUrl, target.replace(/\/?\*$/, ''));
+
+ aliasMap.set(aliasPrefix, targetPrefix);
+ }
+ } catch {
+ // Invalid config, skip
+ }
+ }
+
+ return aliasMap;
+}
+
+/** Cached tsconfig paths per repo root */
+const tsconfigPathsCache = new Map>();
+
+export function getTsconfigPaths(repoRoot: string): Map {
+ const cached = tsconfigPathsCache.get(repoRoot);
+ if (cached) return cached;
+ const paths = loadTsconfigPaths(repoRoot);
+ tsconfigPathsCache.set(repoRoot, paths);
+ return paths;
+}
// ============================================================================
// Types
@@ -70,17 +131,34 @@ interface ImportGraphData {
/**
* Resolve an import source path to an actual indexed file path.
*
- * Handles relative paths (./foo, ../bar), extension resolution (.ts/.tsx/.js/.jsx/.mjs),
- * and index file resolution (foo/index.ts).
+ * Handles:
+ * - Relative paths (./foo, ../bar)
+ * - Extension resolution (.ts/.tsx/.js/.jsx/.mjs)
+ * - Index file resolution (foo/index.ts)
+ * - tsconfig path aliases (@/utils, src/foo)
*
- * Returns null for external packages (no ./ or ../ prefix).
+ * Returns null for unresolvable external packages.
*/
function resolveImportPath(
fromFile: string,
sourcePath: string,
knownFiles: Set,
+ pathAliases?: Map,
): string | null {
- // Skip external packages
+ const extensions = ['.ts', '.tsx', '.js', '.jsx', '.mjs'];
+
+ // Try tsconfig path aliases first (e.g., @/utils/foo -> src/utils/foo)
+ if (pathAliases && !sourcePath.startsWith('.')) {
+ for (const [alias, target] of pathAliases) {
+ if (sourcePath === alias || sourcePath.startsWith(alias + '/')) {
+ const resolved = sourcePath.replace(alias, target);
+ const result = tryResolve(resolved, knownFiles, extensions);
+ if (result) return result;
+ }
+ }
+ }
+
+ // Skip remaining external packages (no ./ or ../ and no alias match)
if (!sourcePath.startsWith('.')) {
return null;
}
@@ -101,18 +179,18 @@ function resolveImportPath(
}
const base = segments.join('/');
- const extensions = ['.ts', '.tsx', '.js', '.jsx', '.mjs'];
+ return tryResolve(base, knownFiles, extensions);
+}
- // Try exact match first (already has extension)
+/** Try resolving a base path with extensions and /index fallbacks */
+function tryResolve(base: string, knownFiles: Set, extensions: string[]): string | null {
if (knownFiles.has(base)) return base;
- // Try with extensions
for (const ext of extensions) {
const candidate = base + ext;
if (knownFiles.has(candidate)) return candidate;
}
- // Try as directory with /index
for (const ext of extensions) {
const candidate = base + '/index' + ext;
if (knownFiles.has(candidate)) return candidate;
@@ -124,8 +202,9 @@ function resolveImportPath(
/**
* Build the import graph from the index database.
* Returns both outgoing (what each file imports) and incoming (who imports each file) edges.
+ * Optionally resolves tsconfig path aliases for better import matching.
*/
-function buildImportGraph(repoId: string, pathPrefix?: string): ImportGraphData {
+function buildImportGraph(repoId: string, pathPrefix?: string, pathAliases?: Map): ImportGraphData {
const db = getDb();
// Load ALL files for import resolution (unfiltered) to prevent
@@ -173,7 +252,7 @@ function buildImportGraph(repoId: string, pathPrefix?: string): ImportGraphData
if (!fromFile) continue;
// Resolve against ALL files (not just scope) for accurate cross-boundary resolution
- const resolved = resolveImportPath(fromFile, imp.source_path, allFiles);
+ const resolved = resolveImportPath(fromFile, imp.source_path, allFiles, pathAliases);
if (!resolved) continue;
outgoing.get(fromFile)?.add(resolved);
@@ -239,9 +318,10 @@ export function findOrphanedFiles(
pathPrefix?: string,
entryPointPatterns?: string[],
limit: number = 100,
+ pathAliases?: Map,
): OrphanedFile[] {
const db = getDb();
- const graph = buildImportGraph(repoId, pathPrefix);
+ const graph = buildImportGraph(repoId, pathPrefix, pathAliases);
const isEntryPoint = buildEntryPointMatcher(entryPointPatterns || []);
const orphaned: OrphanedFile[] = [];
@@ -287,8 +367,9 @@ export function findCircularDeps(
repoId: string,
pathPrefix?: string,
maxDepth: number = 10,
+ pathAliases?: Map,
): CircularDepsResult {
- const graph = buildImportGraph(repoId, pathPrefix);
+ const graph = buildImportGraph(repoId, pathPrefix, pathAliases);
const rawCycles = detectCycles(graph.outgoing, maxDepth);
const cycles = deduplicateCycles(rawCycles);
@@ -465,9 +546,13 @@ export function analyzeDeadCode(
pathPrefix?: string,
entryPoints?: string[],
limit: number = 100,
+ repoRoot?: string,
): DeadCodeResult {
const db = getDb();
+ // Load tsconfig path aliases for better import resolution
+ const pathAliases = repoRoot ? getTsconfigPaths(repoRoot) : undefined;
+
// Get total counts for summary
const fileCount = db.query(
`SELECT COUNT(*) as count FROM repo_files WHERE repo_id = ?${pathPrefix ? ' AND file_path LIKE ?' : ''}`,
@@ -485,7 +570,7 @@ export function analyzeDeadCode(
}
if (category === 'orphaned_files' || category === 'all') {
- orphanedFiles = findOrphanedFiles(repoId, pathPrefix, entryPoints, limit);
+ orphanedFiles = findOrphanedFiles(repoId, pathPrefix, entryPoints, limit, pathAliases);
}
return {
diff --git a/src/indexer/diff.ts b/src/indexer/diff.ts
index 21b033f..f01261c 100644
--- a/src/indexer/diff.ts
+++ b/src/indexer/diff.ts
@@ -3,17 +3,29 @@
*
* Compares current repository state against indexed state
* to determine which files need re-indexing.
+ * Uses content hashing for reliable change detection beyond mtime.
*/
import type { ScannedFile, FileDiff, RepoFileRow } from './types.js';
/**
- * Compare scanned files against indexed files to find changes
+ * Compute a fast content hash for a file
*/
-export function computeDiff(
+export async function computeFileHash(absolutePath: string): Promise {
+ const content = await Bun.file(absolutePath).arrayBuffer();
+ const hasher = new Bun.CryptoHasher('sha256');
+ hasher.update(new Uint8Array(content));
+ return hasher.digest('hex');
+}
+
+/**
+ * Compare scanned files against indexed files to find changes.
+ * Uses mtime as a fast first check, then content hash for accuracy.
+ */
+export async function computeDiff(
scannedFiles: ScannedFile[],
indexedFiles: Map
-): FileDiff {
+): Promise {
const added: ScannedFile[] = [];
const modified: ScannedFile[] = [];
const deleted: string[] = [];
@@ -28,8 +40,17 @@ export function computeDiff(
// New file, not in index
added.push(file);
} else if (file.mtime > indexed.mtime) {
- // File has been modified since last index
- modified.push(file);
+ // mtime changed — verify with content hash if we have one stored
+ if (indexed.hash) {
+ const currentHash = await computeFileHash(file.absolutePath);
+ if (currentHash !== indexed.hash) {
+ modified.push(file);
+ }
+ // Same hash despite mtime change — skip (e.g., touch without edit)
+ } else {
+ // No stored hash — assume modified
+ modified.push(file);
+ }
}
// else: file unchanged, skip
diff --git a/src/indexer/index.ts b/src/indexer/index.ts
index 741c41a..03d4d87 100644
--- a/src/indexer/index.ts
+++ b/src/indexer/index.ts
@@ -9,7 +9,7 @@
import { readFile } from 'fs/promises';
import { scanRepository } from './scanner.js';
import { parseFile } from './parser.js';
-import { computeDiff, hasChanges, getDiffSummary } from './diff.js';
+import { computeDiff, computeFileHash, hasChanges, getDiffSummary } from './diff.js';
import {
getIndexedFiles,
upsertFile,
@@ -71,7 +71,7 @@ export async function indexRepository(options: IndexerOptions): Promise LanguageParser.PARSE_TIMEOUT_MS) {
+ errors.push(`Parse timeout after ${LanguageParser.PARSE_TIMEOUT_MS}ms (symbols extracted, imports skipped)`);
+ return { symbols, imports, errors };
+ }
+
this.extractImports(tree.rootNode, imports);
} catch (err) {
errors.push(`Parse error: ${err instanceof Error ? err.message : String(err)}`);
diff --git a/src/indexer/languages/c.ts b/src/indexer/languages/c.ts
index 1b8eabf..8a3f78e 100644
--- a/src/indexer/languages/c.ts
+++ b/src/indexer/languages/c.ts
@@ -6,7 +6,7 @@
import type { Node as SyntaxNode } from 'web-tree-sitter';
import { LanguageParser } from './base.js';
-import type { ParseResult, ExtractedSymbol, ExtractedImport } from '../types.js';
+import type { ExtractedSymbol, ExtractedImport } from '../types.js';
export class CParser extends LanguageParser {
protected extractSymbols(rootNode: SyntaxNode, symbols: ExtractedSymbol[]): void {
diff --git a/src/indexer/languages/go.ts b/src/indexer/languages/go.ts
index 47caafe..1205141 100644
--- a/src/indexer/languages/go.ts
+++ b/src/indexer/languages/go.ts
@@ -6,7 +6,7 @@
import type { Node as SyntaxNode } from 'web-tree-sitter';
import { LanguageParser } from './base.js';
-import type { ParseResult, ExtractedSymbol, ExtractedImport, SymbolKind } from '../types.js';
+import type { ExtractedSymbol, ExtractedImport, SymbolKind } from '../types.js';
export class GoParser extends LanguageParser {
protected extractSymbols(
diff --git a/src/indexer/languages/java.ts b/src/indexer/languages/java.ts
index 6916469..fb6c5f8 100644
--- a/src/indexer/languages/java.ts
+++ b/src/indexer/languages/java.ts
@@ -6,7 +6,7 @@
import type { Node as SyntaxNode } from 'web-tree-sitter';
import { LanguageParser } from './base.js';
-import type { ParseResult, ExtractedSymbol, ExtractedImport } from '../types.js';
+import type { ExtractedSymbol, ExtractedImport } from '../types.js';
export class JavaParser extends LanguageParser {
protected extractSymbols(rootNode: SyntaxNode, symbols: ExtractedSymbol[]): void {
diff --git a/src/indexer/languages/python.ts b/src/indexer/languages/python.ts
index e7cf7d2..64c201e 100644
--- a/src/indexer/languages/python.ts
+++ b/src/indexer/languages/python.ts
@@ -2,11 +2,12 @@
* Python Language Parser
*
* Extracts symbols and imports from Python files using tree-sitter.
+ * Supports decorators (@property, @staticmethod, @classmethod, @dataclass, etc.)
*/
import type { Node as SyntaxNode } from 'web-tree-sitter';
import { LanguageParser } from './base.js';
-import type { ParseResult, ExtractedSymbol, ExtractedImport, SymbolKind } from '../types.js';
+import type { ExtractedSymbol, ExtractedImport, SymbolKind } from '../types.js';
export class PythonParser extends LanguageParser {
protected extractSymbols(
@@ -17,11 +18,11 @@ export class PythonParser extends LanguageParser {
this.walkTree(rootNode, (node) => {
switch (node.type) {
case 'function_definition':
- this.handleFunctionDefinition(node, symbols, scope);
+ this.handleFunctionDefinition(node, symbols, scope, []);
return false; // Don't recurse into function body
case 'class_definition':
- this.handleClassDefinition(node, symbols, scope);
+ this.handleClassDefinition(node, symbols, scope, []);
// Handle class body with class scope
const className = this.getChildByField(node, 'name');
const classBody = this.getChildByField(node, 'body');
@@ -30,7 +31,9 @@ export class PythonParser extends LanguageParser {
for (const child of classBody.namedChildren) {
if (!child) continue;
if (child.type === 'function_definition') {
- this.handleFunctionDefinition(child, symbols, classScope);
+ this.handleFunctionDefinition(child, symbols, classScope, []);
+ } else if (child.type === 'decorated_definition') {
+ this.handleDecoratedDefinition(child, symbols, classScope);
}
}
}
@@ -45,27 +48,57 @@ export class PythonParser extends LanguageParser {
break;
case 'decorated_definition':
- // Handle decorated functions/classes
- const decorated = node.namedChildren.find(
- (c) => c !== null && (c.type === 'function_definition' || c.type === 'class_definition')
- );
- if (decorated) {
- if (decorated.type === 'function_definition') {
- this.handleFunctionDefinition(decorated, symbols, scope);
- } else if (decorated.type === 'class_definition') {
- this.handleClassDefinition(decorated, symbols, scope);
- }
- }
+ this.handleDecoratedDefinition(node, symbols, scope);
return false;
}
return true;
});
}
- private handleFunctionDefinition(
+ /**
+ * Handle a decorated definition — extract decorators and delegate
+ */
+ private handleDecoratedDefinition(
node: SyntaxNode,
symbols: ExtractedSymbol[],
scope?: string
+ ): void {
+ // Collect decorator names
+ const decorators: string[] = [];
+ for (const child of node.namedChildren) {
+ if (!child) continue;
+ if (child.type === 'decorator') {
+ // Get the decorator expression (name or call)
+ const exprNode = child.namedChildren.find(c => c !== null);
+ if (exprNode) {
+ // Handle both @foo and @foo.bar and @foo(args)
+ const text = this.getNodeText(exprNode);
+ // Extract just the decorator name (before any parens)
+ const name = text.split('(')[0] ?? text;
+ decorators.push(name);
+ }
+ }
+ }
+
+ // Find the actual definition
+ const decorated = node.namedChildren.find(
+ (c) => c !== null && (c.type === 'function_definition' || c.type === 'class_definition')
+ );
+
+ if (decorated) {
+ if (decorated.type === 'function_definition') {
+ this.handleFunctionDefinition(decorated, symbols, scope, decorators);
+ } else if (decorated.type === 'class_definition') {
+ this.handleClassDefinition(decorated, symbols, scope, decorators);
+ }
+ }
+ }
+
+ private handleFunctionDefinition(
+ node: SyntaxNode,
+ symbols: ExtractedSymbol[],
+ scope?: string,
+ decorators: string[] = []
): void {
const nameNode = this.getChildByField(node, 'name');
if (!nameNode) return;
@@ -76,17 +109,30 @@ export class PythonParser extends LanguageParser {
const isSpecialMethod = name.startsWith('__') && name.endsWith('__');
const isPrivate = name.startsWith('_') && !isSpecialMethod;
- // Determine if it's a method (inside a class)
+ // Determine kind based on decorators
const isMethod = !!scope;
- const kind: SymbolKind = isMethod ? 'method' : 'function';
+ const isProperty = decorators.some(d => d === 'property' || d.endsWith('.setter') || d.endsWith('.getter') || d.endsWith('.deleter'));
+ let kind: SymbolKind;
+ if (isProperty) {
+ kind = 'property';
+ } else if (isMethod) {
+ kind = 'method';
+ } else {
+ kind = 'function';
+ }
const signature = this.getFunctionSignature(node);
+ // Build decorator annotation for signature
+ const decoratorSuffix = decorators.length > 0
+ ? ` [@${decorators.join(', @')}]`
+ : '';
+
symbols.push(
this.createSymbol(name, kind, node, {
exported: !isPrivate, // Python doesn't have exports, but we use convention
scope,
- signature,
+ signature: signature + decoratorSuffix,
})
);
}
@@ -94,7 +140,8 @@ export class PythonParser extends LanguageParser {
private handleClassDefinition(
node: SyntaxNode,
symbols: ExtractedSymbol[],
- scope?: string
+ scope?: string,
+ decorators: string[] = []
): void {
const nameNode = this.getChildByField(node, 'name');
if (!nameNode) return;
@@ -102,12 +149,50 @@ export class PythonParser extends LanguageParser {
const name = this.getNodeText(nameNode);
const isPrivate = name.startsWith('_');
+ // Check for dataclass decorator
+ const isDataclass = decorators.some(d => d === 'dataclass' || d.endsWith('.dataclass'));
+
+ // Build signature with base classes and decorators
+ const superclassNode = this.getChildByField(node, 'superclasses');
+ let signature = superclassNode ? this.getNodeText(superclassNode) : undefined;
+
+ if (decorators.length > 0) {
+ const decoratorStr = `[@${decorators.join(', @')}]`;
+ signature = signature ? `${signature} ${decoratorStr}` : decoratorStr;
+ }
+
symbols.push(
this.createSymbol(name, 'class', node, {
exported: !isPrivate,
scope,
+ signature,
})
);
+
+ // For dataclasses, extract fields from the class body as properties
+ if (isDataclass) {
+ const classBody = this.getChildByField(node, 'body');
+ if (classBody) {
+ for (const child of classBody.namedChildren) {
+ if (!child) continue;
+ if (child.type === 'expression_statement') {
+ const expr = child.namedChildren.find(c => c !== null);
+ if (expr && expr.type === 'type') {
+ // Annotated assignment: field: Type = default
+ const fieldName = this.getChildByField(expr, 'left');
+ if (fieldName) {
+ symbols.push(
+ this.createSymbol(this.getNodeText(fieldName), 'property', fieldName, {
+ exported: true,
+ scope: name,
+ })
+ );
+ }
+ }
+ }
+ }
+ }
+ }
}
private handleAssignment(node: SyntaxNode, symbols: ExtractedSymbol[]): void {
@@ -116,7 +201,7 @@ export class PythonParser extends LanguageParser {
const name = this.getNodeText(leftNode);
- // Check if it's a constant (UPPER_CASE convention)
+ // Check if it's a constant (UPPER_CASE or UpperCamelCase convention)
const isConstant = /^[A-Z][A-Z0-9_]*$/.test(name);
const isPrivate = name.startsWith('_');
diff --git a/src/indexer/languages/rust.ts b/src/indexer/languages/rust.ts
index 2bbff3b..b7f2908 100644
--- a/src/indexer/languages/rust.ts
+++ b/src/indexer/languages/rust.ts
@@ -6,7 +6,7 @@
import type { Node as SyntaxNode } from 'web-tree-sitter';
import { LanguageParser } from './base.js';
-import type { ParseResult, ExtractedSymbol, ExtractedImport } from '../types.js';
+import type { ExtractedSymbol, ExtractedImport } from '../types.js';
export class RustParser extends LanguageParser {
protected extractSymbols(
diff --git a/src/indexer/languages/swift.ts b/src/indexer/languages/swift.ts
index f21ebac..c4a894a 100644
--- a/src/indexer/languages/swift.ts
+++ b/src/indexer/languages/swift.ts
@@ -6,7 +6,7 @@
import type { Node as SyntaxNode } from 'web-tree-sitter';
import { LanguageParser } from './base.js';
-import type { ParseResult, ExtractedSymbol, ExtractedImport, SymbolKind } from '../types.js';
+import type { ExtractedSymbol, ExtractedImport, SymbolKind } from '../types.js';
export class SwiftParser extends LanguageParser {
protected extractSymbols(rootNode: SyntaxNode, symbols: ExtractedSymbol[]): void {
diff --git a/src/indexer/languages/typescript.ts b/src/indexer/languages/typescript.ts
index 0d19db7..a75c601 100644
--- a/src/indexer/languages/typescript.ts
+++ b/src/indexer/languages/typescript.ts
@@ -7,7 +7,7 @@
import type { Node as SyntaxNode } from 'web-tree-sitter';
import { LanguageParser } from './base.js';
-import type { ParseResult, ExtractedSymbol, ExtractedImport, SymbolKind } from '../types.js';
+import type { ExtractedSymbol, ExtractedImport, SymbolKind } from '../types.js';
export class TypeScriptParser extends LanguageParser {
protected extractSymbols(
@@ -259,12 +259,72 @@ export class TypeScriptParser extends LanguageParser {
this.walkTree(rootNode, (node) => {
if (node.type === 'import_statement') {
this.handleImportStatement(node, imports);
- return false; // Don't recurse into import
+ return false;
+ }
+ // Track re-exports: export { X } from './bar' and export * from './bar'
+ if (node.type === 'export_statement') {
+ this.handleReExport(node, imports);
+ return true; // Still recurse for nested declarations
}
return true;
});
}
+ /**
+ * Handle re-export statements:
+ * - export { Foo, Bar } from './module'
+ * - export * from './module'
+ * - export * as Ns from './module'
+ * These create import edges in the dependency graph.
+ */
+ private handleReExport(node: SyntaxNode, imports: ExtractedImport[]): void {
+ const sourceNode = this.getChildByField(node, 'source');
+ if (!sourceNode) return; // Not a re-export (no "from" clause)
+
+ const sourcePath = this.getNodeText(sourceNode).replace(/^['"]|['"]$/g, '');
+ const line = node.startPosition.row + 1;
+
+ // Check for export * from './module'
+ const hasStar = node.children.some(c => c !== null && c.type === '*');
+ if (hasStar) {
+ // Check for export * as Ns from './module'
+ const nsImport = node.namedChildren.find(c => c !== null && c.type === 'namespace_export');
+ if (nsImport) {
+ const nameNode = nsImport.namedChildren.find(c => c !== null && c.type === 'identifier');
+ imports.push(
+ this.createImport(
+ nameNode ? this.getNodeText(nameNode) : '*',
+ sourcePath,
+ line,
+ { isNamespace: true }
+ )
+ );
+ } else {
+ imports.push(
+ this.createImport('*', sourcePath, line, { isNamespace: true })
+ );
+ }
+ return;
+ }
+
+ // Check for export { Foo, Bar } from './module'
+ const exportClause = node.namedChildren.find(c => c !== null && c.type === 'export_clause');
+ if (exportClause) {
+ for (const specifier of exportClause.namedChildren) {
+ if (!specifier || specifier.type !== 'export_specifier') continue;
+ const nameNode = this.getChildByField(specifier, 'name');
+ const aliasNode = this.getChildByField(specifier, 'alias');
+ if (nameNode) {
+ imports.push(
+ this.createImport(this.getNodeText(nameNode), sourcePath, line, {
+ localName: aliasNode ? this.getNodeText(aliasNode) : undefined,
+ })
+ );
+ }
+ }
+ }
+ }
+
private handleImportStatement(node: SyntaxNode, imports: ExtractedImport[]): void {
const sourceNode = this.getChildByField(node, 'source');
if (!sourceNode) return;
diff --git a/src/indexer/scanner.ts b/src/indexer/scanner.ts
index 5a17d3a..f002ae1 100644
--- a/src/indexer/scanner.ts
+++ b/src/indexer/scanner.ts
@@ -3,10 +3,12 @@
*
* Discovers source code files in a repository across multiple languages.
* Uses Bun.Glob for fast file discovery with pattern exclusions.
+ * Respects .gitignore rules and detects symlink cycles.
*/
import { join } from 'path';
-import { stat } from 'fs/promises';
+import { stat, lstat } from 'fs/promises';
+import { existsSync, readFileSync } from 'fs';
import type { ScannedFile } from './types.js';
import { getSupportedExtensions } from './languages/index.js';
@@ -51,6 +53,60 @@ export interface ScanOptions {
includeTests?: boolean; // include test files, default: false
}
+/**
+ * Parse a .gitignore file into regex patterns
+ */
+function parseGitignore(repoRoot: string): RegExp[] {
+ const gitignorePath = join(repoRoot, '.gitignore');
+ if (!existsSync(gitignorePath)) return [];
+
+ try {
+ const content = readFileSync(gitignorePath, 'utf-8');
+ const patterns: RegExp[] = [];
+
+ for (const rawLine of content.split('\n')) {
+ const line = rawLine.trim();
+ // Skip empty lines and comments
+ if (!line || line.startsWith('#')) continue;
+ // Skip negation patterns (too complex for simple matching)
+ if (line.startsWith('!')) continue;
+
+ // Convert gitignore glob to regex
+ let pattern = line;
+
+ // Remove trailing slash (directory marker — we match both)
+ const dirOnly = pattern.endsWith('/');
+ if (dirOnly) pattern = pattern.slice(0, -1);
+
+ // Escape regex special chars (except * and ?)
+ pattern = pattern.replace(/[.+^${}()|[\]\\]/g, '\\$&');
+
+ // Convert gitignore wildcards to regex
+ pattern = pattern
+ .replace(/\*\*/g, '___GLOBSTAR___')
+ .replace(/\*/g, '[^/]*')
+ .replace(/\?/g, '[^/]')
+ .replace(/___GLOBSTAR___/g, '.*');
+
+ // If pattern doesn't contain /, it matches any path segment
+ if (!line.includes('/')) {
+ patterns.push(new RegExp(`(^|/)${pattern}(/|$)`));
+ } else {
+ // If starts with /, anchor to repo root
+ if (line.startsWith('/')) {
+ patterns.push(new RegExp(`^${pattern.slice(1)}(/|$)`));
+ } else {
+ patterns.push(new RegExp(`(^|/)${pattern}(/|$)`));
+ }
+ }
+ }
+
+ return patterns;
+ } catch {
+ return [];
+ }
+}
+
/**
* Scan a repository for source code files across all supported languages
*/
@@ -64,6 +120,9 @@ export async function scanRepository(options: ScanOptions): Promise();
+
const glob = new Bun.Glob(pattern);
for await (const filePath of glob.scan({
@@ -131,10 +193,27 @@ export async function scanRepository(options: ScanOptions): Promise rx.test(filePath))) {
+ continue;
+ }
+
const absolutePath = join(repoRoot, filePath);
try {
- const stats = await stat(absolutePath);
+ // Detect symlinks and check for cycles
+ const lstats = await lstat(absolutePath);
+ if (lstats.isSymbolicLink()) {
+ const realPath = await Bun.file(absolutePath).exists()
+ ? (await import('fs/promises')).realpath(absolutePath)
+ : null;
+ if (!realPath || seenRealPaths.has(await realPath)) {
+ continue; // Broken symlink or cycle
+ }
+ seenRealPaths.add(await realPath);
+ }
+
+ const stats = lstats.isSymbolicLink() ? await stat(absolutePath) : lstats;
// Skip files larger than maxFileSize
if (stats.size > maxFileSize) {
@@ -145,6 +224,7 @@ export async function scanRepository(options: ScanOptions): Promise;
return rows.map(row => ({
- name: row.name, // Include actual symbol name for search results
+ name: row.name,
file: row.file_path,
line: row.line,
column: row.column,
diff --git a/src/indexer/types.ts b/src/indexer/types.ts
index 6df8cfd..4ddd5b4 100644
--- a/src/indexer/types.ts
+++ b/src/indexer/types.ts
@@ -53,6 +53,7 @@ export interface ScannedFile {
path: string; // relative to repo root
absolutePath: string; // full path
mtime: number; // modification time (unix timestamp)
+ size?: number; // file size in bytes
}
// Diff result for incremental updates
diff --git a/src/jobs/index.ts b/src/jobs/index.ts
deleted file mode 100644
index d9d3392..0000000
--- a/src/jobs/index.ts
+++ /dev/null
@@ -1,21 +0,0 @@
-/**
- * Background Jobs Module
- *
- * Exports job management functions for long-running operations.
- */
-
-export {
- createJob,
- getJob,
- updateJob,
- cancelJob,
- listJobs,
- cleanupOldJobs,
- updateJobPid,
- cleanupOrphanedProcesses,
- type Job,
- type JobStatus,
- type JobUpdate,
-} from './manager.js';
-
-export { runReindexJob, spawnBackgroundJob } from './workers.js';
diff --git a/src/jobs/manager.ts b/src/jobs/manager.ts
deleted file mode 100644
index 008440f..0000000
--- a/src/jobs/manager.ts
+++ /dev/null
@@ -1,323 +0,0 @@
-/**
- * Background Job Manager
- *
- * Provides job tracking for long-running MCP operations.
- * Jobs are stored in SQLite and can be polled for status.
- *
- * Usage:
- * const jobId = createJob('matrix_reindex');
- * updateJob(jobId, { status: 'running', progress: 50 });
- * const job = getJob(jobId);
- */
-
-import { getDb } from '../db/index.js';
-import { clearJobTimeout } from './workers.js';
-
-function safeJsonParse(json: string | null | undefined, fallback: T): T {
- try { return json ? JSON.parse(json) : fallback; } catch { return fallback; }
-}
-
-export type JobStatus = 'queued' | 'running' | 'completed' | 'failed' | 'cancelled';
-
-export interface Job {
- id: string;
- toolName: string;
- status: JobStatus;
- progressPercent: number;
- progressMessage: string | null;
- input: unknown | null;
- result: unknown | null;
- error: string | null;
- pid: number | null;
- createdAt: string;
- startedAt: string | null;
- completedAt: string | null;
-}
-
-export interface JobUpdate {
- status?: JobStatus;
- progressPercent?: number;
- progressMessage?: string;
- result?: unknown;
- error?: string;
-}
-
-/**
- * Generate a unique job ID
- */
-function generateJobId(): string {
- const timestamp = Date.now().toString(36);
- const random = Math.random().toString(36).substring(2, 8);
- return `job_${timestamp}_${random}`;
-}
-
-/**
- * Create a new background job
- *
- * @param toolName - Name of the MCP tool (e.g., 'matrix_reindex')
- * @param input - Optional input parameters to store
- * @returns Job ID
- */
-export function createJob(toolName: string, input?: unknown): string {
- const db = getDb();
- const id = generateJobId();
-
- db.query(`
- INSERT INTO background_jobs (id, tool_name, status, input)
- VALUES (?, ?, 'queued', ?)
- `).run(id, toolName, input ? JSON.stringify(input) : null);
-
- return id;
-}
-
-/**
- * Get a job by ID
- *
- * @param jobId - Job ID to look up
- * @returns Job object or null if not found
- */
-export function getJob(jobId: string): Job | null {
- const db = getDb();
-
- const row = db.query(`
- SELECT id, tool_name, status, progress_percent, progress_message,
- input, result, error, pid, created_at, started_at, completed_at
- FROM background_jobs
- WHERE id = ?
- `).get(jobId) as {
- id: string;
- tool_name: string;
- status: JobStatus;
- progress_percent: number;
- progress_message: string | null;
- input: string | null;
- result: string | null;
- error: string | null;
- pid: number | null;
- created_at: string;
- started_at: string | null;
- completed_at: string | null;
- } | null;
-
- if (!row) return null;
-
- return {
- id: row.id,
- toolName: row.tool_name,
- status: row.status,
- progressPercent: row.progress_percent,
- progressMessage: row.progress_message,
- input: safeJsonParse(row.input, null),
- result: safeJsonParse(row.result, null),
- error: row.error,
- pid: row.pid,
- createdAt: row.created_at,
- startedAt: row.started_at,
- completedAt: row.completed_at,
- };
-}
-
-/**
- * Update a job's status and progress
- *
- * @param jobId - Job ID to update
- * @param updates - Fields to update
- */
-export function updateJob(jobId: string, updates: JobUpdate): void {
- const db = getDb();
- const sets: string[] = [];
- const values: (string | number | null)[] = [];
-
- if (updates.status !== undefined) {
- sets.push('status = ?');
- values.push(updates.status);
-
- // Auto-set timestamps based on status
- if (updates.status === 'running') {
- sets.push("started_at = datetime('now')");
- } else if (['completed', 'failed', 'cancelled'].includes(updates.status)) {
- sets.push("completed_at = datetime('now')");
- // Clear timeout on terminal states (prevents orphan timers)
- clearJobTimeout(jobId);
- }
- }
-
- if (updates.progressPercent !== undefined) {
- sets.push('progress_percent = ?');
- values.push(updates.progressPercent);
- }
-
- if (updates.progressMessage !== undefined) {
- sets.push('progress_message = ?');
- values.push(updates.progressMessage);
- }
-
- if (updates.result !== undefined) {
- sets.push('result = ?');
- values.push(JSON.stringify(updates.result));
- }
-
- if (updates.error !== undefined) {
- sets.push('error = ?');
- values.push(updates.error);
- }
-
- if (sets.length === 0) return;
-
- values.push(jobId);
- db.query(`UPDATE background_jobs SET ${sets.join(', ')} WHERE id = ?`).run(...values);
-}
-
-/**
- * Cancel a job if it's still running or queued
- *
- * @param jobId - Job ID to cancel
- * @returns true if job was cancelled, false if already completed/failed
- */
-export function cancelJob(jobId: string): boolean {
- const db = getDb();
-
- const result = db.query(`
- UPDATE background_jobs
- SET status = 'cancelled', completed_at = datetime('now')
- WHERE id = ? AND status IN ('queued', 'running')
- `).run(jobId);
-
- // Clear timeout on cancel (uses direct SQL, so updateJob won't be called)
- if (result.changes > 0) {
- clearJobTimeout(jobId);
- }
-
- return result.changes > 0;
-}
-
-/**
- * List jobs by status
- *
- * @param status - Filter by status (optional)
- * @param limit - Max number of jobs to return
- * @returns Array of jobs
- */
-export function listJobs(status?: JobStatus, limit = 50): Job[] {
- const db = getDb();
-
- let query = `
- SELECT id, tool_name, status, progress_percent, progress_message,
- input, result, error, pid, created_at, started_at, completed_at
- FROM background_jobs
- `;
-
- const params: (string | number)[] = [];
-
- if (status) {
- query += ' WHERE status = ?';
- params.push(status);
- }
-
- query += ' ORDER BY created_at DESC LIMIT ?';
- params.push(limit);
-
- const rows = db.query(query).all(...params) as Array<{
- id: string;
- tool_name: string;
- status: JobStatus;
- progress_percent: number;
- progress_message: string | null;
- input: string | null;
- result: string | null;
- error: string | null;
- pid: number | null;
- created_at: string;
- started_at: string | null;
- completed_at: string | null;
- }>;
-
- return rows.map(row => ({
- id: row.id,
- toolName: row.tool_name,
- status: row.status,
- progressPercent: row.progress_percent,
- progressMessage: row.progress_message,
- input: safeJsonParse(row.input, null),
- result: safeJsonParse(row.result, null),
- error: row.error,
- pid: row.pid,
- createdAt: row.created_at,
- startedAt: row.started_at,
- completedAt: row.completed_at,
- }));
-}
-
-/**
- * Clean up old completed jobs (older than 24 hours)
- */
-export function cleanupOldJobs(): number {
- const db = getDb();
-
- const result = db.query(`
- DELETE FROM background_jobs
- WHERE status IN ('completed', 'failed', 'cancelled')
- AND completed_at < datetime('now', '-24 hours')
- `).run();
-
- return result.changes;
-}
-
-/**
- * Update a job's PID after spawning
- *
- * @param jobId - Job ID to update
- * @param pid - Process ID of the spawned worker
- */
-export function updateJobPid(jobId: string, pid: number): void {
- const db = getDb();
- db.query('UPDATE background_jobs SET pid = ? WHERE id = ?').run(pid, jobId);
-}
-
-/**
- * Clean up orphaned processes from previous sessions
- *
- * Checks if running/queued jobs have live processes. If the process
- * doesn't exist, marks the job as failed. This should be called on startup.
- *
- * @returns Number of orphaned jobs cleaned up
- */
-export function cleanupOrphanedProcesses(): number {
- const db = getDb();
-
- // Check if background_jobs table exists (defensive - migrations may not have run)
- const tableExists = db
- .query(`SELECT name FROM sqlite_master WHERE type='table' AND name='background_jobs'`)
- .get();
-
- if (!tableExists) {
- // Table doesn't exist yet - migrations will create it
- return 0;
- }
-
- const running = db
- .query(
- `
- SELECT id, pid FROM background_jobs
- WHERE status IN ('queued', 'running') AND pid IS NOT NULL
- `
- )
- .all() as { id: string; pid: number }[];
-
- let cleaned = 0;
- for (const job of running) {
- try {
- // Signal 0 checks if process exists without killing it
- process.kill(job.pid, 0);
- } catch {
- // Process doesn't exist - mark as failed
- updateJob(job.id, {
- status: 'failed',
- error: 'Process terminated unexpectedly (orphaned)',
- });
- cleaned++;
- }
- }
-
- return cleaned;
-}
diff --git a/src/jobs/worker-entry.ts b/src/jobs/worker-entry.ts
deleted file mode 100644
index fe131cc..0000000
--- a/src/jobs/worker-entry.ts
+++ /dev/null
@@ -1,57 +0,0 @@
-#!/usr/bin/env bun
-/**
- * Background Worker Entry Point
- *
- * Spawned as a subprocess to execute long-running jobs.
- *
- * Usage: bun run worker-entry.ts
- */
-
-import { runReindexJob, type ReindexInput } from './workers.js';
-
-async function main() {
- const args = process.argv.slice(2);
-
- if (args.length < 2) {
- console.error('Usage: worker-entry.ts [inputJson]');
- process.exit(1);
- }
-
- const workerName = args[0]!;
- const jobId = args[1]!;
- const inputJson = args[2];
- let input: Record = {};
- if (inputJson) {
- try { input = JSON.parse(inputJson); } catch (e) {
- console.error(`Worker: invalid input JSON: ${e instanceof Error ? e.message : e}`);
- process.exit(1);
- }
- }
-
- try {
- switch (workerName) {
- case 'reindex':
- await runReindexJob(jobId, input as ReindexInput);
- break;
-
- default:
- console.error(`Unknown worker: ${workerName}`);
- process.exit(1);
- }
-
- } catch (err) {
- console.error(`Worker error: ${err instanceof Error ? err.message : err}`);
- // Update job status to failed if it hasn't been updated yet
- const { getJob, updateJob } = await import('./manager.js');
- const job = getJob(jobId);
- if (job && (job.status === 'queued' || job.status === 'running')) {
- updateJob(jobId, {
- status: 'failed',
- error: `Worker script error: ${err instanceof Error ? err.message : String(err)}`,
- });
- }
- process.exit(1);
- }
-}
-
-main();
diff --git a/src/jobs/workers.ts b/src/jobs/workers.ts
deleted file mode 100644
index 18c68c8..0000000
--- a/src/jobs/workers.ts
+++ /dev/null
@@ -1,182 +0,0 @@
-/**
- * Background Job Workers
- *
- * Worker functions for long-running operations.
- * These run asynchronously and update job status via the manager.
- */
-
-import { updateJob, updateJobPid, getJob } from './manager.js';
-import { indexRepository } from '../indexer/index.js';
-import { getRepoInfo } from '../tools/index-tools.js';
-import { toolRegistry } from '../tools/registry.js';
-
-// Track setTimeout handles for orphan cleanup
-const jobTimeouts = new Map>();
-
-/**
- * Clear timeout for a specific job (call on terminal states)
- */
-export function clearJobTimeout(jobId: string): boolean {
- const timeout = jobTimeouts.get(jobId);
- if (timeout) {
- clearTimeout(timeout);
- jobTimeouts.delete(jobId);
- return true;
- }
- return false;
-}
-
-/**
- * Clear all job timeouts (call on shutdown)
- */
-export function clearAllJobTimeouts(): number {
- let cleared = 0;
- for (const [jobId, timeout] of jobTimeouts) {
- clearTimeout(timeout);
- jobTimeouts.delete(jobId);
- cleared++;
- }
- return cleared;
-}
-
-// Expose map for testing only
-export const _jobTimeoutsForTesting = jobTimeouts;
-
-export interface ReindexInput {
- full?: boolean;
- repoPath?: string;
-}
-
-/**
- * Run the reindex operation as a background job
- *
- * @param jobId - Job ID to update progress
- * @param input - Reindex options
- */
-export async function runReindexJob(jobId: string, input: ReindexInput = {}): Promise {
- try {
- // Mark as running
- updateJob(jobId, {
- status: 'running',
- progressPercent: 0,
- progressMessage: 'Starting reindex...',
- });
-
- // Detect repository
- const repo = getRepoInfo(input.repoPath);
- if (!repo.success) {
- updateJob(jobId, {
- status: 'failed',
- error: repo.message,
- });
- return;
- }
-
- updateJob(jobId, {
- progressPercent: 10,
- progressMessage: 'Repository detected, scanning files...',
- });
-
- // Run the actual indexing with progress callback
- const result = await indexRepository({
- repoRoot: repo.root,
- repoId: repo.id,
- incremental: !input.full,
- onProgress: (message, percent) => {
- // Scale progress from 10-90%
- const scaledPercent = 10 + Math.round(percent * 0.8);
- updateJob(jobId, {
- progressPercent: scaledPercent,
- progressMessage: message,
- });
- },
- });
-
- // Update tool registry
- toolRegistry.setIndexReady(true);
-
- // Mark as completed
- updateJob(jobId, {
- status: 'completed',
- progressPercent: 100,
- progressMessage: 'Indexing complete',
- result: {
- success: true,
- filesScanned: result.filesScanned,
- filesIndexed: result.filesIndexed,
- filesSkipped: result.filesSkipped,
- symbolsFound: result.symbolsFound,
- importsFound: result.importsFound,
- duration: result.duration,
- message:
- result.filesIndexed > 0
- ? `Indexed ${result.filesIndexed} files, found ${result.symbolsFound} symbols`
- : `Index up to date (${result.filesSkipped} files unchanged)`,
- },
- });
- } catch (err) {
- // Mark as failed
- toolRegistry.setIndexReady(false);
- updateJob(jobId, {
- status: 'failed',
- error: err instanceof Error ? err.message : String(err),
- });
- }
-}
-
-// Maximum time a background job can run before being killed (30 minutes)
-const MAX_JOB_TIMEOUT_MS = 30 * 60 * 1000;
-
-/**
- * Spawn a background job in a subprocess
- * This allows the MCP call to return immediately
- *
- * @param workerName - Name of the worker function
- * @param jobId - Job ID
- * @param input - Worker input
- */
-export function spawnBackgroundJob(
- workerName: 'reindex',
- jobId: string,
- input: unknown
-): void {
- // Use Bun.spawn to run the worker in background
- // The worker script handles the actual execution
- const workerScript = new URL('./worker-entry.ts', import.meta.url).pathname;
-
- const proc = Bun.spawn(['bun', 'run', workerScript, workerName, jobId, JSON.stringify(input)], {
- stdio: ['ignore', 'ignore', 'ignore'],
- // Detach from parent process
- detached: true,
- });
-
- // Store PID for orphan cleanup tracking
- // Extract pid to primitive to avoid holding proc object in closure for 30 min
- const pid = proc.pid;
- if (pid) {
- updateJobPid(jobId, pid);
-
- // Add timeout monitoring to prevent zombie processes
- // This runs in the parent process and will kill the worker if it exceeds max time
- const timeoutId = setTimeout(() => {
- // Cleanup from map first (timeout fired = no longer needed)
- jobTimeouts.delete(jobId);
-
- const job = getJob(jobId);
- if (job && (job.status === 'running' || job.status === 'queued')) {
- try {
- process.kill(pid, 'SIGKILL');
- } catch {
- // Process already dead
- }
- updateJob(jobId, {
- status: 'failed',
- error: `Job timed out after ${MAX_JOB_TIMEOUT_MS / 60000} minutes`,
- });
- }
- }, MAX_JOB_TIMEOUT_MS);
-
- // Store timeout handle for orphan cleanup
- jobTimeouts.set(jobId, timeoutId);
- }
-}
diff --git a/src/server/handlers.ts b/src/server/handlers.ts
index a09f8fe..681000d 100644
--- a/src/server/handlers.ts
+++ b/src/server/handlers.ts
@@ -19,9 +19,6 @@ import {
} from '../tools/index.js';
import { matrixGetSolution } from '../tools/recall.js';
import { matrixWarn } from '../tools/warn.js';
-import { matrixPrompt } from '../tools/prompt.js';
-import { getJob, cancelJob, listJobs, createJob, spawnBackgroundJob } from '../jobs/index.js';
-import { matrixDreamer } from '../dreamer/index.js';
import {
validators,
validate,
@@ -32,7 +29,6 @@ import {
type GetSolutionInput,
type FailureInput,
type WarnInput,
- type PromptInput,
type FindDefinitionInput,
type FindCallersInput,
type ListExportsInput,
@@ -41,10 +37,6 @@ import {
type IndexStatusInput,
type ReindexInput,
type DoctorInput,
- type JobStatusInput,
- type JobCancelInput,
- type JobListInput,
- type DreamerInput,
type FindDeadCodeInput,
type FindCircularDepsInput,
} from '../tools/validation.js';
@@ -243,13 +235,6 @@ export async function handleToolCall(name: string, args: Record
return JSON.stringify(result);
}
- // Prompt Agent
- case 'matrix_prompt': {
- const input = validate(validators.prompt, args);
- const result = await matrixPrompt(input);
- return JSON.stringify(result);
- }
-
// Code Index Tools
case 'matrix_find_definition': {
const input = validate(validators.findDefinition, args);
@@ -301,19 +286,6 @@ export async function handleToolCall(name: string, args: Record
case 'matrix_reindex': {
const input = validate(validators.reindex, args);
-
- // Async mode: return job ID immediately for polling
- if (input.async) {
- const jobId = createJob('matrix_reindex', input);
- spawnBackgroundJob('reindex', jobId, input);
- return JSON.stringify({
- jobId,
- status: 'queued',
- message: 'Reindex started in background. Use matrix_job_status to poll for progress.',
- });
- }
-
- // Sync mode: wait for completion (existing behavior)
const result = await matrixReindex(input);
return JSON.stringify(result);
}
@@ -325,65 +297,6 @@ export async function handleToolCall(name: string, args: Record
return JSON.stringify(result);
}
- // ═══════════════════════════════════════════════════════════════
- // Background Job Tools
- // ═══════════════════════════════════════════════════════════════
-
- case 'matrix_job_status': {
- const input = validate(validators.jobStatus, args);
- const job = getJob(input.jobId);
- if (!job) {
- return JSON.stringify({ error: `Job not found: ${input.jobId}` });
- }
- return JSON.stringify({
- id: job.id,
- toolName: job.toolName,
- status: job.status,
- progress: job.progressPercent,
- message: job.progressMessage,
- result: job.result,
- error: job.error,
- createdAt: job.createdAt,
- startedAt: job.startedAt,
- completedAt: job.completedAt,
- });
- }
-
- case 'matrix_job_cancel': {
- const input = validate(validators.jobCancel, args);
- const cancelled = cancelJob(input.jobId);
- return JSON.stringify({
- jobId: input.jobId,
- cancelled,
- message: cancelled
- ? 'Job cancelled successfully'
- : 'Job could not be cancelled (already completed or not found)',
- });
- }
-
- case 'matrix_job_list': {
- const input = validate(validators.jobList, args);
- const jobs = listJobs(input.status, input.limit ?? 50);
- return JSON.stringify({
- jobs: jobs.map(job => ({
- id: job.id,
- toolName: job.toolName,
- status: job.status,
- progress: job.progressPercent,
- message: job.progressMessage,
- createdAt: job.createdAt,
- })),
- count: jobs.length,
- });
- }
-
- // Dreamer - Scheduled Task Automation
- case 'matrix_dreamer': {
- const input = validate(validators.dreamer, args);
- const result = await matrixDreamer(input);
- return JSON.stringify(result);
- }
-
default:
throw new Error(`Unknown tool: ${name}`);
}
diff --git a/src/tools/doctor/checks/core.ts b/src/tools/doctor/checks/core.ts
index a383f15..58dca19 100644
--- a/src/tools/doctor/checks/core.ts
+++ b/src/tools/doctor/checks/core.ts
@@ -115,8 +115,6 @@ function findMissingConfigSections(config: ReturnType): string
const missing: string[] = [];
if (!config.hooks) missing.push('hooks');
- if (!config.hooks?.promptAnalysis) missing.push('hooks.promptAnalysis');
- if (!config.hooks?.promptAnalysis?.memoryInjection) missing.push('hooks.promptAnalysis.memoryInjection');
if (!config.hooks?.permissions) missing.push('hooks.permissions');
if (!config.hooks?.sensitiveFiles) missing.push('hooks.sensitiveFiles');
if (!config.hooks?.stop) missing.push('hooks.stop');
@@ -126,10 +124,6 @@ function findMissingConfigSections(config: ReturnType): string
if (!config.indexing) missing.push('indexing');
if (!config.toolSearch) missing.push('toolSearch');
if (!config.delegation) missing.push('delegation');
- if (!config.dreamer) missing.push('dreamer');
- if (!config.dreamer?.worktree) missing.push('dreamer.worktree');
- if (!config.dreamer?.execution) missing.push('dreamer.execution');
-
return missing;
}
diff --git a/src/tools/doctor/checks/tables.ts b/src/tools/doctor/checks/tables.ts
index c254c79..e189b44 100644
--- a/src/tools/doctor/checks/tables.ts
+++ b/src/tools/doctor/checks/tables.ts
@@ -1,15 +1,9 @@
/**
* Database Table Checks
*
- * - Background Jobs
* - Hook Executions
- * - Dreamer Scheduler
*/
-import { existsSync, readdirSync } from 'fs';
-import { join } from 'path';
-import { homedir } from 'os';
-import { spawnSync } from 'child_process';
import { runMigrations } from '../../../db/migrate.js';
import { getDb } from '../../../db/index.js';
import type { DiagnosticCheck } from '../types.js';
@@ -27,69 +21,20 @@ function tableExists(tableName: string): boolean {
}
/**
- * Create a diagnostic check result for a missing table
+ * Check hook_executions table exists
*/
-function missingTableCheck(name: string, tableName: string): DiagnosticCheck {
- return {
- name,
- status: 'warn',
- message: tableName + ' table missing (run migrations)',
- autoFixable: true,
- fixAction: 'Run database migrations',
- };
-}
-
-/**
- * Check background_jobs table exists (v2.0+ feature)
- */
-export function checkBackgroundJobs(): DiagnosticCheck {
+export function checkHookExecutions(): DiagnosticCheck {
try {
- if (!tableExists('background_jobs')) {
- return missingTableCheck('Background Jobs', 'background_jobs');
- }
-
- const db = getDb();
- const orphaned = db.query(`
- SELECT COUNT(*) as count FROM background_jobs
- WHERE status = 'running' AND pid IS NOT NULL
- `).get() as { count: number };
-
- if (orphaned.count > 0) {
+ if (!tableExists('hook_executions')) {
return {
- name: 'Background Jobs',
+ name: 'Hook Executions',
status: 'warn',
- message: orphaned.count + ' orphaned running jobs found',
+ message: 'hook_executions table missing (run migrations)',
autoFixable: true,
- fixAction: 'Clean up orphaned jobs',
+ fixAction: 'Run database migrations',
};
}
- return {
- name: 'Background Jobs',
- status: 'pass',
- message: 'Table exists, no orphaned jobs',
- autoFixable: false,
- };
- } catch (err) {
- return {
- name: 'Background Jobs',
- status: 'warn',
- message: 'Check failed: ' + (err instanceof Error ? err.message : 'Unknown'),
- autoFixable: true,
- fixAction: 'Run database migrations',
- };
- }
-}
-
-/**
- * Check hook_executions table exists (v2.0+ feature)
- */
-export function checkHookExecutions(): DiagnosticCheck {
- try {
- if (!tableExists('hook_executions')) {
- return missingTableCheck('Hook Executions', 'hook_executions');
- }
-
return {
name: 'Hook Executions',
status: 'pass',
@@ -107,92 +52,16 @@ export function checkHookExecutions(): DiagnosticCheck {
}
}
-/**
- * Check Dreamer scheduler tables and registration
- */
-export function checkDreamer(): DiagnosticCheck {
- try {
- if (!tableExists('dreamer_tasks')) {
- return missingTableCheck('Dreamer Scheduler', 'dreamer_tasks');
- }
-
- if (!tableExists('dreamer_executions')) {
- return missingTableCheck('Dreamer Scheduler', 'dreamer_executions');
- }
-
- const db = getDb();
- const taskCount = db.query(`
- SELECT COUNT(*) as count FROM dreamer_tasks WHERE enabled = 1
- `).get() as { count: number };
-
- const platform = process.platform;
- let schedulerStatus = 'unknown';
-
- if (platform === 'darwin') {
- const launchAgentsDir = join(homedir(), 'Library', 'LaunchAgents');
- if (existsSync(launchAgentsDir)) {
- const files = readdirSync(launchAgentsDir);
- const dreamerPlists = files.filter(f => f.startsWith('com.claude.dreamer.'));
- schedulerStatus = dreamerPlists.length + ' launchd agents';
- } else {
- schedulerStatus = 'LaunchAgents dir missing';
- }
- } else if (platform === 'linux') {
- const result = spawnSync('crontab', ['-l'], { encoding: 'utf-8' });
- if (result.status === 0) {
- const lines = result.stdout.split('\n').filter(l => l.includes('claude-dreamer'));
- schedulerStatus = lines.length + ' cron entries';
- } else {
- schedulerStatus = 'crontab not accessible';
- }
- } else {
- schedulerStatus = 'unsupported platform';
- }
-
- return {
- name: 'Dreamer Scheduler',
- status: 'pass',
- message: taskCount.count + ' tasks enabled, ' + schedulerStatus,
- autoFixable: false,
- };
- } catch (err) {
- return {
- name: 'Dreamer Scheduler',
- status: 'warn',
- message: 'Check failed: ' + (err instanceof Error ? err.message : 'Unknown'),
- autoFixable: true,
- fixAction: 'Run database migrations',
- };
- }
-}
-
/**
* Auto-fix table checks
*/
export async function fixTableCheck(check: DiagnosticCheck): Promise {
try {
switch (check.name) {
- case 'Background Jobs':
- if (check.message.includes('orphaned')) {
- const db = getDb();
- db.query(`
- UPDATE background_jobs
- SET status = 'failed', error = 'Orphaned job cleaned up by doctor', completed_at = datetime('now')
- WHERE status = 'running' AND pid IS NOT NULL
- `).run();
- return { ...check, status: 'pass', fixed: true, message: 'Orphaned jobs cleaned up' };
- }
- runMigrations();
- return { ...check, status: 'pass', fixed: true, message: 'Table created via migrations' };
-
case 'Hook Executions':
runMigrations();
return { ...check, status: 'pass', fixed: true, message: 'Table created via migrations' };
- case 'Dreamer Scheduler':
- runMigrations();
- return { ...check, status: 'pass', fixed: true, message: 'Tables created via migrations' };
-
default:
return check;
}
diff --git a/src/tools/doctor/index.ts b/src/tools/doctor/index.ts
index 3eac928..edad53d 100644
--- a/src/tools/doctor/index.ts
+++ b/src/tools/doctor/index.ts
@@ -6,13 +6,12 @@
*/
import { join } from 'path';
-import { homedir } from 'os';
import { getConfigPath } from '../../config/index.js';
import type { DiagnosticCheck, DoctorResult, DoctorInput } from './types.js';
// Import checks
import { checkMatrixDir, checkDatabase, checkConfig, checkConfigMigration, fixCoreCheck, MATRIX_DIR } from './checks/core.js';
-import { checkBackgroundJobs, checkHookExecutions, checkDreamer, fixTableCheck } from './checks/tables.js';
+import { checkHookExecutions, fixTableCheck } from './checks/tables.js';
import { checkHooks, checkSubagentHooks, checkDelegation, fixHooksCheck } from './checks/hooks.js';
import { checkIndex, checkRepoDetection, checkSkillsDirectory, checkFileSuggestion, fixFeatureCheck } from './checks/features.js';
import { generateIssueTemplate, GITHUB_REPO } from './issue-template.js';
@@ -30,7 +29,7 @@ async function attemptFix(check: DiagnosticCheck): Promise {
// Route to appropriate fixer based on check category
const coreChecks = ['Matrix Directory', 'Database', 'Configuration', 'Config Migration'];
- const tableChecks = ['Background Jobs', 'Hook Executions', 'Dreamer Scheduler'];
+ const tableChecks = ['Hook Executions'];
const hookChecks = ['Subagent Hooks', 'Model Delegation'];
const featureChecks = ['Code Index', 'Skills Directory', 'File Suggestion'];
@@ -64,8 +63,7 @@ export async function matrixDoctor(input: DoctorInput = {}): Promise {
- const contexts: string[] = [];
-
- try {
- // Search for relevant solutions
- const recallResult = await matrixRecall({
- query: prompt.slice(0, 500),
- limit: 2,
- minScore: 0.4,
- });
-
- if (recallResult.solutions.length > 0) {
- for (const sol of recallResult.solutions) {
- const preview = sol.solution.slice(0, 150);
- contexts.push(`[Matrix Solution ${sol.id}] ${sol.problem.slice(0, 100)} → ${preview}`);
- }
- }
-
- // Search for related failures
- const failures = await searchFailures(prompt.slice(0, 300), 1);
- if (failures.length > 0) {
- const fail = failures[0];
- if (fail) {
- contexts.push(`[Matrix Warning] Avoid: ${fail.errorMessage.slice(0, 100)}`);
- }
- }
- } catch {
- // Matrix not available
- }
-
- return contexts;
-}
-
-/**
- * Optimize the prompt based on context
- */
-function optimizePrompt(
- rawPrompt: string,
- assumptions: Assumption[],
- memoryContext: string[]
-): string {
- let optimized = rawPrompt.trim();
-
- // Add context hints for Claude
- const hints: string[] = [];
-
- // Add relevant assumptions
- const highConfidenceAssumptions = assumptions.filter(a => a.confidence > 0.7);
- if (highConfidenceAssumptions.length > 0) {
- for (const a of highConfidenceAssumptions) {
- hints.push(`[Assumed: ${a.assumption}]`);
- }
- }
-
- // Reference relevant memory
- if (memoryContext.length > 0) {
- hints.push('[Matrix memory available - use matrix_recall for details]');
- }
-
- if (hints.length > 0) {
- optimized = `${optimized}\n\n---\nContext: ${hints.join(' ')}`;
- }
-
- return optimized;
-}
-
-/**
- * Main Prompt Agent function
- */
-export async function matrixPrompt(input: PromptInput): Promise {
- const { rawPrompt, mode = 'interactive', skipClarification = false } = input;
-
- // Check for shortcuts
- const shortcut = detectShortcut(rawPrompt);
- if (shortcut) {
- if (shortcut.action === 'abort') {
- return {
- optimizedPrompt: '',
- confidence: 0,
- assumptions: [],
- requiresApproval: false,
- shortcutDetected: shortcut,
- contextInjected: { fromClaudeMd: [], fromGit: [], fromMemory: [] },
- complexity: { score: 0, reasoning: 'Aborted by user' },
- };
- }
- if (shortcut.action === 'execute') {
- // Strip the shortcut and use remaining prompt, preserving original casing
- const cleanPrompt = rawPrompt.replace(new RegExp(`\\b${shortcut.trigger}\\b`, 'i'), '').trim();
- if (!cleanPrompt) {
- return {
- optimizedPrompt: '[Execute previous with best judgment]',
- confidence: 85,
- assumptions: [{ category: 'action', assumption: 'User wants to proceed with inferred plan', confidence: 0.9 }],
- requiresApproval: false,
- shortcutDetected: shortcut,
- contextInjected: { fromClaudeMd: [], fromGit: [], fromMemory: [] },
- complexity: { score: 5, reasoning: 'Quick execution' },
- };
- }
- }
- }
-
- // Load context in parallel
- const [claudeMdContext, gitContext, memoryContext, complexity] = await Promise.all([
- Promise.resolve(loadClaudeMdContext()),
- getGitContext(),
- getMemoryContext(rawPrompt),
- estimateComplexity(rawPrompt),
- ]);
-
- // Analyze ambiguity
- const ambiguity = skipClarification ? null : analyzeAmbiguity(rawPrompt);
-
- // Generate assumptions
- const assumptions = generateAssumptions(rawPrompt, claudeMdContext, gitContext);
-
- // Calculate confidence
- const confidence = calculateConfidence(rawPrompt, ambiguity, assumptions, memoryContext);
-
- // Determine if approval is needed based on mode and confidence
- const thresholds = {
- interactive: 80,
- auto: 90,
- spawn: 85,
- };
- const threshold = thresholds[mode];
- const requiresApproval = confidence < threshold;
-
- // Optimize the prompt
- const optimizedPrompt = optimizePrompt(rawPrompt, assumptions, memoryContext);
-
- // Decide on clarification
- let clarificationNeeded: ClarificationQuestion | undefined;
- if (ambiguity && confidence < 70 && mode !== 'auto') {
- clarificationNeeded = ambiguity;
- }
-
- return {
- optimizedPrompt,
- confidence,
- assumptions,
- requiresApproval,
- clarificationNeeded,
- shortcutDetected: shortcut || undefined,
- contextInjected: {
- fromClaudeMd: claudeMdContext,
- fromGit: gitContext,
- fromMemory: memoryContext,
- },
- complexity: {
- score: complexity.score,
- reasoning: complexity.reasoning,
- },
- };
-}
diff --git a/src/tools/schemas.ts b/src/tools/schemas.ts
index 77e18f6..b0522b8 100644
--- a/src/tools/schemas.ts
+++ b/src/tools/schemas.ts
@@ -15,7 +15,6 @@ import {
FailureInputSchema,
StatusInputSchema,
WarnInputSchema,
- PromptInputSchema,
FindDefinitionInputSchema,
FindCallersInputSchema,
ListExportsInputSchema,
@@ -24,10 +23,6 @@ import {
IndexStatusInputSchema,
ReindexInputSchema,
DoctorInputSchema,
- JobStatusInputSchema,
- JobCancelInputSchema,
- JobListInputSchema,
- DreamerInputSchema,
FindDeadCodeInputSchema,
FindCircularDepsInputSchema,
} from './validation.js';
@@ -121,13 +116,6 @@ export const TOOLS: Tool[] = [
// ═══════════════════════════════════════════════════════════════
// Utility Tools - Always visible
// ═══════════════════════════════════════════════════════════════
- {
- name: 'matrix_prompt',
- description: 'Analyze and optimize a prompt. Detects ambiguity, infers context, returns optimized prompt or asks clarification questions.',
- annotations: { readOnlyHint: true },
- inputSchema: toInputSchema(PromptInputSchema),
- _meta: { category: 'utility' as ToolCategory, visibility: 'always' as VisibilityRule },
- },
{
name: 'matrix_doctor',
description: 'Run diagnostics and auto-fix Matrix plugin issues. Checks database, config, hooks, and index health. If issues cannot be auto-fixed, provides GitHub issue template.',
@@ -207,39 +195,4 @@ export const TOOLS: Tool[] = [
_meta: { delegable: true, category: 'index' as ToolCategory, visibility: 'always' as VisibilityRule },
},
- // ═══════════════════════════════════════════════════════════════
- // Background Job Tools - Always visible
- // ═══════════════════════════════════════════════════════════════
- {
- name: 'matrix_job_status',
- description: 'Get the status and progress of a background job. Use to poll for completion after starting an async operation.',
- annotations: { readOnlyHint: true },
- inputSchema: toInputSchema(JobStatusInputSchema),
- _meta: { delegable: true, category: 'utility' as ToolCategory, visibility: 'always' as VisibilityRule },
- },
- {
- name: 'matrix_job_cancel',
- description: 'Cancel a running or queued background job.',
- annotations: { destructiveHint: true },
- inputSchema: toInputSchema(JobCancelInputSchema),
- _meta: { category: 'utility' as ToolCategory, visibility: 'always' as VisibilityRule },
- },
- {
- name: 'matrix_job_list',
- description: 'List background jobs. Filter by status to see queued, running, or completed jobs.',
- annotations: { readOnlyHint: true },
- inputSchema: toInputSchema(JobListInputSchema),
- _meta: { delegable: true, category: 'utility' as ToolCategory, visibility: 'always' as VisibilityRule },
- },
-
- // ═══════════════════════════════════════════════════════════════
- // Dreamer - Scheduled Task Automation (v2.1)
- // ═══════════════════════════════════════════════════════════════
- {
- name: 'matrix_dreamer',
- description: 'Schedule and manage automated Claude tasks. Actions: "add" (create task - RECURRING by default, confirm with user first!), "list" (show all tasks), "run" (execute immediately), "remove" (delete task), "status" (system health), "logs" (view output), "history" (execution records). IMPORTANT: Before using "add", ASK user if they want ONE-TIME or RECURRING - natural language like "at 1am" becomes daily cron by default. Uses native OS schedulers (launchd on macOS, crontab on Linux).',
- annotations: { idempotentHint: false },
- inputSchema: toInputSchema(DreamerInputSchema),
- _meta: { category: 'utility' as ToolCategory, visibility: 'always' as VisibilityRule },
- },
];
diff --git a/src/tools/validation.ts b/src/tools/validation.ts
index f4bf855..12d2cad 100644
--- a/src/tools/validation.ts
+++ b/src/tools/validation.ts
@@ -80,12 +80,6 @@ const SymbolKindEnum = Type.Union([
Type.Literal('property'),
]);
-const PromptModeEnum = Type.Union([
- Type.Literal('interactive'),
- Type.Literal('auto'),
- Type.Literal('spawn'),
-]);
-
// ============================================================================
// Analysis Enums
// ============================================================================
@@ -176,12 +170,6 @@ export const WarnInputSchema = Type.Object({
repoSpecific: Type.Optional(Type.Boolean({ description: 'Repo-specific warning (for add)' })),
});
-export const PromptInputSchema = Type.Object({
- rawPrompt: Type.String({ description: 'User prompt to analyze' }),
- mode: Type.Optional(PromptModeEnum),
- skipClarification: Type.Optional(Type.Boolean({ description: 'Skip clarification questions' })),
-});
-
// Shared description for repoPath - remove redundancy
const repoPathDesc = 'Repository path';
@@ -221,7 +209,6 @@ export const IndexStatusInputSchema = Type.Object({
export const ReindexInputSchema = Type.Object({
full: Type.Optional(Type.Boolean({ description: 'Force full reindex' })),
repoPath: Type.Optional(Type.String({ description: repoPathDesc })),
- async: Type.Optional(Type.Boolean({ description: 'Run in background, returns jobId for polling' })),
});
export const DoctorInputSchema = Type.Object({
@@ -242,80 +229,6 @@ export const FindCircularDepsInputSchema = Type.Object({
repoPath: Type.Optional(Type.String({ description: repoPathDesc })),
});
-// ============================================================================
-// Background Job Schemas
-// ============================================================================
-
-export const JobStatusInputSchema = Type.Object({
- jobId: Type.String({ description: 'Job ID to check' }),
-});
-
-export const JobCancelInputSchema = Type.Object({
- jobId: Type.String({ description: 'Job ID to cancel' }),
-});
-
-export const JobListInputSchema = Type.Object({
- status: Type.Optional(
- Type.Union([
- Type.Literal('queued'),
- Type.Literal('running'),
- Type.Literal('completed'),
- Type.Literal('failed'),
- Type.Literal('cancelled'),
- ], { description: 'Filter by status' })
- ),
- limit: Type.Optional(Type.Number({ minimum: 1, maximum: 100, description: 'Max jobs to return' })),
-});
-
-// ============================================================================
-// Dreamer (Scheduled Task Automation) Schemas
-// ============================================================================
-
-const DreamerActionEnum = Type.Union([
- Type.Literal('add'),
- Type.Literal('list'),
- Type.Literal('run'),
- Type.Literal('remove'),
- Type.Literal('status'),
- Type.Literal('logs'),
- Type.Literal('history'),
-]);
-
-const WorktreeConfigSchema = Type.Object({
- enabled: Type.Optional(Type.Boolean({ description: 'Run in isolated git worktree' })),
- basePath: Type.Optional(Type.String({ description: 'Worktree base directory' })),
- branchPrefix: Type.Optional(Type.String({ description: 'Branch name prefix (default: claude-task/)' })),
- remoteName: Type.Optional(Type.String({ description: 'Remote for pushing (default: origin)' })),
-});
-
-export const DreamerInputSchema = Type.Object({
- action: DreamerActionEnum,
-
- // For 'add' action
- name: Type.Optional(Type.String({ description: 'Task name', maxLength: 100 })),
- schedule: Type.Optional(Type.String({ description: 'Cron expression or natural language (e.g., "every day at 9am")' })),
- command: Type.Optional(Type.String({ description: 'Claude prompt or /command to execute' })),
- description: Type.Optional(Type.String({ description: 'Task description', maxLength: 500 })),
- workingDirectory: Type.Optional(Type.String({ description: 'Working directory' })),
- timeout: Type.Optional(Type.Number({ description: 'Timeout in seconds (default: 300)' })),
- timezone: Type.Optional(Type.String({ description: 'IANA timezone or "local"' })),
- tags: Type.Optional(Type.Array(Type.String(), { description: 'Tags for organization' })),
- skipPermissions: Type.Optional(Type.Boolean({ description: 'Run with --dangerously-skip-permissions' })),
- worktree: Type.Optional(WorktreeConfigSchema),
- env: Type.Optional(Type.Record(Type.String(), Type.String(), { description: 'Environment variables' })),
-
- // For 'run', 'remove', 'logs' actions
- taskId: Type.Optional(Type.String({ description: 'Task ID' })),
-
- // For 'list' and 'history' actions
- limit: Type.Optional(Type.Number({ description: 'Max results to return' })),
- tag: Type.Optional(Type.String({ description: 'Filter by tag' })),
-
- // For 'logs' action
- lines: Type.Optional(Type.Number({ description: 'Number of log lines (default: 50)' })),
- stream: Type.Optional(Type.Union([Type.Literal('stdout'), Type.Literal('stderr'), Type.Literal('both')], { description: 'Log stream to read' })),
-});
-
// ============================================================================
// Type Exports (inferred from schemas)
// ============================================================================
@@ -327,7 +240,6 @@ export type GetSolutionInput = Static;
export type FailureInput = Static;
export type StatusInput = Static;
export type WarnInput = Static;
-export type PromptInput = Static;
export type FindDefinitionInput = Static;
export type FindCallersInput = Static;
export type ListExportsInput = Static;
@@ -336,10 +248,6 @@ export type GetImportsInput = Static;
export type IndexStatusInput = Static;
export type ReindexInput = Static;
export type DoctorInput = Static;
-export type JobStatusInput = Static;
-export type JobCancelInput = Static;
-export type JobListInput = Static;
-export type DreamerInput = Static;
export type FindDeadCodeInput = Static;
export type FindCircularDepsInput = Static;
@@ -355,7 +263,6 @@ export const validators = {
failure: TypeCompiler.Compile(FailureInputSchema),
status: TypeCompiler.Compile(StatusInputSchema),
warn: TypeCompiler.Compile(WarnInputSchema),
- prompt: TypeCompiler.Compile(PromptInputSchema),
findDefinition: TypeCompiler.Compile(FindDefinitionInputSchema),
findCallers: TypeCompiler.Compile(FindCallersInputSchema),
listExports: TypeCompiler.Compile(ListExportsInputSchema),
@@ -364,10 +271,6 @@ export const validators = {
indexStatus: TypeCompiler.Compile(IndexStatusInputSchema),
reindex: TypeCompiler.Compile(ReindexInputSchema),
doctor: TypeCompiler.Compile(DoctorInputSchema),
- jobStatus: TypeCompiler.Compile(JobStatusInputSchema),
- jobCancel: TypeCompiler.Compile(JobCancelInputSchema),
- jobList: TypeCompiler.Compile(JobListInputSchema),
- dreamer: TypeCompiler.Compile(DreamerInputSchema),
findDeadCode: TypeCompiler.Compile(FindDeadCodeInputSchema),
findCircularDeps: TypeCompiler.Compile(FindCircularDepsInputSchema),
} as const;
diff --git a/src/types/tools.ts b/src/types/tools.ts
index 4612cfe..833b98b 100644
--- a/src/types/tools.ts
+++ b/src/types/tools.ts
@@ -12,7 +12,6 @@ export type {
RewardInput,
FailureInput,
WarnInput, // v2.0: Consolidated warn tool input
- PromptInput,
FindDefinitionInput,
FindCallersInput, // v2.0: New callers tool
ListExportsInput,