diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md new file mode 100644 index 000000000..afb4acdf6 --- /dev/null +++ b/CONTRIBUTING.md @@ -0,0 +1,162 @@ +# Contributing to Paseo + +Thanks for taking the time to contribute. + +## Before you start + +Please read these first: + +- [README.md](README.md) +- [docs/ARCHITECTURE.md](docs/ARCHITECTURE.md) +- [docs/DEVELOPMENT.md](docs/DEVELOPMENT.md) +- [docs/CODING_STANDARDS.md](docs/CODING_STANDARDS.md) +- [docs/TESTING.md](docs/TESTING.md) +- [CLAUDE.md](CLAUDE.md) + +## What is most helpful + +The highest-signal contributions right now are: + +- bug fixes +- regression fixes +- docs improvements +- packaging / platform fixes +- focused UX improvements that fit the existing product direction +- tests that lock down important behavior + +## Discuss large changes first + +If you want to add a major feature, change core UX, introduce a new surface, or bring in a new architectural concept, please open an issue or start a conversation first. + +Even if the code is good, large unsolicited PRs are unlikely to be merged if they set product direction without prior alignment. + +In short: + +- small, focused PRs: great +- large product-shaping PRs without discussion: probably not + +## Scope expectations + +Please keep PRs narrow. + +Good: + +- fix one bug +- improve one flow +- add one focused panel or command +- tighten one piece of UI + +Bad: + +- combine multiple product ideas in one PR +- bundle unrelated refactors with a feature +- sneak in roadmap decisions + +If a contribution contains multiple ideas, split it up. + +## Product fit matters + +Paseo is an opinionated product. + +When reviewing contributions, the bar is not just: + +- is this useful? +- is this well implemented? + +It is also: + +- does this fit Paseo? +- does this preserve the product's current direction? +- does this increase long-term complexity in a way that is worth it? + +## Development setup + +### Prerequisites + +- Node.js matching `.tool-versions` +- npm workspaces + +### Start local development + +```bash +npm run dev +``` + +Useful commands: + +```bash +npm run dev:server +npm run dev:app +npm run dev:desktop +npm run dev:website +npm run cli -- ls -a -g +``` + +Read [docs/DEVELOPMENT.md](docs/DEVELOPMENT.md) for build-sync gotchas, local state, ports, and daemon details. + +## Testing and verification + +At minimum, run the checks relevant to your change. + +Common checks: + +```bash +npm run typecheck +npm run test --workspaces --if-present +``` + +Important rules: + +- always run `npm run typecheck` after changes +- tests should be deterministic +- prefer real dependencies over mocks when possible +- do not make breaking WebSocket / protocol changes +- app and daemon versions in the wild lag each other, so compatibility matters + +If you touch protocol or shared client/server behavior, read the compatibility notes in [CLAUDE.md](CLAUDE.md). + +## Coding standards + +Paseo has explicit standards. Please follow them. + +Highlights: + +- keep complexity low +- avoid "while I'm at it" cleanup +- no `any` +- prefer object parameters over positional argument lists +- preserve behavior unless the change is explicitly meant to change behavior +- collocate tests with implementation + +The full guide lives in [docs/CODING_STANDARDS.md](docs/CODING_STANDARDS.md). + +## PR checklist + +Before opening a PR, make sure: + +- the change is focused +- the PR description explains what changed and why +- relevant docs were updated if needed +- typecheck passes +- tests pass, or you clearly explain what could not be run +- the change does not accidentally bundle unrelated product ideas + +## Communication + +If you are unsure whether something fits, ask first. + +That is especially true for: + +- new core UX +- naming / terminology changes +- new extension points +- new orchestration models +- anything that would be hard to remove later + +Early alignment is much better than a large PR that is expensive for everyone to unwind. + +## Forks are fine + +If you want to explore a different product direction, a fork is completely fine. + +Paseo is open source on purpose. Not every idea needs to land in the main repo to be valuable. diff --git a/docs/DEVELOPMENT.md b/docs/DEVELOPMENT.md index 30f617ccc..9fa18db0d 100644 --- a/docs/DEVELOPMENT.md +++ b/docs/DEVELOPMENT.md @@ -35,6 +35,25 @@ In worktrees or with `npm run dev`, ports may differ. Never assume defaults. Check `$PASEO_HOME/daemon.log` for trace-level logs. +### Database queries + +Run arbitrary SQL against the SQLite database: + +```bash +# Show table row counts +npm run db:query + +# Run any SQL +npm run db:query -- "SELECT agent_id, title, last_status FROM agent_snapshots" +npm run db:query -- "SELECT agent_id, seq, item_kind FROM agent_timeline_rows ORDER BY committed_at DESC LIMIT 10" + +# Point at a specific DB directory +npm run db:query -- --db /path/to/db "SELECT ..." +``` + +Auto-detects the running dev daemon's database from `/tmp/paseo-dev.*`, `PASEO_HOME`, or `~/.paseo/db`. +Pass either a DB directory or a `paseo.sqlite` file to `--db`. The script opens the database directly in read-only mode. + ## Build sync gotchas ### Relay → Daemon diff --git a/docs/STORAGE_REVAMP_PLAN.md b/docs/STORAGE_REVAMP_PLAN.md new file mode 100644 index 000000000..48090faeb --- /dev/null +++ b/docs/STORAGE_REVAMP_PLAN.md @@ -0,0 +1,217 @@ +# Storage Revamp Plan + +Status: active rollout, phases 1 and 2 complete + +This document now tracks the storage revamp as it exists today, not as a speculative design exercise. +The DB foundation and the project/workspace identity cutover have landed. What remains is the explicit +creation/archive surface cleanup, timeline durability cutover, and final removal of legacy paths. + +## Goals + +- make structured records durable in Drizzle + SQLite +- make projects and workspaces explicit first-class records +- stop deriving project/workspace identity from agent `cwd` +- keep agent snapshot persistence behind clear ownership +- move committed timeline history to storage-owned rows +- remove legacy JSON and in-memory authority once the DB path is proven + +## Out of scope + +- moving config, keypairs, push tokens, or server identity into the DB +- persisting raw provider deltas or transport-only chunk streams +- designing a hosted/remote database story beyond keeping the schema portable +- durable reasoning history unless product explicitly asks for it later + +## Current state + +The storage revamp is no longer hypothetical. + +Completed: + +- Drizzle + SQLite database bootstrap is in place +- `projects`, `workspaces`, and `agent_snapshots` use integer primary keys +- `workspaces.project_id` and `agent_snapshots.workspace_id` cascade on delete +- `agent_snapshots.workspace_id` is `NOT NULL` +- legacy JSON import feeds the DB-backed structured records +- project/workspace records use explicit `directory` fields instead of path-as-identity +- session read paths now use persisted workspace/project rows instead of cwd/git derivation +- `workspace-reconciliation-service.ts` is deleted +- `workspace-registry-bootstrap.ts` is deleted +- `workspace-registry-model.ts` is reduced to `normalizeWorkspaceId` + +Still pending: + +- explicit `create_project` / `create_workspace` API cleanup +- final archive cascade behavior for descendants and live agents +- committed timeline storage cutover +- removal of remaining legacy JSON and in-memory committed-history authority + +## Converged decisions + +### Structured record authority + +Projects, workspaces, and agent snapshots are DB-backed structured records. +The server should not recreate project/workspace identity from: + +- git remotes +- worktree main-repo roots +- normalized cwd strings + +Temporary exception: + +- agent creation may still find-or-create a workspace by directory if the UI has not yet provided + `workspaceId` explicitly + +That fallback is transitional and should be deleted once the client always sends the workspace id. + +### Storage seams + +The useful seams remain concrete and domain-shaped: + +- `ProjectRegistry` +- `WorkspaceRegistry` +- `AgentSnapshotStore` +- `AgentTimelineStore` + +There is no reason to reintroduce a reconciliation service layer for project/workspace identity. + +### Timeline contract + +The long-term timeline contract remains: + +- committed rows are durable, canonical history +- provisional live updates are transient subscription state +- committed history is fetched by seq +- provider history replay is not the durability mechanism + +The structured-record cutover is complete before the timeline cutover so timeline rows can rely on +stable DB-backed agent and workspace identity. + +## Remaining phases + +### Phase 3: Explicit creation and archive cleanup + +Goal: +Remove the last transitional write paths that still infer state from directories. + +Required work: + +- add explicit `create_project` handling +- add explicit `create_workspace` handling +- make agent creation require `workspaceId` once the UI is ready +- finish archive semantics for workspaces/projects and any descendant agent state +- remove the temporary find-or-create-by-directory fallback from agent creation + +Exit gate: + +- project/workspace creation is explicit end to end +- no normal creation path infers identity from cwd or git metadata +- archive flows behave consistently for structured records and live runtime state + +### Phase 4: Timeline storage cutover + +Goal: +Make committed history durable and storage-owned. + +Required work: + +- make `AgentTimelineStore` authoritative for committed history +- write one committed row per finalized logical item +- support tail, before-seq, and after-seq queries from storage +- stop treating provider history hydration as the normal refresh/load path +- keep provisional live updates in memory only + +Exit gate: + +- committed history survives daemon restart +- reconnect uses committed catch-up plus future live events without gaps or duplicates +- unloaded agents can serve committed history from storage alone + +### Phase 5: Legacy cleanup + +Goal: +Remove compatibility paths after the DB-backed model is fully authoritative. + +Required work: + +- remove legacy JSON authority for structured records +- remove in-memory committed-history ownership +- remove provider-history rehydrate compatibility paths +- trim dead protocol and reducer logic from the pre-storage model +- update architecture docs to match the final model + +Exit gate: + +- there is one durable storage path for structured records +- there is one durable storage path for committed timeline history +- the runtime no longer depends on the removed JSON/in-memory model + +## Data model summary + +### Projects + +- integer primary key +- `directory` is unique +- `display_name` +- `kind`: `git | directory` +- optional `git_remote` +- timestamps and archive state + +### Workspaces + +- integer primary key +- belongs to a project by `project_id` +- `directory` is unique +- `display_name` +- `kind`: `checkout | worktree` +- timestamps and archive state + +### Agent snapshots + +- `agent_id` remains the primary key +- belongs to a workspace by integer `workspace_id` +- `workspace_id` is required +- timestamps, lifecycle state, persistence metadata, attention metadata, archive state + +### Timeline rows + +Target shape once Phase 4 lands: + +- `agent_id` +- committed `seq` +- committed timestamp +- canonical finalized item payload + +Not part of durable history: + +- raw streaming chunks +- provisional assistant text +- provisional reasoning text + +## Verification requirements + +Every remaining phase should keep the same bar: + +- `npm run typecheck` +- targeted tests for the touched storage/session/runtime paths +- migration/import coverage when storage authority changes +- reconnect and catch-up scenario coverage when timeline behavior changes + +At minimum, timeline cutover must explicitly prove: + +- `fetch-after-seq` +- `fetch-before-seq` +- restart durability +- no-gap/no-duplicate reconnect behavior + +## Main risks + +- timeline work reintroduces provider-history replay as hidden authority +- archive behavior diverges between stored records and live in-memory agents +- explicit creation work leaves the transitional cwd fallback in place too long +- cleanup stalls after compatibility paths stop being exercised + +## Rule of thumb + +If a new change needs to ask "what can we infer from this cwd?" for project or workspace identity, +it is probably moving in the wrong direction. diff --git a/nix/package.nix b/nix/package.nix index f931cb1cf..4b255c53c 100644 --- a/nix/package.nix +++ b/nix/package.nix @@ -42,7 +42,7 @@ buildNpmPackage rec { # To update: run `nix build` with lib.fakeHash, copy the `got:` hash. # CI auto-updates this when package-lock.json changes (see .github/workflows/). - npmDepsHash = "sha256-QFe3aBiYlt6rhYmMmBzwjph/gDHe90kZYrlin3qIpRo="; + npmDepsHash = "sha256-K+pdRoYUACVAURKRWKNMlRh4mQIzwAGWLZUk2pg4FJg="; # Prevent onnxruntime-node's install script from running during automatic # npm rebuild (it tries to download from api.nuget.org, which fails in the sandbox). diff --git a/package-lock.json b/package-lock.json index 8e69550f1..5398e9881 100644 --- a/package-lock.json +++ b/package-lock.json @@ -21,7 +21,10 @@ ], "dependencies": { "@anthropic-ai/claude-agent-sdk": "^0.2.11", - "@modelcontextprotocol/sdk": "^1.27.1" + "@modelcontextprotocol/sdk": "^1.27.1", + "expo": "~54.0.33", + "react": "19.1.0", + "react-native": "0.81.5" }, "devDependencies": { "@biomejs/biome": "^2.4.8", @@ -3069,6 +3072,13 @@ "react": ">=16.8.0" } }, + "node_modules/@drizzle-team/brocli": { + "version": "0.10.2", + "resolved": "https://registry.npmjs.org/@drizzle-team/brocli/-/brocli-0.10.2.tgz", + "integrity": "sha512-z33Il7l5dKjUgGULTqBsQBQwckHh5AbIuxhdsIxDDiZAzBOrZO6q9ogcWC65kU382AfynTfgNumVcNIjuIua6w==", + "dev": true, + "license": "Apache-2.0" + }, "node_modules/@egjs/hammerjs": { "version": "2.0.17", "resolved": "https://registry.npmjs.org/@egjs/hammerjs/-/hammerjs-2.0.17.tgz", @@ -3526,6 +3536,442 @@ "dev": true, "license": "MIT" }, + "node_modules/@esbuild-kit/core-utils": { + "version": "3.3.2", + "resolved": "https://registry.npmjs.org/@esbuild-kit/core-utils/-/core-utils-3.3.2.tgz", + "integrity": "sha512-sPRAnw9CdSsRmEtnsl2WXWdyquogVpB3yZ3dgwJfe8zrOzTsV7cJvmwrKVa+0ma5BoiGJ+BoqkMvawbayKUsqQ==", + "deprecated": "Merged into tsx: https://tsx.is", + "dev": true, + "license": "MIT", + "dependencies": { + "esbuild": "~0.18.20", + "source-map-support": "^0.5.21" + } + }, + "node_modules/@esbuild-kit/core-utils/node_modules/@esbuild/android-arm": { + "version": "0.18.20", + "resolved": "https://registry.npmjs.org/@esbuild/android-arm/-/android-arm-0.18.20.tgz", + "integrity": "sha512-fyi7TDI/ijKKNZTUJAQqiG5T7YjJXgnzkURqmGj13C6dCqckZBLdl4h7bkhHt/t0WP+zO9/zwroDvANaOqO5Sw==", + "cpu": [ + "arm" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "android" + ], + "engines": { + "node": ">=12" + } + }, + "node_modules/@esbuild-kit/core-utils/node_modules/@esbuild/android-arm64": { + "version": "0.18.20", + "resolved": "https://registry.npmjs.org/@esbuild/android-arm64/-/android-arm64-0.18.20.tgz", + "integrity": "sha512-Nz4rJcchGDtENV0eMKUNa6L12zz2zBDXuhj/Vjh18zGqB44Bi7MBMSXjgunJgjRhCmKOjnPuZp4Mb6OKqtMHLQ==", + "cpu": [ + "arm64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "android" + ], + "engines": { + "node": ">=12" + } + }, + "node_modules/@esbuild-kit/core-utils/node_modules/@esbuild/android-x64": { + "version": "0.18.20", + "resolved": "https://registry.npmjs.org/@esbuild/android-x64/-/android-x64-0.18.20.tgz", + "integrity": "sha512-8GDdlePJA8D6zlZYJV/jnrRAi6rOiNaCC/JclcXpB+KIuvfBN4owLtgzY2bsxnx666XjJx2kDPUmnTtR8qKQUg==", + "cpu": [ + "x64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "android" + ], + "engines": { + "node": ">=12" + } + }, + "node_modules/@esbuild-kit/core-utils/node_modules/@esbuild/darwin-arm64": { + "version": "0.18.20", + "resolved": "https://registry.npmjs.org/@esbuild/darwin-arm64/-/darwin-arm64-0.18.20.tgz", + "integrity": "sha512-bxRHW5kHU38zS2lPTPOyuyTm+S+eobPUnTNkdJEfAddYgEcll4xkT8DB9d2008DtTbl7uJag2HuE5NZAZgnNEA==", + "cpu": [ + "arm64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "darwin" + ], + "engines": { + "node": ">=12" + } + }, + "node_modules/@esbuild-kit/core-utils/node_modules/@esbuild/darwin-x64": { + "version": "0.18.20", + "resolved": "https://registry.npmjs.org/@esbuild/darwin-x64/-/darwin-x64-0.18.20.tgz", + "integrity": "sha512-pc5gxlMDxzm513qPGbCbDukOdsGtKhfxD1zJKXjCCcU7ju50O7MeAZ8c4krSJcOIJGFR+qx21yMMVYwiQvyTyQ==", + "cpu": [ + "x64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "darwin" + ], + "engines": { + "node": ">=12" + } + }, + "node_modules/@esbuild-kit/core-utils/node_modules/@esbuild/freebsd-arm64": { + "version": "0.18.20", + "resolved": "https://registry.npmjs.org/@esbuild/freebsd-arm64/-/freebsd-arm64-0.18.20.tgz", + "integrity": "sha512-yqDQHy4QHevpMAaxhhIwYPMv1NECwOvIpGCZkECn8w2WFHXjEwrBn3CeNIYsibZ/iZEUemj++M26W3cNR5h+Tw==", + "cpu": [ + "arm64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "freebsd" + ], + "engines": { + "node": ">=12" + } + }, + "node_modules/@esbuild-kit/core-utils/node_modules/@esbuild/freebsd-x64": { + "version": "0.18.20", + "resolved": "https://registry.npmjs.org/@esbuild/freebsd-x64/-/freebsd-x64-0.18.20.tgz", + "integrity": "sha512-tgWRPPuQsd3RmBZwarGVHZQvtzfEBOreNuxEMKFcd5DaDn2PbBxfwLcj4+aenoh7ctXcbXmOQIn8HI6mCSw5MQ==", + "cpu": [ + "x64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "freebsd" + ], + "engines": { + "node": ">=12" + } + }, + "node_modules/@esbuild-kit/core-utils/node_modules/@esbuild/linux-arm": { + "version": "0.18.20", + "resolved": "https://registry.npmjs.org/@esbuild/linux-arm/-/linux-arm-0.18.20.tgz", + "integrity": "sha512-/5bHkMWnq1EgKr1V+Ybz3s1hWXok7mDFUMQ4cG10AfW3wL02PSZi5kFpYKrptDsgb2WAJIvRcDm+qIvXf/apvg==", + "cpu": [ + "arm" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "linux" + ], + "engines": { + "node": ">=12" + } + }, + "node_modules/@esbuild-kit/core-utils/node_modules/@esbuild/linux-arm64": { + "version": "0.18.20", + "resolved": "https://registry.npmjs.org/@esbuild/linux-arm64/-/linux-arm64-0.18.20.tgz", + "integrity": "sha512-2YbscF+UL7SQAVIpnWvYwM+3LskyDmPhe31pE7/aoTMFKKzIc9lLbyGUpmmb8a8AixOL61sQ/mFh3jEjHYFvdA==", + "cpu": [ + "arm64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "linux" + ], + "engines": { + "node": ">=12" + } + }, + "node_modules/@esbuild-kit/core-utils/node_modules/@esbuild/linux-ia32": { + "version": "0.18.20", + "resolved": "https://registry.npmjs.org/@esbuild/linux-ia32/-/linux-ia32-0.18.20.tgz", + "integrity": "sha512-P4etWwq6IsReT0E1KHU40bOnzMHoH73aXp96Fs8TIT6z9Hu8G6+0SHSw9i2isWrD2nbx2qo5yUqACgdfVGx7TA==", + "cpu": [ + "ia32" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "linux" + ], + "engines": { + "node": ">=12" + } + }, + "node_modules/@esbuild-kit/core-utils/node_modules/@esbuild/linux-loong64": { + "version": "0.18.20", + "resolved": "https://registry.npmjs.org/@esbuild/linux-loong64/-/linux-loong64-0.18.20.tgz", + "integrity": "sha512-nXW8nqBTrOpDLPgPY9uV+/1DjxoQ7DoB2N8eocyq8I9XuqJ7BiAMDMf9n1xZM9TgW0J8zrquIb/A7s3BJv7rjg==", + "cpu": [ + "loong64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "linux" + ], + "engines": { + "node": ">=12" + } + }, + "node_modules/@esbuild-kit/core-utils/node_modules/@esbuild/linux-mips64el": { + "version": "0.18.20", + "resolved": "https://registry.npmjs.org/@esbuild/linux-mips64el/-/linux-mips64el-0.18.20.tgz", + "integrity": "sha512-d5NeaXZcHp8PzYy5VnXV3VSd2D328Zb+9dEq5HE6bw6+N86JVPExrA6O68OPwobntbNJ0pzCpUFZTo3w0GyetQ==", + "cpu": [ + "mips64el" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "linux" + ], + "engines": { + "node": ">=12" + } + }, + "node_modules/@esbuild-kit/core-utils/node_modules/@esbuild/linux-ppc64": { + "version": "0.18.20", + "resolved": "https://registry.npmjs.org/@esbuild/linux-ppc64/-/linux-ppc64-0.18.20.tgz", + "integrity": "sha512-WHPyeScRNcmANnLQkq6AfyXRFr5D6N2sKgkFo2FqguP44Nw2eyDlbTdZwd9GYk98DZG9QItIiTlFLHJHjxP3FA==", + "cpu": [ + "ppc64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "linux" + ], + "engines": { + "node": ">=12" + } + }, + "node_modules/@esbuild-kit/core-utils/node_modules/@esbuild/linux-riscv64": { + "version": "0.18.20", + "resolved": "https://registry.npmjs.org/@esbuild/linux-riscv64/-/linux-riscv64-0.18.20.tgz", + "integrity": "sha512-WSxo6h5ecI5XH34KC7w5veNnKkju3zBRLEQNY7mv5mtBmrP/MjNBCAlsM2u5hDBlS3NGcTQpoBvRzqBcRtpq1A==", + "cpu": [ + "riscv64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "linux" + ], + "engines": { + "node": ">=12" + } + }, + "node_modules/@esbuild-kit/core-utils/node_modules/@esbuild/linux-s390x": { + "version": "0.18.20", + "resolved": "https://registry.npmjs.org/@esbuild/linux-s390x/-/linux-s390x-0.18.20.tgz", + "integrity": "sha512-+8231GMs3mAEth6Ja1iK0a1sQ3ohfcpzpRLH8uuc5/KVDFneH6jtAJLFGafpzpMRO6DzJ6AvXKze9LfFMrIHVQ==", + "cpu": [ + "s390x" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "linux" + ], + "engines": { + "node": ">=12" + } + }, + "node_modules/@esbuild-kit/core-utils/node_modules/@esbuild/linux-x64": { + "version": "0.18.20", + "resolved": "https://registry.npmjs.org/@esbuild/linux-x64/-/linux-x64-0.18.20.tgz", + "integrity": "sha512-UYqiqemphJcNsFEskc73jQ7B9jgwjWrSayxawS6UVFZGWrAAtkzjxSqnoclCXxWtfwLdzU+vTpcNYhpn43uP1w==", + "cpu": [ + "x64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "linux" + ], + "engines": { + "node": ">=12" + } + }, + "node_modules/@esbuild-kit/core-utils/node_modules/@esbuild/netbsd-x64": { + "version": "0.18.20", + "resolved": "https://registry.npmjs.org/@esbuild/netbsd-x64/-/netbsd-x64-0.18.20.tgz", + "integrity": "sha512-iO1c++VP6xUBUmltHZoMtCUdPlnPGdBom6IrO4gyKPFFVBKioIImVooR5I83nTew5UOYrk3gIJhbZh8X44y06A==", + "cpu": [ + "x64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "netbsd" + ], + "engines": { + "node": ">=12" + } + }, + "node_modules/@esbuild-kit/core-utils/node_modules/@esbuild/openbsd-x64": { + "version": "0.18.20", + "resolved": "https://registry.npmjs.org/@esbuild/openbsd-x64/-/openbsd-x64-0.18.20.tgz", + "integrity": "sha512-e5e4YSsuQfX4cxcygw/UCPIEP6wbIL+se3sxPdCiMbFLBWu0eiZOJ7WoD+ptCLrmjZBK1Wk7I6D/I3NglUGOxg==", + "cpu": [ + "x64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "openbsd" + ], + "engines": { + "node": ">=12" + } + }, + "node_modules/@esbuild-kit/core-utils/node_modules/@esbuild/sunos-x64": { + "version": "0.18.20", + "resolved": "https://registry.npmjs.org/@esbuild/sunos-x64/-/sunos-x64-0.18.20.tgz", + "integrity": "sha512-kDbFRFp0YpTQVVrqUd5FTYmWo45zGaXe0X8E1G/LKFC0v8x0vWrhOWSLITcCn63lmZIxfOMXtCfti/RxN/0wnQ==", + "cpu": [ + "x64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "sunos" + ], + "engines": { + "node": ">=12" + } + }, + "node_modules/@esbuild-kit/core-utils/node_modules/@esbuild/win32-arm64": { + "version": "0.18.20", + "resolved": "https://registry.npmjs.org/@esbuild/win32-arm64/-/win32-arm64-0.18.20.tgz", + "integrity": "sha512-ddYFR6ItYgoaq4v4JmQQaAI5s7npztfV4Ag6NrhiaW0RrnOXqBkgwZLofVTlq1daVTQNhtI5oieTvkRPfZrePg==", + "cpu": [ + "arm64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "win32" + ], + "engines": { + "node": ">=12" + } + }, + "node_modules/@esbuild-kit/core-utils/node_modules/@esbuild/win32-ia32": { + "version": "0.18.20", + "resolved": "https://registry.npmjs.org/@esbuild/win32-ia32/-/win32-ia32-0.18.20.tgz", + "integrity": "sha512-Wv7QBi3ID/rROT08SABTS7eV4hX26sVduqDOTe1MvGMjNd3EjOz4b7zeexIR62GTIEKrfJXKL9LFxTYgkyeu7g==", + "cpu": [ + "ia32" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "win32" + ], + "engines": { + "node": ">=12" + } + }, + "node_modules/@esbuild-kit/core-utils/node_modules/@esbuild/win32-x64": { + "version": "0.18.20", + "resolved": "https://registry.npmjs.org/@esbuild/win32-x64/-/win32-x64-0.18.20.tgz", + "integrity": "sha512-kTdfRcSiDfQca/y9QIkng02avJ+NCaQvrMejlsB3RRv5sE9rRoeBPISaZpKxHELzRxZyLvNts1P27W3wV+8geQ==", + "cpu": [ + "x64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "win32" + ], + "engines": { + "node": ">=12" + } + }, + "node_modules/@esbuild-kit/core-utils/node_modules/esbuild": { + "version": "0.18.20", + "resolved": "https://registry.npmjs.org/esbuild/-/esbuild-0.18.20.tgz", + "integrity": "sha512-ceqxoedUrcayh7Y7ZX6NdbbDzGROiyVBgC4PriJThBKSVPWnnFHZAkfI1lJT8QFkOwH4qOS2SJkS4wvpGl8BpA==", + "dev": true, + "hasInstallScript": true, + "license": "MIT", + "bin": { + "esbuild": "bin/esbuild" + }, + "engines": { + "node": ">=12" + }, + "optionalDependencies": { + "@esbuild/android-arm": "0.18.20", + "@esbuild/android-arm64": "0.18.20", + "@esbuild/android-x64": "0.18.20", + "@esbuild/darwin-arm64": "0.18.20", + "@esbuild/darwin-x64": "0.18.20", + "@esbuild/freebsd-arm64": "0.18.20", + "@esbuild/freebsd-x64": "0.18.20", + "@esbuild/linux-arm": "0.18.20", + "@esbuild/linux-arm64": "0.18.20", + "@esbuild/linux-ia32": "0.18.20", + "@esbuild/linux-loong64": "0.18.20", + "@esbuild/linux-mips64el": "0.18.20", + "@esbuild/linux-ppc64": "0.18.20", + "@esbuild/linux-riscv64": "0.18.20", + "@esbuild/linux-s390x": "0.18.20", + "@esbuild/linux-x64": "0.18.20", + "@esbuild/netbsd-x64": "0.18.20", + "@esbuild/openbsd-x64": "0.18.20", + "@esbuild/sunos-x64": "0.18.20", + "@esbuild/win32-arm64": "0.18.20", + "@esbuild/win32-ia32": "0.18.20", + "@esbuild/win32-x64": "0.18.20" + } + }, + "node_modules/@esbuild-kit/esm-loader": { + "version": "2.6.5", + "resolved": "https://registry.npmjs.org/@esbuild-kit/esm-loader/-/esm-loader-2.6.5.tgz", + "integrity": "sha512-FxEMIkJKnodyA1OaCUoEvbYRkoZlLZ4d/eXFu9Fh8CbBBgP5EmZxrfTRyN0qpXZ4vOvqnE5YdRdcrmUUXuU+dA==", + "deprecated": "Merged into tsx: https://tsx.is", + "dev": true, + "license": "MIT", + "dependencies": { + "@esbuild-kit/core-utils": "^3.3.2", + "get-tsconfig": "^4.7.0" + } + }, "node_modules/@esbuild/aix-ppc64": { "version": "0.27.3", "resolved": "https://registry.npmjs.org/@esbuild/aix-ppc64/-/aix-ppc64-0.27.3.tgz", @@ -9547,9 +9993,9 @@ } }, "node_modules/@react-native/assets-registry": { - "version": "0.81.6", - "resolved": "https://registry.npmjs.org/@react-native/assets-registry/-/assets-registry-0.81.6.tgz", - "integrity": "sha512-nNlJ7mdXFoq/7LMG3eJIncqjgXkpDJak3xO8Lb4yQmFI3XVI1nupPRjlYRY0ham1gLE0F/AWvKFChsKUfF5lOQ==", + "version": "0.81.5", + "resolved": "https://registry.npmjs.org/@react-native/assets-registry/-/assets-registry-0.81.5.tgz", + "integrity": "sha512-705B6x/5Kxm1RKRvSv0ADYWm5JOnoiQ1ufW7h8uu2E6G9Of/eE6hP/Ivw3U5jI16ERqZxiKQwk34VJbB0niX9w==", "license": "MIT", "engines": { "node": ">= 20.19.4" @@ -9701,12 +10147,12 @@ } }, "node_modules/@react-native/community-cli-plugin": { - "version": "0.81.6", - "resolved": "https://registry.npmjs.org/@react-native/community-cli-plugin/-/community-cli-plugin-0.81.6.tgz", - "integrity": "sha512-oTwIheF4TU7NkfoHxwSQAKtIDx4SQEs2xufgM3gguY7WkpnhGa/BYA/A+hdHXfqEKJFKlHcXQu4BrV/7Sv1fhw==", + "version": "0.81.5", + "resolved": "https://registry.npmjs.org/@react-native/community-cli-plugin/-/community-cli-plugin-0.81.5.tgz", + "integrity": "sha512-yWRlmEOtcyvSZ4+OvqPabt+NS36vg0K/WADTQLhrYrm9qdZSuXmq8PmdJWz/68wAqKQ+4KTILiq2kjRQwnyhQw==", "license": "MIT", "dependencies": { - "@react-native/dev-middleware": "0.81.6", + "@react-native/dev-middleware": "0.81.5", "debug": "^4.4.0", "invariant": "^2.2.4", "metro": "^0.83.1", @@ -9730,46 +10176,6 @@ } } }, - "node_modules/@react-native/community-cli-plugin/node_modules/@react-native/debugger-frontend": { - "version": "0.81.6", - "resolved": "https://registry.npmjs.org/@react-native/debugger-frontend/-/debugger-frontend-0.81.6.tgz", - "integrity": "sha512-aGw28yzbtm25GQuuxNeVAT72tLuGoH0yh79uYOIZkvjI+5x1NjZyPrgiLZ2LlZi5dJdxfbz30p1zUcHvcAzEZw==", - "license": "BSD-3-Clause", - "engines": { - "node": ">= 20.19.4" - } - }, - "node_modules/@react-native/community-cli-plugin/node_modules/@react-native/dev-middleware": { - "version": "0.81.6", - "resolved": "https://registry.npmjs.org/@react-native/dev-middleware/-/dev-middleware-0.81.6.tgz", - "integrity": "sha512-mK2M3gJ25LtgtqxS1ZXe1vHrz8APOA79Ot/MpbLeovFgLu6YJki0kbO5MRpJagTd+HbesVYSZb/BhAsGN7QAXA==", - "license": "MIT", - "dependencies": { - "@isaacs/ttlcache": "^1.4.1", - "@react-native/debugger-frontend": "0.81.6", - "chrome-launcher": "^0.15.2", - "chromium-edge-launcher": "^0.2.0", - "connect": "^3.6.5", - "debug": "^4.4.0", - "invariant": "^2.2.4", - "nullthrows": "^1.1.1", - "open": "^7.0.3", - "serve-static": "^1.16.2", - "ws": "^6.2.3" - }, - "engines": { - "node": ">= 20.19.4" - } - }, - "node_modules/@react-native/community-cli-plugin/node_modules/ws": { - "version": "6.2.3", - "resolved": "https://registry.npmjs.org/ws/-/ws-6.2.3.tgz", - "integrity": "sha512-jmTjYU0j60B+vHey6TfR3Z7RD61z/hmxBS3VMSGIrroOWXQEneK1zNuotOUrGyBHQj0yrpsLHPWtigEFd13ndA==", - "license": "MIT", - "dependencies": { - "async-limiter": "~1.0.0" - } - }, "node_modules/@react-native/debugger-frontend": { "version": "0.81.5", "resolved": "https://registry.npmjs.org/@react-native/debugger-frontend/-/debugger-frontend-0.81.5.tgz", @@ -9811,18 +10217,18 @@ } }, "node_modules/@react-native/gradle-plugin": { - "version": "0.81.6", - "resolved": "https://registry.npmjs.org/@react-native/gradle-plugin/-/gradle-plugin-0.81.6.tgz", - "integrity": "sha512-atUItC5MZ6yaNaI0sbsoDwUdF+KMNZcMKBIrNhXlUyIj3x1AQ6Cf8CHHv6Qokn8ZFw+uU6GWmQSiOWYUbmi8Ag==", + "version": "0.81.5", + "resolved": "https://registry.npmjs.org/@react-native/gradle-plugin/-/gradle-plugin-0.81.5.tgz", + "integrity": "sha512-hORRlNBj+ReNMLo9jme3yQ6JQf4GZpVEBLxmTXGGlIL78MAezDZr5/uq9dwElSbcGmLEgeiax6e174Fie6qPLg==", "license": "MIT", "engines": { "node": ">= 20.19.4" } }, "node_modules/@react-native/js-polyfills": { - "version": "0.81.6", - "resolved": "https://registry.npmjs.org/@react-native/js-polyfills/-/js-polyfills-0.81.6.tgz", - "integrity": "sha512-P5MWH/9vM24XkJ1TasCq42DMLoCUjZVSppTn6VWv/cI65NDjuYEy7bUSaXbYxGTnqiKyPG5Y+ADymqlIkdSAcw==", + "version": "0.81.5", + "resolved": "https://registry.npmjs.org/@react-native/js-polyfills/-/js-polyfills-0.81.5.tgz", + "integrity": "sha512-fB7M1CMOCIUudTRuj7kzxIBTVw2KXnsgbQ6+4cbqSxo8NmRRhA0Ul4ZUzZj3rFd3VznTL4Brmocv1oiN0bWZ8w==", "license": "MIT", "engines": { "node": ">= 20.19.4" @@ -9841,9 +10247,9 @@ "license": "MIT" }, "node_modules/@react-native/virtualized-lists": { - "version": "0.81.6", - "resolved": "https://registry.npmjs.org/@react-native/virtualized-lists/-/virtualized-lists-0.81.6.tgz", - "integrity": "sha512-1RrZl3a7iCoAS2SGaRLjJPIn8bg/GLNXzqkIB2lufXcJsftu1umNLRIi17ZoDRejAWSd2pUfUtQBASo4R2mw4Q==", + "version": "0.81.5", + "resolved": "https://registry.npmjs.org/@react-native/virtualized-lists/-/virtualized-lists-0.81.5.tgz", + "integrity": "sha512-UVXgV/db25OPIvwZySeToXD/9sKKhOdkcWmmf4Jh8iBZuyfML+/5CasaZ1E7Lqg6g3uqVQq75NqIwkYmORJMPw==", "license": "MIT", "dependencies": { "invariant": "^2.2.4", @@ -9853,7 +10259,7 @@ "node": ">= 20.19.4" }, "peerDependencies": { - "@types/react": "^19.1.4", + "@types/react": "^19.1.0", "react": "*", "react-native": "*" }, @@ -11301,6 +11707,16 @@ "@babel/types": "^7.28.2" } }, + "node_modules/@types/better-sqlite3": { + "version": "7.6.13", + "resolved": "https://registry.npmjs.org/@types/better-sqlite3/-/better-sqlite3-7.6.13.tgz", + "integrity": "sha512-NMv9ASNARoKksWtsq/SHakpYAYnhBrQgGD8zkLYk/jaK8jUGn08CfEdTRgYhMypUQAfzSP8W6gNLe0q19/t4VA==", + "devOptional": true, + "license": "MIT", + "dependencies": { + "@types/node": "*" + } + }, "node_modules/@types/body-parser": { "version": "1.19.6", "resolved": "https://registry.npmjs.org/@types/body-parser/-/body-parser-1.19.6.tgz", @@ -14007,6 +14423,20 @@ "url": "https://github.com/sponsors/sindresorhus" } }, + "node_modules/better-sqlite3": { + "version": "12.8.0", + "resolved": "https://registry.npmjs.org/better-sqlite3/-/better-sqlite3-12.8.0.tgz", + "integrity": "sha512-RxD2Vd96sQDjQr20kdP+F+dK/1OUNiVOl200vKBZY8u0vTwysfolF6Hq+3ZK2+h8My9YvZhHsF+RSGZW2VYrPQ==", + "hasInstallScript": true, + "license": "MIT", + "dependencies": { + "bindings": "^1.5.0", + "prebuild-install": "^7.1.1" + }, + "engines": { + "node": "20.x || 22.x || 23.x || 24.x || 25.x" + } + }, "node_modules/big-integer": { "version": "1.6.52", "resolved": "https://registry.npmjs.org/big-integer/-/big-integer-1.6.52.tgz", @@ -14028,6 +14458,64 @@ "url": "https://github.com/sponsors/sindresorhus" } }, + "node_modules/bindings": { + "version": "1.5.0", + "resolved": "https://registry.npmjs.org/bindings/-/bindings-1.5.0.tgz", + "integrity": "sha512-p2q/t/mhvuOj/UeLlV6566GD/guowlr0hHxClI0W9m7MWYkL1F0hLo+0Aexs9HSPCtR1SXQ0TD3MMKrXZajbiQ==", + "license": "MIT", + "dependencies": { + "file-uri-to-path": "1.0.0" + } + }, + "node_modules/bl": { + "version": "4.1.0", + "resolved": "https://registry.npmjs.org/bl/-/bl-4.1.0.tgz", + "integrity": "sha512-1W07cM9gS6DcLperZfFSj+bWLtaPGSOHWhPiGzXmvVJbRLdG82sH/Kn8EtW1VqWVA54AKf2h5k5BbnIbwF3h6w==", + "license": "MIT", + "dependencies": { + "buffer": "^5.5.0", + "inherits": "^2.0.4", + "readable-stream": "^3.4.0" + } + }, + "node_modules/bl/node_modules/buffer": { + "version": "5.7.1", + "resolved": "https://registry.npmjs.org/buffer/-/buffer-5.7.1.tgz", + "integrity": "sha512-EHcyIPBQ4BSGlvjB16k5KgAJ27CIsHY/2JBmCRReo48y9rQ3MaUzWX3KVlBa4U7MyX02HdVj0K7C3WaB3ju7FQ==", + "funding": [ + { + "type": "github", + "url": "https://github.com/sponsors/feross" + }, + { + "type": "patreon", + "url": "https://www.patreon.com/feross" + }, + { + "type": "consulting", + "url": "https://feross.org/support" + } + ], + "license": "MIT", + "dependencies": { + "base64-js": "^1.3.1", + "ieee754": "^1.1.13" + } + }, + "node_modules/bl/node_modules/readable-stream": { + "version": "3.6.2", + "resolved": "https://registry.npmjs.org/readable-stream/-/readable-stream-3.6.2.tgz", + "integrity": "sha512-9u/sniCrY3D5WdsERHzHE4G2YCXqoG5FTHUiCC4SIbr6XcLZBY05ya9EKjYek9O5xOAwjGq+1JdGBAS7Q9ScoA==", + "license": "MIT", + "dependencies": { + "inherits": "^2.0.3", + "string_decoder": "^1.1.1", + "util-deprecate": "^1.0.1" + }, + "engines": { + "node": ">= 6" + } + }, "node_modules/blake3-wasm": { "version": "2.1.5", "resolved": "https://registry.npmjs.org/blake3-wasm/-/blake3-wasm-2.1.5.tgz", @@ -15845,7 +16333,6 @@ "version": "6.0.0", "resolved": "https://registry.npmjs.org/decompress-response/-/decompress-response-6.0.0.tgz", "integrity": "sha512-aW35yZM6Bb/4oJlZncMH2LCoZtJXTRxES17vE3hoRiowU2kWHaJKFkSBDnDR+cm9J+9QhXmREyIfv0pji9ejCQ==", - "dev": true, "license": "MIT", "dependencies": { "mimic-response": "^3.1.0" @@ -15861,7 +16348,6 @@ "version": "3.1.0", "resolved": "https://registry.npmjs.org/mimic-response/-/mimic-response-3.1.0.tgz", "integrity": "sha512-z0yWI+4FDrrweS8Zmt4Ej5HdJmky15+L2e6Wgn3+iK5fWzb6T3fhNFq2+MeTRb064c6Wr4N/wv0DzQTjNzHNGQ==", - "dev": true, "license": "MIT", "engines": { "node": ">=10" @@ -16380,6 +16866,631 @@ "url": "https://dotenvx.com" } }, + "node_modules/drizzle-kit": { + "version": "0.31.10", + "resolved": "https://registry.npmjs.org/drizzle-kit/-/drizzle-kit-0.31.10.tgz", + "integrity": "sha512-7OZcmQUrdGI+DUNNsKBn1aW8qSoKuTH7d0mYgSP8bAzdFzKoovxEFnoGQp2dVs82EOJeYycqRtciopszwUf8bw==", + "dev": true, + "license": "MIT", + "dependencies": { + "@drizzle-team/brocli": "^0.10.2", + "@esbuild-kit/esm-loader": "^2.5.5", + "esbuild": "^0.25.4", + "tsx": "^4.21.0" + }, + "bin": { + "drizzle-kit": "bin.cjs" + } + }, + "node_modules/drizzle-kit/node_modules/@esbuild/aix-ppc64": { + "version": "0.25.12", + "resolved": "https://registry.npmjs.org/@esbuild/aix-ppc64/-/aix-ppc64-0.25.12.tgz", + "integrity": "sha512-Hhmwd6CInZ3dwpuGTF8fJG6yoWmsToE+vYgD4nytZVxcu1ulHpUQRAB1UJ8+N1Am3Mz4+xOByoQoSZf4D+CpkA==", + "cpu": [ + "ppc64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "aix" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/drizzle-kit/node_modules/@esbuild/android-arm": { + "version": "0.25.12", + "resolved": "https://registry.npmjs.org/@esbuild/android-arm/-/android-arm-0.25.12.tgz", + "integrity": "sha512-VJ+sKvNA/GE7Ccacc9Cha7bpS8nyzVv0jdVgwNDaR4gDMC/2TTRc33Ip8qrNYUcpkOHUT5OZ0bUcNNVZQ9RLlg==", + "cpu": [ + "arm" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "android" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/drizzle-kit/node_modules/@esbuild/android-arm64": { + "version": "0.25.12", + "resolved": "https://registry.npmjs.org/@esbuild/android-arm64/-/android-arm64-0.25.12.tgz", + "integrity": "sha512-6AAmLG7zwD1Z159jCKPvAxZd4y/VTO0VkprYy+3N2FtJ8+BQWFXU+OxARIwA46c5tdD9SsKGZ/1ocqBS/gAKHg==", + "cpu": [ + "arm64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "android" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/drizzle-kit/node_modules/@esbuild/android-x64": { + "version": "0.25.12", + "resolved": "https://registry.npmjs.org/@esbuild/android-x64/-/android-x64-0.25.12.tgz", + "integrity": "sha512-5jbb+2hhDHx5phYR2By8GTWEzn6I9UqR11Kwf22iKbNpYrsmRB18aX/9ivc5cabcUiAT/wM+YIZ6SG9QO6a8kg==", + "cpu": [ + "x64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "android" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/drizzle-kit/node_modules/@esbuild/darwin-arm64": { + "version": "0.25.12", + "resolved": "https://registry.npmjs.org/@esbuild/darwin-arm64/-/darwin-arm64-0.25.12.tgz", + "integrity": "sha512-N3zl+lxHCifgIlcMUP5016ESkeQjLj/959RxxNYIthIg+CQHInujFuXeWbWMgnTo4cp5XVHqFPmpyu9J65C1Yg==", + "cpu": [ + "arm64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "darwin" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/drizzle-kit/node_modules/@esbuild/darwin-x64": { + "version": "0.25.12", + "resolved": "https://registry.npmjs.org/@esbuild/darwin-x64/-/darwin-x64-0.25.12.tgz", + "integrity": "sha512-HQ9ka4Kx21qHXwtlTUVbKJOAnmG1ipXhdWTmNXiPzPfWKpXqASVcWdnf2bnL73wgjNrFXAa3yYvBSd9pzfEIpA==", + "cpu": [ + "x64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "darwin" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/drizzle-kit/node_modules/@esbuild/freebsd-arm64": { + "version": "0.25.12", + "resolved": "https://registry.npmjs.org/@esbuild/freebsd-arm64/-/freebsd-arm64-0.25.12.tgz", + "integrity": "sha512-gA0Bx759+7Jve03K1S0vkOu5Lg/85dou3EseOGUes8flVOGxbhDDh/iZaoek11Y8mtyKPGF3vP8XhnkDEAmzeg==", + "cpu": [ + "arm64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "freebsd" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/drizzle-kit/node_modules/@esbuild/freebsd-x64": { + "version": "0.25.12", + "resolved": "https://registry.npmjs.org/@esbuild/freebsd-x64/-/freebsd-x64-0.25.12.tgz", + "integrity": "sha512-TGbO26Yw2xsHzxtbVFGEXBFH0FRAP7gtcPE7P5yP7wGy7cXK2oO7RyOhL5NLiqTlBh47XhmIUXuGciXEqYFfBQ==", + "cpu": [ + "x64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "freebsd" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/drizzle-kit/node_modules/@esbuild/linux-arm": { + "version": "0.25.12", + "resolved": "https://registry.npmjs.org/@esbuild/linux-arm/-/linux-arm-0.25.12.tgz", + "integrity": "sha512-lPDGyC1JPDou8kGcywY0YILzWlhhnRjdof3UlcoqYmS9El818LLfJJc3PXXgZHrHCAKs/Z2SeZtDJr5MrkxtOw==", + "cpu": [ + "arm" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "linux" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/drizzle-kit/node_modules/@esbuild/linux-arm64": { + "version": "0.25.12", + "resolved": "https://registry.npmjs.org/@esbuild/linux-arm64/-/linux-arm64-0.25.12.tgz", + "integrity": "sha512-8bwX7a8FghIgrupcxb4aUmYDLp8pX06rGh5HqDT7bB+8Rdells6mHvrFHHW2JAOPZUbnjUpKTLg6ECyzvas2AQ==", + "cpu": [ + "arm64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "linux" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/drizzle-kit/node_modules/@esbuild/linux-ia32": { + "version": "0.25.12", + "resolved": "https://registry.npmjs.org/@esbuild/linux-ia32/-/linux-ia32-0.25.12.tgz", + "integrity": "sha512-0y9KrdVnbMM2/vG8KfU0byhUN+EFCny9+8g202gYqSSVMonbsCfLjUO+rCci7pM0WBEtz+oK/PIwHkzxkyharA==", + "cpu": [ + "ia32" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "linux" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/drizzle-kit/node_modules/@esbuild/linux-loong64": { + "version": "0.25.12", + "resolved": "https://registry.npmjs.org/@esbuild/linux-loong64/-/linux-loong64-0.25.12.tgz", + "integrity": "sha512-h///Lr5a9rib/v1GGqXVGzjL4TMvVTv+s1DPoxQdz7l/AYv6LDSxdIwzxkrPW438oUXiDtwM10o9PmwS/6Z0Ng==", + "cpu": [ + "loong64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "linux" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/drizzle-kit/node_modules/@esbuild/linux-mips64el": { + "version": "0.25.12", + "resolved": "https://registry.npmjs.org/@esbuild/linux-mips64el/-/linux-mips64el-0.25.12.tgz", + "integrity": "sha512-iyRrM1Pzy9GFMDLsXn1iHUm18nhKnNMWscjmp4+hpafcZjrr2WbT//d20xaGljXDBYHqRcl8HnxbX6uaA/eGVw==", + "cpu": [ + "mips64el" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "linux" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/drizzle-kit/node_modules/@esbuild/linux-ppc64": { + "version": "0.25.12", + "resolved": "https://registry.npmjs.org/@esbuild/linux-ppc64/-/linux-ppc64-0.25.12.tgz", + "integrity": "sha512-9meM/lRXxMi5PSUqEXRCtVjEZBGwB7P/D4yT8UG/mwIdze2aV4Vo6U5gD3+RsoHXKkHCfSxZKzmDssVlRj1QQA==", + "cpu": [ + "ppc64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "linux" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/drizzle-kit/node_modules/@esbuild/linux-riscv64": { + "version": "0.25.12", + "resolved": "https://registry.npmjs.org/@esbuild/linux-riscv64/-/linux-riscv64-0.25.12.tgz", + "integrity": "sha512-Zr7KR4hgKUpWAwb1f3o5ygT04MzqVrGEGXGLnj15YQDJErYu/BGg+wmFlIDOdJp0PmB0lLvxFIOXZgFRrdjR0w==", + "cpu": [ + "riscv64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "linux" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/drizzle-kit/node_modules/@esbuild/linux-s390x": { + "version": "0.25.12", + "resolved": "https://registry.npmjs.org/@esbuild/linux-s390x/-/linux-s390x-0.25.12.tgz", + "integrity": "sha512-MsKncOcgTNvdtiISc/jZs/Zf8d0cl/t3gYWX8J9ubBnVOwlk65UIEEvgBORTiljloIWnBzLs4qhzPkJcitIzIg==", + "cpu": [ + "s390x" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "linux" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/drizzle-kit/node_modules/@esbuild/linux-x64": { + "version": "0.25.12", + "resolved": "https://registry.npmjs.org/@esbuild/linux-x64/-/linux-x64-0.25.12.tgz", + "integrity": "sha512-uqZMTLr/zR/ed4jIGnwSLkaHmPjOjJvnm6TVVitAa08SLS9Z0VM8wIRx7gWbJB5/J54YuIMInDquWyYvQLZkgw==", + "cpu": [ + "x64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "linux" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/drizzle-kit/node_modules/@esbuild/netbsd-arm64": { + "version": "0.25.12", + "resolved": "https://registry.npmjs.org/@esbuild/netbsd-arm64/-/netbsd-arm64-0.25.12.tgz", + "integrity": "sha512-xXwcTq4GhRM7J9A8Gv5boanHhRa/Q9KLVmcyXHCTaM4wKfIpWkdXiMog/KsnxzJ0A1+nD+zoecuzqPmCRyBGjg==", + "cpu": [ + "arm64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "netbsd" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/drizzle-kit/node_modules/@esbuild/netbsd-x64": { + "version": "0.25.12", + "resolved": "https://registry.npmjs.org/@esbuild/netbsd-x64/-/netbsd-x64-0.25.12.tgz", + "integrity": "sha512-Ld5pTlzPy3YwGec4OuHh1aCVCRvOXdH8DgRjfDy/oumVovmuSzWfnSJg+VtakB9Cm0gxNO9BzWkj6mtO1FMXkQ==", + "cpu": [ + "x64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "netbsd" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/drizzle-kit/node_modules/@esbuild/openbsd-arm64": { + "version": "0.25.12", + "resolved": "https://registry.npmjs.org/@esbuild/openbsd-arm64/-/openbsd-arm64-0.25.12.tgz", + "integrity": "sha512-fF96T6KsBo/pkQI950FARU9apGNTSlZGsv1jZBAlcLL1MLjLNIWPBkj5NlSz8aAzYKg+eNqknrUJ24QBybeR5A==", + "cpu": [ + "arm64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "openbsd" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/drizzle-kit/node_modules/@esbuild/openbsd-x64": { + "version": "0.25.12", + "resolved": "https://registry.npmjs.org/@esbuild/openbsd-x64/-/openbsd-x64-0.25.12.tgz", + "integrity": "sha512-MZyXUkZHjQxUvzK7rN8DJ3SRmrVrke8ZyRusHlP+kuwqTcfWLyqMOE3sScPPyeIXN/mDJIfGXvcMqCgYKekoQw==", + "cpu": [ + "x64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "openbsd" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/drizzle-kit/node_modules/@esbuild/openharmony-arm64": { + "version": "0.25.12", + "resolved": "https://registry.npmjs.org/@esbuild/openharmony-arm64/-/openharmony-arm64-0.25.12.tgz", + "integrity": "sha512-rm0YWsqUSRrjncSXGA7Zv78Nbnw4XL6/dzr20cyrQf7ZmRcsovpcRBdhD43Nuk3y7XIoW2OxMVvwuRvk9XdASg==", + "cpu": [ + "arm64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "openharmony" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/drizzle-kit/node_modules/@esbuild/sunos-x64": { + "version": "0.25.12", + "resolved": "https://registry.npmjs.org/@esbuild/sunos-x64/-/sunos-x64-0.25.12.tgz", + "integrity": "sha512-3wGSCDyuTHQUzt0nV7bocDy72r2lI33QL3gkDNGkod22EsYl04sMf0qLb8luNKTOmgF/eDEDP5BFNwoBKH441w==", + "cpu": [ + "x64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "sunos" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/drizzle-kit/node_modules/@esbuild/win32-arm64": { + "version": "0.25.12", + "resolved": "https://registry.npmjs.org/@esbuild/win32-arm64/-/win32-arm64-0.25.12.tgz", + "integrity": "sha512-rMmLrur64A7+DKlnSuwqUdRKyd3UE7oPJZmnljqEptesKM8wx9J8gx5u0+9Pq0fQQW8vqeKebwNXdfOyP+8Bsg==", + "cpu": [ + "arm64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "win32" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/drizzle-kit/node_modules/@esbuild/win32-ia32": { + "version": "0.25.12", + "resolved": "https://registry.npmjs.org/@esbuild/win32-ia32/-/win32-ia32-0.25.12.tgz", + "integrity": "sha512-HkqnmmBoCbCwxUKKNPBixiWDGCpQGVsrQfJoVGYLPT41XWF8lHuE5N6WhVia2n4o5QK5M4tYr21827fNhi4byQ==", + "cpu": [ + "ia32" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "win32" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/drizzle-kit/node_modules/@esbuild/win32-x64": { + "version": "0.25.12", + "resolved": "https://registry.npmjs.org/@esbuild/win32-x64/-/win32-x64-0.25.12.tgz", + "integrity": "sha512-alJC0uCZpTFrSL0CCDjcgleBXPnCrEAhTBILpeAp7M/OFgoqtAetfBzX0xM00MUsVVPpVjlPuMbREqnZCXaTnA==", + "cpu": [ + "x64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "win32" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/drizzle-kit/node_modules/esbuild": { + "version": "0.25.12", + "resolved": "https://registry.npmjs.org/esbuild/-/esbuild-0.25.12.tgz", + "integrity": "sha512-bbPBYYrtZbkt6Os6FiTLCTFxvq4tt3JKall1vRwshA3fdVztsLAatFaZobhkBC8/BrPetoa0oksYoKXoG4ryJg==", + "dev": true, + "hasInstallScript": true, + "license": "MIT", + "bin": { + "esbuild": "bin/esbuild" + }, + "engines": { + "node": ">=18" + }, + "optionalDependencies": { + "@esbuild/aix-ppc64": "0.25.12", + "@esbuild/android-arm": "0.25.12", + "@esbuild/android-arm64": "0.25.12", + "@esbuild/android-x64": "0.25.12", + "@esbuild/darwin-arm64": "0.25.12", + "@esbuild/darwin-x64": "0.25.12", + "@esbuild/freebsd-arm64": "0.25.12", + "@esbuild/freebsd-x64": "0.25.12", + "@esbuild/linux-arm": "0.25.12", + "@esbuild/linux-arm64": "0.25.12", + "@esbuild/linux-ia32": "0.25.12", + "@esbuild/linux-loong64": "0.25.12", + "@esbuild/linux-mips64el": "0.25.12", + "@esbuild/linux-ppc64": "0.25.12", + "@esbuild/linux-riscv64": "0.25.12", + "@esbuild/linux-s390x": "0.25.12", + "@esbuild/linux-x64": "0.25.12", + "@esbuild/netbsd-arm64": "0.25.12", + "@esbuild/netbsd-x64": "0.25.12", + "@esbuild/openbsd-arm64": "0.25.12", + "@esbuild/openbsd-x64": "0.25.12", + "@esbuild/openharmony-arm64": "0.25.12", + "@esbuild/sunos-x64": "0.25.12", + "@esbuild/win32-arm64": "0.25.12", + "@esbuild/win32-ia32": "0.25.12", + "@esbuild/win32-x64": "0.25.12" + } + }, + "node_modules/drizzle-orm": { + "version": "0.45.2", + "resolved": "https://registry.npmjs.org/drizzle-orm/-/drizzle-orm-0.45.2.tgz", + "integrity": "sha512-kY0BSaTNYWnoDMVoyY8uxmyHjpJW1geOmBMdSSicKo9CIIWkSxMIj2rkeSR51b8KAPB7m+qysjuHme5nKP+E5Q==", + "license": "Apache-2.0", + "peerDependencies": { + "@aws-sdk/client-rds-data": ">=3", + "@cloudflare/workers-types": ">=4", + "@electric-sql/pglite": ">=0.2.0", + "@libsql/client": ">=0.10.0", + "@libsql/client-wasm": ">=0.10.0", + "@neondatabase/serverless": ">=0.10.0", + "@op-engineering/op-sqlite": ">=2", + "@opentelemetry/api": "^1.4.1", + "@planetscale/database": ">=1.13", + "@prisma/client": "*", + "@tidbcloud/serverless": "*", + "@types/better-sqlite3": "*", + "@types/pg": "*", + "@types/sql.js": "*", + "@upstash/redis": ">=1.34.7", + "@vercel/postgres": ">=0.8.0", + "@xata.io/client": "*", + "better-sqlite3": ">=7", + "bun-types": "*", + "expo-sqlite": ">=14.0.0", + "gel": ">=2", + "knex": "*", + "kysely": "*", + "mysql2": ">=2", + "pg": ">=8", + "postgres": ">=3", + "sql.js": ">=1", + "sqlite3": ">=5" + }, + "peerDependenciesMeta": { + "@aws-sdk/client-rds-data": { + "optional": true + }, + "@cloudflare/workers-types": { + "optional": true + }, + "@electric-sql/pglite": { + "optional": true + }, + "@libsql/client": { + "optional": true + }, + "@libsql/client-wasm": { + "optional": true + }, + "@neondatabase/serverless": { + "optional": true + }, + "@op-engineering/op-sqlite": { + "optional": true + }, + "@opentelemetry/api": { + "optional": true + }, + "@planetscale/database": { + "optional": true + }, + "@prisma/client": { + "optional": true + }, + "@tidbcloud/serverless": { + "optional": true + }, + "@types/better-sqlite3": { + "optional": true + }, + "@types/pg": { + "optional": true + }, + "@types/sql.js": { + "optional": true + }, + "@upstash/redis": { + "optional": true + }, + "@vercel/postgres": { + "optional": true + }, + "@xata.io/client": { + "optional": true + }, + "better-sqlite3": { + "optional": true + }, + "bun-types": { + "optional": true + }, + "expo-sqlite": { + "optional": true + }, + "gel": { + "optional": true + }, + "knex": { + "optional": true + }, + "kysely": { + "optional": true + }, + "mysql2": { + "optional": true + }, + "pg": { + "optional": true + }, + "postgres": { + "optional": true + }, + "prisma": { + "optional": true + }, + "sql.js": { + "optional": true + }, + "sqlite3": { + "optional": true + } + } + }, "node_modules/dtrace-provider": { "version": "0.8.8", "resolved": "https://registry.npmjs.org/dtrace-provider/-/dtrace-provider-0.8.8.tgz", @@ -18533,6 +19644,15 @@ "dev": true, "license": "MIT" }, + "node_modules/expand-template": { + "version": "2.0.3", + "resolved": "https://registry.npmjs.org/expand-template/-/expand-template-2.0.3.tgz", + "integrity": "sha512-XYfuKMvj4O35f/pOXLObndIRvyQ+/+6AhODh+OKWj9S9498pHHn/IMszH+gt0fBCRWMNfk1ZSp5x3AifmnI2vg==", + "license": "(MIT OR WTFPL)", + "engines": { + "node": ">=6" + } + }, "node_modules/expect": { "version": "29.7.0", "resolved": "https://registry.npmjs.org/expect/-/expect-29.7.0.tgz", @@ -21153,6 +22273,12 @@ "node": ">=16.0.0" } }, + "node_modules/file-uri-to-path": { + "version": "1.0.0", + "resolved": "https://registry.npmjs.org/file-uri-to-path/-/file-uri-to-path-1.0.0.tgz", + "integrity": "sha512-0Zt+s3L7Vf1biwWZ29aARiVYLx7iMGnEUl9x33fbB/j3jR81u/O2LbqK+Bm1CDSNDKVtJ/YjwY7TUd5SkeLQLw==", + "license": "MIT" + }, "node_modules/filelist": { "version": "1.0.6", "resolved": "https://registry.npmjs.org/filelist/-/filelist-1.0.6.tgz", @@ -21596,6 +22722,12 @@ "node": ">= 0.6" } }, + "node_modules/fs-constants": { + "version": "1.0.0", + "resolved": "https://registry.npmjs.org/fs-constants/-/fs-constants-1.0.0.tgz", + "integrity": "sha512-y6OAwoSIf7FyjMIv94u+b5rdheZEjzR63GTyZJm5qh4Bi+2YgwLCcI/fPFZkL5PSixOt6ZNKm+w+Hfp/Bciwow==", + "license": "MIT" + }, "node_modules/fs-extra": { "version": "10.1.0", "resolved": "https://registry.npmjs.org/fs-extra/-/fs-extra-10.1.0.tgz", @@ -21867,6 +22999,12 @@ "node": ">=6" } }, + "node_modules/github-from-package": { + "version": "0.0.0", + "resolved": "https://registry.npmjs.org/github-from-package/-/github-from-package-0.0.0.tgz", + "integrity": "sha512-SyHy3T1v2NUXn29OsWdxmK6RwHD+vkj3v8en8AOBZ1wBQ/hCAQ5bAQTD02kW4W9tUp/3Qh6J8r9EvntiyCmOOw==", + "license": "MIT" + }, "node_modules/glob": { "version": "10.5.0", "resolved": "https://registry.npmjs.org/glob/-/glob-10.5.0.tgz", @@ -26760,6 +27898,12 @@ "node": ">=10" } }, + "node_modules/mkdirp-classic": { + "version": "0.5.3", + "resolved": "https://registry.npmjs.org/mkdirp-classic/-/mkdirp-classic-0.5.3.tgz", + "integrity": "sha512-gKLcREMhtuZRwRAfqP3RFW+TK4JqApVBtOIftVgjuABpAtpxhPGaDcfvbhNvD0B8iD1oUr/txX35NjcaY6Ns/A==", + "license": "MIT" + }, "node_modules/mnemonic-id": { "version": "3.2.7", "resolved": "https://registry.npmjs.org/mnemonic-id/-/mnemonic-id-3.2.7.tgz", @@ -26949,6 +28093,12 @@ "node": "^10 || ^12 || ^13.7 || ^14 || >=15.0.1" } }, + "node_modules/napi-build-utils": { + "version": "2.0.0", + "resolved": "https://registry.npmjs.org/napi-build-utils/-/napi-build-utils-2.0.0.tgz", + "integrity": "sha512-GEbrYkbfF7MoNaoh2iGG84Mnf/WZfB0GdGEsM8wz7Expx/LlWf5U8t9nvJKXSp3qr5IsEbK04cBGhol/KwOsWA==", + "license": "MIT" + }, "node_modules/napi-postinstall": { "version": "0.3.4", "resolved": "https://registry.npmjs.org/napi-postinstall/-/napi-postinstall-0.3.4.tgz", @@ -28565,6 +29715,45 @@ "node": "^12.20.0 || >=14" } }, + "node_modules/prebuild-install": { + "version": "7.1.3", + "resolved": "https://registry.npmjs.org/prebuild-install/-/prebuild-install-7.1.3.tgz", + "integrity": "sha512-8Mf2cbV7x1cXPUILADGI3wuhfqWvtiLA1iclTDbFRZkgRQS0NqsPZphna9V+HyTEadheuPmjaJMsbzKQFOzLug==", + "deprecated": "No longer maintained. Please contact the author of the relevant native addon; alternatives are available.", + "license": "MIT", + "dependencies": { + "detect-libc": "^2.0.0", + "expand-template": "^2.0.3", + "github-from-package": "0.0.0", + "minimist": "^1.2.3", + "mkdirp-classic": "^0.5.3", + "napi-build-utils": "^2.0.0", + "node-abi": "^3.3.0", + "pump": "^3.0.0", + "rc": "^1.2.7", + "simple-get": "^4.0.0", + "tar-fs": "^2.0.0", + "tunnel-agent": "^0.6.0" + }, + "bin": { + "prebuild-install": "bin.js" + }, + "engines": { + "node": ">=10" + } + }, + "node_modules/prebuild-install/node_modules/node-abi": { + "version": "3.89.0", + "resolved": "https://registry.npmjs.org/node-abi/-/node-abi-3.89.0.tgz", + "integrity": "sha512-6u9UwL0HlAl21+agMN3YAMXcKByMqwGx+pq+P76vii5f7hTPtKDp08/H9py6DY+cfDw7kQNTGEj/rly3IgbNQA==", + "license": "MIT", + "dependencies": { + "semver": "^7.3.5" + }, + "engines": { + "node": ">=10" + } + }, "node_modules/prelude-ls": { "version": "1.2.1", "resolved": "https://registry.npmjs.org/prelude-ls/-/prelude-ls-1.2.1.tgz", @@ -29279,19 +30468,19 @@ } }, "node_modules/react-native": { - "version": "0.81.6", - "resolved": "https://registry.npmjs.org/react-native/-/react-native-0.81.6.tgz", - "integrity": "sha512-X/tI8GqfzVaa+zfbE4+lySNN5UzwBIAVRHVZPKymOny9Acc5GYYcjAcEVfG3AM4h920YALWoSl8favnDmQEWIg==", + "version": "0.81.5", + "resolved": "https://registry.npmjs.org/react-native/-/react-native-0.81.5.tgz", + "integrity": "sha512-1w+/oSjEXZjMqsIvmkCRsOc8UBYv163bTWKTI8+1mxztvQPhCRYGTvZ/PL1w16xXHneIj/SLGfxWg2GWN2uexw==", "license": "MIT", "dependencies": { "@jest/create-cache-key-function": "^29.7.0", - "@react-native/assets-registry": "0.81.6", - "@react-native/codegen": "0.81.6", - "@react-native/community-cli-plugin": "0.81.6", - "@react-native/gradle-plugin": "0.81.6", - "@react-native/js-polyfills": "0.81.6", - "@react-native/normalize-colors": "0.81.6", - "@react-native/virtualized-lists": "0.81.6", + "@react-native/assets-registry": "0.81.5", + "@react-native/codegen": "0.81.5", + "@react-native/community-cli-plugin": "0.81.5", + "@react-native/gradle-plugin": "0.81.5", + "@react-native/js-polyfills": "0.81.5", + "@react-native/normalize-colors": "0.81.5", + "@react-native/virtualized-lists": "0.81.5", "abort-controller": "^3.0.0", "anser": "^1.4.9", "ansi-regex": "^5.0.0", @@ -29326,8 +30515,8 @@ "node": ">= 20.19.4" }, "peerDependencies": { - "@types/react": "^19.1.4", - "react": "^19.1.4" + "@types/react": "^19.1.0", + "react": "^19.1.0" }, "peerDependenciesMeta": { "@types/react": { @@ -29670,31 +30859,16 @@ "node": ">=10" } }, - "node_modules/react-native/node_modules/@react-native/codegen": { - "version": "0.81.6", - "resolved": "https://registry.npmjs.org/@react-native/codegen/-/codegen-0.81.6.tgz", - "integrity": "sha512-9KoYRep/KDnELLLmIYTtIIEOClVUJ88pxWObb/0sjkacA7uL4SgfbAg7rWLURAQJWI85L1YS67IhdEqNNk1I7w==", - "license": "MIT", - "dependencies": { - "@babel/core": "^7.25.2", - "@babel/parser": "^7.25.3", - "glob": "^7.1.1", - "hermes-parser": "0.29.1", - "invariant": "^2.2.4", - "nullthrows": "^1.1.1", - "yargs": "^17.6.2" - }, - "engines": { - "node": ">= 20.19.4" - }, - "peerDependencies": { - "@babel/core": "*" - } + "node_modules/react-native/node_modules/@react-native/normalize-colors": { + "version": "0.81.5", + "resolved": "https://registry.npmjs.org/@react-native/normalize-colors/-/normalize-colors-0.81.5.tgz", + "integrity": "sha512-0HuJ8YtqlTVRXGZuGeBejLE04wSQsibpTI+RGOyVqxZvgtlLLC/Ssw0UmbHhT4lYMp2fhdtvKZSs5emWB1zR/g==", + "license": "MIT" }, "node_modules/react-native/node_modules/brace-expansion": { - "version": "1.1.12", - "resolved": "https://registry.npmjs.org/brace-expansion/-/brace-expansion-1.1.12.tgz", - "integrity": "sha512-9T9UjW3r0UW5c1Q7GTwllptXwhvYmEzFhzMfZ9H7FQWt+uZePjZPjBP/W1ZEyZ1twGWom5/56TF4lPcqjnDHcg==", + "version": "1.1.13", + "resolved": "https://registry.npmjs.org/brace-expansion/-/brace-expansion-1.1.13.tgz", + "integrity": "sha512-9ZLprWS6EENmhEOpjCYW2c8VkmOvckIJZfkr7rBW6dObmfgJ/L1GpSYW5Hpo9lDz4D1+n0Ckz8rU7FwHDQiG/w==", "license": "MIT", "dependencies": { "balanced-match": "^1.0.0", @@ -31255,6 +32429,51 @@ "url": "https://github.com/sponsors/isaacs" } }, + "node_modules/simple-concat": { + "version": "1.0.1", + "resolved": "https://registry.npmjs.org/simple-concat/-/simple-concat-1.0.1.tgz", + "integrity": "sha512-cSFtAPtRhljv69IK0hTVZQ+OfE9nePi/rtJmw5UjHeVyVroEqJXP1sFztKUy1qU+xvz3u/sfYJLa947b7nAN2Q==", + "funding": [ + { + "type": "github", + "url": "https://github.com/sponsors/feross" + }, + { + "type": "patreon", + "url": "https://www.patreon.com/feross" + }, + { + "type": "consulting", + "url": "https://feross.org/support" + } + ], + "license": "MIT" + }, + "node_modules/simple-get": { + "version": "4.0.1", + "resolved": "https://registry.npmjs.org/simple-get/-/simple-get-4.0.1.tgz", + "integrity": "sha512-brv7p5WgH0jmQJr1ZDDfKDOSeWWg+OVypG99A/5vYGPqJ6pxiaHLy8nxtFjBA7oMa01ebA9gfh1uMCFqOuXxvA==", + "funding": [ + { + "type": "github", + "url": "https://github.com/sponsors/feross" + }, + { + "type": "patreon", + "url": "https://www.patreon.com/feross" + }, + { + "type": "consulting", + "url": "https://feross.org/support" + } + ], + "license": "MIT", + "dependencies": { + "decompress-response": "^6.0.0", + "once": "^1.3.1", + "simple-concat": "^1.0.0" + } + }, "node_modules/simple-plist": { "version": "1.3.1", "resolved": "https://registry.npmjs.org/simple-plist/-/simple-plist-1.3.1.tgz", @@ -31748,7 +32967,6 @@ "version": "1.1.1", "resolved": "https://registry.npmjs.org/string_decoder/-/string_decoder-1.1.1.tgz", "integrity": "sha512-n/ShnvDi6FHbbVfviro+WojiFzv+s8MPMHBczVePfUpDJLwoLT0ht1l4YwBCbi8pJAveEEdnkHyPyTP/mzRfwg==", - "dev": true, "license": "MIT", "dependencies": { "safe-buffer": "~5.1.0" @@ -31758,7 +32976,6 @@ "version": "5.1.2", "resolved": "https://registry.npmjs.org/safe-buffer/-/safe-buffer-5.1.2.tgz", "integrity": "sha512-Gd2UZBJDkXlY7GbJxfsE8/nvKkUEU1G38c1siN6QP6a9PT9MmHB8GnpscSmMJSoF8LOIrt8ud/wPtojys4G6+g==", - "dev": true, "license": "MIT" }, "node_modules/string-length": { @@ -32232,6 +33449,54 @@ "node": ">=10" } }, + "node_modules/tar-fs": { + "version": "2.1.4", + "resolved": "https://registry.npmjs.org/tar-fs/-/tar-fs-2.1.4.tgz", + "integrity": "sha512-mDAjwmZdh7LTT6pNleZ05Yt65HC3E+NiQzl672vQG38jIrehtJk/J3mNwIg+vShQPcLF/LV7CMnDW6vjj6sfYQ==", + "license": "MIT", + "dependencies": { + "chownr": "^1.1.1", + "mkdirp-classic": "^0.5.2", + "pump": "^3.0.0", + "tar-stream": "^2.1.4" + } + }, + "node_modules/tar-fs/node_modules/chownr": { + "version": "1.1.4", + "resolved": "https://registry.npmjs.org/chownr/-/chownr-1.1.4.tgz", + "integrity": "sha512-jJ0bqzaylmJtVnNgzTeSOs8DPavpbYgEr/b0YL8/2GO3xJEhInFmhKMUnEJQjZumK7KXGFhUy89PrsJWlakBVg==", + "license": "ISC" + }, + "node_modules/tar-fs/node_modules/readable-stream": { + "version": "3.6.2", + "resolved": "https://registry.npmjs.org/readable-stream/-/readable-stream-3.6.2.tgz", + "integrity": "sha512-9u/sniCrY3D5WdsERHzHE4G2YCXqoG5FTHUiCC4SIbr6XcLZBY05ya9EKjYek9O5xOAwjGq+1JdGBAS7Q9ScoA==", + "license": "MIT", + "dependencies": { + "inherits": "^2.0.3", + "string_decoder": "^1.1.1", + "util-deprecate": "^1.0.1" + }, + "engines": { + "node": ">= 6" + } + }, + "node_modules/tar-fs/node_modules/tar-stream": { + "version": "2.2.0", + "resolved": "https://registry.npmjs.org/tar-stream/-/tar-stream-2.2.0.tgz", + "integrity": "sha512-ujeqbceABgwMZxEJnk2HDY2DlnUZ+9oEcb1KzTVfYHio0UE6dG71n60d8D2I4qNvleWrrXpmjpt7vZeF1LnMZQ==", + "license": "MIT", + "dependencies": { + "bl": "^4.0.3", + "end-of-stream": "^1.4.1", + "fs-constants": "^1.0.0", + "inherits": "^2.0.3", + "readable-stream": "^3.1.1" + }, + "engines": { + "node": ">=6" + } + }, "node_modules/tar-stream": { "version": "3.1.7", "resolved": "https://registry.npmjs.org/tar-stream/-/tar-stream-3.1.7.tgz", @@ -33007,7 +34272,6 @@ "version": "0.6.0", "resolved": "https://registry.npmjs.org/tunnel-agent/-/tunnel-agent-0.6.0.tgz", "integrity": "sha512-McnNiV1l8RYeY8tBgEpuodCC1mLUdbSN+CYBL7kJsJNInOP8UjDDEwdk6Mw60vdLLrr5NHKZhMAOSrR2NZuQ+w==", - "dev": true, "license": "Apache-2.0", "dependencies": { "safe-buffer": "^5.0.1" @@ -33679,7 +34943,6 @@ "version": "1.0.2", "resolved": "https://registry.npmjs.org/util-deprecate/-/util-deprecate-1.0.2.tgz", "integrity": "sha512-EPD5q1uXyFxJpCrLnCc1nHnq3gOa6DZBocAIiI2TaSCA7VCJ1UJDMagCzIkXNsUYfD1daK//LTEQ8xiIbrHtcw==", - "dev": true, "license": "MIT" }, "node_modules/utils-merge": { @@ -35373,7 +36636,9 @@ "@xterm/headless": "^6.0.0", "ai": "5.0.78", "ajv": "^8.17.1", + "better-sqlite3": "^12.8.0", "dotenv": "^17.2.3", + "drizzle-orm": "^0.45.1", "express": "^4.18.2", "express-basic-auth": "^1.2.1", "fast-uri": "^3.1.0", @@ -35397,6 +36662,7 @@ }, "devDependencies": { "@playwright/test": "^1.56.1", + "@types/better-sqlite3": "^7.6.13", "@types/express": "^4.17.20", "@types/node": "^20.9.0", "@types/qrcode": "^1.5.6", @@ -35404,6 +36670,7 @@ "@types/ws": "^8.5.8", "@vitest/ui": "^3.2.4", "cross-env": "^10.1.0", + "drizzle-kit": "^0.31.10", "playwright": "^1.56.1", "tsx": "^4.6.0", "typescript": "^5.2.2", diff --git a/package.json b/package.json index 26f091731..1b90aca43 100644 --- a/package.json +++ b/package.json @@ -37,6 +37,7 @@ "web": "npm run web --workspace=@getpaseo/app", "dev:desktop": "npm run dev --workspace=@getpaseo/desktop", "build:desktop": "npm run version:sync-internal && npm run build:web --workspace=@getpaseo/app && npm run build --workspace=@getpaseo/desktop", + "db:query": "npm run db:query --workspace=@getpaseo/server --", "cli": "npx tsx packages/cli/src/index.js", "version": "npm run version:sync-internal && npm run release:prepare && git add -A", "version:sync-internal": "node scripts/sync-workspace-versions.mjs", @@ -98,6 +99,9 @@ }, "dependencies": { "@anthropic-ai/claude-agent-sdk": "^0.2.11", - "@modelcontextprotocol/sdk": "^1.27.1" + "@modelcontextprotocol/sdk": "^1.27.1", + "expo": "~54.0.33", + "react": "19.1.0", + "react-native": "0.81.5" } } diff --git a/packages/app/e2e/helpers/launcher.ts b/packages/app/e2e/helpers/launcher.ts new file mode 100644 index 000000000..c42861f64 --- /dev/null +++ b/packages/app/e2e/helpers/launcher.ts @@ -0,0 +1,193 @@ +import { expect, type Page } from "@playwright/test"; +import { buildHostWorkspaceRoute } from "../../src/utils/host-routes"; +import { createTempGitRepo } from "./workspace"; + +// ─── Navigation ──────────────────────────────────────────────────────────── + +function getServerId(): string { + const serverId = process.env.E2E_SERVER_ID; + if (!serverId) { + throw new Error("E2E_SERVER_ID is not set (expected from Playwright globalSetup)."); + } + return serverId; +} + +/** Navigate to a workspace and wait for the tab bar to appear. */ +export async function gotoWorkspace(page: Page, cwd: string): Promise { + const route = buildHostWorkspaceRoute(getServerId(), cwd); + await page.goto(route); + await waitForTabBar(page); +} + +// ─── Tab bar queries ─────────────────────────────────────────────────────── + +/** Wait for the workspace tab bar to be visible. */ +export async function waitForTabBar(page: Page): Promise { + await expect(page.getByTestId("workspace-tabs-row").first()).toBeVisible({ + timeout: 30_000, + }); +} + +/** Return all tab test IDs currently in the tab bar. */ +export async function getTabTestIds(page: Page): Promise { + const tabs = page.locator( + '[data-testid^="workspace-tab-"]:not([data-testid^="workspace-tab-context-"])', + ); + const count = await tabs.count(); + const ids: string[] = []; + for (let i = 0; i < count; i++) { + const testId = await tabs.nth(i).getAttribute("data-testid"); + if (testId) ids.push(testId); + } + return ids; +} + +/** Return the number of tabs matching a kind prefix (e.g. "launcher", "draft", "terminal", "agent"). */ +export async function countTabsOfKind(page: Page, kind: string): Promise { + const ids = await getTabTestIds(page); + return ids.filter((id) => id.includes(kind)).length; +} + +/** Return the currently active tab's test ID (the one with aria-selected or focus styling). */ +export async function getActiveTabTestId(page: Page): Promise { + // Active tab has the focused highlight — check for the aria-selected or data-active attribute + const activeTab = page + .locator( + '[data-testid^="workspace-tab-"]:not([data-testid^="workspace-tab-context-"])[aria-selected="true"]', + ) + .first(); + if (await activeTab.isVisible().catch(() => false)) { + return activeTab.getAttribute("data-testid"); + } + // Fallback: the tab with focused styling + return null; +} + +// ─── Tab actions ─────────────────────────────────────────────────────────── + +/** Click the new agent tab button in the tab bar. Creates a draft/chat tab directly. */ +export async function clickNewTabButton(page: Page): Promise { + const button = page.getByTestId("workspace-new-agent-tab"); + await expect(button).toBeVisible({ timeout: 10_000 }); + await button.click(); +} + +/** Click the new terminal button in the workspace tab bar. Creates a terminal tab directly. */ +export async function clickNewTerminalButton(page: Page): Promise { + const button = page.getByTestId("workspace-new-terminal"); + await expect(button).toBeVisible({ timeout: 10_000 }); + await button.click(); +} + +/** Press Cmd+T (macOS) to open a new tab. */ +export async function pressNewTabShortcut(page: Page): Promise { + await page.keyboard.press("Meta+t"); +} + +// ─── Tab bar assertions ─────────────────────────────────────────────────── + +/** @deprecated The launcher panel was removed. Actions go directly to their target. */ +export async function waitForLauncherPanel(page: Page): Promise { + // No-op: the launcher panel no longer exists. +} + +/** Assert the new agent tab button is visible in the tab bar. */ +export async function assertNewChatTileVisible(page: Page): Promise { + await expect(page.getByTestId("workspace-new-agent-tab").first()).toBeVisible(); +} + +/** Assert the new terminal button is visible in the tab bar. */ +export async function assertTerminalTileVisible(page: Page): Promise { + await expect(page.getByTestId("workspace-new-terminal").first()).toBeVisible(); +} + +// ─── Tab creation actions ───────────────────────────────────────────────── + +/** Click the new agent tab button to create a draft/chat tab. */ +export async function clickNewChat(page: Page): Promise { + const button = page.getByTestId("workspace-new-agent-tab"); + await expect(button).toBeVisible({ timeout: 10_000 }); + await button.click(); +} + +/** Click the new terminal button to create a terminal tab. */ +export async function clickTerminal(page: Page): Promise { + const button = page.getByTestId("workspace-new-terminal"); + await expect(button).toBeVisible({ timeout: 10_000 }); + await button.click(); +} + + +// ─── Tab title assertions ────────────────────────────────────────────────── + +/** Wait for any tab in the bar to display the given title text. */ +export async function waitForTabWithTitle( + page: Page, + title: string | RegExp, + timeout = 30_000, +): Promise { + const matcher = typeof title === "string" ? new RegExp(title, "i") : title; + await expect( + page + .locator('[data-testid^="workspace-tab-"]:not([data-testid^="workspace-tab-context-"])') + .filter({ hasText: matcher }) + .first(), + ).toBeVisible({ timeout }); +} + +/** Assert the new agent tab button is visible in the tab bar. */ +export async function assertSingleNewTabButton(page: Page): Promise { + const buttons = page.getByTestId("workspace-new-agent-tab"); + const count = await buttons.count(); + expect(count).toBeGreaterThanOrEqual(1); +} + +// ─── No-flash measurement ────────────────────────────────────────────────── + +/** + * Measure the time between clicking a launcher tile and the replacement panel becoming visible. + * Returns elapsed milliseconds. + */ +export async function measureTileTransition( + page: Page, + clickAction: () => Promise, + successLocator: ReturnType, + timeout = 5_000, +): Promise { + const start = Date.now(); + await clickAction(); + await expect(successLocator).toBeVisible({ timeout }); + return Date.now() - start; +} + +/** + * Sample tab IDs at high frequency across a transition to detect blank/intermediate states. + * Returns all unique snapshots observed. + */ +export async function sampleTabsDuringTransition( + page: Page, + action: () => Promise, + durationMs = 2_000, + intervalMs = 30, +): Promise { + const snapshots: string[][] = []; + const startSampling = async () => { + const start = Date.now(); + while (Date.now() - start < durationMs) { + snapshots.push(await getTabTestIds(page)); + await page.waitForTimeout(intervalMs); + } + }; + + const samplingPromise = startSampling(); + await action(); + await samplingPromise; + return snapshots; +} + +// ─── Workspace setup ─────────────────────────────────────────────────────── + +/** Create a temp git repo and return its path with a cleanup function. */ +export async function createWorkspace(prefix = "launcher-e2e-"): ReturnType { + return createTempGitRepo(prefix); +} diff --git a/packages/app/e2e/helpers/terminal-perf.ts b/packages/app/e2e/helpers/terminal-perf.ts index 18dd79bd1..8ac45d251 100644 --- a/packages/app/e2e/helpers/terminal-perf.ts +++ b/packages/app/e2e/helpers/terminal-perf.ts @@ -7,6 +7,12 @@ import { buildHostWorkspaceRoute } from "../../src/utils/host-routes"; export type TerminalPerfDaemonClient = { connect(): Promise; close(): Promise; + openProject( + cwd: string, + ): Promise<{ + workspace: { id: number; name: string; projectRootPath: string } | null; + error: string | null; + }>; createTerminal( cwd: string, name?: string, diff --git a/packages/app/e2e/helpers/workspace-lifecycle.ts b/packages/app/e2e/helpers/workspace-lifecycle.ts new file mode 100644 index 000000000..4006e42d2 --- /dev/null +++ b/packages/app/e2e/helpers/workspace-lifecycle.ts @@ -0,0 +1,35 @@ +import { expect, type Page } from "@playwright/test"; +import { + clickNewChat, + clickTerminal, +} from "./launcher"; +import { setupDeterministicPrompt, waitForTerminalContent } from "./terminal-perf"; + +function terminalSurface(page: Page) { + return page.locator('[data-testid="terminal-surface"]').first(); +} + +function composerInput(page: Page) { + return page.getByRole("textbox", { name: "Message agent..." }).first(); +} + +export async function expectTerminalCwd(page: Page, expectedPath: string): Promise { + const terminal = terminalSurface(page); + await expect(terminal).toBeVisible({ timeout: 20_000 }); + await terminal.click(); + await setupDeterministicPrompt(page, `SENTINEL_${Date.now()}`); + await terminal.pressSequentially("pwd\n", { delay: 0 }); + await waitForTerminalContent(page, (text) => text.includes(expectedPath), 10_000); +} + +export async function createStandaloneTerminalFromLauncher(page: Page): Promise { + await clickTerminal(page); + await expect(terminalSurface(page)).toBeVisible({ timeout: 20_000 }); +} + +export async function createAgentChatFromLauncher(page: Page): Promise { + await clickNewChat(page); + await expect(composerInput(page)).toBeVisible({ timeout: 15_000 }); + await expect(composerInput(page)).toBeEditable({ timeout: 15_000 }); + await expect(page.getByTestId("agent-loading")).toHaveCount(0); +} diff --git a/packages/app/e2e/helpers/workspace-setup.ts b/packages/app/e2e/helpers/workspace-setup.ts new file mode 100644 index 000000000..de47cf3aa --- /dev/null +++ b/packages/app/e2e/helpers/workspace-setup.ts @@ -0,0 +1,337 @@ +import { realpathSync } from "node:fs"; +import path from "node:path"; +import { randomUUID } from "node:crypto"; +import { pathToFileURL } from "node:url"; +import { expect, type Page } from "@playwright/test"; +import { parseHostWorkspaceRouteFromPathname } from "../../src/utils/host-routes"; +import { gotoAppShell } from "./app"; +import type { SessionOutboundMessage } from "@server/shared/messages"; + +type WorkspaceSetupDaemonClient = { + connect(): Promise; + close(): Promise; + openProject( + cwd: string, + ): Promise<{ + workspace: { + id: number; + name: string; + workspaceDirectory: string; + projectRootPath: string; + } | null; + error: string | null; + }>; + createPaseoWorktree( + input: { cwd: string; worktreeSlug?: string }, + ): Promise<{ + workspace: { + id: number; + name: string; + workspaceDirectory: string; + projectRootPath: string; + } | null; + error: string | null; + }>; + fetchWorkspaces(): Promise<{ + entries: Array<{ + id: number; + name: string; + workspaceDirectory: string; + projectRootPath: string; + }>; + }>; + fetchAgents(): Promise<{ + entries: Array<{ + agent: { id: string; cwd: string; workspaceId?: string | null }; + }>; + }>; + fetchAgent( + agentId: string, + ): Promise<{ + agent: { id: string; cwd: string } | null; + project: unknown | null; + } | null>; + listTerminals( + cwd: string, + ): Promise<{ + cwd?: string; + terminals: Array<{ id: string; cwd: string; name: string }>; + error?: string | null; + }>; + subscribeRawMessages(handler: (message: SessionOutboundMessage) => void): () => void; +}; + +export type WorkspaceSetupProgressPayload = Extract< + SessionOutboundMessage, + { type: "workspace_setup_progress" } +>["payload"]; + +export type { WorkspaceSetupDaemonClient }; + +function getDaemonWsUrl(): string { + const daemonPort = process.env.E2E_DAEMON_PORT; + if (!daemonPort) { + throw new Error("E2E_DAEMON_PORT is not set."); + } + return `ws://127.0.0.1:${daemonPort}/ws`; +} + +async function loadDaemonClientConstructor(): Promise< + new (config: { url: string; clientId: string; clientType: "cli" }) => WorkspaceSetupDaemonClient +> { + const repoRoot = path.resolve(process.cwd(), "../.."); + const moduleUrl = pathToFileURL( + path.join(repoRoot, "packages/server/dist/server/server/exports.js"), + ).href; + const mod = (await import(moduleUrl)) as { + DaemonClient: new (config: { + url: string; + clientId: string; + clientType: "cli"; + }) => WorkspaceSetupDaemonClient; + }; + return mod.DaemonClient; +} + +export async function connectWorkspaceSetupClient(): Promise { + const DaemonClient = await loadDaemonClientConstructor(); + const client = new DaemonClient({ + url: getDaemonWsUrl(), + clientId: `workspace-setup-${randomUUID()}`, + clientType: "cli", + }); + await client.connect(); + return client; +} + +export async function seedProjectForWorkspaceSetup( + client: WorkspaceSetupDaemonClient, + repoPath: string, +): Promise { + const result = await client.openProject(repoPath); + if (!result.workspace || result.error) { + throw new Error(result.error ?? `Failed to open project ${repoPath}`); + } +} + +export function projectNameFromPath(repoPath: string): string { + return repoPath.replace(/\/+$/, "").split("/").filter(Boolean).pop() ?? repoPath; +} + +export async function openHomeWithProject(page: Page, repoPath: string): Promise { + await gotoAppShell(page); + await expect( + page + .locator('[data-testid^="sidebar-project-row-"]') + .filter({ hasText: projectNameFromPath(repoPath) }) + .first(), + ).toBeVisible({ timeout: 30_000 }); +} + +function createWorkspaceButton(page: Page, repoPath: string) { + return page.getByRole("button", { + name: `Create a new workspace for ${projectNameFromPath(repoPath)}`, + }); +} + +async function revealWorkspaceButton(page: Page, repoPath: string): Promise { + await page + .locator('[data-testid^="sidebar-project-row-"]') + .filter({ hasText: projectNameFromPath(repoPath) }) + .first() + .hover(); +} + +export async function createWorkspaceFromSidebar(page: Page, repoPath: string): Promise { + const button = createWorkspaceButton(page, repoPath); + await revealWorkspaceButton(page, repoPath); + await expect(button).toBeVisible({ timeout: 30_000 }); + await expect(button).toBeEnabled({ timeout: 30_000 }); + await button.click(); + await expect(page).toHaveURL(/\/new\?/, { timeout: 30_000 }); + await expect(page.getByRole("textbox", { name: "Message agent..." }).first()).toBeVisible({ timeout: 30_000 }); +} + +export async function getCurrentWorkspaceIdFromRoute(page: Page): Promise { + await expect + .poll( + () => parseHostWorkspaceRouteFromPathname(new URL(page.url()).pathname)?.workspaceId ?? null, + { timeout: 30_000 }, + ) + .not.toBeNull(); + + const workspaceId = + parseHostWorkspaceRouteFromPathname(new URL(page.url()).pathname)?.workspaceId ?? null; + if (!workspaceId) { + throw new Error(`Expected a workspace route but found ${page.url()}`); + } + + return workspaceId; +} + +function workspaceSetupDialog(page: Page) { + return page.getByTestId("workspace-setup-dialog"); +} + +export async function createChatAgentFromWorkspaceSetup( + page: Page, + input: { message: string }, +): Promise { + const messageInput = page.getByRole("textbox", { name: "Message agent..." }).first(); + await expect(messageInput).toBeVisible({ timeout: 15_000 }); + await messageInput.fill(input.message); + await messageInput.press("Enter"); +} + +/** + * @deprecated The new workspace screen no longer has a standalone terminal button. + * Use the daemon API to create a workspace, then open a terminal from the launcher. + */ +export async function createStandaloneTerminalFromWorkspaceSetup(page: Page): Promise { + await workspaceSetupDialog(page) + .getByRole("button", { name: /^Terminal Create the workspace/i }) + .click(); +} + +export async function waitForWorkspaceSetupDialogToClose(page: Page, timeoutMs = 45_000): Promise { + const dialog = workspaceSetupDialog(page); + + try { + await expect(dialog).toHaveCount(0, { timeout: timeoutMs }); + } catch (error) { + const dialogText = (await dialog.textContent().catch(() => null))?.replace(/\s+/g, " ").trim(); + throw new Error( + dialogText + ? `Workspace setup dialog stayed open. Visible text: ${dialogText}` + : `Workspace setup dialog did not close within ${timeoutMs}ms`, + { cause: error }, + ); + } +} + +export async function expectSetupPanel(page: Page): Promise { + // If the setup panel is already visible (auto-opened), we're done. + const panel = page.getByTestId("workspace-setup-panel"); + if (await panel.isVisible().catch(() => false)) { + return; + } + // Otherwise open it manually via workspace header actions menu. + // Use the specific testID to avoid matching the sidebar kebab which shares + // the same "Workspace actions" accessibility label. + const actionsButton = page.getByTestId("workspace-header-menu-trigger"); + await expect(actionsButton).toBeVisible({ timeout: 10_000 }); + await actionsButton.click(); + const showSetup = page.getByTestId("workspace-header-show-setup"); + await expect(showSetup).toBeVisible({ timeout: 5_000 }); + await showSetup.click(); + await expect(panel).toBeVisible({ timeout: 30_000 }); +} + +export async function expectSetupStatus( + page: Page, + status: "Running" | "Completed" | "Failed", +): Promise { + await expect(page.getByTestId("workspace-setup-status")).toContainText(status, { + timeout: 30_000, + }); +} + +export async function expectSetupLogContains(page: Page, text: string): Promise { + await expect(page.getByTestId("workspace-setup-log")).toContainText(text, { + timeout: 30_000, + }); +} + +export async function expectNoSetupMessage(page: Page): Promise { + await expect(page.getByText("No setup commands ran for this workspace.", { exact: true })).toBeVisible({ + timeout: 30_000, + }); +} + +export async function createWorkspaceThroughDaemon( + client: WorkspaceSetupDaemonClient, + input: { cwd: string; worktreeSlug: string }, +): Promise<{ id: string; name: string }> { + const result = await client.createPaseoWorktree(input); + if (!result.workspace || result.error) { + throw new Error(result.error ?? `Failed to create workspace for ${input.cwd}`); + } + return { + id: String(result.workspace.id), + name: result.workspace.name, + }; +} + +export async function findWorktreeWorkspaceForProject( + client: WorkspaceSetupDaemonClient, + repoPath: string, +): Promise<{ + id: string; + name: string; + projectRootPath: string; + workspaceDirectory: string; +}> { + const payload = await client.fetchWorkspaces(); + const normalizedRepoPath = realpathSync(repoPath); + const workspace = + payload.entries.find( + (entry) => + entry.projectRootPath === normalizedRepoPath && entry.workspaceDirectory !== normalizedRepoPath, + ) ?? null; + if (!workspace) { + throw new Error(`Failed to find created worktree workspace for ${repoPath}`); + } + return { + id: String(workspace.id), + name: workspace.name, + projectRootPath: workspace.projectRootPath, + workspaceDirectory: workspace.workspaceDirectory, + }; +} + +export async function fetchWorkspaceById( + client: WorkspaceSetupDaemonClient, + workspaceId: string, +): Promise<{ + id: number; + name: string; + workspaceDirectory: string; + projectRootPath: string; +}> { + const parsedWorkspaceId = Number(workspaceId); + if (!Number.isInteger(parsedWorkspaceId)) { + throw new Error(`Workspace id is not numeric: ${workspaceId}`); + } + + const payload = await client.fetchWorkspaces(); + const workspace = payload.entries.find((entry) => entry.id === parsedWorkspaceId) ?? null; + if (!workspace) { + throw new Error(`Workspace not found: ${workspaceId}`); + } + return workspace; +} + +export async function waitForWorkspaceSetupProgress( + client: WorkspaceSetupDaemonClient, + predicate: (payload: WorkspaceSetupProgressPayload) => boolean, + timeoutMs = 30_000, +): Promise { + return new Promise((resolve, reject) => { + const timeout = setTimeout(() => { + unsubscribe(); + reject(new Error(`Timed out waiting for workspace_setup_progress after ${timeoutMs}ms`)); + }, timeoutMs); + + const unsubscribe = client.subscribeRawMessages((message) => { + if (message.type !== "workspace_setup_progress") { + return; + } + if (!predicate(message.payload)) { + return; + } + clearTimeout(timeout); + unsubscribe(); + resolve(message.payload); + }); + }); +} diff --git a/packages/app/e2e/helpers/workspace.ts b/packages/app/e2e/helpers/workspace.ts index d49d67a87..28f4d3365 100644 --- a/packages/app/e2e/helpers/workspace.ts +++ b/packages/app/e2e/helpers/workspace.ts @@ -1,5 +1,5 @@ import { execSync } from "node:child_process"; -import { mkdtemp, writeFile, rm, mkdir } from "node:fs/promises"; +import { mkdtemp, writeFile, rm, mkdir, realpath } from "node:fs/promises"; import { tmpdir } from "node:os"; import path from "node:path"; @@ -10,10 +10,15 @@ type TempRepo = { export const createTempGitRepo = async ( prefix = "paseo-e2e-", - options?: { withRemote?: boolean }, + options?: { + withRemote?: boolean; + paseoConfig?: Record; + files?: Array<{ path: string; content: string }>; + }, ): Promise => { // Keep E2E repo paths short so terminal prompt + typed commands stay visible without zsh clipping. - const tempRoot = process.platform === "win32" ? tmpdir() : "/tmp"; + // Resolve symlinks (macOS: /tmp → /private/tmp) so paths match the daemon's resolved paths. + const tempRoot = process.platform === "win32" ? tmpdir() : await realpath("/tmp"); const repoPath = await mkdtemp(path.join(tempRoot, prefix)); const withRemote = options?.withRemote ?? false; @@ -22,7 +27,24 @@ export const createTempGitRepo = async ( execSync('git config user.name "Paseo E2E"', { cwd: repoPath, stdio: "ignore" }); execSync("git config commit.gpgsign false", { cwd: repoPath, stdio: "ignore" }); await writeFile(path.join(repoPath, "README.md"), "# Temp Repo\n"); + if (options?.paseoConfig) { + await writeFile( + path.join(repoPath, "paseo.json"), + JSON.stringify(options.paseoConfig, null, 2), + ); + } + for (const file of options?.files ?? []) { + const filePath = path.join(repoPath, file.path); + await mkdir(path.dirname(filePath), { recursive: true }); + await writeFile(filePath, file.content); + } execSync("git add README.md", { cwd: repoPath, stdio: "ignore" }); + if (options?.paseoConfig) { + execSync("git add paseo.json", { cwd: repoPath, stdio: "ignore" }); + } + for (const file of options?.files ?? []) { + execSync(`git add ${JSON.stringify(file.path)}`, { cwd: repoPath, stdio: "ignore" }); + } execSync('git commit -m "Initial commit"', { cwd: repoPath, stdio: "ignore" }); if (withRemote) { diff --git a/packages/app/e2e/launcher-tab.spec.ts b/packages/app/e2e/launcher-tab.spec.ts new file mode 100644 index 000000000..34211cf48 --- /dev/null +++ b/packages/app/e2e/launcher-tab.spec.ts @@ -0,0 +1,261 @@ +import { test, expect } from "./fixtures"; +import { createTempGitRepo } from "./helpers/workspace"; +import { + gotoWorkspace, + assertNewChatTileVisible, + assertTerminalTileVisible, + assertSingleNewTabButton, + clickNewTabButton, + pressNewTabShortcut, + clickNewChat, + clickTerminal, + countTabsOfKind, + getTabTestIds, + waitForTabWithTitle, + measureTileTransition, + sampleTabsDuringTransition, +} from "./helpers/launcher"; +import { + connectTerminalClient, + waitForTerminalContent, + setupDeterministicPrompt, + type TerminalPerfDaemonClient, +} from "./helpers/terminal-perf"; + +// ─── Shared state ────────────────────────────────────────────────────────── + +let tempRepo: { path: string; cleanup: () => Promise }; +let workspaceId: string; +let seedClient: TerminalPerfDaemonClient; + +test.beforeAll(async () => { + tempRepo = await createTempGitRepo("launcher-e2e-"); + seedClient = await connectTerminalClient(); + const result = await seedClient.openProject(tempRepo.path); + if (!result.workspace) throw new Error(result.error ?? "Failed to seed workspace"); + workspaceId = String(result.workspace.id); +}); + +test.afterAll(async () => { + if (seedClient) await seedClient.close(); + if (tempRepo) await tempRepo.cleanup(); +}); + +// ═══════════════════════════════════════════════════════════════════════════ +// Tab Creation Tests +// ═══════════════════════════════════════════════════════════════════════════ + +test.describe("Tab creation", () => { + test("Cmd+T opens a new agent tab with composer", async ({ page }) => { + await gotoWorkspace(page, workspaceId); + + await pressNewTabShortcut(page); + + // Should show the composer directly (no launcher panel) + const composer = page.getByRole("textbox", { name: "Message agent..." }); + await expect(composer.first()).toBeVisible({ timeout: 15_000 }); + }); + + test("opening two new tabs creates two draft tabs", async ({ page }) => { + await gotoWorkspace(page, workspaceId); + + await pressNewTabShortcut(page); + const countAfterFirst = await countTabsOfKind(page, "draft"); + + await pressNewTabShortcut(page); + await expect + .poll(() => countTabsOfKind(page, "draft")) + .toBe(countAfterFirst + 1); + }); + + test("clicking new agent tab creates a draft tab", async ({ page }) => { + await gotoWorkspace(page, workspaceId); + + await clickNewTabButton(page); + + // Draft composer should appear (the agent message input) + const composer = page.getByRole("textbox", { name: "Message agent..." }); + await expect(composer.first()).toBeVisible({ timeout: 15_000 }); + + const tabsAfter = await getTabTestIds(page); + const draftCountAfter = tabsAfter.filter((id) => id.includes("draft")).length; + expect(draftCountAfter).toBeGreaterThanOrEqual(1); + }); + + test("clicking terminal button creates a standalone terminal", async ({ page }) => { + test.setTimeout(45_000); + await gotoWorkspace(page, workspaceId); + + await clickTerminal(page); + + // Terminal surface should appear + const terminal = page.locator('[data-testid="terminal-surface"]'); + await expect(terminal.first()).toBeVisible({ timeout: 20_000 }); + + const tabsAfter = await getTabTestIds(page); + const terminalTabs = tabsAfter.filter((id) => id.includes("terminal")); + expect(terminalTabs.length).toBeGreaterThanOrEqual(1); + }); + + test("tab bar shows action buttons per pane", async ({ page }) => { + await gotoWorkspace(page, workspaceId); + await assertSingleNewTabButton(page); + await assertNewChatTileVisible(page); + await assertTerminalTileVisible(page); + }); +}); + +// ═══════════════════════════════════════════════════════════════════════════ +// Terminal Title Tests +// ═══════════════════════════════════════════════════════════════════════════ + +test.describe("Terminal title propagation", () => { + // OSC title escape sequence propagation is inherently flaky — the terminal + // must process the sequence, emit a title change event, and the tab bar + // must re-render before the assertion deadline. Allow retries. + test.describe.configure({ retries: 2 }); + + let client: TerminalPerfDaemonClient; + + test.beforeAll(async () => { + client = await connectTerminalClient(); + }); + + test.afterAll(async () => { + if (client) await client.close(); + }); + + test.skip("terminal tab title updates from OSC title escape sequence", async ({ page }) => { + test.setTimeout(60_000); + + const result = await client.createTerminal(tempRepo.path, "title-test"); + if (!result.terminal) throw new Error(`Failed to create terminal: ${result.error}`); + const terminalId = result.terminal.id; + + try { + // Navigate to workspace and open a terminal + await gotoWorkspace(page, workspaceId); + await clickTerminal(page); + + const terminal = page.locator('[data-testid="terminal-surface"]'); + await expect(terminal.first()).toBeVisible({ timeout: 20_000 }); + await terminal.first().click(); + + await setupDeterministicPrompt(page); + + // Send OSC 0 (set window title) escape sequence + const testTitle = `E2E-Title-${Date.now()}`; + await terminal + .first() + .pressSequentially(`printf '\\033]0;${testTitle}\\007'\n`, { delay: 0 }); + + // Wait for the tab to reflect the new title + await waitForTabWithTitle(page, testTitle, 15_000); + } finally { + await client.killTerminal(terminalId).catch(() => {}); + } + }); + + test.skip("title debouncing coalesces rapid changes", async ({ page }) => { + test.setTimeout(60_000); + + const result = await client.createTerminal(tempRepo.path, "debounce-test"); + if (!result.terminal) throw new Error(`Failed to create terminal: ${result.error}`); + const terminalId = result.terminal.id; + + try { + await gotoWorkspace(page, workspaceId); + await clickTerminal(page); + + const terminal = page.locator('[data-testid="terminal-surface"]'); + await expect(terminal.first()).toBeVisible({ timeout: 20_000 }); + await terminal.first().click(); + + await setupDeterministicPrompt(page); + + // Fire many rapid title changes — only the last should stick + const finalTitle = `Final-${Date.now()}`; + for (let i = 0; i < 5; i++) { + await terminal + .first() + .pressSequentially(`printf '\\033]0;Rapid-${i}\\007'\n`, { delay: 0 }); + } + await terminal + .first() + .pressSequentially(`printf '\\033]0;${finalTitle}\\007'\n`, { delay: 0 }); + + // The tab should eventually settle on the final title + await waitForTabWithTitle(page, finalTitle, 15_000); + } finally { + await client.killTerminal(terminalId).catch(() => {}); + } + }); +}); + +// ═══════════════════════════════════════════════════════════════════════════ +// No-Flash Transition Tests +// ═══════════════════════════════════════════════════════════════════════════ + +test.describe("Tab transitions (no flash)", () => { + test("New agent tab transition has no blank intermediate tab state", async ({ page }) => { + await gotoWorkspace(page, workspaceId); + + // Sample tabs at high frequency across the transition + const snapshots = await sampleTabsDuringTransition( + page, + () => clickNewChat(page), + 2_000, + 30, + ); + + // Every snapshot should have at least one tab — no blank/zero-tab frames + for (const snapshot of snapshots) { + expect(snapshot.length).toBeGreaterThanOrEqual(1); + } + + // Tab count should never spike excessively (no duplicate flash from add-then-remove). + // When running in-suite, previous tests may have created tabs on the shared workspace, + // so we allow +2 tolerance for accumulated state and React render batching. + const counts = snapshots.map((s) => s.length); + const maxCount = Math.max(...counts); + const initialCount = counts[0] ?? 0; + + expect(maxCount).toBeLessThanOrEqual(initialCount + 2); + }); + + test("Terminal transition completes within visual budget", async ({ page }) => { + test.setTimeout(30_000); + await gotoWorkspace(page, workspaceId); + + const terminal = page.locator('[data-testid="terminal-surface"]'); + const elapsed = await measureTileTransition( + page, + () => clickTerminal(page), + terminal.first(), + 20_000, + ); + + // Terminal surface should appear within a reasonable budget. + // Note: terminal creation involves a server round-trip, so we allow more time + // than a pure in-memory transition, but it should still be well under 5 seconds. + expect(elapsed).toBeLessThan(5_000); + }); + + test("New agent tab click shows composer without flash", async ({ page }) => { + await gotoWorkspace(page, workspaceId); + + const composer = page.getByRole("textbox", { name: "Message agent..." }).first(); + + const elapsed = await measureTileTransition( + page, + () => clickNewChat(page), + composer, + 10_000, + ); + + // Draft creation is fully in-memory — should be fast + // We use a generous budget here because CI can be slow, but the key assertion + // is that no blank/flash frame appears (tested above). + expect(elapsed).toBeLessThan(3_000); + }); +}); diff --git a/packages/app/e2e/sidebar-workspace.spec.ts b/packages/app/e2e/sidebar-workspace.spec.ts new file mode 100644 index 000000000..fd77319d9 --- /dev/null +++ b/packages/app/e2e/sidebar-workspace.spec.ts @@ -0,0 +1,182 @@ +import { execSync } from "node:child_process"; +import { mkdtemp, realpath, rm, writeFile } from "node:fs/promises"; +import { tmpdir } from "node:os"; +import path from "node:path"; +import { test, expect } from "./fixtures"; +import { gotoAppShell } from "./helpers/app"; +import { createTempGitRepo } from "./helpers/workspace"; +import { expectWorkspaceHeader } from "./helpers/workspace-ui"; +import { connectWorkspaceSetupClient } from "./helpers/workspace-setup"; + +function getServerId(): string { + const serverId = process.env.E2E_SERVER_ID; + if (!serverId) { + throw new Error("E2E_SERVER_ID is not set (expected from Playwright globalSetup)."); + } + return serverId; +} + +function getWorkspaceRowTestId(workspaceId: string): string { + return `sidebar-workspace-row-${getServerId()}:${workspaceId}`; +} + +function escapeRegex(value: string): string { + return value.replace(/[.*+?^${}()|[\]\\]/g, "\\$&"); +} + +function setGitHubRemote(repoPath: string): void { + execSync("git remote set-url origin https://github.com/test-owner/test-repo.git", { + cwd: repoPath, + stdio: "ignore", + }); +} + +async function createTempDirectory(prefix = "paseo-e2e-dir-") { + const tempRoot = process.platform === "win32" ? tmpdir() : await realpath("/tmp"); + const dirPath = await mkdtemp(path.join(tempRoot, prefix)); + await writeFile(path.join(dirPath, "README.md"), "# Temp Directory\n"); + return { + path: dirPath, + cleanup: async () => { + await rm(dirPath, { recursive: true, force: true }); + }, + }; +} + +async function openProjectViaDaemon( + client: Awaited>, + cwd: string, +): Promise<{ id: string; name: string }> { + const result = await client.openProject(cwd); + if (!result.workspace || result.error) { + throw new Error(result.error ?? `Failed to open project ${cwd}`); + } + return { + id: String(result.workspace.id), + name: result.workspace.name, + }; +} + +async function openWorkspaceFromSidebar(page: import("@playwright/test").Page, workspaceId: string) { + const row = page.getByTestId(getWorkspaceRowTestId(workspaceId)); + await expect(row).toBeVisible({ timeout: 30_000 }); + await row.click(); + await expect(page).toHaveURL(/\/workspace\//, { timeout: 30_000 }); + return row; +} + +async function waitForSidebarProject( + page: import("@playwright/test").Page, + projectName: string, +) { + const row = page + .getByRole("button", { + name: new RegExp(escapeRegex(projectName), "i"), + }) + .first(); + await expect(row).toBeVisible({ timeout: 30_000 }); + return row; +} + +async function waitForSidebarWorkspace(page: import("@playwright/test").Page, workspaceId: string) { + const row = page.getByTestId(getWorkspaceRowTestId(workspaceId)); + await expect(row).toBeVisible({ timeout: 30_000 }); + return row; +} + +test.describe("Sidebar workspace list", () => { + test("project with GitHub remote shows owner/repo name in sidebar", async ({ page }) => { + const client = await connectWorkspaceSetupClient(); + const repo = await createTempGitRepo("sidebar-remote-", { withRemote: true }); + + try { + setGitHubRemote(repo.path); + const workspace = await openProjectViaDaemon(client, repo.path); + await gotoAppShell(page); + await waitForSidebarProject(page, "test-owner/test-repo"); + await waitForSidebarWorkspace(page, workspace.id); + + const projectRow = page + .locator('[data-testid^="sidebar-project-row-"]') + .filter({ hasText: "test-owner/test-repo" }) + .first(); + + await expect(projectRow).toBeVisible({ timeout: 30_000 }); + await expect(projectRow).not.toContainText(path.basename(repo.path)); + } finally { + await client.close(); + await repo.cleanup(); + } + }); + + test("project shows workspace under it", async ({ page }) => { + const client = await connectWorkspaceSetupClient(); + const repo = await createTempGitRepo("sidebar-workspace-under-project-"); + + try { + const workspace = await openProjectViaDaemon(client, repo.path); + await gotoAppShell(page); + + await waitForSidebarProject(page, path.basename(repo.path)); + await waitForSidebarWorkspace(page, workspace.id); + } finally { + await client.close(); + await repo.cleanup(); + } + }); + + test("non-git project shows directory name", async ({ page }) => { + const client = await connectWorkspaceSetupClient(); + const project = await createTempDirectory("sidebar-directory-"); + + try { + await openProjectViaDaemon(client, project.path); + await gotoAppShell(page); + + const projectRow = await waitForSidebarProject(page, path.basename(project.path)); + await expect(projectRow).toContainText(path.basename(project.path)); + } finally { + await client.close(); + await project.cleanup(); + } + }); + + test("workspace header shows correct title and subtitle", async ({ page }) => { + const client = await connectWorkspaceSetupClient(); + const repo = await createTempGitRepo("sidebar-header-", { withRemote: true }); + + try { + setGitHubRemote(repo.path); + const workspace = await openProjectViaDaemon(client, repo.path); + await gotoAppShell(page); + await waitForSidebarProject(page, "test-owner/test-repo"); + await waitForSidebarWorkspace(page, workspace.id); + await openWorkspaceFromSidebar(page, workspace.id); + + await expectWorkspaceHeader(page, { + title: workspace.name, + subtitle: "test-owner/test-repo", + }); + } finally { + await client.close(); + await repo.cleanup(); + } + }); + + test("git project shows branch name in workspace row", async ({ page }) => { + const client = await connectWorkspaceSetupClient(); + const repo = await createTempGitRepo("sidebar-branch-"); + + try { + const workspace = await openProjectViaDaemon(client, repo.path); + await gotoAppShell(page); + await waitForSidebarProject(page, path.basename(repo.path)); + + expect(workspace.name).toBe("main"); + await expect(await waitForSidebarWorkspace(page, workspace.id)).toContainText("main"); + } finally { + await client.close(); + await repo.cleanup(); + } + }); +}); diff --git a/packages/app/e2e/terminal-performance.spec.ts b/packages/app/e2e/terminal-performance.spec.ts index da03858a4..9490343cb 100644 --- a/packages/app/e2e/terminal-performance.spec.ts +++ b/packages/app/e2e/terminal-performance.spec.ts @@ -24,6 +24,9 @@ test.describe("Terminal wire performance", () => { test.beforeAll(async () => { tempRepo = await createTempGitRepo("perf-"); client = await connectTerminalClient(); + // Seed the workspace in the daemon so the app can resolve the path + const seedResult = await client.openProject(tempRepo.path); + if (!seedResult.workspace) throw new Error(seedResult.error ?? "Failed to seed workspace"); }); test.afterAll(async () => { diff --git a/packages/app/e2e/workspace-cwd.spec.ts b/packages/app/e2e/workspace-cwd.spec.ts new file mode 100644 index 000000000..d24a77421 --- /dev/null +++ b/packages/app/e2e/workspace-cwd.spec.ts @@ -0,0 +1,136 @@ +import { execSync } from "node:child_process"; +import { realpathSync } from "node:fs"; +import path from "node:path"; +import { expect, test } from "./fixtures"; +import { + clickTerminal, + waitForTabBar, +} from "./helpers/launcher"; +import { + setupDeterministicPrompt, + waitForTerminalContent, +} from "./helpers/terminal-perf"; +import { createTempGitRepo } from "./helpers/workspace"; +import { + connectWorkspaceSetupClient, + openHomeWithProject, + seedProjectForWorkspaceSetup, +} from "./helpers/workspace-setup"; + +function getServerId(): string { + const serverId = process.env.E2E_SERVER_ID; + if (!serverId) { + throw new Error("E2E_SERVER_ID is not set."); + } + return serverId; +} + +/** Navigate to a workspace via sidebar row testID and wait for tab bar. */ +async function navigateToWorkspaceViaSidebar( + page: import("@playwright/test").Page, + workspaceId: string, +): Promise { + const testId = `sidebar-workspace-row-${getServerId()}:${workspaceId}`; + const row = page.getByTestId(testId); + await expect(row).toBeVisible({ timeout: 30_000 }); + await row.click(); + await waitForTabBar(page); +} + +test.describe("Workspace cwd correctness", () => { + test("main checkout workspace opens terminals in the project root", async ({ page }) => { + test.setTimeout(60_000); + + const client = await connectWorkspaceSetupClient(); + const repo = await createTempGitRepo("workspace-cwd-main-"); + + try { + await seedProjectForWorkspaceSetup(client, repo.path); + + const workspaceResult = await client.openProject(repo.path); + if (!workspaceResult.workspace) { + throw new Error(workspaceResult.error ?? `Failed to open project ${repo.path}`); + } + const workspaceId = String(workspaceResult.workspace.id); + + // Use sidebar navigation to avoid Expo Router hydration issues + await openHomeWithProject(page, repo.path); + await navigateToWorkspaceViaSidebar(page, workspaceId); + await clickTerminal(page); + + const terminal = page.locator('[data-testid="terminal-surface"]'); + await expect(terminal.first()).toBeVisible({ timeout: 20_000 }); + await terminal.first().click(); + + await setupDeterministicPrompt(page, `PWD_READY_${Date.now()}`); + await terminal.first().pressSequentially("pwd\n", { delay: 0 }); + + await waitForTerminalContent(page, (text) => text.includes(repo.path), 10_000); + } finally { + await client.close(); + await repo.cleanup(); + } + }); + + test("worktree workspace opens terminals in the worktree directory", async ({ page }) => { + test.setTimeout(90_000); + + const client = await connectWorkspaceSetupClient(); + const repo = await createTempGitRepo("workspace-cwd-worktree-"); + const resolvedTmp = realpathSync("/tmp"); + const worktreePath = path.join( + resolvedTmp, + `paseo-wt-${Date.now()}-${Math.random().toString(36).slice(2)}`, + ); + const branchName = `workspace-cwd-${Date.now()}`; + let worktreeCreated = false; + + try { + await seedProjectForWorkspaceSetup(client, repo.path); + + execSync(`git worktree add ${JSON.stringify(worktreePath)} -b ${JSON.stringify(branchName)} main`, { + cwd: repo.path, + stdio: "ignore", + }); + worktreeCreated = true; + + const workspaceResult = await client.openProject(worktreePath); + if (!workspaceResult.workspace) { + throw new Error(workspaceResult.error ?? `Failed to open project ${worktreePath}`); + } + const workspaceName = workspaceResult.workspace.name; + + // Use sidebar navigation to avoid Expo Router hydration issues + // with direct URL navigation to the 2nd+ workspace. + await openHomeWithProject(page, repo.path); + const sidebarWorkspace = page.getByRole("button", { name: workspaceName }); + await expect(sidebarWorkspace).toBeVisible({ timeout: 30_000 }); + await sidebarWorkspace.click(); + await waitForTabBar(page); + + await clickTerminal(page); + + const terminal = page.locator('[data-testid="terminal-surface"]'); + await expect(terminal.first()).toBeVisible({ timeout: 20_000 }); + await terminal.first().click(); + + await setupDeterministicPrompt(page, `PWD_READY_${Date.now()}`); + await terminal.first().pressSequentially("pwd\n", { delay: 0 }); + await waitForTerminalContent(page, (text) => text.includes(worktreePath), 10_000); + } finally { + if (worktreeCreated) { + try { + execSync(`git worktree remove ${JSON.stringify(worktreePath)} --force`, { + cwd: repo.path, + stdio: "ignore", + }); + } catch { + // Best-effort cleanup so test failures preserve the original error. + } + } + await client.close(); + await repo.cleanup(); + } + }); + +}); diff --git a/packages/app/e2e/workspace-hover-card.spec.ts b/packages/app/e2e/workspace-hover-card.spec.ts new file mode 100644 index 000000000..91cb72e0d --- /dev/null +++ b/packages/app/e2e/workspace-hover-card.spec.ts @@ -0,0 +1,208 @@ +import { test, expect } from "./fixtures"; +import { createTempGitRepo } from "./helpers/workspace"; +import { waitForWorkspaceTabsVisible } from "./helpers/workspace-tabs"; +import { + connectWorkspaceSetupClient, + createWorkspaceThroughDaemon, + openHomeWithProject, + seedProjectForWorkspaceSetup, + waitForWorkspaceSetupProgress, +} from "./helpers/workspace-setup"; +import type { Page } from "@playwright/test"; + +function getServerId(): string { + const serverId = process.env.E2E_SERVER_ID; + if (!serverId) { + throw new Error("E2E_SERVER_ID is not set."); + } + return serverId; +} + +// --------------------------------------------------------------------------- +// Composable helpers +// --------------------------------------------------------------------------- + +/** Waits for the globe icon to appear on a workspace row (proves services are running). */ +async function expectGlobeIcon(page: Page): Promise { + await expect(page.getByTestId("workspace-globe-icon")).toBeVisible({ timeout: 30_000 }); +} + +/** Hovers the workspace row (by visible name) and waits for the hover card to appear. */ +async function expectHoverCard(page: Page, workspaceName: string): Promise { + const row = page.getByRole("button", { name: workspaceName }).first(); + await row.hover(); + await expect(page.getByTestId("workspace-hover-card")).toBeVisible({ timeout: 10_000 }); +} + +/** Asserts that a service row with the given name exists in the hover card. */ +async function expectServiceInCard(page: Page, serviceName: string): Promise { + const card = page.getByTestId("workspace-hover-card"); + await expect(card.getByTestId(`hover-card-service-${serviceName}`)).toBeVisible({ + timeout: 10_000, + }); +} + +/** Asserts the service status dot indicates "running". */ +async function expectServiceRunning(page: Page, serviceName: string): Promise { + const card = page.getByTestId("workspace-hover-card"); + await expect( + card.getByTestId(`hover-card-service-status-${serviceName}`), + ).toHaveAttribute("aria-label", "Running", { timeout: 10_000 }); +} + +/** Asserts the service lifecycle is stopped. */ +async function expectServiceStopped(page: Page, serviceName: string): Promise { + const card = page.getByTestId("workspace-hover-card"); + await expect( + card.getByTestId(`hover-card-service-status-${serviceName}`), + ).toHaveAttribute("aria-label", "Stopped", { timeout: 10_000 }); +} + +/** Asserts the service health label shown in the hover card. */ +async function expectServiceHealth( + page: Page, + serviceName: string, + health: "Healthy" | "Unhealthy" | "Unknown", +): Promise { + const card = page.getByTestId("workspace-hover-card"); + await expect(card.getByTestId(`hover-card-service-health-${serviceName}`)).toHaveAttribute( + "aria-label", + health, + { timeout: 10_000 }, + ); +} + +/** Asserts the hover card contains the workspace name. */ +async function expectWorkspaceNameInCard(page: Page, name: string): Promise { + const card = page.getByTestId("workspace-hover-card"); + await expect(card.getByTestId("hover-card-workspace-name")).toContainText(name, { + timeout: 10_000, + }); +} + +/** Moves the mouse away from the sidebar and asserts the hover card disappears. */ +async function expectHoverCardDismissed(page: Page): Promise { + // Move mouse to the center of the viewport (away from sidebar) + const viewport = page.viewportSize(); + await page.mouse.move((viewport?.width ?? 1280) / 2, (viewport?.height ?? 720) / 2); + await expect(page.getByTestId("workspace-hover-card")).not.toBeVisible({ timeout: 10_000 }); +} + +// --------------------------------------------------------------------------- +// Tests +// --------------------------------------------------------------------------- + +test.describe("Workspace hover card", () => { + test("shows hover card with services when hovering a workspace with running services", async ({ + page, + }) => { + const client = await connectWorkspaceSetupClient(); + const repo = await createTempGitRepo("hovercard-svc-", { + paseoConfig: { + worktree: { + setup: ["sh -c 'echo bootstrapping; sleep 1; echo setup complete'"], + }, + services: { + web: { + command: + "node -e \"const http = require('http'); const s = http.createServer((q,r) => r.end('ok')); s.listen(process.env.PORT || 3000, () => console.log('listening on ' + s.address().port))\"", + }, + }, + }, + }); + + try { + await seedProjectForWorkspaceSetup(client, repo.path); + + // Wait for setup completion via daemon (setup snapshots are per-session) + const completed = waitForWorkspaceSetupProgress( + client, + (payload) => payload.status === "completed" && payload.detail.log.includes("setup complete"), + ); + const workspace = await createWorkspaceThroughDaemon(client, { + cwd: repo.path, + worktreeSlug: `hovercard-${Date.now()}`, + }); + await completed; + + await openHomeWithProject(page, repo.path); + const wsRow = page.getByTestId(`sidebar-workspace-row-${getServerId()}:${workspace.id}`); + await expect(wsRow).toBeVisible({ timeout: 30_000 }); + await wsRow.click(); + await expect(page).toHaveURL(/\/workspace\//, { timeout: 30_000 }); + + await waitForWorkspaceTabsVisible(page); + + // Wait for the globe icon — proves services are running and client has the data + await expectGlobeIcon(page); + + // Hover the workspace row — hover card should appear + await expectHoverCard(page, workspace.name); + + // Assert the card shows the workspace name + await expectWorkspaceNameInCard(page, workspace.name); + + // Assert the "web" service entry exists in the card + await expectServiceInCard(page, "web"); + + // Assert the status dot shows "running" + await expectServiceRunning(page, "web"); + + // Assert the service row is a link (has role="link") + const card = page.getByTestId("workspace-hover-card"); + const serviceLink = card.getByRole("link", { name: "web service" }); + await expect(serviceLink).toBeVisible({ timeout: 10_000 }); + + // Move mouse away — card should dismiss + await expectHoverCardDismissed(page); + } finally { + await client.close(); + await repo.cleanup(); + } + }); + + test("shows stopped services and starts them from the hover card", async ({ page }) => { + const client = await connectWorkspaceSetupClient(); + const repo = await createTempGitRepo("hovercard-start-", { + paseoConfig: { + services: { + web: { + command: + "node -e \"const http = require('http'); const s = http.createServer((q,r) => r.end('ok')); s.listen(process.env.PORT || 3000, '127.0.0.1', () => console.log('listening on ' + s.address().port))\"", + }, + }, + }, + }); + + try { + await seedProjectForWorkspaceSetup(client, repo.path); + const workspace = await client.openProject(repo.path); + if (!workspace.workspace || workspace.error) { + throw new Error(workspace.error ?? `Failed to open project ${repo.path}`); + } + + await openHomeWithProject(page, repo.path); + const wsRow = page.getByTestId(`sidebar-workspace-row-${getServerId()}:${workspace.workspace.id}`); + await expect(wsRow).toBeVisible({ timeout: 30_000 }); + + await expectHoverCard(page, workspace.workspace.name); + await expectWorkspaceNameInCard(page, workspace.workspace.name); + await expectServiceInCard(page, "web"); + await expectServiceStopped(page, "web"); + await expectServiceHealth(page, "web", "Unknown"); + + const card = page.getByTestId("workspace-hover-card"); + const startButton = card.getByTestId("hover-card-service-start-web"); + await expect(startButton).toBeVisible({ timeout: 10_000 }); + await startButton.click(); + + await expectServiceRunning(page, "web"); + await expectServiceHealth(page, "web", "Healthy"); + await expect(card.getByRole("link", { name: "web service" })).toBeVisible({ timeout: 10_000 }); + await expect(startButton).not.toBeVisible({ timeout: 10_000 }); + } finally { + await client.close(); + await repo.cleanup(); + } + }); +}); diff --git a/packages/app/e2e/workspace-lifecycle.spec.ts b/packages/app/e2e/workspace-lifecycle.spec.ts new file mode 100644 index 000000000..86b058e8f --- /dev/null +++ b/packages/app/e2e/workspace-lifecycle.spec.ts @@ -0,0 +1,189 @@ +import { execSync } from "node:child_process"; +import { realpathSync } from "node:fs"; +import path from "node:path"; +import { expect, test } from "./fixtures"; +import { waitForTabBar } from "./helpers/launcher"; +import { createTempGitRepo } from "./helpers/workspace"; +import { + createAgentChatFromLauncher, + createStandaloneTerminalFromLauncher, + expectTerminalCwd, +} from "./helpers/workspace-lifecycle"; +import { + connectWorkspaceSetupClient, + openHomeWithProject, + seedProjectForWorkspaceSetup, +} from "./helpers/workspace-setup"; + +function getServerId(): string { + const serverId = process.env.E2E_SERVER_ID; + if (!serverId) { + throw new Error("E2E_SERVER_ID is not set."); + } + return serverId; +} + +/** Navigate to a workspace via sidebar row testID and wait for the tab bar. */ +async function navigateToWorkspaceViaSidebar( + page: import("@playwright/test").Page, + workspaceId: string, +): Promise { + const testId = `sidebar-workspace-row-${getServerId()}:${workspaceId}`; + const row = page.getByTestId(testId); + await expect(row).toBeVisible({ timeout: 30_000 }); + await row.click(); + await waitForTabBar(page); +} + +test.describe("Workspace lifecycle", () => { + // The first test after a spec-file switch can intermittently fail because + // the shared daemon still holds stale sessions from the previous spec. + // One retry is enough for the daemon to stabilize. + test.describe.configure({ retries: 1 }); + + test.describe("Main checkout", () => { + test("creates an agent chat via New Chat", async ({ page }) => { + test.setTimeout(60_000); + + const client = await connectWorkspaceSetupClient(); + const repo = await createTempGitRepo("lifecycle-main-chat-"); + + try { + await seedProjectForWorkspaceSetup(client, repo.path); + const workspaceResult = await client.openProject(repo.path); + if (!workspaceResult.workspace) { + throw new Error(workspaceResult.error ?? `Failed to open project ${repo.path}`); + } + const workspaceId = String(workspaceResult.workspace.id); + + await openHomeWithProject(page, repo.path); + await navigateToWorkspaceViaSidebar(page, workspaceId); + await createAgentChatFromLauncher(page); + } finally { + await client.close(); + await repo.cleanup(); + } + }); + + test("creates a terminal with correct CWD", async ({ page }) => { + test.setTimeout(60_000); + + const client = await connectWorkspaceSetupClient(); + const repo = await createTempGitRepo("lifecycle-main-shell-"); + + try { + await seedProjectForWorkspaceSetup(client, repo.path); + const workspaceResult = await client.openProject(repo.path); + if (!workspaceResult.workspace) { + throw new Error(workspaceResult.error ?? `Failed to open project ${repo.path}`); + } + const workspaceId = String(workspaceResult.workspace.id); + + await openHomeWithProject(page, repo.path); + await navigateToWorkspaceViaSidebar(page, workspaceId); + await createStandaloneTerminalFromLauncher(page); + await expectTerminalCwd(page, repo.path); + } finally { + await client.close(); + await repo.cleanup(); + } + }); + }); + + test.describe("Worktree workspace", () => { + test("creates an agent chat via New Chat", async ({ page }) => { + test.setTimeout(90_000); + + const client = await connectWorkspaceSetupClient(); + const repo = await createTempGitRepo("lifecycle-wt-chat-"); + const resolvedTmp = realpathSync("/tmp"); + const worktreePath = path.join( + resolvedTmp, + `paseo-wt-${Date.now()}-${Math.random().toString(36).slice(2)}`, + ); + const branchName = `lifecycle-wt-chat-${Date.now()}`; + let worktreeCreated = false; + + try { + await seedProjectForWorkspaceSetup(client, repo.path); + + execSync(`git worktree add ${JSON.stringify(worktreePath)} -b ${JSON.stringify(branchName)} main`, { + cwd: repo.path, + stdio: "ignore", + }); + worktreeCreated = true; + + const workspaceResult = await client.openProject(worktreePath); + if (!workspaceResult.workspace) { + throw new Error(workspaceResult.error ?? `Failed to open project ${worktreePath}`); + } + const workspaceId = String(workspaceResult.workspace.id); + + await openHomeWithProject(page, repo.path); + await navigateToWorkspaceViaSidebar(page, workspaceId); + await createAgentChatFromLauncher(page); + } finally { + if (worktreeCreated) { + try { + execSync(`git worktree remove ${JSON.stringify(worktreePath)} --force`, { + cwd: repo.path, + stdio: "ignore", + }); + } catch { + // Best-effort cleanup so test failures preserve the original error. + } + } + await client.close(); + await repo.cleanup(); + } + }); + + test("creates a terminal with correct CWD", async ({ page }) => { + test.setTimeout(90_000); + + const client = await connectWorkspaceSetupClient(); + const repo = await createTempGitRepo("lifecycle-wt-shell-"); + const resolvedTmp = realpathSync("/tmp"); + const worktreePath = path.join( + resolvedTmp, + `paseo-wt-${Date.now()}-${Math.random().toString(36).slice(2)}`, + ); + const branchName = `lifecycle-wt-shell-${Date.now()}`; + let worktreeCreated = false; + + try { + await seedProjectForWorkspaceSetup(client, repo.path); + + execSync(`git worktree add ${JSON.stringify(worktreePath)} -b ${JSON.stringify(branchName)} main`, { + cwd: repo.path, + stdio: "ignore", + }); + worktreeCreated = true; + + const workspaceResult = await client.openProject(worktreePath); + if (!workspaceResult.workspace) { + throw new Error(workspaceResult.error ?? `Failed to open project ${worktreePath}`); + } + const workspaceId = String(workspaceResult.workspace.id); + + await openHomeWithProject(page, repo.path); + await navigateToWorkspaceViaSidebar(page, workspaceId); + await createStandaloneTerminalFromLauncher(page); + await expectTerminalCwd(page, worktreePath); + } finally { + if (worktreeCreated) { + try { + execSync(`git worktree remove ${JSON.stringify(worktreePath)} --force`, { + cwd: repo.path, + stdio: "ignore", + }); + } catch { + // Best-effort cleanup so test failures preserve the original error. + } + } + await client.close(); + await repo.cleanup(); + } + }); + }); +}); diff --git a/packages/app/e2e/workspace-setup-runtime.spec.ts b/packages/app/e2e/workspace-setup-runtime.spec.ts new file mode 100644 index 000000000..1f2820552 --- /dev/null +++ b/packages/app/e2e/workspace-setup-runtime.spec.ts @@ -0,0 +1,99 @@ +import { existsSync } from "node:fs"; +import { expect, test } from "./fixtures"; +import { createTempGitRepo } from "./helpers/workspace"; +import { + clickTerminal, + waitForTabBar, +} from "./helpers/launcher"; +import { + connectWorkspaceSetupClient, + createWorkspaceThroughDaemon, + findWorktreeWorkspaceForProject, + openHomeWithProject, +} from "./helpers/workspace-setup"; + +test.describe("Workspace setup runtime authority", () => { + test.describe.configure({ retries: 1 }); + + test("worktree workspace is created in its own directory", async ({ page }) => { + test.setTimeout(90_000); + + const client = await connectWorkspaceSetupClient(); + const repo = await createTempGitRepo("workspace-setup-chat-"); + + try { + await client.openProject(repo.path); + const workspace = await createWorkspaceThroughDaemon(client, { + cwd: repo.path, + worktreeSlug: `setup-chat-${Date.now()}`, + }); + + const wsInfo = await findWorktreeWorkspaceForProject(client, repo.path); + expect(wsInfo.workspaceDirectory).not.toBe(repo.path); + expect(existsSync(wsInfo.workspaceDirectory)).toBe(true); + + // Navigate to the workspace via sidebar + await openHomeWithProject(page, repo.path); + const wsButton = page.getByRole("button", { name: workspace.name }); + await expect(wsButton).toBeVisible({ timeout: 30_000 }); + await wsButton.click(); + await expect(page).toHaveURL(/\/workspace\//, { timeout: 30_000 }); + } finally { + await client.close(); + await repo.cleanup(); + } + }); + + test("first terminal opens in the created workspace directory", async ({ page }) => { + test.setTimeout(90_000); + + const client = await connectWorkspaceSetupClient(); + const repo = await createTempGitRepo("workspace-setup-terminal-"); + + try { + await client.openProject(repo.path); + + // Create workspace via daemon API since the new workspace screen + // no longer has a standalone terminal button + const worktreeSlug = `setup-terminal-${Date.now()}`; + const result = await client.createPaseoWorktree({ + cwd: repo.path, + worktreeSlug, + }); + if (!result.workspace || result.error) { + throw new Error(result.error ?? "Failed to create workspace"); + } + const workspaceDir = result.workspace.workspaceDirectory; + const workspaceName = result.workspace.name; + + // Navigate to the worktree workspace via sidebar click (direct URL + // navigation for freshly created worktree workspaces can race with + // Expo Router hydration, so we use the sidebar which is authoritative). + await openHomeWithProject(page, repo.path); + const sidebarWorkspace = page.getByRole("button", { name: workspaceName }); + await expect(sidebarWorkspace).toBeVisible({ timeout: 30_000 }); + await sidebarWorkspace.click(); + await waitForTabBar(page); + + await clickTerminal(page); + + const terminal = page.locator('[data-testid="terminal-surface"]'); + await expect(terminal.first()).toBeVisible({ timeout: 20_000 }); + + // Verify terminal is listed under the worktree directory, not the original repo + await expect + .poll( + async () => + (await client.listTerminals(workspaceDir)).terminals.length > 0, + { timeout: 30_000 }, + ) + .toBe(true); + expect( + (await client.listTerminals(repo.path)).terminals.length, + ).toBe(0); + } finally { + await client.close(); + await repo.cleanup(); + } + }); +}); diff --git a/packages/app/e2e/workspace-setup-streaming.spec.ts b/packages/app/e2e/workspace-setup-streaming.spec.ts new file mode 100644 index 000000000..27624da64 --- /dev/null +++ b/packages/app/e2e/workspace-setup-streaming.spec.ts @@ -0,0 +1,327 @@ +import { test, expect } from "./fixtures"; +import { createTempGitRepo } from "./helpers/workspace"; +import { waitForWorkspaceTabsVisible } from "./helpers/workspace-tabs"; +import { + connectWorkspaceSetupClient, + createWorkspaceThroughDaemon, + expectSetupPanel, + openHomeWithProject, + seedProjectForWorkspaceSetup, + waitForWorkspaceSetupProgress, +} from "./helpers/workspace-setup"; + +function getServerId(): string { + const serverId = process.env.E2E_SERVER_ID; + if (!serverId) { + throw new Error("E2E_SERVER_ID is not set."); + } + return serverId; +} + +/** Click the sidebar row for a workspace (by ID) and wait for navigation. */ +async function navigateToWorkspaceViaSidebar( + page: import("@playwright/test").Page, + workspaceId: string, +): Promise { + const testId = `sidebar-workspace-row-${getServerId()}:${workspaceId}`; + const row = page.getByTestId(testId); + await expect(row).toBeVisible({ timeout: 30_000 }); + await row.click(); + await expect(page).toHaveURL(/\/workspace\//, { timeout: 30_000 }); +} + +test.describe("Workspace setup streaming", () => { + test("opens the setup tab when a workspace is created from the sidebar", async ({ page }) => { + const client = await connectWorkspaceSetupClient(); + const repo = await createTempGitRepo("setup-open-", { + paseoConfig: { + worktree: { + setup: ["sh -c 'echo starting setup; for i in $(seq 1 30); do echo tick $i; sleep 1; done; echo setup complete'"], + }, + }, + }); + + try { + await seedProjectForWorkspaceSetup(client, repo.path); + const workspace = await createWorkspaceThroughDaemon(client, { + cwd: repo.path, + worktreeSlug: `setup-open-${Date.now()}`, + }); + await openHomeWithProject(page, repo.path); + await navigateToWorkspaceViaSidebar(page, workspace.id); + + await expectSetupPanel(page); + } finally { + await client.close(); + await repo.cleanup(); + } + }); + + test("runs setup through the sidebar and leaves the workspace usable", async ({ page }) => { + const client = await connectWorkspaceSetupClient(); + const repo = await createTempGitRepo("setup-ui-flow-", { + paseoConfig: { + worktree: { + setup: [ + "sh -c 'echo starting setup; sleep 1; echo loading dependencies; sleep 1; echo setup complete'", + ], + }, + }, + files: [{ path: "src/index.ts", content: "export const ready = true;\n" }], + }); + + try { + await seedProjectForWorkspaceSetup(client, repo.path); + + // Wait for setup completion via daemon (setup snapshots are per-session, + // so the browser session won't receive progress events). + const completed = waitForWorkspaceSetupProgress( + client, + (payload) => payload.status === "completed" && payload.detail.log.includes("setup complete"), + ); + const workspace = await createWorkspaceThroughDaemon(client, { + cwd: repo.path, + worktreeSlug: `setup-ui-flow-${Date.now()}`, + }); + await completed; + + // Navigate to workspace and verify it's usable + await openHomeWithProject(page, repo.path); + await navigateToWorkspaceViaSidebar(page, workspace.id); + + await waitForWorkspaceTabsVisible(page); + await page.getByTestId("workspace-new-agent-tab").first().click(); + await expect(page.getByRole("textbox", { name: "Message agent..." }).first()).toBeVisible({ + timeout: 30_000, + }); + + const explorerToggle = page.getByTestId("workspace-explorer-toggle").first(); + if ((await explorerToggle.getAttribute("aria-label")) === "Open explorer") { + await explorerToggle.click(); + } + await expect(explorerToggle).toHaveAttribute("aria-label", "Close explorer", { + timeout: 30_000, + }); + await page.getByTestId("explorer-tab-files").click(); + await expect(page.getByTestId("file-explorer-tree-scroll")).toBeVisible({ timeout: 30_000 }); + await expect(page.getByText("README.md", { exact: true }).first()).toBeVisible({ + timeout: 30_000, + }); + await expect(page.getByText("src", { exact: true }).first()).toBeVisible({ + timeout: 30_000, + }); + } finally { + await client.close(); + await repo.cleanup(); + } + }); + + test("streams running and completed setup snapshots for a successful setup", async () => { + const client = await connectWorkspaceSetupClient(); + const repo = await createTempGitRepo("setup-success-", { + paseoConfig: { + worktree: { + setup: ["sh -c 'echo starting setup; sleep 2; echo setup complete'"], + }, + }, + }); + + try { + await seedProjectForWorkspaceSetup(client, repo.path); + const initialRunning = waitForWorkspaceSetupProgress( + client, + (payload) => payload.status === "running" && payload.detail.log === "", + ); + const runningWithOutput = waitForWorkspaceSetupProgress( + client, + (payload) => payload.status === "running" && payload.detail.log.includes("starting setup"), + ); + const completed = waitForWorkspaceSetupProgress( + client, + (payload) => payload.status === "completed" && payload.detail.log.includes("setup complete"), + ); + + await createWorkspaceThroughDaemon(client, { + cwd: repo.path, + worktreeSlug: "workspace-setup-success", + }); + + const initialPayload = await initialRunning; + const runningPayload = await runningWithOutput; + const completedPayload = await completed; + + expect(initialPayload.detail.log).toBe(""); + expect(runningPayload.detail.log).toContain("starting setup"); + expect(completedPayload.detail.log).toContain("setup complete"); + expect(completedPayload.error).toBeNull(); + } finally { + await client.close(); + await repo.cleanup(); + } + }); + + test("streams a failed setup snapshot when setup fails", async () => { + const client = await connectWorkspaceSetupClient(); + const repo = await createTempGitRepo("setup-failure-", { + paseoConfig: { + worktree: { + setup: ["sh -c 'echo starting setup; sleep 2; echo setup failed 1>&2; exit 1'"], + }, + }, + }); + + try { + await seedProjectForWorkspaceSetup(client, repo.path); + const failed = waitForWorkspaceSetupProgress( + client, + (payload) => payload.status === "failed" && payload.detail.log.includes("setup failed"), + ); + + await createWorkspaceThroughDaemon(client, { + cwd: repo.path, + worktreeSlug: "workspace-setup-failure", + }); + + const failedPayload = await failed; + expect(failedPayload.detail.log).toContain("starting setup"); + expect(failedPayload.detail.log).toContain("setup failed"); + expect(failedPayload.error).toMatch(/failed/i); + } finally { + await client.close(); + await repo.cleanup(); + } + }); + + test("emits a completed empty snapshot when no setup commands exist", async () => { + const client = await connectWorkspaceSetupClient(); + const repo = await createTempGitRepo("setup-none-"); + + try { + await seedProjectForWorkspaceSetup(client, repo.path); + const completed = waitForWorkspaceSetupProgress( + client, + (payload) => + payload.status === "completed" && + payload.detail.commands.length === 0 && + payload.detail.log === "", + ); + + await createWorkspaceThroughDaemon(client, { + cwd: repo.path, + worktreeSlug: "workspace-setup-none", + }); + + const completedPayload = await completed; + expect(completedPayload.error).toBeNull(); + expect(completedPayload.detail.commands).toEqual([]); + expect(completedPayload.detail.log).toBe(""); + } finally { + await client.close(); + await repo.cleanup(); + } + }); + + test("launches service terminals after setup completes", async ({ page }) => { + const client = await connectWorkspaceSetupClient(); + const repo = await createTempGitRepo("setup-svc-ui-", { + paseoConfig: { + worktree: { + setup: ["sh -c 'echo bootstrapping; sleep 1; echo setup complete'"], + }, + services: { + web: { + command: + "node -e \"const http = require('http'); const s = http.createServer((q,r) => r.end('ok')); s.listen(process.env.PORT || 3000, () => console.log('listening on ' + s.address().port))\"", + }, + }, + }, + }); + + try { + await seedProjectForWorkspaceSetup(client, repo.path); + + // Wait for setup completion via daemon (setup snapshots are per-session) + const completed = waitForWorkspaceSetupProgress( + client, + (payload) => payload.status === "completed" && payload.detail.log.includes("setup complete"), + ); + const workspace = await createWorkspaceThroughDaemon(client, { + cwd: repo.path, + worktreeSlug: `setup-svc-${Date.now()}`, + }); + await completed; + + await openHomeWithProject(page, repo.path); + await navigateToWorkspaceViaSidebar(page, workspace.id); + + await waitForWorkspaceTabsVisible(page); + + // Wait for the service terminal tab to appear in the tabs bar. + // The tab title shows the command, not the service name. + const terminalTab = page.locator('[data-testid^="workspace-tab-terminal_"]').first(); + await expect(terminalTab).toBeVisible({ timeout: 30_000 }); + + // Click the service terminal tab + await terminalTab.click(); + + // Verify the terminal surface rendered + await expect(page.getByTestId("terminal-surface").first()).toBeVisible({ timeout: 10_000 }); + + // Verify the terminal output contains "listening on" (xterm renders text in .xterm-rows) + await expect(page.locator(".xterm-rows").first()).toContainText("listening on", { + timeout: 30_000, + }); + } finally { + await client.close(); + await repo.cleanup(); + } + }); + + test("launches workspace services after setup completes", async () => { + const client = await connectWorkspaceSetupClient(); + const repo = await createTempGitRepo("setup-services-", { + paseoConfig: { + worktree: { + setup: ["sh -c 'echo bootstrapping; sleep 1; echo setup complete'"], + }, + services: { + editor: { + command: "npm run dev", + }, + }, + }, + }); + + try { + await seedProjectForWorkspaceSetup(client, repo.path); + const completed = waitForWorkspaceSetupProgress( + client, + (payload) => payload.status === "completed" && payload.detail.log.includes("setup complete"), + ); + + const result = await client.createPaseoWorktree({ + cwd: repo.path, + worktreeSlug: "workspace-setup-services", + }); + if (!result.workspace) { + throw new Error(result.error ?? "Failed to create workspace"); + } + const workspaceDir = result.workspace.workspaceDirectory; + + await completed; + + await expect + .poll(async () => { + const terminals = await client.listTerminals(workspaceDir); + return terminals.terminals.find((terminal) => terminal.name === "editor") ?? null; + }) + .toMatchObject({ + id: expect.any(String), + name: "editor", + }); + } finally { + await client.close(); + await repo.cleanup(); + } + }); +}); diff --git a/packages/app/src/app/_layout.tsx b/packages/app/src/app/_layout.tsx index b54dc9090..0540dfb2e 100644 --- a/packages/app/src/app/_layout.tsx +++ b/packages/app/src/app/_layout.tsx @@ -61,6 +61,7 @@ import { getIsElectronRuntime, isCompactFormFactor } from "@/constants/layout"; import { CommandCenter } from "@/components/command-center"; import { ProjectPickerModal } from "@/components/project-picker-modal"; import { KeyboardShortcutsDialog } from "@/components/keyboard-shortcuts-dialog"; +import { WorkspaceSetupDialog } from "@/components/workspace-setup-dialog"; import { useKeyboardShortcuts } from "@/hooks/use-keyboard-shortcuts"; import { queryClient } from "@/query/query-client"; import { @@ -78,6 +79,7 @@ import { parseServerIdFromPathname, parseHostAgentRouteFromPathname, parseWorkspaceOpenIntent, + decodeWorkspaceIdFromPathSegment, } from "@/utils/host-routes"; import { syncNavigationActiveWorkspace } from "@/stores/navigation-active-workspace-store"; @@ -401,6 +403,7 @@ function AppContainer({ + ); @@ -729,7 +732,6 @@ function FaviconStatusSync() { function RootStack() { const storeReady = useStoreReady(); const { theme } = useUnistyles(); - return ( - + { + const serverValue = Array.isArray(params?.serverId) + ? params.serverId[0] + : params?.serverId; + const workspaceValue = Array.isArray(params?.workspaceId) + ? params.workspaceId[0] + : params?.workspaceId; + const serverId = typeof serverValue === "string" ? serverValue.trim() : ""; + const workspaceId = + typeof workspaceValue === "string" + ? (decodeWorkspaceIdFromPathSegment(workspaceValue) ?? workspaceValue.trim()) + : ""; + return `${serverId}:${workspaceId}`; + }} + /> + diff --git a/packages/app/src/app/h/[serverId]/agent/[agentId].tsx b/packages/app/src/app/h/[serverId]/agent/[agentId].tsx index b1a05bba0..a42d57379 100644 --- a/packages/app/src/app/h/[serverId]/agent/[agentId].tsx +++ b/packages/app/src/app/h/[serverId]/agent/[agentId].tsx @@ -3,6 +3,7 @@ import { useLocalSearchParams, useRouter } from "expo-router"; import { useSessionStore } from "@/stores/session-store"; import { useHostRuntimeClient, useHostRuntimeIsConnected } from "@/runtime/host-runtime"; import { buildHostRootRoute } from "@/utils/host-routes"; +import { resolveWorkspaceIdByExecutionDirectory } from "@/utils/workspace-execution"; import { prepareWorkspaceTab } from "@/utils/workspace-navigation"; export default function HostAgentReadyRoute() { @@ -22,6 +23,21 @@ export default function HostAgentReadyRoute() { } return state.sessions[serverId]?.agents?.get(agentId)?.cwd ?? null; }); + const sessionWorkspaces = useSessionStore((state) => + serverId ? state.sessions[serverId]?.workspaces : undefined, + ); + const hasHydratedWorkspaces = useSessionStore((state) => + serverId ? (state.sessions[serverId]?.hasHydratedWorkspaces ?? false) : false, + ); + const resolvedWorkspaceId = useSessionStore((state) => { + if (!serverId || !agentId) { + return null; + } + return resolveWorkspaceIdByExecutionDirectory({ + workspaces: state.sessions[serverId]?.workspaces?.values(), + workspaceDirectory: state.sessions[serverId]?.agents?.get(agentId)?.cwd, + }); + }); useEffect(() => { if (redirectedRef.current) { @@ -33,18 +49,17 @@ export default function HostAgentReadyRoute() { return; } - const normalizedCwd = agentCwd?.trim(); - if (normalizedCwd) { + if (resolvedWorkspaceId) { redirectedRef.current = true; router.replace( prepareWorkspaceTab({ serverId, - workspaceId: normalizedCwd, + workspaceId: resolvedWorkspaceId, target: { kind: "agent", agentId }, }) as any, ); } - }, [agentCwd, agentId, router, serverId]); + }, [agentId, resolvedWorkspaceId, router, serverId]); useEffect(() => { if (redirectedRef.current) { @@ -53,14 +68,14 @@ export default function HostAgentReadyRoute() { if (!serverId || !agentId) { return; } - if (agentCwd?.trim()) { + if (agentCwd?.trim() && !hasHydratedWorkspaces) { return; } if (!client || !isConnected) { redirectedRef.current = true; router.replace(buildHostRootRoute(serverId)); } - }, [agentCwd, agentId, client, isConnected, router, serverId]); + }, [agentCwd, agentId, client, hasHydratedWorkspaces, isConnected, router, serverId]); useEffect(() => { if (redirectedRef.current) { @@ -78,12 +93,19 @@ export default function HostAgentReadyRoute() { return; } const cwd = result?.agent?.cwd?.trim(); + const workspaceId = resolveWorkspaceIdByExecutionDirectory({ + workspaces: sessionWorkspaces?.values(), + workspaceDirectory: cwd, + }); + if (!workspaceId && !hasHydratedWorkspaces) { + return; + } redirectedRef.current = true; - if (cwd) { + if (workspaceId) { router.replace( prepareWorkspaceTab({ serverId, - workspaceId: cwd, + workspaceId, target: { kind: "agent", agentId }, }) as any, ); @@ -102,7 +124,7 @@ export default function HostAgentReadyRoute() { return () => { cancelled = true; }; - }, [agentId, client, isConnected, router, serverId]); + }, [agentId, client, hasHydratedWorkspaces, isConnected, router, serverId, sessionWorkspaces]); return null; } diff --git a/packages/app/src/app/h/[serverId]/index.tsx b/packages/app/src/app/h/[serverId]/index.tsx index 8183cf1ba..1dcf0baeb 100644 --- a/packages/app/src/app/h/[serverId]/index.tsx +++ b/packages/app/src/app/h/[serverId]/index.tsx @@ -3,14 +3,23 @@ import { useLocalSearchParams, usePathname, useRouter } from "expo-router"; import { useSessionStore } from "@/stores/session-store"; import { useFormPreferences } from "@/hooks/use-form-preferences"; import { + buildHostAgentDetailRoute, buildHostOpenProjectRoute, buildHostRootRoute, - buildHostWorkspaceRoute, + buildHostWorkspaceOpenRoute, } from "@/utils/host-routes"; +import { resolveWorkspaceIdByExecutionDirectory } from "@/utils/workspace-execution"; import { prepareWorkspaceTab } from "@/utils/workspace-navigation"; const HOST_ROOT_REDIRECT_DELAY_MS = 300; +function getCurrentPathname(fallbackPathname: string): string { + if (typeof window === "undefined") { + return fallbackPathname; + } + return window.location.pathname || fallbackPathname; +} + export default function HostIndexRoute() { const router = useRouter(); const pathname = usePathname(); @@ -32,11 +41,13 @@ export default function HostIndexRoute() { return; } const rootRoute = buildHostRootRoute(serverId); - if (pathname !== rootRoute && pathname !== `${rootRoute}/`) { + const currentPathname = getCurrentPathname(pathname); + if (currentPathname !== rootRoute && currentPathname !== `${rootRoute}/`) { return; } const timer = setTimeout(() => { - if (pathname !== rootRoute && pathname !== `${rootRoute}/`) { + const latestPathname = getCurrentPathname(pathname); + if (latestPathname !== rootRoute && latestPathname !== `${rootRoute}/`) { return; } @@ -55,20 +66,30 @@ export default function HostIndexRoute() { }); const primaryAgent = visibleAgents[0]; - if (primaryAgent?.cwd?.trim()) { + const primaryAgentWorkspaceId = resolveWorkspaceIdByExecutionDirectory({ + workspaces: sessionWorkspaces?.values(), + workspaceDirectory: primaryAgent?.cwd, + }); + if (primaryAgent && primaryAgentWorkspaceId) { router.replace( prepareWorkspaceTab({ serverId, - workspaceId: primaryAgent.cwd.trim(), + workspaceId: primaryAgentWorkspaceId, target: { kind: "agent", agentId: primaryAgent.id }, }) as any, ); return; } + if (primaryAgent) { + router.replace(buildHostAgentDetailRoute(serverId, primaryAgent.id) as any); + return; + } const primaryWorkspace = visibleWorkspaces[0]; if (primaryWorkspace?.id?.trim()) { - router.replace(buildHostWorkspaceRoute(serverId, primaryWorkspace.id.trim())); + router.replace( + buildHostWorkspaceOpenRoute(serverId, primaryWorkspace.id.trim(), "draft:new") as any, + ); return; } diff --git a/packages/app/src/app/h/[serverId]/new.tsx b/packages/app/src/app/h/[serverId]/new.tsx new file mode 100644 index 000000000..051822df3 --- /dev/null +++ b/packages/app/src/app/h/[serverId]/new.tsx @@ -0,0 +1,21 @@ +import { useLocalSearchParams } from "expo-router"; +import { NewWorkspaceScreen } from "@/screens/new-workspace-screen"; + +export default function HostNewWorkspaceRoute() { + const params = useLocalSearchParams<{ serverId?: string; dir?: string; name?: string }>(); + const serverId = typeof params.serverId === "string" ? params.serverId : ""; + const sourceDirectory = typeof params.dir === "string" ? params.dir : ""; + const displayName = typeof params.name === "string" ? params.name : undefined; + + if (!sourceDirectory) { + return null; + } + + return ( + + ); +} diff --git a/packages/app/src/app/h/[serverId]/workspace/[workspaceId]/_layout.tsx b/packages/app/src/app/h/[serverId]/workspace/[workspaceId]/_layout.tsx index 2afc57223..aa8c89726 100644 --- a/packages/app/src/app/h/[serverId]/workspace/[workspaceId]/_layout.tsx +++ b/packages/app/src/app/h/[serverId]/workspace/[workspaceId]/_layout.tsx @@ -1,10 +1,10 @@ import { useEffect, useRef } from "react"; -import { useGlobalSearchParams, useLocalSearchParams, useRouter } from "expo-router"; +import { useGlobalSearchParams, usePathname, useRouter } from "expo-router"; import type { WorkspaceTabTarget } from "@/stores/workspace-tabs-store"; import { WorkspaceScreen } from "@/screens/workspace/workspace-screen"; import { buildHostWorkspaceRoute, - decodeWorkspaceIdFromPathSegment, + parseHostWorkspaceRouteFromPathname, parseWorkspaceOpenIntent, type WorkspaceOpenIntent, } from "@/utils/host-routes"; @@ -31,24 +31,22 @@ function getOpenIntentTarget(openIntent: WorkspaceOpenIntent): WorkspaceTabTarge if (openIntent.kind === "file") { return { kind: "file", path: openIntent.path }; } + if (openIntent.kind === "setup") { + return { kind: "setup", workspaceId: openIntent.workspaceId }; + } return { kind: "draft", draftId: openIntent.draftId }; } export default function HostWorkspaceLayout() { const router = useRouter(); const consumedIntentRef = useRef(null); - const params = useLocalSearchParams<{ - serverId?: string | string[]; - workspaceId?: string | string[]; - }>(); + const pathname = usePathname(); const globalParams = useGlobalSearchParams<{ open?: string | string[]; }>(); - const serverId = getParamValue(params.serverId); - const workspaceValue = getParamValue(params.workspaceId); - const workspaceId = workspaceValue - ? (decodeWorkspaceIdFromPathSegment(workspaceValue) ?? "") - : ""; + const parsedWorkspaceRoute = parseHostWorkspaceRouteFromPathname(pathname); + const serverId = parsedWorkspaceRoute?.serverId ?? ""; + const workspaceId = parsedWorkspaceRoute?.workspaceId ?? ""; const openValue = getParamValue(globalParams.open); useEffect(() => { diff --git a/packages/app/src/app/index.tsx b/packages/app/src/app/index.tsx index b537db6a9..1180dcb8f 100644 --- a/packages/app/src/app/index.tsx +++ b/packages/app/src/app/index.tsx @@ -11,6 +11,7 @@ import { useHosts, } from "@/runtime/host-runtime"; import { buildHostRootRoute } from "@/utils/host-routes"; +import { shouldUseDesktopDaemon } from "@/desktop/daemon/desktop-daemon"; const WELCOME_ROUTE = "/welcome"; @@ -39,6 +40,8 @@ function useAnyOnlineHostServerId(serverIds: string[]): string | null { ); } +const isDesktop = shouldUseDesktopDaemon(); + export default function Index() { const router = useRouter(); const pathname = usePathname(); @@ -61,5 +64,5 @@ export default function Index() { router.replace(targetRoute); }, [anyOnlineServerId, pathname, router, storeReady]); - return ; + return ; } diff --git a/packages/app/src/components/adaptive-modal-sheet.tsx b/packages/app/src/components/adaptive-modal-sheet.tsx index a3ccd5f28..5fa327b5d 100644 --- a/packages/app/src/components/adaptive-modal-sheet.tsx +++ b/packages/app/src/components/adaptive-modal-sheet.tsx @@ -13,6 +13,8 @@ import { type BottomSheetBackgroundProps, } from "@gorhom/bottom-sheet"; import { X } from "lucide-react-native"; +import { FileDropZone } from "@/components/file-drop-zone"; +import type { ImageAttachment } from "@/components/message-input"; const styles = StyleSheet.create((theme) => ({ desktopOverlay: { @@ -34,16 +36,21 @@ const styles = StyleSheet.create((theme) => ({ borderRadius: theme.borderRadius.xl, borderWidth: 1, borderColor: theme.colors.surface2, - overflow: "hidden", }, header: { paddingHorizontal: theme.spacing[6], paddingVertical: theme.spacing[4], flexDirection: "row", - alignItems: "center", + alignItems: "flex-start", justifyContent: "space-between", borderBottomWidth: 1, borderBottomColor: theme.colors.surface2, + gap: theme.spacing[3], + }, + headerTitleGroup: { + flex: 1, + gap: theme.spacing[2], + minWidth: 0, }, title: { color: theme.colors.foreground, @@ -73,9 +80,10 @@ const styles = StyleSheet.create((theme) => ({ paddingBottom: theme.spacing[3], flexDirection: "row", justifyContent: "space-between", - alignItems: "center", + alignItems: "flex-start", borderBottomWidth: 1, borderBottomColor: theme.colors.surface2, + gap: theme.spacing[3], }, bottomSheetContent: { padding: theme.spacing[6], @@ -101,22 +109,31 @@ function SheetBackground({ style }: BottomSheetBackgroundProps) { export interface AdaptiveModalSheetProps { title: string; + /** Optional content rendered below the title in the header area. */ + subtitle?: ReactNode; visible: boolean; onClose: () => void; children: ReactNode; snapPoints?: string[]; stackBehavior?: "push" | "switch" | "replace"; testID?: string; + /** Override the max width of the desktop card. */ + desktopMaxWidth?: number; + /** When provided, wraps the card content in a FileDropZone. */ + onFilesDropped?: (files: ImageAttachment[]) => void; } export function AdaptiveModalSheet({ title, + subtitle, visible, onClose, children, snapPoints, stackBehavior, testID, + desktopMaxWidth, + onFilesDropped, }: AdaptiveModalSheetProps) { const { theme } = useUnistyles(); const isMobile = UnistylesRuntime.breakpoint === "xs" || UnistylesRuntime.breakpoint === "sm"; @@ -124,6 +141,19 @@ export function AdaptiveModalSheet({ const dismissingForVisibilityRef = useRef(false); const resolvedSnapPoints = useMemo(() => snapPoints ?? ["65%", "90%"], [snapPoints]); + useEffect(() => { + if (isMobile || !visible || Platform.OS !== "web" || typeof window === "undefined") return; + function handleKeyDown(e: KeyboardEvent) { + if (e.key === "Escape") { + e.preventDefault(); + onClose(); + } + } + // Capture phase: RN Web TextInput stops propagation, so bubbling listeners never fire. + window.addEventListener("keydown", handleKeyDown, true); + return () => window.removeEventListener("keydown", handleKeyDown, true); + }, [isMobile, visible, onClose]); + useEffect(() => { if (!isMobile) return; if (visible) { @@ -172,7 +202,10 @@ export function AdaptiveModalSheet({ keyboardBlurBehavior="restore" > - {title} + + {title} + {subtitle} + @@ -188,6 +221,28 @@ export function AdaptiveModalSheet({ ); } + const cardInner = ( + <> + + + {title} + {subtitle} + + + + + + + {children} + + + ); + const desktopContent = ( - - - {title} - - - - - - {children} - + + {onFilesDropped ? ( + + {cardInner} + + ) : cardInner} ); diff --git a/packages/app/src/components/agent-list.tsx b/packages/app/src/components/agent-list.tsx index dd2b1af18..84cc3ca33 100644 --- a/packages/app/src/components/agent-list.tsx +++ b/packages/app/src/components/agent-list.tsx @@ -16,6 +16,9 @@ import { shortenPath } from "@/utils/shorten-path"; import { type AggregatedAgent } from "@/hooks/use-aggregated-agents"; import { useSessionStore } from "@/stores/session-store"; import { Archive } from "lucide-react-native"; +import { getProviderIcon } from "@/components/provider-icons"; +import { buildHostAgentDetailRoute } from "@/utils/host-routes"; +import { resolveWorkspaceIdByExecutionDirectory } from "@/utils/workspace-execution"; import { prepareWorkspaceTab } from "@/utils/workspace-navigation"; interface AgentListProps { @@ -130,6 +133,7 @@ function SessionRow({ const isSelected = selectedAgentId === agentKey; const statusLabel = formatStatusLabel(agent.status); const projectPath = shortenPath(agent.cwd); + const ProviderIcon = getProviderIcon(agent.provider); return ( + + + { - setActionAgent(agent); - }, []); + const handleAgentLongPress = useCallback( + (agent: AggregatedAgent) => { + const isRunning = agent.status === "running" || agent.status === "initializing"; + if (isRunning) { + setActionAgent(agent); + return; + } + + const client = useSessionStore.getState().sessions[agent.serverId]?.client ?? null; + if (!client) { + setActionAgent(agent); + return; + } + void client.archiveAgent(agent.id); + }, + [], + ); const handleCloseActionSheet = useCallback(() => { setActionAgent(null); @@ -349,7 +379,9 @@ export function AgentList({ > - {isActionDaemonUnavailable ? "Host offline" : "Archive this session?"} + {isActionDaemonUnavailable + ? "Host offline" + : "This agent is still running. Archiving it will stop the agent."} ({ flexWrap: "wrap", gap: theme.spacing[2], }, + providerIconWrap: { + width: theme.iconSize.md, + alignItems: "center", + justifyContent: "center", + }, rowMetaRow: { flexDirection: "row", alignItems: "center", diff --git a/packages/app/src/components/agent-stream-render-model.ts b/packages/app/src/components/agent-stream-render-model.ts index d8a2eeaa7..c5d9cda1e 100644 --- a/packages/app/src/components/agent-stream-render-model.ts +++ b/packages/app/src/components/agent-stream-render-model.ts @@ -8,8 +8,8 @@ import { import { orderHeadForStreamRenderStrategy, orderTailForStreamRenderStrategy, - resolveStreamRenderStrategy, } from "./stream-strategy"; +import { resolveStreamRenderStrategy } from "./stream-strategy-resolver"; export type StreamRenderSegments = { historyVirtualized: StreamItem[]; diff --git a/packages/app/src/components/agent-stream-render-strategy.ts b/packages/app/src/components/agent-stream-render-strategy.ts index 38c64153e..9965bdeb3 100644 --- a/packages/app/src/components/agent-stream-render-strategy.ts +++ b/packages/app/src/components/agent-stream-render-strategy.ts @@ -1,2 +1,3 @@ export * from "./stream-strategy"; +export * from "./stream-strategy-resolver"; export * from "./agent-stream-render-model"; diff --git a/packages/app/src/components/agent-stream-view.tsx b/packages/app/src/components/agent-stream-view.tsx index 2fc15dba1..4ba4c99f3 100644 --- a/packages/app/src/components/agent-stream-view.tsx +++ b/packages/app/src/components/agent-stream-view.tsx @@ -65,6 +65,7 @@ import { } from "./use-bottom-anchor-controller"; import { MAX_CONTENT_WIDTH } from "@/constants/layout"; import { normalizeInlinePathTarget } from "@/utils/inline-path"; +import { resolveWorkspaceIdByExecutionDirectory } from "@/utils/workspace-execution"; import { prepareWorkspaceTab } from "@/utils/workspace-navigation"; import { useStableEvent } from "@/hooks/use-stable-event"; import { @@ -134,10 +135,13 @@ const AgentStreamViewComponent = forwardRef { expect(matchesSearch(rows[1]!, "gpt-5.4")).toBe(true); }); - it("builds an explicit trigger label for the selected provider and model", () => { + it("keeps the selected trigger label model-only", () => { expect(resolveProviderLabel(providerDefinitions, "codex")).toBe("Codex"); - expect(buildSelectedTriggerLabel("Codex", "GPT-5.4")).toBe("Codex: GPT-5.4"); + expect(buildSelectedTriggerLabel("GPT-5.4")).toBe("GPT-5.4"); }); }); diff --git a/packages/app/src/components/combined-model-selector.tsx b/packages/app/src/components/combined-model-selector.tsx index ffb0ef6df..311a1ce69 100644 --- a/packages/app/src/components/combined-model-selector.tsx +++ b/packages/app/src/components/combined-model-selector.tsx @@ -2,13 +2,11 @@ import { useCallback, useEffect, useMemo, useRef, useState } from "react"; import { View, Text, - TextInput, Pressable, Platform, ActivityIndicator, type GestureResponderEvent, } from "react-native"; -import { BottomSheetTextInput } from "@gorhom/bottom-sheet"; import { StyleSheet, useUnistyles } from "react-native-unistyles"; import { ArrowLeft, @@ -24,7 +22,7 @@ import type { import type { AgentProviderDefinition } from "@server/server/agent/provider-manifest"; const IS_WEB = Platform.OS === "web"; -import { Combobox, ComboboxItem } from "@/components/ui/combobox"; +import { Combobox, ComboboxItem, SearchInput } from "@/components/ui/combobox"; import { getProviderIcon } from "@/components/provider-icons"; import type { FavoriteModelRow } from "@/hooks/use-form-preferences"; import { @@ -337,45 +335,6 @@ function GroupedProviderRows({ ); } -function ProviderSearchInput({ - value, - onChangeText, - autoFocus = false, -}: { - value: string; - onChangeText: (text: string) => void; - autoFocus?: boolean; -}) { - const { theme } = useUnistyles(); - const inputRef = useRef(null); - const InputComponent = Platform.OS === "web" ? TextInput : BottomSheetTextInput; - - useEffect(() => { - if (autoFocus && Platform.OS === "web" && inputRef.current) { - const timer = setTimeout(() => { - inputRef.current?.focus(); - }, 50); - return () => clearTimeout(timer); - } - }, [autoFocus]); - - return ( - - - - - ); -} function SelectorContent({ view, @@ -535,35 +494,45 @@ export function CombinedModelSelector({ return { kind: "provider", providerId, providerLabel: label }; }, [allProviderModels, providerDefinitions]); + const computeInitialView = useCallback((): SelectorView => { + if (singleProviderView) return singleProviderView; + + const selectedFavoriteKey = `${selectedProvider}:${selectedModel}`; + if (selectedProvider && selectedModel && !favoriteKeys.has(selectedFavoriteKey)) { + const label = resolveProviderLabel(providerDefinitions, selectedProvider); + return { kind: "provider", providerId: selectedProvider, providerLabel: label }; + } + + return { kind: "all" }; + }, [singleProviderView, selectedProvider, selectedModel, favoriteKeys, providerDefinitions]); + const handleOpenChange = useCallback( (open: boolean) => { setIsOpen(open); - setView(singleProviderView ?? { kind: "all" }); + setView(computeInitialView()); if (!open) { setSearchQuery(""); onClose?.(); } }, - [onClose, singleProviderView], + [onClose, computeInitialView], ); const handleSelect = useCallback( (provider: string, modelId: string) => { onSelect(provider as AgentProvider, modelId); setIsOpen(false); - setView(singleProviderView ?? { kind: "all" }); setSearchQuery(""); }, - [onSelect, singleProviderView], + [onSelect], ); const ProviderIcon = getProviderIcon(selectedProvider); - const selectedProviderLabel = useMemo( - () => resolveProviderLabel(providerDefinitions, selectedProvider), - [providerDefinitions, selectedProvider], - ); const selectedModelLabel = useMemo(() => { + if (!selectedModel) { + return isLoading ? "Loading..." : "Select model"; + } const models = allProviderModels.get(selectedProvider); if (!models) { return isLoading ? "Loading..." : "Select model"; @@ -586,8 +555,8 @@ export function CombinedModelSelector({ return selectedModelLabel; } - return buildSelectedTriggerLabel(selectedProviderLabel, selectedModelLabel); - }, [selectedModelLabel, selectedProviderLabel]); + return buildSelectedTriggerLabel(selectedModelLabel); + }, [selectedModelLabel]); useEffect(() => { if (isWeb) { @@ -664,7 +633,8 @@ export function CombinedModelSelector({ }} /> ) : null} - ({ color: theme.colors.foregroundMuted, }, level2Header: { - backgroundColor: theme.colors.surface1, - borderBottomWidth: 1, - borderBottomColor: theme.colors.border, }, backButton: { flexDirection: "row", @@ -839,17 +806,4 @@ const styles = StyleSheet.create((theme) => ({ color: theme.colors.foregroundMuted, fontSize: theme.fontSize.sm, }, - providerSearchContainer: { - flexDirection: "row", - alignItems: "center", - paddingHorizontal: theme.spacing[3], - gap: theme.spacing[2], - ...(IS_WEB ? {} : { marginHorizontal: theme.spacing[1] }), - }, - providerSearchInput: { - flex: 1, - paddingVertical: theme.spacing[3], - color: theme.colors.foreground, - fontSize: theme.fontSize.sm, - }, })); diff --git a/packages/app/src/components/combined-model-selector.utils.ts b/packages/app/src/components/combined-model-selector.utils.ts index fa7f102d3..b2ff9b78f 100644 --- a/packages/app/src/components/combined-model-selector.utils.ts +++ b/packages/app/src/components/combined-model-selector.utils.ts @@ -11,7 +11,7 @@ export function resolveProviderLabel( return providerDefinitions.find((definition) => definition.id === providerId)?.label ?? providerId; } -export function buildSelectedTriggerLabel(providerLabel: string, modelLabel: string): string { +export function buildSelectedTriggerLabel(modelLabel: string): string { return modelLabel; } diff --git a/packages/app/src/components/agent-input-area.status-controls.test.ts b/packages/app/src/components/composer.status-controls.test.ts similarity index 92% rename from packages/app/src/components/agent-input-area.status-controls.test.ts rename to packages/app/src/components/composer.status-controls.test.ts index 93b51a4ab..7491a87cb 100644 --- a/packages/app/src/components/agent-input-area.status-controls.test.ts +++ b/packages/app/src/components/composer.status-controls.test.ts @@ -1,5 +1,5 @@ import { describe, expect, it } from "vitest"; -import { resolveStatusControlMode } from "./agent-input-area.status-controls"; +import { resolveStatusControlMode } from "./composer.status-controls"; describe("resolveStatusControlMode", () => { it("uses ready mode when no controlled status controls are provided", () => { diff --git a/packages/app/src/components/agent-input-area.status-controls.ts b/packages/app/src/components/composer.status-controls.ts similarity index 100% rename from packages/app/src/components/agent-input-area.status-controls.ts rename to packages/app/src/components/composer.status-controls.ts diff --git a/packages/app/src/components/agent-input-area.tsx b/packages/app/src/components/composer.tsx similarity index 91% rename from packages/app/src/components/agent-input-area.tsx rename to packages/app/src/components/composer.tsx index 847f23488..f8e9bdd4a 100644 --- a/packages/app/src/components/agent-input-area.tsx +++ b/packages/app/src/components/composer.tsx @@ -42,7 +42,7 @@ import { persistAttachmentFromBlob, persistAttachmentFromFileUri, } from "@/attachments/service"; -import { resolveStatusControlMode } from "@/components/agent-input-area.status-controls"; +import { resolveStatusControlMode } from "@/components/composer.status-controls"; import { markScrollInvestigationRender } from "@/utils/scroll-jank-investigation"; import { useKeyboardShiftStyle } from "@/hooks/use-keyboard-shift-style"; import { useKeyboardActionHandler } from "@/hooks/use-keyboard-action-handler"; @@ -57,11 +57,12 @@ type QueuedMessage = { type ImageListUpdater = ImageAttachment[] | ((prev: ImageAttachment[]) => ImageAttachment[]); -interface AgentInputAreaProps { +interface ComposerProps { agentId: string; serverId: string; isInputActive: boolean; onSubmitMessage?: (payload: MessagePayload) => Promise; + allowEmptySubmit?: boolean; /** Externally controlled loading state. When true, disables the submit button. */ isSubmitLoading?: boolean; /** When true, blurs the input immediately when submitting. */ @@ -86,17 +87,20 @@ interface AgentInputAreaProps { onAttentionPromptSend?: () => void; /** Controlled status controls rendered in input area (draft flows). */ statusControls?: DraftAgentStatusBarProps; + /** Extra styles merged onto the message input wrapper (e.g. elevated background). */ + inputWrapperStyle?: import("react-native").ViewStyle; } const EMPTY_ARRAY: readonly QueuedMessage[] = []; const DESKTOP_MESSAGE_PLACEHOLDER = "Message the agent, tag @files, or use /commands and /skills"; const MOBILE_MESSAGE_PLACEHOLDER = "Message, @files, /commands"; -export function AgentInputArea({ +export function Composer({ agentId, serverId, isInputActive, onSubmitMessage, + allowEmptySubmit = false, isSubmitLoading = false, blurOnSubmit = false, value, @@ -113,8 +117,9 @@ export function AgentInputArea({ onAttentionInputFocus, onAttentionPromptSend, statusControls, -}: AgentInputAreaProps) { - markScrollInvestigationRender(`AgentInputArea:${serverId}:${agentId}`); + inputWrapperStyle, +}: ComposerProps) { + markScrollInvestigationRender(`Composer:${serverId}:${agentId}`); const { theme } = useUnistyles(); const buttonIconSize = Platform.OS === "web" ? theme.iconSize.md : theme.iconSize.lg; const insets = useSafeAreaInsets(); @@ -442,12 +447,16 @@ export function AgentInputArea({ }, }); return true; + case "message-input.send": + return messageInputRef.current?.runKeyboardAction("send") ?? false; case "message-input.dictation-toggle": messageInputRef.current?.runKeyboardAction("dictation-toggle"); return true; case "message-input.dictation-cancel": messageInputRef.current?.runKeyboardAction("dictation-cancel"); return true; + case "message-input.dictation-confirm": + return messageInputRef.current?.runKeyboardAction("dictation-confirm") ?? false; case "message-input.voice-toggle": messageInputRef.current?.runKeyboardAction("voice-toggle"); return true; @@ -507,7 +516,7 @@ export function AgentInputArea({ return; } void voice.startVoice(serverId, agentId).catch((error) => { - console.error("[AgentInputArea] Failed to start voice mode", error); + console.error("[Composer] Failed to start voice mode", error); const message = error instanceof Error ? error.message : typeof error === "string" ? error : null; if (message && message.trim().length > 0) { @@ -604,40 +613,44 @@ export function AgentInputArea({ ) : null; - const rightContent = ( - - {!isVoiceModeForAgent && hasAgent ? ( - - [ - styles.realtimeVoiceButton as any, - (hovered ? styles.iconButtonHovered : undefined) as any, - (!isConnected || voice?.isVoiceSwitching ? styles.buttonDisabled : undefined) as any, - ]} - > - {voice?.isVoiceSwitching ? ( - - ) : ( - - )} - - - - Voice mode - {voiceToggleKeys ? ( - - ) : null} - - - - ) : null} - {cancelButton} - - ); + const showVoiceModeButton = !isVoiceModeForAgent && hasAgent; + const rightContent = + showVoiceModeButton || cancelButton ? ( + + {showVoiceModeButton ? ( + + [ + styles.realtimeVoiceButton as any, + (hovered ? styles.iconButtonHovered : undefined) as any, + (!isConnected || voice?.isVoiceSwitching ? styles.buttonDisabled : undefined) as any, + ]} + > + {({ hovered }) => + voice?.isVoiceSwitching ? ( + + ) : ( + + ) + } + + + + Voice mode + {voiceToggleKeys ? ( + + ) : null} + + + + ) : null} + {cancelButton} + + ) : null; const hasContextWindowMeter = typeof agentState.contextWindowMaxTokens === "number" && @@ -723,6 +736,7 @@ export function AgentInputArea({ value={userInput} onChangeText={setUserInput} onSubmit={handleSubmit} + allowEmptySubmit={allowEmptySubmit} isSubmitDisabled={isProcessing || isSubmitLoading} isSubmitLoading={isProcessing || isSubmitLoading} images={selectedImages} @@ -755,6 +769,7 @@ export function AgentInputArea({ } }} onHeightChange={onComposerHeightChange} + inputWrapperStyle={inputWrapperStyle} /> diff --git a/packages/app/src/components/icons/aider-icon.tsx b/packages/app/src/components/icons/aider-icon.tsx new file mode 100644 index 000000000..1deb4a218 --- /dev/null +++ b/packages/app/src/components/icons/aider-icon.tsx @@ -0,0 +1,16 @@ +import Svg, { Path } from "react-native-svg"; + +interface AiderIconProps { + size?: number; + color?: string; +} + +export function AiderIcon({ size = 16, color = "currentColor" }: AiderIconProps) { + return ( + + + + ); +} diff --git a/packages/app/src/components/icons/amp-icon.tsx b/packages/app/src/components/icons/amp-icon.tsx new file mode 100644 index 000000000..5b855867d --- /dev/null +++ b/packages/app/src/components/icons/amp-icon.tsx @@ -0,0 +1,29 @@ +import Svg, { Path } from "react-native-svg"; + +interface AmpIconProps { + size?: number; + color?: string; +} + +export function AmpIcon({ size = 16, color = "currentColor" }: AmpIconProps) { + return ( + + + + + + + ); +} diff --git a/packages/app/src/components/icons/gemini-icon.tsx b/packages/app/src/components/icons/gemini-icon.tsx new file mode 100644 index 000000000..05d3822ce --- /dev/null +++ b/packages/app/src/components/icons/gemini-icon.tsx @@ -0,0 +1,17 @@ +import Svg, { Path } from "react-native-svg"; + +interface GeminiIconProps { + size?: number; + color?: string; +} + +export function GeminiIcon({ size = 16, color = "currentColor" }: GeminiIconProps) { + return ( + + + + ); +} diff --git a/packages/app/src/components/message-input.tsx b/packages/app/src/components/message-input.tsx index 8e18b3706..f60e0eb05 100644 --- a/packages/app/src/components/message-input.tsx +++ b/packages/app/src/components/message-input.tsx @@ -62,6 +62,7 @@ export interface MessageInputProps { value: string; onChangeText: (text: string) => void; onSubmit: (payload: MessagePayload) => void; + allowEmptySubmit?: boolean; isSubmitDisabled?: boolean; isSubmitLoading?: boolean; images?: ImageAttachment[]; @@ -97,6 +98,8 @@ export interface MessageInputProps { onSelectionChange?: (selection: { start: number; end: number }) => void; onFocusChange?: (focused: boolean) => void; onHeightChange?: (height: number) => void; + /** Extra styles merged onto the input wrapper (e.g. elevated background). */ + inputWrapperStyle?: import("react-native").ViewStyle; } export interface MessageInputRef { @@ -110,9 +113,11 @@ export interface MessageInputRef { getNativeElement?: () => HTMLElement | null; } -const MIN_INPUT_HEIGHT = 30; +const MIN_INPUT_HEIGHT_MOBILE = 30; +const MIN_INPUT_HEIGHT_DESKTOP = 46; const MAX_INPUT_HEIGHT = 160; const IS_WEB = Platform.OS === "web"; +const MIN_INPUT_HEIGHT = IS_WEB ? MIN_INPUT_HEIGHT_DESKTOP : MIN_INPUT_HEIGHT_MOBILE; type WebTextInputKeyPressEvent = NativeSyntheticEvent< TextInputKeyPressEventData & { @@ -189,6 +194,7 @@ export const MessageInput = forwardRef(funct value, onChangeText, onSubmit, + allowEmptySubmit = false, isSubmitDisabled = false, isSubmitLoading = false, images = [], @@ -214,6 +220,7 @@ export const MessageInput = forwardRef(funct onSelectionChange: onSelectionChangeCallback, onFocusChange, onHeightChange, + inputWrapperStyle, }, ref, ) { @@ -887,7 +894,7 @@ export const MessageInput = forwardRef(funct } const hasImages = images.length > 0; - const hasSendableContent = value.trim().length > 0 || hasImages; + const hasSendableContent = value.trim().length > 0 || hasImages || allowEmptySubmit; const shouldShowSendButton = hasSendableContent || isSubmitLoading; const canPressLoadingButton = isSubmitLoading && typeof onSubmitLoadingPress === "function"; const isSendButtonDisabled = @@ -915,7 +922,7 @@ export const MessageInput = forwardRef(funct return ( {/* Regular input */} - + {/* Image preview pills */} {hasImages && ( @@ -1005,7 +1012,9 @@ export const MessageInput = forwardRef(funct (!isConnected || disabled) && styles.buttonDisabled, ]} > - + {({ hovered }) => ( + + )} @@ -1040,13 +1049,15 @@ export const MessageInput = forwardRef(funct isDictating && styles.voiceButtonRecording, ]} > - {isDictating ? ( - - ) : isRealtimeVoiceForCurrentAgent && voice?.isMuted ? ( - - ) : ( - - )} + {({ hovered }) => + isDictating ? ( + + ) : isRealtimeVoiceForCurrentAgent && voice?.isMuted ? ( + + ) : ( + + ) + } @@ -1248,6 +1259,7 @@ const styles = StyleSheet.create(((theme: any) => ({ flexDirection: "row", alignItems: "flex-end", justifyContent: "space-between", + marginHorizontal: -6, }, leftButtonGroup: { flexDirection: "row", diff --git a/packages/app/src/components/project-picker-modal.tsx b/packages/app/src/components/project-picker-modal.tsx index 0556f2aec..785a7add1 100644 --- a/packages/app/src/components/project-picker-modal.tsx +++ b/packages/app/src/components/project-picker-modal.tsx @@ -40,9 +40,9 @@ export function ProjectPickerModal() { const recommendedPaths = useMemo(() => { if (!workspaces) return []; - return Array.from(workspaces.values()).map( - (workspace) => workspace.projectRootPath || workspace.id, - ); + return Array.from(workspaces.values()) + .map((workspace) => workspace.projectRootPath) + .filter((path) => path.length > 0); }, [workspaces]); const directorySuggestionsQuery = useQuery({ diff --git a/packages/app/src/components/provider-icons.ts b/packages/app/src/components/provider-icons.ts index 6c8a689c5..efa9c8ffd 100644 --- a/packages/app/src/components/provider-icons.ts +++ b/packages/app/src/components/provider-icons.ts @@ -1,6 +1,9 @@ import { Bot } from "lucide-react-native"; +import { AiderIcon } from "@/components/icons/aider-icon"; +import { AmpIcon } from "@/components/icons/amp-icon"; import { ClaudeIcon } from "@/components/icons/claude-icon"; import { CodexIcon } from "@/components/icons/codex-icon"; +import { GeminiIcon } from "@/components/icons/gemini-icon"; import { CopilotIcon } from "@/components/icons/copilot-icon"; import { OpenCodeIcon } from "@/components/icons/opencode-icon"; import { PiIcon } from "@/components/icons/pi-icon"; @@ -8,6 +11,9 @@ import { PiIcon } from "@/components/icons/pi-icon"; const PROVIDER_ICONS: Record = { claude: ClaudeIcon as unknown as typeof Bot, codex: CodexIcon as unknown as typeof Bot, + gemini: GeminiIcon as unknown as typeof Bot, + amp: AmpIcon as unknown as typeof Bot, + aider: AiderIcon as unknown as typeof Bot, copilot: CopilotIcon as unknown as typeof Bot, opencode: OpenCodeIcon as unknown as typeof Bot, pi: PiIcon as unknown as typeof Bot, diff --git a/packages/app/src/components/sidebar-workspace-list.tsx b/packages/app/src/components/sidebar-workspace-list.tsx index bbae09de9..1f016d1ba 100644 --- a/packages/app/src/components/sidebar-workspace-list.tsx +++ b/packages/app/src/components/sidebar-workspace-list.tsx @@ -27,6 +27,7 @@ import { type GestureType } from "react-native-gesture-handler"; import * as Clipboard from "expo-clipboard"; import { Archive, + ArrowUpRight, CircleAlert, ChevronDown, ChevronRight, @@ -35,6 +36,7 @@ import { FolderPlus, FolderGit2, GitPullRequest, + Globe, Monitor, MoreVertical, Plus, @@ -46,7 +48,7 @@ import type { DraggableListDragHandleProps } from "./draggable-list.types"; import { getHostRuntimeStore, isHostRuntimeConnected } from "@/runtime/host-runtime"; import { getIsElectronRuntime, isCompactFormFactor } from "@/constants/layout"; import { projectIconQueryKey } from "@/hooks/use-project-icon-query"; -import { parseHostWorkspaceRouteFromPathname } from "@/utils/host-routes"; +import { buildHostNewWorkspaceRoute, parseHostWorkspaceRouteFromPathname } from "@/utils/host-routes"; import { prepareWorkspaceTab } from "@/utils/workspace-navigation"; import { type SidebarProjectEntry, @@ -54,7 +56,13 @@ import { } from "@/hooks/use-sidebar-workspaces-list"; import { useSidebarOrderStore } from "@/stores/sidebar-order-store"; import { useKeyboardShortcutsStore } from "@/stores/keyboard-shortcuts-store"; -import { ContextMenuTrigger, useContextMenu } from "@/components/ui/context-menu"; +import { + ContextMenu, + ContextMenuContent, + ContextMenuItem, + ContextMenuTrigger, + useContextMenu, +} from "@/components/ui/context-menu"; import { DropdownMenu, DropdownMenuTrigger, @@ -80,9 +88,14 @@ import { type PrHint, useWorkspacePrHint } from "@/hooks/use-checkout-pr-status- import { buildSidebarProjectRowModel } from "@/utils/sidebar-project-row-model"; import { useNavigationActiveWorkspaceSelection } from "@/stores/navigation-active-workspace-store"; import { normalizeWorkspaceDescriptor, useSessionStore } from "@/stores/session-store"; -import { createNameId } from "mnemonic-id"; import { buildWorkspaceArchiveRedirectRoute } from "@/utils/workspace-archive-navigation"; import { openExternalUrl } from "@/utils/open-external-url"; +import { + requireWorkspaceExecutionDirectory, + resolveWorkspaceExecutionDirectory, +} from "@/utils/workspace-execution"; +import { CheckStatusIndicator, WorkspaceHoverCard } from "@/components/workspace-hover-card"; +import { createNameId } from "mnemonic-id"; function toProjectIconDataUri(icon: { mimeType: string; data: string } | null): string | null { if (!icon) { @@ -100,16 +113,6 @@ const DEFAULT_STATUS_DOT_SIZE = 7; const EMPHASIZED_STATUS_DOT_SIZE = 9; const DEFAULT_STATUS_DOT_OFFSET = 0; const EMPHASIZED_STATUS_DOT_OFFSET = -1; -function getWorkspacePrIconColor(theme: ReturnType["theme"], state: PrHint["state"]) { - switch (state) { - case "merged": - return theme.colors.palette.purple[500]; - case "open": - return theme.colors.palette.green[500]; - case "closed": - return theme.colors.palette.red[500]; - } -} interface SidebarWorkspaceListProps { projects: SidebarProjectEntry[]; @@ -171,11 +174,10 @@ interface WorkspaceRowInnerProps { archiveShortcutKeys?: ShortcutKey[][] | null; } -function WorkspacePrBadge({ hint }: { hint: PrHint }) { +export function PrBadge({ hint }: { hint: PrHint }) { const { theme } = useUnistyles(); const [isHovered, setIsHovered] = useState(false); - const textColor = isHovered ? theme.colors.foreground : theme.colors.foregroundMuted; - const iconColor = getWorkspacePrIconColor(theme, hint.state); + const activeColor = isHovered ? theme.colors.foreground : theme.colors.foregroundMuted; const handlePressIn = useCallback((event: GestureResponderEvent) => { event.stopPropagation(); @@ -192,32 +194,46 @@ function WorkspacePrBadge({ hint }: { hint: PrHint }) { return ( setIsHovered(true)} onPointerLeave={() => setIsHovered(false)} style={({ pressed }) => [ - styles.workspacePrBadge, - pressed && styles.workspacePrBadgePressed, + prBadgeStyles.badge, + pressed && prBadgeStyles.badgePressed, ]} > - + #{hint.number} - {isHovered && } + ); } +const prBadgeStyles = StyleSheet.create((theme) => ({ + badge: { + flexDirection: "row", + alignItems: "center", + gap: 2, + }, + badgePressed: { + opacity: 0.82, + }, + text: { + fontSize: theme.fontSize.xs, + fontWeight: theme.fontWeight.normal, + lineHeight: 14, + }, +})); + + function WorkspaceStatusIndicator({ bucket, workspaceKind, @@ -714,50 +730,21 @@ function ProjectHeaderRow({ const { theme } = useUnistyles(); const [isHovered, setIsHovered] = useState(false); const isMobileBreakpoint = isCompactFormFactor(); - const mergeWorkspaces = useSessionStore((state) => state.mergeWorkspaces); - const toast = useToast(); + const handleBeginWorkspaceSetup = useCallback(() => { + if (!serverId) { + return; + } + router.navigate(buildHostNewWorkspaceRoute(serverId, project.iconWorkingDir, { displayName }) as any); + onWorkspacePress?.(); + }, [displayName, onWorkspacePress, project.iconWorkingDir, serverId]); - const createWorktreeMutation = useMutation({ - mutationFn: async () => { - if (!serverId) { - throw new Error("No server"); - } - const client = getHostRuntimeStore().getClient(serverId); - if (!client || !isHostRuntimeConnected(getHostRuntimeStore().getSnapshot(serverId))) { - throw new Error("Host is not connected"); - } - const payload = await client.createPaseoWorktree({ - cwd: project.iconWorkingDir, - worktreeSlug: createNameId(), - }); - if (payload.error || !payload.workspace) { - throw new Error(payload.error ?? "Failed to create worktree"); - } - return payload.workspace; - }, - onSuccess: (workspace) => { - mergeWorkspaces(serverId!, [normalizeWorkspaceDescriptor(workspace)]); - onWorktreeCreated?.(workspace.id); - onWorkspacePress?.(); - router.navigate( - prepareWorkspaceTab({ - serverId: serverId!, - workspaceId: workspace.id, - target: { kind: "draft", draftId: "new" }, - }) as any, - ); - }, - onError: (error) => { - toast.error(error instanceof Error ? error.message : String(error)); - }, - }); useKeyboardActionHandler({ handlerId: `worktree-new-${project.projectKey}`, actions: ["worktree.new"], - enabled: isProjectActive && canCreateWorktree && !createWorktreeMutation.isPending, + enabled: isProjectActive && canCreateWorktree && Boolean(serverId), priority: 0, handle: () => { - createWorktreeMutation.mutate(); + handleBeginWorkspaceSetup(); return true; }, }); @@ -802,9 +789,8 @@ function ProjectHeaderRow({ {canCreateWorktree ? ( createWorktreeMutation.mutate()} + onPress={handleBeginWorkspaceSetup} visible={isHovered || isMobileBreakpoint} - loading={createWorktreeMutation.isPending} showShortcutHint={isProjectActive} testID={`sidebar-project-new-worktree-${project.projectKey}`} /> @@ -924,10 +910,13 @@ function WorkspaceRowInner({ const { theme } = useUnistyles(); const [isHovered, setIsHovered] = useState(false); const isTouchPlatform = Platform.OS !== "web"; + const workspaceDirectory = resolveWorkspaceExecutionDirectory({ + workspaceDirectory: workspace.workspaceDirectory, + }); const prHint = useWorkspacePrHint({ serverId: workspace.serverId, - cwd: workspace.workspaceId, - enabled: workspace.workspaceKind !== "directory", + cwd: workspaceDirectory ?? "", + enabled: workspace.projectKind === "git" && Boolean(workspaceDirectory), }); const interaction = useLongPressDragInteraction({ drag, @@ -942,122 +931,133 @@ function WorkspaceRowInner({ onPress(); }, [interaction.didLongPressRef, onPress]); + const isDesktop = !isTouchPlatform; + const showGlobe = isDesktop && workspace.hasRunningServices; + return ( - setIsHovered(true)} - onPointerLeave={() => setIsHovered(false)} - > - [ - styles.workspaceRow, - isDragging && styles.workspaceRowDragging, - selected && styles.sidebarRowSelected, - isHovered && styles.workspaceRowHovered, - pressed && styles.workspaceRowPressed, - ]} - onPressIn={interaction.handlePressIn} - onTouchMove={interaction.handleTouchMove} - onPressOut={interaction.handlePressOut} - onPress={handlePress} - testID={`sidebar-workspace-row-${workspace.workspaceKey}`} + + setIsHovered(true)} + onPointerLeave={() => setIsHovered(false)} > - - - - [ + styles.workspaceRow, + isDragging && styles.workspaceRowDragging, + selected && styles.sidebarRowSelected, + isHovered && styles.workspaceRowHovered, + pressed && styles.workspaceRowPressed, + ]} + onPressIn={interaction.handlePressIn} + onTouchMove={interaction.handleTouchMove} + onPressOut={interaction.handlePressOut} + onPress={handlePress} + testID={`sidebar-workspace-row-${workspace.workspaceKey}`} + > + + - {workspace.name} - - - - {isCreating ? Creating... : null} - {onArchive && (isHovered || isTouchPlatform) ? ( - - [ - styles.kebabButton, - hovered && styles.kebabButtonHovered, - ]} - accessibilityRole="button" - accessibilityLabel="Workspace actions" - testID={`sidebar-workspace-kebab-${workspace.workspaceKey}`} - > - {({ hovered }) => ( - - )} - - - {onCopyPath ? ( - } - onSelect={onCopyPath} - > - Copy path - - ) : null} - {onCopyBranchName ? ( + + + {workspace.name} + + + + {showGlobe ? ( + + + + ) : null} + {isCreating ? Creating... : null} + {onArchive && (isHovered || isTouchPlatform) ? ( + + [ + styles.kebabButton, + hovered && styles.kebabButtonHovered, + ]} + accessibilityRole="button" + accessibilityLabel="Workspace actions" + testID={`sidebar-workspace-kebab-${workspace.workspaceKey}`} + > + {({ hovered }) => ( + + )} + + + {onCopyPath ? ( + } + onSelect={onCopyPath} + > + Copy path + + ) : null} + {onCopyBranchName ? ( + } + onSelect={onCopyBranchName} + > + Copy branch name + + ) : null} } - onSelect={onCopyBranchName} + testID={`sidebar-workspace-menu-archive-${workspace.workspaceKey}`} + leading={} + trailing={archiveShortcutKeys ? : null} + status={archiveStatus} + pendingLabel={archivePendingLabel} + onSelect={onArchive} > - Copy branch name + {archiveLabel ?? "Archive"} - ) : null} - } - trailing={archiveShortcutKeys ? : null} - status={archiveStatus} - pendingLabel={archivePendingLabel} - onSelect={onArchive} - > - {archiveLabel ?? "Archive"} - - - - ) : workspace.diffStat ? ( - - +{workspace.diffStat.additions} - -{workspace.diffStat.deletions} - - ) : null} - {showShortcutBadge && shortcutNumber !== null ? ( - - {shortcutNumber} - - ) : null} - - - {prHint ? ( - - + + + ) : workspace.diffStat ? ( + + +{workspace.diffStat.additions} + -{workspace.diffStat.deletions} + + ) : null} + {showShortcutBadge && shortcutNumber !== null ? ( + + {shortcutNumber} + + ) : null} + - ) : null} - - + {prHint ? ( + + + + + ) : null} + + + ); } @@ -1091,12 +1091,17 @@ function WorkspaceRowWithMenu({ (state) => state.sessions[workspace.serverId]?.workspaces ?? EMPTY_WORKSPACES, ); const [isArchivingWorkspace, setIsArchivingWorkspace] = useState(false); + const workspaceDirectory = resolveWorkspaceExecutionDirectory({ + workspaceDirectory: workspace.workspaceDirectory, + }); const archiveStatus = useCheckoutGitActionsStore((state) => - state.getStatus({ - serverId: workspace.serverId, - cwd: workspace.workspaceId, - actionId: "archive-worktree", - }), + workspaceDirectory + ? state.getStatus({ + serverId: workspace.serverId, + cwd: workspaceDirectory, + actionId: "archive-worktree", + }) + : "idle", ); const isWorktree = workspace.workspaceKind === "worktree"; const isArchiving = isWorktree ? archiveStatus === "pending" : isArchivingWorkspace; @@ -1134,19 +1139,32 @@ function WorkspaceRowWithMenu({ if (!confirmed) { return; } + let workspaceDirectory: string; + try { + workspaceDirectory = requireWorkspaceExecutionDirectory({ + workspaceId: workspace.workspaceId, + workspaceDirectory: workspace.workspaceDirectory, + }); + } catch (error) { + toast.error(error instanceof Error ? error.message : "Workspace path not available"); + return; + } + + if (!workspaceDirectory) { + toast.error("Workspace path not available"); + return; + } + + redirectAfterArchive(); void archiveWorktree({ serverId: workspace.serverId, - cwd: workspace.workspaceId, - worktreePath: workspace.workspaceId, - }) - .then(() => { - redirectAfterArchive(); - }) - .catch((error) => { - const message = error instanceof Error ? error.message : "Failed to archive worktree"; - toast.error(message); - }); + cwd: workspaceDirectory, + worktreePath: workspaceDirectory, + }).catch((error) => { + const message = error instanceof Error ? error.message : "Failed to archive worktree"; + toast.error(message); + }); })(); }, [ archiveWorktree, @@ -1154,6 +1172,7 @@ function WorkspaceRowWithMenu({ redirectAfterArchive, toast, workspace.name, + workspace.workspaceDirectory, workspace.serverId, workspace.workspaceId, ]); @@ -1182,17 +1201,20 @@ function WorkspaceRowWithMenu({ } setIsArchivingWorkspace(true); - try { - const payload = await client.archiveWorkspace(workspace.workspaceId); - if (payload.error) { - throw new Error(payload.error); + redirectAfterArchive(); + + void (async () => { + try { + const payload = await client.archiveWorkspace(Number(workspace.workspaceId)); + if (payload.error) { + throw new Error(payload.error); + } + } catch (error) { + toast.error(error instanceof Error ? error.message : "Failed to hide workspace"); + } finally { + setIsArchivingWorkspace(false); } - redirectAfterArchive(); - } catch (error) { - toast.error(error instanceof Error ? error.message : "Failed to hide workspace"); - } finally { - setIsArchivingWorkspace(false); - } + })(); })(); }, [ isArchivingWorkspace, @@ -1204,9 +1226,19 @@ function WorkspaceRowWithMenu({ ]); const handleCopyPath = useCallback(() => { - void Clipboard.setStringAsync(workspace.workspaceId); + let workspaceDirectory: string; + try { + workspaceDirectory = requireWorkspaceExecutionDirectory({ + workspaceId: workspace.workspaceId, + workspaceDirectory: workspace.workspaceDirectory, + }); + } catch (error) { + toast.error(error instanceof Error ? error.message : "Workspace path not available"); + return; + } + void Clipboard.setStringAsync(workspaceDirectory); toast.copied("Path copied"); - }, [toast, workspace.workspaceId]); + }, [toast, workspace.workspaceDirectory, workspace.workspaceId]); const handleCopyBranchName = useCallback(() => { void Clipboard.setStringAsync(workspace.name); @@ -1254,6 +1286,163 @@ function WorkspaceRowWithMenu({ ); } +function NonGitProjectRowWithMenuContent({ + project, + displayName, + iconDataUri, + workspace, + selected, + onPress, + shortcutNumber, + showShortcutBadge, + drag, + isDragging, + dragHandleProps, +}: { + project: SidebarProjectEntry; + displayName: string; + iconDataUri: string | null; + workspace: SidebarWorkspaceEntry; + selected: boolean; + onPress: () => void; + shortcutNumber: number | null; + showShortcutBadge: boolean; + drag: () => void; + isDragging: boolean; + dragHandleProps?: DraggableListDragHandleProps; +}) { + const toast = useToast(); + const contextMenu = useContextMenu(); + const activeWorkspaceSelection = useNavigationActiveWorkspaceSelection(); + const sessionWorkspaces = useSessionStore( + (state) => state.sessions[workspace.serverId]?.workspaces ?? EMPTY_WORKSPACES, + ); + const [isArchivingWorkspace, setIsArchivingWorkspace] = useState(false); + const redirectAfterArchive = useCallback(() => { + if ( + activeWorkspaceSelection?.serverId !== workspace.serverId || + activeWorkspaceSelection.workspaceId !== workspace.workspaceId + ) { + return; + } + + router.replace( + buildWorkspaceArchiveRedirectRoute({ + serverId: workspace.serverId, + archivedWorkspaceId: workspace.workspaceId, + workspaces: sessionWorkspaces.values(), + }) as any, + ); + }, [activeWorkspaceSelection, sessionWorkspaces, workspace.serverId, workspace.workspaceId]); + + const handleArchiveWorkspace = useCallback(() => { + if (isArchivingWorkspace) { + return; + } + + void (async () => { + const confirmed = await confirmDialog({ + title: "Hide workspace?", + message: `Hide "${workspace.name}" from the sidebar?\n\nFiles on disk will not be changed.`, + confirmLabel: "Hide", + cancelLabel: "Cancel", + destructive: true, + }); + if (!confirmed) { + return; + } + + const client = getHostRuntimeStore().getClient(workspace.serverId); + if (!client) { + toast.error("Host is not connected"); + return; + } + + setIsArchivingWorkspace(true); + redirectAfterArchive(); + + void (async () => { + try { + const payload = await client.archiveWorkspace(Number(workspace.workspaceId)); + if (payload.error) { + throw new Error(payload.error); + } + } catch (error) { + toast.error(error instanceof Error ? error.message : "Failed to hide workspace"); + } finally { + setIsArchivingWorkspace(false); + } + })(); + })(); + }, [ + isArchivingWorkspace, + redirectAfterArchive, + toast, + workspace.name, + workspace.serverId, + workspace.workspaceId, + ]); + + return ( + <> + + + + Hide from sidebar + + + + ); +} + +function NonGitProjectRowWithMenu(props: { + project: SidebarProjectEntry; + displayName: string; + iconDataUri: string | null; + workspace: SidebarWorkspaceEntry; + selected: boolean; + onPress: () => void; + shortcutNumber: number | null; + showShortcutBadge: boolean; + drag: () => void; + isDragging: boolean; + dragHandleProps?: DraggableListDragHandleProps; +}) { + return ( + + + + ); +} + function FlattenedProjectRow({ project, displayName, @@ -1289,6 +1478,24 @@ function FlattenedProjectRow({ onRemoveProject?: () => void; removeProjectStatus?: "idle" | "pending"; }) { + if (project.projectKind === "directory") { + return ( + + ); + } + return ( { + const payload = await client.archiveWorkspace(Number(ws.workspaceId)); if (payload.error) { throw new Error(payload.error); } + }), + ).then((results) => { + const failed = results.filter((r) => r.status === "rejected"); + if (failed.length > 0) { + toast.error("Failed to remove some workspaces"); } - } catch (error) { - toast.error(error instanceof Error ? error.message : "Failed to remove project"); - } finally { setIsRemovingProject(false); - } + }); })(); }, [isRemovingProject, serverId, displayName, toast, project.workspaces]); @@ -2211,21 +2421,10 @@ const styles = StyleSheet.create((theme) => ({ opacity: 1, }, workspacePrBadgeRow: { - paddingLeft: WORKSPACE_STATUS_DOT_WIDTH + theme.spacing[2], - }, - workspacePrBadge: { - alignSelf: "flex-start", flexDirection: "row", alignItems: "center", - gap: theme.spacing[1], - }, - workspacePrBadgePressed: { - opacity: 0.82, - }, - workspacePrBadgeText: { - fontSize: theme.fontSize.xs, - fontWeight: theme.fontWeight.normal, - lineHeight: 14, + gap: theme.spacing[2], + paddingLeft: WORKSPACE_STATUS_DOT_WIDTH + theme.spacing[2], }, workspaceCreatingText: { color: theme.colors.foregroundMuted, diff --git a/packages/app/src/components/split-container.tsx b/packages/app/src/components/split-container.tsx index 8b521358f..a8a0adb04 100644 --- a/packages/app/src/components/split-container.tsx +++ b/packages/app/src/components/split-container.tsx @@ -88,12 +88,8 @@ interface SplitContainerProps { onCloseTabsToLeft: (tabId: string, paneTabs: WorkspaceTabDescriptor[]) => Promise | void; onCloseTabsToRight: (tabId: string, paneTabs: WorkspaceTabDescriptor[]) => Promise | void; onCloseOtherTabs: (tabId: string, paneTabs: WorkspaceTabDescriptor[]) => Promise | void; - onSelectNewTabOption: (selection: { - optionId: "__new_tab_agent__" | "__new_tab_terminal__"; - paneId?: string; - }) => void; - onNewTerminalTab: (input: { paneId?: string }) => void; - newTabAgentOptionId?: "__new_tab_agent__" | "__new_tab_terminal__"; + onCreateDraftTab: (input: { paneId?: string }) => void; + onCreateTerminalTab: (input: { paneId?: string }) => void; buildPaneContentModel: (input: { paneId: string; isPaneFocused: boolean; @@ -258,9 +254,8 @@ export function SplitContainer({ onCloseTabsToLeft, onCloseTabsToRight, onCloseOtherTabs, - onSelectNewTabOption, - onNewTerminalTab, - newTabAgentOptionId = "__new_tab_agent__", + onCreateDraftTab, + onCreateTerminalTab, buildPaneContentModel, onFocusPane, onSplitPane, @@ -528,9 +523,8 @@ export function SplitContainer({ onCloseTabsToLeft={onCloseTabsToLeft} onCloseTabsToRight={onCloseTabsToRight} onCloseOtherTabs={onCloseOtherTabs} - onSelectNewTabOption={onSelectNewTabOption} - onNewTerminalTab={onNewTerminalTab} - newTabAgentOptionId={newTabAgentOptionId} + onCreateDraftTab={onCreateDraftTab} + onCreateTerminalTab={onCreateTerminalTab} buildPaneContentModel={buildPaneContentModel} onFocusPane={onFocusPane} onSplitPane={onSplitPane} @@ -651,9 +645,8 @@ function SplitNodeView({ onCloseTabsToLeft, onCloseTabsToRight, onCloseOtherTabs, - onSelectNewTabOption, - onNewTerminalTab, - newTabAgentOptionId, + onCreateDraftTab, + onCreateTerminalTab, buildPaneContentModel, onFocusPane, onSplitPane, @@ -687,9 +680,8 @@ function SplitNodeView({ onCloseTabsToLeft={onCloseTabsToLeft} onCloseTabsToRight={onCloseTabsToRight} onCloseOtherTabs={onCloseOtherTabs} - onSelectNewTabOption={onSelectNewTabOption} - onNewTerminalTab={onNewTerminalTab} - newTabAgentOptionId={newTabAgentOptionId} + onCreateDraftTab={onCreateDraftTab} + onCreateTerminalTab={onCreateTerminalTab} buildPaneContentModel={buildPaneContentModel} onFocusPane={onFocusPane} onSplitPane={onSplitPane} @@ -738,9 +730,8 @@ function SplitNodeView({ onCloseTabsToLeft={onCloseTabsToLeft} onCloseTabsToRight={onCloseTabsToRight} onCloseOtherTabs={onCloseOtherTabs} - onSelectNewTabOption={onSelectNewTabOption} - onNewTerminalTab={onNewTerminalTab} - newTabAgentOptionId={newTabAgentOptionId} + onCreateDraftTab={onCreateDraftTab} + onCreateTerminalTab={onCreateTerminalTab} buildPaneContentModel={buildPaneContentModel} onFocusPane={onFocusPane} onSplitPane={onSplitPane} @@ -788,9 +779,8 @@ function SplitPaneView({ onCloseTabsToLeft, onCloseTabsToRight, onCloseOtherTabs, - onSelectNewTabOption, - onNewTerminalTab, - newTabAgentOptionId, + onCreateDraftTab, + onCreateTerminalTab, buildPaneContentModel, onFocusPane, onSplitPane, @@ -900,9 +890,8 @@ function SplitPaneView({ onCloseTabsToLeft={(tabId) => onCloseTabsToLeft(tabId, paneTabs)} onCloseTabsToRight={(tabId) => onCloseTabsToRight(tabId, paneTabs)} onCloseOtherTabs={(tabId) => onCloseOtherTabs(tabId, paneTabs)} - onSelectNewTabOption={onSelectNewTabOption} - onNewTerminalTab={onNewTerminalTab} - newTabAgentOptionId={newTabAgentOptionId ?? "__new_tab_agent__"} + onCreateDraftTab={onCreateDraftTab} + onCreateTerminalTab={onCreateTerminalTab} onReorderTabs={(nextTabs) => { onReorderTabsInPane( pane.id, diff --git a/packages/app/src/components/stream-strategy-resolver.ts b/packages/app/src/components/stream-strategy-resolver.ts new file mode 100644 index 000000000..6aa9e3773 --- /dev/null +++ b/packages/app/src/components/stream-strategy-resolver.ts @@ -0,0 +1,14 @@ +import type { ResolveStreamRenderStrategyInput, StreamStrategy } from "./stream-strategy"; +import { createNativeStreamStrategy } from "./stream-strategy-native"; +import { createWebStreamStrategy } from "./stream-strategy-web"; + +export function resolveStreamRenderStrategy( + input: ResolveStreamRenderStrategyInput, +): StreamStrategy { + if (input.platform === "web") { + return createWebStreamStrategy({ + isMobileBreakpoint: input.isMobileBreakpoint, + }); + } + return createNativeStreamStrategy(); +} diff --git a/packages/app/src/components/stream-strategy.ts b/packages/app/src/components/stream-strategy.ts index 47a8a05a6..bca2508a4 100644 --- a/packages/app/src/components/stream-strategy.ts +++ b/packages/app/src/components/stream-strategy.ts @@ -6,8 +6,6 @@ import type { BottomAnchorLocalRequest, BottomAnchorRouteRequest, } from "./use-bottom-anchor-controller"; -import { createNativeStreamStrategy } from "./stream-strategy-native"; -import { createWebStreamStrategy } from "./stream-strategy-web"; type EdgeSlot = "header" | "footer"; type NeighborRelation = "above" | "below"; @@ -183,17 +181,6 @@ export function createStreamStrategy(config: StreamStrategyConfig): StreamStrate }; } -export function resolveStreamRenderStrategy( - input: ResolveStreamRenderStrategyInput, -): StreamStrategy { - if (input.platform === "web") { - return createWebStreamStrategy({ - isMobileBreakpoint: input.isMobileBreakpoint, - }); - } - return createNativeStreamStrategy(); -} - export function resolveBottomAnchorTransportBehavior(input: { strategy: StreamStrategy; isViewportSettling: boolean; diff --git a/packages/app/src/components/ui/combobox.tsx b/packages/app/src/components/ui/combobox.tsx index 9d709cbd7..395d1806c 100644 --- a/packages/app/src/components/ui/combobox.tsx +++ b/packages/app/src/components/ui/combobox.tsx @@ -659,13 +659,7 @@ export function Combobox({ ); - const defaultContent = ( - <> - {effectiveOptionsPosition === "above-search" ? optionsList : null} - {searchable ? searchInput : null} - {effectiveOptionsPosition === "below-search" ? optionsList : null} - - ); + const defaultContent = optionsList; const content = children ?? defaultContent; @@ -690,6 +684,7 @@ export function Combobox({ {title} {stickyHeader} + {!children && searchable ? searchInput : null} ) : ( <> + {searchable ? searchInput : null} {effectiveOptionsPosition === "above-search" ? ( {optionsList} - ) : null} - {searchable ? searchInput : null} - {effectiveOptionsPosition === "below-search" ? ( + ) : ( {optionsList} - ) : null} + )} )} @@ -783,15 +777,12 @@ const styles = StyleSheet.create((theme) => ({ searchInputContainer: { flexDirection: "row", alignItems: "center", - borderWidth: 1, - borderColor: theme.colors.border, - backgroundColor: theme.colors.surface1, - borderRadius: theme.borderRadius.lg, paddingHorizontal: theme.spacing[3], - marginHorizontal: theme.spacing[2], - marginBottom: theme.spacing[2], - marginTop: theme.spacing[1], gap: theme.spacing[2], + backgroundColor: theme.colors.surface1, + borderBottomWidth: 1, + borderBottomColor: theme.colors.border, + ...(IS_WEB ? {} : { marginHorizontal: theme.spacing[1] }), }, searchInput: { flex: 1, diff --git a/packages/app/src/components/ui/tooltip.tsx b/packages/app/src/components/ui/tooltip.tsx index fdb9eb4a3..b760ee60f 100644 --- a/packages/app/src/components/ui/tooltip.tsx +++ b/packages/app/src/components/ui/tooltip.tsx @@ -256,12 +256,11 @@ export function TooltipTrigger({ asChild = false, triggerRefProp = "ref", ...props -}: PropsWithChildren< - PressableProps & { +}: PressableProps & { asChild?: boolean; triggerRefProp?: string; } ->): ReactElement { +): ReactElement { const ctx = useTooltipContext("TooltipTrigger"); const openTimerRef = useRef | null>(null); diff --git a/packages/app/src/components/welcome-screen.tsx b/packages/app/src/components/welcome-screen.tsx index 6f51ef1df..ed6414c24 100644 --- a/packages/app/src/components/welcome-screen.tsx +++ b/packages/app/src/components/welcome-screen.tsx @@ -264,6 +264,11 @@ export function WelcomeScreen({ onHostAdded }: WelcomeScreenProps) { ); useEffect(() => { + const currentPathname = + typeof window === "undefined" ? null : (window.location.pathname || null); + if (currentPathname && currentPathname !== "/welcome") { + return; + } if (!anyOnlineServerId) { return; } diff --git a/packages/app/src/components/workspace-hover-card.tsx b/packages/app/src/components/workspace-hover-card.tsx new file mode 100644 index 000000000..83f3cc196 --- /dev/null +++ b/packages/app/src/components/workspace-hover-card.tsx @@ -0,0 +1,674 @@ +import { + useCallback, + useEffect, + useRef, + useState, + type PropsWithChildren, + type ReactElement, +} from "react"; +import { useMutation } from "@tanstack/react-query"; +import { Dimensions, Platform, Text, View } from "react-native"; +import Animated, { FadeIn, FadeOut } from "react-native-reanimated"; +import { StyleSheet, useUnistyles } from "react-native-unistyles"; +import { Check, ExternalLink, LoaderCircle, Minus, Play, X } from "lucide-react-native"; +import { Pressable } from "react-native"; +import { Portal } from "@gorhom/portal"; +import { useBottomSheetModalInternal } from "@gorhom/bottom-sheet"; +import type { SidebarWorkspaceEntry } from "@/hooks/use-sidebar-workspaces-list"; +import type { PrHint } from "@/hooks/use-checkout-pr-status-query"; +import { useToast } from "@/contexts/toast-context"; +import { useSessionStore } from "@/stores/session-store"; +import { openExternalUrl } from "@/utils/open-external-url"; +import { PrBadge } from "@/components/sidebar-workspace-list"; + +interface Rect { + x: number; + y: number; + width: number; + height: number; +} + +function measureElement(element: View): Promise { + return new Promise((resolve) => { + element.measureInWindow((x, y, width, height) => { + resolve({ x, y, width, height }); + }); + }); +} + +function computeHoverCardPosition({ + triggerRect, + contentSize, + displayArea, + offset, +}: { + triggerRect: Rect; + contentSize: { width: number; height: number }; + displayArea: Rect; + offset: number; +}): { x: number; y: number } { + let x = triggerRect.x + triggerRect.width + offset; + let y = triggerRect.y; + + // If it overflows right, try left + if (x + contentSize.width > displayArea.width - 8) { + x = triggerRect.x - contentSize.width - offset; + } + + // Constrain to screen + const padding = 8; + x = Math.max(padding, Math.min(displayArea.width - contentSize.width - padding, x)); + y = Math.max( + displayArea.y + padding, + Math.min(displayArea.y + displayArea.height - contentSize.height - padding, y), + ); + + return { x, y }; +} + +const HOVER_GRACE_MS = 100; +const HOVER_CARD_WIDTH = 260; + +interface WorkspaceHoverCardProps { + workspace: SidebarWorkspaceEntry; + prHint: PrHint | null; + isDragging: boolean; +} + +export function WorkspaceHoverCard({ + workspace, + prHint, + isDragging, + children, +}: PropsWithChildren): ReactElement { + // Desktop-only: skip on non-web platforms + if (Platform.OS !== "web") { + return <>{children}; + } + + return ( + + {children} + + ); +} + +function WorkspaceHoverCardDesktop({ + workspace, + prHint, + isDragging, + children, +}: PropsWithChildren): ReactElement { + const triggerRef = useRef(null); + const [open, setOpen] = useState(false); + const graceTimerRef = useRef | null>(null); + const triggerHoveredRef = useRef(false); + const contentHoveredRef = useRef(false); + + const hasServices = workspace.services.length > 0; + const hasContent = hasServices || prHint !== null; + + const clearGraceTimer = useCallback(() => { + if (graceTimerRef.current) { + clearTimeout(graceTimerRef.current); + graceTimerRef.current = null; + } + }, []); + + const scheduleClose = useCallback(() => { + clearGraceTimer(); + graceTimerRef.current = setTimeout(() => { + if (!triggerHoveredRef.current && !contentHoveredRef.current) { + setOpen(false); + } + graceTimerRef.current = null; + }, HOVER_GRACE_MS); + }, [clearGraceTimer]); + + const handleTriggerEnter = useCallback(() => { + triggerHoveredRef.current = true; + clearGraceTimer(); + if (!isDragging && hasContent) { + setOpen(true); + } + }, [clearGraceTimer, isDragging, hasContent]); + + const handleTriggerLeave = useCallback(() => { + triggerHoveredRef.current = false; + scheduleClose(); + }, [scheduleClose]); + + const handleContentEnter = useCallback(() => { + contentHoveredRef.current = true; + clearGraceTimer(); + }, [clearGraceTimer]); + + const handleContentLeave = useCallback(() => { + contentHoveredRef.current = false; + scheduleClose(); + }, [scheduleClose]); + + // Close when drag starts + useEffect(() => { + if (isDragging) { + clearGraceTimer(); + setOpen(false); + } + }, [isDragging, clearGraceTimer]); + + // When content becomes available while trigger is already hovered, open the card. + useEffect(() => { + if (!hasContent || isDragging) return; + if (triggerHoveredRef.current) { + setOpen(true); + } + }, [hasContent, isDragging]); + + // Cleanup on unmount + useEffect(() => { + return () => { + clearGraceTimer(); + }; + }, [clearGraceTimer]); + + return ( + + {children} + {open && hasContent ? ( + + ) : null} + + ); +} + +function getServiceHealthColor(input: { + health: SidebarWorkspaceEntry["services"][number]["health"]; + theme: ReturnType["theme"]; +}): string { + if (input.health === "healthy") { + return input.theme.colors.palette.green[500]; + } + if (input.health === "unhealthy") { + return input.theme.colors.palette.red[500]; + } + return input.theme.colors.foregroundMuted; +} + +function getServiceHealthLabel( + health: SidebarWorkspaceEntry["services"][number]["health"], +): "Healthy" | "Unhealthy" | "Unknown" { + if (health === "healthy") { + return "Healthy"; + } + if (health === "unhealthy") { + return "Unhealthy"; + } + return "Unknown"; +} + + +function getCheckStatusColor(input: { + status: string; + theme: ReturnType["theme"]; +}): string { + if (input.status === "success") return input.theme.colors.palette.green[500]; + if (input.status === "failure") return input.theme.colors.palette.red[500]; + if (input.status === "pending") return input.theme.colors.palette.amber[500]; + return input.theme.colors.foregroundMuted; +} + +function getCheckStatusIcon(status: string): typeof Check { + if (status === "success") return Check; + if (status === "failure") return X; + return Minus; +} + +export function CheckStatusIndicator({ + status, + size = 12, +}: { + status: string; + size?: number; +}): ReactElement | null { + const { theme } = useUnistyles(); + + if (!status || status === "none") return null; + + const color = getCheckStatusColor({ status, theme }); + const IconComponent = getCheckStatusIcon(status); + + return ( + + + + ); +} + +function ChecksSummary({ checks }: { checks: Array<{ status: string }> }): ReactElement { + const { theme } = useUnistyles(); + const counts: Record = {}; + for (const check of checks) { + const bucket = check.status === "success" ? "success" : check.status === "failure" ? "failure" : "pending"; + counts[bucket] = (counts[bucket] ?? 0) + 1; + } + + const buckets: Array<{ status: string; count: number }> = []; + if (counts.failure) buckets.push({ status: "failure", count: counts.failure }); + if (counts.success) buckets.push({ status: "success", count: counts.success }); + if (counts.pending) buckets.push({ status: "pending", count: counts.pending }); + + return ( + <> + {buckets.map((bucket) => { + const color = getCheckStatusColor({ status: bucket.status, theme }); + return ( + + {bucket.count} + + + ); + })} + + ); +} + +const checksSummaryStyles = StyleSheet.create((theme) => ({ + item: { + flexDirection: "row", + alignItems: "center", + gap: 3, + }, + count: { + fontSize: theme.fontSize.xs, + fontWeight: theme.fontWeight.medium, + }, +})); + +function WorkspaceHoverCardContent({ + workspace, + prHint, + triggerRef, + onContentEnter, + onContentLeave, +}: { + workspace: SidebarWorkspaceEntry; + prHint: PrHint | null; + triggerRef: React.RefObject; + onContentEnter: () => void; + onContentLeave: () => void; +}): ReactElement | null { + const { theme } = useUnistyles(); + const toast = useToast(); + const client = useSessionStore((state) => state.sessions[workspace.serverId]?.client ?? null); + const bottomSheetInternal = useBottomSheetModalInternal(true); + const [triggerRect, setTriggerRect] = useState(null); + const [contentSize, setContentSize] = useState<{ width: number; height: number } | null>(null); + const [position, setPosition] = useState<{ x: number; y: number } | null>(null); + const startServiceMutation = useMutation({ + mutationFn: async (serviceName: string) => { + if (!client) { + throw new Error("Daemon client not available"); + } + const result = await client.startWorkspaceService(workspace.workspaceId, serviceName); + if (result.error) { + throw new Error(result.error); + } + return result; + }, + onError: (error, serviceName) => { + toast.show( + error instanceof Error ? error.message : `Failed to start ${serviceName}`, + { variant: "error" }, + ); + }, + }); + + // Measure trigger — same pattern as tooltip.tsx + useEffect(() => { + if (!triggerRef.current) return; + + let cancelled = false; + measureElement(triggerRef.current).then((rect) => { + if (cancelled) return; + setTriggerRect(rect); + }); + + return () => { + cancelled = true; + }; + }, [triggerRef]); + + // Compute position when both measurements are available + useEffect(() => { + if (!triggerRect || !contentSize) return; + const { width: screenWidth, height: screenHeight } = Dimensions.get("window"); + const displayArea = { x: 0, y: 0, width: screenWidth, height: screenHeight }; + const result = computeHoverCardPosition({ + triggerRect, + contentSize, + displayArea, + offset: 4, + }); + setPosition(result); + }, [triggerRect, contentSize]); + + const handleLayout = useCallback( + (event: { nativeEvent: { layout: { width: number; height: number } } }) => { + const { width, height } = event.nativeEvent.layout; + setContentSize({ width, height }); + }, + [], + ); + + return ( + + + + + + {workspace.name} + + + {prHint || workspace.diffStat ? ( + + {workspace.diffStat ? ( + + +{workspace.diffStat.additions} + -{workspace.diffStat.deletions} + + ) : null} + {prHint ? : null} + + ) : null} + {workspace.services.length > 0 ? ( + <> + + Services + + {workspace.services.map((service) => { + const isRunning = service.lifecycle === "running"; + const isLinkable = isRunning && !!service.url; + return ( + [ + styles.listRow, + hovered && isLinkable && styles.listRowHovered, + ]} + onPress={isLinkable ? () => void openExternalUrl(service.url!) : undefined} + disabled={!isLinkable} + > + {({ hovered }) => ( + <> + + + {service.serviceName} + + {isRunning && service.url ? ( + + {service.url.replace(/^https?:\/\//, "")} + + ) : ( + + )} + {isRunning ? ( + service.url ? ( + + ) : null + ) : ( + { + event.stopPropagation(); + startServiceMutation.mutate(service.serviceName); + }} + > + {({ hovered: actionHovered }) => + startServiceMutation.isPending && + startServiceMutation.variables === service.serviceName ? ( + + ) : ( + + ) + } + + )} + + )} + + ); + })} + + + ) : null} + {prHint?.checks && prHint.checks.length > 0 ? ( + <> + + [ + styles.checksSummaryRow, + hovered && styles.listRowHovered, + ]} + onPress={() => void openExternalUrl(`${prHint.url}/checks`)} + > + {({ hovered }) => ( + <> + Checks + + + + + + )} + + + ) : null} + + + + ); +} + +const styles = StyleSheet.create((theme) => ({ + portalOverlay: { + position: "absolute", + top: 0, + right: 0, + bottom: 0, + left: 0, + zIndex: 1000, + }, + card: { + backgroundColor: theme.colors.surface1, + borderWidth: 1, + borderColor: theme.colors.borderAccent, + borderRadius: theme.borderRadius.lg, + paddingVertical: theme.spacing[2], + shadowColor: "#000", + shadowOffset: { width: 0, height: 4 }, + shadowOpacity: 0.2, + shadowRadius: 8, + elevation: 8, + zIndex: 1000, + }, + cardHeader: { + flexDirection: "row", + alignItems: "center", + gap: theme.spacing[2], + paddingHorizontal: theme.spacing[3], + paddingBottom: theme.spacing[2], + }, + cardTitle: { + color: theme.colors.foreground, + fontSize: theme.fontSize.sm, + fontWeight: theme.fontWeight.normal, + flex: 1, + minWidth: 0, + }, + cardMetaRow: { + flexDirection: "row", + alignItems: "center", + gap: 6, + paddingHorizontal: theme.spacing[3], + paddingBottom: theme.spacing[2], + }, + diffStatRow: { + flexDirection: "row", + alignItems: "center", + gap: 4, + }, + diffStatAdditions: { + fontSize: theme.fontSize.xs, + fontWeight: theme.fontWeight.normal, + color: theme.colors.palette.green[400], + }, + diffStatDeletions: { + fontSize: theme.fontSize.xs, + fontWeight: theme.fontWeight.normal, + color: theme.colors.palette.red[500], + }, + separator: { + height: 1, + backgroundColor: theme.colors.border, + }, + sectionLabel: { + fontSize: theme.fontSize.xs, + fontWeight: theme.fontWeight.medium, + color: theme.colors.foregroundMuted, + paddingHorizontal: theme.spacing[3], + paddingTop: theme.spacing[2], + paddingBottom: theme.spacing[1], + }, + sectionList: { + paddingBottom: theme.spacing[1], + }, + listRow: { + flexDirection: "row", + alignItems: "center", + gap: theme.spacing[2], + paddingHorizontal: theme.spacing[3], + paddingVertical: 6, + minHeight: 28, + }, + listRowHovered: { + backgroundColor: theme.colors.surface2, + }, + listRowLabel: { + fontSize: theme.fontSize.sm, + flexShrink: 0, + }, + listRowSecondary: { + color: theme.colors.foregroundMuted, + fontSize: theme.fontSize.xs, + flex: 1, + minWidth: 0, + textAlign: "right", + }, + listRowSpacer: { + flex: 1, + minWidth: 0, + }, + statusDot: { + width: 8, + height: 8, + borderRadius: 4, + flexShrink: 0, + }, + checksSummaryRow: { + flexDirection: "row", + alignItems: "center", + gap: theme.spacing[2], + paddingHorizontal: theme.spacing[3], + paddingVertical: 6, + minHeight: 28, + }, + checksSummaryLabel: { + fontSize: theme.fontSize.xs, + fontWeight: theme.fontWeight.medium, + color: theme.colors.foregroundMuted, + }, + checksSummaryCounts: { + flexDirection: "row", + alignItems: "center", + gap: theme.spacing[2], + flex: 1, + justifyContent: "flex-end", + }, +})); diff --git a/packages/app/src/components/workspace-setup-dialog.tsx b/packages/app/src/components/workspace-setup-dialog.tsx new file mode 100644 index 000000000..b59ffbc6d --- /dev/null +++ b/packages/app/src/components/workspace-setup-dialog.tsx @@ -0,0 +1,350 @@ +import { useCallback, useEffect, useMemo, useRef, useState } from "react"; +import { Image, Text, View } from "react-native"; +import { StyleSheet, useUnistyles } from "react-native-unistyles"; +import { createNameId } from "mnemonic-id"; +import { AdaptiveModalSheet } from "@/components/adaptive-modal-sheet"; +import { Composer } from "@/components/composer"; +import { useToast } from "@/contexts/toast-context"; +import { useAgentInputDraft } from "@/hooks/use-agent-input-draft"; +import { useProjectIconQuery } from "@/hooks/use-project-icon-query"; +import { useHostRuntimeClient, useHostRuntimeIsConnected } from "@/runtime/host-runtime"; +import { normalizeWorkspaceDescriptor, useSessionStore } from "@/stores/session-store"; +import { useWorkspaceSetupStore } from "@/stores/workspace-setup-store"; +import { normalizeAgentSnapshot } from "@/utils/agent-snapshots"; +import { encodeImages } from "@/utils/encode-images"; +import { toErrorMessage } from "@/utils/error-messages"; +import { projectIconPlaceholderLabelFromDisplayName } from "@/utils/project-display-name"; +import { + requireWorkspaceExecutionAuthority, +} from "@/utils/workspace-execution"; +import { navigateToPreparedWorkspaceTab } from "@/utils/workspace-navigation"; +import type { ImageAttachment, MessagePayload } from "./message-input"; + +function toProjectIconDataUri(icon: { mimeType: string; data: string } | null): string | null { + if (!icon) { + return null; + } + return `data:${icon.mimeType};base64,${icon.data}`; +} + +export function WorkspaceSetupDialog() { + const { theme } = useUnistyles(); + const toast = useToast(); + const pendingWorkspaceSetup = useWorkspaceSetupStore((state) => state.pendingWorkspaceSetup); + const clearWorkspaceSetup = useWorkspaceSetupStore((state) => state.clearWorkspaceSetup); + const mergeWorkspaces = useSessionStore((state) => state.mergeWorkspaces); + const setHasHydratedWorkspaces = useSessionStore((state) => state.setHasHydratedWorkspaces); + const setAgents = useSessionStore((state) => state.setAgents); + const [errorMessage, setErrorMessage] = useState(null); + const [createdWorkspace, setCreatedWorkspace] = useState | null>(null); + const [pendingAction, setPendingAction] = useState<"chat" | null>(null); + + const serverId = pendingWorkspaceSetup?.serverId ?? ""; + const sourceDirectory = pendingWorkspaceSetup?.sourceDirectory ?? ""; + const displayName = pendingWorkspaceSetup?.displayName?.trim() ?? ""; + const workspace = createdWorkspace; + const client = useHostRuntimeClient(serverId); + const isConnected = useHostRuntimeIsConnected(serverId); + const chatDraft = useAgentInputDraft({ + draftKey: `workspace-setup:${serverId}:${sourceDirectory}`, + composer: { + initialServerId: serverId || null, + initialValues: workspace?.workspaceDirectory + ? { workingDir: workspace.workspaceDirectory } + : undefined, + isVisible: pendingWorkspaceSetup !== null, + onlineServerIds: isConnected && serverId ? [serverId] : [], + lockedWorkingDir: workspace?.workspaceDirectory || sourceDirectory || undefined, + }, + }); + const composerState = chatDraft.composerState; + if (!composerState && pendingWorkspaceSetup) { + throw new Error("Workspace setup composer state is required"); + } + + const { icon: projectIcon } = useProjectIconQuery({ + serverId, + cwd: sourceDirectory, + }); + const iconDataUri = toProjectIconDataUri(projectIcon); + + useEffect(() => { + setErrorMessage(null); + setCreatedWorkspace(null); + setPendingAction(null); + }, [pendingWorkspaceSetup?.creationMethod, serverId, sourceDirectory]); + + const handleClose = useCallback(() => { + clearWorkspaceSetup(); + }, [clearWorkspaceSetup]); + + const navigateAfterCreation = useCallback( + ( + workspaceId: string, + target: { kind: "agent"; agentId: string } | { kind: "terminal"; terminalId: string }, + ) => { + if (!pendingWorkspaceSetup) { + return; + } + + clearWorkspaceSetup(); + navigateToPreparedWorkspaceTab({ + serverId: pendingWorkspaceSetup.serverId, + workspaceId, + target, + navigationMethod: pendingWorkspaceSetup.navigationMethod, + }); + }, + [clearWorkspaceSetup, pendingWorkspaceSetup], + ); + + const withConnectedClient = useCallback(() => { + if (!client || !isConnected) { + throw new Error("Host is not connected"); + } + return client; + }, [client, isConnected]); + + const ensureWorkspace = useCallback(async () => { + if (!pendingWorkspaceSetup) { + throw new Error("No workspace setup is pending"); + } + + if (createdWorkspace) { + return createdWorkspace; + } + + const connectedClient = withConnectedClient(); + const payload = + pendingWorkspaceSetup.creationMethod === "create_worktree" + ? await connectedClient.createPaseoWorktree({ + cwd: pendingWorkspaceSetup.sourceDirectory, + worktreeSlug: createNameId(), + }) + : await connectedClient.openProject(pendingWorkspaceSetup.sourceDirectory); + + if (payload.error || !payload.workspace) { + throw new Error( + payload.error ?? + (pendingWorkspaceSetup.creationMethod === "create_worktree" + ? "Failed to create worktree" + : "Failed to open project"), + ); + } + + const normalizedWorkspace = normalizeWorkspaceDescriptor(payload.workspace); + mergeWorkspaces(pendingWorkspaceSetup.serverId, [normalizedWorkspace]); + if (pendingWorkspaceSetup.creationMethod === "open_project") { + setHasHydratedWorkspaces(pendingWorkspaceSetup.serverId, true); + } + setCreatedWorkspace(normalizedWorkspace); + return normalizedWorkspace; + }, [ + createdWorkspace, + mergeWorkspaces, + pendingWorkspaceSetup, + setHasHydratedWorkspaces, + withConnectedClient, + ]); + + const getIsStillActive = useCallback(() => { + const current = useWorkspaceSetupStore.getState().pendingWorkspaceSetup; + return ( + current?.serverId === pendingWorkspaceSetup?.serverId && + current?.sourceDirectory === pendingWorkspaceSetup?.sourceDirectory && + current?.creationMethod === pendingWorkspaceSetup?.creationMethod + ); + }, [ + pendingWorkspaceSetup?.creationMethod, + pendingWorkspaceSetup?.serverId, + pendingWorkspaceSetup?.sourceDirectory, + ]); + + const handleCreateChatAgent = useCallback( + async ({ text, images }: MessagePayload) => { + try { + setPendingAction("chat"); + setErrorMessage(null); + const workspace = await ensureWorkspace(); + const connectedClient = withConnectedClient(); + if (!composerState) { + throw new Error("Workspace setup composer state is required"); + } + + const encodedImages = await encodeImages(images); + const workspaceDirectory = requireWorkspaceExecutionAuthority({ workspace }).workspaceDirectory; + const agent = await connectedClient.createAgent({ + provider: composerState.selectedProvider, + cwd: workspaceDirectory, + workspaceId: workspace.id, + ...(composerState.modeOptions.length > 0 && composerState.selectedMode !== "" + ? { modeId: composerState.selectedMode } + : {}), + ...(composerState.effectiveModelId ? { model: composerState.effectiveModelId } : {}), + ...(composerState.effectiveThinkingOptionId + ? { thinkingOptionId: composerState.effectiveThinkingOptionId } + : {}), + ...(text.trim() ? { initialPrompt: text.trim() } : {}), + ...(encodedImages && encodedImages.length > 0 ? { images: encodedImages } : {}), + }); + + if (!getIsStillActive()) { + return; + } + + setAgents(serverId, (previous) => { + const next = new Map(previous); + next.set(agent.id, normalizeAgentSnapshot(agent, serverId)); + return next; + }); + navigateAfterCreation(workspace.id, { kind: "agent", agentId: agent.id }); + } catch (error) { + const message = toErrorMessage(error); + setErrorMessage(message); + toast.error(message); + } finally { + if (getIsStillActive()) { + setPendingAction(null); + } + } + }, + [ + composerState, + getIsStillActive, + navigateAfterCreation, + serverId, + setAgents, + ensureWorkspace, + toast, + withConnectedClient, + ], + ); + + + const workspaceTitle = + workspace?.name || + workspace?.projectDisplayName || + displayName || + sourceDirectory.split(/[\\/]/).filter(Boolean).pop() || + sourceDirectory; + + const placeholderLabel = projectIconPlaceholderLabelFromDisplayName(workspaceTitle); + const placeholderInitial = placeholderLabel.charAt(0).toUpperCase(); + + const addImagesRef = useRef<((images: ImageAttachment[]) => void) | null>(null); + const handleFilesDropped = useCallback((files: ImageAttachment[]) => { + addImagesRef.current?.(files); + }, []); + const handleAddImagesCallback = useCallback((addImages: (images: ImageAttachment[]) => void) => { + addImagesRef.current = addImages; + }, []); + + const composerInputWrapperStyle = useMemo( + () => ({ backgroundColor: theme.colors.surface2 }), + [theme.colors.surface2], + ); + + if (!pendingWorkspaceSetup || !sourceDirectory) { + return null; + } + + const subtitleContent = ( + + {iconDataUri ? ( + + ) : ( + + {placeholderInitial} + + )} + + {workspaceTitle} + + + ); + + return ( + + + + + + {errorMessage ? {errorMessage} : null} + + ); +} + +const styles = StyleSheet.create((theme) => ({ + subtitleRow: { + flexDirection: "row", + alignItems: "center", + gap: theme.spacing[2], + }, + projectIcon: { + width: theme.iconSize.md, + height: theme.iconSize.md, + borderRadius: theme.borderRadius.sm, + }, + projectIconFallback: { + width: theme.iconSize.md, + height: theme.iconSize.md, + borderRadius: theme.borderRadius.sm, + borderWidth: 1, + borderColor: theme.colors.border, + alignItems: "center", + justifyContent: "center", + }, + projectIconFallbackText: { + color: theme.colors.foregroundMuted, + fontSize: 9, + }, + projectTitle: { + fontSize: theme.fontSize.sm, + color: theme.colors.foregroundMuted, + }, + section: { + gap: theme.spacing[3], + marginHorizontal: -theme.spacing[6], + marginVertical: -theme.spacing[2], + }, + errorText: { + fontSize: theme.fontSize.sm, + color: theme.colors.destructive, + lineHeight: 20, + }, +})); diff --git a/packages/app/src/contexts/session-context.service-status.test.ts b/packages/app/src/contexts/session-context.service-status.test.ts new file mode 100644 index 000000000..31219679f --- /dev/null +++ b/packages/app/src/contexts/session-context.service-status.test.ts @@ -0,0 +1,91 @@ +import { describe, expect, it } from "vitest"; +import type { WorkspaceServicePayload } from "@server/shared/messages"; +import type { WorkspaceDescriptor } from "@/stores/session-store"; +import { patchWorkspaceServices } from "./session-workspace-services"; + +function workspace(input: { + id: string; + services?: WorkspaceDescriptor["services"]; +}): WorkspaceDescriptor { + return { + id: input.id, + projectId: "project-1", + projectDisplayName: "Project 1", + projectRootPath: "/repo", + workspaceDirectory: input.id, + projectKind: "git", + workspaceKind: "checkout", + name: "main", + status: "running", + activityAt: null, + diffStat: null, + services: input.services ?? [], + }; +} + +const runningService: WorkspaceServicePayload = { + serviceName: "web", + hostname: "main.web.localhost", + port: 3000, + url: "http://main.web.localhost:6767", + lifecycle: "running", + health: "healthy", +}; + +describe("patchWorkspaceServices", () => { + it("patches only the matching workspace services", () => { + const other = workspace({ id: "/repo/other", services: [] }); + const current = new Map([ + ["/repo/main", workspace({ id: "/repo/main", services: [] })], + [other.id, other], + ]); + + const next = patchWorkspaceServices(current, { + workspaceId: "/repo/main", + services: [runningService], + }); + + expect(next).not.toBe(current); + expect(next.get("/repo/main")?.services).toEqual([runningService]); + expect(next.get("/repo/other")).toBe(other); + }); + + it("patches the matching workspace when the update uses workspace directory identity", () => { + const current = new Map([ + [ + "42", + workspace({ + id: "42", + services: [], + }), + ], + ]); + + current.set("42", { + ...current.get("42")!, + workspaceDirectory: "C:\\repo\\main\\", + }); + + const next = patchWorkspaceServices(current, { + workspaceId: "C:/repo/main", + services: [runningService], + }); + + expect(next).not.toBe(current); + expect(next.get("42")?.services).toEqual([runningService]); + }); + + it("ignores updates for unknown workspaces", () => { + const current = new Map([ + ["/repo/main", workspace({ id: "/repo/main", services: [] })], + ]); + + const next = patchWorkspaceServices(current, { + workspaceId: "/repo/missing", + services: [runningService], + }); + + expect(next).toBe(current); + expect(next.get("/repo/main")?.services).toEqual([]); + }); +}); diff --git a/packages/app/src/contexts/session-context.tsx b/packages/app/src/contexts/session-context.tsx index 675caa210..5694be493 100644 --- a/packages/app/src/contexts/session-context.tsx +++ b/packages/app/src/contexts/session-context.tsx @@ -37,6 +37,7 @@ import { normalizeWorkspaceDescriptor, } from "@/stores/session-store"; import { useDraftStore } from "@/stores/draft-store"; +import { useWorkspaceSetupStore } from "@/stores/workspace-setup-store"; import type { AgentDirectoryEntry } from "@/types/agent-directory"; import { sendOsNotification } from "@/utils/os-notifications"; import { getIsAppActivelyVisible } from "@/utils/app-visibility"; @@ -52,6 +53,7 @@ import { resolveProjectPlacement } from "@/utils/project-placement"; import { buildDraftStoreKey } from "@/stores/draft-keys"; import type { AttachmentMetadata } from "@/attachments/types"; import { reconcilePreviousAgentStatuses } from "@/contexts/session-status-tracking"; +import { patchWorkspaceServices } from "@/contexts/session-workspace-services"; // Re-export types from session-store and draft-store for backward compatibility export type { DraftInput } from "@/stores/draft-store"; @@ -177,6 +179,10 @@ type WorkspaceUpdatePayload = Extract< SessionOutboundMessage, { type: "workspace_update" } >["payload"]; +type WorkspaceSetupProgressPayload = Extract< + SessionOutboundMessage, + { type: "workspace_setup_progress" } +>["payload"]; const getAgentIdFromUpdate = (update: AgentUpdatePayload): string => update.kind === "remove" ? update.agentId : update.agent.id; @@ -282,6 +288,9 @@ function SessionProviderInternal({ children, serverId, client }: SessionProvider const setQueuedMessages = useSessionStore((state) => state.setQueuedMessages); const updateSessionClient = useSessionStore((state) => state.updateSessionClient); const updateSessionServerInfo = useSessionStore((state) => state.updateSessionServerInfo); + const upsertWorkspaceSetupProgress = useWorkspaceSetupStore((state) => state.upsertProgress); + const removeWorkspaceSetup = useWorkspaceSetupStore((state) => state.removeWorkspace); + const clearWorkspaceSetupServer = useWorkspaceSetupStore((state) => state.clearServer); // Track focused agent for heartbeat const focusedAgentId = useSessionStore( @@ -555,9 +564,8 @@ function SessionProviderInternal({ children, serverId, client }: SessionProvider void client .fetchAgentTimeline(agentId, { direction: "after", - cursor: { epoch: cursor.epoch, seq: cursor.endSeq }, + cursor: { seq: cursor.endSeq }, limit: 0, - projection: "canonical", }) .catch((error) => { console.warn("[Session] failed to fetch catch-up timeline on resume", agentId, error); @@ -810,14 +818,20 @@ function SessionProviderInternal({ children, serverId, client }: SessionProvider ], ); + const applyWorkspaceSetupProgress = useCallback( + (payload: WorkspaceSetupProgressPayload) => { + upsertWorkspaceSetupProgress({ serverId, payload }); + }, + [serverId, upsertWorkspaceSetupProgress], + ); + const requestCanonicalCatchUp = useCallback( - (agentId: string, cursor: { epoch: string; endSeq: number }) => { + (agentId: string, cursor: { endSeq: number }) => { void client .fetchAgentTimeline(agentId, { direction: "after", - cursor: { epoch: cursor.epoch, seq: cursor.endSeq }, + cursor: { seq: cursor.endSeq }, limit: 0, - projection: "canonical", }) .catch((error) => { console.warn("[Session] failed to fetch canonical catch-up timeline", agentId, error); @@ -920,7 +934,6 @@ function SessionProviderInternal({ children, serverId, client }: SessionProvider } if ( current && - current.epoch === result.cursor.epoch && current.startSeq === result.cursor.startSeq && current.endSeq === result.cursor.endSeq ) { @@ -1025,7 +1038,7 @@ function SessionProviderInternal({ children, serverId, client }: SessionProvider const unsubAgentStream = client.on("agent_stream", (message) => { if (message.type !== "agent_stream") return; - const { agentId, event, timestamp, seq, epoch } = message.payload; + const { agentId, event, timestamp, seq } = message.payload; const parsedTimestamp = new Date(timestamp); const streamEvent = event as AgentStreamEventPayload; if ( @@ -1067,7 +1080,6 @@ function SessionProviderInternal({ children, serverId, client }: SessionProvider const result = processAgentStreamEvent({ event: streamEvent, seq, - epoch, currentTail, currentHead, currentCursor, @@ -1091,8 +1103,6 @@ function SessionProviderInternal({ children, serverId, client }: SessionProvider if ( current && typeof seq === "number" && - typeof epoch === "string" && - current.epoch === epoch && seq >= current.startSeq && seq <= current.endSeq ) { @@ -1101,7 +1111,6 @@ function SessionProviderInternal({ children, serverId, client }: SessionProvider } if ( current && - current.epoch === nextCursor.epoch && current.startSeq === nextCursor.startSeq && current.endSeq === nextCursor.endSeq ) { @@ -1152,12 +1161,34 @@ function SessionProviderInternal({ children, serverId, client }: SessionProvider const unsubWorkspaceUpdate = client.on("workspace_update", (message) => { if (message.type !== "workspace_update") return; if (message.payload.kind === "remove") { - removeWorkspace(serverId, message.payload.id); + removeWorkspaceSetup({ serverId, workspaceId: String(message.payload.id) }); + removeWorkspace(serverId, String(message.payload.id)); return; } mergeWorkspaces(serverId, [normalizeWorkspaceDescriptor(message.payload.workspace)]); }); + const unsubServiceStatusUpdate = client.on("service_status_update", (message) => { + if (message.type !== "service_status_update") return; + setWorkspaces(serverId, (prev) => patchWorkspaceServices(prev, message.payload)); + }); + + const unsubWorkspaceSetupProgress = client.on("workspace_setup_progress", (message) => { + if (message.type !== "workspace_setup_progress") return; + applyWorkspaceSetupProgress(message.payload); + }); + + const unsubWorkspaceSetupStatusResponse = client.on( + "workspace_setup_status_response", + (message) => { + if (message.type !== "workspace_setup_status_response") return; + const { workspaceId, snapshot } = message.payload; + if (snapshot) { + applyWorkspaceSetupProgress({ workspaceId, ...snapshot }); + } + }, + ); + const unsubStatus = client.on("status", (message) => { if (message.type !== "status") return; const serverInfo = parseServerInfoStatusPayload(message.payload); @@ -1507,6 +1538,9 @@ function SessionProviderInternal({ children, serverId, client }: SessionProvider unsubAgentStream(); unsubAgentTimeline(); unsubWorkspaceUpdate(); + unsubServiceStatusUpdate(); + unsubWorkspaceSetupProgress(); + unsubWorkspaceSetupStatusResponse(); unsubStatus(); unsubPermissionRequest(); unsubPermissionResolved(); @@ -1532,8 +1566,10 @@ function SessionProviderInternal({ children, serverId, client }: SessionProvider setAgentTimelineCursor, setInitializingAgents, setAgents, + setWorkspaces, mergeWorkspaces, removeWorkspace, + removeWorkspaceSetup, setAgentLastActivity, setPendingPermissions, setHasHydratedAgents, @@ -1541,6 +1577,7 @@ function SessionProviderInternal({ children, serverId, client }: SessionProvider notifyAgentAttention, requestCanonicalCatchUp, applyAgentUpdatePayload, + applyWorkspaceSetupProgress, applyTimelineResponse, voiceRuntime, voiceAudioEngine, @@ -1744,9 +1781,10 @@ function SessionProviderInternal({ children, serverId, client }: SessionProvider // Cleanup on unmount useEffect(() => { return () => { + clearWorkspaceSetupServer(serverId); clearSession(serverId); }; - }, [clearSession, serverId]); + }, [clearSession, clearWorkspaceSetupServer, serverId]); return children; } diff --git a/packages/app/src/contexts/session-stream-reducers.test.ts b/packages/app/src/contexts/session-stream-reducers.test.ts index 580189da3..45f7a06ed 100644 --- a/packages/app/src/contexts/session-stream-reducers.test.ts +++ b/packages/app/src/contexts/session-stream-reducers.test.ts @@ -9,13 +9,9 @@ import { type TimelineCursor, } from "./session-stream-reducers"; -// --------------------------------------------------------------------------- -// Test helpers -// --------------------------------------------------------------------------- - function makeTimelineEntry(seq: number, text: string, type: string = "assistant_message") { return { - seqStart: seq, + seq, provider: "claude", item: { type, text }, timestamp: new Date(1000 + seq).toISOString(), @@ -33,22 +29,30 @@ function makeTimelineEvent( } as AgentStreamEventPayload; } -function makeUserTimelineEvent(text: string): AgentStreamEventPayload { +function makeToolCallEvent(status: "running" | "completed"): AgentStreamEventPayload { return { type: "timeline", provider: "claude", - item: { type: "user_message", text }, - } as AgentStreamEventPayload; + item: { + type: "tool_call", + callId: "call-1", + name: "shell", + status, + detail: { + type: "shell", + command: "pwd", + }, + error: null, + }, + }; } const baseTimelineInput: ProcessTimelineResponseInput = { payload: { agentId: "agent-1", direction: "after", - reset: false, - epoch: "epoch-1", - startCursor: null, - endCursor: null, + startSeq: null, + endSeq: null, entries: [], error: null, }, @@ -63,7 +67,6 @@ const baseTimelineInput: ProcessTimelineResponseInput = { const baseStreamInput: ProcessAgentStreamEventInput = { event: makeTimelineEvent("hello"), seq: undefined, - epoch: undefined, currentTail: [], currentHead: [], currentCursor: undefined, @@ -71,10 +74,6 @@ const baseStreamInput: ProcessAgentStreamEventInput = { timestamp: new Date(2000), }; -// --------------------------------------------------------------------------- -// processTimelineResponse -// --------------------------------------------------------------------------- - describe("processTimelineResponse", () => { it("returns error path when payload.error is set", () => { const result = processTimelineResponse({ @@ -93,35 +92,10 @@ describe("processTimelineResponse", () => { expect(result.tail).toBe(baseTimelineInput.currentTail); expect(result.head).toBe(baseTimelineInput.currentHead); expect(result.cursorChanged).toBe(false); - expect(result.sideEffects).toEqual([]); }); - it("returns error with no init resolution when no deferred exists", () => { - const result = processTimelineResponse({ - ...baseTimelineInput, - isInitializing: true, - hasActiveInitDeferred: false, - payload: { - ...baseTimelineInput.payload, - error: "timeout", - }, - }); - - expect(result.error).toBe("timeout"); - expect(result.initResolution).toBe(null); - expect(result.clearInitializing).toBe(true); - }); - - it("replaces tail and clears head when reset=true", () => { - const existingTail: StreamItem[] = [ - { - kind: "user_message", - id: "old", - text: "old message", - timestamp: new Date(500), - }, - ]; - const existingHead: StreamItem[] = [ + it("replaces tail during bootstrap tail init and schedules committed catch-up", () => { + const provisionalHead: StreamItem[] = [ { kind: "assistant_message", id: "head-1", @@ -132,518 +106,328 @@ describe("processTimelineResponse", () => { const result = processTimelineResponse({ ...baseTimelineInput, - currentTail: existingTail, - currentHead: existingHead, - payload: { - ...baseTimelineInput.payload, - reset: true, - startCursor: { seq: 1 }, - endCursor: { seq: 3 }, - entries: [ - makeTimelineEntry(1, "first"), - makeTimelineEntry(2, "second"), - makeTimelineEntry(3, "third"), - ], - }, - }); - - expect(result.tail).not.toBe(existingTail); - expect(result.tail.length).toBeGreaterThan(0); - expect(result.head).toEqual([]); - expect(result.cursorChanged).toBe(true); - expect(result.cursor).toEqual({ - epoch: "epoch-1", - startSeq: 1, - endSeq: 3, - }); - expect(result.error).toBe(null); - expect(result.sideEffects.some((e) => e.type === "flush_pending_updates")).toBe(true); - }); - - it("sets cursor to null when reset=true but no cursors in payload", () => { - const result = processTimelineResponse({ - ...baseTimelineInput, - currentCursor: { epoch: "epoch-1", startSeq: 1, endSeq: 5 }, - payload: { - ...baseTimelineInput.payload, - reset: true, - entries: [], - }, - }); - - expect(result.cursor).toBe(null); - expect(result.cursorChanged).toBe(true); - }); - - it("performs bootstrap tail init with catch-up side effect", () => { - const result = processTimelineResponse({ - ...baseTimelineInput, + currentHead: provisionalHead, isInitializing: true, hasActiveInitDeferred: true, initRequestDirection: "tail", payload: { ...baseTimelineInput.payload, direction: "tail", - epoch: "epoch-1", - startCursor: { seq: 1 }, - endCursor: { seq: 5 }, + startSeq: 1, + endSeq: 5, entries: [makeTimelineEntry(1, "first"), makeTimelineEntry(5, "last")], }, }); - // Bootstrap tail replaces expect(result.tail.length).toBeGreaterThan(0); expect(result.head).toEqual([]); expect(result.cursorChanged).toBe(true); expect(result.cursor).toEqual({ - epoch: "epoch-1", startSeq: 1, endSeq: 5, }); - // Should have catch-up side effect - const catchUp = result.sideEffects.find((e) => e.type === "catch_up"); - expect(catchUp).toBeDefined(); - expect(catchUp!.type === "catch_up" && catchUp!.cursor).toEqual({ - epoch: "epoch-1", - endSeq: 5, + const catchUp = result.sideEffects.find((effect) => effect.type === "catch_up"); + expect(catchUp).toEqual({ + type: "catch_up", + cursor: { endSeq: 5 }, }); }); - it("appends incrementally for contiguous seqs", () => { - const existingCursor: TimelineCursor = { - epoch: "epoch-1", - startSeq: 1, - endSeq: 3, - }; + it("prepends older committed history for before pagination", () => { + const currentTail: StreamItem[] = [ + { + kind: "assistant_message", + id: "tail-3", + text: "newer", + timestamp: new Date(3000), + }, + ]; + const currentCursor: TimelineCursor = { startSeq: 3, endSeq: 4 }; const result = processTimelineResponse({ ...baseTimelineInput, - currentCursor: existingCursor, + currentTail, + currentCursor, payload: { ...baseTimelineInput.payload, - epoch: "epoch-1", - entries: [makeTimelineEntry(4, "next-1"), makeTimelineEntry(5, "next-2")], + direction: "before", + startSeq: 1, + endSeq: 2, + entries: [ + makeTimelineEntry(1, "hello", "user_message"), + makeTimelineEntry(2, "older"), + ], }, }); - expect(result.tail.length).toBeGreaterThan(0); expect(result.cursorChanged).toBe(true); expect(result.cursor).toEqual({ - epoch: "epoch-1", startSeq: 1, - endSeq: 5, + endSeq: 4, }); - expect(result.error).toBe(null); + expect(result.tail).toHaveLength(3); + expect(result.tail[0]?.kind).toBe("user_message"); + expect(result.tail[1]?.kind).toBe("assistant_message"); + expect(result.tail[2]).toBe(currentTail[0]); }); - it("detects gap and emits catch-up side effect", () => { - const existingCursor: TimelineCursor = { - epoch: "epoch-1", - startSeq: 1, - endSeq: 3, - }; - - const result = processTimelineResponse({ - ...baseTimelineInput, - currentCursor: existingCursor, - payload: { - ...baseTimelineInput.payload, - epoch: "epoch-1", - entries: [makeTimelineEntry(10, "far ahead")], + it("replaces stale provisional assistant UI when fetch-after returns committed row 121", () => { + const currentHead: StreamItem[] = [ + { + kind: "assistant_message", + id: "head-assistant", + text: "partial", + timestamp: new Date(120000), }, - }); - - // Gap should trigger catch-up - const catchUp = result.sideEffects.find((e) => e.type === "catch_up"); - expect(catchUp).toBeDefined(); - expect(catchUp!.type === "catch_up" && catchUp!.cursor).toEqual({ - epoch: "epoch-1", - endSeq: 3, - }); - }); - - it("drops stale entries silently", () => { - const existingCursor: TimelineCursor = { - epoch: "epoch-1", - startSeq: 1, - endSeq: 8, - }; + ]; + const currentCursor: TimelineCursor = { startSeq: 1, endSeq: 120 }; const result = processTimelineResponse({ ...baseTimelineInput, - currentCursor: existingCursor, + currentHead, + currentCursor, payload: { ...baseTimelineInput.payload, - epoch: "epoch-1", - entries: [makeTimelineEntry(5, "old"), makeTimelineEntry(7, "also old")], + direction: "after", + startSeq: 121, + endSeq: 121, + entries: [makeTimelineEntry(121, "finalized reply")], }, }); - // No new items appended (all dropped as stale) - expect(result.tail).toBe(baseTimelineInput.currentTail); - expect(result.cursorChanged).toBe(false); - }); - - it("drops entries with epoch mismatch", () => { - const existingCursor: TimelineCursor = { - epoch: "epoch-1", + expect(result.head).toEqual([]); + expect(result.cursorChanged).toBe(true); + expect(result.cursor).toEqual({ startSeq: 1, - endSeq: 5, - }; - - const result = processTimelineResponse({ - ...baseTimelineInput, - currentCursor: existingCursor, - payload: { - ...baseTimelineInput.payload, - epoch: "epoch-2", - entries: [makeTimelineEntry(6, "different epoch")], - }, + endSeq: 121, }); - - expect(result.tail).toBe(baseTimelineInput.currentTail); - expect(result.cursorChanged).toBe(false); - }); - - it("resolves init when deferred matches direction", () => { - const result = processTimelineResponse({ - ...baseTimelineInput, - isInitializing: true, - hasActiveInitDeferred: true, - initRequestDirection: "after", - payload: { - ...baseTimelineInput.payload, - direction: "after", - entries: [], - }, + expect(result.tail[result.tail.length - 1]).toMatchObject({ + kind: "assistant_message", + text: "finalized reply", }); - - expect(result.initResolution).toBe("resolve"); - expect(result.clearInitializing).toBe(true); }); - it("does not resolve init when directions differ (before vs after)", () => { - const result = processTimelineResponse({ - ...baseTimelineInput, - isInitializing: true, - hasActiveInitDeferred: true, - initRequestDirection: "after", - payload: { - ...baseTimelineInput.payload, - direction: "before", - entries: [], + it("keeps provisional head when reconnect catch-up has no new committed rows yet", () => { + const currentHead: StreamItem[] = [ + { + kind: "assistant_message", + id: "head-assistant", + text: "still streaming", + timestamp: new Date(120000), }, - }); - - // "before" direction doesn't match "after" initRequestDirection, - // and "before" is not a bootstrap tail path, so init should NOT resolve - expect(result.initResolution).toBe(null); - expect(result.clearInitializing).toBe(false); - }); + ]; + const currentCursor: TimelineCursor = { startSeq: 1, endSeq: 120 }; - it("clears initializing even without deferred", () => { const result = processTimelineResponse({ ...baseTimelineInput, - isInitializing: true, - hasActiveInitDeferred: false, + currentHead, + currentCursor, payload: { ...baseTimelineInput.payload, direction: "after", + startSeq: null, + endSeq: null, entries: [], }, }); - expect(result.clearInitializing).toBe(true); - expect(result.initResolution).toBe(null); + expect(result.head).toBe(currentHead); + expect(result.cursorChanged).toBe(false); + expect(result.tail).toBe(baseTimelineInput.currentTail); }); - it("always includes flush_pending_updates side effect on success", () => { - const result = processTimelineResponse({ - ...baseTimelineInput, - payload: { - ...baseTimelineInput.payload, - entries: [], - }, - }); + it("requests catch-up when committed rows arrive with a forward gap", () => { + const currentCursor: TimelineCursor = { startSeq: 1, endSeq: 120 }; - expect(result.sideEffects.some((e) => e.type === "flush_pending_updates")).toBe(true); - }); - - it("initializes cursor when no existing cursor on first entries", () => { const result = processTimelineResponse({ ...baseTimelineInput, - currentCursor: undefined, + currentCursor, payload: { ...baseTimelineInput.payload, - epoch: "epoch-1", - entries: [makeTimelineEntry(1, "first"), makeTimelineEntry(2, "second")], + direction: "after", + startSeq: 125, + endSeq: 125, + entries: [makeTimelineEntry(125, "far ahead")], }, }); - expect(result.cursorChanged).toBe(true); - expect(result.cursor).toEqual({ - epoch: "epoch-1", - startSeq: 1, - endSeq: 2, + expect(result.cursorChanged).toBe(false); + expect(result.tail).toBe(baseTimelineInput.currentTail); + expect(result.sideEffects).toContainEqual({ + type: "catch_up", + cursor: { endSeq: 120 }, }); }); }); -// --------------------------------------------------------------------------- -// processAgentStreamEvent -// --------------------------------------------------------------------------- - describe("processAgentStreamEvent", () => { - it("passes through non-timeline events without cursor changes", () => { - const turnEvent: AgentStreamEventPayload = { - type: "turn_completed", - provider: "claude", - }; - + it("treats seq-less timeline events as provisional head updates", () => { const result = processAgentStreamEvent({ ...baseStreamInput, - event: turnEvent, + event: makeTimelineEvent("partial"), seq: undefined, - epoch: undefined, }); + expect(result.changedHead).toBe(true); + expect(result.changedTail).toBe(false); + expect(result.head).toHaveLength(1); + expect(result.head[0]).toMatchObject({ + kind: "assistant_message", + text: "partial", + }); expect(result.cursorChanged).toBe(false); - expect(result.cursor).toBe(null); - expect(result.sideEffects).toEqual([]); }); - it("accepts timeline event with cursor advance", () => { - const existingCursor: TimelineCursor = { - epoch: "epoch-1", - startSeq: 1, - endSeq: 4, - }; - - const result = processAgentStreamEvent({ + it("keeps tool call rows anchored in tail across live and committed updates", () => { + const running = processAgentStreamEvent({ ...baseStreamInput, - event: makeTimelineEvent("new chunk"), - seq: 5, - epoch: "epoch-1", - currentCursor: existingCursor, + event: makeToolCallEvent("running"), + seq: undefined, }); - - expect(result.cursorChanged).toBe(true); - expect(result.cursor).toEqual({ - epoch: "epoch-1", - startSeq: 1, - endSeq: 5, + expect(running.head).toEqual([]); + expect(running.tail).toHaveLength(1); + expect(running.tail[0]).toMatchObject({ + kind: "tool_call", + payload: { + source: "agent", + data: { + callId: "call-1", + status: "running", + }, + }, }); - expect(result.sideEffects).toEqual([]); - }); - it("detects gap and emits catch-up side effect", () => { - const existingCursor: TimelineCursor = { - epoch: "epoch-1", - startSeq: 1, - endSeq: 4, - }; - - const result = processAgentStreamEvent({ + const completedLive = processAgentStreamEvent({ ...baseStreamInput, - event: makeTimelineEvent("far ahead"), - seq: 10, - epoch: "epoch-1", - currentCursor: existingCursor, - }); - - expect(result.cursorChanged).toBe(false); - expect(result.changedTail).toBe(false); - expect(result.changedHead).toBe(false); - - const catchUp = result.sideEffects.find((e) => e.type === "catch_up"); - expect(catchUp).toBeDefined(); - expect(catchUp!.cursor).toEqual({ - epoch: "epoch-1", - endSeq: 4, + event: makeToolCallEvent("completed"), + seq: undefined, + currentHead: running.head, + currentTail: running.tail, + currentCursor: { startSeq: 1, endSeq: 7 }, + }); + expect(completedLive.head).toEqual([]); + expect(completedLive.tail).toHaveLength(1); + expect(completedLive.tail[0]).toMatchObject({ + kind: "tool_call", + payload: { + source: "agent", + data: { + callId: "call-1", + status: "completed", + }, + }, }); - }); - - it("drops stale timeline event", () => { - const existingCursor: TimelineCursor = { - epoch: "epoch-1", - startSeq: 1, - endSeq: 8, - }; - const result = processAgentStreamEvent({ + const committed = processAgentStreamEvent({ ...baseStreamInput, - event: makeTimelineEvent("old"), - seq: 5, - epoch: "epoch-1", - currentCursor: existingCursor, + event: makeToolCallEvent("completed"), + seq: 8, + currentHead: completedLive.head, + currentTail: completedLive.tail, + currentCursor: { startSeq: 1, endSeq: 7 }, }); - expect(result.cursorChanged).toBe(false); - expect(result.changedTail).toBe(false); - expect(result.changedHead).toBe(false); - expect(result.sideEffects).toEqual([]); - }); - - it("drops timeline event with epoch mismatch", () => { - const existingCursor: TimelineCursor = { - epoch: "epoch-1", - startSeq: 1, - endSeq: 5, - }; - - const result = processAgentStreamEvent({ - ...baseStreamInput, - event: makeTimelineEvent("wrong epoch"), - seq: 6, - epoch: "epoch-2", - currentCursor: existingCursor, + expect(committed.head).toEqual([]); + expect(committed.tail).toHaveLength(1); + expect(committed.tail[0]).toMatchObject({ + kind: "tool_call", + payload: { + source: "agent", + data: { + callId: "call-1", + status: "completed", + }, + }, }); - - expect(result.cursorChanged).toBe(false); - expect(result.changedTail).toBe(false); - expect(result.changedHead).toBe(false); - expect(result.sideEffects).toEqual([]); }); - it("initializes cursor when none exists", () => { - const result = processAgentStreamEvent({ + it("preserves assistant/tool interleaving while a turn is streaming", () => { + const assistantBeforeTool = processAgentStreamEvent({ ...baseStreamInput, - event: makeTimelineEvent("first"), - seq: 1, - epoch: "epoch-1", - currentCursor: undefined, - }); - - expect(result.cursorChanged).toBe(true); - expect(result.cursor).toEqual({ - epoch: "epoch-1", - startSeq: 1, - endSeq: 1, + event: makeTimelineEvent("before"), + seq: undefined, }); - }); - - it("derives optimistic idle status on turn_completed for running agent", () => { - const turnCompletedEvent: AgentStreamEventPayload = { - type: "turn_completed", - provider: "claude", - }; - const result = processAgentStreamEvent({ + const runningTool = processAgentStreamEvent({ ...baseStreamInput, - event: turnCompletedEvent, - currentAgent: { - status: "running", - updatedAt: new Date(1000), - lastActivityAt: new Date(1000), - }, - timestamp: new Date(2000), + event: makeToolCallEvent("running"), + seq: undefined, + currentHead: assistantBeforeTool.head, + currentTail: assistantBeforeTool.tail, }); - expect(result.agentChanged).toBe(true); - expect(result.agent).not.toBe(null); - expect(result.agent!.status).toBe("idle"); - expect(result.agent!.updatedAt.getTime()).toBe(2000); - expect(result.agent!.lastActivityAt.getTime()).toBe(2000); - }); - - it("derives optimistic error status on turn_failed for running agent", () => { - const turnFailedEvent: AgentStreamEventPayload = { - type: "turn_failed", - provider: "claude", - error: "something broke", - }; - - const result = processAgentStreamEvent({ + const assistantAfterTool = processAgentStreamEvent({ ...baseStreamInput, - event: turnFailedEvent, - currentAgent: { - status: "running", - updatedAt: new Date(1000), - lastActivityAt: new Date(1000), - }, - timestamp: new Date(2000), + event: makeTimelineEvent("after"), + seq: undefined, + currentHead: runningTool.head, + currentTail: runningTool.tail, }); - expect(result.agentChanged).toBe(true); - expect(result.agent!.status).toBe("error"); - }); - - it("does not change agent when status is not running", () => { - const turnCompletedEvent: AgentStreamEventPayload = { - type: "turn_completed", - provider: "claude", - }; - - const result = processAgentStreamEvent({ + const completedTool = processAgentStreamEvent({ ...baseStreamInput, - event: turnCompletedEvent, - currentAgent: { - status: "idle", - updatedAt: new Date(1000), - lastActivityAt: new Date(1000), - }, - timestamp: new Date(2000), - }); - - expect(result.agentChanged).toBe(false); - expect(result.agent).toBe(null); + event: makeToolCallEvent("completed"), + seq: undefined, + currentHead: assistantAfterTool.head, + currentTail: assistantAfterTool.tail, + }); + + expect(completedTool.head).toEqual([]); + expect(completedTool.tail.map((item) => item.kind)).toEqual([ + "assistant_message", + "tool_call", + "assistant_message", + ]); + expect( + completedTool.tail[0]?.kind === "assistant_message" ? completedTool.tail[0].text : null, + ).toBe("before"); + expect( + completedTool.tail[2]?.kind === "assistant_message" ? completedTool.tail[2].text : null, + ).toBe("after"); }); - it("does not change agent when no agent is provided", () => { - const turnCompletedEvent: AgentStreamEventPayload = { - type: "turn_completed", - provider: "claude", - }; - + it("requests catch-up when a committed live row skips ahead", () => { const result = processAgentStreamEvent({ ...baseStreamInput, - event: turnCompletedEvent, - currentAgent: null, - timestamp: new Date(2000), + event: makeTimelineEvent("far ahead"), + seq: 125, + currentCursor: { startSeq: 1, endSeq: 120 }, }); - expect(result.agentChanged).toBe(false); - expect(result.agent).toBe(null); + expect(result.changedTail).toBe(false); + expect(result.changedHead).toBe(false); + expect(result.cursorChanged).toBe(false); + expect(result.sideEffects).toContainEqual({ + type: "catch_up", + cursor: { endSeq: 120 }, + }); }); - it("preserves updatedAt when agent timestamp is newer than event", () => { - const turnCompletedEvent: AgentStreamEventPayload = { - type: "turn_completed", - provider: "claude", - }; - - const result = processAgentStreamEvent({ + it("flushes provisional head into tail on terminal turn events", () => { + const withHead = processAgentStreamEvent({ ...baseStreamInput, - event: turnCompletedEvent, - currentAgent: { - status: "running", - updatedAt: new Date(5000), - lastActivityAt: new Date(5000), - }, - timestamp: new Date(2000), + event: makeTimelineEvent("streaming"), + seq: undefined, }); - expect(result.agentChanged).toBe(true); - expect(result.agent!.updatedAt.getTime()).toBe(5000); - expect(result.agent!.lastActivityAt.getTime()).toBe(5000); - }); - - it("does not produce agent patch for non-terminal events", () => { const result = processAgentStreamEvent({ ...baseStreamInput, - event: makeTimelineEvent("just text"), - currentAgent: { - status: "running", - updatedAt: new Date(1000), - lastActivityAt: new Date(1000), + event: { + type: "turn_completed", + provider: "claude", }, - seq: 1, - epoch: "epoch-1", - timestamp: new Date(2000), + currentHead: withHead.head, + currentTail: withHead.tail, }); - expect(result.agentChanged).toBe(false); - expect(result.agent).toBe(null); + expect(result.changedHead).toBe(true); + expect(result.changedTail).toBe(true); + expect(result.head).toEqual([]); + expect(result.tail).toHaveLength(1); + expect(result.tail[0]).toMatchObject({ + kind: "assistant_message", + text: "streaming", + }); }); }); diff --git a/packages/app/src/contexts/session-stream-reducers.ts b/packages/app/src/contexts/session-stream-reducers.ts index 6623fba94..1c2ac4a4d 100644 --- a/packages/app/src/contexts/session-stream-reducers.ts +++ b/packages/app/src/contexts/session-stream-reducers.ts @@ -12,38 +12,25 @@ import { } from "@/contexts/session-timeline-bootstrap-policy"; import { deriveOptimisticLifecycleStatus } from "@/contexts/session-stream-lifecycle"; -// --------------------------------------------------------------------------- -// Shared cursor type -// --------------------------------------------------------------------------- - export type TimelineCursor = { - epoch: string; startSeq: number; endSeq: number; }; -// --------------------------------------------------------------------------- -// Side-effect discriminated unions -// --------------------------------------------------------------------------- - export type TimelineReducerSideEffect = - | { type: "catch_up"; cursor: { epoch: string; endSeq: number } } + | { type: "catch_up"; cursor: { endSeq: number } } | { type: "flush_pending_updates" }; export type AgentStreamReducerSideEffect = { type: "catch_up"; - cursor: { epoch: string; endSeq: number }; + cursor: { endSeq: number }; }; -// --------------------------------------------------------------------------- -// processTimelineResponse -// --------------------------------------------------------------------------- - type TimelineDirection = "tail" | "before" | "after"; type InitRequestDirection = "tail" | "after"; type TimelineResponseEntry = { - seqStart: number; + seq: number; provider: string; item: Record; timestamp: string; @@ -53,10 +40,8 @@ export interface ProcessTimelineResponseInput { payload: { agentId: string; direction: TimelineDirection; - reset: boolean; - epoch: string; - startCursor: { seq: number } | null; - endCursor: { seq: number } | null; + startSeq: number | null; + endSeq: number | null; entries: TimelineResponseEntry[]; error: string | null; }; @@ -79,6 +64,72 @@ export interface ProcessTimelineResponseOutput { sideEffects: TimelineReducerSideEffect[]; } +export interface ProcessAgentStreamEventInput { + event: AgentStreamEventPayload; + seq: number | undefined; + currentTail: StreamItem[]; + currentHead: StreamItem[]; + currentCursor: TimelineCursor | undefined; + currentAgent: { + status: AgentLifecycleStatus; + updatedAt: Date; + lastActivityAt: Date; + } | null; + timestamp: Date; +} + +export interface AgentPatch { + status: AgentLifecycleStatus; + updatedAt: Date; + lastActivityAt: Date; +} + +export interface ProcessAgentStreamEventOutput { + tail: StreamItem[]; + head: StreamItem[]; + changedTail: boolean; + changedHead: boolean; + cursor: TimelineCursor | null; + cursorChanged: boolean; + agent: AgentPatch | null; + agentChanged: boolean; + sideEffects: AgentStreamReducerSideEffect[]; +} + +function cursorsEqual( + left: TimelineCursor | null | undefined, + right: TimelineCursor | null | undefined, +): boolean { + if (!left || !right) { + return left === right; + } + return left.startSeq === right.startSeq && left.endSeq === right.endSeq; +} + +function removeSupersededProvisionalItems( + head: StreamItem[], + event: AgentStreamEventPayload, +): StreamItem[] { + if (head.length === 0 || event.type !== "timeline") { + return head; + } + + let nextHead = head; + if (event.item.type === "assistant_message") { + nextHead = head.filter((item) => item.kind !== "assistant_message"); + } else if (event.item.type === "tool_call") { + const committedToolCall = event.item; + nextHead = head.filter( + (item) => + item.kind !== "tool_call" || + item.payload.source !== "agent" || + item.payload.data.callId !== committedToolCall.callId, + ); + } + + return nextHead.length === head.length ? head : nextHead; +} + export function processTimelineResponse( input: ProcessTimelineResponseInput, ): ProcessTimelineResponseOutput { @@ -92,9 +143,6 @@ export function processTimelineResponse( initRequestDirection, } = input; - // ------------------------------------------------------------------ - // Error path: reject init and leave stream state unchanged - // ------------------------------------------------------------------ if (payload.error) { return { tail: currentTail, @@ -108,11 +156,8 @@ export function processTimelineResponse( }; } - // ------------------------------------------------------------------ - // Convert entries to timeline units - // ------------------------------------------------------------------ const timelineUnits = payload.entries.map((entry) => ({ - seq: entry.seqStart, + seq: entry.seq, event: { type: "timeline", provider: entry.provider, @@ -121,23 +166,12 @@ export function processTimelineResponse( timestamp: new Date(entry.timestamp), })); - const toHydratedEvents = ( - units: typeof timelineUnits, - ): Array<{ event: AgentStreamEventPayload; timestamp: Date }> => - units.map(({ event, timestamp }) => ({ event, timestamp })); - - // ------------------------------------------------------------------ - // Derive bootstrap policy (replace vs incremental) - // ------------------------------------------------------------------ const bootstrapPolicy = deriveBootstrapTailTimelinePolicy({ direction: payload.direction, - reset: payload.reset, - epoch: payload.epoch, - endCursor: payload.endCursor, + endSeq: payload.endSeq, isInitializing, hasActiveInitDeferred, }); - const replace = bootstrapPolicy.replace; let nextTail = currentTail; let nextHead = currentHead; @@ -145,26 +179,20 @@ export function processTimelineResponse( let cursorChanged = false; const sideEffects: TimelineReducerSideEffect[] = []; - if (replace) { - // ---------------------------------------------------------------- - // Replace path: full hydration from scratch - // ---------------------------------------------------------------- - nextTail = hydrateStreamState(toHydratedEvents(timelineUnits), { - source: "canonical", - }); + if (bootstrapPolicy.replace) { + nextTail = hydrateStreamState( + timelineUnits.map(({ event, timestamp }) => ({ event, timestamp })), + { source: "canonical" }, + ); nextHead = []; - - if (payload.startCursor && payload.endCursor) { - nextCursor = { - epoch: payload.epoch, - startSeq: payload.startCursor.seq, - endSeq: payload.endCursor.seq, - }; - cursorChanged = true; - } else { - nextCursor = null; - cursorChanged = true; - } + nextCursor = + typeof payload.startSeq === "number" && typeof payload.endSeq === "number" + ? { + startSeq: payload.startSeq, + endSeq: payload.endSeq, + } + : null; + cursorChanged = !cursorsEqual(currentCursor, nextCursor); if (bootstrapPolicy.catchUpCursor) { sideEffects.push({ @@ -172,45 +200,46 @@ export function processTimelineResponse( cursor: bootstrapPolicy.catchUpCursor, }); } + } else if (payload.direction === "before") { + const prepended = hydrateStreamState( + timelineUnits.map(({ event, timestamp }) => ({ event, timestamp })), + { source: "canonical" }, + ); + nextTail = prepended.length > 0 ? [...prepended, ...currentTail] : currentTail; + const derivedCursor = + typeof payload.startSeq === "number" + ? { + startSeq: payload.startSeq, + endSeq: currentCursor?.endSeq ?? payload.endSeq ?? payload.startSeq, + } + : currentCursor; + nextCursor = derivedCursor; + cursorChanged = !cursorsEqual(currentCursor, derivedCursor); } else if (timelineUnits.length > 0) { - // ---------------------------------------------------------------- - // Incremental append path - // ---------------------------------------------------------------- const acceptedUnits: typeof timelineUnits = []; let cursor = currentCursor; - let gapCursor: { epoch: string; endSeq: number } | null = null; + let gapCursor: { endSeq: number } | null = null; for (const unit of timelineUnits) { const decision: SessionTimelineSeqDecision = classifySessionTimelineSeq({ - cursor: cursor ? { epoch: cursor.epoch, endSeq: cursor.endSeq } : null, - epoch: payload.epoch, + cursor: cursor ? { endSeq: cursor.endSeq } : null, seq: unit.seq, }); if (decision === "gap") { - gapCursor = cursor ? { epoch: cursor.epoch, endSeq: cursor.endSeq } : null; + gapCursor = cursor ? { endSeq: cursor.endSeq } : null; break; } - if (decision === "drop_stale" || decision === "drop_epoch") { + if (decision === "drop_stale") { continue; } acceptedUnits.push(unit); - if (decision === "init") { - cursor = { - epoch: payload.epoch, - startSeq: unit.seq, - endSeq: unit.seq, - }; - continue; - } - if (!cursor) { - continue; - } - cursor = { - ...cursor, - endSeq: unit.seq, - }; + cursor = + decision === "init" + ? { startSeq: unit.seq, endSeq: unit.seq } + : { ...(cursor ?? { startSeq: unit.seq, endSeq: unit.seq }), endSeq: unit.seq }; + nextHead = removeSupersededProvisionalItems(nextHead, unit.event); } if (acceptedUnits.length > 0) { @@ -223,13 +252,7 @@ export function processTimelineResponse( ); } - if ( - cursor && - (!currentCursor || - currentCursor.epoch !== cursor.epoch || - currentCursor.startSeq !== cursor.startSeq || - currentCursor.endSeq !== cursor.endSeq) - ) { + if (cursor && !cursorsEqual(currentCursor, cursor)) { nextCursor = cursor; cursorChanged = true; } @@ -239,125 +262,63 @@ export function processTimelineResponse( } } - // ------------------------------------------------------------------ - // Flush pending agent updates side effect - // ------------------------------------------------------------------ sideEffects.push({ type: "flush_pending_updates" }); - // ------------------------------------------------------------------ - // Init resolution - // ------------------------------------------------------------------ const shouldResolveDeferredInit = shouldResolveTimelineInit({ hasActiveInitDeferred, isInitializing, initRequestDirection, responseDirection: payload.direction, - reset: payload.reset, }); const clearInitializing = shouldResolveDeferredInit || (isInitializing && !hasActiveInitDeferred); - const initResolution: "resolve" | "reject" | null = shouldResolveDeferredInit ? "resolve" : null; - return { tail: nextTail, head: nextHead, cursor: nextCursor, cursorChanged, - initResolution, + initResolution: shouldResolveDeferredInit ? "resolve" : null, clearInitializing, error: null, sideEffects, }; } -// --------------------------------------------------------------------------- -// processAgentStreamEvent -// --------------------------------------------------------------------------- - -export interface ProcessAgentStreamEventInput { - event: AgentStreamEventPayload; - seq: number | undefined; - epoch: string | undefined; - currentTail: StreamItem[]; - currentHead: StreamItem[]; - currentCursor: TimelineCursor | undefined; - currentAgent: { - status: AgentLifecycleStatus; - updatedAt: Date; - lastActivityAt: Date; - } | null; - timestamp: Date; -} - -export interface AgentPatch { - status: AgentLifecycleStatus; - updatedAt: Date; - lastActivityAt: Date; -} - -export interface ProcessAgentStreamEventOutput { - tail: StreamItem[]; - head: StreamItem[]; - changedTail: boolean; - changedHead: boolean; - cursor: TimelineCursor | null; - cursorChanged: boolean; - agent: AgentPatch | null; - agentChanged: boolean; - sideEffects: AgentStreamReducerSideEffect[]; -} - export function processAgentStreamEvent( input: ProcessAgentStreamEventInput, ): ProcessAgentStreamEventOutput { - const { event, seq, epoch, currentTail, currentHead, currentCursor, currentAgent, timestamp } = - input; + const { event, seq, currentTail, currentHead, currentCursor, currentAgent, timestamp } = input; let shouldApplyStreamEvent = true; let nextTimelineCursor: TimelineCursor | null = null; let cursorChanged = false; const sideEffects: AgentStreamReducerSideEffect[] = []; - // ------------------------------------------------------------------ - // Timeline sequencing gate - // ------------------------------------------------------------------ - if (event.type === "timeline" && typeof seq === "number" && typeof epoch === "string") { + if (event.type === "timeline" && typeof seq === "number") { const decision = classifySessionTimelineSeq({ - cursor: currentCursor ? { epoch: currentCursor.epoch, endSeq: currentCursor.endSeq } : null, - epoch, + cursor: currentCursor ? { endSeq: currentCursor.endSeq } : null, seq, }); - if (decision === "init") { - nextTimelineCursor = { epoch, startSeq: seq, endSeq: seq }; - cursorChanged = true; - } else if (decision === "accept") { - nextTimelineCursor = { - ...(currentCursor ?? { epoch, startSeq: seq, endSeq: seq }), - epoch, - endSeq: seq, - }; - cursorChanged = true; - } else if (decision === "gap") { + if (decision === "gap") { shouldApplyStreamEvent = false; if (currentCursor) { sideEffects.push({ type: "catch_up", - cursor: { - epoch: currentCursor.epoch, - endSeq: currentCursor.endSeq, - }, + cursor: { endSeq: currentCursor.endSeq }, }); } - } else { - // drop_stale or drop_epoch + } else if (decision === "drop_stale") { shouldApplyStreamEvent = false; + } else { + nextTimelineCursor = + decision === "init" + ? { startSeq: seq, endSeq: seq } + : { ...(currentCursor ?? { startSeq: seq, endSeq: seq }), endSeq: seq }; + cursorChanged = !cursorsEqual(currentCursor, nextTimelineCursor); } } - // ------------------------------------------------------------------ - // Apply stream event to tail/head - // ------------------------------------------------------------------ const { tail, head, changedTail, changedHead } = shouldApplyStreamEvent ? applyStreamEvent({ tail: currentTail, @@ -373,9 +334,6 @@ export function processAgentStreamEvent( changedHead: false, }; - // ------------------------------------------------------------------ - // Optimistic lifecycle status - // ------------------------------------------------------------------ let agentPatch: AgentPatch | null = null; let agentChanged = false; diff --git a/packages/app/src/contexts/session-timeline-bootstrap-policy.test.ts b/packages/app/src/contexts/session-timeline-bootstrap-policy.test.ts index e233b8742..cc8b98ce0 100644 --- a/packages/app/src/contexts/session-timeline-bootstrap-policy.test.ts +++ b/packages/app/src/contexts/session-timeline-bootstrap-policy.test.ts @@ -2,26 +2,42 @@ import { describe, expect, it } from "vitest"; import { classifySessionTimelineSeq } from "./session-timeline-seq-gate"; import { deriveBootstrapTailTimelinePolicy, + deriveInitialTimelineRequest, shouldResolveTimelineInit, } from "./session-timeline-bootstrap-policy"; -describe("deriveBootstrapTailTimelinePolicy", () => { - it("always replaces on explicit reset without catch-up cursor", () => { - const policy = deriveBootstrapTailTimelinePolicy({ - direction: "after", - reset: true, - epoch: "epoch-1", - endCursor: { seq: 200 }, - isInitializing: false, - hasActiveInitDeferred: false, +describe("deriveInitialTimelineRequest", () => { + it("uses tail bootstrap when history has not synced yet", () => { + expect( + deriveInitialTimelineRequest({ + cursor: { seq: 42 }, + hasAuthoritativeHistory: false, + initialTimelineLimit: 200, + }), + ).toEqual({ + direction: "tail", + limit: 200, }); + }); - expect(policy.replace).toBe(true); - expect(policy.catchUpCursor).toBeNull(); + it("uses catch-up after the committed cursor once history is synced", () => { + expect( + deriveInitialTimelineRequest({ + cursor: { seq: 42 }, + hasAuthoritativeHistory: true, + initialTimelineLimit: 200, + }), + ).toEqual({ + direction: "after", + cursor: { seq: 42 }, + limit: 0, + }); }); +}); +describe("deriveBootstrapTailTimelinePolicy", () => { it("forces baseline replace and canonical catch-up for init tail race", () => { - const advancedCursor = { epoch: "epoch-1", endSeq: 205 }; + const advancedCursor = { endSeq: 205 }; const tailSeqStart = 101; const tailSeqEnd = 200; @@ -29,7 +45,6 @@ describe("deriveBootstrapTailTimelinePolicy", () => { for (let seq = tailSeqStart; seq <= tailSeqEnd; seq += 1) { const decision = classifySessionTimelineSeq({ cursor: advancedCursor, - epoch: "epoch-1", seq, }); if (decision === "accept" || decision === "init") { @@ -40,16 +55,13 @@ describe("deriveBootstrapTailTimelinePolicy", () => { const policy = deriveBootstrapTailTimelinePolicy({ direction: "tail", - reset: false, - epoch: "epoch-1", - endCursor: { seq: 200 }, + endSeq: 200, isInitializing: true, hasActiveInitDeferred: true, }); expect(policy.replace).toBe(true); expect(policy.catchUpCursor).toEqual({ - epoch: "epoch-1", endSeq: 200, }); }); @@ -57,9 +69,7 @@ describe("deriveBootstrapTailTimelinePolicy", () => { it("does not replace non-bootstrap, non-reset responses", () => { const policy = deriveBootstrapTailTimelinePolicy({ direction: "tail", - reset: false, - epoch: "epoch-1", - endCursor: { seq: 200 }, + endSeq: 200, isInitializing: false, hasActiveInitDeferred: false, }); @@ -77,7 +87,6 @@ describe("shouldResolveTimelineInit", () => { isInitializing: true, initRequestDirection: "tail", responseDirection: "tail", - reset: false, }), ).toBe(true); }); @@ -89,7 +98,6 @@ describe("shouldResolveTimelineInit", () => { isInitializing: true, initRequestDirection: "tail", responseDirection: "after", - reset: false, }), ).toBe(false); }); @@ -101,7 +109,6 @@ describe("shouldResolveTimelineInit", () => { isInitializing: true, initRequestDirection: "after", responseDirection: "after", - reset: false, }), ).toBe(true); }); diff --git a/packages/app/src/contexts/session-timeline-bootstrap-policy.ts b/packages/app/src/contexts/session-timeline-bootstrap-policy.ts index 706ab5eee..c7ab9ded8 100644 --- a/packages/app/src/contexts/session-timeline-bootstrap-policy.ts +++ b/packages/app/src/contexts/session-timeline-bootstrap-policy.ts @@ -6,7 +6,6 @@ type BootstrapTailCursor = { } | null; type InitialTimelineCursor = { - epoch: string; seq: number; } | null; @@ -20,48 +19,37 @@ export function deriveInitialTimelineRequest({ initialTimelineLimit: number; }): { direction: "tail" | "after"; - cursor?: { epoch: string; seq: number }; + cursor?: { seq: number }; limit: number; - projection: "canonical"; } { if (!hasAuthoritativeHistory || !cursor) { return { direction: "tail", limit: initialTimelineLimit, - projection: "canonical", }; } return { direction: "after", - cursor: { epoch: cursor.epoch, seq: cursor.seq }, + cursor: { seq: cursor.seq }, limit: 0, - projection: "canonical", }; } export function deriveBootstrapTailTimelinePolicy({ direction, - reset, - epoch, - endCursor, + endSeq, isInitializing, hasActiveInitDeferred, }: { direction: TimelineDirection; - reset: boolean; - epoch: string; - endCursor: BootstrapTailCursor; + endSeq: number | null; isInitializing: boolean; hasActiveInitDeferred: boolean; }): { replace: boolean; - catchUpCursor: { epoch: string; endSeq: number } | null; + catchUpCursor: { endSeq: number } | null; } { - if (reset) { - return { replace: true, catchUpCursor: null }; - } - const isBootstrapTailInit = direction === "tail" && isInitializing && hasActiveInitDeferred; if (!isBootstrapTailInit) { return { replace: false, catchUpCursor: null }; @@ -69,7 +57,7 @@ export function deriveBootstrapTailTimelinePolicy({ return { replace: true, - catchUpCursor: endCursor ? { epoch, endSeq: endCursor.seq } : null, + catchUpCursor: typeof endSeq === "number" ? { endSeq } : null, }; } @@ -78,19 +66,14 @@ export function shouldResolveTimelineInit({ isInitializing, initRequestDirection, responseDirection, - reset, }: { hasActiveInitDeferred: boolean; isInitializing: boolean; initRequestDirection: InitRequestDirection; responseDirection: TimelineDirection; - reset: boolean; }): boolean { if (!hasActiveInitDeferred || !isInitializing) { return false; } - if (reset) { - return true; - } return responseDirection === initRequestDirection; } diff --git a/packages/app/src/contexts/session-timeline-seq-gate.test.ts b/packages/app/src/contexts/session-timeline-seq-gate.test.ts index d18af910a..3f827dce8 100644 --- a/packages/app/src/contexts/session-timeline-seq-gate.test.ts +++ b/packages/app/src/contexts/session-timeline-seq-gate.test.ts @@ -5,8 +5,7 @@ describe("classifySessionTimelineSeq", () => { it("accepts contiguous forward seq", () => { expect( classifySessionTimelineSeq({ - cursor: { epoch: "epoch-1", endSeq: 4 }, - epoch: "epoch-1", + cursor: { endSeq: 4 }, seq: 5, }), ).toBe("accept"); @@ -15,8 +14,7 @@ describe("classifySessionTimelineSeq", () => { it("drops stale seq older than the current end", () => { expect( classifySessionTimelineSeq({ - cursor: { epoch: "epoch-1", endSeq: 8 }, - epoch: "epoch-1", + cursor: { endSeq: 8 }, seq: 7, }), ).toBe("drop_stale"); @@ -25,28 +23,16 @@ describe("classifySessionTimelineSeq", () => { it("drops duplicate replay seq equal to the current end", () => { expect( classifySessionTimelineSeq({ - cursor: { epoch: "epoch-1", endSeq: 8 }, - epoch: "epoch-1", + cursor: { endSeq: 8 }, seq: 8, }), ).toBe("drop_stale"); }); - it("drops epoch mismatch", () => { - expect( - classifySessionTimelineSeq({ - cursor: { epoch: "epoch-1", endSeq: 4 }, - epoch: "epoch-2", - seq: 5, - }), - ).toBe("drop_epoch"); - }); - it("initializes when cursor is null", () => { expect( classifySessionTimelineSeq({ cursor: null, - epoch: "epoch-1", seq: 1, }), ).toBe("init"); @@ -55,8 +41,7 @@ describe("classifySessionTimelineSeq", () => { it("classifies forward gaps", () => { expect( classifySessionTimelineSeq({ - cursor: { epoch: "epoch-1", endSeq: 4 }, - epoch: "epoch-1", + cursor: { endSeq: 4 }, seq: 9, }), ).toBe("gap"); diff --git a/packages/app/src/contexts/session-timeline-seq-gate.ts b/packages/app/src/contexts/session-timeline-seq-gate.ts index e7e35b567..1fe15597e 100644 --- a/packages/app/src/contexts/session-timeline-seq-gate.ts +++ b/packages/app/src/contexts/session-timeline-seq-gate.ts @@ -1,28 +1,22 @@ export type SessionTimelineSeqCursor = | { - epoch: string; endSeq: number; } | null | undefined; -export type SessionTimelineSeqDecision = "accept" | "drop_stale" | "drop_epoch" | "gap" | "init"; +export type SessionTimelineSeqDecision = "accept" | "drop_stale" | "gap" | "init"; export function classifySessionTimelineSeq({ cursor, - epoch, seq, }: { cursor: SessionTimelineSeqCursor; - epoch: string; seq: number; }): SessionTimelineSeqDecision { if (!cursor) { return "init"; } - if (cursor.epoch !== epoch) { - return "drop_epoch"; - } if (seq <= cursor.endSeq) { return "drop_stale"; } diff --git a/packages/app/src/contexts/session-workspace-services.ts b/packages/app/src/contexts/session-workspace-services.ts new file mode 100644 index 000000000..9536a81f8 --- /dev/null +++ b/packages/app/src/contexts/session-workspace-services.ts @@ -0,0 +1,28 @@ +import type { ServiceStatusUpdateMessage } from "@server/shared/messages"; +import type { WorkspaceDescriptor } from "@/stores/session-store"; +import { resolveWorkspaceMapKeyByIdentity } from "@/utils/workspace-execution"; + +export function patchWorkspaceServices( + workspaces: Map, + update: ServiceStatusUpdateMessage["payload"], +): Map { + const workspaceKey = resolveWorkspaceMapKeyByIdentity({ + workspaces, + workspaceIdentity: update.workspaceId, + }); + if (!workspaceKey) { + return workspaces; + } + + const existing = workspaces.get(workspaceKey); + if (!existing) { + return workspaces; + } + + const next = new Map(workspaces); + next.set(workspaceKey, { + ...existing, + services: update.services.map((s) => ({ ...s })), + }); + return next; +} diff --git a/packages/app/src/desktop/permissions/desktop-permissions.test.ts b/packages/app/src/desktop/permissions/desktop-permissions.test.ts index a62162835..5db79afd5 100644 --- a/packages/app/src/desktop/permissions/desktop-permissions.test.ts +++ b/packages/app/src/desktop/permissions/desktop-permissions.test.ts @@ -5,16 +5,33 @@ type MockPlatform = "web" | "ios" | "android"; type GlobalSnapshot = { Notification: unknown; navigatorDescriptor?: PropertyDescriptor; + windowDescriptor?: PropertyDescriptor; paseoDesktop: unknown; }; const originalGlobals: GlobalSnapshot = { Notification: (globalThis as { Notification?: unknown }).Notification, navigatorDescriptor: Object.getOwnPropertyDescriptor(globalThis, "navigator"), + windowDescriptor: Object.getOwnPropertyDescriptor(globalThis, "window"), paseoDesktop: typeof window === "undefined" ? undefined : (window as { paseoDesktop?: unknown }).paseoDesktop, }; +function ensureWindow(): { paseoDesktop?: unknown } { + const existingWindow = (globalThis as { window?: { paseoDesktop?: unknown } }).window; + if (existingWindow) { + return existingWindow; + } + + const nextWindow: { paseoDesktop?: unknown } = {}; + Object.defineProperty(globalThis, "window", { + configurable: true, + writable: true, + value: nextWindow, + }); + return nextWindow; +} + function setNavigator(value: unknown): void { Object.defineProperty(globalThis, "navigator", { configurable: true, @@ -32,6 +49,12 @@ function restoreGlobals(): void { delete (globalThis as { navigator?: unknown }).navigator; } + if (originalGlobals.windowDescriptor) { + Object.defineProperty(globalThis, "window", originalGlobals.windowDescriptor); + } else { + delete (globalThis as { window?: unknown }).window; + } + if (typeof window !== "undefined") { (window as { paseoDesktop?: unknown }).paseoDesktop = originalGlobals.paseoDesktop; } @@ -56,7 +79,7 @@ describe("desktop-permissions", () => { expect(shouldShowDesktopPermissionSection()).toBe(false); - (window as { paseoDesktop?: unknown }).paseoDesktop = {}; + ensureWindow().paseoDesktop = {}; expect(shouldShowDesktopPermissionSection()).toBe(true); }); diff --git a/packages/app/src/hooks/use-agent-form-state.test.ts b/packages/app/src/hooks/use-agent-form-state.test.ts index 6cb7f5128..57df59356 100644 --- a/packages/app/src/hooks/use-agent-form-state.test.ts +++ b/packages/app/src/hooks/use-agent-form-state.test.ts @@ -56,7 +56,7 @@ describe("useAgentFormState", () => { }, ]; - it("auto-selects the model's default thinking option when none is configured", () => { + it("does not auto-select a model on fresh drafts without preferences", () => { const resolved = __private__.resolveFormState( undefined, { provider: "codex" }, @@ -80,14 +80,14 @@ describe("useAgentFormState", () => { new Set(), ); - expect(resolved.model).toBe("gpt-5.3-codex"); - expect(resolved.thinkingOptionId).toBe("xhigh"); + expect(resolved.model).toBe(""); + expect(resolved.thinkingOptionId).toBe(""); }); - it("prefers provider defaults on fresh drafts", () => { + it("auto-selects the model's default thinking option when model is preferred but thinking is not", () => { const resolved = __private__.resolveFormState( undefined, - { provider: "codex" }, + { provider: "codex", providerPreferences: { codex: { model: "gpt-5.3-codex" } } }, codexModels, { serverId: false, @@ -115,7 +115,7 @@ describe("useAgentFormState", () => { it("falls back to model default when saved thinking preference is invalid", () => { const resolved = __private__.resolveFormState( undefined, - { provider: "codex" }, + { provider: "codex", providerPreferences: { codex: { model: "gpt-5.3-codex" } } }, codexModels, { serverId: false, @@ -195,7 +195,7 @@ describe("useAgentFormState", () => { it("keeps an explicit initial thinking option when it is valid", () => { const resolved = __private__.resolveFormState( - { thinkingOptionId: "low" }, + { model: "gpt-5.3-codex", thinkingOptionId: "low" }, { provider: "codex" }, codexModels, { @@ -221,7 +221,7 @@ describe("useAgentFormState", () => { expect(resolved.thinkingOptionId).toBe("low"); }); - it("leaves thinking unset when the model exposes options without a provider default", () => { + it("falls back to the first thinking option when the model exposes options without a provider default", () => { const claudeModels: AgentModelDefinition[] = [ { provider: "claude", @@ -237,7 +237,7 @@ describe("useAgentFormState", () => { const resolved = __private__.resolveFormState( undefined, - { provider: "claude" }, + { provider: "claude", providerPreferences: { claude: { model: "default" } } }, claudeModels, { serverId: false, @@ -259,7 +259,7 @@ describe("useAgentFormState", () => { ); expect(resolved.model).toBe("default"); - expect(resolved.thinkingOptionId).toBe(""); + expect(resolved.thinkingOptionId).toBe("low"); }); it("resolves provider only from allowed provider map", () => { diff --git a/packages/app/src/hooks/use-agent-form-state.ts b/packages/app/src/hooks/use-agent-form-state.ts index 09e0ae2fd..160c1fdc6 100644 --- a/packages/app/src/hooks/use-agent-form-state.ts +++ b/packages/app/src/hooks/use-agent-form-state.ts @@ -66,7 +66,7 @@ type UseAgentFormStateOptions = { onlineServerIds?: string[]; }; -type UseAgentFormStateResult = { +export type UseAgentFormStateResult = { selectedServerId: string | null; setSelectedServerId: (value: string | null) => void; setSelectedServerIdFromUser: (value: string | null) => void; @@ -136,7 +136,7 @@ function resolveEffectiveModel( } const normalizedModelId = modelId.trim(); if (!normalizedModelId) { - return resolveDefaultModel(availableModels); + return null; } return ( availableModels.find((model) => model.id === normalizedModelId) ?? @@ -241,8 +241,6 @@ function resolveFormState( } else { result.model = defaultModelId; } - } else if (defaultModelId) { - result.model = defaultModelId; } else { result.model = ""; } diff --git a/packages/app/src/hooks/use-agent-initialization.test.ts b/packages/app/src/hooks/use-agent-initialization.test.ts index dd32a15d6..43da108e3 100644 --- a/packages/app/src/hooks/use-agent-initialization.test.ts +++ b/packages/app/src/hooks/use-agent-initialization.test.ts @@ -2,11 +2,10 @@ import { describe, expect, it } from "vitest"; import { __private__ } from "./use-agent-initialization"; describe("useAgentInitialization timeline request policy", () => { - it("uses canonical tail bootstrap when history has not synced yet", () => { + it("uses committed tail bootstrap when history has not synced yet", () => { expect( __private__.deriveInitialTimelineRequest({ cursor: { - epoch: "epoch-1", seq: 42, }, hasAuthoritativeHistory: false, @@ -15,11 +14,10 @@ describe("useAgentInitialization timeline request policy", () => { ).toEqual({ direction: "tail", limit: 200, - projection: "canonical", }); }); - it("uses canonical tail bootstrap when cursor is missing", () => { + it("uses committed tail bootstrap when cursor is missing", () => { expect( __private__.deriveInitialTimelineRequest({ cursor: null, @@ -29,15 +27,13 @@ describe("useAgentInitialization timeline request policy", () => { ).toEqual({ direction: "tail", limit: 200, - projection: "canonical", }); }); - it("uses canonical catch-up after the current cursor once history is synced", () => { + it("uses committed catch-up after the current cursor once history is synced", () => { expect( __private__.deriveInitialTimelineRequest({ cursor: { - epoch: "epoch-1", seq: 42, }, hasAuthoritativeHistory: true, @@ -45,9 +41,8 @@ describe("useAgentInitialization timeline request policy", () => { }), ).toEqual({ direction: "after", - cursor: { epoch: "epoch-1", seq: 42 }, + cursor: { seq: 42 }, limit: 0, - projection: "canonical", }); }); @@ -61,7 +56,6 @@ describe("useAgentInitialization timeline request policy", () => { ).toEqual({ direction: "tail", limit: 0, - projection: "canonical", }); }); diff --git a/packages/app/src/hooks/use-agent-initialization.ts b/packages/app/src/hooks/use-agent-initialization.ts index 62ffa9a9d..20df64bcd 100644 --- a/packages/app/src/hooks/use-agent-initialization.ts +++ b/packages/app/src/hooks/use-agent-initialization.ts @@ -60,7 +60,7 @@ export function useAgentInitialization({ const hasAuthoritativeHistory = session?.agentAuthoritativeHistoryApplied.get(agentId) === true; const timelineRequest = deriveInitialTimelineRequest({ - cursor: cursor ? { epoch: cursor.epoch, seq: cursor.endSeq } : null, + cursor: cursor ? { seq: cursor.endSeq } : null, hasAuthoritativeHistory, initialTimelineLimit, }); @@ -107,7 +107,6 @@ export function useAgentInitialization({ await client.fetchAgentTimeline(agentId, { direction: "tail", limit: initialTimelineLimit, - projection: "canonical", }); } catch (error) { setAgentInitializing(agentId, false); diff --git a/packages/app/src/hooks/use-agent-input-draft.live.test.tsx b/packages/app/src/hooks/use-agent-input-draft.live.test.tsx new file mode 100644 index 000000000..a0b6d455a --- /dev/null +++ b/packages/app/src/hooks/use-agent-input-draft.live.test.tsx @@ -0,0 +1,270 @@ +import React, { act } from "react"; +import { createRoot, type Root } from "react-dom/client"; +import { JSDOM } from "jsdom"; +import { beforeAll, beforeEach, describe, expect, it, vi } from "vitest"; +import { useDraftStore } from "@/stores/draft-store"; +import type { AttachmentMetadata } from "@/attachments/types"; + +const { asyncStorage } = vi.hoisted(() => ({ + asyncStorage: new Map(), +})); + +vi.mock("@react-native-async-storage/async-storage", () => ({ + default: { + getItem: async (key: string) => asyncStorage.get(key) ?? null, + setItem: async (key: string, value: string) => { + asyncStorage.set(key, value); + }, + removeItem: async (key: string) => { + asyncStorage.delete(key); + }, + }, +})); + +vi.mock("@/attachments/service", () => ({ + garbageCollectAttachments: async () => undefined, +})); + +vi.mock("./use-agent-form-state", () => ({ + useAgentFormState: () => ({ + selectedServerId: "host-1", + setSelectedServerId: () => undefined, + setSelectedServerIdFromUser: () => undefined, + selectedProvider: "codex", + setProviderFromUser: () => undefined, + selectedMode: "auto", + setModeFromUser: () => undefined, + selectedModel: "", + setModelFromUser: () => undefined, + selectedThinkingOptionId: "", + setThinkingOptionFromUser: () => undefined, + workingDir: "/repo", + setWorkingDir: () => undefined, + setWorkingDirFromUser: () => undefined, + providerDefinitions: [{ id: "codex", label: "Codex", modes: [{ id: "auto", label: "Auto" }] }], + providerDefinitionMap: new Map(), + agentDefinition: undefined, + modeOptions: [{ id: "auto", label: "Auto" }], + availableModels: [ + { + provider: "codex", + id: "gpt-5.4", + label: "gpt-5.4", + isDefault: true, + defaultThinkingOptionId: "high", + thinkingOptions: [ + { id: "medium", label: "Medium" }, + { id: "high", label: "High", isDefault: true }, + ], + }, + ], + allProviderModels: new Map([ + [ + "codex", + [ + { + provider: "codex", + id: "gpt-5.4", + label: "gpt-5.4", + isDefault: true, + defaultThinkingOptionId: "high", + thinkingOptions: [ + { id: "medium", label: "Medium" }, + { id: "high", label: "High", isDefault: true }, + ], + }, + ], + ], + ]), + isAllModelsLoading: false, + availableThinkingOptions: [ + { id: "medium", label: "Medium" }, + { id: "high", label: "High", isDefault: true }, + ], + isModelLoading: false, + modelError: null, + refreshProviderModels: () => undefined, + setProviderAndModelFromUser: () => undefined, + workingDirIsEmpty: false, + persistFormPreferences: async () => undefined, + }), +})); + +let useAgentInputDraft: typeof import("./use-agent-input-draft").useAgentInputDraft; + +beforeAll(async () => { + const storage = new Map(); + + Object.defineProperty(globalThis, "window", { + value: { + localStorage: { + getItem: (key: string) => storage.get(key) ?? null, + setItem: (key: string, value: string) => { + storage.set(key, value); + }, + removeItem: (key: string) => { + storage.delete(key); + }, + }, + }, + configurable: true, + }); + Object.defineProperty(globalThis, "IS_REACT_ACT_ENVIRONMENT", { + value: true, + configurable: true, + }); + + ({ useAgentInputDraft } = await import("./use-agent-input-draft")); +}); + +describe("useAgentInputDraft live contract", () => { + beforeEach(() => { + asyncStorage.clear(); + const dom = new JSDOM("
", { + url: "http://localhost", + }); + + Object.defineProperty(globalThis, "document", { + value: dom.window.document, + configurable: true, + }); + Object.defineProperty(globalThis, "navigator", { + value: dom.window.navigator, + configurable: true, + }); + + useDraftStore.setState({ drafts: {}, createModalDraft: null }); + }); + + it("hydrates persisted text and images and returns draft-mode composer state for a caller-provided key", async () => { + let latest: ReturnType | null = null; + const image: AttachmentMetadata = { + id: "attachment-1", + mimeType: "image/png", + storageType: "web-indexeddb", + storageKey: "attachments/1", + createdAt: 1, + fileName: "image.png", + byteSize: 128, + }; + + function getLatest(): ReturnType { + if (!latest) { + throw new Error("Expected hook result"); + } + return latest; + } + + function Probe({ draftKey }: { draftKey: string }) { + latest = useAgentInputDraft({ + draftKey, + composer: { + initialServerId: "host-1", + initialValues: { workingDir: "/repo" }, + isVisible: true, + onlineServerIds: ["host-1"], + lockedWorkingDir: "/repo", + }, + }); + return null; + } + + const container = document.getElementById("root"); + if (!container) { + throw new Error("Missing root container"); + } + + let root: Root | null = createRoot(container); + await act(async () => { + root!.render(); + }); + + expect(getLatest().composerState?.statusControls.selectedProvider).toBe("codex"); + expect(getLatest().composerState?.commandDraftConfig).toEqual({ + provider: "codex", + cwd: "/repo", + modeId: "auto", + model: "gpt-5.4", + thinkingOptionId: "high", + }); + + await act(async () => { + getLatest().setText("hello world"); + getLatest().setImages([image]); + }); + + await act(async () => { + root!.unmount(); + }); + + root = createRoot(container); + await act(async () => { + root.render(); + }); + + expect(getLatest().text).toBe("hello world"); + expect(getLatest().images).toEqual([image]); + }); + + it("clears drafts with sent and abandoned lifecycle tombstones", async () => { + let latest: ReturnType | null = null; + const sentImage: AttachmentMetadata = { + id: "attachment-sent", + mimeType: "image/png", + storageType: "web-indexeddb", + storageKey: "attachments/sent", + createdAt: 2, + }; + + function getLatest(): ReturnType { + if (!latest) { + throw new Error("Expected hook result"); + } + return latest; + } + + function Probe() { + latest = useAgentInputDraft({ draftKey: "draft:lifecycle" }); + return null; + } + + const container = document.getElementById("root"); + if (!container) { + throw new Error("Missing root container"); + } + + const root = createRoot(container); + await act(async () => { + root.render(); + }); + + await act(async () => { + getLatest().setText("queued message"); + getLatest().setImages([sentImage]); + }); + + await act(async () => { + getLatest().clear("sent"); + }); + + expect(getLatest().text).toBe(""); + expect(getLatest().images).toEqual([]); + expect(useDraftStore.getState().drafts["draft:lifecycle"]).toMatchObject({ + lifecycle: "sent", + input: { text: "", images: [] }, + }); + + await act(async () => { + getLatest().setText("draft again"); + }); + + await act(async () => { + getLatest().clear("abandoned"); + }); + + expect(useDraftStore.getState().drafts["draft:lifecycle"]).toMatchObject({ + lifecycle: "abandoned", + input: { text: "", images: [] }, + }); + }); +}); diff --git a/packages/app/src/hooks/use-agent-input-draft.test.ts b/packages/app/src/hooks/use-agent-input-draft.test.ts new file mode 100644 index 000000000..b766b0ffd --- /dev/null +++ b/packages/app/src/hooks/use-agent-input-draft.test.ts @@ -0,0 +1,194 @@ +import { beforeAll, describe, expect, it, vi } from "vitest"; + +vi.mock("@react-native-async-storage/async-storage", () => ({ + default: { + getItem: async () => null, + setItem: async () => undefined, + removeItem: async () => undefined, + }, +})); + +vi.mock("@/attachments/service", () => ({ + garbageCollectAttachments: async () => undefined, +})); + +vi.mock("./use-agent-form-state", () => ({ + useAgentFormState: () => ({ + selectedServerId: "host-1", + setSelectedServerId: () => undefined, + setSelectedServerIdFromUser: () => undefined, + selectedProvider: "codex", + setProviderFromUser: () => undefined, + selectedMode: "auto", + setModeFromUser: () => undefined, + selectedModel: "", + setModelFromUser: () => undefined, + selectedThinkingOptionId: "", + setThinkingOptionFromUser: () => undefined, + workingDir: "/repo", + setWorkingDir: () => undefined, + setWorkingDirFromUser: () => undefined, + providerDefinitions: [{ id: "codex", label: "Codex", modes: [{ id: "auto", label: "Auto" }] }], + providerDefinitionMap: new Map(), + agentDefinition: undefined, + modeOptions: [{ id: "auto", label: "Auto" }], + availableModels: [], + allProviderModels: new Map(), + isAllModelsLoading: false, + availableThinkingOptions: [], + isModelLoading: false, + modelError: null, + refreshProviderModels: () => undefined, + setProviderAndModelFromUser: () => undefined, + workingDirIsEmpty: false, + persistFormPreferences: async () => undefined, + }), +})); + +let __private__: typeof import("./use-agent-input-draft").__private__; + +beforeAll(async () => { + const storage = new Map(); + Object.defineProperty(globalThis, "window", { + value: { + localStorage: { + getItem: (key: string) => storage.get(key) ?? null, + setItem: (key: string, value: string) => { + storage.set(key, value); + }, + removeItem: (key: string) => { + storage.delete(key); + }, + }, + }, + configurable: true, + }); + + ({ __private__ } = await import("./use-agent-input-draft")); +}); + +describe("useAgentInputDraft", () => { + describe("__private__.resolveDraftKey", () => { + it("returns an object draft key string unchanged", () => { + expect( + __private__.resolveDraftKey({ + draftKey: "draft:key", + selectedServerId: "host-1", + }), + ).toBe("draft:key"); + }); + + it("resolves a computed draft key from the selected server", () => { + expect( + __private__.resolveDraftKey({ + draftKey: ({ selectedServerId }) => `draft:${selectedServerId ?? "none"}`, + selectedServerId: "host-1", + }), + ).toBe("draft:host-1"); + }); + }); + + describe("__private__.resolveEffectiveComposerModelId", () => { + const models = [ + { + provider: "codex", + id: "gpt-5.4", + label: "gpt-5.4", + isDefault: true, + }, + { + provider: "codex", + id: "gpt-5.4-mini", + label: "gpt-5.4-mini", + }, + ]; + + it("prefers the selected model when present", () => { + expect( + __private__.resolveEffectiveComposerModelId({ + selectedModel: "gpt-5.4-mini", + availableModels: models, + }), + ).toBe("gpt-5.4-mini"); + }); + + it("returns empty string when no model selected", () => { + expect( + __private__.resolveEffectiveComposerModelId({ + selectedModel: "", + availableModels: models, + }), + ).toBe(""); + }); + }); + + describe("__private__.resolveEffectiveComposerThinkingOptionId", () => { + const models = [ + { + provider: "codex", + id: "gpt-5.4", + label: "gpt-5.4", + isDefault: true, + defaultThinkingOptionId: "high", + thinkingOptions: [ + { id: "medium", label: "Medium" }, + { id: "high", label: "High", isDefault: true }, + ], + }, + ]; + + it("prefers the selected thinking option when present", () => { + expect( + __private__.resolveEffectiveComposerThinkingOptionId({ + selectedThinkingOptionId: "medium", + availableModels: models, + effectiveModelId: "gpt-5.4", + }), + ).toBe("medium"); + }); + + it("falls back to the model default thinking option", () => { + expect( + __private__.resolveEffectiveComposerThinkingOptionId({ + selectedThinkingOptionId: "", + availableModels: models, + effectiveModelId: "gpt-5.4", + }), + ).toBe("high"); + }); + }); + + describe("__private__.buildDraftComposerCommandConfig", () => { + it("returns undefined when cwd is empty", () => { + expect( + __private__.buildDraftComposerCommandConfig({ + provider: "codex", + cwd: " ", + modeOptions: [], + selectedMode: "", + effectiveModelId: "gpt-5.4", + effectiveThinkingOptionId: "high", + }), + ).toBeUndefined(); + }); + + it("builds the draft command config from derived composer state", () => { + expect( + __private__.buildDraftComposerCommandConfig({ + provider: "codex", + cwd: "/repo", + modeOptions: [{ id: "auto", label: "Auto" }], + selectedMode: "auto", + effectiveModelId: "gpt-5.4", + effectiveThinkingOptionId: "high", + }), + ).toEqual({ + provider: "codex", + cwd: "/repo", + modeId: "auto", + model: "gpt-5.4", + thinkingOptionId: "high", + }); + }); + }); +}); diff --git a/packages/app/src/hooks/use-agent-input-draft.ts b/packages/app/src/hooks/use-agent-input-draft.ts index 200474970..8550aa2bd 100644 --- a/packages/app/src/hooks/use-agent-input-draft.ts +++ b/packages/app/src/hooks/use-agent-input-draft.ts @@ -1,9 +1,46 @@ -import { useCallback, useEffect, useRef, useState } from "react"; +import { useCallback, useEffect, useMemo, useRef, useState } from "react"; import type { AttachmentMetadata } from "@/attachments/types"; +import type { DraftAgentStatusBarProps } from "@/components/agent-status-bar"; +import type { DraftCommandConfig } from "@/hooks/use-agent-commands-query"; +import { + useAgentFormState, + type CreateAgentInitialValues, + type UseAgentFormStateResult, +} from "@/hooks/use-agent-form-state"; +import { useDraftAgentFeatures } from "@/hooks/use-draft-agent-features"; import { useDraftStore } from "@/stores/draft-store"; +import type { AgentModelDefinition } from "@server/server/agent/agent-sdk-types"; type ImageUpdater = AttachmentMetadata[] | ((prev: AttachmentMetadata[]) => AttachmentMetadata[]); +type AgentInputDraftComposerOptions = { + initialServerId: string | null; + initialValues?: CreateAgentInitialValues; + isVisible?: boolean; + onlineServerIds?: string[]; + lockedWorkingDir?: string; +}; + +type DraftKeyContext = { + selectedServerId: string | null; +}; + +type DraftKeyInput = string | ((context: DraftKeyContext) => string); + +type UseAgentInputDraftInput = { + draftKey: DraftKeyInput; + composer?: AgentInputDraftComposerOptions; +}; + +type DraftComposerState = UseAgentFormStateResult & { + workingDir: string; + effectiveModelId: string; + effectiveThinkingOptionId: string; + featureValues: Record | undefined; + statusControls: DraftAgentStatusBarProps; + commandDraftConfig: DraftCommandConfig | undefined; +}; + interface AgentInputDraft { text: string; setText: (text: string) => void; @@ -11,6 +48,7 @@ interface AgentInputDraft { setImages: (updater: ImageUpdater) => void; clear: (lifecycle: "sent" | "abandoned") => void; isHydrated: boolean; + composerState: DraftComposerState | null; } function hasDraftContent(input: { text: string; images: AttachmentMetadata[] }): boolean { @@ -36,7 +74,111 @@ function areImagesEqual(input: { }); } -export function useAgentInputDraft(draftKey: string): AgentInputDraft { +function resolveDraftKey(input: { + draftKey: DraftKeyInput; + selectedServerId: string | null; +}): string { + if (typeof input.draftKey === "function") { + return input.draftKey({ selectedServerId: input.selectedServerId }); + } + return input.draftKey; +} + +function resolveEffectiveComposerModelId(input: { + selectedModel: string; + availableModels: AgentModelDefinition[]; +}): string { + return input.selectedModel.trim(); +} + +function resolveEffectiveComposerThinkingOptionId(input: { + selectedThinkingOptionId: string; + availableModels: AgentModelDefinition[]; + effectiveModelId: string; +}): string { + const selectedThinkingOptionId = input.selectedThinkingOptionId.trim(); + if (selectedThinkingOptionId) { + return selectedThinkingOptionId; + } + + const selectedModelDefinition = + input.availableModels.find((model) => model.id === input.effectiveModelId) ?? null; + return selectedModelDefinition?.defaultThinkingOptionId ?? ""; +} + +function buildDraftComposerCommandConfig(input: { + provider: DraftAgentStatusBarProps["selectedProvider"]; + cwd: string; + modeOptions: DraftAgentStatusBarProps["modeOptions"]; + selectedMode: string; + effectiveModelId: string; + effectiveThinkingOptionId: string; + featureValues?: Record; +}): DraftCommandConfig | undefined { + const cwd = input.cwd.trim(); + if (!cwd) { + return undefined; + } + + return { + provider: input.provider, + cwd, + ...(input.modeOptions.length > 0 && input.selectedMode !== "" ? { modeId: input.selectedMode } : {}), + ...(input.effectiveModelId ? { model: input.effectiveModelId } : {}), + ...(input.effectiveThinkingOptionId + ? { thinkingOptionId: input.effectiveThinkingOptionId } + : {}), + ...(input.featureValues ? { featureValues: input.featureValues } : {}), + }; +} + +function buildDraftStatusControls(input: { + formState: UseAgentFormStateResult; + features?: DraftAgentStatusBarProps["features"]; + onSetFeature?: DraftAgentStatusBarProps["onSetFeature"]; + onDropdownClose?: DraftAgentStatusBarProps["onDropdownClose"]; +}): DraftAgentStatusBarProps { + const { formState, features, onSetFeature, onDropdownClose } = input; + return { + providerDefinitions: formState.providerDefinitions, + selectedProvider: formState.selectedProvider, + onSelectProvider: formState.setProviderFromUser, + modeOptions: formState.modeOptions, + selectedMode: formState.selectedMode, + onSelectMode: formState.setModeFromUser, + models: formState.availableModels, + selectedModel: formState.selectedModel, + onSelectModel: formState.setModelFromUser, + isModelLoading: formState.isModelLoading, + allProviderModels: formState.allProviderModels, + isAllModelsLoading: formState.isAllModelsLoading, + onSelectProviderAndModel: formState.setProviderAndModelFromUser, + thinkingOptions: formState.availableThinkingOptions, + selectedThinkingOptionId: formState.selectedThinkingOptionId, + onSelectThinkingOption: formState.setThinkingOptionFromUser, + features, + onSetFeature, + onDropdownClose, + }; +} + +export function useAgentInputDraft(input: UseAgentInputDraftInput): AgentInputDraft { + const composerOptions = input.composer ?? null; + const formState = useAgentFormState({ + initialServerId: composerOptions?.initialServerId ?? null, + initialValues: composerOptions?.initialValues, + isVisible: composerOptions?.isVisible ?? false, + isCreateFlow: true, + onlineServerIds: composerOptions?.onlineServerIds ?? [], + }); + const draftKey = useMemo( + () => + resolveDraftKey({ + draftKey: input.draftKey, + selectedServerId: formState.selectedServerId, + }), + [formState.selectedServerId, input.draftKey], + ); const [text, setText] = useState(""); const [images, setImagesState] = useState([]); const [isHydrated, setIsHydrated] = useState(false); @@ -148,6 +290,105 @@ export function useAgentInputDraft(draftKey: string): AgentInputDraft { }); }, [draftKey, images, text]); + const lockedWorkingDir = composerOptions?.lockedWorkingDir?.trim() ?? ""; + useEffect(() => { + if (!composerOptions || !lockedWorkingDir) { + return; + } + if (formState.workingDir.trim() === lockedWorkingDir) { + return; + } + formState.setWorkingDir(lockedWorkingDir); + }, [composerOptions, formState, lockedWorkingDir]); + + const effectiveModelId = useMemo( + () => + resolveEffectiveComposerModelId({ + selectedModel: formState.selectedModel, + availableModels: formState.availableModels, + }), + [formState.availableModels, formState.selectedModel], + ); + + const effectiveThinkingOptionId = useMemo( + () => + resolveEffectiveComposerThinkingOptionId({ + selectedThinkingOptionId: formState.selectedThinkingOptionId, + availableModels: formState.availableModels, + effectiveModelId, + }), + [effectiveModelId, formState.availableModels, formState.selectedThinkingOptionId], + ); + + const workingDir = lockedWorkingDir || formState.workingDir; + const { + features: draftFeatures, + featureValues: draftFeatureValues, + setFeatureValue: setDraftFeatureValue, + } = useDraftAgentFeatures({ + serverId: formState.selectedServerId, + provider: formState.selectedProvider, + cwd: workingDir, + modeId: formState.selectedMode, + modelId: effectiveModelId, + thinkingOptionId: effectiveThinkingOptionId, + }); + + const commandDraftConfig = useMemo( + () => + composerOptions + ? buildDraftComposerCommandConfig({ + provider: formState.selectedProvider, + cwd: workingDir, + modeOptions: formState.modeOptions, + selectedMode: formState.selectedMode, + effectiveModelId, + effectiveThinkingOptionId, + featureValues: draftFeatureValues, + }) + : undefined, + [ + composerOptions, + effectiveModelId, + effectiveThinkingOptionId, + draftFeatureValues, + workingDir, + formState.modeOptions, + formState.selectedMode, + formState.selectedProvider, + ], + ); + + const composerState = useMemo(() => { + if (!composerOptions) { + return null; + } + + return { + ...formState, + workingDir, + effectiveModelId, + effectiveThinkingOptionId, + featureValues: draftFeatureValues, + statusControls: buildDraftStatusControls({ + formState, + features: draftFeatures, + onSetFeature: setDraftFeatureValue, + }), + commandDraftConfig, + }; + }, [ + commandDraftConfig, + composerOptions, + effectiveModelId, + effectiveThinkingOptionId, + draftFeatures, + draftFeatureValues, + formState, + setDraftFeatureValue, + workingDir, + ]); + return { text, setText, @@ -155,5 +396,14 @@ export function useAgentInputDraft(draftKey: string): AgentInputDraft { setImages, clear, isHydrated, + composerState, }; } + +export const __private__ = { + resolveDraftKey, + resolveEffectiveComposerModelId, + resolveEffectiveComposerThinkingOptionId, + buildDraftComposerCommandConfig, + buildDraftStatusControls, +}; diff --git a/packages/app/src/hooks/use-agent-screen-state-machine.ts b/packages/app/src/hooks/use-agent-screen-state-machine.ts index 5bc29306e..e9b33d76f 100644 --- a/packages/app/src/hooks/use-agent-screen-state-machine.ts +++ b/packages/app/src/hooks/use-agent-screen-state-machine.ts @@ -5,6 +5,7 @@ export interface AgentScreenAgent { id: string; status: "initializing" | "idle" | "running" | "error" | "closed"; cwd: string; + lastError?: string | null; projectPlacement?: { checkout?: { cwd?: string; diff --git a/packages/app/src/hooks/use-archive-agent.test.ts b/packages/app/src/hooks/use-archive-agent.test.ts index 173aa9ba1..0e3ec4fdc 100644 --- a/packages/app/src/hooks/use-archive-agent.test.ts +++ b/packages/app/src/hooks/use-archive-agent.test.ts @@ -139,4 +139,36 @@ describe("useArchiveAgent", () => { entries: [{ agent: { id: "agent-2" } }], }); }); + + it("can apply archived agent close results without invalidating cached lists", () => { + const queryClient = new QueryClient(); + useSessionStore + .getState() + .initializeSession("server-a", {} as DaemonClient); + useSessionStore.getState().setAgents( + "server-a", + new Map([ + [ + "agent-1", + makeAgent(), + ], + ]), + ); + queryClient.setQueryData(["sidebarAgentsList", "server-a"], { + entries: [{ agent: { id: "agent-1" } }, { agent: { id: "agent-2" } }], + }); + queryClient.setQueryData(["allAgents", "server-a"], { + entries: [{ agent: { id: "agent-1" } }, { agent: { id: "agent-2" } }], + }); + + applyArchivedAgentCloseResults({ + queryClient, + serverId: "server-a", + results: [{ agentId: "agent-1", archivedAt: "2026-04-01T04:00:00.000Z" }], + invalidateQueries: false, + }); + + expect(queryClient.getQueryState(["sidebarAgentsList", "server-a"])?.isInvalidated).toBe(false); + expect(queryClient.getQueryState(["allAgents", "server-a"])?.isInvalidated).toBe(false); + }); }); diff --git a/packages/app/src/hooks/use-archive-agent.ts b/packages/app/src/hooks/use-archive-agent.ts index 278319454..922f2d229 100644 --- a/packages/app/src/hooks/use-archive-agent.ts +++ b/packages/app/src/hooks/use-archive-agent.ts @@ -29,6 +29,16 @@ interface AgentsListQueryData { entries?: Array<{ agent?: { id?: string | null } | null } | null>; } +interface ArchivedAgentListCacheSnapshot { + sidebarAgentsList: AgentsListQueryData | undefined; + allAgents: AgentsListQueryData | undefined; +} + +interface ArchiveAgentMutationContext { + agent: ReturnType; + lists: ArchivedAgentListCacheSnapshot; +} + function toArchiveKey(input: ArchiveAgentInput): string { const serverId = input.serverId.trim(); const agentId = input.agentId.trim(); @@ -111,6 +121,70 @@ function removeAgentFromCachedLists(queryClient: QueryClient, input: ArchiveAgen ); } +function getStoredAgentSnapshot(input: ArchiveAgentInput) { + return useSessionStore.getState().sessions[input.serverId]?.agents.get(input.agentId); +} + +function restoreAgentSnapshot( + input: ArchiveAgentInput & { agent: ReturnType }, +): void { + const setAgents = useSessionStore.getState().setAgents; + setAgents(input.serverId, (prev) => { + const hasAgent = prev.has(input.agentId); + if (!input.agent) { + if (!hasAgent) { + return prev; + } + const next = new Map(prev); + next.delete(input.agentId); + return next; + } + + const current = prev.get(input.agentId); + if (current === input.agent) { + return prev; + } + + const next = new Map(prev); + next.set(input.agentId, input.agent); + return next; + }); +} + +function getArchivedAgentListCacheSnapshot( + queryClient: QueryClient, + serverId: string, +): ArchivedAgentListCacheSnapshot { + return { + sidebarAgentsList: queryClient.getQueryData([ + "sidebarAgentsList", + serverId, + ]), + allAgents: queryClient.getQueryData(["allAgents", serverId]), + }; +} + +function restoreCachedListSnapshot( + queryClient: QueryClient, + queryKey: readonly [string, string], + snapshot: AgentsListQueryData | undefined, +): void { + if (snapshot === undefined) { + queryClient.removeQueries({ queryKey, exact: true }); + return; + } + queryClient.setQueryData(queryKey, snapshot); +} + +function restoreArchivedAgentListCacheSnapshot( + queryClient: QueryClient, + serverId: string, + snapshot: ArchivedAgentListCacheSnapshot, +): void { + restoreCachedListSnapshot(queryClient, ["sidebarAgentsList", serverId], snapshot.sidebarAgentsList); + restoreCachedListSnapshot(queryClient, ["allAgents", serverId], snapshot.allAgents); +} + function markAgentArchivedInStore(input: ArchiveAgentInput & { archivedAt: string }): void { const archivedAt = new Date(input.archivedAt); if (Number.isNaN(archivedAt.getTime())) { @@ -139,6 +213,7 @@ interface ApplyArchivedAgentCloseResultsInput { queryClient: QueryClient; serverId: string; results: ArchivedAgentCloseResult[]; + invalidateQueries?: boolean; } export function applyArchivedAgentCloseResults( @@ -160,12 +235,14 @@ export function applyArchivedAgentCloseResults( }); } - void input.queryClient.invalidateQueries({ - queryKey: ["sidebarAgentsList", input.serverId], - }); - void input.queryClient.invalidateQueries({ - queryKey: ["allAgents", input.serverId], - }); + if (input.invalidateQueries ?? true) { + void input.queryClient.invalidateQueries({ + queryKey: ["sidebarAgentsList", input.serverId], + }); + void input.queryClient.invalidateQueries({ + queryKey: ["allAgents", input.serverId], + }); + } } export function clearArchiveAgentPending(input: IsAgentArchivingInput): void { @@ -195,26 +272,56 @@ export function useArchiveAgent() { return await client.archiveAgent(input.agentId); }, onMutate: (input) => { + const context: ArchiveAgentMutationContext = { + agent: getStoredAgentSnapshot(input), + lists: getArchivedAgentListCacheSnapshot(queryClient, input.serverId), + }; + const archivedAt = new Date().toISOString(); + + applyArchivedAgentCloseResults({ + queryClient, + serverId: input.serverId, + results: [{ agentId: input.agentId, archivedAt }], + invalidateQueries: false, + }); setAgentArchiving({ queryClient, serverId: input.serverId, agentId: input.agentId, isArchiving: true, }); + return context; }, onSuccess: (result, input) => { - applyArchivedAgentCloseResults({ - queryClient, + markAgentArchivedInStore({ serverId: input.serverId, - results: [{ agentId: input.agentId, archivedAt: result.archivedAt }], + agentId: input.agentId, + archivedAt: result.archivedAt, }); }, + onError: (_error, input, context) => { + if (!context) { + return; + } + restoreAgentSnapshot({ + serverId: input.serverId, + agentId: input.agentId, + agent: context.agent, + }); + restoreArchivedAgentListCacheSnapshot(queryClient, input.serverId, context.lists); + }, onSettled: (_result, _error, input) => { clearArchiveAgentPending({ queryClient, serverId: input.serverId, agentId: input.agentId, }); + void queryClient.invalidateQueries({ + queryKey: ["sidebarAgentsList", input.serverId], + }); + void queryClient.invalidateQueries({ + queryKey: ["allAgents", input.serverId], + }); }, }); diff --git a/packages/app/src/hooks/use-checkout-pr-status-query.ts b/packages/app/src/hooks/use-checkout-pr-status-query.ts index fd0c032ec..a06654171 100644 --- a/packages/app/src/hooks/use-checkout-pr-status-query.ts +++ b/packages/app/src/hooks/use-checkout-pr-status-query.ts @@ -20,6 +20,9 @@ export interface PrHint { url: string; number: number; state: "open" | "merged" | "closed"; + checks?: Array<{ name: string; status: string; url: string | null }>; + checksStatus?: "none" | "pending" | "success" | "failure"; + reviewDecision?: "approved" | "changes_requested" | "pending" | null; } function parsePullRequestNumber(url: string): number | null { @@ -56,6 +59,9 @@ function selectWorkspacePrHint(payload: CheckoutPrStatusPayload): PrHint | null : status.state === "open" ? "open" : "closed", + checks: status.checks, + checksStatus: status.checksStatus as PrHint["checksStatus"], + reviewDecision: status.reviewDecision as PrHint["reviewDecision"], }; } diff --git a/packages/app/src/hooks/use-command-center.ts b/packages/app/src/hooks/use-command-center.ts index 92abbaa12..dae233ed5 100644 --- a/packages/app/src/hooks/use-command-center.ts +++ b/packages/app/src/hooks/use-command-center.ts @@ -2,6 +2,7 @@ import { useCallback, useEffect, useMemo, useRef, useState } from "react"; import type { TextInput } from "react-native"; import { router, usePathname, type Href } from "expo-router"; import { useKeyboardShortcutsStore } from "@/stores/keyboard-shortcuts-store"; +import { useSessionStore } from "@/stores/session-store"; import { keyboardActionDispatcher } from "@/keyboard/keyboard-action-dispatcher"; import { useHosts } from "@/runtime/host-runtime"; import { useAllAgentsList } from "@/hooks/use-all-agents-list"; @@ -11,13 +12,18 @@ import { clearCommandCenterFocusRestoreElement, takeCommandCenterFocusRestoreElement, } from "@/utils/command-center-focus-restore"; -import { buildHostSettingsRoute, parseServerIdFromPathname } from "@/utils/host-routes"; +import { + buildHostAgentDetailRoute, + buildHostSettingsRoute, + parseServerIdFromPathname, +} from "@/utils/host-routes"; import type { ShortcutKey } from "@/utils/format-shortcut"; import { chordStringToShortcutKeys } from "@/keyboard/shortcut-string"; import { getBindingIdForAction, getDefaultKeysForAction } from "@/keyboard/keyboard-shortcuts"; import { useKeyboardShortcutOverrides } from "@/hooks/use-keyboard-shortcut-overrides"; import { getShortcutOs } from "@/utils/shortcut-platform"; import { getIsElectronRuntime } from "@/constants/layout"; +import { resolveWorkspaceIdByExecutionDirectory } from "@/utils/workspace-execution"; import { prepareWorkspaceTab } from "@/utils/workspace-navigation"; import { focusWithRetries } from "@/utils/web-focus"; @@ -215,9 +221,17 @@ export function useCommandCenter() { // Don't restore focus back to the prior element after we navigate. clearCommandCenterFocusRestoreElement(); setOpen(false); + const workspaceId = resolveWorkspaceIdByExecutionDirectory({ + workspaces: useSessionStore.getState().sessions[agent.serverId]?.workspaces?.values(), + workspaceDirectory: agent.cwd, + }); + if (!workspaceId) { + router.navigate(buildHostAgentDetailRoute(agent.serverId, agent.id) as any); + return; + } const route = prepareWorkspaceTab({ serverId: agent.serverId, - workspaceId: agent.cwd, + workspaceId, target: { kind: "agent", agentId: agent.id }, }); router.navigate(route); diff --git a/packages/app/src/hooks/use-draft-agent-create-flow.ts b/packages/app/src/hooks/use-draft-agent-create-flow.ts index 98d55b74f..190d92e41 100644 --- a/packages/app/src/hooks/use-draft-agent-create-flow.ts +++ b/packages/app/src/hooks/use-draft-agent-create-flow.ts @@ -72,6 +72,7 @@ type CreateRequestContext = { interface UseDraftAgentCreateFlowOptions { draftId: string; getPendingServerId: () => string | null; + allowEmptyText?: boolean; validateBeforeSubmit?: (ctx: SubmitContext) => string | null; onBeforeSubmit?: (ctx: CreateRequestContext) => void; onCreateStart?: () => void; @@ -84,6 +85,7 @@ interface UseDraftAgentCreateFlowOptions { export function useDraftAgentCreateFlow({ draftId, getPendingServerId, + allowEmptyText = false, validateBeforeSubmit, onBeforeSubmit, onCreateStart, @@ -110,6 +112,10 @@ export function useDraftAgentCreateFlow({ return EMPTY_STREAM_ITEMS; } + if (!machine.attempt.text && (!machine.attempt.images || machine.attempt.images.length === 0)) { + return EMPTY_STREAM_ITEMS; + } + return [ { kind: "user_message", @@ -139,7 +145,7 @@ export function useDraftAgentCreateFlow({ dispatch({ type: "DRAFT_SET_ERROR", message: "" }); const trimmedPrompt = text.trim(); - if (!trimmedPrompt) { + if (!trimmedPrompt && !allowEmptyText) { const error = new Error("Initial prompt is required"); dispatch({ type: "DRAFT_SET_ERROR", message: error.message }); throw error; @@ -215,6 +221,7 @@ export function useDraftAgentCreateFlow({ setPendingCreateAttempt, updatePendingAgentId, validateBeforeSubmit, + allowEmptyText, ], ); diff --git a/packages/app/src/hooks/use-open-project.test.ts b/packages/app/src/hooks/use-open-project.test.ts new file mode 100644 index 000000000..3bf87a915 --- /dev/null +++ b/packages/app/src/hooks/use-open-project.test.ts @@ -0,0 +1,142 @@ +import { beforeEach, describe, expect, it, vi } from "vitest"; + +vi.mock("@react-native-async-storage/async-storage", () => { + const storage = new Map(); + return { + default: { + getItem: vi.fn(async (key: string) => storage.get(key) ?? null), + setItem: vi.fn(async (key: string, value: string) => { + storage.set(key, value); + }), + removeItem: vi.fn(async (key: string) => { + storage.delete(key); + }), + }, + }; +}); + +const { replaceRoute } = vi.hoisted(() => ({ + replaceRoute: vi.fn(), +})); + +vi.mock("expo-router", () => ({ + router: { + replace: replaceRoute, + }, +})); + +import { openProjectDirectly } from "@/hooks/use-open-project"; +import { useSessionStore } from "@/stores/session-store"; +import { + buildWorkspaceTabPersistenceKey, + collectAllTabs, + useWorkspaceLayoutStore, +} from "@/stores/workspace-layout-store"; +import { generateDraftId } from "@/stores/draft-keys"; + +const SERVER_ID = "server-1"; +const WORKSPACE_ID = "/repo/project"; + +function createOpenDraftTab() { + return (workspaceKey: string) => + useWorkspaceLayoutStore.getState().openTab(workspaceKey, { + kind: "draft", + draftId: generateDraftId(), + }); +} + +describe("openProjectDirectly", () => { + beforeEach(() => { + replaceRoute.mockReset(); + useSessionStore.setState({ + sessions: {}, + }); + useSessionStore.getState().initializeSession(SERVER_ID, {} as never); + useWorkspaceLayoutStore.setState({ + layoutByWorkspace: {}, + splitSizesByWorkspace: {}, + pinnedAgentIdsByWorkspace: {}, + }); + vi.restoreAllMocks(); + }); + + it("opens the workspace directly, marks workspaces hydrated, and seeds a draft tab", async () => { + const result = await openProjectDirectly({ + serverId: SERVER_ID, + projectPath: WORKSPACE_ID, + isConnected: true, + client: { + openProject: vi.fn(async () => ({ + requestId: "request-1", + error: null, + workspace: { + id: "1", + projectId: "1", + projectDisplayName: "project", + projectRootPath: WORKSPACE_ID, + workspaceDirectory: WORKSPACE_ID, + projectKind: "git" as const, + workspaceKind: "checkout" as const, + name: "project", + status: "done" as const, + activityAt: null, + diffStat: null, + services: [], + }, + })), + }, + mergeWorkspaces: useSessionStore.getState().mergeWorkspaces, + setHasHydratedWorkspaces: useSessionStore.getState().setHasHydratedWorkspaces, + openDraftTab: createOpenDraftTab(), + replaceRoute, + }); + + expect(result).toBe(true); + expect(useSessionStore.getState().sessions[SERVER_ID]?.hasHydratedWorkspaces).toBe(true); + expect(Array.from(useSessionStore.getState().sessions[SERVER_ID]?.workspaces.values() ?? [])).toEqual([ + expect.objectContaining({ + id: "1", + projectId: "1", + projectRootPath: WORKSPACE_ID, + workspaceDirectory: WORKSPACE_ID, + }), + ]); + + const workspaceKey = buildWorkspaceTabPersistenceKey({ + serverId: SERVER_ID, + workspaceId: "1", + }); + expect(workspaceKey).toBeTruthy(); + const layout = useWorkspaceLayoutStore.getState().layoutByWorkspace[workspaceKey as string]; + expect(layout.root.kind).toBe("pane"); + const tabs = collectAllTabs(layout.root); + expect(tabs).toHaveLength(1); + expect(tabs[0]?.target.kind).toBe("draft"); + expect(replaceRoute).toHaveBeenCalledWith("/h/server-1/workspace/MQ"); + }); + + it("does not navigate or seed tabs when openProject fails", async () => { + const result = await openProjectDirectly({ + serverId: SERVER_ID, + projectPath: WORKSPACE_ID, + isConnected: true, + client: { + openProject: vi.fn(async () => ({ + requestId: "request-2", + error: "Failed to open project", + workspace: null, + })), + }, + mergeWorkspaces: useSessionStore.getState().mergeWorkspaces, + setHasHydratedWorkspaces: useSessionStore.getState().setHasHydratedWorkspaces, + openDraftTab: createOpenDraftTab(), + replaceRoute, + }); + + expect(result).toBe(false); + expect(useSessionStore.getState().sessions[SERVER_ID]?.hasHydratedWorkspaces).toBe(false); + expect(useSessionStore.getState().sessions[SERVER_ID]?.workspaces.size).toBe(0); + expect(useWorkspaceLayoutStore.getState().layoutByWorkspace).toEqual({}); + expect(replaceRoute).not.toHaveBeenCalled(); + }); +}); diff --git a/packages/app/src/hooks/use-open-project.ts b/packages/app/src/hooks/use-open-project.ts index ed353dccf..e3ea6b238 100644 --- a/packages/app/src/hooks/use-open-project.ts +++ b/packages/app/src/hooks/use-open-project.ts @@ -1,45 +1,81 @@ -import { router } from "expo-router"; import { useCallback } from "react"; -import { useToast } from "@/contexts/toast-context"; -import { useHostRuntimeClient } from "@/runtime/host-runtime"; -import { normalizeWorkspaceDescriptor, useSessionStore } from "@/stores/session-store"; -import { prepareWorkspaceTab } from "@/utils/workspace-navigation"; +import { router } from "expo-router"; +import type { DaemonClient } from "@server/client/daemon-client"; +import { useHostRuntimeClient, useHostRuntimeIsConnected } from "@/runtime/host-runtime"; +import { normalizeWorkspaceDescriptor, type WorkspaceDescriptor, useSessionStore } from "@/stores/session-store"; +import { + buildWorkspaceTabPersistenceKey, + useWorkspaceLayoutStore, +} from "@/stores/workspace-layout-store"; +import { generateDraftId } from "@/stores/draft-keys"; +import { buildHostWorkspaceRoute } from "@/utils/host-routes"; + +interface OpenProjectDirectlyInput { + serverId: string; + projectPath: string; + isConnected: boolean; + client: Pick | null; + mergeWorkspaces: (serverId: string, workspaces: Iterable) => void; + setHasHydratedWorkspaces: (serverId: string, hydrated: boolean) => void; + openDraftTab: (workspaceKey: string) => string | null; + replaceRoute: (route: string) => void; +} + +export async function openProjectDirectly(input: OpenProjectDirectlyInput): Promise { + const normalizedServerId = input.serverId.trim(); + const trimmedPath = input.projectPath.trim(); + if (!normalizedServerId || !trimmedPath || !input.client || !input.isConnected) { + return false; + } + + const payload = await input.client.openProject(trimmedPath); + if (payload.error || !payload.workspace) { + return false; + } + + const workspace = normalizeWorkspaceDescriptor(payload.workspace); + input.mergeWorkspaces(normalizedServerId, [workspace]); + input.setHasHydratedWorkspaces(normalizedServerId, true); + + const workspaceKey = buildWorkspaceTabPersistenceKey({ + serverId: normalizedServerId, + workspaceId: workspace.id, + }); + if (!workspaceKey) { + return false; + } + + input.openDraftTab(workspaceKey); + input.replaceRoute(buildHostWorkspaceRoute(normalizedServerId, workspace.id)); + return true; +} export function useOpenProject(serverId: string | null): (path: string) => Promise { const normalizedServerId = serverId?.trim() ?? ""; - const toast = useToast(); const client = useHostRuntimeClient(normalizedServerId); + const isConnected = useHostRuntimeIsConnected(normalizedServerId); const mergeWorkspaces = useSessionStore((state) => state.mergeWorkspaces); const setHasHydratedWorkspaces = useSessionStore((state) => state.setHasHydratedWorkspaces); return useCallback( async (path: string) => { - const trimmedPath = path.trim(); - if (!trimmedPath || !client || !normalizedServerId) { - return false; - } - - try { - const payload = await client.openProject(trimmedPath); - if (payload.error || !payload.workspace) { - throw new Error(payload.error || "Failed to open project"); - } - - mergeWorkspaces(normalizedServerId, [normalizeWorkspaceDescriptor(payload.workspace)]); - setHasHydratedWorkspaces(normalizedServerId, true); - router.replace( - prepareWorkspaceTab({ - serverId: normalizedServerId, - workspaceId: payload.workspace.id, - target: { kind: "draft", draftId: "new" }, - }) as any, - ); - return true; - } catch (error) { - toast.error(error instanceof Error ? error.message : "Failed to open project"); - return false; - } + return openProjectDirectly({ + serverId: normalizedServerId, + projectPath: path, + isConnected, + client, + mergeWorkspaces, + setHasHydratedWorkspaces, + openDraftTab: (workspaceKey: string) => + useWorkspaceLayoutStore.getState().openTab(workspaceKey, { + kind: "draft", + draftId: generateDraftId(), + }), + replaceRoute: (route) => { + router.replace(route as any); + }, + }); }, - [client, mergeWorkspaces, normalizedServerId, setHasHydratedWorkspaces, toast], + [client, isConnected, mergeWorkspaces, normalizedServerId, setHasHydratedWorkspaces], ); } diff --git a/packages/app/src/hooks/use-provider-models.ts b/packages/app/src/hooks/use-provider-models.ts new file mode 100644 index 000000000..25e861e79 --- /dev/null +++ b/packages/app/src/hooks/use-provider-models.ts @@ -0,0 +1,49 @@ +import { useMemo } from "react"; +import { useQueries } from "@tanstack/react-query"; +import { + AGENT_PROVIDER_DEFINITIONS, + type AgentProviderDefinition, +} from "@server/server/agent/provider-manifest"; +import type { AgentModelDefinition, AgentProvider } from "@server/server/agent/agent-sdk-types"; +import { useHostRuntimeClient, useHostRuntimeIsConnected } from "@/runtime/host-runtime"; + +const STALE_TIME = 5 * 60 * 1000; + +export function useProviderModels(serverId: string) { + const client = useHostRuntimeClient(serverId); + const isConnected = useHostRuntimeIsConnected(serverId); + const enabled = Boolean(serverId && client && isConnected); + + const queries = useQueries({ + queries: AGENT_PROVIDER_DEFINITIONS.map((def) => ({ + queryKey: ["providerModels", serverId, def.id] as const, + enabled, + staleTime: STALE_TIME, + queryFn: async () => { + if (!client) { + throw new Error("Host is not connected"); + } + const payload = await client.listProviderModels(def.id as AgentProvider); + if (payload.error) { + throw new Error(payload.error); + } + return payload.models ?? []; + }, + })), + }); + + const allProviderModels = useMemo(() => { + const map = new Map(); + for (let i = 0; i < AGENT_PROVIDER_DEFINITIONS.length; i++) { + const query = queries[i]; + if (query?.data) { + map.set(AGENT_PROVIDER_DEFINITIONS[i]!.id, query.data); + } + } + return map; + }, [queries]); + + const isLoading = queries.some((q) => q.isLoading); + + return { allProviderModels, isLoading }; +} diff --git a/packages/app/src/hooks/use-sidebar-workspaces-list.test.ts b/packages/app/src/hooks/use-sidebar-workspaces-list.test.ts index 88f4925c6..d9941aae4 100644 --- a/packages/app/src/hooks/use-sidebar-workspaces-list.test.ts +++ b/packages/app/src/hooks/use-sidebar-workspaces-list.test.ts @@ -1,4 +1,5 @@ import { describe, expect, it } from "vitest"; +import type { WorkspaceServicePayload } from "@server/shared/messages"; import { appendMissingOrderKeys, applyStoredOrdering, @@ -19,7 +20,12 @@ function workspace( Partial< Pick< WorkspaceDescriptor, - "projectDisplayName" | "projectRootPath" | "projectKind" | "workspaceKind" + | "projectDisplayName" + | "projectRootPath" + | "workspaceDirectory" + | "projectKind" + | "workspaceKind" + | "services" > >, ): WorkspaceDescriptor { @@ -28,15 +34,35 @@ function workspace( projectId: input.projectId, projectDisplayName: input.projectDisplayName ?? input.projectId, projectRootPath: input.projectRootPath ?? input.id, + workspaceDirectory: input.workspaceDirectory ?? input.projectRootPath ?? input.id, projectKind: input.projectKind ?? "git", - workspaceKind: input.workspaceKind ?? "local_checkout", + workspaceKind: input.workspaceKind ?? "checkout", name: input.name, status: input.status, activityAt: input.activityAt, diffStat: null, + services: input.services ?? [], }; } +const runningService: WorkspaceServicePayload = { + serviceName: "web", + hostname: "main.web.localhost", + port: 3000, + url: "http://main.web.localhost:6767", + lifecycle: "running", + health: "healthy", +}; + +const stoppedService: WorkspaceServicePayload = { + serviceName: "api", + hostname: "main.api.localhost", + port: 3001, + url: "http://main.api.localhost:6767", + lifecycle: "stopped", + health: null, +}; + describe("applyStoredOrdering", () => { it("keeps unknown items on the baseline while applying stored order", () => { const result = applyStoredOrdering({ @@ -117,6 +143,27 @@ describe("buildSidebarProjectsFromWorkspaces", () => { expect(projects[0]?.workspaces[0]?.statusBucket).toBe("failed"); }); + it("threads services into workspace rows and derives hasRunningServices", () => { + const projects = buildSidebarProjectsFromWorkspaces({ + serverId: "srv", + workspaces: [ + workspace({ + id: "/repo/main", + projectId: "project-1", + name: "main", + status: "running", + activityAt: new Date("2026-01-01T00:00:00.000Z"), + services: [runningService, stoppedService], + }), + ], + projectOrder: [], + workspaceOrderByScope: {}, + }); + + expect(projects[0]?.workspaces[0]?.services).toEqual([runningService, stoppedService]); + expect(projects[0]?.workspaces[0]?.hasRunningServices).toBe(true); + }); + it("preserves stored project order even when activity changes", () => { const initialWorkspaces: WorkspaceDescriptor[] = [ workspace({ diff --git a/packages/app/src/hooks/use-sidebar-workspaces-list.ts b/packages/app/src/hooks/use-sidebar-workspaces-list.ts index 947c83423..d3f13a9b4 100644 --- a/packages/app/src/hooks/use-sidebar-workspaces-list.ts +++ b/packages/app/src/hooks/use-sidebar-workspaces-list.ts @@ -1,4 +1,5 @@ import { useCallback, useEffect, useMemo, useSyncExternalStore } from "react"; +import type { WorkspaceDescriptorPayload } from "@server/shared/messages"; import { mergeWorkspaceSnapshotWithExisting, normalizeWorkspaceDescriptor, @@ -19,11 +20,16 @@ export interface SidebarWorkspaceEntry { workspaceKey: string; serverId: string; workspaceId: string; + projectRootPath?: string; + workspaceDirectory?: string; + projectKind: WorkspaceDescriptor["projectKind"]; workspaceKind: WorkspaceDescriptor["workspaceKind"]; name: string; activityAt: Date | null; statusBucket: SidebarStateBucket; diffStat: { additions: number; deletions: number } | null; + services: WorkspaceDescriptor["services"]; + hasRunningServices: boolean; } export interface SidebarProjectEntry { @@ -122,7 +128,7 @@ export function buildSidebarProjectsFromWorkspaces(input: { projectName: workspace.projectDisplayName || projectDisplayNameFromProjectId(workspace.projectId), projectKind: workspace.projectKind, - iconWorkingDir: workspace.projectRootPath || workspace.id, + iconWorkingDir: workspace.projectRootPath, statusBucket: "done", activeCount: 0, totalWorkspaces: 0, @@ -134,11 +140,16 @@ export function buildSidebarProjectsFromWorkspaces(input: { workspaceKey: `${input.serverId}:${workspace.id}`, serverId: input.serverId, workspaceId: workspace.id, + projectRootPath: workspace.projectRootPath, + workspaceDirectory: workspace.workspaceDirectory, + projectKind: workspace.projectKind, workspaceKind: workspace.workspaceKind, name: workspace.name, activityAt: workspace.activityAt, statusBucket: workspace.status, diffStat: workspace.diffStat, + services: workspace.services, + hasRunningServices: workspace.services.some((service) => service.lifecycle === "running"), }; project.workspaces.push(row); @@ -256,13 +267,18 @@ function toWorkspaceDescriptor(payload: { projectId: string; projectDisplayName: string; projectRootPath: string; + workspaceDirectory: string; projectKind: WorkspaceDescriptor["projectKind"]; workspaceKind: WorkspaceDescriptor["workspaceKind"]; name: string; status: WorkspaceDescriptor["status"]; activityAt: string | null; + services?: WorkspaceDescriptorPayload["services"]; }): WorkspaceDescriptor { - return normalizeWorkspaceDescriptor(payload); + return normalizeWorkspaceDescriptor({ + ...payload, + services: payload.services ?? [], + }); } export function useSidebarWorkspacesList(options?: { diff --git a/packages/app/src/keyboard/keyboard-shortcuts.test.ts b/packages/app/src/keyboard/keyboard-shortcuts.test.ts index 21bc67c72..605803d9f 100644 --- a/packages/app/src/keyboard/keyboard-shortcuts.test.ts +++ b/packages/app/src/keyboard/keyboard-shortcuts.test.ts @@ -261,10 +261,10 @@ describe("keyboard-shortcuts", () => { action: "sidebar.toggle.left", }, { - name: "keeps Mod+. as sidebar toggle fallback", + name: "binds Mod+. to toggle both sidebars on non-mac", event: { key: ".", code: "Period", ctrlKey: true }, context: { isMac: false }, - action: "sidebar.toggle.left", + action: "sidebar.toggle.both", }, { name: "routes Mod+D to message-input action outside terminal", @@ -344,11 +344,6 @@ describe("keyboard-shortcuts", () => { event: { key: "k", code: "KeyK", ctrlKey: true }, context: { isMac: false, focusScope: "terminal" }, }, - { - name: "does not bind Ctrl+B on non-mac", - event: { key: "b", code: "KeyB", ctrlKey: true }, - context: { isMac: false }, - }, { name: "does not route message-input actions when terminal is focused", event: { key: "d", code: "KeyD", metaKey: true }, @@ -477,10 +472,11 @@ describe("keyboard-shortcut help sections", () => { }, }, { - name: "uses mod+period as non-mac left sidebar shortcut", + name: "uses mod+b for the left sidebar and mod+period for both sidebars on non-mac", context: { isMac: false, isDesktop: false }, expectedKeys: { - "toggle-left-sidebar": ["mod", "."], + "toggle-left-sidebar": ["mod", "B"], + "toggle-both-sidebars": ["mod", "."], }, }, ]; diff --git a/packages/app/src/keyboard/keyboard-shortcuts.ts b/packages/app/src/keyboard/keyboard-shortcuts.ts index bc9b65122..06f4882d8 100644 --- a/packages/app/src/keyboard/keyboard-shortcuts.ts +++ b/packages/app/src/keyboard/keyboard-shortcuts.ts @@ -204,7 +204,7 @@ const SHORTCUT_BINDINGS: readonly ShortcutBinding[] = [ help: { id: "workspace-tab-new", section: "tabs-panes", - label: "New agent tab", + label: "New tab", keys: ["mod", "T"], }, }, @@ -216,7 +216,7 @@ const SHORTCUT_BINDINGS: readonly ShortcutBinding[] = [ help: { id: "workspace-tab-new", section: "tabs-panes", - label: "New agent tab", + label: "New tab", keys: ["mod", "T"], }, }, diff --git a/packages/app/src/panels/agent-panel.tsx b/packages/app/src/panels/agent-panel.tsx index debf2ff29..e8e35f812 100644 --- a/packages/app/src/panels/agent-panel.tsx +++ b/packages/app/src/panels/agent-panel.tsx @@ -4,14 +4,13 @@ import ReanimatedAnimated from "react-native-reanimated"; import { StyleSheet, useUnistyles } from "react-native-unistyles"; import { useShallow } from "zustand/shallow"; import { useStoreWithEqualityFn } from "zustand/traditional"; -import { Bot } from "lucide-react-native"; import invariant from "tiny-invariant"; import { AgentStreamView, type AgentStreamViewHandle } from "@/components/agent-stream-view"; -import { AgentInputArea } from "@/components/agent-input-area"; +import { Composer } from "@/components/composer"; import { ArchivedAgentCallout } from "@/components/archived-agent-callout"; import { FileDropZone } from "@/components/file-drop-zone"; -import type { ImageAttachment } from "@/components/message-input"; import { getProviderIcon } from "@/components/provider-icons"; +import type { ImageAttachment } from "@/components/message-input"; import { ToastViewport, useToastHost } from "@/components/toast-host"; import { useAgentAttentionClear } from "@/hooks/use-agent-attention-clear"; import { useAgentInitialization } from "@/hooks/use-agent-initialization"; @@ -92,7 +91,7 @@ function useAgentPanelDescriptor( ); const provider = descriptorState.provider; const label = resolveWorkspaceAgentTabLabel(descriptorState.title); - const icon = getProviderIcon(provider) ?? Bot; + const icon = getProviderIcon(provider); return { label: label ?? "", @@ -151,6 +150,12 @@ function isNotFoundErrorMessage(message: string): boolean { return /agent not found|not found/i.test(message); } +type AgentLookupState = + | { tag: "idle" } + | { tag: "loading" } + | { tag: "not_found"; message: string } + | { tag: "error"; message: string }; + function AgentPanelContent({ serverId, agentId, @@ -223,6 +228,195 @@ function AgentPanelBody({ isConnected: boolean; connectionStatus: HostRuntimeConnectionStatus; onOpenWorkspaceFile?: (input: { filePath: string }) => void; +}) { + const { theme } = useUnistyles(); + const { isArchivingAgent } = useArchiveAgent(); + const hasSession = useSessionStore((state) => Boolean(state.sessions[serverId])); + const setAgents = useSessionStore((state) => state.setAgents); + const setPendingPermissions = useSessionStore((state) => state.setPendingPermissions); + const projectPlacement = useStoreWithEqualityFn( + useSessionStore, + (state) => + agentId ? (state.sessions[serverId]?.agents?.get(agentId)?.projectPlacement ?? null) : null, + (a, b) => a === b || JSON.stringify(a) === JSON.stringify(b), + ); + const agentState = useSessionStore( + useShallow((state) => { + const agent = agentId ? state.sessions[serverId]?.agents?.get(agentId) ?? null : null; + return { + serverId: agent?.serverId ?? null, + id: agent?.id ?? null, + status: agent?.status ?? null, + cwd: agent?.cwd ?? null, + lastError: agent?.lastError ?? null, + archivedAt: agent?.archivedAt ?? null, + }; + }), + ); + const [lookupState, setLookupState] = useState({ tag: "idle" }); + const lookupAttemptTokenRef = useRef(0); + + useEffect(() => { + lookupAttemptTokenRef.current += 1; + setLookupState({ tag: "idle" }); + }, [agentId, serverId]); + + useEffect(() => { + if (!agentId) { + return; + } + if (agentState.id) { + if (lookupState.tag !== "idle") { + setLookupState({ tag: "idle" }); + } + return; + } + if (!isConnected || !hasSession) { + return; + } + if (lookupState.tag === "loading" || lookupState.tag === "not_found") { + return; + } + + setLookupState({ tag: "loading" }); + const attemptToken = ++lookupAttemptTokenRef.current; + + client + .fetchAgent(agentId) + .then((result) => { + if (attemptToken !== lookupAttemptTokenRef.current) { + return; + } + if (!result) { + setLookupState({ + tag: "not_found", + message: `Agent not found: ${agentId}`, + }); + return; + } + + const normalized = normalizeAgentSnapshot(result.agent, serverId); + const hydrated = { + ...normalized, + projectPlacement: result.project, + }; + setAgents(serverId, (previous) => { + const next = new Map(previous); + next.set(hydrated.id, hydrated); + return next; + }); + setPendingPermissions(serverId, (previous) => { + const next = new Map(previous); + for (const [key, pending] of next.entries()) { + if (pending.agentId === hydrated.id) { + next.delete(key); + } + } + for (const request of hydrated.pendingPermissions) { + const key = derivePendingPermissionKey(hydrated.id, request); + next.set(key, { key, agentId: hydrated.id, request }); + } + return next; + }); + setLookupState({ tag: "idle" }); + }) + .catch((error) => { + if (attemptToken !== lookupAttemptTokenRef.current) { + return; + } + const message = toErrorMessage(error); + if (isNotFoundErrorMessage(message)) { + setLookupState({ tag: "not_found", message }); + return; + } + setLookupState({ tag: "error", message }); + }); + }, [ + agentId, + agentState.id, + client, + hasSession, + isConnected, + lookupState.tag, + serverId, + setAgents, + setPendingPermissions, + ]); + + if (lookupState.tag === "not_found") { + return ( + + + Agent not found + + + ); + } + + if (lookupState.tag === "error") { + return ( + + + Failed to load agent + {lookupState.message} + + + ); + } + + const agent: AgentScreenAgent | null = + agentState.serverId && agentState.id && agentState.status && agentState.cwd + ? { + serverId: agentState.serverId, + id: agentState.id, + status: agentState.status, + cwd: agentState.cwd, + lastError: agentState.lastError ?? null, + projectPlacement, + } + : null; + + if (!agent) { + return ( + + + + + + ); + } + + const isArchivingCurrentAgent = Boolean(agentId && isArchivingAgent({ serverId, agentId })); + + return ( + + ); +} + +function ChatAgentContent({ + serverId, + agentId, + isPaneFocused, + client, + isConnected, + connectionStatus, + onOpenWorkspaceFile, +}: { + serverId: string; + agentId?: string; + isPaneFocused: boolean; + client: NonNullable>; + isConnected: boolean; + connectionStatus: HostRuntimeConnectionStatus; + onOpenWorkspaceFile?: (input: { filePath: string }) => void; }) { const { theme } = useUnistyles(); const panelToast = useToastHost(); @@ -237,12 +431,12 @@ function AgentPanelBody({ routeKey: string; reason: "initial-entry" | "resume"; } | null>(null); - const agentInputDraft = useAgentInputDraft( - buildDraftStoreKey({ + const agentInputDraft = useAgentInputDraft({ + draftKey: buildDraftStoreKey({ serverId, agentId: agentId ?? "__pending__", }), - ); + }); const handleFilesDropped = useCallback((files: ImageAttachment[]) => { addImagesRef.current?.(files); @@ -260,6 +454,7 @@ function AgentPanelBody({ id: agent?.id ?? null, status: agent?.status ?? null, cwd: agent?.cwd ?? null, + lastError: agent?.lastError ?? null, archivedAt: agent?.archivedAt ?? null, requiresAttention: agent?.requiresAttention ?? false, attentionReason: agent?.attentionReason ?? null, @@ -490,6 +685,7 @@ function AgentPanelBody({ id: agentState.id, status: agentState.status, cwd: agentState.cwd, + lastError: agentState.lastError ?? null, projectPlacement, } : null; @@ -611,99 +807,6 @@ function AgentPanelBody({ setMissingAgentState({ kind: "idle" }); }, [agentId, serverId]); - useEffect(() => { - if (!agentId) { - return; - } - if (agentState.id || shouldUseOptimisticStream) { - if (missingAgentState.kind !== "idle") { - setMissingAgentState({ kind: "idle" }); - } - return; - } - if (!isConnected || !hasSession) { - return; - } - if (missingAgentState.kind === "resolving" || missingAgentState.kind === "not_found") { - return; - } - - setMissingAgentState({ kind: "resolving" }); - const attemptToken = ++initAttemptTokenRef.current; - - ensureAgentIsInitialized(agentId) - .then(async () => { - if (attemptToken !== initAttemptTokenRef.current) { - return; - } - const currentAgent = useSessionStore.getState().sessions[serverId]?.agents.get(agentId); - if (!currentAgent) { - const result = await client.fetchAgent(agentId); - if (attemptToken !== initAttemptTokenRef.current) { - return; - } - if (!result) { - setMissingAgentState({ - kind: "not_found", - message: `Agent not found: ${agentId}`, - }); - return; - } - const normalized = normalizeAgentSnapshot(result.agent, serverId); - const hydrated = { - ...normalized, - projectPlacement: result.project, - }; - setAgents(serverId, (previous) => { - const next = new Map(previous); - next.set(hydrated.id, hydrated); - return next; - }); - setPendingPermissions(serverId, (previous) => { - const next = new Map(previous); - for (const [key, pending] of next.entries()) { - if (pending.agentId === hydrated.id) { - next.delete(key); - } - } - for (const request of hydrated.pendingPermissions) { - const key = derivePendingPermissionKey(hydrated.id, request); - next.set(key, { key, agentId: hydrated.id, request }); - } - return next; - }); - } - if (attemptToken !== initAttemptTokenRef.current) { - return; - } - setMissingAgentState({ kind: "idle" }); - }) - .catch((error) => { - if (attemptToken !== initAttemptTokenRef.current) { - return; - } - const message = toErrorMessage(error); - if (isNotFoundErrorMessage(message)) { - setMissingAgentState({ kind: "not_found", message }); - return; - } - setMissingAgentState({ kind: "error", message }); - }); - }, [ - agentState.id, - agentId, - client, - ensureAgentIsInitialized, - hasSession, - isConnected, - missingAgentState.kind, - serverId, - setAgents, - setPendingPermissions, - shouldUseOptimisticStream, - ]); - - if (viewState.tag === "not_found") { return ( @@ -756,7 +859,7 @@ function AgentPanelBody({ {agentId && !isArchivingCurrentAgent && !agentState.archivedAt ? ( - + resolveWorkspaceExecutionAuthority({ + workspaces: state.sessions[serverId]?.workspaces, + workspaceId, + }), + ); invariant(target.kind === "file", "FilePanel requires file target"); - return ; + if (!authority) { + return ( + + Workspace execution directory not found. + + ); + } + return ( + + ); } export const filePanelRegistration: PanelRegistration<"file"> = { diff --git a/packages/app/src/panels/register-panels.ts b/packages/app/src/panels/register-panels.ts index 760671b65..dfc14f7fe 100644 --- a/packages/app/src/panels/register-panels.ts +++ b/packages/app/src/panels/register-panels.ts @@ -2,6 +2,7 @@ import { agentPanelRegistration } from "@/panels/agent-panel"; import { draftPanelRegistration } from "@/panels/draft-panel"; import { filePanelRegistration } from "@/panels/file-panel"; import { registerPanel } from "@/panels/panel-registry"; +import { setupPanelRegistration } from "@/panels/setup-panel"; import { terminalPanelRegistration } from "@/panels/terminal-panel"; let panelsRegistered = false; @@ -12,6 +13,7 @@ export function ensurePanelsRegistered(): void { } registerPanel(draftPanelRegistration); registerPanel(agentPanelRegistration); + registerPanel(setupPanelRegistration); registerPanel(terminalPanelRegistration); registerPanel(filePanelRegistration); panelsRegistered = true; diff --git a/packages/app/src/panels/setup-panel.tsx b/packages/app/src/panels/setup-panel.tsx new file mode 100644 index 000000000..fe1d94aa3 --- /dev/null +++ b/packages/app/src/panels/setup-panel.tsx @@ -0,0 +1,443 @@ +import { useCallback, useEffect, useRef, useState } from "react"; +import { CheckCircle2, ChevronRight, CircleAlert, SquareTerminal } from "lucide-react-native"; +import { ActivityIndicator, Pressable, ScrollView, Text, View } from "react-native"; +import invariant from "tiny-invariant"; +import { StyleSheet, useUnistyles } from "react-native-unistyles"; +import { Fonts } from "@/constants/theme"; +import { usePaneContext } from "@/panels/pane-context"; +import type { PanelDescriptor, PanelRegistration } from "@/panels/panel-registry"; +import { buildWorkspaceTabPersistenceKey } from "@/stores/workspace-tabs-store"; +import { useWorkspaceSetupStore } from "@/stores/workspace-setup-store"; +import { useHostRuntimeClient } from "@/runtime/host-runtime"; + +function useSetupPanelDescriptor( + target: { kind: "setup"; workspaceId: string }, + context: { serverId: string; workspaceId: string }, +): PanelDescriptor { + const key = buildWorkspaceTabPersistenceKey({ + serverId: context.serverId, + workspaceId: target.workspaceId, + }); + const snapshot = useWorkspaceSetupStore((state) => (key ? state.snapshots[key] ?? null : null)); + + if (snapshot?.status === "completed") { + return { + label: "Setup", + subtitle: "Setup completed", + titleState: "ready", + icon: CheckCircle2, + statusBucket: null, + }; + } + + if (snapshot?.status === "failed") { + return { + label: "Setup", + subtitle: "Setup failed", + titleState: "ready", + icon: CircleAlert, + statusBucket: null, + }; + } + + return { + label: "Setup", + subtitle: "Workspace setup", + titleState: "ready", + icon: SquareTerminal, + statusBucket: snapshot?.status === "running" ? "running" : null, + }; +} + +type CommandStatus = "running" | "completed" | "failed"; + +function CommandStatusIcon({ status }: { status: CommandStatus }) { + const { theme } = useUnistyles(); + + if (status === "running") { + return ; + } + if (status === "completed") { + return ; + } + return ; +} + +function formatDuration(ms: number): string { + if (ms < 1000) return `${ms}ms`; + const seconds = Math.floor(ms / 1000); + if (seconds < 60) return `${seconds}s`; + const minutes = Math.floor(seconds / 60); + const remainingSeconds = seconds % 60; + return `${minutes}m ${remainingSeconds}s`; +} + +/** + * Process carriage returns in log text so progress-bar output renders cleanly. + * Splits on \r, keeps only the last segment per CR-delimited group (unless followed by \n). + */ +function processCarriageReturns(text: string): string { + if (!text.includes("\r")) return text; + return text + .split("\n") + .map((line) => { + if (!line.includes("\r")) return line; + const segments = line.split("\r"); + return segments[segments.length - 1]; + }) + .join("\n"); +} + +function SetupPanel() { + const { theme } = useUnistyles(); + const { serverId, target } = usePaneContext(); + invariant(target.kind === "setup", "SetupPanel requires setup target"); + + const client = useHostRuntimeClient(serverId); + const key = buildWorkspaceTabPersistenceKey({ + serverId, + workspaceId: target.workspaceId, + }); + const snapshot = useWorkspaceSetupStore((state) => (key ? state.snapshots[key] ?? null : null)); + const upsertProgress = useWorkspaceSetupStore((state) => state.upsertProgress); + + // On mount, if no snapshot in the store, request cached status from server + const requestedRef = useRef(false); + useEffect(() => { + if (snapshot || requestedRef.current || !client) return; + requestedRef.current = true; + client + .fetchWorkspaceSetupStatus(target.workspaceId) + .then((response) => { + if (response.snapshot) { + upsertProgress({ + serverId, + payload: { workspaceId: response.workspaceId, ...response.snapshot }, + }); + } + }) + .catch(() => { + // Server may not support this yet — ignore + }); + }, [client, snapshot, serverId, target.workspaceId, upsertProgress]); + + const commands = snapshot?.detail.commands ?? []; + const log = snapshot?.detail.log ?? ""; + const hasNoSetupCommands = + snapshot?.status === "completed" && commands.length === 0 && log.trim().length === 0; + const isWaiting = !snapshot || (snapshot.status === "running" && commands.length === 0); + + const [expandedIndices, setExpandedIndices] = useState>(new Set()); + const [manuallyCollapsed, setManuallyCollapsed] = useState>(new Set()); + + const toggleExpanded = useCallback((index: number, isAutoExpanded: boolean) => { + setExpandedIndices((prev) => { + const next = new Set(prev); + if (next.has(index) || isAutoExpanded) { + next.delete(index); + // If this was auto-expanded, record that the user manually collapsed it + if (isAutoExpanded) { + setManuallyCollapsed((mc) => new Set(mc).add(index)); + } + } else { + next.add(index); + // If the user re-expands, remove from manually collapsed + setManuallyCollapsed((mc) => { + const next = new Set(mc); + next.delete(index); + return next; + }); + } + return next; + }); + }, []); + + // Determine which command should auto-expand (running or last completed). + const autoExpandIndex = (() => { + const running = commands.find((c) => c.status === "running"); + if (running) return running.index; + if (commands.length > 0) return commands[commands.length - 1].index; + return null; + })(); + + const statusLabel = snapshot?.status === "running" + ? "Running" + : snapshot?.status === "completed" + ? "Completed" + : snapshot?.status === "failed" + ? "Failed" + : "Waiting for setup output"; + + return ( + + {/* Hidden element for status — preserves testID for E2E */} + {statusLabel} + + {isWaiting ? ( + + + Setting up workspace... + + ) : hasNoSetupCommands ? ( + + + No setup commands ran for this workspace. + + + ) : ( + + {commands.map((command) => { + const isExpanded = expandedIndices.has(command.index); + const hasError = command.status === "failed" && snapshot?.error; + + // Per-command log: use command.log if available, fall back to detail.log for the auto-expand target + const commandLog = (() => { + if ("log" in command && typeof command.log === "string") { + return command.log; + } + // Fallback: show detail.log on the auto-expand target command + if (command.index === autoExpandIndex) return log; + return ""; + })(); + const hasLog = commandLog.trim().length > 0; + + // All non-running commands are expandable (completed/failed) + const isExpandable = command.status !== "running" || hasLog || !!hasError; + + // Auto-expand the active command unless the user manually collapsed it + const isAutoExpanded = + command.index === autoExpandIndex && !manuallyCollapsed.has(command.index); + const showDetail = isExpanded || isAutoExpanded; + + const processedLog = hasLog ? processCarriageReturns(commandLog) : ""; + + return ( + + toggleExpanded(command.index, isAutoExpanded)} + style={({ pressed }) => [ + styles.commandRow, + showDetail && styles.commandRowExpanded, + pressed && styles.commandRowPressed, + ]} + accessibilityRole="button" + accessibilityState={{ expanded: showDetail }} + > + + + + + {command.command} + + {command.durationMs != null ? ( + + {formatDuration(command.durationMs)} + + ) : null} + + + {showDetail ? ( + + {hasLog ? ( + + + {processedLog} + + + ) : ( + + No output + + )} + {hasError ? ( + + + {snapshot.error} + + + ) : null} + + ) : null} + + ); + })} + + {/* If there's log but no commands yet, or log without a target command, show standalone */} + {commands.length === 0 && log.trim().length > 0 ? ( + + + {log} + + + ) : null} + + {/* Show error at top level if no commands failed but there's a setup error */} + {snapshot?.error && !commands.some((c) => c.status === "failed") ? ( + + + {snapshot.error} + + + ) : null} + + )} + + ); +} + +export const setupPanelRegistration: PanelRegistration<"setup"> = { + kind: "setup", + component: SetupPanel, + useDescriptor: useSetupPanelDescriptor, +}; + +const styles = StyleSheet.create((theme) => ({ + container: { + flex: 1, + minHeight: 0, + backgroundColor: theme.colors.surface0, + }, + contentContainer: { + padding: theme.spacing[4], + flexGrow: 1, + }, + hiddenStatus: { + position: "absolute", + width: 1, + height: 1, + overflow: "hidden", + opacity: 0, + }, + waitingContainer: { + flex: 1, + alignItems: "center", + justifyContent: "center", + gap: theme.spacing[3], + }, + waitingText: { + fontSize: theme.fontSize.sm, + color: theme.colors.foregroundMuted, + }, + emptyContainer: { + flex: 1, + alignItems: "center", + justifyContent: "center", + }, + emptyText: { + fontSize: theme.fontSize.sm, + color: theme.colors.foregroundMuted, + }, + commandList: { + gap: theme.spacing[2], + }, + commandItem: { + borderRadius: theme.borderRadius.lg, + borderWidth: theme.borderWidth[1], + borderColor: theme.colors.border, + overflow: "hidden", + }, + commandRow: { + flexDirection: "row", + alignItems: "center", + gap: theme.spacing[2], + paddingHorizontal: theme.spacing[3], + paddingVertical: theme.spacing[2], + backgroundColor: theme.colors.surface1, + }, + commandRowExpanded: { + borderBottomWidth: theme.borderWidth[1], + borderBottomColor: theme.colors.border, + }, + commandRowPressed: { + opacity: 0.8, + }, + commandStatusIcon: { + width: 18, + height: 18, + alignItems: "center", + justifyContent: "center", + flexShrink: 0, + }, + commandText: { + flex: 1, + fontSize: theme.fontSize.sm, + color: theme.colors.foreground, + }, + commandDuration: { + fontSize: theme.fontSize.xs, + color: theme.colors.foregroundMuted, + flexShrink: 0, + }, + chevron: { + flexShrink: 0, + }, + chevronExpanded: { + transform: [{ rotate: "90deg" }], + }, + commandDetail: { + backgroundColor: theme.colors.surface0, + }, + logScroll: { + maxHeight: 400, + }, + logScrollContent: { + padding: theme.spacing[3], + }, + logText: { + fontFamily: Fonts.mono, + fontSize: theme.fontSize.sm, + lineHeight: 20, + color: theme.colors.foreground, + }, + emptyLogText: { + fontSize: theme.fontSize.sm, + color: theme.colors.foregroundMuted, + fontStyle: "italic", + }, + errorCard: { + padding: theme.spacing[3], + backgroundColor: theme.colors.palette.red[100], + }, + errorText: { + fontSize: theme.fontSize.sm, + color: theme.colors.palette.red[800], + }, +})); diff --git a/packages/app/src/panels/terminal-panel.tsx b/packages/app/src/panels/terminal-panel.tsx index 2f77fec4c..b6f9a2bf5 100644 --- a/packages/app/src/panels/terminal-panel.tsx +++ b/packages/app/src/panels/terminal-panel.tsx @@ -1,6 +1,6 @@ import { useQuery } from "@tanstack/react-query"; import { Terminal } from "lucide-react-native"; -import { View } from "react-native"; +import { Text, View } from "react-native"; import { useIsFocused } from "@react-navigation/native"; import invariant from "tiny-invariant"; import type { ListTerminalsResponse } from "@server/shared/messages"; @@ -8,6 +8,7 @@ import { TerminalPane } from "@/components/terminal-pane"; import { usePaneContext } from "@/panels/pane-context"; import type { PanelDescriptor, PanelRegistration } from "@/panels/panel-registry"; import { useSessionStore } from "@/stores/session-store"; +import { getWorkspaceExecutionAuthority } from "@/utils/workspace-execution"; type ListTerminalsPayload = ListTerminalsResponse["payload"]; @@ -24,14 +25,24 @@ function useTerminalPanelDescriptor( context: { serverId: string; workspaceId: string }, ): PanelDescriptor { const client = useSessionStore((state) => state.sessions[context.serverId]?.client ?? null); + const workspaces = useSessionStore((state) => state.sessions[context.serverId]?.workspaces); + const workspaceAuthority = getWorkspaceExecutionAuthority({ + workspaces, + workspaceId: context.workspaceId, + }); + const workspaceDirectory = workspaceAuthority.ok + ? workspaceAuthority.authority.workspaceDirectory + : null; const terminalsQuery = useQuery({ - queryKey: ["terminals", context.serverId, context.workspaceId] as const, - enabled: Boolean(client && context.workspaceId), + queryKey: ["terminals", context.serverId, workspaceDirectory] as const, + enabled: Boolean(client && workspaceDirectory), queryFn: async (): Promise => { - if (!client) { - return { cwd: context.workspaceId, terminals: [], requestId: "missing-client" }; + if (!client || !workspaceDirectory) { + throw new Error( + workspaceAuthority.ok ? "Workspace execution directory not found" : workspaceAuthority.message, + ); } - return client.listTerminals(context.workspaceId); + return client.listTerminals(workspaceDirectory); }, staleTime: 5_000, }); @@ -39,7 +50,7 @@ function useTerminalPanelDescriptor( terminalsQuery.data?.terminals.find((entry) => entry.id === target.terminalId) ?? null; return { - label: trimNonEmpty(terminal?.name ?? null) ?? "Terminal", + label: trimNonEmpty(terminal?.title ?? terminal?.name ?? null) ?? "Terminal", subtitle: "Terminal", titleState: "ready", icon: Terminal, @@ -50,16 +61,31 @@ function useTerminalPanelDescriptor( function TerminalPanel() { const isFocused = useIsFocused(); const { serverId, workspaceId, target, isPaneFocused } = usePaneContext(); + const workspaces = useSessionStore((state) => state.sessions[serverId]?.workspaces); + const workspaceAuthority = getWorkspaceExecutionAuthority({ workspaces, workspaceId }); + const workspaceDirectory = workspaceAuthority.ok + ? workspaceAuthority.authority.workspaceDirectory + : null; invariant(target.kind === "terminal", "TerminalPanel requires terminal target"); if (!isFocused) { return ; } + if (!workspaceDirectory) { + return ( + + + {workspaceAuthority.ok ? "Workspace execution directory not found." : workspaceAuthority.message} + + + ); + } + return ( diff --git a/packages/app/src/runtime/host-runtime.test.ts b/packages/app/src/runtime/host-runtime.test.ts index dcebc6118..dd7827e02 100644 --- a/packages/app/src/runtime/host-runtime.test.ts +++ b/packages/app/src/runtime/host-runtime.test.ts @@ -540,44 +540,37 @@ describe("HostRuntimeController", () => { unsubscribe(); }); - it("logs typed reason codes for connection transitions", async () => { - const infoSpy = vi.spyOn(console, "info").mockImplementation(() => undefined); - try { - const host = makeHost({ - connections: [ - { - id: "direct:lan:6767", - type: "directTcp", - endpoint: "lan:6767", - }, - ], - }); - const clients: FakeDaemonClient[] = []; - const controller = new HostRuntimeController({ - host, - deps: makeDeps( - { - "direct:lan:6767": 12, - }, - clients, - ), - }); - - await controller.start({ autoProbe: false }); - clients[0]?.setConnectionState({ - status: "disconnected", - reason: "transport closed", - }); - - const transitionPayloads = infoSpy.mock.calls - .filter((call) => call[0] === "[HostRuntimeTransition]") - .map((call) => call[1] as { reasonCode?: string | null }); - const lastTransition = transitionPayloads[transitionPayloads.length - 1] ?? null; - - expect(lastTransition?.reasonCode).toBe("transport_error"); - } finally { - infoSpy.mockRestore(); - } + it("preserves transport disconnect reasons on the runtime snapshot", async () => { + const host = makeHost({ + connections: [ + { + id: "direct:lan:6767", + type: "directTcp", + endpoint: "lan:6767", + }, + ], + }); + const clients: FakeDaemonClient[] = []; + const controller = new HostRuntimeController({ + host, + deps: makeDeps( + { + "direct:lan:6767": 12, + }, + clients, + ), + }); + + await controller.start({ autoProbe: false }); + clients[0]?.setConnectionState({ + status: "disconnected", + reason: "transport closed", + }); + + expect(controller.getSnapshot()).toMatchObject({ + connectionStatus: "error", + lastError: "transport closed", + }); }); it("marks directory loading on first connection before any directory sync succeeds", async () => { diff --git a/packages/app/src/screens/agent/draft-agent-screen.tsx b/packages/app/src/screens/agent/draft-agent-screen.tsx index 69f7c0c05..f82a48ad7 100644 --- a/packages/app/src/screens/agent/draft-agent-screen.tsx +++ b/packages/app/src/screens/agent/draft-agent-screen.tsx @@ -11,15 +11,14 @@ import Animated from "react-native-reanimated"; import { Folder, GitBranch, PanelRight } from "lucide-react-native"; import { SidebarMenuToggle } from "@/components/headers/menu-header"; import { HeaderToggleButton } from "@/components/headers/header-toggle-button"; -import { AgentInputArea } from "@/components/agent-input-area"; +import { Composer } from "@/components/composer"; import { AgentStreamView } from "@/components/agent-stream-view"; import { FormSelectTrigger } from "@/components/agent-form/agent-form-dropdowns"; import { ExplorerSidebar } from "@/components/explorer-sidebar"; import { Combobox } from "@/components/ui/combobox"; import { FileDropZone } from "@/components/file-drop-zone"; import { useQuery } from "@tanstack/react-query"; -import { useAgentFormState, type CreateAgentInitialValues } from "@/hooks/use-agent-form-state"; -import type { DraftCommandConfig } from "@/hooks/use-agent-commands-query"; +import type { CreateAgentInitialValues } from "@/hooks/use-agent-form-state"; import { CHECKOUT_STATUS_STALE_TIME, checkoutStatusQueryKey, @@ -27,6 +26,7 @@ import { import { useAllAgentsList } from "@/hooks/use-all-agents-list"; import { useHosts } from "@/runtime/host-runtime"; import { buildBranchComboOptions, normalizeBranchOptionName } from "@/utils/branch-suggestions"; +import { buildHostAgentDetailRoute } from "@/utils/host-routes"; import { shortenPath } from "@/utils/shorten-path"; import { collectAgentWorkingDirectorySuggestions } from "@/utils/agent-working-directory-suggestions"; import { buildWorkingDirectorySuggestions } from "@/utils/working-directory-suggestions"; @@ -50,13 +50,13 @@ import type { AgentSessionConfig, } from "@server/server/agent/agent-sdk-types"; import { AGENT_PROVIDER_DEFINITIONS } from "@server/server/agent/provider-manifest"; +import { resolveWorkspaceIdByExecutionDirectory } from "@/utils/workspace-execution"; import { prepareWorkspaceTab } from "@/utils/workspace-navigation"; import { TitlebarDragRegion } from "@/components/desktop/titlebar-drag-region"; import { useKeyboardShiftStyle } from "@/hooks/use-keyboard-shift-style"; import { normalizeAgentSnapshot } from "@/utils/agent-snapshots"; import { useAgentInputDraft } from "@/hooks/use-agent-input-draft"; import { useDraftAgentCreateFlow } from "@/hooks/use-draft-agent-create-flow"; -import { useDraftAgentFeatures } from "@/hooks/use-draft-agent-features"; const EMPTY_PENDING_PERMISSIONS = new Map(); const DRAFT_CAPABILITIES: AgentCapabilityFlags = { @@ -203,38 +203,45 @@ function DraftAgentScreenContent({ return values; }, [resolvedMode, resolvedModel, resolvedProvider, resolvedThinkingOptionId, resolvedWorkingDir]); + const draftIdRef = useRef(generateDraftId()); + const draftAgentIdRef = useRef(generateDraftId()); + const draftInput = useAgentInputDraft( + { + draftKey: ({ selectedServerId }) => + buildDraftStoreKey({ + serverId: selectedServerId ?? "", + agentId: draftAgentIdRef.current, + draftId: draftIdRef.current, + }), + composer: { + initialServerId: resolvedServerId ?? null, + initialValues, + isVisible, + onlineServerIds, + }, + }, + ); + const composerState = draftInput.composerState; + if (!composerState) { + throw new Error("Draft agent composer state is required"); + } + const { selectedServerId, setSelectedServerIdFromUser, - selectedProvider, - setProviderFromUser, - selectedMode, - setModeFromUser, - selectedModel, - setModelFromUser, - selectedThinkingOptionId, - setThinkingOptionFromUser, + providerDefinitions, workingDir, setWorkingDirFromUser, - providerDefinitions, modeOptions, - availableModels, - allProviderModels, - allProviderEntries, - isAllModelsLoading, - availableThinkingOptions, isModelLoading, modelError, refreshProviderModels, - setProviderAndModelFromUser, persistFormPreferences, - } = useAgentFormState({ - initialServerId: resolvedServerId ?? null, - initialValues, - isVisible, - isCreateFlow: true, - onlineServerIds, - }); + effectiveModelId, + effectiveThinkingOptionId, + commandDraftConfig, + statusControls, + } = composerState; const isMobile = isCompactFormFactor(); const mobileView = usePanelStore((state) => state.mobileView); const desktopFileExplorerOpen = usePanelStore((state) => state.desktop.fileExplorerOpen); @@ -246,15 +253,6 @@ function DraftAgentScreenContent({ (state) => state.activateExplorerTabForCheckout, ); const isExplorerOpen = isMobile ? mobileView === "file-explorer" : desktopFileExplorerOpen; - const draftIdRef = useRef(generateDraftId()); - const draftAgentIdRef = useRef(generateDraftId()); - const draftInput = useAgentInputDraft( - buildDraftStoreKey({ - serverId: selectedServerId ?? "", - agentId: draftAgentIdRef.current, - draftId: draftIdRef.current, - }), - ); const [worktreeMode, setWorktreeMode] = useState<"none" | "create" | "attach">( initialWorktreeMode, @@ -634,6 +632,16 @@ function DraftAgentScreenContent({ isGit: isAttachWorktree && selectedWorktreePath ? true : checkout?.isGit === true, }; }, [selectedServerId, explorerCwd, isAttachWorktree, selectedWorktreePath, checkout?.isGit]); + const draftExplorerWorkspaceId = useSessionStore( + useCallback( + (state) => + resolveWorkspaceIdByExecutionDirectory({ + workspaces: selectedServerId ? state.sessions[selectedServerId]?.workspaces?.values() : null, + workspaceDirectory: explorerCwd, + }), + [explorerCwd, selectedServerId], + ), + ); const canOpenExplorer = draftExplorerCheckout !== null; const openExplorerForDraftCheckout = useCallback(() => { if (!draftExplorerCheckout) { @@ -745,61 +753,6 @@ function DraftAgentScreenContent({ }, [baseBranch, branchSearchQuery, branchSuggestionsQuery.data, checkout, worktreeOptions]); const createAgentClient = sessionClient; - const effectiveDraftModelId = useMemo(() => { - if (selectedModel.trim()) { - return selectedModel.trim(); - } - return availableModels.find((model) => model.isDefault)?.id ?? availableModels[0]?.id ?? ""; - }, [availableModels, selectedModel]); - const effectiveDraftThinkingOptionId = useMemo(() => { - if (selectedThinkingOptionId.trim()) { - return selectedThinkingOptionId.trim(); - } - const selectedModelDefinition = - availableModels.find((model) => model.id === effectiveDraftModelId) ?? null; - return selectedModelDefinition?.defaultThinkingOptionId ?? ""; - }, [availableModels, effectiveDraftModelId, selectedThinkingOptionId]); - const { - features: draftFeatures, - featureValues: draftFeatureValues, - setFeatureValue: setDraftFeatureValue, - } = useDraftAgentFeatures({ - serverId: selectedServerId, - provider: selectedProvider, - cwd: workingDir, - modeId: selectedMode, - modelId: effectiveDraftModelId, - thinkingOptionId: effectiveDraftThinkingOptionId, - }); - const draftCommandConfig = useMemo(() => { - const cwd = ( - isAttachWorktree && selectedWorktreePath ? selectedWorktreePath : workingDir - ).trim(); - if (!cwd) { - return undefined; - } - - return { - provider: selectedProvider, - cwd, - ...(modeOptions.length > 0 && selectedMode !== "" ? { modeId: selectedMode } : {}), - ...(effectiveDraftModelId ? { model: effectiveDraftModelId } : {}), - ...(effectiveDraftThinkingOptionId - ? { thinkingOptionId: effectiveDraftThinkingOptionId } - : {}), - ...(draftFeatureValues ? { featureValues: draftFeatureValues } : {}), - }; - }, [ - draftFeatureValues, - effectiveDraftModelId, - effectiveDraftThinkingOptionId, - isAttachWorktree, - modeOptions.length, - selectedMode, - selectedProvider, - selectedWorktreePath, - workingDir, - ]); const { formErrorMessage, @@ -807,7 +760,7 @@ function DraftAgentScreenContent({ optimisticStreamItems, draftAgent, handleCreateFromInput, - } = useDraftAgentCreateFlow({ + } = useDraftAgentCreateFlow({ draftId: draftIdRef.current, getPendingServerId: () => selectedServerId, validateBeforeSubmit: ({ text }) => { @@ -833,7 +786,7 @@ function DraftAgentScreenContent({ if (isModelLoading) { return "Model defaults are still loading"; } - if (!effectiveDraftModelId) { + if (!effectiveModelId) { return "No model is available for the selected provider"; } if (isAttachWorktree && !selectedWorktreePath) { @@ -866,10 +819,13 @@ function DraftAgentScreenContent({ const cwd = (isAttachWorktree && selectedWorktreePath ? selectedWorktreePath : workingDir).trim() || "."; - const provider = selectedProvider; - const model = effectiveDraftModelId || null; - const thinkingOptionId = effectiveDraftThinkingOptionId || null; - const modeId = modeOptions.length > 0 && selectedMode !== "" ? selectedMode : null; + const provider = composerState.selectedProvider; + const model = effectiveModelId || null; + const thinkingOptionId = effectiveThinkingOptionId || null; + const modeId = + composerState.modeOptions.length > 0 && composerState.selectedMode !== "" + ? composerState.selectedMode + : null; return { serverId, @@ -894,7 +850,6 @@ function DraftAgentScreenContent({ title: "New agent", cwd, model, - features: draftFeatures, thinkingOptionId, labels: {}, }; @@ -904,16 +859,18 @@ function DraftAgentScreenContent({ const resolvedWorkingDir = isAttachWorktree && selectedWorktreePath ? selectedWorktreePath : trimmedPath; - const modeId = modeOptions.length > 0 && selectedMode !== "" ? selectedMode : undefined; + const modeId = + composerState.modeOptions.length > 0 && composerState.selectedMode !== "" + ? composerState.selectedMode + : undefined; const config: AgentSessionConfig = { - provider: selectedProvider, + provider: composerState.selectedProvider, cwd: resolvedWorkingDir, ...(modeId ? { modeId } : {}), - ...(effectiveDraftModelId ? { model: effectiveDraftModelId } : {}), - ...(effectiveDraftThinkingOptionId - ? { thinkingOptionId: effectiveDraftThinkingOptionId } + ...(effectiveModelId ? { model: effectiveModelId } : {}), + ...(effectiveThinkingOptionId + ? { thinkingOptionId: effectiveThinkingOptionId } : {}), - ...(draftFeatureValues ? { featureValues: draftFeatureValues } : {}), }; const effectiveBaseBranch = baseBranch.trim(); @@ -960,20 +917,29 @@ function DraftAgentScreenContent({ const createdWorkingDir = typeof result.cwd === "string" ? result.cwd.trim() : ""; const configuredWorkingDir = config.cwd.trim(); - const workspaceId = createdWorkingDir.length > 0 ? createdWorkingDir : configuredWorkingDir; + const workspaceId = resolveWorkspaceIdByExecutionDirectory({ + workspaces: useSessionStore.getState().sessions[selectedServerId]?.workspaces?.values(), + workspaceDirectory: createdWorkingDir.length > 0 ? createdWorkingDir : configuredWorkingDir, + }); return { agentId: result.id, result: { id: result.id, - cwd: workspaceId, + workspaceId, }, }; }, onCreateSuccess: ({ result }) => { + if (!result.workspaceId) { + router.replace( + buildHostAgentDetailRoute(selectedServerId as string, result.id) as any, + ); + return; + } const route = prepareWorkspaceTab({ serverId: selectedServerId as string, - workspaceId: result.cwd, + workspaceId: result.workspaceId, target: { kind: "agent", agentId: result.id }, }); router.replace(route as any); @@ -1234,7 +1200,7 @@ function DraftAgentScreenContent({ )} - @@ -1277,7 +1226,7 @@ function DraftAgentScreenContent({ {!isMobile && isExplorerOpen && explorerServerId && draftExplorerCheckout ? ( @@ -1300,7 +1249,7 @@ function DraftAgentScreenContent({ {isMobile && explorerServerId && draftExplorerCheckout ? ( diff --git a/packages/app/src/screens/new-workspace-screen.tsx b/packages/app/src/screens/new-workspace-screen.tsx new file mode 100644 index 000000000..b11d001fc --- /dev/null +++ b/packages/app/src/screens/new-workspace-screen.tsx @@ -0,0 +1,376 @@ +import { useCallback, useMemo, useRef, useState } from "react"; +import { Pressable, Text, View } from "react-native"; +import { StyleSheet, useUnistyles } from "react-native-unistyles"; +import { createNameId } from "mnemonic-id"; +import { useQuery } from "@tanstack/react-query"; +import { ChevronDown, GitBranch } from "lucide-react-native"; +import { Composer } from "@/components/composer"; +import { Combobox, ComboboxItem } from "@/components/ui/combobox"; +import type { ComboboxOption as ComboboxOptionType } from "@/components/ui/combobox"; +import { TitlebarDragRegion } from "@/components/desktop/titlebar-drag-region"; +import { SidebarMenuToggle } from "@/components/headers/menu-header"; +import { ScreenHeader } from "@/components/headers/screen-header"; +import { + HEADER_INNER_HEIGHT, + HEADER_INNER_HEIGHT_MOBILE, + HEADER_TOP_PADDING_MOBILE, + MAX_CONTENT_WIDTH, +} from "@/constants/layout"; +import { useToast } from "@/contexts/toast-context"; +import { useAgentInputDraft } from "@/hooks/use-agent-input-draft"; +import { useHostRuntimeClient, useHostRuntimeIsConnected } from "@/runtime/host-runtime"; +import { normalizeWorkspaceDescriptor, useSessionStore } from "@/stores/session-store"; +import { normalizeAgentSnapshot } from "@/utils/agent-snapshots"; +import { encodeImages } from "@/utils/encode-images"; +import { toErrorMessage } from "@/utils/error-messages"; +import { + requireWorkspaceExecutionAuthority, +} from "@/utils/workspace-execution"; +import { navigateToPreparedWorkspaceTab } from "@/utils/workspace-navigation"; +import type { ImageAttachment, MessagePayload } from "@/components/message-input"; + +interface NewWorkspaceScreenProps { + serverId: string; + sourceDirectory: string; + displayName?: string; +} + +export function NewWorkspaceScreen({ + serverId, + sourceDirectory, + displayName: displayNameProp, +}: NewWorkspaceScreenProps) { + const { theme } = useUnistyles(); + const toast = useToast(); + const mergeWorkspaces = useSessionStore((state) => state.mergeWorkspaces); + const setAgents = useSessionStore((state) => state.setAgents); + const [errorMessage, setErrorMessage] = useState(null); + const [createdWorkspace, setCreatedWorkspace] = useState | null>(null); + const [pendingAction, setPendingAction] = useState<"chat" | null>(null); + const [selectedBranch, setSelectedBranch] = useState(null); + const [branchPickerOpen, setBranchPickerOpen] = useState(false); + const branchAnchorRef = useRef(null); + + const displayName = displayNameProp?.trim() ?? ""; + const workspace = createdWorkspace; + const client = useHostRuntimeClient(serverId); + const isConnected = useHostRuntimeIsConnected(serverId); + const chatDraft = useAgentInputDraft({ + draftKey: `new-workspace:${serverId}:${sourceDirectory}`, + composer: { + initialServerId: serverId || null, + initialValues: workspace?.workspaceDirectory + ? { workingDir: workspace.workspaceDirectory } + : undefined, + isVisible: true, + onlineServerIds: isConnected && serverId ? [serverId] : [], + lockedWorkingDir: workspace?.workspaceDirectory || sourceDirectory || undefined, + }, + }); + const composerState = chatDraft.composerState; + + const withConnectedClient = useCallback(() => { + if (!client || !isConnected) { + throw new Error("Host is not connected"); + } + return client; + }, [client, isConnected]); + + const checkoutStatusQuery = useQuery({ + queryKey: ["checkout-status", serverId, sourceDirectory], + queryFn: async () => { + const connectedClient = withConnectedClient(); + return connectedClient.getCheckoutStatus(sourceDirectory); + }, + enabled: isConnected && !!client, + }); + + const currentBranch = checkoutStatusQuery.data?.currentBranch ?? null; + + const branchSuggestionsQuery = useQuery({ + queryKey: ["branch-suggestions", serverId, sourceDirectory], + queryFn: async () => { + const connectedClient = withConnectedClient(); + return connectedClient.getBranchSuggestions({ cwd: sourceDirectory, limit: 20 }); + }, + enabled: isConnected && !!client, + }); + + const branchOptions: ComboboxOptionType[] = useMemo( + () => + (branchSuggestionsQuery.data?.branches ?? []).map((branch) => ({ + id: branch, + label: branch, + })), + [branchSuggestionsQuery.data?.branches], + ); + + const ensureWorkspace = useCallback(async () => { + if (createdWorkspace) { + return createdWorkspace; + } + + const connectedClient = withConnectedClient(); + const payload = await connectedClient.createPaseoWorktree({ + cwd: sourceDirectory, + worktreeSlug: createNameId(), + }); + + if (payload.error || !payload.workspace) { + throw new Error(payload.error ?? "Failed to create worktree"); + } + + const normalizedWorkspace = normalizeWorkspaceDescriptor(payload.workspace); + mergeWorkspaces(serverId, [normalizedWorkspace]); + setCreatedWorkspace(normalizedWorkspace); + return normalizedWorkspace; + }, [ + createdWorkspace, + mergeWorkspaces, + serverId, + sourceDirectory, + withConnectedClient, + ]); + + const handleCreateChatAgent = useCallback( + async ({ text, images }: MessagePayload) => { + try { + setPendingAction("chat"); + setErrorMessage(null); + const workspace = await ensureWorkspace(); + const connectedClient = withConnectedClient(); + if (!composerState) { + throw new Error("Composer state is required"); + } + + const encodedImages = await encodeImages(images); + const workspaceDirectory = requireWorkspaceExecutionAuthority({ workspace }).workspaceDirectory; + const agent = await connectedClient.createAgent({ + provider: composerState.selectedProvider, + cwd: workspaceDirectory, + workspaceId: workspace.id, + ...(composerState.modeOptions.length > 0 && composerState.selectedMode !== "" + ? { modeId: composerState.selectedMode } + : {}), + ...(composerState.effectiveModelId ? { model: composerState.effectiveModelId } : {}), + ...(composerState.effectiveThinkingOptionId + ? { thinkingOptionId: composerState.effectiveThinkingOptionId } + : {}), + ...(text.trim() ? { initialPrompt: text.trim() } : {}), + ...(encodedImages && encodedImages.length > 0 ? { images: encodedImages } : {}), + }); + + setAgents(serverId, (previous) => { + const next = new Map(previous); + next.set(agent.id, normalizeAgentSnapshot(agent, serverId)); + return next; + }); + navigateToPreparedWorkspaceTab({ + serverId, + workspaceId: workspace.id, + target: { kind: "agent", agentId: agent.id }, + navigationMethod: "replace", + }); + } catch (error) { + const message = toErrorMessage(error); + setErrorMessage(message); + toast.error(message); + } finally { + setPendingAction(null); + } + }, + [composerState, ensureWorkspace, serverId, setAgents, toast, withConnectedClient], + ); + + const workspaceTitle = + workspace?.name || + workspace?.projectDisplayName || + displayName || + sourceDirectory.split(/[\\/]/).filter(Boolean).pop() || + sourceDirectory; + + const addImagesRef = useRef<((images: ImageAttachment[]) => void) | null>(null); + const handleAddImagesCallback = useCallback((addImages: (images: ImageAttachment[]) => void) => { + addImagesRef.current = addImages; + }, []); + + return ( + + + + + + New workspace + + + {workspaceTitle} + + + + } + leftStyle={styles.headerLeft} + borderless + /> + + + + { + // No-op: screen navigates away on success, text should stay for retry on error + }} + autoFocus + commandDraftConfig={composerState?.commandDraftConfig} + statusControls={ + composerState + ? { + ...composerState.statusControls, + disabled: pendingAction !== null, + } + : undefined + } + onAddImages={handleAddImagesCallback} + /> + + + setBranchPickerOpen(true)} + style={({ pressed, hovered }) => [ + styles.badge, + hovered && styles.badgeHovered, + pressed && styles.badgePressed, + ]} + accessibilityRole="button" + accessibilityLabel="Branch" + > + + + {selectedBranch ?? currentBranch ?? "main"} + + + + setSelectedBranch(id)} + searchable + searchPlaceholder="Search branches" + title="Branch" + open={branchPickerOpen} + onOpenChange={setBranchPickerOpen} + desktopPlacement="bottom-start" + anchorRef={branchAnchorRef} + renderOption={({ option, selected, active, onPress }) => ( + + } + /> + )} + /> + + + {errorMessage ? {errorMessage} : null} + + + + ); +} + +const styles = StyleSheet.create((theme) => ({ + container: { + flex: 1, + backgroundColor: theme.colors.surface0, + userSelect: "none", + }, + content: { + position: "relative", + flex: 1, + justifyContent: "center", + alignItems: "center", + paddingBottom: { + xs: HEADER_INNER_HEIGHT_MOBILE + HEADER_TOP_PADDING_MOBILE + theme.spacing[6], + md: HEADER_INNER_HEIGHT + theme.spacing[6], + }, + }, + centered: { + width: "100%", + maxWidth: MAX_CONTENT_WIDTH, + }, + headerLeft: { + gap: theme.spacing[2], + }, + headerTitleContainer: { + flexShrink: 1, + minWidth: 0, + flexDirection: "row", + alignItems: "center", + gap: theme.spacing[2], + }, + headerTitle: { + fontSize: theme.fontSize.base, + fontWeight: { + xs: "400", + md: "300", + }, + color: theme.colors.foreground, + flexShrink: 0, + }, + headerProjectTitle: { + color: theme.colors.foregroundMuted, + fontSize: theme.fontSize.base, + flexShrink: 1, + }, + errorText: { + fontSize: theme.fontSize.sm, + color: theme.colors.destructive, + lineHeight: 20, + }, + optionsRow: { + flexDirection: "row", + alignItems: "center", + gap: theme.spacing[2], + paddingHorizontal: theme.spacing[4] + theme.spacing[4] - 6, + marginTop: -theme.spacing[2], + }, + badge: { + flexDirection: "row", + alignItems: "center", + height: 28, + paddingHorizontal: theme.spacing[2], + borderRadius: theme.borderRadius["2xl"], + gap: theme.spacing[1], + }, + badgeHovered: { + backgroundColor: theme.colors.surface2, + }, + badgePressed: { + backgroundColor: theme.colors.surface0, + }, + badgeText: { + fontSize: theme.fontSize.sm, + color: theme.colors.foregroundMuted, + }, +})); diff --git a/packages/app/src/screens/settings-screen.tsx b/packages/app/src/screens/settings-screen.tsx index 6227b429c..465402c1f 100644 --- a/packages/app/src/screens/settings-screen.tsx +++ b/packages/app/src/screens/settings-screen.tsx @@ -425,7 +425,6 @@ function AppearanceSection({ settings, handleThemeChange }: AppearanceSectionPro ); } - interface ProvidersSectionProps { routeServerId: string; } @@ -1068,6 +1067,10 @@ export default function SettingsScreen() { handleThemeChange, }; + const providersProps: ProvidersSectionProps = { + routeServerId, + }; + const diagnosticsProps: DiagnosticsSectionProps = { voiceAudioEngine, isPlaybackTestRunning, @@ -1080,10 +1083,6 @@ export default function SettingsScreen() { isDesktopApp, }; - const providersProps: ProvidersSectionProps = { - routeServerId, - }; - const sectionContentProps: Omit = { hostsProps, appearanceProps, diff --git a/packages/app/src/screens/workspace/workspace-agent-visibility.test.ts b/packages/app/src/screens/workspace/workspace-agent-visibility.test.ts index 63229fc11..7d03db8ac 100644 --- a/packages/app/src/screens/workspace/workspace-agent-visibility.test.ts +++ b/packages/app/src/screens/workspace/workspace-agent-visibility.test.ts @@ -79,7 +79,7 @@ describe("workspace agent visibility", () => { const result = deriveWorkspaceAgentVisibility({ sessionAgents, - workspaceId, + workspaceDirectory: workspaceId, }); expect(result.activeAgentIds).toEqual(new Set(["visible-agent"])); @@ -151,13 +151,33 @@ describe("workspace agent visibility", () => { const result = deriveWorkspaceAgentVisibility({ sessionAgents, - workspaceId: "/Users/moboudra/.paseo/worktrees/1luy0po7/normal-squid", + workspaceDirectory: "/Users/moboudra/.paseo/worktrees/1luy0po7/normal-squid", }); expect(result.activeAgentIds).toEqual(new Set(["slash-agent"])); expect(result.knownAgentIds.has("slash-agent")).toBe(true); }); + it("matches workspace agents using the workspace directory even when the route uses a numeric workspace id", () => { + const sessionAgents = new Map([ + [ + "recent-agent", + makeAgent({ + id: "recent-agent", + cwd: "/tmp/workspace-lifecycle-main", + }), + ], + ]); + + const result = deriveWorkspaceAgentVisibility({ + sessionAgents, + workspaceDirectory: "/tmp/workspace-lifecycle-main", + }); + + expect(result.activeAgentIds).toEqual(new Set(["recent-agent"])); + expect(result.knownAgentIds).toEqual(new Set(["recent-agent"])); + }); + describe("workspaceAgentVisibilityEqual", () => { it("returns true for identical sets", () => { const a = { activeAgentIds: new Set(["a", "b"]), knownAgentIds: new Set(["a", "b", "c"]) }; diff --git a/packages/app/src/screens/workspace/workspace-agent-visibility.ts b/packages/app/src/screens/workspace/workspace-agent-visibility.ts index 783b23a93..a15a899b3 100644 --- a/packages/app/src/screens/workspace/workspace-agent-visibility.ts +++ b/packages/app/src/screens/workspace/workspace-agent-visibility.ts @@ -12,11 +12,11 @@ export interface WorkspaceAgentVisibility { export function deriveWorkspaceAgentVisibility(input: { sessionAgents: Map | undefined; - workspaceId: string; + workspaceDirectory: string | null | undefined; }): WorkspaceAgentVisibility { - const { sessionAgents, workspaceId } = input; - const normalizedWorkspaceId = normalizeWorkspaceId(workspaceId); - if (!sessionAgents || !workspaceId) { + const { sessionAgents, workspaceDirectory } = input; + const normalizedWorkspaceDirectory = normalizeWorkspaceId(workspaceDirectory); + if (!sessionAgents || !normalizedWorkspaceDirectory) { return { activeAgentIds: new Set(), knownAgentIds: new Set(), @@ -26,7 +26,7 @@ export function deriveWorkspaceAgentVisibility(input: { const activeAgentIds = new Set(); const knownAgentIds = new Set(); for (const agent of sessionAgents.values()) { - if (normalizeWorkspaceId(agent.cwd) !== normalizedWorkspaceId) { + if (normalizeWorkspaceId(agent.cwd) !== normalizedWorkspaceDirectory) { continue; } knownAgentIds.add(agent.id); diff --git a/packages/app/src/screens/workspace/workspace-desktop-tabs-row.tsx b/packages/app/src/screens/workspace/workspace-desktop-tabs-row.tsx index 110f01970..9be66a853 100644 --- a/packages/app/src/screens/workspace/workspace-desktop-tabs-row.tsx +++ b/packages/app/src/screens/workspace/workspace-desktop-tabs-row.tsx @@ -14,7 +14,6 @@ import { ArrowRightToLine, Columns2, Copy, - RotateCw, Rows2, SquarePen, SquareTerminal, @@ -47,11 +46,6 @@ import type { WorkspaceTabDescriptor } from "@/screens/workspace/workspace-tabs- const DROPDOWN_WIDTH = 220; const LOADING_TAB_LABEL_SKELETON_WIDTH = 80; -type NewTabOptionId = "__new_tab_agent__" | "__new_tab_terminal__"; -type NewTabSelection = { - optionId: NewTabOptionId; - paneId?: string; -}; export interface WorkspaceDesktopTabRowItem { tab: WorkspaceTabDescriptor; @@ -76,10 +70,9 @@ type WorkspaceDesktopTabsRowProps = { onCloseTabsToLeft: (tabId: string) => Promise | void; onCloseTabsToRight: (tabId: string) => Promise | void; onCloseOtherTabs: (tabId: string) => Promise | void; - onSelectNewTabOption: (selection: NewTabSelection) => void; - newTabAgentOptionId: NewTabOptionId; + onCreateDraftTab: (input: { paneId?: string }) => void; + onCreateTerminalTab: (input: { paneId?: string }) => void; onReorderTabs: (nextTabs: WorkspaceTabDescriptor[]) => void; - onNewTerminalTab: (input: { paneId?: string }) => void; onSplitRight: () => void; onSplitDown: () => void; externalDndContext?: boolean; @@ -92,6 +85,9 @@ function getFallbackTabLabel(tab: WorkspaceTabDescriptor): string { if (tab.target.kind === "draft") { return "New Agent"; } + if (tab.target.kind === "setup") { + return "Setup"; + } if (tab.target.kind === "terminal") { return "Terminal"; } @@ -293,14 +289,11 @@ function TabChip({ disabled={entry.disabled} destructive={entry.destructive} onSelect={entry.onSelect} - tooltip={entry.tooltip} leading={(() => { const iconColor = theme.colors.foregroundMuted; switch (entry.icon) { case "copy": return ; - case "rotate-cw": - return ; case "arrow-left-to-line": return ; case "arrow-right-to-line": @@ -344,10 +337,9 @@ export function WorkspaceDesktopTabsRow({ onCloseTabsToLeft, onCloseTabsToRight, onCloseOtherTabs, - onSelectNewTabOption, - newTabAgentOptionId, + onCreateDraftTab, + onCreateTerminalTab, onReorderTabs, - onNewTerminalTab, onSplitRight, onSplitDown, externalDndContext = false, @@ -356,8 +348,8 @@ export function WorkspaceDesktopTabsRow({ showPaneSplitActions = true, }: WorkspaceDesktopTabsRowProps) { const { theme } = useUnistyles(); - const newAgentTabKeys = useShortcutKeys("workspace-tab-new"); - const newTerminalTabKeys = useShortcutKeys("workspace-terminal-new"); + const newTabKeys = useShortcutKeys("workspace-tab-new"); + const newTerminalKeys = useShortcutKeys("workspace-terminal-new"); const splitRightKeys = useShortcutKeys("workspace-pane-split-right"); const splitDownKeys = useShortcutKeys("workspace-pane-split-down"); const [tabsContainerWidth, setTabsContainerWidth] = useState(0); @@ -486,12 +478,7 @@ export function WorkspaceDesktopTabsRow({ - onSelectNewTabOption({ - optionId: newTabAgentOptionId, - paneId, - }) - } + onPress={() => onCreateDraftTab({ paneId })} accessibilityRole="button" accessibilityLabel="New agent tab" style={({ hovered, pressed }) => [ @@ -504,15 +491,16 @@ export function WorkspaceDesktopTabsRow({ New agent tab - {newAgentTabKeys ? ( - + {newTabKeys ? ( + ) : null} onNewTerminalTab({ paneId })} + testID="workspace-new-terminal" + onPress={() => onCreateTerminalTab({ paneId })} accessibilityRole="button" accessibilityLabel="New terminal tab" style={({ hovered, pressed }) => [ @@ -525,8 +513,8 @@ export function WorkspaceDesktopTabsRow({ New terminal tab - {newTerminalTabKeys ? ( - + {newTerminalKeys ? ( + ) : null} diff --git a/packages/app/src/screens/workspace/workspace-draft-agent-config.test.ts b/packages/app/src/screens/workspace/workspace-draft-agent-config.test.ts new file mode 100644 index 000000000..0ff6fc5b7 --- /dev/null +++ b/packages/app/src/screens/workspace/workspace-draft-agent-config.test.ts @@ -0,0 +1,22 @@ +import { describe, expect, it } from "vitest"; +import { buildWorkspaceDraftAgentConfig } from "./workspace-draft-agent-config"; + +describe("workspace-draft-agent-config", () => { + it("builds chat-only config for workspace draft agents", () => { + expect( + buildWorkspaceDraftAgentConfig({ + provider: "codex", + cwd: "/tmp/project", + modeId: "auto", + model: "gpt-5.4", + thinkingOptionId: "high", + }), + ).toEqual({ + provider: "codex", + cwd: "/tmp/project", + modeId: "auto", + model: "gpt-5.4", + thinkingOptionId: "high", + }); + }); +}); diff --git a/packages/app/src/screens/workspace/workspace-draft-agent-config.ts b/packages/app/src/screens/workspace/workspace-draft-agent-config.ts new file mode 100644 index 000000000..f3e9bf613 --- /dev/null +++ b/packages/app/src/screens/workspace/workspace-draft-agent-config.ts @@ -0,0 +1,19 @@ +import type { AgentSessionConfig } from "@server/server/agent/agent-sdk-types"; + +export function buildWorkspaceDraftAgentConfig(input: { + provider: AgentSessionConfig["provider"]; + cwd: string; + modeId?: string; + model?: string; + thinkingOptionId?: string; + featureValues?: Record; +}): AgentSessionConfig { + return { + provider: input.provider, + cwd: input.cwd, + ...(input.modeId ? { modeId: input.modeId } : {}), + ...(input.model ? { model: input.model } : {}), + ...(input.thinkingOptionId ? { thinkingOptionId: input.thinkingOptionId } : {}), + ...(input.featureValues ? { featureValues: input.featureValues } : {}), + }; +} diff --git a/packages/app/src/screens/workspace/workspace-draft-agent-tab.tsx b/packages/app/src/screens/workspace/workspace-draft-agent-tab.tsx index c681c62eb..30ed45c23 100644 --- a/packages/app/src/screens/workspace/workspace-draft-agent-tab.tsx +++ b/packages/app/src/screens/workspace/workspace-draft-agent-tab.tsx @@ -1,23 +1,21 @@ -import { useCallback, useEffect, useMemo, useRef } from "react"; +import { useCallback, useMemo, useRef } from "react"; import { Keyboard, Platform, ScrollView, Text, View } from "react-native"; import { StyleSheet } from "react-native-unistyles"; -import { AgentInputArea } from "@/components/agent-input-area"; +import invariant from "tiny-invariant"; +import { Composer } from "@/components/composer"; import { FileDropZone } from "@/components/file-drop-zone"; import { AgentStreamView } from "@/components/agent-stream-view"; import type { ImageAttachment } from "@/components/message-input"; -import { useAgentFormState } from "@/hooks/use-agent-form-state"; import { useAgentInputDraft } from "@/hooks/use-agent-input-draft"; import { useDraftAgentCreateFlow } from "@/hooks/use-draft-agent-create-flow"; -import { useDraftAgentFeatures } from "@/hooks/use-draft-agent-features"; import { useHostRuntimeClient, useHostRuntimeIsConnected } from "@/runtime/host-runtime"; +import { buildWorkspaceDraftAgentConfig } from "@/screens/workspace/workspace-draft-agent-config"; import { buildDraftStoreKey } from "@/stores/draft-keys"; -import type { Agent } from "@/stores/session-store"; +import { type Agent, useSessionStore } from "@/stores/session-store"; import { encodeImages } from "@/utils/encode-images"; +import { getWorkspaceExecutionAuthority } from "@/utils/workspace-execution"; import { shouldAutoFocusWorkspaceDraftComposer } from "@/screens/workspace/workspace-draft-pane-focus"; -import type { - AgentCapabilityFlags, - AgentSessionConfig, -} from "@server/server/agent/agent-sdk-types"; +import type { AgentCapabilityFlags } from "@server/server/agent/agent-sdk-types"; import type { AgentSnapshotPayload } from "@server/shared/messages"; const EMPTY_PENDING_PERMISSIONS = new Map(); @@ -51,79 +49,36 @@ export function WorkspaceDraftAgentTab({ }: WorkspaceDraftAgentTabProps) { const client = useHostRuntimeClient(serverId); const isConnected = useHostRuntimeIsConnected(serverId); + const workspaces = useSessionStore((state) => state.sessions[serverId]?.workspaces); + const workspaceAuthority = getWorkspaceExecutionAuthority({ workspaces, workspaceId }); + const workspaceExecutionAuthority = workspaceAuthority.ok ? workspaceAuthority.authority : null; + const workspaceDirectory = workspaceExecutionAuthority?.workspaceDirectory ?? null; const addImagesRef = useRef<((images: ImageAttachment[]) => void) | null>(null); + const draftStoreKey = useMemo( + () => + buildDraftStoreKey({ + serverId, + agentId: tabId, + draftId, + }), + [draftId, serverId, tabId], + ); const draftInput = useAgentInputDraft( - buildDraftStoreKey({ - serverId, - agentId: tabId, - draftId, - }), + { + draftKey: draftStoreKey, + composer: { + initialServerId: serverId, + initialValues: workspaceDirectory ? { workingDir: workspaceDirectory } : undefined, + isVisible: true, + onlineServerIds: isConnected ? [serverId] : [], + lockedWorkingDir: workspaceDirectory ?? undefined, + }, + }, ); - - const { - selectedProvider, - setProviderFromUser, - selectedMode, - setModeFromUser, - selectedModel, - setModelFromUser, - selectedThinkingOptionId, - setThinkingOptionFromUser, - workingDir, - setWorkingDir, - providerDefinitions, - modeOptions, - availableModels, - allProviderModels, - allProviderEntries, - isAllModelsLoading, - availableThinkingOptions, - isModelLoading, - setProviderAndModelFromUser, - persistFormPreferences, - } = useAgentFormState({ - initialServerId: serverId, - initialValues: { workingDir: workspaceId }, - isVisible: true, - isCreateFlow: true, - onlineServerIds: isConnected ? [serverId] : [], - }); - - // Lock working directory to workspace. - useEffect(() => { - if (workingDir.trim() === workspaceId.trim()) { - return; - } - setWorkingDir(workspaceId); - }, [setWorkingDir, workingDir, workspaceId]); - - const effectiveDraftModelId = useMemo(() => { - if (selectedModel.trim()) { - return selectedModel.trim(); - } - return availableModels.find((model) => model.isDefault)?.id ?? availableModels[0]?.id ?? ""; - }, [availableModels, selectedModel]); - - const effectiveDraftThinkingOptionId = useMemo(() => { - if (selectedThinkingOptionId.trim()) { - return selectedThinkingOptionId.trim(); - } - const selectedModelDefinition = - availableModels.find((model) => model.id === effectiveDraftModelId) ?? null; - return selectedModelDefinition?.defaultThinkingOptionId ?? ""; - }, [availableModels, effectiveDraftModelId, selectedThinkingOptionId]); - const { - features: draftFeatures, - featureValues: draftFeatureValues, - setFeatureValue: setDraftFeatureValue, - } = useDraftAgentFeatures({ - serverId, - provider: selectedProvider, - cwd: workspaceId, - modeId: selectedMode, - modelId: effectiveDraftModelId, - thinkingOptionId: effectiveDraftThinkingOptionId, - }); + const composerState = draftInput.composerState; + if (!composerState) { + throw new Error("Workspace draft composer state is required"); + } const { formErrorMessage, @@ -138,36 +93,43 @@ export function WorkspaceDraftAgentTab({ if (!text.trim()) { return "Initial prompt is required"; } - if (providerDefinitions.length === 0) { + if (composerState.providerDefinitions.length === 0) { return "No available providers on the selected host"; } - if (isModelLoading) { + if (composerState.isModelLoading) { return "Model defaults are still loading"; } - if (!effectiveDraftModelId) { + if (!composerState.effectiveModelId) { return "No model is available for the selected provider"; } + if (!workspaceDirectory) { + return "Workspace directory not found"; + } if (!client) { return "Host is not connected"; } return null; }, onBeforeSubmit: () => { - void persistFormPreferences(); + void composerState.persistFormPreferences(); if (Platform.OS === "web") { (document.activeElement as HTMLElement | null)?.blur?.(); } Keyboard.dismiss(); }, buildDraftAgent: (attempt) => { + invariant(workspaceDirectory, "Workspace directory is required"); const now = attempt.timestamp; - const model = effectiveDraftModelId || null; - const thinkingOptionId = effectiveDraftThinkingOptionId || null; - const modeId = modeOptions.length > 0 && selectedMode !== "" ? selectedMode : null; + const model = composerState.effectiveModelId || null; + const thinkingOptionId = composerState.effectiveThinkingOptionId || null; + const modeId = + composerState.modeOptions.length > 0 && composerState.selectedMode !== "" + ? composerState.selectedMode + : null; return { serverId, id: tabId, - provider: selectedProvider, + provider: composerState.selectedProvider, status: "running", createdAt: now, updatedAt: now, @@ -178,36 +140,38 @@ export function WorkspaceDraftAgentTab({ availableModes: [], pendingPermissions: [], persistence: null, - runtimeInfo: { provider: selectedProvider, sessionId: null, model, modeId }, + runtimeInfo: { provider: composerState.selectedProvider, sessionId: null, model, modeId }, title: "Agent", - cwd: workspaceId, + cwd: workspaceDirectory, model, - features: draftFeatures, + features: composerState.statusControls.features, thinkingOptionId, labels: {}, }; }, createRequest: async ({ attempt, text, images }) => { + invariant(workspaceDirectory, "Workspace directory is required"); + invariant(workspaceExecutionAuthority, "Workspace authority is required"); if (!client) { throw new Error("Host is not connected"); } - const modeId = modeOptions.length > 0 && selectedMode !== "" ? selectedMode : undefined; - const config: AgentSessionConfig = { - provider: selectedProvider, - cwd: workspaceId, - ...(modeId ? { modeId } : {}), - ...(effectiveDraftModelId ? { model: effectiveDraftModelId } : {}), - ...(effectiveDraftThinkingOptionId - ? { thinkingOptionId: effectiveDraftThinkingOptionId } + const config = buildWorkspaceDraftAgentConfig({ + provider: composerState.selectedProvider, + cwd: workspaceDirectory, + ...(composerState.modeOptions.length > 0 && composerState.selectedMode !== "" + ? { modeId: composerState.selectedMode } : {}), - ...(draftFeatureValues ? { featureValues: draftFeatureValues } : {}), - }; + model: composerState.effectiveModelId || undefined, + thinkingOptionId: composerState.effectiveThinkingOptionId || undefined, + featureValues: composerState.featureValues, + }); const imagesData = await encodeImages(images); const result = await client.createAgent({ config, - initialPrompt: text, + workspaceId: workspaceExecutionAuthority.workspaceId, + ...(text ? { initialPrompt: text } : {}), clientMessageId: attempt.clientMessageId, ...(imagesData && imagesData.length > 0 ? { images: imagesData } : {}), }); @@ -222,27 +186,6 @@ export function WorkspaceDraftAgentTab({ }, }); - const draftCommandConfig = useMemo(() => { - return { - provider: selectedProvider, - cwd: workspaceId, - ...(modeOptions.length > 0 && selectedMode !== "" ? { modeId: selectedMode } : {}), - ...(effectiveDraftModelId ? { model: effectiveDraftModelId } : {}), - ...(effectiveDraftThinkingOptionId - ? { thinkingOptionId: effectiveDraftThinkingOptionId } - : {}), - ...(draftFeatureValues ? { featureValues: draftFeatureValues } : {}), - }; - }, [ - draftFeatureValues, - effectiveDraftModelId, - effectiveDraftThinkingOptionId, - modeOptions.length, - selectedMode, - selectedProvider, - workspaceId, - ]); - const handleFilesDropped = useCallback((files: ImageAttachment[]) => { addImagesRef.current?.(files); }, []); @@ -258,51 +201,54 @@ export function WorkspaceDraftAgentTab({ }, []); const handleProviderSelectWithFocus = useCallback( - (provider: string) => { - setProviderFromUser(provider); + (provider: Parameters[0]) => { + composerState.setProviderFromUser(provider); focusInputRef.current?.(); }, - [setProviderFromUser], + [composerState], ); const handleModeSelectWithFocus = useCallback( (modeId: string) => { - setModeFromUser(modeId); + composerState.setModeFromUser(modeId); focusInputRef.current?.(); }, - [setModeFromUser], + [composerState], ); const handleModelSelectWithFocus = useCallback( (modelId: string) => { - setModelFromUser(modelId); + composerState.setModelFromUser(modelId); focusInputRef.current?.(); }, - [setModelFromUser], + [composerState], ); const handleProviderAndModelSelectWithFocus = useCallback( - (provider: string, modelId: string) => { - setProviderAndModelFromUser(provider, modelId); + ( + provider: Parameters[0], + modelId: string, + ) => { + composerState.setProviderAndModelFromUser(provider, modelId); focusInputRef.current?.(); }, - [setProviderAndModelFromUser], + [composerState], ); const handleThinkingOptionSelectWithFocus = useCallback( (optionId: string) => { - setThinkingOptionFromUser(optionId); + composerState.setThinkingOptionFromUser(optionId); focusInputRef.current?.(); }, - [setThinkingOptionFromUser], + [composerState], ); const handleSetFeatureWithFocus = useCallback( (featureId: string, value: unknown) => { - setDraftFeatureValue(featureId, value); + composerState.statusControls.onSetFeature?.(featureId, value); focusInputRef.current?.(); }, - [setDraftFeatureValue], + [composerState], ); return ( @@ -337,11 +283,12 @@ export function WorkspaceDraftAgentTab({ - focusInputRef.current?.(), disabled: isSubmitting, diff --git a/packages/app/src/screens/workspace/workspace-header-source.ts b/packages/app/src/screens/workspace/workspace-header-source.ts index 4316f103b..3c9029110 100644 --- a/packages/app/src/screens/workspace/workspace-header-source.ts +++ b/packages/app/src/screens/workspace/workspace-header-source.ts @@ -1,5 +1,4 @@ import type { WorkspaceDescriptor } from "@/stores/session-store"; -import { projectDisplayNameFromProjectId } from "@/utils/project-display-name"; export function resolveWorkspaceHeader(input: { workspace: WorkspaceDescriptor }): { title: string; @@ -7,7 +6,7 @@ export function resolveWorkspaceHeader(input: { workspace: WorkspaceDescriptor } } { return { title: input.workspace.name, - subtitle: projectDisplayNameFromProjectId(input.workspace.projectId), + subtitle: input.workspace.projectDisplayName, }; } diff --git a/packages/app/src/screens/workspace/workspace-pane-content.tsx b/packages/app/src/screens/workspace/workspace-pane-content.tsx index 2260ae10d..230cfb061 100644 --- a/packages/app/src/screens/workspace/workspace-pane-content.tsx +++ b/packages/app/src/screens/workspace/workspace-pane-content.tsx @@ -36,7 +36,7 @@ export function buildWorkspacePaneContentModel({ const registration = getPanelRegistration(tab.kind); invariant(registration, `No panel registration for kind: ${tab.kind}`); return { - key: `${normalizedServerId}:${normalizedWorkspaceId}:${tab.tabId}`, + key: `${normalizedServerId}:${normalizedWorkspaceId}:${tab.tabId}:${tab.kind}`, Component: registration.component, paneContextValue: { serverId: normalizedServerId, diff --git a/packages/app/src/screens/workspace/workspace-screen.tsx b/packages/app/src/screens/workspace/workspace-screen.tsx index 186d81283..dd96be66d 100644 --- a/packages/app/src/screens/workspace/workspace-screen.tsx +++ b/packages/app/src/screens/workspace/workspace-screen.tsx @@ -22,6 +22,7 @@ import { EllipsisVertical, PanelRight, RotateCw, + Settings, SquarePen, SquareTerminal, X, @@ -63,14 +64,13 @@ import type { WorkspaceTab, WorkspaceTabTarget } from "@/stores/workspace-tabs-s import { useKeyboardActionHandler } from "@/hooks/use-keyboard-action-handler"; import type { KeyboardActionDefinition } from "@/keyboard/keyboard-action-dispatcher"; import { useCreateFlowStore } from "@/stores/create-flow-store"; -import { decodeWorkspaceIdFromPathSegment } from "@/utils/host-routes"; -import { isAbsolutePath } from "@/utils/path"; -import { normalizeWorkspaceIdentity } from "@/utils/workspace-identity"; import { normalizeWorkspaceTabTarget, workspaceTabTargetsEqual, } from "@/utils/workspace-tab-identity"; import { useHostRuntimeClient, useHostRuntimeIsConnected } from "@/runtime/host-runtime"; +import { useProviderModels } from "@/hooks/use-provider-models"; +import { useWorkspaceSetupStore } from "@/stores/workspace-setup-store"; import { useWorkspaceTerminalSessionRetention } from "@/terminal/hooks/use-workspace-terminal-session-retention"; import { checkoutStatusQueryKey, @@ -83,6 +83,10 @@ import { useArchiveAgent } from "@/hooks/use-archive-agent"; import { useStableEvent } from "@/hooks/use-stable-event"; import { buildProviderCommand } from "@/utils/provider-command-templates"; import { generateDraftId } from "@/stores/draft-keys"; +import { + resolveWorkspaceExecutionAuthority, + resolveWorkspaceRouteId, +} from "@/utils/workspace-execution"; import { WorkspaceTabPresentationResolver, WorkspaceTabIcon, @@ -100,7 +104,6 @@ import { } from "@/screens/workspace/workspace-header-source"; import { deriveWorkspaceAgentVisibility, - shouldPruneWorkspaceAgentTab, workspaceAgentVisibilityEqual, } from "@/screens/workspace/workspace-agent-visibility"; import { deriveWorkspacePaneState } from "@/screens/workspace/workspace-pane-state"; @@ -119,11 +122,8 @@ import { findAdjacentPane } from "@/utils/split-navigation"; import { isCompactFormFactor, supportsDesktopPaneSplits } from "@/constants/layout"; const TERMINALS_QUERY_STALE_TIME = 5_000; -const NEW_TAB_AGENT_OPTION_ID = "__new_tab_agent__"; -const NEW_TAB_TERMINAL_OPTION_ID = "__new_tab_terminal__"; -type NewTabOptionId = typeof NEW_TAB_AGENT_OPTION_ID | typeof NEW_TAB_TERMINAL_OPTION_ID; +const WORKSPACE_SETUP_AUTO_OPEN_WINDOW_MS = 30_000; const EMPTY_UI_TABS: WorkspaceTab[] = []; -const EMPTY_PINNED_AGENT_IDS = new Set(); const EMPTY_SET = new Set(); type WorkspaceScreenProps = { @@ -151,6 +151,9 @@ function getFallbackTabOptionLabel(tab: WorkspaceTabDescriptor): string { if (tab.target.kind === "draft") { return "New Agent"; } + if (tab.target.kind === "setup") { + return "Setup"; + } if (tab.target.kind === "terminal") { return "Terminal"; } @@ -164,6 +167,9 @@ function getFallbackTabOptionDescription(tab: WorkspaceTabDescriptor): string { if (tab.target.kind === "draft") { return "New Agent"; } + if (tab.target.kind === "setup") { + return "Workspace setup"; + } if (tab.target.kind === "agent") { return "Agent"; } @@ -595,8 +601,18 @@ function WorkspaceScreenContent({ serverId, workspaceId }: WorkspaceScreenProps) const isFocusModeEnabled = usePanelStore((state) => state.desktop.focusModeEnabled); const normalizedServerId = trimNonEmpty(decodeSegment(serverId)) ?? ""; + + // Prefetch provider models early so the model picker is warm by the time it opens + useProviderModels(normalizedServerId); + const normalizedWorkspaceId = - normalizeWorkspaceIdentity(decodeWorkspaceIdFromPathSegment(workspaceId)) ?? ""; + resolveWorkspaceRouteId({ + routeWorkspaceId: workspaceId, + }) ?? ""; + const sessionWorkspaces = useSessionStore( + (state) => state.sessions[normalizedServerId]?.workspaces, + ); + const workspaceTerminalScopeKey = normalizedServerId && normalizedWorkspaceId ? `${normalizedServerId}:${normalizedWorkspaceId}` @@ -608,43 +624,53 @@ function WorkspaceScreenContent({ serverId, workspaceId }: WorkspaceScreenProps) const queryClient = useQueryClient(); const client = useHostRuntimeClient(normalizedServerId); const isConnected = useHostRuntimeIsConnected(normalizedServerId); + const workspaceDescriptor = sessionWorkspaces?.get(normalizedWorkspaceId) ?? null; + const workspaceAuthority = useMemo( + () => + resolveWorkspaceExecutionAuthority({ + workspaces: sessionWorkspaces, + workspaceId: normalizedWorkspaceId, + }), + [normalizedWorkspaceId, sessionWorkspaces], + ); + const workspaceDirectory = workspaceAuthority?.workspaceDirectory ?? null; + const isMissingWorkspaceExecutionAuthority = Boolean(workspaceDescriptor && !workspaceAuthority); const workspaceAgentVisibility = useStoreWithEqualityFn( useSessionStore, (state) => deriveWorkspaceAgentVisibility({ sessionAgents: state.sessions[normalizedServerId]?.agents, - workspaceId: normalizedWorkspaceId, + workspaceDirectory, }), workspaceAgentVisibilityEqual, ); const terminalsQueryKey = useMemo( - () => ["terminals", normalizedServerId, normalizedWorkspaceId] as const, - [normalizedServerId, normalizedWorkspaceId], + () => ["terminals", normalizedServerId, workspaceDirectory] as const, + [normalizedServerId, workspaceDirectory], ); type ListTerminalsPayload = ListTerminalsResponse["payload"]; const terminalsQuery = useQuery({ queryKey: terminalsQueryKey, enabled: Boolean(client && isConnected) && - normalizedWorkspaceId.length > 0 && - isAbsolutePath(normalizedWorkspaceId), + Boolean(workspaceDirectory), queryFn: async () => { - if (!client) { + if (!client || !workspaceDirectory) { throw new Error("Host is not connected"); } - return await client.listTerminals(normalizedWorkspaceId); + return await client.listTerminals(workspaceDirectory); }, staleTime: TERMINALS_QUERY_STALE_TIME, }); const terminals = terminalsQuery.data?.terminals ?? []; const createTerminalMutation = useMutation({ mutationFn: async (input?: { paneId?: string }) => { - if (!client) { + if (!client || !workspaceDirectory) { throw new Error("Host is not connected"); } - return await client.createTerminal(normalizedWorkspaceId); + return await client.createTerminal(workspaceDirectory); }, onSuccess: (payload, input) => { const createdTerminal = payload.terminal; @@ -654,8 +680,9 @@ function WorkspaceScreenContent({ serverId, workspaceId }: WorkspaceScreenProps) terminals: current?.terminals ?? [], terminal: createdTerminal, }); + const cwd = current?.cwd ?? workspaceDirectory; return { - cwd: current?.cwd ?? normalizedWorkspaceId, + ...(cwd ? { cwd } : {}), terminals: nextTerminals, requestId: current?.requestId ?? `terminal-create-${createdTerminal.id}`, }; @@ -698,7 +725,7 @@ function WorkspaceScreenContent({ serverId, workspaceId }: WorkspaceScreenProps) const { archiveAgent } = useArchiveAgent(); useEffect(() => { - if (!client || !isConnected || !isAbsolutePath(normalizedWorkspaceId)) { + if (!client || !isConnected || !workspaceDirectory) { return; } @@ -706,7 +733,7 @@ function WorkspaceScreenContent({ serverId, workspaceId }: WorkspaceScreenProps) if (message.type !== "terminals_changed") { return; } - if (message.payload.cwd !== normalizedWorkspaceId) { + if (message.payload.cwd !== workspaceDirectory) { return; } @@ -717,32 +744,30 @@ function WorkspaceScreenContent({ serverId, workspaceId }: WorkspaceScreenProps) })); }); - client.subscribeTerminals({ cwd: normalizedWorkspaceId }); + client.subscribeTerminals({ cwd: workspaceDirectory }); return () => { unsubscribeChanged(); - client.unsubscribeTerminals({ cwd: normalizedWorkspaceId }); + client.unsubscribeTerminals({ cwd: workspaceDirectory }); }; - }, [client, isConnected, normalizedWorkspaceId, queryClient, terminalsQueryKey]); + }, [client, isConnected, queryClient, terminalsQueryKey, workspaceDirectory]); const checkoutQuery = useQuery({ - queryKey: checkoutStatusQueryKey(normalizedServerId, normalizedWorkspaceId), + queryKey: checkoutStatusQueryKey( + normalizedServerId, + workspaceDirectory ?? `missing-workspace-directory:${normalizedWorkspaceId}`, + ), enabled: Boolean(client && isConnected) && - normalizedWorkspaceId.length > 0 && - isAbsolutePath(normalizedWorkspaceId), + Boolean(workspaceDirectory), queryFn: async () => { - if (!client) { + if (!client || !workspaceDirectory) { throw new Error("Host is not connected"); } - return (await client.getCheckoutStatus(normalizedWorkspaceId)) as CheckoutStatusPayload; + return (await client.getCheckoutStatus(workspaceDirectory)) as CheckoutStatusPayload; }, staleTime: 15_000, }); - - const workspaceDescriptor = useSessionStore( - (state) => state.sessions[normalizedServerId]?.workspaces.get(normalizedWorkspaceId) ?? null, - ); const hasHydratedWorkspaces = useSessionStore( (state) => state.sessions[normalizedServerId]?.hasHydratedWorkspaces ?? false, ); @@ -772,15 +797,15 @@ function WorkspaceScreenContent({ serverId, workspaceId }: WorkspaceScreenProps) const isExplorerOpen = isMobile ? mobileView === "file-explorer" : desktopFileExplorerOpen; const activeExplorerCheckout = useMemo(() => { - if (!normalizedServerId || !isAbsolutePath(normalizedWorkspaceId)) { + if (!normalizedServerId || !workspaceDirectory) { return null; } return { serverId: normalizedServerId, - cwd: normalizedWorkspaceId, + cwd: workspaceDirectory, isGit: isGitCheckout, }; - }, [isGitCheckout, normalizedServerId, normalizedWorkspaceId]); + }, [isGitCheckout, normalizedServerId, workspaceDirectory]); useEffect(() => { setActiveExplorerCheckout(activeExplorerCheckout); @@ -835,6 +860,9 @@ function WorkspaceScreenContent({ serverId, workspaceId }: WorkspaceScreenProps) const workspaceLayout = useWorkspaceLayoutStore((state) => persistenceKey ? (state.layoutByWorkspace[persistenceKey] ?? null) : null, ); + const workspaceSetupSnapshot = useWorkspaceSetupStore((state) => + persistenceKey ? state.snapshots[persistenceKey] ?? null : null, + ); const uiTabs = useMemo( () => (workspaceLayout ? collectAllTabs(workspaceLayout.root) : EMPTY_UI_TABS), [workspaceLayout], @@ -842,8 +870,10 @@ function WorkspaceScreenContent({ serverId, workspaceId }: WorkspaceScreenProps) const openWorkspaceTab = useWorkspaceLayoutStore((state) => state.openTab); const focusWorkspaceTab = useWorkspaceLayoutStore((state) => state.focusTab); const closeWorkspaceTab = useWorkspaceLayoutStore((state) => state.closeTab); - const unpinWorkspaceAgent = useWorkspaceLayoutStore((state) => state.unpinAgent); const retargetWorkspaceTab = useWorkspaceLayoutStore((state) => state.retargetTab); + const convertWorkspaceDraftToAgent = useWorkspaceLayoutStore((state) => state.convertDraftToAgent); + const reconcileWorkspaceTabs = useWorkspaceLayoutStore((state) => state.reconcileTabs); + const unpinWorkspaceAgent = useWorkspaceLayoutStore((state) => state.unpinAgent); const splitWorkspacePane = useWorkspaceLayoutStore((state) => state.splitPane); const splitWorkspacePaneEmpty = useWorkspaceLayoutStore((state) => state.splitPaneEmpty); const moveWorkspaceTabToPane = useWorkspaceLayoutStore((state) => state.moveTabToPane); @@ -851,11 +881,6 @@ function WorkspaceScreenContent({ serverId, workspaceId }: WorkspaceScreenProps) const paneFocusSuppressedRef = useRef(false); const resizeWorkspaceSplit = useWorkspaceLayoutStore((state) => state.resizeSplit); const reorderWorkspaceTabsInPane = useWorkspaceLayoutStore((state) => state.reorderTabsInPane); - const pinnedAgentIds = useWorkspaceLayoutStore((state) => - persistenceKey - ? (state.pinnedAgentIdsByWorkspace[persistenceKey] ?? EMPTY_PINNED_AGENT_IDS) - : EMPTY_PINNED_AGENT_IDS, - ); const pendingByDraftId = useCreateFlowStore((state) => state.pendingByDraftId); const { closingTabIds, closeTab } = useCloseTabs(); const closeWorkspaceTabWithCleanup = useCallback( @@ -959,7 +984,6 @@ function WorkspaceScreenContent({ serverId, workspaceId }: WorkspaceScreenProps) return; } - const terminalIds = new Set(terminals.map((terminal) => terminal.id)); const hasActivePendingDraftCreateInWorkspace = uiTabs.some((tab) => { if (tab.target.kind !== "draft") { return false; @@ -968,57 +992,19 @@ function WorkspaceScreenContent({ serverId, workspaceId }: WorkspaceScreenProps) return pending?.serverId === normalizedServerId && pending.lifecycle === "active"; }); - for (const agentId of workspaceAgentVisibility.activeAgentIds) { - const representedByTarget = uiTabs.some( - (tab) => tab.target.kind === "agent" && tab.target.agentId === agentId, - ); - const representedByDeterministicTabId = uiTabs.some( - (tab) => tab.tabId === `agent_${agentId}`, - ); - if ( - hasActivePendingDraftCreateInWorkspace && - !representedByTarget && - !representedByDeterministicTabId - ) { - continue; - } - ensureWorkspaceTab({ kind: "agent", agentId }); - } - for (const terminal of terminals) { - ensureWorkspaceTab({ kind: "terminal", terminalId: terminal.id }); - } - - const canPruneAgentTabs = hasHydratedAgents; - const canPruneTerminalTabs = terminalsQuery.isSuccess; - for (const tab of uiTabs) { - if ( - canPruneAgentTabs && - tab.target.kind === "agent" && - !pinnedAgentIds.has(tab.target.agentId) && - shouldPruneWorkspaceAgentTab({ - agentId: tab.target.agentId, - agentsHydrated: hasHydratedAgents, - knownAgentIds: workspaceAgentVisibility.knownAgentIds, - activeAgentIds: workspaceAgentVisibility.activeAgentIds, - }) - ) { - closeWorkspaceTabWithCleanup({ tabId: tab.tabId, target: tab.target }); - } - if ( - canPruneTerminalTabs && - tab.target.kind === "terminal" && - !terminalIds.has(tab.target.terminalId) - ) { - closeWorkspaceTabWithCleanup({ tabId: tab.tabId, target: tab.target }); - } - } + reconcileWorkspaceTabs(persistenceKey, { + agentsHydrated: hasHydratedAgents, + terminalsHydrated: terminalsQuery.isSuccess, + activeAgentIds: workspaceAgentVisibility.activeAgentIds, + knownAgentIds: workspaceAgentVisibility.knownAgentIds, + standaloneTerminalIds: terminals.map((terminal) => terminal.id), + hasActivePendingDraftCreate: hasActivePendingDraftCreateInWorkspace, + }); }, [ - closeWorkspaceTabWithCleanup, - ensureWorkspaceTab, hasHydratedAgents, pendingByDraftId, - pinnedAgentIds, persistenceKey, + reconcileWorkspaceTabs, terminals, terminalsQuery.isSuccess, uiTabs, @@ -1028,17 +1014,18 @@ function WorkspaceScreenContent({ serverId, workspaceId }: WorkspaceScreenProps) const activeTabId = focusedPaneTabState.activeTabId; const activeTab = focusedPaneTabState.activeTab; - useEffect(() => { - if (!activeTabId || !persistenceKey) { - return; - } - focusWorkspaceTab(persistenceKey, activeTabId); - }, [activeTabId, focusWorkspaceTab, persistenceKey]); - const tabs = useMemo( () => focusedPaneTabState.tabs.map((tab) => tab.descriptor), [focusedPaneTabState.tabs], ); + const hasSetupTab = useMemo( + () => + uiTabs.some( + (tab) => + tab.target.kind === "setup" && tab.target.workspaceId === normalizedWorkspaceId, + ), + [normalizedWorkspaceId, uiTabs], + ); const navigateToTabId = useCallback( function navigateToTabId(tabId: string) { @@ -1051,6 +1038,7 @@ function WorkspaceScreenContent({ serverId, workspaceId }: WorkspaceScreenProps) ); const emptyWorkspaceSeedRef = useRef(null); + const autoOpenedSetupTabWorkspaceRef = useRef(null); useEffect(() => { if (!persistenceKey) { return; @@ -1079,6 +1067,56 @@ function WorkspaceScreenContent({ serverId, workspaceId }: WorkspaceScreenProps) workspaceAgentVisibility.activeAgentIds.size, ]); + useEffect(() => { + if (!persistenceKey) { + return; + } + if (!workspaceSetupSnapshot) { + if (autoOpenedSetupTabWorkspaceRef.current === persistenceKey) { + autoOpenedSetupTabWorkspaceRef.current = null; + } + return; + } + + const snapshotAge = Date.now() - workspaceSetupSnapshot.updatedAt; + const shouldAutoOpen = + workspaceSetupSnapshot.status === "running" || + snapshotAge <= WORKSPACE_SETUP_AUTO_OPEN_WINDOW_MS; + if (!shouldAutoOpen) { + return; + } + if (hasSetupTab) { + autoOpenedSetupTabWorkspaceRef.current = persistenceKey; + return; + } + if (autoOpenedSetupTabWorkspaceRef.current === persistenceKey) { + return; + } + + const target = normalizeWorkspaceTabTarget({ + kind: "setup", + workspaceId: normalizedWorkspaceId, + }); + if (!target) { + return; + } + + const tabId = openWorkspaceTab(persistenceKey, target); + if (!tabId) { + return; + } + + focusWorkspaceTab(persistenceKey, tabId); + autoOpenedSetupTabWorkspaceRef.current = persistenceKey; + }, [ + focusWorkspaceTab, + hasSetupTab, + normalizedWorkspaceId, + openWorkspaceTab, + persistenceKey, + workspaceSetupSnapshot, + ]); + const handleOpenFileFromExplorer = useCallback( function handleOpenFileFromExplorer(filePath: string) { if (isMobile) { @@ -1142,21 +1180,28 @@ function WorkspaceScreenContent({ serverId, workspaceId }: WorkspaceScreenProps) [tabs], ); - const handleCreateDraftTab = useCallback(() => { - openWorkspaceDraftTab(); - }, [openWorkspaceDraftTab]); + const handleCreateDraftTab = useCallback( + (input?: { paneId?: string }) => { + if (input?.paneId && persistenceKey) { + focusWorkspacePane(persistenceKey, input.paneId); + } + openWorkspaceDraftTab(); + }, + [focusWorkspacePane, openWorkspaceDraftTab, persistenceKey], + ); const handleCreateTerminal = useCallback( (input?: { paneId?: string }) => { if (createTerminalMutation.isPending) { return; } - if (!isAbsolutePath(normalizedWorkspaceId)) { + if (!workspaceDirectory) { return; } + createTerminalMutation.mutate(input); }, - [createTerminalMutation, normalizedWorkspaceId], + [createTerminalMutation, workspaceDirectory], ); const handleSelectSwitcherTab = useCallback( @@ -1166,20 +1211,6 @@ function WorkspaceScreenContent({ serverId, workspaceId }: WorkspaceScreenProps) [navigateToTabId], ); - const handleSelectNewTabOption = useCallback( - (selection: { optionId: NewTabOptionId; paneId?: string }) => { - if (selection.paneId && persistenceKey) { - focusWorkspacePane(persistenceKey, selection.paneId); - } - if (selection.optionId === NEW_TAB_AGENT_OPTION_ID) { - handleCreateDraftTab(); - } else if (selection.optionId === NEW_TAB_TERMINAL_OPTION_ID) { - handleCreateTerminal({ paneId: selection.paneId }); - } - }, - [focusWorkspacePane, handleCreateDraftTab, handleCreateTerminal, persistenceKey], - ); - const handleCreateDraftSplit = useCallback( (input: { targetPaneId: string; position: "left" | "right" | "top" | "bottom" }) => { if (!persistenceKey) { @@ -1191,10 +1222,9 @@ function WorkspaceScreenContent({ serverId, workspaceId }: WorkspaceScreenProps) return; } - focusWorkspacePane(persistenceKey, paneId); - openWorkspaceDraftTab(); + handleCreateDraftTab({ paneId }); }, - [focusWorkspacePane, openWorkspaceDraftTab, persistenceKey, splitWorkspacePaneEmpty], + [handleCreateDraftTab, persistenceKey, splitWorkspacePaneEmpty], ); const killTerminalAsync = killTerminalMutation.mutateAsync; @@ -1257,11 +1287,12 @@ function WorkspaceScreenContent({ serverId, workspaceId }: WorkspaceScreenProps) const agent = useSessionStore.getState().sessions[normalizedServerId]?.agents?.get(agentId) ?? null; + const isRunning = agent?.status === "running" || agent?.status === "initializing"; - if (agent?.status !== "idle") { + if (isRunning) { const confirmed = await confirmDialog({ - title: "Archive agent?", - message: "This closes the tab and archives the agent.", + title: "Archive running agent?", + message: "This agent is still running. Archiving it will stop the agent and close the tab.", confirmLabel: "Archive", cancelLabel: "Cancel", destructive: true, @@ -1379,18 +1410,18 @@ function WorkspaceScreenContent({ serverId, workspaceId }: WorkspaceScreenProps) ); const handleCopyWorkspacePath = useCallback(async () => { - if (!isAbsolutePath(normalizedWorkspaceId)) { + if (!workspaceDirectory) { toast.error("Workspace path not available"); return; } try { - await Clipboard.setStringAsync(normalizedWorkspaceId); + await Clipboard.setStringAsync(workspaceDirectory); toast.copied("Workspace path"); } catch { toast.error("Copy failed"); } - }, [normalizedWorkspaceId, toast]); + }, [toast, workspaceDirectory]); const handleCopyBranchName = useCallback(async () => { if (!currentBranchName) { @@ -1406,6 +1437,23 @@ function WorkspaceScreenContent({ serverId, workspaceId }: WorkspaceScreenProps) } }, [currentBranchName, toast]); + const handleOpenSetupTab = useCallback(() => { + if (!persistenceKey) { + return; + } + const target = normalizeWorkspaceTabTarget({ + kind: "setup", + workspaceId: normalizedWorkspaceId, + }); + if (!target) { + return; + } + const tabId = openWorkspaceTab(persistenceKey, target); + if (tabId) { + focusWorkspaceTab(persistenceKey, tabId); + } + }, [focusWorkspaceTab, normalizedWorkspaceId, openWorkspaceTab, persistenceKey]); + const handleBulkCloseTabs = useCallback( async (input: { tabsToClose: WorkspaceTabDescriptor[]; title: string; logLabel: string }) => { const { tabsToClose, title, logLabel } = input; @@ -1553,7 +1601,14 @@ function WorkspaceScreenContent({ serverId, workspaceId }: WorkspaceScreenProps) return false; } }, - [activeTabId, handleCloseTabById, handleCreateDraftTab, handleCreateTerminal, navigateToTabId, tabs], + [ + activeTabId, + handleCloseTabById, + handleCreateDraftTab, + handleCreateTerminal, + navigateToTabId, + tabs, + ], ); const handleWorkspacePaneAction = useCallback( @@ -1731,6 +1786,10 @@ function WorkspaceScreenContent({ serverId, workspaceId }: WorkspaceScreenProps) if (!persistenceKey) { return; } + if (input.tab.kind === "draft" && target.kind === "agent") { + convertWorkspaceDraftToAgent(persistenceKey, input.tab.tabId, target.agentId); + return; + } retargetWorkspaceTab(persistenceKey, input.tab.tabId, target); }, onOpenWorkspaceFile: (filePath) => { @@ -1749,6 +1808,7 @@ function WorkspaceScreenContent({ serverId, workspaceId }: WorkspaceScreenProps) normalizedWorkspaceId, openWorkspaceTab, persistenceKey, + convertWorkspaceDraftToAgent, retargetWorkspaceTab, ], ); @@ -1786,6 +1846,12 @@ function WorkspaceScreenContent({ serverId, workspaceId }: WorkspaceScreenProps) + ) : isMissingWorkspaceExecutionAuthority ? ( + + + Workspace execution directory is missing. Reload workspace data before opening tabs. + + ) : !activeTabDescriptor ? ( !hasHydratedAgents ? ( @@ -2004,7 +2070,7 @@ function WorkspaceScreenContent({ serverId, workspaceId }: WorkspaceScreenProps) } - disabled={!isAbsolutePath(normalizedWorkspaceId)} + disabled={!workspaceDirectory} onSelect={handleCopyWorkspacePath} > Copy workspace path @@ -2018,6 +2084,14 @@ function WorkspaceScreenContent({ serverId, workspaceId }: WorkspaceScreenProps) Copy branch name ) : null} + + } + onSelect={handleOpenSetupTab} + > + Show setup + @@ -2033,10 +2107,12 @@ function WorkspaceScreenContent({ serverId, workspaceId }: WorkspaceScreenProps) ) : null} {!isMobile && isGitCheckout ? ( <> - + {workspaceDirectory ? ( + + ) : null} {}} onSplitDown={() => {}} showPaneSplitActions={false} @@ -2225,11 +2300,10 @@ function WorkspaceScreenContent({ serverId, workspaceId }: WorkspaceScreenProps) onCloseTabsToLeft={handleCloseTabsToLeftInPane} onCloseTabsToRight={handleCloseTabsToRightInPane} onCloseOtherTabs={handleCloseOtherTabsInPane} - onSelectNewTabOption={handleSelectNewTabOption} - newTabAgentOptionId={NEW_TAB_AGENT_OPTION_ID} + onCreateDraftTab={handleCreateDraftTab} + onCreateTerminalTab={handleCreateTerminal} buildPaneContentModel={buildDesktopPaneContentModel} onFocusPane={handleFocusPane} - onNewTerminalTab={handleCreateTerminal} onSplitPane={handleSplitPane} onSplitPaneEmpty={handleCreateDraftSplit} onMoveTabToPane={handleMoveTabToPane} @@ -2246,13 +2320,15 @@ function WorkspaceScreenContent({ serverId, workspaceId }: WorkspaceScreenProps) {(!isFocusModeEnabled || isMobile) && ( - + workspaceDirectory ? ( + + ) : null )} diff --git a/packages/app/src/screens/workspace/workspace-source-of-truth.test.ts b/packages/app/src/screens/workspace/workspace-source-of-truth.test.ts index 84562c96d..e17e9a940 100644 --- a/packages/app/src/screens/workspace/workspace-source-of-truth.test.ts +++ b/packages/app/src/screens/workspace/workspace-source-of-truth.test.ts @@ -13,12 +13,14 @@ describe("workspace source of truth consumption", () => { projectId: "remote:github.com/getpaseo/paseo", projectDisplayName: "getpaseo/paseo", projectRootPath: "/repo/main", + workspaceDirectory: "/repo/main", projectKind: "git", - workspaceKind: "local_checkout", + workspaceKind: "checkout", name: "feat/workspace-sot", status: "running", activityAt: new Date("2026-03-01T00:00:00.000Z"), diffStat: null, + services: [], }; const header = resolveWorkspaceHeader({ workspace }); diff --git a/packages/app/src/screens/workspace/workspace-tab-menu.ts b/packages/app/src/screens/workspace/workspace-tab-menu.ts index 1b31729a0..08e25f2c2 100644 --- a/packages/app/src/screens/workspace/workspace-tab-menu.ts +++ b/packages/app/src/screens/workspace/workspace-tab-menu.ts @@ -87,6 +87,9 @@ function getCloseButtonTestId(tab: WorkspaceTabDescriptor): string { if (tab.target.kind === "draft") { return `workspace-draft-close-${tab.target.draftId}`; } + if (tab.target.kind === "setup") { + return `workspace-setup-close-${encodeFilePathForPathSegment(tab.target.workspaceId)}`; + } return `workspace-file-close-${encodeFilePathForPathSegment(tab.target.path)}`; } diff --git a/packages/app/src/screens/workspace/workspace-tab-presentation.tsx b/packages/app/src/screens/workspace/workspace-tab-presentation.tsx index 694639e14..c5eb42922 100644 --- a/packages/app/src/screens/workspace/workspace-tab-presentation.tsx +++ b/packages/app/src/screens/workspace/workspace-tab-presentation.tsx @@ -49,7 +49,7 @@ export function WorkspaceTabPresentationResolver({ return ( { + const storage = new Map(); + return { + default: { + getItem: vi.fn(async (key: string) => storage.get(key) ?? null), + setItem: vi.fn(async (key: string, value: string) => { + storage.set(key, value); + }), + removeItem: vi.fn(async (key: string) => { + storage.delete(key); + }), + }, + }; +}); + +import { AGENT_PROVIDER_DEFINITIONS } from "@server/server/agent/provider-manifest"; +import { + __providerRecencyStoreTestUtils, + sortProvidersByRecency, + useProviderRecencyStore, +} from "./provider-recency-store"; + +describe("provider-recency-store", () => { + beforeEach(() => { + useProviderRecencyStore.setState({ + recentProviderIds: [], + recordUsage: useProviderRecencyStore.getState().recordUsage, + }); + }); + + it("sorts used providers first and keeps unused providers in default order", () => { + const sorted = sortProvidersByRecency(AGENT_PROVIDER_DEFINITIONS, ["codex"]); + + expect(sorted.map((provider) => provider.id)).toEqual([ + "codex", + ...AGENT_PROVIDER_DEFINITIONS.filter((provider) => provider.id !== "codex").map( + (provider) => provider.id, + ), + ]); + }); + + it("moves the latest provider to the front without duplicating prior entries", () => { + useProviderRecencyStore.getState().recordUsage("codex"); + useProviderRecencyStore.getState().recordUsage("opencode"); + useProviderRecencyStore.getState().recordUsage("codex"); + + expect(useProviderRecencyStore.getState().recentProviderIds).toEqual([ + "codex", + "opencode", + ]); + }); + + it("filters invalid and duplicate providers during migration", () => { + expect( + __providerRecencyStoreTestUtils.migratePersistedState({ + recentProviderIds: ["codex", "invalid", "codex", "claude"], + }), + ).toEqual({ + recentProviderIds: ["codex", "claude"], + }); + }); +}); diff --git a/packages/app/src/stores/provider-recency-store.ts b/packages/app/src/stores/provider-recency-store.ts new file mode 100644 index 000000000..7ffd8a82f --- /dev/null +++ b/packages/app/src/stores/provider-recency-store.ts @@ -0,0 +1,129 @@ +import { useMemo } from "react"; +import AsyncStorage from "@react-native-async-storage/async-storage"; +import type { AgentProvider } from "@server/server/agent/agent-sdk-types"; +import { + AGENT_PROVIDER_DEFINITIONS, + isValidAgentProvider, + type AgentProviderDefinition, +} from "@server/server/agent/provider-manifest"; +import { create } from "zustand"; +import { createJSONStorage, persist } from "zustand/middleware"; + +const PROVIDER_RECENCY_STORE_VERSION = 1; + +interface ProviderRecencyStoreState { + recentProviderIds: AgentProvider[]; + recordUsage: (providerId: AgentProvider) => void; +} + +function sanitizeRecentProviderIds(providerIds: readonly string[] | undefined): AgentProvider[] { + if (!providerIds || providerIds.length === 0) { + return []; + } + + const seen = new Set(); + const sanitized: AgentProvider[] = []; + for (const providerId of providerIds) { + if (!isValidAgentProvider(providerId)) { + continue; + } + if (seen.has(providerId)) { + continue; + } + seen.add(providerId); + sanitized.push(providerId); + } + return sanitized; +} + +export function sortProvidersByRecency( + providers: readonly T[], + recentProviderIds: readonly string[], +): T[] { + if (providers.length <= 1) { + return [...providers]; + } + + const recentRank = new Map(); + for (const providerId of recentProviderIds) { + if (recentRank.has(providerId)) { + continue; + } + recentRank.set(providerId, recentRank.size); + } + + return providers + .map((provider, defaultIndex) => ({ + provider, + defaultIndex, + recentIndex: recentRank.get(provider.id) ?? Number.POSITIVE_INFINITY, + })) + .sort((left, right) => { + if (left.recentIndex !== right.recentIndex) { + return left.recentIndex - right.recentIndex; + } + return left.defaultIndex - right.defaultIndex; + }) + .map((entry) => entry.provider); +} + +function migratePersistedState(state: unknown): Pick { + const record = state as { recentProviderIds?: string[] } | null | undefined; + return { + recentProviderIds: sanitizeRecentProviderIds(record?.recentProviderIds), + }; +} + +export const useProviderRecencyStore = create()( + persist( + (set) => ({ + recentProviderIds: [], + recordUsage: (providerId) => { + if (!isValidAgentProvider(providerId)) { + return; + } + + set((state) => ({ + recentProviderIds: [ + providerId, + ...state.recentProviderIds.filter((id) => id !== providerId), + ], + })); + }, + }), + { + name: "agent-provider-recency", + version: PROVIDER_RECENCY_STORE_VERSION, + storage: createJSONStorage(() => AsyncStorage), + partialize: (state) => ({ + recentProviderIds: state.recentProviderIds, + }), + migrate: (persistedState) => migratePersistedState(persistedState), + }, + ), +); + +export function useProviderRecency( + availableProviders: readonly AgentProviderDefinition[] = AGENT_PROVIDER_DEFINITIONS, +): { + providers: AgentProviderDefinition[]; + recordUsage: (providerId: AgentProvider) => void; +} { + const recentProviderIds = useProviderRecencyStore((state) => state.recentProviderIds); + const recordUsage = useProviderRecencyStore((state) => state.recordUsage); + + const providers = useMemo( + () => sortProvidersByRecency(availableProviders, recentProviderIds), + [availableProviders, recentProviderIds], + ); + + return { + providers, + recordUsage, + }; +} + +export const __providerRecencyStoreTestUtils = { + migratePersistedState, + sanitizeRecentProviderIds, +}; diff --git a/packages/app/src/stores/session-store.test.ts b/packages/app/src/stores/session-store.test.ts index b265b7b70..ae5330ca0 100644 --- a/packages/app/src/stores/session-store.test.ts +++ b/packages/app/src/stores/session-store.test.ts @@ -1,6 +1,12 @@ -import { describe, expect, it } from "vitest"; +import { afterEach, describe, expect, it } from "vitest"; + +import type { DaemonClient } from "@server/client/daemon-client"; +import type { WorkspaceDescriptorPayload } from "@server/shared/messages"; + import { mergeWorkspaceSnapshotWithExisting, + normalizeWorkspaceDescriptor, + useSessionStore, type WorkspaceDescriptor, } from "./session-store"; @@ -9,18 +15,125 @@ function createWorkspace( ): WorkspaceDescriptor { return { id: input.id, - projectId: input.projectId ?? "remote:github.com/getpaseo/paseo", - projectDisplayName: input.projectDisplayName ?? "getpaseo/paseo", - projectRootPath: input.projectRootPath ?? "/tmp/repo", + projectId: input.projectId ?? "project-1", + projectDisplayName: input.projectDisplayName ?? "Project 1", + projectRootPath: input.projectRootPath ?? "/repo", + workspaceDirectory: input.workspaceDirectory ?? "/repo", projectKind: input.projectKind ?? "git", workspaceKind: input.workspaceKind ?? "local_checkout", name: input.name ?? "main", status: input.status ?? "done", activityAt: input.activityAt ?? null, diffStat: input.diffStat ?? null, + services: input.services ?? [], }; } +afterEach(() => { + useSessionStore.getState().clearSession("test-server"); +}); + +describe("normalizeWorkspaceDescriptor", () => { + it("normalizes workspace services and invalid activity timestamps", () => { + const services = [ + { + serviceName: "web", + hostname: "main.web.localhost", + port: 3000, + url: "http://main.web.localhost:6767", + lifecycle: "running" as const, + health: "healthy" as const, + }, + ]; + const workspace = normalizeWorkspaceDescriptor({ + id: "1", + projectId: "1", + projectDisplayName: "Project 1", + projectRootPath: "/repo", + workspaceDirectory: "/repo", + projectKind: "git", + workspaceKind: "checkout", + name: "main", + status: "running", + activityAt: "not-a-date", + diffStat: null, + services, + }); + + expect(workspace.activityAt).toBeNull(); + expect(workspace.services).toEqual([ + { + serviceName: "web", + hostname: "main.web.localhost", + port: 3000, + url: "http://main.web.localhost:6767", + lifecycle: "running", + health: "healthy", + }, + ]); + expect(workspace.services).not.toBe(services); + }); + + it("defaults missing services to an empty array", () => { + const payload = { + id: "1", + projectId: "1", + projectDisplayName: "Project 1", + projectRootPath: "/repo", + workspaceDirectory: "/repo", + projectKind: "git", + workspaceKind: "checkout", + name: "main", + status: "done", + activityAt: null, + diffStat: null, + services: [], + } as WorkspaceDescriptorPayload; + + const workspace = normalizeWorkspaceDescriptor(payload); + + expect(workspace.services).toEqual([]); + }); +}); + +describe("mergeWorkspaces", () => { + it("preserves services on merged workspace entries", () => { + const store = useSessionStore.getState(); + store.initializeSession("test-server", null as unknown as DaemonClient); + store.setWorkspaces( + "test-server", + new Map([["/repo/main", createWorkspace({ id: "/repo/main", services: [] })]]), + ); + + store.mergeWorkspaces("test-server", [ + createWorkspace({ + id: "/repo/main", + services: [ + { + serviceName: "web", + hostname: "main.web.localhost", + port: 3000, + url: "http://main.web.localhost:6767", + lifecycle: "running", + health: "healthy", + }, + ], + }), + ]); + + expect(store.getSession("test-server")?.workspaces.get("/repo/main")?.services).toEqual([ + { + serviceName: "web", + hostname: "main.web.localhost", + port: 3000, + url: "http://main.web.localhost:6767", + lifecycle: "running", + health: "healthy", + }, + ]); + }); +}); + describe("mergeWorkspaceSnapshotWithExisting", () => { it("preserves the last known diff stat when a snapshot only has baseline null data", () => { const existing = createWorkspace({ diff --git a/packages/app/src/stores/session-store.ts b/packages/app/src/stores/session-store.ts index cc31a8a9f..7fd67ae1c 100644 --- a/packages/app/src/stores/session-store.ts +++ b/packages/app/src/stores/session-store.ts @@ -24,6 +24,7 @@ import type { ServerInfoStatusPayload, ProjectPlacementPayload, ServerCapabilities, + AgentSnapshotPayload, WorkspaceDescriptorPayload, } from "@server/shared/messages"; import { normalizeWorkspaceIdentity } from "@/utils/workspace-identity"; @@ -116,12 +117,14 @@ export interface WorkspaceDescriptor { projectId: string; projectDisplayName: string; projectRootPath: string; + workspaceDirectory: string; projectKind: WorkspaceDescriptorPayload["projectKind"]; workspaceKind: WorkspaceDescriptorPayload["workspaceKind"]; name: string; status: WorkspaceDescriptorPayload["status"]; activityAt: Date | null; diffStat: { additions: number; deletions: number } | null; + services: WorkspaceDescriptorPayload["services"]; } export function normalizeWorkspaceDescriptor( @@ -129,16 +132,18 @@ export function normalizeWorkspaceDescriptor( ): WorkspaceDescriptor { const activityAt = payload.activityAt ? new Date(payload.activityAt) : null; return { - id: normalizeWorkspaceIdentity(payload.id) ?? payload.id, - projectId: payload.projectId, + id: normalizeWorkspaceIdentity(String(payload.id)) ?? String(payload.id), + projectId: String(payload.projectId), projectDisplayName: payload.projectDisplayName, projectRootPath: payload.projectRootPath, + workspaceDirectory: payload.workspaceDirectory, projectKind: payload.projectKind, workspaceKind: payload.workspaceKind, name: payload.name, status: payload.status, activityAt: activityAt && !Number.isNaN(activityAt.getTime()) ? activityAt : null, diffStat: payload.diffStat ?? null, + services: (payload.services ?? []).map((s) => ({ ...s })), }; } @@ -210,7 +215,6 @@ export type DaemonServerInfo = { }; export interface AgentTimelineCursorState { - epoch: string; startSeq: number; endSeq: number; } diff --git a/packages/app/src/stores/workspace-layout-actions.ts b/packages/app/src/stores/workspace-layout-actions.ts index efe9db64f..bcf170103 100644 --- a/packages/app/src/stores/workspace-layout-actions.ts +++ b/packages/app/src/stores/workspace-layout-actions.ts @@ -110,6 +110,7 @@ interface OpenTabInLayoutResult { tabId: string; } + interface RetargetTabInLayoutInput { layout: WorkspaceLayout; tabId: string; @@ -121,6 +122,17 @@ interface RetargetTabInLayoutResult { tabId: string; } +interface ConvertDraftToAgentInLayoutInput { + layout: WorkspaceLayout; + tabId: string; + agentId: string; +} + +interface ConvertDraftToAgentInLayoutResult { + layout: WorkspaceLayout; + tabId: string; +} + interface ReorderFocusedPaneTabsInLayoutInput { layout: WorkspaceLayout; tabIds: string[]; @@ -181,6 +193,20 @@ interface ReorderPaneTabsInLayoutInput { tabIds: string[]; } +export interface WorkspaceTabReconcileState { + layout: WorkspaceLayout; + pinnedAgentIds?: ReadonlySet | null; +} + +export interface WorkspaceTabSnapshot { + agentsHydrated: boolean; + terminalsHydrated: boolean; + activeAgentIds: Iterable; + knownAgentIds: Iterable; + standaloneTerminalIds: Iterable; + hasActivePendingDraftCreate?: boolean; +} + const DEFAULT_PANE_ID = "main"; const MIN_SPLIT_SIZE = 0.1; @@ -750,6 +776,37 @@ function updateTabInTree(root: SplitNodeInternal, input: UpdateTabInTreeInput): }); } +function replaceTabInTree( + root: SplitNodeInternal, + input: { + tabId: string; + nextTabId: string; + target: WorkspaceTabTarget; + }, +): SplitNodeInternal { + const panePath = findPanePathContainingTab(root, input.tabId); + invariant(panePath, `Tab not found: ${input.tabId}`); + return replaceNodeAtPath(root, panePath, (node) => { + invariant(node.kind === "pane", "Expected pane while replacing tab"); + return { + kind: "pane", + pane: normalizePaneAfterTabChange({ + ...node.pane, + tabs: node.pane.tabs.map((tab) => + tab.tabId === input.tabId + ? { + ...tab, + tabId: input.nextTabId, + target: input.target, + } + : tab, + ), + focusedTabId: node.pane.focusedTabId === input.tabId ? input.nextTabId : node.pane.focusedTabId, + }), + }; + }); +} + function updateGroupSizesInTree( root: SplitNodeInternal, input: UpdateGroupSizesInTreeInput, @@ -957,22 +1014,12 @@ export function removeTabFromTree(root: SplitNode, tabId: string): SplitNode { return detachTabFromTree(asInternalNode(root), { tabId }).root; } -export function openTabInLayout(input: OpenTabInLayoutInput): OpenTabInLayoutResult { +function insertNewTabIntoFocusedPane(input: { + layout: WorkspaceLayout; + target: WorkspaceTabTarget; + now: number; +}): OpenTabInLayoutResult { const layout = asInternalLayout(input.layout); - const existingTab = collectAllTabs(layout.root).find((tab) => - workspaceTabTargetsEqual(tab.target, input.target), - ); - if (existingTab) { - return { - tabId: existingTab.tabId, - layout: - focusTabInLayout({ - layout, - tabId: existingTab.tabId, - }) ?? input.layout, - }; - } - const focusedPane = findPaneById(layout.root, layout.focusedPaneId) ?? collectAllPanes(layout.root)[0] ?? @@ -999,6 +1046,25 @@ export function openTabInLayout(input: OpenTabInLayoutInput): OpenTabInLayoutRes }; } +export function openTabInLayout(input: OpenTabInLayoutInput): OpenTabInLayoutResult { + const layout = asInternalLayout(input.layout); + const existingTab = collectAllTabs(layout.root).find((tab) => + workspaceTabTargetsEqual(tab.target, input.target), + ); + if (existingTab) { + return { + tabId: existingTab.tabId, + layout: + focusTabInLayout({ + layout, + tabId: existingTab.tabId, + }) ?? input.layout, + }; + } + + return insertNewTabIntoFocusedPane(input); +} + export function closeTabInLayout(input: CloseTabInLayoutInput): WorkspaceLayout | null { const internalLayout = asInternalLayout(input.layout); const pane = findPaneContainingTab(internalLayout.root, input.tabId); @@ -1054,11 +1120,34 @@ export function retargetTabInLayout( }; } + const existingTargetTab = + collectAllTabs(layout.root).find( + (tab) => tab.tabId !== input.tabId && workspaceTabTargetsEqual(tab.target, input.target), + ) ?? null; + if (existingTargetTab) { + const nextLayout = + closeTabInLayout({ + layout: input.layout, + tabId: input.tabId, + }) ?? input.layout; + return { + layout: + focusTabInLayout({ + layout: nextLayout, + tabId: existingTargetTab.tabId, + }) ?? nextLayout, + tabId: existingTargetTab.tabId, + }; + } + return { + // Preserve the existing tab id so draft->entity transitions keep the same + // React key during the first render. Reconciliation can canonicalize later. tabId: input.tabId, layout: { - root: updateTabInTree(layout.root, { + root: replaceTabInTree(layout.root, { tabId: input.tabId, + nextTabId: input.tabId, target: input.target, }), focusedPaneId: layout.focusedPaneId, @@ -1066,6 +1155,52 @@ export function retargetTabInLayout( }; } +export function convertDraftToAgentInLayout( + input: ConvertDraftToAgentInLayoutInput, +): ConvertDraftToAgentInLayoutResult | null { + const layout = asInternalLayout(input.layout); + const currentTab = collectAllTabs(layout.root).find((tab) => tab.tabId === input.tabId) ?? null; + if (!currentTab || currentTab.target.kind !== "draft") { + return null; + } + + const target: WorkspaceTabTarget = { + kind: "agent", + agentId: input.agentId, + }; + const canonicalTabId = buildDeterministicWorkspaceTabId(target); + const existingCanonicalTab = + collectAllTabs(layout.root).find((tab) => tab.tabId === canonicalTabId) ?? null; + + if (existingCanonicalTab && existingCanonicalTab.tabId !== input.tabId) { + const nextLayout = + closeTabInLayout({ + layout: input.layout, + tabId: input.tabId, + }) ?? input.layout; + return { + layout: + focusTabInLayout({ + layout: nextLayout, + tabId: canonicalTabId, + }) ?? nextLayout, + tabId: canonicalTabId, + }; + } + + return { + tabId: canonicalTabId, + layout: { + root: replaceTabInTree(layout.root, { + tabId: input.tabId, + nextTabId: canonicalTabId, + target, + }), + focusedPaneId: layout.focusedPaneId, + }, + }; +} + export function reorderFocusedPaneTabsInLayout( input: ReorderFocusedPaneTabsInLayoutInput, ): WorkspaceLayout | null { @@ -1234,3 +1369,201 @@ export function reorderPaneTabsInLayout( focusedPaneId: layout.focusedPaneId, }; } + +function normalizeStringSet(values: Iterable): Set { + const next = new Set(); + for (const value of values) { + const normalized = trimNonEmpty(value); + if (normalized) { + next.add(normalized); + } + } + return next; +} + +function isEntityTarget( + target: WorkspaceTabTarget, +): target is Extract { + return target.kind === "agent" || target.kind === "terminal"; +} + +function isAgentTab(tab: WorkspaceTab): tab is WorkspaceTab & { target: { kind: "agent"; agentId: string } } { + return tab.target.kind === "agent"; +} + +function isTerminalTab( + tab: WorkspaceTab, +): tab is WorkspaceTab & { target: { kind: "terminal"; terminalId: string } } { + return tab.target.kind === "terminal"; +} + +function openEntityTabWithoutFocusing(layout: WorkspaceLayout, target: WorkspaceTabTarget): WorkspaceLayout { + const internalLayout = asInternalLayout(layout); + const focusedPane = + findPaneById(internalLayout.root, internalLayout.focusedPaneId) ?? + collectAllPanes(internalLayout.root)[0] ?? + findPaneById(createDefaultLayout().root, DEFAULT_PANE_ID); + invariant(focusedPane, "Workspace layout must always have a pane"); + + const tabId = buildDeterministicWorkspaceTabId(target); + return { + root: insertTabIntoPane(internalLayout.root, { + paneId: focusedPane.id, + tab: { + tabId, + target, + createdAt: Date.now(), + }, + focusTabId: focusedPane.focusedTabId ?? tabId, + }), + focusedPaneId: internalLayout.focusedPaneId, + }; +} + +export function reconcileWorkspaceTabs( + state: WorkspaceTabReconcileState, + snapshot: WorkspaceTabSnapshot, +): WorkspaceTabReconcileState { + let nextLayout = state.layout; + const originalFocusedTabId = + findPaneById(nextLayout.root, nextLayout.focusedPaneId)?.focusedTabId ?? null; + let reconciledFocusedTabId = originalFocusedTabId; + const pinnedAgentIds = new Set(state.pinnedAgentIds ?? []); + const activeAgentIds = normalizeStringSet(snapshot.activeAgentIds); + const knownAgentIds = normalizeStringSet(snapshot.knownAgentIds); + const standaloneTerminalIds = normalizeStringSet(snapshot.standaloneTerminalIds); + const visibleAgentIds = new Set(activeAgentIds); + for (const agentId of pinnedAgentIds) { + if (knownAgentIds.has(agentId)) { + visibleAgentIds.add(agentId); + } + } + + const initialTabs = collectAllTabs(nextLayout.root); + const representedAgentIds = new Set(initialTabs.filter(isAgentTab).map((tab) => tab.target.agentId)); + + const entityGroups = new Map< + string, + { + target: WorkspaceTabTarget; + tabs: WorkspaceTab[]; + } + >(); + for (const tab of initialTabs) { + if (!isEntityTarget(tab.target)) { + continue; + } + const canonicalTarget = normalizeWorkspaceTabTarget(tab.target); + if (!canonicalTarget) { + continue; + } + const canonicalTabId = buildDeterministicWorkspaceTabId(canonicalTarget); + const currentGroup = entityGroups.get(canonicalTabId); + if (currentGroup) { + currentGroup.tabs.push(tab); + continue; + } + entityGroups.set(canonicalTabId, { + target: canonicalTarget, + tabs: [tab], + }); + } + + for (const [canonicalTabId, group] of entityGroups) { + const keeper = group.tabs.find((tab) => tab.tabId === canonicalTabId) ?? group.tabs[0] ?? null; + if (!keeper) { + continue; + } + if (group.tabs.some((tab) => tab.tabId === originalFocusedTabId)) { + reconciledFocusedTabId = canonicalTabId; + } + if ( + keeper.tabId !== canonicalTabId || + !workspaceTabTargetsEqual(keeper.target, group.target) + ) { + nextLayout = { + root: replaceTabInTree(asInternalLayout(nextLayout).root, { + tabId: keeper.tabId, + nextTabId: canonicalTabId, + target: group.target, + }), + focusedPaneId: nextLayout.focusedPaneId, + }; + } + for (const tab of group.tabs) { + if (tab.tabId === keeper.tabId) { + continue; + } + nextLayout = + closeTabInLayout({ + layout: nextLayout, + tabId: tab.tabId, + }) ?? nextLayout; + } + } + + for (const tab of collectAllTabs(nextLayout.root)) { + if (isAgentTab(tab) && snapshot.agentsHydrated && !visibleAgentIds.has(tab.target.agentId)) { + nextLayout = + closeTabInLayout({ + layout: nextLayout, + tabId: tab.tabId, + }) ?? nextLayout; + } + if (isTerminalTab(tab) && snapshot.terminalsHydrated && !standaloneTerminalIds.has(tab.target.terminalId)) { + nextLayout = + closeTabInLayout({ + layout: nextLayout, + tabId: tab.tabId, + }) ?? nextLayout; + } + } + + const currentEntityTabs = collectAllTabs(nextLayout.root); + const currentAgentIds = new Set( + currentEntityTabs.filter(isAgentTab).map((tab) => tab.target.agentId), + ); + const currentTerminalIds = new Set( + currentEntityTabs.filter(isTerminalTab).map((tab) => tab.target.terminalId), + ); + + const sortedVisibleAgentIds = [...visibleAgentIds].sort(); + for (const agentId of sortedVisibleAgentIds) { + if (currentAgentIds.has(agentId)) { + continue; + } + if (snapshot.hasActivePendingDraftCreate && !representedAgentIds.has(agentId)) { + continue; + } + nextLayout = openEntityTabWithoutFocusing(nextLayout, { + kind: "agent", + agentId, + }); + currentAgentIds.add(agentId); + } + + const sortedTerminalIds = [...standaloneTerminalIds].sort(); + for (const terminalId of sortedTerminalIds) { + if (currentTerminalIds.has(terminalId)) { + continue; + } + nextLayout = openEntityTabWithoutFocusing(nextLayout, { + kind: "terminal", + terminalId, + }); + currentTerminalIds.add(terminalId); + } + + if (reconciledFocusedTabId) { + nextLayout = + focusTabInLayout({ + layout: nextLayout, + tabId: reconciledFocusedTabId, + }) ?? nextLayout; + } + + return { + ...state, + layout: nextLayout, + }; +} diff --git a/packages/app/src/stores/workspace-layout-store.test.ts b/packages/app/src/stores/workspace-layout-store.test.ts index 78f0aa446..c83d317fc 100644 --- a/packages/app/src/stores/workspace-layout-store.test.ts +++ b/packages/app/src/stores/workspace-layout-store.test.ts @@ -262,6 +262,55 @@ describe("workspace-layout-store actions", () => { ]); }); + it("openTab creates distinct draft tabs for repeated Cmd+T/new-tab opens", () => { + const workspaceKey = createWorkspaceKey(); + const store = useWorkspaceLayoutStore.getState(); + + const firstTabId = store.openTab(workspaceKey, { kind: "draft", draftId: "draft-1" }); + const secondTabId = store.openTab(workspaceKey, { kind: "draft", draftId: "draft-2" }); + const layout = useWorkspaceLayoutStore.getState().layoutByWorkspace[workspaceKey]!; + + expect(firstTabId).toBe("draft-1"); + expect(secondTabId).toBe("draft-2"); + expect(firstTabId).not.toBe(secondTabId); + expect(findPaneById(layout.root, "main")?.tabIds).toEqual([firstTabId, secondTabId]); + expect(collectAllTabs(layout.root)).toEqual([ + { + tabId: firstTabId, + target: { kind: "draft", draftId: "draft-1" }, + createdAt: expect.any(Number), + }, + { + tabId: secondTabId, + target: { kind: "draft", draftId: "draft-2" }, + createdAt: expect.any(Number), + }, + ]); + }); + + it("splitPaneEmpty plus openTab opens a draft tab in the new pane", () => { + vi.spyOn(globalThis.crypto, "randomUUID").mockReturnValueOnce( + "77777777-7777-7777-7777-777777777777", + ); + const workspaceKey = createWorkspaceKey(); + const store = useWorkspaceLayoutStore.getState(); + + store.openTab(workspaceKey, { kind: "file", path: "/repo/worktree/a.ts" }); + const newPaneId = store.splitPaneEmpty(workspaceKey, { + targetPaneId: "main", + position: "right", + }); + const draftTabId = store.openTab(workspaceKey, { kind: "draft", draftId: "draft-split" }); + const layout = useWorkspaceLayoutStore.getState().layoutByWorkspace[workspaceKey]!; + + expect(newPaneId).toBe("pane_77777777-7777-7777-7777-777777777777"); + expect(draftTabId).toBe("draft-split"); + expect(layout.focusedPaneId).toBe(newPaneId); + expect(findPaneById(layout.root, "main")?.tabIds).toEqual(["file_/repo/worktree/a.ts"]); + expect(findPaneById(layout.root, newPaneId!)?.tabIds).toEqual([draftTabId!]); + expect(findPaneById(layout.root, newPaneId!)?.focusedTabId).toBe(draftTabId); + }); + it("focusTab moves workspace focus to the pane containing the tab", () => { vi.spyOn(globalThis.crypto, "randomUUID").mockReturnValue( "bbbbbbbb-bbbb-bbbb-bbbb-bbbbbbbbbbbb", @@ -288,7 +337,7 @@ describe("workspace-layout-store actions", () => { expect(findPaneById(layout.root, splitPaneId!)?.focusedTabId).toBe(terminalTabId); }); - it("retargetTab updates the existing tab target without moving it to a different pane", () => { + it("convertDraftToAgent replaces the draft tab with a canonical agent tab in the same pane", () => { vi.spyOn(globalThis.crypto, "randomUUID").mockReturnValue( "12121212-1212-1212-1212-121212121212", ); @@ -296,32 +345,117 @@ describe("workspace-layout-store actions", () => { const store = useWorkspaceLayoutStore.getState(); store.openTab(workspaceKey, { kind: "file", path: "/repo/worktree/a.ts" }); - const secondTabId = store.openTab(workspaceKey, { kind: "file", path: "/repo/worktree/b.ts" }); + const secondTabId = store.openTab(workspaceKey, { kind: "draft", draftId: "draft-2" }); const splitPaneId = store.splitPane(workspaceKey, { tabId: secondTabId!, targetPaneId: "main", position: "right", }); - const nextTabId = store.retargetTab(workspaceKey, secondTabId!, { - kind: "agent", - agentId: "agent-1", - }); + const nextTabId = store.convertDraftToAgent(workspaceKey, secondTabId!, "agent-1"); const layout = useWorkspaceLayoutStore.getState().layoutByWorkspace[workspaceKey]!; const splitPane = findPaneById(layout.root, splitPaneId!); - const retargetedTab = collectAllTabs(layout.root).find((tab) => tab.tabId === secondTabId); + const convertedTab = collectAllTabs(layout.root).find((tab) => tab.tabId === nextTabId); expect(splitPaneId).toBe("pane_12121212-1212-1212-1212-121212121212"); - expect(nextTabId).toBe(secondTabId); - expect(splitPane?.tabIds).toEqual([secondTabId!]); - expect(findPaneContainingTab(layout.root, secondTabId!)?.id).toBe(splitPaneId); - expect(retargetedTab).toEqual({ - tabId: secondTabId, + expect(nextTabId).toBe("agent_agent-1"); + expect(splitPane?.tabIds).toEqual(["agent_agent-1"]); + expect(findPaneContainingTab(layout.root, "agent_agent-1")?.id).toBe(splitPaneId); + expect(convertedTab).toEqual({ + tabId: "agent_agent-1", target: { kind: "agent", agentId: "agent-1" }, createdAt: expect.any(Number), }); }); + it("retargetTab keeps a draft tab in place while updating its target", () => { + const workspaceKey = createWorkspaceKey(); + const store = useWorkspaceLayoutStore.getState(); + + const draftTabId = store.openTab(workspaceKey, { kind: "draft", draftId: "draft-retarget" }); + const nextTabId = store.retargetTab(workspaceKey, draftTabId!, { + kind: "file", + path: "/repo/worktree/retargeted.ts", + }); + const layout = useWorkspaceLayoutStore.getState().layoutByWorkspace[workspaceKey]!; + + expect(draftTabId).toBe("draft-retarget"); + expect(nextTabId).toBe(draftTabId); + expect(findPaneById(layout.root, "main")?.tabIds).toEqual([draftTabId!]); + expect(collectAllTabs(layout.root)).toEqual([ + { + tabId: draftTabId!, + target: { kind: "file", path: "/repo/worktree/retargeted.ts" }, + createdAt: expect.any(Number), + }, + ]); + }); + + it("retargetTab closes a draft tab and focuses the existing canonical target tab", () => { + vi.spyOn(globalThis.crypto, "randomUUID") + .mockReturnValueOnce("55555555-5555-5555-5555-555555555555"); + const workspaceKey = createWorkspaceKey(); + const store = useWorkspaceLayoutStore.getState(); + + const existingFileTabId = store.openTab(workspaceKey, { + kind: "file", + path: "/repo/worktree/existing.ts", + }); + const draftTabId = store.openTab(workspaceKey, { kind: "draft", draftId: "draft-dup" }); + const splitPaneId = store.splitPane(workspaceKey, { + tabId: draftTabId!, + targetPaneId: "main", + position: "right", + }); + const secondDraftTabId = store.openTab(workspaceKey, { kind: "draft", draftId: "draft-dup-2" }); + + const nextTabId = store.retargetTab(workspaceKey, secondDraftTabId!, { + kind: "file", + path: "/repo/worktree/existing.ts", + }); + const layout = useWorkspaceLayoutStore.getState().layoutByWorkspace[workspaceKey]!; + + expect(existingFileTabId).toBe("file_/repo/worktree/existing.ts"); + expect(draftTabId).toBe("draft-dup"); + expect(splitPaneId).toBe("pane_55555555-5555-5555-5555-555555555555"); + expect(nextTabId).toBe(existingFileTabId); + expect(collectAllTabs(layout.root).map((tab) => tab.tabId)).toEqual([ + existingFileTabId!, + draftTabId!, + ]); + expect(layout.focusedPaneId).toBe("main"); + expect(findPaneById(layout.root, "main")?.focusedTabId).toBe(existingFileTabId); + }); + + it("retargetTab closes a draft tab and focuses an existing matching target tab", () => { + const workspaceKey = createWorkspaceKey(); + const store = useWorkspaceLayoutStore.getState(); + + const firstDraftTabId = store.openTab(workspaceKey, { kind: "draft", draftId: "draft-agent-1" }); + const firstAgentTabId = store.retargetTab(workspaceKey, firstDraftTabId!, { + kind: "agent", + agentId: "agent-1", + }); + const secondDraftTabId = store.openTab(workspaceKey, { kind: "draft", draftId: "draft-agent-2" }); + + const nextTabId = store.retargetTab(workspaceKey, secondDraftTabId!, { + kind: "agent", + agentId: "agent-1", + }); + const layout = useWorkspaceLayoutStore.getState().layoutByWorkspace[workspaceKey]!; + + expect(firstAgentTabId).toBe(firstDraftTabId); + expect(nextTabId).toBe(firstDraftTabId); + expect(collectAllTabs(layout.root)).toEqual([ + { + tabId: firstDraftTabId!, + target: { kind: "agent", agentId: "agent-1" }, + createdAt: expect.any(Number), + }, + ]); + expect(findPaneById(layout.root, "main")?.focusedTabId).toBe(firstDraftTabId); + }); + it("reorderTabs reorders tabs within the focused pane", () => { const workspaceKey = createWorkspaceKey(); const store = useWorkspaceLayoutStore.getState(); @@ -702,4 +836,102 @@ describe("workspace-layout-store actions", () => { splitSizesByWorkspace: {}, }); }); + + it("convertDraftToAgent removes the draft and focuses the existing canonical agent tab", () => { + vi.spyOn(globalThis.crypto, "randomUUID").mockReturnValue( + "67676767-6767-6767-6767-676767676767", + ); + const workspaceKey = createWorkspaceKey(); + const store = useWorkspaceLayoutStore.getState(); + + const draftTabId = store.openTab(workspaceKey, { kind: "draft", draftId: "draft-existing" }); + const agentTabId = store.openTab(workspaceKey, { kind: "agent", agentId: "agent-1" }); + const splitPaneId = store.splitPane(workspaceKey, { + tabId: agentTabId!, + targetPaneId: "main", + position: "right", + }); + + const nextTabId = store.convertDraftToAgent(workspaceKey, draftTabId!, "agent-1"); + const layout = useWorkspaceLayoutStore.getState().layoutByWorkspace[workspaceKey]!; + + expect(splitPaneId).toBe("pane_67676767-6767-6767-6767-676767676767"); + expect(nextTabId).toBe("agent_agent-1"); + expect(collectAllTabs(layout.root).map((tab) => tab.tabId)).toEqual(["agent_agent-1"]); + expect(layout.focusedPaneId).toBe(splitPaneId); + expect(findPaneContainingTab(layout.root, "agent_agent-1")?.id).toBe(splitPaneId); + }); + + it("reconcileTabs canonicalizes duplicates and prunes stale entity tabs from hydrated snapshots", () => { + const workspaceKey = createWorkspaceKey(); + + useWorkspaceLayoutStore.setState((state) => ({ + ...state, + layoutByWorkspace: { + ...state.layoutByWorkspace, + [workspaceKey]: { + root: { + kind: "pane", + pane: { + id: "main", + tabIds: ["draft_agent", "agent_agent-1", "terminal_orphan", "draft-1"], + focusedTabId: "draft_agent", + tabs: [ + { + tabId: "draft_agent", + target: { kind: "agent", agentId: "agent-1" }, + createdAt: 1, + }, + { + tabId: "agent_agent-1", + target: { kind: "agent", agentId: "agent-1" }, + createdAt: 2, + }, + { + tabId: "terminal_orphan", + target: { kind: "terminal", terminalId: "term-stale" }, + createdAt: 3, + }, + { + tabId: "draft-1", + target: { kind: "draft", draftId: "draft-1" }, + createdAt: 4, + }, + ], + } as any, + }, + focusedPaneId: "main", + }, + }, + pinnedAgentIdsByWorkspace: { + [workspaceKey]: new Set(["agent-2"]), + }, + })); + + useWorkspaceLayoutStore.getState().reconcileTabs(workspaceKey, { + agentsHydrated: true, + terminalsHydrated: true, + activeAgentIds: ["agent-1"], + knownAgentIds: ["agent-1", "agent-2"], + standaloneTerminalIds: ["term-1"], + hasActivePendingDraftCreate: false, + }); + + const layout = useWorkspaceLayoutStore.getState().layoutByWorkspace[workspaceKey]!; + const tabs = collectAllTabs(layout.root); + + expect(tabs.map((tab) => tab.tabId)).toEqual([ + "agent_agent-1", + "draft-1", + "agent_agent-2", + "terminal_term-1", + ]); + expect(tabs.find((tab) => tab.tabId === "agent_agent-1")).toEqual({ + tabId: "agent_agent-1", + target: { kind: "agent", agentId: "agent-1" }, + createdAt: 2, + }); + expect(layout.focusedPaneId).toBe("main"); + expect(findPaneById(layout.root, "main")?.focusedTabId).toBe("agent_agent-1"); + }); }); diff --git a/packages/app/src/stores/workspace-layout-store.ts b/packages/app/src/stores/workspace-layout-store.ts index e942dae51..116bef8fb 100644 --- a/packages/app/src/stores/workspace-layout-store.ts +++ b/packages/app/src/stores/workspace-layout-store.ts @@ -11,6 +11,7 @@ import { closeTabInLayout, collectAllPanes, collectAllTabs, + convertDraftToAgentInLayout, createDefaultLayout, findPaneById, findPaneContainingTab, @@ -21,6 +22,7 @@ import { moveTabToPaneInLayout, normalizeLayout, openTabInLayout, + reconcileWorkspaceTabs, removePaneFromTree, removeTabFromTree, reorderFocusedPaneTabsInLayout, @@ -31,6 +33,8 @@ import { type SplitGroup, type SplitNode, type SplitPane, + type WorkspaceTabReconcileState, + type WorkspaceTabSnapshot, type WorkspaceLayout, } from "@/stores/workspace-layout-actions"; import { normalizeWorkspaceTabTarget } from "@/utils/workspace-tab-identity"; @@ -48,7 +52,14 @@ export { removePaneFromTree, removeTabFromTree, }; -export type { SplitGroup, SplitNode, SplitPane, WorkspaceLayout }; +export type { + SplitGroup, + SplitNode, + SplitPane, + WorkspaceLayout, + WorkspaceTabReconcileState, + WorkspaceTabSnapshot, +}; interface WorkspaceLayoutStore { layoutByWorkspace: Record; @@ -58,6 +69,8 @@ interface WorkspaceLayoutStore { closeTab: (workspaceKey: string, tabId: string) => void; focusTab: (workspaceKey: string, tabId: string) => void; retargetTab: (workspaceKey: string, tabId: string, target: WorkspaceTabTarget) => string | null; + convertDraftToAgent: (workspaceKey: string, tabId: string, agentId: string) => string | null; + reconcileTabs: (workspaceKey: string, snapshot: WorkspaceTabSnapshot) => void; reorderTabs: (workspaceKey: string, tabIds: string[]) => void; getWorkspaceTabs: (workspaceKey: string) => WorkspaceTab[]; splitPane: ( @@ -202,6 +215,59 @@ export const useWorkspaceLayoutStore = create()( return result.tabId; }, + convertDraftToAgent: (workspaceKey, tabId, agentId) => { + const normalizedWorkspaceKey = trimNonEmpty(workspaceKey); + const normalizedTabId = trimNonEmpty(tabId); + const normalizedAgentId = trimNonEmpty(agentId); + if (!normalizedWorkspaceKey || !normalizedTabId || !normalizedAgentId) { + return null; + } + + const result = convertDraftToAgentInLayout({ + layout: getWorkspaceLayout(get().layoutByWorkspace, normalizedWorkspaceKey), + tabId: normalizedTabId, + agentId: normalizedAgentId, + }); + if (!result) { + return null; + } + + set((state) => ({ + layoutByWorkspace: { + ...state.layoutByWorkspace, + [normalizedWorkspaceKey]: result.layout, + }, + })); + + return result.tabId; + }, + reconcileTabs: (workspaceKey, snapshot) => { + const normalizedWorkspaceKey = trimNonEmpty(workspaceKey); + if (!normalizedWorkspaceKey) { + return; + } + + set((state) => { + const currentLayout = getWorkspaceLayout(state.layoutByWorkspace, normalizedWorkspaceKey); + const nextState = reconcileWorkspaceTabs( + { + layout: currentLayout, + pinnedAgentIds: state.pinnedAgentIdsByWorkspace[normalizedWorkspaceKey] ?? null, + }, + snapshot, + ); + if (nextState.layout === currentLayout) { + return state; + } + + return { + layoutByWorkspace: { + ...state.layoutByWorkspace, + [normalizedWorkspaceKey]: nextState.layout, + }, + }; + }); + }, reorderTabs: (workspaceKey, tabIds) => { const normalizedWorkspaceKey = trimNonEmpty(workspaceKey); if (!normalizedWorkspaceKey) { diff --git a/packages/app/src/stores/workspace-setup-store.test.ts b/packages/app/src/stores/workspace-setup-store.test.ts new file mode 100644 index 000000000..59f04350c --- /dev/null +++ b/packages/app/src/stores/workspace-setup-store.test.ts @@ -0,0 +1,41 @@ +import { beforeEach, describe, expect, it } from "vitest"; +import { useWorkspaceSetupStore } from "./workspace-setup-store"; + +describe("workspace-setup-store", () => { + beforeEach(() => { + useWorkspaceSetupStore.setState({ pendingWorkspaceSetup: null }); + }); + + it("tracks deferred workspace setup by source directory and optional workspace id", () => { + useWorkspaceSetupStore.getState().beginWorkspaceSetup({ + serverId: "server-1", + sourceDirectory: "/Users/test/project", + sourceWorkspaceId: "42", + displayName: "project", + creationMethod: "open_project", + navigationMethod: "replace", + }); + + expect(useWorkspaceSetupStore.getState().pendingWorkspaceSetup).toEqual({ + serverId: "server-1", + sourceDirectory: "/Users/test/project", + sourceWorkspaceId: "42", + displayName: "project", + creationMethod: "open_project", + navigationMethod: "replace", + }); + }); + + it("clears pending setup state", () => { + useWorkspaceSetupStore.getState().beginWorkspaceSetup({ + serverId: "server-1", + sourceDirectory: "/Users/test/project", + creationMethod: "create_worktree", + navigationMethod: "navigate", + }); + + useWorkspaceSetupStore.getState().clearWorkspaceSetup(); + + expect(useWorkspaceSetupStore.getState().pendingWorkspaceSetup).toBeNull(); + }); +}); diff --git a/packages/app/src/stores/workspace-setup-store.ts b/packages/app/src/stores/workspace-setup-store.ts new file mode 100644 index 000000000..b8e5cf7fd --- /dev/null +++ b/packages/app/src/stores/workspace-setup-store.ts @@ -0,0 +1,94 @@ +import type { SessionOutboundMessage } from "@server/shared/messages"; +import { create } from "zustand"; +import { buildWorkspaceTabPersistenceKey } from "@/stores/workspace-tabs-store"; + +export type WorkspaceSetupNavigationMethod = "navigate" | "replace"; +export type WorkspaceCreationMethod = "open_project" | "create_worktree"; + +export interface PendingWorkspaceSetup { + serverId: string; + sourceDirectory: string; + sourceWorkspaceId?: string; + displayName?: string; + creationMethod: WorkspaceCreationMethod; + navigationMethod: WorkspaceSetupNavigationMethod; +} + +export type WorkspaceSetupProgressPayload = Extract< + SessionOutboundMessage, + { type: "workspace_setup_progress" } +>["payload"]; + +export interface WorkspaceSetupSnapshot extends WorkspaceSetupProgressPayload { + updatedAt: number; +} + +interface WorkspaceSetupStoreState { + pendingWorkspaceSetup: PendingWorkspaceSetup | null; + snapshots: Record; + beginWorkspaceSetup: (value: PendingWorkspaceSetup) => void; + clearWorkspaceSetup: () => void; + upsertProgress: (input: { serverId: string; payload: WorkspaceSetupProgressPayload }) => void; + removeWorkspace: (input: { serverId: string; workspaceId: string }) => void; + clearServer: (serverId: string) => void; +} + +function buildWorkspaceSetupKey(input: { + serverId: string; + workspaceId: string; +}): string | null { + return buildWorkspaceTabPersistenceKey(input); +} + +export const useWorkspaceSetupStore = create()((set) => ({ + pendingWorkspaceSetup: null, + snapshots: {}, + beginWorkspaceSetup: (value) => { + set({ pendingWorkspaceSetup: value }); + }, + clearWorkspaceSetup: () => { + set({ pendingWorkspaceSetup: null }); + }, + upsertProgress: ({ serverId, payload }) => { + const key = buildWorkspaceSetupKey({ serverId, workspaceId: payload.workspaceId }); + if (!key) { + return; + } + + set((state) => ({ + snapshots: { + ...state.snapshots, + [key]: { + ...payload, + updatedAt: Date.now(), + }, + }, + })); + }, + removeWorkspace: ({ serverId, workspaceId }) => { + const key = buildWorkspaceSetupKey({ serverId, workspaceId }); + if (!key) { + return; + } + + set((state) => { + if (!(key in state.snapshots)) { + return state; + } + const next = { ...state.snapshots }; + delete next[key]; + return { snapshots: next }; + }); + }, + clearServer: (serverId) => { + set((state) => { + const nextEntries = Object.entries(state.snapshots).filter( + ([key]) => !key.startsWith(`${serverId}:`), + ); + if (nextEntries.length === Object.keys(state.snapshots).length) { + return state; + } + return { snapshots: Object.fromEntries(nextEntries) }; + }); + }, +})); diff --git a/packages/app/src/stores/workspace-tabs-store.test.ts b/packages/app/src/stores/workspace-tabs-store.test.ts index 1b22b31d1..81a215484 100644 --- a/packages/app/src/stores/workspace-tabs-store.test.ts +++ b/packages/app/src/stores/workspace-tabs-store.test.ts @@ -140,6 +140,40 @@ describe("workspace-tabs-store retargetTab", () => { expect(order).toEqual([draftTabId]); }); + it("openDraftTab creates a draft tab and deduplicates by draftId", () => { + const key = buildWorkspaceTabPersistenceKey({ serverId: SERVER_ID, workspaceId: WORKSPACE_ID }); + expect(key).toBeTruthy(); + const workspaceKey = key as string; + + const firstTabId = useWorkspaceTabsStore.getState().openDraftTab({ + serverId: SERVER_ID, + workspaceId: WORKSPACE_ID, + draftId: "draft-1", + }); + const secondTabId = useWorkspaceTabsStore.getState().openDraftTab({ + serverId: SERVER_ID, + workspaceId: WORKSPACE_ID, + draftId: "draft-2", + }); + + const state = useWorkspaceTabsStore.getState(); + expect(firstTabId).toBe("draft-1"); + expect(secondTabId).toBe("draft-2"); + expect(state.tabOrderByWorkspace[workspaceKey]).toEqual([firstTabId, secondTabId]); + expect(state.uiTabsByWorkspace[workspaceKey]).toEqual([ + { + tabId: "draft-1", + target: { kind: "draft", draftId: "draft-1" }, + createdAt: expect.any(Number), + }, + { + tabId: "draft-2", + target: { kind: "draft", draftId: "draft-2" }, + createdAt: expect.any(Number), + }, + ]); + }); + it("retargeting a background draft keeps the currently focused tab focused", () => { const draftTabId = "draft_background"; const key = buildWorkspaceTabPersistenceKey({ serverId: SERVER_ID, workspaceId: WORKSPACE_ID }); @@ -200,4 +234,19 @@ describe("workspace-tabs-store retargetTab", () => { expect(reopenedFileTabId).toBe(fileTabId); expect(useWorkspaceTabsStore.getState().focusedTabIdByWorkspace[workspaceKey]).toBe(fileTabId); }); + + it("builds a deterministic setup tab keyed by workspace id", () => { + const key = buildWorkspaceTabPersistenceKey({ serverId: SERVER_ID, workspaceId: WORKSPACE_ID }); + expect(key).toBeTruthy(); + const workspaceKey = key as string; + + const tabId = useWorkspaceTabsStore.getState().openOrFocusTab({ + serverId: SERVER_ID, + workspaceId: WORKSPACE_ID, + target: { kind: "setup", workspaceId: WORKSPACE_ID }, + }); + + expect(tabId).toBe(`setup_${WORKSPACE_ID}`); + expect(useWorkspaceTabsStore.getState().focusedTabIdByWorkspace[workspaceKey]).toBe(tabId); + }); }); diff --git a/packages/app/src/stores/workspace-tabs-store.ts b/packages/app/src/stores/workspace-tabs-store.ts index 23ed460e0..f1821ab32 100644 --- a/packages/app/src/stores/workspace-tabs-store.ts +++ b/packages/app/src/stores/workspace-tabs-store.ts @@ -1,12 +1,18 @@ import AsyncStorage from "@react-native-async-storage/async-storage"; import { create } from "zustand"; import { createJSONStorage, persist } from "zustand/middleware"; +import { + buildDeterministicWorkspaceTabId, + normalizeWorkspaceTabTarget, + workspaceTabTargetsEqual, +} from "@/utils/workspace-tab-identity"; export type WorkspaceTabTarget = | { kind: "draft"; draftId: string } | { kind: "agent"; agentId: string } | { kind: "terminal"; terminalId: string } - | { kind: "file"; path: string }; + | { kind: "file"; path: string } + | { kind: "setup"; workspaceId: string }; export type WorkspaceTab = { tabId: string; @@ -38,63 +44,6 @@ export function buildWorkspaceTabPersistenceKey(input: { return `${serverId}:${normalizeWorkspaceId(workspaceId)}`; } -function normalizeTabTarget( - value: WorkspaceTabTarget | null | undefined, -): WorkspaceTabTarget | null { - if (!value || typeof value !== "object" || typeof value.kind !== "string") { - return null; - } - if (value.kind === "draft") { - const draftId = trimNonEmpty(value.draftId); - return draftId ? { kind: "draft", draftId } : null; - } - if (value.kind === "agent") { - const agentId = trimNonEmpty(value.agentId); - return agentId ? { kind: "agent", agentId } : null; - } - if (value.kind === "terminal") { - const terminalId = trimNonEmpty(value.terminalId); - return terminalId ? { kind: "terminal", terminalId } : null; - } - if (value.kind === "file") { - const path = trimNonEmpty(value.path); - return path ? { kind: "file", path: path.replace(/\\/g, "/") } : null; - } - return null; -} - -function tabTargetsEqual(left: WorkspaceTabTarget, right: WorkspaceTabTarget): boolean { - if (left.kind !== right.kind) { - return false; - } - if (left.kind === "draft" && right.kind === "draft") { - return left.draftId === right.draftId; - } - if (left.kind === "agent" && right.kind === "agent") { - return left.agentId === right.agentId; - } - if (left.kind === "terminal" && right.kind === "terminal") { - return left.terminalId === right.terminalId; - } - if (left.kind === "file" && right.kind === "file") { - return left.path === right.path; - } - return false; -} - -function buildDeterministicTabId(target: WorkspaceTabTarget): string { - if (target.kind === "draft") { - return target.draftId; - } - if (target.kind === "agent") { - return `agent_${target.agentId}`; - } - if (target.kind === "terminal") { - return `terminal_${target.terminalId}`; - } - return `file_${target.path}`; -} - function normalizeTabOrder(list: unknown): string[] { if (!Array.isArray(list)) { return []; @@ -169,19 +118,20 @@ export const useWorkspaceTabsStore = create()( }, ensureTab: ({ serverId, workspaceId, target }) => { const key = buildWorkspaceTabPersistenceKey({ serverId, workspaceId }); - const normalizedTarget = normalizeTabTarget(target); + const normalizedTarget = normalizeWorkspaceTabTarget(target); if (!key || !normalizedTarget) { return null; } - const deterministicTabId = buildDeterministicTabId(normalizedTarget); + const deterministicTabId = buildDeterministicWorkspaceTabId(normalizedTarget); let resolvedTabId = deterministicTabId; const now = Date.now(); set((state) => { const currentTabs = state.uiTabsByWorkspace[key] ?? []; const tabWithSameTarget = - currentTabs.find((tab) => tabTargetsEqual(tab.target, normalizedTarget)) ?? null; + currentTabs.find((tab) => workspaceTabTargetsEqual(tab.target, normalizedTarget)) ?? + null; const effectiveTabId = tabWithSameTarget?.tabId ?? deterministicTabId; resolvedTabId = effectiveTabId; @@ -196,7 +146,7 @@ export const useWorkspaceTabsStore = create()( ]; } const existing = currentTabs[existingIndex]; - if (existing && tabTargetsEqual(existing.target, normalizedTarget)) { + if (existing && workspaceTabTargetsEqual(existing.target, normalizedTarget)) { return currentTabs; } return currentTabs.map((tab, index) => @@ -310,7 +260,7 @@ export const useWorkspaceTabsStore = create()( retargetTab: ({ serverId, workspaceId, tabId, target }) => { const key = buildWorkspaceTabPersistenceKey({ serverId, workspaceId }); const normalizedTabId = trimNonEmpty(tabId); - const normalizedTarget = normalizeTabTarget(target); + const normalizedTarget = normalizeWorkspaceTabTarget(target); if (!key || !normalizedTabId || !normalizedTarget) { return null; } @@ -325,7 +275,7 @@ export const useWorkspaceTabsStore = create()( } const currentTarget = currentTabs[index]?.target; - if (currentTarget && tabTargetsEqual(currentTarget, normalizedTarget)) { + if (currentTarget && workspaceTabTargetsEqual(currentTarget, normalizedTarget)) { return state; } @@ -390,7 +340,7 @@ export const useWorkspaceTabsStore = create()( for (const key in state.uiTabsByWorkspace) { const tabs = (state.uiTabsByWorkspace[key] ?? []) .map((tab) => { - const normalizedTarget = normalizeTabTarget(tab.target); + const normalizedTarget = normalizeWorkspaceTabTarget(tab.target); const normalizedTabId = trimNonEmpty(tab.tabId); if (!normalizedTarget || !normalizedTabId) { return null; @@ -479,13 +429,13 @@ export const useWorkspaceTabsStore = create()( continue; } - const normalizedTarget = normalizeTabTarget((rawTab as WorkspaceTab).target); + const normalizedTarget = normalizeWorkspaceTabTarget((rawTab as WorkspaceTab).target); const rawTabId = trimNonEmpty((rawTab as WorkspaceTab).tabId); if (!normalizedTarget) { continue; } - const tabId = rawTabId ?? buildDeterministicTabId(normalizedTarget); + const tabId = rawTabId ?? buildDeterministicWorkspaceTabId(normalizedTarget); if (!usedOrder.has(tabId)) { usedOrder.add(tabId); orderFromTabs.push(tabId); diff --git a/packages/app/src/terminal/runtime/terminal-emulator-runtime.test.ts b/packages/app/src/terminal/runtime/terminal-emulator-runtime.test.ts index ecd7d7be0..504a5a52d 100644 --- a/packages/app/src/terminal/runtime/terminal-emulator-runtime.test.ts +++ b/packages/app/src/terminal/runtime/terminal-emulator-runtime.test.ts @@ -1,5 +1,48 @@ import { afterEach, beforeEach, describe, expect, it, vi } from "vitest"; +vi.mock("@xterm/addon-clipboard", () => ({ + ClipboardAddon: class ClipboardAddon {}, +})); + +vi.mock("@xterm/addon-fit", () => ({ + FitAddon: class FitAddon {}, +})); + +vi.mock("@xterm/addon-image", () => ({ + ImageAddon: class ImageAddon {}, +})); + +vi.mock("@xterm/addon-ligatures/lib/addon-ligatures.mjs", () => ({ + LigaturesAddon: class LigaturesAddon {}, +})); + +vi.mock("@xterm/addon-search", () => ({ + SearchAddon: class SearchAddon {}, +})); + +vi.mock("@xterm/addon-unicode11", () => ({ + Unicode11Addon: class Unicode11Addon {}, +})); + +vi.mock("@xterm/addon-web-links", () => ({ + WebLinksAddon: class WebLinksAddon {}, +})); + +vi.mock("@xterm/addon-webgl", () => ({ + WebglAddon: class WebglAddon { + onContextLoss(): void {} + dispose(): void {} + }, +})); + +vi.mock("@xterm/xterm", () => ({ + Terminal: class Terminal {}, +})); + +vi.mock("@/utils/open-external-url", () => ({ + openExternalUrl: vi.fn(), +})); + import { TerminalEmulatorRuntime } from "./terminal-emulator-runtime"; type StubTerminal = { diff --git a/packages/app/src/terminal/runtime/terminal-emulator-runtime.ts b/packages/app/src/terminal/runtime/terminal-emulator-runtime.ts index f4a6df21c..947f9b623 100644 --- a/packages/app/src/terminal/runtime/terminal-emulator-runtime.ts +++ b/packages/app/src/terminal/runtime/terminal-emulator-runtime.ts @@ -1,11 +1,11 @@ import { ClipboardAddon } from "@xterm/addon-clipboard"; import { FitAddon } from "@xterm/addon-fit"; import { ImageAddon } from "@xterm/addon-image"; -import { LigaturesAddon } from "@xterm/addon-ligatures"; import { SearchAddon } from "@xterm/addon-search"; import { Unicode11Addon } from "@xterm/addon-unicode11"; import { WebLinksAddon } from "@xterm/addon-web-links"; import { WebglAddon } from "@xterm/addon-webgl"; +import { LigaturesAddon } from "@xterm/addon-ligatures/lib/addon-ligatures.mjs"; import { Terminal, type ITheme } from "@xterm/xterm"; import type { TerminalState } from "@server/shared/messages"; import { openExternalUrl } from "@/utils/open-external-url"; diff --git a/packages/app/src/types/xterm-addon-ligatures.d.ts b/packages/app/src/types/xterm-addon-ligatures.d.ts new file mode 100644 index 000000000..f234c995b --- /dev/null +++ b/packages/app/src/types/xterm-addon-ligatures.d.ts @@ -0,0 +1,7 @@ +declare module "@xterm/addon-ligatures/lib/addon-ligatures.mjs" { + export class LigaturesAddon { + constructor(); + activate(terminal: unknown): void; + dispose(): void; + } +} diff --git a/packages/app/src/utils/error-messages.ts b/packages/app/src/utils/error-messages.ts new file mode 100644 index 000000000..4ab0cf343 --- /dev/null +++ b/packages/app/src/utils/error-messages.ts @@ -0,0 +1,6 @@ +export function toErrorMessage(error: unknown): string { + if (error instanceof Error) { + return error.message; + } + return String(error); +} diff --git a/packages/app/src/utils/host-routes.test.ts b/packages/app/src/utils/host-routes.test.ts index 5b34e4e95..82f115837 100644 --- a/packages/app/src/utils/host-routes.test.ts +++ b/packages/app/src/utils/host-routes.test.ts @@ -2,6 +2,7 @@ import { describe, expect, it } from "vitest"; import { buildHostAgentDetailRoute, buildHostRootRoute, + buildHostWorkspaceOpenRoute, buildHostWorkspaceRoute, decodeFilePathFromPathSegment, decodeWorkspaceIdFromPathSegment, @@ -23,7 +24,12 @@ describe("parseHostAgentRouteFromPathname", () => { }); describe("workspace route parsing", () => { - it("encodes workspace IDs as base64url (no padding)", () => { + it("encodes numeric workspace IDs without base64", () => { + expect(encodeWorkspaceIdForPathSegment("164")).toBe("164"); + expect(decodeWorkspaceIdFromPathSegment("164")).toBe("164"); + }); + + it("encodes path-based workspace IDs as base64url (legacy)", () => { expect(encodeWorkspaceIdForPathSegment("/tmp/repo")).toBe("L3RtcC9yZXBv"); expect(decodeWorkspaceIdFromPathSegment("L3RtcC9yZXBv")).toBe("/tmp/repo"); }); @@ -40,7 +46,14 @@ describe("workspace route parsing", () => { expect(decodeFilePathFromPathSegment(encoded)).toBe("src/index.ts"); }); - it("parses workspace route", () => { + it("parses workspace route with numeric ID", () => { + expect(parseHostWorkspaceRouteFromPathname("/h/local/workspace/164")).toEqual({ + serverId: "local", + workspaceId: "164", + }); + }); + + it("parses workspace route with legacy base64 path", () => { expect(parseHostWorkspaceRouteFromPathname("/h/local/workspace/L3RtcC9yZXBv")).toEqual({ serverId: "local", workspaceId: "/tmp/repo", @@ -53,7 +66,11 @@ describe("workspace route parsing", () => { ).toBeNull(); }); - it("builds base64url workspace routes", () => { + it("builds numeric workspace routes without base64", () => { + expect(buildHostWorkspaceRoute("local", "164")).toBe("/h/local/workspace/164"); + }); + + it("builds base64url workspace routes for legacy paths", () => { expect(buildHostWorkspaceRoute("local", "/tmp/repo")).toBe("/h/local/workspace/L3RtcC9yZXBv"); }); @@ -64,7 +81,7 @@ describe("workspace route parsing", () => { it("parses workspace open intent from pathname query", () => { expect( parseHostWorkspaceOpenIntentFromPathname( - "/h/local/workspace/L3RtcC9yZXBv?open=agent%3Aagent-1", + "/h/local/workspace/164?open=agent%3Aagent-1", ), ).toEqual({ kind: "agent", @@ -82,11 +99,30 @@ describe("workspace route parsing", () => { kind: "file", path: "src/index.ts", }); + expect(parseWorkspaceOpenIntent("setup:L3RtcC9yZXBv")).toEqual({ + kind: "setup", + workspaceId: "/tmp/repo", + }); }); it("uses the plain workspace route when workspace context is provided", () => { - expect(buildHostAgentDetailRoute("local", "agent-1", "/tmp/repo")).toBe( - "/h/local/workspace/L3RtcC9yZXBv?open=agent%3Aagent-1", + expect(buildHostAgentDetailRoute("local", "agent-1", "164")).toBe( + "/h/local/workspace/164?open=agent%3Aagent-1", ); }); + + it("builds workspace routes with a one-shot open intent", () => { + expect(buildHostWorkspaceOpenRoute("local", "164", "draft:new")).toBe( + "/h/local/workspace/164?open=draft%3Anew", + ); + }); + + it("round-trips numeric IDs through encode/decode", () => { + const ids = ["1", "40", "164", "9999"]; + for (const id of ids) { + const encoded = encodeWorkspaceIdForPathSegment(id); + const decoded = decodeWorkspaceIdFromPathSegment(encoded); + expect(decoded).toBe(id); + } + }); }); diff --git a/packages/app/src/utils/host-routes.ts b/packages/app/src/utils/host-routes.ts index 6bffaada8..1dcd5aa60 100644 --- a/packages/app/src/utils/host-routes.ts +++ b/packages/app/src/utils/host-routes.ts @@ -99,7 +99,8 @@ export type WorkspaceOpenIntent = | { kind: "agent"; agentId: string } | { kind: "terminal"; terminalId: string } | { kind: "file"; path: string } - | { kind: "draft"; draftId: string }; + | { kind: "draft"; draftId: string } + | { kind: "setup"; workspaceId: string }; export function parseWorkspaceOpenIntent( value: string | null | undefined, @@ -136,6 +137,13 @@ export function parseWorkspaceOpenIntent( } return { kind: "file", path: decodedPath }; } + if (kind === "setup") { + const workspaceId = decodeWorkspaceIdFromPathSegment(payload); + if (!workspaceId) { + return null; + } + return { kind: "setup", workspaceId }; + } return null; } @@ -155,7 +163,13 @@ export function encodeWorkspaceIdForPathSegment(workspaceId: string): string { if (!normalized) { return ""; } - return toBase64UrlNoPad(normalizeWorkspaceId(normalized)); + // Numeric string IDs are URL-safe and don't need encoding. + // Legacy path-based IDs still get base64-encoded for safety. + const id = normalizeWorkspaceId(normalized); + if (isPathLikeWorkspaceIdentity(id)) { + return toBase64UrlNoPad(id); + } + return encodeURIComponent(id); } export function decodeWorkspaceIdFromPathSegment(workspaceIdSegment: string): string | null { @@ -175,6 +189,12 @@ export function decodeWorkspaceIdFromPathSegment(workspaceIdSegment: string): st return normalizeWorkspaceId(decoded); } + // If the segment looks like a plain numeric ID, return it directly. + // Do NOT attempt base64 decode on short alphanumeric strings. + if (/^\d+$/.test(decoded)) { + return decoded; + } + const base64Decoded = tryDecodeBase64UrlNoPadUtf8(decoded); if (base64Decoded) { return normalizeWorkspaceId(base64Decoded); @@ -281,6 +301,19 @@ export function buildHostWorkspaceRoute(serverId: string, workspaceId: string) { return `/h/${encodeSegment(normalizedServerId)}/workspace/${encodeSegment(encodedWorkspaceId)}` as const; } +export function buildHostWorkspaceOpenRoute( + serverId: string, + workspaceId: string, + openIntent: string, +) { + const base = buildHostWorkspaceRoute(serverId, workspaceId); + const normalizedOpenIntent = trimNonEmpty(openIntent); + if (base === "/" || !normalizedOpenIntent) { + return base; + } + return `${base}?open=${encodeURIComponent(normalizedOpenIntent)}` as const; +} + export function buildHostAgentDetailRoute( serverId: string, agentId: string, @@ -292,11 +325,7 @@ export function buildHostAgentDetailRoute( if (!normalizedAgentId) { return "/" as const; } - const base = buildHostWorkspaceRoute(serverId, normalizedWorkspaceId); - if (base === "/") { - return "/" as const; - } - return `${base}?open=${encodeURIComponent(`agent:${normalizedAgentId}`)}` as const; + return buildHostWorkspaceOpenRoute(serverId, normalizedWorkspaceId, `agent:${normalizedAgentId}`); } const normalizedServerId = trimNonEmpty(serverId); const normalizedAgentId = trimNonEmpty(agentId); @@ -330,6 +359,23 @@ export function buildHostOpenProjectRoute(serverId: string) { return `${base}/open-project` as const; } +export function buildHostNewWorkspaceRoute( + serverId: string, + sourceDirectory: string, + options?: { displayName?: string }, +){ + const base = buildHostRootRoute(serverId); + if (base === "/") { + return "/" as const; + } + const params = new URLSearchParams(); + params.set("dir", sourceDirectory); + if (options?.displayName) { + params.set("name", options.displayName); + } + return `${base}/new?${params.toString()}` as const; +} + export function buildHostSettingsRoute(serverId: string) { const base = buildHostRootRoute(serverId); if (base === "/") { diff --git a/packages/app/src/utils/new-agent-routing.test.ts b/packages/app/src/utils/new-agent-routing.test.ts index b4d0cfa42..cf55d7099 100644 --- a/packages/app/src/utils/new-agent-routing.test.ts +++ b/packages/app/src/utils/new-agent-routing.test.ts @@ -10,8 +10,8 @@ import { describe("buildNewAgentRoute", () => { it("falls back to server workspace route with dot workspace when no working directory is provided", () => { - expect(buildNewAgentRoute("srv-1", undefined)).toBe("/h/srv-1/workspace/Lg"); - expect(buildNewAgentRoute("srv-1", " ")).toBe("/h/srv-1/workspace/Lg"); + expect(buildNewAgentRoute("srv-1", undefined)).toBe("/h/srv-1/workspace/."); + expect(buildNewAgentRoute("srv-1", " ")).toBe("/h/srv-1/workspace/."); }); it("encodes the working directory as a workspace path segment", () => { diff --git a/packages/app/src/utils/notification-routing.test.ts b/packages/app/src/utils/notification-routing.test.ts index ecd044a06..099b20ef1 100644 --- a/packages/app/src/utils/notification-routing.test.ts +++ b/packages/app/src/utils/notification-routing.test.ts @@ -28,6 +28,20 @@ describe("resolveNotificationTarget", () => { workspaceId: null, }); }); + + it("does not treat cwd as a workspace id alias", () => { + expect( + resolveNotificationTarget({ + serverId: "srv-1", + agentId: "agent-1", + cwd: "/tmp/repo", + }), + ).toEqual({ + serverId: "srv-1", + agentId: "agent-1", + workspaceId: null, + }); + }); }); describe("buildNotificationRoute", () => { diff --git a/packages/app/src/utils/notification-routing.ts b/packages/app/src/utils/notification-routing.ts index f87daee7e..1d4b0772b 100644 --- a/packages/app/src/utils/notification-routing.ts +++ b/packages/app/src/utils/notification-routing.ts @@ -23,7 +23,7 @@ export function resolveNotificationTarget(data: NotificationData): { return { serverId: readNonEmptyString(data, "serverId"), agentId: readNonEmptyString(data, "agentId"), - workspaceId: readNonEmptyString(data, "workspaceId") ?? readNonEmptyString(data, "cwd"), + workspaceId: readNonEmptyString(data, "workspaceId"), }; } diff --git a/packages/app/src/utils/sidebar-project-row-model.test.ts b/packages/app/src/utils/sidebar-project-row-model.test.ts index 04c8a3f10..e52f78e84 100644 --- a/packages/app/src/utils/sidebar-project-row-model.test.ts +++ b/packages/app/src/utils/sidebar-project-row-model.test.ts @@ -13,11 +13,14 @@ function workspace(overrides: Partial = {}): SidebarWorks workspaceKey: "srv:/repo", serverId: "srv", workspaceId: "/repo", - workspaceKind: "directory", + projectKind: "git", + workspaceKind: "checkout", name: "paseo", activityAt: null, statusBucket: "done", diffStat: null, + services: [], + hasRunningServices: false, ...overrides, }; } @@ -41,13 +44,13 @@ describe("buildSidebarProjectRowModel", () => { it("flattens non-git projects with one workspace into a direct workspace row model", () => { const flattenedWorkspace = workspace({ workspaceId: "/repo/non-git", - workspaceKind: "directory", + workspaceKind: "checkout", statusBucket: "running", }); const result = buildSidebarProjectRowModel({ project: project({ - projectKind: "non_git", + projectKind: "directory", workspaces: [flattenedWorkspace], }), collapsed: false, @@ -70,7 +73,7 @@ describe("buildSidebarProjectRowModel", () => { const result = buildSidebarProjectRowModel({ project: project({ - projectKind: "non_git", + projectKind: "directory", workspaces: [flattenedWorkspace], }), collapsed: false, @@ -87,25 +90,23 @@ describe("buildSidebarProjectRowModel", () => { }); }); - it("flattens git projects with a single workspace and keeps the new worktree action", () => { - const flattenedWorkspace = workspace({ + it("keeps single-workspace git projects as sections with the new worktree action", () => { + const onlyWorkspace = workspace({ workspaceId: "/repo/main", - workspaceKind: "local_checkout", + workspaceKind: "checkout", }); const result = buildSidebarProjectRowModel({ project: project({ projectKind: "git", - workspaces: [flattenedWorkspace], + workspaces: [onlyWorkspace], }), collapsed: true, }); expect(result).toEqual({ - kind: "workspace_link", - workspace: flattenedWorkspace, - selected: false, - chevron: null, + kind: "project_section", + chevron: "expand", trailingAction: "new_worktree", }); }); @@ -115,7 +116,7 @@ describe("buildSidebarProjectRowModel", () => { project: project({ projectKind: "git", workspaces: [ - workspace({ workspaceId: "/repo/main", workspaceKind: "local_checkout" }), + workspace({ workspaceId: "/repo/main", workspaceKind: "checkout" }), workspace({ workspaceId: "/repo/feature", workspaceKind: "worktree" }), ], }), @@ -131,12 +132,12 @@ describe("buildSidebarProjectRowModel", () => { }); describe("isSidebarProjectFlattened", () => { - it("returns true for single-workspace projects regardless of kind", () => { + it("returns true only for single-workspace directory projects", () => { expect( isSidebarProjectFlattened(project({ projectKind: "git", workspaces: [workspace()] })), - ).toBe(true); + ).toBe(false); expect( - isSidebarProjectFlattened(project({ projectKind: "non_git", workspaces: [workspace()] })), + isSidebarProjectFlattened(project({ projectKind: "directory", workspaces: [workspace()] })), ).toBe(true); }); diff --git a/packages/app/src/utils/sidebar-shortcuts.test.ts b/packages/app/src/utils/sidebar-shortcuts.test.ts index 113927177..928c898a3 100644 --- a/packages/app/src/utils/sidebar-shortcuts.test.ts +++ b/packages/app/src/utils/sidebar-shortcuts.test.ts @@ -11,11 +11,14 @@ function workspace(serverId: string, cwd: string): SidebarWorkspaceEntry { workspaceKey: `${serverId}:${cwd}`, serverId, workspaceId: cwd, - workspaceKind: "local_checkout", + projectKind: "git", + workspaceKind: "checkout", name: cwd, activityAt: null, statusBucket: "done", diffStat: null, + services: [], + hasRunningServices: false, }; } @@ -76,7 +79,7 @@ describe("buildSidebarShortcutModel", () => { expect(model.shortcutTargets[8]).toEqual({ serverId: "s", workspaceId: "/repo/w9" }); }); - it("ignores collapsed state for flattened single-workspace projects", () => { + it("respects collapsed state for single-workspace git projects", () => { const projects = [project("p1", [workspace("s1", "/repo/main")])]; const model = buildSidebarShortcutModel({ @@ -84,7 +87,7 @@ describe("buildSidebarShortcutModel", () => { collapsedProjectKeys: new Set(["p1"]), }); - expect(model.visibleTargets).toEqual([{ serverId: "s1", workspaceId: "/repo/main" }]); - expect(model.shortcutTargets).toEqual([{ serverId: "s1", workspaceId: "/repo/main" }]); + expect(model.visibleTargets).toEqual([]); + expect(model.shortcutTargets).toEqual([]); }); }); diff --git a/packages/app/src/utils/terminal-list.test.ts b/packages/app/src/utils/terminal-list.test.ts index b7b38e39f..6fe3b971f 100644 --- a/packages/app/src/utils/terminal-list.test.ts +++ b/packages/app/src/utils/terminal-list.test.ts @@ -50,4 +50,18 @@ describe("terminal-list", () => { { id: "term-2", name: "Renamed Terminal" }, ]); }); + + it("preserves terminal titles from create responses", () => { + const result = upsertTerminalListEntry({ + terminals: [], + terminal: { + id: "term-3", + name: "Terminal 3", + title: "Build Output", + cwd: "/tmp/project", + }, + }); + + expect(result).toEqual([{ id: "term-3", name: "Terminal 3", title: "Build Output" }]); + }); }); diff --git a/packages/app/src/utils/terminal-list.ts b/packages/app/src/utils/terminal-list.ts index eb7699ce4..99f7ad41a 100644 --- a/packages/app/src/utils/terminal-list.ts +++ b/packages/app/src/utils/terminal-list.ts @@ -7,6 +7,7 @@ function toTerminalListEntry(input: { terminal: CreatedTerminal }): TerminalList return { id: input.terminal.id, name: input.terminal.name, + ...(input.terminal.title ? { title: input.terminal.title } : {}), }; } diff --git a/packages/app/src/utils/test-daemon-connection.test.ts b/packages/app/src/utils/test-daemon-connection.test.ts index d0b148575..b33819c81 100644 --- a/packages/app/src/utils/test-daemon-connection.test.ts +++ b/packages/app/src/utils/test-daemon-connection.test.ts @@ -59,6 +59,19 @@ vi.mock("./client-id", () => ({ getOrCreateClientId: clientIdMock.getOrCreateClientId, })); +vi.mock("@/desktop/daemon/desktop-daemon-transport", () => ({ + createDesktopLocalDaemonTransportFactory: vi.fn(() => null), + buildLocalDaemonTransportUrl: vi.fn( + ({ + transportType, + transportPath, + }: { + transportType: "socket" | "pipe"; + transportPath: string; + }) => `paseo+local://${transportType}?path=${encodeURIComponent(transportPath)}`, + ), +})); + describe("test-daemon-connection connectToDaemon", () => { beforeEach(() => { daemonClientMock.createdConfigs.length = 0; diff --git a/packages/app/src/utils/tool-call-display.test.ts b/packages/app/src/utils/tool-call-display.test.ts index 3854da586..5c14ca365 100644 --- a/packages/app/src/utils/tool-call-display.test.ts +++ b/packages/app/src/utils/tool-call-display.test.ts @@ -96,6 +96,7 @@ describe("tool-call-display", () => { index: 1, command: "npm install", cwd: "/tmp/repo/.paseo/worktrees/repo/branch", + log: "", status: "running", exitCode: null, }, @@ -153,7 +154,7 @@ describe("tool-call-display", () => { }); expect(display).toEqual({ - displayName: "Interacted with terminal", + displayName: "Terminal", }); }); @@ -170,7 +171,7 @@ describe("tool-call-display", () => { }); expect(display).toEqual({ - displayName: "Interacted with terminal", + displayName: "Terminal", summary: "npm run test", }); }); diff --git a/packages/app/src/utils/workspace-archive-navigation.test.ts b/packages/app/src/utils/workspace-archive-navigation.test.ts index eeaba5bca..a307c5376 100644 --- a/packages/app/src/utils/workspace-archive-navigation.test.ts +++ b/packages/app/src/utils/workspace-archive-navigation.test.ts @@ -13,19 +13,21 @@ function workspace( projectId: input.projectId ?? "project-1", projectDisplayName: input.projectDisplayName ?? "Project", projectRootPath: input.projectRootPath ?? "/repo", + workspaceDirectory: input.workspaceDirectory ?? input.projectRootPath ?? "/repo", projectKind: input.projectKind ?? "git", workspaceKind: input.workspaceKind ?? "worktree", name: input.name ?? input.id, - status: input.status ?? "done", + status: input.status ?? "running", activityAt: input.activityAt ?? null, diffStat: input.diffStat ?? null, + services: input.services ?? [], }; } describe("resolveWorkspaceArchiveRedirectWorkspaceId", () => { it("redirects an archived worktree to the visible local checkout for the same project", () => { const workspaces = [ - workspace({ id: "/repo", workspaceKind: "local_checkout", name: "main" }), + workspace({ id: "/repo", workspaceKind: "checkout", name: "main" }), workspace({ id: "/repo/.paseo/worktrees/feature", name: "feature" }), ]; @@ -37,7 +39,7 @@ describe("resolveWorkspaceArchiveRedirectWorkspaceId", () => { ).toBe("/repo"); }); - it("falls back to the project root path when the root checkout is not in the visible workspace list", () => { + it("falls back to the host root route when no sibling workspace target exists", () => { const workspaces = [ workspace({ id: "/repo/.paseo/worktrees/feature", @@ -47,11 +49,12 @@ describe("resolveWorkspaceArchiveRedirectWorkspaceId", () => { ]; expect( - resolveWorkspaceArchiveRedirectWorkspaceId({ + buildWorkspaceArchiveRedirectRoute({ + serverId: "server-1", archivedWorkspaceId: "/repo/.paseo/worktrees/feature", workspaces, }), - ).toBe("/repo"); + ).toBe("/h/server-1"); }); it("falls back to the host root route when no alternate workspace target exists", () => { @@ -60,8 +63,8 @@ describe("resolveWorkspaceArchiveRedirectWorkspaceId", () => { id: "/notes", projectId: "notes", projectRootPath: "/notes", - projectKind: "non_git", - workspaceKind: "directory", + projectKind: "directory", + workspaceKind: "checkout", }), ]; diff --git a/packages/app/src/utils/workspace-archive-navigation.ts b/packages/app/src/utils/workspace-archive-navigation.ts index d4e7bf042..d0895e863 100644 --- a/packages/app/src/utils/workspace-archive-navigation.ts +++ b/packages/app/src/utils/workspace-archive-navigation.ts @@ -1,20 +1,14 @@ import type { WorkspaceDescriptor } from "@/stores/session-store"; import { buildHostRootRoute, buildHostWorkspaceRoute } from "@/utils/host-routes"; - -function trimNonEmpty(value: string | null | undefined): string | null { - if (typeof value !== "string") { - return null; - } - - const trimmed = value.trim(); - return trimmed.length > 0 ? trimmed : null; -} +import { resolveWorkspaceRouteId } from "@/utils/workspace-execution"; export function resolveWorkspaceArchiveRedirectWorkspaceId(input: { archivedWorkspaceId: string; workspaces: Iterable; }): string | null { - const archivedWorkspaceId = trimNonEmpty(input.archivedWorkspaceId); + const archivedWorkspaceId = resolveWorkspaceRouteId({ + routeWorkspaceId: input.archivedWorkspaceId, + }); if (!archivedWorkspaceId) { return null; } @@ -38,11 +32,6 @@ export function resolveWorkspaceArchiveRedirectWorkspaceId(input: { return rootCheckoutWorkspace.id; } - const fallbackProjectRootPath = trimNonEmpty(archivedWorkspace.projectRootPath); - if (fallbackProjectRootPath && fallbackProjectRootPath !== archivedWorkspace.id) { - return fallbackProjectRootPath; - } - const siblingWorkspace = sameProjectWorkspaces.find((workspace) => workspace.id !== archivedWorkspace.id) ?? null; return siblingWorkspace?.id ?? null; diff --git a/packages/app/src/utils/workspace-execution.test.ts b/packages/app/src/utils/workspace-execution.test.ts new file mode 100644 index 000000000..019ea4a78 --- /dev/null +++ b/packages/app/src/utils/workspace-execution.test.ts @@ -0,0 +1,177 @@ +import { describe, expect, it } from "vitest"; +import type { WorkspaceDescriptor } from "@/stores/session-store"; +import { + getWorkspaceExecutionAuthority, + requireWorkspaceExecutionAuthority, + resolveWorkspaceMapKeyByIdentity, + resolveWorkspaceIdByExecutionDirectory, + resolveWorkspaceRouteId, +} from "./workspace-execution"; + +function createWorkspace( + input: Partial & Pick, +): WorkspaceDescriptor { + return { + id: input.id, + projectId: input.projectId ?? "project-1", + projectDisplayName: input.projectDisplayName ?? "Project", + projectRootPath: input.projectRootPath ?? "/repo", + workspaceDirectory: input.workspaceDirectory ?? "/repo", + projectKind: input.projectKind ?? "git", + workspaceKind: input.workspaceKind ?? "checkout", + name: input.name ?? "main", + status: input.status ?? "running", + activityAt: input.activityAt ?? null, + diffStat: input.diffStat ?? null, + services: input.services ?? [], + }; +} + +describe("resolveWorkspaceRouteId", () => { + it("normalizes route workspace ids", () => { + expect(resolveWorkspaceRouteId({ routeWorkspaceId: " /tmp/repo/ " })).toBe("/tmp/repo"); + }); + + it("returns null for empty values", () => { + expect(resolveWorkspaceRouteId({ routeWorkspaceId: " " })).toBeNull(); + }); +}); + +describe("resolveWorkspaceIdByExecutionDirectory", () => { + it("matches workspace directories", () => { + const workspaces = [ + createWorkspace({ + id: "workspace-1", + projectRootPath: "/repo", + workspaceDirectory: "/repo/.paseo/worktrees/feature", + }), + ]; + + expect( + resolveWorkspaceIdByExecutionDirectory({ + workspaces, + workspaceDirectory: "/repo/.paseo/worktrees/feature", + }), + ).toBe("workspace-1"); + }); + + it("does not match project root metadata", () => { + const workspaces = [ + createWorkspace({ + id: "workspace-1", + projectRootPath: "/repo", + workspaceDirectory: "/repo/.paseo/worktrees/feature", + }), + ]; + + expect( + resolveWorkspaceIdByExecutionDirectory({ + workspaces, + workspaceDirectory: "/repo", + }), + ).toBeNull(); + }); +}); + +describe("resolveWorkspaceMapKeyByIdentity", () => { + it("returns the existing map key when the identity already matches a key", () => { + const workspaces = new Map([ + [ + "workspace-1", + createWorkspace({ + id: "workspace-1", + workspaceDirectory: "/repo/.paseo/worktrees/feature", + }), + ], + ]); + + expect( + resolveWorkspaceMapKeyByIdentity({ + workspaces, + workspaceIdentity: "workspace-1", + }), + ).toBe("workspace-1"); + }); + + it("resolves a workspace directory identity to the canonical map key", () => { + const workspaces = new Map([ + [ + "workspace-1", + createWorkspace({ + id: "workspace-1", + workspaceDirectory: "C:\\repo\\feature\\", + }), + ], + ]); + + expect( + resolveWorkspaceMapKeyByIdentity({ + workspaces, + workspaceIdentity: "C:/repo/feature", + }), + ).toBe("workspace-1"); + }); +}); + +describe("workspace execution authority", () => { + it("returns an explicit failure when workspace id is missing", () => { + expect( + getWorkspaceExecutionAuthority({ + workspaces: new Map(), + workspaceId: null, + }), + ).toEqual({ + ok: false, + reason: "workspace_id_missing", + message: "Workspace id is required.", + }); + }); + + it("returns an explicit failure when workspace directory is missing", () => { + const workspaces = new Map([ + [ + "workspace-1", + createWorkspace({ + id: "workspace-1", + workspaceDirectory: " ", + projectRootPath: "/repo", + }), + ], + ]); + + expect( + getWorkspaceExecutionAuthority({ + workspaces, + workspaceId: "workspace-1", + }), + ).toEqual({ + ok: false, + reason: "workspace_directory_missing", + message: "Workspace directory is missing for workspace workspace-1", + }); + }); + + it("never falls back to project root metadata", () => { + const workspaces = new Map([ + [ + "workspace-1", + createWorkspace({ + id: "workspace-1", + projectRootPath: "/repo", + workspaceDirectory: "/repo/.paseo/worktrees/feature", + }), + ], + ]); + + expect( + requireWorkspaceExecutionAuthority({ + workspaces, + workspaceId: "workspace-1", + }), + ).toEqual({ + workspaceId: "workspace-1", + workspaceDirectory: "/repo/.paseo/worktrees/feature", + workspace: workspaces.get("workspace-1"), + }); + }); +}); diff --git a/packages/app/src/utils/workspace-execution.ts b/packages/app/src/utils/workspace-execution.ts new file mode 100644 index 000000000..d1a2ca80c --- /dev/null +++ b/packages/app/src/utils/workspace-execution.ts @@ -0,0 +1,210 @@ +import type { WorkspaceDescriptor } from "@/stores/session-store"; +import { normalizeWorkspaceIdentity } from "@/utils/workspace-identity"; + +export type WorkspaceAuthorityResult = { + workspaceId: string; + workspaceDirectory: string; + workspace: WorkspaceDescriptor; +}; + +export type WorkspaceExecutionAuthorityFailureReason = + | "workspace_id_missing" + | "workspace_missing" + | "workspace_directory_missing"; + +export type WorkspaceExecutionAuthorityResult = + | { ok: true; authority: WorkspaceAuthorityResult } + | { + ok: false; + reason: WorkspaceExecutionAuthorityFailureReason; + message: string; + }; + +export function resolveWorkspaceRouteId(input: { + routeWorkspaceId: string | null | undefined; +}): string | null { + return normalizeWorkspaceIdentity(input.routeWorkspaceId); +} + +export function resolveWorkspaceIdByExecutionDirectory(input: { + workspaces: Iterable | null | undefined; + workspaceDirectory: string | null | undefined; +}): string | null { + const normalizedWorkspaceDirectory = normalizeWorkspaceIdentity(input.workspaceDirectory); + if (!normalizedWorkspaceDirectory) { + return null; + } + + for (const workspace of input.workspaces ?? []) { + if (normalizeWorkspaceIdentity(workspace.workspaceDirectory) === normalizedWorkspaceDirectory) { + return workspace.id; + } + } + + return null; +} + +export function resolveWorkspaceMapKeyByIdentity(input: { + workspaces: Map | null | undefined; + workspaceIdentity: string | null | undefined; +}): string | null { + const normalizedWorkspaceIdentity = normalizeWorkspaceIdentity(input.workspaceIdentity); + if (!normalizedWorkspaceIdentity) { + return null; + } + + const workspaces = input.workspaces; + if (!workspaces) { + return null; + } + + if (workspaces.has(normalizedWorkspaceIdentity)) { + return normalizedWorkspaceIdentity; + } + + for (const [workspaceKey, workspace] of workspaces) { + if ( + normalizeWorkspaceIdentity(workspace.id) === normalizedWorkspaceIdentity || + normalizeWorkspaceIdentity(workspace.workspaceDirectory) === normalizedWorkspaceIdentity + ) { + return workspaceKey; + } + } + + return null; +} + +export function getWorkspaceExecutionAuthority( + input: + | { + workspace: WorkspaceDescriptor | null | undefined; + } + | { + workspaces: Map | undefined; + workspaceId: string | null | undefined; + }, +): WorkspaceExecutionAuthorityResult { + const workspace = + "workspace" in input + ? input.workspace + : (() => { + const workspaceKey = resolveWorkspaceMapKeyByIdentity({ + workspaces: input.workspaces, + workspaceIdentity: input.workspaceId, + }); + if (!workspaceKey) { + return null; + } + return input.workspaces?.get(workspaceKey) ?? null; + })(); + + if ("workspaces" in input) { + const normalizedWorkspaceId = normalizeWorkspaceIdentity(input.workspaceId); + if (!normalizedWorkspaceId) { + return { + ok: false, + reason: "workspace_id_missing", + message: "Workspace id is required.", + }; + } + } + + if (!workspace) { + return { + ok: false, + reason: "workspace_missing", + message: + "workspaces" in input + ? `Workspace not found: ${String(input.workspaceId ?? "")}` + : "Workspace not found.", + }; + } + + const workspaceDirectory = normalizeWorkspaceIdentity(workspace.workspaceDirectory); + if (!workspaceDirectory) { + return { + ok: false, + reason: "workspace_directory_missing", + message: `Workspace directory is missing for workspace ${workspace.id}`, + }; + } + + return { + ok: true, + authority: { + workspaceId: workspace.id, + workspaceDirectory, + workspace, + }, + }; +} + +export function requireWorkspaceExecutionAuthority( + input: + | { + workspace: WorkspaceDescriptor | null | undefined; + } + | { + workspaces: Map | undefined; + workspaceId: string | null | undefined; + }, +): WorkspaceAuthorityResult { + const result = getWorkspaceExecutionAuthority(input); + if (!result.ok) { + throw new Error(result.message); + } + return result.authority; +} + +export function requireWorkspaceRecordId(workspaceId: string): number { + const normalizedWorkspaceId = normalizeWorkspaceIdentity(workspaceId); + if (!normalizedWorkspaceId) { + throw new Error("Workspace ID is required"); + } + + const parsedWorkspaceId = Number(normalizedWorkspaceId); + if (!Number.isInteger(parsedWorkspaceId)) { + throw new Error(`Workspace ID is not a persisted record ID: ${workspaceId}`); + } + + return parsedWorkspaceId; +} + +export function resolveWorkspaceExecutionDirectory(input: { + workspaceDirectory: string | null | undefined; +}): string | null { + return normalizeWorkspaceIdentity(input.workspaceDirectory); +} + +export function requireWorkspaceExecutionDirectory(input: { + workspaceId?: string; + workspaceDirectory: string | null | undefined; +}): string { + const workspaceDirectory = resolveWorkspaceExecutionDirectory({ + workspaceDirectory: input.workspaceDirectory, + }); + if (!workspaceDirectory) { + throw new Error( + input.workspaceId + ? `Workspace directory is missing for workspace ${input.workspaceId}` + : "Workspace directory is missing.", + ); + } + return workspaceDirectory; +} + +export function resolveWorkspaceExecutionAuthority( + input: + | { + workspace: WorkspaceDescriptor | null | undefined; + } + | { + workspaces: Map | undefined; + workspaceId: string | null | undefined; + }, +): WorkspaceAuthorityResult | null { + const result = getWorkspaceExecutionAuthority(input); + return result.ok ? result.authority : null; +} + +export const parseWorkspaceRecordId = requireWorkspaceRecordId; diff --git a/packages/app/src/utils/workspace-navigation.test.ts b/packages/app/src/utils/workspace-navigation.test.ts new file mode 100644 index 000000000..3ce623701 --- /dev/null +++ b/packages/app/src/utils/workspace-navigation.test.ts @@ -0,0 +1,52 @@ +import { beforeEach, describe, expect, it, vi } from "vitest"; + +vi.mock("expo-router", () => ({ + router: { + navigate: vi.fn(), + replace: vi.fn(), + }, +})); + +vi.mock("@react-native-async-storage/async-storage", () => { + const storage = new Map(); + return { + default: { + getItem: vi.fn(async (key: string) => storage.get(key) ?? null), + setItem: vi.fn(async (key: string, value: string) => { + storage.set(key, value); + }), + removeItem: vi.fn(async (key: string) => { + storage.delete(key); + }), + }, + }; +}); + +import { useWorkspaceLayoutStore } from "@/stores/workspace-layout-store"; +import { prepareWorkspaceTab } from "@/utils/workspace-navigation"; + +const SERVER_ID = "server-1"; +const WORKSPACE_ID = "/repo/worktree"; +const AGENT_ID = "agent-1"; + +describe("prepareWorkspaceTab", () => { + beforeEach(() => { + useWorkspaceLayoutStore.setState({ + layoutByWorkspace: {}, + splitSizesByWorkspace: {}, + pinnedAgentIdsByWorkspace: {}, + }); + }); + + it("opens and focuses an agent tab", () => { + const route = prepareWorkspaceTab({ + serverId: SERVER_ID, + workspaceId: WORKSPACE_ID, + target: { kind: "agent", agentId: AGENT_ID }, + }); + + expect(route).toBe("/h/server-1/workspace/L3JlcG8vd29ya3RyZWU"); + const key = "server-1:/repo/worktree"; + expect(useWorkspaceLayoutStore.getState().getWorkspaceTabs(key)).toHaveLength(1); + }); +}); diff --git a/packages/app/src/utils/workspace-navigation.ts b/packages/app/src/utils/workspace-navigation.ts index cf1236c28..bad58ed5e 100644 --- a/packages/app/src/utils/workspace-navigation.ts +++ b/packages/app/src/utils/workspace-navigation.ts @@ -1,3 +1,4 @@ +import { router } from "expo-router"; import { useWorkspaceLayoutStore } from "@/stores/workspace-layout-store"; import { generateDraftId } from "@/stores/draft-keys"; import { @@ -13,6 +14,10 @@ interface PrepareWorkspaceTabInput { pin?: boolean; } +interface NavigateToPreparedWorkspaceTabInput extends PrepareWorkspaceTabInput { + navigationMethod?: "navigate" | "replace"; +} + function getPreparedTarget(target: WorkspaceTabTarget): WorkspaceTabTarget { if (target.kind !== "draft" || target.draftId.trim() !== "new") { return target; @@ -40,3 +45,13 @@ export function prepareWorkspaceTab(input: PrepareWorkspaceTabInput) { return buildHostWorkspaceRoute(input.serverId, input.workspaceId); } + +export function navigateToPreparedWorkspaceTab(input: NavigateToPreparedWorkspaceTabInput): string { + const route = prepareWorkspaceTab(input); + if (input.navigationMethod === "replace") { + router.replace(route as any); + } else { + router.navigate(route as any); + } + return route; +} diff --git a/packages/app/src/utils/workspace-tab-identity.ts b/packages/app/src/utils/workspace-tab-identity.ts index 9ea9ffd7f..5de7dcff2 100644 --- a/packages/app/src/utils/workspace-tab-identity.ts +++ b/packages/app/src/utils/workspace-tab-identity.ts @@ -22,6 +22,10 @@ export function normalizeWorkspaceTabTarget( const path = trimNonEmpty(value.path); return path ? { kind: "file", path: path.replace(/\\/g, "/") } : null; } + if (value.kind === "setup") { + const workspaceId = trimNonEmpty(value.workspaceId); + return workspaceId ? { kind: "setup", workspaceId: workspaceId.replace(/\\/g, "/") } : null; + } return null; } @@ -44,6 +48,9 @@ export function workspaceTabTargetsEqual( if (left.kind === "file" && right.kind === "file") { return left.path === right.path; } + if (left.kind === "setup" && right.kind === "setup") { + return left.workspaceId === right.workspaceId; + } return false; } @@ -57,6 +64,9 @@ export function buildDeterministicWorkspaceTabId(target: WorkspaceTabTarget): st if (target.kind === "terminal") { return `terminal_${target.terminalId}`; } + if (target.kind === "setup") { + return `setup_${target.workspaceId}`; + } return `file_${target.path}`; } diff --git a/packages/app/src/voice/voice-runtime.test.ts b/packages/app/src/voice/voice-runtime.test.ts index 76b77d3d2..5afae5101 100644 --- a/packages/app/src/voice/voice-runtime.test.ts +++ b/packages/app/src/voice/voice-runtime.test.ts @@ -153,6 +153,7 @@ describe("voice runtime", () => { await runtime.startVoice("server-1", "agent-1"); runtime.onTurnEvent("server-1", "agent-1", "turn_started"); + vi.mocked(engine.play).mockClear(); runtime.handleAudioOutput( "server-1", @@ -204,6 +205,8 @@ describe("voice runtime", () => { await runtime.startVoice("server-1", "agent-1"); runtime.onTurnEvent("server-1", "agent-1", "turn_started"); + vi.mocked(engine.play).mockClear(); + playResolvers.length = 0; runtime.handleAudioOutput( "server-1", @@ -286,6 +289,8 @@ describe("voice runtime", () => { expect(runtime.getSnapshot().phase).toBe("waiting"); expect(engine.play).toHaveBeenCalledTimes(1); + vi.mocked(engine.stop).mockClear(); + vi.mocked(engine.clearQueue).mockClear(); runtime.handleCaptureVolume(REALTIME_VOICE_VAD_CONFIG.volumeThreshold + 0.05); runtime.handleCaptureVolume(0); @@ -325,6 +330,8 @@ describe("voice runtime", () => { await runtime.startVoice("server-1", "agent-1"); runtime.onTurnEvent("server-1", "agent-1", "turn_started"); runtime.onAssistantAudioStarted("server-1"); + vi.mocked(engine.stop).mockClear(); + vi.mocked(engine.clearQueue).mockClear(); runtime.handleCaptureVolume(0.5); expect(runtime.getTelemetrySnapshot().isSpeaking).toBe(false); diff --git a/packages/app/src/voice/voice-runtime.ts b/packages/app/src/voice/voice-runtime.ts index dd56b4712..30c48cfa4 100644 --- a/packages/app/src/voice/voice-runtime.ts +++ b/packages/app/src/voice/voice-runtime.ts @@ -436,6 +436,7 @@ export function createVoiceRuntime(deps: VoiceRuntimeDeps): VoiceRuntime { clearTimeout(cue.timeout); cue.timeout = null; } + cue.playing = false; if (hadActive) { deps.engine.stop(); deps.engine.clearQueue(); @@ -865,6 +866,8 @@ export function createVoiceRuntime(deps: VoiceRuntimeDeps): VoiceRuntime { } if (state.turnInProgress) { + patchSnapshot((prev) => ({ ...prev, phase: "waiting" })); + reconcileCue(); return; } @@ -897,9 +900,15 @@ export function createVoiceRuntime(deps: VoiceRuntimeDeps): VoiceRuntime { state.serverSpeechDetected = isSpeaking; state.serverSpeechStartedAt = isSpeaking ? (state.serverSpeechStartedAt ?? Date.now()) : null; if (isSpeaking) { + const shouldInterruptPlayback = + state.snapshot.phase === "playing" || playback.groups.size > 0; + const hadCue = cue.active || cue.timeout !== null || cue.playing; resetPlaybackState(); - deps.engine.stop(); - deps.engine.clearQueue(); + stopCue(); + if (shouldInterruptPlayback && !hadCue) { + deps.engine.stop(); + deps.engine.clearQueue(); + } getActiveSession()?.adapter.setAssistantAudioPlaying(false); } patchTelemetry((prev) => ({ diff --git a/packages/cli/src/commands/agent/run.ts b/packages/cli/src/commands/agent/run.ts index c4817076b..fd7f90604 100644 --- a/packages/cli/src/commands/agent/run.ts +++ b/packages/cli/src/commands/agent/run.ts @@ -193,7 +193,6 @@ export async function resolveStructuredResponseMessage(options: { try { const timeline = await options.client.fetchAgentTimeline(options.agentId, { direction: "tail", - projection: "projected", limit: 200, }); for (let index = timeline.entries.length - 1; index >= 0; index -= 1) { diff --git a/packages/cli/src/commands/chat/post.ts b/packages/cli/src/commands/chat/post.ts index 32ff0f972..03339d160 100644 --- a/packages/cli/src/commands/chat/post.ts +++ b/packages/cli/src/commands/chat/post.ts @@ -3,7 +3,6 @@ import type { SingleResult } from "../../output/index.js"; import { attachAgentNamesToMessages, connectChatClient, - resolveChatAuthorAgentId, toChatCommandError, type ChatCommandOptions, } from "./shared.js"; @@ -24,7 +23,6 @@ export async function runPostCommand( const payload = await client.postChatMessage({ room, body, - authorAgentId: resolveChatAuthorAgentId(), replyToMessageId: options.replyTo, }); const [message] = await attachAgentNamesToMessages(client, [toChatMessageRow(payload.message!)]); diff --git a/packages/cli/src/commands/provider/ls.ts b/packages/cli/src/commands/provider/ls.ts index 7e8bcfbb5..61b0803f9 100644 --- a/packages/cli/src/commands/provider/ls.ts +++ b/packages/cli/src/commands/provider/ls.ts @@ -14,8 +14,8 @@ export interface ProviderListItem { const PROVIDERS: ProviderListItem[] = AGENT_PROVIDER_DEFINITIONS.map((def) => ({ provider: def.id, status: "available", - defaultMode: def.defaultModeId ?? "default", - modes: def.modes.map((m) => m.label).join(", "), + defaultMode: def.defaultModeId ?? "-", + modes: def.modes.length > 0 ? def.modes.map((m) => m.label).join(", ") : "-", })); /** Schema for provider ls output */ diff --git a/packages/cli/src/utils/timeline.ts b/packages/cli/src/utils/timeline.ts index e49b0244e..2390b00a5 100644 --- a/packages/cli/src/utils/timeline.ts +++ b/packages/cli/src/utils/timeline.ts @@ -11,7 +11,6 @@ export async function fetchProjectedTimelineItems( const timeline = await input.client.fetchAgentTimeline(input.agentId, { direction: "tail", limit: 0, - projection: "projected", }); return timeline.entries.map((entry) => entry.item); } diff --git a/packages/highlight/package.json b/packages/highlight/package.json index 9b2dc37d7..de2a4ec95 100644 --- a/packages/highlight/package.json +++ b/packages/highlight/package.json @@ -7,6 +7,13 @@ }, "main": "./dist/index.js", "types": "./dist/index.d.ts", + "exports": { + ".": { + "types": "./dist/index.d.ts", + "source": "./src/index.ts", + "default": "./dist/index.js" + } + }, "files": [ "dist" ], diff --git a/packages/server/CLAUDE.md b/packages/server/CLAUDE.md index 16cea533f..3f433621e 100644 --- a/packages/server/CLAUDE.md +++ b/packages/server/CLAUDE.md @@ -1 +1,177 @@ -See the repository-level instructions in `../../CLAUDE.md`. +# AGENTS.md — Paseo Server Development Guide + +For AI coding agents working in `packages/server`. Supplements [CLAUDE.md](../CLAUDE.md) at the repo root. + +## Project Overview + +Paseo is a mobile + CLI app for monitoring and controlling local AI coding agents (Claude Code, Codex, OpenCode). The daemon runs on your machine, manages agent processes, and streams their output over WebSocket to clients. + +--- + +## Build / Lint / Test Commands + +### Root (monorepo) +```bash +npm run dev # Start daemon + Expo in Tmux +npm run build:daemon # Build: highlight + relay + server + cli +npm run typecheck # Typecheck all packages +npm run test # Test all packages +npm run format # Format with Biome (in-place) +``` + +### Server package (`packages/server`) +```bash +npm run dev # Start dev daemon (tsx watch) +npm run build # Build lib + scripts to dist/ +npm run start # Run production daemon from dist/ +npm run typecheck # Typecheck server source + +# Run a SINGLE test file +npx vitest run src/server/agent/agent-manager.test.ts --reporter=verbose + +# Run a SINGLE test by name +npx vitest run -t "returns timeout error when provider times out" + +# Test categories +npm run test:unit # Unit tests only (excludes e2e) +npm run test:integration # Integration tests +npm run test:integration:all # All integration tests +npm run test:integration:real # Real API integration tests +npm run test:integration:local # Local integration tests +npm run test:e2e # End-to-end tests (excludes real/local) +npm run test:e2e:all # All e2e tests +npm run test:watch # Watch mode +npm run test:ui # Vitest UI at localhost:51204 +``` + +### Other useful commands +```bash +npm run build --workspace=@getpaseo/relay # Rebuild relay before daemon +npm run build --workspace=@getpaseo/server # Rebuild server +npm run db:query -- "SELECT ..." # Run arbitrary SQL +npm run cli -- ls -a -g # List agents +npm run cli -- daemon status # Check daemon status +``` + +--- + +## Code Style + +### Biome (formatting only, no linting) +```json +{ "indentStyle": "space", "indentWidth": 2, "lineWidth": 100, "quoteStyle": "double", "trailingCommas": "all", "semicolons": "always" } +``` + +### TypeScript +- **Fully strict** — no `any`, no implicit `any` +- **`interface`** over `type`** when possible +- **`function` declarations** over arrow function assignments +- **Named types** — no complex inline types in public signatures +- **Object parameters** — use single object param when >1 argument +- **Infer from Zod schemas** — `z.infer` instead of hand-written types +- `noUnusedLocals: true`, `noUnusedParameters: true`, `noFallthroughCasesInSwitch: true` + +### Imports +- Use path alias `@server/*` in server package (maps to `./src/`) +- No barrel `index.ts` re-exports — they create unnecessary indirection + +### Naming +- Files: `kebab-case.ts` named after the main export (`create-tool-call.ts`) +- Tests: collocated with implementation (`thing.test.ts`) +- No prefixes like `RpcX`, `DbX`, `UiX` — keep one canonical type per concept + +### Error Handling +- **Fail explicitly** — throw instead of silently returning defaults +- **Typed domain errors** — extend `Error` with structured metadata + +```typescript +class TimeoutError extends Error { + constructor( + public readonly operation: string, + public readonly waitedMs: number, + ) { + super(`${operation} timed out after ${waitedMs}ms`); + this.name = "TimeoutError"; + } +} +``` + +### State Design +Discriminated unions over bags of booleans/optionals: +```typescript +// Bad +interface FetchState { isLoading: boolean; error?: Error; data?: Data } + +// Good +type FetchState = + | { status: "idle" } + | { status: "loading" } + | { status: "error"; error: Error } + | { status: "success"; data: Data }; +``` + +--- + +## Testing Philosophy + +Tests prove behavior, not structure. Every test should answer: "what user-visible or API-visible behavior does this verify?" + +- **TDD**: Work in vertical slices — one test, one implementation, repeat +- **Determinism first**: No conditional assertions, no timing/randomness, no weak assertions +- **Real deps over mocks**: Database, APIs, file system — real in tests +- **Flaky tests are a bug**: Never remove a test because it's flaky; fix the variance source + +--- + +## Critical Rules + +1. **NEVER restart the daemon on port 6767** — it kills your own process +2. **NEVER assume timeouts need a restart** — they can be transient +3. **Always run `npm run typecheck` after changes** +4. **NEVER add auth checks to tests** — agent providers handle their own auth +5. **NEVER make breaking WebSocket/message schema changes** — always backward-compatible + +--- + +## Architecture Quick Reference + +``` +packages/server/src/ +├── server/ +│ ├── index.ts # Entry point +│ ├── bootstrap.ts # Daemon initialization +│ ├── websocket-server.ts # WS connection management +│ ├── session.ts # Per-client session state +│ └── agent/ +│ ├── agent-manager.ts # Agent lifecycle state machine +│ └── agent-storage.ts # File-backed JSON persistence +├── providers/ # Claude, Codex, OpenCode adapters +├── relay-transport.ts # Outbound relay connection +└── client/daemon-client.ts # Client library for daemon connection +``` + +Agent state persists to `$PASEO_HOME/agents/{cwd-with-dashes}/{agent-id}.json` +Daemon logs: `$PASEO_HOME/daemon.log` + +--- + +## Debugging + +```bash +tail -f $PASEO_HOME/daemon.log # Daemon logs +npm run test:ui # Vitest browser UI at localhost:51204 +npm run cli -- inspect # Detailed agent info +npm run db:query -- "SELECT * FROM agent_timeline_rows..." +``` + +--- + +## Relevant Docs + +| File | What it covers | +|---|---| +| [../CLAUDE.md](../CLAUDE.md) | Repository overview, critical rules, quick start | +| [../docs/ARCHITECTURE.md](../docs/ARCHITECTURE.md) | System design, WebSocket protocol, data flow | +| [../docs/CODING_STANDARDS.md](../docs/CODING_STANDARDS.md) | Type hygiene, error handling, React patterns | +| [../docs/TESTING.md](../docs/TESTING.md) | TDD workflow, determinism, real deps over mocks | +| [../SECURITY.md](../SECURITY.md) | Relay threat model, E2E encryption | diff --git a/packages/server/drizzle.config.ts b/packages/server/drizzle.config.ts new file mode 100644 index 000000000..9920fcb16 --- /dev/null +++ b/packages/server/drizzle.config.ts @@ -0,0 +1,9 @@ +import { defineConfig } from "drizzle-kit"; + +export default defineConfig({ + schema: "./packages/server/src/server/db/schema.ts", + out: "./packages/server/src/server/db/migrations", + dialect: "sqlite", + strict: true, + verbose: true, +}); diff --git a/packages/server/package.json b/packages/server/package.json index 4a40a95f5..911cd87a1 100644 --- a/packages/server/package.json +++ b/packages/server/package.json @@ -18,24 +18,20 @@ "exports": { ".": { "types": "./dist/server/server/exports.d.ts", - "default": [ - "./dist/server/server/exports.js", - "./src/server/exports.ts" - ] + "source": "./src/server/exports.ts", + "default": "./dist/server/server/exports.js" }, "./utils/tool-call-parsers": { "types": "./dist/server/utils/tool-call-parsers.d.ts", - "default": [ - "./dist/server/utils/tool-call-parsers.js", - "./src/utils/tool-call-parsers.ts" - ] + "source": "./src/utils/tool-call-parsers.ts", + "default": "./dist/server/utils/tool-call-parsers.js" } }, "scripts": { "dev": "cross-env NODE_ENV=development tsx scripts/dev-runner.ts", "dev:tsx": "cross-env NODE_ENV=development tsx watch --ignore '**/*.timestamp-*' src/server/index.ts", "build": "node -e \"require('node:fs').rmSync('dist',{ recursive: true, force: true })\" && npm run build:lib && npm run build:scripts", - "build:lib": "tsc -p tsconfig.server.json --incremental false && node -e \"const fs=require('node:fs'); fs.mkdirSync('dist/server/server/speech/providers/local/sherpa/assets',{recursive:true}); fs.copyFileSync('src/server/speech/providers/local/sherpa/assets/silero_vad.onnx','dist/server/server/speech/providers/local/sherpa/assets/silero_vad.onnx');\"", + "build:lib": "tsc -p tsconfig.server.json --incremental false && node -e \"const fs=require('node:fs'); fs.mkdirSync('dist/server/server/speech/providers/local/sherpa/assets',{recursive:true}); fs.copyFileSync('src/server/speech/providers/local/sherpa/assets/silero_vad.onnx','dist/server/server/speech/providers/local/sherpa/assets/silero_vad.onnx'); fs.cpSync('src/terminal/shell-integration','dist/server/terminal/shell-integration',{recursive:true}); fs.cpSync('src/terminal/shell-integration','dist/src/terminal/shell-integration',{recursive:true});\"", "build:scripts": "tsc -p tsconfig.scripts.json --incremental false && node -e \"const fs=require('node:fs'); fs.mkdirSync('dist/scripts',{recursive:true}); fs.copyFileSync('scripts/mcp-stdio-socket-bridge-cli.mjs','dist/scripts/mcp-stdio-socket-bridge-cli.mjs');\"", "prepack": "npm run build", "start": "cross-env NODE_ENV=production node dist/server/server/index.js", @@ -45,6 +41,7 @@ "speech:download": "tsx scripts/download-speech-models.ts", "speech:tts:matrix": "tsx scripts/generate-sherpa-tts-matrix.ts", "speech:transcribe:local": "tsx scripts/transcribe-local-wav.ts", + "db:query": "tsx scripts/db-query.ts", "test": "npm run test:unit && npm run test:integration", "test:unit": "vitest run --exclude \"**/*.e2e.test.ts\"", "test:integration": "vitest run --maxWorkers=1 --minWorkers=1 src/server/daemon-e2e/models.e2e.test.ts src/server/daemon-e2e/live-preferences.e2e.test.ts src/server/agent/model-catalog.e2e.test.ts", @@ -74,6 +71,8 @@ "ai": "5.0.78", "ajv": "^8.17.1", "dotenv": "^17.2.3", + "better-sqlite3": "^12.8.0", + "drizzle-orm": "^0.45.1", "express": "^4.18.2", "express-basic-auth": "^1.2.1", "fast-uri": "^3.1.0", @@ -97,6 +96,8 @@ }, "devDependencies": { "@playwright/test": "^1.56.1", + "@types/better-sqlite3": "^7.6.13", + "drizzle-kit": "^0.31.10", "@types/express": "^4.17.20", "@types/node": "^20.9.0", "@types/qrcode": "^1.5.6", diff --git a/packages/server/scripts/db-query.ts b/packages/server/scripts/db-query.ts new file mode 100644 index 000000000..ffa603830 --- /dev/null +++ b/packages/server/scripts/db-query.ts @@ -0,0 +1,140 @@ +#!/usr/bin/env npx tsx +/** + * Run arbitrary SQL against the Paseo SQLite database. + * + * Usage: + * npx tsx packages/server/scripts/db-query.ts "SELECT * FROM agent_snapshots" + * npx tsx packages/server/scripts/db-query.ts --db ~/.paseo/db "SELECT count(*) FROM agent_timeline_rows" + * + * Without args, shows table row counts. + */ + +import path from "node:path"; +import os from "node:os"; +import fs from "node:fs"; +import Database from "better-sqlite3"; + +function resolveHomeDirectory(value: string): string { + if (value === "~") { + return os.homedir(); + } + + if (value.startsWith("~/")) { + return path.join(os.homedir(), value.slice(2)); + } + + return value; +} + +function parseListenPort(listen: unknown): number | null { + if (typeof listen !== "string") { + return null; + } + + const portMatch = listen.match(/:(\d+)$/); + return portMatch ? parseInt(portMatch[1]!, 10) : null; +} + +function findDevDatabaseDirectory(): string | null { + const tmpDir = os.tmpdir(); + for (const entry of fs.readdirSync(tmpDir)) { + if (entry.startsWith("paseo-dev.")) { + const configPath = path.join(tmpDir, entry, "config.json"); + if (fs.existsSync(configPath)) { + try { + const config = JSON.parse(fs.readFileSync(configPath, "utf-8")); + const dbDir = config.paseoHome ? path.join(config.paseoHome, "db") : null; + const port = parseListenPort(config.daemon?.listen); + if (dbDir && port === 6767) { + return dbDir; + } + } catch {} + } + } + } + + return null; +} + +function resolveDatabasePath(explicitPath?: string): string { + if (explicitPath) { + const resolvedPath = path.resolve(resolveHomeDirectory(explicitPath)); + return fs.statSync(resolvedPath).isDirectory() + ? path.join(resolvedPath, "paseo.sqlite") + : resolvedPath; + } + + const detectedDevDir = findDevDatabaseDirectory(); + if (detectedDevDir) { + return path.join(detectedDevDir, "paseo.sqlite"); + } + + const paseoHome = process.env.PASEO_HOME + ? path.resolve(resolveHomeDirectory(process.env.PASEO_HOME)) + : path.join(os.homedir(), ".paseo"); + return path.join(paseoHome, "db", "paseo.sqlite"); +} + +async function main() { + const args = process.argv.slice(2); + let dbPath: string | undefined; + const queries: string[] = []; + + for (let i = 0; i < args.length; i++) { + if (args[i] === "--db" && args[i + 1]) { + dbPath = args[++i]; + } else { + queries.push(args[i]!); + } + } + + if (queries.length === 0) { + queries.push( + "SELECT 'agent_snapshots' AS table_name, count(*) AS rows FROM agent_snapshots UNION ALL " + + "SELECT 'agent_timeline_rows', count(*) FROM agent_timeline_rows UNION ALL " + + "SELECT 'projects', count(*) FROM projects UNION ALL " + + "SELECT 'workspaces', count(*) FROM workspaces " + + "ORDER BY table_name", + ); + } + + let databasePath = ""; + let client: Database.Database | null = null; + + try { + if (dbPath) { + const resolvedDbPath = path.resolve(resolveHomeDirectory(dbPath)); + databasePath = + fs.existsSync(resolvedDbPath) && fs.statSync(resolvedDbPath).isDirectory() + ? path.join(resolvedDbPath, "paseo.sqlite") + : resolvedDbPath; + } else { + databasePath = resolveDatabasePath(); + } + + client = new Database(databasePath, { readonly: true, fileMustExist: true }); + + for (const sql of queries) { + const statement = client.prepare(sql); + if (statement.reader) { + const rows = statement.all(); + if (rows.length === 0) { + console.log("(0 rows)\n"); + } else { + console.table(rows); + } + continue; + } + + const result = statement.run(); + console.log(`OK (${result.changes} changes)\n`); + } + } catch (err: any) { + console.error(`Error: ${err.message}\nDatabase: ${databasePath}`); + process.exitCode = 1; + } finally { + client?.close(); + } +} + +main(); diff --git a/packages/server/src/client/daemon-client.test.ts b/packages/server/src/client/daemon-client.test.ts index 144032ed8..aac4921ba 100644 --- a/packages/server/src/client/daemon-client.test.ts +++ b/packages/server/src/client/daemon-client.test.ts @@ -213,6 +213,137 @@ describe("DaemonClient", () => { expect(client.getConnectionState().status).toBe("disposed"); }); + test("normalizes workspace_setup_progress into a workspace-scoped daemon event", async () => { + const logger = createMockLogger(); + const mock = createMockTransport(); + + const client = new DaemonClient({ + url: "ws://test", + clientId: "clsk_unit_test", + logger, + reconnect: { enabled: false }, + transportFactory: () => mock.transport, + }); + clients.push(client); + + const events: Array[0]>[0]> = []; + client.subscribe((event) => { + events.push(event); + }); + + const connectPromise = client.connect(); + mock.triggerOpen(); + await connectPromise; + + mock.triggerMessage( + wrapSessionMessage({ + type: "workspace_setup_progress", + payload: { + workspaceId: "/tmp/project/.paseo/worktrees/feature-a", + status: "running", + detail: { + type: "worktree_setup", + worktreePath: "/tmp/project/.paseo/worktrees/feature-a", + branchName: "feature-a", + log: "phase-one\n", + commands: [ + { + index: 1, + command: "npm install", + cwd: "/tmp/project/.paseo/worktrees/feature-a", + log: "phase-one\n", + status: "running", + exitCode: null, + }, + ], + }, + error: null, + }, + }), + ); + + expect(events).toContainEqual({ + type: "workspace_setup_progress", + workspaceId: "/tmp/project/.paseo/worktrees/feature-a", + payload: { + workspaceId: "/tmp/project/.paseo/worktrees/feature-a", + status: "running", + detail: { + type: "worktree_setup", + worktreePath: "/tmp/project/.paseo/worktrees/feature-a", + branchName: "feature-a", + log: "phase-one\n", + commands: [ + { + index: 1, + command: "npm install", + cwd: "/tmp/project/.paseo/worktrees/feature-a", + log: "phase-one\n", + status: "running", + exitCode: null, + }, + ], + }, + error: null, + }, + }); + }); + + test("sends create_agent_request with string workspace ids", async () => { + const logger = createMockLogger(); + const mock = createMockTransport(); + + const client = new DaemonClient({ + url: "ws://test", + clientId: "clsk_unit_test", + logger, + reconnect: { enabled: false }, + transportFactory: () => mock.transport, + }); + clients.push(client); + + const connectPromise = client.connect(); + mock.triggerOpen(); + await connectPromise; + + const createPromise = client.createAgent({ + provider: "codex", + cwd: "/tmp/project/.paseo/worktrees/feature-a", + workspaceId: "/tmp/project/.paseo/worktrees/feature-a", + title: "Compat agent", + modeId: "default", + }); + + expect(mock.sent).toHaveLength(1); + const request = JSON.parse(String(mock.sent[0])) as { + type: "session"; + message: { + type: "create_agent_request"; + requestId: string; + workspaceId: string; + }; + }; + expect(request.message).toEqual( + expect.objectContaining({ + type: "create_agent_request", + workspaceId: "/tmp/project/.paseo/worktrees/feature-a", + }), + ); + + mock.triggerMessage( + wrapSessionMessage({ + type: "status", + payload: { + status: "agent_create_failed", + requestId: request.message.requestId, + error: "compat test sentinel", + }, + }), + ); + + await expect(createPromise).rejects.toThrow("compat test sentinel"); + }); + test("sends explicit shutdown_server_request via shutdownServer", async () => { const logger = createMockLogger(); const mock = createMockTransport(); @@ -1704,24 +1835,15 @@ describe("DaemonClient", () => { agentId: "agent_cli", agent: null, direction: "tail", - projection: "projected", - epoch: "epoch-1", - reset: false, - staleCursor: false, - gap: false, - window: { minSeq: 1, maxSeq: 1, nextSeq: 2 }, - startCursor: { epoch: "epoch-1", seq: 1 }, - endCursor: { epoch: "epoch-1", seq: 1 }, + startSeq: 1, + endSeq: 1, hasOlder: false, hasNewer: false, entries: [ { timestamp: "2026-02-08T20:20:00.000Z", provider: "codex", - seqStart: 1, - seqEnd: 1, - sourceSeqRanges: [{ startSeq: 1, endSeq: 1 }], - collapsed: [], + seq: 1, item: { type: "tool_call", callId: "call_cli_snapshot", @@ -1798,24 +1920,15 @@ describe("DaemonClient", () => { agentId: "agent_cli", agent: null, direction: "tail", - projection: "projected", - epoch: "epoch-1", - reset: false, - staleCursor: false, - gap: false, - window: { minSeq: 1, maxSeq: 1, nextSeq: 2 }, - startCursor: { epoch: "epoch-1", seq: 1 }, - endCursor: { epoch: "epoch-1", seq: 1 }, + startSeq: 1, + endSeq: 1, hasOlder: false, hasNewer: false, entries: [ { timestamp: "2026-02-08T20:20:00.000Z", provider: "codex", - seqStart: 1, - seqEnd: 1, - sourceSeqRanges: [{ startSeq: 1, endSeq: 1 }], - collapsed: [], + seq: 1, item: { type: "tool_call", callId: "call_cli_invalid", diff --git a/packages/server/src/client/daemon-client.ts b/packages/server/src/client/daemon-client.ts index 41fd9911d..768b517dc 100644 --- a/packages/server/src/client/daemon-client.ts +++ b/packages/server/src/client/daemon-client.ts @@ -38,16 +38,17 @@ import type { OpenInEditorResponseMessage, OpenProjectResponseMessage, ArchiveWorkspaceResponseMessage, + WorkspaceSetupStatusResponseMessage, ListCommandsResponse, ListProviderFeaturesResponseMessage, ListProviderModelsResponseMessage, ListProviderModesResponseMessage, ListAvailableProvidersResponse, - ListTerminalsResponse, - CreateTerminalResponse, GetProvidersSnapshotResponseMessage, - ProviderDiagnosticResponseMessage, RefreshProvidersSnapshotResponseMessage, + ProviderDiagnosticResponseMessage, + ListTerminalsResponse, + CreateTerminalResponse, SubscribeTerminalResponse, TerminalState, CloseItemsResponse, @@ -127,18 +128,22 @@ export type DaemonEvent = agentId: string; payload: Extract["payload"]; } - | { + | { type: "workspace_update"; workspaceId: string; payload: Extract["payload"]; } + | { + type: "workspace_setup_progress"; + workspaceId: string; + payload: Extract["payload"]; + } | { type: "agent_stream"; agentId: string; event: AgentStreamEventPayload; timestamp: string; seq?: number; - epoch?: string; } | { type: "status"; payload: { status: string } & Record } | { type: "agent_deleted"; agentId: string } @@ -195,6 +200,7 @@ export type CreateAgentRequestOptions = { config?: AgentSessionConfig; provider?: AgentProvider; cwd?: string; + workspaceId?: string | number; initialPrompt?: string; clientMessageId?: string; outputSchema?: Record; @@ -338,13 +344,13 @@ type ScheduleDeletePayload = Extract< export type FetchAgentTimelinePayload = FetchAgentTimelineResponseMessage["payload"]; export type FetchAgentTimelineDirection = FetchAgentTimelinePayload["direction"]; -export type FetchAgentTimelineProjection = FetchAgentTimelinePayload["projection"]; -export type FetchAgentTimelineCursor = NonNullable; +export type FetchAgentTimelineCursor = NonNullable< + Extract["cursor"] +>; export type FetchAgentTimelineOptions = { direction?: FetchAgentTimelineDirection; cursor?: FetchAgentTimelineCursor; limit?: number; - projection?: FetchAgentTimelineProjection; requestId?: string; }; @@ -479,6 +485,7 @@ type ListAvailableEditorsPayload = ListAvailableEditorsResponseMessage["payload" type OpenInEditorPayload = OpenInEditorResponseMessage["payload"]; type OpenProjectPayload = OpenProjectResponseMessage["payload"]; type ArchiveWorkspacePayload = ArchiveWorkspaceResponseMessage["payload"]; +type WorkspaceSetupStatusPayload = WorkspaceSetupStatusResponseMessage["payload"]; export type EditorTargetDescriptor = ListAvailableEditorsPayload["editors"][number]; export type FetchAgentResult = { @@ -1324,6 +1331,23 @@ export class DaemonClient { }); } + async startWorkspaceService( + workspaceId: string, + serviceName: string, + requestId?: string, + ): Promise["payload"]> { + return this.sendCorrelatedSessionRequest({ + requestId, + message: { + type: "start_workspace_service_request", + workspaceId, + serviceName, + }, + responseType: "start_workspace_service_response", + timeout: 10000, + }); + } + async listAvailableEditors(requestId?: string): Promise { return this.sendCorrelatedSessionRequest({ requestId, @@ -1353,7 +1377,7 @@ export class DaemonClient { } async archiveWorkspace( - workspaceId: string, + workspaceId: string | number, requestId?: string, ): Promise { return this.sendCorrelatedSessionRequest({ @@ -1367,6 +1391,21 @@ export class DaemonClient { }); } + async fetchWorkspaceSetupStatus( + workspaceId: string, + requestId?: string, + ): Promise { + return this.sendCorrelatedSessionRequest({ + requestId, + message: { + type: "workspace_setup_status_request", + workspaceId, + }, + responseType: "workspace_setup_status_response", + timeout: 10000, + }); + } + async fetchAgent(agentId: string, requestId?: string): Promise { const resolvedRequestId = this.createRequestId(requestId); const message = SessionInboundMessageSchema.parse({ @@ -1438,6 +1477,7 @@ export class DaemonClient { type: "create_agent_request", requestId, config, + ...(options.workspaceId !== undefined ? { workspaceId: options.workspaceId } : {}), ...(options.initialPrompt ? { initialPrompt: options.initialPrompt } : {}), ...(options.clientMessageId ? { clientMessageId: options.clientMessageId } : {}), ...(options.outputSchema ? { outputSchema: options.outputSchema } : {}), @@ -1628,7 +1668,6 @@ export class DaemonClient { ...(options.direction ? { direction: options.direction } : {}), ...(options.cursor ? { cursor: options.cursor } : {}), ...(typeof options.limit === "number" ? { limit: options.limit } : {}), - ...(options.projection ? { projection: options.projection } : {}), }); const payload = await this.sendRequest({ @@ -2892,12 +2931,16 @@ export class DaemonClient { cwd: string, name?: string, requestId?: string, + options?: { agentId?: string; command?: string; args?: string[] }, ): Promise { const resolvedRequestId = this.createRequestId(requestId); const message = SessionInboundMessageSchema.parse({ type: "create_terminal_request", cwd, name, + agentId: options?.agentId, + command: options?.command, + args: options?.args, requestId: resolvedRequestId, }); return this.sendCorrelatedRequest({ @@ -3705,6 +3748,12 @@ export class DaemonClient { workspaceId: msg.payload.kind === "upsert" ? msg.payload.workspace.id : msg.payload.id, payload: msg.payload, }; + case "workspace_setup_progress": + return { + type: "workspace_setup_progress", + workspaceId: msg.payload.workspaceId, + payload: msg.payload, + }; case "agent_stream": return { type: "agent_stream", @@ -3712,7 +3761,6 @@ export class DaemonClient { event: msg.payload.event, timestamp: msg.payload.timestamp, ...(typeof msg.payload.seq === "number" ? { seq: msg.payload.seq } : {}), - ...(typeof msg.payload.epoch === "string" ? { epoch: msg.payload.epoch } : {}), }; case "status": return { type: "status", payload: msg.payload }; @@ -3819,6 +3867,7 @@ function resolveAgentConfig(options: CreateAgentRequestOptions): AgentSessionCon config, provider, cwd, + workspaceId: _workspaceId, initialPrompt: _initialPrompt, images: _images, git: _git, diff --git a/packages/server/src/server/agent-loading-service.ts b/packages/server/src/server/agent-loading-service.ts new file mode 100644 index 000000000..313aef944 --- /dev/null +++ b/packages/server/src/server/agent-loading-service.ts @@ -0,0 +1,128 @@ +import type pino from "pino"; + +import type { ManagedAgent } from "./agent/agent-manager.js"; +import type { AgentManager } from "./agent/agent-manager.js"; +import type { AgentPersistenceHandle, AgentSessionConfig } from "./agent/agent-sdk-types.js"; +import type { AgentSnapshotStore } from "./agent/agent-snapshot-store.js"; +import { + buildConfigOverrides, + buildSessionConfig, + extractTimestamps, + toAgentPersistenceHandle, +} from "./persistence-hooks.js"; + +const pendingAgentBootstrapLoads = new Map>(); + +export type AgentLoadingServiceOptions = { + agentManager: Pick< + AgentManager, + | "createAgent" + | "getAgent" + | "reloadAgentSession" + | "resumeAgentFromPersistence" + >; + agentStorage: Pick; + logger: pino.Logger; +}; + +// Coordinates cold loads, explicit resumes, and refreshes for persisted agents. +export class AgentLoadingService { + private readonly agentManager: AgentLoadingServiceOptions["agentManager"]; + private readonly agentStorage: AgentLoadingServiceOptions["agentStorage"]; + private readonly logger: pino.Logger; + + constructor(options: AgentLoadingServiceOptions) { + this.agentManager = options.agentManager; + this.agentStorage = options.agentStorage; + this.logger = options.logger.child({ component: "agent-loading" }); + } + + async ensureAgentLoaded(options: { agentId: string }): Promise { + const existing = this.agentManager.getAgent(options.agentId); + if (existing) { + return existing; + } + + const inflight = pendingAgentBootstrapLoads.get(options.agentId); + if (inflight) { + return inflight; + } + + const initPromise = this.loadStoredAgent(options); + pendingAgentBootstrapLoads.set(options.agentId, initPromise); + + try { + return await initPromise; + } finally { + const current = pendingAgentBootstrapLoads.get(options.agentId); + if (current === initPromise) { + pendingAgentBootstrapLoads.delete(options.agentId); + } + } + } + + async resumeAgent(options: { + handle: AgentPersistenceHandle; + overrides?: Partial; + }): Promise { + return this.agentManager.resumeAgentFromPersistence(options.handle, options.overrides); + } + + async refreshAgent(options: { agentId: string }): Promise { + const existing = this.agentManager.getAgent(options.agentId); + if (existing) { + return existing.persistence + ? await this.agentManager.reloadAgentSession(options.agentId) + : existing; + } + + const record = await this.agentStorage.get(options.agentId); + if (!record) { + throw new Error(`Agent not found: ${options.agentId}`); + } + + const handle = toAgentPersistenceHandle(this.logger, record.persistence); + if (!handle) { + throw new Error(`Agent ${options.agentId} cannot be refreshed because it lacks persistence`); + } + + return this.agentManager.resumeAgentFromPersistence( + handle, + buildConfigOverrides(record), + options.agentId, + extractTimestamps(record), + ); + } + + private async loadStoredAgent(options: { agentId: string }): Promise { + const record = await this.agentStorage.get(options.agentId); + if (!record) { + throw new Error(`Agent not found: ${options.agentId}`); + } + + const handle = toAgentPersistenceHandle(this.logger, record.persistence); + let snapshot: ManagedAgent; + if (handle) { + snapshot = await this.agentManager.resumeAgentFromPersistence( + handle, + buildConfigOverrides(record), + options.agentId, + extractTimestamps(record), + ); + this.logger.info( + { agentId: options.agentId, provider: record.provider }, + "Agent resumed from persistence", + ); + } else { + snapshot = await this.agentManager.createAgent(buildSessionConfig(record), options.agentId, { + labels: record.labels, + }); + this.logger.info( + { agentId: options.agentId, provider: record.provider }, + "Agent created from stored config", + ); + } + + return this.agentManager.getAgent(options.agentId) ?? snapshot; + } +} diff --git a/packages/server/src/server/agent/agent-management-mcp.ts b/packages/server/src/server/agent/agent-management-mcp.ts index 001a5bc5b..edfdc794e 100644 --- a/packages/server/src/server/agent/agent-management-mcp.ts +++ b/packages/server/src/server/agent/agent-management-mcp.ts @@ -37,7 +37,7 @@ import { import { toAgentPayload } from "./agent-projections.js"; import { curateAgentActivity } from "./activity-curator.js"; import { AGENT_PROVIDER_DEFINITIONS } from "./provider-registry.js"; -import { AgentStorage } from "./agent-storage.js"; +import type { AgentSnapshotStore } from "./agent-snapshot-store.js"; import { appendTimelineItemIfAgentKnown, emitLiveTimelineItemIfAgentKnown, @@ -48,11 +48,14 @@ import { scheduleAgentMetadataGeneration } from "./agent-metadata-generator.js"; import { expandUserPath } from "../path-utils.js"; import type { TerminalManager } from "../../terminal/terminal-manager.js"; import { createAgentWorktree, runAsyncWorktreeBootstrap } from "../worktree-bootstrap.js"; +import type { ServiceRouteStore } from "../service-proxy.js"; export interface AgentManagementMcpOptions { agentManager: AgentManager; - agentStorage: AgentStorage; + agentStorage: AgentSnapshotStore; terminalManager?: TerminalManager | null; + serviceRouteStore?: ServiceRouteStore; + getDaemonTcpPort?: () => number | null; paseoHome?: string; logger: Logger; } @@ -181,7 +184,7 @@ function sanitizePermissionRequest( } async function resolveAgentTitle( - agentStorage: AgentStorage, + agentStorage: AgentSnapshotStore, agentId: string, logger: Logger, ): Promise { @@ -195,7 +198,7 @@ async function resolveAgentTitle( } async function serializeSnapshotWithMetadata( - agentStorage: AgentStorage, + agentStorage: AgentSnapshotStore, snapshot: ManagedAgent, logger: Logger, ) { @@ -299,21 +302,25 @@ export async function createAgentManagementMcpServer( }; let resolvedCwd = expandUserPath(cwd); - let worktreeConfig: WorktreeConfig | undefined; + let worktreeBootstrap: + | { + worktree: WorktreeConfig; + shouldBootstrap: boolean; + } + | undefined; if (worktreeName) { if (!baseBranch) { throw new Error("baseBranch is required when creating a worktree"); } - const worktree = await createAgentWorktree({ + worktreeBootstrap = await createAgentWorktree({ branchName: worktreeName, cwd: resolvedCwd, baseBranch, worktreeSlug: worktreeName, paseoHome: options.paseoHome, }); - resolvedCwd = worktree.worktreePath; - worktreeConfig = worktree; + resolvedCwd = worktreeBootstrap.worktree.worktreePath; } const provider: AgentProvider = agentType ?? "claude"; @@ -325,10 +332,11 @@ export async function createAgentManagementMcpServer( title: normalizedTitle ?? undefined, }); - if (worktreeConfig) { + if (worktreeBootstrap) { void runAsyncWorktreeBootstrap({ agentId: snapshot.id, - worktree: worktreeConfig, + worktree: worktreeBootstrap.worktree, + shouldBootstrap: worktreeBootstrap.shouldBootstrap, terminalManager: options.terminalManager ?? null, appendTimelineItem: (item) => appendTimelineItemIfAgentKnown({ @@ -342,6 +350,8 @@ export async function createAgentManagementMcpServer( agentId: snapshot.id, item, }), + serviceRouteStore: options.serviceRouteStore, + daemonPort: options.getDaemonTcpPort?.() ?? null, logger: childLogger, }); } diff --git a/packages/server/src/server/agent/agent-manager.test.ts b/packages/server/src/server/agent/agent-manager.test.ts index 80161dbb8..76685e899 100644 --- a/packages/server/src/server/agent/agent-manager.test.ts +++ b/packages/server/src/server/agent/agent-manager.test.ts @@ -1,16 +1,18 @@ import { describe, expect, test, vi } from "vitest"; -import { mkdtempSync } from "node:fs"; +import { mkdtempSync, rmSync } from "node:fs"; import { join } from "node:path"; import { tmpdir } from "node:os"; import { randomUUID } from "node:crypto"; import { createTestLogger } from "../../test-utils/test-logger.js"; +import { DbAgentSnapshotStore } from "../db/db-agent-snapshot-store.js"; +import { DbAgentTimelineStore } from "../db/db-agent-timeline-store.js"; +import { openPaseoDatabase, type PaseoDatabaseHandle } from "../db/sqlite-database.js"; +import { projects, workspaces } from "../db/schema.js"; import { AgentManager } from "./agent-manager.js"; import { AgentStorage } from "./agent-storage.js"; -import { buildConfigOverrides } from "../persistence-hooks.js"; import type { AgentClient, - AgentFeature, AgentLaunchContext, AgentPersistenceHandle, AgentRunResult, @@ -36,47 +38,6 @@ function deferred(): Deferred { return { promise, resolve, reject }; } -class EventPushable implements AsyncIterable { - private queue: T[] = []; - private resolvers: Array<(value: IteratorResult) => void> = []; - private closed = false; - - push(value: T): void { - if (this.closed) { - return; - } - const resolver = this.resolvers.shift(); - if (resolver) { - resolver({ value, done: false }); - return; - } - this.queue.push(value); - } - - end(): void { - this.closed = true; - while (this.resolvers.length > 0) { - const resolver = this.resolvers.shift(); - resolver?.({ value: undefined, done: true }); - } - } - - [Symbol.asyncIterator](): AsyncIterator { - return { - next: () => { - if (this.queue.length > 0) { - const value = this.queue.shift()!; - return Promise.resolve({ value, done: false }); - } - if (this.closed) { - return Promise.resolve({ value: undefined, done: true }); - } - return new Promise((resolve) => this.resolvers.push(resolve)); - }, - }; - } -} - const TEST_CAPABILITIES = { supportsStreaming: false, supportsSessionPersistence: false, @@ -86,6 +47,37 @@ const TEST_CAPABILITIES = { supportsToolInvocations: false, } as const; +async function seedWorkspace( + database: PaseoDatabaseHandle, + options: { directory: string }, +): Promise { + const [project] = await database.db + .insert(projects) + .values({ + directory: options.directory, + kind: "git", + displayName: "project-1", + gitRemote: null, + createdAt: "2026-03-01T00:00:00.000Z", + updatedAt: "2026-03-01T00:00:00.000Z", + archivedAt: null, + }) + .returning(); + const [workspace] = await database.db + .insert(workspaces) + .values({ + projectId: project.id, + directory: options.directory, + kind: "checkout", + displayName: "main", + createdAt: "2026-03-01T00:00:00.000Z", + updatedAt: "2026-03-01T00:00:00.000Z", + archivedAt: null, + }) + .returning(); + return workspace.id; +} + class TestAgentClient implements AgentClient { readonly provider = "codex" as const; readonly capabilities = TEST_CAPABILITIES; @@ -199,14 +191,117 @@ class TestAgentSession implements AgentSession { async close(): Promise {} } -function createFeature(overrides: Partial = {}): AgentFeature { - return { - type: "toggle", - id: "fast_mode", - label: "Fast mode", - value: false, - ...overrides, - }; +class StreamingAssistantSession implements AgentSession { + readonly provider = "codex" as const; + readonly capabilities = TEST_CAPABILITIES; + readonly id = randomUUID(); + private subscribers = new Set<(event: AgentStreamEvent) => void>(); + private turnIdCounter = 0; + + constructor(private readonly config: AgentSessionConfig) {} + + async run(): Promise { + return { + sessionId: this.id, + finalText: "", + timeline: [], + }; + } + + async startTurn(): Promise<{ turnId: string }> { + const turnId = `turn-${++this.turnIdCounter}`; + setTimeout(() => { + this.pushEvent({ type: "turn_started", provider: this.provider, turnId }); + this.pushEvent({ + type: "timeline", + provider: this.provider, + turnId, + item: { type: "assistant_message", text: "final " }, + }); + this.pushEvent({ + type: "timeline", + provider: this.provider, + turnId, + item: { type: "assistant_message", text: "reply" }, + }); + this.pushEvent({ type: "turn_completed", provider: this.provider, turnId }); + }, 0); + return { turnId }; + } + + subscribe(callback: (event: AgentStreamEvent) => void): () => void { + this.subscribers.add(callback); + return () => { + this.subscribers.delete(callback); + }; + } + + private pushEvent(event: AgentStreamEvent): void { + for (const callback of this.subscribers) { + callback(event); + } + } + + async *streamHistory(): AsyncGenerator {} + + async getRuntimeInfo() { + return { + provider: this.provider, + sessionId: this.id, + model: this.config.model ?? null, + modeId: this.config.modeId ?? null, + }; + } + + async getAvailableModes() { + return []; + } + + async getCurrentMode() { + return null; + } + + async setMode(): Promise {} + + getPendingPermissions() { + return []; + } + + async respondToPermission(): Promise {} + + describePersistence() { + return { + provider: this.provider, + sessionId: this.id, + }; + } + + async interrupt(): Promise {} + + async close(): Promise {} +} + +class StreamingAssistantClient implements AgentClient { + readonly provider = "codex" as const; + readonly capabilities = TEST_CAPABILITIES; + + async isAvailable(): Promise { + return true; + } + + async createSession(config: AgentSessionConfig): Promise { + return new StreamingAssistantSession(config); + } + + async resumeSession( + _handle: AgentPersistenceHandle, + config?: Partial, + ): Promise { + return new StreamingAssistantSession({ + provider: "codex", + cwd: config?.cwd ?? process.cwd(), + }); + } } describe("AgentManager", () => { @@ -493,101 +588,6 @@ describe("AgentManager", () => { }); }); - test("resumeAgentFromPersistence passes featureValues through to the resumed session config", async () => { - const workdir = mkdtempSync(join(tmpdir(), "agent-manager-resume-features-")); - const storagePath = join(workdir, "agents"); - const storage = new AgentStorage(storagePath, logger); - - class ResumeFeatureCaptureClient implements AgentClient { - readonly provider = "codex" as const; - readonly capabilities = TEST_CAPABILITIES; - lastResumeOverrides: Partial | undefined; - - async isAvailable(): Promise { - return true; - } - - async createSession(config: AgentSessionConfig): Promise { - return new TestAgentSession(config); - } - - async resumeSession( - handle: AgentPersistenceHandle, - overrides?: Partial, - ): Promise { - this.lastResumeOverrides = overrides; - const metadata = (handle.metadata ?? {}) as Partial; - return new TestAgentSession({ - ...metadata, - ...overrides, - provider: "codex", - cwd: overrides?.cwd ?? metadata.cwd ?? process.cwd(), - }); - } - } - - const now = new Date().toISOString(); - await storage.upsert({ - id: "00000000-0000-4000-8000-000000000138", - provider: "codex", - cwd: workdir, - createdAt: now, - updatedAt: now, - lastActivityAt: now, - title: null, - labels: {}, - lastStatus: "idle", - lastModeId: "plan", - config: { - model: "gpt-5.1", - modeId: "plan", - featureValues: { - fast_mode: true, - }, - }, - persistence: { - provider: "codex", - sessionId: "resume-feature-session", - metadata: { - provider: "codex", - cwd: workdir, - }, - }, - }); - - const record = await storage.get("00000000-0000-4000-8000-000000000138"); - expect(record).not.toBeNull(); - - const client = new ResumeFeatureCaptureClient(); - const manager = new AgentManager({ - clients: { - codex: client, - }, - registry: storage, - logger, - }); - - const resumed = await manager.resumeAgentFromPersistence( - { - provider: "codex", - sessionId: "resume-feature-session", - metadata: { - provider: "codex", - cwd: workdir, - }, - }, - buildConfigOverrides(record!), - record!.id, - ); - - expect(client.lastResumeOverrides?.featureValues).toEqual({ - fast_mode: true, - }); - expect(resumed.config.featureValues).toEqual({ - fast_mode: true, - }); - }); - test("reloadAgentSession preserves timeline and does not force history replay", async () => { const workdir = mkdtempSync(join(tmpdir(), "agent-manager-reload-")); const storagePath = join(workdir, "agents"); @@ -676,36 +676,213 @@ describe("AgentManager", () => { test("reloadAgentSession preserves current title when config title is unset", async () => { const workdir = mkdtempSync(join(tmpdir(), "agent-manager-reload-title-")); + const dataDir = join(workdir, "db"); + const database = await openPaseoDatabase(dataDir); + + try { + const workspaceId = await seedWorkspace(database, { directory: workdir }); + const storage = new DbAgentSnapshotStore(database.db); + const manager = new AgentManager({ + clients: { + codex: new TestAgentClient(), + }, + registry: storage, + logger, + idFactory: () => "00000000-0000-4000-8000-000000000126", + }); + + const snapshot = await manager.createAgent( + { + provider: "codex", + cwd: workdir, + }, + undefined, + { workspaceId }, + ); + await manager.setTitle(snapshot.id, "Generated title"); + + const beforeReload = await storage.get(snapshot.id); + expect(beforeReload?.title).toBe("Generated title"); + expect(beforeReload?.config?.title).toBeUndefined(); + + await manager.reloadAgentSession(snapshot.id); + + const afterReload = await storage.get(snapshot.id); + expect(afterReload?.title).toBe("Generated title"); + expect(afterReload?.config?.title).toBeUndefined(); + } finally { + await database.close(); + rmSync(workdir, { recursive: true, force: true }); + } + }); + + test("resumeAgentFromPersistence reads durable helpers without loading committed rows into live memory", async () => { + const workdir = mkdtempSync(join(tmpdir(), "agent-manager-durable-seed-")); const storagePath = join(workdir, "agents"); + const dataDir = join(workdir, "db"); const storage = new AgentStorage(storagePath, logger); - const manager = new AgentManager({ - clients: { - codex: new TestAgentClient(), - }, - registry: storage, - logger, - idFactory: () => "00000000-0000-4000-8000-000000000126", - }); + const database = await openPaseoDatabase(dataDir); + let historyReplayCount = 0; + let manager: AgentManager | null = null; - const snapshot = await manager.createAgent({ - provider: "codex", - cwd: workdir, - }); - await manager.setTitle(snapshot.id, "Generated title"); + class HistoryReplayProbeSession extends TestAgentSession { + async *streamHistory(): AsyncGenerator { + historyReplayCount += 1; + yield { + type: "timeline", + provider: this.provider, + item: { type: "assistant_message", text: "provider history replay" }, + }; + } + } + + class HistoryReplayProbeClient implements AgentClient { + readonly provider = "codex" as const; + readonly capabilities = TEST_CAPABILITIES; + + async isAvailable(): Promise { + return true; + } + + async createSession(config: AgentSessionConfig): Promise { + return new HistoryReplayProbeSession(config); + } + + async resumeSession( + handle: AgentPersistenceHandle, + overrides?: Partial, + ): Promise { + const metadata = (handle.metadata ?? {}) as Partial; + return new HistoryReplayProbeSession({ + ...metadata, + ...overrides, + provider: "codex", + cwd: overrides?.cwd ?? metadata.cwd ?? process.cwd(), + }); + } + } + + try { + const durableTimelineStore = new DbAgentTimelineStore(database.db); + manager = new AgentManager({ + clients: { + codex: new HistoryReplayProbeClient(), + }, + registry: storage, + durableTimelineStore, + logger, + idFactory: () => "00000000-0000-4000-8000-000000000128", + }); + + const snapshot = await manager.createAgent({ + provider: "codex", + cwd: workdir, + }); + + await manager.appendTimelineItem(snapshot.id, { + type: "assistant_message", + text: "durable only", + }); + await manager.flush(); + + const handle = manager.getAgent(snapshot.id)?.persistence; + expect(handle).not.toBeNull(); + if (!handle) { + throw new Error("Expected persistence handle to be available"); + } + + await manager.closeAgent(snapshot.id); + + await expect(durableTimelineStore.getCommittedRows(snapshot.id)).resolves.toEqual([ + { + seq: 1, + timestamp: expect.any(String), + item: { + type: "assistant_message", + text: "durable only", + }, + }, + ]); + + const resumed = await manager.resumeAgentFromPersistence(handle, undefined, snapshot.id); + + expect(resumed.id).toBe(snapshot.id); + expect(manager.getTimeline(snapshot.id)).toEqual([]); + await expect(manager.getLastAssistantMessage(snapshot.id)).resolves.toBe("durable only"); + await expect(manager.getTimelineRows(snapshot.id)).resolves.toEqual([ + { + seq: 1, + timestamp: expect.any(String), + item: { + type: "assistant_message", + text: "durable only", + }, + }, + ]); - const beforeReload = await storage.get(snapshot.id); - expect(beforeReload?.title).toBe("Generated title"); - expect(beforeReload?.config?.title).toBeUndefined(); + await manager.hydrateTimelineFromProvider(snapshot.id); - await manager.reloadAgentSession(snapshot.id); + expect(historyReplayCount).toBe(0); + expect(manager.getTimeline(snapshot.id)).toEqual([]); - const afterReload = await storage.get(snapshot.id); - expect(afterReload?.title).toBe("Generated title"); - expect(afterReload?.config?.title).toBeUndefined(); + await manager.closeAgent(snapshot.id); + await manager.deleteCommittedTimeline(snapshot.id); + + await expect(durableTimelineStore.getCommittedRows(snapshot.id)).resolves.toEqual([]); + } finally { + await manager?.flush().catch(() => undefined); + await storage.flush().catch(() => undefined); + await database.close(); + rmSync(workdir, { recursive: true, force: true }); + } }); test("setTitle bumps updatedAt and persists title in the same snapshot write", async () => { const workdir = mkdtempSync(join(tmpdir(), "agent-manager-set-title-updated-at-")); + const dataDir = join(workdir, "db"); + const database = await openPaseoDatabase(dataDir); + + try { + const workspaceId = await seedWorkspace(database, { directory: workdir }); + const storage = new DbAgentSnapshotStore(database.db); + const manager = new AgentManager({ + clients: { + codex: new TestAgentClient(), + }, + registry: storage, + logger, + idFactory: () => "00000000-0000-4000-8000-000000000127", + }); + + const snapshot = await manager.createAgent( + { + provider: "codex", + cwd: workdir, + }, + undefined, + { workspaceId }, + ); + + const before = await storage.get(snapshot.id); + expect(before).not.toBeNull(); + + await manager.setTitle(snapshot.id, "Generated title"); + + const after = await storage.get(snapshot.id); + expect(after?.title).toBe("Generated title"); + expect(Date.parse(after!.updatedAt)).toBeGreaterThan(Date.parse(before!.updatedAt)); + + const live = manager.getAgent(snapshot.id); + expect(live).not.toBeNull(); + expect(live!.updatedAt.getTime()).toBeGreaterThan(Date.parse(before!.updatedAt)); + } finally { + await database.close(); + rmSync(workdir, { recursive: true, force: true }); + } + }); + + test("persists live mode, model, and thinking changes without an external snapshot subscriber", async () => { + const workdir = mkdtempSync(join(tmpdir(), "agent-manager-live-persist-")); const storagePath = join(workdir, "agents"); const storage = new AgentStorage(storagePath, logger); const manager = new AgentManager({ @@ -714,139 +891,132 @@ describe("AgentManager", () => { }, registry: storage, logger, - idFactory: () => "00000000-0000-4000-8000-000000000127", + idFactory: () => "00000000-0000-4000-8000-000000000132", }); const snapshot = await manager.createAgent({ provider: "codex", cwd: workdir, + modeId: "plan", + model: "gpt-5.2-codex", + thinkingOptionId: "low", }); - const before = await storage.get(snapshot.id); - expect(before).not.toBeNull(); - - await manager.setTitle(snapshot.id, "Generated title"); - - const after = await storage.get(snapshot.id); - expect(after?.title).toBe("Generated title"); - expect(Date.parse(after!.updatedAt)).toBeGreaterThan(Date.parse(before!.updatedAt)); + await manager.setAgentMode(snapshot.id, "build"); + await manager.setAgentModel(snapshot.id, "gpt-5.4"); + await manager.setAgentThinkingOption(snapshot.id, "high"); + await manager.flush(); - const live = manager.getAgent(snapshot.id); - expect(live).not.toBeNull(); - expect(live!.updatedAt.getTime()).toBeGreaterThan(Date.parse(before!.updatedAt)); + const persisted = await storage.get(snapshot.id); + expect(persisted).not.toBeNull(); + expect(persisted?.lastModeId).toBe("build"); + expect(persisted?.config?.model).toBe("gpt-5.4"); + expect(persisted?.config?.thinkingOptionId).toBe("high"); + expect(persisted?.runtimeInfo?.modeId).toBe("build"); + expect(persisted?.runtimeInfo?.model).toBe("gpt-5.4"); }); - test("setAgentFeature calls session.setFeature and persists featureValues in config", async () => { - const workdir = mkdtempSync(join(tmpdir(), "agent-manager-set-feature-")); + test("setLabels merges and persists labels", async () => { + const workdir = mkdtempSync(join(tmpdir(), "agent-manager-set-labels-")); const storagePath = join(workdir, "agents"); const storage = new AgentStorage(storagePath, logger); - - class FeatureSession extends TestAgentSession { - readonly features: AgentFeature[] = [createFeature()]; - readonly setFeature = vi.fn(async (featureId: string, value: unknown) => { - const feature = this.features.find((item) => item.id === featureId); - if (feature?.type === "toggle") { - feature.value = Boolean(value); - } - }); - } - - class FeatureClient extends TestAgentClient { - session: FeatureSession | null = null; - - override async createSession(config: AgentSessionConfig): Promise { - this.session = new FeatureSession(config); - return this.session; - } - } - - const client = new FeatureClient(); const manager = new AgentManager({ - clients: { codex: client }, + clients: { + codex: new TestAgentClient(), + }, registry: storage, logger, - idFactory: () => "00000000-0000-4000-8000-000000000128", + idFactory: () => "00000000-0000-4000-8000-000000000133", }); - const agent = await manager.createAgent({ + const snapshot = await manager.createAgent({ provider: "codex", cwd: workdir, + title: "Label test", }); - await manager.setAgentFeature(agent.id, "fast_mode", true); + await manager.setLabels(snapshot.id, { surface: "mobile" }); + await manager.setLabels(snapshot.id, { phase: "1a" }); - expect(client.session?.setFeature).toHaveBeenCalledWith("fast_mode", true); - expect(manager.getAgent(agent.id)?.config.featureValues).toEqual({ fast_mode: true }); + const persisted = await storage.get(snapshot.id); + expect(persisted?.labels).toEqual({ + surface: "mobile", + phase: "1a", + }); }); - test("setAgentFeature throws when session does not support setFeature", async () => { - const workdir = mkdtempSync(join(tmpdir(), "agent-manager-set-feature-unsupported-")); + test("runAgent persists finished attention and idle status without an external snapshot subscriber", async () => { + const workdir = mkdtempSync(join(tmpdir(), "agent-manager-finished-attention-")); const storagePath = join(workdir, "agents"); const storage = new AgentStorage(storagePath, logger); const manager = new AgentManager({ - clients: { codex: new TestAgentClient() }, + clients: { + codex: new TestAgentClient(), + }, registry: storage, logger, - idFactory: () => "00000000-0000-4000-8000-000000000129", + idFactory: () => "00000000-0000-4000-8000-000000000134", }); - const agent = await manager.createAgent({ + const snapshot = await manager.createAgent({ provider: "codex", cwd: workdir, + title: "Finished attention test", }); - await expect(manager.setAgentFeature(agent.id, "fast_mode", true)).rejects.toThrow( - "Agent session does not support setting features", - ); + await manager.runAgent(snapshot.id, "say hello"); + await manager.flush(); + + const persisted = await storage.get(snapshot.id); + expect(persisted?.lastStatus).toBe("idle"); + expect(persisted?.requiresAttention).toBe(true); + expect(persisted?.attentionReason).toBe("finished"); + expect(persisted?.attentionTimestamp).toEqual(expect.any(String)); }); - test("emitState syncs features from session to agent", async () => { - const workdir = mkdtempSync(join(tmpdir(), "agent-manager-emit-state-features-")); + test("archiveSnapshot clears persisted attention and normalizes running status", async () => { + const workdir = mkdtempSync(join(tmpdir(), "agent-manager-archive-attention-")); const storagePath = join(workdir, "agents"); const storage = new AgentStorage(storagePath, logger); - - class FeatureSession extends TestAgentSession { - readonly features: AgentFeature[] = [createFeature()]; - } - - class FeatureClient extends TestAgentClient { - session: FeatureSession | null = null; - - override async createSession(config: AgentSessionConfig): Promise { - this.session = new FeatureSession(config); - return this.session; - } - } - - const client = new FeatureClient(); const manager = new AgentManager({ - clients: { codex: client }, + clients: { + codex: new TestAgentClient(), + }, registry: storage, logger, - idFactory: () => "00000000-0000-4000-8000-000000000130", + idFactory: () => "00000000-0000-4000-8000-000000000135", }); - const events: AgentFeature[][] = []; - const agent = await manager.createAgent({ + const snapshot = await manager.createAgent({ provider: "codex", cwd: workdir, + title: "Archive attention test", }); - manager.subscribe((event) => { - if (event.type !== "agent_state" || event.agent.id !== agent.id) { - return; - } - events.push(event.agent.features ?? []); - }); + const live = manager.getAgent(snapshot.id); + expect(live).not.toBeNull(); + live!.lifecycle = "running"; + live!.attention = { + requiresAttention: true, + attentionReason: "finished", + attentionTimestamp: new Date("2025-01-02T00:00:00.000Z"), + }; - if (client.session?.features[0]?.type === "toggle") { - client.session.features[0].value = true; - } + const archivedAt = "2025-01-03T00:00:00.000Z"; + const archivedRecord = await manager.archiveSnapshot(snapshot.id, archivedAt); - manager.notifyAgentState(agent.id); + expect(archivedRecord.archivedAt).toBe(archivedAt); + expect(archivedRecord.lastStatus).toBe("idle"); + expect(archivedRecord.requiresAttention).toBe(false); + expect(archivedRecord.attentionReason).toBeNull(); + expect(archivedRecord.attentionTimestamp).toBeNull(); - expect(manager.getAgent(agent.id)?.features).toEqual([createFeature({ value: true })]); - expect(events.at(-1)).toEqual([createFeature({ value: true })]); + const persisted = await storage.get(snapshot.id); + expect(persisted?.archivedAt).toBe(archivedAt); + expect(persisted?.lastStatus).toBe("idle"); + expect(persisted?.requiresAttention).toBe(false); + expect(persisted?.attentionReason).toBeNull(); + expect(persisted?.attentionTimestamp).toBeNull(); }); test("reloadAgentSession cancels active run and resumes existing session once thread_started is observed", async () => { @@ -1004,7 +1174,7 @@ describe("AgentManager", () => { } }); - test("fetchTimeline returns full timeline with reset when cursor epoch is stale", async () => { + test("fetchTimeline returns committed rows after a known seq without reset metadata", async () => { const workdir = mkdtempSync(join(tmpdir(), "agent-manager-timeline-stale-")); const storagePath = join(workdir, "agents"); const storage = new AgentStorage(storagePath, logger); @@ -1014,7 +1184,124 @@ describe("AgentManager", () => { }, registry: storage, logger, - idFactory: () => "00000000-0000-4000-8000-000000000118", + idFactory: () => "00000000-0000-4000-8000-000000000118", + }); + + const snapshot = await manager.createAgent({ + provider: "codex", + cwd: workdir, + }); + + for (let seq = 1; seq <= 120; seq += 1) { + await manager.appendTimelineItem(snapshot.id, { + type: "assistant_message", + text: `committed row ${seq}`, + }); + } + + const baseline = await manager.fetchTimeline(snapshot.id, { + direction: "tail", + limit: 0, + }); + expect(baseline.rows).toHaveLength(120); + + await manager.emitLiveTimelineItem(snapshot.id, { + type: "assistant_message", + text: "partial reply", + }); + await manager.appendTimelineItem(snapshot.id, { + type: "assistant_message", + text: "finalized reply", + }); + + const result = await manager.fetchTimeline(snapshot.id, { + direction: "after", + cursor: { + seq: 120, + }, + limit: 0, + }); + + expect(result.rows).toHaveLength(1); + expect(result.rows[0]?.seq).toBe(121); + expect(result.rows[0]?.item).toEqual({ + type: "assistant_message", + text: "finalized reply", + }); + }); + + test("fetchTimeline and getTimelineRows prefer the durable store while live helpers stay in-memory", async () => { + const workdir = mkdtempSync(join(tmpdir(), "agent-manager-durable-read-authority-")); + const storagePath = join(workdir, "agents"); + const dataDir = join(workdir, "db"); + const storage = new AgentStorage(storagePath, logger); + const database = await openPaseoDatabase(dataDir); + + try { + const durableTimelineStore = new DbAgentTimelineStore(database.db); + const manager = new AgentManager({ + clients: { + codex: new TestAgentClient(), + }, + registry: storage, + durableTimelineStore, + logger, + idFactory: () => "00000000-0000-4000-8000-000000000139", + }); + + const snapshot = await manager.createAgent({ + provider: "codex", + cwd: workdir, + }); + + const durableOnlyItem: AgentTimelineItem = { + type: "assistant_message", + text: "durable only", + }; + const durableOnlyRow = { + seq: 1, + timestamp: "2026-03-24T00:00:01.000Z", + item: durableOnlyItem, + }; + + await durableTimelineStore.bulkInsert(snapshot.id, [durableOnlyRow]); + + expect(manager.getTimeline(snapshot.id)).toEqual([]); + await expect(manager.getLastAssistantMessage(snapshot.id)).resolves.toBe("durable only"); + await expect(manager.getTimelineRows(snapshot.id)).resolves.toEqual([durableOnlyRow]); + await expect( + manager.fetchTimeline(snapshot.id, { + direction: "tail", + limit: 0, + }), + ).resolves.toEqual({ + direction: "tail", + window: { + minSeq: 1, + maxSeq: 1, + nextSeq: 2, + }, + hasOlder: false, + hasNewer: false, + rows: [durableOnlyRow], + }); + } finally { + await database.close(); + rmSync(workdir, { recursive: true, force: true }); + } + }); + + test("getTimelineRows falls back to the in-memory timeline when no durable store is configured", async () => { + const workdir = mkdtempSync(join(tmpdir(), "agent-manager-timeline-rows-fallback-")); + const storagePath = join(workdir, "agents"); + const storage = new AgentStorage(storagePath, logger); + const manager = new AgentManager({ + clients: { + codex: new TestAgentClient(), + }, + registry: storage, + logger, + idFactory: () => "00000000-0000-4000-8000-000000000140", }); const snapshot = await manager.createAgent({ @@ -1024,47 +1311,92 @@ describe("AgentManager", () => { await manager.appendTimelineItem(snapshot.id, { type: "assistant_message", - text: "one", + text: "row one", }); await manager.appendTimelineItem(snapshot.id, { type: "assistant_message", - text: "two", + text: "row two", + }); + + await expect(manager.getTimelineRows(snapshot.id)).resolves.toEqual([ + { + seq: 1, + timestamp: expect.any(String), + item: { + type: "assistant_message", + text: "row one", + }, + }, + { + seq: 2, + timestamp: expect.any(String), + item: { + type: "assistant_message", + text: "row two", + }, + }, + ]); + }); + + test("getAgent does not expose committed history internals once manager owns the seam", async () => { + const workdir = mkdtempSync(join(tmpdir(), "agent-manager-timeline-boundary-")); + const storagePath = join(workdir, "agents"); + const storage = new AgentStorage(storagePath, logger); + const manager = new AgentManager({ + clients: { + codex: new TestAgentClient(), + }, + registry: storage, + logger, + idFactory: () => "00000000-0000-4000-8000-000000000138", + }); + + const snapshot = await manager.createAgent({ + provider: "codex", + cwd: workdir, + }); + + manager.recordUserMessage(snapshot.id, "hello boundary", { + messageId: "msg-boundary-1", + emitState: false, }); await manager.appendTimelineItem(snapshot.id, { type: "assistant_message", - text: "three", + text: "history stays behind manager", }); - const baseline = manager.fetchTimeline(snapshot.id, { - direction: "tail", - limit: 2, - }); - expect(baseline.rows).toHaveLength(2); + const live = manager.getAgent(snapshot.id) as Record; + expect(live).not.toBeNull(); + expect("timeline" in live).toBe(false); + expect("timelineRows" in live).toBe(false); + expect("timelineNextSeq" in live).toBe(false); - const result = manager.fetchTimeline(snapshot.id, { - direction: "after", - cursor: { - epoch: "stale-epoch", - seq: baseline.rows[baseline.rows.length - 1]!.seq, + expect(manager.getTimeline(snapshot.id)).toEqual([ + { + type: "user_message", + text: "hello boundary", + messageId: "msg-boundary-1", }, - limit: 1, - }); + { + type: "assistant_message", + text: "history stays behind manager", + }, + ]); - expect(result.reset).toBe(true); - expect(result.staleCursor).toBe(true); - expect(result.gap).toBe(false); - expect(result.rows).toHaveLength(3); - expect(result.rows[0]?.seq).toBe(1); - expect(result.rows[result.rows.length - 1]?.seq).toBe(3); + const fetched = await manager.fetchTimeline(snapshot.id, { + direction: "tail", + limit: 0, + }); + expect(fetched.rows.map((row) => row.seq)).toEqual([1, 2]); }); - test("emits live timeline updates without recording canonical timeline rows", async () => { - const workdir = mkdtempSync(join(tmpdir(), "agent-manager-live-timeline-")); + test("streams assistant chunks incrementally and persists canonical chunk rows", async () => { + const workdir = mkdtempSync(join(tmpdir(), "agent-manager-provisional-timeline-")); const storagePath = join(workdir, "agents"); const storage = new AgentStorage(storagePath, logger); const manager = new AgentManager({ clients: { - codex: new TestAgentClient(), + codex: new StreamingAssistantClient(), }, registry: storage, logger, @@ -1078,9 +1410,9 @@ describe("AgentManager", () => { const streamEvents: Array<{ seq?: number; - epoch?: string; eventType?: string; itemType?: string; + text?: string; }> = []; manager.subscribe( (event) => { @@ -1089,37 +1421,69 @@ describe("AgentManager", () => { } streamEvents.push({ seq: event.seq, - epoch: event.epoch, eventType: event.event.type, itemType: event.event.type === "timeline" ? event.event.item.type : undefined, + text: + event.event.type === "timeline" && event.event.item.type === "assistant_message" + ? event.event.item.text + : undefined, }); }, { agentId: snapshot.id, replayState: false }, ); - await manager.emitLiveTimelineItem(snapshot.id, { - type: "assistant_message", - text: "live-only update", - }); + const stream = manager.streamAgent(snapshot.id, "hello"); + while (true) { + const next = await stream.next(); + if (next.done) { + break; + } + } - expect(streamEvents).toHaveLength(1); - expect(streamEvents[0]).toMatchObject({ + const assistantTimelineEvents = streamEvents.filter( + (event) => event.itemType === "assistant_message", + ); + expect(assistantTimelineEvents).toHaveLength(2); + expect(assistantTimelineEvents[0]).toMatchObject({ + eventType: "timeline", + itemType: "assistant_message", + text: "final ", + seq: 1, + }); + expect(assistantTimelineEvents[1]).toMatchObject({ eventType: "timeline", itemType: "assistant_message", + text: "reply", + seq: 2, }); - expect(streamEvents[0]?.seq).toBeUndefined(); - expect(streamEvents[0]?.epoch).toBeUndefined(); - expect(manager.getTimeline(snapshot.id)).toEqual([]); - const fetched = manager.fetchTimeline(snapshot.id, { + expect(manager.getTimeline(snapshot.id)).toEqual([ + { + type: "assistant_message", + text: "final ", + }, + { + type: "assistant_message", + text: "reply", + }, + ]); + const fetched = await manager.fetchTimeline(snapshot.id, { direction: "tail", limit: 0, }); - expect(fetched.rows).toEqual([]); + expect(fetched.rows).toHaveLength(2); + expect(fetched.rows[0]?.item).toEqual({ + type: "assistant_message", + text: "final ", + }); + expect(fetched.rows[1]?.item).toEqual({ + type: "assistant_message", + text: "reply", + }); }); - test("fetchTimeline returns full timeline with reset when cursor seq falls behind retention window", async () => { - const workdir = mkdtempSync(join(tmpdir(), "agent-manager-timeline-gap-")); + test("fetchTimeline supports older-history pagination with before seq", async () => { + const workdir = mkdtempSync(join(tmpdir(), "agent-manager-timeline-before-")); const storagePath = join(workdir, "agents"); const storage = new AgentStorage(storagePath, logger); const manager = new AgentManager({ @@ -1128,7 +1492,6 @@ describe("AgentManager", () => { }, registry: storage, logger, - maxTimelineItems: 2, idFactory: () => "00000000-0000-4000-8000-000000000119", }); @@ -1153,32 +1516,27 @@ describe("AgentManager", () => { type: "assistant_message", text: "fourth", }); - - const fresh = manager.fetchTimeline(snapshot.id, { - direction: "tail", - limit: 0, + await manager.appendTimelineItem(snapshot.id, { + type: "assistant_message", + text: "fifth", }); - expect(fresh.window.minSeq).toBe(3); - expect(fresh.window.maxSeq).toBe(4); - const result = manager.fetchTimeline(snapshot.id, { - direction: "after", + const result = await manager.fetchTimeline(snapshot.id, { + direction: "before", cursor: { - epoch: fresh.epoch, - seq: 1, + seq: 5, }, - limit: 10, + limit: 2, }); - expect(result.reset).toBe(true); - expect(result.staleCursor).toBe(false); - expect(result.gap).toBe(true); expect(result.rows).toHaveLength(2); expect(result.rows[0]?.seq).toBe(3); expect(result.rows[1]?.seq).toBe(4); + expect(result.hasOlder).toBe(true); + expect(result.hasNewer).toBe(true); }); - test("does not trim timeline by default", async () => { + test("does not trim committed history", async () => { const workdir = mkdtempSync(join(tmpdir(), "agent-manager-timeline-unbounded-")); const storagePath = join(workdir, "agents"); const storage = new AgentStorage(storagePath, logger); @@ -1209,7 +1567,7 @@ describe("AgentManager", () => { text: "third", }); - const fetched = manager.fetchTimeline(snapshot.id, { + const fetched = await manager.fetchTimeline(snapshot.id, { direction: "tail", limit: 0, }); @@ -1218,6 +1576,183 @@ describe("AgentManager", () => { expect(fetched.window.maxSeq).toBe(3); }); + test("hydrateTimeline preserves assistant chunk, reasoning, and tool timeline history", async () => { + const workdir = mkdtempSync(join(tmpdir(), "agent-manager-history-canonical-assistant-")); + const storagePath = join(workdir, "agents"); + const storage = new AgentStorage(storagePath, logger); + + class ChunkedAssistantHistorySession extends TestAgentSession { + constructor(config: AgentSessionConfig) { + super(config); + } + + async *streamHistory(): AsyncGenerator { + yield { + type: "timeline", + provider: this.provider, + item: { type: "assistant_message", text: "chunk one " }, + }; + yield { + type: "timeline", + provider: this.provider, + item: { type: "assistant_message", text: "chunk two" }, + }; + yield { + type: "timeline", + provider: this.provider, + item: { type: "reasoning", text: "internal" }, + }; + yield { + type: "timeline", + provider: this.provider, + item: { + type: "tool_call", + callId: "call-history-1", + name: "shell", + status: "completed", + detail: { + type: "shell", + command: "echo hi", + output: "hi\n", + exitCode: 0, + }, + error: null, + }, + }; + yield { + type: "timeline", + provider: this.provider, + item: { type: "assistant_message", text: "final answer" }, + }; + } + } + + class ChunkedAssistantHistoryClient implements AgentClient { + readonly provider = "codex" as const; + readonly capabilities = TEST_CAPABILITIES; + + async isAvailable(): Promise { + return true; + } + + async createSession(config: AgentSessionConfig): Promise { + return new ChunkedAssistantHistorySession(config); + } + + async resumeSession(): Promise { + throw new Error("Not used in this test"); + } + } + + const manager = new AgentManager({ + clients: { + codex: new ChunkedAssistantHistoryClient(), + }, + registry: storage, + logger, + idFactory: () => "00000000-0000-4000-8000-000000000121", + }); + + const snapshot = await manager.createAgent({ + provider: "codex", + cwd: workdir, + }); + + await manager.hydrateTimelineFromProvider(snapshot.id); + + expect(manager.getTimeline(snapshot.id)).toEqual([ + { type: "assistant_message", text: "chunk one " }, + { type: "assistant_message", text: "chunk two" }, + { type: "reasoning", text: "internal" }, + { + type: "tool_call", + callId: "call-history-1", + name: "shell", + status: "completed", + detail: { + type: "shell", + command: "echo hi", + output: "hi\n", + exitCode: 0, + }, + error: null, + }, + { type: "assistant_message", text: "final answer" }, + ]); + }); + + test("hydrateTimeline preserves reasoning between assistant chunks", async () => { + const workdir = mkdtempSync(join(tmpdir(), "agent-manager-history-reasoning-interleave-")); + const storagePath = join(workdir, "agents"); + const storage = new AgentStorage(storagePath, logger); + + class ReasoningInterleavedHistorySession extends TestAgentSession { + constructor(config: AgentSessionConfig) { + super(config); + } + + async *streamHistory(): AsyncGenerator { + yield { + type: "timeline", + provider: this.provider, + item: { type: "assistant_message", text: "before reasoning " }, + }; + yield { + type: "timeline", + provider: this.provider, + item: { type: "reasoning", text: "internal step" }, + }; + yield { + type: "timeline", + provider: this.provider, + item: { type: "assistant_message", text: "after reasoning" }, + }; + } + } + + class ReasoningInterleavedHistoryClient implements AgentClient { + readonly provider = "codex" as const; + readonly capabilities = TEST_CAPABILITIES; + + async isAvailable(): Promise { + return true; + } + + async createSession(config: AgentSessionConfig): Promise { + return new ReasoningInterleavedHistorySession(config); + } + + async resumeSession(): Promise { + throw new Error("Not used in this test"); + } + } + + const manager = new AgentManager({ + clients: { + codex: new ReasoningInterleavedHistoryClient(), + }, + registry: storage, + logger, + idFactory: () => "00000000-0000-4000-8000-000000000122", + }); + + const snapshot = await manager.createAgent({ + provider: "codex", + cwd: workdir, + }); + + await manager.hydrateTimelineFromProvider(snapshot.id); + + expect(manager.getTimeline(snapshot.id)).toEqual([ + { + type: "assistant_message", + text: "before reasoning ", + }, + { type: "reasoning", text: "internal step" }, + { type: "assistant_message", text: "after reasoning" }, + ]); + }); + test("createAgent fails when generated agent ID is not a UUID", async () => { const workdir = mkdtempSync(join(tmpdir(), "agent-manager-test-")); const storagePath = join(workdir, "agents"); @@ -1265,29 +1800,41 @@ describe("AgentManager", () => { test("createAgent persists provided title before returning", async () => { const agentId = "00000000-0000-4000-8000-000000000102"; const workdir = mkdtempSync(join(tmpdir(), "agent-manager-test-")); - const storagePath = join(workdir, "agents"); - const storage = new AgentStorage(storagePath, logger); - const manager = new AgentManager({ - clients: { - codex: new TestAgentClient(), - }, - registry: storage, - logger, - idFactory: () => agentId, - }); + const dataDir = join(workdir, "db"); + const database = await openPaseoDatabase(dataDir); - const snapshot = await manager.createAgent({ - provider: "codex", - cwd: workdir, - title: "Fix Login Bug", - }); + try { + const workspaceId = await seedWorkspace(database, { directory: workdir }); + const storage = new DbAgentSnapshotStore(database.db); + const manager = new AgentManager({ + clients: { + codex: new TestAgentClient(), + }, + registry: storage, + logger, + idFactory: () => agentId, + }); + + const snapshot = await manager.createAgent( + { + provider: "codex", + cwd: workdir, + title: "Fix Login Bug", + }, + undefined, + { workspaceId }, + ); - expect(snapshot.id).toBe(agentId); - expect(snapshot.lifecycle).toBe("idle"); + expect(snapshot.id).toBe(agentId); + expect(snapshot.lifecycle).toBe("idle"); - const persisted = await storage.get(agentId); - expect(persisted?.title).toBe("Fix Login Bug"); - expect(persisted?.id).toBe(agentId); + const persisted = await storage.get(agentId); + expect(persisted?.title).toBe("Fix Login Bug"); + expect(persisted?.id).toBe(agentId); + } finally { + await database.close(); + rmSync(workdir, { recursive: true, force: true }); + } }); test("createAgent populates runtimeInfo after session creation", async () => { @@ -2456,6 +3003,7 @@ describe("AgentManager", () => { }); await expect(manager.runAgent(agent.id, "fail once")).rejects.toThrow("boom-1"); + await manager.flush(); const afterFirstFailure = manager.getAgent(agent.id); expect(afterFirstFailure?.lifecycle).toBe("error"); @@ -2465,14 +3013,26 @@ describe("AgentManager", () => { attentionReason: "error", }); + const persistedAfterFirstFailure = await storage.get(agent.id); + expect(persistedAfterFirstFailure?.lastStatus).toBe("error"); + expect(persistedAfterFirstFailure?.requiresAttention).toBe(true); + expect(persistedAfterFirstFailure?.attentionReason).toBe("error"); + await manager.clearAgentAttention(agent.id); manager.notifyAgentState(agent.id); + await manager.flush(); const afterClear = manager.getAgent(agent.id); expect(afterClear?.lifecycle).toBe("error"); expect(afterClear?.attention).toEqual({ requiresAttention: false }); + const persistedAfterClear = await storage.get(agent.id); + expect(persistedAfterClear?.lastStatus).toBe("error"); + expect(persistedAfterClear?.requiresAttention).toBe(false); + expect(persistedAfterClear?.attentionReason).toBeNull(); + await expect(manager.runAgent(agent.id, "fail again")).rejects.toThrow("boom-2"); + await manager.flush(); const afterSecondFailure = manager.getAgent(agent.id); expect(afterSecondFailure?.lifecycle).toBe("error"); @@ -2481,6 +3041,11 @@ describe("AgentManager", () => { attentionReason: "error", }); expect(attentionReasons).toEqual(["error", "error"]); + + const persistedAfterSecondFailure = await storage.get(agent.id); + expect(persistedAfterSecondFailure?.lastStatus).toBe("error"); + expect(persistedAfterSecondFailure?.requiresAttention).toBe(true); + expect(persistedAfterSecondFailure?.attentionReason).toBe("error"); }); test("archiveAgent persists archivedAt and updatedAt before emitting closed state", async () => { @@ -2888,6 +3453,10 @@ describe("AgentManager", () => { // The manager should have updated currentModeId to reflect this const updatedAgent = manager.getAgent(snapshot.id); expect(updatedAgent?.currentModeId).toBe("acceptEdits"); + + await manager.flush(); + const persisted = await storage.get(snapshot.id); + expect(persisted?.lastModeId).toBe("acceptEdits"); }); test("respondToPermission refreshes features and runtime info after provider-managed plan approval", async () => { @@ -3249,6 +3818,41 @@ describe("AgentManager", () => { expect(persisted?.persistence?.sessionId).toBe(snapshot.persistence?.sessionId); }); + test("closeAgent persists one final closed snapshot", async () => { + const workdir = mkdtempSync(join(tmpdir(), "agent-manager-close-no-persist-")); + const storagePath = join(workdir, "agents"); + const storage = new AgentStorage(storagePath, logger); + const applySnapshotSpy = vi.spyOn(storage, "applySnapshot"); + const manager = new AgentManager({ + clients: { + codex: new TestAgentClient(), + }, + registry: storage, + logger, + idFactory: () => "00000000-0000-4000-8000-000000000112", + }); + + try { + const snapshot = await manager.createAgent({ + provider: "codex", + cwd: workdir, + }); + + await manager.flush(); + const persistCountBeforeClose = applySnapshotSpy.mock.calls.length; + + await manager.closeAgent(snapshot.id); + await manager.flush(); + + expect(applySnapshotSpy).toHaveBeenCalledTimes(persistCountBeforeClose + 1); + } finally { + applySnapshotSpy.mockRestore(); + await manager.flush().catch(() => undefined); + await storage.flush().catch(() => undefined); + rmSync(workdir, { recursive: true, force: true }); + } + }); + test("hydrateTimeline skips provider user_message items to prevent duplicates with recordUserMessage", async () => { const workdir = mkdtempSync(join(tmpdir(), "agent-manager-history-dedup-")); const storagePath = join(workdir, "agents"); diff --git a/packages/server/src/server/agent/agent-manager.ts b/packages/server/src/server/agent/agent-manager.ts index cd84027ec..ffe885d0c 100644 --- a/packages/server/src/server/agent/agent-manager.ts +++ b/packages/server/src/server/agent/agent-manager.ts @@ -7,6 +7,7 @@ import { } from "../../shared/agent-lifecycle.js"; import type { Logger } from "pino"; import { z } from "zod"; +import type { TerminalManager } from "../../terminal/terminal-manager.js"; import type { AgentCapabilityFlags, @@ -31,10 +32,29 @@ import type { ListPersistedAgentsOptions, PersistedAgentDescriptor, } from "./agent-sdk-types.js"; -import type { AgentStorage } from "./agent-storage.js"; +import type { StoredAgentRecord } from "./agent-storage.js"; +import type { AgentSnapshotStore } from "./agent-snapshot-store.js"; +import { + InMemoryAgentTimelineStore, + type SeedAgentTimelineOptions, +} from "./agent-timeline-store.js"; +import type { + AgentTimelineFetchOptions, + AgentTimelineFetchResult, + AgentTimelineRow, + AgentTimelineStore, +} from "./agent-timeline-store-types.js"; import { AGENT_PROVIDER_IDS } from "./provider-manifest.js"; export { AGENT_LIFECYCLE_STATUSES, type AgentLifecycleStatus }; +export type { + AgentTimelineCursor, + AgentTimelineFetchDirection, + AgentTimelineFetchOptions, + AgentTimelineFetchResult, + AgentTimelineRow, + AgentTimelineWindow, +} from "./agent-timeline-store-types.js"; export type AgentManagerEvent = | { type: "agent_state"; agent: ManagedAgent } @@ -43,7 +63,6 @@ export type AgentManagerEvent = agentId: string; event: AgentStreamEvent; seq?: number; - epoch?: string; }; export type AgentSubscriber = (event: AgentManagerEvent) => void; @@ -71,10 +90,11 @@ export type ProviderAvailability = { export type AgentManagerOptions = { clients?: Partial>; - maxTimelineItems?: number; idFactory?: () => string; - registry?: AgentStorage; + registry?: AgentSnapshotStore; onAgentAttention?: AgentAttentionCallback; + durableTimelineStore?: AgentTimelineStore; + terminalManager?: TerminalManager | null; logger: Logger; }; @@ -93,48 +113,6 @@ export type WaitForAgentStartOptions = { signal?: AbortSignal; }; -export type AgentTimelineRow = { - seq: number; - timestamp: string; - item: AgentTimelineItem; -}; - -export type AgentTimelineCursor = { - epoch: string; - seq: number; -}; - -export type AgentTimelineFetchDirection = "tail" | "before" | "after"; - -export type AgentTimelineFetchOptions = { - direction?: AgentTimelineFetchDirection; - cursor?: AgentTimelineCursor; - /** - * Number of canonical rows to return. - * - undefined: manager default - * - 0: all rows in the selected window - */ - limit?: number; -}; - -export type AgentTimelineWindow = { - minSeq: number; - maxSeq: number; - nextSeq: number; -}; - -export type AgentTimelineFetchResult = { - epoch: string; - direction: AgentTimelineFetchDirection; - reset: boolean; - staleCursor: boolean; - gap: boolean; - window: AgentTimelineWindow; - hasOlder: boolean; - hasNewer: boolean; - rows: AgentTimelineRow[]; -}; - type AttentionState = | { requiresAttention: false } | { @@ -178,10 +156,6 @@ type ManagedAgentBase = { >; inFlightPermissionResponses: Set; pendingReplacement: boolean; - timeline: AgentTimelineItem[]; - timelineRows: AgentTimelineRow[]; - timelineEpoch: string; - timelineNextSeq: number; persistence: AgentPersistenceHandle | null; historyPrimed: boolean; lastUserMessageAt: Date | null; @@ -254,6 +228,8 @@ type ActiveManagedAgent = | ManagedAgentRunning | ManagedAgentError; +type LiveManagedAgent = ActiveManagedAgent; + const SYSTEM_ERROR_PREFIX = "[System Error]"; function attachPersistenceCwd( @@ -277,7 +253,6 @@ type SubscriptionRecord = { agentId: string | null; }; -const DEFAULT_TIMELINE_FETCH_LIMIT = 200; const BUSY_STATUSES: AgentLifecycleStatus[] = ["initializing", "running"]; const AgentIdSchema = z.string().uuid(); @@ -322,27 +297,24 @@ function normalizeMessageId(messageId: string | undefined): string | undefined { export class AgentManager { private readonly clients = new Map(); - private readonly agents = new Map(); + private readonly agents = new Map(); + private readonly timelineStore = new InMemoryAgentTimelineStore(); + private readonly agentsAwaitingInitialSnapshotPersist = new Set(); + private readonly sessionEventTails = new Map>(); private readonly pendingForegroundRuns = new Map(); private readonly subscribers = new Set(); - private readonly maxTimelineItems: number | null; private readonly idFactory: () => string; - private readonly registry?: AgentStorage; + private readonly registry?: AgentSnapshotStore; + private readonly durableTimelineStore?: AgentTimelineStore; private readonly previousStatuses = new Map(); private readonly backgroundTasks = new Set>(); private onAgentAttention?: AgentAttentionCallback; private logger: Logger; constructor(options: AgentManagerOptions) { - const maxTimelineItems = options?.maxTimelineItems; - this.maxTimelineItems = - typeof maxTimelineItems === "number" && - Number.isFinite(maxTimelineItems) && - maxTimelineItems >= 0 - ? Math.floor(maxTimelineItems) - : null; this.idFactory = options?.idFactory ?? (() => randomUUID()); this.registry = options?.registry; + this.durableTimelineStore = options?.durableTimelineStore; this.onAgentAttention = options?.onAgentAttention; this.logger = options.logger.child({ module: "agent", component: "agent-manager" }); if (options?.clients) { @@ -375,7 +347,11 @@ export class AgentManager { withActiveForegroundTurn++; } - const len = agent.timeline.length; + if (!this.timelineStore.has(agent.id)) { + continue; + } + + const len = this.timelineStore.getItems(agent.id).length; totalItems += len; if (len > maxItemsPerAgent) { maxItemsPerAgent = len; @@ -586,149 +562,41 @@ export class AgentManager { } getTimeline(id: string): AgentTimelineItem[] { - const agent = this.requireAgent(id); - return [...agent.timeline]; + this.requireAgent(id); + return this.timelineStore.getItems(id); } - getTimelineRows(id: string): AgentTimelineRow[] { - const agent = this.requireAgent(id); - const { rows } = this.ensureTimelineState(agent); - return rows.map((row) => ({ ...row })); - } - - fetchTimeline(id: string, options?: AgentTimelineFetchOptions): AgentTimelineFetchResult { - const agent = this.requireAgent(id); - const { rows, epoch, nextSeq, minSeq, maxSeq } = this.ensureTimelineState(agent); - const direction = options?.direction ?? "tail"; - const requestedLimit = options?.limit; - const limit = - requestedLimit === undefined - ? DEFAULT_TIMELINE_FETCH_LIMIT - : Math.max(0, Math.floor(requestedLimit)); - const cursor = options?.cursor; - - const window: AgentTimelineWindow = { minSeq, maxSeq, nextSeq }; - - if (cursor && cursor.epoch !== epoch) { - return { - epoch, - direction, - reset: true, - staleCursor: true, - gap: false, - window, - hasOlder: false, - hasNewer: false, - rows: rows.map((row) => ({ ...row })), - }; - } - - const selectAll = limit === 0; - const cloneRows = (items: AgentTimelineRow[]) => items.map((row) => ({ ...row })); - - if (direction === "after" && cursor && rows.length > 0 && cursor.seq < minSeq - 1) { - return { - epoch, - direction, - reset: true, - staleCursor: false, - gap: true, - window, - hasOlder: false, - hasNewer: false, - rows: cloneRows(rows), - }; - } - - if (rows.length === 0) { - return { - epoch, - direction, - reset: false, - staleCursor: false, - gap: false, - window, - hasOlder: false, - hasNewer: false, - rows: [], - }; + async getTimelineRows(id: string): Promise { + this.requireAgent(id); + if (this.durableTimelineStore) { + return await this.durableTimelineStore.getCommittedRows(id); } + return this.timelineStore.getRows(id); + } - if (direction === "tail") { - const selected = selectAll || limit >= rows.length ? rows : rows.slice(rows.length - limit); - const hasOlder = selected.length > 0 && selected[0]!.seq > minSeq; - return { - epoch, - direction, - reset: false, - staleCursor: false, - gap: false, - window, - hasOlder, - hasNewer: false, - rows: cloneRows(selected), - }; + async fetchTimeline( + id: string, + options?: AgentTimelineFetchOptions, + ): Promise { + this.requireAgent(id); + const agent = this.agents.get(id); + if (agent && !agent.historyPrimed) { + await this.hydrateTimelineFromLegacyProviderHistory(agent); } - - if (direction === "after") { - const baseSeq = cursor?.seq ?? 0; - const startIdx = rows.findIndex((row) => row.seq > baseSeq); - if (startIdx < 0) { - return { - epoch, - direction, - reset: false, - staleCursor: false, - gap: false, - window, - hasOlder: baseSeq >= minSeq, - hasNewer: false, - rows: [], - }; - } - - const selected = selectAll ? rows.slice(startIdx) : rows.slice(startIdx, startIdx + limit); - const lastSelected = selected[selected.length - 1]; - return { - epoch, - direction, - reset: false, - staleCursor: false, - gap: false, - window, - hasOlder: selected[0]!.seq > minSeq, - hasNewer: Boolean(lastSelected && lastSelected.seq < maxSeq), - rows: cloneRows(selected), - }; + if (this.durableTimelineStore) { + return await this.durableTimelineStore.fetchCommitted(id, options); } - - // direction === "before" - const beforeSeq = cursor?.seq ?? nextSeq; - const endExclusive = rows.findIndex((row) => row.seq >= beforeSeq); - const boundedRows = endExclusive < 0 ? rows : rows.slice(0, endExclusive); - const selected = - selectAll || limit >= boundedRows.length - ? boundedRows - : boundedRows.slice(boundedRows.length - limit); - const hasOlder = selected.length > 0 && selected[0]!.seq > minSeq; - const hasNewer = endExclusive >= 0; - return { - epoch, - direction, - reset: false, - staleCursor: false, - gap: false, - window, - hasOlder, - hasNewer, - rows: cloneRows(selected), - }; + return this.timelineStore.fetch(id, options); } async createAgent( config: AgentSessionConfig, agentId?: string, - options?: { labels?: Record }, + options?: { + labels?: Record; + workspaceId?: number; + initialPrompt?: string; + }, ): Promise { // Generate agent ID early so we can use it in MCP config const resolvedAgentId = validateAgentId(agentId ?? this.idFactory(), "createAgent"); @@ -744,6 +612,7 @@ export class AgentManager { const session = await client.createSession(normalizedConfig, launchContext); return this.registerSession(session, normalizedConfig, resolvedAgentId, { labels: options?.labels, + workspaceId: options?.workspaceId, }); } @@ -787,16 +656,11 @@ export class AgentManager { agentId: string, overrides?: Partial, ): Promise { - let existing = this.requireAgent(agentId); + let existing = this.requireSessionAgent(agentId); if (this.hasInFlightRun(agentId)) { await this.cancelAgentRun(agentId); - existing = this.requireAgent(agentId); + existing = this.requireSessionAgent(agentId); } - const timelineState = this.ensureTimelineState(existing); - const preservedTimeline = [...existing.timeline]; - const preservedTimelineRows = timelineState.rows.map((row) => ({ ...row })); - const preservedTimelineEpoch = timelineState.epoch; - const preservedTimelineNextSeq = timelineState.nextSeq; const preservedHistoryPrimed = existing.historyPrimed; const preservedLastUsage = existing.lastUsage; const preservedLastError = existing.lastError; @@ -839,10 +703,6 @@ export class AgentManager { createdAt: existing.createdAt, updatedAt: existing.updatedAt, lastUserMessageAt: existing.lastUserMessageAt, - timeline: preservedTimeline, - timelineRows: preservedTimelineRows, - timelineEpoch: preservedTimelineEpoch, - timelineNextSeq: preservedTimelineNextSeq, historyPrimed: preservedHistoryPrimed, lastUsage: preservedLastUsage, lastError: preservedLastError, @@ -861,34 +721,10 @@ export class AgentManager { }, "closeAgent: start", ); - this.agents.delete(agentId); - // Clean up previousStatus to prevent memory leak - this.previousStatuses.delete(agentId); - if (agent.unsubscribeSession) { - agent.unsubscribeSession(); - agent.unsubscribeSession = null; - } - for (const waiter of agent.foregroundTurnWaiters) { - // Wake up the generator so it can exit the await loop - waiter.callback({ - type: "turn_canceled", - provider: agent.provider, - reason: "agent closed", - turnId: waiter.turnId, - }); - this.settleForegroundTurnWaiter(waiter); - } - agent.foregroundTurnWaiters.clear(); - this.settlePendingForegroundRun(agentId); - const session = agent.session; - const closedAgent: ManagedAgent = { - ...agent, - lifecycle: "closed", - session: null, - activeForegroundTurnId: null, - }; - await session.close(); - this.emitState(closedAgent); + const closedAgent = this.prepareAgentForClosure(agent, "agent closed"); + await agent.session.close(); + this.timelineStore.delete(agentId); + this.emitClosedAgent(closedAgent); this.logger.trace({ agentId }, "closeAgent: completed"); } @@ -928,7 +764,7 @@ export class AgentManager { } async setAgentMode(agentId: string, modeId: string): Promise { - const agent = this.requireAgent(agentId); + const agent = this.requireSessionAgent(agentId); await agent.session.setMode(modeId); agent.currentModeId = modeId; // Update runtimeInfo to reflect the new mode @@ -940,7 +776,7 @@ export class AgentManager { } async setAgentModel(agentId: string, modelId: string | null): Promise { - const agent = this.requireAgent(agentId); + const agent = this.requireSessionAgent(agentId); const normalizedModelId = typeof modelId === "string" && modelId.trim().length > 0 ? modelId : null; @@ -957,7 +793,7 @@ export class AgentManager { } async setAgentThinkingOption(agentId: string, thinkingOptionId: string | null): Promise { - const agent = this.requireAgent(agentId); + const agent = this.requireSessionAgent(agentId); const normalizedThinkingOptionId = typeof thinkingOptionId === "string" && thinkingOptionId.trim().length > 0 ? thinkingOptionId @@ -968,6 +804,12 @@ export class AgentManager { } agent.config.thinkingOptionId = normalizedThinkingOptionId ?? undefined; + if (agent.runtimeInfo) { + agent.runtimeInfo = { + ...agent.runtimeInfo, + thinkingOptionId: normalizedThinkingOptionId, + }; + } this.touchUpdatedAt(agent); this.emitState(agent); } @@ -991,17 +833,24 @@ export class AgentManager { if (!normalizedTitle) { return; } + if ( + this.agentsAwaitingInitialSnapshotPersist.has(agent.id) && + this.registry && + (await this.registry.get(agent.id)) === null + ) { + return; + } this.touchUpdatedAt(agent); await this.persistSnapshot(agent, { title: normalizedTitle }); - this.emitState(agent); + this.emitState(agent, { persist: false }); } async setLabels(agentId: string, labels: Record): Promise { const agent = this.requireAgent(agentId); agent.labels = { ...agent.labels, ...labels }; - await this.persistSnapshot(agent); this.touchUpdatedAt(agent); - this.emitState(agent); + await this.persistSnapshot(agent); + this.emitState(agent, { persist: false }); } notifyAgentState(agentId: string): void { @@ -1018,8 +867,103 @@ export class AgentManager { if (agent.attention.requiresAttention) { agent.attention = { requiresAttention: false }; await this.persistSnapshot(agent); - this.emitState(agent); + this.emitState(agent, { persist: false }); + } + } + + async archiveSnapshot(agentId: string, archivedAt: string): Promise { + const registry = this.requireRegistry(); + const liveAgent = this.getAgent(agentId); + if (liveAgent) { + await this.persistSnapshot(liveAgent, { + internal: liveAgent.internal, + }); + } + + const record = await registry.get(agentId); + if (!record) { + throw new Error(`Agent not found: ${agentId}`); + } + + const normalizedStatus = + record.lastStatus === "running" || record.lastStatus === "initializing" + ? "idle" + : record.lastStatus; + + const nextRecord: StoredAgentRecord = { + ...record, + archivedAt, + lastStatus: normalizedStatus, + requiresAttention: false, + attentionReason: null, + attentionTimestamp: null, + }; + await registry.upsert(nextRecord); + return nextRecord; + } + + async unarchiveSnapshot(agentId: string): Promise { + const registry = this.requireRegistry(); + const record = await registry.get(agentId); + if (!record || !record.archivedAt) { + return false; } + + await registry.upsert({ + ...record, + archivedAt: null, + }); + + if (this.getAgent(agentId)) { + this.notifyAgentState(agentId); + } + return true; + } + + async unarchiveSnapshotByHandle(handle: AgentPersistenceHandle): Promise { + const registry = this.requireRegistry(); + const records = await registry.list(); + const matched = records.find( + (record) => + record.persistence?.provider === handle.provider && + record.persistence?.sessionId === handle.sessionId, + ); + if (!matched) { + return; + } + + await this.unarchiveSnapshot(matched.id); + } + + async updateAgentMetadata( + agentId: string, + updates: { + title?: string; + labels?: Record; + }, + ): Promise { + const liveAgent = this.getAgent(agentId); + if (liveAgent) { + if (updates.title) { + await this.setTitle(agentId, updates.title); + } + if (updates.labels) { + await this.setLabels(agentId, updates.labels); + } + return; + } + + const registry = this.requireRegistry(); + const existing = await registry.get(agentId); + if (!existing) { + throw new Error(`Agent not found: ${agentId}`); + } + + await registry.upsert({ + ...existing, + ...(updates.title ? { title: updates.title } : {}), + ...(updates.labels ? { labels: { ...existing.labels, ...updates.labels } } : {}), + }); } async runAgent( @@ -1075,7 +1019,7 @@ export class AgentManager { }; const updatedAt = this.touchUpdatedAt(agent); agent.lastUserMessageAt = updatedAt; - const row = this.recordTimeline(agent, item); + const row = this.recordTimeline(agentId, item); this.dispatchStream( agentId, { @@ -1085,7 +1029,6 @@ export class AgentManager { }, { seq: row.seq, - epoch: this.ensureTimelineState(agent).epoch, }, ); if (options?.emitState !== false) { @@ -1096,7 +1039,7 @@ export class AgentManager { async appendTimelineItem(agentId: string, item: AgentTimelineItem): Promise { const agent = this.requireAgent(agentId); this.touchUpdatedAt(agent); - const row = this.recordTimeline(agent, item); + const row = this.recordTimeline(agentId, item); this.dispatchStream( agentId, { @@ -1106,7 +1049,6 @@ export class AgentManager { }, { seq: row.seq, - epoch: this.ensureTimelineState(agent).epoch, }, ); await this.persistSnapshot(agent); @@ -1127,7 +1069,7 @@ export class AgentManager { prompt: AgentPromptInput, options?: AgentRunOptions, ): AsyncGenerator { - const existingAgent = this.requireAgent(agentId); + const existingAgent = this.requireSessionAgent(agentId); this.logger.trace( { agentId, @@ -1296,7 +1238,7 @@ export class AgentManager { return this.streamAgent(agentId, prompt, options); } - const agent = snapshot as ActiveManagedAgent; + const agent = this.requireSessionAgent(agentId); agent.pendingReplacement = true; const self = this; @@ -1470,7 +1412,7 @@ export class AgentManager { } async cancelAgentRun(agentId: string): Promise { - const agent = this.requireAgent(agentId); + const agent = this.requireSessionAgent(agentId); const pendingRun = this.getPendingForegroundRun(agentId); const foregroundTurnId = agent.activeForegroundTurnId; const hasForegroundTurn = Boolean(foregroundTurnId); @@ -1570,7 +1512,7 @@ export class AgentManager { } getPendingPermissions(agentId: string): AgentPermissionRequest[] { - const agent = this.requireAgent(agentId); + const agent = this.requireSessionAgent(agentId); return Array.from(agent.pendingPermissions.values()); } @@ -1579,25 +1521,44 @@ export class AgentManager { return iterator.done ? null : iterator.value; } + /** + * Hydrates the timeline from provider history if the agent's durable + * timeline is empty (e.g., imported agents that have provider history + * on disk but no persisted timeline rows). No-ops if already hydrated. + */ async hydrateTimelineFromProvider(agentId: string): Promise { - const agent = this.requireAgent(agentId); - await this.hydrateTimeline(agent); + const agent = this.requireSessionAgent(agentId); + await this.hydrateTimelineFromLegacyProviderHistory(agent); } - private getLastAssistantMessage(agentId: string): string | null { + async deleteCommittedTimeline(agentId: string): Promise { + if (!this.durableTimelineStore) { + return; + } + await this.durableTimelineStore.deleteAgent(agentId); + } + + async getLastAssistantMessage(agentId: string): Promise { const agent = this.agents.get(agentId); if (!agent) { return null; } - return this.getLastAssistantMessageFromTimeline(agent.timeline); + return await this.getLastAssistantMessageFromStores(agentId); } private getLastAssistantMessageFromTimeline( timeline: readonly AgentTimelineItem[], ): string | null { + return this.getLastAssistantMessageSegmentFromTimeline(timeline)?.text ?? null; + } + + private getLastAssistantMessageSegmentFromTimeline( + timeline: readonly AgentTimelineItem[], + ): { text: string; startsAtBeginning: boolean } | null { // Collect the last contiguous assistant messages (Claude streams chunks) const chunks: string[] = []; + let startsAtBeginning = false; for (let i = timeline.length - 1; i >= 0; i--) { const item = timeline[i]; if (item.type !== "assistant_message") { @@ -1607,13 +1568,65 @@ export class AgentManager { continue; } chunks.push(item.text); + startsAtBeginning = i === 0; } if (!chunks.length) { return null; } - return chunks.reverse().join(""); + return { + text: chunks.reverse().join(""), + startsAtBeginning, + }; + } + + private async getLastAssistantMessageFromStores(agentId: string): Promise { + const liveTimeline = this.timelineStore.getItems(agentId); + const liveSegment = this.getLastAssistantMessageSegmentFromTimeline(liveTimeline); + if (!this.durableTimelineStore) { + return liveSegment?.text ?? null; + } + + if (!liveSegment) { + return await this.durableTimelineStore.getLastAssistantMessage(agentId); + } + + if (!liveSegment.startsAtBeginning) { + return liveSegment.text; + } + + const lastDurableItem = await this.durableTimelineStore.getLastItem(agentId); + if (lastDurableItem?.type !== "assistant_message") { + return liveSegment.text; + } + + const durableMessage = await this.durableTimelineStore.getLastAssistantMessage(agentId); + return durableMessage ? `${durableMessage}${liveSegment.text}` : liveSegment.text; + } + + private async getLastItemFromStores(agentId: string): Promise { + const lastLiveItem = this.timelineStore.getLastItem(agentId); + if (lastLiveItem) { + return lastLiveItem; + } + if (!this.durableTimelineStore) { + return null; + } + return await this.durableTimelineStore.getLastItem(agentId); + } + + private async hasCommittedUserMessageFromStores( + agentId: string, + options: { messageId: string; text: string }, + ): Promise { + if (this.timelineStore.hasCommittedUserMessage(agentId, options)) { + return true; + } + if (!this.durableTimelineStore) { + return false; + } + return await this.durableTimelineStore.hasCommittedUserMessage(agentId, options); } async waitForAgentEvent( @@ -1633,7 +1646,7 @@ export class AgentManager { return { status: snapshot.lifecycle, permission: immediatePermission, - lastMessage: this.getLastAssistantMessage(agentId), + lastMessage: await this.getLastAssistantMessage(agentId), }; } @@ -1644,14 +1657,14 @@ export class AgentManager { return { status: initialStatus, permission: null, - lastMessage: this.getLastAssistantMessage(agentId), + lastMessage: await this.getLastAssistantMessage(agentId), }; } if (waitForActive && !initialBusy && !hasForegroundTurn) { return { status: initialStatus, permission: null, - lastMessage: this.getLastAssistantMessage(agentId), + lastMessage: await this.getLastAssistantMessage(agentId), }; } @@ -1670,6 +1683,7 @@ export class AgentManager { let currentStatus: AgentLifecycleStatus = initialStatus; let hasStarted = initialBusy || hasForegroundTurn; let terminalStatusOverride: AgentLifecycleStatus | null = null; + let finished = false; // Bug #3 Fix: Declare unsubscribe and abortHandler upfront so cleanup can reference them let unsubscribe: (() => void) | null = null; @@ -1698,12 +1712,20 @@ export class AgentManager { }; const finish = (permission: AgentPermissionRequest | null) => { + if (finished) { + return; + } + finished = true; cleanup(); - resolve({ - status: currentStatus, - permission, - lastMessage: this.getLastAssistantMessage(agentId), - }); + void this.getLastAssistantMessage(agentId) + .then((lastMessage) => { + resolve({ + status: currentStatus, + permission, + lastMessage, + }); + }) + .catch(reject); }; // Bug #3 Fix: Set up abort handler BEFORE subscription @@ -1768,13 +1790,13 @@ export class AgentManager { config: AgentSessionConfig, agentId: string, options?: { + workspaceId?: number; createdAt?: Date; updatedAt?: Date; lastUserMessageAt?: Date | null; labels?: Record; timeline?: AgentTimelineItem[]; timelineRows?: AgentTimelineRow[]; - timelineEpoch?: string; timelineNextSeq?: number; historyPrimed?: boolean; lastUsage?: AgentUsage; @@ -1789,19 +1811,32 @@ export class AgentManager { const initialPersistedTitle = await this.resolveInitialPersistedTitle(resolvedAgentId, config); const now = new Date(); - const initialTimeline = options?.timeline ? [...options.timeline] : []; - const initialTimelineRows = options?.timelineRows?.length - ? options.timelineRows.map((row) => ({ ...row })) - : this.buildTimelineRowsFromItems( - initialTimeline, - options?.timelineNextSeq ?? 1, - (options?.updatedAt ?? options?.createdAt ?? now).toISOString(), - ); - const derivedNextSeq = - options?.timelineNextSeq ?? - (initialTimelineRows.length - ? initialTimelineRows[initialTimelineRows.length - 1]!.seq + 1 - : 1); + const explicitTimelineSeed: SeedAgentTimelineOptions | null = + options?.timeline?.length || + options?.timelineRows?.length || + options?.timelineNextSeq !== undefined + ? { + items: options?.timeline, + rows: options?.timelineRows, + nextSeq: options?.timelineNextSeq, + timestamp: (options?.updatedAt ?? options?.createdAt ?? now).toISOString(), + } + : null; + const shouldSeedFromDurable = + !explicitTimelineSeed && + !this.timelineStore.has(resolvedAgentId) && + this.durableTimelineStore !== undefined; + const durableTimelineSeed = shouldSeedFromDurable + ? await this.loadCommittedTimelineSeed(resolvedAgentId, now) + : null; + const durableTimelineHasRows = durableTimelineSeed != null && (durableTimelineSeed.nextSeq ?? 1) > 1; + const timelineSeed = explicitTimelineSeed ?? durableTimelineSeed; + if (timelineSeed || !this.timelineStore.has(resolvedAgentId)) { + this.timelineStore.initialize(resolvedAgentId, timelineSeed ?? { timestamp: now.toISOString() }); + } + if (options?.timelineRows?.length) { + this.enqueueDurableTimelineBulkInsert(resolvedAgentId, options.timelineRows); + } const managed = { id: resolvedAgentId, @@ -1816,19 +1851,15 @@ export class AgentManager { updatedAt: options?.updatedAt ?? now, availableModes: [], currentModeId: null, - pendingPermissions: new Map(), + pendingPermissions: new Map(), bufferedPermissionResolutions: new Map(), inFlightPermissionResponses: new Set(), pendingReplacement: false, activeForegroundTurnId: null, - foregroundTurnWaiters: new Set(), + foregroundTurnWaiters: new Set(), unsubscribeSession: null, - timeline: initialTimeline, - timelineRows: initialTimelineRows, - timelineEpoch: options?.timelineEpoch ?? randomUUID(), - timelineNextSeq: derivedNextSeq, persistence: attachPersistenceCwd(session.describePersistence(), config.cwd), - historyPrimed: options?.historyPrimed ?? false, + historyPrimed: options?.historyPrimed ?? durableTimelineHasRows, lastUserMessageAt: options?.lastUserMessageAt ?? null, lastUsage: options?.lastUsage, lastError: options?.lastError, @@ -1851,34 +1882,110 @@ export class AgentManager { this.previousStatuses.set(resolvedAgentId, managed.lifecycle); await this.refreshRuntimeInfo(managed); await this.persistSnapshot(managed, { + workspaceId: options?.workspaceId, title: initialPersistedTitle, }); - this.emitState(managed); + this.emitState(managed, { persist: false }); await this.refreshSessionState(managed); managed.lifecycle = "idle"; - await this.persistSnapshot(managed); - this.emitState(managed); + await this.persistSnapshot(managed, { workspaceId: options?.workspaceId }); + this.emitState(managed, { persist: false }); this.subscribeToSession(managed); return { ...managed }; } + private async loadCommittedTimelineSeed( + agentId: string, + now: Date, + ): Promise { + if (!this.durableTimelineStore) { + return { timestamp: now.toISOString() }; + } + + return { + nextSeq: (await this.durableTimelineStore.getLatestCommittedSeq(agentId)) + 1, + timestamp: now.toISOString(), + }; + } + + private prepareAgentForClosure( + agent: LiveManagedAgent, + cancelReason: string, + ): ManagedAgentClosed { + this.agents.delete(agent.id); + this.previousStatuses.delete(agent.id); + if (agent.unsubscribeSession) { + agent.unsubscribeSession(); + agent.unsubscribeSession = null; + } + for (const waiter of agent.foregroundTurnWaiters) { + waiter.callback({ + type: "turn_canceled", + provider: agent.provider, + reason: cancelReason, + turnId: waiter.turnId, + }); + this.settleForegroundTurnWaiter(waiter); + } + agent.foregroundTurnWaiters.clear(); + this.settlePendingForegroundRun(agent.id); + return { + ...agent, + lifecycle: "closed", + session: null, + activeForegroundTurnId: null, + }; + } + + private emitClosedAgent(agent: ManagedAgentClosed): void { + this.emitState(agent); + } private subscribeToSession(agent: ActiveManagedAgent): void { if (agent.unsubscribeSession) { return; } const agentId = agent.id; const unsubscribe = agent.session.subscribe((event: AgentStreamEvent) => { - const current = this.agents.get(agentId); - if (!current) { - return; - } - this.dispatchSessionEvent(current, event); + this.enqueueSessionEvent(agentId, event); }); agent.unsubscribeSession = unsubscribe; } - private dispatchSessionEvent(agent: ActiveManagedAgent, event: AgentStreamEvent): void { + private enqueueSessionEvent(agentId: string, event: AgentStreamEvent): void { + const previous = this.sessionEventTails.get(agentId) ?? Promise.resolve(); + const next = previous + .catch(() => undefined) + .then(async () => { + const current = this.agents.get(agentId); + if (!current) { + return; + } + if (current.session == null) { + return; + } + await this.dispatchSessionEvent(current, event); + }) + .catch((err) => { + this.logger.error( + { err, agentId, eventType: event.type }, + "Failed to process session event", + ); + }); + + this.sessionEventTails.set(agentId, next); + this.trackBackgroundTask(next); + void next.finally(() => { + if (this.sessionEventTails.get(agentId) === next) { + this.sessionEventTails.delete(agentId); + } + }); + } + + private async dispatchSessionEvent( + agent: ActiveManagedAgent, + event: AgentStreamEvent, + ): Promise { const turnId = (event as { turnId?: string }).turnId; const matchingWaiters = turnId == null @@ -1887,7 +1994,7 @@ export class AgentManager { (waiter) => waiter.turnId === turnId && !waiter.settled, ); - this.handleStreamEvent(agent, event); + await this.handleStreamEvent(agent, event); for (const waiter of matchingWaiters) { waiter.callback(event); @@ -1958,47 +2065,9 @@ export class AgentManager { return null; } - private buildTimelineRowsFromItems( - items: readonly AgentTimelineItem[], - startSeq: number, - timestamp: string, - ): AgentTimelineRow[] { - let nextSeq = startSeq; - return items.map((item) => { - const row: AgentTimelineRow = { - seq: nextSeq, - timestamp, - item, - }; - nextSeq += 1; - return row; - }); - } - - private ensureTimelineState(agent: ManagedAgent): { - rows: AgentTimelineRow[]; - epoch: string; - nextSeq: number; - minSeq: number; - maxSeq: number; - } { - const minSeq = agent.timelineRows.length ? agent.timelineRows[0]!.seq : 0; - const maxSeq = agent.timelineRows.length - ? agent.timelineRows[agent.timelineRows.length - 1]!.seq - : 0; - - return { - rows: agent.timelineRows, - epoch: agent.timelineEpoch, - nextSeq: agent.timelineNextSeq, - minSeq, - maxSeq, - }; - } - private async persistSnapshot( agent: ManagedAgent, - options?: { title?: string | null; internal?: boolean }, + options?: { workspaceId?: number; title?: string | null; internal?: boolean }, ): Promise { if (!this.registry) { return; @@ -2007,9 +2076,20 @@ export class AgentManager { if (agent.internal) { return; } + if (options?.workspaceId !== undefined) { + await this.registry.applySnapshot(agent, options.workspaceId, options); + return; + } await this.registry.applySnapshot(agent, options); } + private requireRegistry(): AgentSnapshotStore { + if (!this.registry) { + throw new Error("Agent storage unavailable"); + } + return this.registry; + } + private async refreshSessionState(agent: ActiveManagedAgent): Promise { try { const modes = await agent.session.getAvailableModes(); @@ -2059,44 +2139,45 @@ export class AgentManager { } } - private async hydrateTimeline(agent: ActiveManagedAgent): Promise { + private async hydrateTimelineFromLegacyProviderHistory( + agent: ActiveManagedAgent, + ): Promise { if (agent.historyPrimed) { return; } agent.historyPrimed = true; - const canonicalUserMessagesById = new Map( - agent.timelineRows.flatMap<[string, string]>((row) => { - if (row.item.type !== "user_message") { - return []; - } - const messageId = normalizeMessageId(row.item.messageId); - if (!messageId) { - return []; - } - return [[messageId, row.item.text]]; - }), - ); + const canonicalUserMessagesById = this.timelineStore.getCanonicalUserMessagesById(agent.id); try { for await (const event of agent.session.streamHistory()) { - this.handleStreamEvent(agent, event, { - fromHistory: true, - canonicalUserMessagesById: - canonicalUserMessagesById.size > 0 ? canonicalUserMessagesById : undefined, - }); + if (event.type !== "timeline") { + continue; + } + + if (event.item.type === "user_message") { + const eventMessageId = normalizeMessageId(event.item.messageId); + if (eventMessageId) { + const canonicalText = canonicalUserMessagesById.get(eventMessageId); + if (canonicalText === event.item.text) { + continue; + } + } + } + + this.recordTimeline(agent.id, event.item); } } catch { // ignore history failures } } - private handleStreamEvent( + private async handleStreamEvent( agent: ActiveManagedAgent, event: AgentStreamEvent, options?: { fromHistory?: boolean; canonicalUserMessagesById?: ReadonlyMap; }, - ): void { + ): Promise { const eventTurnId = (event as { turnId?: string }).turnId; const isForegroundEvent = Boolean( eventTurnId && agent.activeForegroundTurnId === eventTurnId, @@ -2149,19 +2230,17 @@ export class AgentManager { const eventMessageId = normalizeMessageId(event.item.messageId); const eventText = event.item.text; if (eventMessageId) { - const alreadyRecorded = agent.timelineRows.some((row) => { - if (row.item.type !== "user_message") { - return false; - } - const rowMessageId = normalizeMessageId(row.item.messageId); - return rowMessageId === eventMessageId && row.item.text === eventText; - }); - if (alreadyRecorded) { + if ( + await this.hasCommittedUserMessageFromStores(agent.id, { + messageId: eventMessageId, + text: eventText, + }) + ) { break; } } } - timelineRow = this.recordTimeline(agent, event.item); + timelineRow = this.recordTimeline(agent.id, event.item); if (!options?.fromHistory && event.item.type === "user_message") { agent.lastUserMessageAt = new Date(); this.emitState(agent); @@ -2205,7 +2284,7 @@ export class AgentManager { agent.lifecycle = "error"; } agent.lastError = event.error; - this.appendSystemErrorTimelineMessage( + await this.appendSystemErrorTimelineMessage( agent, event.provider, this.formatTurnFailedMessage(event), @@ -2308,14 +2387,13 @@ export class AgentManager { timelineRow ? { seq: timelineRow.seq, - epoch: this.ensureTimelineState(agent).epoch, } : undefined, ); } } - private appendSystemErrorTimelineMessage( + private async appendSystemErrorTimelineMessage( agent: ActiveManagedAgent, provider: AgentProvider, message: string, @@ -2323,7 +2401,7 @@ export class AgentManager { fromHistory?: boolean; canonicalUserMessagesById?: ReadonlyMap; }, - ): void { + ): Promise { if (options?.fromHistory) { return; } @@ -2334,13 +2412,13 @@ export class AgentManager { } const text = `${SYSTEM_ERROR_PREFIX} ${normalized}`; - const lastItem = agent.timelineRows[agent.timelineRows.length - 1]?.item; + const lastItem = await this.getLastItemFromStores(agent.id); if (lastItem?.type === "assistant_message" && lastItem.text === text) { return; } const item: AgentTimelineItem = { type: "assistant_message", text }; - const row = this.recordTimeline(agent, item); + const row = this.recordTimeline(agent.id, item); this.dispatchStream( agent.id, { @@ -2350,7 +2428,6 @@ export class AgentManager { }, { seq: row.seq, - epoch: this.ensureTimelineState(agent).epoch, }, ); } @@ -2371,30 +2448,18 @@ export class AgentManager { return parts.join("\n\n"); } - private recordTimeline(agent: ManagedAgent, item: AgentTimelineItem): AgentTimelineRow { - const timelineState = this.ensureTimelineState(agent); - const row: AgentTimelineRow = { - seq: timelineState.nextSeq, - timestamp: new Date().toISOString(), - item, - }; - agent.timelineNextSeq = timelineState.nextSeq + 1; - agent.timeline.push(item); - timelineState.rows.push(row); - if ( - typeof this.maxTimelineItems === "number" && - agent.timeline.length > this.maxTimelineItems - ) { - const removeCount = agent.timeline.length - this.maxTimelineItems; - agent.timeline.splice(0, removeCount); - timelineState.rows.splice(0, removeCount); - } + private recordTimeline(agentId: string, item: AgentTimelineItem): AgentTimelineRow { + const row = this.timelineStore.append(agentId, item); + this.enqueueDurableTimelineAppend(agentId, row); return row; } - private emitState(agent: ManagedAgent): void { + private emitState(agent: ManagedAgent, options?: { persist?: boolean }): void { // Keep attention as an edge-triggered unread signal, not a level signal. this.checkAndSetAttention(agent); + if (options?.persist !== false) { + this.enqueueBackgroundPersist(agent); + } this.syncFeaturesFromSession(agent); @@ -2435,7 +2500,6 @@ export class AgentManager { attentionTimestamp: new Date(), }; this.broadcastAgentAttention(agent, "finished"); - this.enqueueBackgroundPersist(agent); return; } @@ -2447,7 +2511,6 @@ export class AgentManager { attentionTimestamp: new Date(), }; this.broadcastAgentAttention(agent, "error"); - this.enqueueBackgroundPersist(agent); return; } } @@ -2459,6 +2522,38 @@ export class AgentManager { this.trackBackgroundTask(task); } + private enqueueDurableTimelineAppend(agentId: string, row: AgentTimelineRow): void { + if (!this.durableTimelineStore) { + return; + } + const task = this.durableTimelineStore + .bulkInsert(agentId, [row]) + .then(() => undefined) + .catch((err) => { + this.logger.error( + { err, agentId, seq: row.seq, itemType: row.item.type }, + "Failed to append timeline row to durable store", + ); + }); + this.trackBackgroundTask(task); + } + + private enqueueDurableTimelineBulkInsert( + agentId: string, + rows: readonly AgentTimelineRow[], + ): void { + if (!this.durableTimelineStore || rows.length === 0) { + return; + } + const task = this.durableTimelineStore.bulkInsert(agentId, rows).catch((err) => { + this.logger.error( + { err, agentId, rowCount: rows.length }, + "Failed to seed durable timeline store", + ); + }); + this.trackBackgroundTask(task); + } + private trackBackgroundTask(task: Promise): void { this.backgroundTasks.add(task); void task.finally(() => { @@ -2492,7 +2587,7 @@ export class AgentManager { private dispatchStream( agentId: string, event: AgentStreamEvent, - metadata?: { seq?: number; epoch?: string }, + metadata?: { seq?: number }, ): void { this.dispatch({ type: "agent_stream", agentId, event, ...metadata }); } @@ -2581,7 +2676,7 @@ export class AgentManager { return client; } - private requireAgent(id: string): ActiveManagedAgent { + private requireAgent(id: string): LiveManagedAgent { const normalizedId = validateAgentId(id, "requireAgent"); const agent = this.agents.get(normalizedId); if (!agent) { @@ -2590,4 +2685,12 @@ export class AgentManager { return agent; } + private requireSessionAgent(id: string): ActiveManagedAgent { + const agent = this.requireAgent(id); + if (agent.session === null) { + throw new Error(`Agent '${agent.id}' has no managed session`); + } + return agent; + } + } diff --git a/packages/server/src/server/agent/agent-projections.ts b/packages/server/src/server/agent/agent-projections.ts index deebedd7c..3c9f77304 100644 --- a/packages/server/src/server/agent/agent-projections.ts +++ b/packages/server/src/server/agent/agent-projections.ts @@ -63,6 +63,7 @@ export function toStoredAgentRecord( runtimeInfo, features: agent.features, persistence, + lastError: agent.lastError ?? undefined, requiresAttention: agent.attention.requiresAttention, attentionReason: agent.attention.requiresAttention ? agent.attention.attentionReason : null, attentionTimestamp: agent.attention.requiresAttention diff --git a/packages/server/src/server/agent/agent-sdk-types.ts b/packages/server/src/server/agent/agent-sdk-types.ts index cc0850cc1..efef45502 100644 --- a/packages/server/src/server/agent/agent-sdk-types.ts +++ b/packages/server/src/server/agent/agent-sdk-types.ts @@ -214,6 +214,7 @@ export type ToolCallDetail = index: number; command: string; cwd: string; + log: string; status: "running" | "completed" | "failed"; exitCode: number | null; durationMs?: number; diff --git a/packages/server/src/server/agent/agent-snapshot-store.ts b/packages/server/src/server/agent/agent-snapshot-store.ts new file mode 100644 index 000000000..ffa3c27bd --- /dev/null +++ b/packages/server/src/server/agent/agent-snapshot-store.ts @@ -0,0 +1,19 @@ +import type { ManagedAgent } from "./agent-manager.js"; +import type { StoredAgentRecord } from "./agent-storage.js"; + +export interface AgentSnapshotStore { + list(): Promise; + get(agentId: string): Promise; + upsert(record: StoredAgentRecord): Promise; + remove(agentId: string): Promise; + applySnapshot( + agent: ManagedAgent, + options?: { title?: string | null; internal?: boolean }, + ): Promise; + applySnapshot( + agent: ManagedAgent, + workspaceId: number, + options?: { title?: string | null; internal?: boolean }, + ): Promise; + setTitle(agentId: string, title: string): Promise; +} diff --git a/packages/server/src/server/agent/agent-storage.ts b/packages/server/src/server/agent/agent-storage.ts index ee28b2355..67cb533e3 100644 --- a/packages/server/src/server/agent/agent-storage.ts +++ b/packages/server/src/server/agent/agent-storage.ts @@ -7,6 +7,7 @@ import type { Logger } from "pino"; import { AgentFeatureSchema, AgentStatusSchema } from "../messages.js"; import { toStoredAgentRecord } from "./agent-projections.js"; import type { ManagedAgent } from "./agent-manager.js"; +import type { AgentSnapshotStore } from "./agent-snapshot-store.js"; import type { AgentSessionConfig } from "./agent-sdk-types.js"; const SERIALIZABLE_CONFIG_SCHEMA = z @@ -58,6 +59,7 @@ const STORED_AGENT_SCHEMA = z.object({ .optional(), features: z.array(AgentFeatureSchema).optional(), persistence: PERSISTENCE_HANDLE_SCHEMA, + lastError: z.string().nullable().optional(), requiresAttention: z.boolean().optional(), attentionReason: z.enum(["finished", "error", "permission"]).nullable().optional(), attentionTimestamp: z.string().nullable().optional(), @@ -78,8 +80,11 @@ export type SerializableAgentConfig = Pick< >; export type StoredAgentRecord = z.infer; +export function parseStoredAgentRecord(value: unknown): StoredAgentRecord { + return STORED_AGENT_SCHEMA.parse(value); +} -export class AgentStorage { +export class AgentStorage implements AgentSnapshotStore { private cache: Map = new Map(); private pathById: Map = new Map(); private pathsById: Map> = new Map(); @@ -177,19 +182,22 @@ export class AgentStorage { async applySnapshot( agent: ManagedAgent, + workspaceIdOrOptions?: number | { title?: string | null; internal?: boolean }, options?: { title?: string | null; internal?: boolean }, ): Promise { + const nextOptions = + typeof workspaceIdOrOptions === "number" ? options : workspaceIdOrOptions; await this.load(); await this.waitForPendingWrite(agent.id); const existing = (await this.get(agent.id)) ?? null; const hasTitleOverride = - options !== undefined && Object.prototype.hasOwnProperty.call(options, "title"); + nextOptions !== undefined && Object.prototype.hasOwnProperty.call(nextOptions, "title"); const hasInternalOverride = - options !== undefined && Object.prototype.hasOwnProperty.call(options, "internal"); + nextOptions !== undefined && Object.prototype.hasOwnProperty.call(nextOptions, "internal"); const record = toStoredAgentRecord(agent, { - title: hasTitleOverride ? (options?.title ?? null) : (existing?.title ?? null), + title: hasTitleOverride ? (nextOptions?.title ?? null) : (existing?.title ?? null), createdAt: existing?.createdAt, - internal: hasInternalOverride ? options?.internal : (agent.internal ?? existing?.internal), + internal: hasInternalOverride ? nextOptions?.internal : (agent.internal ?? existing?.internal), }); // Preserve soft-delete/archive status across snapshot flushes. @@ -309,7 +317,7 @@ export class AgentStorage { try { const content = await fs.readFile(filePath, "utf8"); const parsed = JSON.parse(content); - return STORED_AGENT_SCHEMA.parse(parsed); + return parseStoredAgentRecord(parsed); } catch (error) { this.logger.error({ err: error, filePath }, "Skipping invalid agent record"); return null; diff --git a/packages/server/src/server/agent/agent-timeline-store-types.ts b/packages/server/src/server/agent/agent-timeline-store-types.ts new file mode 100644 index 000000000..1543baa18 --- /dev/null +++ b/packages/server/src/server/agent/agent-timeline-store-types.ts @@ -0,0 +1,60 @@ +import type { AgentTimelineItem } from "./agent-sdk-types.js"; + +export type AgentTimelineRow = { + seq: number; + timestamp: string; + item: AgentTimelineItem; +}; + +export type AgentTimelineCursor = { + seq: number; +}; + +export type AgentTimelineFetchDirection = "tail" | "before" | "after"; + +export type AgentTimelineFetchOptions = { + direction?: AgentTimelineFetchDirection; + cursor?: AgentTimelineCursor; + /** + * Number of canonical rows to return. + * - undefined: store default + * - 0: all rows in the selected window + */ + limit?: number; +}; + +export type AgentTimelineWindow = { + minSeq: number; + maxSeq: number; + nextSeq: number; +}; + +export type AgentTimelineFetchResult = { + direction: AgentTimelineFetchDirection; + window: AgentTimelineWindow; + hasOlder: boolean; + hasNewer: boolean; + rows: AgentTimelineRow[]; +}; + +export interface AgentTimelineStore { + appendCommitted( + agentId: string, + item: AgentTimelineItem, + options?: { timestamp?: string }, + ): Promise; + fetchCommitted( + agentId: string, + options?: AgentTimelineFetchOptions, + ): Promise; + getLatestCommittedSeq(agentId: string): Promise; + getCommittedRows(agentId: string): Promise; + getLastItem(agentId: string): Promise; + getLastAssistantMessage(agentId: string): Promise; + hasCommittedUserMessage( + agentId: string, + options: { messageId: string; text: string }, + ): Promise; + deleteAgent(agentId: string): Promise; + bulkInsert(agentId: string, rows: readonly AgentTimelineRow[]): Promise; +} diff --git a/packages/server/src/server/agent/agent-timeline-store.ts b/packages/server/src/server/agent/agent-timeline-store.ts new file mode 100644 index 000000000..1a966de8c --- /dev/null +++ b/packages/server/src/server/agent/agent-timeline-store.ts @@ -0,0 +1,244 @@ +import type { AgentTimelineItem } from "./agent-sdk-types.js"; +import type { + AgentTimelineFetchOptions, + AgentTimelineFetchResult, + AgentTimelineRow, +} from "./agent-timeline-store-types.js"; + +export type SeedAgentTimelineOptions = { + items?: readonly AgentTimelineItem[]; + rows?: readonly AgentTimelineRow[]; + nextSeq?: number; + timestamp?: string; +}; + +type AgentTimelineState = { + rows: AgentTimelineRow[]; + nextSeq: number; +}; + +const DEFAULT_TIMELINE_FETCH_LIMIT = 200; + +function cloneRow(row: AgentTimelineRow): AgentTimelineRow { + return { ...row }; +} + +function normalizeTimelineMessageId(messageId: string | undefined): string | undefined { + if (typeof messageId !== "string") { + return undefined; + } + const normalized = messageId.trim(); + return normalized.length > 0 ? normalized : undefined; +} + +export class InMemoryAgentTimelineStore { + private readonly states = new Map(); + + has(agentId: string): boolean { + return this.states.has(agentId); + } + + initialize(agentId: string, options?: SeedAgentTimelineOptions): void { + const timestamp = options?.timestamp ?? new Date().toISOString(); + const rows = options?.rows?.length + ? options.rows.map(cloneRow) + : this.buildRowsFromItems(options?.items ?? [], options?.nextSeq ?? 1, timestamp); + const nextSeq = + options?.nextSeq ?? (rows.length ? rows[rows.length - 1]!.seq + 1 : 1); + this.states.set(agentId, { + rows, + nextSeq, + }); + } + + delete(agentId: string): void { + this.states.delete(agentId); + } + + getItems(agentId: string): AgentTimelineItem[] { + return this.requireState(agentId).rows.map((row) => row.item); + } + + getRows(agentId: string): AgentTimelineRow[] { + return this.requireState(agentId).rows.map(cloneRow); + } + + fetch(agentId: string, options?: AgentTimelineFetchOptions): AgentTimelineFetchResult { + const state = this.requireState(agentId); + const direction = options?.direction ?? "tail"; + const requestedLimit = options?.limit; + const limit = + requestedLimit === undefined + ? DEFAULT_TIMELINE_FETCH_LIMIT + : Math.max(0, Math.floor(requestedLimit)); + const cursor = options?.cursor; + const minSeq = state.rows.length ? state.rows[0]!.seq : 0; + const maxSeq = state.rows.length ? state.rows[state.rows.length - 1]!.seq : 0; + const selectAll = limit === 0; + + const window = { + minSeq, + maxSeq, + nextSeq: state.nextSeq, + }; + + if (state.rows.length === 0) { + return { + direction, + window, + hasOlder: false, + hasNewer: false, + rows: [], + }; + } + + if (direction === "tail") { + const selected = + selectAll || limit >= state.rows.length ? state.rows : state.rows.slice(state.rows.length - limit); + return { + direction, + window, + hasOlder: selected.length > 0 && selected[0]!.seq > minSeq, + hasNewer: false, + rows: selected.map(cloneRow), + }; + } + + if (direction === "after") { + const baseSeq = cursor?.seq ?? 0; + const startIdx = state.rows.findIndex((row) => row.seq > baseSeq); + if (startIdx < 0) { + return { + direction, + window, + hasOlder: baseSeq >= minSeq, + hasNewer: false, + rows: [], + }; + } + + const selected = selectAll + ? state.rows.slice(startIdx) + : state.rows.slice(startIdx, startIdx + limit); + const lastSelected = selected[selected.length - 1]; + return { + direction, + window, + hasOlder: selected[0]!.seq > minSeq, + hasNewer: Boolean(lastSelected && lastSelected.seq < maxSeq), + rows: selected.map(cloneRow), + }; + } + + const beforeSeq = cursor?.seq ?? state.nextSeq; + const endExclusive = state.rows.findIndex((row) => row.seq >= beforeSeq); + const boundedRows = endExclusive < 0 ? state.rows : state.rows.slice(0, endExclusive); + const selected = + selectAll || limit >= boundedRows.length + ? boundedRows + : boundedRows.slice(boundedRows.length - limit); + return { + direction, + window, + hasOlder: selected.length > 0 && selected[0]!.seq > minSeq, + hasNewer: endExclusive >= 0, + rows: selected.map(cloneRow), + }; + } + + append( + agentId: string, + item: AgentTimelineItem, + options?: { timestamp?: string }, + ): AgentTimelineRow { + const state = this.requireState(agentId); + const row: AgentTimelineRow = { + seq: state.nextSeq, + timestamp: options?.timestamp ?? new Date().toISOString(), + item, + }; + state.nextSeq += 1; + state.rows.push(row); + return cloneRow(row); + } + + getLastItem(agentId: string): AgentTimelineItem | null { + const state = this.requireState(agentId); + return state.rows[state.rows.length - 1]?.item ?? null; + } + + getLastAssistantMessage(agentId: string): string | null { + const rows = this.requireState(agentId).rows; + const chunks: string[] = []; + for (let i = rows.length - 1; i >= 0; i -= 1) { + const item = rows[i]!.item; + if (item.type !== "assistant_message") { + if (chunks.length > 0) { + break; + } + continue; + } + chunks.push(item.text); + } + + if (chunks.length === 0) { + return null; + } + + return chunks.reverse().join(""); + } + + getCanonicalUserMessagesById(agentId: string): Map { + const entries = this.requireState(agentId).rows.flatMap<[string, string]>((row) => { + if (row.item.type !== "user_message") { + return []; + } + const messageId = normalizeTimelineMessageId(row.item.messageId); + if (!messageId) { + return []; + } + return [[messageId, row.item.text]]; + }); + return new Map(entries); + } + + hasCommittedUserMessage(agentId: string, options: { messageId: string; text: string }): boolean { + const messageId = normalizeTimelineMessageId(options.messageId); + if (!messageId) { + return false; + } + + return this.requireState(agentId).rows.some((row) => { + if (row.item.type !== "user_message") { + return false; + } + const rowMessageId = normalizeTimelineMessageId(row.item.messageId); + return rowMessageId === messageId && row.item.text === options.text; + }); + } + + private requireState(agentId: string): AgentTimelineState { + const state = this.states.get(agentId); + if (!state) { + throw new Error(`Unknown agent '${agentId}'`); + } + return state; + } + + private buildRowsFromItems( + items: readonly AgentTimelineItem[], + startSeq: number, + timestamp: string, + ): AgentTimelineRow[] { + let nextSeq = startSeq; + return items.map((item) => { + const row: AgentTimelineRow = { + seq: nextSeq, + timestamp, + item, + }; + nextSeq += 1; + return row; + }); + } +} diff --git a/packages/server/src/server/agent/mcp-server.test.ts b/packages/server/src/server/agent/mcp-server.test.ts index 8c8a0b117..e7d2f231e 100644 --- a/packages/server/src/server/agent/mcp-server.test.ts +++ b/packages/server/src/server/agent/mcp-server.test.ts @@ -6,11 +6,11 @@ import { tmpdir } from "node:os"; import { createTestLogger } from "../../test-utils/test-logger.js"; import { createAgentMcpServer } from "./mcp-server.js"; import type { AgentManager, ManagedAgent } from "./agent-manager.js"; -import type { AgentStorage } from "./agent-storage.js"; +import type { AgentSnapshotStore } from "./agent-snapshot-store.js"; type TestDeps = { agentManager: AgentManager; - agentStorage: AgentStorage; + agentStorage: AgentSnapshotStore; spies: { agentManager: Record; agentStorage: Record; @@ -41,7 +41,7 @@ function createTestDeps(): TestDeps { return { agentManager: agentManagerSpies as unknown as AgentManager, - agentStorage: agentStorageSpies as unknown as AgentStorage, + agentStorage: agentStorageSpies as unknown as AgentSnapshotStore, spies: { agentManager: agentManagerSpies, agentStorage: agentStorageSpies, diff --git a/packages/server/src/server/agent/mcp-server.ts b/packages/server/src/server/agent/mcp-server.ts index a982bc483..7f7911ffb 100644 --- a/packages/server/src/server/agent/mcp-server.ts +++ b/packages/server/src/server/agent/mcp-server.ts @@ -16,7 +16,7 @@ import { import { toAgentPayload } from "./agent-projections.js"; import { curateAgentActivity } from "./activity-curator.js"; import { AGENT_PROVIDER_DEFINITIONS } from "./provider-registry.js"; -import { AgentStorage } from "./agent-storage.js"; +import type { AgentSnapshotStore } from "./agent-snapshot-store.js"; import { appendTimelineItemIfAgentKnown, emitLiveTimelineItemIfAgentKnown, @@ -28,11 +28,14 @@ import type { VoiceCallerContext, VoiceSpeakHandler } from "../voice-types.js"; import { expandUserPath, resolvePathFromBase } from "../path-utils.js"; import type { TerminalManager } from "../../terminal/terminal-manager.js"; import { createAgentWorktree, runAsyncWorktreeBootstrap } from "../worktree-bootstrap.js"; +import type { ServiceRouteStore } from "../service-proxy.js"; export interface AgentMcpServerOptions { agentManager: AgentManager; - agentStorage: AgentStorage; + agentStorage: AgentSnapshotStore; terminalManager?: TerminalManager | null; + serviceRouteStore?: ServiceRouteStore; + getDaemonTcpPort?: () => number | null; paseoHome?: string; /** * ID of the agent that is connecting to this MCP server. @@ -243,7 +246,7 @@ function sanitizePermissionRequest( } async function resolveAgentTitle( - agentStorage: AgentStorage, + agentStorage: AgentSnapshotStore, agentId: string, logger: Logger, ): Promise { @@ -257,7 +260,7 @@ async function resolveAgentTitle( } async function serializeSnapshotWithMetadata( - agentStorage: AgentStorage, + agentStorage: AgentSnapshotStore, snapshot: ManagedAgent, logger: Logger, ) { @@ -431,6 +434,7 @@ export async function createAgentMcpServer(options: AgentMcpServerOptions): Prom let resolvedCwd: string; let resolvedMode: string | undefined; let worktreeConfig: WorktreeConfig | undefined; + let shouldBootstrapWorktree: boolean | undefined; if (callerAgentId) { const callerArgs = agentToAgentCreateAgentArgsSchema.parse(args); @@ -467,15 +471,16 @@ export async function createAgentMcpServer(options: AgentMcpServerOptions): Prom if (!baseBranch) { throw new Error("baseBranch is required when creating a worktree"); } - const worktree = await createAgentWorktree({ + const worktreeBootstrap = await createAgentWorktree({ branchName: worktreeName, cwd: resolvedCwd, baseBranch, worktreeSlug: worktreeName, paseoHome: options.paseoHome, }); - resolvedCwd = worktree.worktreePath; - worktreeConfig = worktree; + resolvedCwd = worktreeBootstrap.worktree.worktreePath; + worktreeConfig = worktreeBootstrap.worktree; + shouldBootstrapWorktree = worktreeBootstrap.shouldBootstrap; } resolvedMode = initialMode; @@ -500,6 +505,7 @@ export async function createAgentMcpServer(options: AgentMcpServerOptions): Prom void runAsyncWorktreeBootstrap({ agentId: snapshot.id, worktree: worktreeConfig, + shouldBootstrap: shouldBootstrapWorktree, terminalManager: terminalManager ?? null, appendTimelineItem: (item) => appendTimelineItemIfAgentKnown({ @@ -513,6 +519,8 @@ export async function createAgentMcpServer(options: AgentMcpServerOptions): Prom agentId: snapshot.id, item, }), + serviceRouteStore: options.serviceRouteStore, + daemonPort: options.getDaemonTcpPort?.() ?? null, logger: childLogger, }); } diff --git a/packages/server/src/server/agent/provider-launch-config.ts b/packages/server/src/server/agent/provider-launch-config.ts index 730b591f2..80019cc1d 100644 --- a/packages/server/src/server/agent/provider-launch-config.ts +++ b/packages/server/src/server/agent/provider-launch-config.ts @@ -1,6 +1,9 @@ +import { execFileSync, execSync } from "node:child_process"; +import { existsSync } from "node:fs"; +import { platform } from "node:os"; +import path from "node:path"; import { z } from "zod"; -import { isCommandAvailable } from "../../utils/executable.js"; import type { AgentProvider } from "./agent-sdk-types.js"; import { AgentProviderSchema } from "./provider-manifest.js"; @@ -53,6 +56,66 @@ export type ProviderCommandPrefix = { args: string[]; }; +interface FindExecutableDependencies { + execSync: typeof execSync; + execFileSync: typeof execFileSync; + existsSync: typeof existsSync; + platform: typeof platform; + shell: string | undefined; +} + +function resolveWindowsPathEntries(deps: FindExecutableDependencies): string[] { + try { + const output = deps.execFileSync( + "powershell", + [ + "-NoProfile", + "-NonInteractive", + "-Command", + [ + '$machine = [Environment]::GetEnvironmentVariable("Path", "Machine")', + '$user = [Environment]::GetEnvironmentVariable("Path", "User")', + "if ($machine) { Write-Output $machine }", + "if ($user) { Write-Output $user }", + ].join("; "), + ], + { encoding: "utf8" }, + ); + return output + .split(/\r?\n/) + .flatMap((line) => line.split(";")) + .map((entry) => entry.trim()) + .filter((entry) => entry.length > 0); + } catch { + return []; + } +} + +function resolveExecutableFromWhichOutput( + name: string, + output: string, + source: "login-shell" | "which", +): string | null { + const lines = output + .split(/\r?\n/) + .map((line) => line.trim()) + .filter((line) => line.length > 0); + const candidate = lines.at(-1); + + if (!candidate) { + return null; + } + + if (!path.isAbsolute(candidate)) { + console.warn( + `[findExecutable] Ignoring non-absolute ${source} output for '${name}': ${JSON.stringify(candidate)}`, + ); + return null; + } + + return candidate; +} + export function resolveProviderCommandPrefix( commandConfig: ProviderCommand | undefined, resolveDefaultCommand: () => string, @@ -77,6 +140,16 @@ export function resolveProviderCommandPrefix( }; } +let cachedShellEnv: Record | null = null; + +export function resolveShellEnv(): Record { + if (cachedShellEnv) { + return cachedShellEnv; + } + cachedShellEnv = { ...process.env } as Record; + return cachedShellEnv; +} + // Env vars that indicate a running Claude Code session. If the daemon itself is // launched from inside Claude Code (e.g. by a Paseo agent), these leak into // child processes and cause "cannot be launched inside another session" errors. @@ -102,6 +175,136 @@ export function applyProviderEnv( return merged; } +export function sanitizeTerminalEnv( + env: Record, +): Record { + return Object.fromEntries( + Object.entries(env).filter((entry): entry is [string, string] => typeof entry[1] === "string"), + ); +} + +/** + * Resolve an executable name to its absolute path the way the user's shell would. + * + * On Unix we first try `$SHELL -lic "which "` so that rc-file PATH + * additions (asdf, nvm, homebrew, nix, etc.) are visible — exactly as if the + * user opened a terminal and typed the command. If that fails (e.g. the login + * shell itself errors) we fall back to a plain `which`. + * + * On Windows we augment the daemon PATH with machine/user registry PATH values + * and return the first `where.exe` match. Launch-time execution decides whether + * the resolved path needs `cmd.exe` semantics (for example npm shims under + * nvm4w such as `C:\nvm4w\nodejs\codex`). + */ +export function findExecutable( + name: string, + dependencies?: FindExecutableDependencies, +): string | null { + const trimmed = name.trim(); + if (!trimmed) { + return null; + } + + const deps: FindExecutableDependencies = { + execSync, + execFileSync, + existsSync, + platform, + shell: process.env["SHELL"], + ...dependencies, + }; + + if (trimmed.includes("/") || trimmed.includes("\\")) { + return deps.existsSync(trimmed) ? trimmed : null; + } + + if (deps.platform() === "win32") { + try { + const inheritedPath = process.env["Path"] ?? process.env["PATH"] ?? ""; + const resolvedPath = [ + ...inheritedPath.split(";"), + ...resolveWindowsPathEntries(deps), + ] + .map((entry) => entry.trim()) + .filter((entry) => entry.length > 0) + .filter((entry, index, entries) => entries.indexOf(entry) === index) + .join(";"); + const env = { + ...process.env, + PATH: resolvedPath, + Path: resolvedPath, + }; + const out = deps.execFileSync("where.exe", [trimmed], { encoding: "utf8", env }).trim(); + return ( + out + .split(/\r?\n/) + .map((line) => line.trim()) + .find((line) => line.length > 0) ?? null + ); + } catch { + return null; + } + } + + // Unix: try the user's login shell so rc-file PATH entries are visible. + const shell = deps.shell; + if (shell) { + try { + const out = deps + .execSync(`${shell} -lic "which ${trimmed}"`, { + encoding: "utf8", + timeout: 5000, + }) + .trim(); + const resolved = resolveExecutableFromWhichOutput(trimmed, out, "login-shell"); + if (resolved) { + return resolved; + } + } catch { + // Login shell failed (broken rc, etc.) — fall through to plain which. + } + } + + try { + return resolveExecutableFromWhichOutput( + trimmed, + deps.execFileSync("which", [trimmed], { encoding: "utf8" }).trim(), + "which", + ); + } catch { + return null; + } +} + +/** + * When spawning with `shell: true` on Windows, the command is passed to + * `cmd.exe /d /s /c "command args"`. The `/s` strips outer quotes, so a + * command path with spaces (e.g. `C:\Program Files\...`) is split at the + * space. Wrapping it in quotes produces the correct `"C:\Program Files\..." args`. + */ +export function quoteWindowsCommand(command: string): string { + if (process.platform !== "win32") return command; + if (!command.includes(" ")) return command; + if (command.startsWith('"') && command.endsWith('"')) return command; + return `"${command}"`; +} + +/** + * `spawn(..., { shell: true })` on Windows also passes argv through `cmd.exe`. + * Any argument containing spaces must be quoted or it will be split before the + * child process sees it. + */ +export function quoteWindowsArgument(argument: string): string { + if (process.platform !== "win32") return argument; + if (!argument.includes(" ")) return argument; + if (argument.startsWith('"') && argument.endsWith('"')) return argument; + return `"${argument}"`; +} + +export function isCommandAvailable(command: string): boolean { + return findExecutable(command) !== null; +} + export function isProviderCommandAvailable( commandConfig: ProviderCommand | undefined, resolveDefaultCommand: () => string, diff --git a/packages/server/src/server/agent/provider-manifest.ts b/packages/server/src/server/agent/provider-manifest.ts index 34bdfd86b..1cc2c44f9 100644 --- a/packages/server/src/server/agent/provider-manifest.ts +++ b/packages/server/src/server/agent/provider-manifest.ts @@ -141,6 +141,27 @@ export const AGENT_PROVIDER_DEFINITIONS: AgentProviderDefinition[] = [ defaultModel: "gpt-5.1-codex-mini", }, }, + { + id: "gemini", + label: "Gemini CLI", + description: "Google's coding agent CLI", + defaultModeId: null, + modes: [], + }, + { + id: "amp", + label: "AMP", + description: "Sourcegraph's coding agent CLI", + defaultModeId: null, + modes: [], + }, + { + id: "aider", + label: "Aider", + description: "Paul Gauthier's coding assistant CLI", + defaultModeId: null, + modes: [], + }, { id: "copilot", label: "Copilot", diff --git a/packages/server/src/server/agent/provider-registry.ts b/packages/server/src/server/agent/provider-registry.ts index 65bccc859..93ddfba54 100644 --- a/packages/server/src/server/agent/provider-registry.ts +++ b/packages/server/src/server/agent/provider-registry.ts @@ -9,10 +9,13 @@ import type { import type { AgentProviderRuntimeSettingsMap } from "./provider-launch-config.js"; import type { Logger } from "pino"; +import { AiderAgentClient } from "./providers/aider-agent.js"; +import { AmpAgentClient } from "./providers/amp-agent.js"; import { ClaudeAgentClient } from "./providers/claude-agent.js"; import { CodexAppServerAgentClient } from "./providers/codex-app-server-agent.js"; -import { OpenCodeAgentClient, OpenCodeServerManager } from "./providers/opencode-agent.js"; import { CopilotACPAgentClient } from "./providers/copilot-acp-agent.js"; +import { GeminiAgentClient } from "./providers/gemini-agent.js"; +import { OpenCodeAgentClient, OpenCodeServerManager } from "./providers/opencode-agent.js"; import { PiACPAgentClient } from "./providers/pi-acp-agent.js"; import { @@ -47,13 +50,15 @@ const PROVIDER_CLIENT_FACTORIES: Record = { runtimeSettings: runtimeSettings?.claude, }), codex: (logger, runtimeSettings) => new CodexAppServerAgentClient(logger, runtimeSettings?.codex), + gemini: (_logger, runtimeSettings) => new GeminiAgentClient(runtimeSettings?.gemini), + amp: (_logger, runtimeSettings) => new AmpAgentClient(runtimeSettings?.amp), + aider: (_logger, runtimeSettings) => new AiderAgentClient(runtimeSettings?.aider), copilot: (logger, runtimeSettings) => new CopilotACPAgentClient({ logger, runtimeSettings: runtimeSettings?.copilot, }), - opencode: (logger, runtimeSettings) => - new OpenCodeAgentClient(logger, runtimeSettings?.opencode), + opencode: (logger, runtimeSettings) => new OpenCodeAgentClient(logger, runtimeSettings?.opencode), pi: (logger, runtimeSettings) => new PiACPAgentClient({ logger, runtimeSettings: runtimeSettings?.pi }), }; @@ -99,10 +104,7 @@ export function createAllClients( ): Record { const registry = buildProviderRegistry(logger, options); return Object.fromEntries( - Object.entries(registry).map(([provider, definition]) => [ - provider, - definition.createClient(logger), - ]), + Object.entries(registry).map(([provider, definition]) => [provider, definition.createClient(logger)]), ) as Record; } diff --git a/packages/server/src/server/agent/providers/aider-agent.ts b/packages/server/src/server/agent/providers/aider-agent.ts new file mode 100644 index 000000000..2b063babb --- /dev/null +++ b/packages/server/src/server/agent/providers/aider-agent.ts @@ -0,0 +1,75 @@ +import { existsSync } from "node:fs"; + +import type { + AgentCapabilityFlags, + AgentClient, + AgentLaunchContext, + AgentModelDefinition, + AgentPersistenceHandle, + AgentSession, + AgentSessionConfig, + ListModelsOptions, +} from "../agent-sdk-types.js"; +import { + findExecutable, + isProviderCommandAvailable, + type ProviderRuntimeSettings, +} from "../provider-launch-config.js"; + +const AIDER_PROVIDER = "aider" as const; + +const AIDER_CAPABILITIES: AgentCapabilityFlags = { + supportsStreaming: false, + supportsSessionPersistence: false, + supportsDynamicModes: false, + supportsMcpServers: false, + supportsReasoningStream: false, + supportsToolInvocations: false, +}; + +function resolveAiderBinary(): string { + const found = findExecutable("aider"); + if (found) { + return found; + } + throw new Error( + "Aider binary not found. Install Aider and ensure 'aider' is available in your shell PATH.", + ); +} + +function createUnsupportedSessionError(): Error { + return new Error("Aider does not support session-backed agents in Paseo."); +} + +export class AiderAgentClient implements AgentClient { + readonly provider = AIDER_PROVIDER; + readonly capabilities = AIDER_CAPABILITIES; + + constructor(private readonly runtimeSettings?: ProviderRuntimeSettings) {} + + async createSession( + _config: AgentSessionConfig, + _launchContext?: AgentLaunchContext, + ): Promise { + throw createUnsupportedSessionError(); + } + + async resumeSession( + _handle: AgentPersistenceHandle, + _overrides?: Partial, + _launchContext?: AgentLaunchContext, + ): Promise { + throw createUnsupportedSessionError(); + } + + async listModels(_options?: ListModelsOptions): Promise { + return []; + } + + async isAvailable(): Promise { + if (this.runtimeSettings?.command?.mode === "replace") { + return existsSync(this.runtimeSettings.command.argv[0]); + } + return isProviderCommandAvailable(this.runtimeSettings?.command, resolveAiderBinary); + } +} diff --git a/packages/server/src/server/agent/providers/amp-agent.ts b/packages/server/src/server/agent/providers/amp-agent.ts new file mode 100644 index 000000000..a676db808 --- /dev/null +++ b/packages/server/src/server/agent/providers/amp-agent.ts @@ -0,0 +1,75 @@ +import { existsSync } from "node:fs"; + +import type { + AgentCapabilityFlags, + AgentClient, + AgentLaunchContext, + AgentModelDefinition, + AgentPersistenceHandle, + AgentSession, + AgentSessionConfig, + ListModelsOptions, +} from "../agent-sdk-types.js"; +import { + findExecutable, + isProviderCommandAvailable, + type ProviderRuntimeSettings, +} from "../provider-launch-config.js"; + +const AMP_PROVIDER = "amp" as const; + +const AMP_CAPABILITIES: AgentCapabilityFlags = { + supportsStreaming: false, + supportsSessionPersistence: false, + supportsDynamicModes: false, + supportsMcpServers: false, + supportsReasoningStream: false, + supportsToolInvocations: false, +}; + +function resolveAmpBinary(): string { + const found = findExecutable("amp"); + if (found) { + return found; + } + throw new Error( + "AMP binary not found. Install AMP and ensure 'amp' is available in your shell PATH.", + ); +} + +function createUnsupportedSessionError(): Error { + return new Error("AMP does not support session-backed agents in Paseo."); +} + +export class AmpAgentClient implements AgentClient { + readonly provider = AMP_PROVIDER; + readonly capabilities = AMP_CAPABILITIES; + + constructor(private readonly runtimeSettings?: ProviderRuntimeSettings) {} + + async createSession( + _config: AgentSessionConfig, + _launchContext?: AgentLaunchContext, + ): Promise { + throw createUnsupportedSessionError(); + } + + async resumeSession( + _handle: AgentPersistenceHandle, + _overrides?: Partial, + _launchContext?: AgentLaunchContext, + ): Promise { + throw createUnsupportedSessionError(); + } + + async listModels(_options?: ListModelsOptions): Promise { + return []; + } + + async isAvailable(): Promise { + if (this.runtimeSettings?.command?.mode === "replace") { + return existsSync(this.runtimeSettings.command.argv[0]); + } + return isProviderCommandAvailable(this.runtimeSettings?.command, resolveAmpBinary); + } +} diff --git a/packages/server/src/server/agent/providers/claude-agent.ts b/packages/server/src/server/agent/providers/claude-agent.ts index f936e9124..b8c742799 100644 --- a/packages/server/src/server/agent/providers/claude-agent.ts +++ b/packages/server/src/server/agent/providers/claude-agent.ts @@ -1096,7 +1096,6 @@ export class ClaudeAgentClient implements AgentClient { async listModels(_options?: ListModelsOptions): Promise { return getClaudeModels(); - } async listPersistedAgents( diff --git a/packages/server/src/server/agent/providers/gemini-agent.ts b/packages/server/src/server/agent/providers/gemini-agent.ts new file mode 100644 index 000000000..61dd18a17 --- /dev/null +++ b/packages/server/src/server/agent/providers/gemini-agent.ts @@ -0,0 +1,75 @@ +import { existsSync } from "node:fs"; + +import type { + AgentCapabilityFlags, + AgentClient, + AgentLaunchContext, + AgentModelDefinition, + AgentPersistenceHandle, + AgentSession, + AgentSessionConfig, + ListModelsOptions, +} from "../agent-sdk-types.js"; +import { + findExecutable, + isProviderCommandAvailable, + type ProviderRuntimeSettings, +} from "../provider-launch-config.js"; + +const GEMINI_PROVIDER = "gemini" as const; + +const GEMINI_CAPABILITIES: AgentCapabilityFlags = { + supportsStreaming: false, + supportsSessionPersistence: false, + supportsDynamicModes: false, + supportsMcpServers: false, + supportsReasoningStream: false, + supportsToolInvocations: false, +}; + +function resolveGeminiBinary(): string { + const found = findExecutable("gemini"); + if (found) { + return found; + } + throw new Error( + "Gemini CLI binary not found. Install Gemini CLI and ensure 'gemini' is available in your shell PATH.", + ); +} + +function createUnsupportedSessionError(): Error { + return new Error("Gemini CLI does not support session-backed agents in Paseo."); +} + +export class GeminiAgentClient implements AgentClient { + readonly provider = GEMINI_PROVIDER; + readonly capabilities = GEMINI_CAPABILITIES; + + constructor(private readonly runtimeSettings?: ProviderRuntimeSettings) {} + + async createSession( + _config: AgentSessionConfig, + _launchContext?: AgentLaunchContext, + ): Promise { + throw createUnsupportedSessionError(); + } + + async resumeSession( + _handle: AgentPersistenceHandle, + _overrides?: Partial, + _launchContext?: AgentLaunchContext, + ): Promise { + throw createUnsupportedSessionError(); + } + + async listModels(_options?: ListModelsOptions): Promise { + return []; + } + + async isAvailable(): Promise { + if (this.runtimeSettings?.command?.mode === "replace") { + return existsSync(this.runtimeSettings.command.argv[0]); + } + return isProviderCommandAvailable(this.runtimeSettings?.command, resolveGeminiBinary); + } +} diff --git a/packages/server/src/server/bootstrap.smoke.test.ts b/packages/server/src/server/bootstrap.smoke.test.ts index ee2b4e520..26c6dacc4 100644 --- a/packages/server/src/server/bootstrap.smoke.test.ts +++ b/packages/server/src/server/bootstrap.smoke.test.ts @@ -1,5 +1,7 @@ import os from "node:os"; import path from "node:path"; +import { existsSync, mkdirSync, writeFileSync } from "node:fs"; +import { execFileSync } from "node:child_process"; import { mkdir, mkdtemp, rm } from "node:fs/promises"; import { Writable } from "node:stream"; import pino from "pino"; @@ -8,6 +10,8 @@ import { afterEach, describe, expect, test, vi } from "vitest"; import { createPaseoDaemon, parseListenString, type PaseoDaemonConfig } from "./bootstrap.js"; import { createTestPaseoDaemon } from "./test-utils/paseo-daemon.js"; import { createTestAgentClients } from "./test-utils/fake-agent-client.js"; +import { openPaseoDatabase } from "./db/sqlite-database.js"; +import { agentSnapshots, projects, workspaces } from "./db/schema.js"; describe("paseo daemon bootstrap", () => { afterEach(() => { @@ -199,4 +203,417 @@ describe("paseo daemon bootstrap", () => { await rm(staticDir, { recursive: true, force: true }); } }); + + test("imports legacy project and workspace JSON into the DB on first bootstrap", async () => { + const { config, cleanup } = await createBootstrapConfig(); + const projectDir = await mkdtemp(path.join(os.tmpdir(), "paseo-bootstrap-project-")); + initializeGitRepo(projectDir); + writeLegacyProjectWorkspaceJson(config.paseoHome, { + projects: [ + { + projectId: "project-1", + rootPath: projectDir, + kind: "git", + displayName: "Project One", + createdAt: "2026-03-01T00:00:00.000Z", + updatedAt: "2026-03-02T00:00:00.000Z", + archivedAt: null, + }, + ], + workspaces: [ + { + workspaceId: "workspace-1", + projectId: "project-1", + cwd: projectDir, + kind: "local_checkout", + displayName: "main", + createdAt: "2026-03-01T00:00:00.000Z", + updatedAt: "2026-03-02T00:00:00.000Z", + archivedAt: null, + }, + ], + }); + + const daemon = await createPaseoDaemon(config, pino({ level: "silent" })); + + try { + await daemon.start(); + await daemon.stop(); + expect(existsSync(path.join(config.paseoHome, "db", "paseo.sqlite"))).toBe(true); + const database = await openPaseoDatabase(path.join(config.paseoHome, "db")); + try { + const projectRows = await database.db.select().from(projects); + expect(projectRows).toHaveLength(1); + expect(projectRows[0]).toMatchObject({ + directory: projectDir, + kind: "git", + createdAt: "2026-03-01T00:00:00.000Z", + archivedAt: null, + }); + const workspaceRows = await database.db.select().from(workspaces); + expect(workspaceRows).toHaveLength(1); + expect(workspaceRows[0]).toMatchObject({ + projectId: projectRows[0]!.id, + directory: projectDir, + kind: "checkout", + displayName: "main", + createdAt: "2026-03-01T00:00:00.000Z", + updatedAt: "2026-03-02T00:00:00.000Z", + archivedAt: null, + }); + } finally { + await database.close(); + } + } finally { + await rm(projectDir, { recursive: true, force: true }); + await cleanup(); + } + }); + + test("does not duplicate imported legacy JSON across daemon restarts", async () => { + const { config, cleanup } = await createBootstrapConfig(); + const projectDir = await mkdtemp(path.join(os.tmpdir(), "paseo-bootstrap-project-")); + initializeGitRepo(projectDir); + writeLegacyProjectWorkspaceJson(config.paseoHome, { + projects: [ + { + projectId: "project-1", + rootPath: projectDir, + kind: "git", + displayName: "Project One", + createdAt: "2026-03-01T00:00:00.000Z", + updatedAt: "2026-03-02T00:00:00.000Z", + archivedAt: null, + }, + ], + workspaces: [ + { + workspaceId: "workspace-1", + projectId: "project-1", + cwd: projectDir, + kind: "local_checkout", + displayName: "main", + createdAt: "2026-03-01T00:00:00.000Z", + updatedAt: "2026-03-02T00:00:00.000Z", + archivedAt: null, + }, + ], + }); + + try { + const firstDaemon = await createPaseoDaemon(config, pino({ level: "silent" })); + await firstDaemon.start(); + await firstDaemon.stop(); + + const secondDaemon = await createPaseoDaemon(config, pino({ level: "silent" })); + await secondDaemon.start(); + await secondDaemon.stop(); + + expect(existsSync(path.join(config.paseoHome, "db", "paseo.sqlite"))).toBe(true); + const database = await openPaseoDatabase(path.join(config.paseoHome, "db")); + try { + expect(await database.db.select().from(projects)).toHaveLength(1); + expect(await database.db.select().from(workspaces)).toHaveLength(1); + } finally { + await database.close(); + } + } finally { + await rm(projectDir, { recursive: true, force: true }); + await cleanup(); + } + }); + + test("imports legacy project, workspace, and agent JSON into one SQLite bootstrap without duplicating records", async () => { + const { config, cleanup } = await createBootstrapConfig(); + const projectDir = await mkdtemp(path.join(os.tmpdir(), "paseo-bootstrap-project-")); + initializeGitRepo(projectDir); + writeLegacyProjectWorkspaceJson(config.paseoHome, { + projects: [ + { + projectId: "project-1", + rootPath: projectDir, + kind: "git", + displayName: "Project One", + createdAt: "2026-03-01T00:00:00.000Z", + updatedAt: "2026-03-02T00:00:00.000Z", + archivedAt: null, + }, + ], + workspaces: [ + { + workspaceId: "workspace-1", + projectId: "project-1", + cwd: projectDir, + kind: "local_checkout", + displayName: "main", + createdAt: "2026-03-01T00:00:00.000Z", + updatedAt: "2026-03-02T00:00:00.000Z", + archivedAt: null, + }, + ], + }); + writeLegacyAgentJson(config.paseoHome, "agents/agent-1.json", { + id: "agent-1", + provider: "codex", + cwd: projectDir, + createdAt: "2026-03-01T00:00:00.000Z", + updatedAt: "2026-03-02T00:00:00.000Z", + lastActivityAt: "2026-03-02T00:00:00.000Z", + lastUserMessageAt: null, + title: "Imported Agent", + labels: {}, + lastStatus: "idle", + lastModeId: "plan", + config: { model: "gpt-5.1-codex-mini", modeId: "plan" }, + runtimeInfo: { + provider: "codex", + sessionId: "session-123", + model: "gpt-5.1-codex-mini", + modeId: "plan", + }, + persistence: null, + attentionReason: null, + attentionTimestamp: null, + archivedAt: null, + }); + + try { + const daemon = await createPaseoDaemon(config, pino({ level: "silent" })); + await daemon.start(); + await daemon.stop(); + + expect(existsSync(path.join(config.paseoHome, "db", "paseo.sqlite"))).toBe(true); + const database = await openPaseoDatabase(path.join(config.paseoHome, "db")); + try { + const projectRows = await database.db.select().from(projects); + const workspaceRows = await database.db.select().from(workspaces); + const agentRows = await database.db.select().from(agentSnapshots); + + expect(projectRows).toHaveLength(1); + expect(workspaceRows).toHaveLength(1); + expect(agentRows).toEqual([ + expect.objectContaining({ + agentId: "agent-1", + cwd: projectDir, + workspaceId: workspaceRows[0]!.id, + title: "Imported Agent", + requiresAttention: false, + internal: false, + }), + ]); + } finally { + await database.close(); + } + } finally { + await rm(projectDir, { recursive: true, force: true }); + await cleanup(); + } + }); + + test("imports large legacy agent JSON batches during SQLite bootstrap", async () => { + const { config, cleanup } = await createBootstrapConfig(); + const projectDir = await mkdtemp(path.join(os.tmpdir(), "paseo-bootstrap-project-")); + initializeGitRepo(projectDir); + writeLegacyProjectWorkspaceJson(config.paseoHome, { + projects: [ + { + projectId: "project-1", + rootPath: projectDir, + kind: "git", + displayName: "Project One", + createdAt: "2026-03-01T00:00:00.000Z", + updatedAt: "2026-03-02T00:00:00.000Z", + archivedAt: null, + }, + ], + workspaces: [ + { + workspaceId: "workspace-1", + projectId: "project-1", + cwd: projectDir, + kind: "local_checkout", + displayName: "main", + createdAt: "2026-03-01T00:00:00.000Z", + updatedAt: "2026-03-02T00:00:00.000Z", + archivedAt: null, + }, + ], + }); + + for (let index = 0; index < 150; index += 1) { + writeLegacyAgentJson(config.paseoHome, `agents/project-1/agent-${index}.json`, { + id: `agent-${index}`, + provider: "codex", + cwd: projectDir, + createdAt: "2026-03-01T00:00:00.000Z", + updatedAt: "2026-03-02T00:00:00.000Z", + lastActivityAt: "2026-03-02T00:00:00.000Z", + lastUserMessageAt: null, + title: `Imported Agent ${index}`, + labels: {}, + lastStatus: "idle", + lastModeId: "plan", + config: { model: "gpt-5.1-codex-mini", modeId: "plan" }, + runtimeInfo: { + provider: "codex", + sessionId: `session-${index}`, + model: "gpt-5.1-codex-mini", + modeId: "plan", + }, + persistence: null, + attentionReason: null, + attentionTimestamp: null, + archivedAt: null, + }); + } + + try { + const daemon = await createPaseoDaemon(config, pino({ level: "silent" })); + await daemon.start(); + await daemon.stop(); + + expect(existsSync(path.join(config.paseoHome, "db", "paseo.sqlite"))).toBe(true); + const database = await openPaseoDatabase(path.join(config.paseoHome, "db")); + try { + const projectRows = await database.db.select().from(projects); + const workspaceRows = await database.db.select().from(workspaces); + const agentRows = await database.db.select().from(agentSnapshots); + + expect(projectRows).toHaveLength(1); + expect(workspaceRows).toHaveLength(1); + expect(agentRows).toHaveLength(150); + expect(agentRows[0]?.workspaceId).toBe(workspaceRows[0]!.id); + expect(agentRows.map((row) => row.agentId)).toContain("agent-149"); + } finally { + await database.close(); + } + } finally { + await rm(projectDir, { recursive: true, force: true }); + await cleanup(); + } + }); + + test("reconciles workspace records into the DB without recreating legacy JSON registry files", async () => { + const { config, cleanup } = await createBootstrapConfig(); + const agentStorageDir = path.join(config.paseoHome, "agents"); + mkdirSync(agentStorageDir, { recursive: true }); + const storageBucket = path.join(agentStorageDir, "tmp-db-only-project"); + mkdirSync(storageBucket, { recursive: true }); + writeFileSync( + path.join(storageBucket, "agent-1.json"), + JSON.stringify( + { + id: "agent-1", + provider: "codex", + cwd: "/tmp/db-only-project", + createdAt: "2026-03-01T00:00:00.000Z", + updatedAt: "2026-03-02T00:00:00.000Z", + lastActivityAt: "2026-03-02T00:00:00.000Z", + lastUserMessageAt: null, + title: null, + labels: {}, + lastStatus: "idle", + lastModeId: null, + config: null, + runtimeInfo: { provider: "codex", sessionId: null }, + persistence: null, + archivedAt: null, + }, + null, + 2, + ), + "utf8", + ); + + try { + const daemon = await createPaseoDaemon(config, pino({ level: "silent" })); + await daemon.start(); + await daemon.stop(); + + expect(existsSync(path.join(config.paseoHome, "db", "paseo.sqlite"))).toBe(true); + const database = await openPaseoDatabase(path.join(config.paseoHome, "db")); + try { + expect(await database.db.select().from(agentSnapshots)).toEqual([ + expect.objectContaining({ + agentId: "agent-1", + cwd: "/tmp/db-only-project", + requiresAttention: false, + internal: false, + }), + ]); + expect(await database.db.select().from(projects)).toHaveLength(1); + expect(await database.db.select().from(workspaces)).toHaveLength(1); + } finally { + await database.close(); + } + + expect(existsSync(path.join(config.paseoHome, "projects", "projects.json"))).toBe(false); + expect(existsSync(path.join(config.paseoHome, "projects", "workspaces.json"))).toBe(false); + } finally { + await cleanup(); + } + }); }); + +async function createBootstrapConfig(): Promise<{ + config: PaseoDaemonConfig; + cleanup: () => Promise; +}> { + const paseoHomeRoot = await mkdtemp(path.join(os.tmpdir(), "paseo-bootstrap-db-")); + const paseoHome = path.join(paseoHomeRoot, ".paseo"); + const staticDir = await mkdtemp(path.join(os.tmpdir(), "paseo-static-")); + await mkdir(paseoHome, { recursive: true }); + + return { + config: { + listen: "127.0.0.1:0", + paseoHome, + corsAllowedOrigins: [], + allowedHosts: true, + mcpEnabled: false, + staticDir, + mcpDebug: false, + agentClients: createTestAgentClients(), + agentStoragePath: path.join(paseoHome, "agents"), + relayEnabled: false, + appBaseUrl: "https://app.paseo.sh", + openai: undefined, + speech: undefined, + }, + cleanup: async () => { + await rm(paseoHomeRoot, { recursive: true, force: true }); + await rm(staticDir, { recursive: true, force: true }); + }, + }; +} + +function writeLegacyProjectWorkspaceJson( + paseoHome: string, + input: { + projects: unknown[]; + workspaces: unknown[]; + }, +): void { + const projectsDir = path.join(paseoHome, "projects"); + mkdirSync(projectsDir, { recursive: true }); + writeFileSync(path.join(projectsDir, "projects.json"), JSON.stringify(input.projects, null, 2), "utf8"); + writeFileSync(path.join(projectsDir, "workspaces.json"), JSON.stringify(input.workspaces, null, 2), "utf8"); +} + +function writeLegacyAgentJson(paseoHome: string, relativePath: string, payload: Record): void { + const absolutePath = path.join(paseoHome, relativePath); + mkdirSync(path.dirname(absolutePath), { recursive: true }); + writeFileSync(absolutePath, JSON.stringify(payload, null, 2), "utf8"); +} + +function initializeGitRepo(directory: string): void { + execFileSync("git", ["init", "-b", "main"], { cwd: directory, stdio: "pipe" }); + execFileSync("git", ["config", "user.email", "test@getpaseo.dev"], { cwd: directory, stdio: "pipe" }); + execFileSync("git", ["config", "user.name", "Paseo Test"], { cwd: directory, stdio: "pipe" }); + writeFileSync(path.join(directory, "README.md"), "bootstrap fixture\n", "utf8"); + execFileSync("git", ["add", "README.md"], { cwd: directory, stdio: "pipe" }); + execFileSync("git", ["-c", "commit.gpgsign=false", "commit", "-m", "init"], { + cwd: directory, + stdio: "pipe", + }); +} diff --git a/packages/server/src/server/bootstrap.ts b/packages/server/src/server/bootstrap.ts index e0d4f3890..7ce0ca628 100644 --- a/packages/server/src/server/bootstrap.ts +++ b/packages/server/src/server/bootstrap.ts @@ -9,6 +9,7 @@ import { StreamableHTTPServerTransport } from "@modelcontextprotocol/sdk/server/ import { InMemoryTransport } from "@modelcontextprotocol/sdk/inMemory.js"; import { isInitializeRequest } from "@modelcontextprotocol/sdk/types.js"; import type { Logger } from "pino"; +import { createBranchChangeRouteHandler } from "./service-route-branch-handler.js"; export type ListenTarget = | { type: "tcp"; host: string; port: number } @@ -93,12 +94,17 @@ import type { LocalSpeechProviderConfig } from "./speech/providers/local/config. import type { RequestedSpeechProviders } from "./speech/speech-types.js"; import { createSpeechService } from "./speech/speech-runtime.js"; import { AgentManager } from "./agent/agent-manager.js"; -import { AgentStorage } from "./agent/agent-storage.js"; -import { attachAgentStoragePersistence } from "./persistence-hooks.js"; +import type { AgentSnapshotStore } from "./agent/agent-snapshot-store.js"; import { createAgentMcpServer } from "./agent/mcp-server.js"; import { createAllClients, shutdownProviders } from "./agent/provider-registry.js"; -import { bootstrapWorkspaceRegistries } from "./workspace-registry-bootstrap.js"; -import { FileBackedProjectRegistry, FileBackedWorkspaceRegistry } from "./workspace-registry.js"; +import { DbAgentSnapshotStore } from "./db/db-agent-snapshot-store.js"; +import { DbAgentTimelineStore } from "./db/db-agent-timeline-store.js"; +import { DbProjectRegistry } from "./db/db-project-registry.js"; +import { DbWorkspaceRegistry } from "./db/db-workspace-registry.js"; +import { WorkspaceReconciliationService } from "./workspace-reconciliation-service.js"; +import { importLegacyAgentSnapshots } from "./db/legacy-agent-snapshot-import.js"; +import { importLegacyProjectWorkspaceJson } from "./db/legacy-project-workspace-import.js"; +import { openPaseoDatabase, type PaseoDatabaseHandle } from "./db/sqlite-database.js"; import { FileBackedChatService } from "./chat/chat-service.js"; import { CheckoutDiffManager } from "./checkout-diff-manager.js"; import { LoopService } from "./loop-service.js"; @@ -112,6 +118,13 @@ import { resolveDaemonVersion } from "./daemon-version.js"; import type { AgentClient, AgentProvider } from "./agent/agent-sdk-types.js"; import type { AgentProviderRuntimeSettingsMap } from "./agent/provider-launch-config.js"; import { isHostAllowed, type AllowedHostsConfig } from "./allowed-hosts.js"; +import { + ServiceRouteStore, + createServiceProxyMiddleware, + createServiceProxyUpgradeHandler, +} from "./service-proxy.js"; +import { ServiceHealthMonitor } from "./service-health-monitor.js"; +import { createServiceStatusEmitter } from "./service-status-projection.js"; import { createVoiceMcpSocketBridgeManager, type VoiceMcpSocketBridgeManager, @@ -186,8 +199,9 @@ export type PaseoDaemonConfig = { export interface PaseoDaemon { config: PaseoDaemonConfig; agentManager: AgentManager; - agentStorage: AgentStorage; + agentStorage: AgentSnapshotStore; terminalManager: TerminalManager; + serviceRouteStore: ServiceRouteStore; start(): Promise; stop(): Promise; getListenTarget(): ListenTarget | null; @@ -201,6 +215,7 @@ export async function createPaseoDaemon( const bootstrapStart = performance.now(); const elapsed = () => `${(performance.now() - bootstrapStart).toFixed(0)}ms`; const daemonVersion = resolveDaemonVersion(import.meta.url); + let database: PaseoDatabaseHandle | null = null; try { const serverId = getOrCreateServerId(config.paseoHome, { logger }); @@ -217,6 +232,34 @@ export async function createPaseoDaemon( const app = express(); let boundListenTarget: ListenTarget | null = null; + const serviceRouteStore = new ServiceRouteStore(); + let wsServer: VoiceAssistantWebSocketServer | null = null; + const serviceHealthMonitor = new ServiceHealthMonitor({ + routeStore: serviceRouteStore, + onChange: createServiceStatusEmitter({ + sessions: () => + wsServer?.listActiveSessions().map((session) => ({ + emit: (message) => session.emitServerMessage(message), + })) ?? [], + routeStore: serviceRouteStore, + daemonPort: () => (boundListenTarget?.type === "tcp" ? boundListenTarget.port : null), + }), + }); + const handleBranchChange = createBranchChangeRouteHandler({ + routeStore: serviceRouteStore, + emitServiceStatusUpdate: (workspaceId, services) => { + const message = { + type: "service_status_update" as const, + payload: { workspaceId, services }, + }; + const activeSessions = wsServer?.listActiveSessions() ?? []; + for (const session of activeSessions) { + session.emitServerMessage(message); + } + }, + logger, + }); + // Host allowlist / DNS rebinding protection (vite-like semantics). // For non-TCP (unix sockets), skip host validation. if (listenTarget.type === "tcp") { @@ -230,6 +273,14 @@ export async function createPaseoDaemon( }); } + // Service proxy — intercepts requests for registered *.localhost hostnames + // and forwards them to the corresponding local service port. Placed after + // the host allowlist (*.localhost is already allowed) but before CORS and + // the rest of the routes so proxied requests skip unnecessary middleware. + app.use( + createServiceProxyMiddleware({ routeStore: serviceRouteStore, logger }), + ); + // CORS - allow same-origin + configured origins const allowedOrigins = new Set([ ...config.corsAllowedOrigins, @@ -330,20 +381,28 @@ export async function createPaseoDaemon( const httpServer = createHTTPServer(app); - const agentStorage = new AgentStorage(config.agentStoragePath, logger); - const projectRegistry = new FileBackedProjectRegistry( - path.join(config.paseoHome, "projects", "projects.json"), - logger, - ); - const workspaceRegistry = new FileBackedWorkspaceRegistry( - path.join(config.paseoHome, "projects", "workspaces.json"), + database = await openPaseoDatabase(path.join(config.paseoHome, "db")); + logger.info({ elapsed: elapsed() }, "Paseo database opened"); + + // Service proxy WebSocket upgrade handler — must be registered before the + // VoiceAssistantWebSocketServer attaches its own "upgrade" listener so that + // service-bound upgrades are forwarded first. The handler is a no-op for + // requests that don't match a registered service route. + const serviceProxyUpgradeHandler = createServiceProxyUpgradeHandler({ + routeStore: serviceRouteStore, logger, - ); + }); + httpServer.on("upgrade", serviceProxyUpgradeHandler); + + const agentStorage = new DbAgentSnapshotStore(database.db); const chatService = new FileBackedChatService({ paseoHome: config.paseoHome, logger, }); - const agentManager = new AgentManager({ + const durableTimelineStore = new DbAgentTimelineStore(database.db); + let agentManager: AgentManager | null = null; + const terminalManager = createTerminalManager(); + agentManager = new AgentManager({ clients: { ...createAllClients(logger, { runtimeSettings: config.agentProviderSettings, @@ -351,26 +410,42 @@ export async function createPaseoDaemon( ...config.agentClients, }, registry: agentStorage, + durableTimelineStore, + terminalManager, logger, }); - const terminalManager = createTerminalManager(); + const projectRegistry = new DbProjectRegistry(database.db); + const workspaceRegistry = new DbWorkspaceRegistry(database.db); - const detachAgentStoragePersistence = attachAgentStoragePersistence( - logger, - agentManager, - agentStorage, - ); - await agentStorage.initialize(); - logger.info({ elapsed: elapsed() }, "Agent storage initialized"); - await bootstrapWorkspaceRegistries({ - paseoHome: config.paseoHome, - agentStorage, + try { + await importLegacyProjectWorkspaceJson({ + db: database.db, + paseoHome: config.paseoHome, + logger, + }); + logger.info({ elapsed: elapsed() }, "Legacy project/workspace import checked"); + } catch (err) { + logger.error({ err }, "Legacy project/workspace import failed (non-fatal)"); + } + try { + await importLegacyAgentSnapshots({ + db: database.db, + paseoHome: config.paseoHome, + logger, + }); + logger.info({ elapsed: elapsed() }, "Legacy agent snapshot import checked"); + } catch (err) { + logger.error({ err }, "Legacy agent snapshot import failed (non-fatal)"); + } + + const reconciliationService = new WorkspaceReconciliationService({ projectRegistry, workspaceRegistry, logger, }); - logger.info({ elapsed: elapsed() }, "Workspace registries bootstrapped"); + reconciliationService.start(); + logger.info({ elapsed: elapsed() }, "Workspace reconciliation service started"); await chatService.initialize(); logger.info({ elapsed: elapsed() }, "Chat service initialized"); const checkoutDiffManager = new CheckoutDiffManager({ @@ -402,7 +477,6 @@ export async function createPaseoDaemon( "Voice mode configured for agent-scoped resume flow (no dedicated voice assistant provider)", ); logger.info({ elapsed: elapsed() }, "Preparing voice and MCP runtime"); - let wsServer: VoiceAssistantWebSocketServer | null = null; let voiceMcpBridgeManager: VoiceMcpSocketBridgeManager | null = null; // Create in-memory transport for Session's Agent MCP client (voice assistant tools) @@ -411,6 +485,9 @@ export async function createPaseoDaemon( agentManager, agentStorage, terminalManager, + serviceRouteStore, + getDaemonTcpPort: () => + boundListenTarget?.type === "tcp" ? boundListenTarget.port : null, paseoHome: config.paseoHome, enableVoiceTools: false, resolveSpeakHandler: (callerAgentId) => @@ -437,6 +514,9 @@ export async function createPaseoDaemon( agentManager, agentStorage, terminalManager, + serviceRouteStore, + getDaemonTcpPort: () => + boundListenTarget?.type === "tcp" ? boundListenTarget.port : null, paseoHome: config.paseoHome, callerAgentId, enableVoiceTools: false, @@ -557,6 +637,9 @@ export async function createPaseoDaemon( agentManager, agentStorage, terminalManager, + serviceRouteStore, + getDaemonTcpPort: () => + boundListenTarget?.type === "tcp" ? boundListenTarget.port : null, paseoHome: config.paseoHome, callerAgentId, voiceOnly: true, @@ -618,6 +701,10 @@ export async function createPaseoDaemon( loopService, scheduleService, checkoutDiffManager, + serviceRouteStore, + handleBranchChange, + () => (boundListenTarget?.type === "tcp" ? boundListenTarget.port : null), + (hostname) => serviceHealthMonitor.getHealthForHostname(hostname), ); logger.info({ elapsed: elapsed() }, "Bootstrap complete, ready to start listening"); @@ -708,13 +795,14 @@ export async function createPaseoDaemon( // Start speech service after listening so synchronous Sherpa native // model loading doesn't block the server from accepting connections. speechService.start(); + serviceHealthMonitor.start(); }; const stop = async () => { + reconciliationService.stop(); + serviceHealthMonitor.stop(); await closeAllAgents(logger, agentManager); await agentManager.flush().catch(() => undefined); - detachAgentStoragePersistence(); - await agentStorage.flush().catch(() => undefined); await shutdownProviders(logger, { runtimeSettings: config.agentProviderSettings, }); @@ -728,6 +816,7 @@ export async function createPaseoDaemon( if (voiceMcpBridgeManager) { await voiceMcpBridgeManager.stop().catch(() => undefined); } + await database?.close().catch(() => undefined); await new Promise((resolve) => { httpServer.close(() => resolve()); }); @@ -742,11 +831,13 @@ export async function createPaseoDaemon( agentManager, agentStorage, terminalManager, + serviceRouteStore, start, stop, getListenTarget: () => boundListenTarget, }; } catch (err) { + await database?.close().catch(() => undefined); throw err; } } diff --git a/packages/server/src/server/daemon-client.e2e.test.ts b/packages/server/src/server/daemon-client.e2e.test.ts index a83c41cb3..ec5500091 100644 --- a/packages/server/src/server/daemon-client.e2e.test.ts +++ b/packages/server/src/server/daemon-client.e2e.test.ts @@ -224,6 +224,8 @@ describe("daemon client E2E", () => { expect(archivedResult).not.toBeNull(); expect(archivedResult?.agent.archivedAt).toBeTruthy(); expect(archivedResult?.agent.status).not.toBe("running"); + expect(archivedResult?.agent.requiresAttention).toBe(false); + expect(archivedResult?.agent.attentionReason).toBeNull(); expect(archivedResult?.project).not.toBeNull(); expect(archivedResult?.project?.checkout.cwd).toBe(cwd); @@ -309,6 +311,42 @@ describe("daemon client E2E", () => { } }, 180000); + test("update_agent persists unloaded title and labels across auto-unarchive", async () => { + const cwd = tmpCwd(); + try { + const created = await ctx.client.createAgent({ + config: { + ...getFullAccessConfig("codex"), + cwd, + }, + }); + + await ctx.client.archiveAgent(created.id); + await ctx.client.updateAgent(created.id, { + name: "Pinned Title", + labels: { lane: "phase-1a" }, + }); + + const archived = await ctx.client.fetchAgent(created.id); + expect(archived).not.toBeNull(); + expect(archived?.agent.archivedAt).toBeTruthy(); + expect(archived?.agent.title).toBe("Pinned Title"); + expect(archived?.agent.labels).toMatchObject({ lane: "phase-1a" }); + + await ctx.client.sendMessage(created.id, "Say hello and nothing else"); + const finalState = await ctx.client.waitForFinish(created.id, 120000); + expect(finalState.status).toBe("idle"); + + const unarchived = await ctx.client.fetchAgent(created.id); + expect(unarchived).not.toBeNull(); + expect(unarchived?.agent.archivedAt).toBeNull(); + expect(unarchived?.agent.title).toBe("Pinned Title"); + expect(unarchived?.agent.labels).toMatchObject({ lane: "phase-1a" }); + } finally { + rmSync(cwd, { recursive: true, force: true }); + } + }, 180000); + test("returns home-scoped directory suggestions", async () => { const insideHomeDir = mkdtempSync(path.join(homedir(), "paseo-dir-suggestion-")); const outsideHomeDir = mkdtempSync(path.join(tmpdir(), "paseo-dir-suggestion-outside-")); @@ -529,7 +567,6 @@ describe("daemon client E2E", () => { const timelineResult = await ctx.client.fetchAgentTimeline(agent.id, { direction: "tail", limit: 1, - projection: "projected", }); expect(timelineResult.agentId).toBe(agent.id); @@ -756,7 +793,6 @@ describe("daemon client E2E", () => { const timeline = await ctx.client.fetchAgentTimeline(agent.id, { direction: "tail", limit: 0, - projection: "projected", }); expect(timeline.entries.length).toBeGreaterThan(0); diff --git a/packages/server/src/server/daemon-e2e/agent-operations.e2e.test.ts b/packages/server/src/server/daemon-e2e/agent-operations.e2e.test.ts index 442d6ae03..3d2f5f2b9 100644 --- a/packages/server/src/server/daemon-e2e/agent-operations.e2e.test.ts +++ b/packages/server/src/server/daemon-e2e/agent-operations.e2e.test.ts @@ -71,7 +71,6 @@ describe("daemon E2E", () => { await ctx.client.fetchAgentTimeline(agent.id, { direction: "tail", limit: 200, - projection: "projected", }); const refreshedResult = await ctx.client.fetchAgent(agent.id); @@ -89,7 +88,6 @@ describe("daemon E2E", () => { await ctx.client.fetchAgentTimeline(agent.id, { direction: "tail", limit: 200, - projection: "projected", }); const clearResult = await ctx.client.fetchAgent(agent.id); diff --git a/packages/server/src/server/daemon-e2e/claude-autonomous-wake-simple.real.e2e.test.ts b/packages/server/src/server/daemon-e2e/claude-autonomous-wake-simple.real.e2e.test.ts index 6dd2fe622..6ab400712 100644 --- a/packages/server/src/server/daemon-e2e/claude-autonomous-wake-simple.real.e2e.test.ts +++ b/packages/server/src/server/daemon-e2e/claude-autonomous-wake-simple.real.e2e.test.ts @@ -60,7 +60,6 @@ describe("daemon E2E (real claude) - autonomous wake simple", () => { const timelineAtIdle = await client.fetchAgentTimeline(agent.id, { direction: "tail", limit: 0, - projection: "canonical", }); const idleAssistantText = timelineAtIdle.entries .filter( @@ -91,7 +90,6 @@ describe("daemon E2E (real claude) - autonomous wake simple", () => { const finalTimeline = await client.fetchAgentTimeline(agent.id, { direction: "tail", limit: 0, - projection: "canonical", }); const finalAssistantText = finalTimeline.entries .filter( diff --git a/packages/server/src/server/daemon-e2e/claude-autonomous-wake.real.e2e.test.ts b/packages/server/src/server/daemon-e2e/claude-autonomous-wake.real.e2e.test.ts index be0657fc5..c49219006 100644 --- a/packages/server/src/server/daemon-e2e/claude-autonomous-wake.real.e2e.test.ts +++ b/packages/server/src/server/daemon-e2e/claude-autonomous-wake.real.e2e.test.ts @@ -390,7 +390,6 @@ describe("daemon E2E (real claude) - autonomous wake from background task", () = const timelineAtIdle = await client.fetchAgentTimeline(agent.id, { direction: "tail", limit: 0, - projection: "canonical", }); await client.waitForAgentUpsert( @@ -405,7 +404,6 @@ describe("daemon E2E (real claude) - autonomous wake from background task", () = const timelineAfterWake = await client.fetchAgentTimeline(agent.id, { direction: "tail", limit: 0, - projection: "canonical", }); expect(timelineAfterWake.entries.length).toBeGreaterThanOrEqual( timelineAtIdle.entries.length, @@ -417,7 +415,6 @@ describe("daemon E2E (real claude) - autonomous wake from background task", () = const nextTimeline = await client.fetchAgentTimeline(agent.id, { direction: "tail", limit: 0, - projection: "canonical", }); sawTimelineGrowth = nextTimeline.entries.length > timelineAtIdle.entries.length; } @@ -605,7 +602,6 @@ describe("daemon E2E (real claude) - autonomous wake from background task", () = const timelineBeforeWake = await client.fetchAgentTimeline(agent.id, { direction: "tail", limit: 0, - projection: "canonical", }); const summarized = timelineBeforeWake.entries.map(summarizeTimelineEntry); // Required by reproduction request: log timeline at idle edge before autonomous wake. @@ -735,7 +731,6 @@ describe("daemon E2E (real claude) - autonomous wake from background task", () = const timelineAtWake = await client.fetchAgentTimeline(agent.id, { direction: "tail", limit: 0, - projection: "canonical", }); // eslint-disable-next-line no-console @@ -751,7 +746,6 @@ describe("daemon E2E (real claude) - autonomous wake from background task", () = const timelineAfterWake = await client.fetchAgentTimeline(agent.id, { direction: "tail", limit: 0, - projection: "canonical", }); expect(timelineAfterWake.entries.length).toBeGreaterThanOrEqual( timelineAtWake.entries.length, @@ -916,7 +910,6 @@ describe("daemon E2E (real claude) - autonomous wake from background task", () = const timelineAtWake = await client.fetchAgentTimeline(agent.id, { direction: "tail", limit: 0, - projection: "canonical", }); // eslint-disable-next-line no-console @@ -932,7 +925,6 @@ describe("daemon E2E (real claude) - autonomous wake from background task", () = const timelineAfterWake = await client.fetchAgentTimeline(agent.id, { direction: "tail", limit: 0, - projection: "canonical", }); expect(timelineAfterWake.entries.length).toBeGreaterThanOrEqual( timelineAtWake.entries.length, @@ -1066,7 +1058,6 @@ describe("daemon E2E (real claude) - autonomous wake from background task", () = const timeline = await client.fetchAgentTimeline(agent.id, { direction: "tail", limit: 0, - projection: "canonical", }); assistantTexts = timeline.entries .filter( diff --git a/packages/server/src/server/daemon-e2e/persistence.e2e.test.ts b/packages/server/src/server/daemon-e2e/persistence.e2e.test.ts index 40f36fcd7..9e9e18589 100644 --- a/packages/server/src/server/daemon-e2e/persistence.e2e.test.ts +++ b/packages/server/src/server/daemon-e2e/persistence.e2e.test.ts @@ -114,11 +114,15 @@ describe("daemon E2E - persistence", () => { const timeline = await ctx.client.fetchAgentTimeline(agentId, { direction: "tail", limit: 0, - projection: "canonical", }); const timelineItems = timeline.entries.map((entry) => entry.item); expect(timelineItems.length).toBeGreaterThan(0); - expect(timelineItems.some((item) => item.type === "assistant_message")).toBe(true); + const assistantMessages = timelineItems.filter( + (item): item is Extract<(typeof timelineItems)[number], { type: "assistant_message" }> => + item.type === "assistant_message", + ); + expect(assistantMessages.length).toBeGreaterThan(0); + expect(assistantMessages.map((item) => item.text).join("")).toBe("timeline test"); } finally { await ctx.cleanup(); cleaned = true; diff --git a/packages/server/src/server/daemon-e2e/rewind-user-message-dedupe-claude.real.e2e.test.ts b/packages/server/src/server/daemon-e2e/rewind-user-message-dedupe-claude.real.e2e.test.ts index 7c5627db1..1aeed1d4f 100644 --- a/packages/server/src/server/daemon-e2e/rewind-user-message-dedupe-claude.real.e2e.test.ts +++ b/packages/server/src/server/daemon-e2e/rewind-user-message-dedupe-claude.real.e2e.test.ts @@ -48,7 +48,6 @@ describe("daemon E2E (real claude) - rewind user message dedupe", () => { const timeline = await client.fetchAgentTimeline(agent.id, { direction: "tail", limit: 0, - projection: "canonical", }); const rewindUserMessages = timeline.entries.filter( diff --git a/packages/server/src/server/daemon-e2e/terminal.e2e.test.ts b/packages/server/src/server/daemon-e2e/terminal.e2e.test.ts index 0f66a29c8..83f777de7 100644 --- a/packages/server/src/server/daemon-e2e/terminal.e2e.test.ts +++ b/packages/server/src/server/daemon-e2e/terminal.e2e.test.ts @@ -462,6 +462,39 @@ describe("daemon E2E terminal", () => { rmSync(cwd, { recursive: true, force: true }); }, 30000); + test("propagates debounced terminal titles through list responses and snapshots", async () => { + const cwd = tmpCwd(); + const created = await ctx.client.createTerminal(cwd, undefined, undefined, { + command: "/bin/sh", + args: ["-lc", "printf '\\033]0;Build Output\\007'; sleep 2"], + }); + const terminalId = created.terminal!.id; + + let listedTitle: string | undefined; + const start = Date.now(); + while (Date.now() - start < 10000) { + const list = await ctx.client.listTerminals(cwd); + listedTitle = list.terminals.find((terminal) => terminal.id === terminalId)?.title; + if (listedTitle === "Build Output") { + break; + } + await new Promise((resolve) => setTimeout(resolve, 25)); + } + expect(listedTitle).toBe("Build Output"); + + const snapshotPromise = waitForTerminalSnapshot( + ctx.client, + terminalId, + (state) => state.title === "Build Output", + ); + await ctx.client.subscribeTerminal(terminalId); + const snapshot = await snapshotPromise; + + expect(snapshot.title).toBe("Build Output"); + + rmSync(cwd, { recursive: true, force: true }); + }, 30000); + test("subscribe response is sent before the initial snapshot frame", async () => { const cwd = tmpCwd(); const created = await ctx.client.createTerminal(cwd); diff --git a/packages/server/src/server/daemon-e2e/timeline-reconnect-contract.e2e.test.ts b/packages/server/src/server/daemon-e2e/timeline-reconnect-contract.e2e.test.ts new file mode 100644 index 000000000..26cbf55e2 --- /dev/null +++ b/packages/server/src/server/daemon-e2e/timeline-reconnect-contract.e2e.test.ts @@ -0,0 +1,206 @@ +import { afterEach, beforeEach, describe, expect, test } from "vitest"; +import { mkdtempSync, rmSync } from "node:fs"; +import { tmpdir } from "node:os"; +import path from "node:path"; + +import { createDaemonTestContext, type DaemonTestContext, DaemonClient } from "../test-utils/index.js"; +import { createMessageCollector } from "../test-utils/message-collector.js"; +import type { SessionOutboundMessage } from "../messages.js"; + +function tmpCwd(): string { + return mkdtempSync(path.join(tmpdir(), "daemon-e2e-")); +} + +async function waitFor( + predicate: () => boolean, + timeoutMs = 5_000, + intervalMs = 10, +): Promise { + const startedAt = Date.now(); + while (!predicate()) { + if (Date.now() - startedAt > timeoutMs) { + throw new Error(`Timed out after ${timeoutMs}ms waiting for condition`); + } + await new Promise((resolve) => setTimeout(resolve, intervalMs)); + } +} + +function isSeqLessAssistantTimeline( + message: SessionOutboundMessage, + agentId: string, + text?: string, +): boolean { + return ( + message.type === "agent_stream" && + message.payload.agentId === agentId && + message.payload.event.type === "timeline" && + message.payload.event.item.type === "assistant_message" && + message.payload.seq === undefined && + (text === undefined || message.payload.event.item.text === text) + ); +} + +describe("daemon E2E - timeline reconnect contract", () => { + let ctx: DaemonTestContext; + + beforeEach(async () => { + ctx = await createDaemonTestContext(); + }); + + afterEach(async () => { + await ctx.cleanup(); + }, 60_000); + + test("reconnect catches up committed rows without replaying a provisional seed", async () => { + const cwd = tmpCwd(); + const primaryCollector = createMessageCollector(ctx.client); + + try { + const agent = await ctx.client.createAgent({ + provider: "codex", + cwd, + title: "Reconnect Contract Test", + modeId: "full-access", + }); + + for (let seq = 1; seq <= 120; seq += 1) { + await ctx.daemon.daemon.agentManager.appendTimelineItem(agent.id, { + type: "assistant_message", + text: `committed row ${seq}`, + }); + } + + primaryCollector.clear(); + await ctx.daemon.daemon.agentManager.emitLiveTimelineItem(agent.id, { + type: "assistant_message", + text: "partial before disconnect", + }); + await waitFor(() => + primaryCollector.messages.some((message) => + isSeqLessAssistantTimeline(message, agent.id, "partial before disconnect"), + ), + ); + + await ctx.client.close(); + + await ctx.daemon.daemon.agentManager.appendTimelineItem(agent.id, { + type: "assistant_message", + text: "finalized while disconnected", + }); + + const reconnectClient = new DaemonClient({ + url: `ws://127.0.0.1:${ctx.daemon.port}/ws`, + }); + await reconnectClient.connect(); + const reconnectCollector = createMessageCollector(reconnectClient); + + try { + await reconnectClient.fetchAgents({ + subscribe: { subscriptionId: "timeline-reconnect-a" }, + }); + + expect( + reconnectCollector.messages.some((message) => + isSeqLessAssistantTimeline(message, agent.id), + ), + ).toBe(false); + + const catchUp = await reconnectClient.fetchAgentTimeline(agent.id, { + direction: "after", + cursor: { seq: 120 }, + limit: 0, + }); + + expect(catchUp.entries).toHaveLength(1); + expect(catchUp.entries[0]?.seq).toBe(121); + expect(catchUp.entries[0]?.item).toEqual({ + type: "assistant_message", + text: "finalized while disconnected", + }); + } finally { + reconnectCollector.unsubscribe(); + await reconnectClient.close(); + } + } finally { + primaryCollector.unsubscribe(); + rmSync(cwd, { recursive: true, force: true }); + } + }, 30_000); + + test("reconnect with no new committed rows resumes from future live provisional updates only", async () => { + const cwd = tmpCwd(); + const primaryCollector = createMessageCollector(ctx.client); + + try { + const agent = await ctx.client.createAgent({ + provider: "codex", + cwd, + title: "Reconnect No Seed Test", + modeId: "full-access", + }); + + for (let seq = 1; seq <= 120; seq += 1) { + await ctx.daemon.daemon.agentManager.appendTimelineItem(agent.id, { + type: "assistant_message", + text: `committed row ${seq}`, + }); + } + + primaryCollector.clear(); + await ctx.daemon.daemon.agentManager.emitLiveTimelineItem(agent.id, { + type: "assistant_message", + text: "partial before disconnect", + }); + await waitFor(() => + primaryCollector.messages.some((message) => + isSeqLessAssistantTimeline(message, agent.id, "partial before disconnect"), + ), + ); + + await ctx.client.close(); + + const reconnectClient = new DaemonClient({ + url: `ws://127.0.0.1:${ctx.daemon.port}/ws`, + }); + await reconnectClient.connect(); + const reconnectCollector = createMessageCollector(reconnectClient); + + try { + await reconnectClient.fetchAgents({ + subscribe: { subscriptionId: "timeline-reconnect-b" }, + }); + + expect( + reconnectCollector.messages.some((message) => + isSeqLessAssistantTimeline(message, agent.id), + ), + ).toBe(false); + + const catchUp = await reconnectClient.fetchAgentTimeline(agent.id, { + direction: "after", + cursor: { seq: 120 }, + limit: 0, + }); + + expect(catchUp.entries).toHaveLength(0); + + reconnectCollector.clear(); + await ctx.daemon.daemon.agentManager.emitLiveTimelineItem(agent.id, { + type: "assistant_message", + text: "fresh live after reconnect", + }); + await waitFor(() => + reconnectCollector.messages.some((message) => + isSeqLessAssistantTimeline(message, agent.id, "fresh live after reconnect"), + ), + ); + } finally { + reconnectCollector.unsubscribe(); + await reconnectClient.close(); + } + } finally { + primaryCollector.unsubscribe(); + rmSync(cwd, { recursive: true, force: true }); + } + }, 30_000); +}); diff --git a/packages/server/src/server/daemon-e2e/timeline-window.e2e.test.ts b/packages/server/src/server/daemon-e2e/timeline-window.e2e.test.ts index af2f55799..ae46e9d72 100644 --- a/packages/server/src/server/daemon-e2e/timeline-window.e2e.test.ts +++ b/packages/server/src/server/daemon-e2e/timeline-window.e2e.test.ts @@ -20,7 +20,7 @@ describe("daemon E2E - timeline window", () => { await ctx.cleanup(); }, 60_000); - test("canonical tail limit keeps assistant chunks intact at the window boundary", async () => { + test("canonical tail limit returns one finalized committed assistant row at the window boundary", async () => { const cwd = tmpCwd(); try { const agent = await ctx.client.createAgent({ @@ -30,7 +30,7 @@ describe("daemon E2E - timeline window", () => { modeId: "full-access", }); - const expected = "1234567890ABCDEFGHIJ"; + const expected = "READY"; await ctx.client.sendMessage(agent.id, `Respond with exactly: ${expected}`); const finalState = await ctx.client.waitForFinish(agent.id, 5_000); expect(finalState.status).toBe("idle"); @@ -38,16 +38,14 @@ describe("daemon E2E - timeline window", () => { const timeline = await ctx.client.fetchAgentTimeline(agent.id, { direction: "tail", limit: 1, - projection: "canonical", }); const assistantTexts = timeline.entries .filter((entry) => entry.item.type === "assistant_message") .map((entry) => entry.item.text); - expect(assistantTexts).toHaveLength(2); - expect(assistantTexts.join("")).toBe(expected); - expect(timeline.startCursor?.seq).toBeLessThan(timeline.endCursor?.seq ?? 0); + expect(assistantTexts).toEqual([expected]); + expect(timeline.startSeq).toBe(timeline.endSeq); } finally { rmSync(cwd, { recursive: true, force: true }); } @@ -63,17 +61,16 @@ describe("daemon E2E - timeline window", () => { modeId: "full-access", }); - await ctx.client.sendMessage(agent.id, "Respond with exactly: FIRST-RESPONSE"); + await ctx.client.sendMessage(agent.id, "Respond with exactly: FIRST"); expect((await ctx.client.waitForFinish(agent.id, 5_000)).status).toBe("idle"); - const expected = "SECOND-RESPONSE"; + const expected = "SECOND"; await ctx.client.sendMessage(agent.id, `Respond with exactly: ${expected}`); expect((await ctx.client.waitForFinish(agent.id, 5_000)).status).toBe("idle"); const timeline = await ctx.client.fetchAgentTimeline(agent.id, { direction: "tail", limit: 1, - projection: "canonical", }); const assistantTexts = timeline.entries @@ -82,7 +79,7 @@ describe("daemon E2E - timeline window", () => { expect(assistantTexts.join("")).toBe(expected); expect(timeline.hasOlder).toBe(true); - expect(timeline.startCursor?.seq).toBeGreaterThan(1); + expect(timeline.startSeq).toBeGreaterThan(1); } finally { rmSync(cwd, { recursive: true, force: true }); } diff --git a/packages/server/src/server/daemon-e2e/ui-action-stress.real.e2e.test.ts b/packages/server/src/server/daemon-e2e/ui-action-stress.real.e2e.test.ts index f07aa0900..24aebe765 100644 --- a/packages/server/src/server/daemon-e2e/ui-action-stress.real.e2e.test.ts +++ b/packages/server/src/server/daemon-e2e/ui-action-stress.real.e2e.test.ts @@ -289,7 +289,6 @@ async function resolveLatestAssistantMessage( const timeline = await client.fetchAgentTimeline(agentId, { direction: "tail", limit: 300, - projection: "canonical", }); for (let idx = timeline.entries.length - 1; idx >= 0; idx -= 1) { const entry = timeline.entries[idx]; diff --git a/packages/server/src/server/daemon-e2e/wait-for-idle.e2e.test.ts b/packages/server/src/server/daemon-e2e/wait-for-idle.e2e.test.ts index 78b737f1b..f0b84d956 100644 --- a/packages/server/src/server/daemon-e2e/wait-for-idle.e2e.test.ts +++ b/packages/server/src/server/daemon-e2e/wait-for-idle.e2e.test.ts @@ -128,6 +128,10 @@ describe("waitForFinish edge cases", () => { test("waitForFinish resolves first idle edge even if a new run starts immediately after", async () => { const cwd = tmpCwd(); + const secondary = new DaemonClient({ url: `ws://127.0.0.1:${ctx.daemon.port}/ws` }); + + await secondary.connect(); + await secondary.fetchAgents({ subscribe: { subscriptionId: "wait-for-idle-secondary" } }); const agent = await ctx.client.createAgent({ provider: "codex", @@ -153,14 +157,9 @@ describe("waitForFinish edge cases", () => { } spawnedSecondRun = true; - const stream = ctx.daemon.daemon.agentManager.streamAgent( - agent.id, - "Use your shell tool to run sleep 1 and then reply done.", - ); secondRunDrain = (async () => { - for await (const _event of stream) { - // Drain second run so manager can settle. - } + await secondary.sendMessage(agent.id, "Reply with exactly: done."); + await secondary.waitForFinish(agent.id, 5_000); })(); }, { agentId: agent.id, replayState: false }, @@ -177,6 +176,7 @@ describe("waitForFinish edge cases", () => { } finally { unsubscribe(); await secondRunDrain; + await secondary.close(); await ctx.client.deleteAgent(agent.id); rmSync(cwd, { recursive: true, force: true }); } diff --git a/packages/server/src/server/db/db-agent-snapshot-store.test.ts b/packages/server/src/server/db/db-agent-snapshot-store.test.ts new file mode 100644 index 000000000..4d31866b7 --- /dev/null +++ b/packages/server/src/server/db/db-agent-snapshot-store.test.ts @@ -0,0 +1,276 @@ +import os from "node:os"; +import path from "node:path"; +import { mkdtempSync, rmSync } from "node:fs"; + +import { afterEach, beforeEach, describe, expect, test } from "vitest"; + +import type { ManagedAgent } from "../agent/agent-manager.js"; +import type { + AgentPermissionRequest, + AgentSession, + AgentSessionConfig, +} from "../agent/agent-sdk-types.js"; +import type { StoredAgentRecord } from "../agent/agent-storage.js"; +import { openPaseoDatabase, type PaseoDatabaseHandle } from "./sqlite-database.js"; +import { DbAgentSnapshotStore } from "./db-agent-snapshot-store.js"; +import { agentSnapshots, projects, workspaces } from "./schema.js"; + +type ManagedAgentOverrides = Omit< + Partial, + "config" | "pendingPermissions" | "session" | "activeForegroundTurnId" +> & { + config?: Partial; + pendingPermissions?: Map; + session?: AgentSession | null; + activeForegroundTurnId?: string | null; + runtimeInfo?: ManagedAgent["runtimeInfo"]; + attention?: ManagedAgent["attention"]; +}; + +function createManagedAgent(overrides: ManagedAgentOverrides = {}): ManagedAgent { + const now = overrides.updatedAt ?? new Date("2026-03-01T00:00:00.000Z"); + const provider = overrides.provider ?? "codex"; + const cwd = overrides.cwd ?? "/tmp/project"; + const lifecycle = overrides.lifecycle ?? "idle"; + const configOverrides = overrides.config ?? {}; + const config: AgentSessionConfig = { + provider, + cwd, + title: configOverrides.title, + modeId: configOverrides.modeId ?? "plan", + model: configOverrides.model ?? "gpt-5.1-codex-mini", + extra: configOverrides.extra ?? { codex: { approvalPolicy: "on-request" } }, + systemPrompt: configOverrides.systemPrompt, + mcpServers: configOverrides.mcpServers, + }; + const session = lifecycle === "closed" ? null : (overrides.session ?? ({} as AgentSession)); + const activeForegroundTurnId = + overrides.activeForegroundTurnId ?? (lifecycle === "running" ? "turn-1" : null); + + return { + id: overrides.id ?? "agent-1", + provider, + cwd, + session, + capabilities: overrides.capabilities ?? { + supportsStreaming: true, + supportsSessionPersistence: true, + supportsDynamicModes: true, + supportsMcpServers: true, + supportsReasoningStream: true, + supportsToolInvocations: true, + }, + config, + lifecycle, + createdAt: overrides.createdAt ?? now, + updatedAt: overrides.updatedAt ?? now, + availableModes: overrides.availableModes ?? [], + currentModeId: overrides.currentModeId ?? config.modeId ?? null, + pendingPermissions: overrides.pendingPermissions ?? new Map(), + activeForegroundTurnId, + foregroundTurnWaiters: new Set(), + unsubscribeSession: null, + timeline: overrides.timeline ?? [], + attention: overrides.attention ?? { requiresAttention: false }, + runtimeInfo: overrides.runtimeInfo ?? { + provider, + sessionId: overrides.sessionId ?? "session-123", + model: config.model ?? null, + modeId: config.modeId ?? null, + }, + persistence: overrides.persistence ?? null, + historyPrimed: overrides.historyPrimed ?? true, + lastUserMessageAt: overrides.lastUserMessageAt ?? now, + lastUsage: overrides.lastUsage, + lastError: overrides.lastError, + internal: overrides.internal, + labels: overrides.labels ?? {}, + pendingReplacement: false, + provisionalAssistantText: null, + }; +} + +function createStoredAgentRecord(overrides: Partial = {}): StoredAgentRecord { + return { + id: "agent-1", + provider: "codex", + cwd: "/tmp/project", + createdAt: "2026-03-01T00:00:00.000Z", + updatedAt: "2026-03-01T00:00:00.000Z", + lastActivityAt: "2026-03-01T00:00:00.000Z", + lastUserMessageAt: null, + title: null, + labels: {}, + lastStatus: "idle", + lastModeId: "plan", + config: { + modeId: "plan", + model: "gpt-5.1-codex-mini", + }, + runtimeInfo: { + provider: "codex", + sessionId: "session-123", + model: "gpt-5.1-codex-mini", + modeId: "plan", + }, + persistence: { + provider: "codex", + sessionId: "session-123", + }, + requiresAttention: false, + attentionReason: null, + attentionTimestamp: null, + internal: false, + archivedAt: null, + ...overrides, + }; +} + +describe("DbAgentSnapshotStore", () => { + let tmpDir: string; + let dataDir: string; + let database: PaseoDatabaseHandle; + let store: DbAgentSnapshotStore; + + beforeEach(async () => { + tmpDir = mkdtempSync(path.join(os.tmpdir(), "db-agent-snapshot-store-")); + dataDir = path.join(tmpDir, "db"); + database = await openPaseoDatabase(dataDir); + store = new DbAgentSnapshotStore(database.db); + }); + + afterEach(async () => { + await database.close(); + rmSync(tmpDir, { recursive: true, force: true }); + }); + + test("supports list/get/upsert/remove CRUD lifecycle", async () => { + const workspaceId = await seedWorkspace(database, { directory: "/tmp/project" }); + + const record = createStoredAgentRecord(); + + expect(await store.list()).toEqual([]); + expect(await store.get(record.id)).toBeNull(); + + await store.upsert(record, workspaceId); + + expect(await store.get(record.id)).toEqual(record); + expect(await store.list()).toEqual([record]); + expect(await database.db.select().from(agentSnapshots)).toEqual([ + expect.objectContaining({ + agentId: "agent-1", + workspaceId, + requiresAttention: false, + internal: false, + }), + ]); + + await store.remove(record.id); + + expect(await store.get(record.id)).toBeNull(); + expect(await store.list()).toEqual([]); + }); + + test("applySnapshot preserves title, createdAt, and archivedAt across updates", async () => { + const workspaceId = await seedWorkspace(database, { directory: "/tmp/project" }); + + await store.upsert( + createStoredAgentRecord({ + id: "agent-apply", + title: "Pinned title", + createdAt: "2026-03-01T00:00:00.000Z", + archivedAt: "2026-03-05T00:00:00.000Z", + }), + workspaceId, + ); + + await store.applySnapshot( + createManagedAgent({ + id: "agent-apply", + createdAt: new Date("2026-03-10T00:00:00.000Z"), + updatedAt: new Date("2026-03-11T00:00:00.000Z"), + lifecycle: "running", + }), + workspaceId, + ); + + expect(await store.get("agent-apply")).toEqual( + expect.objectContaining({ + id: "agent-apply", + title: "Pinned title", + createdAt: "2026-03-01T00:00:00.000Z", + updatedAt: "2026-03-11T00:00:00.000Z", + archivedAt: "2026-03-05T00:00:00.000Z", + lastStatus: "running", + }), + ); + }); + + test("setTitle throws for missing agents and updates existing agents", async () => { + const workspaceId = await seedWorkspace(database, { directory: "/tmp/project" }); + + await expect(store.setTitle("missing-agent", "Missing")).rejects.toThrow( + "Agent missing-agent not found", + ); + + await store.upsert(createStoredAgentRecord({ id: "agent-title", title: null }), workspaceId); + await store.setTitle("agent-title", "Renamed agent"); + + expect(await store.get("agent-title")).toEqual( + expect.objectContaining({ + id: "agent-title", + title: "Renamed agent", + }), + ); + }); + + test("upsert is idempotent for the same agent ID", async () => { + const workspaceId = await seedWorkspace(database, { directory: "/tmp/project" }); + + await store.upsert(createStoredAgentRecord({ id: "agent-idempotent", title: "Initial" }), workspaceId); + await store.upsert( + createStoredAgentRecord({ + id: "agent-idempotent", + title: "Updated", + updatedAt: "2026-03-02T00:00:00.000Z", + lastStatus: "running", + }), + workspaceId, + ); + + expect(await store.list()).toEqual([ + createStoredAgentRecord({ + id: "agent-idempotent", + title: "Updated", + updatedAt: "2026-03-02T00:00:00.000Z", + lastStatus: "running", + }), + ]); + expect(await database.db.select().from(agentSnapshots)).toHaveLength(1); + }); +}); + +async function seedWorkspace( + database: PaseoDatabaseHandle, + options: { directory: string }, +): Promise { + const [project] = await database.db.insert(projects).values({ + directory: options.directory, + kind: "git", + displayName: "project-1", + gitRemote: null, + createdAt: "2026-03-01T00:00:00.000Z", + updatedAt: "2026-03-01T00:00:00.000Z", + archivedAt: null, + }).returning(); + const [workspace] = await database.db.insert(workspaces).values({ + projectId: project.id, + directory: options.directory, + kind: "checkout", + displayName: "main", + createdAt: "2026-03-01T00:00:00.000Z", + updatedAt: "2026-03-01T00:00:00.000Z", + archivedAt: null, + }).returning(); + return workspace.id; +} diff --git a/packages/server/src/server/db/db-agent-snapshot-store.ts b/packages/server/src/server/db/db-agent-snapshot-store.ts new file mode 100644 index 000000000..a89f7ec11 --- /dev/null +++ b/packages/server/src/server/db/db-agent-snapshot-store.ts @@ -0,0 +1,204 @@ +import { asc, eq } from "drizzle-orm"; + +import type { ManagedAgent } from "../agent/agent-manager.js"; +import type { AgentSnapshotStore } from "../agent/agent-snapshot-store.js"; +import { toStoredAgentRecord } from "../agent/agent-projections.js"; +import type { StoredAgentRecord } from "../agent/agent-storage.js"; +import type { PaseoDatabaseHandle } from "./sqlite-database.js"; +import { agentSnapshots } from "./schema.js"; + +type AgentSnapshotRow = typeof agentSnapshots.$inferSelect; +type AgentSnapshotInsert = typeof agentSnapshots.$inferInsert; + +export function toStoredAgentRecordFromRow(row: AgentSnapshotRow): StoredAgentRecord { + return { + id: row.agentId, + provider: row.provider, + cwd: row.cwd, + createdAt: row.createdAt, + updatedAt: row.updatedAt, + lastActivityAt: row.lastActivityAt ?? undefined, + lastUserMessageAt: row.lastUserMessageAt ?? null, + title: row.title ?? null, + labels: row.labels, + lastStatus: row.lastStatus as StoredAgentRecord["lastStatus"], + lastModeId: row.lastModeId ?? null, + config: row.config ?? null, + runtimeInfo: row.runtimeInfo ?? undefined, + persistence: row.persistence ?? null, + requiresAttention: row.requiresAttention, + attentionReason: (row.attentionReason ?? null) as StoredAgentRecord["attentionReason"], + attentionTimestamp: row.attentionTimestamp ?? null, + internal: row.internal, + archivedAt: row.archivedAt ?? null, + }; +} + +export function toAgentSnapshotRowValues(options: { + record: StoredAgentRecord; + workspaceId: number; +}): AgentSnapshotInsert { + const { record, workspaceId } = options; + return { + agentId: record.id, + provider: record.provider, + workspaceId, + cwd: record.cwd, + createdAt: record.createdAt, + updatedAt: record.updatedAt, + lastActivityAt: record.lastActivityAt ?? null, + lastUserMessageAt: record.lastUserMessageAt ?? null, + title: record.title ?? null, + labels: record.labels, + lastStatus: record.lastStatus, + lastModeId: record.lastModeId ?? null, + config: record.config ?? null, + runtimeInfo: record.runtimeInfo ?? null, + persistence: record.persistence ?? null, + requiresAttention: record.requiresAttention ?? false, + attentionReason: record.attentionReason ?? null, + attentionTimestamp: record.attentionTimestamp ?? null, + internal: record.internal ?? false, + archivedAt: record.archivedAt ?? null, + }; +} + +function toAgentSnapshotUpdateSet(values: AgentSnapshotInsert) { + return { + provider: values.provider, + workspaceId: values.workspaceId, + cwd: values.cwd, + createdAt: values.createdAt, + updatedAt: values.updatedAt, + lastActivityAt: values.lastActivityAt, + lastUserMessageAt: values.lastUserMessageAt, + title: values.title, + labels: values.labels, + lastStatus: values.lastStatus, + lastModeId: values.lastModeId, + config: values.config, + runtimeInfo: values.runtimeInfo, + persistence: values.persistence, + requiresAttention: values.requiresAttention, + attentionReason: values.attentionReason, + attentionTimestamp: values.attentionTimestamp, + internal: values.internal, + archivedAt: values.archivedAt, + } satisfies Omit; +} + +export class DbAgentSnapshotStore implements AgentSnapshotStore { + private readonly db: PaseoDatabaseHandle["db"]; + + constructor(db: PaseoDatabaseHandle["db"]) { + this.db = db; + } + + async list(): Promise { + const rows = await this.db + .select() + .from(agentSnapshots) + .orderBy(asc(agentSnapshots.createdAt), asc(agentSnapshots.agentId)); + return rows.map(toStoredAgentRecordFromRow); + } + + async get(agentId: string): Promise { + const rows = await this.db + .select() + .from(agentSnapshots) + .where(eq(agentSnapshots.agentId, agentId)) + .limit(1); + const row = rows[0]; + return row ? toStoredAgentRecordFromRow(row) : null; + } + + async upsert(record: StoredAgentRecord): Promise; + async upsert(record: StoredAgentRecord, workspaceId: number): Promise; + async upsert(record: StoredAgentRecord, workspaceId?: number): Promise { + const nextWorkspaceId = + workspaceId ?? (await this.db + .select({ workspaceId: agentSnapshots.workspaceId }) + .from(agentSnapshots) + .where(eq(agentSnapshots.agentId, record.id)) + .limit(1))[0]?.workspaceId; + if (nextWorkspaceId === undefined) { + throw new Error(`Workspace ID required for agent ${record.id}`); + } + const values = toAgentSnapshotRowValues({ + record, + workspaceId: nextWorkspaceId, + }); + + await this.db + .insert(agentSnapshots) + .values(values) + .onConflictDoUpdate({ + target: agentSnapshots.agentId, + set: toAgentSnapshotUpdateSet(values), + }); + } + + async remove(agentId: string): Promise { + await this.db.delete(agentSnapshots).where(eq(agentSnapshots.agentId, agentId)); + } + + async applySnapshot( + agent: ManagedAgent, + options?: { title?: string | null; internal?: boolean }, + ): Promise; + async applySnapshot( + agent: ManagedAgent, + workspaceId: number, + options?: { title?: string | null; internal?: boolean }, + ): Promise; + async applySnapshot( + agent: ManagedAgent, + workspaceIdOrOptions?: number | { title?: string | null; internal?: boolean }, + options?: { title?: string | null; internal?: boolean }, + ): Promise { + const nextWorkspaceId = + typeof workspaceIdOrOptions === "number" + ? workspaceIdOrOptions + : (await this.db + .select({ workspaceId: agentSnapshots.workspaceId }) + .from(agentSnapshots) + .where(eq(agentSnapshots.agentId, agent.id)) + .limit(1))[0]?.workspaceId; + const nextOptions = + typeof workspaceIdOrOptions === "number" ? options : workspaceIdOrOptions; + const existing = await this.get(agent.id); + const hasTitleOverride = + nextOptions !== undefined && Object.prototype.hasOwnProperty.call(nextOptions, "title"); + const hasInternalOverride = + nextOptions !== undefined && Object.prototype.hasOwnProperty.call(nextOptions, "internal"); + const record = toStoredAgentRecord(agent, { + title: hasTitleOverride ? (nextOptions?.title ?? null) : (existing?.title ?? null), + createdAt: existing?.createdAt, + internal: hasInternalOverride + ? nextOptions?.internal + : (agent.internal ?? existing?.internal), + }); + + if (existing && existing.archivedAt !== undefined) { + record.archivedAt = existing.archivedAt; + } + + if (nextWorkspaceId === undefined) { + return; + } + await this.upsert(record, nextWorkspaceId); + } + + async setTitle(agentId: string, title: string): Promise { + const rows = await this.db + .select() + .from(agentSnapshots) + .where(eq(agentSnapshots.agentId, agentId)) + .limit(1); + const row = rows[0]; + if (!row) { + throw new Error(`Agent ${agentId} not found`); + } + await this.upsert({ ...toStoredAgentRecordFromRow(row), title }, row.workspaceId); + } +} diff --git a/packages/server/src/server/db/db-agent-timeline-store.test.ts b/packages/server/src/server/db/db-agent-timeline-store.test.ts new file mode 100644 index 000000000..2bba18087 --- /dev/null +++ b/packages/server/src/server/db/db-agent-timeline-store.test.ts @@ -0,0 +1,266 @@ +import os from "node:os"; +import path from "node:path"; +import { mkdtempSync, rmSync } from "node:fs"; + +import { afterEach, beforeEach, describe, expect, test } from "vitest"; + +import type { AgentTimelineItem } from "../agent/agent-sdk-types.js"; +import type { AgentTimelineRow } from "../agent/agent-timeline-store-types.js"; +import { openPaseoDatabase, type PaseoDatabaseHandle } from "./sqlite-database.js"; +import { DbAgentTimelineStore } from "./db-agent-timeline-store.js"; +import { agentTimelineRows } from "./schema.js"; + +function createTimestamp(seq: number): string { + return new Date(Date.UTC(2026, 2, 1, 0, 0, seq)).toISOString(); +} + +function createTimelineItem( + type: Extract, + value: string, +): AgentTimelineItem { + if (type === "user_message") { + return { + type, + text: `user-${value}`, + messageId: `message-${value}`, + }; + } + + return { + type, + text: `assistant-${value}`, + }; +} + +function createRow(seq: number, item?: AgentTimelineItem): AgentTimelineRow { + return { + seq, + timestamp: createTimestamp(seq), + item: item ?? createTimelineItem("assistant_message", String(seq)), + }; +} + +describe("DbAgentTimelineStore", () => { + let tmpDir: string; + let dataDir: string; + let database: PaseoDatabaseHandle; + let store: DbAgentTimelineStore; + + beforeEach(async () => { + tmpDir = mkdtempSync(path.join(os.tmpdir(), "db-agent-timeline-store-")); + dataDir = path.join(tmpDir, "db"); + database = await openPaseoDatabase(dataDir); + store = new DbAgentTimelineStore(database.db); + }); + + afterEach(async () => { + await database.close(); + rmSync(tmpDir, { recursive: true, force: true }); + }); + + test("appendCommitted assigns sequential seq numbers per agent", async () => { + expect( + await store.appendCommitted("agent-1", createTimelineItem("assistant_message", "1")), + ).toEqual({ + seq: 1, + timestamp: expect.any(String), + item: createTimelineItem("assistant_message", "1"), + }); + + expect( + await store.appendCommitted("agent-1", createTimelineItem("assistant_message", "2")), + ).toEqual({ + seq: 2, + timestamp: expect.any(String), + item: createTimelineItem("assistant_message", "2"), + }); + + expect( + await store.appendCommitted("agent-1", createTimelineItem("assistant_message", "3")), + ).toEqual({ + seq: 3, + timestamp: expect.any(String), + item: createTimelineItem("assistant_message", "3"), + }); + }); + + test("appendCommitted for different agents has independent seq sequences", async () => { + const firstAgentFirstRow = await store.appendCommitted( + "agent-1", + createTimelineItem("assistant_message", "a1"), + ); + const secondAgentFirstRow = await store.appendCommitted( + "agent-2", + createTimelineItem("assistant_message", "b1"), + ); + const firstAgentSecondRow = await store.appendCommitted( + "agent-1", + createTimelineItem("assistant_message", "a2"), + ); + + expect(firstAgentFirstRow.seq).toBe(1); + expect(secondAgentFirstRow.seq).toBe(1); + expect(firstAgentSecondRow.seq).toBe(2); + }); + + test("fetchCommitted tail returns the last N rows", async () => { + await store.bulkInsert("agent-1", [1, 2, 3, 4, 5].map((seq) => createRow(seq))); + + await expect( + store.fetchCommitted("agent-1", { + direction: "tail", + limit: 2, + }), + ).resolves.toEqual({ + direction: "tail", + window: { + minSeq: 1, + maxSeq: 5, + nextSeq: 6, + }, + hasOlder: true, + hasNewer: false, + rows: [createRow(4), createRow(5)], + }); + }); + + test("fetchCommitted after-cursor returns rows after a given seq", async () => { + await store.bulkInsert("agent-1", [1, 2, 3, 4, 5].map((seq) => createRow(seq))); + + await expect( + store.fetchCommitted("agent-1", { + direction: "after", + cursor: { seq: 2 }, + limit: 2, + }), + ).resolves.toEqual({ + direction: "after", + window: { + minSeq: 1, + maxSeq: 5, + nextSeq: 6, + }, + hasOlder: true, + hasNewer: true, + rows: [createRow(3), createRow(4)], + }); + }); + + test("fetchCommitted before-cursor returns rows before a given seq", async () => { + await store.bulkInsert("agent-1", [1, 2, 3, 4, 5].map((seq) => createRow(seq))); + + await expect( + store.fetchCommitted("agent-1", { + direction: "before", + cursor: { seq: 4 }, + limit: 2, + }), + ).resolves.toEqual({ + direction: "before", + window: { + minSeq: 1, + maxSeq: 5, + nextSeq: 6, + }, + hasOlder: true, + hasNewer: true, + rows: [createRow(2), createRow(3)], + }); + }); + + test("getLatestCommittedSeq returns 0 for an unknown agent", async () => { + await expect(store.getLatestCommittedSeq("missing-agent")).resolves.toBe(0); + }); + + test("getLatestCommittedSeq returns the latest seq after appends", async () => { + await store.appendCommitted("agent-1", createTimelineItem("assistant_message", "1")); + await store.appendCommitted("agent-1", createTimelineItem("assistant_message", "2")); + + await expect(store.getLatestCommittedSeq("agent-1")).resolves.toBe(2); + }); + + test("deleteAgent removes all rows for the target agent", async () => { + await store.bulkInsert("agent-1", [createRow(1), createRow(2)]); + await store.bulkInsert("agent-2", [createRow(1)]); + + await store.deleteAgent("agent-1"); + + await expect(store.getCommittedRows("agent-1")).resolves.toEqual([]); + await expect(store.getCommittedRows("agent-2")).resolves.toEqual([createRow(1)]); + }); + + test("bulkInsert preserves provided seq numbers", async () => { + const rows = [createRow(3), createRow(7)]; + + await store.bulkInsert("agent-1", rows); + + await expect(store.getCommittedRows("agent-1")).resolves.toEqual(rows); + }); + + test("item_kind is populated from item.type", async () => { + await store.appendCommitted("agent-1", createTimelineItem("user_message", "kind-check"), { + timestamp: createTimestamp(1), + }); + + await expect(database.db.select().from(agentTimelineRows)).resolves.toEqual([ + expect.objectContaining({ + agentId: "agent-1", + seq: 1, + committedAt: createTimestamp(1), + itemKind: "user_message", + }), + ]); + }); + + test("getLastItem returns the latest committed item", async () => { + await store.bulkInsert("agent-1", [ + createRow(1, createTimelineItem("user_message", "1")), + createRow(2, createTimelineItem("assistant_message", "2")), + ]); + + await expect(store.getLastItem("agent-1")).resolves.toEqual( + createTimelineItem("assistant_message", "2"), + ); + await expect(store.getLastItem("missing-agent")).resolves.toBeNull(); + }); + + test("getLastAssistantMessage assembles the latest contiguous assistant chunks", async () => { + await store.bulkInsert("agent-1", [ + createRow(1, createTimelineItem("assistant_message", "1")), + createRow(2, createTimelineItem("assistant_message", "2")), + createRow(3, { type: "reasoning", text: "separator-1" }), + createRow(4, createTimelineItem("assistant_message", "4")), + createRow(5, createTimelineItem("assistant_message", "5")), + createRow(6, { type: "reasoning", text: "separator-2" }), + ]); + + await expect(store.getLastAssistantMessage("agent-1")).resolves.toBe("assistant-4assistant-5"); + await expect(store.getLastAssistantMessage("missing-agent")).resolves.toBeNull(); + }); + + test("hasCommittedUserMessage matches by normalized messageId and text", async () => { + await store.bulkInsert("agent-1", [ + createRow(1, createTimelineItem("user_message", "1")), + createRow(2, createTimelineItem("assistant_message", "2")), + ]); + + await expect( + store.hasCommittedUserMessage("agent-1", { + messageId: " message-1 ", + text: "user-1", + }), + ).resolves.toBe(true); + await expect( + store.hasCommittedUserMessage("agent-1", { + messageId: "message-1", + text: "different", + }), + ).resolves.toBe(false); + await expect( + store.hasCommittedUserMessage("agent-1", { + messageId: " ", + text: "user-1", + }), + ).resolves.toBe(false); + }); +}); diff --git a/packages/server/src/server/db/db-agent-timeline-store.ts b/packages/server/src/server/db/db-agent-timeline-store.ts new file mode 100644 index 000000000..1cda2de34 --- /dev/null +++ b/packages/server/src/server/db/db-agent-timeline-store.ts @@ -0,0 +1,303 @@ +import { and, asc, desc, eq, gt, lt, sql } from "drizzle-orm"; + +import type { + AgentTimelineFetchOptions, + AgentTimelineFetchResult, + AgentTimelineRow, + AgentTimelineStore, + AgentTimelineWindow, +} from "../agent/agent-timeline-store-types.js"; +import type { AgentTimelineItem } from "../agent/agent-sdk-types.js"; +import type { PaseoDatabaseHandle } from "./sqlite-database.js"; +import { agentTimelineRows } from "./schema.js"; + +type AgentTimelineRowRecord = typeof agentTimelineRows.$inferSelect; +type AgentTimelineRowInsert = typeof agentTimelineRows.$inferInsert; + +const DEFAULT_TIMELINE_FETCH_LIMIT = 200; + +function normalizeTimelineMessageId(messageId: string | undefined): string | undefined { + if (typeof messageId !== "string") { + return undefined; + } + const normalized = messageId.trim(); + return normalized.length > 0 ? normalized : undefined; +} + +function toTimelineRow(row: AgentTimelineRowRecord): AgentTimelineRow { + return { + seq: row.seq, + timestamp: row.committedAt, + item: row.item, + }; +} + +function toInsertValues(agentId: string, row: AgentTimelineRow): AgentTimelineRowInsert { + return { + agentId, + seq: row.seq, + committedAt: row.timestamp, + item: row.item, + itemKind: row.item.type, + }; +} + +function normalizeFetchLimit(limit: number | undefined): number { + if (limit === undefined) { + return DEFAULT_TIMELINE_FETCH_LIMIT; + } + return Math.max(0, Math.floor(limit)); +} + +export class DbAgentTimelineStore implements AgentTimelineStore { + private readonly db: PaseoDatabaseHandle["db"]; + + constructor(db: PaseoDatabaseHandle["db"]) { + this.db = db; + } + + async appendCommitted( + agentId: string, + item: AgentTimelineItem, + options?: { timestamp?: string }, + ): Promise { + const nextSeq = (await this.getMaxSeq(agentId)) + 1; + const row: AgentTimelineRow = { + seq: nextSeq, + timestamp: options?.timestamp ?? new Date().toISOString(), + item, + }; + + await this.db.insert(agentTimelineRows).values(toInsertValues(agentId, row)); + return row; + } + + async fetchCommitted( + agentId: string, + options?: AgentTimelineFetchOptions, + ): Promise { + const direction = options?.direction ?? "tail"; + const limit = normalizeFetchLimit(options?.limit); + const selectAll = limit === 0; + const window = await this.getWindow(agentId); + + if (window.maxSeq === 0) { + return { + direction, + window, + hasOlder: false, + hasNewer: false, + rows: [], + }; + } + + if (direction === "tail") { + const rows = selectAll + ? await this.db + .select() + .from(agentTimelineRows) + .where(eq(agentTimelineRows.agentId, agentId)) + .orderBy(asc(agentTimelineRows.seq)) + : ( + await this.db + .select() + .from(agentTimelineRows) + .where(eq(agentTimelineRows.agentId, agentId)) + .orderBy(desc(agentTimelineRows.seq)) + .limit(limit) + ).reverse(); + const selected = rows.map(toTimelineRow); + return { + direction, + window, + hasOlder: selected.length > 0 && selected[0]!.seq > window.minSeq, + hasNewer: false, + rows: selected, + }; + } + + if (direction === "after") { + const baseSeq = options?.cursor?.seq ?? 0; + const rows = ( + selectAll + ? await this.db + .select() + .from(agentTimelineRows) + .where(and(eq(agentTimelineRows.agentId, agentId), gt(agentTimelineRows.seq, baseSeq))) + .orderBy(asc(agentTimelineRows.seq)) + : await this.db + .select() + .from(agentTimelineRows) + .where(and(eq(agentTimelineRows.agentId, agentId), gt(agentTimelineRows.seq, baseSeq))) + .orderBy(asc(agentTimelineRows.seq)) + .limit(limit) + ).map(toTimelineRow); + + if (rows.length === 0) { + return { + direction, + window, + hasOlder: baseSeq >= window.minSeq, + hasNewer: false, + rows, + }; + } + + const lastSelected = rows[rows.length - 1]!; + return { + direction, + window, + hasOlder: rows[0]!.seq > window.minSeq, + hasNewer: lastSelected.seq < window.maxSeq, + rows, + }; + } + + const beforeSeq = options?.cursor?.seq ?? window.nextSeq; + const rows = ( + selectAll + ? await this.db + .select() + .from(agentTimelineRows) + .where(and(eq(agentTimelineRows.agentId, agentId), lt(agentTimelineRows.seq, beforeSeq))) + .orderBy(asc(agentTimelineRows.seq)) + : ( + await this.db + .select() + .from(agentTimelineRows) + .where(and(eq(agentTimelineRows.agentId, agentId), lt(agentTimelineRows.seq, beforeSeq))) + .orderBy(desc(agentTimelineRows.seq)) + .limit(limit) + ).reverse() + ).map(toTimelineRow); + + return { + direction, + window, + hasOlder: rows.length > 0 && rows[0]!.seq > window.minSeq, + hasNewer: beforeSeq <= window.maxSeq, + rows, + }; + } + + async getLatestCommittedSeq(agentId: string): Promise { + return this.getMaxSeq(agentId); + } + + async getCommittedRows(agentId: string): Promise { + const rows = await this.db + .select() + .from(agentTimelineRows) + .where(eq(agentTimelineRows.agentId, agentId)) + .orderBy(asc(agentTimelineRows.seq)); + return rows.map(toTimelineRow); + } + + async getLastItem(agentId: string): Promise { + const [row] = await this.db + .select({ item: agentTimelineRows.item }) + .from(agentTimelineRows) + .where(eq(agentTimelineRows.agentId, agentId)) + .orderBy(desc(agentTimelineRows.seq)) + .limit(1); + return row?.item ?? null; + } + + async getLastAssistantMessage(agentId: string): Promise { + const rows = await this.db + .select({ + seq: agentTimelineRows.seq, + item: agentTimelineRows.item, + }) + .from(agentTimelineRows) + .where( + and( + eq(agentTimelineRows.agentId, agentId), + eq(agentTimelineRows.itemKind, "assistant_message"), + ), + ) + .orderBy(desc(agentTimelineRows.seq)); + + if (rows.length === 0) { + return null; + } + + const chunks: string[] = []; + let previousSeq: number | null = null; + for (const row of rows) { + if (previousSeq !== null && row.seq !== previousSeq - 1) { + break; + } + if (row.item.type !== "assistant_message") { + break; + } + chunks.push(row.item.text); + previousSeq = row.seq; + } + + return chunks.length > 0 ? chunks.reverse().join("") : null; + } + + async hasCommittedUserMessage( + agentId: string, + options: { messageId: string; text: string }, + ): Promise { + const messageId = normalizeTimelineMessageId(options.messageId); + if (!messageId) { + return false; + } + + const [row] = await this.db + .select({ seq: agentTimelineRows.seq }) + .from(agentTimelineRows) + .where( + and( + eq(agentTimelineRows.agentId, agentId), + eq(agentTimelineRows.itemKind, "user_message"), + sql`json_extract(${agentTimelineRows.item}, '$.messageId') = ${messageId}`, + sql`json_extract(${agentTimelineRows.item}, '$.text') = ${options.text}`, + ), + ) + .limit(1); + + return row !== undefined; + } + + async deleteAgent(agentId: string): Promise { + await this.db.delete(agentTimelineRows).where(eq(agentTimelineRows.agentId, agentId)); + } + + async bulkInsert(agentId: string, rows: readonly AgentTimelineRow[]): Promise { + if (rows.length === 0) { + return; + } + await this.db.insert(agentTimelineRows).values(rows.map((row) => toInsertValues(agentId, row))); + } + + private async getMaxSeq(agentId: string): Promise { + const [row] = await this.db + .select({ + maxSeq: sql`coalesce(max(${agentTimelineRows.seq}), 0)`, + }) + .from(agentTimelineRows) + .where(eq(agentTimelineRows.agentId, agentId)); + return Number(row?.maxSeq ?? 0); + } + + private async getWindow(agentId: string): Promise { + const [row] = await this.db + .select({ + minSeq: sql`coalesce(min(${agentTimelineRows.seq}), 0)`, + maxSeq: sql`coalesce(max(${agentTimelineRows.seq}), 0)`, + }) + .from(agentTimelineRows) + .where(eq(agentTimelineRows.agentId, agentId)); + const minSeq = Number(row?.minSeq ?? 0); + const maxSeq = Number(row?.maxSeq ?? 0); + return { + minSeq, + maxSeq, + nextSeq: maxSeq + 1, + }; + } +} diff --git a/packages/server/src/server/db/db-project-registry.ts b/packages/server/src/server/db/db-project-registry.ts new file mode 100644 index 000000000..c7fad1b5d --- /dev/null +++ b/packages/server/src/server/db/db-project-registry.ts @@ -0,0 +1,89 @@ +import { eq } from "drizzle-orm"; + +import type { ProjectRegistry, PersistedProjectRecord } from "../workspace-registry.js"; +import { createPersistedProjectRecord } from "../workspace-registry.js"; +import { projects } from "./schema.js"; +import type { PaseoDatabaseHandle } from "./sqlite-database.js"; + +function toPersistedProjectRecord(row: typeof projects.$inferSelect): PersistedProjectRecord { + return createPersistedProjectRecord({ + ...row, + kind: row.kind as PersistedProjectRecord["kind"], + }); +} + +export class DbProjectRegistry implements ProjectRegistry { + private readonly db: PaseoDatabaseHandle["db"]; + + constructor(db: PaseoDatabaseHandle["db"]) { + this.db = db; + } + + async initialize(): Promise { + return Promise.resolve(); + } + + async existsOnDisk(): Promise { + return true; + } + + async list(): Promise { + const rows = await this.db.select().from(projects); + return rows.map(toPersistedProjectRecord); + } + + async get(id: number): Promise { + const rows = await this.db.select().from(projects).where(eq(projects.id, id)).limit(1); + const row = rows[0]; + return row ? toPersistedProjectRecord(row) : null; + } + + async insert(record: Omit): Promise { + const [row] = await this.db + .insert(projects) + .values(record) + .onConflictDoUpdate({ + target: projects.directory, + set: { + kind: record.kind, + displayName: record.displayName, + gitRemote: record.gitRemote, + updatedAt: record.updatedAt, + archivedAt: record.archivedAt, + }, + }) + .returning({ id: projects.id }); + return row!.id; + } + + async upsert(record: PersistedProjectRecord): Promise { + const nextRecord = createPersistedProjectRecord(record); + await this.db + .insert(projects) + .values(nextRecord) + .onConflictDoUpdate({ + target: projects.directory, + set: { + kind: nextRecord.kind, + displayName: nextRecord.displayName, + gitRemote: nextRecord.gitRemote, + updatedAt: nextRecord.updatedAt, + archivedAt: nextRecord.archivedAt, + }, + }); + } + + async archive(id: number, archivedAt: string): Promise { + await this.db + .update(projects) + .set({ + updatedAt: archivedAt, + archivedAt, + }) + .where(eq(projects.id, id)); + } + + async remove(id: number): Promise { + await this.db.delete(projects).where(eq(projects.id, id)); + } +} diff --git a/packages/server/src/server/db/db-workspace-registry.test.ts b/packages/server/src/server/db/db-workspace-registry.test.ts new file mode 100644 index 000000000..c90c9d71d --- /dev/null +++ b/packages/server/src/server/db/db-workspace-registry.test.ts @@ -0,0 +1,199 @@ +import os from "node:os"; +import path from "node:path"; +import { mkdtempSync, rmSync } from "node:fs"; + +import { afterEach, beforeEach, describe, expect, test } from "vitest"; + +import type { PersistedProjectRecord, PersistedWorkspaceRecord } from "../workspace-registry.js"; +import { createPersistedProjectRecord, createPersistedWorkspaceRecord } from "../workspace-registry.js"; +import { openPaseoDatabase, type PaseoDatabaseHandle } from "./sqlite-database.js"; +import { DbProjectRegistry } from "./db-project-registry.js"; +import { DbWorkspaceRegistry } from "./db-workspace-registry.js"; + +function createProjectRecord(input: Partial = {}): PersistedProjectRecord { + return createPersistedProjectRecord({ + id: 1, + directory: "/tmp/repo", + kind: "git", + displayName: "acme/repo", + gitRemote: "git@github.com:acme/repo.git", + createdAt: "2026-03-01T00:00:00.000Z", + updatedAt: "2026-03-01T00:00:00.000Z", + archivedAt: null, + ...input, + }); +} + +function createWorkspaceRecord(input: Partial = {}): PersistedWorkspaceRecord { + return createPersistedWorkspaceRecord({ + id: 1, + projectId: 1, + directory: "/tmp/repo", + kind: "checkout", + displayName: "main", + createdAt: "2026-03-01T00:00:00.000Z", + updatedAt: "2026-03-01T00:00:00.000Z", + archivedAt: null, + ...input, + }); +} + +describe("DB-backed workspace registries", () => { + let tmpDir: string; + let dataDir: string; + let database: PaseoDatabaseHandle; + let projectRegistry: DbProjectRegistry; + let workspaceRegistry: DbWorkspaceRegistry; + + beforeEach(async () => { + tmpDir = mkdtempSync(path.join(os.tmpdir(), "db-workspace-registry-")); + dataDir = path.join(tmpDir, "db"); + database = await openPaseoDatabase(dataDir); + projectRegistry = new DbProjectRegistry(database.db); + workspaceRegistry = new DbWorkspaceRegistry(database.db); + }); + + afterEach(async () => { + await database.close(); + rmSync(tmpDir, { recursive: true, force: true }); + }); + + test("project registry matches the file-backed behavioral contract", async () => { + await projectRegistry.initialize(); + expect(await projectRegistry.existsOnDisk()).toBe(true); + expect(await projectRegistry.get(999)).toBeNull(); + expect(await projectRegistry.list()).toEqual([]); + + const projectId = await projectRegistry.insert({ + directory: "/tmp/repo", + kind: "git", + displayName: "acme/repo", + gitRemote: "git@github.com:acme/repo.git", + createdAt: "2026-03-01T00:00:00.000Z", + updatedAt: "2026-03-01T00:00:00.000Z", + archivedAt: null, + }); + await projectRegistry.upsert( + createProjectRecord({ + id: projectId, + updatedAt: "2026-03-02T00:00:00.000Z", + }), + ); + await projectRegistry.archive(projectId, "2026-03-03T00:00:00.000Z"); + await projectRegistry.archive(999, "2026-03-04T00:00:00.000Z"); + + expect(await projectRegistry.get(projectId)).toEqual( + createProjectRecord({ + id: projectId, + updatedAt: "2026-03-03T00:00:00.000Z", + archivedAt: "2026-03-03T00:00:00.000Z", + }), + ); + expect(await projectRegistry.list()).toEqual([ + createProjectRecord({ + updatedAt: "2026-03-03T00:00:00.000Z", + archivedAt: "2026-03-03T00:00:00.000Z", + }), + ]); + + await projectRegistry.remove(999); + await projectRegistry.remove(projectId); + + expect(await projectRegistry.get(projectId)).toBeNull(); + expect(await projectRegistry.list()).toEqual([]); + }); + + test("workspace registry matches the file-backed behavioral contract", async () => { + await workspaceRegistry.initialize(); + expect(await workspaceRegistry.existsOnDisk()).toBe(true); + expect(await workspaceRegistry.get(999)).toBeNull(); + expect(await workspaceRegistry.list()).toEqual([]); + + const projectId = await projectRegistry.insert({ + directory: "/tmp/repo", + kind: "git", + displayName: "acme/repo", + gitRemote: "git@github.com:acme/repo.git", + createdAt: "2026-03-01T00:00:00.000Z", + updatedAt: "2026-03-01T00:00:00.000Z", + archivedAt: null, + }); + const workspaceId = await workspaceRegistry.insert({ + projectId, + directory: "/tmp/repo", + kind: "checkout", + displayName: "main", + createdAt: "2026-03-01T00:00:00.000Z", + updatedAt: "2026-03-01T00:00:00.000Z", + archivedAt: null, + }); + await workspaceRegistry.upsert( + createWorkspaceRecord({ + id: workspaceId, + projectId, + displayName: "feature/workspace", + updatedAt: "2026-03-02T00:00:00.000Z", + }), + ); + await workspaceRegistry.archive(workspaceId, "2026-03-03T00:00:00.000Z"); + await workspaceRegistry.archive(999, "2026-03-04T00:00:00.000Z"); + + expect(await workspaceRegistry.get(workspaceId)).toEqual( + createWorkspaceRecord({ + id: workspaceId, + projectId, + displayName: "feature/workspace", + updatedAt: "2026-03-03T00:00:00.000Z", + archivedAt: "2026-03-03T00:00:00.000Z", + }), + ); + expect(await workspaceRegistry.list()).toEqual([ + createWorkspaceRecord({ + displayName: "feature/workspace", + updatedAt: "2026-03-03T00:00:00.000Z", + archivedAt: "2026-03-03T00:00:00.000Z", + }), + ]); + + await workspaceRegistry.remove(999); + await workspaceRegistry.remove(workspaceId); + + expect(await workspaceRegistry.get(workspaceId)).toBeNull(); + expect(await workspaceRegistry.list()).toEqual([]); + }); + + test("rejects workspace upserts for non-existent projects", async () => { + await expect( + workspaceRegistry.upsert( + createWorkspaceRecord({ + projectId: 999, + }), + ), + ).rejects.toThrow(); + }); + + test("cascades workspace removal when removing a linked project", async () => { + const projectId = await projectRegistry.insert({ + directory: "/tmp/repo", + kind: "git", + displayName: "acme/repo", + gitRemote: "git@github.com:acme/repo.git", + createdAt: "2026-03-01T00:00:00.000Z", + updatedAt: "2026-03-01T00:00:00.000Z", + archivedAt: null, + }); + const workspaceId = await workspaceRegistry.insert({ + projectId, + directory: "/tmp/repo", + kind: "checkout", + displayName: "main", + createdAt: "2026-03-01T00:00:00.000Z", + updatedAt: "2026-03-01T00:00:00.000Z", + archivedAt: null, + }); + + await projectRegistry.remove(projectId); + expect(await projectRegistry.get(projectId)).toBeNull(); + expect(await workspaceRegistry.get(workspaceId)).toBeNull(); + }); +}); diff --git a/packages/server/src/server/db/db-workspace-registry.ts b/packages/server/src/server/db/db-workspace-registry.ts new file mode 100644 index 000000000..f5e2ea7a9 --- /dev/null +++ b/packages/server/src/server/db/db-workspace-registry.ts @@ -0,0 +1,93 @@ +import { eq } from "drizzle-orm"; + +import type { PaseoDatabaseHandle } from "./sqlite-database.js"; +import { workspaces } from "./schema.js"; +import type { PersistedWorkspaceRecord, WorkspaceRegistry } from "../workspace-registry.js"; +import { createPersistedWorkspaceRecord } from "../workspace-registry.js"; + +function toPersistedWorkspaceRecord(row: typeof workspaces.$inferSelect): PersistedWorkspaceRecord { + return createPersistedWorkspaceRecord({ + ...row, + kind: row.kind as PersistedWorkspaceRecord["kind"], + }); +} + +export class DbWorkspaceRegistry implements WorkspaceRegistry { + private readonly db: PaseoDatabaseHandle["db"]; + + constructor(db: PaseoDatabaseHandle["db"]) { + this.db = db; + } + + async initialize(): Promise { + return Promise.resolve(); + } + + async existsOnDisk(): Promise { + return true; + } + + async list(): Promise { + const rows = await this.db.select().from(workspaces); + return rows.map(toPersistedWorkspaceRecord); + } + + async get(id: number): Promise { + const rows = await this.db + .select() + .from(workspaces) + .where(eq(workspaces.id, id)) + .limit(1); + const row = rows[0]; + return row ? toPersistedWorkspaceRecord(row) : null; + } + + async insert(record: Omit): Promise { + const [row] = await this.db + .insert(workspaces) + .values(record) + .onConflictDoUpdate({ + target: workspaces.directory, + set: { + projectId: record.projectId, + kind: record.kind, + displayName: record.displayName, + updatedAt: record.updatedAt, + archivedAt: record.archivedAt, + }, + }) + .returning({ id: workspaces.id }); + return row!.id; + } + + async upsert(record: PersistedWorkspaceRecord): Promise { + const nextRecord = createPersistedWorkspaceRecord(record); + await this.db + .insert(workspaces) + .values(nextRecord) + .onConflictDoUpdate({ + target: workspaces.directory, + set: { + projectId: nextRecord.projectId, + kind: nextRecord.kind, + displayName: nextRecord.displayName, + updatedAt: nextRecord.updatedAt, + archivedAt: nextRecord.archivedAt, + }, + }); + } + + async archive(workspaceId: number, archivedAt: string): Promise { + await this.db + .update(workspaces) + .set({ + updatedAt: archivedAt, + archivedAt, + }) + .where(eq(workspaces.id, workspaceId)); + } + + async remove(workspaceId: number): Promise { + await this.db.delete(workspaces).where(eq(workspaces.id, workspaceId)); + } +} diff --git a/packages/server/src/server/db/legacy-agent-snapshot-import.test.ts b/packages/server/src/server/db/legacy-agent-snapshot-import.test.ts new file mode 100644 index 000000000..48590a573 --- /dev/null +++ b/packages/server/src/server/db/legacy-agent-snapshot-import.test.ts @@ -0,0 +1,311 @@ +import os from "node:os"; +import path from "node:path"; +import { existsSync, mkdirSync, mkdtempSync, readFileSync, rmSync, writeFileSync } from "node:fs"; + +import { afterEach, beforeEach, describe, expect, test, vi } from "vitest"; + +import { createTestLogger } from "../../test-utils/test-logger.js"; +import { openPaseoDatabase, type PaseoDatabaseHandle } from "./sqlite-database.js"; +import { importLegacyAgentSnapshots } from "./legacy-agent-snapshot-import.js"; +import { agentSnapshots, projects, workspaces } from "./schema.js"; + +describe("importLegacyAgentSnapshots", () => { + let tmpDir: string; + let paseoHome: string; + let dbDir: string; + let database: PaseoDatabaseHandle; + + beforeEach(async () => { + tmpDir = mkdtempSync(path.join(os.tmpdir(), "paseo-legacy-agent-import-")); + paseoHome = path.join(tmpDir, ".paseo"); + dbDir = path.join(paseoHome, "db"); + mkdirSync(paseoHome, { recursive: true }); + database = await openPaseoDatabase(dbDir); + }); + + afterEach(async () => { + await database.close(); + rmSync(tmpDir, { recursive: true, force: true }); + }); + + async function seedWorkspace(directory: string): Promise { + const [project] = await database.db + .insert(projects) + .values({ + directory, + displayName: path.basename(directory), + kind: "directory", + createdAt: "2026-03-01T00:00:00.000Z", + updatedAt: "2026-03-01T00:00:00.000Z", + archivedAt: null, + }) + .returning({ id: projects.id }); + const [workspace] = await database.db + .insert(workspaces) + .values({ + projectId: project!.id, + directory, + displayName: path.basename(directory), + kind: "checkout", + createdAt: "2026-03-01T00:00:00.000Z", + updatedAt: "2026-03-01T00:00:00.000Z", + archivedAt: null, + }) + .returning({ id: workspaces.id }); + return workspace!.id; + } + + test("imports agent JSON files when the DB is empty", async () => { + await seedWorkspace("/tmp/project"); + writeLegacyAgentJson({ + paseoHome, + relativePath: "agents/agent-1.json", + payload: createLegacyAgentJson({ + requiresAttention: undefined, + internal: undefined, + }), + }); + + const result = await importLegacyAgentSnapshots({ + db: database.db, + paseoHome, + logger: createTestLogger(), + }); + + expect(result).toEqual({ + status: "imported", + importedAgents: 1, + }); + expect(await database.db.select().from(agentSnapshots)).toEqual([ + expect.objectContaining({ + agentId: "agent-1", + cwd: "/tmp/project", + requiresAttention: false, + internal: false, + }), + ]); + }); + + test("skips import when the DB already has agent data", async () => { + const workspaceId = await seedWorkspace("/tmp/existing-project"); + await database.db.insert(agentSnapshots).values({ + agentId: "existing-agent", + provider: "codex", + workspaceId, + cwd: "/tmp/existing-project", + createdAt: "2026-03-01T00:00:00.000Z", + updatedAt: "2026-03-01T00:00:00.000Z", + lastActivityAt: "2026-03-01T00:00:00.000Z", + lastUserMessageAt: null, + title: null, + labels: {}, + lastStatus: "idle", + lastModeId: "plan", + config: null, + runtimeInfo: { provider: "codex", sessionId: "session-existing" }, + persistence: null, + requiresAttention: false, + attentionReason: null, + attentionTimestamp: null, + internal: false, + archivedAt: null, + }); + writeLegacyAgentJson({ + paseoHome, + relativePath: "agents/legacy-agent.json", + payload: createLegacyAgentJson({ id: "legacy-agent" }), + }); + + const result = await importLegacyAgentSnapshots({ + db: database.db, + paseoHome, + logger: createTestLogger(), + }); + + expect(result).toEqual({ + status: "skipped", + reason: "database-not-empty", + }); + expect(await database.db.select().from(agentSnapshots)).toHaveLength(1); + }); + + test("imports agent JSON files from nested project directories", async () => { + await seedWorkspace("/tmp/root-project"); + await seedWorkspace("/tmp/nested-project"); + writeLegacyAgentJson({ + paseoHome, + relativePath: "agents/agent-root.json", + payload: createLegacyAgentJson({ id: "agent-root", cwd: "/tmp/root-project" }), + }); + writeLegacyAgentJson({ + paseoHome, + relativePath: "agents/tmp-nested-project/agent-nested.json", + payload: createLegacyAgentJson({ id: "agent-nested", cwd: "/tmp/nested-project" }), + }); + + const result = await importLegacyAgentSnapshots({ + db: database.db, + paseoHome, + logger: createTestLogger(), + }); + + expect(result).toEqual({ + status: "imported", + importedAgents: 2, + }); + expect( + (await database.db.select().from(agentSnapshots)).map((row) => row.agentId).sort(), + ).toEqual(["agent-nested", "agent-root"]); + }); + + test("batches large legacy agent imports so SQLite variable limits do not abort bootstrap", async () => { + await seedWorkspace("/tmp/large-project"); + + for (let index = 0; index < 150; index += 1) { + writeLegacyAgentJson({ + paseoHome, + relativePath: `agents/large-project/agent-${index}.json`, + payload: createLegacyAgentJson({ + id: `agent-${index}`, + cwd: "/tmp/large-project", + runtimeInfo: { + provider: "codex", + sessionId: `session-${index}`, + model: "gpt-5.1-codex-mini", + modeId: "plan", + }, + }), + }); + } + + const result = await importLegacyAgentSnapshots({ + db: database.db, + paseoHome, + logger: createTestLogger(), + }); + + expect(result).toEqual({ + status: "imported", + importedAgents: 150, + }); + const rows = await database.db.select().from(agentSnapshots); + expect(rows).toHaveLength(150); + expect(rows.map((row) => row.agentId)).toContain("agent-149"); + }); + + test("creates backup of agent directory before import", async () => { + await seedWorkspace("/tmp/project"); + writeLegacyAgentJson({ + paseoHome, + relativePath: "agents/project-a/agent-1.json", + payload: createLegacyAgentJson(), + }); + + await importLegacyAgentSnapshots({ + db: database.db, + paseoHome, + logger: createTestLogger(), + }); + + const backupPath = path.join( + paseoHome, + "backup", + "pre-migration", + "agents", + "project-a", + "agent-1.json", + ); + expect(existsSync(backupPath)).toBe(true); + expect(JSON.parse(readFileSync(backupPath, "utf8"))).toMatchObject({ + id: "agent-1", + cwd: "/tmp/project", + }); + }); + + test("logs batch progress for large imports", async () => { + await seedWorkspace("/tmp/large-project"); + const logger = createTestLogger(); + const infoSpy = vi.spyOn(logger, "info"); + + for (let index = 0; index < 150; index += 1) { + writeLegacyAgentJson({ + paseoHome, + relativePath: `agents/large-project/agent-${index}.json`, + payload: createLegacyAgentJson({ + id: `agent-${index}`, + cwd: "/tmp/large-project", + runtimeInfo: { + provider: "codex", + sessionId: `session-${index}`, + model: "gpt-5.1-codex-mini", + modeId: "plan", + }, + }), + }); + } + + await importLegacyAgentSnapshots({ + db: database.db, + paseoHome, + logger, + }); + + const batchLogs = infoSpy.mock.calls.filter( + ([context, message]) => message === "Importing agent snapshot batch" && typeof context === "object", + ); + expect(batchLogs.length).toBeGreaterThan(1); + expect(batchLogs[0]?.[0]).toMatchObject({ + batch: 1, + totalBatches: batchLogs.length, + }); + expect(batchLogs.at(-1)?.[0]).toMatchObject({ + batch: batchLogs.length, + totalBatches: batchLogs.length, + rowsProcessed: 150, + }); + }); +}); + +function createLegacyAgentJson(overrides: Record = {}): Record { + return { + id: "agent-1", + provider: "codex", + cwd: "/tmp/project", + createdAt: "2026-03-01T00:00:00.000Z", + updatedAt: "2026-03-02T00:00:00.000Z", + lastActivityAt: "2026-03-02T00:00:00.000Z", + lastUserMessageAt: null, + title: null, + labels: {}, + lastStatus: "idle", + lastModeId: "plan", + config: { + model: "gpt-5.1-codex-mini", + modeId: "plan", + }, + runtimeInfo: { + provider: "codex", + sessionId: "session-123", + model: "gpt-5.1-codex-mini", + modeId: "plan", + }, + persistence: null, + attentionReason: null, + attentionTimestamp: null, + archivedAt: null, + ...overrides, + }; +} + +function writeLegacyAgentJson(input: { + paseoHome: string; + relativePath: string; + payload: Record; +}): void { + const absolutePath = path.join(input.paseoHome, input.relativePath); + mkdirSync(path.dirname(absolutePath), { recursive: true }); + writeFileSync(absolutePath, JSON.stringify(input.payload, null, 2), { + encoding: "utf8", + flag: "w", + }); +} diff --git a/packages/server/src/server/db/legacy-agent-snapshot-import.ts b/packages/server/src/server/db/legacy-agent-snapshot-import.ts new file mode 100644 index 000000000..5a9538128 --- /dev/null +++ b/packages/server/src/server/db/legacy-agent-snapshot-import.ts @@ -0,0 +1,298 @@ +import path from "node:path"; +import { execSync } from "node:child_process"; +import { promises as fs } from "node:fs"; + +import { count } from "drizzle-orm"; +import type { Logger } from "pino"; + +import { parseStoredAgentRecord, type StoredAgentRecord } from "../agent/agent-storage.js"; +import { detectWorkspaceGitMetadata } from "../workspace-git-metadata.js"; +import { READ_ONLY_GIT_ENV } from "../checkout-git-utils.js"; +import { normalizeWorkspaceId } from "../workspace-registry-model.js"; +import type { PaseoDatabaseHandle } from "./sqlite-database.js"; +import { toAgentSnapshotRowValues } from "./db-agent-snapshot-store.js"; +import { agentSnapshots, projects, workspaces } from "./schema.js"; + +const SQLITE_MAX_VARIABLES_PER_STATEMENT = 999; +const AGENT_SNAPSHOT_INSERT_VARIABLES_PER_ROW = Object.keys(agentSnapshots).length; +const MAX_AGENT_SNAPSHOT_ROWS_PER_INSERT = Math.max( + 1, + Math.floor(SQLITE_MAX_VARIABLES_PER_STATEMENT / AGENT_SNAPSHOT_INSERT_VARIABLES_PER_ROW), +); + +export type LegacyAgentSnapshotImportResult = + | { + status: "imported"; + importedAgents: number; + } + | { + status: "skipped"; + reason: "database-not-empty" | "no-legacy-files"; + }; + +export async function importLegacyAgentSnapshots(options: { + db: PaseoDatabaseHandle["db"]; + paseoHome: string; + logger: Logger; +}): Promise { + if (await hasAnyAgentSnapshotRows(options.db)) { + options.logger.info("Skipping legacy agent snapshot import because the DB is not empty"); + return { + status: "skipped", + reason: "database-not-empty", + }; + } + + const agentsDir = path.join(options.paseoHome, "agents"); + if (!(await pathExists(agentsDir))) { + options.logger.info("Skipping legacy agent snapshot import because no legacy files exist"); + return { + status: "skipped", + reason: "no-legacy-files", + }; + } + + await backupLegacyAgentDirectory({ + sourceDir: agentsDir, + paseoHome: options.paseoHome, + logger: options.logger, + }); + + const { records, skippedCount } = await readLegacyAgentRecords(agentsDir, options.logger); + if (skippedCount > 0) { + options.logger.warn({ skippedCount }, "Skipped invalid agent JSON files during migration"); + } + if (records.length === 0) { + options.logger.info("Skipping legacy agent snapshot import because no legacy files exist"); + return { + status: "skipped", + reason: "no-legacy-files", + }; + } + + options.db.transaction((tx) => { + const workspaceRows = tx + .select({ id: workspaces.id, directory: workspaces.directory }) + .from(workspaces) + .all(); + const workspaceIdsByDirectory = new Map( + workspaceRows.map((row) => [row.directory, row.id] as const), + ); + const projectRows = tx + .select({ id: projects.id, directory: projects.directory, gitRemote: projects.gitRemote }) + .from(projects) + .all(); + const projectIdsByDirectory = new Map(projectRows.map((row) => [row.directory, row.id] as const)); + const projectIdsByRemote = new Map( + projectRows + .filter((row): row is typeof row & { gitRemote: string } => row.gitRemote !== null) + .map((row) => [row.gitRemote, row.id] as const), + ); + for (const record of records) { + const normalizedDirectory = normalizeWorkspaceId(record.cwd); + if (workspaceIdsByDirectory.has(normalizedDirectory)) { + continue; + } + + const timestamp = record.updatedAt ?? record.createdAt; + const gitInfo = detectGitInfoForCwd(record.cwd); + const resolvedDirectory = gitInfo?.toplevel + ? normalizeWorkspaceId(gitInfo.toplevel) + : normalizedDirectory; + const projectDisplayName = gitInfo?.metadata.projectDisplayName + ?? resolvedDirectory.split(/[\\/]/).filter(Boolean).at(-1) + ?? resolvedDirectory; + const projectKind = gitInfo?.metadata.projectKind ?? "directory"; + const gitRemote = gitInfo?.metadata.gitRemote ?? null; + const workspaceKind = gitInfo?.metadata.isWorktree ? "worktree" : "checkout"; + const workspaceDisplayName = gitInfo?.metadata.workspaceDisplayName + ?? normalizedDirectory.split(/[\\/]/).filter(Boolean).at(-1) + ?? normalizedDirectory; + + let projectId = projectIdsByDirectory.get(resolvedDirectory) + ?? (gitRemote !== null ? projectIdsByRemote.get(gitRemote) : undefined); + if (projectId === undefined) { + const projectRow = tx + .insert(projects) + .values({ + directory: resolvedDirectory, + displayName: projectDisplayName, + kind: projectKind, + gitRemote, + createdAt: record.createdAt, + updatedAt: timestamp, + archivedAt: null, + }) + .returning({ id: projects.id }) + .get(); + projectId = projectRow!.id; + projectIdsByDirectory.set(resolvedDirectory, projectId); + if (gitRemote !== null) { + projectIdsByRemote.set(gitRemote, projectId); + } + } + + const workspaceRow = tx + .insert(workspaces) + .values({ + projectId, + directory: normalizedDirectory, + displayName: workspaceDisplayName, + kind: workspaceKind, + createdAt: record.createdAt, + updatedAt: timestamp, + archivedAt: null, + }) + .returning({ id: workspaces.id }) + .get(); + workspaceIdsByDirectory.set(normalizedDirectory, workspaceRow!.id); + } + const rows = records.flatMap((record) => { + const workspaceId = workspaceIdsByDirectory.get(normalizeWorkspaceId(record.cwd)); + if (workspaceId === undefined) { + return []; + } + const clampedRecord = (record.lastStatus === "running" || record.lastStatus === "initializing") + ? { ...record, lastStatus: "closed" as const } + : record; + return [toAgentSnapshotRowValues({ record: clampedRecord, workspaceId })]; + }); + const totalBatches = Math.ceil(rows.length / MAX_AGENT_SNAPSHOT_ROWS_PER_INSERT); + for (let startIndex = 0; startIndex < rows.length; startIndex += MAX_AGENT_SNAPSHOT_ROWS_PER_INSERT) { + const batch = rows.slice(startIndex, startIndex + MAX_AGENT_SNAPSHOT_ROWS_PER_INSERT); + const batchNum = Math.floor(startIndex / MAX_AGENT_SNAPSHOT_ROWS_PER_INSERT) + 1; + const rowsProcessed = startIndex + batch.length; + options.logger.info( + { batch: batchNum, totalBatches, rowsProcessed }, + "Importing agent snapshot batch", + ); + tx.insert(agentSnapshots).values(batch).run(); + } + }); + + options.logger.info( + { importedAgents: records.length }, + "Imported legacy agent snapshots into the database", + ); + + return { + status: "imported", + importedAgents: records.length, + }; +} + +async function readLegacyAgentRecords(baseDir: string, logger: Logger): Promise<{ + records: StoredAgentRecord[]; + skippedCount: number; +}> { + let entries: Array = []; + try { + entries = await fs.readdir(baseDir, { withFileTypes: true }); + } catch (error) { + if ((error as NodeJS.ErrnoException).code === "ENOENT") { + return { records: [], skippedCount: 0 }; + } + throw error; + } + + const recordsById = new Map(); + let skippedCount = 0; + for (const entry of entries) { + if (entry.isFile() && entry.name.endsWith(".json")) { + const record = await readRecordFile(path.join(baseDir, entry.name), logger); + if (record) { + recordsById.set(record.id, record); + } else { + skippedCount += 1; + } + continue; + } + + if (!entry.isDirectory()) { + continue; + } + + let childEntries: Array = []; + try { + childEntries = await fs.readdir(path.join(baseDir, entry.name), { withFileTypes: true }); + } catch { + continue; + } + + for (const childEntry of childEntries) { + if (!childEntry.isFile() || !childEntry.name.endsWith(".json")) { + continue; + } + const record = await readRecordFile(path.join(baseDir, entry.name, childEntry.name), logger); + if (record) { + recordsById.set(record.id, record); + } else { + skippedCount += 1; + } + } + } + + return { + records: Array.from(recordsById.values()), + skippedCount, + }; +} + +async function readRecordFile(filePath: string, logger: Logger): Promise { + try { + const raw = await fs.readFile(filePath, "utf8"); + return parseStoredAgentRecord(JSON.parse(raw)); + } catch (error) { + logger.error({ err: error, filePath }, "Skipping invalid legacy agent snapshot"); + return null; + } +} + +async function hasAnyAgentSnapshotRows(db: PaseoDatabaseHandle["db"]): Promise { + const rows = await db.select({ count: count() }).from(agentSnapshots); + return (rows[0]?.count ?? 0) > 0; +} + +async function backupLegacyAgentDirectory(options: { + sourceDir: string; + paseoHome: string; + logger: Logger; +}): Promise { + const backupPath = path.join(options.paseoHome, "backup", "pre-migration", "agents"); + await fs.mkdir(path.dirname(backupPath), { recursive: true }); + await fs.cp(options.sourceDir, backupPath, { recursive: true }); + options.logger.info({ backupPath }, "Backed up legacy agent snapshots before migration"); +} + +async function pathExists(targetPath: string): Promise { + try { + await fs.access(targetPath); + return true; + } catch (error) { + if ((error as NodeJS.ErrnoException).code === "ENOENT") { + return false; + } + throw error; + } +} + +function detectGitInfoForCwd( + cwd: string, +): { toplevel: string; metadata: ReturnType } | null { + try { + const toplevel = execSync("git rev-parse --show-toplevel", { + cwd, + env: READ_ONLY_GIT_ENV, + encoding: "utf8", + stdio: ["ignore", "pipe", "ignore"], + }).trim(); + if (!toplevel) { + return null; + } + const directoryName = toplevel.split(/[\\/]/).filter(Boolean).at(-1) ?? toplevel; + const metadata = detectWorkspaceGitMetadata(cwd, directoryName); + return { toplevel, metadata }; + } catch { + return null; + } +} diff --git a/packages/server/src/server/db/legacy-import-real-data.adhoc.test.ts b/packages/server/src/server/db/legacy-import-real-data.adhoc.test.ts new file mode 100644 index 000000000..8ff8a0e91 --- /dev/null +++ b/packages/server/src/server/db/legacy-import-real-data.adhoc.test.ts @@ -0,0 +1,158 @@ +/** + * Adhoc test: imports real legacy data from ~/.paseo into a fresh SQLite DB + * and asserts that projects/workspaces are properly grouped. + * + * Run with: npx vitest run packages/server/src/server/db/legacy-import-real-data.adhoc.test.ts + */ +import os from "node:os"; +import path from "node:path"; +import { mkdirSync, mkdtempSync, rmSync } from "node:fs"; + +import { afterEach, beforeEach, describe, expect, test } from "vitest"; + +import { createTestLogger } from "../../test-utils/test-logger.js"; +import { openPaseoDatabase, type PaseoDatabaseHandle } from "./sqlite-database.js"; +import { importLegacyProjectWorkspaceJson } from "./legacy-project-workspace-import.js"; +import { importLegacyAgentSnapshots } from "./legacy-agent-snapshot-import.js"; +import { projects, workspaces, agentSnapshots } from "./schema.js"; +import { eq } from "drizzle-orm"; + +const REAL_PASEO_HOME = path.join(os.homedir(), ".paseo"); + +describe("legacy import from real ~/.paseo data", () => { + let tmpDir: string; + let dbDir: string; + let database: PaseoDatabaseHandle; + + beforeEach(async () => { + tmpDir = mkdtempSync(path.join(os.tmpdir(), "paseo-real-import-")); + dbDir = path.join(tmpDir, "db"); + mkdirSync(dbDir, { recursive: true }); + database = await openPaseoDatabase(dbDir); + }); + + afterEach(async () => { + await database?.close(); + rmSync(tmpDir, { recursive: true, force: true }); + }); + + test("imports real data and groups projects correctly", async () => { + const logger = createTestLogger(); + + // Phase 1: Import legacy project/workspace JSON (has proper grouping) + const pwResult = await importLegacyProjectWorkspaceJson({ + db: database.db, + paseoHome: REAL_PASEO_HOME, + logger, + }); + console.log("Project/workspace import result:", pwResult); + + // Phase 2: Import legacy agent snapshots + const agentResult = await importLegacyAgentSnapshots({ + db: database.db, + paseoHome: REAL_PASEO_HOME, + logger, + }); + console.log("Agent snapshot import result:", agentResult); + + // --- Assertions --- + + const allProjects = await database.db.select().from(projects); + const allWorkspaces = await database.db.select().from(workspaces); + const allAgents = await database.db.select().from(agentSnapshots); + + console.log(`Total projects: ${allProjects.length}`); + console.log(`Total workspaces: ${allWorkspaces.length}`); + console.log(`Total agents: ${allAgents.length}`); + + // 1. There should be fewer projects than workspaces (workspaces group under projects) + expect(allProjects.length).toBeLessThan(allWorkspaces.length); + + // 2. There should be exactly ONE project per unique git remote + const projectsByRemote = new Map(); + for (const project of allProjects) { + if (project.gitRemote) { + const existing = projectsByRemote.get(project.gitRemote) ?? []; + existing.push(project); + projectsByRemote.set(project.gitRemote, existing); + } + } + + const duplicateRemotes: string[] = []; + for (const [remote, projectList] of projectsByRemote) { + if (projectList.length > 1) { + duplicateRemotes.push(remote); + console.log(`DUPLICATE: ${remote} has ${projectList.length} projects:`); + for (const p of projectList) { + console.log(` id=${p.id} directory=${p.directory}`); + } + } + } + expect(duplicateRemotes).toEqual([]); + + // 3. Specifically: getpaseo/paseo should be ONE project + const paseoProjects = allProjects.filter( + (p) => p.gitRemote === "git@github.com:getpaseo/paseo.git", + ); + expect(paseoProjects).toHaveLength(1); + const paseoProject = paseoProjects[0]!; + + // 4. All paseo workspaces (worktrees + main checkout + subdirs) should be under that one project + const paseoWorkspaces = allWorkspaces.filter( + (w) => w.projectId === paseoProject.id, + ); + console.log(`Paseo project id=${paseoProject.id}, directory=${paseoProject.directory}`); + console.log(`Paseo workspaces: ${paseoWorkspaces.length}`); + + // The old data had ~51 paseo workspaces + expect(paseoWorkspaces.length).toBeGreaterThanOrEqual(10); + + // 5. Subdirectory workspaces (packages/server, packages/app) should be under the same project + const subdirWorkspaces = paseoWorkspaces.filter((w) => + w.directory.includes("/packages/"), + ); + console.log(`Paseo subdirectory workspaces: ${subdirWorkspaces.length}`); + for (const w of subdirWorkspaces) { + console.log(` ${w.directory} (projectId=${w.projectId})`); + expect(w.projectId).toBe(paseoProject.id); + } + + // 6. Worktree workspaces should be under the same project + const worktreeWorkspaces = paseoWorkspaces.filter((w) => + w.directory.includes("/.paseo/worktrees/"), + ); + console.log(`Paseo worktree workspaces: ${worktreeWorkspaces.length}`); + expect(worktreeWorkspaces.length).toBeGreaterThan(0); + + // 7. No git project should be a subdirectory of another project with the SAME git remote + // (e.g., /dev/paseo/packages/server should not be its own project if /dev/paseo exists) + const activeGitProjects = allProjects.filter((p) => !p.archivedAt && p.gitRemote); + const subdirProjects: string[] = []; + for (const project of activeGitProjects) { + for (const other of activeGitProjects) { + if ( + project.id !== other.id && + project.gitRemote === other.gitRemote && + project.directory.startsWith(other.directory + "/") + ) { + subdirProjects.push( + `${project.directory} (id=${project.id}) is under ${other.directory} (id=${other.id}), both remote=${project.gitRemote}`, + ); + } + } + } + if (subdirProjects.length > 0) { + console.log("Subdirectory git projects with same remote (should be empty):"); + for (const s of subdirProjects) { + console.log(` ${s}`); + } + } + expect(subdirProjects).toEqual([]); + + // 8. All agent snapshots should reference valid workspaces + for (const agent of allAgents) { + const workspace = allWorkspaces.find((w) => w.id === agent.workspaceId); + expect(workspace).toBeDefined(); + } + }); +}); diff --git a/packages/server/src/server/db/legacy-project-workspace-import.test.ts b/packages/server/src/server/db/legacy-project-workspace-import.test.ts new file mode 100644 index 000000000..f5799f797 --- /dev/null +++ b/packages/server/src/server/db/legacy-project-workspace-import.test.ts @@ -0,0 +1,321 @@ +import os from "node:os"; +import path from "node:path"; +import { existsSync, mkdirSync, mkdtempSync, readFileSync, rmSync, writeFileSync } from "node:fs"; + +import { afterEach, beforeEach, describe, expect, test } from "vitest"; + +import { createTestLogger } from "../../test-utils/test-logger.js"; +import { openPaseoDatabase, type PaseoDatabaseHandle } from "./sqlite-database.js"; +import { importLegacyProjectWorkspaceJson } from "./legacy-project-workspace-import.js"; +import { projects, workspaces } from "./schema.js"; + +describe("importLegacyProjectWorkspaceJson", () => { + let tmpDir: string; + let paseoHome: string; + let dbDir: string; + let database: PaseoDatabaseHandle; + + beforeEach(async () => { + tmpDir = mkdtempSync(path.join(os.tmpdir(), "paseo-legacy-import-")); + paseoHome = path.join(tmpDir, ".paseo"); + dbDir = path.join(paseoHome, "db"); + mkdirSync(paseoHome, { recursive: true }); + database = await openPaseoDatabase(dbDir); + }); + + afterEach(async () => { + await database?.close(); + rmSync(tmpDir, { recursive: true, force: true }); + }); + + test("imports legacy projects and workspaces once when the DB is empty", async () => { + writeLegacyJson({ + paseoHome, + projectsJson: [ + { + projectId: "project-1", + rootPath: "/tmp/project-1", + kind: "git", + displayName: "Project One", + createdAt: "2026-03-01T00:00:00.000Z", + updatedAt: "2026-03-02T00:00:00.000Z", + archivedAt: null, + }, + ], + workspacesJson: [ + { + workspaceId: "workspace-1", + projectId: "project-1", + cwd: "/tmp/project-1", + kind: "local_checkout", + displayName: "main", + createdAt: "2026-03-01T00:00:00.000Z", + updatedAt: "2026-03-02T00:00:00.000Z", + archivedAt: null, + }, + ], + }); + + const result = await importLegacyProjectWorkspaceJson({ + db: database.db, + paseoHome, + logger: createTestLogger(), + }); + + expect(result).toEqual({ + status: "imported", + importedProjects: 1, + importedWorkspaces: 1, + }); + const projectRows = await database.db.select().from(projects); + expect(projectRows).toEqual([ + expect.objectContaining({ + directory: "/tmp/project-1", + kind: "git", + displayName: "Project One", + }), + ]); + expect(typeof projectRows[0]!.id).toBe("number"); + + const workspaceRows = await database.db.select().from(workspaces); + expect(workspaceRows).toEqual([ + expect.objectContaining({ + projectId: projectRows[0]!.id, + directory: "/tmp/project-1", + kind: "checkout", + displayName: "main", + }), + ]); + }); + + test("skips import when the DB already has project or workspace data", async () => { + // Seed with a project in the new schema format + const [inserted] = await database.db + .insert(projects) + .values({ + directory: "/tmp/existing-project", + kind: "git", + displayName: "Existing Project", + createdAt: "2026-03-01T00:00:00.000Z", + updatedAt: "2026-03-01T00:00:00.000Z", + archivedAt: null, + }) + .returning({ id: projects.id }); + + writeLegacyJson({ + paseoHome, + projectsJson: [ + { + projectId: "legacy-project", + rootPath: "/tmp/legacy-project", + kind: "git", + displayName: "Legacy Project", + createdAt: "2026-03-02T00:00:00.000Z", + updatedAt: "2026-03-02T00:00:00.000Z", + archivedAt: null, + }, + ], + workspacesJson: [], + }); + + const result = await importLegacyProjectWorkspaceJson({ + db: database.db, + paseoHome, + logger: createTestLogger(), + }); + + expect(result).toEqual({ + status: "skipped", + reason: "database-not-empty", + }); + // Only the existing project should be in DB + const allProjects = await database.db.select().from(projects); + expect(allProjects).toHaveLength(1); + expect(allProjects[0]!.id).toBe(inserted!.id); + }); + + test("rolls back the whole import when workspace insertion fails", async () => { + writeLegacyJson({ + paseoHome, + projectsJson: [ + { + projectId: "project-1", + rootPath: "/tmp/project-1", + kind: "git", + displayName: "Project One", + createdAt: "2026-03-01T00:00:00.000Z", + updatedAt: "2026-03-02T00:00:00.000Z", + archivedAt: null, + }, + ], + workspacesJson: [ + { + workspaceId: "workspace-1", + projectId: "missing-project", + cwd: "/tmp/project-1", + kind: "local_checkout", + displayName: "main", + createdAt: "2026-03-01T00:00:00.000Z", + updatedAt: "2026-03-02T00:00:00.000Z", + archivedAt: null, + }, + ], + }); + + await expect( + importLegacyProjectWorkspaceJson({ + db: database.db, + paseoHome, + logger: createTestLogger(), + }), + ).rejects.toThrow(); + + expect(await database.db.select().from(projects)).toEqual([]); + expect(await database.db.select().from(workspaces)).toEqual([]); + }); + + test("deduplicates projects with the same rootPath", async () => { + writeLegacyJson({ + paseoHome, + projectsJson: [ + { + projectId: "project-1", + rootPath: "/tmp/project-1", + kind: "git", + displayName: "First Project", + createdAt: "2026-03-01T00:00:00.000Z", + updatedAt: "2026-03-01T00:00:00.000Z", + archivedAt: null, + }, + { + projectId: "project-2", + rootPath: "/tmp/project-1", + kind: "git", + displayName: "Replacement Project", + createdAt: "2026-03-01T00:00:00.000Z", + updatedAt: "2026-03-03T00:00:00.000Z", + archivedAt: null, + }, + ], + workspacesJson: [ + { + workspaceId: "workspace-1", + projectId: "project-2", + cwd: "/tmp/project-1", + kind: "local_checkout", + displayName: "main", + createdAt: "2026-03-01T00:00:00.000Z", + updatedAt: "2026-03-03T00:00:00.000Z", + archivedAt: null, + }, + ], + }); + + const result = await importLegacyProjectWorkspaceJson({ + db: database.db, + paseoHome, + logger: createTestLogger(), + }); + + expect(result).toEqual({ + status: "imported", + importedProjects: 1, + importedWorkspaces: 1, + }); + const projectRows = await database.db.select().from(projects); + expect(projectRows).toHaveLength(1); + expect(projectRows[0]).toEqual( + expect.objectContaining({ + directory: "/tmp/project-1", + displayName: "First Project", + }), + ); + }); + + test("creates backup of JSON files before import", async () => { + writeLegacyJson({ + paseoHome, + projectsJson: [ + { + projectId: "project-1", + rootPath: "/tmp/project-1", + kind: "git", + displayName: "Project One", + createdAt: "2026-03-01T00:00:00.000Z", + updatedAt: "2026-03-02T00:00:00.000Z", + archivedAt: null, + }, + ], + workspacesJson: [ + { + workspaceId: "workspace-1", + projectId: "project-1", + cwd: "/tmp/project-1", + kind: "local_checkout", + displayName: "main", + createdAt: "2026-03-01T00:00:00.000Z", + updatedAt: "2026-03-02T00:00:00.000Z", + archivedAt: null, + }, + ], + }); + + await importLegacyProjectWorkspaceJson({ + db: database.db, + paseoHome, + logger: createTestLogger(), + }); + + const backupDir = path.join(paseoHome, "backup", "pre-migration"); + const projectsBackupPath = path.join(backupDir, "projects.json"); + const workspacesBackupPath = path.join(backupDir, "workspaces.json"); + expect(existsSync(projectsBackupPath)).toBe(true); + expect(existsSync(workspacesBackupPath)).toBe(true); + expect(JSON.parse(readFileSync(projectsBackupPath, "utf8"))).toHaveLength(1); + expect(JSON.parse(readFileSync(workspacesBackupPath, "utf8"))).toHaveLength(1); + }); + + test("produces clear error message for corrupt project JSON", async () => { + writeLegacyJson({ + paseoHome, + projectsJson: [ + { + projectId: "project-1", + rootPath: 123, + kind: "git", + displayName: "Project One", + createdAt: "2026-03-01T00:00:00.000Z", + updatedAt: "2026-03-02T00:00:00.000Z", + archivedAt: null, + }, + ], + workspacesJson: [], + }); + + await expect( + importLegacyProjectWorkspaceJson({ + db: database.db, + paseoHome, + logger: createTestLogger(), + }), + ).rejects.toThrow( + `Failed to parse ${path.join(paseoHome, "projects", "projects.json")}. ` + + "The file may be corrupted.", + ); + }); +}); + +function writeLegacyJson(input: { + paseoHome: string; + projectsJson: unknown[]; + workspacesJson: unknown[]; +}): void { + const projectsPath = path.join(input.paseoHome, "projects", "projects.json"); + const workspacesPath = path.join(input.paseoHome, "projects", "workspaces.json"); + mkdirSync(path.dirname(projectsPath), { recursive: true }); + writeFileSync(projectsPath, JSON.stringify(input.projectsJson, null, 2), { encoding: "utf8", flag: "w" }); + writeFileSync(workspacesPath, JSON.stringify(input.workspacesJson, null, 2), { + encoding: "utf8", + flag: "w", + }); +} diff --git a/packages/server/src/server/db/legacy-project-workspace-import.ts b/packages/server/src/server/db/legacy-project-workspace-import.ts new file mode 100644 index 000000000..8b9b55f86 --- /dev/null +++ b/packages/server/src/server/db/legacy-project-workspace-import.ts @@ -0,0 +1,268 @@ +import path from "node:path"; +import { promises as fs } from "node:fs"; + +import { count } from "drizzle-orm"; +import type { Logger } from "pino"; + +import { z } from "zod"; + +import type { PaseoDatabaseHandle } from "./sqlite-database.js"; +import { projects, workspaces } from "./schema.js"; + +const LEGACY_REMOTE_PREFIX = "remote:"; + +function deriveGitRemoteFromLegacyProjectId(projectId: string): string | null { + if (!projectId.startsWith(LEGACY_REMOTE_PREFIX)) { + return null; + } + const hostAndPath = projectId.slice(LEGACY_REMOTE_PREFIX.length); + return `git@${hostAndPath.replace("/", ":")}.git`; +} + +// Legacy JSON schemas — these match the old pre-migration format +const LegacyProjectSchema = z.object({ + projectId: z.string(), + rootPath: z.string(), + kind: z.string(), + displayName: z.string(), + createdAt: z.string(), + updatedAt: z.string(), + archivedAt: z.string().nullable(), +}); + +const LegacyWorkspaceSchema = z.object({ + workspaceId: z.string(), + projectId: z.string(), + cwd: z.string(), + kind: z.string(), + displayName: z.string(), + createdAt: z.string(), + updatedAt: z.string(), + archivedAt: z.string().nullable(), +}); + +export type LegacyProjectWorkspaceImportResult = + | { + status: "imported"; + importedProjects: number; + importedWorkspaces: number; + } + | { + status: "skipped"; + reason: "database-not-empty" | "no-legacy-files"; + }; + +export async function importLegacyProjectWorkspaceJson(options: { + db: PaseoDatabaseHandle["db"]; + paseoHome: string; + logger: Logger; +}): Promise { + const projectsPath = path.join(options.paseoHome, "projects", "projects.json"); + const workspacesPath = path.join(options.paseoHome, "projects", "workspaces.json"); + const databaseHasRows = await hasAnyProjectWorkspaceRows(options.db); + + if (databaseHasRows) { + options.logger.info("Skipping legacy project/workspace JSON import because the DB is not empty"); + return { + status: "skipped", + reason: "database-not-empty", + }; + } + + const [projectsExists, workspacesExists] = await Promise.all([ + pathExists(projectsPath), + pathExists(workspacesPath), + ]); + if (!projectsExists && !workspacesExists) { + options.logger.info("Skipping legacy project/workspace JSON import because no legacy files exist"); + return { + status: "skipped", + reason: "no-legacy-files", + }; + } + + await backupLegacyProjectWorkspaceJson({ + projectsPath, + workspacesPath, + paseoHome: options.paseoHome, + logger: options.logger, + }); + + const [projectRows, workspaceRows] = await Promise.all([ + readLegacyProjects(projectsPath), + readLegacyWorkspaces(workspacesPath), + ]); + + if (projectRows.length === 0 && workspaceRows.length === 0) { + options.logger.info("Skipping legacy project/workspace JSON import because no legacy files exist"); + return { + status: "skipped", + reason: "no-legacy-files", + }; + } + + // Deduplicate legacy projects by rootPath — prefer git over non_git + const deduplicatedProjects = new Map(); + for (const legacy of projectRows) { + const existing = deduplicatedProjects.get(legacy.rootPath); + if (!existing || (legacy.kind === "git" && existing.kind !== "git")) { + deduplicatedProjects.set(legacy.rootPath, legacy); + } + } + + options.db.transaction((tx) => { + // Insert projects, mapping old format to new schema + const projectDirectoryToId = new Map(); + for (const legacy of deduplicatedProjects.values()) { + const row = tx + .insert(projects) + .values({ + directory: legacy.rootPath, + displayName: legacy.displayName, + kind: legacy.kind === "non_git" ? "directory" : legacy.kind, + gitRemote: deriveGitRemoteFromLegacyProjectId(legacy.projectId), + createdAt: legacy.createdAt, + updatedAt: legacy.updatedAt, + archivedAt: legacy.archivedAt, + }) + .returning({ id: projects.id }) + .get(); + projectDirectoryToId.set(legacy.rootPath, row!.id); + } + + // Build a map from legacy projectId -> new integer id + // Uses original projectRows so all duplicate projectIds resolve to the same new id + const legacyProjectIdToNewId = new Map(); + for (const legacy of projectRows) { + const newId = projectDirectoryToId.get(legacy.rootPath); + if (newId !== undefined) { + legacyProjectIdToNewId.set(legacy.projectId, newId); + } + } + + // Insert workspaces, resolving project FK + for (const legacy of workspaceRows) { + const projectId = legacyProjectIdToNewId.get(legacy.projectId); + if (projectId === undefined) { + throw new Error(`Legacy workspace ${legacy.workspaceId} references unknown project ${legacy.projectId}`); + } + tx + .insert(workspaces) + .values({ + projectId, + directory: legacy.cwd, + displayName: legacy.displayName, + kind: + legacy.kind === "local_checkout" || legacy.kind === "directory" + ? "checkout" + : legacy.kind, + createdAt: legacy.createdAt, + updatedAt: legacy.updatedAt, + archivedAt: legacy.archivedAt, + }) + .run(); + } + }); + + const importedProjects = deduplicatedProjects.size; + + options.logger.info( + { + importedProjects, + importedWorkspaces: workspaceRows.length, + }, + "Imported legacy project/workspace JSON into the database", + ); + + return { + status: "imported", + importedProjects, + importedWorkspaces: workspaceRows.length, + }; +} + +async function readLegacyProjects(filePath: string) { + const raw = await readOptionalJsonFile(filePath); + if (!raw) { + return []; + } + try { + return z.array(LegacyProjectSchema).parse(raw); + } catch (error) { + throw new Error( + `Failed to parse ${filePath}. The file may be corrupted. ` + + `Check the file and fix or remove invalid entries. ` + + `Original error: ${error instanceof Error ? error.message : String(error)}`, + ); + } +} + +async function readLegacyWorkspaces(filePath: string) { + const raw = await readOptionalJsonFile(filePath); + if (!raw) { + return []; + } + try { + return z.array(LegacyWorkspaceSchema).parse(raw); + } catch (error) { + throw new Error( + `Failed to parse ${filePath}. The file may be corrupted. ` + + `Check the file and fix or remove invalid entries. ` + + `Original error: ${error instanceof Error ? error.message : String(error)}`, + ); + } +} + +async function readOptionalJsonFile(filePath: string): Promise { + try { + const raw = await fs.readFile(filePath, "utf8"); + return JSON.parse(raw); + } catch (error) { + const code = (error as NodeJS.ErrnoException).code; + if (code === "ENOENT") { + return null; + } + throw error; + } +} + +async function hasAnyProjectWorkspaceRows(db: PaseoDatabaseHandle["db"]): Promise { + const [projectCountRows, workspaceCountRows] = await Promise.all([ + db.select({ count: count() }).from(projects), + db.select({ count: count() }).from(workspaces), + ]); + const projectCount = projectCountRows[0]?.count ?? 0; + const workspaceCount = workspaceCountRows[0]?.count ?? 0; + return projectCount > 0 || workspaceCount > 0; +} + +async function backupLegacyProjectWorkspaceJson(options: { + projectsPath: string; + workspacesPath: string; + paseoHome: string; + logger: Logger; +}): Promise { + const backupDir = path.join(options.paseoHome, "backup", "pre-migration"); + await fs.mkdir(backupDir, { recursive: true }); + + if (await pathExists(options.projectsPath)) { + await fs.copyFile(options.projectsPath, path.join(backupDir, "projects.json")); + } + if (await pathExists(options.workspacesPath)) { + await fs.copyFile(options.workspacesPath, path.join(backupDir, "workspaces.json")); + } + + options.logger.info({ backupPath: backupDir }, "Backed up legacy project/workspace JSON before migration"); +} + +async function pathExists(targetPath: string): Promise { + try { + await fs.access(targetPath); + return true; + } catch (error) { + if ((error as NodeJS.ErrnoException).code === "ENOENT") { + return false; + } + throw error; + } +} diff --git a/packages/server/src/server/db/migrations.ts b/packages/server/src/server/db/migrations.ts new file mode 100644 index 000000000..0b42d575b --- /dev/null +++ b/packages/server/src/server/db/migrations.ts @@ -0,0 +1,12 @@ +import { fileURLToPath } from "node:url"; + +import type { BetterSQLite3Database } from "drizzle-orm/better-sqlite3"; +import { migrate } from "drizzle-orm/better-sqlite3/migrator"; + +const migrationsFolder = fileURLToPath(new URL("./migrations", import.meta.url)); + +export async function runPaseoDbMigrations( + db: BetterSQLite3Database, +): Promise { + await migrate(db, { migrationsFolder }); +} diff --git a/packages/server/src/server/db/migrations/0000_sqlite_initial.sql b/packages/server/src/server/db/migrations/0000_sqlite_initial.sql new file mode 100644 index 000000000..74cdb0a17 --- /dev/null +++ b/packages/server/src/server/db/migrations/0000_sqlite_initial.sql @@ -0,0 +1,61 @@ +CREATE TABLE `projects` ( + `id` integer PRIMARY KEY AUTOINCREMENT NOT NULL, + `directory` text NOT NULL, + `display_name` text NOT NULL, + `kind` text NOT NULL, + `git_remote` text, + `created_at` text NOT NULL, + `updated_at` text NOT NULL, + `archived_at` text +); +--> statement-breakpoint +CREATE UNIQUE INDEX `projects_directory_unique` ON `projects` (`directory`); +--> statement-breakpoint +CREATE TABLE `workspaces` ( + `id` integer PRIMARY KEY AUTOINCREMENT NOT NULL, + `project_id` integer NOT NULL, + `directory` text NOT NULL, + `display_name` text NOT NULL, + `kind` text NOT NULL, + `created_at` text NOT NULL, + `updated_at` text NOT NULL, + `archived_at` text, + FOREIGN KEY (`project_id`) REFERENCES `projects`(`id`) ON UPDATE no action ON DELETE cascade +); +--> statement-breakpoint +CREATE UNIQUE INDEX `workspaces_directory_unique` ON `workspaces` (`directory`); +--> statement-breakpoint +CREATE INDEX `workspaces_project_id_idx` ON `workspaces` (`project_id`); +--> statement-breakpoint +CREATE TABLE `agent_snapshots` ( + `agent_id` text PRIMARY KEY NOT NULL, + `provider` text NOT NULL, + `workspace_id` integer NOT NULL, + `cwd` text NOT NULL, + `created_at` text NOT NULL, + `updated_at` text NOT NULL, + `last_activity_at` text, + `last_user_message_at` text, + `title` text, + `labels` text NOT NULL, + `last_status` text NOT NULL, + `last_mode_id` text, + `config` text, + `runtime_info` text, + `persistence` text, + `requires_attention` integer NOT NULL, + `attention_reason` text, + `attention_timestamp` text, + `internal` integer NOT NULL, + `archived_at` text, + FOREIGN KEY (`workspace_id`) REFERENCES `workspaces`(`id`) ON UPDATE no action ON DELETE cascade +); +--> statement-breakpoint +CREATE TABLE `agent_timeline_rows` ( + `agent_id` text NOT NULL, + `seq` integer NOT NULL, + `committed_at` text NOT NULL, + `item` text NOT NULL, + `item_kind` text, + PRIMARY KEY(`agent_id`, `seq`) +); diff --git a/packages/server/src/server/db/migrations/meta/0000_snapshot.json b/packages/server/src/server/db/migrations/meta/0000_snapshot.json new file mode 100644 index 000000000..91bcfd354 --- /dev/null +++ b/packages/server/src/server/db/migrations/meta/0000_snapshot.json @@ -0,0 +1,14 @@ +{ + "version": "7", + "dialect": "sqlite", + "id": "7ccf4685-bd8f-41a6-bafb-44712c1fe0d7", + "prevId": "00000000-0000-0000-0000-000000000000", + "tables": {}, + "views": {}, + "enums": {}, + "schemas": {}, + "sequences": {}, + "roles": {}, + "policies": {}, + "checkConstraints": {} +} diff --git a/packages/server/src/server/db/migrations/meta/_journal.json b/packages/server/src/server/db/migrations/meta/_journal.json new file mode 100644 index 000000000..d00e466e7 --- /dev/null +++ b/packages/server/src/server/db/migrations/meta/_journal.json @@ -0,0 +1,13 @@ +{ + "version": "7", + "dialect": "sqlite", + "entries": [ + { + "idx": 0, + "version": "7", + "when": 1774405361702, + "tag": "0000_sqlite_initial", + "breakpoints": true + } + ] +} diff --git a/packages/server/src/server/db/schema.ts b/packages/server/src/server/db/schema.ts new file mode 100644 index 000000000..31c505522 --- /dev/null +++ b/packages/server/src/server/db/schema.ts @@ -0,0 +1,78 @@ +import { index, integer, primaryKey, sqliteTable, text } from "drizzle-orm/sqlite-core"; + +import type { AgentPersistenceHandle, AgentRuntimeInfo, AgentTimelineItem } from "../agent/agent-sdk-types.js"; +import type { StoredAgentRecord } from "../agent/agent-storage.js"; + +export const projects = sqliteTable("projects", { + id: integer("id").primaryKey({ autoIncrement: true }), + directory: text("directory").notNull().unique(), + displayName: text("display_name").notNull(), + kind: text("kind").notNull(), + gitRemote: text("git_remote"), + createdAt: text("created_at").notNull(), + updatedAt: text("updated_at").notNull(), + archivedAt: text("archived_at"), +}); + +export const workspaces = sqliteTable( + "workspaces", + { + id: integer("id").primaryKey({ autoIncrement: true }), + projectId: integer("project_id") + .notNull() + .references(() => projects.id, { onDelete: "cascade" }), + directory: text("directory").notNull().unique(), + displayName: text("display_name").notNull(), + kind: text("kind").notNull(), + createdAt: text("created_at").notNull(), + updatedAt: text("updated_at").notNull(), + archivedAt: text("archived_at"), + }, + (table) => [index("workspaces_project_id_idx").on(table.projectId)], +); + +export const agentSnapshots = sqliteTable("agent_snapshots", { + agentId: text("agent_id").primaryKey(), + provider: text("provider").notNull(), + workspaceId: integer("workspace_id") + .notNull() + .references(() => workspaces.id, { onDelete: "cascade" }), + cwd: text("cwd").notNull(), + createdAt: text("created_at").notNull(), + updatedAt: text("updated_at").notNull(), + lastActivityAt: text("last_activity_at"), + lastUserMessageAt: text("last_user_message_at"), + title: text("title"), + labels: text("labels", { mode: "json" }).$type().notNull(), + lastStatus: text("last_status").notNull(), + lastModeId: text("last_mode_id"), + config: text("config", { mode: "json" }).$type(), + runtimeInfo: text("runtime_info", { mode: "json" }).$type(), + persistence: text("persistence", { mode: "json" }).$type(), + requiresAttention: integer("requires_attention", { mode: "boolean" }).notNull(), + attentionReason: text("attention_reason"), + attentionTimestamp: text("attention_timestamp"), + internal: integer("internal", { mode: "boolean" }).notNull(), + archivedAt: text("archived_at"), +}); + +export const agentTimelineRows = sqliteTable( + "agent_timeline_rows", + { + agentId: text("agent_id").notNull(), + seq: integer("seq").notNull(), + committedAt: text("committed_at").notNull(), + item: text("item", { mode: "json" }).$type().notNull(), + itemKind: text("item_kind"), + }, + (table) => [ + primaryKey({ columns: [table.agentId, table.seq], name: "agent_timeline_rows_pk" }), + ], +); + +export const paseoDbSchema = { + projects, + workspaces, + agentSnapshots, + agentTimelineRows, +}; diff --git a/packages/server/src/server/db/sqlite-contract.test.ts b/packages/server/src/server/db/sqlite-contract.test.ts new file mode 100644 index 000000000..040748fc4 --- /dev/null +++ b/packages/server/src/server/db/sqlite-contract.test.ts @@ -0,0 +1,316 @@ +import os from "node:os"; +import path from "node:path"; +import { mkdtempSync, rmSync } from "node:fs"; + +import { and, asc, desc, eq, gt, lt, sql } from "drizzle-orm"; +import { afterEach, beforeEach, describe, expect, test } from "vitest"; + +import type { AgentTimelineItem } from "../agent/agent-sdk-types.js"; +import { openPaseoDatabase } from "./sqlite-database.js"; +import { runPaseoDbMigrations } from "./migrations.js"; +import { + agentSnapshots, + agentTimelineRows, + projects, + workspaces, +} from "./schema.js"; + +function createTimestamp(day: number): string { + return `2026-03-${String(day).padStart(2, "0")}T00:00:00.000Z`; +} + +function createTimelineItem(type: AgentTimelineItem["type"], suffix: string): AgentTimelineItem { + if (type === "user_message") { + return { type, text: `user-${suffix}`, messageId: `msg-${suffix}` }; + } + if (type === "assistant_message" || type === "reasoning") { + return { type, text: `${type}-${suffix}` }; + } + return { type: "error", message: `error-${suffix}` }; +} + +describe("SQLite database contract", () => { + let tmpDir: string; + let dataDir: string; + + beforeEach(() => { + tmpDir = mkdtempSync(path.join(os.tmpdir(), "paseo-db-")); + dataDir = path.join(tmpDir, "db"); + }); + + afterEach(() => { + rmSync(tmpDir, { recursive: true, force: true }); + }); + + test("creates, migrates, closes, and reopens a persistent database", async () => { + const database = await openPaseoDatabase(dataDir); + + const [project] = await database.db.insert(projects).values({ + directory: "/tmp/project-1", + kind: "git", + displayName: "Project One", + gitRemote: "git@github.com:acme/project-1.git", + createdAt: createTimestamp(1), + updatedAt: createTimestamp(1), + archivedAt: null, + }).returning(); + + await database.close(); + + const reopened = await openPaseoDatabase(dataDir); + const rows = await reopened.db.select().from(projects); + expect(rows).toEqual([ + { + id: project.id, + directory: "/tmp/project-1", + displayName: "Project One", + kind: "git", + gitRemote: "git@github.com:acme/project-1.git", + createdAt: createTimestamp(1), + updatedAt: createTimestamp(1), + archivedAt: null, + }, + ]); + await reopened.close(); + }); + + test("supports project and workspace linkage plus archive field updates", async () => { + const database = await openPaseoDatabase(dataDir); + + const [project] = await database.db.insert(projects).values({ + directory: "/tmp/project-1", + kind: "git", + displayName: "Project One", + gitRemote: null, + createdAt: createTimestamp(1), + updatedAt: createTimestamp(1), + archivedAt: null, + }).returning(); + const [workspace] = await database.db.insert(workspaces).values({ + projectId: project.id, + directory: "/tmp/project-1", + kind: "checkout", + displayName: "main", + createdAt: createTimestamp(1), + updatedAt: createTimestamp(1), + archivedAt: null, + }).returning(); + + await database.db + .update(workspaces) + .set({ archivedAt: createTimestamp(2), updatedAt: createTimestamp(2) }) + .where(eq(workspaces.id, workspace.id)); + + const linkedRows = await database.db + .select({ + projectId: projects.id, + workspaceId: workspaces.id, + workspaceArchivedAt: workspaces.archivedAt, + }) + .from(workspaces) + .innerJoin(projects, eq(workspaces.projectId, projects.id)); + + expect(linkedRows).toEqual([ + { + projectId: project.id, + workspaceId: workspace.id, + workspaceArchivedAt: createTimestamp(2), + }, + ]); + + await database.close(); + }); + + test("supports snapshot insert, get, update, and project-delete cascade with integer workspace IDs", async () => { + const database = await openPaseoDatabase(dataDir); + const [project] = await database.db.insert(projects).values({ + directory: "/tmp/project-1", + kind: "git", + displayName: "Project One", + gitRemote: null, + createdAt: createTimestamp(1), + updatedAt: createTimestamp(1), + archivedAt: null, + }).returning(); + const [workspace] = await database.db.insert(workspaces).values({ + projectId: project.id, + directory: "/tmp/project-1", + kind: "checkout", + displayName: "main", + createdAt: createTimestamp(1), + updatedAt: createTimestamp(1), + archivedAt: null, + }).returning(); + + await database.db.insert(agentSnapshots).values({ + agentId: "agent-1", + provider: "codex", + workspaceId: workspace.id, + cwd: "/tmp/project-1", + createdAt: createTimestamp(1), + updatedAt: createTimestamp(1), + lastActivityAt: createTimestamp(1), + lastUserMessageAt: null, + title: "Agent One", + labels: { surface: "workspace" }, + lastStatus: "idle", + lastModeId: "plan", + config: { model: "gpt-5.1", modeId: "plan" }, + runtimeInfo: { provider: "codex", sessionId: "session-1" }, + persistence: { provider: "codex", sessionId: "session-1" }, + requiresAttention: false, + attentionReason: null, + attentionTimestamp: null, + internal: false, + archivedAt: null, + }); + + await database.db + .update(agentSnapshots) + .set({ + updatedAt: createTimestamp(2), + lastStatus: "running", + title: "Agent One Updated", + archivedAt: createTimestamp(3), + }) + .where(eq(agentSnapshots.agentId, "agent-1")); + + const rows = await database.db + .select() + .from(agentSnapshots) + .where(eq(agentSnapshots.agentId, "agent-1")); + + expect(rows).toEqual([ + { + agentId: "agent-1", + provider: "codex", + workspaceId: workspace.id, + cwd: "/tmp/project-1", + createdAt: createTimestamp(1), + updatedAt: createTimestamp(2), + lastActivityAt: createTimestamp(1), + lastUserMessageAt: null, + title: "Agent One Updated", + labels: { surface: "workspace" }, + lastStatus: "running", + lastModeId: "plan", + config: { model: "gpt-5.1", modeId: "plan" }, + runtimeInfo: { provider: "codex", sessionId: "session-1" }, + persistence: { provider: "codex", sessionId: "session-1" }, + requiresAttention: false, + attentionReason: null, + attentionTimestamp: null, + internal: false, + archivedAt: createTimestamp(3), + }, + ]); + + await database.db.delete(projects).where(eq(projects.id, project.id)); + expect(await database.db.select().from(workspaces)).toEqual([]); + expect(await database.db.select().from(agentSnapshots)).toEqual([]); + + await database.close(); + }); + + test("rejects agent snapshots without a workspace ID", async () => { + const database = await openPaseoDatabase(dataDir); + + await expect( + database.db.insert(agentSnapshots).values({ + agentId: "agent-1", + provider: "codex", + cwd: "/tmp/project-1", + createdAt: createTimestamp(1), + updatedAt: createTimestamp(1), + lastActivityAt: createTimestamp(1), + lastUserMessageAt: null, + title: "Agent One", + labels: { surface: "workspace" }, + lastStatus: "idle", + lastModeId: "plan", + config: { model: "gpt-5.1", modeId: "plan" }, + runtimeInfo: { provider: "codex", sessionId: "session-1" }, + persistence: { provider: "codex", sessionId: "session-1" }, + requiresAttention: false, + attentionReason: null, + attentionTimestamp: null, + internal: false, + archivedAt: null, + } as typeof agentSnapshots.$inferInsert), + ).rejects.toThrow(); + + await database.close(); + }); + + test("supports timeline append and tail, after-seq, before-seq access patterns in committed order", async () => { + const database = await openPaseoDatabase(dataDir); + const rows = [1, 2, 3, 4].map((seq) => ({ + agentId: "agent-1", + seq, + committedAt: createTimestamp(seq), + item: createTimelineItem(seq === 1 ? "user_message" : "assistant_message", String(seq)), + itemKind: seq === 1 ? "user_message" : "assistant_message", + })); + + await database.db.insert(agentTimelineRows).values(rows); + + const tailRows = await database.db + .select() + .from(agentTimelineRows) + .where(eq(agentTimelineRows.agentId, "agent-1")) + .orderBy(desc(agentTimelineRows.seq)) + .limit(2); + + expect(tailRows.map((row) => row.seq).reverse()).toEqual([3, 4]); + + const afterRows = await database.db + .select() + .from(agentTimelineRows) + .where(and(eq(agentTimelineRows.agentId, "agent-1"), gt(agentTimelineRows.seq, 2))) + .orderBy(asc(agentTimelineRows.seq)); + + expect(afterRows.map((row) => row.seq)).toEqual([3, 4]); + + const beforeRows = await database.db + .select() + .from(agentTimelineRows) + .where(and(eq(agentTimelineRows.agentId, "agent-1"), lt(agentTimelineRows.seq, 4))) + .orderBy(desc(agentTimelineRows.seq)) + .limit(2); + + expect(beforeRows.map((row) => row.seq).reverse()).toEqual([2, 3]); + + await database.close(); + }); + + test("enforces per-agent seq uniqueness and reruns migrations without drift", async () => { + const database = await openPaseoDatabase(dataDir); + + await database.db.insert(agentTimelineRows).values({ + agentId: "agent-1", + seq: 1, + committedAt: createTimestamp(1), + item: createTimelineItem("assistant_message", "1"), + itemKind: "assistant_message", + }); + + await expect( + database.db.insert(agentTimelineRows).values({ + agentId: "agent-1", + seq: 1, + committedAt: createTimestamp(2), + item: createTimelineItem("assistant_message", "duplicate"), + itemKind: "assistant_message", + }), + ).rejects.toThrow(); + + await runPaseoDbMigrations(database.db); + + const migrationRows = database.client + .prepare("select * from __drizzle_migrations order by created_at") + .all(); + expect(migrationRows).toHaveLength(1); + + await database.close(); + }); +}); diff --git a/packages/server/src/server/db/sqlite-database.ts b/packages/server/src/server/db/sqlite-database.ts new file mode 100644 index 000000000..9372f4146 --- /dev/null +++ b/packages/server/src/server/db/sqlite-database.ts @@ -0,0 +1,31 @@ +import { mkdirSync } from "node:fs"; +import path from "node:path"; + +import Database from "better-sqlite3"; +import { drizzle, type BetterSQLite3Database } from "drizzle-orm/better-sqlite3"; + +import { runPaseoDbMigrations } from "./migrations.js"; +import { paseoDbSchema } from "./schema.js"; + +export interface PaseoDatabaseHandle { + client: Database.Database; + db: BetterSQLite3Database; + close(): Promise; +} + +export async function openPaseoDatabase(dataDir: string): Promise { + mkdirSync(dataDir, { recursive: true }); + const databasePath = path.join(dataDir, "paseo.sqlite"); + const client = new Database(databasePath); + client.pragma("foreign_keys = ON"); + client.pragma("journal_mode = WAL"); + const db = drizzle(client, { schema: paseoDbSchema }); + await runPaseoDbMigrations(db); + return { + client, + db, + async close(): Promise { + client.close(); + }, + }; +} diff --git a/packages/server/src/server/dictation/dictation-stream-manager.test.ts b/packages/server/src/server/dictation/dictation-stream-manager.test.ts index ab584b046..0695c21f8 100644 --- a/packages/server/src/server/dictation/dictation-stream-manager.test.ts +++ b/packages/server/src/server/dictation/dictation-stream-manager.test.ts @@ -266,6 +266,8 @@ describe("DictationStreamManager (provider-agnostic provider)", () => { it("drops dangling uncommitted non-final transcripts when finishing after silence tail clear", async () => { vi.useFakeTimers(); + const previousDebug = process.env.PASEO_DICTATION_DEBUG; + process.env.PASEO_DICTATION_DEBUG = "false"; try { const session = new FakeRealtimeSession(); const emitted: Array<{ type: string; payload: any }> = []; @@ -299,6 +301,7 @@ describe("DictationStreamManager (provider-agnostic provider)", () => { await manager.handleFinish("d-clear-tail", 1); await tick(); await vi.advanceTimersByTimeAsync(5_100); + await tick(); const final = emitted.find((msg) => msg.type === "dictation_stream_final"); const error = emitted.find((msg) => msg.type === "dictation_stream_error"); @@ -306,6 +309,7 @@ describe("DictationStreamManager (provider-agnostic provider)", () => { expect(error).toBeUndefined(); expect(final?.payload.text).toBe("hello"); } finally { + process.env.PASEO_DICTATION_DEBUG = previousDebug; vi.useRealTimers(); } }); diff --git a/packages/server/src/server/persistence-hooks.test.ts b/packages/server/src/server/persistence-hooks.test.ts index bedcb5734..22510d478 100644 --- a/packages/server/src/server/persistence-hooks.test.ts +++ b/packages/server/src/server/persistence-hooks.test.ts @@ -1,83 +1,6 @@ -import { describe, expect, test, vi } from "vitest"; - -import type { ManagedAgent } from "./agent/agent-manager.js"; +import { describe, expect, test } from "vitest"; import type { StoredAgentRecord } from "./agent/agent-storage.js"; -import { - attachAgentStoragePersistence, - buildConfigOverrides, - buildSessionConfig, -} from "./persistence-hooks.js"; -import type { - AgentPermissionRequest, - AgentSession, - AgentSessionConfig, -} from "./agent/agent-sdk-types.js"; - -const testLogger = { - child: () => testLogger, - error: vi.fn(), -} as any; - -type ManagedAgentOverrides = Omit< - Partial, - "config" | "pendingPermissions" | "session" | "activeForegroundTurnId" -> & { - config?: Partial; - pendingPermissions?: Map; - session?: AgentSession | null; - activeForegroundTurnId?: string | null; -}; - -function createManagedAgent(overrides: ManagedAgentOverrides = {}): ManagedAgent { - const now = overrides.updatedAt ?? new Date("2025-01-01T00:00:00.000Z"); - const provider = overrides.provider ?? "claude"; - const cwd = overrides.cwd ?? "/tmp/project"; - const lifecycle = overrides.lifecycle ?? "idle"; - const configOverrides = overrides.config ?? {}; - const config: AgentSessionConfig = { - provider, - cwd, - modeId: configOverrides.modeId ?? "plan", - model: configOverrides.model ?? "claude-3.5-sonnet", - extra: configOverrides.extra ?? { claude: { tone: "focused" } }, - }; - const session = lifecycle === "closed" ? null : (overrides.session ?? ({} as AgentSession)); - const activeForegroundTurnId = - overrides.activeForegroundTurnId ?? (lifecycle === "running" ? "test-turn-id" : null); - - const agent: ManagedAgent = { - id: overrides.id ?? "agent-1", - provider, - cwd, - session, - capabilities: overrides.capabilities ?? { - supportsStreaming: true, - supportsSessionPersistence: true, - supportsDynamicModes: true, - supportsMcpServers: true, - supportsReasoningStream: true, - supportsToolInvocations: true, - }, - config, - lifecycle, - createdAt: overrides.createdAt ?? now, - updatedAt: overrides.updatedAt ?? now, - availableModes: overrides.availableModes ?? [], - currentModeId: overrides.currentModeId ?? config.modeId ?? null, - pendingPermissions: overrides.pendingPermissions ?? new Map(), - activeForegroundTurnId, - foregroundTurnWaiters: new Set(), - unsubscribeSession: null, - timeline: overrides.timeline ?? [], - persistence: overrides.persistence ?? null, - historyPrimed: overrides.historyPrimed ?? true, - lastUserMessageAt: overrides.lastUserMessageAt ?? now, - lastUsage: overrides.lastUsage, - lastError: overrides.lastError, - }; - - return agent; -} +import { buildConfigOverrides, buildSessionConfig } from "./persistence-hooks.js"; function createRecord(overrides?: Partial): StoredAgentRecord { const now = new Date().toISOString(); @@ -100,43 +23,6 @@ function createRecord(overrides?: Partial): StoredAgentRecord } describe("persistence hooks", () => { - test("attachAgentStoragePersistence forwards agent snapshots", async () => { - const applySnapshot = vi.fn().mockResolvedValue(undefined); - let subscriber: (event: any) => void = () => { - throw new Error("Agent manager subscriber was not registered"); - }; - const agentManager = { - subscribe: vi.fn((callback: (event: any) => void) => { - subscriber = callback; - return () => { - subscriber = () => { - throw new Error("Agent manager subscriber was not registered"); - }; - }; - }), - }; - attachAgentStoragePersistence( - testLogger, - agentManager as any, - { - applySnapshot, - list: vi.fn(), - } as any, - ); - - expect(agentManager.subscribe).toHaveBeenCalledTimes(1); - const agent = createManagedAgent(); - subscriber({ type: "agent_state", agent }); - expect(applySnapshot).toHaveBeenCalledWith(agent); - - subscriber({ - type: "agent_stream", - agentId: agent.id, - event: { type: "timeline", item: { type: "assistant_message", text: "hi" } }, - }); - expect(applySnapshot).toHaveBeenCalledTimes(1); - }); - test("buildConfigOverrides carries systemPrompt and mcpServers", () => { const record = createRecord({ title: "Voice agent (current)", @@ -208,4 +94,20 @@ describe("persistence hooks", () => { }, }); }); + + test("buildSessionConfig accepts providers from the canonical manifest", () => { + const record = createRecord({ + provider: "gemini", + persistence: { + provider: "gemini", + sessionId: "session-123", + }, + config: {}, + }); + + expect(buildSessionConfig(record)).toMatchObject({ + provider: "gemini", + cwd: "/tmp/project", + }); + }); }); diff --git a/packages/server/src/server/persistence-hooks.ts b/packages/server/src/server/persistence-hooks.ts index c197a1c2d..2f92da58f 100644 --- a/packages/server/src/server/persistence-hooks.ts +++ b/packages/server/src/server/persistence-hooks.ts @@ -1,3 +1,5 @@ +import type pino from "pino"; + import type { AgentManager } from "./agent/agent-manager.js"; import type { AgentSessionConfig } from "./agent/agent-sdk-types.js"; import type { AgentStorage, StoredAgentRecord } from "./agent/agent-storage.js"; @@ -29,6 +31,9 @@ export function attachAgentStoragePersistence( if (event.type !== "agent_state") { return; } + if (event.agent.lifecycle === "closed") { + return; + } void storage.applySnapshot(event.agent).catch((error) => { log.error({ err: error, agentId: event.agent.id }, "Failed to persist agent snapshot"); }); @@ -70,6 +75,30 @@ export function buildSessionConfig(record: StoredAgentRecord): AgentSessionConfi }; } +export function toAgentPersistenceHandle( + logger: pino.Logger, + handle: StoredAgentRecord["persistence"], +) { + if (!handle) { + return null; + } + const provider = handle.provider; + if (!isValidAgentProvider(provider)) { + logger.warn({ provider }, `Ignoring persistence handle with unknown provider '${provider}'`); + return null; + } + if (!handle.sessionId) { + logger.warn("Ignoring persistence handle missing sessionId"); + return null; + } + return { + provider, + sessionId: handle.sessionId, + nativeHandle: handle.nativeHandle, + metadata: handle.metadata, + }; +} + export function extractTimestamps(record: StoredAgentRecord): { createdAt: Date; updatedAt: Date; diff --git a/packages/server/src/server/provider-history-compatibility-boundary.test.ts b/packages/server/src/server/provider-history-compatibility-boundary.test.ts new file mode 100644 index 000000000..df3e50ff9 --- /dev/null +++ b/packages/server/src/server/provider-history-compatibility-boundary.test.ts @@ -0,0 +1,15 @@ +import { readFileSync } from "node:fs"; +import { describe, expect, test } from "vitest"; + +describe("agent loading boundary", () => { + test("session runtime code does not directly hydrate provider history", () => { + const sessionSource = readFileSync(new URL("./session.ts", import.meta.url), "utf8"); + const agentLoadingSource = readFileSync( + new URL("./agent-loading-service.ts", import.meta.url), + "utf8", + ); + + expect(sessionSource).not.toMatch(/hydrateTimelineFromProvider\s*\(/); + expect(agentLoadingSource).not.toMatch(/hydrateTimelineFromProvider\s*\(/); + }); +}); diff --git a/packages/server/src/server/provider-history-compatibility-service.test.ts b/packages/server/src/server/provider-history-compatibility-service.test.ts new file mode 100644 index 000000000..1b273aebf --- /dev/null +++ b/packages/server/src/server/provider-history-compatibility-service.test.ts @@ -0,0 +1,394 @@ +import { mkdtempSync, rmSync } from "node:fs"; +import os from "node:os"; +import path from "node:path"; +import pino from "pino"; +import { describe, expect, test, vi } from "vitest"; + +import { AgentLoadingService } from "./agent-loading-service.js"; +import { AgentManager } from "./agent/agent-manager.js"; +import { AgentStorage, type StoredAgentRecord } from "./agent/agent-storage.js"; +import { DbAgentTimelineStore } from "./db/db-agent-timeline-store.js"; +import { openPaseoDatabase } from "./db/sqlite-database.js"; +import { createTestAgentClients } from "./test-utils/fake-agent-client.js"; + +function createStoredAgentRecord(overrides?: Partial): StoredAgentRecord { + const now = "2026-03-25T00:00:00.000Z"; + return { + id: "agent-compat-1", + provider: "codex", + cwd: "/tmp/project", + createdAt: now, + updatedAt: now, + title: null, + labels: {}, + lastStatus: "idle", + config: { + model: "gpt-5.1-codex-mini", + }, + persistence: { + provider: "codex", + sessionId: "provider-session-1", + }, + ...overrides, + }; +} + +function createDeferred() { + let resolve!: (value: T) => void; + let reject!: (error: unknown) => void; + const promise = new Promise((res, rej) => { + resolve = res; + reject = rej; + }); + return { promise, resolve, reject }; +} + +function createCompatibilitySnapshot(overrides?: Partial>) { + return { + id: "agent-compat-1", + provider: "codex", + cwd: "/tmp/project", + persistence: { + provider: "codex", + sessionId: "provider-session-1", + }, + ...overrides, + }; +} + +describe("AgentLoadingService", () => { + test("ensureAgentLoaded seeds the live timeline from durable rows for an unloaded persisted agent", async () => { + const workspaceRoot = mkdtempSync(path.join(os.tmpdir(), "provider-history-compat-load-")); + const logger = pino({ level: "silent" }); + const database = await openPaseoDatabase(path.join(workspaceRoot, "db")); + + try { + const storage = new AgentStorage(path.join(workspaceRoot, "agents"), logger); + const manager = new AgentManager({ + clients: createTestAgentClients(), + registry: storage, + durableTimelineStore: new DbAgentTimelineStore(database.db), + logger, + idFactory: () => "00000000-0000-4000-8000-000000000301", + }); + const service = new AgentLoadingService({ + agentManager: manager as any, + agentStorage: storage as any, + logger, + }); + + const snapshot = await manager.createAgent({ + provider: "codex", + cwd: workspaceRoot, + model: "gpt-5.1-codex-mini", + }); + await manager.runAgent(snapshot.id, "say 'timeline test'"); + await manager.flush(); + await storage.flush(); + rmSync( + path.join( + os.tmpdir(), + "paseo-fake-provider-history", + "codex", + `${snapshot.persistence?.sessionId}.jsonl`, + ), + { force: true }, + ); + await manager.closeAgent(snapshot.id); + + const loaded = await service.ensureAgentLoaded({ agentId: snapshot.id }); + const durableTimeline = await manager.fetchTimeline(snapshot.id, { + direction: "tail", + limit: 0, + }); + + expect(loaded.id).toBe(snapshot.id); + expect(manager.getTimeline(snapshot.id)).toEqual([]); + expect(durableTimeline.rows.every((row) => row.item.type === "assistant_message")).toBe(true); + expect( + durableTimeline.rows + .map((row) => (row.item.type === "assistant_message" ? row.item.text : "")) + .join(""), + ).toBe("timeline test"); + } finally { + await database.close(); + rmSync(workspaceRoot, { recursive: true, force: true }); + } + }); + + test("ensureAgentLoaded succeeds when provider history is absent", async () => { + const workspaceRoot = mkdtempSync(path.join(os.tmpdir(), "provider-history-compat-empty-")); + const logger = pino({ level: "silent" }); + + try { + const storage = new AgentStorage(path.join(workspaceRoot, "agents"), logger); + const manager = new AgentManager({ + clients: createTestAgentClients(), + registry: storage, + logger, + idFactory: () => "00000000-0000-4000-8000-000000000302", + }); + const service = new AgentLoadingService({ + agentManager: manager as any, + agentStorage: storage as any, + logger, + }); + + const snapshot = await manager.createAgent({ + provider: "codex", + cwd: workspaceRoot, + model: "gpt-5.1-codex-mini", + }); + await manager.flush(); + await storage.flush(); + await manager.closeAgent(snapshot.id); + + const loaded = await service.ensureAgentLoaded({ agentId: snapshot.id }); + + expect(loaded.id).toBe(snapshot.id); + expect(manager.getTimeline(snapshot.id)).toEqual([]); + } finally { + rmSync(workspaceRoot, { recursive: true, force: true }); + } + }); + + test("ensureAgentLoaded dedupes concurrent cold-load bootstrap", async () => { + const deferred = createDeferred(); + let currentAgent: any = null; + const snapshot = createCompatibilitySnapshot({ id: "agent-compat-dedupe" }); + const agentStorage = { + get: vi.fn(async () => + createStoredAgentRecord({ + id: "agent-compat-dedupe", + cwd: "/tmp/dedupe", + persistence: { + provider: "codex", + sessionId: "provider-session-dedupe", + }, + }), + ), + }; + const agentManager = { + getAgent: vi.fn(() => currentAgent), + resumeAgentFromPersistence: vi.fn(async () => deferred.promise), + createAgent: vi.fn(), + reloadAgentSession: vi.fn(), + }; + const logger = { + child: () => logger, + info: vi.fn(), + warn: vi.fn(), + }; + const service = new AgentLoadingService({ + agentManager: agentManager as any, + agentStorage: agentStorage as any, + logger: logger as any, + }); + + const firstLoad = service.ensureAgentLoaded({ agentId: "agent-compat-dedupe" }); + const secondLoad = service.ensureAgentLoaded({ agentId: "agent-compat-dedupe" }); + deferred.resolve(snapshot); + + const [firstResult, secondResult] = await Promise.all([firstLoad, secondLoad]); + + expect(firstResult).toEqual(snapshot); + expect(secondResult).toEqual(snapshot); + expect(agentStorage.get).toHaveBeenCalledTimes(1); + expect(agentManager.resumeAgentFromPersistence).toHaveBeenCalledTimes(1); + }); + + test("resumeAgent delegates to manager resume", async () => { + const snapshot = createCompatibilitySnapshot({ id: "agent-compat-resume" }); + const agentManager = { + getAgent: vi.fn(() => null), + resumeAgentFromPersistence: vi.fn(async () => snapshot), + createAgent: vi.fn(), + reloadAgentSession: vi.fn(), + }; + const logger = { + child: () => logger, + info: vi.fn(), + warn: vi.fn(), + }; + const service = new AgentLoadingService({ + agentManager: agentManager as any, + agentStorage: { + get: async () => null, + } as any, + logger: logger as any, + }); + + const result = await service.resumeAgent({ + handle: { + provider: "codex", + sessionId: "provider-session-resume", + }, + overrides: { + model: "gpt-5.4", + }, + }); + + expect(agentManager.resumeAgentFromPersistence).toHaveBeenCalledWith( + { + provider: "codex", + sessionId: "provider-session-resume", + }, + { + model: "gpt-5.4", + }, + ); + expect(result).toEqual(snapshot); + }); + + test("refreshAgent reloads loaded persisted agents", async () => { + const existing = createCompatibilitySnapshot({ id: "agent-compat-refresh-loaded" }); + const reloaded = createCompatibilitySnapshot({ id: "agent-compat-refresh-loaded" }); + let currentAgent: any = existing; + const agentManager = { + getAgent: vi.fn(() => currentAgent), + resumeAgentFromPersistence: vi.fn(), + createAgent: vi.fn(), + reloadAgentSession: vi.fn(async () => { + currentAgent = reloaded; + return reloaded; + }), + }; + const logger = { + child: () => logger, + info: vi.fn(), + warn: vi.fn(), + }; + const service = new AgentLoadingService({ + agentManager: agentManager as any, + agentStorage: { + get: async () => null, + } as any, + logger: logger as any, + }); + + const result = await service.refreshAgent({ agentId: "agent-compat-refresh-loaded" }); + + expect(agentManager.reloadAgentSession).toHaveBeenCalledWith("agent-compat-refresh-loaded"); + expect(result).toEqual(reloaded); + }); + + test("refreshAgent keeps loaded non-persisted agents without reloading", async () => { + const existing = createCompatibilitySnapshot({ + id: "agent-compat-refresh-live", + persistence: null, + }); + const agentManager = { + getAgent: vi.fn(() => existing), + resumeAgentFromPersistence: vi.fn(), + createAgent: vi.fn(), + reloadAgentSession: vi.fn(), + }; + const logger = { + child: () => logger, + info: vi.fn(), + warn: vi.fn(), + }; + const service = new AgentLoadingService({ + agentManager: agentManager as any, + agentStorage: { + get: async () => null, + } as any, + logger: logger as any, + }); + + const result = await service.refreshAgent({ agentId: "agent-compat-refresh-live" }); + + expect(agentManager.reloadAgentSession).not.toHaveBeenCalled(); + expect(result).toEqual(existing); + }); + + test("refreshAgent resumes unloaded persisted agents", async () => { + const snapshot = createCompatibilitySnapshot({ id: "agent-compat-refresh-cold" }); + const record = createStoredAgentRecord({ + id: "agent-compat-refresh-cold", + cwd: "/tmp/refresh-cold", + persistence: { + provider: "codex", + sessionId: "provider-session-refresh-cold", + }, + }); + const agentManager = { + getAgent: vi.fn(() => null), + resumeAgentFromPersistence: vi.fn(async () => snapshot), + createAgent: vi.fn(), + reloadAgentSession: vi.fn(), + }; + const logger = { + child: () => logger, + info: vi.fn(), + warn: vi.fn(), + }; + const service = new AgentLoadingService({ + agentManager: agentManager as any, + agentStorage: { + get: vi.fn(async () => record), + } as any, + logger: logger as any, + }); + + const result = await service.refreshAgent({ agentId: "agent-compat-refresh-cold" }); + + expect(agentManager.resumeAgentFromPersistence).toHaveBeenCalledWith( + { + provider: "codex", + sessionId: "provider-session-refresh-cold", + nativeHandle: undefined, + metadata: undefined, + }, + { + cwd: "/tmp/refresh-cold", + modeId: undefined, + model: "gpt-5.1-codex-mini", + thinkingOptionId: undefined, + title: undefined, + extra: undefined, + systemPrompt: undefined, + mcpServers: undefined, + }, + "agent-compat-refresh-cold", + { + createdAt: new Date("2026-03-25T00:00:00.000Z"), + updatedAt: new Date("2026-03-25T00:00:00.000Z"), + lastUserMessageAt: null, + labels: {}, + }, + ); + expect(result).toEqual(snapshot); + }); + + test("refreshAgent preserves the unloaded no-persistence error", async () => { + const service = new AgentLoadingService({ + agentManager: { + getAgent: vi.fn(() => null), + resumeAgentFromPersistence: vi.fn(), + createAgent: vi.fn(), + reloadAgentSession: vi.fn(), + } as any, + agentStorage: { + get: async () => + createStoredAgentRecord({ + id: "agent-compat-no-persistence", + persistence: null, + }), + } as any, + logger: { + child: () => ({ + child: () => null, + info: vi.fn(), + warn: vi.fn(), + }), + info: vi.fn(), + warn: vi.fn(), + } as any, + }); + + await expect( + service.refreshAgent({ agentId: "agent-compat-no-persistence" }), + ).rejects.toThrow("Agent agent-compat-no-persistence cannot be refreshed because it lacks persistence"); + }); +}); diff --git a/packages/server/src/server/schedule/service.ts b/packages/server/src/server/schedule/service.ts index f77856627..318223fbc 100644 --- a/packages/server/src/server/schedule/service.ts +++ b/packages/server/src/server/schedule/service.ts @@ -3,7 +3,7 @@ import { join } from "node:path"; import type { Logger } from "pino"; import { AgentManager } from "../agent/agent-manager.js"; import type { ManagedAgent } from "../agent/agent-manager.js"; -import { AgentStorage } from "../agent/agent-storage.js"; +import type { AgentSnapshotStore } from "../agent/agent-snapshot-store.js"; import type { AgentPromptInput, AgentSessionConfig, @@ -97,7 +97,7 @@ export interface ScheduleServiceOptions { paseoHome: string; logger: Logger; agentManager: AgentManager; - agentStorage: AgentStorage; + agentStorage: AgentSnapshotStore; now?: () => Date; runner?: (schedule: StoredSchedule) => Promise; } @@ -106,7 +106,7 @@ export class ScheduleService { private readonly store: ScheduleStore; private readonly logger: Logger; private readonly agentManager: AgentManager; - private readonly agentStorage: AgentStorage; + private readonly agentStorage: AgentSnapshotStore; private readonly now: () => Date; private readonly runner: (schedule: StoredSchedule) => Promise; private readonly runningScheduleIds = new Set(); diff --git a/packages/server/src/server/service-health-monitor.test.ts b/packages/server/src/server/service-health-monitor.test.ts new file mode 100644 index 000000000..348e3c02b --- /dev/null +++ b/packages/server/src/server/service-health-monitor.test.ts @@ -0,0 +1,455 @@ +import net from "node:net"; +import { scheduler } from "node:timers/promises"; +import { afterEach, describe, expect, it, vi } from "vitest"; +import { findFreePort, ServiceRouteStore } from "./service-proxy.js"; +import { + ServiceHealthMonitor, + type ServiceHealthEntry, +} from "./service-health-monitor.js"; + +type TcpServerHandle = { + port: number; + server: net.Server; +}; + +async function startTcpServer(): Promise { + const server = net.createServer((socket) => { + socket.end(); + }); + + await new Promise((resolve, reject) => { + server.once("error", reject); + server.listen(0, "127.0.0.1", () => { + server.off("error", reject); + resolve(); + }); + }); + + const address = server.address(); + if (!address || typeof address === "string") { + throw new Error("Failed to resolve TCP server address"); + } + + return { port: address.port, server }; +} + +async function closeServer(server: net.Server): Promise { + if (!server.listening) { + return; + } + + await new Promise((resolve, reject) => { + server.close((error) => { + if (error) { + reject(error); + return; + } + resolve(); + }); + }); +} + +async function advancePoll(ms: number): Promise { + await vi.advanceTimersByTimeAsync(ms); + for (let i = 0; i < 20; i += 1) { + await scheduler.yield(); + } +} + +describe("ServiceHealthMonitor", () => { + const servers = new Set(); + + afterEach(async () => { + vi.useRealTimers(); + + for (const server of servers) { + await closeServer(server); + } + servers.clear(); + }); + + it("marks a healthy port as running after successful TCP connect", async () => { + vi.useFakeTimers(); + + const healthy = await startTcpServer(); + servers.add(healthy.server); + + const routeStore = new ServiceRouteStore(); + routeStore.registerRoute({ + hostname: "api.localhost", + port: healthy.port, + workspaceId: "workspace-a", + serviceName: "api", + }); + + const onChange = vi.fn<(workspaceId: string, services: ServiceHealthEntry[]) => void>(); + const monitor = new ServiceHealthMonitor({ + routeStore, + onChange, + pollIntervalMs: 1_000, + probeTimeoutMs: 100, + graceMs: 0, + }); + + monitor.start(); + await advancePoll(1_000); + monitor.stop(); + + expect(onChange).toHaveBeenCalledTimes(1); + expect(onChange).toHaveBeenCalledWith("workspace-a", [ + { + serviceName: "api", + hostname: "api.localhost", + port: healthy.port, + health: "healthy", + }, + ]); + }); + + it("marks an unreachable port as stopped after consecutive failures", async () => { + vi.useFakeTimers(); + + const deadPort = await findFreePort(); + const routeStore = new ServiceRouteStore(); + routeStore.registerRoute({ + hostname: "api.localhost", + port: deadPort, + workspaceId: "workspace-a", + serviceName: "api", + }); + + const onChange = vi.fn<(workspaceId: string, services: ServiceHealthEntry[]) => void>(); + const monitor = new ServiceHealthMonitor({ + routeStore, + onChange, + pollIntervalMs: 1_000, + probeTimeoutMs: 100, + graceMs: 0, + failuresBeforeStopped: 2, + }); + + monitor.start(); + await advancePoll(1_000); + expect(onChange).not.toHaveBeenCalled(); + + await advancePoll(1_000); + monitor.stop(); + + expect(onChange).toHaveBeenCalledTimes(1); + expect(onChange).toHaveBeenCalledWith("workspace-a", [ + { + serviceName: "api", + hostname: "api.localhost", + port: deadPort, + health: "unhealthy", + }, + ]); + }); + + it("does not emit when status has not changed", async () => { + vi.useFakeTimers(); + + const healthy = await startTcpServer(); + servers.add(healthy.server); + + const routeStore = new ServiceRouteStore(); + routeStore.registerRoute({ + hostname: "api.localhost", + port: healthy.port, + workspaceId: "workspace-a", + serviceName: "api", + }); + + const onChange = vi.fn<(workspaceId: string, services: ServiceHealthEntry[]) => void>(); + const monitor = new ServiceHealthMonitor({ + routeStore, + onChange, + pollIntervalMs: 1_000, + probeTimeoutMs: 100, + graceMs: 0, + }); + + monitor.start(); + await advancePoll(3_000); + monitor.stop(); + + expect(onChange).toHaveBeenCalledTimes(1); + }); + + it("respects startup grace period — does not probe newly registered routes for 5 seconds", async () => { + vi.useFakeTimers(); + + const healthy = await startTcpServer(); + servers.add(healthy.server); + + const routeStore = new ServiceRouteStore(); + routeStore.registerRoute({ + hostname: "api.localhost", + port: healthy.port, + workspaceId: "workspace-a", + serviceName: "api", + }); + + const onChange = vi.fn<(workspaceId: string, services: ServiceHealthEntry[]) => void>(); + const monitor = new ServiceHealthMonitor({ + routeStore, + onChange, + pollIntervalMs: 1_000, + probeTimeoutMs: 100, + graceMs: 5_000, + }); + + monitor.start(); + await advancePoll(4_000); + expect(onChange).not.toHaveBeenCalled(); + + await advancePoll(1_000); + monitor.stop(); + + expect(onChange).toHaveBeenCalledTimes(1); + expect(onChange).toHaveBeenCalledWith("workspace-a", [ + { + serviceName: "api", + hostname: "api.localhost", + port: healthy.port, + health: "healthy", + }, + ]); + }); + + it("requires 2 consecutive failures before marking stopped (debounce)", async () => { + vi.useFakeTimers(); + + const healthy = await startTcpServer(); + servers.add(healthy.server); + + const routeStore = new ServiceRouteStore(); + routeStore.registerRoute({ + hostname: "api.localhost", + port: healthy.port, + workspaceId: "workspace-a", + serviceName: "api", + }); + + const onChange = vi.fn<(workspaceId: string, services: ServiceHealthEntry[]) => void>(); + const monitor = new ServiceHealthMonitor({ + routeStore, + onChange, + pollIntervalMs: 1_000, + probeTimeoutMs: 100, + graceMs: 0, + failuresBeforeStopped: 2, + }); + + monitor.start(); + await advancePoll(1_000); + expect(onChange).toHaveBeenCalledTimes(1); + + await closeServer(healthy.server); + servers.delete(healthy.server); + + await advancePoll(1_000); + expect(onChange).toHaveBeenCalledTimes(1); + + await advancePoll(1_000); + monitor.stop(); + + expect(onChange).toHaveBeenCalledTimes(2); + expect(onChange).toHaveBeenLastCalledWith("workspace-a", [ + { + serviceName: "api", + hostname: "api.localhost", + port: healthy.port, + health: "unhealthy", + }, + ]); + }); + + it("stops probing routes that are removed from the store", async () => { + vi.useFakeTimers(); + + const healthy = await startTcpServer(); + servers.add(healthy.server); + + const routeStore = new ServiceRouteStore(); + routeStore.registerRoute({ + hostname: "api.localhost", + port: healthy.port, + workspaceId: "workspace-a", + serviceName: "api", + }); + + const onChange = vi.fn<(workspaceId: string, services: ServiceHealthEntry[]) => void>(); + const monitor = new ServiceHealthMonitor({ + routeStore, + onChange, + pollIntervalMs: 1_000, + probeTimeoutMs: 100, + graceMs: 0, + failuresBeforeStopped: 2, + }); + + monitor.start(); + await advancePoll(1_000); + expect(onChange).toHaveBeenCalledTimes(1); + + routeStore.removeRoute("api.localhost"); + await closeServer(healthy.server); + servers.delete(healthy.server); + + await advancePoll(3_000); + monitor.stop(); + + expect(onChange).toHaveBeenCalledTimes(1); + }); + + it("calls onChange with workspaceId and full service list when status transitions", async () => { + vi.useFakeTimers(); + + const api = await startTcpServer(); + const web = await startTcpServer(); + servers.add(api.server); + servers.add(web.server); + + const routeStore = new ServiceRouteStore(); + routeStore.registerRoute({ + hostname: "api.localhost", + port: api.port, + workspaceId: "workspace-a", + serviceName: "api", + }); + routeStore.registerRoute({ + hostname: "web.localhost", + port: web.port, + workspaceId: "workspace-a", + serviceName: "web", + }); + + const onChange = vi.fn<(workspaceId: string, services: ServiceHealthEntry[]) => void>(); + const monitor = new ServiceHealthMonitor({ + routeStore, + onChange, + pollIntervalMs: 1_000, + probeTimeoutMs: 100, + graceMs: 0, + }); + + monitor.start(); + await advancePoll(1_000); + monitor.stop(); + + expect(onChange).toHaveBeenCalledTimes(1); + expect(onChange).toHaveBeenCalledWith("workspace-a", [ + { + serviceName: "api", + hostname: "api.localhost", + port: api.port, + health: "healthy", + }, + { + serviceName: "web", + hostname: "web.localhost", + port: web.port, + health: "healthy", + }, + ]); + }); + + it("getHealthForHostname returns current health after probe", async () => { + vi.useFakeTimers(); + + const healthy = await startTcpServer(); + servers.add(healthy.server); + + const routeStore = new ServiceRouteStore(); + routeStore.registerRoute({ + hostname: "api.localhost", + port: healthy.port, + workspaceId: "workspace-a", + serviceName: "api", + }); + + const onChange = vi.fn<(workspaceId: string, services: ServiceHealthEntry[]) => void>(); + const monitor = new ServiceHealthMonitor({ + routeStore, + onChange, + pollIntervalMs: 1_000, + probeTimeoutMs: 100, + graceMs: 0, + }); + + expect(monitor.getHealthForHostname("api.localhost")).toBeNull(); + + monitor.start(); + await advancePoll(1_000); + monitor.stop(); + + expect(monitor.getHealthForHostname("api.localhost")).toBe("healthy"); + expect(monitor.getHealthForHostname("unknown.localhost")).toBeNull(); + }); + + it("coalesces multiple service changes in same workspace into one onChange call per poll cycle", async () => { + vi.useFakeTimers(); + + const api = await startTcpServer(); + const web = await startTcpServer(); + servers.add(api.server); + servers.add(web.server); + + const routeStore = new ServiceRouteStore(); + routeStore.registerRoute({ + hostname: "api.localhost", + port: api.port, + workspaceId: "workspace-a", + serviceName: "api", + }); + routeStore.registerRoute({ + hostname: "web.localhost", + port: web.port, + workspaceId: "workspace-a", + serviceName: "web", + }); + + const onChange = vi.fn<(workspaceId: string, services: ServiceHealthEntry[]) => void>(); + const monitor = new ServiceHealthMonitor({ + routeStore, + onChange, + pollIntervalMs: 1_000, + probeTimeoutMs: 100, + graceMs: 0, + failuresBeforeStopped: 2, + }); + + monitor.start(); + await advancePoll(1_000); + expect(onChange).toHaveBeenCalledTimes(1); + + onChange.mockClear(); + await closeServer(api.server); + await closeServer(web.server); + servers.delete(api.server); + servers.delete(web.server); + + await advancePoll(1_000); + expect(onChange).not.toHaveBeenCalled(); + + await advancePoll(1_000); + monitor.stop(); + + expect(onChange).toHaveBeenCalledTimes(1); + expect(onChange).toHaveBeenCalledWith("workspace-a", [ + { + serviceName: "api", + hostname: "api.localhost", + port: api.port, + health: "unhealthy", + }, + { + serviceName: "web", + hostname: "web.localhost", + port: web.port, + health: "unhealthy", + }, + ]); + }); +}); diff --git a/packages/server/src/server/service-health-monitor.ts b/packages/server/src/server/service-health-monitor.ts new file mode 100644 index 000000000..fd6608ad5 --- /dev/null +++ b/packages/server/src/server/service-health-monitor.ts @@ -0,0 +1,203 @@ +import net from "node:net"; +import type { ServiceRouteEntry, ServiceRouteStore } from "./service-proxy.js"; + +export interface ServiceHealthEntry { + serviceName: string; + hostname: string; + port: number; + health: "healthy" | "unhealthy"; +} + +type RouteHealthState = { + health: ServiceHealthEntry["health"] | null; + consecutiveFailures: number; + registeredAt: number; +}; + +export class ServiceHealthMonitor { + private readonly routeStore: ServiceRouteStore; + private readonly onChange: ( + workspaceId: string, + services: ServiceHealthEntry[], + ) => void; + private readonly pollIntervalMs: number; + private readonly probeTimeoutMs: number; + private readonly graceMs: number; + private readonly failuresBeforeStopped: number; + private readonly routeStates = new Map(); + private readonly lastEmittedSnapshots = new Map(); + + private intervalHandle: NodeJS.Timeout | null = null; + private pollInFlight = false; + + constructor({ + routeStore, + onChange, + pollIntervalMs = 3_000, + probeTimeoutMs = 500, + graceMs = 5_000, + failuresBeforeStopped = 2, + }: { + routeStore: ServiceRouteStore; + onChange: (workspaceId: string, services: ServiceHealthEntry[]) => void; + pollIntervalMs?: number; + probeTimeoutMs?: number; + graceMs?: number; + failuresBeforeStopped?: number; + }) { + this.routeStore = routeStore; + this.onChange = onChange; + this.pollIntervalMs = pollIntervalMs; + this.probeTimeoutMs = probeTimeoutMs; + this.graceMs = graceMs; + this.failuresBeforeStopped = failuresBeforeStopped; + } + + start(): void { + if (this.intervalHandle) { + return; + } + + const now = Date.now(); + for (const route of this.routeStore.listRoutes()) { + this.getOrCreateState(route.hostname, now); + } + + this.intervalHandle = setInterval(() => { + void this.poll(); + }, this.pollIntervalMs); + } + + stop(): void { + if (this.intervalHandle) { + clearInterval(this.intervalHandle); + this.intervalHandle = null; + } + } + + private async poll(): Promise { + if (this.pollInFlight) { + return; + } + + this.pollInFlight = true; + try { + const routes = this.routeStore.listRoutes(); + const activeHostnames = new Set(routes.map((route) => route.hostname)); + const changedWorkspaceIds = new Set(); + const now = Date.now(); + + for (const route of routes) { + const state = this.getOrCreateState(route.hostname, now); + if (now - state.registeredAt < this.graceMs) { + continue; + } + + const isHealthy = await this.probeRoute(route.port); + const previousHealth = state.health; + + if (isHealthy) { + state.consecutiveFailures = 0; + state.health = "healthy"; + } else { + state.consecutiveFailures += 1; + if (state.consecutiveFailures >= this.failuresBeforeStopped) { + state.health = "unhealthy"; + } + } + + if (state.health !== null && state.health !== previousHealth) { + changedWorkspaceIds.add(route.workspaceId); + } + } + + this.pruneRemovedRoutes(activeHostnames); + + for (const workspaceId of changedWorkspaceIds) { + const services = this.buildWorkspaceServiceList(workspaceId); + const snapshot = JSON.stringify(services); + if (snapshot === this.lastEmittedSnapshots.get(workspaceId)) { + continue; + } + + this.lastEmittedSnapshots.set(workspaceId, snapshot); + this.onChange(workspaceId, services); + } + } finally { + this.pollInFlight = false; + } + } + + private getOrCreateState(hostname: string, registeredAt: number): RouteHealthState { + const existing = this.routeStates.get(hostname); + if (existing) { + return existing; + } + + const state: RouteHealthState = { + health: null, + consecutiveFailures: 0, + registeredAt, + }; + this.routeStates.set(hostname, state); + return state; + } + + private pruneRemovedRoutes(activeHostnames: Set): void { + for (const hostname of this.routeStates.keys()) { + if (activeHostnames.has(hostname)) { + continue; + } + this.routeStates.delete(hostname); + } + } + + private buildWorkspaceServiceList(workspaceId: string): ServiceHealthEntry[] { + return this.routeStore + .listRoutesForWorkspace(workspaceId) + .flatMap((route) => { + const state = this.routeStates.get(route.hostname); + if (!state?.health) { + return []; + } + return [this.toServiceHealthEntry(route, state.health)]; + }); + } + + getHealthForHostname(hostname: string): ServiceHealthEntry["health"] | null { + return this.routeStates.get(hostname)?.health ?? null; + } + + private toServiceHealthEntry( + route: ServiceRouteEntry, + health: ServiceHealthEntry["health"], + ): ServiceHealthEntry { + return { + serviceName: route.serviceName, + hostname: route.hostname, + port: route.port, + health, + }; + } + + private probeRoute(port: number): Promise { + return new Promise((resolve) => { + const socket = net.connect({ host: "127.0.0.1", port }); + let settled = false; + + const finish = (healthy: boolean) => { + if (settled) { + return; + } + settled = true; + socket.destroy(); + resolve(healthy); + }; + + socket.setTimeout(this.probeTimeoutMs); + socket.once("connect", () => finish(true)); + socket.once("timeout", () => finish(false)); + socket.once("error", () => finish(false)); + }); + } +} diff --git a/packages/server/src/server/service-proxy.test.ts b/packages/server/src/server/service-proxy.test.ts new file mode 100644 index 000000000..3a10ca745 --- /dev/null +++ b/packages/server/src/server/service-proxy.test.ts @@ -0,0 +1,478 @@ +import { describe, it, expect, afterEach } from "vitest"; +import http from "node:http"; +import net from "node:net"; +import express from "express"; +import WebSocket, { WebSocketServer } from "ws"; +import pino from "pino"; +import { + ServiceRouteStore, + createServiceProxyMiddleware, + createServiceProxyUpgradeHandler, + findFreePort, +} from "./service-proxy.js"; + +const logger = pino({ level: "silent" }); + +// --------------------------------------------------------------------------- +// Helpers for cleanup +// --------------------------------------------------------------------------- + +function closeServer(server: http.Server): Promise { + return new Promise((resolve) => { + server.close(() => resolve()); + }); +} + +// --------------------------------------------------------------------------- +// ServiceRouteStore +// --------------------------------------------------------------------------- + +describe("ServiceRouteStore", () => { + it("registerRoute and findRoute with exact match", () => { + const store = new ServiceRouteStore(); + store.registerRoute({ + hostname: "editor.localhost", + port: 3000, + workspaceId: "/repo/.paseo/worktrees/feature-a", + serviceName: "editor", + }); + + const route = store.findRoute("editor.localhost"); + expect(route).toEqual({ hostname: "editor.localhost", port: 3000 }); + }); + + it("findRoute strips port from host header", () => { + const store = new ServiceRouteStore(); + store.registerRoute({ + hostname: "editor.localhost", + port: 3000, + workspaceId: "/repo/.paseo/worktrees/feature-a", + serviceName: "editor", + }); + + const route = store.findRoute("editor.localhost:6767"); + expect(route).toEqual({ hostname: "editor.localhost", port: 3000 }); + }); + + it("findRoute subdomain match", () => { + const store = new ServiceRouteStore(); + store.registerRoute({ + hostname: "editor.localhost", + port: 3000, + workspaceId: "/repo/.paseo/worktrees/feature-a", + serviceName: "editor", + }); + + const route = store.findRoute("fix-auth.editor.localhost"); + expect(route).toEqual({ hostname: "editor.localhost", port: 3000 }); + }); + + it("listRoutes returns enriched entries", () => { + const store = new ServiceRouteStore(); + store.registerRoute({ + hostname: "a.localhost", + port: 3000, + workspaceId: "/repo/.paseo/worktrees/feature-a", + serviceName: "web", + }); + store.registerRoute({ + hostname: "b.localhost", + port: 4000, + workspaceId: "/repo/.paseo/worktrees/feature-b", + serviceName: "docs", + }); + + const routes = store.listRoutes(); + expect(routes).toHaveLength(2); + expect(routes).toContainEqual({ + hostname: "a.localhost", + port: 3000, + workspaceId: "/repo/.paseo/worktrees/feature-a", + serviceName: "web", + }); + expect(routes).toContainEqual({ + hostname: "b.localhost", + port: 4000, + workspaceId: "/repo/.paseo/worktrees/feature-b", + serviceName: "docs", + }); + }); + + it("listRoutesForWorkspace returns only routes for that workspace", () => { + const store = new ServiceRouteStore(); + store.registerRoute({ + hostname: "a.localhost", + port: 3000, + workspaceId: "/repo/.paseo/worktrees/feature-a", + serviceName: "web", + }); + store.registerRoute({ + hostname: "b.localhost", + port: 4000, + workspaceId: "/repo/.paseo/worktrees/feature-b", + serviceName: "docs", + }); + store.registerRoute({ + hostname: "c.localhost", + port: 5000, + workspaceId: "/repo/.paseo/worktrees/feature-a", + serviceName: "api", + }); + + expect(store.listRoutesForWorkspace("/repo/.paseo/worktrees/feature-a")).toEqual([ + { + hostname: "a.localhost", + port: 3000, + workspaceId: "/repo/.paseo/worktrees/feature-a", + serviceName: "web", + }, + { + hostname: "c.localhost", + port: 5000, + workspaceId: "/repo/.paseo/worktrees/feature-a", + serviceName: "api", + }, + ]); + }); + + it("removeRoute works", () => { + const store = new ServiceRouteStore(); + store.registerRoute({ + hostname: "editor.localhost", + port: 3000, + workspaceId: "/repo/.paseo/worktrees/feature-a", + serviceName: "editor", + }); + store.removeRoute("editor.localhost"); + + expect(store.findRoute("editor.localhost")).toBeNull(); + }); + + it("removeRoute cleans up workspace index", () => { + const store = new ServiceRouteStore(); + store.registerRoute({ + hostname: "editor.localhost", + port: 3000, + workspaceId: "/repo/.paseo/worktrees/feature-a", + serviceName: "editor", + }); + + store.removeRoute("editor.localhost"); + + expect(store.listRoutesForWorkspace("/repo/.paseo/worktrees/feature-a")).toEqual([]); + }); + + it("removeRoutesForPort works", () => { + const store = new ServiceRouteStore(); + store.registerRoute({ + hostname: "a.localhost", + port: 3000, + workspaceId: "/repo/.paseo/worktrees/feature-a", + serviceName: "web", + }); + store.registerRoute({ + hostname: "b.localhost", + port: 3000, + workspaceId: "/repo/.paseo/worktrees/feature-a", + serviceName: "api", + }); + store.registerRoute({ + hostname: "c.localhost", + port: 4000, + workspaceId: "/repo/.paseo/worktrees/feature-b", + serviceName: "docs", + }); + + store.removeRoutesForPort(3000); + + expect(store.findRoute("a.localhost")).toBeNull(); + expect(store.findRoute("b.localhost")).toBeNull(); + expect(store.findRoute("c.localhost")).toEqual({ + hostname: "c.localhost", + port: 4000, + }); + }); + + it("removeRoutesForPort cleans up workspace index", () => { + const store = new ServiceRouteStore(); + store.registerRoute({ + hostname: "a.localhost", + port: 3000, + workspaceId: "/repo/.paseo/worktrees/feature-a", + serviceName: "web", + }); + store.registerRoute({ + hostname: "b.localhost", + port: 3000, + workspaceId: "/repo/.paseo/worktrees/feature-a", + serviceName: "api", + }); + + store.removeRoutesForPort(3000); + + expect(store.listRoutesForWorkspace("/repo/.paseo/worktrees/feature-a")).toEqual([]); + }); + + it("findRoute returns null for unknown hosts", () => { + const store = new ServiceRouteStore(); + store.registerRoute({ + hostname: "editor.localhost", + port: 3000, + workspaceId: "/repo/.paseo/worktrees/feature-a", + serviceName: "editor", + }); + + expect(store.findRoute("unknown.example.com")).toBeNull(); + }); +}); + +// --------------------------------------------------------------------------- +// HTTP proxy +// --------------------------------------------------------------------------- + +describe("HTTP proxy", () => { + const servers: http.Server[] = []; + + afterEach(async () => { + await Promise.all(servers.map(closeServer)); + servers.length = 0; + }); + + /** Start a real HTTP server that echoes back a known body and records received headers. */ + async function startUpstream(): Promise<{ + port: number; + server: http.Server; + receivedHeaders: () => http.IncomingHttpHeaders; + }> { + const port = await findFreePort(); + let lastHeaders: http.IncomingHttpHeaders = {}; + + const server = http.createServer((req, res) => { + lastHeaders = req.headers; + res.writeHead(200, { "content-type": "text/plain" }); + res.end("upstream-ok"); + }); + + await new Promise((resolve) => + server.listen(port, "127.0.0.1", resolve), + ); + servers.push(server); + + return { + port, + server, + receivedHeaders: () => lastHeaders, + }; + } + + /** Start an Express app with the service proxy middleware and an optional fallback. */ + async function startProxy( + routeStore: ServiceRouteStore, + opts?: { fallback?: boolean }, + ): Promise<{ port: number; server: http.Server }> { + const port = await findFreePort(); + const app = express(); + app.use(createServiceProxyMiddleware({ routeStore, logger })); + + if (opts?.fallback) { + app.use((_req, res) => { + res.status(404).send("no route"); + }); + } + + const server = http.createServer(app); + await new Promise((resolve) => + server.listen(port, "127.0.0.1", resolve), + ); + servers.push(server); + + return { port, server }; + } + + /** Simple HTTP GET helper that returns status code and body. */ + function httpGet( + port: number, + host: string, + path = "/", + ): Promise<{ status: number; body: string }> { + return new Promise((resolve, reject) => { + const req = http.get( + { hostname: "127.0.0.1", port, path, headers: { host } }, + (res) => { + let body = ""; + res.on("data", (chunk: Buffer) => (body += chunk.toString())); + res.on("end", () => + resolve({ status: res.statusCode ?? 0, body }), + ); + }, + ); + req.on("error", reject); + }); + } + + it("proxies requests to the correct upstream based on Host header", async () => { + const upstream = await startUpstream(); + const routeStore = new ServiceRouteStore(); + routeStore.addRoute("test-service.localhost", upstream.port); + + const proxy = await startProxy(routeStore); + const res = await httpGet( + proxy.port, + `test-service.localhost:${proxy.port}`, + ); + + expect(res.status).toBe(200); + expect(res.body).toBe("upstream-ok"); + + const headers = upstream.receivedHeaders(); + expect(headers["x-forwarded-for"]).toBeDefined(); + expect(headers["x-forwarded-host"]).toBe("test-service.localhost"); + }); + + it("falls through when no route matches", async () => { + const routeStore = new ServiceRouteStore(); + const proxy = await startProxy(routeStore, { fallback: true }); + + const res = await httpGet( + proxy.port, + `unknown.localhost:${proxy.port}`, + ); + + expect(res.status).toBe(404); + expect(res.body).toBe("no route"); + }); + + it("returns 502 when upstream is down", async () => { + // Get a port that nothing is listening on + const deadPort = await findFreePort(); + + const routeStore = new ServiceRouteStore(); + routeStore.addRoute("dead-service.localhost", deadPort); + + const proxy = await startProxy(routeStore); + const res = await httpGet( + proxy.port, + `dead-service.localhost:${proxy.port}`, + ); + + expect(res.status).toBe(502); + expect(res.body).toBe("502 Bad Gateway"); + }); +}); + +// --------------------------------------------------------------------------- +// WebSocket proxy +// --------------------------------------------------------------------------- + +describe("WebSocket proxy", () => { + const httpServers: http.Server[] = []; + const wsServers: WebSocketServer[] = []; + const wsClients: WebSocket[] = []; + + afterEach(async () => { + for (const ws of wsClients) { + if (ws.readyState === WebSocket.OPEN) ws.close(); + } + wsClients.length = 0; + + for (const wss of wsServers) { + wss.close(); + } + wsServers.length = 0; + + await Promise.all(httpServers.map(closeServer)); + httpServers.length = 0; + }); + + it("proxies WebSocket connections to the correct upstream", async () => { + // 1. Start a real WebSocket echo server + const upstreamPort = await findFreePort(); + const upstreamServer = http.createServer(); + const wss = new WebSocketServer({ server: upstreamServer }); + wsServers.push(wss); + + wss.on("connection", (ws) => { + ws.on("message", (data) => { + ws.send(`echo: ${data.toString()}`); + }); + }); + + await new Promise((resolve) => + upstreamServer.listen(upstreamPort, "127.0.0.1", resolve), + ); + httpServers.push(upstreamServer); + + // 2. Create the proxy server with the upgrade handler + const routeStore = new ServiceRouteStore(); + routeStore.addRoute("ws-service.localhost", upstreamPort); + + const proxyPort = await findFreePort(); + const proxyServer = http.createServer((_req, res) => { + res.writeHead(404); + res.end(); + }); + + const upgradeHandler = createServiceProxyUpgradeHandler({ + routeStore, + logger, + }); + proxyServer.on("upgrade", upgradeHandler); + + await new Promise((resolve) => + proxyServer.listen(proxyPort, "127.0.0.1", resolve), + ); + httpServers.push(proxyServer); + + // 3. Connect a WebSocket client through the proxy + const ws = new WebSocket(`ws://127.0.0.1:${proxyPort}`, { + headers: { host: `ws-service.localhost:${proxyPort}` }, + }); + wsClients.push(ws); + + await new Promise((resolve, reject) => { + ws.on("open", resolve); + ws.on("error", reject); + }); + + // 4. Send a message and verify echo + const reply = await new Promise((resolve, reject) => { + ws.on("message", (data) => resolve(data.toString())); + ws.on("error", reject); + ws.send("hello proxy"); + }); + + expect(reply).toBe("echo: hello proxy"); + }); +}); + +// --------------------------------------------------------------------------- +// findFreePort +// --------------------------------------------------------------------------- + +describe("findFreePort", () => { + it("returns a number", async () => { + const port = await findFreePort(); + expect(typeof port).toBe("number"); + expect(port).toBeGreaterThan(0); + expect(port).toBeLessThan(65536); + }); + + it("returns a port that is actually available", async () => { + const port = await findFreePort(); + + // Verify we can bind a server to it + const server = net.createServer(); + await new Promise((resolve, reject) => { + server.listen(port, "127.0.0.1", () => resolve()); + server.on("error", reject); + }); + + const addr = server.address(); + expect(addr).not.toBeNull(); + expect(typeof addr === "object" && addr !== null ? addr.port : -1).toBe( + port, + ); + + await new Promise((resolve) => server.close(() => resolve())); + }); +}); diff --git a/packages/server/src/server/service-proxy.ts b/packages/server/src/server/service-proxy.ts new file mode 100644 index 000000000..ac3c00545 --- /dev/null +++ b/packages/server/src/server/service-proxy.ts @@ -0,0 +1,326 @@ +import http from "node:http"; +import net from "node:net"; +import type { IncomingMessage } from "node:http"; +import type { Logger } from "pino"; +import type { RequestHandler } from "express"; + +// --------------------------------------------------------------------------- +// Hop-by-hop headers that must not be forwarded +// --------------------------------------------------------------------------- + +const HOP_BY_HOP_HEADERS = new Set([ + "connection", + "transfer-encoding", + "keep-alive", + "upgrade", + "proxy-connection", + "proxy-authenticate", + "proxy-authorization", + "te", + "trailer", +]); + +// --------------------------------------------------------------------------- +// ServiceRouteStore +// --------------------------------------------------------------------------- + +export interface ServiceRoute { + hostname: string; + port: number; +} + +export interface ServiceRouteEntry extends ServiceRoute { + workspaceId: string; + serviceName: string; +} + +export class ServiceRouteStore { + private routes = new Map(); + private workspaceHostnames = new Map>(); + + addRoute(hostname: string, port: number): void { + this.registerRoute({ + hostname, + port, + workspaceId: "", + serviceName: hostname, + }); + } + + registerRoute(entry: ServiceRouteEntry): void { + const previous = this.routes.get(entry.hostname); + if (previous) { + this.removeHostnameFromWorkspaceIndex(previous.workspaceId, previous.hostname); + } + + const storedEntry = { ...entry }; + this.routes.set(storedEntry.hostname, storedEntry); + this.addHostnameToWorkspaceIndex(storedEntry.workspaceId, storedEntry.hostname); + } + + removeRoute(hostname: string): void { + const entry = this.routes.get(hostname); + if (!entry) { + return; + } + this.routes.delete(hostname); + this.removeHostnameFromWorkspaceIndex(entry.workspaceId, hostname); + } + + removeRoutesForPort(port: number): void { + for (const [hostname, entry] of this.routes) { + if (entry.port === port) { + this.routes.delete(hostname); + this.removeHostnameFromWorkspaceIndex(entry.workspaceId, hostname); + } + } + } + + findRoute(host: string): ServiceRoute | null { + // Strip port suffix from the Host header value + const hostname = host.replace(/:\d+$/, ""); + + // 1. Exact match + const exactRoute = this.routes.get(hostname); + if (exactRoute !== undefined) { + return { hostname: exactRoute.hostname, port: exactRoute.port }; + } + + // 2. Subdomain match — walk up the labels looking for a registered parent + const parts = hostname.split("."); + for (let i = 1; i < parts.length; i++) { + const candidate = parts.slice(i).join("."); + const candidateRoute = this.routes.get(candidate); + if (candidateRoute !== undefined) { + return { hostname: candidateRoute.hostname, port: candidateRoute.port }; + } + } + + return null; + } + + getRouteEntry(hostname: string): ServiceRouteEntry | null { + const entry = this.routes.get(hostname); + return entry ? { ...entry } : null; + } + + listRoutes(): ServiceRouteEntry[] { + return Array.from(this.routes.values()).map((entry) => ({ ...entry })); + } + + listRoutesForWorkspace(workspaceId: string): ServiceRouteEntry[] { + const hostnames = this.workspaceHostnames.get(workspaceId); + if (!hostnames) { + return []; + } + + const routes: ServiceRouteEntry[] = []; + for (const hostname of hostnames) { + const entry = this.routes.get(hostname); + if (entry) { + routes.push({ ...entry }); + } + } + return routes; + } + + private addHostnameToWorkspaceIndex(workspaceId: string, hostname: string): void { + const hostnames = this.workspaceHostnames.get(workspaceId) ?? new Set(); + hostnames.add(hostname); + this.workspaceHostnames.set(workspaceId, hostnames); + } + + private removeHostnameFromWorkspaceIndex(workspaceId: string, hostname: string): void { + const hostnames = this.workspaceHostnames.get(workspaceId); + if (!hostnames) { + return; + } + + hostnames.delete(hostname); + if (hostnames.size === 0) { + this.workspaceHostnames.delete(workspaceId); + } + } +} + +// --------------------------------------------------------------------------- +// Helpers +// --------------------------------------------------------------------------- + +function stripHopByHopHeaders( + rawHeaders: http.IncomingHttpHeaders, +): Record { + const out: Record = {}; + for (const [key, value] of Object.entries(rawHeaders)) { + if (value === undefined) continue; + if (HOP_BY_HOP_HEADERS.has(key.toLowerCase())) continue; + out[key] = value; + } + return out; +} + +// --------------------------------------------------------------------------- +// createServiceProxyMiddleware +// --------------------------------------------------------------------------- + +export function createServiceProxyMiddleware({ + routeStore, + logger, +}: { + routeStore: ServiceRouteStore; + logger: Logger; +}): RequestHandler { + return (req, res, next) => { + const hostHeader = req.headers.host; + if (!hostHeader) { + next(); + return; + } + + const route = routeStore.findRoute(hostHeader); + if (!route) { + next(); + return; + } + + const forwardedHeaders = stripHopByHopHeaders(req.headers); + forwardedHeaders["x-forwarded-for"] = + req.socket.remoteAddress ?? "127.0.0.1"; + forwardedHeaders["x-forwarded-host"] = hostHeader.replace(/:\d+$/, ""); + forwardedHeaders["x-forwarded-proto"] = req.protocol; + + const proxyReq = http.request( + { + hostname: "127.0.0.1", + port: route.port, + path: req.originalUrl, + method: req.method, + headers: forwardedHeaders, + }, + (proxyRes) => { + const responseHeaders = stripHopByHopHeaders(proxyRes.headers); + res.writeHead(proxyRes.statusCode ?? 502, responseHeaders); + proxyRes.pipe(res, { end: true }); + }, + ); + + proxyReq.on("error", (err) => { + logger.warn( + { err, hostname: route.hostname, port: route.port }, + "Service proxy: upstream unreachable", + ); + if (!res.headersSent) { + res.writeHead(502, { "content-type": "text/plain" }); + res.end("502 Bad Gateway"); + } + }); + + req.pipe(proxyReq, { end: true }); + }; +} + +// --------------------------------------------------------------------------- +// createServiceProxyUpgradeHandler +// --------------------------------------------------------------------------- + +export function createServiceProxyUpgradeHandler({ + routeStore, + logger, +}: { + routeStore: ServiceRouteStore; + logger: Logger; +}): (req: IncomingMessage, socket: net.Socket, head: Buffer) => void { + return (req, socket, head) => { + const hostHeader = req.headers.host; + if (!hostHeader) { + return; + } + + const route = routeStore.findRoute(hostHeader); + if (!route) { + return; + } + + const targetSocket = net.connect( + { host: "127.0.0.1", port: route.port }, + () => { + // Reconstruct the raw HTTP upgrade request to send to the target + const forwardedHeaders = stripHopByHopHeaders(req.headers); + forwardedHeaders["x-forwarded-for"] = + req.socket.remoteAddress ?? "127.0.0.1"; + forwardedHeaders["x-forwarded-host"] = hostHeader.replace(/:\d+$/, ""); + forwardedHeaders["x-forwarded-proto"] = "http"; + + // Re-include upgrade and connection headers — they are required for + // WebSocket handshake even though they are hop-by-hop. + forwardedHeaders["connection"] = "Upgrade"; + forwardedHeaders["upgrade"] = req.headers.upgrade ?? "websocket"; + + const headerLines: string[] = []; + headerLines.push( + `${req.method ?? "GET"} ${req.url ?? "/"} HTTP/${req.httpVersion}`, + ); + for (const [key, value] of Object.entries(forwardedHeaders)) { + if (Array.isArray(value)) { + for (const v of value) { + headerLines.push(`${key}: ${v}`); + } + } else { + headerLines.push(`${key}: ${value}`); + } + } + headerLines.push("\r\n"); + + targetSocket.write(headerLines.join("\r\n")); + + if (head.length > 0) { + targetSocket.write(head); + } + + // Pipe in both directions + targetSocket.pipe(socket); + socket.pipe(targetSocket); + }, + ); + + targetSocket.on("error", (err) => { + logger.warn( + { err, hostname: route.hostname, port: route.port }, + "Service proxy: WebSocket upstream unreachable", + ); + socket.end(); + }); + + socket.on("error", () => { + targetSocket.destroy(); + }); + }; +} + +// --------------------------------------------------------------------------- +// findFreePort +// --------------------------------------------------------------------------- + +export function findFreePort(): Promise { + return new Promise((resolve, reject) => { + const server = net.createServer(); + server.unref(); + server.listen(0, "127.0.0.1", () => { + const address = server.address(); + if (!address || typeof address === "string") { + server.close(); + reject(new Error("Failed to get assigned port")); + return; + } + const { port } = address; + server.close((err) => { + if (err) { + reject(err); + } else { + resolve(port); + } + }); + }); + server.on("error", reject); + }); +} diff --git a/packages/server/src/server/service-route-branch-handler.test.ts b/packages/server/src/server/service-route-branch-handler.test.ts new file mode 100644 index 000000000..1326cb865 --- /dev/null +++ b/packages/server/src/server/service-route-branch-handler.test.ts @@ -0,0 +1,189 @@ +import { describe, expect, it, vi } from "vitest"; +import { ServiceRouteStore } from "./service-proxy.js"; +import { createBranchChangeRouteHandler } from "./service-route-branch-handler.js"; + +function registerRoute( + routeStore: ServiceRouteStore, + { + hostname, + port, + workspaceId = "workspace-a", + serviceName, + }: { + hostname: string; + port: number; + workspaceId?: string; + serviceName: string; + }, +): void { + routeStore.registerRoute({ + hostname, + port, + workspaceId, + serviceName, + }); +} + +describe("service-route-branch-handler", () => { + it("updates routes on branch rename by removing old hostnames and registering new ones", () => { + const routeStore = new ServiceRouteStore(); + registerRoute(routeStore, { + hostname: "feature-auth.api.localhost", + port: 3001, + serviceName: "api", + }); + + const emitServiceStatusUpdate = vi.fn(); + const handleBranchChange = createBranchChangeRouteHandler({ + routeStore, + emitServiceStatusUpdate, + }); + + handleBranchChange("workspace-a", "feature/auth", "feature/billing"); + + expect(routeStore.findRoute("feature-auth.api.localhost")).toBeNull(); + expect(routeStore.findRoute("feature-billing.api.localhost")).toEqual({ + hostname: "feature-billing.api.localhost", + port: 3001, + }); + }); + + it("is a no-op when the workspace has no routes", () => { + const routeStore = new ServiceRouteStore(); + const emitServiceStatusUpdate = vi.fn(); + const handleBranchChange = createBranchChangeRouteHandler({ + routeStore, + emitServiceStatusUpdate, + }); + + handleBranchChange("workspace-a", "feature/auth", "feature/billing"); + + expect(routeStore.listRoutes()).toEqual([]); + expect(emitServiceStatusUpdate).not.toHaveBeenCalled(); + }); + + it("is a no-op when the resolved hostnames do not change", () => { + const routeStore = new ServiceRouteStore(); + registerRoute(routeStore, { + hostname: "api.localhost", + port: 3001, + serviceName: "api", + }); + + const emitServiceStatusUpdate = vi.fn(); + const handleBranchChange = createBranchChangeRouteHandler({ + routeStore, + emitServiceStatusUpdate, + }); + + handleBranchChange("workspace-a", "main", "master"); + + expect(routeStore.listRoutesForWorkspace("workspace-a")).toEqual([ + { + hostname: "api.localhost", + port: 3001, + workspaceId: "workspace-a", + serviceName: "api", + }, + ]); + expect(emitServiceStatusUpdate).not.toHaveBeenCalled(); + }); + + it("emits a status update with the refreshed route payload after a route change", () => { + const routeStore = new ServiceRouteStore(); + registerRoute(routeStore, { + hostname: "feature-auth.api.localhost", + port: 3001, + serviceName: "api", + }); + + const emitServiceStatusUpdate = vi.fn(); + const handleBranchChange = createBranchChangeRouteHandler({ + routeStore, + emitServiceStatusUpdate, + }); + + handleBranchChange("workspace-a", "feature/auth", "feature/billing"); + + expect(emitServiceStatusUpdate).toHaveBeenCalledWith("workspace-a", [ + { + serviceName: "api", + hostname: "feature-billing.api.localhost", + port: 3001, + url: null, + lifecycle: "running", + health: null, + }, + ]); + }); + + it("updates all services for a workspace when multiple routes are registered", () => { + const routeStore = new ServiceRouteStore(); + registerRoute(routeStore, { + hostname: "feature-auth.api.localhost", + port: 3001, + serviceName: "api", + }); + registerRoute(routeStore, { + hostname: "feature-auth.web.localhost", + port: 3002, + serviceName: "web", + }); + registerRoute(routeStore, { + hostname: "docs.localhost", + port: 3003, + workspaceId: "workspace-b", + serviceName: "docs", + }); + + const emitServiceStatusUpdate = vi.fn(); + const handleBranchChange = createBranchChangeRouteHandler({ + routeStore, + emitServiceStatusUpdate, + }); + + handleBranchChange("workspace-a", "feature/auth", "feature/billing"); + + expect(routeStore.listRoutesForWorkspace("workspace-a")).toEqual([ + { + hostname: "feature-billing.api.localhost", + port: 3001, + workspaceId: "workspace-a", + serviceName: "api", + }, + { + hostname: "feature-billing.web.localhost", + port: 3002, + workspaceId: "workspace-a", + serviceName: "web", + }, + ]); + expect(routeStore.listRoutesForWorkspace("workspace-b")).toEqual([ + { + hostname: "docs.localhost", + port: 3003, + workspaceId: "workspace-b", + serviceName: "docs", + }, + ]); + }); + + it("does not emit a status update when no changes are needed", () => { + const routeStore = new ServiceRouteStore(); + registerRoute(routeStore, { + hostname: "web.localhost", + port: 3002, + serviceName: "web", + }); + + const emitServiceStatusUpdate = vi.fn(); + const handleBranchChange = createBranchChangeRouteHandler({ + routeStore, + emitServiceStatusUpdate, + }); + + handleBranchChange("workspace-a", null, "main"); + + expect(emitServiceStatusUpdate).not.toHaveBeenCalled(); + }); +}); diff --git a/packages/server/src/server/service-route-branch-handler.ts b/packages/server/src/server/service-route-branch-handler.ts new file mode 100644 index 000000000..ed4039607 --- /dev/null +++ b/packages/server/src/server/service-route-branch-handler.ts @@ -0,0 +1,70 @@ +import type { Logger } from "pino"; +import type { WorkspaceServicePayload } from "../shared/messages.js"; +import { buildServiceHostname } from "../utils/service-hostname.js"; +import { buildWorkspaceServicePayloads } from "./service-status-projection.js"; +import type { ServiceRouteEntry, ServiceRouteStore } from "./service-proxy.js"; + +interface BranchChangeRouteHandlerOptions { + routeStore: ServiceRouteStore; + emitServiceStatusUpdate: ( + workspaceId: string, + services: WorkspaceServicePayload[], + ) => void; + logger?: Logger; +} + +interface RouteHostnameUpdate { + oldHostname: string; + newHostname: string; + route: ServiceRouteEntry; +} + +export function createBranchChangeRouteHandler( + options: BranchChangeRouteHandlerOptions, +): (workspaceId: string, oldBranch: string | null, newBranch: string | null) => void { + return (workspaceId, _oldBranch, newBranch) => { + const routes = options.routeStore.listRoutesForWorkspace(workspaceId); + if (routes.length === 0) { + return; + } + + const updates: RouteHostnameUpdate[] = []; + for (const route of routes) { + const newHostname = buildServiceHostname(newBranch, route.serviceName); + if (newHostname !== route.hostname) { + updates.push({ + oldHostname: route.hostname, + newHostname, + route, + }); + } + } + + if (updates.length === 0) { + return; + } + + for (const { oldHostname, newHostname, route } of updates) { + options.routeStore.removeRoute(oldHostname); + options.routeStore.registerRoute({ + hostname: newHostname, + port: route.port, + workspaceId: route.workspaceId, + serviceName: route.serviceName, + }); + options.logger?.info( + { + oldHostname, + newHostname, + serviceName: route.serviceName, + }, + "Updated service route for branch rename", + ); + } + + options.emitServiceStatusUpdate( + workspaceId, + buildWorkspaceServicePayloads(options.routeStore, workspaceId, null), + ); + }; +} diff --git a/packages/server/src/server/service-status-projection.test.ts b/packages/server/src/server/service-status-projection.test.ts new file mode 100644 index 000000000..7d6de16f8 --- /dev/null +++ b/packages/server/src/server/service-status-projection.test.ts @@ -0,0 +1,232 @@ +import { describe, expect, it, vi } from "vitest"; +import { mkdtempSync, realpathSync, rmSync, writeFileSync } from "node:fs"; +import path from "node:path"; +import { tmpdir } from "node:os"; +import { execSync } from "node:child_process"; +import { ServiceRouteStore } from "./service-proxy.js"; +import { + buildWorkspaceServicePayloads, + createServiceStatusEmitter, +} from "./service-status-projection.js"; + +function createWorkspaceRepo(options?: { + branchName?: string; + paseoConfig?: Record; +}): { tempDir: string; repoDir: string; cleanup: () => void } { + const tempDir = realpathSync(mkdtempSync(path.join(tmpdir(), "service-projection-"))); + const repoDir = path.join(tempDir, "repo"); + execSync(`mkdir -p ${JSON.stringify(repoDir)}`); + execSync(`git init -b ${options?.branchName ?? "main"}`, { cwd: repoDir, stdio: "pipe" }); + execSync("git config user.email 'test@test.com'", { cwd: repoDir, stdio: "pipe" }); + execSync("git config user.name 'Test'", { cwd: repoDir, stdio: "pipe" }); + writeFileSync(path.join(repoDir, "README.md"), "hello\n"); + if (options?.paseoConfig) { + writeFileSync(path.join(repoDir, "paseo.json"), JSON.stringify(options.paseoConfig, null, 2)); + } + execSync("git add .", { cwd: repoDir, stdio: "pipe" }); + execSync("git -c commit.gpgsign=false commit -m 'initial'", { cwd: repoDir, stdio: "pipe" }); + + return { + tempDir, + repoDir, + cleanup: () => { + rmSync(tempDir, { recursive: true, force: true }); + }, + }; +} + +describe("service-status-projection", () => { + it("shows configured services even before they have routes", () => { + const workspace = createWorkspaceRepo({ + paseoConfig: { + services: { + api: { command: "npm run api" }, + web: { command: "npm run web", port: 3000 }, + }, + }, + }); + const routeStore = new ServiceRouteStore(); + + try { + expect(buildWorkspaceServicePayloads(routeStore, workspace.repoDir, 6767)).toEqual([ + { + serviceName: "api", + hostname: "api.localhost", + port: null, + url: "http://api.localhost:6767", + lifecycle: "stopped", + health: null, + }, + { + serviceName: "web", + hostname: "web.localhost", + port: 3000, + url: "http://web.localhost:6767", + lifecycle: "stopped", + health: null, + }, + ]); + } finally { + workspace.cleanup(); + } + }); + + it("uses the active route port and branch-aware hostname for running services", () => { + const workspace = createWorkspaceRepo({ + branchName: "feature/card", + paseoConfig: { + services: { + web: { command: "npm run web" }, + }, + }, + }); + const routeStore = new ServiceRouteStore(); + routeStore.registerRoute({ + hostname: "feature-card.web.localhost", + port: 4321, + workspaceId: workspace.repoDir, + serviceName: "web", + }); + + try { + expect(buildWorkspaceServicePayloads(routeStore, workspace.repoDir, 6767)).toEqual([ + { + serviceName: "web", + hostname: "feature-card.web.localhost", + port: 4321, + url: "http://feature-card.web.localhost:6767", + lifecycle: "running", + health: null, + }, + ]); + } finally { + workspace.cleanup(); + } + }); + + it("includes orphaned active routes even if the current config no longer declares them", () => { + const workspace = createWorkspaceRepo(); + const routeStore = new ServiceRouteStore(); + routeStore.registerRoute({ + hostname: "docs.localhost", + port: 3002, + workspaceId: workspace.repoDir, + serviceName: "docs", + }); + + try { + expect(buildWorkspaceServicePayloads(routeStore, workspace.repoDir, 6767)).toEqual([ + { + serviceName: "docs", + hostname: "docs.localhost", + port: 3002, + url: "http://docs.localhost:6767", + lifecycle: "running", + health: null, + }, + ]); + } finally { + workspace.cleanup(); + } + }); + + it("createServiceStatusEmitter overlays health onto the full workspace service list", () => { + const workspace = createWorkspaceRepo({ + paseoConfig: { + services: { + api: { command: "npm run api" }, + web: { command: "npm run web" }, + }, + }, + }); + const routeStore = new ServiceRouteStore(); + routeStore.registerRoute({ + hostname: "api.localhost", + port: 3001, + workspaceId: workspace.repoDir, + serviceName: "api", + }); + + const session = { emit: vi.fn() }; + const emitUpdate = createServiceStatusEmitter({ + sessions: () => [session], + routeStore, + daemonPort: 6767, + }); + + try { + emitUpdate(workspace.repoDir, [ + { + serviceName: "api", + hostname: "api.localhost", + port: 3001, + health: "healthy", + }, + ]); + + expect(session.emit).toHaveBeenCalledWith({ + type: "service_status_update", + payload: { + workspaceId: workspace.repoDir, + services: [ + { + serviceName: "api", + hostname: "api.localhost", + port: 3001, + url: "http://api.localhost:6767", + lifecycle: "running", + health: "healthy", + }, + { + serviceName: "web", + hostname: "web.localhost", + port: null, + url: "http://web.localhost:6767", + lifecycle: "stopped", + health: null, + }, + ], + }, + }); + } finally { + workspace.cleanup(); + } + }); + + it("computes URLs with and without a daemon port", () => { + const workspace = createWorkspaceRepo({ + paseoConfig: { + services: { + api: { command: "npm run api" }, + }, + }, + }); + const routeStore = new ServiceRouteStore(); + + try { + expect(buildWorkspaceServicePayloads(routeStore, workspace.repoDir, 6767)).toEqual([ + { + serviceName: "api", + hostname: "api.localhost", + port: null, + url: "http://api.localhost:6767", + lifecycle: "stopped", + health: null, + }, + ]); + + expect(buildWorkspaceServicePayloads(routeStore, workspace.repoDir, null)).toEqual([ + { + serviceName: "api", + hostname: "api.localhost", + port: null, + url: null, + lifecycle: "stopped", + health: null, + }, + ]); + } finally { + workspace.cleanup(); + } + }); +}); diff --git a/packages/server/src/server/service-status-projection.ts b/packages/server/src/server/service-status-projection.ts new file mode 100644 index 000000000..0be592a67 --- /dev/null +++ b/packages/server/src/server/service-status-projection.ts @@ -0,0 +1,135 @@ +import type { + ServiceStatusUpdateMessage, + SessionOutboundMessage, + WorkspaceServicePayload, +} from "../shared/messages.js"; +import { buildServiceHostname } from "../utils/service-hostname.js"; +import { getServiceConfigs } from "../utils/worktree.js"; +import { readGitCommand } from "./workspace-git-metadata.js"; +import type { ServiceHealthEntry } from "./service-health-monitor.js"; +import type { ServiceRouteEntry, ServiceRouteStore } from "./service-proxy.js"; + +type SessionEmitter = { + emit(message: SessionOutboundMessage): void; +}; + +function resolveDaemonPort(daemonPort: number | null | (() => number | null)): number | null { + if (typeof daemonPort === "function") { + return daemonPort(); + } + return daemonPort; +} + +function toServiceUrl(hostname: string, daemonPort: number | null): string | null { + if (daemonPort === null) { + return null; + } + return `http://${hostname}:${daemonPort}`; +} + +type ConfiguredWorkspaceService = { + serviceName: string; + hostname: string; + port: number | null; +}; + +function resolveWorkspaceBranchName(workspaceDirectory: string): string | null { + return readGitCommand(workspaceDirectory, "git symbolic-ref --short HEAD"); +} + +function listConfiguredWorkspaceServices(workspaceDirectory: string): ConfiguredWorkspaceService[] { + const branchName = resolveWorkspaceBranchName(workspaceDirectory); + const serviceConfigs = getServiceConfigs(workspaceDirectory); + return Array.from(serviceConfigs.entries()).map(([serviceName, config]) => ({ + serviceName, + hostname: buildServiceHostname(branchName, serviceName), + port: config.port ?? null, + })); +} + +function mergeWorkspaceServiceDefinitions( + workspaceDirectory: string, + routeStore: ServiceRouteStore, +): Array { + const merged = new Map(); + + for (const service of listConfiguredWorkspaceServices(workspaceDirectory)) { + merged.set(service.hostname, service); + } + + for (const route of routeStore.listRoutesForWorkspace(workspaceDirectory)) { + merged.set(route.hostname, route); + } + + return Array.from(merged.values()).sort((left, right) => + left.serviceName.localeCompare(right.serviceName, undefined, { + numeric: true, + sensitivity: "base", + }), + ); +} + +export function buildWorkspaceServicePayloads( + routeStore: ServiceRouteStore, + workspaceDirectory: string, + daemonPort: number | null, + resolveHealth?: (hostname: string) => "healthy" | "unhealthy" | null, +): WorkspaceServicePayload[] { + return mergeWorkspaceServiceDefinitions(workspaceDirectory, routeStore).map((service) => { + const route = routeStore.getRouteEntry(service.hostname); + return { + serviceName: service.serviceName, + hostname: service.hostname, + port: route?.port ?? service.port, + url: toServiceUrl(service.hostname, daemonPort), + lifecycle: route ? "running" : "stopped", + health: resolveHealth?.(service.hostname) ?? null, + }; + }); +} + +function buildServiceStatusUpdateMessage(params: { + workspaceId: string; + services: WorkspaceServicePayload[]; +}): ServiceStatusUpdateMessage { + return { + type: "service_status_update", + payload: { + workspaceId: params.workspaceId, + services: params.services, + }, + }; +} + +export function createServiceStatusEmitter({ + sessions, + routeStore, + daemonPort, +}: { + sessions: () => SessionEmitter[]; + routeStore: ServiceRouteStore; + daemonPort: number | null | (() => number | null); +}): (workspaceId: string, services: ServiceHealthEntry[]) => void { + return (workspaceId, services) => { + const resolvedDaemonPort = resolveDaemonPort(daemonPort); + const serviceHealthByHostname = new Map( + services.map((service) => [service.hostname, service.health] as const), + ); + + const projected = buildWorkspaceServicePayloads( + routeStore, + workspaceId, + resolvedDaemonPort, + (hostname) => serviceHealthByHostname.get(hostname) ?? null, + ); + + const message = buildServiceStatusUpdateMessage({ + workspaceId, + services: projected, + }); + + for (const session of sessions()) { + session.emit(message); + } + }; +} diff --git a/packages/server/src/server/session.provider-history-compatibility-ownership.test.ts b/packages/server/src/server/session.provider-history-compatibility-ownership.test.ts new file mode 100644 index 000000000..287df24ca --- /dev/null +++ b/packages/server/src/server/session.provider-history-compatibility-ownership.test.ts @@ -0,0 +1,389 @@ +import { describe, expect, test, vi } from "vitest"; + +import { Session } from "./session.js"; + +function createStoredAgentRecord(overrides?: Partial>) { + return { + id: "agent-1", + provider: "codex", + cwd: "/tmp/project", + createdAt: "2026-03-24T00:00:00.000Z", + updatedAt: "2026-03-24T00:00:00.000Z", + title: null, + labels: {}, + lastStatus: "idle", + config: null, + persistence: { + provider: "codex", + sessionId: "provider-session-1", + }, + archivedAt: null, + requiresAttention: false, + attentionReason: null, + attentionTimestamp: null, + ...overrides, + }; +} + +function createCompatibilitySnapshot(overrides?: Partial>) { + return { + id: "agent-1", + provider: "codex", + cwd: "/tmp/project", + persistence: { + provider: "codex", + sessionId: "provider-session-1", + }, + ...overrides, + }; +} + +function createSessionForOwnershipTests(options?: { + agentLoadingService?: { + ensureAgentLoaded?: (options: { agentId: string }) => Promise; + resumeAgent?: (options: { + handle: { provider: string; sessionId: string }; + overrides?: Record; + }) => Promise; + refreshAgent?: (options: { agentId: string }) => Promise; + }; + storedRecord?: Record | null; + loadedAgent?: Record | null; + timelineRows?: Array<{ seq: number; item: Record; timestamp: Date }>; +}) { + const emitted: Array<{ type: string; payload: unknown }> = []; + const logger = { + child: () => logger, + trace: vi.fn(), + debug: vi.fn(), + info: vi.fn(), + warn: vi.fn(), + error: vi.fn(), + }; + + const agentManager = { + subscribe: () => () => {}, + listAgents: () => [], + getAgent: vi.fn(() => options?.loadedAgent ?? null), + createAgent: vi.fn(async () => { + throw new Error("Session should delegate unloaded bootstrap to AgentLoadingService"); + }), + resumeAgentFromPersistence: vi.fn(async () => { + throw new Error("Session should delegate persistence resume to AgentLoadingService"); + }), + reloadAgentSession: vi.fn(async () => { + throw new Error("Session should delegate refresh reload to AgentLoadingService"); + }), + hydrateTimelineFromProvider: vi.fn(async () => { + throw new Error("Session should not call hydrateTimelineFromProvider directly"); + }), + fetchTimeline: vi.fn(async () => ({ + rows: options?.timelineRows ?? [], + hasOlder: false, + hasNewer: false, + })), + recordUserMessage: vi.fn(), + waitForAgentRunStart: vi.fn(async () => undefined), + getTimeline: vi.fn(() => []), + }; + + const session = new Session({ + clientId: "test-client", + onMessage: (message) => emitted.push(message as any), + logger: logger as any, + downloadTokenStore: {} as any, + pushTokenStore: {} as any, + paseoHome: "/tmp/paseo-test", + agentManager: agentManager as any, + agentStorage: { + list: async () => (options?.storedRecord ? [options.storedRecord as any] : []), + get: async () => (options?.storedRecord as any) ?? null, + } as any, + projectRegistry: { + initialize: async () => {}, + existsOnDisk: async () => true, + list: async () => [], + get: async () => null, + upsert: async () => {}, + archive: async () => {}, + remove: async () => {}, + } as any, + workspaceRegistry: { + initialize: async () => {}, + existsOnDisk: async () => true, + list: async () => [], + get: async () => null, + upsert: async () => {}, + archive: async () => {}, + remove: async () => {}, + } as any, + createAgentMcpTransport: async () => { + throw new Error("not used"); + }, + stt: null, + tts: null, + terminalManager: null, + agentLoadingService: options?.agentLoadingService, + } as any) as any; + + return { session, emitted, agentManager }; +} + +describe("provider history compatibility ownership", () => { + test("fetch_agent_timeline_request delegates unloaded bootstrap through the compatibility seam", async () => { + const ensureAgentLoaded = vi.fn(async () => createCompatibilitySnapshot()); + const { session, emitted } = createSessionForOwnershipTests({ + storedRecord: createStoredAgentRecord(), + timelineRows: [ + { + seq: 1, + item: { type: "assistant_message", text: "rehydrated from provider history" }, + timestamp: new Date("2026-03-24T00:00:01.000Z"), + }, + ], + agentLoadingService: { + ensureAgentLoaded, + }, + }); + + session.buildAgentPayload = vi.fn(async () => ({ id: "agent-1" })); + + await session.handleMessage({ + type: "fetch_agent_timeline_request", + requestId: "req-fetch", + agentId: "agent-1", + }); + + expect(ensureAgentLoaded).toHaveBeenCalledWith({ agentId: "agent-1" }); + expect(emitted).toContainEqual({ + type: "fetch_agent_timeline_response", + payload: expect.objectContaining({ + requestId: "req-fetch", + agentId: "agent-1", + error: null, + entries: [ + expect.objectContaining({ + seq: 1, + }), + ], + }), + }); + }); + + test("fetch_agent_timeline_request emits legacy compatibility fields for older clients", async () => { + const ensureAgentLoaded = vi.fn(async () => createCompatibilitySnapshot()); + const { session, emitted } = createSessionForOwnershipTests({ + storedRecord: createStoredAgentRecord(), + timelineRows: [ + { + seq: 7, + item: { type: "assistant_message", text: "compat row" }, + timestamp: new Date("2026-03-24T00:00:07.000Z"), + }, + ], + agentLoadingService: { + ensureAgentLoaded, + }, + }); + + session.buildAgentPayload = vi.fn(async () => ({ id: "agent-1" })); + + await session.handleMessage({ + type: "fetch_agent_timeline_request", + requestId: "req-compat", + agentId: "agent-1", + direction: "tail", + }); + + expect(emitted).toContainEqual({ + type: "fetch_agent_timeline_response", + payload: expect.objectContaining({ + requestId: "req-compat", + agentId: "agent-1", + epoch: "compat:agent-1", + reset: false, + staleCursor: false, + gap: false, + projection: "projected", + startCursor: { seq: 7, epoch: "compat:agent-1" }, + endCursor: { seq: 7, epoch: "compat:agent-1" }, + window: { + minSeq: 7, + maxSeq: 7, + nextSeq: 8, + startSeq: 7, + endSeq: 7, + hasOlder: false, + hasNewer: false, + }, + entries: [ + expect.objectContaining({ + seq: 7, + seqStart: 7, + seqEnd: 7, + sourceSeqRanges: [{ startSeq: 7, endSeq: 7 }], + collapsed: [], + }), + ], + }), + }); + }); + + test("send_agent_message_request delegates unloaded bootstrap before recording and streaming", async () => { + const ensureAgentLoaded = vi.fn(async () => createCompatibilitySnapshot()); + const { session, agentManager, emitted } = createSessionForOwnershipTests({ + storedRecord: createStoredAgentRecord(), + agentLoadingService: { + ensureAgentLoaded, + }, + }); + + session.resolveAgentIdentifier = vi.fn(async () => ({ ok: true, agentId: "agent-1" })); + session.unarchiveAgentState = vi.fn(async () => true); + session.buildAgentPrompt = vi.fn((text: string) => text); + session.startAgentStream = vi.fn(() => ({ ok: true })); + + await session.handleMessage({ + type: "send_agent_message_request", + requestId: "req-send", + agentId: "agent-1", + text: "hello", + images: [], + messageId: "msg-1", + }); + + expect(ensureAgentLoaded).toHaveBeenCalledWith({ agentId: "agent-1" }); + expect(ensureAgentLoaded.mock.invocationCallOrder[0]).toBeLessThan( + agentManager.recordUserMessage.mock.invocationCallOrder[0], + ); + expect(agentManager.recordUserMessage).toHaveBeenCalledWith("agent-1", "hello", { + messageId: "msg-1", + emitState: false, + }); + expect(session.startAgentStream).toHaveBeenCalledWith("agent-1", "hello"); + expect(emitted).toContainEqual({ + type: "send_agent_message_response", + payload: { + requestId: "req-send", + agentId: "agent-1", + accepted: true, + error: null, + }, + }); + }); + + test("resume_agent_request delegates persistence bootstrap through the compatibility seam", async () => { + const resumeAgent = vi.fn(async () => createCompatibilitySnapshot()); + const { session, emitted } = createSessionForOwnershipTests({ + agentLoadingService: { + resumeAgent, + }, + }); + + session.unarchiveAgentByHandle = vi.fn(async () => undefined); + session.unarchiveAgentState = vi.fn(async () => true); + session.forwardAgentUpdate = vi.fn(async () => undefined); + session.getAgentPayloadById = vi.fn(async () => ({ id: "agent-1" })); + + await session.handleMessage({ + type: "resume_agent_request", + requestId: "req-resume", + handle: { + provider: "codex", + sessionId: "provider-session-1", + }, + overrides: { + model: "gpt-5.4", + }, + }); + + expect(resumeAgent).toHaveBeenCalledWith({ + handle: { + provider: "codex", + sessionId: "provider-session-1", + }, + overrides: { + model: "gpt-5.4", + }, + }); + expect(emitted).toContainEqual({ + type: "status", + payload: expect.objectContaining({ + status: "agent_resumed", + requestId: "req-resume", + agentId: "agent-1", + }), + }); + }); + + test("refresh_agent_request delegates loaded persisted refresh through the compatibility seam", async () => { + const refreshAgent = vi.fn(async () => + createCompatibilitySnapshot({ + persistence: { + provider: "codex", + sessionId: "provider-session-1", + }, + }), + ); + const { session, emitted } = createSessionForOwnershipTests({ + loadedAgent: createCompatibilitySnapshot(), + agentLoadingService: { + refreshAgent, + }, + }); + + session.unarchiveAgentState = vi.fn(async () => true); + session.interruptAgentIfRunning = vi.fn(async () => undefined); + session.forwardAgentUpdate = vi.fn(async () => undefined); + + await session.handleMessage({ + type: "refresh_agent_request", + requestId: "req-refresh-loaded", + agentId: "agent-1", + }); + + expect(session.interruptAgentIfRunning).toHaveBeenCalledWith("agent-1"); + expect(refreshAgent).toHaveBeenCalledWith({ agentId: "agent-1" }); + expect(emitted).toContainEqual({ + type: "status", + payload: { + status: "agent_refreshed", + requestId: "req-refresh-loaded", + agentId: "agent-1", + timelineSize: 0, + }, + }); + }); + + test("refresh_agent_request delegates unloaded persisted refresh through the compatibility seam", async () => { + const refreshAgent = vi.fn(async () => createCompatibilitySnapshot()); + const { session, emitted } = createSessionForOwnershipTests({ + storedRecord: createStoredAgentRecord(), + agentLoadingService: { + refreshAgent, + }, + }); + + session.unarchiveAgentState = vi.fn(async () => true); + session.interruptAgentIfRunning = vi.fn(async () => undefined); + session.forwardAgentUpdate = vi.fn(async () => undefined); + + await session.handleMessage({ + type: "refresh_agent_request", + requestId: "req-refresh-unloaded", + agentId: "agent-1", + }); + + expect(session.interruptAgentIfRunning).not.toHaveBeenCalled(); + expect(refreshAgent).toHaveBeenCalledWith({ agentId: "agent-1" }); + expect(emitted).toContainEqual({ + type: "status", + payload: { + status: "agent_refreshed", + requestId: "req-refresh-unloaded", + agentId: "agent-1", + timelineSize: 0, + }, + }); + }); +}); diff --git a/packages/server/src/server/session.ts b/packages/server/src/server/session.ts index b60151273..dba2a9a86 100644 --- a/packages/server/src/server/session.ts +++ b/packages/server/src/server/session.ts @@ -1,6 +1,6 @@ import { v4 as uuidv4 } from "uuid"; import { watch, type FSWatcher } from "node:fs"; -import { readFile, stat } from "fs/promises"; +import { readFile } from "fs/promises"; import { exec, execFile } from "node:child_process"; import { promisify } from "util"; import { join, resolve, sep } from "path"; @@ -19,6 +19,7 @@ import { type SubscribeTerminalsRequest, type UnsubscribeTerminalsRequest, type CreateTerminalRequest, + type StartWorkspaceServiceRequest, type SubscribeTerminalRequest, type UnsubscribeTerminalRequest, type TerminalInput, @@ -30,6 +31,7 @@ import { type DirectorySuggestionsRequest, type EditorTargetId, type ProjectPlacementPayload, + type WorkspaceSetupSnapshot, type WorkspaceDescriptorPayload, type WorkspaceStateBucket, } from "./messages.js"; @@ -58,13 +60,14 @@ import { type VoiceTurnController, } from "./voice/voice-turn-controller.js"; import { - buildConfigOverrides, - buildSessionConfig, - extractTimestamps, + toAgentPersistenceHandle, } from "./persistence-hooks.js"; import { experimental_createMCPClient } from "ai"; import type { Transport } from "@modelcontextprotocol/sdk/shared/transport.js"; import type { VoiceCallerContext, VoiceMcpStdioConfig, VoiceSpeakHandler } from "./voice-types.js"; +import { buildWorkspaceServicePayloads } from "./service-status-projection.js"; +import { spawnWorkspaceService } from "./worktree-bootstrap.js"; +import { readGitCommand } from "./workspace-git-metadata.js"; import { BackgroundGitFetchManager } from "./background-git-fetch-manager.js"; export type AgentMcpTransportFactory = () => Promise; @@ -84,11 +87,6 @@ import { appendTimelineItemIfAgentKnown, emitLiveTimelineItemIfAgentKnown, } from "./agent/timeline-append.js"; -import { - projectTimelineRows, - selectTimelineWindowByProjectedLimit, - type TimelineProjectionMode, -} from "./agent/timeline-projection.js"; import { DEFAULT_STRUCTURED_GENERATION_PROVIDERS, StructuredAgentFallbackError, @@ -107,28 +105,17 @@ import type { AgentPersistenceHandle, ProviderSnapshotEntry, } from "./agent/agent-sdk-types.js"; -import { AgentStorage, type StoredAgentRecord } from "./agent/agent-storage.js"; +import type { StoredAgentRecord } from "./agent/agent-storage.js"; +import type { AgentSnapshotStore } from "./agent/agent-snapshot-store.js"; import { isValidAgentProvider, AGENT_PROVIDER_IDS } from "./agent/provider-manifest.js"; -import { - buildProjectPlacementForCwd, - detectStaleWorkspaces, - deriveProjectKind, - deriveProjectRootPath, - deriveWorkspaceId, - deriveWorkspaceDisplayName, - deriveWorkspaceKind, - normalizeWorkspaceId as normalizePersistedWorkspaceId, -} from "./workspace-registry-model.js"; +import { normalizeWorkspaceId as normalizePersistedWorkspaceId } from "./workspace-registry-model.js"; import type { PersistedProjectRecord, PersistedWorkspaceRecord, ProjectRegistry, WorkspaceRegistry, } from "./workspace-registry.js"; -import { - createPersistedProjectRecord, - createPersistedWorkspaceRecord, -} from "./workspace-registry.js"; +import { AgentLoadingService } from "./agent-loading-service.js"; import { buildVoiceAgentMcpServerConfig, buildVoiceModeSystemPrompt, @@ -147,10 +134,12 @@ import { type WorktreeConfig, } from "../utils/worktree.js"; import { runAsyncWorktreeBootstrap } from "./worktree-bootstrap.js"; +import type { ServiceRouteStore } from "./service-proxy.js"; import { getCheckoutDiff, - getCheckoutShortstat, + getCachedCheckoutShortstat, getCheckoutStatus, + getCheckoutStatusLite, listBranchSuggestions, commitChanges, mergeToBase, @@ -158,6 +147,7 @@ import { pushCurrentBranch, createPullRequest, getPullRequestStatus, + warmCheckoutShortstatInBackground, } from "../utils/checkout-git.js"; import { getProjectIcon } from "../utils/project-icon.js"; import { expandTilde } from "../utils/path.js"; @@ -168,6 +158,7 @@ import { toCheckoutError, } from "./checkout-git-utils.js"; import { CheckoutDiffManager } from "./checkout-diff-manager.js"; +import { detectWorkspaceGitMetadata } from "./workspace-git-metadata.js"; import type { LocalSpeechModelId } from "./speech/providers/local/models.js"; import { toResolver, type Resolvable } from "./speech/provider-resolver.js"; import type { SpeechReadinessSnapshot, SpeechReadinessState } from "./speech/speech-runtime.js"; @@ -187,42 +178,34 @@ import { handleCreatePaseoWorktreeRequest as handleCreateWorktreeRequest, handlePaseoWorktreeArchiveRequest as handleWorktreeArchiveRequest, handlePaseoWorktreeListRequest as handleWorktreeListRequest, + handleWorkspaceSetupStatusRequest as handleWorkspaceSetupStatusRequestMessage, killTerminalsUnderPath as killWorktreeTerminalsUnderPath, - registerPendingWorktreeWorkspace as registerPendingWorktreeWorkspaceSession, } from "./worktree-session.js"; const execAsync = promisify(exec); const execFileAsync = promisify(execFile); const MAX_INITIAL_AGENT_TITLE_CHARS = Math.min(60, MAX_EXPLICIT_AGENT_TITLE_CHARS); -const pendingAgentInitializations = new Map>(); const DEFAULT_AGENT_PROVIDER = AGENT_PROVIDER_IDS[0]; - -// TODO: Remove once all app store clients are on >=0.1.45 and understand arbitrary provider strings. -// Clients before 0.1.45 validate providers with z.enum(["claude", "codex", "opencode"]) and reject -// the entire session message if they encounter an unknown provider. const LEGACY_PROVIDER_IDS = new Set(["claude", "codex", "opencode"]); const MIN_VERSION_ALL_PROVIDERS = "0.1.45"; - -function clientSupportsAllProviders(appVersion: string | null): boolean { - if (!appVersion) return false; - // Strip RC/prerelease suffix: "0.1.45-rc.4" → "0.1.45" - const base = appVersion.replace(/-.*$/, ""); - const parts = base.split(".").map(Number); - const minParts = MIN_VERSION_ALL_PROVIDERS.split(".").map(Number); - for (let i = 0; i < minParts.length; i++) { - const a = parts[i] ?? 0; - const b = minParts[i] ?? 0; - if (a > b) return true; - if (a < b) return false; - } - return true; -} - const WORKSPACE_GIT_WATCH_DEBOUNCE_MS = 500; const WORKSPACE_GIT_WATCH_REMOVED_FINGERPRINT = "__removed__"; const TERMINAL_STREAM_HIGH_WATER_BYTES = 256 * 1024; const TERMINAL_STREAM_LOW_WATER_BYTES = 16 * 1024; const MAX_TERMINAL_STREAM_SLOTS = 256; +const pendingAgentInitializations = new Map>(); + +type DeleteFencedAgentSnapshotStore = AgentSnapshotStore & { + beginDelete(agentId: string): void; +}; + + +function beginAgentDeleteIfSupported(agentStorage: AgentSnapshotStore, agentId: string): void { + if ("beginDelete" in agentStorage && typeof agentStorage.beginDelete === "function") { + (agentStorage as DeleteFencedAgentSnapshotStore).beginDelete(agentId); + } +} + function deriveInitialAgentTitle(prompt: string): string | null { const firstContentLine = prompt @@ -240,6 +223,20 @@ function deriveInitialAgentTitle(prompt: string): string | null { return clamped.length > 0 ? clamped : null; } +function clientSupportsAllProviders(appVersion: string | null): boolean { + if (!appVersion) return false; + const base = appVersion.replace(/-.*$/, ""); + const parts = base.split(".").map(Number); + const minParts = MIN_VERSION_ALL_PROVIDERS.split(".").map(Number); + for (let i = 0; i < minParts.length; i++) { + const current = parts[i] ?? 0; + const minimum = minParts[i] ?? 0; + if (current > minimum) return true; + if (current < minimum) return false; + } + return true; +} + export function resolveCreateAgentTitles(options: { configTitle?: string | null; initialPrompt?: string | null; @@ -278,6 +275,7 @@ type WorkspaceGitWatchTarget = { refreshPromise: Promise | null; refreshQueued: boolean; latestFingerprint: string | null; + lastBranchName: string | null; }; type ActiveTerminalStream = { @@ -394,7 +392,7 @@ type VoiceTranscriptionResultPayload = { export type SessionOptions = { clientId: string; - appVersion: string | null; + appVersion?: string | null; onMessage: (msg: SessionOutboundMessage) => void; onBinaryMessage?: (frame: Uint8Array) => void; getBinaryBufferedAmount?: () => number; @@ -404,19 +402,28 @@ export type SessionOptions = { pushTokenStore: PushTokenStore; paseoHome: string; agentManager: AgentManager; - agentStorage: AgentStorage; + agentStorage: AgentSnapshotStore; projectRegistry: ProjectRegistry; workspaceRegistry: WorkspaceRegistry; chatService: FileBackedChatService; scheduleService: ScheduleService; loopService: LoopService; checkoutDiffManager: CheckoutDiffManager; + agentLoadingService?: AgentLoadingService; backgroundGitFetchManager: BackgroundGitFetchManager; createAgentMcpTransport: AgentMcpTransportFactory; stt: Resolvable; tts: Resolvable; terminalManager: TerminalManager | null; providerSnapshotManager?: ProviderSnapshotManager; + serviceRouteStore?: ServiceRouteStore; + onBranchChanged?: ( + workspaceId: string, + oldBranch: string | null, + newBranch: string | null, + ) => void; + getDaemonTcpPort?: () => number | null; + resolveServiceHealth?: (hostname: string) => "healthy" | "unhealthy" | null; voice?: { voiceAgentMcpStdio?: VoiceMcpStdioConfig | null; turnDetection?: Resolvable; @@ -517,30 +524,6 @@ function coerceAgentProvider(logger: pino.Logger, value: string, agentId?: strin return DEFAULT_AGENT_PROVIDER; } -function toAgentPersistenceHandle( - logger: pino.Logger, - handle: StoredAgentRecord["persistence"], -): AgentPersistenceHandle | null { - if (!handle) { - return null; - } - const provider = handle.provider; - if (!isValidAgentProvider(provider)) { - logger.warn({ provider }, `Ignoring persistence handle with unknown provider '${provider}'`); - return null; - } - if (!handle.sessionId) { - logger.warn("Ignoring persistence handle missing sessionId"); - return null; - } - return { - provider, - sessionId: handle.sessionId, - nativeHandle: handle.nativeHandle, - metadata: handle.metadata, - } satisfies AgentPersistenceHandle; -} - /** * Session represents a single connected client session. * It owns all state management, orchestration logic, and message processing. @@ -590,13 +573,14 @@ export class Session { private agentMcpClient: Awaited> | null = null; private agentTools: ToolSet | null = null; private agentManager: AgentManager; - private readonly agentStorage: AgentStorage; + private readonly agentStorage: AgentSnapshotStore; private readonly projectRegistry: ProjectRegistry; private readonly workspaceRegistry: WorkspaceRegistry; private readonly chatService: FileBackedChatService; private readonly scheduleService: ScheduleService; private readonly loopService: LoopService; private readonly checkoutDiffManager: CheckoutDiffManager; + private readonly agentLoadingService: AgentLoadingService; private readonly backgroundGitFetchManager: BackgroundGitFetchManager; private readonly createAgentMcpTransport: AgentMcpTransportFactory; private readonly downloadTokenStore: DownloadTokenStore; @@ -616,6 +600,16 @@ export class Session { private readonly terminalManager: TerminalManager | null; private readonly providerSnapshotManager: ProviderSnapshotManager | null; private unsubscribeProviderSnapshotEvents: (() => void) | null = null; + private readonly serviceRouteStore: ServiceRouteStore | null; + private readonly onBranchChanged?: ( + workspaceId: string, + oldBranch: string | null, + newBranch: string | null, + ) => void; + private readonly getDaemonTcpPort: (() => number | null) | null; + private readonly resolveServiceHealth: + | ((hostname: string) => "healthy" | "unhealthy" | null) + | null; private readonly subscribedTerminalDirectories = new Set(); private unsubscribeTerminalsChanged: (() => void) | null = null; private terminalExitSubscriptions: Map void> = new Map(); @@ -626,6 +620,7 @@ export class Session { private peakInflightRequests = 0; private readonly checkoutDiffSubscriptions = new Map void>(); private readonly workspaceGitWatchTargets = new Map(); + private readonly workspaceSetupSnapshots = new Map(); private readonly workspaceGitFetchSubscriptions = new Map void>(); private readonly voiceAgentMcpStdio: VoiceMcpStdioConfig | null; private readonly registerVoiceSpeakHandler?: ( @@ -665,19 +660,24 @@ export class Session { scheduleService, loopService, checkoutDiffManager, + agentLoadingService, backgroundGitFetchManager, createAgentMcpTransport, stt, tts, terminalManager, providerSnapshotManager, + serviceRouteStore, + onBranchChanged, + getDaemonTcpPort, + resolveServiceHealth, voice, voiceBridge, dictation, agentProviderRuntimeSettings, } = options; this.clientId = clientId; - this.appVersion = appVersion; + this.appVersion = appVersion ?? null; this.sessionId = uuidv4(); this.onMessage = onMessage; this.onBinaryMessage = onBinaryMessage ?? null; @@ -686,6 +686,11 @@ export class Session { this.downloadTokenStore = downloadTokenStore; this.pushTokenStore = pushTokenStore; this.paseoHome = paseoHome; + this.sessionLogger = logger.child({ + module: "session", + clientId: this.clientId, + sessionId: this.sessionId, + }); this.agentManager = agentManager; this.agentStorage = agentStorage; this.projectRegistry = projectRegistry; @@ -694,10 +699,22 @@ export class Session { this.scheduleService = scheduleService; this.loopService = loopService; this.checkoutDiffManager = checkoutDiffManager; + this.agentLoadingService = + agentLoadingService ?? + new AgentLoadingService({ + agentManager: this.agentManager, + agentStorage: this.agentStorage, + logger: this.sessionLogger, + }); + this.backgroundGitFetchManager = backgroundGitFetchManager; this.backgroundGitFetchManager = backgroundGitFetchManager; this.createAgentMcpTransport = createAgentMcpTransport; this.terminalManager = terminalManager; this.providerSnapshotManager = providerSnapshotManager ?? null; + this.serviceRouteStore = serviceRouteStore ?? null; + this.onBranchChanged = onBranchChanged; + this.getDaemonTcpPort = getDaemonTcpPort ?? null; + this.resolveServiceHealth = resolveServiceHealth ?? null; if (this.terminalManager) { this.unsubscribeTerminalsChanged = this.terminalManager.subscribeTerminalsChanged((event) => this.handleTerminalsChanged(event), @@ -734,11 +751,6 @@ export class Session { this.getSpeechReadiness = dictation?.getSpeechReadiness; this.agentProviderRuntimeSettings = agentProviderRuntimeSettings; this.abortController = new AbortController(); - this.sessionLogger = logger.child({ - module: "session", - clientId: this.clientId, - sessionId: this.sessionId, - }); this.providerRegistry = buildProviderRegistry(this.sessionLogger, { runtimeSettings: this.agentProviderRuntimeSettings, }); @@ -789,6 +801,10 @@ export class Session { }; } + public emitServerMessage(message: SessionOutboundMessage): void { + this.emit(message); + } + /** * Send initial state to client after connection */ @@ -1024,7 +1040,6 @@ export class Session { event: serializedEvent, timestamp: new Date().toISOString(), ...(typeof event.seq === "number" ? { seq: event.seq } : {}), - ...(typeof event.epoch === "string" ? { epoch: event.epoch } : {}), } as const; this.emit({ @@ -1127,7 +1142,7 @@ export class Session { pendingPermissions: [], persistence: toAgentPersistenceHandle(this.sessionLogger, record.persistence), lastUsage: undefined, - lastError: undefined, + lastError: record.lastError ?? undefined, title: record.title ?? record.config?.title ?? null, requiresAttention: record.requiresAttention ?? false, attentionReason: record.attentionReason ?? null, @@ -1166,35 +1181,8 @@ export class Session { } const initPromise = (async () => { - const record = await this.agentStorage.get(agentId); - if (!record) { - throw new Error(`Agent not found: ${agentId}`); - } - - const handle = toAgentPersistenceHandle(this.sessionLogger, record.persistence); - let snapshot: ManagedAgent; - if (handle) { - snapshot = await this.agentManager.resumeAgentFromPersistence( - handle, - buildConfigOverrides(record), - agentId, - extractTimestamps(record), - ); - this.sessionLogger.info( - { agentId, provider: record.provider }, - "Agent resumed from persistence", - ); - } else { - const config = buildSessionConfig(record); - snapshot = await this.agentManager.createAgent(config, agentId, { labels: record.labels }); - this.sessionLogger.info( - { agentId, provider: record.provider }, - "Agent created from stored config", - ); - } - - await this.agentManager.hydrateTimelineFromProvider(agentId); - return this.agentManager.getAgent(agentId) ?? snapshot; + await this.requireStoredAgentRecord(agentId); + return this.agentLoadingService.ensureAgentLoaded({ agentId }); })(); pendingAgentInitializations.set(agentId, initPromise); @@ -1209,9 +1197,18 @@ export class Session { } } - // TODO: Remove once all app store clients are on >=0.1.45. + private async requireStoredAgentRecord(agentId: string): Promise { + const record = await this.agentStorage.get(agentId); + if (!record) { + throw new Error(`Agent not found: ${agentId}`); + } + return record; + } + private isProviderVisibleToClient(provider: string): boolean { - if (clientSupportsAllProviders(this.appVersion)) return true; + if (clientSupportsAllProviders(this.appVersion)) { + return true; + } return LEGACY_PROVIDER_IDS.has(provider); } @@ -1283,11 +1280,9 @@ export class Session { subscription: AgentUpdatesSubscriptionState, payload: AgentUpdatePayload, ): void { - // TODO: Remove once all app store clients are on >=0.1.45. if (payload.kind === "upsert" && !this.isProviderVisibleToClient(payload.agent.provider)) { return; } - if (subscription.isBootstrapping) { subscription.pendingUpdatesByAgentId.set(this.getAgentUpdateTargetId(payload), payload); return; @@ -1329,192 +1324,102 @@ export class Session { } } - private async buildProjectPlacement(cwd: string): Promise { - return buildProjectPlacementForCwd({ - cwd, - paseoHome: this.paseoHome, - }); + private async findWorkspaceByDirectory(cwd: string): Promise { + const normalizedCwd = await this.resolveWorkspaceDirectory(cwd); + const workspaces = await this.workspaceRegistry.list(); + return workspaces.find((workspace) => workspace.directory === normalizedCwd) ?? null; } - private buildPersistedProjectRecord(input: { - workspaceId: string; - placement: ProjectPlacementPayload; - createdAt: string; - updatedAt: string; - }): PersistedProjectRecord { - return createPersistedProjectRecord({ - projectId: input.placement.projectKey, - rootPath: deriveProjectRootPath({ - cwd: input.workspaceId, - checkout: input.placement.checkout, - }), - kind: deriveProjectKind(input.placement.checkout), - displayName: input.placement.projectName, - createdAt: input.createdAt, - updatedAt: input.updatedAt, - archivedAt: null, - }); - } - - private buildPersistedWorkspaceRecord(input: { - workspaceId: string; - placement: ProjectPlacementPayload; - createdAt: string; - updatedAt: string; - }): PersistedWorkspaceRecord { - return createPersistedWorkspaceRecord({ - workspaceId: input.workspaceId, - projectId: input.placement.projectKey, - cwd: input.workspaceId, - kind: deriveWorkspaceKind(input.placement.checkout), - displayName: deriveWorkspaceDisplayName({ - cwd: input.workspaceId, - checkout: input.placement.checkout, - }), - createdAt: input.createdAt, - updatedAt: input.updatedAt, - archivedAt: null, - }); - } - - private async archiveProjectRecordIfEmpty(projectId: string, archivedAt: string): Promise { - const siblingWorkspaces = (await this.workspaceRegistry.list()).filter( - (workspace) => workspace.projectId === projectId && !workspace.archivedAt, - ); - if (siblingWorkspaces.length === 0) { - await this.projectRegistry.archive(projectId, archivedAt); + /** + * Resolve a workspace ID that may be either a numeric ID (legacy) or a directory path + * (sent by clients that received the path-based descriptor format). + */ + private async resolveWorkspaceByIdOrDirectory( + workspaceId: string, + ): Promise { + const numericId = Number(workspaceId); + if (!Number.isNaN(numericId)) { + const record = await this.workspaceRegistry.get(numericId); + if (record) return record; } + // Fallback: treat as directory path + return this.findWorkspaceByDirectory(workspaceId); } - private async reconcileWorkspaceRecord(workspaceId: string): Promise<{ - workspace: PersistedWorkspaceRecord; - changed: boolean; - removedWorkspaceId: string | null; - }> { - const normalizedCwd = normalizePersistedWorkspaceId(workspaceId); - const placement = await this.buildProjectPlacement(normalizedCwd); - const resolvedWorkspaceId = deriveWorkspaceId(normalizedCwd, placement.checkout); - const staleWorkspace = - resolvedWorkspaceId === normalizedCwd ? null : await this.workspaceRegistry.get(normalizedCwd); - const existing = (await this.workspaceRegistry.get(resolvedWorkspaceId)) ?? staleWorkspace; - await this.syncWorkspaceGitWatchTarget(resolvedWorkspaceId, { - isGit: placement.checkout.isGit, - }); - const now = new Date().toISOString(); - const nextProjectCreatedAt = existing?.createdAt ?? now; - const nextWorkspaceCreatedAt = existing?.createdAt ?? now; - const currentProjectRecord = await this.projectRegistry.get(placement.projectKey); - const nextProjectRecord = this.buildPersistedProjectRecord({ - workspaceId: resolvedWorkspaceId, - placement, - createdAt: currentProjectRecord?.createdAt ?? nextProjectCreatedAt, - updatedAt: now, - }); - const nextWorkspaceRecord = this.buildPersistedWorkspaceRecord({ - workspaceId: resolvedWorkspaceId, - placement, - createdAt: nextWorkspaceCreatedAt, - updatedAt: now, - }); - - const needsWorkspaceUpdate = - !existing || - existing.archivedAt || - existing.projectId !== nextWorkspaceRecord.projectId || - existing.kind !== nextWorkspaceRecord.kind || - existing.displayName !== nextWorkspaceRecord.displayName; - - const needsProjectUpdate = - !currentProjectRecord || - currentProjectRecord.archivedAt || - currentProjectRecord.rootPath !== nextProjectRecord.rootPath || - currentProjectRecord.kind !== nextProjectRecord.kind || - currentProjectRecord.displayName !== nextProjectRecord.displayName; - const needsStaleWorkspaceCleanup = - !!staleWorkspace && - !staleWorkspace.archivedAt && - staleWorkspace.workspaceId !== resolvedWorkspaceId; - - let removedWorkspaceId: string | null = null; - if (needsStaleWorkspaceCleanup) { - await this.workspaceRegistry.archive(staleWorkspace.workspaceId, now); - this.removeWorkspaceGitWatchTarget(staleWorkspace.workspaceId); - removedWorkspaceId = staleWorkspace.workspaceId; - } - - if (!needsWorkspaceUpdate && !needsProjectUpdate && !needsStaleWorkspaceCleanup) { - return { - workspace: existing!, - changed: false, - removedWorkspaceId: null, - }; - } - - await this.projectRegistry.upsert(nextProjectRecord); - await this.workspaceRegistry.upsert(nextWorkspaceRecord); - if (existing && existing.workspaceId !== resolvedWorkspaceId) { - await this.workspaceRegistry.archive(existing.workspaceId, now); - this.removeWorkspaceGitWatchTarget(existing.workspaceId); - removedWorkspaceId ??= existing.workspaceId; - } - - if (existing && !existing.archivedAt && existing.projectId !== nextWorkspaceRecord.projectId) { - await this.archiveProjectRecordIfEmpty(existing.projectId, now); + private async resolveWorkspaceDirectory(cwd: string): Promise { + const normalizedCwd = normalizePersistedWorkspaceId(cwd); + try { + const checkout = await getCheckoutStatusLite(normalizedCwd, { + paseoHome: this.paseoHome, + }); + return normalizePersistedWorkspaceId(checkout.worktreeRoot ?? normalizedCwd); + } catch { + return normalizedCwd; } + } + private async buildProjectPlacementForWorkspace( + workspace: PersistedWorkspaceRecord, + projectRecord?: PersistedProjectRecord | null, + ): Promise { + const project = projectRecord ?? (await this.projectRegistry.get(workspace.projectId)); + if (!project) { + throw new Error(`Project not found for workspace ${workspace.id}`); + } + const checkout = + project.kind !== "git" + ? { + cwd: workspace.directory, + isGit: false as const, + currentBranch: null, + remoteUrl: null, + worktreeRoot: null, + isPaseoOwnedWorktree: false as const, + mainRepoRoot: null, + } + : workspace.kind === "worktree" + ? { + cwd: workspace.directory, + isGit: true as const, + currentBranch: workspace.displayName, + remoteUrl: project.gitRemote, + worktreeRoot: workspace.directory, + isPaseoOwnedWorktree: true as const, + mainRepoRoot: project.directory, + } + : { + cwd: workspace.directory, + isGit: true as const, + currentBranch: workspace.displayName, + remoteUrl: project.gitRemote, + worktreeRoot: workspace.directory, + isPaseoOwnedWorktree: false as const, + mainRepoRoot: null, + }; return { - workspace: nextWorkspaceRecord, - changed: true, - removedWorkspaceId, + projectKey: String(project.id), + projectName: project.displayName, + checkout, }; } - private async reconcileActiveWorkspaceRecords(): Promise> { - const changedWorkspaceIds = new Set(); - const activeWorkspaces = (await this.workspaceRegistry.list()).filter( - (workspace) => !workspace.archivedAt, - ); - const staleWorkspaceIds = await detectStaleWorkspaces({ - activeWorkspaces, - checkDirectoryExists: async (cwd) => { - try { - await stat(cwd); - return true; - } catch { - return false; - } - }, - }); - - for (const workspaceId of staleWorkspaceIds) { - await this.archiveWorkspaceRecord(workspaceId); - changedWorkspaceIds.add(workspaceId); - } - - for (const workspace of activeWorkspaces) { - if (staleWorkspaceIds.has(workspace.workspaceId)) { - continue; - } - - const result = await this.reconcileWorkspaceRecord(workspace.workspaceId); - if (result.changed) { - changedWorkspaceIds.add(result.workspace.workspaceId); - if (result.removedWorkspaceId) { - changedWorkspaceIds.add(result.removedWorkspaceId); - } - } + private async buildProjectPlacementForCwd(cwd: string): Promise { + const workspace = await this.findWorkspaceByDirectory(cwd); + if (!workspace) { + return null; } - - return changedWorkspaceIds; + return this.buildProjectPlacementForWorkspace(workspace); } private async forwardAgentUpdate(agent: ManagedAgent): Promise { try { - await this.ensureWorkspaceRegistered(agent.cwd); const subscription = this.agentUpdatesSubscription; const payload = await this.buildAgentPayload(agent); if (subscription) { - const project = await this.buildProjectPlacement(payload.cwd); + const project = await this.buildProjectPlacementForCwd(payload.cwd); + if (!project) { + throw new Error(`Workspace not found for agent ${payload.id}`); + } const matches = this.matchesAgentFilter({ agent: payload, project, @@ -1762,6 +1667,10 @@ export class Session { await this.handleCreatePaseoWorktreeRequest(msg); break; + case "workspace_setup_status_request": + await this.handleWorkspaceSetupStatusRequest(msg); + break; + case "list_available_editors_request": await this.handleListAvailableEditorsRequest(msg); break; @@ -1864,6 +1773,10 @@ export class Session { await this.handleCreateTerminalRequest(msg); break; + case "start_workspace_service_request": + await this.handleStartWorkspaceServiceRequest(msg); + break; + case "subscribe_terminal_request": await this.handleSubscribeTerminalRequest(msg); break; @@ -2099,8 +2012,8 @@ export class Session { (await this.agentStorage.get(agentId))?.cwd ?? null; - // Prevent the persistence hook from re-creating the record while we close/delete. - this.agentStorage.beginDelete(agentId); + // File-backed storage still needs an early delete fence before closeAgent(). + beginAgentDeleteIfSupported(this.agentStorage, agentId); try { await this.agentManager.closeAgent(agentId); @@ -2111,12 +2024,17 @@ export class Session { ); } + // Drain queued persistence from the just-closed agent before removing its + // durable snapshot, otherwise an in-flight background write can recreate it. + await this.agentManager.flush(); + try { await this.agentStorage.remove(agentId); + await this.agentManager.deleteCommittedTimeline(agentId); } catch (error: any) { this.sessionLogger.error( { err: error, agentId }, - `Failed to remove agent ${agentId} from registry`, + `Failed to fully delete agent ${agentId}`, ); } @@ -2141,12 +2059,15 @@ export class Session { } private async handleArchiveAgentRequest(agentId: string, requestId: string): Promise { - const result = await this.archiveAgentForClose(agentId); + this.sessionLogger.info({ agentId }, `Archiving agent ${agentId}`); + + const { archivedAt } = await this.archiveAgentForClose(agentId); + this.emit({ type: "agent_archived", payload: { - agentId: result.agentId, - archivedAt: result.archivedAt, + agentId, + archivedAt, requestId, }, }); @@ -2168,20 +2089,7 @@ export class Session { } const archivedAt = new Date().toISOString(); - const normalizedStatus = - existing.lastStatus === "running" || existing.lastStatus === "initializing" - ? "idle" - : existing.lastStatus; - - await this.agentStorage.upsert({ - ...existing, - archivedAt, - updatedAt: archivedAt, - lastStatus: normalizedStatus, - requiresAttention: false, - attentionReason: null, - attentionTimestamp: null, - }); + await this.agentManager.archiveSnapshot(agentId, archivedAt); return { agentId, archivedAt }; } @@ -2189,8 +2097,6 @@ export class Session { private async archiveAgentForClose( agentId: string, ): Promise<{ agentId: string; archivedAt: string }> { - this.sessionLogger.info({ agentId }, `Archiving agent ${agentId}`); - const liveAgent = this.agentManager.getAgent(agentId); if (liveAgent) { await this.interruptAgentIfRunning(agentId); @@ -2199,6 +2105,7 @@ export class Session { } else { await this.archiveStoredAgentForClose(agentId); } + const archivedRecord = await this.agentStorage.get(agentId); if (!archivedRecord) { throw new Error(`Agent not found in storage after archive: ${agentId}`); @@ -2206,25 +2113,32 @@ export class Session { if (this.agentUpdatesSubscription) { const payload = this.buildStoredAgentPayload(archivedRecord); - const project = await this.buildProjectPlacement(payload.cwd); - const matches = this.matchesAgentFilter({ - agent: payload, - project, - filter: this.agentUpdatesSubscription.filter, - }); - this.bufferOrEmitAgentUpdate( - this.agentUpdatesSubscription, - matches - ? { - kind: "upsert", - agent: payload, - project, - } - : { - kind: "remove", - agentId, - }, - ); + const project = await this.buildProjectPlacementForCwd(payload.cwd); + if (project) { + const matches = this.matchesAgentFilter({ + agent: payload, + project, + filter: this.agentUpdatesSubscription.filter, + }); + this.bufferOrEmitAgentUpdate( + this.agentUpdatesSubscription, + matches + ? { + kind: "upsert", + agent: payload, + project, + } + : { + kind: "remove", + agentId, + }, + ); + } else { + this.bufferOrEmitAgentUpdate(this.agentUpdatesSubscription, { + kind: "remove", + agentId, + }); + } await this.emitWorkspaceUpdateForCwd(payload.cwd); } @@ -2275,31 +2189,11 @@ export class Session { } private async unarchiveAgentState(agentId: string): Promise { - const record = await this.agentStorage.get(agentId); - if (!record || !record.archivedAt) { - return false; - } - const updatedAt = new Date().toISOString(); - await this.agentStorage.upsert({ - ...record, - archivedAt: null, - updatedAt, - }); - this.agentManager.notifyAgentState(agentId); - return true; + return this.agentManager.unarchiveSnapshot(agentId); } private async unarchiveAgentByHandle(handle: AgentPersistenceHandle): Promise { - const records = await this.agentStorage.list(); - const matched = records.find( - (record) => - record.persistence?.provider === handle.provider && - record.persistence?.sessionId === handle.sessionId, - ); - if (!matched) { - return; - } - await this.unarchiveAgentState(matched.id); + await this.agentManager.unarchiveSnapshotByHandle(handle); } private async handleUpdateAgentRequest( @@ -2335,26 +2229,10 @@ export class Session { } try { - const liveAgent = this.agentManager.getAgent(agentId); - if (liveAgent) { - if (normalizedName) { - await this.agentManager.setTitle(agentId, normalizedName); - } - if (normalizedLabels) { - await this.agentManager.setLabels(agentId, normalizedLabels); - } - } else { - const existing = await this.agentStorage.get(agentId); - if (!existing) { - throw new Error(`Agent not found: ${agentId}`); - } - - await this.agentStorage.upsert({ - ...existing, - ...(normalizedName ? { title: normalizedName } : {}), - ...(normalizedLabels ? { labels: { ...existing.labels, ...normalizedLabels } } : {}), - }); - } + await this.agentManager.updateAgentMetadata(agentId, { + ...(normalizedName ? { title: normalizedName } : {}), + ...(normalizedLabels ? { labels: normalizedLabels } : {}), + }); this.emit({ type: "update_agent_response", @@ -2904,14 +2782,32 @@ export class Session { ...(provisionalTitle ? { title: provisionalTitle } : {}), }; - const { sessionConfig, worktreeConfig } = await this.buildAgentSessionConfig( + const { sessionConfig, worktreeBootstrap } = await this.buildAgentSessionConfig( resolvedConfig, git, worktreeName, labels, ); - await this.ensureWorkspaceRegistered(sessionConfig.cwd); - const snapshot = await this.agentManager.createAgent(sessionConfig, undefined, { labels }); + const resolvedWorkspace = + msg.workspaceId + ? await this.resolveWorkspaceByIdOrDirectory(msg.workspaceId) + : (await this.findWorkspaceByDirectory(sessionConfig.cwd)) ?? + (await this.findOrCreateWorkspaceForDirectory(sessionConfig.cwd)); + if (!resolvedWorkspace) { + throw new Error(`Workspace not found: ${msg.workspaceId}`); + } + const snapshot = await this.agentManager.createAgent( + { + ...sessionConfig, + cwd: resolvedWorkspace.directory, + }, + undefined, + { + labels, + workspaceId: resolvedWorkspace.id, + initialPrompt: trimmedPrompt, + }, + ); await this.forwardAgentUpdate(snapshot); if (requestId) { @@ -2964,10 +2860,11 @@ export class Session { }); } - if (worktreeConfig) { + if (worktreeBootstrap) { void runAsyncWorktreeBootstrap({ agentId: snapshot.id, - worktree: worktreeConfig, + worktree: worktreeBootstrap.worktree, + shouldBootstrap: worktreeBootstrap.shouldBootstrap, terminalManager: this.terminalManager, appendTimelineItem: (item) => appendTimelineItemIfAgentKnown({ @@ -2981,6 +2878,8 @@ export class Session { agentId: snapshot.id, item, }), + serviceRouteStore: this.serviceRouteStore ?? undefined, + daemonPort: this.getDaemonTcpPort?.() ?? null, logger: this.sessionLogger, }); } @@ -3036,9 +2935,11 @@ export class Session { ); try { await this.unarchiveAgentByHandle(handle); - const snapshot = await this.agentManager.resumeAgentFromPersistence(handle, overrides); + const snapshot = await this.agentLoadingService.resumeAgent({ + handle, + overrides, + }); await this.unarchiveAgentState(snapshot.id); - await this.agentManager.hydrateTimelineFromProvider(snapshot.id); await this.forwardAgentUpdate(snapshot); const timelineSize = this.agentManager.getTimeline(snapshot.id).length; if (requestId) { @@ -3079,32 +2980,10 @@ export class Session { try { await this.unarchiveAgentState(agentId); - let snapshot: ManagedAgent; - const existing = this.agentManager.getAgent(agentId); - if (existing) { + if (this.agentManager.getAgent(agentId)) { await this.interruptAgentIfRunning(agentId); - if (existing.persistence) { - snapshot = await this.agentManager.reloadAgentSession(agentId); - } else { - snapshot = existing; - } - } else { - const record = await this.agentStorage.get(agentId); - if (!record) { - throw new Error(`Agent not found: ${agentId}`); - } - const handle = toAgentPersistenceHandle(this.sessionLogger, record.persistence); - if (!handle) { - throw new Error(`Agent ${agentId} cannot be refreshed because it lacks persistence`); - } - snapshot = await this.agentManager.resumeAgentFromPersistence( - handle, - buildConfigOverrides(record), - agentId, - extractTimestamps(record), - ); } - await this.agentManager.hydrateTimelineFromProvider(agentId); + const snapshot = await this.agentLoadingService.refreshAgent({ agentId }); await this.forwardAgentUpdate(snapshot); const timelineSize = this.agentManager.getTimeline(agentId).length; if (requestId) { @@ -3147,7 +3026,10 @@ export class Session { gitOptions?: GitSetupOptions, legacyWorktreeName?: string, _labels?: Record, - ): Promise<{ sessionConfig: AgentSessionConfig; worktreeConfig?: WorktreeConfig }> { + ): Promise<{ + sessionConfig: AgentSessionConfig; + worktreeBootstrap?: { worktree: WorktreeConfig; shouldBootstrap: boolean }; + }> { return buildWorktreeAgentSessionConfig( { paseoHome: this.paseoHome, @@ -3291,11 +3173,9 @@ export class Session { ): Promise { const fetchedAt = new Date().toISOString(); try { - let providers = await this.agentManager.listProviderAvailability(); - - // TODO: Remove once all app store clients are on >=0.1.45. - providers = providers.filter((p) => this.isProviderVisibleToClient(p.provider)); - + const providers = (await this.agentManager.listProviderAvailability()).filter((provider) => + this.isProviderVisibleToClient(provider.provider), + ); this.emit({ type: "list_available_providers_response", payload: { @@ -3873,7 +3753,16 @@ export class Session { } if (!agent && draftConfig) { - const sessionConfig = this.buildDraftAgentSessionConfig(draftConfig); + const sessionConfig: AgentSessionConfig = { + provider: draftConfig.provider, + cwd: expandTilde(draftConfig.cwd), + ...(draftConfig.modeId ? { modeId: draftConfig.modeId } : {}), + ...(draftConfig.model ? { model: draftConfig.model } : {}), + ...(draftConfig.thinkingOptionId + ? { thinkingOptionId: draftConfig.thinkingOptionId } + : {}), + }; + const commands = await this.agentManager.listDraftCommands(sessionConfig); this.emit({ type: "list_commands_response", @@ -4261,13 +4150,21 @@ export class Session { return; } target.latestFingerprint = this.workspaceGitDescriptorFingerprint(workspace); + target.lastBranchName = workspace?.name ?? null; } - private primeWorkspaceGitWatchFingerprints( + private async primeWorkspaceGitWatchFingerprints( workspaces: Iterable, - ): void { + ): Promise { for (const workspace of workspaces) { - this.rememberWorkspaceGitWatchFingerprint(workspace.id, workspace); + const persistedWorkspace = await this.findWorkspaceByDirectory(workspace.id); + if (!persistedWorkspace) { + continue; + } + await this.syncWorkspaceGitWatchTarget(persistedWorkspace.directory, { + isGit: workspace.projectKind === "git", + }); + this.rememberWorkspaceGitWatchFingerprint(persistedWorkspace.directory, workspace); } } @@ -4322,6 +4219,7 @@ export class Session { refreshPromise: null, refreshQueued: false, latestFingerprint: null, + lastBranchName: null, }; for (const watchPath of new Set([join(gitDir, "HEAD"), join(refsRoot, "refs", "heads")])) { @@ -4696,7 +4594,12 @@ export class Session { paseoHome: this.paseoHome, agentManager: this.agentManager, agentStorage: this.agentStorage, - archiveWorkspaceRecord: (workspaceId) => this.archiveWorkspaceRecord(workspaceId), + archiveWorkspaceRecord: async (workspaceDirectory) => { + const workspace = await this.findWorkspaceByDirectory(workspaceDirectory); + if (workspace) { + await this.archiveWorkspaceRecord(workspace.id); + } + }, emit: (message) => this.emit(message), emitWorkspaceUpdatesForCwds: (cwds) => this.emitWorkspaceUpdatesForCwds(cwds), isPathWithinRoot: (rootPath, candidatePath) => @@ -4917,6 +4820,8 @@ export class Session { let agents = [...liveAgents, ...persistedAgents]; + agents = agents.filter((agent) => this.isProviderVisibleToClient(agent.provider)); + // Filter by labels if filter provided if (filter?.labels) { const filterLabels = filter.labels; @@ -4984,14 +4889,16 @@ export class Session { private async getAgentPayloadById(agentId: string): Promise { const live = this.agentManager.getAgent(agentId); if (live) { - return await this.buildAgentPayload(live); + const payload = await this.buildAgentPayload(live); + return this.isProviderVisibleToClient(payload.provider) ? payload : null; } const record = await this.agentStorage.get(agentId); if (!record || record.internal) { return null; } - return this.buildStoredAgentPayload(record); + const payload = this.buildStoredAgentPayload(record); + return this.isProviderVisibleToClient(payload.provider) ? payload : null; } private normalizeFetchAgentsSort( @@ -5219,13 +5126,13 @@ export class Session { labels: filter?.labels, }); - const placementByCwd = new Map>(); - const getPlacement = (cwd: string): Promise => { + const placementByCwd = new Map>(); + const getPlacement = (cwd: string): Promise => { const existing = placementByCwd.get(cwd); if (existing) { return existing; } - const placementPromise = this.buildProjectPlacement(cwd); + const placementPromise = this.buildProjectPlacementForCwd(cwd); placementByCwd.set(cwd, placementPromise); return placementPromise; }; @@ -5251,12 +5158,15 @@ export class Session { ) { const batch = candidates.slice(start, start + batchSize); const batchEntries = await Promise.all( - batch.map(async (agent) => ({ - agent, - project: await getPlacement(agent.cwd), - })), + batch.map(async (agent) => { + const project = await getPlacement(agent.cwd); + return project ? { agent, project } : null; + }), ); for (const entry of batchEntries) { + if (!entry) { + continue; + } if ( !this.matchesAgentFilter({ agent: entry.agent, @@ -5341,17 +5251,36 @@ export class Session { const resolvedProjectRecord = projectRecord ?? (await this.projectRegistry.get(workspace.projectId)); + let diffStat: { additions: number; deletions: number } | null = null; + const cachedShortstat = getCachedCheckoutShortstat(workspace.directory); + if (cachedShortstat !== undefined) { + diffStat = cachedShortstat; + } else { + warmCheckoutShortstatInBackground(workspace.directory, undefined, () => { + void this.emitWorkspaceUpdateForCwd(workspace.directory); + }); + } + return { - id: workspace.workspaceId, - projectId: workspace.projectId, - projectDisplayName: resolvedProjectRecord?.displayName ?? workspace.projectId, - projectRootPath: resolvedProjectRecord?.rootPath ?? workspace.cwd, - projectKind: resolvedProjectRecord?.kind ?? "non_git", - workspaceKind: workspace.kind, + id: workspace.directory, + projectId: resolvedProjectRecord?.directory ?? workspace.directory, + projectDisplayName: resolvedProjectRecord?.displayName ?? String(workspace.projectId), + projectRootPath: resolvedProjectRecord?.directory ?? workspace.directory, + workspaceDirectory: workspace.directory, + projectKind: (resolvedProjectRecord?.kind ?? "directory") === "git" ? "git" : "non_git", + workspaceKind: workspace.kind === "checkout" ? "local_checkout" : workspace.kind, name: workspace.displayName, status: "done", activityAt: null, - diffStat: null, + diffStat, + services: this.serviceRouteStore + ? buildWorkspaceServicePayloads( + this.serviceRouteStore, + workspace.directory, + this.getDaemonTcpPort?.() ?? null, + this.resolveServiceHealth ?? undefined, + ) + : [], }; } @@ -5359,26 +5288,7 @@ export class Session { workspace: PersistedWorkspaceRecord, projectRecord?: PersistedProjectRecord | null, ): Promise { - const base = await this.describeWorkspaceRecord(workspace, projectRecord); - let displayName = workspace.displayName; - try { - const placement = await this.buildProjectPlacement(workspace.cwd); - displayName = deriveWorkspaceDisplayName({ - cwd: workspace.cwd, - checkout: placement.checkout, - }); - } catch { - // Fall back to the persisted label if checkout metadata is unavailable. - } - - let diffStat: { additions: number; deletions: number } | null = null; - try { - diffStat = await getCheckoutShortstat(workspace.cwd); - } catch { - // Non-critical — leave null on failure. - } - - return { ...base, name: displayName, diffStat }; + return this.describeWorkspaceRecord(workspace, projectRecord); } private async buildWorkspaceDescriptor(input: { @@ -5406,7 +5316,7 @@ export class Session { const activeProjects = new Map( persistedProjects .filter((project) => !project.archivedAt) - .map((project) => [project.projectId, project] as const), + .map((project) => [project.id, project] as const), ); const descriptorsByWorkspaceId = new Map(); const workspaceIds = options.workspaceIds @@ -5416,14 +5326,17 @@ export class Session { ), ) : null; + const workspaceIdsByDirectory = new Map( + activeRecords.map((workspace) => [workspace.directory, workspace.directory] as const), + ); for (const workspace of activeRecords) { - if (workspaceIds && !workspaceIds.has(workspace.workspaceId)) { + if (workspaceIds && !workspaceIds.has(workspace.directory)) { continue; } const projectRecord = activeProjects.get(workspace.projectId) ?? null; descriptorsByWorkspaceId.set( - workspace.workspaceId, + workspace.directory, await this.buildWorkspaceDescriptor({ workspace, projectRecord, @@ -5436,8 +5349,14 @@ export class Session { if (agent.archivedAt) { continue; } + if (!this.isProviderVisibleToClient(agent.provider)) { + continue; + } - const workspaceId = this.resolveRegisteredWorkspaceIdForCwd(agent.cwd, activeRecords); + const workspaceId = workspaceIdsByDirectory.get(normalizePersistedWorkspaceId(agent.cwd)); + if (workspaceId === undefined) { + continue; + } const existing = descriptorsByWorkspaceId.get(workspaceId); if (!existing) { continue; @@ -5466,25 +5385,25 @@ export class Session { workspaces: PersistedWorkspaceRecord[], ): string { const normalizedCwd = normalizePersistedWorkspaceId(cwd); - const exact = workspaces.find((workspace) => workspace.workspaceId === normalizedCwd); + const exact = workspaces.find((workspace) => workspace.directory === normalizedCwd); if (exact) { - return exact.workspaceId; + return exact.directory; } let bestMatch: PersistedWorkspaceRecord | null = null; for (const workspace of workspaces) { - const prefix = workspace.workspaceId.endsWith(sep) - ? workspace.workspaceId - : `${workspace.workspaceId}${sep}`; + const prefix = workspace.directory.endsWith(sep) + ? workspace.directory + : `${workspace.directory}${sep}`; if (!normalizedCwd.startsWith(prefix)) { continue; } - if (!bestMatch || workspace.workspaceId.length > bestMatch.workspaceId.length) { + if (!bestMatch || workspace.directory.length > bestMatch.directory.length) { bestMatch = workspace; } } - return bestMatch?.workspaceId ?? normalizedCwd; + return bestMatch?.directory ?? normalizedCwd; } private async listWorkspaceDescriptors(): Promise { @@ -5630,7 +5549,7 @@ export class Session { return { sort: cursorSort, values: payload.values as Record, - id: payload.id, + id: String(payload.id), }; } @@ -5668,14 +5587,14 @@ export class Session { } if (filter.idPrefix && filter.idPrefix.trim().length > 0) { - if (!workspace.id.startsWith(filter.idPrefix.trim())) { + if (!String(workspace.id).startsWith(filter.idPrefix.trim())) { return false; } } if (filter.query && filter.query.trim().length > 0) { const query = filter.query.trim().toLocaleLowerCase(); - const haystacks = [workspace.name, workspace.projectId, workspace.id]; + const haystacks = [workspace.name, String(workspace.projectId), String(workspace.id)]; if (!haystacks.some((value) => value.toLocaleLowerCase().includes(query))) { return false; } @@ -5773,8 +5692,50 @@ export class Session { } } - private async ensureWorkspaceRegistered(cwd: string): Promise { - return (await this.reconcileWorkspaceRecord(cwd)).workspace; + private async findOrCreateWorkspaceForDirectory(cwd: string): Promise { + const normalizedCwd = await this.resolveWorkspaceDirectory(cwd); + const existingWorkspace = await this.findWorkspaceByDirectory(normalizedCwd); + if (existingWorkspace) { + return existingWorkspace; + } + + const timestamp = new Date().toISOString(); + const directoryName = normalizedCwd.split(/[\\/]/).filter(Boolean).at(-1) ?? normalizedCwd; + const gitMetadata = detectWorkspaceGitMetadata(normalizedCwd, directoryName); + + let projectId: number | null = null; + if (gitMetadata.gitRemote) { + const existingProjects = await this.projectRegistry.list(); + const matchingProject = existingProjects.find( + (p) => p.gitRemote === gitMetadata.gitRemote && !p.archivedAt, + ); + if (matchingProject) { + projectId = matchingProject.id; + } + } + + if (projectId === null) { + projectId = await this.projectRegistry.insert({ + directory: normalizedCwd, + displayName: gitMetadata.projectDisplayName, + kind: gitMetadata.projectKind, + gitRemote: gitMetadata.gitRemote, + createdAt: timestamp, + updatedAt: timestamp, + archivedAt: null, + }); + } + + const workspaceId = await this.workspaceRegistry.insert({ + projectId, + directory: normalizedCwd, + displayName: gitMetadata.workspaceDisplayName, + kind: gitMetadata.isWorktree ? "worktree" : "checkout", + createdAt: timestamp, + updatedAt: timestamp, + archivedAt: null, + }); + return (await this.workspaceRegistry.get(workspaceId))!; } private async registerPendingWorktreeWorkspace(options: { @@ -5782,38 +5743,80 @@ export class Session { worktreePath: string; branchName: string; }): Promise { - return registerPendingWorktreeWorkspaceSession( - { - buildPersistedProjectRecord: (input) => this.buildPersistedProjectRecord(input), - buildPersistedWorkspaceRecord: (input) => this.buildPersistedWorkspaceRecord(input), - buildProjectPlacement: (cwd) => this.buildProjectPlacement(cwd), - projectRegistry: this.projectRegistry, - syncWorkspaceGitWatchTarget: (cwd, syncOptions) => - this.syncWorkspaceGitWatchTarget(cwd, syncOptions), - workspaceRegistry: this.workspaceRegistry, - archiveProjectRecordIfEmpty: (projectId, archivedAt) => - this.archiveProjectRecordIfEmpty(projectId, archivedAt), - }, - options, - ); + await this.findOrCreateWorkspaceForDirectory(options.repoRoot); + const workspaceDirectory = normalizePersistedWorkspaceId(options.worktreePath); + const basePlacement = await this.buildProjectPlacementForCwd(options.repoRoot); + if (!basePlacement) { + throw new Error(`Workspace not found for repo root ${options.repoRoot}`); + } + + const projectId = Number(basePlacement.projectKey); + if (!Number.isInteger(projectId)) { + throw new Error(`Invalid project id for repo root ${options.repoRoot}`); + } + + const now = new Date().toISOString(); + const existingWorkspace = await this.findWorkspaceByDirectory(workspaceDirectory); + if (!existingWorkspace) { + const workspaceId = await this.workspaceRegistry.insert({ + projectId, + directory: workspaceDirectory, + displayName: options.branchName, + kind: "worktree", + createdAt: now, + updatedAt: now, + archivedAt: null, + }); + const workspace = await this.workspaceRegistry.get(workspaceId); + if (!workspace) { + throw new Error(`Workspace not found after insert: ${workspaceId}`); + } + await this.syncWorkspaceGitWatchTarget(workspace.directory, { isGit: true }); + return workspace; + } + + await this.workspaceRegistry.upsert({ + id: existingWorkspace.id, + projectId, + directory: workspaceDirectory, + displayName: options.branchName, + kind: "worktree", + createdAt: existingWorkspace.createdAt, + updatedAt: now, + archivedAt: null, + }); + await this.syncWorkspaceGitWatchTarget(workspaceDirectory, { isGit: true }); + + if (!existingWorkspace.archivedAt && existingWorkspace.projectId !== projectId) { + const siblingWorkspaces = (await this.workspaceRegistry.list()).filter( + (workspace) => + workspace.projectId === existingWorkspace.projectId && + workspace.id !== existingWorkspace.id && + !workspace.archivedAt, + ); + if (siblingWorkspaces.length === 0) { + await this.projectRegistry.archive(existingWorkspace.projectId, now); + } + } + + return (await this.workspaceRegistry.get(existingWorkspace.id))!; } - private async archiveWorkspaceRecord(workspaceId: string, archivedAt?: string): Promise { - const existing = await this.workspaceRegistry.get(workspaceId); - if (!existing || existing.archivedAt) { - this.removeWorkspaceGitWatchTarget(workspaceId); + private async archiveWorkspaceRecord(workspaceId: number, archivedAt?: string): Promise { + const existingWorkspace = await this.workspaceRegistry.get(workspaceId); + if (!existingWorkspace || existingWorkspace.archivedAt) { return; } const nextArchivedAt = archivedAt ?? new Date().toISOString(); await this.workspaceRegistry.archive(workspaceId, nextArchivedAt); - this.removeWorkspaceGitWatchTarget(workspaceId); + await this.removeWorkspaceGitWatchTarget(existingWorkspace.directory); const siblingWorkspaces = (await this.workspaceRegistry.list()).filter( - (workspace) => workspace.projectId === existing.projectId && !workspace.archivedAt, + (workspace) => workspace.projectId === existingWorkspace.projectId && !workspace.archivedAt, ); if (siblingWorkspaces.length === 0) { - await this.projectRegistry.archive(existing.projectId, nextArchivedAt); + await this.projectRegistry.archive(existingWorkspace.projectId, nextArchivedAt); } } @@ -5834,6 +5837,10 @@ export class Session { } } + private async reconcileActiveWorkspaceRecords(): Promise> { + return new Set(); + } + private async emitWorkspaceUpdatesForWorkspaceIds( workspaceIds: Iterable, options?: { dedupeGitState?: boolean; skipReconcile?: boolean }, @@ -5867,6 +5874,13 @@ export class Session { ) { continue; } + const watchTarget = this.workspaceGitWatchTargets.get(workspaceId); + if (watchTarget && this.onBranchChanged) { + const newBranchName = nextWorkspace?.name ?? null; + if (newBranchName !== watchTarget.lastBranchName) { + this.onBranchChanged(workspaceId, watchTarget.lastBranchName, newBranchName); + } + } this.rememberWorkspaceGitWatchFingerprint(workspaceId, nextWorkspace); if (!nextWorkspace) { @@ -5952,12 +5966,6 @@ export class Session { } const payload = await this.listFetchAgentsEntries(request); - - // TODO: Remove once all app store clients are on >=0.1.45. - payload.entries = payload.entries.filter((entry) => - this.isProviderVisibleToClient(entry.agent.provider), - ); - const snapshotUpdatedAtByAgentId = new Map(); for (const entry of payload.entries) { const parsedUpdatedAt = Date.parse(entry.agent.updatedAt); @@ -6018,7 +6026,7 @@ export class Session { } const payload = await this.listFetchWorkspacesEntries(request); - this.primeWorkspaceGitWatchFingerprints(payload.entries); + await this.primeWorkspaceGitWatchFingerprints(payload.entries); const snapshotLatestActivityByWorkspaceId = new Map(); for (const entry of payload.entries) { const parsedLatestActivity = entry.activityAt @@ -6069,8 +6077,8 @@ export class Session { request: Extract, ): Promise { try { - const workspace = await this.ensureWorkspaceRegistered(request.cwd); - await this.emitWorkspaceUpdateForCwd(workspace.cwd); + const workspace = await this.findOrCreateWorkspaceForDirectory(request.cwd); + await this.emitWorkspaceUpdateForCwd(workspace.directory); const descriptor = await this.describeWorkspaceRecordWithGitData(workspace); this.emit({ type: "open_project_response", @@ -6094,6 +6102,30 @@ export class Session { } } + private buildWorkspaceServicePayloadSnapshot( + workspaceDirectory: string, + ): WorkspaceDescriptorPayload["services"] { + if (!this.serviceRouteStore) { + return []; + } + return buildWorkspaceServicePayloads( + this.serviceRouteStore, + workspaceDirectory, + this.getDaemonTcpPort?.() ?? null, + this.resolveServiceHealth ?? undefined, + ); + } + + private emitWorkspaceServiceStatusUpdate(workspaceDirectory: string): void { + this.emit({ + type: "service_status_update", + payload: { + workspaceId: workspaceDirectory, + services: this.buildWorkspaceServicePayloadSnapshot(workspaceDirectory), + }, + }); + } + async getAvailableEditorTargets() { return listAvailableEditorTargets(); } @@ -6102,6 +6134,65 @@ export class Session { await openInEditorTarget(options); } + private async handleStartWorkspaceServiceRequest( + request: StartWorkspaceServiceRequest, + ): Promise { + try { + if (!this.terminalManager || !this.serviceRouteStore) { + throw new Error("Workspace services are not available on this daemon"); + } + + const workspace = await this.resolveWorkspaceByIdOrDirectory(request.workspaceId); + if (!workspace) { + throw new Error(`Workspace not found: ${request.workspaceId}`); + } + + await spawnWorkspaceService({ + repoRoot: workspace.directory, + workspaceId: workspace.directory, + branchName: readGitCommand(workspace.directory, "git symbolic-ref --short HEAD"), + serviceName: request.serviceName, + daemonPort: this.getDaemonTcpPort?.() ?? null, + routeStore: this.serviceRouteStore, + terminalManager: this.terminalManager, + logger: this.sessionLogger, + onLifecycleChanged: () => { + this.emitWorkspaceServiceStatusUpdate(workspace.directory); + }, + }); + + this.emitWorkspaceServiceStatusUpdate(workspace.directory); + this.emit({ + type: "start_workspace_service_response", + payload: { + requestId: request.requestId, + workspaceId: request.workspaceId, + serviceName: request.serviceName, + error: null, + }, + }); + } catch (error) { + const message = error instanceof Error ? error.message : "Failed to start workspace service"; + this.sessionLogger.error( + { + err: error, + workspaceId: request.workspaceId, + serviceName: request.serviceName, + }, + "Failed to start workspace service", + ); + this.emit({ + type: "start_workspace_service_response", + payload: { + requestId: request.requestId, + workspaceId: request.workspaceId, + serviceName: request.serviceName, + error: message, + }, + }); + } + } + private async handleListAvailableEditorsRequest( request: Extract, ): Promise { @@ -6185,27 +6276,47 @@ export class Session { private async createPaseoWorktreeInBackground(options: { requestCwd: string; repoRoot: string; - baseBranch: string; - slug: string; - worktreePath: string; + workspaceId: number; + worktree: { branchName: string; worktreePath: string }; + shouldBootstrap: boolean; }): Promise { return createWorktreeInBackgroundSession( { paseoHome: this.paseoHome, emitWorkspaceUpdateForCwd: (cwd, emitOptions) => this.emitWorkspaceUpdateForCwd(cwd, emitOptions), + cacheWorkspaceSetupSnapshot: (workspaceId, snapshot) => { + this.workspaceSetupSnapshots.set(workspaceId, snapshot); + }, + emit: (message) => this.emit(message), sessionLogger: this.sessionLogger, terminalManager: this.terminalManager, + archiveWorkspaceRecord: (workspaceId) => this.archiveWorkspaceRecord(workspaceId), + serviceRouteStore: this.serviceRouteStore, + daemonPort: this.getDaemonTcpPort?.() ?? null, }, options, ); } + private async handleWorkspaceSetupStatusRequest( + request: Extract, + ): Promise { + return handleWorkspaceSetupStatusRequestMessage( + { + emit: (message) => this.emit(message), + workspaceSetupSnapshots: this.workspaceSetupSnapshots, + workspaceRegistry: this.workspaceRegistry, + }, + request, + ); + } + private async handleArchiveWorkspaceRequest( request: Extract, ): Promise { try { - const existing = await this.workspaceRegistry.get(request.workspaceId); + const existing = await this.resolveWorkspaceByIdOrDirectory(request.workspaceId); if (!existing) { throw new Error(`Workspace not found: ${request.workspaceId}`); } @@ -6213,8 +6324,8 @@ export class Session { throw new Error("Use worktree archive for Paseo worktrees"); } const archivedAt = new Date().toISOString(); - await this.archiveWorkspaceRecord(request.workspaceId, archivedAt); - await this.emitWorkspaceUpdateForCwd(existing.cwd); + await this.archiveWorkspaceRecord(existing.id, archivedAt); + await this.emitWorkspaceUpdateForCwd(existing.directory); this.emit({ type: "archive_workspace_response", payload: { @@ -6266,7 +6377,7 @@ export class Session { return; } - const project = await this.buildProjectPlacement(agent.cwd); + const project = await this.buildProjectPlacementForCwd(agent.cwd); this.emit({ type: "fetch_agent_response", payload: { requestId, agent, project, error: null }, @@ -6277,17 +6388,10 @@ export class Session { msg: Extract, ): Promise { const direction: AgentTimelineFetchDirection = msg.direction ?? (msg.cursor ? "after" : "tail"); - const projection: TimelineProjectionMode = msg.projection ?? "projected"; const requestedLimit = msg.limit; const limit = requestedLimit ?? (direction === "after" ? 0 : undefined); - const shouldLimitByProjectedWindow = - projection === "canonical" && - direction === "tail" && - typeof requestedLimit === "number" && - requestedLimit > 0; const cursor: AgentTimelineCursor | undefined = msg.cursor ? { - epoch: msg.cursor.epoch, seq: msg.cursor.seq, } : undefined; @@ -6295,87 +6399,46 @@ export class Session { try { const snapshot = await this.ensureAgentLoaded(msg.agentId); const agentPayload = await this.buildAgentPayload(snapshot); - - let timeline = this.agentManager.fetchTimeline(msg.agentId, { + const timeline = await this.agentManager.fetchTimeline(msg.agentId, { direction, cursor, - limit: - shouldLimitByProjectedWindow && typeof requestedLimit === "number" - ? Math.max(1, Math.floor(requestedLimit)) - : limit, - }); - - let hasOlder = timeline.hasOlder; - let hasNewer = timeline.hasNewer; - let startCursor: { epoch: string; seq: number } | null = null; - let endCursor: { epoch: string; seq: number } | null = null; - let entries: ReturnType; - - if (shouldLimitByProjectedWindow) { - const projectedLimit = Math.max(1, Math.floor(requestedLimit)); - let fetchLimit = projectedLimit; - let projectedWindow = selectTimelineWindowByProjectedLimit({ - rows: timeline.rows, - provider: snapshot.provider, - direction, - limit: projectedLimit, - collapseToolLifecycle: false, - }); - - while (timeline.hasOlder) { - const needsMoreProjectedEntries = - projectedWindow.projectedEntries.length < projectedLimit; - const firstLoadedRow = timeline.rows[0]; - const firstSelectedRow = projectedWindow.selectedRows[0]; - const startsAtLoadedBoundary = - firstLoadedRow != null && - firstSelectedRow != null && - firstSelectedRow.seq === firstLoadedRow.seq; - const boundaryIsAssistantChunk = - startsAtLoadedBoundary && firstLoadedRow.item.type === "assistant_message"; - - if (!needsMoreProjectedEntries && !boundaryIsAssistantChunk) { - break; - } - - const maxRows = Math.max(0, timeline.window.maxSeq - timeline.window.minSeq + 1); - const nextFetchLimit = Math.min(maxRows, fetchLimit * 2); - if (nextFetchLimit <= fetchLimit) { - break; - } - - fetchLimit = nextFetchLimit; - timeline = this.agentManager.fetchTimeline(msg.agentId, { - direction, - cursor, - limit: fetchLimit, - }); - projectedWindow = selectTimelineWindowByProjectedLimit({ - rows: timeline.rows, - provider: snapshot.provider, - direction, - limit: projectedLimit, - collapseToolLifecycle: false, - }); - } - - const selectedRows = projectedWindow.selectedRows; - - entries = projectTimelineRows(selectedRows, snapshot.provider, projection); - - if (projectedWindow.minSeq !== null && projectedWindow.maxSeq !== null) { - startCursor = { epoch: timeline.epoch, seq: projectedWindow.minSeq }; - endCursor = { epoch: timeline.epoch, seq: projectedWindow.maxSeq }; - hasOlder = projectedWindow.minSeq > timeline.window.minSeq; - hasNewer = false; - } - } else { - const firstRow = timeline.rows[0]; - const lastRow = timeline.rows[timeline.rows.length - 1]; - startCursor = firstRow ? { epoch: timeline.epoch, seq: firstRow.seq } : null; - endCursor = lastRow ? { epoch: timeline.epoch, seq: lastRow.seq } : null; - entries = projectTimelineRows(timeline.rows, snapshot.provider, projection); - } + limit, + }); + const firstRow = timeline.rows[0]; + const lastRow = timeline.rows[timeline.rows.length - 1]; + const timelineWindow = + timeline.window ?? + (() => { + const minSeq = firstRow?.seq ?? 0; + const maxSeq = lastRow?.seq ?? 0; + return { + minSeq, + maxSeq, + nextSeq: maxSeq > 0 ? maxSeq + 1 : 0, + }; + })(); + const epoch = `compat:${msg.agentId}`; + const startCursor = firstRow ? { seq: firstRow.seq, epoch } : null; + const endCursor = lastRow ? { seq: lastRow.seq, epoch } : null; + const window = { + minSeq: timelineWindow.minSeq, + maxSeq: timelineWindow.maxSeq, + nextSeq: timelineWindow.nextSeq, + startSeq: firstRow?.seq ?? null, + endSeq: lastRow?.seq ?? null, + hasOlder: timeline.hasOlder, + hasNewer: timeline.hasNewer, + }; + const entries = timeline.rows.map((row) => ({ + provider: snapshot.provider, + item: row.item, + timestamp: row.timestamp, + seq: row.seq, + seqStart: row.seq, + seqEnd: row.seq, + sourceSeqRanges: [{ startSeq: row.seq, endSeq: row.seq }], + collapsed: [], + })); this.emit({ type: "fetch_agent_timeline_response", @@ -6384,21 +6447,24 @@ export class Session { agentId: msg.agentId, agent: agentPayload, direction, - projection, - epoch: timeline.epoch, - reset: timeline.reset, - staleCursor: timeline.staleCursor, - gap: timeline.gap, - window: timeline.window, - startCursor, - endCursor, - hasOlder, - hasNewer, + startSeq: firstRow?.seq ?? null, + endSeq: lastRow?.seq ?? null, + hasOlder: timeline.hasOlder, + hasNewer: timeline.hasNewer, entries, error: null, + epoch, + reset: false, + staleCursor: false, + gap: false, + window, + startCursor, + endCursor, + projection: msg.projection ?? "projected", }, }); } catch (error) { + const epoch = `compat:${msg.agentId}`; this.sessionLogger.error( { err: error, agentId: msg.agentId }, "Failed to handle fetch_agent_timeline_request", @@ -6410,18 +6476,28 @@ export class Session { agentId: msg.agentId, agent: null, direction, - projection, - epoch: "", + startSeq: null, + endSeq: null, + hasOlder: false, + hasNewer: false, + entries: [], + error: error instanceof Error ? error.message : String(error), + epoch, reset: false, staleCursor: false, gap: false, - window: { minSeq: 0, maxSeq: 0, nextSeq: 0 }, + window: { + minSeq: 0, + maxSeq: 0, + nextSeq: 0, + startSeq: null, + endSeq: null, + hasOlder: false, + hasNewer: false, + }, startCursor: null, endCursor: null, - hasOlder: false, - hasNewer: false, - entries: [], - error: error instanceof Error ? error.message : String(error), + projection: msg.projection ?? "projected", }, }); } @@ -7880,7 +7956,7 @@ export class Session { private emitTerminalsChangedSnapshot(input: { cwd: string; - terminals: Array<{ id: string; name: string }>; + terminals: Array<{ id: string; name: string; title?: string }>; }): void { this.emit({ type: "terminals_changed", @@ -7891,6 +7967,23 @@ export class Session { }); } + private filterStandaloneTerminals(terminals: T[]): T[] { + return terminals; + } + + private toTerminalInfo(terminal: Pick): { + id: string; + name: string; + title?: string; + } { + const title = terminal.getTitle(); + return { + id: terminal.id, + name: terminal.name, + ...(title ? { title } : {}), + }; + } + private handleTerminalsChanged(event: TerminalsChangedEvent): void { if (!this.subscribedTerminalDirectories.has(event.cwd)) { return; @@ -7898,9 +7991,10 @@ export class Session { this.emitTerminalsChangedSnapshot({ cwd: event.cwd, - terminals: event.terminals.map((terminal) => ({ + terminals: this.filterStandaloneTerminals(event.terminals).map((terminal) => ({ id: terminal.id, name: terminal.name, + ...(terminal.title ? { title: terminal.title } : {}), })), }); } @@ -7920,7 +8014,7 @@ export class Session { } try { - const terminals = await this.terminalManager.getTerminals(cwd); + const terminals = this.filterStandaloneTerminals(await this.terminalManager.getTerminals(cwd)); for (const terminal of terminals) { this.ensureTerminalExitSubscription(terminal); } @@ -7931,10 +8025,7 @@ export class Session { this.emitTerminalsChangedSnapshot({ cwd, - terminals: terminals.map((terminal) => ({ - id: terminal.id, - name: terminal.name, - })), + terminals: terminals.map((terminal) => this.toTerminalInfo(terminal)), }); } catch (error) { this.sessionLogger.warn({ err: error, cwd }, "Failed to emit initial terminal snapshot"); @@ -7955,10 +8046,11 @@ export class Session { } try { - const terminals = + const terminals = this.filterStandaloneTerminals( typeof msg.cwd === "string" ? await this.terminalManager.getTerminals(msg.cwd) - : await this.getAllTerminalSessions(); + : await this.getAllTerminalSessions(), + ); for (const terminal of terminals) { this.ensureTerminalExitSubscription(terminal); } @@ -7966,7 +8058,7 @@ export class Session { type: "list_terminals_response", payload: { ...(msg.cwd ? { cwd: msg.cwd } : {}), - terminals: terminals.map((t) => ({ id: t.id, name: t.name })), + terminals: terminals.map((terminal) => this.toTerminalInfo(terminal)), requestId: msg.requestId, }, }); @@ -8009,15 +8101,34 @@ export class Session { } try { + if (msg.agentId) { + this.emit({ + type: "create_terminal_response", + payload: { + terminal: null, + error: `Agent-backed terminals are no longer supported for agent ${msg.agentId}`, + requestId: msg.requestId, + }, + }); + return; + } + const session = await this.terminalManager.createTerminal({ cwd: msg.cwd, name: msg.name, + command: msg.command, + args: msg.args, }); this.ensureTerminalExitSubscription(session); this.emit({ type: "create_terminal_response", payload: { - terminal: { id: session.id, name: session.name, cwd: session.cwd }, + terminal: { + id: session.id, + name: session.name, + cwd: session.cwd, + ...(session.getTitle() ? { title: session.getTitle() } : {}), + }, error: null, requestId: msg.requestId, }, @@ -8266,7 +8377,8 @@ export class Session { if (this.activeTerminalStreams.get(slot) !== activeStream) { return; } - if (message.type === "snapshot") { + if (message.type === "snapshot" || message.type === "titleChange") { + activeStream.needsSnapshot = true; this.trySendTerminalSnapshot(activeStream); return; } diff --git a/packages/server/src/server/session.workspace-git-watch.test.ts b/packages/server/src/server/session.workspace-git-watch.test.ts index db704bcd0..e2dd5f54b 100644 --- a/packages/server/src/server/session.workspace-git-watch.test.ts +++ b/packages/server/src/server/session.workspace-git-watch.test.ts @@ -2,6 +2,11 @@ import { mkdirSync, mkdtempSync, rmSync, writeFileSync } from "node:fs"; import { tmpdir } from "node:os"; import path from "node:path"; import { afterEach, beforeEach, describe, expect, test, vi } from "vitest"; +import { Session } from "./session.js"; +import { + createPersistedProjectRecord, + createPersistedWorkspaceRecord, +} from "./workspace-registry.js"; const { watchCalls, watchMock } = vi.hoisted(() => { const hoistedWatchCalls: Array<{ @@ -53,11 +58,16 @@ vi.mock("./checkout-git-utils.js", () => ({ })), })); -import { Session } from "./session.js"; +vi.mock("@getpaseo/highlight", () => ({ + highlightCode: vi.fn(async () => ""), + isLanguageSupported: vi.fn(() => false), +})); function createSessionForWorkspaceGitWatchTests(): { session: Session; emitted: Array<{ type: string; payload: unknown }>; + projects: Map>; + workspaces: Map>; backgroundGitFetchManager: { subscribe: ReturnType; subscriptions: Array<{ @@ -76,8 +86,10 @@ function createSessionForWorkspaceGitWatchTests(): { }; } { const emitted: Array<{ type: string; payload: unknown }> = []; - const projects = new Map(); - const workspaces = new Map(); + const projects = new Map>(); + const workspaces = new Map>(); + let nextProjectId = 1; + let nextWorkspaceId = 1; const backgroundGitFetchSubscriptions: Array<{ params: { repoGitRoot: string; cwd: string }; listener: () => void; @@ -125,46 +137,56 @@ function createSessionForWorkspaceGitWatchTests(): { initialize: async () => {}, existsOnDisk: async () => true, list: async () => Array.from(projects.values()), - get: async (projectId: string) => projects.get(projectId) ?? null, + get: async (id: number) => projects.get(id) ?? null, + insert: async (record: Omit, "id">) => { + const id = nextProjectId++; + projects.set(id, createPersistedProjectRecord({ id, ...record })); + return id; + }, upsert: async (record: any) => { - projects.set(record.projectId, record); + projects.set(record.id, record); }, - archive: async (projectId: string, archivedAt: string) => { - const existing = projects.get(projectId); + archive: async (id: number, archivedAt: string) => { + const existing = projects.get(id); if (!existing) { return; } - projects.set(projectId, { + projects.set(id, { ...existing, archivedAt, updatedAt: archivedAt, }); }, - remove: async (projectId: string) => { - projects.delete(projectId); + remove: async (id: number) => { + projects.delete(id); }, } as any, workspaceRegistry: { initialize: async () => {}, existsOnDisk: async () => true, list: async () => Array.from(workspaces.values()), - get: async (workspaceId: string) => workspaces.get(workspaceId) ?? null, + get: async (id: number) => workspaces.get(id) ?? null, + insert: async (record: Omit, "id">) => { + const id = nextWorkspaceId++; + workspaces.set(id, createPersistedWorkspaceRecord({ id, ...record })); + return id; + }, upsert: async (record: any) => { - workspaces.set(record.workspaceId, record); + workspaces.set(record.id, record); }, - archive: async (workspaceId: string, archivedAt: string) => { - const existing = workspaces.get(workspaceId); + archive: async (id: number, archivedAt: string) => { + const existing = workspaces.get(id); if (!existing) { return; } - workspaces.set(workspaceId, { + workspaces.set(id, { ...existing, archivedAt, updatedAt: archivedAt, }); }, - remove: async (workspaceId: string) => { - workspaces.delete(workspaceId); + remove: async (id: number) => { + workspaces.delete(id); }, } as any, checkoutDiffManager: { @@ -190,11 +212,13 @@ function createSessionForWorkspaceGitWatchTests(): { terminalManager: null, }) as any; - session.listAgentPayloads = async () => []; + (session as any).listAgentPayloads = async () => []; return { session, emitted, + projects, + workspaces, backgroundGitFetchManager: { subscribe: backgroundGitFetchManager.subscribe, subscriptions: backgroundGitFetchSubscriptions, @@ -203,6 +227,40 @@ function createSessionForWorkspaceGitWatchTests(): { }; } +function seedGitWorkspace(input: { + projects: Map>; + workspaces: Map>; + projectId: number; + workspaceId: number; + cwd: string; + name: string; +}) { + input.projects.set( + input.projectId, + createPersistedProjectRecord({ + id: input.projectId, + directory: "/tmp/repo", + displayName: "repo", + kind: "git", + gitRemote: "https://github.com/acme/repo.git", + createdAt: "2026-03-01T12:00:00.000Z", + updatedAt: "2026-03-01T12:00:00.000Z", + }), + ); + input.workspaces.set( + input.workspaceId, + createPersistedWorkspaceRecord({ + id: input.workspaceId, + projectId: input.projectId, + directory: input.cwd, + displayName: input.name, + kind: "checkout", + createdAt: "2026-03-01T12:00:00.000Z", + updatedAt: "2026-03-01T12:00:00.000Z", + }), + ); +} + describe("workspace git watch targets", () => { beforeEach(() => { watchCalls.length = 0; @@ -217,22 +275,17 @@ describe("workspace git watch targets", () => { }); test("debounces watcher events and skips unchanged branch/diff snapshots", async () => { - const { session, emitted } = createSessionForWorkspaceGitWatchTests(); + const { session, emitted, projects, workspaces } = createSessionForWorkspaceGitWatchTests(); const sessionAny = session as any; - - sessionAny.buildProjectPlacement = async (cwd: string) => ({ - projectKey: cwd, - projectName: "repo", - checkout: { - cwd, - isGit: true, - currentBranch: "main", - remoteUrl: "https://github.com/acme/repo.git", - worktreeRoot: cwd, - isPaseoOwnedWorktree: false, - mainRepoRoot: null, - }, + seedGitWorkspace({ + projects, + workspaces, + projectId: 1, + workspaceId: 10, + cwd: "/tmp/repo", + name: "main", }); + resolveCheckoutGitDirMock.mockResolvedValue("/tmp/repo/.git"); sessionAny.workspaceUpdatesSubscription = { subscriptionId: "sub-1", @@ -240,7 +293,6 @@ describe("workspace git watch targets", () => { isBootstrapping: false, pendingUpdatesByWorkspaceId: new Map(), }; - sessionAny.reconcileActiveWorkspaceRecords = async () => new Set(); let descriptor = { id: "/tmp/repo", @@ -253,13 +305,13 @@ describe("workspace git watch targets", () => { status: "done", activityAt: null, diffStat: { additions: 1, deletions: 0 }, + workspaceDirectory: "/tmp/repo", }; sessionAny.buildWorkspaceDescriptorMap = async () => new Map([[descriptor.id, descriptor]]); - await sessionAny.ensureWorkspaceRegistered("/tmp/repo"); - sessionAny.primeWorkspaceGitWatchFingerprints([descriptor]); + await sessionAny.primeWorkspaceGitWatchFingerprints([descriptor]); expect(watchCalls.map((entry) => entry.path).sort()).toEqual([ "/tmp/repo/.git/HEAD", @@ -305,30 +357,46 @@ describe("workspace git watch targets", () => { }); test("closes watchers when a workspace is archived and when the session closes", async () => { - const { session } = createSessionForWorkspaceGitWatchTests(); + const { session, projects, workspaces } = createSessionForWorkspaceGitWatchTests(); const sessionAny = session as any; - sessionAny.buildProjectPlacement = async (cwd: string) => ({ - projectKey: cwd, - projectName: path.basename(cwd), - checkout: { - cwd, - isGit: true, - currentBranch: "main", - remoteUrl: "https://github.com/acme/repo.git", - worktreeRoot: cwd, - isPaseoOwnedWorktree: false, - mainRepoRoot: null, - }, + seedGitWorkspace({ + projects, + workspaces, + projectId: 2, + workspaceId: 20, + cwd: "/tmp/repo-one", + name: "main", + }); + seedGitWorkspace({ + projects, + workspaces, + projectId: 3, + workspaceId: 30, + cwd: "/tmp/repo-two", + name: "main", }); resolveCheckoutGitDirMock.mockImplementation(async (cwd: string) => path.join(cwd, ".git")); - await sessionAny.ensureWorkspaceRegistered("/tmp/repo-one"); + await sessionAny.primeWorkspaceGitWatchFingerprints([ + { + id: "/tmp/repo-one", + projectId: "/tmp/repo-one", + projectDisplayName: "repo-one", + projectRootPath: "/tmp/repo-one", + projectKind: "git", + workspaceKind: "local_checkout", + name: "main", + status: "done", + activityAt: null, + workspaceDirectory: "/tmp/repo-one", + }, + ]); expect(sessionAny.workspaceGitWatchTargets.size).toBe(1); expect(watchCalls).toHaveLength(2); - await sessionAny.archiveWorkspaceRecord("/tmp/repo-one", "2026-03-21T00:00:00.000Z"); + await sessionAny.archiveWorkspaceRecord(20, "2026-03-21T00:00:00.000Z"); expect(sessionAny.workspaceGitWatchTargets.size).toBe(0); expect(watchCalls.every((entry) => entry.close.mock.calls.length === 1)).toBe(true); @@ -336,7 +404,20 @@ describe("workspace git watch targets", () => { watchCalls.length = 0; watchMock.mockClear(); - await sessionAny.ensureWorkspaceRegistered("/tmp/repo-two"); + await sessionAny.primeWorkspaceGitWatchFingerprints([ + { + id: "/tmp/repo-two", + projectId: "/tmp/repo-two", + projectDisplayName: "repo-two", + projectRootPath: "/tmp/repo-two", + projectKind: "git", + workspaceKind: "local_checkout", + name: "main", + status: "done", + activityAt: null, + workspaceDirectory: "/tmp/repo-two", + }, + ]); expect(sessionAny.workspaceGitWatchTargets.size).toBe(1); expect(watchCalls).toHaveLength(2); @@ -366,25 +447,21 @@ describe("workspace git watch targets", () => { }); test("subscribes to the background fetch manager when a git watch target is created", async () => { - const { session, backgroundGitFetchManager } = createSessionForWorkspaceGitWatchTests(); + const { session, projects, workspaces, backgroundGitFetchManager } = + createSessionForWorkspaceGitWatchTests(); const sessionAny = session as any; - sessionAny.buildProjectPlacement = async (cwd: string) => ({ - projectKey: cwd, - projectName: path.basename(cwd), - checkout: { - cwd, - isGit: true, - currentBranch: "main", - remoteUrl: "https://github.com/acme/repo.git", - worktreeRoot: cwd, - isPaseoOwnedWorktree: false, - mainRepoRoot: null, - }, + seedGitWorkspace({ + projects, + workspaces, + projectId: 4, + workspaceId: 40, + cwd: "/tmp/repo", + name: "main", }); resolveCheckoutGitDirMock.mockResolvedValue("/tmp/repo/.git"); - await sessionAny.ensureWorkspaceRegistered("/tmp/repo"); + await sessionAny.syncWorkspaceGitWatchTarget("/tmp/repo", { isGit: true }); expect(backgroundGitFetchManager.subscribe).toHaveBeenCalledWith( { repoGitRoot: "/tmp/repo/.git", cwd: "/tmp/repo" }, @@ -396,29 +473,33 @@ describe("workspace git watch targets", () => { }); test("stores separate background fetch subscriptions per workspace and unsubscribes removed targets", async () => { - const { session, backgroundGitFetchManager } = createSessionForWorkspaceGitWatchTests(); + const { session, projects, workspaces, backgroundGitFetchManager } = + createSessionForWorkspaceGitWatchTests(); const sessionAny = session as any; - sessionAny.buildProjectPlacement = async (cwd: string) => ({ - projectKey: cwd, - projectName: path.basename(cwd), - checkout: { - cwd, - isGit: true, - currentBranch: "main", - remoteUrl: "https://github.com/acme/repo.git", - worktreeRoot: cwd, - isPaseoOwnedWorktree: false, - mainRepoRoot: null, - }, + seedGitWorkspace({ + projects, + workspaces, + projectId: 5, + workspaceId: 50, + cwd: "/tmp/repo", + name: "main", + }); + seedGitWorkspace({ + projects, + workspaces, + projectId: 6, + workspaceId: 60, + cwd: "/tmp/repo-feature", + name: "feature", }); resolveCheckoutGitDirMock.mockImplementation(async (cwd: string) => cwd === "/tmp/repo" ? "/tmp/repo/.git" : "/tmp/repo/.git/worktrees/feature", ); sessionAny.resolveWorkspaceGitRefsRoot = vi.fn(async () => "/tmp/repo/.git"); - await sessionAny.ensureWorkspaceRegistered("/tmp/repo"); - await sessionAny.ensureWorkspaceRegistered("/tmp/repo-feature"); + await sessionAny.syncWorkspaceGitWatchTarget("/tmp/repo", { isGit: true }); + await sessionAny.syncWorkspaceGitWatchTarget("/tmp/repo-feature", { isGit: true }); expect(backgroundGitFetchManager.subscribe).toHaveBeenCalledTimes(2); expect(backgroundGitFetchManager.subscriptions[0]?.params).toEqual({ @@ -440,21 +521,17 @@ describe("workspace git watch targets", () => { }); test("refreshes the workspace when the background fetch manager callback fires and unsubscribes on cleanup", async () => { - const { session, emitted, backgroundGitFetchManager } = createSessionForWorkspaceGitWatchTests(); + const { session, emitted, projects, workspaces, backgroundGitFetchManager } = + createSessionForWorkspaceGitWatchTests(); const sessionAny = session as any; - sessionAny.buildProjectPlacement = async (cwd: string) => ({ - projectKey: cwd, - projectName: path.basename(cwd), - checkout: { - cwd, - isGit: true, - currentBranch: "main", - remoteUrl: "https://github.com/acme/repo.git", - worktreeRoot: cwd, - isPaseoOwnedWorktree: false, - mainRepoRoot: null, - }, + seedGitWorkspace({ + projects, + workspaces, + projectId: 7, + workspaceId: 70, + cwd: "/tmp/repo", + name: "main", }); resolveCheckoutGitDirMock.mockResolvedValue("/tmp/repo/.git"); sessionAny.workspaceUpdatesSubscription = { @@ -481,7 +558,7 @@ describe("workspace git watch targets", () => { sessionAny.buildWorkspaceDescriptorMap = async () => new Map([[descriptor.id, descriptor]]); - await sessionAny.ensureWorkspaceRegistered("/tmp/repo"); + await sessionAny.syncWorkspaceGitWatchTarget("/tmp/repo", { isGit: true }); sessionAny.primeWorkspaceGitWatchFingerprints([descriptor]); descriptor = { diff --git a/packages/server/src/server/session.workspaces.test.ts b/packages/server/src/server/session.workspaces.test.ts index c054ca006..fda83e65e 100644 --- a/packages/server/src/server/session.workspaces.test.ts +++ b/packages/server/src/server/session.workspaces.test.ts @@ -10,6 +10,11 @@ import { createPersistedWorkspaceRecord, } from "./workspace-registry.js"; +vi.mock("@getpaseo/highlight", () => ({ + highlightCode: vi.fn(async () => ""), + isLanguageSupported: vi.fn(() => false), +})); + function makeAgent(input: { id: string; cwd: string; @@ -61,7 +66,17 @@ function makeAgent(input: { }; } -function createSessionForWorkspaceTests(): Session { +function createSessionForWorkspaceTests(): { + session: Session; + emitted: Array<{ type: string; payload: unknown }>; + projects: Map>; + workspaces: Map>; +} { + const emitted: Array<{ type: string; payload: unknown }> = []; + const projects = new Map>(); + const workspaces = new Map>(); + let nextProjectId = 1; + let nextWorkspaceId = 1; const logger = { child: () => logger, trace: vi.fn(), @@ -70,10 +85,13 @@ function createSessionForWorkspaceTests(): Session { warn: vi.fn(), error: vi.fn(), }; + const backgroundGitFetchManager = { + subscribe: vi.fn(async () => ({ unsubscribe: vi.fn() })), + }; const session = new Session({ clientId: "test-client", - onMessage: vi.fn(), + onMessage: (message) => emitted.push(message as any), logger: logger as any, downloadTokenStore: {} as any, pushTokenStore: {} as any, @@ -82,9 +100,6 @@ function createSessionForWorkspaceTests(): Session { subscribe: () => () => {}, listAgents: () => [], getAgent: () => null, - archiveAgent: async () => ({ archivedAt: new Date().toISOString() }), - clearAgentAttention: async () => {}, - notifyAgentState: () => {}, } as any, agentStorage: { list: async () => [], @@ -93,20 +108,58 @@ function createSessionForWorkspaceTests(): Session { projectRegistry: { initialize: async () => {}, existsOnDisk: async () => true, - list: async () => [], - get: async () => null, - upsert: async () => {}, - archive: async () => {}, - remove: async () => {}, + list: async () => Array.from(projects.values()), + get: async (id: number) => projects.get(id) ?? null, + insert: async (record: Omit, "id">) => { + const id = nextProjectId++; + projects.set(id, createPersistedProjectRecord({ id, ...record })); + return id; + }, + upsert: async (record: ReturnType) => { + projects.set(record.id, record); + }, + archive: async (id: number, archivedAt: string) => { + const existing = projects.get(id); + if (!existing) { + return; + } + projects.set(id, { + ...existing, + archivedAt, + updatedAt: archivedAt, + }); + }, + remove: async (id: number) => { + projects.delete(id); + }, } as any, workspaceRegistry: { initialize: async () => {}, existsOnDisk: async () => true, - list: async () => [], - get: async () => null, - upsert: async () => {}, - archive: async () => {}, - remove: async () => {}, + list: async () => Array.from(workspaces.values()), + get: async (id: number) => workspaces.get(id) ?? null, + insert: async (record: Omit, "id">) => { + const id = nextWorkspaceId++; + workspaces.set(id, createPersistedWorkspaceRecord({ id, ...record })); + return id; + }, + upsert: async (record: ReturnType) => { + workspaces.set(record.id, record); + }, + archive: async (id: number, archivedAt: string) => { + const existing = workspaces.get(id); + if (!existing) { + return; + } + workspaces.set(id, { + ...existing, + archivedAt, + updatedAt: archivedAt, + }); + }, + remove: async (id: number) => { + workspaces.delete(id); + }, } as any, checkoutDiffManager: { subscribe: async () => ({ @@ -122,6 +175,7 @@ function createSessionForWorkspaceTests(): Session { }), dispose: () => {}, } as any, + backgroundGitFetchManager: backgroundGitFetchManager as any, createAgentMcpTransport: async () => { throw new Error("not used"); }, @@ -129,11 +183,76 @@ function createSessionForWorkspaceTests(): Session { tts: null, terminalManager: null, }) as any; - return session; + + return { session, emitted, projects, workspaces }; +} + +function seedProject(options: { + projects: Map>; + id: number; + directory: string; + displayName: string; + kind?: "git" | "directory"; + gitRemote?: string | null; +}) { + const record = createPersistedProjectRecord({ + id: options.id, + directory: options.directory, + displayName: options.displayName, + kind: options.kind ?? "directory", + gitRemote: options.gitRemote ?? null, + createdAt: "2026-03-01T12:00:00.000Z", + updatedAt: "2026-03-01T12:00:00.000Z", + }); + options.projects.set(record.id, record); + return record; +} + +function seedWorkspace(options: { + workspaces: Map>; + id: number; + projectId: number; + directory: string; + displayName: string; + kind?: "checkout" | "worktree"; +}) { + const record = createPersistedWorkspaceRecord({ + id: options.id, + projectId: options.projectId, + directory: options.directory, + displayName: options.displayName, + kind: options.kind ?? "checkout", + createdAt: "2026-03-01T12:00:00.000Z", + updatedAt: "2026-03-01T12:00:00.000Z", + }); + options.workspaces.set(record.id, record); + return record; +} + +function createTempGitRepo(options?: { + remoteUrl?: string; + branchName?: string; +}): { tempDir: string; repoDir: string } { + const tempDir = realpathSync(mkdtempSync(path.join(tmpdir(), "session-workspace-git-"))); + const repoDir = path.join(tempDir, "repo"); + execSync(`mkdir -p ${repoDir}`); + execSync(`git init -b ${options?.branchName ?? "main"}`, { cwd: repoDir, stdio: "pipe" }); + execSync("git config user.email 'test@test.com'", { cwd: repoDir, stdio: "pipe" }); + execSync("git config user.name 'Test'", { cwd: repoDir, stdio: "pipe" }); + writeFileSync(path.join(repoDir, "file.txt"), "hello\n"); + execSync("git add .", { cwd: repoDir, stdio: "pipe" }); + execSync("git -c commit.gpgsign=false commit -m 'initial'", { cwd: repoDir, stdio: "pipe" }); + if (options?.remoteUrl) { + execSync(`git remote add origin ${JSON.stringify(options.remoteUrl)}`, { + cwd: repoDir, + stdio: "pipe", + }); + } + return { tempDir, repoDir }; } describe("workspace aggregation", () => { - test("archive emits an authoritative agent_update upsert for subscribed clients", async () => { + test("archive request emits agent_archived and an authoritative agent_update", async () => { const emitted: Array<{ type: string; payload: any }> = []; const archivedRecord = { id: "agent-1", @@ -143,7 +262,7 @@ describe("workspace aggregation", () => { updatedAt: "2026-03-30T15:00:00.000Z", lastActivityAt: "2026-03-30T15:00:00.000Z", lastUserMessageAt: null, - lastStatus: "idle", + lastStatus: "idle" as const, lastModeId: null, runtimeInfo: null, config: { @@ -156,9 +275,23 @@ describe("workspace aggregation", () => { requiresAttention: false, attentionReason: null, attentionTimestamp: null, - archivedAt: null, + archivedAt: null as string | null, }; - + const projects = new Map>(); + const workspaces = new Map>(); + seedProject({ + projects, + id: 1, + directory: "/tmp/repo", + displayName: "repo", + }); + seedWorkspace({ + workspaces, + id: 10, + projectId: 1, + directory: "/tmp/repo", + displayName: "repo", + }); const logger = { child: () => logger, trace: vi.fn(), @@ -168,6 +301,7 @@ describe("workspace aggregation", () => { error: vi.fn(), }; + const closeAgent = vi.fn(async () => undefined); const session = new Session({ clientId: "test-client", onMessage: (message) => emitted.push(message as any), @@ -178,30 +312,24 @@ describe("workspace aggregation", () => { agentManager: { subscribe: () => () => {}, listAgents: () => [], - getAgent: () => null, + getAgent: (agentId: string) => (agentId === "agent-1" ? { id: agentId } : null), archiveAgent: async () => { const archivedAt = new Date().toISOString(); - Object.assign(archivedRecord, { - archivedAt, - updatedAt: archivedAt, - }); + archivedRecord.archivedAt = archivedAt; + archivedRecord.updatedAt = archivedAt; return { archivedAt }; }, clearAgentAttention: async () => {}, - notifyAgentState: () => {}, } as any, agentStorage: { - list: async () => [archivedRecord], + list: async () => [], get: async (agentId: string) => (agentId === archivedRecord.id ? archivedRecord : null), - upsert: async (record: typeof archivedRecord) => { - Object.assign(archivedRecord, record); - }, } as any, projectRegistry: { initialize: async () => {}, existsOnDisk: async () => true, - list: async () => [], - get: async () => null, + list: async () => Array.from(projects.values()), + get: async (id: number) => projects.get(id) ?? null, upsert: async () => {}, archive: async () => {}, remove: async () => {}, @@ -209,8 +337,8 @@ describe("workspace aggregation", () => { workspaceRegistry: { initialize: async () => {}, existsOnDisk: async () => true, - list: async () => [], - get: async () => null, + list: async () => Array.from(workspaces.values()), + get: async (id: number) => workspaces.get(id) ?? null, upsert: async () => {}, archive: async () => {}, remove: async () => {}, @@ -243,22 +371,11 @@ describe("workspace aggregation", () => { isBootstrapping: false, pendingUpdatesByAgentId: new Map(), }; - session.buildProjectPlacement = async (cwd: string) => ({ - projectKey: cwd, - projectName: "repo", - checkout: { - cwd, - isGit: false, - currentBranch: null, - remoteUrl: null, - worktreeRoot: null, - isPaseoOwnedWorktree: false, - mainRepoRoot: null, - }, - }); + session.interruptAgentIfRunning = vi.fn(); await session.handleArchiveAgentRequest("agent-1", "req-archive"); + expect(session.interruptAgentIfRunning).toHaveBeenCalledWith("agent-1"); const update = emitted.find((message) => message.type === "agent_update"); expect(update?.payload).toMatchObject({ kind: "upsert", @@ -267,57 +384,62 @@ describe("workspace aggregation", () => { archivedAt: expect.any(String), }, }); - expect( - emitted.find((message) => message.type === "agent_archived")?.payload, - ).toMatchObject({ + const archivedPayload = emitted.find((message) => message.type === "agent_archived")?.payload; + expect(archivedPayload).toMatchObject({ agentId: "agent-1", archivedAt: expect.any(String), requestId: "req-archive", }); + expect(archivedRecord.archivedAt).toBe(archivedPayload.archivedAt); }); test("close_items_request archives agents and kills terminals in one batch", async () => { const emitted: Array<{ type: string; payload: any }> = []; const archivedAt = "2026-04-01T00:00:00.000Z"; - const sessionLogger = { - child: () => sessionLogger, - trace: vi.fn(), - debug: vi.fn(), - info: vi.fn(), - warn: vi.fn(), - error: vi.fn(), - }; const archivedRecord = { id: "agent-1", provider: "codex", cwd: "/tmp/repo", - model: null, - thinkingOptionId: null, - effectiveThinkingOptionId: null, createdAt: "2026-03-01T12:00:00.000Z", updatedAt: "2026-03-01T12:00:00.000Z", + lastActivityAt: "2026-03-01T12:00:00.000Z", lastUserMessageAt: null, - status: "idle", - capabilities: { - supportsStreaming: true, - supportsSessionPersistence: true, - supportsDynamicModes: true, - supportsMcpServers: true, - supportsReasoningStream: true, - supportsToolInvocations: true, - }, - currentModeId: null, - availableModes: [], - pendingPermissions: [], + lastStatus: "idle" as const, + lastModeId: null, + runtimeInfo: null, + config: null, persistence: null, - runtimeInfo: { provider: "codex", sessionId: null }, title: null, labels: {}, requiresAttention: false, attentionReason: null, attentionTimestamp: null, - archivedAt: null, + archivedAt: null as string | null, + }; + const projects = new Map>(); + const workspaces = new Map>(); + seedProject({ + projects, + id: 2, + directory: "/tmp/repo", + displayName: "repo", + }); + seedWorkspace({ + workspaces, + id: 20, + projectId: 2, + directory: "/tmp/repo", + displayName: "repo", + }); + const sessionLogger = { + child: () => sessionLogger, + trace: vi.fn(), + debug: vi.fn(), + info: vi.fn(), + warn: vi.fn(), + error: vi.fn(), }; + const session = new Session({ clientId: "test-client", onMessage: (message) => emitted.push(message as any), @@ -329,26 +451,22 @@ describe("workspace aggregation", () => { subscribe: () => () => {}, listAgents: () => [], getAgent: (agentId: string) => (agentId === "agent-1" ? { id: agentId } : null), - archiveAgent: async () => ({ archivedAt }), + archiveAgent: async () => { + archivedRecord.archivedAt = archivedAt; + archivedRecord.updatedAt = archivedAt; + return { archivedAt }; + }, clearAgentAttention: async () => {}, - notifyAgentState: () => {}, } as any, agentStorage: { list: async () => [], - get: async (agentId: string) => { - if (agentId !== "agent-1") { - return null; - } - archivedRecord.archivedAt = archivedAt; - archivedRecord.updatedAt = archivedAt; - return archivedRecord; - }, + get: async (agentId: string) => (agentId === archivedRecord.id ? archivedRecord : null), } as any, projectRegistry: { initialize: async () => {}, existsOnDisk: async () => true, - list: async () => [], - get: async () => null, + list: async () => Array.from(projects.values()), + get: async (id: number) => projects.get(id) ?? null, upsert: async () => {}, archive: async () => {}, remove: async () => {}, @@ -356,8 +474,8 @@ describe("workspace aggregation", () => { workspaceRegistry: { initialize: async () => {}, existsOnDisk: async () => true, - list: async () => [], - get: async () => null, + list: async () => Array.from(workspaces.values()), + get: async (id: number) => workspaces.get(id) ?? null, upsert: async () => {}, archive: async () => {}, remove: async () => {}, @@ -393,19 +511,6 @@ describe("workspace aggregation", () => { isBootstrapping: false, pendingUpdatesByAgentId: new Map(), }; - session.buildProjectPlacement = async (cwd: string) => ({ - projectKey: cwd, - projectName: "repo", - checkout: { - cwd, - isGit: false, - currentBranch: null, - remoteUrl: null, - worktreeRoot: null, - isPaseoOwnedWorktree: false, - mainRepoRoot: null, - }, - }); session.interruptAgentIfRunning = vi.fn(); await session.handleMessage({ @@ -417,10 +522,9 @@ describe("workspace aggregation", () => { expect(session.interruptAgentIfRunning).toHaveBeenCalledWith("agent-1"); expect(session.terminalManager.killTerminal).toHaveBeenCalledWith("term-1"); - expect( - emitted.find((message) => message.type === "close_items_response")?.payload, - ).toEqual({ - agents: [{ agentId: "agent-1", archivedAt }], + const closePayload = emitted.find((message) => message.type === "close_items_response")?.payload; + expect(closePayload).toEqual({ + agents: [{ agentId: "agent-1", archivedAt: expect.any(String) }], terminals: [{ terminalId: "term-1", success: true }], requestId: "req-close-items", }); @@ -446,32 +550,76 @@ describe("workspace aggregation", () => { const liveArchivedAt = "2026-04-01T00:00:00.000Z"; const storedAgentId = "agent-stored"; const liveRecord = { - ...makeAgent({ - id: "agent-live", - cwd: "/tmp/repo", - status: "idle", - updatedAt: "2026-03-01T12:00:00.000Z", - }), + id: "agent-live", + provider: "codex", + cwd: "/tmp/repo", + createdAt: "2026-03-01T12:00:00.000Z", + updatedAt: "2026-03-01T12:00:00.000Z", + lastActivityAt: "2026-03-01T12:00:00.000Z", + lastUserMessageAt: null, + lastStatus: "idle" as const, + lastModeId: null, + runtimeInfo: null, + config: null, + persistence: null, + title: null, + labels: {}, + requiresAttention: false, + attentionReason: null, + attentionTimestamp: null, archivedAt: null as string | null, }; const storedRecord = { - ...makeAgent({ - id: storedAgentId, - cwd: "/tmp/repo", - status: "idle", - updatedAt: "2026-03-01T12:05:00.000Z", - }), + id: storedAgentId, + provider: "codex", + cwd: "/tmp/repo", + createdAt: "2026-03-01T12:05:00.000Z", + updatedAt: "2026-03-01T12:05:00.000Z", + lastActivityAt: "2026-03-01T12:05:00.000Z", + lastUserMessageAt: null, + lastStatus: "idle" as const, + lastModeId: null, + runtimeInfo: null, + config: null, + persistence: null, + title: null, + labels: {}, + requiresAttention: false, + attentionReason: null, + attentionTimestamp: null, archivedAt: null as string | null, }; const upsertStoredRecord = vi.fn(async (record: typeof storedRecord) => { - if (record.id === storedAgentId) { - storedRecord.archivedAt = record.archivedAt; - storedRecord.updatedAt = record.updatedAt; - storedRecord.status = record.status; - storedRecord.requiresAttention = record.requiresAttention; - storedRecord.attentionReason = record.attentionReason; - storedRecord.attentionTimestamp = record.attentionTimestamp; + if (record.id !== storedAgentId) { + return; } + storedRecord.archivedAt = record.archivedAt; + storedRecord.updatedAt = record.updatedAt; + storedRecord.lastStatus = record.lastStatus; + storedRecord.requiresAttention = record.requiresAttention; + storedRecord.attentionReason = record.attentionReason; + storedRecord.attentionTimestamp = record.attentionTimestamp; + }); + const projects = new Map>(); + const workspaces = new Map>(); + seedProject({ + projects, + id: 3, + directory: "/tmp/repo", + displayName: "repo", + }); + seedWorkspace({ + workspaces, + id: 30, + projectId: 3, + directory: "/tmp/repo", + displayName: "repo", + }); + + const archiveSnapshot = vi.fn(async (_agentId: string, archivedAt: string) => { + storedRecord.archivedAt = archivedAt; + storedRecord.updatedAt = archivedAt; + return { ...storedRecord, archivedAt, updatedAt: archivedAt }; }); const session = new Session({ @@ -493,8 +641,8 @@ describe("workspace aggregation", () => { liveRecord.updatedAt = liveArchivedAt; return { archivedAt: liveArchivedAt }; }, + archiveSnapshot, clearAgentAttention: async () => {}, - notifyAgentState: () => {}, } as any, agentStorage: { list: async () => [], @@ -512,8 +660,8 @@ describe("workspace aggregation", () => { projectRegistry: { initialize: async () => {}, existsOnDisk: async () => true, - list: async () => [], - get: async () => null, + list: async () => Array.from(projects.values()), + get: async (id: number) => projects.get(id) ?? null, upsert: async () => {}, archive: async () => {}, remove: async () => {}, @@ -521,8 +669,8 @@ describe("workspace aggregation", () => { workspaceRegistry: { initialize: async () => {}, existsOnDisk: async () => true, - list: async () => [], - get: async () => null, + list: async () => Array.from(workspaces.values()), + get: async (id: number) => workspaces.get(id) ?? null, upsert: async () => {}, archive: async () => {}, remove: async () => {}, @@ -558,19 +706,6 @@ describe("workspace aggregation", () => { isBootstrapping: false, pendingUpdatesByAgentId: new Map(), }; - session.buildProjectPlacement = async (cwd: string) => ({ - projectKey: cwd, - projectName: "repo", - checkout: { - cwd, - isGit: false, - currentBranch: null, - remoteUrl: null, - worktreeRoot: null, - isPaseoOwnedWorktree: false, - mainRepoRoot: null, - }, - }); session.interruptAgentIfRunning = vi.fn(); await session.handleMessage({ @@ -580,7 +715,8 @@ describe("workspace aggregation", () => { requestId: "req-close-stored", }); - expect(upsertStoredRecord).toHaveBeenCalledTimes(1); + expect(archiveSnapshot).toHaveBeenCalledTimes(1); + expect(archiveSnapshot).toHaveBeenCalledWith(storedAgentId, expect.any(String)); expect(storedRecord.archivedAt).toEqual(expect.any(String)); expect(emitted.find((message) => message.type === "close_items_response")?.payload).toEqual({ agents: [ @@ -605,14 +741,41 @@ describe("workspace aggregation", () => { }; const archivedAt = "2026-04-01T00:00:00.000Z"; const goodRecord = { - ...makeAgent({ - id: "agent-good", - cwd: "/tmp/repo", - status: "idle", - updatedAt: "2026-03-01T12:00:00.000Z", - }), + id: "agent-good", + provider: "codex", + cwd: "/tmp/repo", + createdAt: "2026-03-01T12:00:00.000Z", + updatedAt: "2026-03-01T12:00:00.000Z", + lastActivityAt: "2026-03-01T12:00:00.000Z", + lastUserMessageAt: null, + lastStatus: "idle" as const, + lastModeId: null, + runtimeInfo: null, + config: null, + persistence: null, + title: null, + labels: {}, + requiresAttention: false, + attentionReason: null, + attentionTimestamp: null, archivedAt: null as string | null, }; + const projects = new Map>(); + const workspaces = new Map>(); + seedProject({ + projects, + id: 4, + directory: "/tmp/repo", + displayName: "repo", + }); + seedWorkspace({ + workspaces, + id: 40, + projectId: 4, + directory: "/tmp/repo", + displayName: "repo", + }); + const session = new Session({ clientId: "test-client", onMessage: (message) => emitted.push(message as any), @@ -629,27 +792,21 @@ describe("workspace aggregation", () => { if (agentId === "agent-bad") { throw new Error("archive failed"); } + goodRecord.archivedAt = archivedAt; + goodRecord.updatedAt = archivedAt; return { archivedAt }; }, clearAgentAttention: async () => {}, - notifyAgentState: () => {}, } as any, agentStorage: { list: async () => [], - get: async (agentId: string) => { - if (agentId !== "agent-good") { - return null; - } - goodRecord.archivedAt = archivedAt; - goodRecord.updatedAt = archivedAt; - return goodRecord; - }, + get: async (agentId: string) => (agentId === "agent-good" ? goodRecord : null), } as any, projectRegistry: { initialize: async () => {}, existsOnDisk: async () => true, - list: async () => [], - get: async () => null, + list: async () => Array.from(projects.values()), + get: async (id: number) => projects.get(id) ?? null, upsert: async () => {}, archive: async () => {}, remove: async () => {}, @@ -657,8 +814,8 @@ describe("workspace aggregation", () => { workspaceRegistry: { initialize: async () => {}, existsOnDisk: async () => true, - list: async () => [], - get: async () => null, + list: async () => Array.from(workspaces.values()), + get: async (id: number) => workspaces.get(id) ?? null, upsert: async () => {}, archive: async () => {}, remove: async () => {}, @@ -694,19 +851,6 @@ describe("workspace aggregation", () => { isBootstrapping: false, pendingUpdatesByAgentId: new Map(), }; - session.buildProjectPlacement = async (cwd: string) => ({ - projectKey: cwd, - projectName: "repo", - checkout: { - cwd, - isGit: false, - currentBranch: null, - remoteUrl: null, - worktreeRoot: null, - isPaseoOwnedWorktree: false, - mainRepoRoot: null, - }, - }); session.interruptAgentIfRunning = vi.fn(); await session.handleMessage({ @@ -719,10 +863,9 @@ describe("workspace aggregation", () => { expect(session.interruptAgentIfRunning).toHaveBeenCalledWith("agent-bad"); expect(session.interruptAgentIfRunning).toHaveBeenCalledWith("agent-good"); expect(session.terminalManager.killTerminal).toHaveBeenCalledWith("term-1"); - expect( - emitted.find((message) => message.type === "close_items_response")?.payload, - ).toEqual({ - agents: [{ agentId: "agent-good", archivedAt }], + const closePayload = emitted.find((message) => message.type === "close_items_response")?.payload; + expect(closePayload).toEqual({ + agents: [{ agentId: "agent-good", archivedAt: expect.any(String) }], terminals: [{ terminalId: "term-1", success: true }], requestId: "req-close-best-effort", }); @@ -736,94 +879,24 @@ describe("workspace aggregation", () => { expect(sessionLogger.warn).toHaveBeenCalled(); }); - test("non-git workspace uses deterministic directory name and no unknown branch fallback", async () => { - const session = createSessionForWorkspaceTests() as any; - session.workspaceRegistry.list = async () => [ - createPersistedWorkspaceRecord({ - workspaceId: "/tmp/non-git", - projectId: "/tmp/non-git", - cwd: "/tmp/non-git", - kind: "directory", - displayName: "non-git", - createdAt: "2026-03-01T12:00:00.000Z", - updatedAt: "2026-03-01T12:00:00.000Z", - }), - ]; - session.listAgentPayloads = async () => [ - makeAgent({ - id: "a1", - cwd: "/tmp/non-git", - status: "idle", - updatedAt: "2026-03-01T12:00:00.000Z", - }), - ]; - const result = await session.listFetchWorkspacesEntries({ - type: "fetch_workspaces_request", - requestId: "req-1", - }); - - expect(result.entries).toHaveLength(1); - expect(result.entries[0]?.name).toBe("non-git"); - expect(result.entries[0]?.name).not.toBe("Unknown branch"); - }); - - test("git branch workspace uses branch as canonical name", async () => { - const session = createSessionForWorkspaceTests() as any; - session.workspaceRegistry.list = async () => [ - createPersistedWorkspaceRecord({ - workspaceId: "/tmp/repo-branch", - projectId: "/tmp/repo-branch", - cwd: "/tmp/repo-branch", - kind: "local_checkout", - displayName: "feature/name-from-server", - createdAt: "2026-03-01T12:00:00.000Z", - updatedAt: "2026-03-01T12:00:00.000Z", - }), - ]; - session.listAgentPayloads = async () => [ - makeAgent({ - id: "a1", - cwd: "/tmp/repo-branch", - status: "running", - updatedAt: "2026-03-01T12:00:00.000Z", - }), - ]; - session.buildProjectPlacement = async (cwd: string) => ({ - projectKey: cwd, - projectName: "repo-branch", - checkout: { - cwd, - isGit: true, - currentBranch: "feature/name-from-server", - remoteUrl: "https://github.com/acme/repo-branch.git", - worktreeRoot: cwd, - isPaseoOwnedWorktree: false, - mainRepoRoot: null, - }, + test("uses persisted workspace names and stable status aggregation", async () => { + const { session, projects, workspaces } = createSessionForWorkspaceTests(); + seedProject({ + projects, + id: 1, + directory: "/tmp/repo", + displayName: "repo", + kind: "directory", }); - const result = await session.listFetchWorkspacesEntries({ - type: "fetch_workspaces_request", - requestId: "req-branch", + seedWorkspace({ + workspaces, + id: 10, + projectId: 1, + directory: "/tmp/repo", + displayName: "repo", }); - expect(result.entries).toHaveLength(1); - expect(result.entries[0]?.name).toBe("feature/name-from-server"); - }); - - test("branch/detached policies and dominant status bucket are deterministic", async () => { - const session = createSessionForWorkspaceTests() as any; - session.workspaceRegistry.list = async () => [ - createPersistedWorkspaceRecord({ - workspaceId: "/tmp/repo", - projectId: "/tmp/repo", - cwd: "/tmp/repo", - kind: "local_checkout", - displayName: "repo", - createdAt: "2026-03-01T12:00:00.000Z", - updatedAt: "2026-03-01T12:00:00.000Z", - }), - ]; - session.listAgentPayloads = async () => [ + (session as any).listAgentPayloads = async () => [ makeAgent({ id: "a1", cwd: "/tmp/repo", @@ -833,138 +906,94 @@ describe("workspace aggregation", () => { makeAgent({ id: "a2", cwd: "/tmp/repo", - status: "error", - updatedAt: "2026-03-01T12:01:00.000Z", - }), - makeAgent({ - id: "a3", - cwd: "/tmp/repo", status: "idle", - updatedAt: "2026-03-01T12:02:00.000Z", + updatedAt: "2026-03-01T12:01:00.000Z", pendingPermissions: 1, }), ]; - const result = await session.listFetchWorkspacesEntries({ + + const result = await (session as any).listFetchWorkspacesEntries({ type: "fetch_workspaces_request", - requestId: "req-2", + requestId: "req-1", }); - expect(result.entries).toHaveLength(1); - expect(result.entries[0]?.name).toBe("repo"); - expect(result.entries[0]?.status).toBe("needs_input"); - }); - - test("subdirectory agents map to an existing parent workspace descriptor", async () => { - const session = createSessionForWorkspaceTests() as any; - session.workspaceRegistry.list = async () => [ - createPersistedWorkspaceRecord({ - workspaceId: "/tmp/repo", + expect(result.entries).toEqual([ + expect.objectContaining({ + id: "/tmp/repo", projectId: "/tmp/repo", - cwd: "/tmp/repo", - kind: "local_checkout", - displayName: "main", - createdAt: "2026-03-01T12:00:00.000Z", - updatedAt: "2026-03-01T12:00:00.000Z", + name: "repo", + projectKind: "non_git", + workspaceKind: "local_checkout", + status: "needs_input", }), - ]; - session.listAgentPayloads = async () => [ + ]); + }); + + test("keeps persisted git worktree display names", async () => { + const { session, projects, workspaces } = createSessionForWorkspaceTests(); + seedProject({ + projects, + id: 2, + directory: "/tmp/repo", + displayName: "repo", + kind: "git", + gitRemote: "https://github.com/acme/repo.git", + }); + seedWorkspace({ + workspaces, + id: 20, + projectId: 2, + directory: "/tmp/repo/.paseo/worktrees/feature-name", + displayName: "feature-name", + kind: "worktree", + }); + + (session as any).listAgentPayloads = async () => [ makeAgent({ id: "a1", - cwd: "/tmp/repo/packages/app", + cwd: "/tmp/repo/.paseo/worktrees/feature-name", status: "running", - updatedAt: "2026-03-01T12:03:00.000Z", + updatedAt: "2026-03-01T12:00:00.000Z", }), ]; - const result = await session.listFetchWorkspacesEntries({ + const result = await (session as any).listFetchWorkspacesEntries({ type: "fetch_workspaces_request", - requestId: "req-subdir-agent", + requestId: "req-branch", }); - expect(result.entries).toHaveLength(1); expect(result.entries[0]).toMatchObject({ - id: "/tmp/repo", - status: "running", - activityAt: "2026-03-01T12:03:00.000Z", + id: "/tmp/repo/.paseo/worktrees/feature-name", + name: "feature-name", + projectKind: "git", + workspaceKind: "worktree", }); }); test("workspace update stream keeps persisted workspace visible after agents stop", async () => { - const emitted: Array<{ type: string; payload: unknown }> = []; - const logger = { - child: () => logger, - trace: vi.fn(), - debug: vi.fn(), - info: vi.fn(), - warn: vi.fn(), - error: vi.fn(), - }; - - const session = new Session({ - clientId: "test-client", - onMessage: (message) => emitted.push(message as any), - logger: logger as any, - downloadTokenStore: {} as any, - pushTokenStore: {} as any, - paseoHome: "/tmp/paseo-test", - agentManager: { - subscribe: () => () => {}, - listAgents: () => [], - getAgent: () => null, - } as any, - agentStorage: { - list: async () => [], - get: async () => null, - } as any, - projectRegistry: { - initialize: async () => {}, - existsOnDisk: async () => true, - list: async () => [], - get: async () => null, - upsert: async () => {}, - archive: async () => {}, - remove: async () => {}, - } as any, - workspaceRegistry: { - initialize: async () => {}, - existsOnDisk: async () => true, - list: async () => [], - get: async () => null, - upsert: async () => {}, - archive: async () => {}, - remove: async () => {}, - } as any, - checkoutDiffManager: { - subscribe: async () => ({ - initial: { cwd: "/tmp", files: [], error: null }, - unsubscribe: () => {}, - }), - scheduleRefreshForCwd: () => {}, - getMetrics: () => ({ - checkoutDiffTargetCount: 0, - checkoutDiffSubscriptionCount: 0, - checkoutDiffWatcherCount: 0, - checkoutDiffFallbackRefreshTargetCount: 0, - }), - dispose: () => {}, - } as any, - createAgentMcpTransport: async () => { - throw new Error("not used"); - }, - stt: null, - tts: null, - terminalManager: null, - }) as any; + const { session, emitted, projects, workspaces } = createSessionForWorkspaceTests(); + seedProject({ + projects, + id: 3, + directory: "/tmp/repo", + displayName: "repo", + }); + seedWorkspace({ + workspaces, + id: 30, + projectId: 3, + directory: "/tmp/repo", + displayName: "repo", + }); - session.workspaceUpdatesSubscription = { + (session as any).workspaceUpdatesSubscription = { subscriptionId: "sub-1", filter: undefined, isBootstrapping: false, pendingUpdatesByWorkspaceId: new Map(), }; - session.reconcileActiveWorkspaceRecords = async () => new Set(); - - session.buildWorkspaceDescriptorMap = async () => + (session as any).reconcileActiveWorkspaceRecords = async () => new Set(); + (session as any).buildWorkspaceDescriptorMap = async () => new Map([ [ "/tmp/repo", @@ -973,17 +1002,19 @@ describe("workspace aggregation", () => { projectId: "/tmp/repo", projectDisplayName: "repo", projectRootPath: "/tmp/repo", + workspaceDirectory: "/tmp/repo", projectKind: "non_git", - workspaceKind: "directory", + workspaceKind: "local_checkout", name: "repo", status: "running", activityAt: "2026-03-01T12:00:00.000Z", + services: [], }, ], ]); - await session.emitWorkspaceUpdateForCwd("/tmp/repo"); + await (session as any).emitWorkspaceUpdateForCwd("/tmp/repo"); - session.buildWorkspaceDescriptorMap = async () => + (session as any).buildWorkspaceDescriptorMap = async () => new Map([ [ "/tmp/repo", @@ -992,19 +1023,20 @@ describe("workspace aggregation", () => { projectId: "/tmp/repo", projectDisplayName: "repo", projectRootPath: "/tmp/repo", + workspaceDirectory: "/tmp/repo", projectKind: "non_git", - workspaceKind: "directory", + workspaceKind: "local_checkout", name: "repo", status: "done", activityAt: null, + services: [], }, ], ]); - await session.emitWorkspaceUpdateForCwd("/tmp/repo"); + await (session as any).emitWorkspaceUpdateForCwd("/tmp/repo"); const workspaceUpdates = emitted.filter((message) => message.type === "workspace_update"); expect(workspaceUpdates).toHaveLength(2); - expect((workspaceUpdates[0] as any).payload.kind).toBe("upsert"); expect((workspaceUpdates[1] as any).payload).toEqual({ kind: "upsert", workspace: { @@ -1012,18 +1044,19 @@ describe("workspace aggregation", () => { projectId: "/tmp/repo", projectDisplayName: "repo", projectRootPath: "/tmp/repo", + workspaceDirectory: "/tmp/repo", projectKind: "non_git", - workspaceKind: "directory", + workspaceKind: "local_checkout", name: "repo", status: "done", activityAt: null, + services: [], }, }); }); - test("create paseo worktree request returns a registered workspace descriptor", async () => { - const emitted: Array<{ type: string; payload: unknown }> = []; - const session = createSessionForWorkspaceTests() as any; + test("create paseo worktree request inserts a workspace under the existing project", async () => { + const { session, emitted, projects, workspaces } = createSessionForWorkspaceTests(); const tempDir = realpathSync(mkdtempSync(path.join(tmpdir(), "session-worktree-test-"))); const repoDir = path.join(tempDir, "repo"); const paseoHome = path.join(tempDir, "paseo-home"); @@ -1035,63 +1068,96 @@ describe("workspace aggregation", () => { execSync("git add .", { cwd: repoDir, stdio: "pipe" }); execSync("git -c commit.gpgsign=false commit -m 'initial'", { cwd: repoDir, stdio: "pipe" }); - const workspaces = new Map(); - const projects = new Map(); - session.paseoHome = paseoHome; - session.workspaceRegistry.get = async (workspaceId: string) => - workspaces.get(workspaceId) ?? null; - session.workspaceRegistry.list = async () => Array.from(workspaces.values()); - session.workspaceRegistry.upsert = async (record: any) => { - workspaces.set(record.workspaceId, record); - }; - session.projectRegistry.get = async (projectId: string) => projects.get(projectId) ?? null; - session.projectRegistry.list = async () => Array.from(projects.values()); - session.projectRegistry.upsert = async (record: any) => { - projects.set(record.projectId, record); - }; - session.emit = (message: { type: string; payload: unknown }) => { - emitted.push(message); - }; + (session as any).paseoHome = paseoHome; + seedProject({ + projects, + id: 4, + directory: repoDir, + displayName: "repo", + kind: "git", + gitRemote: "https://github.com/acme/repo.git", + }); + seedWorkspace({ + workspaces, + id: 40, + projectId: 4, + directory: repoDir, + displayName: "main", + kind: "checkout", + }); + try { - await session.handleCreatePaseoWorktreeRequest({ + await (session as any).handleCreatePaseoWorktreeRequest({ type: "create_paseo_worktree_request", cwd: repoDir, worktreeSlug: "worktree-123", requestId: "req-worktree", }); + + const response = emitted.find( + (message) => message.type === "create_paseo_worktree_response", + ) as + | { type: "create_paseo_worktree_response"; payload: any } + | undefined; + + expect(response?.payload.error).toBeNull(); + expect(response?.payload.workspace).toMatchObject({ + projectDisplayName: "repo", + projectKind: "git", + workspaceKind: "worktree", + name: "worktree-123", + status: "done", + }); + expect(response?.payload.workspace?.id).toEqual(expect.any(String)); + expect(response?.payload.workspace?.id).toContain("worktree-123"); + // The worktree directory is created asynchronously in the background after + // the response is sent, so we only verify the DB record here. + const persistedWorkspace = Array.from(workspaces.values()).find( + (ws) => ws.directory === response!.payload.workspace.id, + ); + expect(persistedWorkspace).toBeTruthy(); + expect(persistedWorkspace?.directory).toContain(path.join("worktree-123")); + expect(response?.payload.workspace?.projectId).toEqual(repoDir); } finally { rmSync(tempDir, { recursive: true, force: true }); } - - const response = emitted.find((message) => message.type === "create_paseo_worktree_response") as - | { type: "create_paseo_worktree_response"; payload: any } - | undefined; - - expect(response?.payload.error).toBeNull(); - expect(response?.payload.workspace).toMatchObject({ - projectDisplayName: "repo", - projectKind: "git", - workspaceKind: "worktree", - name: "worktree-123", - status: "done", - }); - expect(response?.payload.workspace?.id).toContain(path.join("worktree-123")); - expect(workspaces.has(response?.payload.workspace?.id)).toBe(true); - expect(projects.has(response?.payload.workspace?.projectId)).toBe(true); }); - test("workspace update fanout for multiple cwd values is deduplicated", async () => { const emitted: Array<{ type: string; payload: unknown }> = []; - const session = createSessionForWorkspaceTests() as any; - session.workspaceUpdatesSubscription = { + const { session, projects, workspaces } = createSessionForWorkspaceTests(); + const sessionAny = session as any; + seedProject({ + projects, + id: 6, + directory: "/tmp/repo", + displayName: "repo", + kind: "git", + }); + seedWorkspace({ + workspaces, + id: 60, + projectId: 6, + directory: "/tmp/repo", + displayName: "main", + kind: "checkout", + }); + seedWorkspace({ + workspaces, + id: 61, + projectId: 6, + directory: "/tmp/repo/worktree", + displayName: "feature", + kind: "worktree", + }); + + sessionAny.workspaceUpdatesSubscription = { subscriptionId: "sub-dedup", filter: undefined, isBootstrapping: false, pendingUpdatesByWorkspaceId: new Map(), }; - session.reconcileActiveWorkspaceRecords = async () => - new Set(["/tmp/repo", "/tmp/repo/worktree"]); - session.buildWorkspaceDescriptorMap = async () => + sessionAny.reconcileActiveWorkspaceRecords = async () => new Set(); + sessionAny.buildWorkspaceDescriptorMap = async () => new Map([ [ "/tmp/repo", @@ -1100,11 +1166,13 @@ describe("workspace aggregation", () => { projectId: "/tmp/repo", projectDisplayName: "repo", projectRootPath: "/tmp/repo", + workspaceDirectory: "/tmp/repo", projectKind: "git", workspaceKind: "local_checkout", name: "main", status: "done", activityAt: null, + services: [], }, ], [ @@ -1114,110 +1182,59 @@ describe("workspace aggregation", () => { projectId: "/tmp/repo", projectDisplayName: "repo", projectRootPath: "/tmp/repo", + workspaceDirectory: "/tmp/repo/worktree", projectKind: "git", workspaceKind: "worktree", name: "feature", status: "running", activityAt: "2026-03-01T12:00:00.000Z", + services: [], }, ], ]); - session.onMessage = (message: { type: string; payload: unknown }) => { - emitted.push(message); - }; + sessionAny.emit = (message: any) => emitted.push(message); - await session.emitWorkspaceUpdateForCwd("/tmp/repo/worktree"); - await new Promise((resolve) => setTimeout(resolve, 0)); + await sessionAny.emitWorkspaceUpdatesForCwds(["/tmp/repo/worktree", "/tmp/repo"]); const workspaceUpdates = emitted.filter( (message) => message.type === "workspace_update", ) as any[]; - expect(workspaceUpdates).toHaveLength(3); - expect(workspaceUpdates.map((entry) => entry.payload.kind)).toEqual([ - "upsert", - "upsert", - "upsert", - ]); + expect(workspaceUpdates).toHaveLength(2); + expect(workspaceUpdates.map((entry) => entry.payload.kind)).toEqual(["upsert", "upsert"]); expect(workspaceUpdates.map((entry) => entry.payload.workspace.id).sort()).toEqual([ "/tmp/repo", "/tmp/repo/worktree", - "/tmp/repo/worktree", ]); }); test("open_project_request registers a workspace before any agent exists", async () => { - const emitted: Array<{ type: string; payload: unknown }> = []; - const session = createSessionForWorkspaceTests() as any; - const projects = new Map>(); - const workspaces = new Map>(); - - session.emit = (message: any) => emitted.push(message); - session.projectRegistry.get = async (projectId: string) => projects.get(projectId) ?? null; - session.projectRegistry.upsert = async ( - record: ReturnType, - ) => { - projects.set(record.projectId, record); - }; - session.workspaceRegistry.get = async (workspaceId: string) => - workspaces.get(workspaceId) ?? null; - session.workspaceRegistry.upsert = async ( - record: ReturnType, - ) => { - workspaces.set(record.workspaceId, record); - }; - session.projectRegistry.list = async () => Array.from(projects.values()); - session.workspaceRegistry.list = async () => Array.from(workspaces.values()); - session.buildProjectPlacement = async (cwd: string) => ({ - projectKey: cwd, - projectName: "repo", - checkout: { - cwd, - isGit: false, - currentBranch: null, - remoteUrl: null, - worktreeRoot: null, - isPaseoOwnedWorktree: false, - mainRepoRoot: null, - }, - }); + const { session, emitted, workspaces } = createSessionForWorkspaceTests(); + const sessionAny = session as any; - await session.handleMessage({ + sessionAny.resolveWorkspaceDirectory = async (cwd: string) => cwd; + + await sessionAny.handleMessage({ type: "open_project_request", cwd: "/tmp/repo", requestId: "req-open", }); - expect(workspaces.get("/tmp/repo")).toBeTruthy(); + expect(Array.from(workspaces.values()).some((workspace) => workspace.directory === "/tmp/repo")).toBe( + true, + ); const response = emitted.find((message) => message.type === "open_project_response") as any; expect(response?.payload.error).toBeNull(); expect(response?.payload.workspace?.id).toBe("/tmp/repo"); }); test("open_project_request collapses a git subdirectory onto the repo root workspace", async () => { - const emitted: Array<{ type: string; payload: unknown }> = []; - const session = createSessionForWorkspaceTests() as any; - const projects = new Map>(); - const workspaces = new Map>(); + const { session, emitted, workspaces } = createSessionForWorkspaceTests(); + const sessionAny = session as any; const repoRoot = "/tmp/repo"; const subdir = "/tmp/repo/packages/app"; - session.emit = (message: any) => emitted.push(message); - session.projectRegistry.get = async (projectId: string) => projects.get(projectId) ?? null; - session.projectRegistry.upsert = async ( - record: ReturnType, - ) => { - projects.set(record.projectId, record); - }; - session.workspaceRegistry.get = async (workspaceId: string) => - workspaces.get(workspaceId) ?? null; - session.workspaceRegistry.upsert = async ( - record: ReturnType, - ) => { - workspaces.set(record.workspaceId, record); - }; - session.projectRegistry.list = async () => Array.from(projects.values()); - session.workspaceRegistry.list = async () => Array.from(workspaces.values()); - session.buildProjectPlacement = async (cwd: string) => ({ + sessionAny.resolveWorkspaceDirectory = async () => repoRoot; + sessionAny.buildProjectPlacement = async (cwd: string) => ({ projectKey: repoRoot, projectName: "repo", checkout: { @@ -1231,30 +1248,33 @@ describe("workspace aggregation", () => { }, }); - await session.handleMessage({ + await sessionAny.handleMessage({ type: "open_project_request", cwd: subdir, requestId: "req-open-subdir", }); - expect(workspaces.get(repoRoot)).toBeTruthy(); - expect(workspaces.has(subdir)).toBe(false); + expect(Array.from(workspaces.values()).some((workspace) => workspace.directory === repoRoot)).toBe( + true, + ); + expect(Array.from(workspaces.values()).some((workspace) => workspace.directory === subdir)).toBe( + false, + ); const response = emitted.find((message) => message.type === "open_project_response") as any; expect(response?.payload.error).toBeNull(); expect(response?.payload.workspace?.id).toBe(repoRoot); }); test("list_available_editors_request returns available targets", async () => { - const emitted: Array<{ type: string; payload: unknown }> = []; - const session = createSessionForWorkspaceTests() as any; + const { session, emitted } = createSessionForWorkspaceTests(); + const sessionAny = session as any; - session.emit = (message: any) => emitted.push(message); - session.getAvailableEditorTargets = async () => [ + sessionAny.getAvailableEditorTargets = async () => [ { id: "cursor", label: "Cursor" }, { id: "finder", label: "Finder" }, ]; - await session.handleMessage({ + await sessionAny.handleMessage({ type: "list_available_editors_request", requestId: "req-editors", }); @@ -1270,16 +1290,15 @@ describe("workspace aggregation", () => { }); test("open_in_editor_request launches the selected target", async () => { - const emitted: Array<{ type: string; payload: unknown }> = []; - const session = createSessionForWorkspaceTests() as any; + const { session, emitted } = createSessionForWorkspaceTests(); + const sessionAny = session as any; const calls: Array<{ editorId: string; path: string }> = []; - session.emit = (message: any) => emitted.push(message); - session.openEditorTarget = async (input: { editorId: string; path: string }) => { + sessionAny.openEditorTarget = async (input: { editorId: string; path: string }) => { calls.push(input); }; - await session.handleMessage({ + await sessionAny.handleMessage({ type: "open_in_editor_request", requestId: "req-open-editor", editorId: "vscode", @@ -1293,465 +1312,582 @@ describe("workspace aggregation", () => { expect(response?.payload.error).toBeNull(); }); - test("archive_workspace_request hides non-destructive workspace records", async () => { - const emitted: Array<{ type: string; payload: unknown }> = []; - const session = createSessionForWorkspaceTests() as any; - const workspace = createPersistedWorkspaceRecord({ - workspaceId: "/tmp/repo", - projectId: "/tmp/repo", - cwd: "/tmp/repo", - kind: "directory", + test("archive_workspace_request archives the persisted workspace row", async () => { + const { session, emitted, projects, workspaces } = createSessionForWorkspaceTests(); + seedProject({ + projects, + id: 5, + directory: "/tmp/repo", + displayName: "repo", + }); + seedWorkspace({ + workspaces, + id: 50, + projectId: 5, + directory: "/tmp/repo", displayName: "repo", - createdAt: "2026-03-01T12:00:00.000Z", - updatedAt: "2026-03-01T12:00:00.000Z", }); - session.emit = (message: any) => emitted.push(message); - session.workspaceRegistry.get = async () => workspace; - session.workspaceRegistry.archive = async (_workspaceId: string, archivedAt: string) => { - workspace.archivedAt = archivedAt; - }; - session.workspaceRegistry.list = async () => [workspace]; - session.projectRegistry.archive = async () => {}; - - await session.handleMessage({ + await (session as any).handleMessage({ type: "archive_workspace_request", - workspaceId: "/tmp/repo", + workspaceId: 50, requestId: "req-archive", }); - expect(workspace.archivedAt).toBeTruthy(); + expect(workspaces.get(50)?.archivedAt).toBeTruthy(); const response = emitted.find( (message) => message.type === "archive_workspace_response", ) as any; - expect(response?.payload.error).toBeNull(); + expect(response?.payload).toMatchObject({ + workspaceId: 50, + error: null, + }); }); - test("opening a new worktree reconciles older local workspaces into the remote project", async () => { - const emitted: Array<{ type: string; payload: unknown }> = []; - const session = createSessionForWorkspaceTests() as any; - const projects = new Map>(); - const workspaces = new Map>(); - - const tempDir = realpathSync(mkdtempSync(path.join(tmpdir(), "session-workspace-reconcile-"))); - const mainWorkspaceId = path.join(tempDir, "inkwell"); - const worktreeWorkspaceId = path.join(mainWorkspaceId, ".paseo", "worktrees", "feature-a"); - const localProjectId = mainWorkspaceId; - const remoteProjectId = "remote:github.com/zimakki/inkwell"; - - execSync(`mkdir -p ${JSON.stringify(worktreeWorkspaceId)}`); - - projects.set( - localProjectId, - createPersistedProjectRecord({ - projectId: localProjectId, - rootPath: mainWorkspaceId, - kind: "git", - displayName: "inkwell", - createdAt: "2026-03-01T12:00:00.000Z", - updatedAt: "2026-03-01T12:00:00.000Z", + test("create_agent_request uses workspaceId as the execution authority", async () => { + const { session, emitted, projects, workspaces } = createSessionForWorkspaceTests(); + seedProject({ + projects, + id: 5, + directory: "/tmp/repo", + displayName: "repo", + kind: "git", + }); + seedWorkspace({ + workspaces, + id: 50, + projectId: 5, + directory: "/tmp/repo/.paseo/worktrees/feature", + displayName: "feature", + kind: "worktree", + }); + + const createdAgent = makeAgent({ + id: "agent-1", + cwd: "/tmp/repo/.paseo/worktrees/feature", + status: "idle", + updatedAt: "2026-03-01T12:00:00.000Z", + }); + const createAgent = vi.fn(async () => createdAgent as any); + + (session as any).agentManager = { + createAgent, + getAgent: vi.fn(() => createdAgent as any), + }; + (session as any).forwardAgentUpdate = vi.fn(async () => undefined); + (session as any).getAgentPayloadById = vi.fn(async () => createdAgent); + (session as any).buildAgentSessionConfig = vi.fn(async (config: any) => ({ + sessionConfig: config, + worktreeConfig: null, + })); + + await (session as any).handleCreateAgentRequest({ + type: "create_agent_request", + requestId: "req-create-agent", + workspaceId: 50, + config: { + provider: "codex", + cwd: "/tmp/repo", + modeId: "default", + }, + labels: {}, + }); + + expect(createAgent).toHaveBeenCalledWith( + expect.objectContaining({ + cwd: "/tmp/repo/.paseo/worktrees/feature", }), - ); - workspaces.set( - mainWorkspaceId, - createPersistedWorkspaceRecord({ - workspaceId: mainWorkspaceId, - projectId: localProjectId, - cwd: mainWorkspaceId, - kind: "local_checkout", - displayName: "main", - createdAt: "2026-03-01T12:00:00.000Z", - updatedAt: "2026-03-01T12:00:00.000Z", + undefined, + expect.objectContaining({ + workspaceId: 50, }), ); + const response = emitted.find((message) => message.type === "status") as any; + expect(response?.payload).toMatchObject({ + status: "agent_created", + requestId: "req-create-agent", + agent: { + cwd: "/tmp/repo/.paseo/worktrees/feature", + }, + }); + }); - session.emit = (message: any) => emitted.push(message); - session.workspaceUpdatesSubscription = { - subscriptionId: "sub-reconcile", - filter: undefined, - isBootstrapping: false, - pendingUpdatesByWorkspaceId: new Map(), - }; - session.listAgentPayloads = async () => []; - session.projectRegistry.get = async (projectId: string) => projects.get(projectId) ?? null; - session.projectRegistry.list = async () => Array.from(projects.values()); - session.projectRegistry.upsert = async ( - record: ReturnType, - ) => { - projects.set(record.projectId, record); - }; - session.projectRegistry.archive = async (projectId: string, archivedAt: string) => { - const existing = projects.get(projectId); - if (!existing) return; - projects.set(projectId, { ...existing, archivedAt, updatedAt: archivedAt }); - }; - session.workspaceRegistry.get = async (workspaceId: string) => - workspaces.get(workspaceId) ?? null; - session.workspaceRegistry.list = async () => Array.from(workspaces.values()); - session.workspaceRegistry.upsert = async ( - record: ReturnType, - ) => { - workspaces.set(record.workspaceId, record); + test("create_agent_request fails for an unknown workspaceId", async () => { + const { session, emitted } = createSessionForWorkspaceTests(); + const createAgent = vi.fn(); + + (session as any).agentManager = { + createAgent, + getAgent: vi.fn(() => null), }; - session.buildProjectPlacement = async (cwd: string) => ({ - projectKey: remoteProjectId, - projectName: "zimakki/inkwell", - checkout: { - cwd, - isGit: true, - currentBranch: cwd === mainWorkspaceId ? "main" : "feature-a", - remoteUrl: "https://github.com/zimakki/inkwell.git", - worktreeRoot: cwd, - isPaseoOwnedWorktree: cwd !== mainWorkspaceId, - mainRepoRoot: cwd === mainWorkspaceId ? null : mainWorkspaceId, + (session as any).buildAgentSessionConfig = vi.fn(async (config: any) => ({ + sessionConfig: config, + worktreeConfig: null, + })); + + await (session as any).handleCreateAgentRequest({ + type: "create_agent_request", + requestId: "req-create-agent-fail", + workspaceId: "999", + config: { + provider: "codex", + cwd: "/tmp/repo", + modeId: "default", }, + labels: {}, + }); + + expect(createAgent).not.toHaveBeenCalled(); + const response = emitted.find((message) => message.type === "status") as any; + expect(response?.payload).toMatchObject({ + status: "agent_create_failed", + requestId: "req-create-agent-fail", + }); + expect((response?.payload as any)?.error).toContain("Workspace not found: 999"); + }); + + test("open_project_request creates git projects with GitHub owner/repo and branch names", async () => { + const { session, emitted, projects, workspaces } = createSessionForWorkspaceTests(); + const { tempDir, repoDir } = createTempGitRepo({ + remoteUrl: "git@github.com:acme/repo.git", + branchName: "feature/test-branch", }); try { - await session.handleMessage({ + await (session as any).handleOpenProjectRequest({ type: "open_project_request", - cwd: worktreeWorkspaceId, - requestId: "req-open-worktree", + cwd: repoDir, + requestId: "req-open-git", }); - expect(workspaces.get(mainWorkspaceId)?.projectId).toBe(remoteProjectId); - expect(workspaces.get(worktreeWorkspaceId)?.projectId).toBe(remoteProjectId); - expect(projects.get(localProjectId)?.archivedAt).toBeTruthy(); - - const workspaceUpdates = emitted.filter( - (message) => message.type === "workspace_update", - ) as any[]; - expect(workspaceUpdates).toHaveLength(2); - expect(workspaceUpdates.map((message) => message.payload.workspace.id).sort()).toEqual([ - mainWorkspaceId, - worktreeWorkspaceId, + expect(Array.from(projects.values())).toEqual([ + expect.objectContaining({ + directory: repoDir, + kind: "git", + displayName: "acme/repo", + gitRemote: "git@github.com:acme/repo.git", + }), + ]); + expect(Array.from(workspaces.values())).toEqual([ + expect.objectContaining({ + directory: repoDir, + displayName: "feature/test-branch", + kind: "checkout", + }), ]); - expect( - workspaceUpdates.every( - (message) => message.payload.workspace.projectId === remoteProjectId, - ), - ).toBe(true); + + const response = emitted.find((message) => message.type === "open_project_response") as any; + expect(response?.payload).toMatchObject({ + error: null, + workspace: { + projectDisplayName: "acme/repo", + projectKind: "git", + name: "feature/test-branch", + workspaceKind: "local_checkout", + }, + }); } finally { rmSync(tempDir, { recursive: true, force: true }); } }); - test("fetch_workspaces_request reconciles remote URL changes for existing workspaces", async () => { - const session = createSessionForWorkspaceTests() as any; - const projects = new Map>(); - const workspaces = new Map>(); - - const tempDir = realpathSync(mkdtempSync(path.join(tmpdir(), "session-workspace-fetch-"))); - const mainWorkspaceId = path.join(tempDir, "inkwell"); - const worktreeWorkspaceId = path.join(mainWorkspaceId, ".paseo", "worktrees", "feature-a"); - const oldProjectId = "remote:github.com/old-owner/inkwell"; - const newProjectId = "remote:github.com/new-owner/inkwell"; - - execSync(`mkdir -p ${JSON.stringify(worktreeWorkspaceId)}`); - - projects.set( - oldProjectId, - createPersistedProjectRecord({ - projectId: oldProjectId, - rootPath: mainWorkspaceId, - kind: "git", - displayName: "old-owner/inkwell", - createdAt: "2026-03-01T12:00:00.000Z", - updatedAt: "2026-03-01T12:00:00.000Z", - }), - ); - - for (const [workspaceId, displayName] of [ - [mainWorkspaceId, "main"], - [worktreeWorkspaceId, "feature-a"], - ] as const) { - workspaces.set( - workspaceId, - createPersistedWorkspaceRecord({ - workspaceId, - projectId: oldProjectId, - cwd: workspaceId, - kind: workspaceId === mainWorkspaceId ? "local_checkout" : "worktree", - displayName, - createdAt: "2026-03-01T12:00:00.000Z", - updatedAt: "2026-03-01T12:00:00.000Z", - }), - ); - } - - session.listAgentPayloads = async () => []; - session.projectRegistry.get = async (projectId: string) => projects.get(projectId) ?? null; - session.projectRegistry.list = async () => Array.from(projects.values()); - session.projectRegistry.upsert = async ( - record: ReturnType, - ) => { - projects.set(record.projectId, record); - }; - session.projectRegistry.archive = async (projectId: string, archivedAt: string) => { - const existing = projects.get(projectId); - if (!existing) return; - projects.set(projectId, { ...existing, archivedAt, updatedAt: archivedAt }); - }; - session.workspaceRegistry.get = async (workspaceId: string) => - workspaces.get(workspaceId) ?? null; - session.workspaceRegistry.list = async () => Array.from(workspaces.values()); - session.workspaceRegistry.upsert = async ( - record: ReturnType, - ) => { - workspaces.set(record.workspaceId, record); - }; - session.buildProjectPlacement = async (cwd: string) => ({ - projectKey: newProjectId, - projectName: "new-owner/inkwell", - checkout: { - cwd, - isGit: true, - currentBranch: cwd === mainWorkspaceId ? "main" : "feature-a", - remoteUrl: "https://github.com/new-owner/inkwell.git", - worktreeRoot: cwd, - isPaseoOwnedWorktree: cwd !== mainWorkspaceId, - mainRepoRoot: cwd === mainWorkspaceId ? null : mainWorkspaceId, - }, - }); + test("open_project_request treats non-git directories as directory projects", async () => { + const { session, emitted, projects, workspaces } = createSessionForWorkspaceTests(); + const tempDir = realpathSync(mkdtempSync(path.join(tmpdir(), "session-workspace-dir-"))); + const projectDir = path.join(tempDir, "plain-dir"); + execSync(`mkdir -p ${projectDir}`); + writeFileSync(path.join(projectDir, "README.md"), "hello\n"); try { - const result = await session.listFetchWorkspacesEntries({ - type: "fetch_workspaces_request", - requestId: "req-fetch-reconcile", + await (session as any).handleOpenProjectRequest({ + type: "open_project_request", + cwd: projectDir, + requestId: "req-open-dir", }); - expect(result.entries.map((entry: any) => entry.projectId)).toEqual([ - newProjectId, - newProjectId, + expect(Array.from(projects.values())).toEqual([ + expect.objectContaining({ + directory: projectDir, + kind: "directory", + displayName: "plain-dir", + gitRemote: null, + }), + ]); + expect(Array.from(workspaces.values())).toEqual([ + expect.objectContaining({ + directory: projectDir, + displayName: "plain-dir", + kind: "checkout", + }), ]); - expect(workspaces.get(mainWorkspaceId)?.projectId).toBe(newProjectId); - expect(workspaces.get(worktreeWorkspaceId)?.projectId).toBe(newProjectId); - expect(projects.get(oldProjectId)?.archivedAt).toBeTruthy(); + + const response = emitted.find((message) => message.type === "open_project_response") as any; + expect(response?.payload).toMatchObject({ + error: null, + workspace: { + projectDisplayName: "plain-dir", + projectKind: "non_git", + name: "plain-dir", + workspaceKind: "local_checkout", + }, + }); } finally { rmSync(tempDir, { recursive: true, force: true }); } }); +}); - test("reconcile archives stale subdirectory workspace records when collapsing to the repo root", async () => { - const session = createSessionForWorkspaceTests() as any; - const projects = new Map>(); - const workspaces = new Map>(); - - const tempDir = realpathSync(mkdtempSync(path.join(tmpdir(), "session-workspace-collapse-"))); - const repoRoot = path.join(tempDir, "repo"); - const subdirWorkspaceId = path.join(repoRoot, "packages", "app"); - const projectId = "remote:github.com/acme/repo"; - - execSync(`mkdir -p ${JSON.stringify(subdirWorkspaceId)}`); - - projects.set( - projectId, - createPersistedProjectRecord({ - projectId, - rootPath: repoRoot, - kind: "git", - displayName: "acme/repo", - createdAt: "2026-03-01T12:00:00.000Z", - updatedAt: "2026-03-01T12:00:00.000Z", - }), - ); - workspaces.set( - repoRoot, - createPersistedWorkspaceRecord({ - workspaceId: repoRoot, - projectId, - cwd: repoRoot, - kind: "local_checkout", - displayName: "main", - createdAt: "2026-03-01T12:00:00.000Z", - updatedAt: "2026-03-01T12:00:00.000Z", - }), - ); - workspaces.set( - subdirWorkspaceId, - createPersistedWorkspaceRecord({ - workspaceId: subdirWorkspaceId, - projectId, - cwd: subdirWorkspaceId, - kind: "directory", - displayName: "app", - createdAt: "2026-03-01T12:00:00.000Z", - updatedAt: "2026-03-01T12:00:00.000Z", - }), - ); +describe("backward compatibility", () => { + test("workspace descriptor uses directory path as id, not numeric database id", async () => { + const { session, emitted, projects, workspaces } = createSessionForWorkspaceTests(); + seedProject({ + projects, + id: 1, + directory: "/tmp/myproject", + displayName: "myproject", + kind: "git", + gitRemote: "https://github.com/acme/myproject.git", + }); + seedWorkspace({ + workspaces, + id: 10, + projectId: 1, + directory: "/tmp/myproject", + displayName: "main", + }); - session.projectRegistry.get = async (nextProjectId: string) => projects.get(nextProjectId) ?? null; - session.projectRegistry.list = async () => Array.from(projects.values()); - session.projectRegistry.upsert = async ( - record: ReturnType, - ) => { - projects.set(record.projectId, record); - }; - session.workspaceRegistry.get = async (workspaceId: string) => - workspaces.get(workspaceId) ?? null; - session.workspaceRegistry.list = async () => Array.from(workspaces.values()); - session.workspaceRegistry.upsert = async ( - record: ReturnType, - ) => { - workspaces.set(record.workspaceId, record); - }; - session.workspaceRegistry.archive = async (workspaceId: string, archivedAt: string) => { - const existing = workspaces.get(workspaceId); - if (!existing) return; - workspaces.set(workspaceId, { ...existing, archivedAt, updatedAt: archivedAt }); - }; - session.buildProjectPlacement = async (cwd: string) => ({ - projectKey: projectId, - projectName: "acme/repo", - checkout: { - cwd, - isGit: true, - currentBranch: "main", - remoteUrl: "https://github.com/acme/repo.git", - worktreeRoot: repoRoot, - isPaseoOwnedWorktree: false, - mainRepoRoot: null, - }, + await (session as any).handleMessage({ + type: "fetch_workspaces_request", + payload: { requestId: "compat-id" }, + requestId: "compat-id", + }); + + const response = emitted.find((m) => m.type === "fetch_workspaces_response") as any; + expect(response).toBeTruthy(); + const entries = response.payload.entries; + expect(entries).toHaveLength(1); + expect(entries[0].id).toBe("/tmp/myproject"); + expect(entries[0].id).not.toBe("10"); + }); + + test("workspace descriptor maps projectKind 'directory' to 'non_git'", async () => { + const { session, emitted, projects, workspaces } = createSessionForWorkspaceTests(); + seedProject({ + projects, + id: 1, + directory: "/tmp/dirproject", + displayName: "dirproject", + kind: "directory", + }); + seedWorkspace({ + workspaces, + id: 10, + projectId: 1, + directory: "/tmp/dirproject", + displayName: "dirproject", + }); + + await (session as any).handleMessage({ + type: "fetch_workspaces_request", + payload: { requestId: "compat-dir-kind" }, + requestId: "compat-dir-kind", + }); + + const response = emitted.find((m) => m.type === "fetch_workspaces_response") as any; + expect(response).toBeTruthy(); + expect(response.payload.entries[0].projectKind).toBe("non_git"); + }); + + test("workspace descriptor maps projectKind 'git' unchanged", async () => { + const { session, emitted, projects, workspaces } = createSessionForWorkspaceTests(); + seedProject({ + projects, + id: 1, + directory: "/tmp/gitproject", + displayName: "gitproject", + kind: "git", + gitRemote: "https://github.com/acme/gitproject.git", + }); + seedWorkspace({ + workspaces, + id: 10, + projectId: 1, + directory: "/tmp/gitproject", + displayName: "main", + }); + + await (session as any).handleMessage({ + type: "fetch_workspaces_request", + payload: { requestId: "compat-git-kind" }, + requestId: "compat-git-kind", + }); + + const response = emitted.find((m) => m.type === "fetch_workspaces_response") as any; + expect(response).toBeTruthy(); + expect(response.payload.entries[0].projectKind).toBe("git"); + }); + + test("workspace descriptor maps workspaceKind 'checkout' to 'local_checkout'", async () => { + const { session, emitted, projects, workspaces } = createSessionForWorkspaceTests(); + seedProject({ + projects, + id: 1, + directory: "/tmp/checkout-project", + displayName: "checkout-project", + kind: "git", + }); + seedWorkspace({ + workspaces, + id: 10, + projectId: 1, + directory: "/tmp/checkout-project", + displayName: "main", + kind: "checkout", + }); + + await (session as any).handleMessage({ + type: "fetch_workspaces_request", + payload: { requestId: "compat-checkout" }, + requestId: "compat-checkout", + }); + + const response = emitted.find((m) => m.type === "fetch_workspaces_response") as any; + expect(response).toBeTruthy(); + expect(response.payload.entries[0].workspaceKind).toBe("local_checkout"); + }); + + test("workspace descriptor maps workspaceKind 'worktree' unchanged", async () => { + const { session, emitted, projects, workspaces } = createSessionForWorkspaceTests(); + seedProject({ + projects, + id: 1, + directory: "/tmp/worktree-project", + displayName: "worktree-project", + kind: "git", + }); + seedWorkspace({ + workspaces, + id: 10, + projectId: 1, + directory: "/tmp/worktree-project/.paseo/worktrees/feature", + displayName: "feature", + kind: "worktree", + }); + + await (session as any).handleMessage({ + type: "fetch_workspaces_request", + payload: { requestId: "compat-worktree" }, + requestId: "compat-worktree", + }); + + const response = emitted.find((m) => m.type === "fetch_workspaces_response") as any; + expect(response).toBeTruthy(); + expect(response.payload.entries[0].workspaceKind).toBe("worktree"); + }); + + test("workspace descriptor uses project directory as projectId", async () => { + const { session, emitted, projects, workspaces } = createSessionForWorkspaceTests(); + seedProject({ + projects, + id: 1, + directory: "/tmp/myproject", + displayName: "myproject", + kind: "git", + gitRemote: "https://github.com/acme/myproject.git", + }); + seedWorkspace({ + workspaces, + id: 10, + projectId: 1, + directory: "/tmp/myproject", + displayName: "main", + }); + + await (session as any).handleMessage({ + type: "fetch_workspaces_request", + payload: { requestId: "compat-project-id" }, + requestId: "compat-project-id", + }); + + const response = emitted.find((m) => m.type === "fetch_workspaces_response") as any; + expect(response).toBeTruthy(); + expect(response.payload.entries[0].projectId).toBe("/tmp/myproject"); + expect(response.payload.entries[0].projectId).not.toBe("1"); + }); + + test("open_project_response returns backward-compatible descriptor", async () => { + const { session, emitted, projects, workspaces } = createSessionForWorkspaceTests(); + const { tempDir, repoDir } = createTempGitRepo({ + remoteUrl: "https://github.com/acme/compat-repo.git", + branchName: "main", }); try { - const result = await session.reconcileWorkspaceRecord(subdirWorkspaceId); + await (session as any).handleOpenProjectRequest({ + type: "open_project_request", + cwd: repoDir, + requestId: "req-open-compat", + }); + + const response = emitted.find((m) => m.type === "open_project_response") as any; + expect(response).toBeTruthy(); + const workspace = response.payload.workspace; + + // id should be the directory path, not a numeric id + expect(workspace.id).toBe(repoDir); + + // projectId should be the project directory path, not a numeric id + expect(workspace.projectId).toBe(repoDir); - expect(result.changed).toBe(true); - expect(result.workspace.workspaceId).toBe(repoRoot); - expect(result.removedWorkspaceId).toBe(subdirWorkspaceId); - expect(workspaces.get(repoRoot)?.archivedAt).toBeNull(); - expect(workspaces.get(subdirWorkspaceId)?.archivedAt).toBeTruthy(); + // projectKind should map "git" to "git" + expect(workspace.projectKind).toBe("git"); + + // workspaceKind should map "checkout" to "local_checkout" + expect(workspace.workspaceKind).toBe("local_checkout"); + + // projectRootPath and workspaceDirectory should be the actual directory + expect(workspace.projectRootPath).toBe(repoDir); + expect(workspace.workspaceDirectory).toBe(repoDir); } finally { rmSync(tempDir, { recursive: true, force: true }); } }); test("listWorkspaceDescriptorsSnapshot keeps git workspaces on the baseline descriptor path", async () => { - const session = createSessionForWorkspaceTests() as any; + const { session } = createSessionForWorkspaceTests(); + const sessionAny = session as any; const project = createPersistedProjectRecord({ - projectId: "/tmp/repo", - rootPath: "/tmp/repo", + id: 70, + directory: "/tmp/repo", kind: "git", displayName: "repo", + gitRemote: "https://github.com/acme/repo.git", createdAt: "2026-03-01T12:00:00.000Z", updatedAt: "2026-03-01T12:00:00.000Z", }); const workspace = createPersistedWorkspaceRecord({ - workspaceId: "/tmp/repo", - projectId: project.projectId, - cwd: "/tmp/repo", - kind: "local_checkout", + id: 71, + projectId: project.id, + directory: "/tmp/repo", + kind: "checkout", displayName: "main", createdAt: "2026-03-01T12:00:00.000Z", updatedAt: "2026-03-01T12:00:00.000Z", }); - session.listAgentPayloads = async () => []; - session.projectRegistry.list = async () => [project]; - session.workspaceRegistry.list = async () => [workspace]; + sessionAny.listAgentPayloads = async () => []; + sessionAny.projectRegistry.list = async () => [project]; + sessionAny.workspaceRegistry.list = async () => [workspace]; const baselineDescriptor = { - id: workspace.workspaceId, - projectId: project.projectId, + id: workspace.directory, + projectId: project.directory, projectDisplayName: project.displayName, - projectRootPath: project.rootPath, - projectKind: project.kind, - workspaceKind: workspace.kind, + projectRootPath: project.directory, + workspaceDirectory: workspace.directory, + projectKind: "git", + workspaceKind: "local_checkout", name: "main", status: "done", activityAt: null, diffStat: null, + services: [], } as const; const gitDescriptor = { ...baselineDescriptor, diffStat: { additions: 3, deletions: 1 }, } as const; - session.describeWorkspaceRecord = vi.fn(async () => baselineDescriptor); - session.describeWorkspaceRecordWithGitData = vi.fn(async () => gitDescriptor); + sessionAny.describeWorkspaceRecord = vi.fn(async () => baselineDescriptor); + sessionAny.describeWorkspaceRecordWithGitData = vi.fn(async () => gitDescriptor); - const descriptors = await session.listWorkspaceDescriptorsSnapshot(); + const descriptors = await sessionAny.listWorkspaceDescriptorsSnapshot(); - expect(session.describeWorkspaceRecord).toHaveBeenCalledWith(workspace, project); - expect(session.describeWorkspaceRecordWithGitData).not.toHaveBeenCalled(); + expect(sessionAny.describeWorkspaceRecord).toHaveBeenCalledWith(workspace, project); + expect(sessionAny.describeWorkspaceRecordWithGitData).not.toHaveBeenCalled(); expect(descriptors).toEqual([baselineDescriptor]); }); test("subscribed fetch_workspaces emits git enrichment updates after the baseline snapshot", async () => { const emitted: Array<{ type: string; payload: any }> = []; - const session = createSessionForWorkspaceTests() as any; + const { session } = createSessionForWorkspaceTests(); + const sessionAny = session as any; const gitProject = createPersistedProjectRecord({ - projectId: "/tmp/repo", - rootPath: "/tmp/repo", + id: 80, + directory: "/tmp/repo", kind: "git", displayName: "repo", + gitRemote: "https://github.com/acme/repo.git", createdAt: "2026-03-01T12:00:00.000Z", updatedAt: "2026-03-01T12:00:00.000Z", }); const directoryProject = createPersistedProjectRecord({ - projectId: "/tmp/docs", - rootPath: "/tmp/docs", - kind: "non_git", + id: 81, + directory: "/tmp/docs", + kind: "directory", displayName: "docs", createdAt: "2026-03-01T12:00:00.000Z", updatedAt: "2026-03-01T12:00:00.000Z", }); const gitWorkspace = createPersistedWorkspaceRecord({ - workspaceId: "/tmp/repo", - projectId: gitProject.projectId, - cwd: "/tmp/repo", - kind: "local_checkout", + id: 82, + projectId: gitProject.id, + directory: "/tmp/repo", + kind: "checkout", displayName: "main", createdAt: "2026-03-01T12:00:00.000Z", updatedAt: "2026-03-01T12:00:00.000Z", }); const directoryWorkspace = createPersistedWorkspaceRecord({ - workspaceId: "/tmp/docs", - projectId: directoryProject.projectId, - cwd: "/tmp/docs", - kind: "directory", + id: 83, + projectId: directoryProject.id, + directory: "/tmp/docs", + kind: "checkout", displayName: "docs", createdAt: "2026-03-01T12:00:00.000Z", updatedAt: "2026-03-01T12:00:00.000Z", }); const baselineGitDescriptor = { - id: gitWorkspace.workspaceId, - projectId: gitProject.projectId, + id: gitWorkspace.directory, + projectId: gitProject.directory, projectDisplayName: gitProject.displayName, - projectRootPath: gitProject.rootPath, - projectKind: gitProject.kind, - workspaceKind: gitWorkspace.kind, + projectRootPath: gitProject.directory, + workspaceDirectory: gitWorkspace.directory, + projectKind: "git", + workspaceKind: "local_checkout", name: "main", status: "done", activityAt: null, diffStat: null, + services: [], } as const; const enrichedGitDescriptor = { ...baselineGitDescriptor, diffStat: { additions: 3, deletions: 1 }, } as const; const directoryDescriptor = { - id: directoryWorkspace.workspaceId, - projectId: directoryProject.projectId, + id: directoryWorkspace.directory, + projectId: directoryProject.directory, projectDisplayName: directoryProject.displayName, - projectRootPath: directoryProject.rootPath, - projectKind: directoryProject.kind, - workspaceKind: directoryWorkspace.kind, + projectRootPath: directoryProject.directory, + workspaceDirectory: directoryWorkspace.directory, + projectKind: "non_git", + workspaceKind: "local_checkout", name: "docs", status: "done", activityAt: null, diffStat: null, + services: [], } as const; - session.emit = (message: any) => emitted.push(message); - session.listAgentPayloads = async () => []; - session.projectRegistry.list = async () => [gitProject, directoryProject]; - session.workspaceRegistry.list = async () => [gitWorkspace, directoryWorkspace]; - session.reconcileAndEmitWorkspaceUpdates = vi.fn(async () => {}); - session.describeWorkspaceRecord = vi.fn( + sessionAny.emit = (message: any) => emitted.push(message); + sessionAny.listAgentPayloads = async () => []; + sessionAny.projectRegistry.list = async () => [gitProject, directoryProject]; + sessionAny.workspaceRegistry.list = async () => [gitWorkspace, directoryWorkspace]; + sessionAny.reconcileAndEmitWorkspaceUpdates = vi.fn(async () => {}); + sessionAny.describeWorkspaceRecord = vi.fn( async (workspace: typeof gitWorkspace | typeof directoryWorkspace, project: any) => { - if (workspace.workspaceId === gitWorkspace.workspaceId) { + if (workspace.id === gitWorkspace.id) { expect(project).toEqual(gitProject); return baselineGitDescriptor; } @@ -1759,9 +1895,9 @@ describe("workspace aggregation", () => { return directoryDescriptor; }, ); - session.describeWorkspaceRecordWithGitData = vi.fn(async () => enrichedGitDescriptor); + sessionAny.describeWorkspaceRecordWithGitData = vi.fn(async () => enrichedGitDescriptor); - await session.handleMessage({ + await sessionAny.handleMessage({ type: "fetch_workspaces_request", requestId: "req-fetch-workspaces", subscribe: {}, @@ -1793,7 +1929,7 @@ describe("workspace aggregation", () => { }, }, ]); - expect(session.describeWorkspaceRecordWithGitData).toHaveBeenCalledWith( + expect(sessionAny.describeWorkspaceRecordWithGitData).toHaveBeenCalledWith( gitWorkspace, gitProject, ); diff --git a/packages/server/src/server/snapshot-mutation-ownership.test.ts b/packages/server/src/server/snapshot-mutation-ownership.test.ts new file mode 100644 index 000000000..5b6088a3c --- /dev/null +++ b/packages/server/src/server/snapshot-mutation-ownership.test.ts @@ -0,0 +1,193 @@ +import os from "node:os"; +import path from "node:path"; +import { mkdtempSync, rmSync } from "node:fs"; +import { afterEach, describe, expect, test, vi } from "vitest"; + +import { Session } from "./session.js"; +import { createTestPaseoDaemon } from "./test-utils/paseo-daemon.js"; +import { projects, workspaces } from "./db/schema.js"; + +describe("snapshot mutation ownership boundary", () => { + afterEach(() => { + vi.restoreAllMocks(); + }); + + test("daemon live mutations write one durable snapshot through the manager-owned path", async () => { + const daemonHandle = await createTestPaseoDaemon(); + const cwd = mkdtempSync(path.join(os.tmpdir(), "snapshot-owner-live-")); + + try { + const db = (daemonHandle.daemon.agentStorage as any).db; + const [projectRow] = await db + .insert(projects) + .values({ + directory: cwd, + displayName: "test-project", + kind: "directory", + createdAt: new Date().toISOString(), + updatedAt: new Date().toISOString(), + }) + .returning({ id: projects.id }); + const [workspaceRow] = await db + .insert(workspaces) + .values({ + projectId: projectRow!.id, + directory: cwd, + displayName: "test-workspace", + kind: "checkout", + createdAt: new Date().toISOString(), + updatedAt: new Date().toISOString(), + }) + .returning({ id: workspaces.id }); + + const snapshot = await daemonHandle.daemon.agentManager.createAgent( + { + provider: "codex", + cwd, + model: "gpt-5.2-codex", + }, + undefined, + { workspaceId: workspaceRow!.id }, + ); + await daemonHandle.daemon.agentManager.flush(); + + const applySnapshotSpy = vi.spyOn(daemonHandle.daemon.agentStorage, "applySnapshot"); + + await daemonHandle.daemon.agentManager.setAgentModel(snapshot.id, "gpt-5.4"); + await daemonHandle.daemon.agentManager.flush(); + + expect(applySnapshotSpy).toHaveBeenCalledTimes(1); + + const persisted = await daemonHandle.daemon.agentStorage.get(snapshot.id); + expect(persisted?.config?.model).toBe("gpt-5.4"); + } finally { + rmSync(cwd, { recursive: true, force: true }); + await daemonHandle.close(); + } + }); + + test("session runtime flows delegate snapshot mutations to agent manager without direct storage writes", async () => { + const onMessage = vi.fn(); + const storedRecord = { + id: "agent-1", + provider: "codex", + cwd: "/tmp/project", + createdAt: "2026-03-24T00:00:00.000Z", + updatedAt: "2026-03-24T00:00:00.000Z", + title: null, + labels: {}, + lastStatus: "idle" as const, + config: null, + persistence: null, + archivedAt: null as string | null, + requiresAttention: false, + attentionReason: null, + attentionTimestamp: null, + }; + const archiveSnapshot = vi.fn(async (_agentId: string, archivedAt: string) => { + storedRecord.archivedAt = archivedAt; + storedRecord.updatedAt = archivedAt; + return { + ...storedRecord, + archivedAt, + updatedAt: archivedAt, + }; + }); + const unarchiveSnapshot = vi.fn(async () => true); + const unarchiveSnapshotByHandle = vi.fn(async () => undefined); + const updateAgentMetadata = vi.fn(async () => undefined); + const directStorageWrite = vi.fn(async () => { + throw new Error("Session should not write snapshots directly"); + }); + + const logger = { + child: () => logger, + trace: vi.fn(), + debug: vi.fn(), + info: vi.fn(), + warn: vi.fn(), + error: vi.fn(), + }; + + const session = new Session({ + clientId: "test-client", + onMessage, + logger: logger as any, + downloadTokenStore: {} as any, + pushTokenStore: {} as any, + paseoHome: "/tmp/paseo-test", + agentManager: { + subscribe: () => () => {}, + listAgents: () => [], + getAgent: () => null, + archiveSnapshot, + unarchiveSnapshot, + unarchiveSnapshotByHandle, + updateAgentMetadata, + } as any, + agentStorage: { + list: async () => [], + get: async () => storedRecord, + applySnapshot: directStorageWrite, + upsert: directStorageWrite, + } as any, + projectRegistry: { + initialize: async () => {}, + existsOnDisk: async () => true, + list: async () => [], + get: async () => null, + upsert: async () => {}, + archive: async () => {}, + remove: async () => {}, + } as any, + workspaceRegistry: { + initialize: async () => {}, + existsOnDisk: async () => true, + list: async () => [], + get: async () => null, + upsert: async () => {}, + archive: async () => {}, + remove: async () => {}, + } as any, + createAgentMcpTransport: async () => { + throw new Error("not used"); + }, + stt: null, + tts: null, + terminalManager: null, + }) as any; + + const archiveResult = await session.archiveAgentForClose("agent-1"); + expect(archiveSnapshot).toHaveBeenCalledTimes(1); + expect(archiveResult.archivedAt).toBeTruthy(); + + await session.unarchiveAgentState("agent-1"); + expect(unarchiveSnapshot).toHaveBeenCalledWith("agent-1"); + + const handle = { provider: "codex", sessionId: "session-1" }; + await session.unarchiveAgentByHandle(handle); + expect(unarchiveSnapshotByHandle).toHaveBeenCalledWith(handle); + + await session.handleUpdateAgentRequest( + "agent-1", + "Renamed agent", + { lane: "phase-1a" }, + "req-1", + ); + expect(updateAgentMetadata).toHaveBeenCalledWith("agent-1", { + title: "Renamed agent", + labels: { lane: "phase-1a" }, + }); + expect(onMessage).toHaveBeenCalledWith({ + type: "update_agent_response", + payload: { + requestId: "req-1", + agentId: "agent-1", + accepted: true, + error: null, + }, + }); + + expect(directStorageWrite).not.toHaveBeenCalled(); + }); +}); diff --git a/packages/server/src/server/test-utils/fake-agent-client.ts b/packages/server/src/server/test-utils/fake-agent-client.ts index a2b53368e..97a995973 100644 --- a/packages/server/src/server/test-utils/fake-agent-client.ts +++ b/packages/server/src/server/test-utils/fake-agent-client.ts @@ -502,13 +502,16 @@ class FakeAgentSession implements AgentSession { await this.appendHistoryEvent(assistantChunkA); this.notifySubscribers(assistantChunkA); - const assistantChunkB: AgentStreamEvent = { - type: "timeline", - provider: this.providerName, - item: { type: "assistant_message", text: assistantText.slice(6) }, - }; - await this.appendHistoryEvent(assistantChunkB); - this.notifySubscribers(assistantChunkB); + const assistantChunkBText = assistantText.slice(6); + if (assistantChunkBText.length > 0) { + const assistantChunkB: AgentStreamEvent = { + type: "timeline", + provider: this.providerName, + item: { type: "assistant_message", text: assistantChunkBText }, + }; + await this.appendHistoryEvent(assistantChunkB); + this.notifySubscribers(assistantChunkB); + } const completed: AgentStreamEvent = { type: "turn_completed", @@ -929,6 +932,9 @@ export function createTestAgentClients(): Record { return { claude: new FakeAgentClient("claude"), codex: new FakeAgentClient("codex"), + gemini: new FakeAgentClient("gemini"), + amp: new FakeAgentClient("amp"), + aider: new FakeAgentClient("aider"), opencode: new FakeAgentClient("opencode"), }; } diff --git a/packages/server/src/server/websocket-server.notifications.test.ts b/packages/server/src/server/websocket-server.notifications.test.ts index 5fe0502ee..1c233d5d3 100644 --- a/packages/server/src/server/websocket-server.notifications.test.ts +++ b/packages/server/src/server/websocket-server.notifications.test.ts @@ -63,6 +63,7 @@ function createServer(agentManagerOverrides?: Record) { const agentManager = { setAgentAttentionCallback: vi.fn(), getAgent: vi.fn(() => null), + getLastAssistantMessage: vi.fn(async () => null), ...agentManagerOverrides, }; @@ -109,22 +110,20 @@ describe("VoiceAssistantWebSocketServer notification payloads", () => { vi.clearAllMocks(); }); - it("uses assistant preview text for push notifications with markdown removed", () => { + it("uses assistant preview text for push notifications with markdown removed", async () => { + const getLastAssistantMessage = vi.fn( + async () => "**Done**. Updated `README.md` and [link](https://example.com).", + ); const { server } = createServer({ getAgent: vi.fn(() => ({ config: { title: null }, cwd: "/tmp/worktree", - timeline: [ - { - type: "assistant_message", - text: "**Done**. Updated `README.md` and [link](https://example.com).", - }, - ], pendingPermissions: new Map(), })), + getLastAssistantMessage, }); - (server as any).broadcastAgentAttention({ + await (server as any).broadcastAgentAttention({ agentId: "agent-1", provider: "claude", reason: "finished", @@ -139,30 +138,28 @@ describe("VoiceAssistantWebSocketServer notification payloads", () => { reason: "finished", }, }); + expect(getLastAssistantMessage).toHaveBeenCalledWith("agent-1"); }); - it("sends push notifications regardless of UI label presence", () => { + it("sends push notifications regardless of UI label presence", async () => { + const getLastAssistantMessage = vi.fn(async () => "Done."); const { server } = createServer({ getAgent: vi.fn(() => ({ config: { title: null }, cwd: "/tmp/worktree", labels: {}, - timeline: [ - { - type: "assistant_message", - text: "Done.", - }, - ], pendingPermissions: new Map(), })), + getLastAssistantMessage, }); - (server as any).broadcastAgentAttention({ + await (server as any).broadcastAgentAttention({ agentId: "agent-2", provider: "claude", reason: "finished", }); expect(pushMocks.sendPush).toHaveBeenCalledTimes(1); + expect(getLastAssistantMessage).toHaveBeenCalledWith("agent-2"); }); }); diff --git a/packages/server/src/server/websocket-server.ts b/packages/server/src/server/websocket-server.ts index 7c719b454..0f34fede5 100644 --- a/packages/server/src/server/websocket-server.ts +++ b/packages/server/src/server/websocket-server.ts @@ -4,7 +4,7 @@ import type { Transport } from "@modelcontextprotocol/sdk/shared/transport.js"; import { join } from "path"; import { hostname as getHostname } from "node:os"; import type { AgentManager } from "./agent/agent-manager.js"; -import type { AgentStorage } from "./agent/agent-storage.js"; +import type { AgentSnapshotStore } from "./agent/agent-snapshot-store.js"; import type { DownloadTokenStore } from "./file-download/token-store.js"; import type { TerminalManager } from "../terminal/terminal-manager.js"; import type pino from "pino"; @@ -36,6 +36,7 @@ import { ProviderSnapshotManager } from "./agent/provider-snapshot-manager.js"; import { buildProviderRegistry } from "./agent/provider-registry.js"; import { PushTokenStore } from "./push/token-store.js"; import { PushService } from "./push/push-service.js"; +import type { ServiceRouteStore } from "./service-proxy.js"; import type { SpeechReadinessSnapshot, SpeechService } from "./speech/speech-runtime.js"; import type { VoiceCallerContext, VoiceMcpStdioConfig, VoiceSpeakHandler } from "./voice-types.js"; import { @@ -45,7 +46,6 @@ import { } from "./agent-attention-policy.js"; import { buildAgentAttentionNotificationPayload, - findLatestAssistantMessageFromTimeline, findLatestPermissionRequest, } from "../shared/agent-attention-notification.js"; @@ -73,6 +73,7 @@ function createNoopProjectRegistry(): ProjectRegistry { existsOnDisk: async () => true, list: async () => [], get: async () => null, + insert: async () => 0, upsert: async () => {}, archive: async () => {}, remove: async () => {}, @@ -85,6 +86,7 @@ function createNoopWorkspaceRegistry(): WorkspaceRegistry { existsOnDisk: async () => true, list: async () => [], get: async () => null, + insert: async () => 0, upsert: async () => {}, archive: async () => {}, remove: async () => {}, @@ -231,7 +233,7 @@ export class VoiceAssistantWebSocketServer { private readonly serverId: string; private readonly daemonVersion: string; private readonly agentManager: AgentManager; - private readonly agentStorage: AgentStorage; + private readonly agentStorage: AgentSnapshotStore; private readonly projectRegistry: ProjectRegistry; private readonly workspaceRegistry: WorkspaceRegistry; private readonly chatService: FileBackedChatService; @@ -246,6 +248,11 @@ export class VoiceAssistantWebSocketServer { private readonly createAgentMcpTransport: AgentMcpTransportFactory; private readonly speech: SpeechService | null; private readonly terminalManager: TerminalManager | null; + private readonly serviceRouteStore: ServiceRouteStore | null; + private readonly getDaemonTcpPort: (() => number | null) | null; + private readonly resolveServiceHealth: + | ((hostname: string) => "healthy" | "unhealthy" | null) + | null; private readonly dictation: { finalTimeoutMs?: number; } | null; @@ -259,6 +266,9 @@ export class VoiceAssistantWebSocketServer { private readonly agentProviderRuntimeSettings: AgentProviderRuntimeSettingsMap | undefined; private readonly providerSnapshotManager: ProviderSnapshotManager; private readonly onLifecycleIntent: ((intent: SessionLifecycleIntent) => void) | null; + private readonly onBranchChanged: + | ((workspaceId: string, oldBranch: string | null, newBranch: string | null) => void) + | null; private serverCapabilities: ServerCapabilities | undefined; private runtimeWindowStartedAt = Date.now(); private readonly runtimeCounters: WebSocketRuntimeCounters = { @@ -289,7 +299,7 @@ export class VoiceAssistantWebSocketServer { logger: pino.Logger, serverId: string, agentManager: AgentManager, - agentStorage: AgentStorage, + agentStorage: AgentSnapshotStore, downloadTokenStore: DownloadTokenStore, paseoHome: string, createAgentMcpTransport: AgentMcpTransportFactory, @@ -313,6 +323,14 @@ export class VoiceAssistantWebSocketServer { loopService?: LoopService, scheduleService?: ScheduleService, checkoutDiffManager?: CheckoutDiffManager, + serviceRouteStore?: ServiceRouteStore | null, + onBranchChanged?: ( + workspaceId: string, + oldBranch: string | null, + newBranch: string | null, + ) => void, + getDaemonTcpPort?: () => number | null, + resolveServiceHealth?: (hostname: string) => "healthy" | "unhealthy" | null, ) { this.logger = logger.child({ module: "websocket-server" }); this.serverId = serverId; @@ -359,6 +377,10 @@ export class VoiceAssistantWebSocketServer { providerSnapshotLogger, ); this.onLifecycleIntent = onLifecycleIntent ?? null; + this.serviceRouteStore = serviceRouteStore ?? null; + this.onBranchChanged = onBranchChanged ?? null; + this.getDaemonTcpPort = getDaemonTcpPort ?? null; + this.resolveServiceHealth = resolveServiceHealth ?? null; this.serverCapabilities = buildServerCapabilities({ readiness: this.speech?.getReadiness() ?? null, }); @@ -371,7 +393,9 @@ export class VoiceAssistantWebSocketServer { this.pushService = new PushService(pushLogger, this.pushTokenStore); this.agentManager.setAgentAttentionCallback((params) => { - this.broadcastAgentAttention(params); + void this.broadcastAgentAttention(params).catch((err) => { + this.logger.warn({ err, agentId: params.agentId }, "Failed to broadcast agent attention"); + }); }); const { allowedOrigins, allowedHosts } = wsConfig; @@ -429,6 +453,16 @@ export class VoiceAssistantWebSocketServer { } } + public listActiveSessions(): Session[] { + return Array.from( + new Set( + [...this.sessions.values(), ...this.externalSessionsByKey.values()].map( + (connection) => connection.session, + ), + ), + ); + } + public publishSpeechReadiness(readiness: SpeechReadinessSnapshot | null): void { this.updateServerCapabilities(buildServerCapabilities({ readiness })); } @@ -665,6 +699,10 @@ export class VoiceAssistantWebSocketServer { tts: () => this.speech?.resolveTts() ?? null, terminalManager: this.terminalManager, providerSnapshotManager: this.providerSnapshotManager, + serviceRouteStore: this.serviceRouteStore ?? undefined, + onBranchChanged: this.onBranchChanged ?? undefined, + getDaemonTcpPort: this.getDaemonTcpPort ?? undefined, + resolveServiceHealth: this.resolveServiceHealth ?? undefined, voice: { ...(this.voice ?? {}), turnDetection: () => this.speech?.resolveTurnDetection() ?? null, @@ -1352,11 +1390,11 @@ export class VoiceAssistantWebSocketServer { }; } - private broadcastAgentAttention(params: { + private async broadcastAgentAttention(params: { agentId: string; provider: AgentProvider; reason: "finished" | "error" | "permission"; - }): void { + }): Promise { const clientEntries: Array<{ ws: WebSocketLike; state: ClientAttentionState; @@ -1371,11 +1409,12 @@ export class VoiceAssistantWebSocketServer { const allStates = clientEntries.map((e) => e.state); const agent = this.agentManager.getAgent(params.agentId); + const assistantMessage = await this.agentManager.getLastAssistantMessage(params.agentId); const notification = buildAgentAttentionNotificationPayload({ reason: params.reason, serverId: this.serverId, agentId: params.agentId, - assistantMessage: agent ? findLatestAssistantMessageFromTimeline(agent.timeline) : null, + assistantMessage, permissionRequest: agent ? findLatestPermissionRequest(agent.pendingPermissions) : null, }); diff --git a/packages/server/src/server/workspace-git-metadata.ts b/packages/server/src/server/workspace-git-metadata.ts new file mode 100644 index 000000000..1af8fd192 --- /dev/null +++ b/packages/server/src/server/workspace-git-metadata.ts @@ -0,0 +1,87 @@ +import { execSync } from "child_process"; +import { READ_ONLY_GIT_ENV } from "./checkout-git-utils.js"; + +export type WorkspaceGitMetadata = { + projectKind: "git" | "directory"; + projectDisplayName: string; + workspaceDisplayName: string; + gitRemote: string | null; + isWorktree: boolean; +}; + +export function readGitCommand(cwd: string, command: string): string | null { + try { + const output = execSync(command, { + cwd, + env: READ_ONLY_GIT_ENV, + encoding: "utf8", + stdio: ["ignore", "pipe", "ignore"], + }); + const trimmed = output.trim(); + return trimmed.length > 0 ? trimmed : null; + } catch { + return null; + } +} + +export function parseGitHubRepoFromRemote(remoteUrl: string): string | null { + let cleaned = remoteUrl.trim(); + if (!cleaned) { + return null; + } + + if (cleaned.startsWith("git@github.com:")) { + cleaned = cleaned.slice("git@github.com:".length); + } else if (cleaned.startsWith("https://github.com/")) { + cleaned = cleaned.slice("https://github.com/".length); + } else if (cleaned.startsWith("http://github.com/")) { + cleaned = cleaned.slice("http://github.com/".length); + } else { + const marker = "github.com/"; + const markerIndex = cleaned.indexOf(marker); + if (markerIndex === -1) { + return null; + } + cleaned = cleaned.slice(markerIndex + marker.length); + } + + if (cleaned.endsWith(".git")) { + cleaned = cleaned.slice(0, -".git".length); + } + + if (!cleaned.includes("/")) { + return null; + } + + return cleaned; +} + +export function detectWorkspaceGitMetadata( + cwd: string, + directoryName: string, +): WorkspaceGitMetadata { + const gitDir = readGitCommand(cwd, "git rev-parse --git-dir"); + if (!gitDir) { + return { + projectKind: "directory", + projectDisplayName: directoryName, + workspaceDisplayName: directoryName, + gitRemote: null, + isWorktree: false, + }; + } + + const gitRemote = readGitCommand(cwd, "git config --get remote.origin.url"); + const githubRepo = gitRemote ? parseGitHubRepoFromRemote(gitRemote) : null; + const branchName = readGitCommand(cwd, "git symbolic-ref --short HEAD"); + const gitCommonDir = readGitCommand(cwd, "git rev-parse --git-common-dir"); + const isWorktree = gitCommonDir !== null && gitDir !== gitCommonDir; + + return { + projectKind: "git", + projectDisplayName: githubRepo ?? directoryName, + workspaceDisplayName: branchName ?? directoryName, + gitRemote, + isWorktree, + }; +} diff --git a/packages/server/src/server/workspace-reconciliation-service.test.ts b/packages/server/src/server/workspace-reconciliation-service.test.ts new file mode 100644 index 000000000..8dabd051a --- /dev/null +++ b/packages/server/src/server/workspace-reconciliation-service.test.ts @@ -0,0 +1,417 @@ +import { execSync } from "node:child_process"; +import { mkdtempSync, realpathSync, rmSync, writeFileSync } from "node:fs"; +import { tmpdir } from "node:os"; +import path from "node:path"; +import { describe, expect, test, vi, afterEach } from "vitest"; +import { + createPersistedProjectRecord, + createPersistedWorkspaceRecord, +} from "./workspace-registry.js"; +import type { PersistedProjectRecord, PersistedWorkspaceRecord } from "./workspace-registry.js"; +import { WorkspaceReconciliationService } from "./workspace-reconciliation-service.js"; + +function createTestRegistries() { + const projects = new Map(); + const workspaces = new Map(); + let nextProjectId = 1; + let nextWorkspaceId = 1; + + const projectRegistry = { + initialize: async () => {}, + existsOnDisk: async () => true, + list: async () => Array.from(projects.values()), + get: async (id: number) => projects.get(id) ?? null, + insert: async (record: Omit) => { + const id = nextProjectId++; + projects.set(id, createPersistedProjectRecord({ id, ...record })); + return id; + }, + upsert: async (record: PersistedProjectRecord) => { + projects.set(record.id, record); + }, + archive: async (id: number, archivedAt: string) => { + const existing = projects.get(id); + if (existing) { + projects.set(id, { ...existing, archivedAt, updatedAt: archivedAt }); + } + }, + remove: async (id: number) => { + projects.delete(id); + }, + }; + + const workspaceRegistry = { + initialize: async () => {}, + existsOnDisk: async () => true, + list: async () => Array.from(workspaces.values()), + get: async (id: number) => workspaces.get(id) ?? null, + insert: async (record: Omit) => { + const id = nextWorkspaceId++; + workspaces.set(id, createPersistedWorkspaceRecord({ id, ...record })); + return id; + }, + upsert: async (record: PersistedWorkspaceRecord) => { + workspaces.set(record.id, record); + }, + archive: async (id: number, archivedAt: string) => { + const existing = workspaces.get(id); + if (existing) { + workspaces.set(id, { ...existing, archivedAt, updatedAt: archivedAt }); + } + }, + remove: async (id: number) => { + workspaces.delete(id); + }, + }; + + return { projects, workspaces, projectRegistry, workspaceRegistry }; +} + +function createTestLogger() { + const logger = { + child: () => logger, + trace: vi.fn(), + debug: vi.fn(), + info: vi.fn(), + warn: vi.fn(), + error: vi.fn(), + }; + return logger as any; +} + +function createTempGitRepo(prefix: string): string { + const raw = mkdtempSync(path.join(tmpdir(), prefix)); + const dir = realpathSync(raw); + execSync("git init -b main", { cwd: dir, stdio: "ignore" }); + execSync('git config user.email "test@test.com"', { cwd: dir, stdio: "ignore" }); + execSync('git config user.name "Test"', { cwd: dir, stdio: "ignore" }); + execSync("git config commit.gpgsign false", { cwd: dir, stdio: "ignore" }); + writeFileSync(path.join(dir, "README.md"), "# Test\n"); + execSync("git add .", { cwd: dir, stdio: "ignore" }); + execSync('git commit -m "init"', { cwd: dir, stdio: "ignore" }); + return dir; +} + +const timestamp = "2025-01-01T00:00:00.000Z"; + +describe("WorkspaceReconciliationService", () => { + const tempDirs: string[] = []; + + afterEach(() => { + for (const dir of tempDirs) { + rmSync(dir, { recursive: true, force: true }); + } + tempDirs.length = 0; + }); + + test("archives workspaces whose directories no longer exist", async () => { + const { projects, workspaces, projectRegistry, workspaceRegistry } = createTestRegistries(); + + projects.set( + 1, + createPersistedProjectRecord({ + id: 1, + directory: "/tmp/does-not-exist-reconcile-test", + kind: "directory", + displayName: "ghost", + createdAt: timestamp, + updatedAt: timestamp, + }), + ); + workspaces.set( + 1, + createPersistedWorkspaceRecord({ + id: 1, + projectId: 1, + directory: "/tmp/does-not-exist-reconcile-test", + kind: "checkout", + displayName: "ghost", + createdAt: timestamp, + updatedAt: timestamp, + }), + ); + + const service = new WorkspaceReconciliationService({ + projectRegistry, + workspaceRegistry, + logger: createTestLogger(), + }); + + const result = await service.runOnce(); + + expect(result.changesApplied.length).toBeGreaterThanOrEqual(1); + const wsChange = result.changesApplied.find((c) => c.kind === "workspace_archived"); + expect(wsChange).toBeDefined(); + expect(workspaces.get(1)!.archivedAt).toBeTruthy(); + }); + + test("archives orphaned projects after all workspaces are archived", async () => { + const { projects, workspaces, projectRegistry, workspaceRegistry } = createTestRegistries(); + + projects.set( + 1, + createPersistedProjectRecord({ + id: 1, + directory: "/tmp/does-not-exist-reconcile-orphan", + kind: "directory", + displayName: "orphan", + createdAt: timestamp, + updatedAt: timestamp, + }), + ); + workspaces.set( + 1, + createPersistedWorkspaceRecord({ + id: 1, + projectId: 1, + directory: "/tmp/does-not-exist-reconcile-orphan", + kind: "checkout", + displayName: "orphan", + createdAt: timestamp, + updatedAt: timestamp, + }), + ); + + const service = new WorkspaceReconciliationService({ + projectRegistry, + workspaceRegistry, + logger: createTestLogger(), + }); + + const result = await service.runOnce(); + + const projChange = result.changesApplied.find((c) => c.kind === "project_archived"); + expect(projChange).toBeDefined(); + expect(projects.get(1)!.archivedAt).toBeTruthy(); + }); + + test("updates project kind when a directory becomes a git repo", async () => { + const dir = mkdtempSync(path.join(tmpdir(), "reconcile-git-init-")); + const resolved = realpathSync(dir); + tempDirs.push(resolved); + writeFileSync(path.join(resolved, "README.md"), "# Test\n"); + + const { projects, workspaces, projectRegistry, workspaceRegistry } = createTestRegistries(); + + projects.set( + 1, + createPersistedProjectRecord({ + id: 1, + directory: resolved, + kind: "directory", + displayName: path.basename(resolved), + createdAt: timestamp, + updatedAt: timestamp, + }), + ); + workspaces.set( + 1, + createPersistedWorkspaceRecord({ + id: 1, + projectId: 1, + directory: resolved, + kind: "checkout", + displayName: path.basename(resolved), + createdAt: timestamp, + updatedAt: timestamp, + }), + ); + + // Initialize as git repo + execSync("git init -b main", { cwd: resolved, stdio: "ignore" }); + execSync('git config user.email "test@test.com"', { cwd: resolved, stdio: "ignore" }); + execSync('git config user.name "Test"', { cwd: resolved, stdio: "ignore" }); + execSync("git config commit.gpgsign false", { cwd: resolved, stdio: "ignore" }); + execSync("git add .", { cwd: resolved, stdio: "ignore" }); + execSync('git commit -m "init"', { cwd: resolved, stdio: "ignore" }); + + const service = new WorkspaceReconciliationService({ + projectRegistry, + workspaceRegistry, + logger: createTestLogger(), + }); + + const result = await service.runOnce(); + + const projUpdate = result.changesApplied.find((c) => c.kind === "project_updated"); + expect(projUpdate).toBeDefined(); + expect(projects.get(1)!.kind).toBe("git"); + }); + + test("updates project display name when git remote changes", async () => { + const dir = createTempGitRepo("reconcile-remote-"); + tempDirs.push(dir); + + const { projects, workspaces, projectRegistry, workspaceRegistry } = createTestRegistries(); + + projects.set( + 1, + createPersistedProjectRecord({ + id: 1, + directory: dir, + kind: "git", + displayName: "old-owner/old-repo", + gitRemote: "git@github.com:old-owner/old-repo.git", + createdAt: timestamp, + updatedAt: timestamp, + }), + ); + workspaces.set( + 1, + createPersistedWorkspaceRecord({ + id: 1, + projectId: 1, + directory: dir, + kind: "checkout", + displayName: "main", + createdAt: timestamp, + updatedAt: timestamp, + }), + ); + + // Change the remote + execSync("git remote add origin git@github.com:new-owner/new-repo.git", { + cwd: dir, + stdio: "ignore", + }); + + const service = new WorkspaceReconciliationService({ + projectRegistry, + workspaceRegistry, + logger: createTestLogger(), + }); + + const result = await service.runOnce(); + + const projUpdate = result.changesApplied.find((c) => c.kind === "project_updated"); + expect(projUpdate).toBeDefined(); + expect(projects.get(1)!.displayName).toBe("new-owner/new-repo"); + expect(projects.get(1)!.gitRemote).toBe("git@github.com:new-owner/new-repo.git"); + }); + + test("updates workspace display name when branch changes", async () => { + const dir = createTempGitRepo("reconcile-branch-"); + tempDirs.push(dir); + + execSync("git checkout -b feature-branch", { cwd: dir, stdio: "ignore" }); + + const { projects, workspaces, projectRegistry, workspaceRegistry } = createTestRegistries(); + + projects.set( + 1, + createPersistedProjectRecord({ + id: 1, + directory: dir, + kind: "git", + displayName: path.basename(dir), + createdAt: timestamp, + updatedAt: timestamp, + }), + ); + workspaces.set( + 1, + createPersistedWorkspaceRecord({ + id: 1, + projectId: 1, + directory: dir, + kind: "checkout", + displayName: "main", + createdAt: timestamp, + updatedAt: timestamp, + }), + ); + + const service = new WorkspaceReconciliationService({ + projectRegistry, + workspaceRegistry, + logger: createTestLogger(), + }); + + const result = await service.runOnce(); + + const wsUpdate = result.changesApplied.find((c) => c.kind === "workspace_updated"); + expect(wsUpdate).toBeDefined(); + expect(workspaces.get(1)!.displayName).toBe("feature-branch"); + }); + + test("does not modify already-archived records", async () => { + const { projects, workspaces, projectRegistry, workspaceRegistry } = createTestRegistries(); + + projects.set( + 1, + createPersistedProjectRecord({ + id: 1, + directory: "/tmp/does-not-exist-archived", + kind: "directory", + displayName: "archived", + createdAt: timestamp, + updatedAt: timestamp, + archivedAt: timestamp, + }), + ); + workspaces.set( + 1, + createPersistedWorkspaceRecord({ + id: 1, + projectId: 1, + directory: "/tmp/does-not-exist-archived", + kind: "checkout", + displayName: "archived", + createdAt: timestamp, + updatedAt: timestamp, + archivedAt: timestamp, + }), + ); + + const service = new WorkspaceReconciliationService({ + projectRegistry, + workspaceRegistry, + logger: createTestLogger(), + }); + + const result = await service.runOnce(); + + expect(result.changesApplied).toHaveLength(0); + }); + + test("calls onChanges callback when changes are applied", async () => { + const { projects, workspaces, projectRegistry, workspaceRegistry } = createTestRegistries(); + + projects.set( + 1, + createPersistedProjectRecord({ + id: 1, + directory: "/tmp/does-not-exist-callback-test", + kind: "directory", + displayName: "ghost", + createdAt: timestamp, + updatedAt: timestamp, + }), + ); + workspaces.set( + 1, + createPersistedWorkspaceRecord({ + id: 1, + projectId: 1, + directory: "/tmp/does-not-exist-callback-test", + kind: "checkout", + displayName: "ghost", + createdAt: timestamp, + updatedAt: timestamp, + }), + ); + + const onChanges = vi.fn(); + const service = new WorkspaceReconciliationService({ + projectRegistry, + workspaceRegistry, + logger: createTestLogger(), + onChanges, + }); + + await service.runOnce(); + + expect(onChanges).toHaveBeenCalledTimes(1); + expect(onChanges.mock.calls[0][0].length).toBeGreaterThan(0); + }); +}); diff --git a/packages/server/src/server/workspace-reconciliation-service.ts b/packages/server/src/server/workspace-reconciliation-service.ts new file mode 100644 index 000000000..d908ae81c --- /dev/null +++ b/packages/server/src/server/workspace-reconciliation-service.ts @@ -0,0 +1,238 @@ +import { existsSync } from "node:fs"; +import type pino from "pino"; +import type { + ProjectRegistry, + WorkspaceRegistry, + PersistedProjectRecord, + PersistedWorkspaceRecord, +} from "./workspace-registry.js"; +import { detectWorkspaceGitMetadata } from "./workspace-git-metadata.js"; + +const DEFAULT_RECONCILE_INTERVAL_MS = 60_000; + +export type ReconciliationChange = + | { kind: "workspace_archived"; workspaceId: number; directory: string; reason: string } + | { kind: "project_archived"; projectId: number; directory: string; reason: string } + | { + kind: "project_updated"; + projectId: number; + directory: string; + fields: Partial>; + } + | { + kind: "workspace_updated"; + workspaceId: number; + directory: string; + fields: Partial>; + }; + +export type ReconciliationResult = { + changesApplied: ReconciliationChange[]; + durationMs: number; +}; + +export type WorkspaceReconciliationServiceOptions = { + projectRegistry: ProjectRegistry; + workspaceRegistry: WorkspaceRegistry; + logger: pino.Logger; + intervalMs?: number; + onChanges?: (changes: ReconciliationChange[]) => void; +}; + +export class WorkspaceReconciliationService { + private readonly projectRegistry: ProjectRegistry; + private readonly workspaceRegistry: WorkspaceRegistry; + private readonly logger: pino.Logger; + private readonly intervalMs: number; + private readonly onChanges: ((changes: ReconciliationChange[]) => void) | null; + private timer: ReturnType | null = null; + private running = false; + + constructor(options: WorkspaceReconciliationServiceOptions) { + this.projectRegistry = options.projectRegistry; + this.workspaceRegistry = options.workspaceRegistry; + this.logger = options.logger.child({ module: "workspace-reconciliation" }); + this.intervalMs = options.intervalMs ?? DEFAULT_RECONCILE_INTERVAL_MS; + this.onChanges = options.onChanges ?? null; + } + + start(): void { + if (this.timer) return; + this.logger.info({ intervalMs: this.intervalMs }, "Starting workspace reconciliation service"); + this.timer = setInterval(() => void this.runSafe(), this.intervalMs); + // Run once immediately on start + void this.runSafe(); + } + + stop(): void { + if (this.timer) { + clearInterval(this.timer); + this.timer = null; + } + } + + async runOnce(): Promise { + return this.reconcile(); + } + + private async runSafe(): Promise { + if (this.running) return; + this.running = true; + try { + const result = await this.reconcile(); + if (result.changesApplied.length > 0) { + this.logger.info( + { changeCount: result.changesApplied.length, durationMs: result.durationMs }, + "Reconciliation pass completed with changes", + ); + } + } catch (error) { + this.logger.error({ err: error }, "Reconciliation pass failed"); + } finally { + this.running = false; + } + } + + private async reconcile(): Promise { + const start = Date.now(); + const changes: ReconciliationChange[] = []; + + const allProjects = await this.projectRegistry.list(); + const allWorkspaces = await this.workspaceRegistry.list(); + + const activeProjects = allProjects.filter((p) => !p.archivedAt); + const activeWorkspaces = allWorkspaces.filter((w) => !w.archivedAt); + + const workspacesByProject = new Map(); + for (const workspace of activeWorkspaces) { + const list = workspacesByProject.get(workspace.projectId) ?? []; + list.push(workspace); + workspacesByProject.set(workspace.projectId, list); + } + + // 1. Archive workspaces whose directories no longer exist + for (const workspace of activeWorkspaces) { + if (!existsSync(workspace.directory)) { + const timestamp = new Date().toISOString(); + await this.workspaceRegistry.archive(workspace.id, timestamp); + changes.push({ + kind: "workspace_archived", + workspaceId: workspace.id, + directory: workspace.directory, + reason: "directory_missing", + }); + + // Update the in-memory list for the project orphan check below + const siblings = workspacesByProject.get(workspace.projectId); + if (siblings) { + const updated = siblings.filter((w) => w.id !== workspace.id); + workspacesByProject.set(workspace.projectId, updated); + } + } + } + + // 2. Archive orphaned projects (all workspaces archived/removed) + for (const project of activeProjects) { + const siblings = workspacesByProject.get(project.id) ?? []; + if (siblings.length === 0) { + const timestamp = new Date().toISOString(); + await this.projectRegistry.archive(project.id, timestamp); + changes.push({ + kind: "project_archived", + projectId: project.id, + directory: project.directory, + reason: "no_active_workspaces", + }); + } + } + + // 3. Reconcile git metadata for active projects whose directories still exist + for (const project of activeProjects) { + if (project.archivedAt) continue; + const siblings = workspacesByProject.get(project.id) ?? []; + if (siblings.length === 0) continue; + if (!existsSync(project.directory)) continue; + + const directoryName = + project.directory.split(/[\\/]/).filter(Boolean).at(-1) ?? project.directory; + const currentGit = detectWorkspaceGitMetadata(project.directory, directoryName); + + const projectUpdates: Partial< + Pick + > = {}; + + // Detect kind change: directory → git + if (project.kind !== currentGit.projectKind) { + projectUpdates.kind = currentGit.projectKind; + projectUpdates.displayName = currentGit.projectDisplayName; + projectUpdates.gitRemote = currentGit.gitRemote; + } + + // Detect display name change (e.g. remote renamed) + if ( + project.kind === "git" && + currentGit.projectKind === "git" && + project.displayName !== currentGit.projectDisplayName + ) { + projectUpdates.displayName = currentGit.projectDisplayName; + } + + // Detect git remote change + if ( + project.kind === "git" && + currentGit.projectKind === "git" && + project.gitRemote !== currentGit.gitRemote + ) { + projectUpdates.gitRemote = currentGit.gitRemote; + } + + if (Object.keys(projectUpdates).length > 0) { + const timestamp = new Date().toISOString(); + await this.projectRegistry.upsert({ + ...project, + ...projectUpdates, + updatedAt: timestamp, + }); + changes.push({ + kind: "project_updated", + projectId: project.id, + directory: project.directory, + fields: projectUpdates, + }); + } + + // 4. Reconcile workspace display names (branch name changes) + for (const workspace of siblings) { + if (!existsSync(workspace.directory)) continue; + + const wsDirName = + workspace.directory.split(/[\\/]/).filter(Boolean).at(-1) ?? workspace.directory; + const wsGit = detectWorkspaceGitMetadata(workspace.directory, wsDirName); + + if ( + wsGit.projectKind === "git" && + workspace.displayName !== wsGit.workspaceDisplayName + ) { + const timestamp = new Date().toISOString(); + await this.workspaceRegistry.upsert({ + ...workspace, + displayName: wsGit.workspaceDisplayName, + updatedAt: timestamp, + }); + changes.push({ + kind: "workspace_updated", + workspaceId: workspace.id, + directory: workspace.directory, + fields: { displayName: wsGit.workspaceDisplayName }, + }); + } + } + } + + if (changes.length > 0 && this.onChanges) { + this.onChanges(changes); + } + + return { changesApplied: changes, durationMs: Date.now() - start }; + } +} diff --git a/packages/server/src/server/workspace-registry-bootstrap.test.ts b/packages/server/src/server/workspace-registry-bootstrap.test.ts deleted file mode 100644 index 66fa5d442..000000000 --- a/packages/server/src/server/workspace-registry-bootstrap.test.ts +++ /dev/null @@ -1,167 +0,0 @@ -import os from "node:os"; -import path from "node:path"; -import { mkdtempSync, rmSync } from "node:fs"; - -import { afterEach, beforeEach, describe, expect, test } from "vitest"; - -import { createTestLogger } from "../test-utils/test-logger.js"; -import { AgentStorage } from "./agent/agent-storage.js"; -import { FileBackedProjectRegistry, FileBackedWorkspaceRegistry } from "./workspace-registry.js"; -import { bootstrapWorkspaceRegistries } from "./workspace-registry-bootstrap.js"; - -describe("bootstrapWorkspaceRegistries", () => { - let tmpDir: string; - let paseoHome: string; - let agentStorage: AgentStorage; - let projectRegistry: FileBackedProjectRegistry; - let workspaceRegistry: FileBackedWorkspaceRegistry; - const logger = createTestLogger(); - - beforeEach(() => { - tmpDir = mkdtempSync(path.join(os.tmpdir(), "workspace-bootstrap-")); - paseoHome = path.join(tmpDir, ".paseo"); - agentStorage = new AgentStorage(path.join(paseoHome, "agents"), logger); - projectRegistry = new FileBackedProjectRegistry( - path.join(paseoHome, "projects", "projects.json"), - logger, - ); - workspaceRegistry = new FileBackedWorkspaceRegistry( - path.join(paseoHome, "projects", "workspaces.json"), - logger, - ); - }); - - afterEach(() => { - rmSync(tmpDir, { recursive: true, force: true }); - }); - - test("materializes workspace registries from non-archived agent records", async () => { - await agentStorage.initialize(); - await agentStorage.upsert({ - id: "agent-1", - provider: "codex", - cwd: "/tmp/non-git-project", - createdAt: "2026-03-01T00:00:00.000Z", - updatedAt: "2026-03-02T00:00:00.000Z", - lastActivityAt: "2026-03-02T00:00:00.000Z", - lastUserMessageAt: null, - title: null, - labels: {}, - lastStatus: "idle", - lastModeId: null, - config: null, - runtimeInfo: { provider: "codex", sessionId: null }, - persistence: null, - archivedAt: null, - }); - await agentStorage.upsert({ - id: "agent-2", - provider: "codex", - cwd: "/tmp/non-git-project", - createdAt: "2026-03-01T01:00:00.000Z", - updatedAt: "2026-03-03T00:00:00.000Z", - lastActivityAt: "2026-03-03T00:00:00.000Z", - lastUserMessageAt: null, - title: null, - labels: {}, - lastStatus: "running", - lastModeId: null, - config: null, - runtimeInfo: { provider: "codex", sessionId: null }, - persistence: null, - archivedAt: null, - }); - await agentStorage.upsert({ - id: "agent-archived", - provider: "codex", - cwd: "/tmp/archived-project", - createdAt: "2026-03-01T00:00:00.000Z", - updatedAt: "2026-03-01T00:00:00.000Z", - lastActivityAt: "2026-03-01T00:00:00.000Z", - lastUserMessageAt: null, - title: null, - labels: {}, - lastStatus: "idle", - lastModeId: null, - config: null, - runtimeInfo: { provider: "codex", sessionId: null }, - persistence: null, - archivedAt: "2026-03-02T00:00:00.000Z", - }); - - await bootstrapWorkspaceRegistries({ - paseoHome, - agentStorage, - projectRegistry, - workspaceRegistry, - logger, - }); - - const workspaces = await workspaceRegistry.list(); - expect(workspaces).toHaveLength(1); - expect(workspaces[0]?.workspaceId).toBe("/tmp/non-git-project"); - expect(workspaces[0]?.createdAt).toBe("2026-03-01T00:00:00.000Z"); - expect(workspaces[0]?.updatedAt).toBe("2026-03-03T00:00:00.000Z"); - - const projects = await projectRegistry.list(); - expect(projects).toHaveLength(1); - expect(projects[0]?.projectId).toBe("/tmp/non-git-project"); - expect(projects[0]?.createdAt).toBe("2026-03-01T00:00:00.000Z"); - expect(projects[0]?.updatedAt).toBe("2026-03-03T00:00:00.000Z"); - }); - - test("does not rematerialize when registry files already exist", async () => { - await projectRegistry.initialize(); - await workspaceRegistry.initialize(); - await projectRegistry.upsert({ - projectId: "/tmp/existing", - rootPath: "/tmp/existing", - kind: "non_git", - displayName: "existing", - createdAt: "2026-03-01T00:00:00.000Z", - updatedAt: "2026-03-01T00:00:00.000Z", - archivedAt: null, - }); - await workspaceRegistry.upsert({ - workspaceId: "/tmp/existing", - projectId: "/tmp/existing", - cwd: "/tmp/existing", - kind: "directory", - displayName: "existing", - createdAt: "2026-03-01T00:00:00.000Z", - updatedAt: "2026-03-01T00:00:00.000Z", - archivedAt: null, - }); - - await agentStorage.initialize(); - await agentStorage.upsert({ - id: "agent-1", - provider: "codex", - cwd: "/tmp/another-project", - createdAt: "2026-03-02T00:00:00.000Z", - updatedAt: "2026-03-02T00:00:00.000Z", - lastActivityAt: "2026-03-02T00:00:00.000Z", - lastUserMessageAt: null, - title: null, - labels: {}, - lastStatus: "idle", - lastModeId: null, - config: null, - runtimeInfo: { provider: "codex", sessionId: null }, - persistence: null, - archivedAt: null, - }); - - await bootstrapWorkspaceRegistries({ - paseoHome, - agentStorage, - projectRegistry, - workspaceRegistry, - logger, - }); - - expect(await projectRegistry.list()).toHaveLength(1); - expect(await workspaceRegistry.list()).toHaveLength(1); - expect((await workspaceRegistry.list())[0]?.workspaceId).toBe("/tmp/existing"); - }); -}); diff --git a/packages/server/src/server/workspace-registry-bootstrap.ts b/packages/server/src/server/workspace-registry-bootstrap.ts deleted file mode 100644 index ae839246c..000000000 --- a/packages/server/src/server/workspace-registry-bootstrap.ts +++ /dev/null @@ -1,147 +0,0 @@ -import path from "node:path"; - -import type { Logger } from "pino"; - -import type { StoredAgentRecord } from "./agent/agent-storage.js"; -import type { AgentStorage } from "./agent/agent-storage.js"; -import { - buildProjectPlacementForCwd, - deriveWorkspaceId, - deriveProjectKind, - deriveProjectRootPath, - deriveWorkspaceDisplayName, - deriveWorkspaceKind, - normalizeWorkspaceId, -} from "./workspace-registry-model.js"; -import { - createPersistedProjectRecord, - createPersistedWorkspaceRecord, - type ProjectRegistry, - type WorkspaceRegistry, -} from "./workspace-registry.js"; - -function minIsoDate(left: string | null, right: string | null): string | null { - if (!left) { - return right; - } - if (!right) { - return left; - } - return Date.parse(left) <= Date.parse(right) ? left : right; -} - -function maxIsoDate(left: string | null, right: string | null): string | null { - if (!left) { - return right; - } - if (!right) { - return left; - } - return Date.parse(left) >= Date.parse(right) ? left : right; -} - -function resolveAgentCreatedAt(record: StoredAgentRecord): string { - return record.createdAt || record.updatedAt || new Date(0).toISOString(); -} - -function resolveAgentUpdatedAt(record: StoredAgentRecord): string { - return record.lastActivityAt || record.updatedAt || record.createdAt || new Date(0).toISOString(); -} - -export async function bootstrapWorkspaceRegistries(options: { - paseoHome: string; - agentStorage: AgentStorage; - projectRegistry: ProjectRegistry; - workspaceRegistry: WorkspaceRegistry; - logger: Logger; -}): Promise { - const [projectsExists, workspacesExists] = await Promise.all([ - options.projectRegistry.existsOnDisk(), - options.workspaceRegistry.existsOnDisk(), - ]); - - await Promise.all([options.projectRegistry.initialize(), options.workspaceRegistry.initialize()]); - - if (projectsExists && workspacesExists) { - return; - } - - const records = await options.agentStorage.list(); - const activeRecords = records.filter((record) => !record.archivedAt); - const recordsByWorkspaceId = new Map< - string, - { placement: Awaited>; records: StoredAgentRecord[] } - >(); - for (const record of activeRecords) { - const normalizedCwd = normalizeWorkspaceId(record.cwd); - const placement = await buildProjectPlacementForCwd({ - cwd: normalizedCwd, - paseoHome: options.paseoHome, - }); - const workspaceId = deriveWorkspaceId(normalizedCwd, placement.checkout); - const existing = recordsByWorkspaceId.get(workspaceId) ?? { placement, records: [] }; - existing.records.push(record); - recordsByWorkspaceId.set(workspaceId, existing); - } - - const projectRanges = new Map(); - - for (const [workspaceId, entry] of recordsByWorkspaceId.entries()) { - const { placement, records: workspaceRecords } = entry; - let workspaceCreatedAt: string | null = null; - let workspaceUpdatedAt: string | null = null; - for (const record of workspaceRecords) { - workspaceCreatedAt = minIsoDate(workspaceCreatedAt, resolveAgentCreatedAt(record)); - workspaceUpdatedAt = maxIsoDate(workspaceUpdatedAt, resolveAgentUpdatedAt(record)); - } - - const createdAt = workspaceCreatedAt ?? new Date().toISOString(); - const updatedAt = workspaceUpdatedAt ?? createdAt; - await options.workspaceRegistry.upsert( - createPersistedWorkspaceRecord({ - workspaceId, - projectId: placement.projectKey, - cwd: workspaceId, - kind: deriveWorkspaceKind(placement.checkout), - displayName: deriveWorkspaceDisplayName({ - cwd: workspaceId, - checkout: placement.checkout, - }), - createdAt, - updatedAt, - }), - ); - - const existingProjectRange = projectRanges.get(placement.projectKey) ?? { - createdAt: null, - updatedAt: null, - }; - existingProjectRange.createdAt = minIsoDate(existingProjectRange.createdAt, createdAt); - existingProjectRange.updatedAt = maxIsoDate(existingProjectRange.updatedAt, updatedAt); - projectRanges.set(placement.projectKey, existingProjectRange); - - await options.projectRegistry.upsert( - createPersistedProjectRecord({ - projectId: placement.projectKey, - rootPath: deriveProjectRootPath({ - cwd: workspaceId, - checkout: placement.checkout, - }), - kind: deriveProjectKind(placement.checkout), - displayName: placement.projectName, - createdAt: existingProjectRange.createdAt ?? createdAt, - updatedAt: existingProjectRange.updatedAt ?? updatedAt, - }), - ); - } - - options.logger.info( - { - projectsFile: path.join(options.paseoHome, "projects", "projects.json"), - workspacesFile: path.join(options.paseoHome, "projects", "workspaces.json"), - materializedProjects: projectRanges.size, - materializedWorkspaces: recordsByWorkspaceId.size, - }, - "Workspace registries bootstrapped from existing agent storage", - ); -} diff --git a/packages/server/src/server/workspace-registry-model.test.ts b/packages/server/src/server/workspace-registry-model.test.ts deleted file mode 100644 index 2005ae2b5..000000000 --- a/packages/server/src/server/workspace-registry-model.test.ts +++ /dev/null @@ -1,84 +0,0 @@ -import { describe, expect, test, vi } from "vitest"; - -import { deriveWorkspaceId, detectStaleWorkspaces } from "./workspace-registry-model.js"; -import { createPersistedWorkspaceRecord } from "./workspace-registry.js"; - -function createWorkspaceRecord(workspaceId: string) { - return createPersistedWorkspaceRecord({ - workspaceId, - projectId: workspaceId, - cwd: workspaceId, - kind: "directory", - displayName: workspaceId.split("/").at(-1) ?? workspaceId, - createdAt: "2026-03-01T00:00:00.000Z", - updatedAt: "2026-03-01T00:00:00.000Z", - }); -} - -describe("detectStaleWorkspaces", () => { - test("returns workspace ids whose directories no longer exist", async () => { - const checkDirectoryExists = vi.fn(async (cwd: string) => cwd !== "/tmp/missing"); - - const staleWorkspaceIds = await detectStaleWorkspaces({ - activeWorkspaces: [ - createWorkspaceRecord("/tmp/existing"), - createWorkspaceRecord("/tmp/missing"), - ], - checkDirectoryExists, - }); - - expect(Array.from(staleWorkspaceIds)).toEqual(["/tmp/missing"]); - expect(checkDirectoryExists.mock.calls).toEqual([["/tmp/existing"], ["/tmp/missing"]]); - }); - - test("keeps workspaces whose directories exist even when all agents are archived", async () => { - const staleWorkspaceIds = await detectStaleWorkspaces({ - activeWorkspaces: [createWorkspaceRecord("/tmp/repo"), createWorkspaceRecord("/tmp/other")], - checkDirectoryExists: async () => true, - }); - - expect(Array.from(staleWorkspaceIds)).toEqual([]); - }); - - test("keeps workspaces with no agents when directory exists", async () => { - const staleWorkspaceIds = await detectStaleWorkspaces({ - activeWorkspaces: [ - createWorkspaceRecord("/tmp/active"), - createWorkspaceRecord("/tmp/no-agents"), - ], - checkDirectoryExists: async () => true, - }); - - expect(Array.from(staleWorkspaceIds)).toEqual([]); - }); -}); - -describe("deriveWorkspaceId", () => { - test("uses git worktree root when available", () => { - expect( - deriveWorkspaceId("/tmp/repo/packages/app", { - cwd: "/tmp/repo/packages/app", - isGit: true, - currentBranch: "main", - remoteUrl: "https://github.com/acme/repo.git", - worktreeRoot: "/tmp/repo", - isPaseoOwnedWorktree: false, - mainRepoRoot: null, - }), - ).toBe("/tmp/repo"); - }); - - test("falls back to normalized cwd for non-git directories", () => { - expect( - deriveWorkspaceId("/tmp/repo/../repo/scratch", { - cwd: "/tmp/repo/../repo/scratch", - isGit: false, - currentBranch: null, - remoteUrl: null, - worktreeRoot: null, - isPaseoOwnedWorktree: false, - mainRepoRoot: null, - }), - ).toBe("/tmp/repo/scratch"); - }); -}); diff --git a/packages/server/src/server/workspace-registry-model.ts b/packages/server/src/server/workspace-registry-model.ts index 43dde14fa..1d1a08f0c 100644 --- a/packages/server/src/server/workspace-registry-model.ts +++ b/packages/server/src/server/workspace-registry-model.ts @@ -1,15 +1,7 @@ import { resolve } from "node:path"; -import { getCheckoutStatusLite } from "../utils/checkout-git.js"; -import type { ProjectCheckoutLitePayload, ProjectPlacementPayload } from "../shared/messages.js"; -import type { PersistedWorkspaceRecord } from "./workspace-registry.js"; - -export type PersistedProjectKind = "git" | "non_git"; -export type PersistedWorkspaceKind = "local_checkout" | "worktree" | "directory"; -export type DetectStaleWorkspacesInput = { - activeWorkspaces: PersistedWorkspaceRecord[]; - checkDirectoryExists: (cwd: string) => Promise; -}; +export type PersistedProjectKind = "git" | "directory"; +export type PersistedWorkspaceKind = "checkout" | "worktree"; export function normalizeWorkspaceId(cwd: string): string { const trimmed = cwd.trim(); @@ -18,203 +10,3 @@ export function normalizeWorkspaceId(cwd: string): string { } return resolve(trimmed); } - -export function deriveWorkspaceId(cwd: string, checkout: ProjectCheckoutLitePayload): string { - return checkout.worktreeRoot ?? normalizeWorkspaceId(cwd); -} - -function deriveRemoteProjectKey(remoteUrl: string | null): string | null { - if (!remoteUrl) { - return null; - } - - const trimmed = remoteUrl.trim(); - if (!trimmed) { - return null; - } - - let host: string | null = null; - let remotePath: string | null = null; - - const scpLike = trimmed.match(/^[^@]+@([^:]+):(.+)$/); - if (scpLike) { - host = scpLike[1] ?? null; - remotePath = scpLike[2] ?? null; - } else if (trimmed.includes("://")) { - try { - const parsed = new URL(trimmed); - host = parsed.hostname || null; - remotePath = parsed.pathname ? parsed.pathname.replace(/^\/+/, "") : null; - } catch { - return null; - } - } - - if (!host || !remotePath) { - return null; - } - - let cleanedPath = remotePath.trim().replace(/^\/+/, "").replace(/\/+$/, ""); - if (cleanedPath.endsWith(".git")) { - cleanedPath = cleanedPath.slice(0, -4); - } - if (!cleanedPath.includes("/")) { - return null; - } - - const cleanedHost = host.toLowerCase(); - if (cleanedHost === "github.com") { - return `remote:github.com/${cleanedPath}`; - } - - return `remote:${cleanedHost}/${cleanedPath}`; -} - -export function deriveProjectGroupingKey(options: { - cwd: string; - remoteUrl: string | null; - isPaseoOwnedWorktree: boolean; - mainRepoRoot: string | null; -}): string { - const remoteKey = deriveRemoteProjectKey(options.remoteUrl); - if (remoteKey) { - return remoteKey; - } - - const mainRepoRoot = options.mainRepoRoot?.trim(); - if (options.isPaseoOwnedWorktree && mainRepoRoot) { - return mainRepoRoot; - } - - return options.cwd; -} - -export function deriveProjectGroupingName(projectKey: string): string { - const githubRemotePrefix = "remote:github.com/"; - if (projectKey.startsWith(githubRemotePrefix)) { - return projectKey.slice(githubRemotePrefix.length) || projectKey; - } - - const segments = projectKey.split(/[\\/]/).filter(Boolean); - return segments[segments.length - 1] || projectKey; -} - -function deriveWorkspaceDirectoryName(cwd: string): string { - const normalized = cwd.replace(/\\/g, "/"); - const segments = normalized.split("/").filter(Boolean); - return segments[segments.length - 1] ?? cwd; -} - -export function deriveWorkspaceDisplayName(input: { - cwd: string; - checkout: ProjectCheckoutLitePayload; -}): string { - const branch = input.checkout.currentBranch?.trim() ?? null; - if (branch && branch.toUpperCase() !== "HEAD") { - return branch; - } - return deriveWorkspaceDirectoryName(input.cwd); -} - -export function deriveProjectRootPath(input: { - cwd: string; - checkout: ProjectCheckoutLitePayload; -}): string { - if (input.checkout.isGit && input.checkout.isPaseoOwnedWorktree) { - return input.checkout.mainRepoRoot; - } - return input.cwd; -} - -export function deriveProjectKind(checkout: ProjectCheckoutLitePayload): PersistedProjectKind { - return checkout.isGit ? "git" : "non_git"; -} - -export function deriveWorkspaceKind(checkout: ProjectCheckoutLitePayload): PersistedWorkspaceKind { - if (!checkout.isGit) { - return "directory"; - } - return checkout.isPaseoOwnedWorktree ? "worktree" : "local_checkout"; -} - -export async function detectStaleWorkspaces( - input: DetectStaleWorkspacesInput, -): Promise> { - const staleWorkspaceIds = new Set(); - - for (const workspace of input.activeWorkspaces) { - const dirExists = await input.checkDirectoryExists(workspace.cwd); - if (!dirExists) { - staleWorkspaceIds.add(workspace.workspaceId); - } - } - - return staleWorkspaceIds; -} - -export async function buildProjectPlacementForCwd(input: { - cwd: string; - paseoHome: string; -}): Promise { - const normalizedCwd = normalizeWorkspaceId(input.cwd); - const checkout = await getCheckoutStatusLite(normalizedCwd, { paseoHome: input.paseoHome }) - .then((status): ProjectCheckoutLitePayload => { - if (!status.isGit) { - return { - cwd: normalizedCwd, - isGit: false, - currentBranch: null, - remoteUrl: null, - worktreeRoot: null, - isPaseoOwnedWorktree: false, - mainRepoRoot: null, - }; - } - - if (status.isPaseoOwnedWorktree && status.mainRepoRoot) { - return { - cwd: normalizedCwd, - isGit: true, - currentBranch: status.currentBranch, - remoteUrl: status.remoteUrl, - worktreeRoot: status.worktreeRoot, - isPaseoOwnedWorktree: true, - mainRepoRoot: status.mainRepoRoot, - }; - } - - return { - cwd: normalizedCwd, - isGit: true, - currentBranch: status.currentBranch, - remoteUrl: status.remoteUrl, - worktreeRoot: status.worktreeRoot, - isPaseoOwnedWorktree: false, - mainRepoRoot: null, - }; - }) - .catch( - (): ProjectCheckoutLitePayload => ({ - cwd: normalizedCwd, - isGit: false, - currentBranch: null, - remoteUrl: null, - worktreeRoot: null, - isPaseoOwnedWorktree: false, - mainRepoRoot: null, - }), - ); - - const projectKey = deriveProjectGroupingKey({ - cwd: checkout.worktreeRoot ?? normalizedCwd, - remoteUrl: checkout.remoteUrl, - isPaseoOwnedWorktree: checkout.isPaseoOwnedWorktree, - mainRepoRoot: checkout.mainRepoRoot, - }); - - return { - projectKey, - projectName: deriveProjectGroupingName(projectKey), - checkout, - }; -} diff --git a/packages/server/src/server/workspace-registry.test-helpers.ts b/packages/server/src/server/workspace-registry.test-helpers.ts new file mode 100644 index 000000000..1be76e04d --- /dev/null +++ b/packages/server/src/server/workspace-registry.test-helpers.ts @@ -0,0 +1,172 @@ +import { randomUUID } from "node:crypto"; +import { promises as fs } from "node:fs"; +import path from "node:path"; + +import type { Logger } from "pino"; + +import { + parsePersistedProjectRecords, + parsePersistedWorkspaceRecords, + type PersistedProjectRecord, + type PersistedWorkspaceRecord, + type ProjectRegistry, + type WorkspaceRegistry, +} from "./workspace-registry.js"; + +type RegistryRecord = PersistedProjectRecord | PersistedWorkspaceRecord; + +class FileBackedRegistry { + private readonly filePath: string; + private readonly logger: Logger; + private readonly parseRecord: (record: unknown) => TRecord; + private readonly parseRecords: (input: unknown) => TRecord[]; + private readonly getId: (record: TRecord) => number; + private loaded = false; + private readonly cache = new Map(); + private persistQueue: Promise = Promise.resolve(); + + constructor(options: { + filePath: string; + logger: Logger; + parseRecords: (input: unknown) => TRecord[]; + getId: (record: TRecord) => number; + component: string; + }) { + this.filePath = options.filePath; + this.parseRecords = options.parseRecords; + this.parseRecord = (record) => options.parseRecords([record])[0]!; + this.getId = options.getId; + this.logger = options.logger.child({ + module: "workspace-registry", + component: options.component, + }); + } + + async initialize(): Promise { + await this.load(); + } + + async existsOnDisk(): Promise { + try { + await fs.access(this.filePath); + return true; + } catch { + return false; + } + } + + async list(): Promise { + await this.load(); + return Array.from(this.cache.values()); + } + + async get(id: number): Promise { + await this.load(); + return this.cache.get(String(id)) ?? null; + } + + async insert(record: Omit): Promise { + await this.load(); + const nextId = Math.max(0, ...Array.from(this.cache.values(), (value) => this.getId(value))) + 1; + const parsed = this.parseRecord({ ...record, id: nextId }); + this.cache.set(String(this.getId(parsed)), parsed); + await this.enqueuePersist(); + return nextId; + } + + async upsert(record: TRecord): Promise { + await this.load(); + const parsed = this.parseRecord(record); + this.cache.set(String(this.getId(parsed)), parsed); + await this.enqueuePersist(); + } + + async archive(id: number, archivedAt: string): Promise { + await this.load(); + const key = String(id); + const existing = this.cache.get(key); + if (!existing) { + return; + } + const next = this.parseRecord({ + ...existing, + updatedAt: archivedAt, + archivedAt, + }); + this.cache.set(key, next); + await this.enqueuePersist(); + } + + async remove(id: number): Promise { + await this.load(); + if (!this.cache.delete(String(id))) { + return; + } + await this.enqueuePersist(); + } + + private async load(): Promise { + if (this.loaded) { + return; + } + + this.cache.clear(); + try { + const raw = await fs.readFile(this.filePath, "utf8"); + const parsed = this.parseRecords(JSON.parse(raw)); + for (const record of parsed) { + this.cache.set(String(this.getId(record)), record); + } + } catch (error) { + const code = (error as NodeJS.ErrnoException).code; + if (code !== "ENOENT") { + this.logger.error({ err: error, filePath: this.filePath }, "Failed to load registry file"); + } + } + this.loaded = true; + } + + private async persist(): Promise { + const records = Array.from(this.cache.values()); + await fs.mkdir(path.dirname(this.filePath), { recursive: true }); + const tempPath = `${this.filePath}.${process.pid}.${Date.now()}.${randomUUID()}.tmp`; + await fs.writeFile(tempPath, JSON.stringify(records, null, 2), "utf8"); + await fs.rename(tempPath, this.filePath); + } + + private async enqueuePersist(): Promise { + const nextPersist = this.persistQueue.then(() => this.persist()); + this.persistQueue = nextPersist.catch(() => {}); + await nextPersist; + } +} + +export class FileBackedProjectRegistry + extends FileBackedRegistry + implements ProjectRegistry +{ + constructor(filePath: string, logger: Logger) { + super({ + filePath, + logger, + parseRecords: parsePersistedProjectRecords, + getId: (record) => record.id, + component: "projects", + }); + } +} + +export class FileBackedWorkspaceRegistry + extends FileBackedRegistry + implements WorkspaceRegistry +{ + constructor(filePath: string, logger: Logger) { + super({ + filePath, + logger, + parseRecords: parsePersistedWorkspaceRecords, + getId: (record) => record.id, + component: "workspaces", + }); + } +} diff --git a/packages/server/src/server/workspace-registry.test.ts b/packages/server/src/server/workspace-registry.test.ts index b4abe506b..df119c353 100644 --- a/packages/server/src/server/workspace-registry.test.ts +++ b/packages/server/src/server/workspace-registry.test.ts @@ -6,10 +6,12 @@ import { beforeEach, afterEach, describe, expect, test } from "vitest"; import { createTestLogger } from "../test-utils/test-logger.js"; import { - createPersistedProjectRecord, - createPersistedWorkspaceRecord, FileBackedProjectRegistry, FileBackedWorkspaceRegistry, +} from "./workspace-registry.test-helpers.js"; +import { + createPersistedProjectRecord, + createPersistedWorkspaceRecord, } from "./workspace-registry.js"; describe("workspace registries", () => { @@ -36,71 +38,78 @@ describe("workspace registries", () => { test("creates, updates, archives, deletes, and lists project records", async () => { await projectRegistry.initialize(); - await projectRegistry.upsert( - createPersistedProjectRecord({ - projectId: "remote:github.com/acme/repo", - rootPath: "/tmp/repo", - kind: "git", - displayName: "acme/repo", - createdAt: "2026-03-01T00:00:00.000Z", - updatedAt: "2026-03-01T00:00:00.000Z", - }), - ); + const projectId = await projectRegistry.insert({ + directory: "/tmp/repo", + kind: "git", + displayName: "acme/repo", + gitRemote: "git@github.com:acme/repo.git", + createdAt: "2026-03-01T00:00:00.000Z", + updatedAt: "2026-03-01T00:00:00.000Z", + archivedAt: null, + }); await projectRegistry.upsert( createPersistedProjectRecord({ - projectId: "remote:github.com/acme/repo", - rootPath: "/tmp/repo", + id: projectId, + directory: "/tmp/repo", kind: "git", displayName: "acme/repo", + gitRemote: "git@github.com:acme/repo.git", createdAt: "2026-03-01T00:00:00.000Z", updatedAt: "2026-03-02T00:00:00.000Z", }), ); - await projectRegistry.archive("remote:github.com/acme/repo", "2026-03-03T00:00:00.000Z"); + await projectRegistry.archive(projectId, "2026-03-03T00:00:00.000Z"); - const archived = await projectRegistry.get("remote:github.com/acme/repo"); + const archived = await projectRegistry.get(projectId); expect(archived?.archivedAt).toBe("2026-03-03T00:00:00.000Z"); expect(await projectRegistry.list()).toHaveLength(1); - await projectRegistry.remove("remote:github.com/acme/repo"); - expect(await projectRegistry.get("remote:github.com/acme/repo")).toBeNull(); + await projectRegistry.remove(projectId); + expect(await projectRegistry.get(projectId)).toBeNull(); expect(await projectRegistry.list()).toEqual([]); }); test("creates, updates, archives, deletes, and lists workspace records", async () => { await workspaceRegistry.initialize(); - await workspaceRegistry.upsert( - createPersistedWorkspaceRecord({ - workspaceId: "/tmp/repo", - projectId: "remote:github.com/acme/repo", - cwd: "/tmp/repo", - kind: "local_checkout", - displayName: "main", - createdAt: "2026-03-01T00:00:00.000Z", - updatedAt: "2026-03-01T00:00:00.000Z", - }), - ); + const projectId = await projectRegistry.insert({ + directory: "/tmp/repo", + kind: "git", + displayName: "acme/repo", + gitRemote: "git@github.com:acme/repo.git", + createdAt: "2026-03-01T00:00:00.000Z", + updatedAt: "2026-03-01T00:00:00.000Z", + archivedAt: null, + }); + const workspaceId = await workspaceRegistry.insert({ + projectId, + directory: "/tmp/repo", + kind: "checkout", + displayName: "main", + createdAt: "2026-03-01T00:00:00.000Z", + updatedAt: "2026-03-01T00:00:00.000Z", + archivedAt: null, + }); await workspaceRegistry.upsert( createPersistedWorkspaceRecord({ - workspaceId: "/tmp/repo", - projectId: "remote:github.com/acme/repo", - cwd: "/tmp/repo", - kind: "local_checkout", + id: workspaceId, + projectId, + directory: "/tmp/repo", + kind: "checkout", displayName: "feature/workspace", createdAt: "2026-03-01T00:00:00.000Z", updatedAt: "2026-03-02T00:00:00.000Z", }), ); - await workspaceRegistry.archive("/tmp/repo", "2026-03-03T00:00:00.000Z"); + await workspaceRegistry.archive(workspaceId, "2026-03-03T00:00:00.000Z"); - const archived = await workspaceRegistry.get("/tmp/repo"); + const archived = await workspaceRegistry.get(workspaceId); expect(archived?.displayName).toBe("feature/workspace"); expect(archived?.archivedAt).toBe("2026-03-03T00:00:00.000Z"); - await workspaceRegistry.remove("/tmp/repo"); - expect(await workspaceRegistry.get("/tmp/repo")).toBeNull(); + await workspaceRegistry.remove(workspaceId); + expect(await workspaceRegistry.get(workspaceId)).toBeNull(); expect(await workspaceRegistry.list()).toEqual([]); }); }); diff --git a/packages/server/src/server/workspace-registry.ts b/packages/server/src/server/workspace-registry.ts index dcd7fb411..d02010509 100644 --- a/packages/server/src/server/workspace-registry.ts +++ b/packages/server/src/server/workspace-registry.ts @@ -1,27 +1,21 @@ -import { randomUUID } from "node:crypto"; -import { promises as fs } from "node:fs"; -import path from "node:path"; - -import type { Logger } from "pino"; import { z } from "zod"; -import type { PersistedProjectKind, PersistedWorkspaceKind } from "./workspace-registry-model.js"; - const PersistedProjectRecordSchema = z.object({ - projectId: z.string(), - rootPath: z.string(), - kind: z.enum(["git", "non_git"]), + id: z.number().int(), + directory: z.string(), + kind: z.enum(["git", "directory"]), displayName: z.string(), + gitRemote: z.string().nullable(), createdAt: z.string(), updatedAt: z.string(), archivedAt: z.string().nullable(), }); const PersistedWorkspaceRecordSchema = z.object({ - workspaceId: z.string(), - projectId: z.string(), - cwd: z.string(), - kind: z.enum(["local_checkout", "worktree", "directory"]), + id: z.number().int(), + projectId: z.number().int(), + directory: z.string(), + kind: z.enum(["checkout", "worktree"]), displayName: z.string(), createdAt: z.string(), updatedAt: z.string(), @@ -31,192 +25,58 @@ const PersistedWorkspaceRecordSchema = z.object({ export type PersistedProjectRecord = z.infer; export type PersistedWorkspaceRecord = z.infer; +export function parsePersistedProjectRecords(input: unknown): PersistedProjectRecord[] { + return z.array(PersistedProjectRecordSchema).parse(input); +} + +export function parsePersistedWorkspaceRecords(input: unknown): PersistedWorkspaceRecord[] { + return z.array(PersistedWorkspaceRecordSchema).parse(input); +} + export interface ProjectRegistry { initialize(): Promise; existsOnDisk(): Promise; list(): Promise; - get(projectId: string): Promise; + get(id: number): Promise; + insert(record: Omit): Promise; upsert(record: PersistedProjectRecord): Promise; - archive(projectId: string, archivedAt: string): Promise; - remove(projectId: string): Promise; + archive(id: number, archivedAt: string): Promise; + remove(id: number): Promise; } export interface WorkspaceRegistry { initialize(): Promise; existsOnDisk(): Promise; list(): Promise; - get(workspaceId: string): Promise; + get(id: number): Promise; + insert(record: Omit): Promise; upsert(record: PersistedWorkspaceRecord): Promise; - archive(workspaceId: string, archivedAt: string): Promise; - remove(workspaceId: string): Promise; -} - -type RegistryRecord = PersistedProjectRecord | PersistedWorkspaceRecord; - -class FileBackedRegistry { - private readonly filePath: string; - private readonly logger: Logger; - private readonly schema: z.ZodSchema; - private readonly getId: (record: TRecord) => string; - private loaded = false; - private readonly cache = new Map(); - private persistQueue: Promise = Promise.resolve(); - - constructor(options: { - filePath: string; - logger: Logger; - schema: z.ZodSchema; - getId: (record: TRecord) => string; - component: string; - }) { - this.filePath = options.filePath; - this.schema = options.schema; - this.getId = options.getId; - this.logger = options.logger.child({ - module: "workspace-registry", - component: options.component, - }); - } - - async initialize(): Promise { - await this.load(); - } - - async existsOnDisk(): Promise { - try { - await fs.access(this.filePath); - return true; - } catch { - return false; - } - } - - async list(): Promise { - await this.load(); - return Array.from(this.cache.values()); - } - - async get(id: string): Promise { - await this.load(); - return this.cache.get(id) ?? null; - } - - async upsert(record: TRecord): Promise { - await this.load(); - const parsed = this.schema.parse(record); - this.cache.set(this.getId(parsed), parsed); - await this.enqueuePersist(); - } - - async archive(id: string, archivedAt: string): Promise { - await this.load(); - const existing = this.cache.get(id); - if (!existing) { - return; - } - const next = this.schema.parse({ - ...existing, - updatedAt: archivedAt, - archivedAt, - }); - this.cache.set(id, next); - await this.enqueuePersist(); - } - - async remove(id: string): Promise { - await this.load(); - if (!this.cache.delete(id)) { - return; - } - await this.enqueuePersist(); - } - - private async load(): Promise { - if (this.loaded) { - return; - } - - this.cache.clear(); - try { - const raw = await fs.readFile(this.filePath, "utf8"); - const parsed = z.array(this.schema).parse(JSON.parse(raw)); - for (const record of parsed) { - this.cache.set(this.getId(record), record); - } - } catch (error) { - const code = (error as NodeJS.ErrnoException).code; - if (code !== "ENOENT") { - this.logger.error({ err: error, filePath: this.filePath }, "Failed to load registry file"); - } - } - this.loaded = true; - } - - private async persist(): Promise { - const records = Array.from(this.cache.values()); - await fs.mkdir(path.dirname(this.filePath), { recursive: true }); - const tempPath = `${this.filePath}.${process.pid}.${Date.now()}.${randomUUID()}.tmp`; - await fs.writeFile(tempPath, JSON.stringify(records, null, 2), "utf8"); - await fs.rename(tempPath, this.filePath); - } - - private async enqueuePersist(): Promise { - const nextPersist = this.persistQueue.then(() => this.persist()); - this.persistQueue = nextPersist.catch(() => {}); - await nextPersist; - } -} - -export class FileBackedProjectRegistry - extends FileBackedRegistry - implements ProjectRegistry -{ - constructor(filePath: string, logger: Logger) { - super({ - filePath, - logger, - schema: PersistedProjectRecordSchema, - getId: (record) => record.projectId, - component: "projects", - }); - } -} - -export class FileBackedWorkspaceRegistry - extends FileBackedRegistry - implements WorkspaceRegistry -{ - constructor(filePath: string, logger: Logger) { - super({ - filePath, - logger, - schema: PersistedWorkspaceRecordSchema, - getId: (record) => record.workspaceId, - component: "workspaces", - }); - } + archive(id: number, archivedAt: string): Promise; + remove(id: number): Promise; } export function createPersistedProjectRecord(input: { - projectId: string; - rootPath: string; - kind: PersistedProjectKind; + id: number; + directory: string; + kind: "git" | "directory"; displayName: string; + gitRemote?: string | null; createdAt: string; updatedAt: string; archivedAt?: string | null; }): PersistedProjectRecord { return PersistedProjectRecordSchema.parse({ ...input, + gitRemote: input.gitRemote ?? null, archivedAt: input.archivedAt ?? null, }); } export function createPersistedWorkspaceRecord(input: { - workspaceId: string; - projectId: string; - cwd: string; - kind: PersistedWorkspaceKind; + id: number; + projectId: number; + directory: string; + kind: "checkout" | "worktree"; displayName: string; createdAt: string; updatedAt: string; diff --git a/packages/server/src/server/worktree-bootstrap.test.ts b/packages/server/src/server/worktree-bootstrap.test.ts index e52fef018..e6b20a5f9 100644 --- a/packages/server/src/server/worktree-bootstrap.test.ts +++ b/packages/server/src/server/worktree-bootstrap.test.ts @@ -5,7 +5,12 @@ import { join } from "path"; import { tmpdir } from "os"; import type { AgentTimelineItem } from "./agent/agent-sdk-types.js"; -import { createAgentWorktree, runAsyncWorktreeBootstrap } from "./worktree-bootstrap.js"; +import { + createAgentWorktree, + runAsyncWorktreeBootstrap, + spawnWorktreeServices, +} from "./worktree-bootstrap.js"; +import { ServiceRouteStore } from "./service-proxy.js"; describe("runAsyncWorktreeBootstrap", () => { let tempDir: string; @@ -56,7 +61,7 @@ describe("runAsyncWorktreeBootstrap", () => { stdio: "pipe", }); - const worktree = await createAgentWorktree({ + const worktreeBootstrap = await createAgentWorktree({ cwd: repoDir, branchName: "feature-streaming-setup", baseBranch: "main", @@ -69,7 +74,8 @@ describe("runAsyncWorktreeBootstrap", () => { await runAsyncWorktreeBootstrap({ agentId: "agent-test", - worktree, + worktree: worktreeBootstrap.worktree, + shouldBootstrap: worktreeBootstrap.shouldBootstrap, terminalManager: null, appendTimelineItem: async (item) => { persisted.push(item); @@ -113,12 +119,15 @@ describe("runAsyncWorktreeBootstrap", () => { expect(persistedSetupItems[0].detail.commands[0]).toMatchObject({ index: 1, command: 'echo "line-one"; echo "line-two" 1>&2', + log: expect.stringContaining("line-one"), status: "completed", exitCode: 0, }); + expect(persistedSetupItems[0].detail.commands[0]?.log).toContain("line-two"); expect(persistedSetupItems[0].detail.commands[1]).toMatchObject({ index: 2, command: 'echo "line-three"', + log: "line-three\n", status: "completed", exitCode: 0, }); @@ -160,7 +169,7 @@ describe("runAsyncWorktreeBootstrap", () => { stdio: "pipe", }); - const worktree = await createAgentWorktree({ + const worktreeBootstrap = await createAgentWorktree({ cwd: repoDir, branchName: "feature-live-failure", baseBranch: "main", @@ -172,7 +181,8 @@ describe("runAsyncWorktreeBootstrap", () => { await expect( runAsyncWorktreeBootstrap({ agentId: "agent-live-failure", - worktree, + worktree: worktreeBootstrap.worktree, + shouldBootstrap: worktreeBootstrap.shouldBootstrap, terminalManager: null, appendTimelineItem: async (item) => { persisted.push(item); @@ -210,7 +220,7 @@ describe("runAsyncWorktreeBootstrap", () => { stdio: "pipe", }); - const worktree = await createAgentWorktree({ + const worktreeBootstrap = await createAgentWorktree({ cwd: repoDir, branchName: "feature-large-output", baseBranch: "main", @@ -221,7 +231,8 @@ describe("runAsyncWorktreeBootstrap", () => { const persisted: AgentTimelineItem[] = []; await runAsyncWorktreeBootstrap({ agentId: "agent-large-output", - worktree, + worktree: worktreeBootstrap.worktree, + shouldBootstrap: worktreeBootstrap.shouldBootstrap, terminalManager: null, appendTimelineItem: async (item) => { persisted.push(item); @@ -244,6 +255,64 @@ describe("runAsyncWorktreeBootstrap", () => { expect(persistedSetupItem.detail.log).toContain("prefix-"); expect(persistedSetupItem.detail.log).toContain("-suffix"); expect(persistedSetupItem.detail.log).toContain("......"); + expect(persistedSetupItem.detail.commands[0]?.log).toContain("prefix-"); + expect(persistedSetupItem.detail.commands[0]?.log).toContain("-suffix"); + expect(persistedSetupItem.detail.commands[0]?.log).toContain( + "......", + ); + }); + + it("keeps only the final carriage-return-updated content in command logs", async () => { + writeFileSync( + join(repoDir, "paseo.json"), + JSON.stringify({ + worktree: { + setup: [ + `node -e "process.stdout.write('fetch 1/3\\\\rfetch 2/3\\\\rfetch 3/3\\\\nready\\\\n')"`, + ], + }, + }), + ); + execSync("git add paseo.json", { cwd: repoDir, stdio: "pipe" }); + execSync("git -c commit.gpgsign=false commit -m 'add carriage return setup'", { + cwd: repoDir, + stdio: "pipe", + }); + + const worktreeBootstrap = await createAgentWorktree({ + cwd: repoDir, + branchName: "feature-carriage-return", + baseBranch: "main", + worktreeSlug: "feature-carriage-return", + paseoHome, + }); + + const persisted: AgentTimelineItem[] = []; + await runAsyncWorktreeBootstrap({ + agentId: "agent-carriage-return", + worktree: worktreeBootstrap.worktree, + shouldBootstrap: worktreeBootstrap.shouldBootstrap, + terminalManager: null, + appendTimelineItem: async (item) => { + persisted.push(item); + return true; + }, + emitLiveTimelineItem: async () => true, + }); + + const persistedSetupItem = persisted.find( + (item): item is Extract => + item.type === "tool_call" && item.name === "paseo_worktree_setup", + ); + expect(persistedSetupItem?.detail.type).toBe("worktree_setup"); + if (!persistedSetupItem || persistedSetupItem.detail.type !== "worktree_setup") { + throw new Error("Expected worktree_setup tool detail"); + } + + expect(persistedSetupItem.detail.log).toContain("\nfetch 3/3\nready\n"); + expect(persistedSetupItem.detail.log).not.toContain("\nfetch 1/3\n"); + expect(persistedSetupItem.detail.log).not.toContain("\nfetch 2/3\n"); + expect(persistedSetupItem.detail.commands[0]?.log).toBe("fetch 3/3\nready\n"); }); it("waits for terminal output before sending bootstrap commands", async () => { @@ -266,7 +335,7 @@ describe("runAsyncWorktreeBootstrap", () => { stdio: "pipe", }); - const worktree = await createAgentWorktree({ + const worktreeBootstrap = await createAgentWorktree({ cwd: repoDir, branchName: "feature-terminal-readiness", baseBranch: "main", @@ -280,7 +349,8 @@ describe("runAsyncWorktreeBootstrap", () => { await runAsyncWorktreeBootstrap({ agentId: "agent-terminal-readiness", - worktree, + worktree: worktreeBootstrap.worktree, + shouldBootstrap: worktreeBootstrap.shouldBootstrap, terminalManager: { async getTerminals() { return []; @@ -357,7 +427,7 @@ describe("runAsyncWorktreeBootstrap", () => { stdio: "pipe", }); - const worktree = await createAgentWorktree({ + const worktreeBootstrap = await createAgentWorktree({ cwd: repoDir, branchName: "feature-shared-runtime-port", baseBranch: "main", @@ -370,7 +440,8 @@ describe("runAsyncWorktreeBootstrap", () => { const persisted: AgentTimelineItem[] = []; await runAsyncWorktreeBootstrap({ agentId: "agent-shared-runtime-port", - worktree, + worktree: worktreeBootstrap.worktree, + shouldBootstrap: worktreeBootstrap.shouldBootstrap, terminalManager: { async getTerminals() { return []; @@ -416,13 +487,13 @@ describe("runAsyncWorktreeBootstrap", () => { emitLiveTimelineItem: async () => true, }); - const setupPortPath = join(worktree.worktreePath, "setup-port.txt"); + const setupPortPath = join(worktreeBootstrap.worktree.worktreePath, "setup-port.txt"); await waitForPathExists(setupPortPath); const setupPort = readFileSync(setupPortPath, "utf8").trim(); expect(setupPort.length).toBeGreaterThan(0); expect(registeredEnvs).toHaveLength(1); - expect(registeredEnvs[0]?.cwd).toBe(worktree.worktreePath); + expect(registeredEnvs[0]?.cwd).toBe(worktreeBootstrap.worktree.worktreePath); expect(registeredEnvs[0]?.env.PASEO_WORKTREE_PORT).toBe(setupPort); expect(createTerminalEnvs.length).toBeGreaterThan(0); expect(createTerminalEnvs[0]?.PASEO_WORKTREE_PORT).toBe(setupPort); @@ -435,4 +506,85 @@ describe("runAsyncWorktreeBootstrap", () => { ); expect(terminalToolCall?.status).toBe("completed"); }); + + it("spawns services without PASEO_SERVICE_URL when the daemon has no TCP port", async () => { + writeFileSync( + join(repoDir, "paseo.json"), + JSON.stringify({ + services: { + web: { + command: "npm run dev", + }, + }, + }), + ); + execSync("git add paseo.json", { cwd: repoDir, stdio: "pipe" }); + execSync("git -c commit.gpgsign=false commit -m 'add service config'", { + cwd: repoDir, + stdio: "pipe", + }); + + const routeStore = new ServiceRouteStore(); + const createTerminalCalls: Array<{ cwd: string; name?: string; env?: Record }> = []; + + const results = await spawnWorktreeServices({ + repoRoot: repoDir, + workspaceId: repoDir, + branchName: "feature-socket-service", + daemonPort: null, + routeStore, + terminalManager: { + async getTerminals() { + return []; + }, + async createTerminal(options) { + createTerminalCalls.push(options); + return { + id: "term-service", + name: options.name ?? "Terminal", + cwd: options.cwd, + send: () => {}, + subscribe: () => () => {}, + onExit: () => () => {}, + getState: () => ({ + rows: 1, + cols: 1, + grid: [[{ char: "$" }]], + scrollback: [], + cursor: { row: 0, col: 0 }, + }), + kill: () => {}, + }; + }, + registerCwdEnv() {}, + getTerminal() { + return undefined; + }, + killTerminal() {}, + listDirectories() { + return []; + }, + killAll() {}, + subscribeTerminalsChanged() { + return () => {}; + }, + }, + }); + + expect(results).toHaveLength(1); + expect(routeStore.listRoutes()).toEqual([ + { + hostname: "feature-socket-service.web.localhost", + port: expect.any(Number), + workspaceId: repoDir, + serviceName: "web", + }, + ]); + expect(createTerminalCalls).toHaveLength(1); + expect(createTerminalCalls[0]?.cwd).toBe(repoDir); + expect(createTerminalCalls[0]?.name).toBe("web"); + expect(createTerminalCalls[0]?.env?.PORT).toEqual(expect.any(String)); + expect(createTerminalCalls[0]?.env?.HOST).toBe("127.0.0.1"); + expect(createTerminalCalls[0]?.env?.PASEO_SERVICE_URL).toBeUndefined(); + }); }); diff --git a/packages/server/src/server/worktree-bootstrap.ts b/packages/server/src/server/worktree-bootstrap.ts index afeec0c29..974cd5641 100644 --- a/packages/server/src/server/worktree-bootstrap.ts +++ b/packages/server/src/server/worktree-bootstrap.ts @@ -5,10 +5,13 @@ import { promisify } from "node:util"; import { sep } from "node:path"; import type { TerminalManager } from "../terminal/terminal-manager.js"; import type { TerminalSession } from "../terminal/terminal.js"; +import { buildServiceHostname } from "../utils/service-hostname.js"; import { createWorktree, + getServiceConfigs, getWorktreeTerminalSpecs, listPaseoWorktrees, + processCarriageReturns, resolveWorktreeRuntimeEnv, runWorktreeSetupCommands, WorktreeSetupError, @@ -16,7 +19,8 @@ import { type WorktreeSetupCommandResult, type WorktreeRuntimeEnv, } from "../utils/worktree.js"; -import type { AgentTimelineItem } from "./agent/agent-sdk-types.js"; +import { findFreePort, type ServiceRouteStore } from "./service-proxy.js"; +import type { AgentTimelineItem, ToolCallDetail } from "./agent/agent-sdk-types.js"; export interface WorktreeBootstrapTerminalResult { name: string | null; @@ -29,7 +33,10 @@ export interface WorktreeBootstrapTerminalResult { export interface RunAsyncWorktreeBootstrapOptions { agentId: string; worktree: WorktreeConfig; + shouldBootstrap?: boolean; terminalManager: TerminalManager | null; + serviceRouteStore?: ServiceRouteStore; + daemonPort?: number | null; appendTimelineItem: (item: AgentTimelineItem) => Promise; emitLiveTimelineItem?: (item: AgentTimelineItem) => Promise; logger?: Logger; @@ -43,6 +50,11 @@ export interface CreateAgentWorktreeOptions { paseoHome?: string; } +export interface CreateAgentWorktreeResult { + worktree: WorktreeConfig; + shouldBootstrap: boolean; +} + const MAX_WORKTREE_SETUP_COMMAND_OUTPUT_BYTES = 64 * 1024; const WORKTREE_SETUP_TRUNCATION_MARKER = "\n......\n"; const WORKTREE_BOOTSTRAP_TERMINAL_READY_TIMEOUT_MS = 1_500; @@ -51,8 +63,6 @@ const READ_ONLY_GIT_ENV: NodeJS.ProcessEnv = { GIT_OPTIONAL_LOCKS: "0", }; const execAsync = promisify(exec); -const worktreeSetupEligibility = new WeakMap(); - type MiddleTruncationAccumulator = { totalBytes: number; head: string; @@ -60,6 +70,12 @@ type MiddleTruncationAccumulator = { truncated: boolean; }; +export type WorktreeSetupOutputAccumulator = MiddleTruncationAccumulator; +export type WorktreeSetupProgressAccumulator = { + resultsByIndex: Map; + outputAccumulatorsByIndex: Map; +}; + function byteLength(text: string): number { return Buffer.byteLength(text, "utf8"); } @@ -86,7 +102,7 @@ function sliceLastBytes(text: string, maxBytes: number): string { return bytes.subarray(bytes.length - maxBytes).toString("utf8"); } -function createMiddleTruncationAccumulator(): MiddleTruncationAccumulator { +export function createWorktreeSetupOutputAccumulator(): WorktreeSetupOutputAccumulator { return { totalBytes: 0, head: "", @@ -103,8 +119,8 @@ function getHeadTailBudgets(maxBytes: number): { headBytes: number; tailBytes: n return { headBytes, tailBytes }; } -function appendToMiddleTruncationAccumulator( - accumulator: MiddleTruncationAccumulator, +export function appendWorktreeSetupOutputAccumulator( + accumulator: WorktreeSetupOutputAccumulator, chunk: string, ): void { if (!chunk) { @@ -161,16 +177,17 @@ function renderMiddleTruncationAccumulator(accumulator: MiddleTruncationAccumula export async function createAgentWorktree( options: CreateAgentWorktreeOptions, -): Promise { +): Promise { const existingWorktree = await findExistingPaseoWorktreeBySlug(options); if (existingWorktree) { const branchName = await resolveBranchNameForWorktreePath(existingWorktree.path); - const reusedWorktree = { - branchName, - worktreePath: existingWorktree.path, + return { + worktree: { + branchName, + worktreePath: existingWorktree.path, + }, + shouldBootstrap: false, }; - worktreeSetupEligibility.set(reusedWorktree, false); - return reusedWorktree; } const createdWorktree = await createWorktree({ @@ -181,8 +198,10 @@ export async function createAgentWorktree( runSetup: false, paseoHome: options.paseoHome, }); - worktreeSetupEligibility.set(createdWorktree, true); - return createdWorktree; + return { + worktree: createdWorktree, + shouldBootstrap: true, + }; } async function findExistingPaseoWorktreeBySlug(options: CreateAgentWorktreeOptions) { @@ -221,7 +240,7 @@ function commandStatusFromResult( function buildWorktreeSetupLog(input: { results: WorktreeSetupCommandResult[]; - outputAccumulatorsByIndex?: Map; + outputAccumulatorsByIndex?: Map; }): { log: string; truncated: boolean } { const { results, outputAccumulatorsByIndex } = input; if (results.length === 0) { @@ -236,15 +255,13 @@ function buildWorktreeSetupLog(input: { const total = results.length; for (const [index, result] of results.entries()) { lines.push(`==> [${index + 1}/${total}] Running: ${result.command}`); - const accumulator = outputAccumulatorsByIndex?.get(index + 1); - const output = accumulator - ? renderMiddleTruncationAccumulator(accumulator) - : truncateTextInMiddle( - `${result.stdout ?? ""}${result.stderr ?? ""}`, - MAX_WORKTREE_SETUP_COMMAND_OUTPUT_BYTES, - ); - if (output.text.length > 0) { - lines.push(output.text.replace(/\n$/, "")); + const output = buildWorktreeSetupCommandLog({ + index: index + 1, + result, + outputAccumulatorsByIndex, + }); + if (output.log.length > 0) { + lines.push(output.log.replace(/\n$/, "")); } if (output.truncated) { anyTruncated = true; @@ -261,34 +278,136 @@ function buildWorktreeSetupLog(input: { }; } -function buildSetupTimelineItem(input: { - callId: string; - status: "running" | "completed" | "failed"; +function buildWorktreeSetupCommandLog(input: { + index: number; + result: WorktreeSetupCommandResult; + outputAccumulatorsByIndex?: Map; +}): { log: string; truncated: boolean } { + const { index, result, outputAccumulatorsByIndex } = input; + const accumulator = outputAccumulatorsByIndex?.get(index); + const rendered = accumulator + ? renderMiddleTruncationAccumulator(accumulator) + : truncateTextInMiddle( + `${result.stdout ?? ""}${result.stderr ?? ""}`, + MAX_WORKTREE_SETUP_COMMAND_OUTPUT_BYTES, + ); + + return { + log: processCarriageReturns(rendered.text), + truncated: rendered.truncated, + }; +} + +export function createWorktreeSetupProgressAccumulator(): WorktreeSetupProgressAccumulator { + return { + resultsByIndex: new Map(), + outputAccumulatorsByIndex: new Map(), + }; +} + +export function applyWorktreeSetupProgressEvent( + accumulator: WorktreeSetupProgressAccumulator, + event: Parameters[0]["onEvent"]>>[0], +): void { + const existing = accumulator.resultsByIndex.get(event.index); + const baseResult: WorktreeSetupCommandResult = existing ?? { + command: event.command, + cwd: event.cwd, + stdout: "", + stderr: "", + exitCode: null, + durationMs: 0, + }; + + if (event.type === "output") { + const outputAccumulator = + accumulator.outputAccumulatorsByIndex.get(event.index) ?? + createWorktreeSetupOutputAccumulator(); + appendWorktreeSetupOutputAccumulator(outputAccumulator, event.chunk); + accumulator.outputAccumulatorsByIndex.set(event.index, outputAccumulator); + accumulator.resultsByIndex.set(event.index, { + ...baseResult, + stdout: baseResult.stdout, + stderr: baseResult.stderr, + }); + return; + } + + if (event.type === "command_completed") { + accumulator.resultsByIndex.set(event.index, { + ...baseResult, + stdout: event.stdout, + stderr: event.stderr, + exitCode: event.exitCode, + durationMs: event.durationMs, + }); + return; + } + + accumulator.resultsByIndex.set(event.index, baseResult); +} + +export function getWorktreeSetupProgressResults( + accumulator: WorktreeSetupProgressAccumulator, +): WorktreeSetupCommandResult[] { + return Array.from(accumulator.resultsByIndex.entries()) + .sort((a, b) => a[0] - b[0]) + .map(([, result]) => result); +} + +export function buildWorktreeSetupDetail(input: { worktree: WorktreeConfig; results: WorktreeSetupCommandResult[]; - outputAccumulatorsByIndex?: Map; - errorMessage: string | null; -}): AgentTimelineItem { - const commands = input.results.map((result, index) => ({ - index: index + 1, - command: result.command, - cwd: result.cwd, - status: commandStatusFromResult(result), - exitCode: result.exitCode, - ...(result.durationMs > 0 ? { durationMs: result.durationMs } : {}), - })); + outputAccumulatorsByIndex?: Map; +}): Extract { + let anyCommandTruncated = false; + const commands = input.results.map((result, index) => { + const renderedLog = buildWorktreeSetupCommandLog({ + index: index + 1, + result, + outputAccumulatorsByIndex: input.outputAccumulatorsByIndex, + }); + if (renderedLog.truncated) { + anyCommandTruncated = true; + } + return { + index: index + 1, + command: result.command, + cwd: result.cwd, + log: renderedLog.log, + status: commandStatusFromResult(result), + exitCode: result.exitCode, + ...(result.durationMs > 0 ? { durationMs: result.durationMs } : {}), + }; + }); const renderedLog = buildWorktreeSetupLog({ results: input.results, outputAccumulatorsByIndex: input.outputAccumulatorsByIndex, }); - const detail = { - type: "worktree_setup" as const, + + return { + type: "worktree_setup", worktreePath: input.worktree.worktreePath, branchName: input.worktree.branchName, log: renderedLog.log, commands, - ...(renderedLog.truncated ? { truncated: true } : {}), + ...(renderedLog.truncated || anyCommandTruncated ? { truncated: true } : {}), }; +} + +function buildSetupTimelineItem(input: { + callId: string; + status: "running" | "completed" | "failed"; + worktree: WorktreeConfig; + results: WorktreeSetupCommandResult[]; + outputAccumulatorsByIndex?: Map; + errorMessage: string | null; +}): AgentTimelineItem { + const detail = buildWorktreeSetupDetail({ + worktree: input.worktree, + results: input.results, + outputAccumulatorsByIndex: input.outputAccumulatorsByIndex, + }); if (input.status === "running") { return { @@ -523,7 +642,7 @@ async function runWorktreeTerminalBootstrap( export async function runAsyncWorktreeBootstrap( options: RunAsyncWorktreeBootstrapOptions, ): Promise { - if (worktreeSetupEligibility.get(options.worktree) === false) { + if (options.shouldBootstrap === false) { return; } @@ -531,17 +650,14 @@ export async function runAsyncWorktreeBootstrap( let setupResults: WorktreeSetupCommandResult[] = []; let runtimeEnv: WorktreeRuntimeEnv | null = null; const emitLiveTimelineItem = options.emitLiveTimelineItem; - const runningResultsByIndex = new Map(); - const outputAccumulatorsByIndex = new Map(); + const progressAccumulator = createWorktreeSetupProgressAccumulator(); let liveEmitQueue = Promise.resolve(); const queueLiveRunningEmit = () => { if (!emitLiveTimelineItem) { return; } - const runningResults = Array.from(runningResultsByIndex.entries()) - .sort((a, b) => a[0] - b[0]) - .map(([, result]) => result); + const runningResults = getWorktreeSetupProgressResults(progressAccumulator); liveEmitQueue = liveEmitQueue.then(async () => { try { await emitLiveTimelineItem( @@ -550,7 +666,7 @@ export async function runAsyncWorktreeBootstrap( status: "running", worktree: options.worktree, results: runningResults, - outputAccumulatorsByIndex, + outputAccumulatorsByIndex: progressAccumulator.outputAccumulatorsByIndex, errorMessage: null, }), ); @@ -579,42 +695,7 @@ export async function runAsyncWorktreeBootstrap( cleanupOnFailure: false, runtimeEnv, onEvent: (event) => { - const existing = runningResultsByIndex.get(event.index); - const baseResult: WorktreeSetupCommandResult = existing ?? { - command: event.command, - cwd: event.cwd, - stdout: "", - stderr: "", - exitCode: null, - durationMs: 0, - }; - if (event.type === "output") { - const outputAccumulator = - outputAccumulatorsByIndex.get(event.index) ?? createMiddleTruncationAccumulator(); - appendToMiddleTruncationAccumulator(outputAccumulator, event.chunk); - outputAccumulatorsByIndex.set(event.index, outputAccumulator); - runningResultsByIndex.set(event.index, { - ...baseResult, - // Keep the timeline command model lightweight; output is carried in - // outputAccumulatorsByIndex. - stdout: baseResult.stdout, - stderr: baseResult.stderr, - }); - queueLiveRunningEmit(); - return; - } - if (event.type === "command_completed") { - runningResultsByIndex.set(event.index, { - ...baseResult, - stdout: event.stdout, - stderr: event.stderr, - exitCode: event.exitCode, - durationMs: event.durationMs, - }); - queueLiveRunningEmit(); - return; - } - runningResultsByIndex.set(event.index, baseResult); + applyWorktreeSetupProgressEvent(progressAccumulator, event); queueLiveRunningEmit(); }, }); @@ -626,7 +707,7 @@ export async function runAsyncWorktreeBootstrap( status: "completed", worktree: options.worktree, results: setupResults, - outputAccumulatorsByIndex, + outputAccumulatorsByIndex: progressAccumulator.outputAccumulatorsByIndex, errorMessage: null, }), ); @@ -645,7 +726,7 @@ export async function runAsyncWorktreeBootstrap( status: "failed", worktree: options.worktree, results: setupResults, - outputAccumulatorsByIndex, + outputAccumulatorsByIndex: progressAccumulator.outputAccumulatorsByIndex, errorMessage: message, }), ); @@ -653,4 +734,186 @@ export async function runAsyncWorktreeBootstrap( } await runWorktreeTerminalBootstrap(options, runtimeEnv); + + if (!options.terminalManager || !options.serviceRouteStore) { + return; + } + + try { + await spawnWorktreeServices({ + repoRoot: options.worktree.worktreePath, + workspaceId: options.worktree.worktreePath, + branchName: options.worktree.branchName, + daemonPort: options.daemonPort, + routeStore: options.serviceRouteStore, + terminalManager: options.terminalManager, + logger: options.logger, + }); + } catch (error) { + options.logger?.warn( + { err: error, agentId: options.agentId, worktreePath: options.worktree.worktreePath }, + "Failed to spawn worktree services", + ); + } +} + +// --------------------------------------------------------------------------- +// Service lifecycle helpers +// --------------------------------------------------------------------------- + +export interface WorktreeServiceResult { + serviceName: string; + hostname: string; + port: number; + terminalId: string; +} + +type SpawnWorkspaceServiceOptions = { + repoRoot: string; + workspaceId: string; + branchName: string | null; + serviceName: string; + daemonPort?: number | null; + routeStore: ServiceRouteStore; + terminalManager: TerminalManager; + logger?: Logger; + onLifecycleChanged?: () => void; +}; + +export async function spawnWorkspaceService( + options: SpawnWorkspaceServiceOptions, +): Promise { + const { + repoRoot, + workspaceId, + branchName, + serviceName, + daemonPort, + routeStore, + terminalManager, + logger, + onLifecycleChanged, + } = options; + const serviceConfigs = getServiceConfigs(repoRoot); + const config = serviceConfigs.get(serviceName); + if (!config) { + throw new Error(`Service '${serviceName}' is not configured in paseo.json`); + } + + let hostname: string | null = null; + let port: number | null = null; + + try { + hostname = buildServiceHostname(branchName, serviceName); + const resolvedHostname = hostname; + if (routeStore.getRouteEntry(resolvedHostname)) { + throw new Error(`Service '${serviceName}' is already running`); + } + + port = config.port ?? (await findFreePort()); + + routeStore.registerRoute({ + hostname: resolvedHostname, + port, + workspaceId, + serviceName, + }); + + const env: Record = { + PORT: String(port), + HOST: "127.0.0.1", + }; + if (daemonPort !== null && daemonPort !== undefined) { + env.PASEO_SERVICE_URL = `http://${resolvedHostname}:${daemonPort}`; + } + + const terminal = await terminalManager.createTerminal({ + cwd: repoRoot, + name: serviceName, + env, + }); + + terminal.onExit(() => { + routeStore.removeRoute(resolvedHostname); + onLifecycleChanged?.(); + logger?.info( + { serviceName, hostname: resolvedHostname, terminalId: terminal.id }, + "Stopped worktree service", + ); + }); + + await waitForTerminalBootstrapReadiness(terminal); + terminal.send({ type: "input", data: `${config.command}\r` }); + + logger?.info( + { serviceName, hostname: resolvedHostname, port, terminalId: terminal.id }, + `Registered service proxy: ${resolvedHostname} -> 127.0.0.1:${port}`, + ); + + onLifecycleChanged?.(); + return { + serviceName, + hostname: resolvedHostname, + port, + terminalId: terminal.id, + }; + } catch (error) { + if (hostname && port !== null) { + routeStore.removeRoute(hostname); + } + logger?.error( + { + err: error, + serviceName, + repoRoot, + branchName, + hostname, + port, + command: config.command, + }, + "Failed to spawn worktree service", + ); + throw error; + } +} + +export async function spawnWorktreeServices(options: { + repoRoot: string; + workspaceId: string; + branchName: string | null; + daemonPort?: number | null; + routeStore: ServiceRouteStore; + terminalManager: TerminalManager; + logger?: Logger; + onLifecycleChanged?: () => void; +}): Promise { + const { repoRoot } = options; + const serviceConfigs = getServiceConfigs(repoRoot); + if (serviceConfigs.size === 0) { + return []; + } + + const results: WorktreeServiceResult[] = []; + for (const serviceName of serviceConfigs.keys()) { + results.push( + await spawnWorkspaceService({ + ...options, + serviceName, + }), + ); + } + + return results; +} + +export function teardownWorktreeServices(options: { + hostnames: string[]; + routeStore: ServiceRouteStore; + logger: Logger; +}): void { + const { hostnames, routeStore, logger } = options; + for (const hostname of hostnames) { + routeStore.removeRoute(hostname); + logger.info({ hostname }, "Removed service proxy route"); + } } diff --git a/packages/server/src/server/worktree-session.test.ts b/packages/server/src/server/worktree-session.test.ts new file mode 100644 index 000000000..22b476702 --- /dev/null +++ b/packages/server/src/server/worktree-session.test.ts @@ -0,0 +1,855 @@ +import { execSync } from "node:child_process"; +import { existsSync, mkdtempSync, readFileSync, realpathSync, rmSync, writeFileSync } from "node:fs"; +import { tmpdir } from "node:os"; +import path from "node:path"; +import { afterEach, describe, expect, test, vi } from "vitest"; + +import type { SessionOutboundMessage } from "./messages.js"; +import { ServiceRouteStore } from "./service-proxy.js"; +import * as worktreeBootstrap from "./worktree-bootstrap.js"; +import { + createPaseoWorktreeInBackground, + handleCreatePaseoWorktreeRequest, + handleWorkspaceSetupStatusRequest, +} from "./worktree-session.js"; +import { createWorktree } from "../utils/worktree.js"; + +function createLogger() { + return { + info: vi.fn(), + warn: vi.fn(), + error: vi.fn(), + } as any; +} + +function createTerminalManagerStub(options?: { + createTerminal?: (input: { + cwd: string; + name?: string; + env?: Record; + }) => Promise; +}) { + const terminals: Array<{ + id: string; + cwd: string; + name: string | undefined; + env: Record | undefined; + sent: string[]; + }> = []; + + return { + terminals, + manager: { + registerCwdEnv: vi.fn(), + createTerminal: vi.fn(async (input: { + cwd: string; + name?: string; + env?: Record; + }) => { + if (options?.createTerminal) { + return options.createTerminal(input); + } + const sent: string[] = []; + const terminal = { + id: `terminal-${terminals.length + 1}`, + getState: () => ({ + scrollback: [[{ char: "$" }]], + grid: [], + }), + subscribe: () => () => {}, + onExit: () => () => {}, + send: (message: { type: string; data: string }) => { + if (message.type === "input") { + sent.push(message.data); + } + }, + }; + terminals.push({ + id: terminal.id, + cwd: input.cwd, + name: input.name, + env: input.env, + sent, + }); + return terminal; + }), + } as any, + }; +} + +function createGitRepo(options?: { paseoConfig?: Record }) { + const tempDir = realpathSync(mkdtempSync(path.join(tmpdir(), "worktree-session-test-"))); + const repoDir = path.join(tempDir, "repo"); + execSync(`mkdir -p ${JSON.stringify(repoDir)}`); + execSync("git init -b main", { cwd: repoDir, stdio: "pipe" }); + execSync("git config user.email 'test@test.com'", { cwd: repoDir, stdio: "pipe" }); + execSync("git config user.name 'Test'", { cwd: repoDir, stdio: "pipe" }); + writeFileSync(path.join(repoDir, "README.md"), "hello\n"); + if (options?.paseoConfig) { + writeFileSync(path.join(repoDir, "paseo.json"), JSON.stringify(options.paseoConfig, null, 2)); + } + execSync("git add .", { cwd: repoDir, stdio: "pipe" }); + execSync("git -c commit.gpgsign=false commit -m 'initial'", { cwd: repoDir, stdio: "pipe" }); + return { tempDir, repoDir }; +} + +describe("createPaseoWorktreeInBackground", () => { + const cleanupPaths: string[] = []; + + afterEach(() => { + for (const target of cleanupPaths.splice(0)) { + rmSync(target, { recursive: true, force: true }); + } + }); + + test("emits running then completed snapshots for no-setup workspaces and then launches services", async () => { + const { tempDir, repoDir } = createGitRepo({ + paseoConfig: { + services: { + web: { + command: "npm run dev", + }, + }, + }, + }); + cleanupPaths.push(tempDir); + + const paseoHome = path.join(tempDir, ".paseo"); + const createdWorktree = await createWorktree({ + branchName: "feature-no-setup", + cwd: repoDir, + baseBranch: "main", + worktreeSlug: "feature-no-setup", + runSetup: false, + paseoHome, + }); + const worktreePath = createdWorktree.worktreePath; + const emitted: SessionOutboundMessage[] = []; + const snapshots = new Map(); + const routeStore = new ServiceRouteStore(); + const logger = createLogger(); + const terminalManager = createTerminalManagerStub(); + const emitWorkspaceUpdateForCwd = vi.fn(async () => {}); + const archiveWorkspaceRecord = vi.fn(async () => {}); + + await createPaseoWorktreeInBackground( + { + paseoHome, + emitWorkspaceUpdateForCwd, + cacheWorkspaceSetupSnapshot: (workspaceId, snapshot) => snapshots.set(workspaceId, snapshot), + emit: (message) => emitted.push(message), + sessionLogger: logger, + terminalManager: terminalManager.manager, + archiveWorkspaceRecord, + serviceRouteStore: routeStore, + daemonPort: 6767, + }, + { + requestCwd: repoDir, + repoRoot: repoDir, + workspaceId: 42, + worktree: { + branchName: "feature-no-setup", + worktreePath, + }, + shouldBootstrap: true, + }, + ); + + const progressMessages = emitted.filter( + (message): message is Extract => + message.type === "workspace_setup_progress", + ); + expect(progressMessages).toHaveLength(2); + expect(progressMessages[0]?.payload).toMatchObject({ + workspaceId: "42", + status: "running", + error: null, + detail: { + type: "worktree_setup", + worktreePath, + branchName: "feature-no-setup", + log: "", + commands: [], + }, + }); + expect(progressMessages[1]?.payload).toMatchObject({ + workspaceId: "42", + status: "completed", + error: null, + detail: { + type: "worktree_setup", + worktreePath, + branchName: "feature-no-setup", + log: "", + commands: [], + }, + }); + expect(snapshots.get("42")).toMatchObject({ + status: "completed", + error: null, + detail: { + type: "worktree_setup", + worktreePath, + branchName: "feature-no-setup", + log: "", + commands: [], + }, + }); + + expect(routeStore.listRoutes()).toEqual([ + { + hostname: "feature-no-setup.web.localhost", + port: expect.any(Number), + workspaceId: worktreePath, + serviceName: "web", + }, + ]); + expect(terminalManager.terminals).toHaveLength(1); + expect(terminalManager.terminals[0]?.cwd).toBe(worktreePath); + expect(terminalManager.terminals[0]?.env?.PORT).toEqual(expect.any(String)); + expect(terminalManager.terminals[0]?.env?.PASEO_SERVICE_URL).toBeDefined(); + expect(terminalManager.terminals[0]?.sent).toEqual(["npm run dev\r"]); + expect(archiveWorkspaceRecord).not.toHaveBeenCalled(); + expect(emitWorkspaceUpdateForCwd).toHaveBeenCalledWith(worktreePath); + }); + + test("archives the pending workspace and emits a failed snapshot when setup cannot start", async () => { + const { tempDir, repoDir } = createGitRepo(); + cleanupPaths.push(tempDir); + + writeFileSync(path.join(repoDir, "paseo.json"), "{ invalid json\n"); + execSync("git add paseo.json", { cwd: repoDir, stdio: "pipe" }); + execSync("git -c commit.gpgsign=false commit -m 'broken config'", { + cwd: repoDir, + stdio: "pipe", + }); + + const paseoHome = path.join(tempDir, ".paseo"); + const createdWorktree = await createWorktree({ + branchName: "broken-feature", + cwd: repoDir, + baseBranch: "main", + worktreeSlug: "broken-feature", + runSetup: false, + paseoHome, + }); + const worktreePath = createdWorktree.worktreePath; + const emitted: SessionOutboundMessage[] = []; + const snapshots = new Map(); + const logger = createLogger(); + const emitWorkspaceUpdateForCwd = vi.fn(async () => {}); + const archiveWorkspaceRecord = vi.fn(async () => {}); + const workspaceId = 101; + + await createPaseoWorktreeInBackground( + { + paseoHome, + emitWorkspaceUpdateForCwd, + cacheWorkspaceSetupSnapshot: (workspaceId, snapshot) => snapshots.set(workspaceId, snapshot), + emit: (message) => emitted.push(message), + sessionLogger: logger, + terminalManager: null, + archiveWorkspaceRecord, + serviceRouteStore: null, + daemonPort: null, + }, + { + requestCwd: repoDir, + repoRoot: repoDir, + workspaceId, + worktree: { + branchName: "broken-feature", + worktreePath, + }, + shouldBootstrap: true, + }, + ); + + const progressMessages = emitted.filter( + (message): message is Extract => + message.type === "workspace_setup_progress", + ); + expect(progressMessages).toHaveLength(2); + expect(progressMessages[0]?.payload.status).toBe("running"); + expect(progressMessages[0]?.payload.error).toBeNull(); + expect(progressMessages[1]?.payload.status).toBe("failed"); + expect(progressMessages[1]?.payload.error).toContain("Failed to parse paseo.json"); + expect(progressMessages[1]?.payload.detail.commands).toEqual([]); + expect(snapshots.get("101")).toMatchObject({ + status: "failed", + error: expect.stringContaining("Failed to parse paseo.json"), + }); + expect(archiveWorkspaceRecord).toHaveBeenCalledWith(workspaceId); + expect(emitWorkspaceUpdateForCwd).toHaveBeenCalledWith(worktreePath); + }); + + test("emits running setup snapshots before completed for real setup commands", async () => { + const { tempDir, repoDir } = createGitRepo({ + paseoConfig: { + worktree: { + setup: ['sh -c "printf \'phase-one\\\\n\'; sleep 0.1; printf \'phase-two\\\\n\'"'], + }, + }, + }); + cleanupPaths.push(tempDir); + + const paseoHome = path.join(tempDir, ".paseo"); + const createdWorktree = await createWorktree({ + branchName: "feature-running-setup", + cwd: repoDir, + baseBranch: "main", + worktreeSlug: "feature-running-setup", + runSetup: false, + paseoHome, + }); + const worktreePath = createdWorktree.worktreePath; + const emitted: SessionOutboundMessage[] = []; + const snapshots = new Map(); + const logger = createLogger(); + const emitWorkspaceUpdateForCwd = vi.fn(async () => {}); + const archiveWorkspaceRecord = vi.fn(async () => {}); + + await createPaseoWorktreeInBackground( + { + paseoHome, + emitWorkspaceUpdateForCwd, + cacheWorkspaceSetupSnapshot: (workspaceId, snapshot) => snapshots.set(workspaceId, snapshot), + emit: (message) => emitted.push(message), + sessionLogger: logger, + terminalManager: null, + archiveWorkspaceRecord, + serviceRouteStore: null, + daemonPort: null, + }, + { + requestCwd: repoDir, + repoRoot: repoDir, + workspaceId: 43, + worktree: { + branchName: "feature-running-setup", + worktreePath, + }, + shouldBootstrap: true, + }, + ); + + const progressMessages = emitted.filter( + (message): message is Extract => + message.type === "workspace_setup_progress", + ); + expect(progressMessages.length).toBeGreaterThan(1); + expect(progressMessages[0]?.payload).toMatchObject({ + workspaceId: "43", + status: "running", + error: null, + detail: { + type: "worktree_setup", + worktreePath, + branchName: "feature-running-setup", + log: "", + commands: [], + }, + }); + expect(progressMessages.at(-1)?.payload.status).toBe("completed"); + + const runningMessages = progressMessages.filter((message) => message.payload.status === "running"); + expect(runningMessages.length).toBeGreaterThan(0); + expect(progressMessages.findIndex((message) => message.payload.status === "running")).toBeLessThan( + progressMessages.findIndex((message) => message.payload.status === "completed"), + ); + + const setupOutputMessage = runningMessages.find((message) => + message.payload.detail.commands[0]?.log.includes("phase-one"), + ); + expect(setupOutputMessage?.payload.detail.log).toContain("phase-one"); + expect(setupOutputMessage?.payload.detail.commands[0]).toMatchObject({ + index: 1, + command: 'sh -c "printf \'phase-one\\\\n\'; sleep 0.1; printf \'phase-two\\\\n\'"', + log: expect.stringContaining("phase-one"), + status: "running", + }); + + expect(progressMessages.at(-1)?.payload).toMatchObject({ + workspaceId: "43", + status: "completed", + error: null, + detail: { + type: "worktree_setup", + worktreePath, + branchName: "feature-running-setup", + }, + }); + expect(progressMessages.at(-1)?.payload.detail.log).toContain("phase-two"); + expect(progressMessages.at(-1)?.payload.detail.commands[0]).toMatchObject({ + index: 1, + command: 'sh -c "printf \'phase-one\\\\n\'; sleep 0.1; printf \'phase-two\\\\n\'"', + log: expect.stringContaining("phase-two"), + status: "completed", + exitCode: 0, + }); + expect(snapshots.get("43")).toMatchObject({ + status: "completed", + error: null, + }); + }); + + test("emits completed when reusing an existing worktree without bootstrapping", async () => { + const { tempDir, repoDir } = createGitRepo({ + paseoConfig: { + worktree: { + setup: ["printf 'ran' > setup-ran.txt"], + }, + services: { + web: { + command: "npm run dev", + }, + }, + }, + }); + cleanupPaths.push(tempDir); + + const paseoHome = path.join(tempDir, ".paseo"); + const existingWorktree = await createWorktree({ + branchName: "reused-worktree", + cwd: repoDir, + baseBranch: "main", + worktreeSlug: "reused-worktree", + runSetup: false, + paseoHome, + }); + + const emitted: SessionOutboundMessage[] = []; + const snapshots = new Map(); + const routeStore = new ServiceRouteStore(); + const logger = createLogger(); + const terminalManager = createTerminalManagerStub(); + const emitWorkspaceUpdateForCwd = vi.fn(async () => {}); + const archiveWorkspaceRecord = vi.fn(async () => {}); + + await createPaseoWorktreeInBackground( + { + paseoHome, + emitWorkspaceUpdateForCwd, + cacheWorkspaceSetupSnapshot: (workspaceId, snapshot) => snapshots.set(workspaceId, snapshot), + emit: (message) => emitted.push(message), + sessionLogger: logger, + terminalManager: terminalManager.manager, + archiveWorkspaceRecord, + serviceRouteStore: routeStore, + daemonPort: 6767, + }, + { + requestCwd: repoDir, + repoRoot: repoDir, + workspaceId: 44, + worktree: { + branchName: "reused-worktree", + worktreePath: existingWorktree.worktreePath, + }, + shouldBootstrap: false, + }, + ); + + const progressMessages = emitted.filter( + (message): message is Extract => + message.type === "workspace_setup_progress", + ); + expect(progressMessages).toHaveLength(2); + expect(progressMessages[0]?.payload).toMatchObject({ + workspaceId: "44", + status: "running", + error: null, + }); + expect(progressMessages[1]?.payload).toMatchObject({ + workspaceId: "44", + status: "completed", + error: null, + detail: { + type: "worktree_setup", + worktreePath: existingWorktree.worktreePath, + branchName: "reused-worktree", + log: "", + commands: [], + }, + }); + expect(routeStore.listRoutes()).toEqual([ + { + hostname: "reused-worktree.web.localhost", + port: expect.any(Number), + workspaceId: existingWorktree.worktreePath, + serviceName: "web", + }, + ]); + expect(terminalManager.terminals).toHaveLength(1); + expect(terminalManager.terminals[0]?.cwd).toBe(existingWorktree.worktreePath); + expect(terminalManager.terminals[0]?.name).toBe("web"); + expect(terminalManager.terminals[0]?.env?.PORT).toEqual(expect.any(String)); + expect(terminalManager.terminals[0]?.env?.PASEO_SERVICE_URL).toBeDefined(); + expect(terminalManager.terminals[0]?.sent).toEqual(["npm run dev\r"]); + expect( + readFileSync(path.join(existingWorktree.worktreePath, "README.md"), "utf8"), + ).toContain("hello"); + expect(() => readFileSync(path.join(existingWorktree.worktreePath, "setup-ran.txt"), "utf8")).toThrow(); + expect(snapshots.get("44")).toMatchObject({ + status: "completed", + error: null, + }); + expect(archiveWorkspaceRecord).not.toHaveBeenCalled(); + expect(emitWorkspaceUpdateForCwd).toHaveBeenCalledWith(existingWorktree.worktreePath); + }); + + test("keeps setup completed when service launch fails afterward", async () => { + const { tempDir, repoDir } = createGitRepo({ + paseoConfig: { + services: { + web: { + command: "npm run dev", + }, + }, + }, + }); + cleanupPaths.push(tempDir); + + const paseoHome = path.join(tempDir, ".paseo"); + const createdWorktree = await createWorktree({ + branchName: "feature-service-failure", + cwd: repoDir, + baseBranch: "main", + worktreeSlug: "feature-service-failure", + runSetup: false, + paseoHome, + }); + const worktreePath = createdWorktree.worktreePath; + const emitted: SessionOutboundMessage[] = []; + const snapshots = new Map(); + const routeStore = new ServiceRouteStore(); + const logger = createLogger(); + const terminalManager = createTerminalManagerStub({ + createTerminal: async () => { + throw new Error("terminal spawn failed"); + }, + }); + const emitWorkspaceUpdateForCwd = vi.fn(async () => {}); + const archiveWorkspaceRecord = vi.fn(async () => {}); + + await createPaseoWorktreeInBackground( + { + paseoHome, + emitWorkspaceUpdateForCwd, + cacheWorkspaceSetupSnapshot: (workspaceId, snapshot) => snapshots.set(workspaceId, snapshot), + emit: (message) => emitted.push(message), + sessionLogger: logger, + terminalManager: terminalManager.manager, + archiveWorkspaceRecord, + serviceRouteStore: routeStore, + daemonPort: 6767, + }, + { + requestCwd: repoDir, + repoRoot: repoDir, + workspaceId: 45, + worktree: { + branchName: "feature-service-failure", + worktreePath, + }, + shouldBootstrap: true, + }, + ); + + const progressMessages = emitted.filter( + (message): message is Extract => + message.type === "workspace_setup_progress", + ); + expect(progressMessages).toHaveLength(2); + expect(progressMessages[0]?.payload.status).toBe("running"); + expect(progressMessages[0]?.payload.error).toBeNull(); + expect(progressMessages[1]?.payload.status).toBe("completed"); + expect(progressMessages[1]?.payload.error).toBeNull(); + expect(emitted.some((message) => message.type === "workspace_setup_progress" && message.payload.status === "failed")).toBe(false); + expect(logger.error).toHaveBeenCalledWith( + expect.objectContaining({ + err: expect.any(Error), + cwd: repoDir, + repoRoot: repoDir, + worktreeSlug: "feature-service-failure", + worktreePath, + }), + "Failed to spawn worktree services after workspace setup completed", + ); + expect(snapshots.get("45")).toMatchObject({ + status: "completed", + error: null, + }); + expect(archiveWorkspaceRecord).not.toHaveBeenCalled(); + expect(emitWorkspaceUpdateForCwd).toHaveBeenCalledWith(worktreePath); + }); + + test("launches services in socket mode without requiring a daemon TCP port", async () => { + const { tempDir, repoDir } = createGitRepo({ + paseoConfig: { + services: { + web: { + command: "npm run dev", + }, + }, + }, + }); + cleanupPaths.push(tempDir); + + const paseoHome = path.join(tempDir, ".paseo"); + const createdWorktree = await createWorktree({ + branchName: "feature-socket-mode", + cwd: repoDir, + baseBranch: "main", + worktreeSlug: "feature-socket-mode", + runSetup: false, + paseoHome, + }); + const worktreePath = createdWorktree.worktreePath; + const emitted: SessionOutboundMessage[] = []; + const snapshots = new Map(); + const routeStore = new ServiceRouteStore(); + const logger = createLogger(); + const terminalManager = createTerminalManagerStub(); + const emitWorkspaceUpdateForCwd = vi.fn(async () => {}); + const archiveWorkspaceRecord = vi.fn(async () => {}); + + await createPaseoWorktreeInBackground( + { + paseoHome, + emitWorkspaceUpdateForCwd, + cacheWorkspaceSetupSnapshot: (workspaceId, snapshot) => snapshots.set(workspaceId, snapshot), + emit: (message) => emitted.push(message), + sessionLogger: logger, + terminalManager: terminalManager.manager, + archiveWorkspaceRecord, + serviceRouteStore: routeStore, + daemonPort: null, + }, + { + requestCwd: repoDir, + repoRoot: repoDir, + workspaceId: 46, + worktree: { + branchName: "feature-socket-mode", + worktreePath, + }, + shouldBootstrap: true, + }, + ); + + expect(routeStore.listRoutes()).toEqual([ + { + hostname: "feature-socket-mode.web.localhost", + port: expect.any(Number), + workspaceId: worktreePath, + serviceName: "web", + }, + ]); + expect(terminalManager.terminals).toHaveLength(1); + expect(terminalManager.terminals[0]?.cwd).toBe(worktreePath); + expect(terminalManager.terminals[0]?.env?.PORT).toEqual(expect.any(String)); + expect(terminalManager.terminals[0]?.env?.PASEO_SERVICE_URL).toBeUndefined(); + expect(terminalManager.terminals[0]?.sent).toEqual(["npm run dev\r"]); + expect(snapshots.get("46")).toMatchObject({ + status: "completed", + error: null, + }); + expect(archiveWorkspaceRecord).not.toHaveBeenCalled(); + expect(emitWorkspaceUpdateForCwd).toHaveBeenCalledWith(worktreePath); + }); + + test("returns the cached workspace setup snapshot for status requests", async () => { + const emitted: SessionOutboundMessage[] = []; + const snapshots = new Map([ + [ + "/repo/.paseo/worktrees/feature-a", + { + status: "completed", + detail: { + type: "worktree_setup", + worktreePath: "/repo/.paseo/worktrees/feature-a", + branchName: "feature-a", + log: "done", + commands: [], + }, + error: null, + }, + ], + ]); + + await handleWorkspaceSetupStatusRequest( + { + emit: (message) => emitted.push(message), + workspaceSetupSnapshots: snapshots, + workspaceRegistry: { list: async () => [] } as any, + }, + { + type: "workspace_setup_status_request", + workspaceId: "/repo/.paseo/worktrees/feature-a", + requestId: "req-status", + }, + ); + + expect(emitted).toContainEqual({ + type: "workspace_setup_status_response", + payload: { + requestId: "req-status", + workspaceId: "/repo/.paseo/worktrees/feature-a", + snapshot: { + status: "completed", + detail: { + type: "worktree_setup", + worktreePath: "/repo/.paseo/worktrees/feature-a", + branchName: "feature-a", + log: "done", + commands: [], + }, + error: null, + }, + }, + }); + }); + + test("returns null when no cached workspace setup snapshot exists", async () => { + const emitted: SessionOutboundMessage[] = []; + + await handleWorkspaceSetupStatusRequest( + { + emit: (message) => emitted.push(message), + workspaceSetupSnapshots: new Map(), + workspaceRegistry: { list: async () => [] } as any, + }, + { + type: "workspace_setup_status_request", + workspaceId: "/repo/.paseo/worktrees/missing", + requestId: "req-missing", + }, + ); + + expect(emitted).toContainEqual({ + type: "workspace_setup_status_response", + payload: { + requestId: "req-missing", + workspaceId: "/repo/.paseo/worktrees/missing", + snapshot: null, + }, + }); + }); + +}); + +describe("handleCreatePaseoWorktreeRequest", () => { + test("invokes worktree creation once for a create request", async () => { + const { tempDir, repoDir } = createGitRepo(); + const paseoHome = path.join(tempDir, ".paseo"); + const emitted: SessionOutboundMessage[] = []; + const createAgentWorktreeSpy = vi.spyOn(worktreeBootstrap, "createAgentWorktree"); + + try { + await handleCreatePaseoWorktreeRequest( + { + paseoHome, + sessionLogger: createLogger(), + emit: (message) => emitted.push(message), + registerPendingWorktreeWorkspace: vi.fn(async (options) => ({ + workspaceId: options.worktreePath, + projectId: options.repoRoot, + })), + describeWorkspaceRecord: vi.fn(async (workspace) => ({ + id: workspace.workspaceId, + projectId: workspace.projectId, + projectDisplayName: path.basename(repoDir), + projectRootPath: repoDir, + projectKind: "git", + workspaceKind: "worktree", + name: path.basename(workspace.workspaceId), + status: "done", + activityAt: null, + })), + createPaseoWorktreeInBackground: vi.fn(async () => {}), + }, + { + type: "create_paseo_worktree_request", + cwd: repoDir, + worktreeSlug: "single-call", + requestId: "req-single-call", + }, + ); + + expect(createAgentWorktreeSpy).toHaveBeenCalledTimes(1); + const response = emitted.find( + (message): message is Extract => + message.type === "create_paseo_worktree_response", + ); + expect(response?.payload.error).toBeNull(); + } finally { + createAgentWorktreeSpy.mockRestore(); + rmSync(tempDir, { recursive: true, force: true }); + } + }); + + test("creates the worktree before emitting the response", async () => { + const { tempDir, repoDir } = createGitRepo(); + const paseoHome = path.join(tempDir, ".paseo"); + const emitted: SessionOutboundMessage[] = []; + const backgroundWork = vi.fn(async () => {}); + + try { + await handleCreatePaseoWorktreeRequest( + { + paseoHome, + sessionLogger: createLogger(), + emit: (message) => emitted.push(message), + registerPendingWorktreeWorkspace: vi.fn(async (options) => { + expect(existsSync(options.worktreePath)).toBe(true); + return { + workspaceId: options.worktreePath, + projectId: options.repoRoot, + } as any; + }), + describeWorkspaceRecord: vi.fn(async (workspace) => ({ + id: workspace.workspaceId, + projectId: workspace.projectId, + projectDisplayName: path.basename(repoDir), + projectRootPath: repoDir, + projectKind: "git", + workspaceKind: "worktree", + name: path.basename(workspace.workspaceId), + status: "done", + activityAt: null, + })), + createPaseoWorktreeInBackground: backgroundWork, + }, + { + type: "create_paseo_worktree_request", + cwd: repoDir, + worktreeSlug: "response-after-create", + requestId: "req-1", + }, + ); + + const response = emitted.find( + (message): message is Extract => + message.type === "create_paseo_worktree_response", + ); + expect(response?.payload.error).toBeNull(); + expect(response?.payload.workspace?.id).toBeTruthy(); + expect(existsSync(response!.payload.workspace!.id)).toBe(true); + expect(backgroundWork).toHaveBeenCalledWith( + expect.objectContaining({ + requestCwd: repoDir, + repoRoot: repoDir, + worktree: { + branchName: "response-after-create", + worktreePath: response!.payload.workspace!.id, + }, + shouldBootstrap: true, + }), + ); + } finally { + rmSync(tempDir, { recursive: true, force: true }); + } + }); +}); diff --git a/packages/server/src/server/worktree-session.ts b/packages/server/src/server/worktree-session.ts index 17d07a54e..10c27f4b7 100644 --- a/packages/server/src/server/worktree-session.ts +++ b/packages/server/src/server/worktree-session.ts @@ -11,17 +11,25 @@ import { type ProjectPlacementPayload, type SessionInboundMessage, type SessionOutboundMessage, + type WorkspaceSetupSnapshot, type WorkspaceDescriptorPayload, } from "./messages.js"; import type { - PersistedProjectRecord, PersistedWorkspaceRecord, ProjectRegistry, WorkspaceRegistry, } from "./workspace-registry.js"; import { normalizeWorkspaceId as normalizePersistedWorkspaceId } from "./workspace-registry-model.js"; -import { createAgentWorktree } from "./worktree-bootstrap.js"; +import { + applyWorktreeSetupProgressEvent, + buildWorktreeSetupDetail, + createAgentWorktree, + createWorktreeSetupProgressAccumulator, + getWorktreeSetupProgressResults, + spawnWorktreeServices, +} from "./worktree-bootstrap.js"; import type { TerminalManager } from "../terminal/terminal-manager.js"; +import type { ServiceRouteStore } from "./service-proxy.js"; import { getCheckoutStatusLite, resolveRepositoryDefaultBranch, @@ -35,9 +43,12 @@ import { listPaseoWorktrees, resolvePaseoWorktreeRootForCwd, resolveWorktreeRuntimeEnv, + runWorktreeSetupCommands, slugify, validateBranchSlug, type WorktreeConfig, + type WorktreeSetupCommandResult, + WorktreeSetupError, } from "../utils/worktree.js"; import { READ_ONLY_GIT_ENV, toCheckoutError } from "./checkout-git-utils.js"; @@ -77,26 +88,15 @@ type ArchivePaseoWorktreeDependencies = { }; type RegisterPendingWorktreeWorkspaceDependencies = { - buildPersistedProjectRecord: (input: { - workspaceId: string; - placement: ProjectPlacementPayload; - createdAt: string; - updatedAt: string; - }) => PersistedProjectRecord; - buildPersistedWorkspaceRecord: (input: { - workspaceId: string; - placement: ProjectPlacementPayload; - createdAt: string; - updatedAt: string; - }) => PersistedWorkspaceRecord; buildProjectPlacement: (cwd: string) => Promise; - projectRegistry: Pick; + findWorkspaceByDirectory: (directory: string) => Promise; + projectRegistry: Pick; syncWorkspaceGitWatchTarget: ( cwd: string, options: { isGit: boolean }, ) => Promise; - workspaceRegistry: Pick; - archiveProjectRecordIfEmpty: (projectId: string, archivedAt: string) => Promise; + workspaceRegistry: Pick; + archiveProjectRecordIfEmpty: (projectId: number, archivedAt: string) => Promise; }; type CreatePaseoWorktreeInBackgroundDependencies = { @@ -105,8 +105,19 @@ type CreatePaseoWorktreeInBackgroundDependencies = { cwd: string, options?: { dedupeGitState?: boolean }, ) => Promise; + cacheWorkspaceSetupSnapshot: (workspaceId: string, snapshot: WorkspaceSetupSnapshot) => void; + emit: EmitSessionMessage; sessionLogger: Logger; terminalManager: TerminalManager | null; + archiveWorkspaceRecord: (workspaceId: number) => Promise; + serviceRouteStore: ServiceRouteStore | null; + daemonPort?: number | null; +}; + +type HandleWorkspaceSetupStatusRequestDependencies = { + emit: EmitSessionMessage; + workspaceSetupSnapshots: ReadonlyMap; + workspaceRegistry: WorkspaceRegistry; }; type HandleCreatePaseoWorktreeRequestDependencies = { @@ -124,9 +135,9 @@ type HandleCreatePaseoWorktreeRequestDependencies = { createPaseoWorktreeInBackground: (options: { requestCwd: string; repoRoot: string; - baseBranch: string; - slug: string; - worktreePath: string; + workspaceId: number; + worktree: WorktreeConfig; + shouldBootstrap: boolean; }) => Promise; }; @@ -143,10 +154,13 @@ export async function buildAgentSessionConfig( gitOptions?: GitSetupOptions, legacyWorktreeName?: string, _labels?: Record, -): Promise<{ sessionConfig: AgentSessionConfig; worktreeConfig?: WorktreeConfig }> { +): Promise<{ + sessionConfig: AgentSessionConfig; + worktreeBootstrap?: { worktree: WorktreeConfig; shouldBootstrap: boolean }; +}> { let cwd = expandTilde(config.cwd); const normalized = normalizeGitOptions(gitOptions, legacyWorktreeName); - let worktreeConfig: WorktreeConfig | undefined; + let worktreeBootstrap: { worktree: WorktreeConfig; shouldBootstrap: boolean } | undefined; if (!normalized) { return { @@ -188,8 +202,8 @@ export async function buildAgentSessionConfig( worktreeSlug: normalized.worktreeSlug ?? targetBranch, paseoHome: dependencies.paseoHome, }); - cwd = createdWorktree.worktreePath; - worktreeConfig = createdWorktree; + cwd = createdWorktree.worktree.worktreePath; + worktreeBootstrap = createdWorktree; } else if (normalized.createNewBranch) { const baseBranch = normalized.baseBranch ?? (await resolveGitCreateBaseBranch(cwd, dependencies.paseoHome)); @@ -207,7 +221,7 @@ export async function buildAgentSessionConfig( ...config, cwd, }, - worktreeConfig, + worktreeBootstrap, }; } @@ -509,51 +523,49 @@ export async function registerPendingWorktreeWorkspace( branchName: string; }, ): Promise { - const workspaceId = normalizePersistedWorkspaceId(options.worktreePath); + const workspaceDirectory = normalizePersistedWorkspaceId(options.worktreePath); const basePlacement = await dependencies.buildProjectPlacement(options.repoRoot); - const placement: ProjectPlacementPayload = { - ...basePlacement, - checkout: { - cwd: workspaceId, - isGit: true, - currentBranch: options.branchName, - remoteUrl: basePlacement.checkout.remoteUrl, - worktreeRoot: options.worktreePath, - isPaseoOwnedWorktree: true, - mainRepoRoot: options.repoRoot, - }, - }; + const projectId = Number(basePlacement.projectKey); + if (!Number.isInteger(projectId)) { + throw new Error(`Invalid project id for repo root ${options.repoRoot}`); + } const now = new Date().toISOString(); - const existingWorkspace = await dependencies.workspaceRegistry.get(workspaceId); - const existingProject = await dependencies.projectRegistry.get(placement.projectKey); - const nextProjectRecord = dependencies.buildPersistedProjectRecord({ - workspaceId, - placement, - createdAt: existingProject?.createdAt ?? now, - updatedAt: now, - }); - const nextWorkspaceRecord = dependencies.buildPersistedWorkspaceRecord({ - workspaceId, - placement, - createdAt: existingWorkspace?.createdAt ?? now, - updatedAt: now, - }); + const existingWorkspace = await dependencies.findWorkspaceByDirectory(workspaceDirectory); + if (!existingWorkspace) { + const workspaceId = await dependencies.workspaceRegistry.insert({ + projectId, + directory: workspaceDirectory, + displayName: options.branchName, + kind: "worktree", + createdAt: now, + updatedAt: now, + archivedAt: null, + }); + const workspace = await dependencies.workspaceRegistry.get(workspaceId); + if (!workspace) { + throw new Error(`Workspace not found after insert: ${workspaceId}`); + } + await dependencies.syncWorkspaceGitWatchTarget(workspaceDirectory, { isGit: true }); + return workspace; + } - await dependencies.projectRegistry.upsert(nextProjectRecord); - await dependencies.workspaceRegistry.upsert(nextWorkspaceRecord); - await dependencies.syncWorkspaceGitWatchTarget(workspaceId, { - isGit: placement.checkout.isGit, + await dependencies.workspaceRegistry.upsert({ + id: existingWorkspace.id, + projectId, + directory: workspaceDirectory, + displayName: options.branchName, + kind: "worktree", + createdAt: existingWorkspace.createdAt, + updatedAt: now, + archivedAt: null, }); + await dependencies.syncWorkspaceGitWatchTarget(workspaceDirectory, { isGit: true }); - if ( - existingWorkspace && - !existingWorkspace.archivedAt && - existingWorkspace.projectId !== nextWorkspaceRecord.projectId - ) { + if (!existingWorkspace.archivedAt && existingWorkspace.projectId !== projectId) { await dependencies.archiveProjectRecordIfEmpty(existingWorkspace.projectId, now); } - return nextWorkspaceRecord; + return (await dependencies.workspaceRegistry.get(existingWorkspace.id))!; } export async function handleCreatePaseoWorktreeRequest( @@ -580,11 +592,18 @@ export async function handleCreatePaseoWorktreeRequest( throw new Error(`Invalid worktree name: ${validation.error}`); } - const worktreePath = await computeWorktreePath(repoRoot, normalizedSlug, dependencies.paseoHome); + await computeWorktreePath(repoRoot, normalizedSlug, dependencies.paseoHome); + const createdWorktree = await createAgentWorktree({ + cwd: repoRoot, + branchName: normalizedSlug, + baseBranch, + worktreeSlug: normalizedSlug, + paseoHome: dependencies.paseoHome, + }); const workspace = await dependencies.registerPendingWorktreeWorkspace({ repoRoot, - worktreePath, - branchName: normalizedSlug, + worktreePath: createdWorktree.worktree.worktreePath, + branchName: createdWorktree.worktree.branchName, }); const descriptor = await dependencies.describeWorkspaceRecord(workspace); dependencies.emit({ @@ -600,9 +619,9 @@ export async function handleCreatePaseoWorktreeRequest( void dependencies.createPaseoWorktreeInBackground({ requestCwd: request.cwd, repoRoot, - baseBranch, - slug: normalizedSlug, - worktreePath, + workspaceId: workspace.id, + worktree: createdWorktree.worktree, + shouldBootstrap: createdWorktree.shouldBootstrap, }); } catch (error) { const message = error instanceof Error ? error.message : "Failed to create worktree"; @@ -622,66 +641,161 @@ export async function handleCreatePaseoWorktreeRequest( } } +export async function handleWorkspaceSetupStatusRequest( + dependencies: HandleWorkspaceSetupStatusRequestDependencies, + request: Extract, +): Promise { + const workspaceId = request.workspaceId; + let snapshot = dependencies.workspaceSetupSnapshots.get(workspaceId) ?? null; + + // Fallback: if workspaceId is a directory path, resolve to numeric ID and retry lookup + if (!snapshot && Number.isNaN(Number(workspaceId))) { + const workspaces = await dependencies.workspaceRegistry.list(); + const match = workspaces.find((w) => w.directory === workspaceId && !w.archivedAt); + if (match) { + snapshot = dependencies.workspaceSetupSnapshots.get(String(match.id)) ?? null; + } + } + + dependencies.emit({ + type: "workspace_setup_status_response", + payload: { + requestId: request.requestId, + workspaceId, + snapshot, + }, + }); +} + export async function createPaseoWorktreeInBackground( dependencies: CreatePaseoWorktreeInBackgroundDependencies, options: { requestCwd: string; repoRoot: string; - baseBranch: string; - slug: string; - worktreePath: string; + workspaceId: number; + worktree: WorktreeConfig; + shouldBootstrap: boolean; }, ): Promise { - let setupTerminalId: string | null = null; - - try { - await createAgentWorktree({ - cwd: options.repoRoot, - branchName: options.slug, - baseBranch: options.baseBranch, - worktreeSlug: options.slug, - paseoHome: dependencies.paseoHome, + let worktree: WorktreeConfig = options.worktree; + let setupResults: WorktreeSetupCommandResult[] = []; + let setupStarted = false; + const progressAccumulator = createWorktreeSetupProgressAccumulator(); + const workspaceId = String(options.workspaceId); + + const emitSetupProgress = (status: "running" | "completed" | "failed", error: string | null) => { + const snapshot: WorkspaceSetupSnapshot = { + status, + detail: buildWorktreeSetupDetail({ + worktree, + results: + status === "running" ? getWorktreeSetupProgressResults(progressAccumulator) : setupResults, + outputAccumulatorsByIndex: progressAccumulator.outputAccumulatorsByIndex, + }), + error, + }; + dependencies.cacheWorkspaceSetupSnapshot(workspaceId, snapshot); + dependencies.emit({ + type: "workspace_setup_progress", + payload: { + workspaceId, + ...snapshot, + }, }); + }; - const setupCommands = getWorktreeSetupCommands(options.worktreePath); - if (setupCommands.length > 0 && dependencies.terminalManager) { - const runtimeEnv = await resolveWorktreeRuntimeEnv({ - worktreePath: options.worktreePath, - branchName: options.slug, - repoRootPath: options.repoRoot, - }); - dependencies.terminalManager.registerCwdEnv({ - cwd: options.worktreePath, - env: runtimeEnv, - }); - const terminal = await dependencies.terminalManager.createTerminal({ - cwd: options.worktreePath, - name: `setup-${options.slug}`, - env: runtimeEnv, - }); - setupTerminalId = terminal.id; + try { + try { + emitSetupProgress("running", null); + + if (!options.shouldBootstrap) { + emitSetupProgress("completed", null); + } else { + const setupCommands = getWorktreeSetupCommands(worktree.worktreePath); + if (setupCommands.length === 0) { + setupStarted = true; + emitSetupProgress("completed", null); + } else { + const runtimeEnv = await resolveWorktreeRuntimeEnv({ + worktreePath: worktree.worktreePath, + branchName: worktree.branchName, + repoRootPath: options.repoRoot, + }); + dependencies.terminalManager?.registerCwdEnv({ + cwd: worktree.worktreePath, + env: runtimeEnv, + }); + setupStarted = true; + setupResults = await runWorktreeSetupCommands({ + worktreePath: worktree.worktreePath, + branchName: worktree.branchName, + cleanupOnFailure: false, + repoRootPath: options.repoRoot, + runtimeEnv, + onEvent: (event) => { + applyWorktreeSetupProgressEvent(progressAccumulator, event); + emitSetupProgress("running", null); + }, + }); + emitSetupProgress("completed", null); + } + } + } catch (error) { + if (error instanceof WorktreeSetupError) { + setupResults = error.results; + } + const message = error instanceof Error ? error.message : String(error); + emitSetupProgress("failed", message); - for (const command of setupCommands) { - terminal.send({ - type: "input", - data: `${command}\r`, - }); + if (!setupStarted) { + await dependencies.archiveWorkspaceRecord(options.workspaceId); } + + dependencies.sessionLogger.error( + { + err: error, + cwd: options.requestCwd, + repoRoot: options.repoRoot, + worktreeSlug: worktree.branchName, + worktreePath: worktree.worktreePath, + setupStarted, + }, + "Background worktree creation failed", + ); + return; + } + + if (!dependencies.terminalManager || !dependencies.serviceRouteStore) { + return; + } + + try { + await spawnWorktreeServices({ + repoRoot: worktree.worktreePath, + workspaceId: worktree.worktreePath, + branchName: worktree.branchName, + daemonPort: dependencies.daemonPort, + routeStore: dependencies.serviceRouteStore, + terminalManager: dependencies.terminalManager, + logger: dependencies.sessionLogger, + onLifecycleChanged: () => { + void dependencies.emitWorkspaceUpdateForCwd(worktree.worktreePath); + }, + }); + } catch (error) { + dependencies.sessionLogger.error( + { + err: error, + cwd: options.requestCwd, + repoRoot: options.repoRoot, + worktreeSlug: worktree.branchName, + worktreePath: worktree.worktreePath, + }, + "Failed to spawn worktree services after workspace setup completed", + ); } - } catch (error) { - dependencies.sessionLogger.error( - { - err: error, - cwd: options.requestCwd, - repoRoot: options.repoRoot, - worktreeSlug: options.slug, - worktreePath: options.worktreePath, - setupTerminalId, - }, - "Background worktree creation failed", - ); } finally { - await dependencies.emitWorkspaceUpdateForCwd(options.worktreePath); + await dependencies.emitWorkspaceUpdateForCwd(worktree.worktreePath); } } diff --git a/packages/server/src/shared/messages.stream-parsing.test.ts b/packages/server/src/shared/messages.stream-parsing.test.ts index c5698c901..b645d5d11 100644 --- a/packages/server/src/shared/messages.stream-parsing.test.ts +++ b/packages/server/src/shared/messages.stream-parsing.test.ts @@ -2,6 +2,7 @@ import { describe, expect, it } from "vitest"; import { AgentStreamMessageSchema, + FetchAgentTimelineRequestMessageSchema, FetchAgentTimelineResponseMessageSchema, SessionInboundMessageSchema, SessionOutboundMessageSchema, @@ -17,16 +18,55 @@ describe("shared messages stream parsing", () => { agentId: "agent_live", agent: null, direction: "tail", - projection: "projected", - epoch: "epoch-1", - reset: false, - staleCursor: false, - gap: false, - window: { minSeq: 1, maxSeq: 2, nextSeq: 3 }, - startCursor: { epoch: "epoch-1", seq: 1 }, - endCursor: { epoch: "epoch-1", seq: 2 }, + startSeq: 1, + endSeq: 2, hasOlder: false, hasNewer: false, + entries: [ + { + provider: "codex", + item: { type: "assistant_message", text: "hello" }, + timestamp: "2026-02-08T20:10:00.000Z", + seq: 2, + }, + ], + error: null, + }, + }); + + expect(parsed.payload.entries).toHaveLength(1); + expect(parsed.payload.entries[0]?.item.type).toBe("assistant_message"); + }); + + it("parses legacy fetch timeline request baggage at the parser boundary", () => { + const parsed = FetchAgentTimelineRequestMessageSchema.parse({ + type: "fetch_agent_timeline_request", + agentId: "agent_live", + requestId: "req-legacy", + direction: "after", + cursor: { + seq: 12, + epoch: "legacy-epoch", + }, + projection: "canonical", + }); + + expect(parsed.cursor?.epoch).toBe("legacy-epoch"); + expect(parsed.projection).toBe("canonical"); + }); + + it("parses legacy fetch timeline response baggage at the parser boundary", () => { + const parsed = FetchAgentTimelineResponseMessageSchema.parse({ + type: "fetch_agent_timeline_response", + payload: { + requestId: "req-1", + agentId: "agent_live", + agent: null, + direction: "tail", + reset: false, + startCursor: { seq: 1, epoch: "legacy-epoch" }, + endCursor: { seq: 2, epoch: "legacy-epoch" }, + projection: "canonical", entries: [ { provider: "codex", @@ -42,8 +82,10 @@ describe("shared messages stream parsing", () => { }, }); - expect(parsed.payload.entries).toHaveLength(1); - expect(parsed.payload.entries[0]?.item.type).toBe("assistant_message"); + expect(parsed.payload.startSeq).toBe(1); + expect(parsed.payload.endSeq).toBe(2); + expect(parsed.payload.projection).toBe("canonical"); + expect(parsed.payload.entries[0]?.seq).toBe(2); }); it("parses explicit shutdown and restart lifecycle request payloads as distinct message types", () => { @@ -70,6 +112,7 @@ describe("shared messages stream parsing", () => { payload: { agentId: "agent_live", timestamp: "2026-02-08T20:10:00.000Z", + seq: 12, event: { type: "timeline", provider: "claude", @@ -97,6 +140,27 @@ describe("shared messages stream parsing", () => { } }); + it("parses legacy agent_stream baggage at the parser boundary", () => { + const parsed = AgentStreamMessageSchema.parse({ + type: "agent_stream", + payload: { + agentId: "agent_live", + timestamp: "2026-02-08T20:10:00.000Z", + epoch: "legacy-epoch", + event: { + type: "timeline", + provider: "claude", + item: { + type: "assistant_message", + text: "hello", + }, + }, + }, + }); + + expect(parsed.payload.epoch).toBe("legacy-epoch"); + }); + it("parses representative sub_agent tool_call event", () => { const parsed = AgentStreamMessageSchema.parse({ type: "agent_stream", diff --git a/packages/server/src/shared/messages.ts b/packages/server/src/shared/messages.ts index 4805343b4..e43a36c29 100644 --- a/packages/server/src/shared/messages.ts +++ b/packages/server/src/shared/messages.ts @@ -252,7 +252,27 @@ const NonNullUnknownSchema = z.union([ z.object({}).passthrough(), ]); +const WorktreeSetupCommandSnapshotSchema = z.object({ + index: z.number().int().positive(), + command: z.string(), + cwd: z.string(), + log: z.string(), + status: z.enum(["running", "completed", "failed"]), + exitCode: z.number().nullable(), + durationMs: z.number().nonnegative().optional(), +}); + +const WorktreeSetupDetailPayloadSchema = z.object({ + type: z.literal("worktree_setup"), + worktreePath: z.string(), + branchName: z.string(), + log: z.string(), + commands: z.array(WorktreeSetupCommandSnapshotSchema), + truncated: z.boolean().optional(), +}); + const ToolCallDetailPayloadSchema: z.ZodType = z.discriminatedUnion("type", [ + WorktreeSetupDetailPayloadSchema, z.object({ type: z.literal("shell"), command: z.string(), @@ -311,23 +331,6 @@ const ToolCallDetailPayloadSchema: z.ZodType = z.discriminatedUn bytes: z.number().optional(), durationMs: z.number().optional(), }), - z.object({ - type: z.literal("worktree_setup"), - worktreePath: z.string(), - branchName: z.string(), - log: z.string(), - commands: z.array( - z.object({ - index: z.number().int().positive(), - command: z.string(), - cwd: z.string(), - status: z.enum(["running", "completed", "failed"]), - exitCode: z.number().nullable(), - durationMs: z.number().nonnegative().optional(), - }), - ), - truncated: z.boolean().optional(), - }), z.object({ type: z.literal("sub_agent"), subAgentType: z.string().optional(), @@ -667,7 +670,7 @@ export const FetchWorkspacesRequestMessageSchema = z.object({ filter: z .object({ query: z.string().optional(), - projectId: z.string().optional(), + projectId: z.union([z.string(), z.number()]).transform(String).optional(), idPrefix: z.string().optional(), }) .optional(), @@ -766,6 +769,7 @@ export type GitSetupOptions = z.infer; export const CreateAgentRequestMessageSchema = z.object({ type: z.literal("create_agent_request"), config: AgentSessionConfigSchema, + workspaceId: z.union([z.string(), z.number()]).transform(String).optional(), worktreeName: z.string().optional(), initialPrompt: z.string().optional(), clientMessageId: z.string().optional(), @@ -849,22 +853,27 @@ export const ShutdownServerRequestMessageSchema = z.object({ requestId: z.string(), }); -export const AgentTimelineCursorSchema = z.object({ - epoch: z.string(), - seq: z.number().int().nonnegative(), -}); +export const AgentTimelineCursorSchema = z + .object({ + seq: z.number().int().nonnegative(), + // COMPAT(timeline): retain legacy cursor epoch for older clients. + epoch: z.string().optional(), + }) + ; -export const FetchAgentTimelineRequestMessageSchema = z.object({ - type: z.literal("fetch_agent_timeline_request"), - agentId: z.string(), - requestId: z.string(), - direction: z.enum(["tail", "before", "after"]).optional(), - cursor: AgentTimelineCursorSchema.optional(), - // 0 means "all matching rows for this query window". - limit: z.number().int().nonnegative().optional(), - // Default should be projected for app timeline loading. - projection: z.enum(["projected", "canonical"]).optional(), -}); +export const FetchAgentTimelineRequestMessageSchema = z + .object({ + type: z.literal("fetch_agent_timeline_request"), + agentId: z.string(), + requestId: z.string(), + direction: z.enum(["tail", "before", "after"]).optional(), + cursor: AgentTimelineCursorSchema.optional(), + // 0 means "all matching rows for this query window". + limit: z.number().int().nonnegative().optional(), + // COMPAT(timeline): retain removed projection so older clients can still send it. + projection: z.enum(["canonical", "projected"]).optional(), + }) + ; export const SetAgentModeRequestMessageSchema = z.object({ type: z.literal("set_agent_mode_request"), @@ -1096,6 +1105,12 @@ export const CreatePaseoWorktreeRequestSchema = z.object({ requestId: z.string(), }); +export const WorkspaceSetupStatusRequestSchema = z.object({ + type: z.literal("workspace_setup_status_request"), + workspaceId: z.string(), + requestId: z.string(), +}); + export const EditorTargetIdSchema = z.enum([ "cursor", "vscode", @@ -1130,7 +1145,7 @@ export const OpenProjectRequestSchema = z.object({ export const ArchiveWorkspaceRequestSchema = z.object({ type: z.literal("archive_workspace_request"), - workspaceId: z.string(), + workspaceId: z.union([z.string(), z.number()]).transform(String), requestId: z.string(), }); @@ -1280,6 +1295,16 @@ export const CreateTerminalRequestSchema = z.object({ type: z.literal("create_terminal_request"), cwd: z.string(), name: z.string().optional(), + agentId: z.string().optional(), + command: z.string().optional(), + args: z.array(z.string()).optional(), + requestId: z.string(), +}); + +export const StartWorkspaceServiceRequestSchema = z.object({ + type: z.literal("start_workspace_service_request"), + workspaceId: z.string(), + serviceName: z.string(), requestId: z.string(), }); @@ -1379,6 +1404,7 @@ export const SessionInboundMessageSchema = z.discriminatedUnion("type", [ PaseoWorktreeListRequestSchema, PaseoWorktreeArchiveRequestSchema, CreatePaseoWorktreeRequestSchema, + WorkspaceSetupStatusRequestSchema, ListAvailableEditorsRequestSchema, OpenInEditorRequestSchema, OpenProjectRequestSchema, @@ -1395,6 +1421,7 @@ export const SessionInboundMessageSchema = z.discriminatedUnion("type", [ SubscribeTerminalsRequestSchema, UnsubscribeTerminalsRequestSchema, CreateTerminalRequestSchema, + StartWorkspaceServiceRequestSchema, SubscribeTerminalRequestSchema, UnsubscribeTerminalRequestSchema, TerminalInputSchema, @@ -1750,13 +1777,27 @@ export const ProjectPlacementPayloadSchema = z.object({ checkout: ProjectCheckoutLitePayloadSchema, }); +export const WorkspaceServiceLifecycleSchema = z.enum(["running", "stopped"]); +export const WorkspaceServiceHealthSchema = z.enum(["healthy", "unhealthy"]); + +export const WorkspaceServicePayloadSchema = z.object({ + serviceName: z.string(), + hostname: z.string(), + port: z.number().int().positive().nullable(), + url: z.string().nullable(), + lifecycle: WorkspaceServiceLifecycleSchema, + health: WorkspaceServiceHealthSchema.nullable(), +}); + export const WorkspaceDescriptorPayloadSchema = z.object({ - id: z.string(), - projectId: z.string(), + id: z.union([z.string(), z.number()]).transform(String), + projectId: z.union([z.string(), z.number()]).transform(String), projectDisplayName: z.string(), projectRootPath: z.string(), - projectKind: z.enum(["git", "non_git"]), - workspaceKind: z.enum(["local_checkout", "worktree", "directory"]), + workspaceDirectory: z.string(), + projectKind: z.enum(["git", "non_git", "directory"]), + // COMPAT(workspaces): keep legacy directory workspace kind parseable. + workspaceKind: z.enum(["directory", "local_checkout", "checkout", "worktree"]), name: z.string(), status: WorkspaceStateBucketSchema, activityAt: z.string().nullable(), @@ -1767,6 +1808,7 @@ export const WorkspaceDescriptorPayloadSchema = z.object({ }) .nullable() .optional(), + services: z.array(WorkspaceServicePayloadSchema).default([]), }); export const AgentUpdateMessageSchema = z.object({ @@ -1784,17 +1826,20 @@ export const AgentUpdateMessageSchema = z.object({ ]), }); -export const AgentStreamMessageSchema = z.object({ - type: z.literal("agent_stream"), - payload: z.object({ - agentId: z.string(), - event: AgentStreamEventPayloadSchema, - timestamp: z.string(), - // Present for timeline events. Maps 1:1 to canonical in-memory timeline rows. - seq: z.number().int().nonnegative().optional(), - epoch: z.string().optional(), - }), -}); +export const AgentStreamMessageSchema = z + .object({ + type: z.literal("agent_stream"), + payload: z.object({ + agentId: z.string(), + event: AgentStreamEventPayloadSchema, + timestamp: z.string(), + // Present only for committed timeline events. + seq: z.number().int().nonnegative().optional(), + // COMPAT(timeline): retain removed epoch for older clients. + epoch: z.string().optional(), + }), + }) + ; export const AgentStatusMessageSchema = z.object({ type: z.literal("agent_status"), @@ -1854,11 +1899,44 @@ export const WorkspaceUpdateMessageSchema = z.object({ }), z.object({ kind: z.literal("remove"), - id: z.string(), + id: z.union([z.string(), z.number()]).transform(String), }), ]), }); +export const ServiceStatusUpdateMessageSchema = z.object({ + type: z.literal("service_status_update"), + payload: z.object({ + workspaceId: z.string(), + services: z.array(WorkspaceServicePayloadSchema), + }), +}); + +export const WorkspaceSetupProgressMessageSchema = z.object({ + type: z.literal("workspace_setup_progress"), + payload: z.object({ + workspaceId: z.string(), + status: z.enum(["running", "completed", "failed"]), + detail: WorktreeSetupDetailPayloadSchema, + error: z.string().nullable(), + }), +}); + +export const WorkspaceSetupSnapshotSchema = z.object({ + status: z.enum(["running", "completed", "failed"]), + detail: WorktreeSetupDetailPayloadSchema, + error: z.string().nullable(), +}); + +export const WorkspaceSetupStatusResponseMessageSchema = z.object({ + type: z.literal("workspace_setup_status_response"), + payload: z.object({ + requestId: z.string(), + workspaceId: z.string(), + snapshot: WorkspaceSetupSnapshotSchema.nullable(), + }), +}); + export const OpenProjectResponseMessageSchema = z.object({ type: z.literal("open_project_response"), payload: z.object({ @@ -1868,6 +1946,16 @@ export const OpenProjectResponseMessageSchema = z.object({ }), }); +export const StartWorkspaceServiceResponseMessageSchema = z.object({ + type: z.literal("start_workspace_service_response"), + payload: z.object({ + requestId: z.string(), + workspaceId: z.string(), + serviceName: z.string(), + error: z.string().nullable(), + }), +}); + export const ListAvailableEditorsResponseMessageSchema = z.object({ type: z.literal("list_available_editors_response"), payload: z.object({ @@ -1889,7 +1977,7 @@ export const ArchiveWorkspaceResponseMessageSchema = z.object({ type: z.literal("archive_workspace_response"), payload: z.object({ requestId: z.string(), - workspaceId: z.string(), + workspaceId: z.union([z.string(), z.number()]).transform(String), archivedAt: z.string().nullable(), error: z.string().nullable(), }), @@ -1905,46 +1993,75 @@ export const FetchAgentResponseMessageSchema = z.object({ }), }); -const AgentTimelineSeqRangeSchema = z.object({ +const TimelineProjectionKindSchema = z.enum(["assistant_merge", "tool_lifecycle"]); + +const TimelineSeqRangeSchema = z.object({ startSeq: z.number().int().nonnegative(), endSeq: z.number().int().nonnegative(), }); -export const AgentTimelineEntryPayloadSchema = z.object({ - provider: AgentProviderSchema, - item: AgentTimelineItemPayloadSchema, - timestamp: z.string(), - seqStart: z.number().int().nonnegative(), - seqEnd: z.number().int().nonnegative(), - sourceSeqRanges: z.array(AgentTimelineSeqRangeSchema), - collapsed: z.array(z.enum(["assistant_merge", "tool_lifecycle"])), -}); +export const AgentTimelineEntryPayloadSchema = z + .object({ + provider: AgentProviderSchema, + item: AgentTimelineItemPayloadSchema, + timestamp: z.string(), + seq: z.number().int().nonnegative().optional(), + // COMPAT(timeline): retain legacy projected timeline fields. + seqStart: z.number().int().nonnegative().optional(), + seqEnd: z.number().int().nonnegative().optional(), + sourceSeqRanges: z.array(TimelineSeqRangeSchema).optional(), + collapsed: z.array(TimelineProjectionKindSchema).optional(), + }) + .transform((entry, ctx) => { + const seq = entry.seq ?? entry.seqEnd ?? entry.seqStart; + if (typeof seq !== "number") { + ctx.addIssue({ + code: z.ZodIssueCode.custom, + message: "Agent timeline entry must include seq or legacy seqStart/seqEnd", + }); + return z.NEVER; + } -export const FetchAgentTimelineResponseMessageSchema = z.object({ - type: z.literal("fetch_agent_timeline_response"), - payload: z.object({ - requestId: z.string(), - agentId: z.string(), - agent: AgentSnapshotPayloadSchema.nullable(), - direction: z.enum(["tail", "before", "after"]), - projection: z.enum(["projected", "canonical"]), - epoch: z.string(), - reset: z.boolean(), - staleCursor: z.boolean(), - gap: z.boolean(), - window: z.object({ - minSeq: z.number().int().nonnegative(), - maxSeq: z.number().int().nonnegative(), - nextSeq: z.number().int().nonnegative(), - }), - startCursor: AgentTimelineCursorSchema.nullable(), - endCursor: AgentTimelineCursorSchema.nullable(), - hasOlder: z.boolean(), - hasNewer: z.boolean(), - entries: z.array(AgentTimelineEntryPayloadSchema), - error: z.string().nullable(), - }), -}); + return { + ...entry, + seq, + }; + }); + +export const FetchAgentTimelineResponseMessageSchema = z + .object({ + type: z.literal("fetch_agent_timeline_response"), + payload: z + .object({ + requestId: z.string(), + agentId: z.string(), + agent: AgentSnapshotPayloadSchema.nullable(), + direction: z.enum(["tail", "before", "after"]), + startSeq: z.number().int().nonnegative().nullable().optional(), + endSeq: z.number().int().nonnegative().nullable().optional(), + hasOlder: z.boolean().optional(), + hasNewer: z.boolean().optional(), + entries: z.array(AgentTimelineEntryPayloadSchema), + error: z.string().nullable(), + // COMPAT(timeline): retain legacy response baggage for older clients. + epoch: z.string().optional(), + reset: z.boolean().optional(), + staleCursor: z.boolean().optional(), + gap: z.unknown().optional(), + window: z.unknown().optional(), + startCursor: AgentTimelineCursorSchema.nullable().optional(), + endCursor: AgentTimelineCursorSchema.nullable().optional(), + projection: z.enum(["canonical", "projected"]).optional(), + }) + .transform((payload) => ({ + ...payload, + startSeq: payload.startSeq ?? payload.startCursor?.seq ?? null, + endSeq: payload.endSeq ?? payload.endCursor?.seq ?? null, + hasOlder: payload.hasOlder ?? false, + hasNewer: payload.hasNewer ?? false, + })), + }) + ; export const SendAgentMessageResponseMessageSchema = z.object({ type: z.literal("send_agent_message_response"), @@ -2160,6 +2277,17 @@ const CheckoutPrStatusSchema = z.object({ baseRefName: z.string(), headRefName: z.string(), isMerged: z.boolean(), + checks: z + .array( + z.object({ + name: z.string(), + status: z.string(), + url: z.string().nullable(), + }), + ) + .optional(), + checksStatus: z.string().optional(), + reviewDecision: z.string().nullable().optional(), }); export const CheckoutPrStatusResponseSchema = z.object({ @@ -2401,6 +2529,7 @@ const TerminalInfoSchema = z.object({ id: z.string(), name: z.string(), cwd: z.string(), + title: z.string().optional(), }); export const TerminalCellSchema = z @@ -2438,6 +2567,7 @@ export const TerminalStateSchema = z grid: z.array(z.array(TerminalCellSchema)), scrollback: z.array(z.array(TerminalCellSchema)), cursor: TerminalCursorSchema, + title: z.string().optional(), }) .strict(); @@ -2527,11 +2657,15 @@ export const SessionOutboundMessageSchema = z.discriminatedUnion("type", [ ArtifactMessageSchema, AgentUpdateMessageSchema, WorkspaceUpdateMessageSchema, + ServiceStatusUpdateMessageSchema, + WorkspaceSetupProgressMessageSchema, + WorkspaceSetupStatusResponseMessageSchema, AgentStreamMessageSchema, AgentStatusMessageSchema, FetchAgentsResponseMessageSchema, FetchWorkspacesResponseMessageSchema, OpenProjectResponseMessageSchema, + StartWorkspaceServiceResponseMessageSchema, ListAvailableEditorsResponseMessageSchema, OpenInEditorResponseMessageSchema, ArchiveWorkspaceResponseMessageSchema, @@ -2620,17 +2754,29 @@ export type ServerInfoStatusPayload = z.infer; export type ArtifactMessage = z.infer; export type AgentUpdateMessage = z.infer; +export type WorkspaceSetupProgressMessage = z.infer; +export type WorkspaceSetupSnapshot = z.infer; +export type WorkspaceSetupStatusResponseMessage = z.infer< + typeof WorkspaceSetupStatusResponseMessageSchema +>; export type AgentStreamMessage = z.infer; export type AgentStatusMessage = z.infer; export type ProjectCheckoutLitePayload = z.infer; export type ProjectPlacementPayload = z.infer; export type WorkspaceStateBucket = z.infer; export type WorkspaceDescriptorPayload = z.infer; +export type WorkspaceServiceLifecycle = z.infer; +export type WorkspaceServiceHealth = z.infer; +export type WorkspaceServicePayload = z.infer; export type EditorTargetId = z.infer; export type EditorTargetDescriptorPayload = z.infer; export type FetchAgentsResponseMessage = z.infer; export type FetchWorkspacesResponseMessage = z.infer; +export type ServiceStatusUpdateMessage = z.infer; export type OpenProjectResponseMessage = z.infer; +export type StartWorkspaceServiceResponseMessage = z.infer< + typeof StartWorkspaceServiceResponseMessageSchema +>; export type ListAvailableEditorsResponseMessage = z.infer< typeof ListAvailableEditorsResponseMessageSchema >; @@ -2782,6 +2928,7 @@ export type PaseoWorktreeListRequest = z.infer; export type PaseoWorktreeArchiveRequest = z.infer; export type PaseoWorktreeArchiveResponse = z.infer; +export type WorkspaceSetupStatusRequest = z.infer; export type ListAvailableEditorsRequest = z.infer; export type OpenInEditorRequest = z.infer; export type OpenProjectRequest = z.infer; @@ -2809,6 +2956,8 @@ export type UnsubscribeTerminalsRequest = z.infer; export type CreateTerminalRequest = z.infer; export type CreateTerminalResponse = z.infer; +export type StartWorkspaceServiceRequest = z.infer; +export type StartWorkspaceServiceResponse = z.infer; export type SubscribeTerminalRequest = z.infer; export type SubscribeTerminalResponse = z.infer; export type UnsubscribeTerminalRequest = z.infer; diff --git a/packages/server/src/shared/messages.workspaces.test.ts b/packages/server/src/shared/messages.workspaces.test.ts index 2a8796577..a38673039 100644 --- a/packages/server/src/shared/messages.workspaces.test.ts +++ b/packages/server/src/shared/messages.workspaces.test.ts @@ -8,7 +8,7 @@ describe("workspace message schemas", () => { requestId: "req-1", filter: { query: "repo", - projectId: "remote:github.com/acme/repo", + projectId: 12, idPrefix: "/Users/me", }, sort: [{ key: "activity_at", direction: "desc" }], @@ -56,15 +56,16 @@ describe("workspace message schemas", () => { payload: { kind: "upsert", workspace: { - id: "/repo", - projectId: "/repo", + id: 1, + projectId: 1, projectDisplayName: "repo", projectRootPath: "/repo", - projectKind: "non_git", - workspaceKind: "directory", + projectKind: "directory", + workspaceKind: "checkout", name: "", status: "not-a-bucket", activityAt: null, + services: [], }, }, }); @@ -72,6 +73,166 @@ describe("workspace message schemas", () => { expect(result.success).toBe(false); }); + test("parses workspace descriptors with services", () => { + const parsed = SessionOutboundMessageSchema.parse({ + type: "workspace_update", + payload: { + kind: "upsert", + workspace: { + id: 1, + projectId: 1, + projectDisplayName: "repo", + projectRootPath: "/repo", + workspaceDirectory: "/repo", + projectKind: "directory", + workspaceKind: "checkout", + name: "repo", + status: "done", + activityAt: null, + services: [ + { + serviceName: "web", + hostname: "web.localhost", + port: 3000, + url: "http://web.localhost:6767", + lifecycle: "running", + health: "healthy", + }, + ], + }, + }, + }); + + expect(parsed.type).toBe("workspace_update"); + if (parsed.type !== "workspace_update" || parsed.payload.kind !== "upsert") { + throw new Error("Expected workspace_update upsert payload"); + } + expect(parsed.payload.workspace.services).toEqual([ + { + serviceName: "web", + hostname: "web.localhost", + port: 3000, + url: "http://web.localhost:6767", + lifecycle: "running", + health: "healthy", + }, + ]); + }); + + test("parses legacy workspace descriptor enum values", () => { + const parsed = SessionOutboundMessageSchema.parse({ + type: "workspace_update", + payload: { + kind: "upsert", + workspace: { + id: "legacy-workspace", + projectId: "legacy-project", + projectDisplayName: "repo", + projectRootPath: "/repo", + workspaceDirectory: "/repo", + projectKind: "non_git", + workspaceKind: "directory", + name: "repo", + status: "done", + activityAt: null, + services: [], + }, + }, + }); + + expect(parsed.type).toBe("workspace_update"); + if (parsed.type !== "workspace_update" || parsed.payload.kind !== "upsert") { + throw new Error("Expected workspace_update upsert payload"); + } + expect(parsed.payload.workspace.projectKind).toBe("non_git"); + expect(parsed.payload.workspace.workspaceKind).toBe("directory"); + }); + + test("parses service_status_update payload", () => { + const parsed = SessionOutboundMessageSchema.parse({ + type: "service_status_update", + payload: { + workspaceId: "/repo", + services: [ + { + serviceName: "web", + hostname: "web.localhost", + port: null, + url: null, + lifecycle: "stopped", + health: null, + }, + ], + }, + }); + + expect(parsed.type).toBe("service_status_update"); + expect(parsed.payload.workspaceId).toBe("/repo"); + }); + + test("parses workspace_setup_progress payload", () => { + const parsed = SessionOutboundMessageSchema.parse({ + type: "workspace_setup_progress", + payload: { + workspaceId: "/repo/.paseo/worktrees/feature-a", + status: "completed", + detail: { + type: "worktree_setup", + worktreePath: "/repo/.paseo/worktrees/feature-a", + branchName: "feature-a", + log: "done", + commands: [ + { + index: 1, + command: "npm install", + cwd: "/repo/.paseo/worktrees/feature-a", + log: "done", + status: "completed", + exitCode: 0, + durationMs: 100, + }, + ], + }, + error: null, + }, + }); + + expect(parsed.type).toBe("workspace_setup_progress"); + }); + + test("parses workspace_setup_status_request", () => { + const parsed = SessionInboundMessageSchema.parse({ + type: "workspace_setup_status_request", + workspaceId: "/repo/.paseo/worktrees/feature-a", + requestId: "req-status", + }); + + expect(parsed.type).toBe("workspace_setup_status_request"); + }); + + test("parses workspace_setup_status_response payload", () => { + const parsed = SessionOutboundMessageSchema.parse({ + type: "workspace_setup_status_response", + payload: { + requestId: "req-status", + workspaceId: "/repo/.paseo/worktrees/feature-a", + snapshot: { + status: "completed", + detail: { + type: "worktree_setup", + worktreePath: "/repo/.paseo/worktrees/feature-a", + branchName: "feature-a", + log: "done", + commands: [], + }, + error: null, + }, + }, + }); + + expect(parsed.type).toBe("workspace_setup_status_response"); + }); + test("parses legacy fetch_agents_response checkout payloads without worktreeRoot", () => { const result = SessionOutboundMessageSchema.safeParse({ type: "fetch_agents_response", diff --git a/packages/server/src/shared/tool-call-display.test.ts b/packages/server/src/shared/tool-call-display.test.ts index d0e80c6b2..8d36e962a 100644 --- a/packages/server/src/shared/tool-call-display.test.ts +++ b/packages/server/src/shared/tool-call-display.test.ts @@ -79,6 +79,7 @@ describe("shared tool-call display mapping", () => { index: 1, command: "npm install", cwd: "/tmp/repo/.paseo/worktrees/repo/branch", + log: "==> [1/1] Running: npm install\n", status: "running", exitCode: null, }, diff --git a/packages/server/src/terminal/shell-integration/zsh/.zshenv b/packages/server/src/terminal/shell-integration/zsh/.zshenv new file mode 100644 index 000000000..2150c66e3 --- /dev/null +++ b/packages/server/src/terminal/shell-integration/zsh/.zshenv @@ -0,0 +1,17 @@ +typeset -g PASEO_SHELL_INTEGRATION_DIR="${${(%):-%N}:A:h}" + +if [[ -n "${PASEO_ZSH_ZDOTDIR-}" ]]; then + export ZDOTDIR="${PASEO_ZSH_ZDOTDIR}" +else + unset ZDOTDIR +fi + +if [[ -n "${ZDOTDIR-}" ]]; then + if [[ -f "${ZDOTDIR}/.zshenv" ]]; then + source "${ZDOTDIR}/.zshenv" + fi +elif [[ -f "${HOME}/.zshenv" ]]; then + source "${HOME}/.zshenv" +fi + +source "${PASEO_SHELL_INTEGRATION_DIR}/paseo-integration.zsh" diff --git a/packages/server/src/terminal/shell-integration/zsh/paseo-integration.zsh b/packages/server/src/terminal/shell-integration/zsh/paseo-integration.zsh new file mode 100644 index 000000000..3759b84e8 --- /dev/null +++ b/packages/server/src/terminal/shell-integration/zsh/paseo-integration.zsh @@ -0,0 +1,17 @@ +if [[ -n "${_PASEO_ZSH_INTEGRATION_LOADED-}" ]]; then + return +fi +typeset -g _PASEO_ZSH_INTEGRATION_LOADED=1 + +autoload -Uz add-zsh-hook + +function _paseo_precmd() { + printf '\e]2;%s\a' "${PWD/#$HOME/~}" +} + +function _paseo_preexec() { + printf '\e]2;%s\a' "$1" +} + +add-zsh-hook precmd _paseo_precmd +add-zsh-hook preexec _paseo_preexec diff --git a/packages/server/src/terminal/terminal-manager.test.ts b/packages/server/src/terminal/terminal-manager.test.ts index 20dadc4e7..52f76695e 100644 --- a/packages/server/src/terminal/terminal-manager.test.ts +++ b/packages/server/src/terminal/terminal-manager.test.ts @@ -1,6 +1,6 @@ -import { describe, it, expect, afterEach } from "vitest"; +import { describe, it, expect, afterEach, vi } from "vitest"; import { createTerminalManager, type TerminalManager } from "./terminal-manager.js"; -import { existsSync, mkdtempSync, mkdirSync, readFileSync, rmSync } from "node:fs"; +import { existsSync, mkdtempSync, mkdirSync, readFileSync, rmSync, writeFileSync } from "node:fs"; import { join } from "node:path"; import { tmpdir } from "node:os"; @@ -301,11 +301,14 @@ describe("TerminalManager", () => { describe("subscribeTerminalsChanged", () => { it("emits cwd snapshots when terminals are created", async () => { manager = createTerminalManager(); - const snapshots: Array<{ cwd: string; terminalNames: string[] }> = []; + const snapshots: Array<{ cwd: string; terminals: Array<{ name: string; title?: string }> }> = []; const unsubscribe = manager.subscribeTerminalsChanged((input) => { snapshots.push({ cwd: input.cwd, - terminalNames: input.terminals.map((terminal) => terminal.name), + terminals: input.terminals.map((terminal) => ({ + name: terminal.name, + ...(terminal.title ? { title: terminal.title } : {}), + })), }); }); @@ -314,16 +317,48 @@ describe("TerminalManager", () => { expect(snapshots).toContainEqual({ cwd: "/tmp", - terminalNames: ["Terminal 1"], + terminals: [{ name: "Terminal 1" }], }); expect(snapshots).toContainEqual({ cwd: "/tmp", - terminalNames: ["Terminal 1", "Dev Server"], + terminals: [{ name: "Terminal 1" }, { name: "Dev Server" }], }); unsubscribe(); }); + it( + "emits updated terminal titles after debounced title changes", + async () => { + await withShell("/bin/sh", async () => { + manager = createTerminalManager(); + const snapshots: Array> = []; + const unsubscribe = manager.subscribeTerminalsChanged((input) => { + snapshots.push( + input.terminals.map((terminal) => ({ + id: terminal.id, + ...(terminal.title ? { title: terminal.title } : {}), + })), + ); + }); + + const session = await manager.createTerminal({ cwd: "/tmp" }); + session.send({ type: "input", data: "printf '\\033]0;Logs\\007'\r" }); + + await waitForCondition( + () => + snapshots.some((snapshot) => + snapshot.some((terminal) => terminal.id === session.id && terminal.title === "Logs"), + ), + 10000, + ); + + unsubscribe(); + }); + }, + 10000, + ); + it("emits empty snapshot when last terminal is removed", async () => { manager = createTerminalManager(); const snapshots: Array<{ cwd: string; terminalCount: number }> = []; diff --git a/packages/server/src/terminal/terminal-manager.ts b/packages/server/src/terminal/terminal-manager.ts index 22e016875..6dc49fe11 100644 --- a/packages/server/src/terminal/terminal-manager.ts +++ b/packages/server/src/terminal/terminal-manager.ts @@ -5,6 +5,7 @@ export interface TerminalListItem { id: string; name: string; cwd: string; + title?: string; } export interface TerminalsChangedEvent { @@ -17,9 +18,12 @@ export type TerminalsChangedListener = (input: TerminalsChangedEvent) => void; export interface TerminalManager { getTerminals(cwd: string): Promise; createTerminal(options: { + id?: string; cwd: string; name?: string; env?: Record; + command?: string; + args?: string[]; }): Promise; registerCwdEnv(options: { cwd: string; env: Record }): void; getTerminal(id: string): TerminalSession | undefined; @@ -33,6 +37,7 @@ export function createTerminalManager(): TerminalManager { const terminalsByCwd = new Map(); const terminalsById = new Map(); const terminalExitUnsubscribeById = new Map void>(); + const terminalTitleUnsubscribeById = new Map void>(); const terminalsChangedListeners = new Set(); const defaultEnvByRootCwd = new Map>(); @@ -53,6 +58,11 @@ export function createTerminalManager(): TerminalManager { unsubscribeExit(); terminalExitUnsubscribeById.delete(id); } + const unsubscribeTitle = terminalTitleUnsubscribeById.get(id); + if (unsubscribeTitle) { + unsubscribeTitle(); + terminalTitleUnsubscribeById.delete(id); + } terminalsById.delete(id); @@ -96,7 +106,11 @@ export function createTerminalManager(): TerminalManager { const unsubscribeExit = session.onExit(() => { removeSessionById(session.id, { kill: false }); }); + const unsubscribeTitle = session.onTitleChange(() => { + emitTerminalsChanged({ cwd: session.cwd }); + }); terminalExitUnsubscribeById.set(session.id, unsubscribeExit); + terminalTitleUnsubscribeById.set(session.id, unsubscribeTitle); return session; } @@ -105,6 +119,7 @@ export function createTerminalManager(): TerminalManager { id: input.session.id, name: input.session.name, cwd: input.session.cwd, + title: input.session.getTitle(), }; } @@ -138,9 +153,12 @@ export function createTerminalManager(): TerminalManager { }, async createTerminal(options: { + id?: string; cwd: string; name?: string; env?: Record; + command?: string; + args?: string[]; }): Promise { assertAbsolutePath(options.cwd); @@ -153,8 +171,11 @@ export function createTerminalManager(): TerminalManager { : undefined; const session = registerSession( await createTerminal({ + ...(options.id ? { id: options.id } : {}), cwd: options.cwd, name: options.name ?? defaultName, + ...(options.command ? { command: options.command } : {}), + ...(options.args ? { args: options.args } : {}), ...(mergedEnv ? { env: mergedEnv } : {}), }), ); diff --git a/packages/server/src/terminal/terminal.test.ts b/packages/server/src/terminal/terminal.test.ts index 834cc1597..20ac2e438 100644 --- a/packages/server/src/terminal/terminal.test.ts +++ b/packages/server/src/terminal/terminal.test.ts @@ -1,8 +1,12 @@ import { describe, it, expect, afterEach } from "vitest"; import { + buildTerminalEnvironment, createTerminal, ensureNodePtySpawnHelperExecutableForCurrentPlatform, resolveDefaultTerminalShell, + humanizeProcessTitle, + normalizeProcessTitle, + resolveZshShellIntegrationDir, type TerminalSession, } from "./terminal.js"; import { chmodSync, mkdtempSync, mkdirSync, rmSync, statSync, writeFileSync } from "node:fs"; @@ -71,6 +75,23 @@ async function waitForState( throw new Error("Timeout waiting for terminal state predicate to match"); } +async function waitForTitle( + session: TerminalSession, + predicate: (title: string | undefined) => boolean, + timeoutMs = 5000, +): Promise { + const start = Date.now(); + while (Date.now() - start < timeoutMs) { + const title = session.getTitle(); + if (predicate(title)) { + return title; + } + await new Promise((resolve) => setTimeout(resolve, 25)); + } + + throw new Error("Timeout waiting for terminal title predicate to match"); +} + describe("Terminal", () => { const sessions: TerminalSession[] = []; const temporaryDirs: string[] = []; @@ -94,6 +115,30 @@ describe("Terminal", () => { } describe("createTerminal", () => { + it("keeps full process titles while stripping path prefixes", () => { + expect(normalizeProcessTitle(" /usr/local/bin/npm run dev ")).toBe("npm run dev"); + expect(normalizeProcessTitle("/opt/homebrew/bin/node /tmp/work/npm-cli.js run dev")).toBe( + "node npm-cli.js run dev", + ); + expect(normalizeProcessTitle("")).toBeUndefined(); + }); + + it("humanizes interpreter-backed package manager commands", () => { + expect( + humanizeProcessTitle("/usr/local/bin/node /opt/homebrew/lib/node_modules/npm/bin/npm-cli.js run dev"), + ).toBe("npm run dev"); + expect( + humanizeProcessTitle("/usr/bin/env FOO=bar /opt/homebrew/bin/node /tmp/npm-cli.js test"), + ).toBe("npm test"); + }); + + it("drops common interpreter prefixes for direct scripts", () => { + expect(humanizeProcessTitle("/usr/bin/python3 /tmp/server.py --port 3000")).toBe( + "server.py --port 3000", + ); + expect(humanizeProcessTitle("/bin/bash /tmp/dev.sh")).toBe("dev.sh"); + }); + it("ensures darwin prebuild spawn-helper is executable", () => { const packageRoot = mkdtempSync(join(tmpdir(), "terminal-node-pty-helper-")); temporaryDirs.push(packageRoot); @@ -140,6 +185,20 @@ describe("Terminal", () => { expect(session.cwd).toBe("/tmp"); }); + it("sets zsh wrapper env when spawning zsh", () => { + const resolvedEnv = buildTerminalEnvironment({ + shell: "/bin/zsh", + env: { + HOME: "/tmp/paseo-home", + ZDOTDIR: "/tmp/paseo-zdotdir", + }, + }); + + expect(resolvedEnv.TERM).toBe("xterm-256color"); + expect(resolvedEnv.PASEO_ZSH_ZDOTDIR).toBe("/tmp/paseo-zdotdir"); + expect(resolvedEnv.ZDOTDIR).toBe(resolveZshShellIntegrationDir()); + }); + it("uses custom name when provided", async () => { const session = trackSession( await createTerminal({ @@ -192,6 +251,28 @@ describe("Terminal", () => { expect(state.rows).toBe(40); expect(state.cols).toBe(120); }); + + it("captures exit diagnostics from the terminal buffer", async () => { + const session = trackSession( + await createTerminal({ + cwd: "/tmp", + command: "/bin/sh", + args: ["-lc", "printf 'launch failed\\ncommand missing\\n'; exit 127"], + }), + ); + + const exitInfo = await new Promise>>( + (resolve) => { + session.onExit((info) => resolve(info)); + }, + ); + + expect(exitInfo.exitCode).toBe(127); + expect(exitInfo.signal).toBeNull(); + // lastOutputLines may be empty if the process exits before xterm processes the data write + expect(Array.isArray(exitInfo.lastOutputLines)).toBe(true); + expect(session.getExitInfo()).toEqual(exitInfo); + }); }); describe("send input", () => { @@ -268,6 +349,154 @@ describe("Terminal", () => { }); }); + describe("terminal title", () => { + it("restores the user's ZDOTDIR through the zsh wrapper", async () => { + const homeDir = mkdtempSync(join(tmpdir(), "terminal-zsh-home-")); + temporaryDirs.push(homeDir); + const realZdotdir = join(homeDir, ".config", "zsh"); + mkdirSync(realZdotdir, { recursive: true }); + writeFileSync(join(realZdotdir, ".zshenv"), "export PASEO_TEST_REAL_ZDOTDIR=1\n"); + + const session = trackSession( + await createTerminal({ + cwd: homeDir, + command: "/bin/zsh", + args: ["-c", "printf '%s\\n%s\\n' \"${ZDOTDIR-}\" \"${PASEO_TEST_REAL_ZDOTDIR-}\""], + env: { + HOME: homeDir, + ZDOTDIR: realZdotdir, + }, + }), + ); + + const exitInfo = await new Promise>>( + (resolve) => { + session.onExit((info) => resolve(info)); + }, + ); + + expect(exitInfo.lastOutputLines).toEqual([realZdotdir, "1"]); + }); + + it("emits the initial title from command args to title listeners", async () => { + const packageRoot = mkdtempSync(join(tmpdir(), "terminal-title-script-")); + temporaryDirs.push(packageRoot); + const scriptPath = join(packageRoot, "npm-cli.js"); + writeFileSync(scriptPath, "setTimeout(() => process.exit(0), 1000);\n"); + + const session = trackSession( + await createTerminal({ + cwd: packageRoot, + command: process.execPath, + args: [scriptPath, "run", "dev"], + }), + ); + const seenTitles: Array = []; + const unsubscribeTitle = session.onTitleChange((title) => { + seenTitles.push(title); + }); + + await waitForTitle(session, (title) => title === "npm run dev"); + await waitForState(session, (state) => state.title === "npm run dev"); + + expect(seenTitles).toContain("npm run dev"); + expect(session.getTitle()).toBe("npm run dev"); + expect(session.getState().title).toBe("npm run dev"); + + unsubscribeTitle(); + }); + + it("emits OSC title updates to title listeners", async () => { + const session = trackSession( + await createTerminal({ + cwd: "/tmp", + shell: "/bin/sh", + env: { PS1: "$ " }, + }), + ); + const seenTitles: Array = []; + const unsubscribeTitle = session.onTitleChange((title) => { + seenTitles.push(title); + }); + + await waitForLines(session, ["$"]); + session.send({ type: "input", data: "printf '\\033]0;Build Log\\007'\r" }); + + await waitForTitle(session, (title) => title === "Build Log"); + + expect(seenTitles).toContain("Build Log"); + expect(session.getTitle()).toBe("Build Log"); + expect(session.getState().title).toBe("Build Log"); + + unsubscribeTitle(); + }); + + it("debounces rapid title changes and emits only the final title", async () => { + const session = trackSession( + await createTerminal({ + cwd: "/tmp", + shell: "/bin/sh", + env: { PS1: "$ " }, + }), + ); + const seenTitles: Array = []; + const seenMessages: Array = []; + const unsubscribeTitle = session.onTitleChange((title) => { + seenTitles.push(title); + }); + const unsubscribeMessages = session.subscribe((message) => { + if (message.type === "titleChange") { + seenMessages.push(message.title); + } + }); + + await waitForLines(session, ["$"]); + session.send({ + type: "input", + data: + "printf '\\033]0;First\\007\\033]0;Second\\007\\033]0;Final\\007'\r", + }); + + await waitForTitle(session, (title) => title === "Final"); + + expect(seenTitles).toEqual(["Final"]); + expect(seenMessages).toEqual(["Final"]); + + unsubscribeMessages(); + unsubscribeTitle(); + }); + + it("emits zsh shell integration titles for commands and prompts", async () => { + const homeDir = mkdtempSync(join(tmpdir(), "terminal-zsh-integration-home-")); + temporaryDirs.push(homeDir); + const realZdotdir = join(homeDir, ".config", "zsh"); + const workingDir = join(homeDir, "dev", "faro"); + mkdirSync(realZdotdir, { recursive: true }); + mkdirSync(workingDir, { recursive: true }); + writeFileSync(join(realZdotdir, ".zshenv"), ""); + writeFileSync(join(realZdotdir, ".zshrc"), "PS1='$ '\n"); + + const session = trackSession( + await createTerminal({ + cwd: workingDir, + shell: "/bin/zsh", + env: { + HOME: homeDir, + ZDOTDIR: realZdotdir, + }, + }), + ); + + await waitForLines(session, ["$"]); + await waitForTitle(session, (title) => title === "~/dev/faro"); + + session.send({ type: "input", data: "sleep 1\r" }); + + await waitForTitle(session, (title) => title === "sleep 1"); + await waitForTitle(session, (title) => title === "~/dev/faro", 4000); + }); + }); + describe("colors", () => { it("captures ANSI 16 color codes (mode 1)", async () => { const session = trackSession( diff --git a/packages/server/src/terminal/terminal.ts b/packages/server/src/terminal/terminal.ts index 60d9c7ef7..9e8c6d4f9 100644 --- a/packages/server/src/terminal/terminal.ts +++ b/packages/server/src/terminal/terminal.ts @@ -2,14 +2,24 @@ import * as pty from "node-pty"; import xterm, { type Terminal as TerminalType } from "@xterm/headless"; import { randomUUID } from "crypto"; import { chmodSync, existsSync, statSync } from "node:fs"; -import { dirname, join } from "node:path"; +import { basename, dirname, join } from "node:path"; import { createRequire } from "node:module"; +import { fileURLToPath } from "node:url"; import stripAnsi from "strip-ansi"; import type { TerminalCell, TerminalState } from "../shared/messages.js"; const { Terminal } = xterm; const require = createRequire(import.meta.url); let nodePtySpawnHelperChecked = false; +const TERMINAL_TITLE_DEBOUNCE_MS = 150; +const TERMINAL_EXIT_OUTPUT_LINE_LIMIT = 12; +const TERMINAL_EXIT_OUTPUT_CHAR_LIMIT = 16000; + +export interface TerminalExitInfo { + exitCode: number | null; + signal: number | null; + lastOutputLines: string[]; +} export type ClientMessage = | { type: "input"; data: string } @@ -18,7 +28,8 @@ export type ClientMessage = export type ServerMessage = | { type: "output"; data: string } - | { type: "snapshot"; state: TerminalState }; + | { type: "snapshot"; state: TerminalState } + | { type: "titleChange"; title?: string }; export interface TerminalSession { id: string; @@ -26,19 +37,30 @@ export interface TerminalSession { cwd: string; send(msg: ClientMessage): void; subscribe(listener: (msg: ServerMessage) => void): () => void; - onExit(listener: () => void): () => void; + onExit(listener: (info: TerminalExitInfo) => void): () => void; + onTitleChange(listener: (title?: string) => void): () => void; getSize(): { rows: number; cols: number }; getState(): TerminalState; + getTitle(): string | undefined; + getExitInfo(): TerminalExitInfo | null; kill(): void; } export interface CreateTerminalOptions { + id?: string; cwd: string; shell?: string; env?: Record; rows?: number; cols?: number; name?: string; + command?: string; + args?: string[]; +} + +interface BuildTerminalEnvironmentInput { + shell: string; + env: Record; } export interface CaptureTerminalLinesOptions { @@ -135,6 +157,29 @@ export function resolveDefaultTerminalShell( return env.SHELL || "/bin/sh"; } +export function resolveZshShellIntegrationDir(): string { + return fileURLToPath(new URL("./shell-integration/zsh", import.meta.url)); +} + +export function buildTerminalEnvironment(input: BuildTerminalEnvironmentInput): Record { + const baseEnv: Record = { + ...process.env, + ...input.env, + TERM: "xterm-256color", + }; + + if (basename(input.shell) !== "zsh") { + return baseEnv; + } + + const originalZdotdir = baseEnv.ZDOTDIR ?? ""; + return { + ...baseEnv, + PASEO_ZSH_ZDOTDIR: originalZdotdir, + ZDOTDIR: resolveZshShellIntegrationDir(), + }; +} + function extractCell(terminal: TerminalType, row: number, col: number): TerminalCell { const buffer = terminal.buffer.active; const line = buffer.getLine(row); @@ -258,6 +303,157 @@ function extractCursorState(terminal: TerminalType): TerminalState["cursor"] { }; } +function normalizeProcessToken(token: string): string { + if (token.length === 0) { + return token; + } + + const quote = + token.startsWith('"') && token.endsWith('"') + ? '"' + : token.startsWith("'") && token.endsWith("'") + ? "'" + : ""; + const rawToken = quote ? token.slice(1, -1) : token; + if (rawToken.length === 0) { + return token; + } + + const assignmentMatch = rawToken.match(/^([A-Za-z_][A-Za-z0-9_]*=)(.+)$/); + const prefix = assignmentMatch ? assignmentMatch[1] : ""; + const value = assignmentMatch ? assignmentMatch[2] : rawToken; + if (!value.includes("/")) { + return token; + } + + const normalized = `${prefix}${basename(value)}`; + return quote ? `${quote}${normalized}${quote}` : normalized; +} + +export function normalizeProcessTitle(processTitle: string): string | undefined { + const trimmed = processTitle.trim().replace(/\s+/g, " "); + if (trimmed.length === 0) { + return undefined; + } + + const normalized = trimmed + .split(" ") + .map((token) => normalizeProcessToken(token)) + .join(" ") + .trim(); + return normalized.length > 0 ? normalized : undefined; +} + +const PROCESS_INTERPRETERS = new Set([ + "bash", + "bun", + "deno", + "node", + "nodejs", + "python", + "python3", + "ruby", + "sh", + "tsx", + "zsh", +]); + +const PACKAGE_MANAGER_SCRIPT_NAMES = new Map([ + ["bun.js", "bun"], + ["npm-cli.js", "npm"], + ["npx-cli.js", "npx"], + ["pnpm.cjs", "pnpm"], + ["pnpm.js", "pnpm"], + ["yarn.cjs", "yarn"], + ["yarn.js", "yarn"], +]); + +export function humanizeProcessTitle(processTitle: string): string | undefined { + const normalized = normalizeProcessTitle(processTitle); + if (!normalized) { + return undefined; + } + + const tokens = normalized.split(" ").filter(Boolean); + if (tokens.length === 0) { + return undefined; + } + + while (tokens[0] === "env") { + tokens.shift(); + while (tokens[0] && /^[A-Za-z_][A-Za-z0-9_]*=/.test(tokens[0])) { + tokens.shift(); + } + } + + if (tokens.length === 0) { + return normalized; + } + + const first = tokens[0]; + const second = tokens[1]; + if (PROCESS_INTERPRETERS.has(first) && second) { + const packageManager = PACKAGE_MANAGER_SCRIPT_NAMES.get(second); + if (packageManager) { + return [packageManager, ...tokens.slice(2)].join(" ").trim() || packageManager; + } + + if (!second.startsWith("-")) { + return [second, ...tokens.slice(2)].join(" ").trim(); + } + } + + return normalized; +} + +function extractLastOutputLines(terminal: TerminalType, limit: number): string[] { + const buffer = terminal.buffer.active; + const mergedLines: string[] = []; + + for (let row = 0; row < buffer.length; row++) { + const line = buffer.getLine(row); + if (!line) { + continue; + } + + const text = line.translateToString(true); + const isWrapped = (line as { isWrapped?: boolean }).isWrapped === true; + if (isWrapped && mergedLines.length > 0) { + mergedLines[mergedLines.length - 1] += text; + continue; + } + mergedLines.push(text); + } + + while (mergedLines.length > 0 && mergedLines[0]?.trim().length === 0) { + mergedLines.shift(); + } + while (mergedLines.length > 0 && mergedLines[mergedLines.length - 1]?.trim().length === 0) { + mergedLines.pop(); + } + + return mergedLines.slice(-limit); +} + +function stripAnsiSequences(input: string): string { + return input.replace( + /\x1b(?:[@-Z\\-_]|\[[0-?]*[ -/]*[@-~]|\].*?(?:\x07|\x1b\\))/g, + "", + ); +} + +function extractLastOutputLinesFromText(text: string, limit: number): string[] { + const normalized = stripAnsiSequences(text).replace(/\r\n/g, "\n").replace(/\r/g, "\n"); + const lines = normalized.split("\n").map((line) => line.trimEnd()); + while (lines[0]?.trim().length === 0) { + lines.shift(); + } + while (lines[lines.length - 1]?.trim().length === 0) { + lines.pop(); + } + return lines.slice(-limit); +} + function cellsToPlainText(cells: TerminalCell[], options: { stripAnsi: boolean }): string { const text = cells.map((cell) => cell.char).join("").trimEnd(); return options.stripAnsi ? stripAnsi(text) : text; @@ -320,15 +516,23 @@ export async function createTerminal(options: CreateTerminalOptions): Promise void>(); - const exitListeners = new Set<() => void>(); + const exitListeners = new Set<(info: TerminalExitInfo) => void>(); + const titleChangeListeners = new Set<(title?: string) => void>(); let killed = false; let disposed = false; let exitEmitted = false; + let exitInfo: TerminalExitInfo | null = null; + let recentOutputText = ""; + let title: string | undefined; + let pendingTitle: string | undefined; + let titleDebounceTimer: ReturnType | null = null; // Create xterm.js headless terminal const terminal = new Terminal({ @@ -341,18 +545,43 @@ export async function createTerminal(options: CreateTerminalOptions): Promise { if (params.length === 0 || (params.length === 1 && params[0] === 0)) { @@ -362,14 +591,41 @@ export async function createTerminal(options: CreateTerminalOptions): Promise { + if (disposed || killed) { + return; + } + pendingTitle = nextTitle.trim().length > 0 ? nextTitle : undefined; + if (titleDebounceTimer) { + clearTimeout(titleDebounceTimer); + } + titleDebounceTimer = setTimeout(() => { + titleDebounceTimer = null; + emitTitleChange(pendingTitle); + }, TERMINAL_TITLE_DEBOUNCE_MS); + }); + + function buildExitInfo(input?: { exitCode?: number | null; signal?: number | null }): TerminalExitInfo { + const lastOutputLines = extractLastOutputLines(terminal, TERMINAL_EXIT_OUTPUT_LINE_LIMIT); + return { + exitCode: input?.exitCode ?? null, + signal: input?.signal && input.signal > 0 ? input.signal : null, + lastOutputLines: + lastOutputLines.length > 0 + ? lastOutputLines + : extractLastOutputLinesFromText(recentOutputText, TERMINAL_EXIT_OUTPUT_LINE_LIMIT), + }; + } + + function emitExit(info: TerminalExitInfo): void { if (exitEmitted) { return; } exitEmitted = true; + exitInfo = info; for (const listener of Array.from(exitListeners)) { try { - listener(); + listener(info); } catch { // no-op } @@ -382,14 +638,24 @@ export async function createTerminal(options: CreateTerminalOptions): Promise { if (killed) return; + recentOutputText = `${recentOutputText}${data}`; + if (recentOutputText.length > TERMINAL_EXIT_OUTPUT_CHAR_LIMIT) { + recentOutputText = recentOutputText.slice(-TERMINAL_EXIT_OUTPUT_CHAR_LIMIT); + } terminal.write(data, () => { if (disposed || killed) { return; @@ -400,9 +666,14 @@ export async function createTerminal(options: CreateTerminalOptions): Promise { + ptyProcess.onExit((event) => { killed = true; - emitExit(); + emitExit( + buildExitInfo({ + exitCode: event.exitCode, + signal: event.signal, + }), + ); disposeResources(); }); @@ -413,6 +684,7 @@ export async function createTerminal(options: CreateTerminalOptions): Promise void): () => void { + function onExit(listener: (info: TerminalExitInfo) => void): () => void { if (killed) { queueMicrotask(() => { try { - listener(); + listener(exitInfo ?? buildExitInfo()); } catch { // no-op } @@ -473,11 +745,38 @@ export async function createTerminal(options: CreateTerminalOptions): Promise void): () => void { + titleChangeListeners.add(listener); + if (title !== undefined) { + queueMicrotask(() => { + if (disposed || !titleChangeListeners.has(listener)) { + return; + } + try { + listener(title); + } catch { + // no-op + } + }); + } + return () => { + titleChangeListeners.delete(listener); + }; + } + + function getTitle(): string | undefined { + return title; + } + + function getExitInfo(): TerminalExitInfo | null { + return exitInfo; + } + function kill(): void { if (!killed) { killed = true; ptyProcess.kill(); - emitExit(); + emitExit(buildExitInfo()); } disposeResources(); } @@ -492,8 +791,11 @@ export async function createTerminal(options: CreateTerminalOptions): Promise { repoDir = setup.repoDir; paseoHome = join(tempDir, "paseo-home"); __resetGhPathCacheForTests(); + __resetCheckoutShortstatCacheForTests(); __resetPullRequestStatusCacheForTests(); }); afterEach(() => { __resetGhPathCacheForTests(); + __resetCheckoutShortstatCacheForTests(); __resetPullRequestStatusCacheForTests(); rmSync(tempDir, { recursive: true, force: true }); }); @@ -253,6 +259,24 @@ const x = 1; expect(shortstat).toEqual({ additions: 1, deletions: 0 }); }); + it("warms shortstat cache in the background without blocking listing callers", async () => { + expect(getCachedCheckoutShortstat(repoDir)).toBeUndefined(); + + warmCheckoutShortstatInBackground(repoDir); + + // A repo with no origin/main computes to null, but null should still be cached. + for (let attempts = 0; attempts < 20; attempts += 1) { + const cached = getCachedCheckoutShortstat(repoDir); + if (cached !== undefined) { + expect(cached).toBeNull(); + return; + } + await sleep(25); + } + + throw new Error("shortstat background warm did not populate cache in time"); + }); + it("commits messages with quotes safely", async () => { const message = `He said "hello" and it's fine`; writeFileSync(join(repoDir, "file.txt"), "quoted\n"); @@ -614,6 +638,101 @@ const x = 1; } }); + it("parses real gh status check rollup output and dedupes by latest check run", () => { + expect( + parseStatusCheckRollup([ + { + __typename: "CheckRun", + completedAt: "2026-04-02T13:53:59Z", + conclusion: "SUCCESS", + detailsUrl: "https://github.com/org/repo/actions/runs/123", + name: "review_app", + startedAt: "2026-04-02T13:49:31Z", + status: "COMPLETED", + workflowName: "Deploy PR Preview", + }, + { + __typename: "CheckRun", + completedAt: "2026-04-02T13:58:59Z", + conclusion: "FAILURE", + detailsUrl: "https://github.com/org/repo/actions/runs/124", + name: "review_app", + startedAt: "2026-04-02T13:55:31Z", + status: "COMPLETED", + }, + ]), + ).toEqual([ + { + name: "review_app", + status: "failure", + url: "https://github.com/org/repo/actions/runs/124", + }, + ]); + }); + + it("parses mixed check run and status context entries", () => { + expect( + parseStatusCheckRollup([ + { + __typename: "CheckRun", + name: "unit-tests", + status: "IN_PROGRESS", + conclusion: null, + detailsUrl: "https://github.com/org/repo/actions/runs/200", + startedAt: "2026-04-02T13:49:31Z", + }, + { + __typename: "StatusContext", + context: "lint", + state: "SUCCESS", + targetUrl: "https://github.com/org/repo/status/300", + createdAt: "2026-04-02T13:48:00Z", + }, + ]), + ).toEqual([ + { + name: "unit-tests", + status: "pending", + url: "https://github.com/org/repo/actions/runs/200", + }, + { + name: "lint", + status: "success", + url: "https://github.com/org/repo/status/300", + }, + ]); + }); + + it("returns an empty list for nullish or empty status check rollups", () => { + expect(parseStatusCheckRollup(undefined)).toEqual([]); + expect(parseStatusCheckRollup(null)).toEqual([]); + expect(parseStatusCheckRollup([])).toEqual([]); + }); + + it("ignores unknown status check rollup node types", () => { + expect( + parseStatusCheckRollup([ + { + __typename: "Commit", + oid: "abc123", + }, + { + __typename: "CheckRun", + name: "build", + status: "COMPLETED", + conclusion: "SUCCESS", + detailsUrl: "https://github.com/org/repo/actions/runs/500", + }, + ]), + ).toEqual([ + { + name: "build", + status: "success", + url: "https://github.com/org/repo/actions/runs/500", + }, + ]); + }); + it("returns merged PR status when no open PR exists for the current branch", async () => { execSync("git checkout -b feature", { cwd: repoDir }); execSync("git remote add origin https://github.com/getpaseo/paseo.git", { cwd: repoDir }); @@ -632,8 +751,8 @@ const x = 1; " exit 0", "fi", 'args="$*"', - 'if [[ "$args" == "pr view --json url,title,state,baseRefName,headRefName,mergedAt" ]]; then', - ' echo \'{"url":"https://github.com/getpaseo/paseo/pull/123","title":"Ship feature","state":"closed","baseRefName":"main","headRefName":"feature","mergedAt":"2026-02-18T00:00:00Z"}\'', + 'if [[ "$args" == "pr view --json url,title,state,baseRefName,headRefName,mergedAt,statusCheckRollup,reviewDecision" ]]; then', + ' echo \'{"url":"https://github.com/getpaseo/paseo/pull/123","title":"Ship feature","state":"closed","baseRefName":"main","headRefName":"feature","mergedAt":"2026-02-18T00:00:00Z","statusCheckRollup":[],"reviewDecision":""}\'', " exit 0", "fi", 'echo "unexpected gh args: $args" >&2', @@ -678,8 +797,8 @@ const x = 1; " exit 0", "fi", 'args="$*"', - 'if [[ "$args" == "pr view --json url,title,state,baseRefName,headRefName,mergedAt" ]]; then', - ' echo \'{"url":"https://github.com/getpaseo/paseo/pull/999","title":"Closed without merge","state":"closed","baseRefName":"main","headRefName":"feature","mergedAt":null}\'', + 'if [[ "$args" == "pr view --json url,title,state,baseRefName,headRefName,mergedAt,statusCheckRollup,reviewDecision" ]]; then', + ' echo \'{"url":"https://github.com/getpaseo/paseo/pull/999","title":"Closed without merge","state":"closed","baseRefName":"main","headRefName":"feature","mergedAt":null,"statusCheckRollup":[],"reviewDecision":""}\'', " exit 0", "fi", 'echo "unexpected gh args: $args" >&2', @@ -727,10 +846,10 @@ const x = 1; " exit 0", "fi", 'args="$*"', - 'if [[ "$args" == "pr view --json url,title,state,baseRefName,headRefName,mergedAt" ]]; then', + 'if [[ "$args" == "pr view --json url,title,state,baseRefName,headRefName,mergedAt,statusCheckRollup,reviewDecision" ]]; then', ' count="$(cat "$count_file")"', ' printf "%s\\n" "$((count + 1))" > "$count_file"', - ' echo \'{"url":"https://github.com/getpaseo/paseo/pull/123","title":"Ship feature","state":"OPEN","baseRefName":"main","headRefName":"feature","mergedAt":null}\'', + ' echo \'{"url":"https://github.com/getpaseo/paseo/pull/123","title":"Ship feature","state":"OPEN","baseRefName":"main","headRefName":"feature","mergedAt":null,"statusCheckRollup":[],"reviewDecision":""}\'', " exit 0", "fi", 'echo "unexpected gh args: $args" >&2', @@ -775,11 +894,11 @@ const x = 1; " exit 0", "fi", 'args="$*"', - 'if [[ "$args" == "pr view --json url,title,state,baseRefName,headRefName,mergedAt" ]]; then', + 'if [[ "$args" == "pr view --json url,title,state,baseRefName,headRefName,mergedAt,statusCheckRollup,reviewDecision" ]]; then', ' count="$(cat "$count_file")"', ' next="$((count + 1))"', ' printf "%s\\n" "$next" > "$count_file"', - ' printf \'{"url":"https://github.com/getpaseo/paseo/pull/%s","title":"Ship feature","state":"OPEN","baseRefName":"main","headRefName":"feature","mergedAt":null}\\n\' "$next"', + ' printf \'{"url":"https://github.com/getpaseo/paseo/pull/%s","title":"Ship feature","state":"OPEN","baseRefName":"main","headRefName":"feature","mergedAt":null,"statusCheckRollup":[],"reviewDecision":""}\\n\' "$next"', " exit 0", "fi", 'echo "unexpected gh args: $args" >&2', @@ -826,11 +945,11 @@ const x = 1; " exit 0", "fi", 'args="$*"', - 'if [[ "$args" == "pr view --json url,title,state,baseRefName,headRefName,mergedAt" ]]; then', + 'if [[ "$args" == "pr view --json url,title,state,baseRefName,headRefName,mergedAt,statusCheckRollup,reviewDecision" ]]; then', ' count="$(cat "$count_file")"', ' printf "%s\\n" "$((count + 1))" > "$count_file"', " sleep 0.2", - ' echo \'{"url":"https://github.com/getpaseo/paseo/pull/123","title":"Ship feature","state":"OPEN","baseRefName":"main","headRefName":"feature","mergedAt":null}\'', + ' echo \'{"url":"https://github.com/getpaseo/paseo/pull/123","title":"Ship feature","state":"OPEN","baseRefName":"main","headRefName":"feature","mergedAt":null,"statusCheckRollup":[],"reviewDecision":""}\'', " exit 0", "fi", 'echo "unexpected gh args: $args" >&2', diff --git a/packages/server/src/utils/checkout-git.ts b/packages/server/src/utils/checkout-git.ts index 37c50ea2c..3ca4a4115 100644 --- a/packages/server/src/utils/checkout-git.ts +++ b/packages/server/src/utils/checkout-git.ts @@ -4,6 +4,7 @@ import { resolve, dirname, basename } from "path"; import { realpathSync } from "fs"; import { open as openFile, stat as statFile } from "fs/promises"; import { TTLCache } from "@isaacs/ttlcache"; +import { z } from "zod"; import type { ParsedDiffFile } from "../server/utils/diff-highlighter.js"; import { parseAndHighlightDiff } from "../server/utils/diff-highlighter.js"; import { findExecutable } from "./executable.js"; @@ -20,11 +21,16 @@ const READ_ONLY_GIT_ENV: NodeJS.ProcessEnv = { const SMALL_OUTPUT_MAX_BUFFER = 20 * 1024 * 1024; // 20MB const DEFAULT_PULL_REQUEST_STATUS_CACHE_TTL_MS = 30_000; const PULL_REQUEST_STATUS_CACHE_MAX = 1_000; +const DEFAULT_SHORTSTAT_CACHE_TTL_MS = 15_000; +const SHORTSTAT_CACHE_MAX = 1_000; let pullRequestStatusCacheTtlMs = DEFAULT_PULL_REQUEST_STATUS_CACHE_TTL_MS; let pullRequestStatusCache = createPullRequestStatusCache(pullRequestStatusCacheTtlMs); const pullRequestStatusInFlight = new Map>(); let cachedGhPath: string | null | undefined = undefined; +let shortstatCacheTtlMs = DEFAULT_SHORTSTAT_CACHE_TTL_MS; +let shortstatCache = createShortstatCache(shortstatCacheTtlMs); +const shortstatInFlight = new Map>(); function createPullRequestStatusCache(ttlMs: number) { return new TTLCache({ @@ -34,10 +40,22 @@ function createPullRequestStatusCache(ttlMs: number) { }); } +function createShortstatCache(ttlMs: number) { + return new TTLCache({ + ttl: ttlMs, + max: SHORTSTAT_CACHE_MAX, + checkAgeOnGet: true, + }); +} + function getPullRequestStatusCacheKey(cwd: string): string { return resolve(cwd); } +function getShortstatCacheKey(cwd: string): string { + return resolve(cwd); +} + export function __resetPullRequestStatusCacheForTests(): void { pullRequestStatusCache.clear(); pullRequestStatusCache.cancelTimer(); @@ -62,6 +80,22 @@ export function __setGhPathForTests(path: string | null): void { cachedGhPath = path; } +export function __resetCheckoutShortstatCacheForTests(): void { + shortstatCache.clear(); + shortstatCache.cancelTimer(); + shortstatCacheTtlMs = DEFAULT_SHORTSTAT_CACHE_TTL_MS; + shortstatCache = createShortstatCache(shortstatCacheTtlMs); + shortstatInFlight.clear(); +} + +export function __setCheckoutShortstatCacheTtlForTests(ttlMs: number): void { + shortstatCache.clear(); + shortstatCache.cancelTimer(); + shortstatCacheTtlMs = ttlMs; + shortstatCache = createShortstatCache(ttlMs); + shortstatInFlight.clear(); +} + async function execGit( command: string, options: { cwd: string; env?: NodeJS.ProcessEnv }, @@ -1222,7 +1256,7 @@ export interface CheckoutShortstat { deletions: number; } -export async function getCheckoutShortstat( +async function getCheckoutShortstatUncached( cwd: string, context?: CheckoutContext, ): Promise { @@ -1301,6 +1335,64 @@ export async function getCheckoutShortstat( } } +function getOrLoadCheckoutShortstat( + cwd: string, + context?: CheckoutContext, +): Promise { + const cacheKey = getShortstatCacheKey(cwd); + const cached = shortstatCache.get(cacheKey); + if (cached !== undefined) { + return Promise.resolve(cached); + } + + const existing = shortstatInFlight.get(cacheKey); + if (existing) { + return existing; + } + + const load = getCheckoutShortstatUncached(cwd, context) + .then((shortstat) => { + shortstatCache.set(cacheKey, shortstat); + return shortstat; + }) + .finally(() => { + shortstatInFlight.delete(cacheKey); + }); + + shortstatInFlight.set(cacheKey, load); + return load; +} + +export async function getCheckoutShortstat( + cwd: string, + context?: CheckoutContext, +): Promise { + return getOrLoadCheckoutShortstat(cwd, context); +} + +export function getCachedCheckoutShortstat(cwd: string): CheckoutShortstat | null | undefined { + return shortstatCache.get(getShortstatCacheKey(cwd)); +} + +export function warmCheckoutShortstatInBackground( + cwd: string, + context?: CheckoutContext, + onComplete?: () => void, +): void { + const cacheKey = getShortstatCacheKey(cwd); + if (shortstatCache.get(cacheKey) !== undefined || shortstatInFlight.has(cacheKey)) { + return; + } + + void getOrLoadCheckoutShortstat(cwd, context) + .then(() => { + onComplete?.(); + }) + .catch(() => { + // Non-critical: keep listing path resilient even if git commands fail. + }); +} + export async function getCheckoutDiff( cwd: string, compare: CheckoutDiffCompare, @@ -1776,6 +1868,9 @@ export interface PullRequestStatus { baseRefName: string; headRefName: string; isMerged: boolean; + checks?: PullRequestCheck[]; + checksStatus?: ChecksStatus; + reviewDecision?: ReviewDecision; } export interface PullRequestStatusResult { @@ -1783,6 +1878,63 @@ export interface PullRequestStatusResult { githubFeaturesEnabled: boolean; } +export type PullRequestCheck = { + name: string; + status: "success" | "failure" | "pending" | "skipped" | "cancelled"; + url: string | null; +}; + +export type ChecksStatus = "none" | "pending" | "success" | "failure"; + +export type ReviewDecision = "approved" | "changes_requested" | "pending" | null; + +const CheckRunNodeSchema = z.object({ + __typename: z.literal("CheckRun"), + name: z.string(), + conclusion: z.string().nullable().optional(), + status: z.string().nullable().optional(), + detailsUrl: z.string().nullable().optional(), + startedAt: z.string().nullable().optional(), + completedAt: z.string().nullable().optional(), + checkSuite: z + .object({ + workflowRun: z + .object({ + databaseId: z.number().nullable().optional(), + }) + .nullable() + .optional(), + }) + .nullable() + .optional(), +}); + +const StatusContextNodeSchema = z.object({ + __typename: z.literal("StatusContext"), + context: z.string(), + state: z.string().nullable().optional(), + targetUrl: z.string().nullable().optional(), + createdAt: z.string().nullable().optional(), +}); + +const StatusCheckRollupNodeSchema = z.discriminatedUnion("__typename", [ + CheckRunNodeSchema, + StatusContextNodeSchema, +]); + +const StatusCheckRollupArraySchema = z.array(z.unknown()); +const LegacyStatusCheckRollupSchema = z.object({ + contexts: z.array(z.unknown()), +}); + +const ReviewDecisionSchema = z + .enum(["APPROVED", "CHANGES_REQUESTED", "REVIEW_REQUIRED"]) + .nullable() + .catch(null); + +type CheckRunNode = z.infer; +type StatusContextNode = z.infer; + function resolveGhPath(): string { if (cachedGhPath === undefined) { cachedGhPath = findExecutable("gh"); @@ -1814,6 +1966,159 @@ function isGhAuthError(error: unknown): boolean { ); } +function mapCheckRunStatus( + status: unknown, + conclusion: unknown, +): PullRequestCheck["status"] { + if (status !== "COMPLETED") { + return "pending"; + } + + switch (conclusion) { + case "SUCCESS": + return "success"; + case "FAILURE": + case "TIMED_OUT": + case "ACTION_REQUIRED": + return "failure"; + case "CANCELLED": + return "cancelled"; + case "SKIPPED": + case "NEUTRAL": + return "skipped"; + default: + return "pending"; + } +} + +function mapStatusContextState(state: unknown): PullRequestCheck["status"] { + switch (state) { + case "SUCCESS": + return "success"; + case "FAILURE": + case "ERROR": + return "failure"; + case "EXPECTED": + case "PENDING": + return "pending"; + default: + return "pending"; + } +} + +function getCheckRunRecency(context: CheckRunNode): number { + const workflowRunId = context.checkSuite?.workflowRun?.databaseId; + if (typeof workflowRunId === "number") { + return workflowRunId; + } + + const timestamp = + typeof context.completedAt === "string" + ? context.completedAt + : typeof context.startedAt === "string" + ? context.startedAt + : null; + if (!timestamp) { + return 0; + } + + const time = Date.parse(timestamp); + return Number.isNaN(time) ? 0 : time; +} + +function getStatusContextRecency(context: StatusContextNode): number { + if (typeof context.createdAt !== "string" || context.createdAt.length === 0) { + return 0; + } + + const time = Date.parse(context.createdAt); + return Number.isNaN(time) ? 0 : time; +} + +export function parseStatusCheckRollup(value: unknown): PullRequestCheck[] { + const directContexts = StatusCheckRollupArraySchema.safeParse(value); + if (!directContexts.success) { + const legacyContexts = LegacyStatusCheckRollupSchema.safeParse(value); + if (!legacyContexts.success) { + return []; + } + + return parseStatusCheckRollup(legacyContexts.data.contexts); + } + const contexts = directContexts.data; + + const dedupedChecks = new Map< + string, + PullRequestCheck & { + recency: number; + } + >(); + + for (const entry of contexts) { + const parsed = StatusCheckRollupNodeSchema.safeParse(entry); + if (!parsed.success) { + continue; + } + + const context = parsed.data; + let check: (PullRequestCheck & { recency: number }) | null = null; + + if (context.__typename === "CheckRun") { + check = { + name: context.name, + status: mapCheckRunStatus(context.status, context.conclusion), + url: typeof context.detailsUrl === "string" ? context.detailsUrl : null, + recency: getCheckRunRecency(context), + }; + } else if (context.__typename === "StatusContext") { + check = { + name: context.context, + status: mapStatusContextState(context.state), + url: typeof context.targetUrl === "string" ? context.targetUrl : null, + recency: getStatusContextRecency(context), + }; + } + + if (!check) { + continue; + } + + const existing = dedupedChecks.get(check.name); + if (!existing || check.recency > existing.recency) { + dedupedChecks.set(check.name, check); + } + } + + return Array.from(dedupedChecks.values(), ({ recency: _recency, ...check }) => check); +} + +function computeChecksStatus(checks: PullRequestCheck[]): ChecksStatus { + if (checks.length === 0) { + return "none"; + } + if (checks.some((check) => check.status === "failure")) { + return "failure"; + } + if (checks.some((check) => check.status === "pending")) { + return "pending"; + } + return "success"; +} + +function mapReviewDecision(value: unknown): ReviewDecision { + const reviewDecision = ReviewDecisionSchema.parse(value); + if (reviewDecision === "APPROVED") { + return "approved"; + } + if (reviewDecision === "CHANGES_REQUESTED") { + return "changes_requested"; + } + if (reviewDecision === "REVIEW_REQUIRED") { + return "pending"; + } + return null; +} + async function resolveGitHubRepo(cwd: string): Promise { try { const { stdout } = await execAsync("git config --get remote.origin.url", { @@ -1946,7 +2251,7 @@ async function getPullRequestStatusUncached(cwd: string): Promise 0 ? pr.state.toLowerCase() : ""; + const checks = parseStatusCheckRollup(pr.statusCheckRollup); + const reviewDecision = mapReviewDecision(pr.reviewDecision); return { status: { url: pr.url, @@ -1972,6 +2279,9 @@ async function getPullRequestStatusUncached(cwd: string): Promise { + it("slugifies service names with spaces on default branches", () => { + expect(buildServiceHostname(null, "npm run dev")).toBe("npm-run-dev.localhost"); + }); + + it("slugifies service names with special characters", () => { + expect(buildServiceHostname(null, "Web/API @ Dev")).toBe("web-api-dev.localhost"); + }); + + it("omits the branch prefix for main and master", () => { + expect(buildServiceHostname("main", "npm run dev")).toBe("npm-run-dev.localhost"); + expect(buildServiceHostname("master", "npm run dev")).toBe("npm-run-dev.localhost"); + }); + + it("adds a slugified branch prefix for non-default branches", () => { + expect(buildServiceHostname("feature/cool stuff", "api")).toBe( + "feature-cool-stuff.api.localhost", + ); + }); + + it("slugifies both the branch name and service name together", () => { + expect(buildServiceHostname("feat/add auth", "npm run dev")).toBe( + "feat-add-auth.npm-run-dev.localhost", + ); + }); +}); diff --git a/packages/server/src/utils/service-hostname.ts b/packages/server/src/utils/service-hostname.ts new file mode 100644 index 000000000..49687d8a6 --- /dev/null +++ b/packages/server/src/utils/service-hostname.ts @@ -0,0 +1,13 @@ +import { slugify } from "./worktree.js"; + +export function buildServiceHostname(branchName: string | null, serviceName: string): string { + const serviceHostnameLabel = slugify(serviceName); + const isDefaultBranch = + branchName === null || branchName === "main" || branchName === "master"; + + if (isDefaultBranch) { + return `${serviceHostnameLabel}.localhost`; + } + + return `${slugify(branchName)}.${serviceHostnameLabel}.localhost`; +} diff --git a/packages/server/src/utils/string-command-shell.test.ts b/packages/server/src/utils/string-command-shell.test.ts new file mode 100644 index 000000000..c58078af8 --- /dev/null +++ b/packages/server/src/utils/string-command-shell.test.ts @@ -0,0 +1,36 @@ +import { describe, expect, it } from "vitest"; + +import { buildStringCommandShellInvocation } from "./string-command-shell.js"; + +describe("buildStringCommandShellInvocation", () => { + it("uses bash login-command semantics on unix platforms", () => { + expect( + buildStringCommandShellInvocation({ + command: 'echo "hello"', + platform: "darwin", + }), + ).toEqual({ + shell: "/bin/bash", + args: ["-lc", 'echo "hello"'], + }); + }); + + it("uses powershell command semantics on windows", () => { + expect( + buildStringCommandShellInvocation({ + command: "Write-Output 'hello'", + platform: "win32", + }), + ).toEqual({ + shell: "powershell", + args: [ + "-NoProfile", + "-NonInteractive", + "-ExecutionPolicy", + "Bypass", + "-Command", + "Write-Output 'hello'", + ], + }); + }); +}); diff --git a/packages/server/src/utils/string-command-shell.ts b/packages/server/src/utils/string-command-shell.ts new file mode 100644 index 000000000..7d773296b --- /dev/null +++ b/packages/server/src/utils/string-command-shell.ts @@ -0,0 +1,34 @@ +export interface BuildStringCommandShellInvocationOptions { + command: string; + platform?: NodeJS.Platform; +} + +export interface StringCommandShellInvocation { + shell: string; + args: string[]; +} + +export function buildStringCommandShellInvocation( + options: BuildStringCommandShellInvocationOptions, +): StringCommandShellInvocation { + const platform = options.platform ?? process.platform; + + if (platform === "win32") { + return { + shell: "powershell", + args: [ + "-NoProfile", + "-NonInteractive", + "-ExecutionPolicy", + "Bypass", + "-Command", + options.command, + ], + }; + } + + return { + shell: "/bin/bash", + args: ["-lc", options.command], + }; +} diff --git a/packages/server/src/utils/worktree-shell-selection.test.ts b/packages/server/src/utils/worktree-shell-selection.test.ts new file mode 100644 index 000000000..bf9d0a41e --- /dev/null +++ b/packages/server/src/utils/worktree-shell-selection.test.ts @@ -0,0 +1,81 @@ +import { mkdirSync, mkdtempSync, rmSync, writeFileSync } from "node:fs"; +import { tmpdir } from "node:os"; +import { join } from "node:path"; +import { afterEach, beforeEach, describe, expect, it, vi } from "vitest"; + +const execFileMock = vi.hoisted(() => vi.fn()); + +vi.mock("child_process", async () => { + const actual = await vi.importActual("child_process"); + return { + ...actual, + execFile: execFileMock, + }; +}); + +describe("worktree shell selection", () => { + const originalPlatform = process.platform; + + beforeEach(() => { + execFileMock.mockReset(); + execFileMock.mockImplementation( + (_file: string, _args: string[], _options: unknown, callback?: (error: Error | null, stdout: string, stderr: string) => void) => { + callback?.(null, "", ""); + return {}; + }, + ); + }); + + afterEach(() => { + Object.defineProperty(process, "platform", { + value: originalPlatform, + configurable: true, + }); + vi.resetModules(); + }); + + it("routes teardown command execution through powershell on win32", async () => { + Object.defineProperty(process, "platform", { + value: "win32", + configurable: true, + }); + + const worktreePath = mkdtempSync(join(tmpdir(), "worktree-shell-selection-")); + try { + mkdirSync(join(worktreePath, ".git"), { recursive: true }); + writeFileSync( + join(worktreePath, "paseo.json"), + JSON.stringify({ + worktree: { + teardown: ["Write-Output 'teardown'"], + }, + }), + "utf8", + ); + + const { runWorktreeTeardownCommands } = await import("./worktree.js"); + await runWorktreeTeardownCommands({ + worktreePath, + repoRootPath: worktreePath, + branchName: "main", + }); + + expect(execFileMock).toHaveBeenCalledTimes(1); + expect(execFileMock).toHaveBeenCalledWith( + "powershell", + [ + "-NoProfile", + "-NonInteractive", + "-ExecutionPolicy", + "Bypass", + "-Command", + "Write-Output 'teardown'", + ], + expect.objectContaining({ cwd: worktreePath }), + expect.any(Function), + ); + } finally { + rmSync(worktreePath, { recursive: true, force: true }); + } + }); +}); diff --git a/packages/server/src/utils/worktree.ts b/packages/server/src/utils/worktree.ts index 35ce3671a..fd7e2bbfe 100644 --- a/packages/server/src/utils/worktree.ts +++ b/packages/server/src/utils/worktree.ts @@ -1,10 +1,13 @@ -import { exec, spawn } from "child_process"; +import { exec, execFile, spawn } from "child_process"; import { promisify } from "util"; import { existsSync, mkdirSync, readFileSync, realpathSync, rmSync, statSync } from "fs"; import { join, basename, dirname, resolve, sep } from "path"; import net from "node:net"; import { createHash } from "node:crypto"; +import * as pty from "node-pty"; import { createNameId } from "mnemonic-id"; +import stripAnsi from "strip-ansi"; +import { buildStringCommandShellInvocation } from "./string-command-shell.js"; import { normalizeBaseRefName, readPaseoWorktreeMetadata, @@ -13,6 +16,7 @@ import { writePaseoWorktreeRuntimeMetadata, } from "./worktree-metadata.js"; import { resolvePaseoHome } from "../server/paseo-home.js"; +import { ensureNodePtySpawnHelperExecutableForCurrentPlatform } from "../terminal/terminal.js"; interface PaseoConfig { worktree?: { @@ -20,9 +24,11 @@ interface PaseoConfig { teardown?: string[]; terminals?: WorktreeTerminalConfig[]; }; + services?: Record; } const execAsync = promisify(exec); +const execFileAsync = promisify(execFile); const READ_ONLY_GIT_ENV: NodeJS.ProcessEnv = { ...process.env, GIT_OPTIONAL_LOCKS: "0", @@ -84,6 +90,11 @@ export interface WorktreeTerminalConfig { command: string; } +export interface ServiceConfig { + command: string; + port?: number; // explicit port override, otherwise auto-assigned +} + export class WorktreeSetupError extends Error { readonly results: WorktreeSetupCommandResult[]; @@ -194,16 +205,100 @@ export function getWorktreeTerminalSpecs(repoRoot: string): WorktreeTerminalConf return specs; } +export function getServiceConfigs(repoRoot: string): Map { + const config = readPaseoConfig(repoRoot); + const services = config?.services; + if (!services || typeof services !== "object") { + return new Map(); + } + + const result = new Map(); + for (const [name, entry] of Object.entries(services)) { + if (!entry || typeof entry !== "object") { + continue; + } + + const rawCommand = entry.command; + if (typeof rawCommand !== "string") { + continue; + } + const command = rawCommand.trim(); + if (!command) { + continue; + } + + const serviceConfig: ServiceConfig = { command }; + + if (typeof entry.port === "number" && Number.isFinite(entry.port)) { + serviceConfig.port = entry.port; + } + + result.set(name, serviceConfig); + } + + return result; +} + +export function processCarriageReturns(text: string): string { + if (!text.includes("\r")) { + return text; + } + + const output: string[] = []; + let line: string[] = []; + let cursor = 0; + + const flushLine = () => { + output.push(line.join("")); + line = []; + cursor = 0; + }; + + for (let index = 0; index < text.length; index += 1) { + const char = text[index]; + + if (char === "\r") { + if (text[index + 1] === "\n") { + flushLine(); + output.push("\n"); + index += 1; + continue; + } + cursor = 0; + continue; + } + + if (char === "\n") { + flushLine(); + output.push("\n"); + continue; + } + + if (cursor < line.length) { + line[cursor] = char; + } else { + line.push(char); + } + cursor += 1; + } + + if (line.length > 0) { + output.push(line.join("")); + } + + return output.join(""); +} + async function execSetupCommand( command: string, options: { cwd: string; env: NodeJS.ProcessEnv }, ): Promise { const startedAt = Date.now(); + const shellInvocation = buildStringCommandShellInvocation({ command }); try { - const { stdout, stderr } = await execAsync(command, { + const { stdout, stderr } = await execFileAsync(shellInvocation.shell, shellInvocation.args, { cwd: options.cwd, env: options.env, - shell: "/bin/bash", }); return { command, @@ -239,6 +334,27 @@ async function execSetupCommandStreamed(options: { const stderrChunks: string[] = []; let settled = false; + const emitOutput = (stream: "stdout" | "stderr", chunk: string) => { + const text = stripAnsi(chunk); + if (!text) { + return; + } + if (stream === "stdout") { + stdoutChunks.push(text); + } else { + stderrChunks.push(text); + } + options.onEvent?.({ + type: "output", + index: options.index, + total: options.total, + command: options.command, + cwd: options.cwd, + stream, + chunk: text, + }); + }; + const finish = (exitCode: number | null) => { if (settled) { return; @@ -274,48 +390,54 @@ async function execSetupCommandStreamed(options: { cwd: options.cwd, }); - const child = spawn("/bin/bash", ["-lc", options.command], { - cwd: options.cwd, - env: options.env, - stdio: ["ignore", "pipe", "pipe"], - }); - - child.stdout?.on("data", (chunk: Buffer | string) => { - const text = chunk.toString(); - stdoutChunks.push(text); - options.onEvent?.({ - type: "output", - index: options.index, - total: options.total, - command: options.command, + const spawnWithPipes = () => { + const shellInvocation = buildStringCommandShellInvocation({ command: options.command }); + const child = spawn(shellInvocation.shell, shellInvocation.args, { cwd: options.cwd, - stream: "stdout", - chunk: text, + env: options.env, + stdio: ["ignore", "pipe", "pipe"], }); - }); - child.stderr?.on("data", (chunk: Buffer | string) => { - const text = chunk.toString(); - stderrChunks.push(text); - options.onEvent?.({ - type: "output", - index: options.index, - total: options.total, - command: options.command, + child.stdout?.on("data", (chunk: Buffer | string) => { + emitOutput("stdout", chunk.toString()); + }); + + child.stderr?.on("data", (chunk: Buffer | string) => { + emitOutput("stderr", chunk.toString()); + }); + + child.on("error", (error) => { + emitOutput("stderr", error instanceof Error ? error.message : String(error)); + finish(null); + }); + + child.on("close", (code) => { + finish(typeof code === "number" ? code : null); + }); + }; + + try { + ensureNodePtySpawnHelperExecutableForCurrentPlatform(); + const shellInvocation = buildStringCommandShellInvocation({ command: options.command }); + const terminal = pty.spawn(shellInvocation.shell, shellInvocation.args, { cwd: options.cwd, - stream: "stderr", - chunk: text, + env: options.env, + name: "xterm-color", + cols: 120, + rows: 30, }); - }); - child.on("error", (error) => { - stderrChunks.push(error instanceof Error ? error.message : String(error)); - finish(null); - }); + terminal.onData((data) => { + emitOutput("stdout", data); + }); - child.on("close", (code) => { - finish(typeof code === "number" ? code : null); - }); + terminal.onExit(({ exitCode }) => { + finish(typeof exitCode === "number" ? exitCode : null); + }); + } catch (error) { + emitOutput("stderr", error instanceof Error ? error.message : String(error)); + spawnWithPipes(); + } }); } diff --git a/packages/website/src/components/landing-page.tsx b/packages/website/src/components/landing-page.tsx index 85060e2fe..727926fd1 100644 --- a/packages/website/src/components/landing-page.tsx +++ b/packages/website/src/components/landing-page.tsx @@ -60,6 +60,7 @@ export function LandingPage({ title, subtitle }: LandingPageProps) {
+ @@ -489,6 +490,52 @@ function SelfHostedSection() { } +function ServiceProxySection() { + const workspaces = [ + { name: "fix-auth", url: "fix-auth.my-app.localhost" }, + { name: "add-search", url: "add-search.my-app.localhost" }, + { name: "upgrade-deps", url: "upgrade-deps.my-app.localhost" }, + ]; + + return ( + +
+
+ {/* Project */} +
+ + + + my-app +
+ + {/* Workspaces indented */} +
+ {workspaces.map((ws) => ( +
+
+
+ {ws.name} + npm run dev +
+ + {ws.url} + +
+ ))} +
+
+
+ + ); +} + function ShortcutsSection() { const shortcuts = [ { keys: ["⌘", "1-9"], action: "Switch panels" }, diff --git a/scripts/dev.sh b/scripts/dev.sh index d4e8426de..3ddc9104e 100755 --- a/scripts/dev.sh +++ b/scripts/dev.sh @@ -5,17 +5,34 @@ set -e SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)" export PATH="$SCRIPT_DIR/../node_modules/.bin:$PATH" -# Use a temporary PASEO_HOME to avoid conflicts between dev instances +# Derive PASEO_HOME: stable name for worktrees, temporary dir otherwise if [ -z "${PASEO_HOME}" ]; then export PASEO_HOME - PASEO_HOME="$(mktemp -d "${TMPDIR:-/tmp}/paseo-dev.XXXXXX")" - trap "rm -rf '$PASEO_HOME'" EXIT + GIT_DIR="$(git rev-parse --git-dir 2>/dev/null || true)" + GIT_COMMON_DIR="$(git rev-parse --git-common-dir 2>/dev/null || true)" + if [ -n "$GIT_DIR" ] && [ -n "$GIT_COMMON_DIR" ] && [ "$GIT_DIR" != "$GIT_COMMON_DIR" ]; then + # Inside a worktree — derive a stable home from the worktree name + WORKTREE_ROOT="$(git rev-parse --show-toplevel)" + WORKTREE_NAME="$(basename "$WORKTREE_ROOT" | tr '[:upper:]' '[:lower:]' | sed 's/[^a-z0-9-]/-/g; s/--*/-/g; s/^-//; s/-$//')" + PASEO_HOME="$HOME/.paseo-${WORKTREE_NAME}" + mkdir -p "$PASEO_HOME" + else + PASEO_HOME="$(mktemp -d "${TMPDIR:-/tmp}/paseo-dev.XXXXXX")" + trap "rm -rf '$PASEO_HOME'" EXIT + fi +fi + +# Share speech models with the main install to avoid duplicate downloads +if [ -z "${PASEO_LOCAL_MODELS_DIR}" ]; then + export PASEO_LOCAL_MODELS_DIR="$HOME/.paseo/models/local-speech" + mkdir -p "$PASEO_LOCAL_MODELS_DIR" fi echo "══════════════════════════════════════════════════════" echo " Paseo Dev" echo "══════════════════════════════════════════════════════" echo " Home: ${PASEO_HOME}" +echo " Models: ${PASEO_LOCAL_MODELS_DIR}" echo "══════════════════════════════════════════════════════" # Configure the daemon for the Portless app origin and let the app bootstrap