From c8f484831cd5725e4d88e57f294b1619d0f450d0 Mon Sep 17 00:00:00 2001 From: Viktor Nagy <137165288+NagyVikt@users.noreply.github.com> Date: Tue, 21 Apr 2026 15:35:46 +0200 Subject: [PATCH 01/48] Unblock the next Guardex npm publish after 7.0.15 was already taken (#239) npm rejected republishing 7.0.15, so this bump moves the package to 7.0.16, adds the matching README release note for the shipped Guardex changes, and updates the release integration test to follow the current package version instead of a stale literal. Constraint: npm will not publish over the already-taken 7.0.15 version Rejected: Leave README and release test expectations on 7.0.15 | publish metadata and release verification would drift Confidence: high Scope-risk: narrow Reversibility: clean Directive: When bumping a published version, keep README release notes and gx release expectations on the same current semver Tested: npm test (150/150); node --check bin/multiagent-safety.js; openspec validate agent-codex-release-guardex-7-0-16-2026-04-21-15-22 --type change --strict; openspec validate --specs; env npm_config_cache=/tmp/guardex-npm-cache npm pack --dry-run Not-tested: Live gh release creation against GitHub Co-authored-by: NagyVikt --- README.md | 6 ++++++ .../.openspec.yaml | 2 ++ .../proposal.md | 15 +++++++++++++ .../specs/release-version-bump/spec.md | 10 +++++++++ .../tasks.md | 21 +++++++++++++++++++ package-lock.json | 4 ++-- package.json | 2 +- test/install.test.js | 4 ++-- 8 files changed, 59 insertions(+), 5 deletions(-) create mode 100644 openspec/changes/agent-codex-release-guardex-7-0-16-2026-04-21-15-22/.openspec.yaml create mode 100644 openspec/changes/agent-codex-release-guardex-7-0-16-2026-04-21-15-22/proposal.md create mode 100644 openspec/changes/agent-codex-release-guardex-7-0-16-2026-04-21-15-22/specs/release-version-bump/spec.md create mode 100644 openspec/changes/agent-codex-release-guardex-7-0-16-2026-04-21-15-22/tasks.md diff --git a/README.md b/README.md index d335e89..f5c0e1c 100644 --- a/README.md +++ b/README.md @@ -529,6 +529,12 @@ npm pack --dry-run
v7.x +### v7.0.16 +- `gx doctor` now keeps nested repo repair runs visibly progressing, and overlapping integration work stays off the protected base branch instead of trying to merge back on `main`. +- Cleanup and finish flows are less brittle: `codex-agent` no longer waits on PRs that can never exist, and prune cleanup now walks both managed worktree roots so stale sandboxes get removed consistently. +- Mirror-sync diagnostics are quieter: when the mirror PAT is unset, Guardex now skips the sync path instead of marking the run red, and shared `ralplan` lanes stay easier to identify during handoff/debugging. +- Bumped `@imdeadpool/guardex` from `7.0.15` → `7.0.16` after npm rejected a republish over the already-published `7.0.15`. + ### v7.0.15 - `gx doctor` no longer blocks recursive nested protected-repo repairs on child PR merge waits; nested sandboxes now force `--no-wait-for-merge` so the parent repair loop can continue. - `gx setup` can now refresh managed files from protected `main` through a temporary sandbox branch/worktree, sync the managed outputs back to the visible base checkout, and prune the sandbox afterward. diff --git a/openspec/changes/agent-codex-release-guardex-7-0-16-2026-04-21-15-22/.openspec.yaml b/openspec/changes/agent-codex-release-guardex-7-0-16-2026-04-21-15-22/.openspec.yaml new file mode 100644 index 0000000..4b8c565 --- /dev/null +++ b/openspec/changes/agent-codex-release-guardex-7-0-16-2026-04-21-15-22/.openspec.yaml @@ -0,0 +1,2 @@ +schema: spec-driven +created: 2026-04-21 diff --git a/openspec/changes/agent-codex-release-guardex-7-0-16-2026-04-21-15-22/proposal.md b/openspec/changes/agent-codex-release-guardex-7-0-16-2026-04-21-15-22/proposal.md new file mode 100644 index 0000000..c670293 --- /dev/null +++ b/openspec/changes/agent-codex-release-guardex-7-0-16-2026-04-21-15-22/proposal.md @@ -0,0 +1,15 @@ +## Why + +- `npm publish` rejected `@imdeadpool/guardex@7.0.15` because that version is already published, so the package needs the next publishable patch version before release can proceed. +- The shipped behavior since `7.0.15` is not captured in the README release notes yet, and `package-lock.json` is also still lagging behind the manifest version. + +## What Changes + +- Bump the package release metadata from `7.0.15` to `7.0.16` in `package.json` and `package-lock.json`. +- Add a `README.md` release-notes entry for `v7.0.16` that summarizes the post-`7.0.15` Guardex behavior now shipping in the package. +- Update the `gx release` integration expectation in `test/install.test.js` so the release workflow tracks the current package version. + +## Impact + +- Unblocks the next npm publish without changing runtime behavior beyond what is already merged on `main`. +- Keeps the packaged version, lockfile metadata, and documented release notes aligned so release state is easier to trust. diff --git a/openspec/changes/agent-codex-release-guardex-7-0-16-2026-04-21-15-22/specs/release-version-bump/spec.md b/openspec/changes/agent-codex-release-guardex-7-0-16-2026-04-21-15-22/specs/release-version-bump/spec.md new file mode 100644 index 0000000..168eb9f --- /dev/null +++ b/openspec/changes/agent-codex-release-guardex-7-0-16-2026-04-21-15-22/specs/release-version-bump/spec.md @@ -0,0 +1,10 @@ +## ADDED Requirements + +### Requirement: Release recovery version alignment +The release metadata SHALL move to the next publishable package version when npm rejects the current version as already published. + +#### Scenario: Recover from an already-published npm version +- **GIVEN** `npm publish` rejects the current Guardex version as already published +- **WHEN** maintainers prepare the recovery release +- **THEN** `package.json` and `package-lock.json` SHALL be bumped to the next publishable semver +- **AND** `README.md` SHALL record the new release version with the newly shipped behavior that the package now contains. diff --git a/openspec/changes/agent-codex-release-guardex-7-0-16-2026-04-21-15-22/tasks.md b/openspec/changes/agent-codex-release-guardex-7-0-16-2026-04-21-15-22/tasks.md new file mode 100644 index 0000000..6dfa771 --- /dev/null +++ b/openspec/changes/agent-codex-release-guardex-7-0-16-2026-04-21-15-22/tasks.md @@ -0,0 +1,21 @@ +## 1. Specification + +- [x] 1.1 Finalize proposal scope and acceptance criteria for `agent-codex-release-guardex-7-0-16-2026-04-21-15-22`. +- [x] 1.2 Define normative requirements in `specs/release-version-bump/spec.md`. + +## 2. Implementation + +- [x] 2.1 Bump `package.json`, `package-lock.json`, and `README.md` to the next publishable Guardex release version. +- [x] 2.2 Update the `gx release` integration expectation in `test/install.test.js` so the release workflow follows the current package version. + +## 3. Verification + +- [x] 3.1 Run `npm test`, `node --check bin/multiagent-safety.js`, and `npm pack --dry-run` for the release-only change. `npm test` passed `150/150`; `node --check bin/multiagent-safety.js` passed; `npm pack --dry-run` produced `imdeadpool-guardex-7.0.16.tgz`. +- [x] 3.2 Run `openspec validate agent-codex-release-guardex-7-0-16-2026-04-21-15-22 --type change --strict`. Result: `Change 'agent-codex-release-guardex-7-0-16-2026-04-21-15-22' is valid`. +- [x] 3.3 Run `openspec validate --specs`. Result: `No items found to validate.` + +## 4. Completion + +- [ ] 4.1 Finish the agent branch via PR merge + cleanup (`gx finish --via-pr --wait-for-merge --cleanup` or `bash scripts/agent-branch-finish.sh --branch --base --via-pr --wait-for-merge --cleanup`). +- [ ] 4.2 Record PR URL + final `MERGED` state in the completion handoff. +- [ ] 4.3 Confirm sandbox cleanup (`git worktree list`, `git branch -a`) or capture a `BLOCKED:` handoff if merge/cleanup is pending. diff --git a/package-lock.json b/package-lock.json index f6431c3..b38cba3 100644 --- a/package-lock.json +++ b/package-lock.json @@ -1,12 +1,12 @@ { "name": "@imdeadpool/guardex", - "version": "7.0.15", + "version": "7.0.16", "lockfileVersion": 3, "requires": true, "packages": { "": { "name": "@imdeadpool/guardex", - "version": "7.0.15", + "version": "7.0.16", "license": "MIT", "bin": { "gitguardex": "bin/multiagent-safety.js", diff --git a/package.json b/package.json index 20c1daf..7422cd6 100644 --- a/package.json +++ b/package.json @@ -1,6 +1,6 @@ { "name": "@imdeadpool/guardex", - "version": "7.0.15", + "version": "7.0.16", "description": "GitGuardex: hardened multi-agent git guardrails for parallel agent work.", "license": "MIT", "preferGlobal": true, diff --git a/test/install.test.js b/test/install.test.js index cab67ec..9717db2 100644 --- a/test/install.test.js +++ b/test/install.test.js @@ -5015,9 +5015,9 @@ exit 1 assert.match(args, new RegExp(`^create$`, 'm')); assert.match(args, new RegExp(`^v${escapeRegexLiteral(cliVersion)}$`, 'm')); assert.match(args, /^--repo$\nrecodeee\/gitguardex$/m); - assert.match(args, /^--title$\nv7\.0\.15$/m); + assert.match(args, new RegExp(`^--title$\\nv${escapeRegexLiteral(cliVersion)}$`, 'm')); assert.match(args, /Changes since v7\.0\.12\./); - assert.match(args, /### v7\.0\.15/); + assert.match(args, new RegExp(`### v${escapeRegexLiteral(cliVersion)}`)); assert.match(args, /### v7\.0\.14/); assert.match(args, /### v7\.0\.13/); }); From 9f5a6b2a8795cf6e54f49350ed6c91b15f1293fd Mon Sep 17 00:00:00 2001 From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com> Date: Tue, 21 Apr 2026 15:42:09 +0200 Subject: [PATCH 02/48] Bump next from 15.2.6 to 15.5.15 in /frontend (#229) Bumps [next](https://github.com/vercel/next.js) from 15.2.6 to 15.5.15. - [Release notes](https://github.com/vercel/next.js/releases) - [Changelog](https://github.com/vercel/next.js/blob/canary/release.js) - [Commits](https://github.com/vercel/next.js/compare/v15.2.6...v15.5.15) --- updated-dependencies: - dependency-name: next dependency-version: 15.5.15 dependency-type: direct:production ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> --- frontend/package-lock.json | 913 +++++++++++++++++-------------------- frontend/package.json | 2 +- 2 files changed, 419 insertions(+), 496 deletions(-) diff --git a/frontend/package-lock.json b/frontend/package-lock.json index 647885d..f7a9f1a 100644 --- a/frontend/package-lock.json +++ b/frontend/package-lock.json @@ -9,7 +9,7 @@ "version": "0.1.0", "dependencies": { "express": "4.21.2", - "next": "15.2.6", + "next": "15.5.15", "react": "19.0.3", "react-dom": "19.0.3" }, @@ -265,14 +265,22 @@ "url": "https://github.com/sponsors/nzakas" } }, + "node_modules/@img/colour": { + "version": "1.1.0", + "resolved": "https://registry.npmjs.org/@img/colour/-/colour-1.1.0.tgz", + "integrity": "sha512-Td76q7j57o/tLVdgS746cYARfSyxk8iEfRxewL9h4OMzYhbW4TAcppl0mT4eyqXddh6L/jwoM75mo7ixa/pCeQ==", + "optional": true, + "engines": { + "node": ">=18" + } + }, "node_modules/@img/sharp-darwin-arm64": { - "version": "0.33.5", - "resolved": "https://registry.npmjs.org/@img/sharp-darwin-arm64/-/sharp-darwin-arm64-0.33.5.tgz", - "integrity": "sha512-UT4p+iz/2H4twwAoLCqfA9UH5pI6DggwKEGuaPy7nCVQ8ZsiY5PIcrRvD1DzuY3qYL07NtIQcWnBSY/heikIFQ==", + "version": "0.34.5", + "resolved": "https://registry.npmjs.org/@img/sharp-darwin-arm64/-/sharp-darwin-arm64-0.34.5.tgz", + "integrity": "sha512-imtQ3WMJXbMY4fxb/Ndp6HBTNVtWCUI0WdobyheGf5+ad6xX8VIDO8u2xE4qc/fr08CKG/7dDseFtn6M6g/r3w==", "cpu": [ "arm64" ], - "license": "Apache-2.0", "optional": true, "os": [ "darwin" @@ -284,17 +292,16 @@ "url": "https://opencollective.com/libvips" }, "optionalDependencies": { - "@img/sharp-libvips-darwin-arm64": "1.0.4" + "@img/sharp-libvips-darwin-arm64": "1.2.4" } }, "node_modules/@img/sharp-darwin-x64": { - "version": "0.33.5", - "resolved": "https://registry.npmjs.org/@img/sharp-darwin-x64/-/sharp-darwin-x64-0.33.5.tgz", - "integrity": "sha512-fyHac4jIc1ANYGRDxtiqelIbdWkIuQaI84Mv45KvGRRxSAa7o7d1ZKAOBaYbnepLC1WqxfpimdeWfvqqSGwR2Q==", + "version": "0.34.5", + "resolved": "https://registry.npmjs.org/@img/sharp-darwin-x64/-/sharp-darwin-x64-0.34.5.tgz", + "integrity": "sha512-YNEFAF/4KQ/PeW0N+r+aVVsoIY0/qxxikF2SWdp+NRkmMB7y9LBZAVqQ4yhGCm/H3H270OSykqmQMKLBhBJDEw==", "cpu": [ "x64" ], - "license": "Apache-2.0", "optional": true, "os": [ "darwin" @@ -306,17 +313,16 @@ "url": "https://opencollective.com/libvips" }, "optionalDependencies": { - "@img/sharp-libvips-darwin-x64": "1.0.4" + "@img/sharp-libvips-darwin-x64": "1.2.4" } }, "node_modules/@img/sharp-libvips-darwin-arm64": { - "version": "1.0.4", - "resolved": "https://registry.npmjs.org/@img/sharp-libvips-darwin-arm64/-/sharp-libvips-darwin-arm64-1.0.4.tgz", - "integrity": "sha512-XblONe153h0O2zuFfTAbQYAX2JhYmDHeWikp1LM9Hul9gVPjFY427k6dFEcOL72O01QxQsWi761svJ/ev9xEDg==", + "version": "1.2.4", + "resolved": "https://registry.npmjs.org/@img/sharp-libvips-darwin-arm64/-/sharp-libvips-darwin-arm64-1.2.4.tgz", + "integrity": "sha512-zqjjo7RatFfFoP0MkQ51jfuFZBnVE2pRiaydKJ1G/rHZvnsrHAOcQALIi9sA5co5xenQdTugCvtb1cuf78Vf4g==", "cpu": [ "arm64" ], - "license": "LGPL-3.0-or-later", "optional": true, "os": [ "darwin" @@ -326,13 +332,12 @@ } }, "node_modules/@img/sharp-libvips-darwin-x64": { - "version": "1.0.4", - "resolved": "https://registry.npmjs.org/@img/sharp-libvips-darwin-x64/-/sharp-libvips-darwin-x64-1.0.4.tgz", - "integrity": "sha512-xnGR8YuZYfJGmWPvmlunFaWJsb9T/AO2ykoP3Fz/0X5XV2aoYBPkX6xqCQvUTKKiLddarLaxpzNe+b1hjeWHAQ==", + "version": "1.2.4", + "resolved": "https://registry.npmjs.org/@img/sharp-libvips-darwin-x64/-/sharp-libvips-darwin-x64-1.2.4.tgz", + "integrity": "sha512-1IOd5xfVhlGwX+zXv2N93k0yMONvUlANylbJw1eTah8K/Jtpi15KC+WSiaX/nBmbm2HxRM1gZ0nSdjSsrZbGKg==", "cpu": [ "x64" ], - "license": "LGPL-3.0-or-later", "optional": true, "os": [ "darwin" @@ -342,16 +347,12 @@ } }, "node_modules/@img/sharp-libvips-linux-arm": { - "version": "1.0.5", - "resolved": "https://registry.npmjs.org/@img/sharp-libvips-linux-arm/-/sharp-libvips-linux-arm-1.0.5.tgz", - "integrity": "sha512-gvcC4ACAOPRNATg/ov8/MnbxFDJqf/pDePbBnuBDcjsI8PssmjoKMAz4LtLaVi+OnSb5FK/yIOamqDwGmXW32g==", + "version": "1.2.4", + "resolved": "https://registry.npmjs.org/@img/sharp-libvips-linux-arm/-/sharp-libvips-linux-arm-1.2.4.tgz", + "integrity": "sha512-bFI7xcKFELdiNCVov8e44Ia4u2byA+l3XtsAj+Q8tfCwO6BQ8iDojYdvoPMqsKDkuoOo+X6HZA0s0q11ANMQ8A==", "cpu": [ "arm" ], - "libc": [ - "glibc" - ], - "license": "LGPL-3.0-or-later", "optional": true, "os": [ "linux" @@ -361,16 +362,42 @@ } }, "node_modules/@img/sharp-libvips-linux-arm64": { - "version": "1.0.4", - "resolved": "https://registry.npmjs.org/@img/sharp-libvips-linux-arm64/-/sharp-libvips-linux-arm64-1.0.4.tgz", - "integrity": "sha512-9B+taZ8DlyyqzZQnoeIvDVR/2F4EbMepXMc/NdVbkzsJbzkUjhXv/70GQJ7tdLA4YJgNP25zukcxpX2/SueNrA==", + "version": "1.2.4", + "resolved": "https://registry.npmjs.org/@img/sharp-libvips-linux-arm64/-/sharp-libvips-linux-arm64-1.2.4.tgz", + "integrity": "sha512-excjX8DfsIcJ10x1Kzr4RcWe1edC9PquDRRPx3YVCvQv+U5p7Yin2s32ftzikXojb1PIFc/9Mt28/y+iRklkrw==", "cpu": [ "arm64" ], - "libc": [ - "glibc" + "optional": true, + "os": [ + "linux" + ], + "funding": { + "url": "https://opencollective.com/libvips" + } + }, + "node_modules/@img/sharp-libvips-linux-ppc64": { + "version": "1.2.4", + "resolved": "https://registry.npmjs.org/@img/sharp-libvips-linux-ppc64/-/sharp-libvips-linux-ppc64-1.2.4.tgz", + "integrity": "sha512-FMuvGijLDYG6lW+b/UvyilUWu5Ayu+3r2d1S8notiGCIyYU/76eig1UfMmkZ7vwgOrzKzlQbFSuQfgm7GYUPpA==", + "cpu": [ + "ppc64" + ], + "optional": true, + "os": [ + "linux" + ], + "funding": { + "url": "https://opencollective.com/libvips" + } + }, + "node_modules/@img/sharp-libvips-linux-riscv64": { + "version": "1.2.4", + "resolved": "https://registry.npmjs.org/@img/sharp-libvips-linux-riscv64/-/sharp-libvips-linux-riscv64-1.2.4.tgz", + "integrity": "sha512-oVDbcR4zUC0ce82teubSm+x6ETixtKZBh/qbREIOcI3cULzDyb18Sr/Wcyx7NRQeQzOiHTNbZFF1UwPS2scyGA==", + "cpu": [ + "riscv64" ], - "license": "LGPL-3.0-or-later", "optional": true, "os": [ "linux" @@ -380,16 +407,12 @@ } }, "node_modules/@img/sharp-libvips-linux-s390x": { - "version": "1.0.4", - "resolved": "https://registry.npmjs.org/@img/sharp-libvips-linux-s390x/-/sharp-libvips-linux-s390x-1.0.4.tgz", - "integrity": "sha512-u7Wz6ntiSSgGSGcjZ55im6uvTrOxSIS8/dgoVMoiGE9I6JAfU50yH5BoDlYA1tcuGS7g/QNtetJnxA6QEsCVTA==", + "version": "1.2.4", + "resolved": "https://registry.npmjs.org/@img/sharp-libvips-linux-s390x/-/sharp-libvips-linux-s390x-1.2.4.tgz", + "integrity": "sha512-qmp9VrzgPgMoGZyPvrQHqk02uyjA0/QrTO26Tqk6l4ZV0MPWIW6LTkqOIov+J1yEu7MbFQaDpwdwJKhbJvuRxQ==", "cpu": [ "s390x" ], - "libc": [ - "glibc" - ], - "license": "LGPL-3.0-or-later", "optional": true, "os": [ "linux" @@ -399,16 +422,12 @@ } }, "node_modules/@img/sharp-libvips-linux-x64": { - "version": "1.0.4", - "resolved": "https://registry.npmjs.org/@img/sharp-libvips-linux-x64/-/sharp-libvips-linux-x64-1.0.4.tgz", - "integrity": "sha512-MmWmQ3iPFZr0Iev+BAgVMb3ZyC4KeFc3jFxnNbEPas60e1cIfevbtuyf9nDGIzOaW9PdnDciJm+wFFaTlj5xYw==", + "version": "1.2.4", + "resolved": "https://registry.npmjs.org/@img/sharp-libvips-linux-x64/-/sharp-libvips-linux-x64-1.2.4.tgz", + "integrity": "sha512-tJxiiLsmHc9Ax1bz3oaOYBURTXGIRDODBqhveVHonrHJ9/+k89qbLl0bcJns+e4t4rvaNBxaEZsFtSfAdquPrw==", "cpu": [ "x64" ], - "libc": [ - "glibc" - ], - "license": "LGPL-3.0-or-later", "optional": true, "os": [ "linux" @@ -418,16 +437,12 @@ } }, "node_modules/@img/sharp-libvips-linuxmusl-arm64": { - "version": "1.0.4", - "resolved": "https://registry.npmjs.org/@img/sharp-libvips-linuxmusl-arm64/-/sharp-libvips-linuxmusl-arm64-1.0.4.tgz", - "integrity": "sha512-9Ti+BbTYDcsbp4wfYib8Ctm1ilkugkA/uscUn6UXK1ldpC1JjiXbLfFZtRlBhjPZ5o1NCLiDbg8fhUPKStHoTA==", + "version": "1.2.4", + "resolved": "https://registry.npmjs.org/@img/sharp-libvips-linuxmusl-arm64/-/sharp-libvips-linuxmusl-arm64-1.2.4.tgz", + "integrity": "sha512-FVQHuwx1IIuNow9QAbYUzJ+En8KcVm9Lk5+uGUQJHaZmMECZmOlix9HnH7n1TRkXMS0pGxIJokIVB9SuqZGGXw==", "cpu": [ "arm64" ], - "libc": [ - "musl" - ], - "license": "LGPL-3.0-or-later", "optional": true, "os": [ "linux" @@ -437,16 +452,12 @@ } }, "node_modules/@img/sharp-libvips-linuxmusl-x64": { - "version": "1.0.4", - "resolved": "https://registry.npmjs.org/@img/sharp-libvips-linuxmusl-x64/-/sharp-libvips-linuxmusl-x64-1.0.4.tgz", - "integrity": "sha512-viYN1KX9m+/hGkJtvYYp+CCLgnJXwiQB39damAO7WMdKWlIhmYTfHjwSbQeUK/20vY154mwezd9HflVFM1wVSw==", + "version": "1.2.4", + "resolved": "https://registry.npmjs.org/@img/sharp-libvips-linuxmusl-x64/-/sharp-libvips-linuxmusl-x64-1.2.4.tgz", + "integrity": "sha512-+LpyBk7L44ZIXwz/VYfglaX/okxezESc6UxDSoyo2Ks6Jxc4Y7sGjpgU9s4PMgqgjj1gZCylTieNamqA1MF7Dg==", "cpu": [ "x64" ], - "libc": [ - "musl" - ], - "license": "LGPL-3.0-or-later", "optional": true, "os": [ "linux" @@ -456,16 +467,12 @@ } }, "node_modules/@img/sharp-linux-arm": { - "version": "0.33.5", - "resolved": "https://registry.npmjs.org/@img/sharp-linux-arm/-/sharp-linux-arm-0.33.5.tgz", - "integrity": "sha512-JTS1eldqZbJxjvKaAkxhZmBqPRGmxgu+qFKSInv8moZ2AmT5Yib3EQ1c6gp493HvrvV8QgdOXdyaIBrhvFhBMQ==", + "version": "0.34.5", + "resolved": "https://registry.npmjs.org/@img/sharp-linux-arm/-/sharp-linux-arm-0.34.5.tgz", + "integrity": "sha512-9dLqsvwtg1uuXBGZKsxem9595+ujv0sJ6Vi8wcTANSFpwV/GONat5eCkzQo/1O6zRIkh0m/8+5BjrRr7jDUSZw==", "cpu": [ "arm" ], - "libc": [ - "glibc" - ], - "license": "Apache-2.0", "optional": true, "os": [ "linux" @@ -477,20 +484,37 @@ "url": "https://opencollective.com/libvips" }, "optionalDependencies": { - "@img/sharp-libvips-linux-arm": "1.0.5" + "@img/sharp-libvips-linux-arm": "1.2.4" } }, "node_modules/@img/sharp-linux-arm64": { - "version": "0.33.5", - "resolved": "https://registry.npmjs.org/@img/sharp-linux-arm64/-/sharp-linux-arm64-0.33.5.tgz", - "integrity": "sha512-JMVv+AMRyGOHtO1RFBiJy/MBsgz0x4AWrT6QoEVVTyh1E39TrCUpTRI7mx9VksGX4awWASxqCYLCV4wBZHAYxA==", + "version": "0.34.5", + "resolved": "https://registry.npmjs.org/@img/sharp-linux-arm64/-/sharp-linux-arm64-0.34.5.tgz", + "integrity": "sha512-bKQzaJRY/bkPOXyKx5EVup7qkaojECG6NLYswgktOZjaXecSAeCWiZwwiFf3/Y+O1HrauiE3FVsGxFg8c24rZg==", "cpu": [ "arm64" ], - "libc": [ - "glibc" + "optional": true, + "os": [ + "linux" + ], + "engines": { + "node": "^18.17.0 || ^20.3.0 || >=21.0.0" + }, + "funding": { + "url": "https://opencollective.com/libvips" + }, + "optionalDependencies": { + "@img/sharp-libvips-linux-arm64": "1.2.4" + } + }, + "node_modules/@img/sharp-linux-ppc64": { + "version": "0.34.5", + "resolved": "https://registry.npmjs.org/@img/sharp-linux-ppc64/-/sharp-linux-ppc64-0.34.5.tgz", + "integrity": "sha512-7zznwNaqW6YtsfrGGDA6BRkISKAAE1Jo0QdpNYXNMHu2+0dTrPflTLNkpc8l7MUP5M16ZJcUvysVWWrMefZquA==", + "cpu": [ + "ppc64" ], - "license": "Apache-2.0", "optional": true, "os": [ "linux" @@ -502,20 +526,37 @@ "url": "https://opencollective.com/libvips" }, "optionalDependencies": { - "@img/sharp-libvips-linux-arm64": "1.0.4" + "@img/sharp-libvips-linux-ppc64": "1.2.4" + } + }, + "node_modules/@img/sharp-linux-riscv64": { + "version": "0.34.5", + "resolved": "https://registry.npmjs.org/@img/sharp-linux-riscv64/-/sharp-linux-riscv64-0.34.5.tgz", + "integrity": "sha512-51gJuLPTKa7piYPaVs8GmByo7/U7/7TZOq+cnXJIHZKavIRHAP77e3N2HEl3dgiqdD/w0yUfiJnII77PuDDFdw==", + "cpu": [ + "riscv64" + ], + "optional": true, + "os": [ + "linux" + ], + "engines": { + "node": "^18.17.0 || ^20.3.0 || >=21.0.0" + }, + "funding": { + "url": "https://opencollective.com/libvips" + }, + "optionalDependencies": { + "@img/sharp-libvips-linux-riscv64": "1.2.4" } }, "node_modules/@img/sharp-linux-s390x": { - "version": "0.33.5", - "resolved": "https://registry.npmjs.org/@img/sharp-linux-s390x/-/sharp-linux-s390x-0.33.5.tgz", - "integrity": "sha512-y/5PCd+mP4CA/sPDKl2961b+C9d+vPAveS33s6Z3zfASk2j5upL6fXVPZi7ztePZ5CuH+1kW8JtvxgbuXHRa4Q==", + "version": "0.34.5", + "resolved": "https://registry.npmjs.org/@img/sharp-linux-s390x/-/sharp-linux-s390x-0.34.5.tgz", + "integrity": "sha512-nQtCk0PdKfho3eC5MrbQoigJ2gd1CgddUMkabUj+rBevs8tZ2cULOx46E7oyX+04WGfABgIwmMC0VqieTiR4jg==", "cpu": [ "s390x" ], - "libc": [ - "glibc" - ], - "license": "Apache-2.0", "optional": true, "os": [ "linux" @@ -527,20 +568,16 @@ "url": "https://opencollective.com/libvips" }, "optionalDependencies": { - "@img/sharp-libvips-linux-s390x": "1.0.4" + "@img/sharp-libvips-linux-s390x": "1.2.4" } }, "node_modules/@img/sharp-linux-x64": { - "version": "0.33.5", - "resolved": "https://registry.npmjs.org/@img/sharp-linux-x64/-/sharp-linux-x64-0.33.5.tgz", - "integrity": "sha512-opC+Ok5pRNAzuvq1AG0ar+1owsu842/Ab+4qvU879ippJBHvyY5n2mxF1izXqkPYlGuP/M556uh53jRLJmzTWA==", + "version": "0.34.5", + "resolved": "https://registry.npmjs.org/@img/sharp-linux-x64/-/sharp-linux-x64-0.34.5.tgz", + "integrity": "sha512-MEzd8HPKxVxVenwAa+JRPwEC7QFjoPWuS5NZnBt6B3pu7EG2Ge0id1oLHZpPJdn3OQK+BQDiw9zStiHBTJQQQQ==", "cpu": [ "x64" ], - "libc": [ - "glibc" - ], - "license": "Apache-2.0", "optional": true, "os": [ "linux" @@ -552,20 +589,16 @@ "url": "https://opencollective.com/libvips" }, "optionalDependencies": { - "@img/sharp-libvips-linux-x64": "1.0.4" + "@img/sharp-libvips-linux-x64": "1.2.4" } }, "node_modules/@img/sharp-linuxmusl-arm64": { - "version": "0.33.5", - "resolved": "https://registry.npmjs.org/@img/sharp-linuxmusl-arm64/-/sharp-linuxmusl-arm64-0.33.5.tgz", - "integrity": "sha512-XrHMZwGQGvJg2V/oRSUfSAfjfPxO+4DkiRh6p2AFjLQztWUuY/o8Mq0eMQVIY7HJ1CDQUJlxGGZRw1a5bqmd1g==", + "version": "0.34.5", + "resolved": "https://registry.npmjs.org/@img/sharp-linuxmusl-arm64/-/sharp-linuxmusl-arm64-0.34.5.tgz", + "integrity": "sha512-fprJR6GtRsMt6Kyfq44IsChVZeGN97gTD331weR1ex1c1rypDEABN6Tm2xa1wE6lYb5DdEnk03NZPqA7Id21yg==", "cpu": [ "arm64" ], - "libc": [ - "musl" - ], - "license": "Apache-2.0", "optional": true, "os": [ "linux" @@ -577,20 +610,16 @@ "url": "https://opencollective.com/libvips" }, "optionalDependencies": { - "@img/sharp-libvips-linuxmusl-arm64": "1.0.4" + "@img/sharp-libvips-linuxmusl-arm64": "1.2.4" } }, "node_modules/@img/sharp-linuxmusl-x64": { - "version": "0.33.5", - "resolved": "https://registry.npmjs.org/@img/sharp-linuxmusl-x64/-/sharp-linuxmusl-x64-0.33.5.tgz", - "integrity": "sha512-WT+d/cgqKkkKySYmqoZ8y3pxx7lx9vVejxW/W4DOFMYVSkErR+w7mf2u8m/y4+xHe7yY9DAXQMWQhpnMuFfScw==", + "version": "0.34.5", + "resolved": "https://registry.npmjs.org/@img/sharp-linuxmusl-x64/-/sharp-linuxmusl-x64-0.34.5.tgz", + "integrity": "sha512-Jg8wNT1MUzIvhBFxViqrEhWDGzqymo3sV7z7ZsaWbZNDLXRJZoRGrjulp60YYtV4wfY8VIKcWidjojlLcWrd8Q==", "cpu": [ "x64" ], - "libc": [ - "musl" - ], - "license": "Apache-2.0", "optional": true, "os": [ "linux" @@ -602,20 +631,19 @@ "url": "https://opencollective.com/libvips" }, "optionalDependencies": { - "@img/sharp-libvips-linuxmusl-x64": "1.0.4" + "@img/sharp-libvips-linuxmusl-x64": "1.2.4" } }, "node_modules/@img/sharp-wasm32": { - "version": "0.33.5", - "resolved": "https://registry.npmjs.org/@img/sharp-wasm32/-/sharp-wasm32-0.33.5.tgz", - "integrity": "sha512-ykUW4LVGaMcU9lu9thv85CbRMAwfeadCJHRsg2GmeRa/cJxsVY9Rbd57JcMxBkKHag5U/x7TSBpScF4U8ElVzg==", + "version": "0.34.5", + "resolved": "https://registry.npmjs.org/@img/sharp-wasm32/-/sharp-wasm32-0.34.5.tgz", + "integrity": "sha512-OdWTEiVkY2PHwqkbBI8frFxQQFekHaSSkUIJkwzclWZe64O1X4UlUjqqqLaPbUpMOQk6FBu/HtlGXNblIs0huw==", "cpu": [ "wasm32" ], - "license": "Apache-2.0 AND LGPL-3.0-or-later AND MIT", "optional": true, "dependencies": { - "@emnapi/runtime": "^1.2.0" + "@emnapi/runtime": "^1.7.0" }, "engines": { "node": "^18.17.0 || ^20.3.0 || >=21.0.0" @@ -624,14 +652,31 @@ "url": "https://opencollective.com/libvips" } }, + "node_modules/@img/sharp-win32-arm64": { + "version": "0.34.5", + "resolved": "https://registry.npmjs.org/@img/sharp-win32-arm64/-/sharp-win32-arm64-0.34.5.tgz", + "integrity": "sha512-WQ3AgWCWYSb2yt+IG8mnC6Jdk9Whs7O0gxphblsLvdhSpSTtmu69ZG1Gkb6NuvxsNACwiPV6cNSZNzt0KPsw7g==", + "cpu": [ + "arm64" + ], + "optional": true, + "os": [ + "win32" + ], + "engines": { + "node": "^18.17.0 || ^20.3.0 || >=21.0.0" + }, + "funding": { + "url": "https://opencollective.com/libvips" + } + }, "node_modules/@img/sharp-win32-ia32": { - "version": "0.33.5", - "resolved": "https://registry.npmjs.org/@img/sharp-win32-ia32/-/sharp-win32-ia32-0.33.5.tgz", - "integrity": "sha512-T36PblLaTwuVJ/zw/LaH0PdZkRz5rd3SmMHX8GSmR7vtNSP5Z6bQkExdSK7xGWyxLw4sUknBuugTelgw2faBbQ==", + "version": "0.34.5", + "resolved": "https://registry.npmjs.org/@img/sharp-win32-ia32/-/sharp-win32-ia32-0.34.5.tgz", + "integrity": "sha512-FV9m/7NmeCmSHDD5j4+4pNI8Cp3aW+JvLoXcTUo0IqyjSfAZJ8dIUmijx1qaJsIiU+Hosw6xM5KijAWRJCSgNg==", "cpu": [ "ia32" ], - "license": "Apache-2.0 AND LGPL-3.0-or-later", "optional": true, "os": [ "win32" @@ -644,13 +689,12 @@ } }, "node_modules/@img/sharp-win32-x64": { - "version": "0.33.5", - "resolved": "https://registry.npmjs.org/@img/sharp-win32-x64/-/sharp-win32-x64-0.33.5.tgz", - "integrity": "sha512-MpY/o8/8kj+EcnxwvrP4aTJSWw/aZ7JIGR4aBeZkZw5B7/Jn+tY9/VNwtcoGmdT7GfggGIU4kygOMSbYnOrAbg==", + "version": "0.34.5", + "resolved": "https://registry.npmjs.org/@img/sharp-win32-x64/-/sharp-win32-x64-0.34.5.tgz", + "integrity": "sha512-+29YMsqY2/9eFEiW93eqWnuLcWcufowXewwSNIT6UwZdUUCrM3oFjMWH/Z6/TMmb4hlFenmfAVbpWeup2jryCw==", "cpu": [ "x64" ], - "license": "Apache-2.0 AND LGPL-3.0-or-later", "optional": true, "os": [ "win32" @@ -676,10 +720,9 @@ } }, "node_modules/@next/env": { - "version": "15.2.6", - "resolved": "https://registry.npmjs.org/@next/env/-/env-15.2.6.tgz", - "integrity": "sha512-kp1Mpm4K1IzSSJ5ZALfek0JBD2jBw9VGMXR/aT7ykcA2q/ieDARyBzg+e8J1TkeIb5AFj/YjtZdoajdy5uNy6w==", - "license": "MIT" + "version": "15.5.15", + "resolved": "https://registry.npmjs.org/@next/env/-/env-15.5.15.tgz", + "integrity": "sha512-vcmyu5/MyFzN7CdqRHO3uHO44p/QPCZkuTUXroeUmhNP8bL5PHFEhik22JUazt+CDDoD6EpBYRCaS2pISL+/hg==" }, "node_modules/@next/eslint-plugin-next": { "version": "15.2.6", @@ -692,13 +735,12 @@ } }, "node_modules/@next/swc-darwin-arm64": { - "version": "15.2.5", - "resolved": "https://registry.npmjs.org/@next/swc-darwin-arm64/-/swc-darwin-arm64-15.2.5.tgz", - "integrity": "sha512-4OimvVlFTbgzPdA0kh8A1ih6FN9pQkL4nPXGqemEYgk+e7eQhsst/p35siNNqA49eQA6bvKZ1ASsDtu9gtXuog==", + "version": "15.5.15", + "resolved": "https://registry.npmjs.org/@next/swc-darwin-arm64/-/swc-darwin-arm64-15.5.15.tgz", + "integrity": "sha512-6PvFO2Tzt10GFK2Ro9tAVEtacMqRmTarYMFKAnV2vYMdwWc73xzmDQyAV7SwEdMhzmiRoo7+m88DuiXlJlGeaw==", "cpu": [ "arm64" ], - "license": "MIT", "optional": true, "os": [ "darwin" @@ -708,13 +750,12 @@ } }, "node_modules/@next/swc-darwin-x64": { - "version": "15.2.5", - "resolved": "https://registry.npmjs.org/@next/swc-darwin-x64/-/swc-darwin-x64-15.2.5.tgz", - "integrity": "sha512-ohzRaE9YbGt1ctE0um+UGYIDkkOxHV44kEcHzLqQigoRLaiMtZzGrA11AJh2Lu0lv51XeiY1ZkUvkThjkVNBMA==", + "version": "15.5.15", + "resolved": "https://registry.npmjs.org/@next/swc-darwin-x64/-/swc-darwin-x64-15.5.15.tgz", + "integrity": "sha512-G+YNV+z6FDZTp/+IdGyIMFqalBTaQSnvAA+X/hrt+eaTRFSznRMz9K7rTmzvM6tDmKegNtyzgufZW0HwVzEqaQ==", "cpu": [ "x64" ], - "license": "MIT", "optional": true, "os": [ "darwin" @@ -724,16 +765,12 @@ } }, "node_modules/@next/swc-linux-arm64-gnu": { - "version": "15.2.5", - "resolved": "https://registry.npmjs.org/@next/swc-linux-arm64-gnu/-/swc-linux-arm64-gnu-15.2.5.tgz", - "integrity": "sha512-FMSdxSUt5bVXqqOoZCc/Seg4LQep9w/fXTazr/EkpXW2Eu4IFI9FD7zBDlID8TJIybmvKk7mhd9s+2XWxz4flA==", + "version": "15.5.15", + "resolved": "https://registry.npmjs.org/@next/swc-linux-arm64-gnu/-/swc-linux-arm64-gnu-15.5.15.tgz", + "integrity": "sha512-eVkrMcVIBqGfXB+QUC7jjZ94Z6uX/dNStbQFabewAnk13Uy18Igd1YZ/GtPRzdhtm7QwC0e6o7zOQecul4iC1w==", "cpu": [ "arm64" ], - "libc": [ - "glibc" - ], - "license": "MIT", "optional": true, "os": [ "linux" @@ -743,16 +780,12 @@ } }, "node_modules/@next/swc-linux-arm64-musl": { - "version": "15.2.5", - "resolved": "https://registry.npmjs.org/@next/swc-linux-arm64-musl/-/swc-linux-arm64-musl-15.2.5.tgz", - "integrity": "sha512-4ZNKmuEiW5hRKkGp2HWwZ+JrvK4DQLgf8YDaqtZyn7NYdl0cHfatvlnLFSWUayx9yFAUagIgRGRk8pFxS8Qniw==", + "version": "15.5.15", + "resolved": "https://registry.npmjs.org/@next/swc-linux-arm64-musl/-/swc-linux-arm64-musl-15.5.15.tgz", + "integrity": "sha512-RwSHKMQ7InLy5GfkY2/n5PcFycKA08qI1VST78n09nN36nUPqCvGSMiLXlfUmzmpQpF6XeBYP2KRWHi0UW3uNg==", "cpu": [ "arm64" ], - "libc": [ - "musl" - ], - "license": "MIT", "optional": true, "os": [ "linux" @@ -762,16 +795,12 @@ } }, "node_modules/@next/swc-linux-x64-gnu": { - "version": "15.2.5", - "resolved": "https://registry.npmjs.org/@next/swc-linux-x64-gnu/-/swc-linux-x64-gnu-15.2.5.tgz", - "integrity": "sha512-bE6lHQ9GXIf3gCDE53u2pTl99RPZW5V1GLHSRMJ5l/oB/MT+cohu9uwnCK7QUph2xIOu2a6+27kL0REa/kqwZw==", + "version": "15.5.15", + "resolved": "https://registry.npmjs.org/@next/swc-linux-x64-gnu/-/swc-linux-x64-gnu-15.5.15.tgz", + "integrity": "sha512-nplqvY86LakS+eeiuWsNWvfmK8pFcOEW7ZtVRt4QH70lL+0x6LG/m1OpJ/tvrbwjmR8HH9/fH2jzW1GlL03TIg==", "cpu": [ "x64" ], - "libc": [ - "glibc" - ], - "license": "MIT", "optional": true, "os": [ "linux" @@ -781,16 +810,12 @@ } }, "node_modules/@next/swc-linux-x64-musl": { - "version": "15.2.5", - "resolved": "https://registry.npmjs.org/@next/swc-linux-x64-musl/-/swc-linux-x64-musl-15.2.5.tgz", - "integrity": "sha512-y7EeQuSkQbTAkCEQnJXm1asRUuGSWAchGJ3c+Qtxh8LVjXleZast8Mn/rL7tZOm7o35QeIpIcid6ufG7EVTTcA==", + "version": "15.5.15", + "resolved": "https://registry.npmjs.org/@next/swc-linux-x64-musl/-/swc-linux-x64-musl-15.5.15.tgz", + "integrity": "sha512-eAgl9NKQ84/sww0v81DQINl/vL2IBxD7sMybd0cWRw6wqgouVI53brVRBrggqBRP/NWeIAE1dm5cbKYoiMlqDQ==", "cpu": [ "x64" ], - "libc": [ - "musl" - ], - "license": "MIT", "optional": true, "os": [ "linux" @@ -800,13 +825,12 @@ } }, "node_modules/@next/swc-win32-arm64-msvc": { - "version": "15.2.5", - "resolved": "https://registry.npmjs.org/@next/swc-win32-arm64-msvc/-/swc-win32-arm64-msvc-15.2.5.tgz", - "integrity": "sha512-gQMz0yA8/dskZM2Xyiq2FRShxSrsJNha40Ob/M2n2+JGRrZ0JwTVjLdvtN6vCxuq4ByhOd4a9qEf60hApNR2gQ==", + "version": "15.5.15", + "resolved": "https://registry.npmjs.org/@next/swc-win32-arm64-msvc/-/swc-win32-arm64-msvc-15.5.15.tgz", + "integrity": "sha512-GJVZC86lzSquh0MtvZT+L7G8+jMnJcldloOjA8Kf3wXvBrvb6OGe2MzPuALxFshSm/IpwUtD2mIoof39ymf52A==", "cpu": [ "arm64" ], - "license": "MIT", "optional": true, "os": [ "win32" @@ -816,13 +840,12 @@ } }, "node_modules/@next/swc-win32-x64-msvc": { - "version": "15.2.5", - "resolved": "https://registry.npmjs.org/@next/swc-win32-x64-msvc/-/swc-win32-x64-msvc-15.2.5.tgz", - "integrity": "sha512-tBDNVUcI7U03+3oMvJ11zrtVin5p0NctiuKmTGyaTIEAVj9Q77xukLXGXRnWxKRIIdFG4OTA2rUVGZDYOwgmAA==", + "version": "15.5.15", + "resolved": "https://registry.npmjs.org/@next/swc-win32-x64-msvc/-/swc-win32-x64-msvc-15.5.15.tgz", + "integrity": "sha512-nFucjVdwlFqxh/JG3hWSJ4p8+YJV7Ii8aPDuBQULB6DzUF4UNZETXLfEUk+oI2zEznWWULPt7MeuTE6xtK1HSA==", "cpu": [ "x64" ], - "license": "MIT", "optional": true, "os": [ "win32" @@ -893,12 +916,6 @@ "dev": true, "license": "MIT" }, - "node_modules/@swc/counter": { - "version": "0.1.3", - "resolved": "https://registry.npmjs.org/@swc/counter/-/counter-0.1.3.tgz", - "integrity": "sha512-e2BR4lsJkkRlKZ/qCHPw9ZaSxc0MVUd7gtbtaB7aMvHeJVYe8sOB8DBZkP2DtISHGSku9sCK6T6cnY0CtXrOCQ==", - "license": "Apache-2.0" - }, "node_modules/@swc/helpers": { "version": "0.5.15", "resolved": "https://registry.npmjs.org/@swc/helpers/-/helpers-0.5.15.tgz", @@ -1358,9 +1375,6 @@ "arm64" ], "dev": true, - "libc": [ - "glibc" - ], "license": "MIT", "optional": true, "os": [ @@ -1375,9 +1389,6 @@ "arm64" ], "dev": true, - "libc": [ - "musl" - ], "license": "MIT", "optional": true, "os": [ @@ -1392,9 +1403,6 @@ "ppc64" ], "dev": true, - "libc": [ - "glibc" - ], "license": "MIT", "optional": true, "os": [ @@ -1409,9 +1417,6 @@ "riscv64" ], "dev": true, - "libc": [ - "glibc" - ], "license": "MIT", "optional": true, "os": [ @@ -1426,9 +1431,6 @@ "riscv64" ], "dev": true, - "libc": [ - "musl" - ], "license": "MIT", "optional": true, "os": [ @@ -1443,9 +1445,6 @@ "s390x" ], "dev": true, - "libc": [ - "glibc" - ], "license": "MIT", "optional": true, "os": [ @@ -1460,9 +1459,6 @@ "x64" ], "dev": true, - "libc": [ - "glibc" - ], "license": "MIT", "optional": true, "os": [ @@ -1477,9 +1473,6 @@ "x64" ], "dev": true, - "libc": [ - "musl" - ], "license": "MIT", "optional": true, "os": [ @@ -1920,17 +1913,6 @@ "node": ">=8" } }, - "node_modules/busboy": { - "version": "1.6.0", - "resolved": "https://registry.npmjs.org/busboy/-/busboy-1.6.0.tgz", - "integrity": "sha512-8SFQbg/0hQ9xy3UNTB0YEnsNBbWfhf7RtnzpL7TkBiTBRfrQ9Fxcnz7VJsleJpyp6rVLvXiuORqjlHi5q+PYuA==", - "dependencies": { - "streamsearch": "^1.1.0" - }, - "engines": { - "node": ">=10.16.0" - } - }, "node_modules/bytes": { "version": "3.1.2", "resolved": "https://registry.npmjs.org/bytes/-/bytes-3.1.2.tgz", @@ -2041,25 +2023,11 @@ "integrity": "sha512-IV3Ou0jSMzZrd3pZ48nLkT9DA7Ag1pnPzaiQhpW7c3RbcqqzvzzVu+L8gfqMp/8IM2MQtSiqaCxrrcfu8I8rMA==", "license": "MIT" }, - "node_modules/color": { - "version": "4.2.3", - "resolved": "https://registry.npmjs.org/color/-/color-4.2.3.tgz", - "integrity": "sha512-1rXeuUUiGGrykh+CeBdu5Ie7OJwinCgQY0bc7GCRxy5xVHy+moaqkpL/jqQq0MtQOeYcrqEz4abc5f0KtU7W4A==", - "license": "MIT", - "optional": true, - "dependencies": { - "color-convert": "^2.0.1", - "color-string": "^1.9.0" - }, - "engines": { - "node": ">=12.5.0" - } - }, "node_modules/color-convert": { "version": "2.0.1", "resolved": "https://registry.npmjs.org/color-convert/-/color-convert-2.0.1.tgz", "integrity": "sha512-RRECPsj7iu/xb5oKYcsFHSppFNnsj/52OVTRKb4zP5onXwVF3zVmmToNcOfGC+CRDpfK/U584fMg38ZHCaElKQ==", - "devOptional": true, + "dev": true, "license": "MIT", "dependencies": { "color-name": "~1.1.4" @@ -2072,20 +2040,9 @@ "version": "1.1.4", "resolved": "https://registry.npmjs.org/color-name/-/color-name-1.1.4.tgz", "integrity": "sha512-dOy+3AuW3a2wNbZHIuMZpTcgjGuLU/uBL/ubcZF9OXbDo8ff4O8yVp5Bf0efS8uEoYo5q4Fx7dY9OgQGXgAsQA==", - "devOptional": true, + "dev": true, "license": "MIT" }, - "node_modules/color-string": { - "version": "1.9.1", - "resolved": "https://registry.npmjs.org/color-string/-/color-string-1.9.1.tgz", - "integrity": "sha512-shrVawQFojnZv6xM40anx4CkoDP+fZsw/ZerEMsW/pyzsRbElpsL/DBVW7q3ExxwusdNXI3lXpuhEZkzs8p5Eg==", - "license": "MIT", - "optional": true, - "dependencies": { - "color-name": "^1.0.0", - "simple-swizzle": "^0.2.2" - } - }, "node_modules/concat-map": { "version": "0.0.1", "resolved": "https://registry.npmjs.org/concat-map/-/concat-map-0.0.1.tgz", @@ -2296,7 +2253,6 @@ "version": "2.1.2", "resolved": "https://registry.npmjs.org/detect-libc/-/detect-libc-2.1.2.tgz", "integrity": "sha512-Btj2BOOO83o3WyH59e8MgXsxEQVcarkUOpEYrubB0urwnN10yQ364rsiByU11nZlqWYZm05i/of7io4mzihBtQ==", - "license": "Apache-2.0", "optional": true, "engines": { "node": ">=8" @@ -3591,13 +3547,6 @@ "url": "https://github.com/sponsors/ljharb" } }, - "node_modules/is-arrayish": { - "version": "0.3.4", - "resolved": "https://registry.npmjs.org/is-arrayish/-/is-arrayish-0.3.4.tgz", - "integrity": "sha512-m6UrgzFVUYawGBh1dUsWR5M2Clqic9RVXC/9f8ceNlv2IcO9j9J/z8UoCLPqtsPBFNzEpfR3xftohbfqDx8EQA==", - "license": "MIT", - "optional": true - }, "node_modules/is-async-function": { "version": "2.1.1", "resolved": "https://registry.npmjs.org/is-async-function/-/is-async-function-2.1.1.tgz", @@ -4337,16 +4286,12 @@ } }, "node_modules/next": { - "version": "15.2.6", - "resolved": "https://registry.npmjs.org/next/-/next-15.2.6.tgz", - "integrity": "sha512-DIKFctUpZoCq5ok2ztVU+PqhWsbiqM9xNP7rHL2cAp29NQcmDp7Y6JnBBhHRbFt4bCsCZigj6uh+/Gwh2158Wg==", - "deprecated": "This version has a security vulnerability. Please upgrade to a patched version. See https://nextjs.org/blog/security-update-2025-12-11 for more details.", - "license": "MIT", + "version": "15.5.15", + "resolved": "https://registry.npmjs.org/next/-/next-15.5.15.tgz", + "integrity": "sha512-VSqCrJwtLVGwAVE0Sb/yikrQfkwkZW9p+lL/J4+xe+G3ZA+QnWPqgcfH1tDUEuk9y+pthzzVFp4L/U8JerMfMQ==", "dependencies": { - "@next/env": "15.2.6", - "@swc/counter": "0.1.3", + "@next/env": "15.5.15", "@swc/helpers": "0.5.15", - "busboy": "1.6.0", "caniuse-lite": "^1.0.30001579", "postcss": "8.4.31", "styled-jsx": "5.1.6" @@ -4358,19 +4303,19 @@ "node": "^18.18.0 || ^19.8.0 || >= 20.0.0" }, "optionalDependencies": { - "@next/swc-darwin-arm64": "15.2.5", - "@next/swc-darwin-x64": "15.2.5", - "@next/swc-linux-arm64-gnu": "15.2.5", - "@next/swc-linux-arm64-musl": "15.2.5", - "@next/swc-linux-x64-gnu": "15.2.5", - "@next/swc-linux-x64-musl": "15.2.5", - "@next/swc-win32-arm64-msvc": "15.2.5", - "@next/swc-win32-x64-msvc": "15.2.5", - "sharp": "^0.33.5" + "@next/swc-darwin-arm64": "15.5.15", + "@next/swc-darwin-x64": "15.5.15", + "@next/swc-linux-arm64-gnu": "15.5.15", + "@next/swc-linux-arm64-musl": "15.5.15", + "@next/swc-linux-x64-gnu": "15.5.15", + "@next/swc-linux-x64-musl": "15.5.15", + "@next/swc-win32-arm64-msvc": "15.5.15", + "@next/swc-win32-x64-msvc": "15.5.15", + "sharp": "^0.34.3" }, "peerDependencies": { "@opentelemetry/api": "^1.1.0", - "@playwright/test": "^1.41.2", + "@playwright/test": "^1.51.1", "babel-plugin-react-compiler": "*", "react": "^18.2.0 || 19.0.0-rc-de68d2f4-20241204 || ^19.0.0", "react-dom": "^18.2.0 || 19.0.0-rc-de68d2f4-20241204 || ^19.0.0", @@ -5209,16 +5154,15 @@ "license": "ISC" }, "node_modules/sharp": { - "version": "0.33.5", - "resolved": "https://registry.npmjs.org/sharp/-/sharp-0.33.5.tgz", - "integrity": "sha512-haPVm1EkS9pgvHrQ/F3Xy+hgcuMV0Wm9vfIBSiwZ05k+xgb0PkBQpGsAA/oWdDobNaZTH5ppvHtzCFbnSEwHVw==", + "version": "0.34.5", + "resolved": "https://registry.npmjs.org/sharp/-/sharp-0.34.5.tgz", + "integrity": "sha512-Ou9I5Ft9WNcCbXrU9cMgPBcCK8LiwLqcbywW3t4oDV37n1pzpuNLsYiAV8eODnjbtQlSDwZ2cUEeQz4E54Hltg==", "hasInstallScript": true, - "license": "Apache-2.0", "optional": true, "dependencies": { - "color": "^4.2.3", - "detect-libc": "^2.0.3", - "semver": "^7.6.3" + "@img/colour": "^1.0.0", + "detect-libc": "^2.1.2", + "semver": "^7.7.3" }, "engines": { "node": "^18.17.0 || ^20.3.0 || >=21.0.0" @@ -5227,25 +5171,30 @@ "url": "https://opencollective.com/libvips" }, "optionalDependencies": { - "@img/sharp-darwin-arm64": "0.33.5", - "@img/sharp-darwin-x64": "0.33.5", - "@img/sharp-libvips-darwin-arm64": "1.0.4", - "@img/sharp-libvips-darwin-x64": "1.0.4", - "@img/sharp-libvips-linux-arm": "1.0.5", - "@img/sharp-libvips-linux-arm64": "1.0.4", - "@img/sharp-libvips-linux-s390x": "1.0.4", - "@img/sharp-libvips-linux-x64": "1.0.4", - "@img/sharp-libvips-linuxmusl-arm64": "1.0.4", - "@img/sharp-libvips-linuxmusl-x64": "1.0.4", - "@img/sharp-linux-arm": "0.33.5", - "@img/sharp-linux-arm64": "0.33.5", - "@img/sharp-linux-s390x": "0.33.5", - "@img/sharp-linux-x64": "0.33.5", - "@img/sharp-linuxmusl-arm64": "0.33.5", - "@img/sharp-linuxmusl-x64": "0.33.5", - "@img/sharp-wasm32": "0.33.5", - "@img/sharp-win32-ia32": "0.33.5", - "@img/sharp-win32-x64": "0.33.5" + "@img/sharp-darwin-arm64": "0.34.5", + "@img/sharp-darwin-x64": "0.34.5", + "@img/sharp-libvips-darwin-arm64": "1.2.4", + "@img/sharp-libvips-darwin-x64": "1.2.4", + "@img/sharp-libvips-linux-arm": "1.2.4", + "@img/sharp-libvips-linux-arm64": "1.2.4", + "@img/sharp-libvips-linux-ppc64": "1.2.4", + "@img/sharp-libvips-linux-riscv64": "1.2.4", + "@img/sharp-libvips-linux-s390x": "1.2.4", + "@img/sharp-libvips-linux-x64": "1.2.4", + "@img/sharp-libvips-linuxmusl-arm64": "1.2.4", + "@img/sharp-libvips-linuxmusl-x64": "1.2.4", + "@img/sharp-linux-arm": "0.34.5", + "@img/sharp-linux-arm64": "0.34.5", + "@img/sharp-linux-ppc64": "0.34.5", + "@img/sharp-linux-riscv64": "0.34.5", + "@img/sharp-linux-s390x": "0.34.5", + "@img/sharp-linux-x64": "0.34.5", + "@img/sharp-linuxmusl-arm64": "0.34.5", + "@img/sharp-linuxmusl-x64": "0.34.5", + "@img/sharp-wasm32": "0.34.5", + "@img/sharp-win32-arm64": "0.34.5", + "@img/sharp-win32-ia32": "0.34.5", + "@img/sharp-win32-x64": "0.34.5" } }, "node_modules/shebang-command": { @@ -5343,16 +5292,6 @@ "url": "https://github.com/sponsors/ljharb" } }, - "node_modules/simple-swizzle": { - "version": "0.2.4", - "resolved": "https://registry.npmjs.org/simple-swizzle/-/simple-swizzle-0.2.4.tgz", - "integrity": "sha512-nAu1WFPQSMNr2Zn9PGSZK9AGn4t/y97lEm+MXTtUDwfP0ksAIX4nO+6ruD9Jwut4C49SB1Ws+fbXsm/yScWOHw==", - "license": "MIT", - "optional": true, - "dependencies": { - "is-arrayish": "^0.3.1" - } - }, "node_modules/source-map-js": { "version": "1.2.1", "resolved": "https://registry.npmjs.org/source-map-js/-/source-map-js-1.2.1.tgz", @@ -5392,14 +5331,6 @@ "node": ">= 0.4" } }, - "node_modules/streamsearch": { - "version": "1.1.0", - "resolved": "https://registry.npmjs.org/streamsearch/-/streamsearch-1.1.0.tgz", - "integrity": "sha512-Mcc5wHehp9aXz1ax6bZUyY5afg9u2rv5cqQI3mRrYkGC8rW2hM02jWuwjtL++LS5qinSyhj2QfLyNsuc+VsExg==", - "engines": { - "node": ">=10.0.0" - } - }, "node_modules/string.prototype.includes": { "version": "2.0.1", "resolved": "https://registry.npmjs.org/string.prototype.includes/-/string.prototype.includes-2.0.1.tgz", @@ -6194,145 +6125,187 @@ "integrity": "sha512-bV0Tgo9K4hfPCek+aMAn81RppFKv2ySDQeMoSZuvTASywNTnVJCArCZE2FWqpvIatKu7VMRLWlR1EazvVhDyhQ==", "dev": true }, + "@img/colour": { + "version": "1.1.0", + "resolved": "https://registry.npmjs.org/@img/colour/-/colour-1.1.0.tgz", + "integrity": "sha512-Td76q7j57o/tLVdgS746cYARfSyxk8iEfRxewL9h4OMzYhbW4TAcppl0mT4eyqXddh6L/jwoM75mo7ixa/pCeQ==", + "optional": true + }, "@img/sharp-darwin-arm64": { - "version": "0.33.5", - "resolved": "https://registry.npmjs.org/@img/sharp-darwin-arm64/-/sharp-darwin-arm64-0.33.5.tgz", - "integrity": "sha512-UT4p+iz/2H4twwAoLCqfA9UH5pI6DggwKEGuaPy7nCVQ8ZsiY5PIcrRvD1DzuY3qYL07NtIQcWnBSY/heikIFQ==", + "version": "0.34.5", + "resolved": "https://registry.npmjs.org/@img/sharp-darwin-arm64/-/sharp-darwin-arm64-0.34.5.tgz", + "integrity": "sha512-imtQ3WMJXbMY4fxb/Ndp6HBTNVtWCUI0WdobyheGf5+ad6xX8VIDO8u2xE4qc/fr08CKG/7dDseFtn6M6g/r3w==", "optional": true, "requires": { - "@img/sharp-libvips-darwin-arm64": "1.0.4" + "@img/sharp-libvips-darwin-arm64": "1.2.4" } }, "@img/sharp-darwin-x64": { - "version": "0.33.5", - "resolved": "https://registry.npmjs.org/@img/sharp-darwin-x64/-/sharp-darwin-x64-0.33.5.tgz", - "integrity": "sha512-fyHac4jIc1ANYGRDxtiqelIbdWkIuQaI84Mv45KvGRRxSAa7o7d1ZKAOBaYbnepLC1WqxfpimdeWfvqqSGwR2Q==", + "version": "0.34.5", + "resolved": "https://registry.npmjs.org/@img/sharp-darwin-x64/-/sharp-darwin-x64-0.34.5.tgz", + "integrity": "sha512-YNEFAF/4KQ/PeW0N+r+aVVsoIY0/qxxikF2SWdp+NRkmMB7y9LBZAVqQ4yhGCm/H3H270OSykqmQMKLBhBJDEw==", "optional": true, "requires": { - "@img/sharp-libvips-darwin-x64": "1.0.4" + "@img/sharp-libvips-darwin-x64": "1.2.4" } }, "@img/sharp-libvips-darwin-arm64": { - "version": "1.0.4", - "resolved": "https://registry.npmjs.org/@img/sharp-libvips-darwin-arm64/-/sharp-libvips-darwin-arm64-1.0.4.tgz", - "integrity": "sha512-XblONe153h0O2zuFfTAbQYAX2JhYmDHeWikp1LM9Hul9gVPjFY427k6dFEcOL72O01QxQsWi761svJ/ev9xEDg==", + "version": "1.2.4", + "resolved": "https://registry.npmjs.org/@img/sharp-libvips-darwin-arm64/-/sharp-libvips-darwin-arm64-1.2.4.tgz", + "integrity": "sha512-zqjjo7RatFfFoP0MkQ51jfuFZBnVE2pRiaydKJ1G/rHZvnsrHAOcQALIi9sA5co5xenQdTugCvtb1cuf78Vf4g==", "optional": true }, "@img/sharp-libvips-darwin-x64": { - "version": "1.0.4", - "resolved": "https://registry.npmjs.org/@img/sharp-libvips-darwin-x64/-/sharp-libvips-darwin-x64-1.0.4.tgz", - "integrity": "sha512-xnGR8YuZYfJGmWPvmlunFaWJsb9T/AO2ykoP3Fz/0X5XV2aoYBPkX6xqCQvUTKKiLddarLaxpzNe+b1hjeWHAQ==", + "version": "1.2.4", + "resolved": "https://registry.npmjs.org/@img/sharp-libvips-darwin-x64/-/sharp-libvips-darwin-x64-1.2.4.tgz", + "integrity": "sha512-1IOd5xfVhlGwX+zXv2N93k0yMONvUlANylbJw1eTah8K/Jtpi15KC+WSiaX/nBmbm2HxRM1gZ0nSdjSsrZbGKg==", "optional": true }, "@img/sharp-libvips-linux-arm": { - "version": "1.0.5", - "resolved": "https://registry.npmjs.org/@img/sharp-libvips-linux-arm/-/sharp-libvips-linux-arm-1.0.5.tgz", - "integrity": "sha512-gvcC4ACAOPRNATg/ov8/MnbxFDJqf/pDePbBnuBDcjsI8PssmjoKMAz4LtLaVi+OnSb5FK/yIOamqDwGmXW32g==", + "version": "1.2.4", + "resolved": "https://registry.npmjs.org/@img/sharp-libvips-linux-arm/-/sharp-libvips-linux-arm-1.2.4.tgz", + "integrity": "sha512-bFI7xcKFELdiNCVov8e44Ia4u2byA+l3XtsAj+Q8tfCwO6BQ8iDojYdvoPMqsKDkuoOo+X6HZA0s0q11ANMQ8A==", "optional": true }, "@img/sharp-libvips-linux-arm64": { - "version": "1.0.4", - "resolved": "https://registry.npmjs.org/@img/sharp-libvips-linux-arm64/-/sharp-libvips-linux-arm64-1.0.4.tgz", - "integrity": "sha512-9B+taZ8DlyyqzZQnoeIvDVR/2F4EbMepXMc/NdVbkzsJbzkUjhXv/70GQJ7tdLA4YJgNP25zukcxpX2/SueNrA==", + "version": "1.2.4", + "resolved": "https://registry.npmjs.org/@img/sharp-libvips-linux-arm64/-/sharp-libvips-linux-arm64-1.2.4.tgz", + "integrity": "sha512-excjX8DfsIcJ10x1Kzr4RcWe1edC9PquDRRPx3YVCvQv+U5p7Yin2s32ftzikXojb1PIFc/9Mt28/y+iRklkrw==", + "optional": true + }, + "@img/sharp-libvips-linux-ppc64": { + "version": "1.2.4", + "resolved": "https://registry.npmjs.org/@img/sharp-libvips-linux-ppc64/-/sharp-libvips-linux-ppc64-1.2.4.tgz", + "integrity": "sha512-FMuvGijLDYG6lW+b/UvyilUWu5Ayu+3r2d1S8notiGCIyYU/76eig1UfMmkZ7vwgOrzKzlQbFSuQfgm7GYUPpA==", + "optional": true + }, + "@img/sharp-libvips-linux-riscv64": { + "version": "1.2.4", + "resolved": "https://registry.npmjs.org/@img/sharp-libvips-linux-riscv64/-/sharp-libvips-linux-riscv64-1.2.4.tgz", + "integrity": "sha512-oVDbcR4zUC0ce82teubSm+x6ETixtKZBh/qbREIOcI3cULzDyb18Sr/Wcyx7NRQeQzOiHTNbZFF1UwPS2scyGA==", "optional": true }, "@img/sharp-libvips-linux-s390x": { - "version": "1.0.4", - "resolved": "https://registry.npmjs.org/@img/sharp-libvips-linux-s390x/-/sharp-libvips-linux-s390x-1.0.4.tgz", - "integrity": "sha512-u7Wz6ntiSSgGSGcjZ55im6uvTrOxSIS8/dgoVMoiGE9I6JAfU50yH5BoDlYA1tcuGS7g/QNtetJnxA6QEsCVTA==", + "version": "1.2.4", + "resolved": "https://registry.npmjs.org/@img/sharp-libvips-linux-s390x/-/sharp-libvips-linux-s390x-1.2.4.tgz", + "integrity": "sha512-qmp9VrzgPgMoGZyPvrQHqk02uyjA0/QrTO26Tqk6l4ZV0MPWIW6LTkqOIov+J1yEu7MbFQaDpwdwJKhbJvuRxQ==", "optional": true }, "@img/sharp-libvips-linux-x64": { - "version": "1.0.4", - "resolved": "https://registry.npmjs.org/@img/sharp-libvips-linux-x64/-/sharp-libvips-linux-x64-1.0.4.tgz", - "integrity": "sha512-MmWmQ3iPFZr0Iev+BAgVMb3ZyC4KeFc3jFxnNbEPas60e1cIfevbtuyf9nDGIzOaW9PdnDciJm+wFFaTlj5xYw==", + "version": "1.2.4", + "resolved": "https://registry.npmjs.org/@img/sharp-libvips-linux-x64/-/sharp-libvips-linux-x64-1.2.4.tgz", + "integrity": "sha512-tJxiiLsmHc9Ax1bz3oaOYBURTXGIRDODBqhveVHonrHJ9/+k89qbLl0bcJns+e4t4rvaNBxaEZsFtSfAdquPrw==", "optional": true }, "@img/sharp-libvips-linuxmusl-arm64": { - "version": "1.0.4", - "resolved": "https://registry.npmjs.org/@img/sharp-libvips-linuxmusl-arm64/-/sharp-libvips-linuxmusl-arm64-1.0.4.tgz", - "integrity": "sha512-9Ti+BbTYDcsbp4wfYib8Ctm1ilkugkA/uscUn6UXK1ldpC1JjiXbLfFZtRlBhjPZ5o1NCLiDbg8fhUPKStHoTA==", + "version": "1.2.4", + "resolved": "https://registry.npmjs.org/@img/sharp-libvips-linuxmusl-arm64/-/sharp-libvips-linuxmusl-arm64-1.2.4.tgz", + "integrity": "sha512-FVQHuwx1IIuNow9QAbYUzJ+En8KcVm9Lk5+uGUQJHaZmMECZmOlix9HnH7n1TRkXMS0pGxIJokIVB9SuqZGGXw==", "optional": true }, "@img/sharp-libvips-linuxmusl-x64": { - "version": "1.0.4", - "resolved": "https://registry.npmjs.org/@img/sharp-libvips-linuxmusl-x64/-/sharp-libvips-linuxmusl-x64-1.0.4.tgz", - "integrity": "sha512-viYN1KX9m+/hGkJtvYYp+CCLgnJXwiQB39damAO7WMdKWlIhmYTfHjwSbQeUK/20vY154mwezd9HflVFM1wVSw==", + "version": "1.2.4", + "resolved": "https://registry.npmjs.org/@img/sharp-libvips-linuxmusl-x64/-/sharp-libvips-linuxmusl-x64-1.2.4.tgz", + "integrity": "sha512-+LpyBk7L44ZIXwz/VYfglaX/okxezESc6UxDSoyo2Ks6Jxc4Y7sGjpgU9s4PMgqgjj1gZCylTieNamqA1MF7Dg==", "optional": true }, "@img/sharp-linux-arm": { - "version": "0.33.5", - "resolved": "https://registry.npmjs.org/@img/sharp-linux-arm/-/sharp-linux-arm-0.33.5.tgz", - "integrity": "sha512-JTS1eldqZbJxjvKaAkxhZmBqPRGmxgu+qFKSInv8moZ2AmT5Yib3EQ1c6gp493HvrvV8QgdOXdyaIBrhvFhBMQ==", + "version": "0.34.5", + "resolved": "https://registry.npmjs.org/@img/sharp-linux-arm/-/sharp-linux-arm-0.34.5.tgz", + "integrity": "sha512-9dLqsvwtg1uuXBGZKsxem9595+ujv0sJ6Vi8wcTANSFpwV/GONat5eCkzQo/1O6zRIkh0m/8+5BjrRr7jDUSZw==", "optional": true, "requires": { - "@img/sharp-libvips-linux-arm": "1.0.5" + "@img/sharp-libvips-linux-arm": "1.2.4" } }, "@img/sharp-linux-arm64": { - "version": "0.33.5", - "resolved": "https://registry.npmjs.org/@img/sharp-linux-arm64/-/sharp-linux-arm64-0.33.5.tgz", - "integrity": "sha512-JMVv+AMRyGOHtO1RFBiJy/MBsgz0x4AWrT6QoEVVTyh1E39TrCUpTRI7mx9VksGX4awWASxqCYLCV4wBZHAYxA==", + "version": "0.34.5", + "resolved": "https://registry.npmjs.org/@img/sharp-linux-arm64/-/sharp-linux-arm64-0.34.5.tgz", + "integrity": "sha512-bKQzaJRY/bkPOXyKx5EVup7qkaojECG6NLYswgktOZjaXecSAeCWiZwwiFf3/Y+O1HrauiE3FVsGxFg8c24rZg==", "optional": true, "requires": { - "@img/sharp-libvips-linux-arm64": "1.0.4" + "@img/sharp-libvips-linux-arm64": "1.2.4" + } + }, + "@img/sharp-linux-ppc64": { + "version": "0.34.5", + "resolved": "https://registry.npmjs.org/@img/sharp-linux-ppc64/-/sharp-linux-ppc64-0.34.5.tgz", + "integrity": "sha512-7zznwNaqW6YtsfrGGDA6BRkISKAAE1Jo0QdpNYXNMHu2+0dTrPflTLNkpc8l7MUP5M16ZJcUvysVWWrMefZquA==", + "optional": true, + "requires": { + "@img/sharp-libvips-linux-ppc64": "1.2.4" + } + }, + "@img/sharp-linux-riscv64": { + "version": "0.34.5", + "resolved": "https://registry.npmjs.org/@img/sharp-linux-riscv64/-/sharp-linux-riscv64-0.34.5.tgz", + "integrity": "sha512-51gJuLPTKa7piYPaVs8GmByo7/U7/7TZOq+cnXJIHZKavIRHAP77e3N2HEl3dgiqdD/w0yUfiJnII77PuDDFdw==", + "optional": true, + "requires": { + "@img/sharp-libvips-linux-riscv64": "1.2.4" } }, "@img/sharp-linux-s390x": { - "version": "0.33.5", - "resolved": "https://registry.npmjs.org/@img/sharp-linux-s390x/-/sharp-linux-s390x-0.33.5.tgz", - "integrity": "sha512-y/5PCd+mP4CA/sPDKl2961b+C9d+vPAveS33s6Z3zfASk2j5upL6fXVPZi7ztePZ5CuH+1kW8JtvxgbuXHRa4Q==", + "version": "0.34.5", + "resolved": "https://registry.npmjs.org/@img/sharp-linux-s390x/-/sharp-linux-s390x-0.34.5.tgz", + "integrity": "sha512-nQtCk0PdKfho3eC5MrbQoigJ2gd1CgddUMkabUj+rBevs8tZ2cULOx46E7oyX+04WGfABgIwmMC0VqieTiR4jg==", "optional": true, "requires": { - "@img/sharp-libvips-linux-s390x": "1.0.4" + "@img/sharp-libvips-linux-s390x": "1.2.4" } }, "@img/sharp-linux-x64": { - "version": "0.33.5", - "resolved": "https://registry.npmjs.org/@img/sharp-linux-x64/-/sharp-linux-x64-0.33.5.tgz", - "integrity": "sha512-opC+Ok5pRNAzuvq1AG0ar+1owsu842/Ab+4qvU879ippJBHvyY5n2mxF1izXqkPYlGuP/M556uh53jRLJmzTWA==", + "version": "0.34.5", + "resolved": "https://registry.npmjs.org/@img/sharp-linux-x64/-/sharp-linux-x64-0.34.5.tgz", + "integrity": "sha512-MEzd8HPKxVxVenwAa+JRPwEC7QFjoPWuS5NZnBt6B3pu7EG2Ge0id1oLHZpPJdn3OQK+BQDiw9zStiHBTJQQQQ==", "optional": true, "requires": { - "@img/sharp-libvips-linux-x64": "1.0.4" + "@img/sharp-libvips-linux-x64": "1.2.4" } }, "@img/sharp-linuxmusl-arm64": { - "version": "0.33.5", - "resolved": "https://registry.npmjs.org/@img/sharp-linuxmusl-arm64/-/sharp-linuxmusl-arm64-0.33.5.tgz", - "integrity": "sha512-XrHMZwGQGvJg2V/oRSUfSAfjfPxO+4DkiRh6p2AFjLQztWUuY/o8Mq0eMQVIY7HJ1CDQUJlxGGZRw1a5bqmd1g==", + "version": "0.34.5", + "resolved": "https://registry.npmjs.org/@img/sharp-linuxmusl-arm64/-/sharp-linuxmusl-arm64-0.34.5.tgz", + "integrity": "sha512-fprJR6GtRsMt6Kyfq44IsChVZeGN97gTD331weR1ex1c1rypDEABN6Tm2xa1wE6lYb5DdEnk03NZPqA7Id21yg==", "optional": true, "requires": { - "@img/sharp-libvips-linuxmusl-arm64": "1.0.4" + "@img/sharp-libvips-linuxmusl-arm64": "1.2.4" } }, "@img/sharp-linuxmusl-x64": { - "version": "0.33.5", - "resolved": "https://registry.npmjs.org/@img/sharp-linuxmusl-x64/-/sharp-linuxmusl-x64-0.33.5.tgz", - "integrity": "sha512-WT+d/cgqKkkKySYmqoZ8y3pxx7lx9vVejxW/W4DOFMYVSkErR+w7mf2u8m/y4+xHe7yY9DAXQMWQhpnMuFfScw==", + "version": "0.34.5", + "resolved": "https://registry.npmjs.org/@img/sharp-linuxmusl-x64/-/sharp-linuxmusl-x64-0.34.5.tgz", + "integrity": "sha512-Jg8wNT1MUzIvhBFxViqrEhWDGzqymo3sV7z7ZsaWbZNDLXRJZoRGrjulp60YYtV4wfY8VIKcWidjojlLcWrd8Q==", "optional": true, "requires": { - "@img/sharp-libvips-linuxmusl-x64": "1.0.4" + "@img/sharp-libvips-linuxmusl-x64": "1.2.4" } }, "@img/sharp-wasm32": { - "version": "0.33.5", - "resolved": "https://registry.npmjs.org/@img/sharp-wasm32/-/sharp-wasm32-0.33.5.tgz", - "integrity": "sha512-ykUW4LVGaMcU9lu9thv85CbRMAwfeadCJHRsg2GmeRa/cJxsVY9Rbd57JcMxBkKHag5U/x7TSBpScF4U8ElVzg==", + "version": "0.34.5", + "resolved": "https://registry.npmjs.org/@img/sharp-wasm32/-/sharp-wasm32-0.34.5.tgz", + "integrity": "sha512-OdWTEiVkY2PHwqkbBI8frFxQQFekHaSSkUIJkwzclWZe64O1X4UlUjqqqLaPbUpMOQk6FBu/HtlGXNblIs0huw==", "optional": true, "requires": { - "@emnapi/runtime": "^1.2.0" + "@emnapi/runtime": "^1.7.0" } }, + "@img/sharp-win32-arm64": { + "version": "0.34.5", + "resolved": "https://registry.npmjs.org/@img/sharp-win32-arm64/-/sharp-win32-arm64-0.34.5.tgz", + "integrity": "sha512-WQ3AgWCWYSb2yt+IG8mnC6Jdk9Whs7O0gxphblsLvdhSpSTtmu69ZG1Gkb6NuvxsNACwiPV6cNSZNzt0KPsw7g==", + "optional": true + }, "@img/sharp-win32-ia32": { - "version": "0.33.5", - "resolved": "https://registry.npmjs.org/@img/sharp-win32-ia32/-/sharp-win32-ia32-0.33.5.tgz", - "integrity": "sha512-T36PblLaTwuVJ/zw/LaH0PdZkRz5rd3SmMHX8GSmR7vtNSP5Z6bQkExdSK7xGWyxLw4sUknBuugTelgw2faBbQ==", + "version": "0.34.5", + "resolved": "https://registry.npmjs.org/@img/sharp-win32-ia32/-/sharp-win32-ia32-0.34.5.tgz", + "integrity": "sha512-FV9m/7NmeCmSHDD5j4+4pNI8Cp3aW+JvLoXcTUo0IqyjSfAZJ8dIUmijx1qaJsIiU+Hosw6xM5KijAWRJCSgNg==", "optional": true }, "@img/sharp-win32-x64": { - "version": "0.33.5", - "resolved": "https://registry.npmjs.org/@img/sharp-win32-x64/-/sharp-win32-x64-0.33.5.tgz", - "integrity": "sha512-MpY/o8/8kj+EcnxwvrP4aTJSWw/aZ7JIGR4aBeZkZw5B7/Jn+tY9/VNwtcoGmdT7GfggGIU4kygOMSbYnOrAbg==", + "version": "0.34.5", + "resolved": "https://registry.npmjs.org/@img/sharp-win32-x64/-/sharp-win32-x64-0.34.5.tgz", + "integrity": "sha512-+29YMsqY2/9eFEiW93eqWnuLcWcufowXewwSNIT6UwZdUUCrM3oFjMWH/Z6/TMmb4hlFenmfAVbpWeup2jryCw==", "optional": true }, "@napi-rs/wasm-runtime": { @@ -6348,9 +6321,9 @@ } }, "@next/env": { - "version": "15.2.6", - "resolved": "https://registry.npmjs.org/@next/env/-/env-15.2.6.tgz", - "integrity": "sha512-kp1Mpm4K1IzSSJ5ZALfek0JBD2jBw9VGMXR/aT7ykcA2q/ieDARyBzg+e8J1TkeIb5AFj/YjtZdoajdy5uNy6w==" + "version": "15.5.15", + "resolved": "https://registry.npmjs.org/@next/env/-/env-15.5.15.tgz", + "integrity": "sha512-vcmyu5/MyFzN7CdqRHO3uHO44p/QPCZkuTUXroeUmhNP8bL5PHFEhik22JUazt+CDDoD6EpBYRCaS2pISL+/hg==" }, "@next/eslint-plugin-next": { "version": "15.2.6", @@ -6362,51 +6335,51 @@ } }, "@next/swc-darwin-arm64": { - "version": "15.2.5", - "resolved": "https://registry.npmjs.org/@next/swc-darwin-arm64/-/swc-darwin-arm64-15.2.5.tgz", - "integrity": "sha512-4OimvVlFTbgzPdA0kh8A1ih6FN9pQkL4nPXGqemEYgk+e7eQhsst/p35siNNqA49eQA6bvKZ1ASsDtu9gtXuog==", + "version": "15.5.15", + "resolved": "https://registry.npmjs.org/@next/swc-darwin-arm64/-/swc-darwin-arm64-15.5.15.tgz", + "integrity": "sha512-6PvFO2Tzt10GFK2Ro9tAVEtacMqRmTarYMFKAnV2vYMdwWc73xzmDQyAV7SwEdMhzmiRoo7+m88DuiXlJlGeaw==", "optional": true }, "@next/swc-darwin-x64": { - "version": "15.2.5", - "resolved": "https://registry.npmjs.org/@next/swc-darwin-x64/-/swc-darwin-x64-15.2.5.tgz", - "integrity": "sha512-ohzRaE9YbGt1ctE0um+UGYIDkkOxHV44kEcHzLqQigoRLaiMtZzGrA11AJh2Lu0lv51XeiY1ZkUvkThjkVNBMA==", + "version": "15.5.15", + "resolved": "https://registry.npmjs.org/@next/swc-darwin-x64/-/swc-darwin-x64-15.5.15.tgz", + "integrity": "sha512-G+YNV+z6FDZTp/+IdGyIMFqalBTaQSnvAA+X/hrt+eaTRFSznRMz9K7rTmzvM6tDmKegNtyzgufZW0HwVzEqaQ==", "optional": true }, "@next/swc-linux-arm64-gnu": { - "version": "15.2.5", - "resolved": "https://registry.npmjs.org/@next/swc-linux-arm64-gnu/-/swc-linux-arm64-gnu-15.2.5.tgz", - "integrity": "sha512-FMSdxSUt5bVXqqOoZCc/Seg4LQep9w/fXTazr/EkpXW2Eu4IFI9FD7zBDlID8TJIybmvKk7mhd9s+2XWxz4flA==", + "version": "15.5.15", + "resolved": "https://registry.npmjs.org/@next/swc-linux-arm64-gnu/-/swc-linux-arm64-gnu-15.5.15.tgz", + "integrity": "sha512-eVkrMcVIBqGfXB+QUC7jjZ94Z6uX/dNStbQFabewAnk13Uy18Igd1YZ/GtPRzdhtm7QwC0e6o7zOQecul4iC1w==", "optional": true }, "@next/swc-linux-arm64-musl": { - "version": "15.2.5", - "resolved": "https://registry.npmjs.org/@next/swc-linux-arm64-musl/-/swc-linux-arm64-musl-15.2.5.tgz", - "integrity": "sha512-4ZNKmuEiW5hRKkGp2HWwZ+JrvK4DQLgf8YDaqtZyn7NYdl0cHfatvlnLFSWUayx9yFAUagIgRGRk8pFxS8Qniw==", + "version": "15.5.15", + "resolved": "https://registry.npmjs.org/@next/swc-linux-arm64-musl/-/swc-linux-arm64-musl-15.5.15.tgz", + "integrity": "sha512-RwSHKMQ7InLy5GfkY2/n5PcFycKA08qI1VST78n09nN36nUPqCvGSMiLXlfUmzmpQpF6XeBYP2KRWHi0UW3uNg==", "optional": true }, "@next/swc-linux-x64-gnu": { - "version": "15.2.5", - "resolved": "https://registry.npmjs.org/@next/swc-linux-x64-gnu/-/swc-linux-x64-gnu-15.2.5.tgz", - "integrity": "sha512-bE6lHQ9GXIf3gCDE53u2pTl99RPZW5V1GLHSRMJ5l/oB/MT+cohu9uwnCK7QUph2xIOu2a6+27kL0REa/kqwZw==", + "version": "15.5.15", + "resolved": "https://registry.npmjs.org/@next/swc-linux-x64-gnu/-/swc-linux-x64-gnu-15.5.15.tgz", + "integrity": "sha512-nplqvY86LakS+eeiuWsNWvfmK8pFcOEW7ZtVRt4QH70lL+0x6LG/m1OpJ/tvrbwjmR8HH9/fH2jzW1GlL03TIg==", "optional": true }, "@next/swc-linux-x64-musl": { - "version": "15.2.5", - "resolved": "https://registry.npmjs.org/@next/swc-linux-x64-musl/-/swc-linux-x64-musl-15.2.5.tgz", - "integrity": "sha512-y7EeQuSkQbTAkCEQnJXm1asRUuGSWAchGJ3c+Qtxh8LVjXleZast8Mn/rL7tZOm7o35QeIpIcid6ufG7EVTTcA==", + "version": "15.5.15", + "resolved": "https://registry.npmjs.org/@next/swc-linux-x64-musl/-/swc-linux-x64-musl-15.5.15.tgz", + "integrity": "sha512-eAgl9NKQ84/sww0v81DQINl/vL2IBxD7sMybd0cWRw6wqgouVI53brVRBrggqBRP/NWeIAE1dm5cbKYoiMlqDQ==", "optional": true }, "@next/swc-win32-arm64-msvc": { - "version": "15.2.5", - "resolved": "https://registry.npmjs.org/@next/swc-win32-arm64-msvc/-/swc-win32-arm64-msvc-15.2.5.tgz", - "integrity": "sha512-gQMz0yA8/dskZM2Xyiq2FRShxSrsJNha40Ob/M2n2+JGRrZ0JwTVjLdvtN6vCxuq4ByhOd4a9qEf60hApNR2gQ==", + "version": "15.5.15", + "resolved": "https://registry.npmjs.org/@next/swc-win32-arm64-msvc/-/swc-win32-arm64-msvc-15.5.15.tgz", + "integrity": "sha512-GJVZC86lzSquh0MtvZT+L7G8+jMnJcldloOjA8Kf3wXvBrvb6OGe2MzPuALxFshSm/IpwUtD2mIoof39ymf52A==", "optional": true }, "@next/swc-win32-x64-msvc": { - "version": "15.2.5", - "resolved": "https://registry.npmjs.org/@next/swc-win32-x64-msvc/-/swc-win32-x64-msvc-15.2.5.tgz", - "integrity": "sha512-tBDNVUcI7U03+3oMvJ11zrtVin5p0NctiuKmTGyaTIEAVj9Q77xukLXGXRnWxKRIIdFG4OTA2rUVGZDYOwgmAA==", + "version": "15.5.15", + "resolved": "https://registry.npmjs.org/@next/swc-win32-x64-msvc/-/swc-win32-x64-msvc-15.5.15.tgz", + "integrity": "sha512-nFucjVdwlFqxh/JG3hWSJ4p8+YJV7Ii8aPDuBQULB6DzUF4UNZETXLfEUk+oI2zEznWWULPt7MeuTE6xtK1HSA==", "optional": true }, "@nodelib/fs.scandir": { @@ -6453,11 +6426,6 @@ "integrity": "sha512-TvZbIpeKqGQQ7X0zSCvPH9riMSFQFSggnfBjFZ1mEoILW+UuXCKwOoPcgjMwiUtRqFZ8jWhPJc4um14vC6I4ag==", "dev": true }, - "@swc/counter": { - "version": "0.1.3", - "resolved": "https://registry.npmjs.org/@swc/counter/-/counter-0.1.3.tgz", - "integrity": "sha512-e2BR4lsJkkRlKZ/qCHPw9ZaSxc0MVUd7gtbtaB7aMvHeJVYe8sOB8DBZkP2DtISHGSku9sCK6T6cnY0CtXrOCQ==" - }, "@swc/helpers": { "version": "0.5.15", "resolved": "https://registry.npmjs.org/@swc/helpers/-/helpers-0.5.15.tgz", @@ -7071,14 +7039,6 @@ "fill-range": "^7.1.1" } }, - "busboy": { - "version": "1.6.0", - "resolved": "https://registry.npmjs.org/busboy/-/busboy-1.6.0.tgz", - "integrity": "sha512-8SFQbg/0hQ9xy3UNTB0YEnsNBbWfhf7RtnzpL7TkBiTBRfrQ9Fxcnz7VJsleJpyp6rVLvXiuORqjlHi5q+PYuA==", - "requires": { - "streamsearch": "^1.1.0" - } - }, "bytes": { "version": "3.1.2", "resolved": "https://registry.npmjs.org/bytes/-/bytes-3.1.2.tgz", @@ -7140,21 +7100,11 @@ "resolved": "https://registry.npmjs.org/client-only/-/client-only-0.0.1.tgz", "integrity": "sha512-IV3Ou0jSMzZrd3pZ48nLkT9DA7Ag1pnPzaiQhpW7c3RbcqqzvzzVu+L8gfqMp/8IM2MQtSiqaCxrrcfu8I8rMA==" }, - "color": { - "version": "4.2.3", - "resolved": "https://registry.npmjs.org/color/-/color-4.2.3.tgz", - "integrity": "sha512-1rXeuUUiGGrykh+CeBdu5Ie7OJwinCgQY0bc7GCRxy5xVHy+moaqkpL/jqQq0MtQOeYcrqEz4abc5f0KtU7W4A==", - "optional": true, - "requires": { - "color-convert": "^2.0.1", - "color-string": "^1.9.0" - } - }, "color-convert": { "version": "2.0.1", "resolved": "https://registry.npmjs.org/color-convert/-/color-convert-2.0.1.tgz", "integrity": "sha512-RRECPsj7iu/xb5oKYcsFHSppFNnsj/52OVTRKb4zP5onXwVF3zVmmToNcOfGC+CRDpfK/U584fMg38ZHCaElKQ==", - "devOptional": true, + "dev": true, "requires": { "color-name": "~1.1.4" } @@ -7163,17 +7113,7 @@ "version": "1.1.4", "resolved": "https://registry.npmjs.org/color-name/-/color-name-1.1.4.tgz", "integrity": "sha512-dOy+3AuW3a2wNbZHIuMZpTcgjGuLU/uBL/ubcZF9OXbDo8ff4O8yVp5Bf0efS8uEoYo5q4Fx7dY9OgQGXgAsQA==", - "devOptional": true - }, - "color-string": { - "version": "1.9.1", - "resolved": "https://registry.npmjs.org/color-string/-/color-string-1.9.1.tgz", - "integrity": "sha512-shrVawQFojnZv6xM40anx4CkoDP+fZsw/ZerEMsW/pyzsRbElpsL/DBVW7q3ExxwusdNXI3lXpuhEZkzs8p5Eg==", - "optional": true, - "requires": { - "color-name": "^1.0.0", - "simple-swizzle": "^0.2.2" - } + "dev": true }, "concat-map": { "version": "0.0.1", @@ -8211,12 +8151,6 @@ "get-intrinsic": "^1.2.6" } }, - "is-arrayish": { - "version": "0.3.4", - "resolved": "https://registry.npmjs.org/is-arrayish/-/is-arrayish-0.3.4.tgz", - "integrity": "sha512-m6UrgzFVUYawGBh1dUsWR5M2Clqic9RVXC/9f8ceNlv2IcO9j9J/z8UoCLPqtsPBFNzEpfR3xftohbfqDx8EQA==", - "optional": true - }, "is-async-function": { "version": "2.1.1", "resolved": "https://registry.npmjs.org/is-async-function/-/is-async-function-2.1.1.tgz", @@ -8676,25 +8610,23 @@ "integrity": "sha512-+EUsqGPLsM+j/zdChZjsnX51g4XrHFOIXwfnCVPGlQk/k5giakcKsuxCObBRu6DSm9opw/O6slWbJdghQM4bBg==" }, "next": { - "version": "15.2.6", - "resolved": "https://registry.npmjs.org/next/-/next-15.2.6.tgz", - "integrity": "sha512-DIKFctUpZoCq5ok2ztVU+PqhWsbiqM9xNP7rHL2cAp29NQcmDp7Y6JnBBhHRbFt4bCsCZigj6uh+/Gwh2158Wg==", - "requires": { - "@next/env": "15.2.6", - "@next/swc-darwin-arm64": "15.2.5", - "@next/swc-darwin-x64": "15.2.5", - "@next/swc-linux-arm64-gnu": "15.2.5", - "@next/swc-linux-arm64-musl": "15.2.5", - "@next/swc-linux-x64-gnu": "15.2.5", - "@next/swc-linux-x64-musl": "15.2.5", - "@next/swc-win32-arm64-msvc": "15.2.5", - "@next/swc-win32-x64-msvc": "15.2.5", - "@swc/counter": "0.1.3", + "version": "15.5.15", + "resolved": "https://registry.npmjs.org/next/-/next-15.5.15.tgz", + "integrity": "sha512-VSqCrJwtLVGwAVE0Sb/yikrQfkwkZW9p+lL/J4+xe+G3ZA+QnWPqgcfH1tDUEuk9y+pthzzVFp4L/U8JerMfMQ==", + "requires": { + "@next/env": "15.5.15", + "@next/swc-darwin-arm64": "15.5.15", + "@next/swc-darwin-x64": "15.5.15", + "@next/swc-linux-arm64-gnu": "15.5.15", + "@next/swc-linux-arm64-musl": "15.5.15", + "@next/swc-linux-x64-gnu": "15.5.15", + "@next/swc-linux-x64-musl": "15.5.15", + "@next/swc-win32-arm64-msvc": "15.5.15", + "@next/swc-win32-x64-msvc": "15.5.15", "@swc/helpers": "0.5.15", - "busboy": "1.6.0", "caniuse-lite": "^1.0.30001579", "postcss": "8.4.31", - "sharp": "^0.33.5", + "sharp": "^0.34.3", "styled-jsx": "5.1.6" } }, @@ -9214,33 +9146,38 @@ "integrity": "sha512-E5LDX7Wrp85Kil5bhZv46j8jOeboKq5JMmYM3gVGdGH8xFpPWXUMsNrlODCrkoxMEeNi/XZIwuRvY4XNwYMJpw==" }, "sharp": { - "version": "0.33.5", - "resolved": "https://registry.npmjs.org/sharp/-/sharp-0.33.5.tgz", - "integrity": "sha512-haPVm1EkS9pgvHrQ/F3Xy+hgcuMV0Wm9vfIBSiwZ05k+xgb0PkBQpGsAA/oWdDobNaZTH5ppvHtzCFbnSEwHVw==", + "version": "0.34.5", + "resolved": "https://registry.npmjs.org/sharp/-/sharp-0.34.5.tgz", + "integrity": "sha512-Ou9I5Ft9WNcCbXrU9cMgPBcCK8LiwLqcbywW3t4oDV37n1pzpuNLsYiAV8eODnjbtQlSDwZ2cUEeQz4E54Hltg==", "optional": true, "requires": { - "@img/sharp-darwin-arm64": "0.33.5", - "@img/sharp-darwin-x64": "0.33.5", - "@img/sharp-libvips-darwin-arm64": "1.0.4", - "@img/sharp-libvips-darwin-x64": "1.0.4", - "@img/sharp-libvips-linux-arm": "1.0.5", - "@img/sharp-libvips-linux-arm64": "1.0.4", - "@img/sharp-libvips-linux-s390x": "1.0.4", - "@img/sharp-libvips-linux-x64": "1.0.4", - "@img/sharp-libvips-linuxmusl-arm64": "1.0.4", - "@img/sharp-libvips-linuxmusl-x64": "1.0.4", - "@img/sharp-linux-arm": "0.33.5", - "@img/sharp-linux-arm64": "0.33.5", - "@img/sharp-linux-s390x": "0.33.5", - "@img/sharp-linux-x64": "0.33.5", - "@img/sharp-linuxmusl-arm64": "0.33.5", - "@img/sharp-linuxmusl-x64": "0.33.5", - "@img/sharp-wasm32": "0.33.5", - "@img/sharp-win32-ia32": "0.33.5", - "@img/sharp-win32-x64": "0.33.5", - "color": "^4.2.3", - "detect-libc": "^2.0.3", - "semver": "^7.6.3" + "@img/colour": "^1.0.0", + "@img/sharp-darwin-arm64": "0.34.5", + "@img/sharp-darwin-x64": "0.34.5", + "@img/sharp-libvips-darwin-arm64": "1.2.4", + "@img/sharp-libvips-darwin-x64": "1.2.4", + "@img/sharp-libvips-linux-arm": "1.2.4", + "@img/sharp-libvips-linux-arm64": "1.2.4", + "@img/sharp-libvips-linux-ppc64": "1.2.4", + "@img/sharp-libvips-linux-riscv64": "1.2.4", + "@img/sharp-libvips-linux-s390x": "1.2.4", + "@img/sharp-libvips-linux-x64": "1.2.4", + "@img/sharp-libvips-linuxmusl-arm64": "1.2.4", + "@img/sharp-libvips-linuxmusl-x64": "1.2.4", + "@img/sharp-linux-arm": "0.34.5", + "@img/sharp-linux-arm64": "0.34.5", + "@img/sharp-linux-ppc64": "0.34.5", + "@img/sharp-linux-riscv64": "0.34.5", + "@img/sharp-linux-s390x": "0.34.5", + "@img/sharp-linux-x64": "0.34.5", + "@img/sharp-linuxmusl-arm64": "0.34.5", + "@img/sharp-linuxmusl-x64": "0.34.5", + "@img/sharp-wasm32": "0.34.5", + "@img/sharp-win32-arm64": "0.34.5", + "@img/sharp-win32-ia32": "0.34.5", + "@img/sharp-win32-x64": "0.34.5", + "detect-libc": "^2.1.2", + "semver": "^7.7.3" } }, "shebang-command": { @@ -9302,15 +9239,6 @@ "side-channel-map": "^1.0.1" } }, - "simple-swizzle": { - "version": "0.2.4", - "resolved": "https://registry.npmjs.org/simple-swizzle/-/simple-swizzle-0.2.4.tgz", - "integrity": "sha512-nAu1WFPQSMNr2Zn9PGSZK9AGn4t/y97lEm+MXTtUDwfP0ksAIX4nO+6ruD9Jwut4C49SB1Ws+fbXsm/yScWOHw==", - "optional": true, - "requires": { - "is-arrayish": "^0.3.1" - } - }, "source-map-js": { "version": "1.2.1", "resolved": "https://registry.npmjs.org/source-map-js/-/source-map-js-1.2.1.tgz", @@ -9337,11 +9265,6 @@ "internal-slot": "^1.1.0" } }, - "streamsearch": { - "version": "1.1.0", - "resolved": "https://registry.npmjs.org/streamsearch/-/streamsearch-1.1.0.tgz", - "integrity": "sha512-Mcc5wHehp9aXz1ax6bZUyY5afg9u2rv5cqQI3mRrYkGC8rW2hM02jWuwjtL++LS5qinSyhj2QfLyNsuc+VsExg==" - }, "string.prototype.includes": { "version": "2.0.1", "resolved": "https://registry.npmjs.org/string.prototype.includes/-/string.prototype.includes-2.0.1.tgz", diff --git a/frontend/package.json b/frontend/package.json index 9b8c014..575f70a 100644 --- a/frontend/package.json +++ b/frontend/package.json @@ -14,7 +14,7 @@ }, "dependencies": { "express": "4.21.2", - "next": "15.2.6", + "next": "15.5.15", "react": "19.0.3", "react-dom": "19.0.3" }, From 1dd69f44663a29393e0fc6991b8dfd1e880ab729 Mon Sep 17 00:00:00 2001 From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com> Date: Tue, 21 Apr 2026 15:46:21 +0200 Subject: [PATCH 03/48] Bump qs and express in /frontend (#228) Bumps [qs](https://github.com/ljharb/qs) to 6.14.2 and updates ancestor dependency [express](https://github.com/expressjs/express). These dependencies need to be updated together. Updates `qs` from 6.13.0 to 6.14.2 - [Changelog](https://github.com/ljharb/qs/blob/main/CHANGELOG.md) - [Commits](https://github.com/ljharb/qs/compare/v6.13.0...v6.14.2) Updates `express` from 4.21.2 to 4.22.1 - [Release notes](https://github.com/expressjs/express/releases) - [Changelog](https://github.com/expressjs/express/blob/v4.22.1/History.md) - [Commits](https://github.com/expressjs/express/compare/4.21.2...v4.22.1) --- updated-dependencies: - dependency-name: qs dependency-version: 6.14.2 dependency-type: indirect - dependency-name: express dependency-version: 4.22.1 dependency-type: direct:production ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> --- frontend/package-lock.json | 89 +++++++++++++++++++++++--------------- frontend/package.json | 2 +- 2 files changed, 56 insertions(+), 35 deletions(-) diff --git a/frontend/package-lock.json b/frontend/package-lock.json index f7a9f1a..ec34336 100644 --- a/frontend/package-lock.json +++ b/frontend/package-lock.json @@ -8,7 +8,7 @@ "name": "guardex-how-it-works-frontend", "version": "0.1.0", "dependencies": { - "express": "4.21.2", + "express": "4.22.1", "next": "15.5.15", "react": "19.0.3", "react-dom": "19.0.3" @@ -2917,39 +2917,38 @@ } }, "node_modules/express": { - "version": "4.21.2", - "resolved": "https://registry.npmjs.org/express/-/express-4.21.2.tgz", - "integrity": "sha512-28HqgMZAmih1Czt9ny7qr6ek2qddF4FclbMzwhCREB6OFfH+rXAnuNCwo1/wFvrtbgsQDb4kSbX9de9lFbrXnA==", - "license": "MIT", + "version": "4.22.1", + "resolved": "https://registry.npmjs.org/express/-/express-4.22.1.tgz", + "integrity": "sha512-F2X8g9P1X7uCPZMA3MVf9wcTqlyNp7IhH5qPCI0izhaOIYXaW9L535tGA3qmjRzpH+bZczqq7hVKxTR4NWnu+g==", "dependencies": { "accepts": "~1.3.8", "array-flatten": "1.1.1", - "body-parser": "1.20.3", - "content-disposition": "0.5.4", + "body-parser": "~1.20.3", + "content-disposition": "~0.5.4", "content-type": "~1.0.4", - "cookie": "0.7.1", - "cookie-signature": "1.0.6", + "cookie": "~0.7.1", + "cookie-signature": "~1.0.6", "debug": "2.6.9", "depd": "2.0.0", "encodeurl": "~2.0.0", "escape-html": "~1.0.3", "etag": "~1.8.1", - "finalhandler": "1.3.1", - "fresh": "0.5.2", - "http-errors": "2.0.0", + "finalhandler": "~1.3.1", + "fresh": "~0.5.2", + "http-errors": "~2.0.0", "merge-descriptors": "1.0.3", "methods": "~1.1.2", - "on-finished": "2.4.1", + "on-finished": "~2.4.1", "parseurl": "~1.3.3", - "path-to-regexp": "0.1.12", + "path-to-regexp": "~0.1.12", "proxy-addr": "~2.0.7", - "qs": "6.13.0", + "qs": "~6.14.0", "range-parser": "~1.2.1", "safe-buffer": "5.2.1", - "send": "0.19.0", - "serve-static": "1.16.2", + "send": "~0.19.0", + "serve-static": "~1.16.2", "setprototypeof": "1.2.0", - "statuses": "2.0.1", + "statuses": "~2.0.1", "type-is": "~1.6.18", "utils-merge": "1.0.1", "vary": "~1.1.2" @@ -2977,6 +2976,20 @@ "integrity": "sha512-Tpp60P6IUJDTuOq/5Z8cdskzJujfwqfOTkrwIwj7IRISpnkJnT6SyJ4PCPnGMoFjC9ddhal5KVIYtAt97ix05A==", "license": "MIT" }, + "node_modules/express/node_modules/qs": { + "version": "6.14.2", + "resolved": "https://registry.npmjs.org/qs/-/qs-6.14.2.tgz", + "integrity": "sha512-V/yCWTTF7VJ9hIh18Ugr2zhJMP01MY7c5kh4J870L7imm6/DIzBsNLTXzMwUA3yZ5b/KBqLx8Kp3uRvd7xSe3Q==", + "dependencies": { + "side-channel": "^1.1.0" + }, + "engines": { + "node": ">=0.6" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, "node_modules/fast-deep-equal": { "version": "3.1.3", "resolved": "https://registry.npmjs.org/fast-deep-equal/-/fast-deep-equal-3.1.3.tgz", @@ -7723,38 +7736,38 @@ "integrity": "sha512-aIL5Fx7mawVa300al2BnEE4iNvo1qETxLrPI/o05L7z6go7fCw1J6EQmbK4FmJ2AS7kgVF/KEZWufBfdClMcPg==" }, "express": { - "version": "4.21.2", - "resolved": "https://registry.npmjs.org/express/-/express-4.21.2.tgz", - "integrity": "sha512-28HqgMZAmih1Czt9ny7qr6ek2qddF4FclbMzwhCREB6OFfH+rXAnuNCwo1/wFvrtbgsQDb4kSbX9de9lFbrXnA==", + "version": "4.22.1", + "resolved": "https://registry.npmjs.org/express/-/express-4.22.1.tgz", + "integrity": "sha512-F2X8g9P1X7uCPZMA3MVf9wcTqlyNp7IhH5qPCI0izhaOIYXaW9L535tGA3qmjRzpH+bZczqq7hVKxTR4NWnu+g==", "requires": { "accepts": "~1.3.8", "array-flatten": "1.1.1", - "body-parser": "1.20.3", - "content-disposition": "0.5.4", + "body-parser": "~1.20.3", + "content-disposition": "~0.5.4", "content-type": "~1.0.4", - "cookie": "0.7.1", - "cookie-signature": "1.0.6", + "cookie": "~0.7.1", + "cookie-signature": "~1.0.6", "debug": "2.6.9", "depd": "2.0.0", "encodeurl": "~2.0.0", "escape-html": "~1.0.3", "etag": "~1.8.1", - "finalhandler": "1.3.1", - "fresh": "0.5.2", - "http-errors": "2.0.0", + "finalhandler": "~1.3.1", + "fresh": "~0.5.2", + "http-errors": "~2.0.0", "merge-descriptors": "1.0.3", "methods": "~1.1.2", - "on-finished": "2.4.1", + "on-finished": "~2.4.1", "parseurl": "~1.3.3", - "path-to-regexp": "0.1.12", + "path-to-regexp": "~0.1.12", "proxy-addr": "~2.0.7", - "qs": "6.13.0", + "qs": "~6.14.0", "range-parser": "~1.2.1", "safe-buffer": "5.2.1", - "send": "0.19.0", - "serve-static": "1.16.2", + "send": "~0.19.0", + "serve-static": "~1.16.2", "setprototypeof": "1.2.0", - "statuses": "2.0.1", + "statuses": "~2.0.1", "type-is": "~1.6.18", "utils-merge": "1.0.1", "vary": "~1.1.2" @@ -7772,6 +7785,14 @@ "version": "2.0.0", "resolved": "https://registry.npmjs.org/ms/-/ms-2.0.0.tgz", "integrity": "sha512-Tpp60P6IUJDTuOq/5Z8cdskzJujfwqfOTkrwIwj7IRISpnkJnT6SyJ4PCPnGMoFjC9ddhal5KVIYtAt97ix05A==" + }, + "qs": { + "version": "6.14.2", + "resolved": "https://registry.npmjs.org/qs/-/qs-6.14.2.tgz", + "integrity": "sha512-V/yCWTTF7VJ9hIh18Ugr2zhJMP01MY7c5kh4J870L7imm6/DIzBsNLTXzMwUA3yZ5b/KBqLx8Kp3uRvd7xSe3Q==", + "requires": { + "side-channel": "^1.1.0" + } } } }, diff --git a/frontend/package.json b/frontend/package.json index 575f70a..9e0f33c 100644 --- a/frontend/package.json +++ b/frontend/package.json @@ -13,7 +13,7 @@ "lint": "next lint" }, "dependencies": { - "express": "4.21.2", + "express": "4.22.1", "next": "15.5.15", "react": "19.0.3", "react-dom": "19.0.3" From 20f9312868517757b97269faa4f0167b56681b02 Mon Sep 17 00:00:00 2001 From: Viktor Nagy <137165288+NagyVikt@users.noreply.github.com> Date: Tue, 21 Apr 2026 15:53:19 +0200 Subject: [PATCH 04/48] Auto-finish: gx doctor repairs (#241) Co-authored-by: NagyVikt --- .gitignore | 17 ++++------------- AGENTS.md | 9 ++++++--- scripts/agent-file-locks.py | 1 - 3 files changed, 10 insertions(+), 17 deletions(-) diff --git a/.gitignore b/.gitignore index 6c2f557..9cfa814 100644 --- a/.gitignore +++ b/.gitignore @@ -78,23 +78,14 @@ openspec/plan/* # multiagent-safety:START .omx/ .omc/ +scripts/* scripts/agent-branch-start.sh -scripts/agent-branch-finish.sh -scripts/codex-agent.sh -scripts/review-bot-watch.sh -scripts/agent-worktree-prune.sh scripts/agent-file-locks.py -scripts/install-agent-git-hooks.sh -scripts/openspec/init-plan-workspace.sh -scripts/openspec/init-change-workspace.sh -.githooks/pre-commit -.githooks/pre-push -.githooks/post-merge -.githooks/post-checkout +.githooks oh-my-codex/ -.codex/skills/guardex/SKILL.md +.codex/skills/gitguardex/SKILL.md .codex/skills/guardex-merge-skills-to-dev/SKILL.md -.claude/commands/guardex.md +.claude/commands/gitguardex.md .omx/state/agent-file-locks.json # multiagent-safety:END diff --git a/AGENTS.md b/AGENTS.md index 2910a4f..d386fed 100644 --- a/AGENTS.md +++ b/AGENTS.md @@ -253,7 +253,7 @@ scripts/openspec/init-plan-workspace.sh ``` -## Multi-Agent Safety Contract +## Multi-Agent Execution Contract (GX) **Repo toggle.** Guardex is enabled by default. If the repo root `.env` sets `GUARDEX_ON=0`, `false`, `no`, or `off`, treat this entire Guardex contract as disabled for the repo and do not require Guardex worktrees, lock claims, completion flow, or OpenSpec workflow until `GUARDEX_ON` is re-enabled. @@ -261,13 +261,16 @@ scripts/openspec/init-plan-workspace.sh `GUARDEX_ON=0` disables Guardex for that repo. `GUARDEX_ON=1` explicitly enables Guardex for that repo again. -**Isolation.** Every task runs on a dedicated `agent/*` branch + worktree. Start with `scripts/agent-branch-start.sh "" ""`. Treat the base branch (`main`/`dev`) as read-only while an agent branch is active. Never `git checkout ` on a primary working tree (including nested repos); use `git worktree add` instead. The `.githooks/post-checkout` hook auto-reverts primary-branch switches during agent sessions — bypass only with `GUARDEX_ALLOW_PRIMARY_BRANCH_SWITCH=1`. +**Isolation.** Every task runs on a dedicated `agent/*` branch + worktree. Start with `scripts/agent-branch-start.sh "" ""`. Treat the base branch (`main`/`dev`) as read-only while an agent branch is active. Never `git checkout ` on a primary working tree (including nested repos); use `git worktree add` instead. The `.githooks/post-checkout` hook auto-reverts primary-branch switches during agent sessions - bypass only with `GUARDEX_ALLOW_PRIMARY_BRANCH_SWITCH=1`. +For every new task, including follow-up work in the same chat/session, if an assigned agent sub-branch/worktree is already open, continue in that sub-branch instead of creating a fresh lane unless the user explicitly redirects scope. +Never implement directly on the local/base branch checkout; keep it unchanged and perform all edits in the agent sub-branch/worktree. **Ownership.** Before editing, claim files: `scripts/agent-file-locks.py claim --branch "" `. Before deleting, confirm the path is in your claim. Don't edit outside your scope unless reassigned. **Handoff gate.** Post a one-line handoff note (plan/change, owned scope, intended action) before editing. Re-read the latest handoffs before replacing others' code. -**Completion.** Finish with `scripts/agent-branch-finish.sh --branch "" --via-pr --wait-for-merge --cleanup` (or `gx finish --all`). Task is only complete when: commit pushed, PR URL recorded, state = `MERGED`, sandbox worktree pruned. If anything blocks, append a `BLOCKED:` note and stop — don't half-finish. +**Completion.** Finish with `scripts/agent-branch-finish.sh --branch "" --via-pr --wait-for-merge --cleanup` (or `gx finish --all`). Task is only complete when: commit pushed, PR URL recorded, state = `MERGED`, sandbox worktree pruned. If anything blocks, append a `BLOCKED:` note and stop - don't half-finish. +OMX completion policy: when a task is done, the agent must commit the task changes, push the agent branch, and create/update a PR before considering the branch complete. **Parallel safety.** Assume other agents edit nearby. Never revert unrelated changes. Report conflicts in the handoff. diff --git a/scripts/agent-file-locks.py b/scripts/agent-file-locks.py index 53c2d2f..06cdd7a 100755 --- a/scripts/agent-file-locks.py +++ b/scripts/agent-file-locks.py @@ -27,7 +27,6 @@ 'AGENTS.md', '.githooks/pre-commit', '.githooks/pre-push', - '.githooks/post-merge', 'scripts/agent-branch-start.sh', 'scripts/agent-branch-finish.sh', 'scripts/agent-file-locks.py', From 887d8d3b270d99d26ef7227e4585b216db6bbf14 Mon Sep 17 00:00:00 2001 From: Viktor Nagy <137165288+NagyVikt@users.noreply.github.com> Date: Tue, 21 Apr 2026 15:53:29 +0200 Subject: [PATCH 05/48] Clarify how GitGuardex updates AGENTS contract docs (#242) README now explains the current setup/doctor behavior for repo guidance files. It makes the marker-managed AGENTS refresh explicit, shows that existing repo-owned text is preserved outside the managed block, and clarifies that root CLAUDE.md is left alone unless the repo chooses to symlink it to AGENTS. A small visual flow and before/after examples make the behavior easier to scan during setup decisions. Constraint: README wording needed to match the current ensureAgentsSnippet implementation and install regression tests Rejected: Describe root CLAUDE.md as auto-managed | inaccurate because setup installs .claude/commands/gitguardex.md instead Confidence: high Scope-risk: narrow Directive: Keep this section aligned with ensureAgentsSnippet semantics and AGENTS refresh tests when setup/doctor behavior changes Tested: npm test -- --test-name-pattern="setup refreshes existing managed AGENTS block by default|doctor refreshes existing managed AGENTS block by default" Not-tested: Mermaid and ASCII rendering differences across npm and GitHub surfaces Co-authored-by: NagyVikt --- README.md | 56 +++++++++++++++++++++++++++++++++++++++++++++++++++++++ 1 file changed, 56 insertions(+) diff --git a/README.md b/README.md index f5c0e1c..5e22f3f 100644 --- a/README.md +++ b/README.md @@ -74,6 +74,62 @@ That's it. Setup installs hooks, scripts, templates, and scaffolds OpenSpec/cave --- +## AGENTS.md and CLAUDE.md behavior + +GitGuardex does **not** clear repo-owned guidance just because it needs to install its own contract. + +- `AGENTS.md` is managed by marker block. `gx setup` and `gx doctor` only refresh the ` ... ` section. +- Text before or after that marker block stays yours. +- If `AGENTS.md` exists but has no Guardex markers yet, GitGuardex appends its contract to the end instead of replacing the file. +- If `AGENTS.md` does not exist, GitGuardex creates it. +- Root `CLAUDE.md` is **not** separately rewritten by setup/doctor. If your repo keeps a separate `CLAUDE.md`, GitGuardex leaves it alone. In this repo `CLAUDE.md` is a symlink to `AGENTS.md`, so Claude reads the same contract. GitGuardex also installs `.claude/commands/gitguardex.md` for Claude command guidance. + +```mermaid +flowchart TD + A[Existing AGENTS.md] --> B{Guardex markers already present?} + B -->|Yes| C[Refresh only the managed block] + B -->|No| D[Append managed block to the end] + E[No AGENTS.md yet] --> F[Create AGENTS.md with managed block] + C --> G[Keep repo-owned text before and after] + D --> G +``` + +Before / after: + +```md +# AGENTS + +Project-specific guidance before managed block. + + +- old managed contract + + +Trailing repo notes after managed block. +``` + +```md +# AGENTS + +Project-specific guidance before managed block. + + +- current GitGuardex-managed contract + + +Trailing repo notes after managed block. +``` + +Visual model: + +```text +CLAUDE.md -> AGENTS.md + ├─ repo-owned text stays + └─ Guardex refreshes only the marker block +``` + +--- + ## What `gx` shows first Before you branch, repair, or start agents, run plain `gx`. It gives you a one-screen status view for the CLI, global helpers, repo safety service, current repo path, and active branch. From 412258495bb386b5bf057776e47ff0654c362be6 Mon Sep 17 00:00:00 2001 From: Viktor Nagy <137165288+NagyVikt@users.noreply.github.com> Date: Tue, 21 Apr 2026 16:11:50 +0200 Subject: [PATCH 06/48] Make doctor failures visually obvious in terminal output (#243) Doctor already prints compact actionable status lines, but fail and success output blend together during long recursive runs. This colors semantic doctor lines only and supports FORCE_COLOR so ANSI coverage can be verified deterministically. Constraint: Human-readable doctor output must remain plain in non-color terminals Rejected: Add a dedicated doctor color flag | FORCE_COLOR already covers explicit ANSI opt-in without expanding the CLI surface Confidence: high Scope-risk: narrow Directive: Keep doctor status coloring scoped to semantic outcome lines; do not color every repair bullet without rechecking readability Tested: node --check bin/multiagent-safety.js; node --test --test-name-pattern "doctor" test/install.test.js; openspec validate agent-codex-doctor-status-colors-2026-04-21-15-58 --type change --strict; openspec validate --specs Not-tested: Full repo non-doctor test suite Co-authored-by: NagyVikt --- bin/multiagent-safety.js | 105 ++++++++++++++++-- .../.openspec.yaml | 2 + .../notes.md | 5 + .../proposal.md | 17 +++ .../specs/doctor-workflow/spec.md | 21 ++++ .../tasks.md | 23 ++++ test/install.test.js | 53 +++++++++ 7 files changed, 215 insertions(+), 11 deletions(-) create mode 100644 openspec/changes/agent-codex-doctor-status-colors-2026-04-21-15-58/.openspec.yaml create mode 100644 openspec/changes/agent-codex-doctor-status-colors-2026-04-21-15-58/notes.md create mode 100644 openspec/changes/agent-codex-doctor-status-colors-2026-04-21-15-58/proposal.md create mode 100644 openspec/changes/agent-codex-doctor-status-colors-2026-04-21-15-58/specs/doctor-workflow/spec.md create mode 100644 openspec/changes/agent-codex-doctor-status-colors-2026-04-21-15-58/tasks.md diff --git a/bin/multiagent-safety.js b/bin/multiagent-safety.js index 8a33b4d..b291bee 100755 --- a/bin/multiagent-safety.js +++ b/bin/multiagent-safety.js @@ -357,7 +357,17 @@ function runtimeVersion() { } function supportsAnsiColors() { - return Boolean(process.stdout.isTTY) && !process.env.NO_COLOR && process.env.TERM !== 'dumb'; + const forced = String(process.env.FORCE_COLOR || '').trim().toLowerCase(); + if (['0', 'false', 'no', 'off'].includes(forced)) { + return false; + } + if (forced.length > 0) { + return true; + } + if (process.env.NO_COLOR) { + return false; + } + return Boolean(process.stdout.isTTY) && process.env.TERM !== 'dumb'; } function colorize(text, colorCode) { @@ -367,6 +377,56 @@ function colorize(text, colorCode) { return `\u001B[${colorCode}m${text}\u001B[0m`; } +function doctorOutputColorCode(status) { + const normalized = String(status || '').trim().toLowerCase(); + if (['active', 'done', 'ok', 'safe', 'success'].includes(normalized)) { + return '32'; + } + if (normalized === 'disabled') { + return '36'; + } + if (['degraded', 'pending', 'skip', 'warn', 'warning'].includes(normalized)) { + return '33'; + } + if (['error', 'fail', 'inactive', 'unsafe'].includes(normalized)) { + return '31'; + } + return null; +} + +function colorizeDoctorOutput(text, status) { + const colorCode = doctorOutputColorCode(status); + return colorCode ? colorize(text, colorCode) : text; +} + +function detectAutoFinishDetailStatus(detail) { + const trimmed = String(detail || '').trim(); + const match = trimmed.match(/^\[(\w+)\]/); + if (match) { + return match[1].toLowerCase(); + } + if (/^Skipped\b/i.test(trimmed) || /^No local agent branches found\b/i.test(trimmed)) { + return 'skip'; + } + return null; +} + +function detectAutoFinishSummaryStatus(summary) { + if (!summary || summary.enabled === false) { + return detectAutoFinishDetailStatus(summary?.details?.[0]); + } + if ((summary.failed || 0) > 0) { + return 'fail'; + } + if ((summary.completed || 0) > 0) { + return 'done'; + } + if ((summary.skipped || 0) > 0) { + return 'skip'; + } + return null; +} + function statusDot(status) { if (status === 'active') { return colorize('●', '32'); // green @@ -604,22 +664,29 @@ function printAutoFinishSummary(summary, options = {}) { if (enabled) { console.log( - `[${TOOL_NAME}] Auto-finish sweep (base=${baseBranch}): attempted=${summary.attempted}, completed=${summary.completed}, skipped=${summary.skipped}, failed=${summary.failed}`, + colorizeDoctorOutput( + `[${TOOL_NAME}] Auto-finish sweep (base=${baseBranch}): attempted=${summary.attempted}, completed=${summary.completed}, skipped=${summary.skipped}, failed=${summary.failed}`, + detectAutoFinishSummaryStatus(summary), + ), ); const visibleDetails = verbose ? details : details.slice(0, detailLimit).map(summarizeAutoFinishDetail); for (const detail of visibleDetails) { - console.log(`[${TOOL_NAME}] ${detail}`); + console.log(colorizeDoctorOutput(`[${TOOL_NAME}] ${detail}`, detectAutoFinishDetailStatus(detail))); } if (!verbose && details.length > detailLimit) { console.log( - `[${TOOL_NAME}] … ${details.length - detailLimit} more branch result(s). Re-run with --verbose-auto-finish for full details.`, + colorizeDoctorOutput( + `[${TOOL_NAME}] … ${details.length - detailLimit} more branch result(s). Re-run with --verbose-auto-finish for full details.`, + 'warn', + ), ); } return; } if (details.length > 0) { - console.log(`[${TOOL_NAME}] ${verbose ? details[0] : summarizeAutoFinishDetail(details[0])}`); + const detail = verbose ? details[0] : summarizeAutoFinishDetail(details[0]); + console.log(colorizeDoctorOutput(`[${TOOL_NAME}] ${detail}`, detectAutoFinishDetailStatus(detail))); } } @@ -5043,21 +5110,34 @@ function printScanResult(scan, json = false) { if (scan.guardexEnabled === false) { console.log( - `[${TOOL_NAME}] Guardex is disabled for this repo (${describeGuardexRepoToggle(scan.guardexToggle)}).`, + colorizeDoctorOutput( + `[${TOOL_NAME}] Guardex is disabled for this repo (${describeGuardexRepoToggle(scan.guardexToggle)}).`, + 'disabled', + ), ); return; } if (scan.findings.length === 0) { - console.log(`[${TOOL_NAME}] ✅ No safety issues detected.`); + console.log(colorizeDoctorOutput(`[${TOOL_NAME}] ✅ No safety issues detected.`, 'safe')); return; } for (const item of scan.findings) { const target = item.path ? ` (${item.path})` : ''; - console.log(`[${item.level.toUpperCase()}] ${item.code}${target}: ${item.message}`); + console.log( + colorizeDoctorOutput( + `[${item.level.toUpperCase()}] ${item.code}${target}: ${item.message}`, + item.level, + ), + ); } - console.log(`[${TOOL_NAME}] Summary: ${scan.errors} error(s), ${scan.warnings} warning(s).`); + console.log( + colorizeDoctorOutput( + `[${TOOL_NAME}] Summary: ${scan.errors} error(s), ${scan.warnings} warning(s).`, + scan.errors > 0 ? 'error' : 'warn', + ), + ); } function setExitCodeFromScan(scan) { @@ -5498,10 +5578,13 @@ function doctor(rawArgs) { verbose: singleRepoOptions.verboseAutoFinish, }); if (safe) { - console.log(`[${TOOL_NAME}] ✅ Repo is fully safe.`); + console.log(colorizeDoctorOutput(`[${TOOL_NAME}] ✅ Repo is fully safe.`, 'safe')); } else { console.log( - `[${TOOL_NAME}] ⚠️ Repo is not fully safe yet (${scanResult.errors} error(s), ${scanResult.warnings} warning(s)).`, + colorizeDoctorOutput( + `[${TOOL_NAME}] ⚠️ Repo is not fully safe yet (${scanResult.errors} error(s), ${scanResult.warnings} warning(s)).`, + scanResult.errors > 0 ? 'unsafe' : 'warn', + ), ); } setExitCodeFromScan(scanResult); diff --git a/openspec/changes/agent-codex-doctor-status-colors-2026-04-21-15-58/.openspec.yaml b/openspec/changes/agent-codex-doctor-status-colors-2026-04-21-15-58/.openspec.yaml new file mode 100644 index 0000000..4b8c565 --- /dev/null +++ b/openspec/changes/agent-codex-doctor-status-colors-2026-04-21-15-58/.openspec.yaml @@ -0,0 +1,2 @@ +schema: spec-driven +created: 2026-04-21 diff --git a/openspec/changes/agent-codex-doctor-status-colors-2026-04-21-15-58/notes.md b/openspec/changes/agent-codex-doctor-status-colors-2026-04-21-15-58/notes.md new file mode 100644 index 0000000..f85e383 --- /dev/null +++ b/openspec/changes/agent-codex-doctor-status-colors-2026-04-21-15-58/notes.md @@ -0,0 +1,5 @@ +# T1 Notes + +- Color `gx doctor` failure lines red so blocked auto-finish rows are visible in long recursive runs. +- Color doctor success lines green, including `No safety issues detected.` and `Repo is fully safe.`, while keeping non-TTY output unchanged. +- Add a regression that forces ANSI output and proves doctor renders colors for both failure and success status lines. diff --git a/openspec/changes/agent-codex-doctor-status-colors-2026-04-21-15-58/proposal.md b/openspec/changes/agent-codex-doctor-status-colors-2026-04-21-15-58/proposal.md new file mode 100644 index 0000000..3005336 --- /dev/null +++ b/openspec/changes/agent-codex-doctor-status-colors-2026-04-21-15-58/proposal.md @@ -0,0 +1,17 @@ +## Why + +- `gx doctor` already prints compact actionable statuses, but the success and failure lines all use the same default terminal color and are easy to miss in long recursive runs. +- Auto-finish failures are the most actionable doctor output, yet they visually blend into the surrounding safe scan output. +- The CLI needs a deterministic way to emit ANSI colors during automated verification so status-color regressions can be tested. + +## What Changes + +- Color human-readable `gx doctor` success lines green. +- Color doctor failure lines red and skip/pending lines yellow. +- Honor the standard `FORCE_COLOR` environment variable so ANSI output can be verified in tests without changing non-color output defaults. + +## Impact + +- Affects only the human-readable doctor/status CLI output when ANSI colors are enabled. +- JSON output and non-color terminals remain unchanged. +- Main risk: over-coloring could reduce readability, so the change stays scoped to doctor scan/final status lines and auto-finish summary/detail rows. diff --git a/openspec/changes/agent-codex-doctor-status-colors-2026-04-21-15-58/specs/doctor-workflow/spec.md b/openspec/changes/agent-codex-doctor-status-colors-2026-04-21-15-58/specs/doctor-workflow/spec.md new file mode 100644 index 0000000..ee1405f --- /dev/null +++ b/openspec/changes/agent-codex-doctor-status-colors-2026-04-21-15-58/specs/doctor-workflow/spec.md @@ -0,0 +1,21 @@ +## ADDED Requirements + +### Requirement: `gx doctor` uses semantic status colors +When ANSI color output is enabled, the human-readable `gx doctor` workflow SHALL color success lines green, failure lines red, and skip or pending lines yellow. + +#### Scenario: safe doctor lines render green +- **GIVEN** `gx doctor` runs in human-readable mode with ANSI color output enabled +- **WHEN** the repo scan reports `No safety issues detected.` and doctor reaches `Repo is fully safe.` +- **THEN** both success lines SHALL be emitted in green + +#### Scenario: doctor auto-finish failures render red +- **GIVEN** `gx doctor` runs in human-readable mode with ANSI color output enabled +- **AND** the auto-finish sweep reports at least one failed branch result +- **WHEN** doctor prints the auto-finish summary and failed branch detail +- **THEN** the failure summary line SHALL be emitted in red +- **AND** the failed branch detail line SHALL be emitted in red + +#### Scenario: doctor skip or pending lines render yellow +- **GIVEN** `gx doctor` runs in human-readable mode with ANSI color output enabled +- **WHEN** doctor prints a skipped or pending auto-finish line +- **THEN** that line SHALL be emitted in yellow diff --git a/openspec/changes/agent-codex-doctor-status-colors-2026-04-21-15-58/tasks.md b/openspec/changes/agent-codex-doctor-status-colors-2026-04-21-15-58/tasks.md new file mode 100644 index 0000000..87cbc48 --- /dev/null +++ b/openspec/changes/agent-codex-doctor-status-colors-2026-04-21-15-58/tasks.md @@ -0,0 +1,23 @@ +## 1. Specification + +- [x] 1.1 Define the doctor status-color requirement for human-readable output. + +## 2. Implementation + +- [x] 2.1 Color doctor success/failure/pending lines with semantic ANSI colors when color output is enabled. +- [x] 2.2 Add a regression that forces ANSI output and checks both red failure lines and green success lines. + +## 3. Verification + +- [x] 3.1 Run `node --check bin/multiagent-safety.js`. +- [x] 3.2 Run `node --test --test-name-pattern "doctor" test/install.test.js`. +- [x] 3.3 Run `openspec validate agent-codex-doctor-status-colors-2026-04-21-15-58 --type change --strict`. +- [x] 3.4 Run `openspec validate --specs`. + +Verification note: `node --check bin/multiagent-safety.js` passed. `node --test --test-name-pattern "doctor" test/install.test.js` passed with 18 doctor-focused tests, including the new forced-color regression. `openspec validate agent-codex-doctor-status-colors-2026-04-21-15-58 --type change --strict` passed, and `openspec validate --specs` returned `No items found to validate.` + +## 4. Completion + +- [ ] 4.1 Finish the agent branch via PR merge + cleanup (`gx finish --via-pr --wait-for-merge --cleanup` or `bash scripts/agent-branch-finish.sh --branch --base --via-pr --wait-for-merge --cleanup`). +- [ ] 4.2 Record PR URL + final `MERGED` state in the completion handoff. +- [ ] 4.3 Confirm sandbox cleanup (`git worktree list`, `git branch -a`) or capture a `BLOCKED:` handoff if merge/cleanup is pending. diff --git a/test/install.test.js b/test/install.test.js index 9717db2..96f96e7 100644 --- a/test/install.test.js +++ b/test/install.test.js @@ -1526,6 +1526,59 @@ exit 1 assert.match(verboseOutput, /git -C ".+rebase --continue/); }); +test('doctor colors failure and success status lines when color output is enabled', () => { + const repoDir = initRepoOnBranch('main'); + seedCommit(repoDir); + attachOriginRemoteForBranch(repoDir, 'main'); + const { readyBranch, readyWorktree, fileName } = prepareDoctorAutoFinishReadyBranch(repoDir, { + taskName: 'doctor-color-status', + fileName: 'doctor-color-status.txt', + }); + + let result = runCmd('git', ['worktree', 'remove', readyWorktree, '--force'], repoDir); + assert.equal(result.status, 0, result.stderr || result.stdout); + + fs.writeFileSync(path.join(repoDir, fileName), 'main branch conflicting color change\n', 'utf8'); + result = runCmd('git', ['add', fileName], repoDir); + assert.equal(result.status, 0, result.stderr || result.stdout); + result = runCmd('git', ['commit', '-m', 'main branch conflicting color change'], repoDir, { + ALLOW_COMMIT_ON_PROTECTED_BRANCH: '1', + }); + assert.equal(result.status, 0, result.stderr || result.stdout); + result = runCmd('git', ['push', 'origin', 'main'], repoDir); + assert.equal(result.status, 0, result.stderr || result.stdout); + + const { fakePath: fakeGhPath } = createFakeGhScript(` +if [[ "$1" == "--version" ]]; then + echo "gh version 2.0.0" + exit 0 +fi +echo "unexpected gh args: $*" >&2 +exit 1 +`); + + result = runNodeWithEnv( + ['doctor', '--target', repoDir, '--allow-protected-base-write'], + repoDir, + { GUARDEX_GH_BIN: fakeGhPath, FORCE_COLOR: '1' }, + ); + assert.equal(result.status, 0, result.stderr || result.stdout); + + const ansiOutput = `${result.stdout}\n${result.stderr}`; + assert.match(ansiOutput, /\u001B\[32m\[gitguardex\] ✅ No safety issues detected\.\u001B\[0m/); + assert.match( + ansiOutput, + /\u001B\[31m\[gitguardex\] Auto-finish sweep \(base=main\): attempted=1, completed=0, skipped=\d+, failed=1\u001B\[0m/, + ); + assert.match( + ansiOutput, + new RegExp( + `\\u001B\\[31m\\[gitguardex\\]\\s+\\[fail\\] ${escapeRegexLiteral(readyBranch)}: rebase conflict in finish flow; run rebase --continue or rebase --abort in the source-probe worktree\\u001B\\[0m`, + ), + ); + assert.match(ansiOutput, /\u001B\[32m\[gitguardex\] ✅ Repo is fully safe\.\u001B\[0m/); +}); + test('setup pre-commit blocks codex session commits on non-agent branches by default', () => { const repoDir = initRepo(); From 0a4ec3e38e23275f3bd971dfc62c2050eab4b25c Mon Sep 17 00:00:00 2001 From: Viktor Nagy <137165288+NagyVikt@users.noreply.github.com> Date: Tue, 21 Apr 2026 16:26:59 +0200 Subject: [PATCH 07/48] Clarify how Guardex sandboxes get cleaned up (#244) The README still showed post-merge cleanup as optional and did not explain the case where one agent implements a change but another agent has to close the sandbox after a usage-limit stop. This aligns the daily workflow with the current finish flags, adds a direct worktree-removal escape hatch, and ties the OpenSpec collaboration checklist to real cleanup handoffs. Constraint: README should match the current agent-branch-finish --cleanup flow on main Rejected: Mention cleanup only in Known rough edges | did not tell users how to remove unwanted worktrees or why collaboration handoffs exist Confidence: high Scope-risk: narrow Reversibility: clean Directive: Keep README cleanup examples aligned with finish-script defaults and OpenSpec collaboration tasks Tested: git diff --check; npm test partial pass before hang in install.test.js Not-tested: Full npm test completion (hung in test/install.test.js hook-fail-task path) Co-authored-by: NagyVikt --- README.md | 18 ++++++++++++------ .../.openspec.yaml | 2 ++ .../notes.md | 5 +++++ 3 files changed, 19 insertions(+), 6 deletions(-) create mode 100644 openspec/changes/agent-codex-document-worktree-cleanup-handoff-2026-04-21-16-19/.openspec.yaml create mode 100644 openspec/changes/agent-codex-document-worktree-cleanup-handoff-2026-04-21-16-19/notes.md diff --git a/README.md b/README.md index 5e22f3f..42e5a8c 100644 --- a/README.md +++ b/README.md @@ -155,17 +155,22 @@ python3 scripts/agent-file-locks.py claim \ # 3) Implement + verify npm test -# 4) Finish (commit + push + PR + merge) +# 4) Finish (commit + push + PR + merge + cleanup) bash scripts/agent-branch-finish.sh \ --branch "$(git rev-parse --abbrev-ref HEAD)" \ - --base dev --via-pr --wait-for-merge - -# 5) Optional: cleanup after merge -gx cleanup --branch "$(git rev-parse --abbrev-ref HEAD)" + --base main --via-pr --wait-for-merge --cleanup ``` If you use `scripts/codex-agent.sh`, the finish flow runs automatically when the Codex session exits — it auto-commits, retries once after syncing if the base moved during the run, then pushes and opens the PR. +Guardex normally prunes merged sandboxes for you as part of the finish flow. If you simply do not want a local sandbox/worktree anymore, remove that worktree directly; delete the branch too only if you are intentionally abandoning that lane: + +```sh +git worktree remove .omx/agent-worktrees/ +# Claude Code sandboxes live under .omc/agent-worktrees/ +git branch -D agent// # optional, only if you are discarding the lane +``` + Running Codex across several existing worktrees (e.g. from VS Code Source Control)? Finalize everything ready at once: ```sh @@ -489,6 +494,7 @@ Expanded flow: - `scripts/codex-agent.sh` enforces OpenSpec workspaces before launching Codex. - `scripts/agent-branch-start.sh` can scaffold both `openspec/changes//` and `openspec/plan//` when `GUARDEX_OPENSPEC_AUTO_INIT=true`. +- The collaboration section in `tasks.md` is there for real cleanup handoffs too. If the first Codex/Claude session finishes the implementation work but hits a usage limit before `agent-branch-finish --cleanup`, hand the same sandbox to another agent, let that agent finish cleanup, and record the join/handoff in the change task. Environment variables: @@ -553,7 +559,7 @@ gh workflow run sync-frontend-mirror.yml Being honest about where this still has issues: -- **Usage limit mid-task.** When an agent hits its Codex/Claude usage limit partway through, the cleanup flow currently has to be handed to a different agent. It works, but the handoff is uglier than I'd like. +- **Usage limit mid-task.** When an agent hits its Codex/Claude usage limit partway through, another agent may need to take over the same sandbox and run the remaining finish/cleanup steps. The OpenSpec collaboration checklist is there to capture that handoff, but it is still uglier than I'd like. - **Conflict-stuck probes.** Fixed in v7.0.2 — earlier versions could leak `__source-probe-*` worktrees when the sync-guard rebase hit conflicts. If you're on an older release, `gx cleanup` sweeps these. - **Windows.** Most of the hook surface assumes a POSIX shell. Use WSL or symlink-enabled git if you're on Windows. diff --git a/openspec/changes/agent-codex-document-worktree-cleanup-handoff-2026-04-21-16-19/.openspec.yaml b/openspec/changes/agent-codex-document-worktree-cleanup-handoff-2026-04-21-16-19/.openspec.yaml new file mode 100644 index 0000000..4b8c565 --- /dev/null +++ b/openspec/changes/agent-codex-document-worktree-cleanup-handoff-2026-04-21-16-19/.openspec.yaml @@ -0,0 +1,2 @@ +schema: spec-driven +created: 2026-04-21 diff --git a/openspec/changes/agent-codex-document-worktree-cleanup-handoff-2026-04-21-16-19/notes.md b/openspec/changes/agent-codex-document-worktree-cleanup-handoff-2026-04-21-16-19/notes.md new file mode 100644 index 0000000..abd2483 --- /dev/null +++ b/openspec/changes/agent-codex-document-worktree-cleanup-handoff-2026-04-21-16-19/notes.md @@ -0,0 +1,5 @@ +# T1 Notes + +- Update the README daily workflow example so finish-time cleanup is the default merged-sandbox path. +- Document the manual `git worktree remove` escape hatch for users who simply want a local worktree gone. +- Explain that OpenSpec collaboration checklists also cover cross-agent cleanup handoffs when the original Codex/Claude session runs out of runway. From 75aae919e6805b843f80aef6338748ec85fcc0ca Mon Sep 17 00:00:00 2001 From: Viktor Nagy <137165288+NagyVikt@users.noreply.github.com> Date: Tue, 21 Apr 2026 16:37:33 +0200 Subject: [PATCH 08/48] Keep npm release verification green when the version is already live (#245) The current main branch already passes the release-path test suite, but workflow_dispatch and backfill release runs can still fail later on npm publish when the exact package version is already on the registry. This teaches the release workflow to detect that case and skip publish cleanly while leaving verification intact. Constraint: Existing published versions on npm are immutable and should not make healthy verification runs fail Constraint: Release verification must still run on every workflow invocation before publish decisions are made Rejected: Remove workflow_dispatch support | would block maintainer-side verification on current main Rejected: Skip npm publish unconditionally on manual runs | hides legitimate unpublished release failures Confidence: high Scope-risk: narrow Reversibility: clean Directive: Keep the release workflow idempotent for already-published versions so GitHub release state can be backfilled without forcing a version bump Tested: npm test Tested: node --check bin/multiagent-safety.js Tested: npm pack --dry-run Tested: openspec validate agent-codex-make-release-workflow-idempotent-when-ve-2026-04-21-16-30 --type change --strict Tested: openspec validate --specs Not-tested: Live GitHub Actions workflow execution before merge Co-authored-by: NagyVikt --- .github/workflows/release.yml | 26 +++++++++++++++++++ .../.openspec.yaml | 2 ++ .../proposal.md | 14 ++++++++++ .../specs/release-workflow/spec.md | 10 +++++++ .../tasks.md | 21 +++++++++++++++ test/metadata.test.js | 11 ++++++++ 6 files changed, 84 insertions(+) create mode 100644 openspec/changes/agent-codex-make-release-workflow-idempotent-when-ve-2026-04-21-16-30/.openspec.yaml create mode 100644 openspec/changes/agent-codex-make-release-workflow-idempotent-when-ve-2026-04-21-16-30/proposal.md create mode 100644 openspec/changes/agent-codex-make-release-workflow-idempotent-when-ve-2026-04-21-16-30/specs/release-workflow/spec.md create mode 100644 openspec/changes/agent-codex-make-release-workflow-idempotent-when-ve-2026-04-21-16-30/tasks.md diff --git a/.github/workflows/release.yml b/.github/workflows/release.yml index ab02297..40df43a 100644 --- a/.github/workflows/release.yml +++ b/.github/workflows/release.yml @@ -35,5 +35,31 @@ jobs: node --check bin/multiagent-safety.js npm pack --dry-run + - name: Resolve package metadata + id: pkg + run: | + echo "name=$(node -p "require('./package.json').name")" >> "$GITHUB_OUTPUT" + echo "version=$(node -p "require('./package.json').version")" >> "$GITHUB_OUTPUT" + + - name: Check npm registry for current version + id: registry + env: + PACKAGE_NAME: ${{ steps.pkg.outputs.name }} + PACKAGE_VERSION: ${{ steps.pkg.outputs.version }} + run: | + if npm view "${PACKAGE_NAME}@${PACKAGE_VERSION}" version >/dev/null 2>&1; then + echo "already_published=true" >> "$GITHUB_OUTPUT" + else + echo "already_published=false" >> "$GITHUB_OUTPUT" + fi + - name: Publish with provenance + if: ${{ steps.registry.outputs.already_published != 'true' }} run: npm publish --provenance --access public + + - name: Skip already-published npm version + if: ${{ steps.registry.outputs.already_published == 'true' }} + env: + PACKAGE_NAME: ${{ steps.pkg.outputs.name }} + PACKAGE_VERSION: ${{ steps.pkg.outputs.version }} + run: echo "${PACKAGE_NAME}@${PACKAGE_VERSION} is already on npm; skipping publish." diff --git a/openspec/changes/agent-codex-make-release-workflow-idempotent-when-ve-2026-04-21-16-30/.openspec.yaml b/openspec/changes/agent-codex-make-release-workflow-idempotent-when-ve-2026-04-21-16-30/.openspec.yaml new file mode 100644 index 0000000..4b8c565 --- /dev/null +++ b/openspec/changes/agent-codex-make-release-workflow-idempotent-when-ve-2026-04-21-16-30/.openspec.yaml @@ -0,0 +1,2 @@ +schema: spec-driven +created: 2026-04-21 diff --git a/openspec/changes/agent-codex-make-release-workflow-idempotent-when-ve-2026-04-21-16-30/proposal.md b/openspec/changes/agent-codex-make-release-workflow-idempotent-when-ve-2026-04-21-16-30/proposal.md new file mode 100644 index 0000000..fe6eb1a --- /dev/null +++ b/openspec/changes/agent-codex-make-release-workflow-idempotent-when-ve-2026-04-21-16-30/proposal.md @@ -0,0 +1,14 @@ +## Why + +- The failing `Release to npm (provenance)` runs in GitHub Actions are stale release-tag executions; current `main` is green locally, and `@imdeadpool/guardex@7.0.16` is already published on npm. +- Today the workflow always executes `npm publish`, so manual verification runs or backfill GitHub releases for an already-published version would fail even when the code under test is healthy. + +## What Changes + +- Teach `.github/workflows/release.yml` to resolve the package name/version from `package.json`, check npm for that exact version, and skip `npm publish` when it already exists. +- Add a metadata regression test in `test/metadata.test.js` that locks the workflow's already-published skip behavior. + +## Impact + +- Makes the release workflow idempotent for already-published versions while preserving the existing verify steps and normal publish path for new versions. +- Lets maintainers run `workflow_dispatch` on a healthy `main` release commit without re-breaking on an already-published npm version. diff --git a/openspec/changes/agent-codex-make-release-workflow-idempotent-when-ve-2026-04-21-16-30/specs/release-workflow/spec.md b/openspec/changes/agent-codex-make-release-workflow-idempotent-when-ve-2026-04-21-16-30/specs/release-workflow/spec.md new file mode 100644 index 0000000..0e3ae90 --- /dev/null +++ b/openspec/changes/agent-codex-make-release-workflow-idempotent-when-ve-2026-04-21-16-30/specs/release-workflow/spec.md @@ -0,0 +1,10 @@ +## ADDED Requirements + +### Requirement: Release workflow skips already-published package versions +The `Release to npm (provenance)` workflow SHALL verify the package on every run, but it SHALL skip `npm publish` when the exact `package.json` version is already present on npm. + +#### Scenario: workflow_dispatch on an already-published version +- **GIVEN** the workflow is running against a commit whose `package.json` version already exists on npm +- **WHEN** the verify steps complete successfully +- **THEN** the workflow SHALL detect that published version before the publish step +- **AND** it SHALL report that publish is being skipped instead of failing on `npm publish`. diff --git a/openspec/changes/agent-codex-make-release-workflow-idempotent-when-ve-2026-04-21-16-30/tasks.md b/openspec/changes/agent-codex-make-release-workflow-idempotent-when-ve-2026-04-21-16-30/tasks.md new file mode 100644 index 0000000..b326bbc --- /dev/null +++ b/openspec/changes/agent-codex-make-release-workflow-idempotent-when-ve-2026-04-21-16-30/tasks.md @@ -0,0 +1,21 @@ +## 1. Specification + +- [x] 1.1 Finalize proposal scope and acceptance criteria for `agent-codex-make-release-workflow-idempotent-when-ve-2026-04-21-16-30`. +- [x] 1.2 Define normative requirements in `specs/release-workflow/spec.md`. + +## 2. Implementation + +- [x] 2.1 Update `.github/workflows/release.yml` so it skips `npm publish` when the current package version already exists on npm. +- [x] 2.2 Add/update `test/metadata.test.js` regression coverage for the release-workflow skip behavior. + +## 3. Verification + +- [x] 3.1 Run `npm test`, `node --check bin/multiagent-safety.js`, and `npm pack --dry-run`. Result: `npm test` passed `152/152`; `node --check bin/multiagent-safety.js` passed; `npm pack --dry-run` produced `imdeadpool-guardex-7.0.16.tgz`. +- [x] 3.2 Run `openspec validate agent-codex-make-release-workflow-idempotent-when-ve-2026-04-21-16-30 --type change --strict`. Result: `Change 'agent-codex-make-release-workflow-idempotent-when-ve-2026-04-21-16-30' is valid`. +- [x] 3.3 Run `openspec validate --specs`. Result: `No items found to validate.` + +## 4. Completion + +- [ ] 4.1 Finish the agent branch via PR merge + cleanup (`gx finish --via-pr --wait-for-merge --cleanup` or `bash scripts/agent-branch-finish.sh --branch --base --via-pr --wait-for-merge --cleanup`). +- [ ] 4.2 Record PR URL + final `MERGED` state in the completion handoff. +- [ ] 4.3 Confirm sandbox cleanup (`git worktree list`, `git branch -a`) or capture a `BLOCKED:` handoff if merge/cleanup is pending. diff --git a/test/metadata.test.js b/test/metadata.test.js index e2b91a6..4d63748 100644 --- a/test/metadata.test.js +++ b/test/metadata.test.js @@ -31,6 +31,17 @@ test('release workflow publishes with provenance in CI', () => { assert.match(workflow, /npm publish --provenance --access public/); }); +test('release workflow skips publish when the current version is already on npm', () => { + const workflowPath = path.join(repoRoot, '.github', 'workflows', 'release.yml'); + const workflow = fs.readFileSync(workflowPath, 'utf8'); + assert.match(workflow, /name:\s+Resolve package metadata/); + assert.match(workflow, /name:\s+Check npm registry for current version/); + assert.match(workflow, /npm view "\$\{PACKAGE_NAME\}@\$\{PACKAGE_VERSION\}" version/); + assert.match(workflow, /if:\s+\$\{\{\s*steps\.registry\.outputs\.already_published != 'true'\s*\}\}/); + assert.match(workflow, /if:\s+\$\{\{\s*steps\.registry\.outputs\.already_published == 'true'\s*\}\}/); + assert.match(workflow, /skipping publish\./); +}); + test('release workflow only publishes from published releases or manual dispatch', () => { const workflowPath = path.join(repoRoot, '.github', 'workflows', 'release.yml'); const workflow = fs.readFileSync(workflowPath, 'utf8'); From 495c867028f4f19250dcd5a0a435832d9c40b610 Mon Sep 17 00:00:00 2001 From: Viktor Nagy <137165288+NagyVikt@users.noreply.github.com> Date: Tue, 21 Apr 2026 16:38:36 +0200 Subject: [PATCH 09/48] Bring the README docs in line with current Guardex behavior (#246) The docs still showed the older AGENTS marker explanation and an outdated gx status illustration, so the README was refreshed to match the current CLI surface and the managed-block guidance the user asked to add. This keeps the changes confined to documentation: one richer README section, one updated terminal SVG, and the matching T1 note for the branch. Constraint: Keep the change docs-only and preserve the user-provided README wording for the AGENTS.md/CLAUDE.md section Rejected: Add the new AGENTS guidance as a second duplicate section | would leave conflicting docs in the same README Confidence: high Scope-risk: narrow Reversibility: clean Directive: Update the terminal-status SVG whenever gx command/service output materially changes so README screenshots do not drift Tested: git diff --check; rendered docs/images/workflow-gx-terminal-status.svg to /tmp/workflow-gx-terminal-status.png for visual review Not-tested: npm test (skipped because this is a docs/SVG-only change and the suite previously hung in test/install.test.js) Co-authored-by: NagyVikt --- README.md | 100 ++++++++++------- docs/images/workflow-gx-terminal-status.svg | 103 ++++++++++++------ .../.openspec.yaml | 2 + .../notes.md | 5 + 4 files changed, 138 insertions(+), 72 deletions(-) create mode 100644 openspec/changes/agent-codex-refresh-gx-status-svg-2026-04-21-16-31/.openspec.yaml create mode 100644 openspec/changes/agent-codex-refresh-gx-status-svg-2026-04-21-16-31/notes.md diff --git a/README.md b/README.md index 42e5a8c..94886ea 100644 --- a/README.md +++ b/README.md @@ -74,60 +74,84 @@ That's it. Setup installs hooks, scripts, templates, and scaffolds OpenSpec/cave --- -## AGENTS.md and CLAUDE.md behavior +## How `AGENTS.md` and `CLAUDE.md` are handled -GitGuardex does **not** clear repo-owned guidance just because it needs to install its own contract. +> [!IMPORTANT] +> **GitGuardex never overwrites your guidance.** Only the content between these markers is managed: +> +> ```text +> +> ... managed content ... +> +> ``` +> +> Everything outside that block is preserved byte-for-byte. -- `AGENTS.md` is managed by marker block. `gx setup` and `gx doctor` only refresh the ` ... ` section. -- Text before or after that marker block stays yours. -- If `AGENTS.md` exists but has no Guardex markers yet, GitGuardex appends its contract to the end instead of replacing the file. -- If `AGENTS.md` does not exist, GitGuardex creates it. -- Root `CLAUDE.md` is **not** separately rewritten by setup/doctor. If your repo keeps a separate `CLAUDE.md`, GitGuardex leaves it alone. In this repo `CLAUDE.md` is a symlink to `AGENTS.md`, so Claude reads the same contract. GitGuardex also installs `.claude/commands/gitguardex.md` for Claude command guidance. +### Behavior at a glance -```mermaid -flowchart TD - A[Existing AGENTS.md] --> B{Guardex markers already present?} - B -->|Yes| C[Refresh only the managed block] - B -->|No| D[Append managed block to the end] - E[No AGENTS.md yet] --> F[Create AGENTS.md with managed block] - C --> G[Keep repo-owned text before and after] - D --> G -``` +
-Before / after: +| Your repo has… | `gx setup` / `gx doctor` does… | +| :--- | :--- | +| `AGENTS.md` **with** markers | Refreshes **only** the managed block | +| `AGENTS.md` **without** markers | Appends the managed block to the end | +| No `AGENTS.md` | Creates it with the managed block | +| A root `CLAUDE.md` | Leaves it alone | -```md -# AGENTS +
-Project-specific guidance before managed block. +> [!NOTE] +> In this repo, `CLAUDE.md` is a symlink to `AGENTS.md`, so Claude reads the same contract. Claude-specific command guidance is installed separately at `.claude/commands/gitguardex.md`. - -- old managed contract - +### Decision flow -Trailing repo notes after managed block. -``` - -```md -# AGENTS +```mermaid +flowchart TD + Start([gx setup / gx doctor]) + Check{AGENTS.md
exists?} + Markers{Markers
present?} + Create[Create AGENTS.md
with managed block] + Refresh[Refresh the
managed block] + Append[Append managed block
to end of file] + Done([Repo-owned text preserved]) -Project-specific guidance before managed block. + Start --> Check + Check -- No --> Create + Check -- Yes --> Markers + Markers -- Yes --> Refresh + Markers -- No --> Append + Create --> Done + Refresh --> Done + Append --> Done - -- current GitGuardex-managed contract - + classDef entry fill:#0b76c5,stroke:#60a5fa,stroke-width:2px,color:#fff + classDef decide fill:#78350f,stroke:#fbbf24,stroke-width:2px,color:#fff + classDef action fill:#374151,stroke:#94a3b8,stroke-width:1.5px,color:#f1f5f9 + classDef finish fill:#064e3b,stroke:#34d399,stroke-width:2px,color:#fff -Trailing repo notes after managed block. + class Start entry + class Check,Markers decide + class Create,Refresh,Append action + class Done finish ``` -Visual model: +### What actually changes -```text -CLAUDE.md -> AGENTS.md - ├─ repo-owned text stays - └─ Guardex refreshes only the marker block +```diff + # AGENTS + + Project-specific guidance before managed block. + + +- - old managed contract ++ - current GitGuardex-managed contract + + + Trailing repo notes after managed block. ``` +Only lines **inside** the marker block change. Everything above and below is preserved exactly. + --- ## What `gx` shows first diff --git a/docs/images/workflow-gx-terminal-status.svg b/docs/images/workflow-gx-terminal-status.svg index f1a0e3f..b49eac7 100644 --- a/docs/images/workflow-gx-terminal-status.svg +++ b/docs/images/workflow-gx-terminal-status.svg @@ -1,45 +1,80 @@ - + GitGuardex terminal status output - A terminal screenshot style illustration showing the gx status help screen with active services, repo path, branch, commands, and doctor hint. + A terminal screenshot style illustration showing the updated gx help screen with active services, repo path, branch, merge and release commands, agent bot controls, repo toggle guidance, and the doctor hint. - - + + - - - deadpool@recodee:~/Documents/recodee$ gx - + + deadpool@recodee:~/ + Documents/recodee + $ gx - - [gitguardex] CLI: @imdeadpool/guardex/7.0.10 linux-x64 node-v22.22.0 - [gitguardex] Global services: - - ● oh-my-codex: active - - ● oh-my-claude-sisyphus: active - - ● @fission-ai/openspec: active - - ● cavemem: active - - ● @imdeadpool/codex-account-switcher: active - - ● GitHub (gh): active - [gitguardex] Repo safety service: ● active. - [gitguardex] Repo: /home/deadpool/Documents/recodee - [gitguardex] Branch: dev - gitguardex-tools logs: - - └─ USAGE - $ gx <command> [options] - └─ COMMANDS - status Show GitGuardex CLI + service health without modifying files - setup Install, repair, and verify guardrails (flags: --repair, --install-only, --target) - doctor Repair drift + verify (auto-sandboxes on protected main) - protect Manage protected branches (list/add/remove/set/reset) - sync Sync agent branches with origin/<base> - finish Commit + PR + merge completed agent branches (--all, --branch) - cleanup Prune merged/stale agent branches and worktrees - prompt Print AI setup checklist (--exec, --snippet) - Try 'gitguardex doctor' for one-step repair + verification. + + [gitguardex] CLI: @imdeadpool/guardex/7.0.16 linux-x64 node-v22.22.0 + [gitguardex] Global services: + + - + + oh-my-codex: active + - + + oh-my-claudecode: active + - + + @fission-ai/openspec: active + - + + cavemem: active + - + + @imdeadpool/codex-account-switcher: active + - + + cavekit: active + - + + caveman: active + - + + GitHub (gh): active + + [gitguardex] Repo safety service: + + active. + [gitguardex] Repo: /home/deadpool/Documents/recodee + [gitguardex] Branch: dev + gitguardex-tools logs: + + └─ USAGE + $ gx <command> [options] + + └─ COMMANDS + status Show GitGuardex CLI + service health without modifying files + setup Install, repair, and verify guardrails (flags: --repair, --install-only, --target) + doctor Repair drift + verify (auto-sandboxes on protected main) + protect Manage protected branches (list/add/remove/set/reset) + merge Create/reuse an integration lane and merge overlapping agent branches + sync Sync agent branches with origin/<base> + finish Commit + PR + merge completed agent branches (--all, --branch) + cleanup Prune merged/stale agent branches and worktrees + release Create or update the current GitHub release with README-generated notes + agents Start/stop repo-scoped review + cleanup bots + prompt Print AI setup checklist (--exec, --snippet) + report Security/safety reports (e.g. OpenSSF scorecard) + help Show this help output + version Print Guardex version + + └─ AGENT BOT + agents Start/stop review + cleanup bots for this repo + + └─ REPO TOGGLE + Set repo-root .env: GUARDEX_ON=0 disables Guardex, GUARDEX_ON=1 enables it again + └─ Try 'gitguardex doctor' for one-step repair + verification. diff --git a/openspec/changes/agent-codex-refresh-gx-status-svg-2026-04-21-16-31/.openspec.yaml b/openspec/changes/agent-codex-refresh-gx-status-svg-2026-04-21-16-31/.openspec.yaml new file mode 100644 index 0000000..4b8c565 --- /dev/null +++ b/openspec/changes/agent-codex-refresh-gx-status-svg-2026-04-21-16-31/.openspec.yaml @@ -0,0 +1,2 @@ +schema: spec-driven +created: 2026-04-21 diff --git a/openspec/changes/agent-codex-refresh-gx-status-svg-2026-04-21-16-31/notes.md b/openspec/changes/agent-codex-refresh-gx-status-svg-2026-04-21-16-31/notes.md new file mode 100644 index 0000000..714049d --- /dev/null +++ b/openspec/changes/agent-codex-refresh-gx-status-svg-2026-04-21-16-31/notes.md @@ -0,0 +1,5 @@ +# T1 Notes + +- Refresh `docs/images/workflow-gx-terminal-status.svg` so the README terminal snapshot matches the current `gx` output. +- Replace the README `AGENTS.md` / `CLAUDE.md` handling section with the fuller marker-preservation table, note, flowchart, and diff. +- Keep the docs change scoped to the SVG, README section, and matching T1 note only. From b9d1b4c341d67ecbdebc89892e41ee284a4ca6da Mon Sep 17 00:00:00 2001 From: NagyVikt Date: Tue, 21 Apr 2026 17:07:25 +0200 Subject: [PATCH 10/48] new --- frontend/scripts/agent-branch-finish.sh | 583 +++++++++++ frontend/scripts/agent-branch-merge.sh | 421 ++++++++ frontend/scripts/agent-branch-start.sh | 614 ++++++++++++ frontend/scripts/agent-file-locks.py | 407 ++++++++ frontend/scripts/agent-worktree-prune.sh | 572 +++++++++++ frontend/scripts/codex-agent.sh | 910 ++++++++++++++++++ frontend/scripts/guardex-docker-loader.sh | 123 +++ frontend/scripts/guardex-env.sh | 74 ++ frontend/scripts/install-agent-git-hooks.sh | 21 + .../scripts/openspec/init-change-workspace.sh | 93 ++ .../scripts/openspec/init-plan-workspace.sh | 160 +++ frontend/scripts/review-bot-watch.sh | 330 +++++++ 12 files changed, 4308 insertions(+) create mode 100755 frontend/scripts/agent-branch-finish.sh create mode 100755 frontend/scripts/agent-branch-merge.sh create mode 100755 frontend/scripts/agent-branch-start.sh create mode 100755 frontend/scripts/agent-file-locks.py create mode 100755 frontend/scripts/agent-worktree-prune.sh create mode 100755 frontend/scripts/codex-agent.sh create mode 100755 frontend/scripts/guardex-docker-loader.sh create mode 100644 frontend/scripts/guardex-env.sh create mode 100755 frontend/scripts/install-agent-git-hooks.sh create mode 100755 frontend/scripts/openspec/init-change-workspace.sh create mode 100755 frontend/scripts/openspec/init-plan-workspace.sh create mode 100755 frontend/scripts/review-bot-watch.sh diff --git a/frontend/scripts/agent-branch-finish.sh b/frontend/scripts/agent-branch-finish.sh new file mode 100755 index 0000000..fb528e1 --- /dev/null +++ b/frontend/scripts/agent-branch-finish.sh @@ -0,0 +1,583 @@ +#!/usr/bin/env bash +set -euo pipefail + +BASE_BRANCH="" +BASE_BRANCH_EXPLICIT=0 +SOURCE_BRANCH="" +PUSH_ENABLED=1 +DELETE_REMOTE_BRANCH=0 +DELETE_REMOTE_BRANCH_EXPLICIT=0 +MERGE_MODE="auto" +GH_BIN="${GUARDEX_GH_BIN:-gh}" +CLEANUP_AFTER_MERGE_RAW="${GUARDEX_FINISH_CLEANUP:-false}" +WAIT_FOR_MERGE_RAW="${GUARDEX_FINISH_WAIT_FOR_MERGE:-false}" +WAIT_TIMEOUT_SECONDS_RAW="${GUARDEX_FINISH_WAIT_TIMEOUT_SECONDS:-1800}" +WAIT_POLL_SECONDS_RAW="${GUARDEX_FINISH_WAIT_POLL_SECONDS:-10}" + +normalize_bool() { + local raw="${1:-}" + local fallback="${2:-0}" + local lowered + lowered="$(printf '%s' "$raw" | tr '[:upper:]' '[:lower:]')" + case "$lowered" in + 1|true|yes|on) printf '1' ;; + 0|false|no|off) printf '0' ;; + '') printf '%s' "$fallback" ;; + *) printf '%s' "$fallback" ;; + esac +} + +normalize_int() { + local raw="${1:-}" + local fallback="${2:-0}" + local min_value="${3:-0}" + local value="$raw" + + if [[ -z "$value" || ! "$value" =~ ^[0-9]+$ ]]; then + value="$fallback" + fi + + if (( value < min_value )); then + value="$min_value" + fi + + printf '%s' "$value" +} + +CLEANUP_AFTER_MERGE="$(normalize_bool "$CLEANUP_AFTER_MERGE_RAW" "0")" +WAIT_FOR_MERGE="$(normalize_bool "$WAIT_FOR_MERGE_RAW" "0")" +WAIT_TIMEOUT_SECONDS="$(normalize_int "$WAIT_TIMEOUT_SECONDS_RAW" "1800" "30")" +WAIT_POLL_SECONDS="$(normalize_int "$WAIT_POLL_SECONDS_RAW" "10" "0")" + +while [[ $# -gt 0 ]]; do + case "$1" in + --base) + BASE_BRANCH="${2:-}" + BASE_BRANCH_EXPLICIT=1 + shift 2 + ;; + --branch) + SOURCE_BRANCH="${2:-}" + shift 2 + ;; + --no-push) + PUSH_ENABLED=0 + shift + ;; + --keep-remote-branch) + DELETE_REMOTE_BRANCH=0 + DELETE_REMOTE_BRANCH_EXPLICIT=1 + shift + ;; + --delete-remote-branch) + DELETE_REMOTE_BRANCH=1 + DELETE_REMOTE_BRANCH_EXPLICIT=1 + shift + ;; + --cleanup) + CLEANUP_AFTER_MERGE=1 + shift + ;; + --no-cleanup) + CLEANUP_AFTER_MERGE=0 + shift + ;; + --wait-for-merge) + WAIT_FOR_MERGE=1 + shift + ;; + --no-wait-for-merge) + WAIT_FOR_MERGE=0 + shift + ;; + --wait-timeout-seconds) + WAIT_TIMEOUT_SECONDS="$(normalize_int "${2:-}" "1800" "30")" + shift 2 + ;; + --wait-poll-seconds) + WAIT_POLL_SECONDS="$(normalize_int "${2:-}" "10" "0")" + shift 2 + ;; + --mode) + MERGE_MODE="${2:-auto}" + shift 2 + ;; + --via-pr) + MERGE_MODE="pr" + shift + ;; + --direct-only) + MERGE_MODE="direct" + shift + ;; + *) + echo "[agent-branch-finish] Unknown argument: $1" >&2 + echo "Usage: $0 [--base ] [--branch ] [--no-push] [--cleanup|--no-cleanup] [--wait-for-merge|--no-wait-for-merge] [--wait-timeout-seconds ] [--wait-poll-seconds ] [--keep-remote-branch|--delete-remote-branch] [--mode auto|direct|pr|--via-pr|--direct-only]" >&2 + exit 1 + ;; + esac +done + +if [[ "$CLEANUP_AFTER_MERGE" -eq 1 && "$DELETE_REMOTE_BRANCH_EXPLICIT" -eq 0 ]]; then + DELETE_REMOTE_BRANCH=1 +fi + +case "$MERGE_MODE" in + auto|direct|pr) ;; + *) + echo "[agent-branch-finish] Invalid --mode value: ${MERGE_MODE} (expected auto|direct|pr)" >&2 + exit 1 + ;; +esac + +if ! git rev-parse --is-inside-work-tree >/dev/null 2>&1; then + echo "[agent-branch-finish] Not inside a git repository." >&2 + exit 1 +fi + +repo_root="$(git rev-parse --show-toplevel)" +current_worktree="$(pwd -P)" +common_git_dir_raw="$(git -C "$repo_root" rev-parse --git-common-dir)" +if [[ "$common_git_dir_raw" == /* ]]; then + common_git_dir="$common_git_dir_raw" +else + common_git_dir="$(cd "$repo_root/$common_git_dir_raw" && pwd -P)" +fi +repo_common_root="$(cd "$common_git_dir/.." && pwd -P)" + +if [[ -z "$SOURCE_BRANCH" ]]; then + SOURCE_BRANCH="$(git rev-parse --abbrev-ref HEAD)" +fi + +stored_worktree_root_rel="$(git -C "$repo_root" config --get "branch.${SOURCE_BRANCH}.guardexWorktreeRoot" || true)" +if [[ -z "$stored_worktree_root_rel" ]]; then + stored_worktree_root_rel=".omx/agent-worktrees" +fi +agent_worktree_root="${repo_common_root}/${stored_worktree_root_rel}" + +if [[ "$BASE_BRANCH_EXPLICIT" -eq 1 && -z "$BASE_BRANCH" ]]; then + echo "[agent-branch-finish] --base requires a non-empty branch name." >&2 + exit 1 +fi + +if [[ "$BASE_BRANCH_EXPLICIT" -eq 0 ]]; then + source_branch_base="$(git -C "$repo_root" config --get "branch.${SOURCE_BRANCH}.guardexBase" || true)" + if [[ -n "$source_branch_base" ]]; then + BASE_BRANCH="$source_branch_base" + else + configured_base="$(git -C "$repo_root" config --get multiagent.baseBranch || true)" + if [[ -n "$configured_base" ]]; then + BASE_BRANCH="$configured_base" + fi + fi +fi + +if [[ -z "$BASE_BRANCH" ]]; then + for fallback_branch in dev main master; do + if git -C "$repo_root" show-ref --verify --quiet "refs/heads/${fallback_branch}" \ + || git -C "$repo_root" show-ref --verify --quiet "refs/remotes/origin/${fallback_branch}"; then + BASE_BRANCH="$fallback_branch" + break + fi + done +fi + +if [[ -z "$BASE_BRANCH" ]]; then + BASE_BRANCH="dev" +fi + +if [[ "$SOURCE_BRANCH" == "$BASE_BRANCH" ]]; then + echo "[agent-branch-finish] Source branch and base branch are both '$BASE_BRANCH'." >&2 + echo "[agent-branch-finish] Switch to your agent branch or pass --branch ." >&2 + exit 1 +fi + +if ! git -C "$repo_root" show-ref --verify --quiet "refs/heads/${SOURCE_BRANCH}"; then + echo "[agent-branch-finish] Local source branch does not exist: ${SOURCE_BRANCH}" >&2 + exit 1 +fi + +get_worktree_for_branch() { + local branch="$1" + git -C "$repo_root" worktree list --porcelain | awk -v target="refs/heads/${branch}" ' + $1 == "worktree" { wt = $2 } + $1 == "branch" && $2 == target { print wt; exit } + ' +} + +is_clean_worktree() { + local wt="$1" + git -C "$wt" diff --quiet -- . ":(exclude).omx/state/agent-file-locks.json" \ + && git -C "$wt" diff --cached --quiet -- . ":(exclude).omx/state/agent-file-locks.json" +} + +source_worktree="$(get_worktree_for_branch "$SOURCE_BRANCH")" +created_source_probe=0 +source_probe_path="" +integration_worktree="" + +cleanup() { + if [[ -n "$integration_worktree" && -d "$integration_worktree" ]]; then + git -C "$repo_root" worktree remove "$integration_worktree" --force >/dev/null 2>&1 || true + fi + if [[ "$created_source_probe" -eq 1 && -n "$source_probe_path" && -d "$source_probe_path" ]]; then + # Abort any in-progress git op so `worktree remove --force` succeeds on conflict-stuck probes. + git -C "$source_probe_path" rebase --abort >/dev/null 2>&1 || true + git -C "$source_probe_path" merge --abort >/dev/null 2>&1 || true + git -C "$repo_root" worktree remove "$source_probe_path" --force >/dev/null 2>&1 || true + fi +} +trap cleanup EXIT + +if [[ -z "$source_worktree" ]]; then + source_probe_path="${agent_worktree_root}/__source-probe-${SOURCE_BRANCH//\//__}-$(date +%Y%m%d-%H%M%S)" + mkdir -p "$(dirname "$source_probe_path")" + git -C "$repo_root" worktree add "$source_probe_path" "$SOURCE_BRANCH" >/dev/null + source_worktree="$source_probe_path" + created_source_probe=1 +fi + +if ! is_clean_worktree "$source_worktree"; then + echo "[agent-branch-finish] Source worktree is not clean for '${SOURCE_BRANCH}': ${source_worktree}" >&2 + echo "[agent-branch-finish] Commit/stash changes on the source branch before finishing." >&2 + exit 1 +fi + +start_ref="$BASE_BRANCH" +if git -C "$repo_root" show-ref --verify --quiet "refs/remotes/origin/${BASE_BRANCH}"; then + git -C "$repo_root" fetch origin "$BASE_BRANCH" --quiet + start_ref="origin/${BASE_BRANCH}" +fi + +require_before_finish_raw="$(git -C "$repo_root" config --get multiagent.sync.requireBeforeFinish || true)" +if [[ -z "$require_before_finish_raw" ]]; then + require_before_finish_raw="true" +fi +require_before_finish="$(printf '%s' "$require_before_finish_raw" | tr '[:upper:]' '[:lower:]')" +should_require_sync=0 +case "$require_before_finish" in + 1|true|yes|on) should_require_sync=1 ;; + 0|false|no|off) should_require_sync=0 ;; + *) should_require_sync=1 ;; +esac + +if [[ "$should_require_sync" -eq 1 ]] && git -C "$repo_root" show-ref --verify --quiet "refs/remotes/origin/${BASE_BRANCH}"; then + behind_count="$(git -C "$repo_root" rev-list --left-right --count "${SOURCE_BRANCH}...origin/${BASE_BRANCH}" 2>/dev/null | awk '{print $2}')" + behind_count="${behind_count:-0}" + if [[ "$behind_count" -gt 0 ]]; then + echo "[agent-sync-guard] Branch '${SOURCE_BRANCH}' is behind origin/${BASE_BRANCH} by ${behind_count} commit(s)." >&2 + echo "[agent-sync-guard] Auto-syncing '${SOURCE_BRANCH}' onto origin/${BASE_BRANCH} before finish..." >&2 + if ! git -C "$source_worktree" rebase "origin/${BASE_BRANCH}"; then + git_dir="$(git -C "$source_worktree" rev-parse --git-dir)" + rebase_active=0 + if [[ -e "${git_dir}/rebase-merge" || -e "${git_dir}/rebase-apply" ]]; then + rebase_active=1 + fi + + echo "[agent-sync-guard] Auto-sync failed while rebasing '${SOURCE_BRANCH}' onto origin/${BASE_BRANCH}." >&2 + if [[ "$rebase_active" -eq 1 ]]; then + echo "[agent-sync-guard] Resolve conflicts, then run: git -C \"$source_worktree\" rebase --continue" >&2 + echo "[agent-sync-guard] Or abort: git -C \"$source_worktree\" rebase --abort" >&2 + fi + exit 1 + fi + + behind_after="$(git -C "$repo_root" rev-list --left-right --count "${SOURCE_BRANCH}...origin/${BASE_BRANCH}" 2>/dev/null | awk '{print $2}')" + behind_after="${behind_after:-0}" + echo "[agent-sync-guard] Auto-sync complete (behind now: ${behind_after})." >&2 + fi +fi + +integration_stamp="$(date +%Y%m%d-%H%M%S)" +integration_worktree_base="${agent_worktree_root}/__integrate-${BASE_BRANCH//\//__}-${integration_stamp}" +integration_branch_base="__agent_integrate_${BASE_BRANCH//\//_}_$(date +%Y%m%d_%H%M%S)" +integration_worktree="$integration_worktree_base" +integration_branch="$integration_branch_base" +integration_suffix=1 +while [[ -e "$integration_worktree" ]] || git -C "$repo_root" show-ref --verify --quiet "refs/heads/${integration_branch}"; do + integration_worktree="${integration_worktree_base}-${integration_suffix}" + integration_branch="${integration_branch_base}_${integration_suffix}" + integration_suffix=$((integration_suffix + 1)) +done +mkdir -p "$(dirname "$integration_worktree")" + +git -C "$repo_root" worktree add "$integration_worktree" "$start_ref" >/dev/null +git -C "$integration_worktree" checkout -b "$integration_branch" >/dev/null + +if git -C "$repo_root" show-ref --verify --quiet "refs/remotes/origin/${BASE_BRANCH}"; then + git -C "$source_worktree" fetch origin "$BASE_BRANCH" --quiet + + if ! git -C "$source_worktree" merge --no-commit --no-ff "origin/${BASE_BRANCH}" >/dev/null 2>&1; then + conflict_files="$(git -C "$source_worktree" diff --name-only --diff-filter=U || true)" + git -C "$source_worktree" merge --abort >/dev/null 2>&1 || true + + echo "[agent-branch-finish] Preflight conflict detected between '${SOURCE_BRANCH}' and latest origin/${BASE_BRANCH}." >&2 + if [[ -n "$conflict_files" ]]; then + echo "[agent-branch-finish] Conflicting files:" >&2 + while IFS= read -r file; do + [[ -n "$file" ]] && echo " - ${file}" >&2 + done <<< "$conflict_files" + fi + echo "[agent-branch-finish] Rebase/merge '${BASE_BRANCH}' into '${SOURCE_BRANCH}' and resolve conflicts before finishing." >&2 + exit 1 + fi + + git -C "$source_worktree" merge --abort >/dev/null 2>&1 || true +fi + +if ! git -C "$integration_worktree" merge --no-ff --no-edit "$SOURCE_BRANCH"; then + echo "[agent-branch-finish] Merge conflict detected while merging '${SOURCE_BRANCH}' into '${BASE_BRANCH}'." >&2 + git -C "$integration_worktree" merge --abort >/dev/null 2>&1 || true + exit 1 +fi + +merge_completed=1 +merge_status="direct" +direct_push_error="" +pr_url="" + +is_local_branch_delete_error() { + local output="$1" + if [[ "$output" != *"failed to delete local branch"* ]]; then + return 1 + fi + if [[ "$output" == *"cannot delete branch"* ]] || [[ "$output" == *"used by worktree"* ]]; then + return 0 + fi + return 1 +} + +read_pr_state() { + local state_line + state_line="$("$GH_BIN" pr view "$SOURCE_BRANCH" --json state,mergedAt,url --jq '[.state, (.mergedAt // ""), (.url // "")] | join("\u001f")' 2>/dev/null || true)" + if [[ -z "$state_line" ]]; then + return 1 + fi + + local parsed_state="" + local parsed_merged_at="" + local parsed_url="" + IFS=$'\x1f' read -r parsed_state parsed_merged_at parsed_url <<< "$state_line" + PR_STATE="$parsed_state" + PR_MERGED_AT="$parsed_merged_at" + if [[ -n "$parsed_url" ]]; then + pr_url="$parsed_url" + fi + return 0 +} + +wait_for_pr_merge() { + local deadline + deadline=$(( $(date +%s) + WAIT_TIMEOUT_SECONDS )) + local wait_notice_printed=0 + local merge_output="" + + while true; do + if merge_output="$("$GH_BIN" pr merge "$SOURCE_BRANCH" --squash --delete-branch 2>&1)"; then + return 0 + fi + if is_local_branch_delete_error "$merge_output"; then + echo "[agent-branch-finish] PR merged but gh could not delete the local branch (active worktree); continuing local cleanup." >&2 + return 0 + fi + + PR_STATE="" + PR_MERGED_AT="" + if read_pr_state; then + if [[ "$PR_STATE" == "MERGED" || -n "$PR_MERGED_AT" ]]; then + return 0 + fi + if [[ "$PR_STATE" == "CLOSED" ]]; then + echo "[agent-branch-finish] PR closed without merge; cannot continue auto-finish." >&2 + if [[ -n "$pr_url" ]]; then + echo "[agent-branch-finish] PR: ${pr_url}" >&2 + fi + if [[ -n "$merge_output" ]]; then + echo "$merge_output" >&2 + fi + return 1 + fi + fi + + if [[ "$wait_notice_printed" -eq 0 ]]; then + echo "[agent-branch-finish] Waiting for required checks/reviews, then retrying merge automatically (timeout ${WAIT_TIMEOUT_SECONDS}s)." >&2 + if [[ -n "$pr_url" ]]; then + echo "[agent-branch-finish] PR: ${pr_url}" >&2 + fi + wait_notice_printed=1 + fi + + if (( $(date +%s) >= deadline )); then + echo "[agent-branch-finish] Timed out waiting for PR merge after ${WAIT_TIMEOUT_SECONDS}s." >&2 + if [[ -n "$merge_output" ]]; then + echo "$merge_output" >&2 + fi + return 2 + fi + + sleep "$WAIT_POLL_SECONDS" + done +} + +run_pr_flow() { + if ! command -v "$GH_BIN" >/dev/null 2>&1; then + echo "[agent-branch-finish] PR fallback requested but GitHub CLI not found: ${GH_BIN}" >&2 + return 1 + fi + + git -C "$source_worktree" push -u origin "$SOURCE_BRANCH" + + pr_title="$(git -C "$repo_root" log -1 --pretty=%s "$SOURCE_BRANCH" 2>/dev/null || true)" + if [[ -z "$pr_title" ]]; then + pr_title="Merge ${SOURCE_BRANCH} into ${BASE_BRANCH}" + fi + pr_body="Automated by scripts/agent-branch-finish.sh (PR flow)." + + "$GH_BIN" pr create \ + --base "$BASE_BRANCH" \ + --head "$SOURCE_BRANCH" \ + --title "$pr_title" \ + --body "$pr_body" >/dev/null 2>&1 || true + + pr_url="$("$GH_BIN" pr view "$SOURCE_BRANCH" --json url --jq '.url' 2>/dev/null || true)" + + merge_output="" + if merge_output="$("$GH_BIN" pr merge "$SOURCE_BRANCH" --squash --delete-branch 2>&1)"; then + return 0 + fi + if is_local_branch_delete_error "$merge_output"; then + echo "[agent-branch-finish] PR merged but gh could not delete the local branch (active worktree); continuing local cleanup." >&2 + return 0 + fi + + if [[ "$WAIT_FOR_MERGE" -eq 1 ]]; then + wait_for_pr_merge + return $? + fi + + auto_output="" + if auto_output="$("$GH_BIN" pr merge "$SOURCE_BRANCH" --squash --delete-branch --auto 2>&1)"; then + echo "[agent-branch-finish] PR auto-merge enabled; waiting for required checks/reviews." >&2 + return 2 + fi + + if [[ -n "$merge_output" ]]; then + echo "[agent-branch-finish] PR merge not completed yet; leaving PR open." >&2 + echo "${merge_output}" >&2 + fi + if [[ -n "$auto_output" ]]; then + echo "${auto_output}" >&2 + fi + return 2 +} + +if [[ "$PUSH_ENABLED" -eq 1 ]]; then + if [[ "$MERGE_MODE" != "pr" ]]; then + if ! direct_push_output="$(git -C "$integration_worktree" push origin "HEAD:${BASE_BRANCH}" 2>&1)"; then + direct_push_error="$direct_push_output" + merge_completed=0 + fi + else + merge_completed=0 + fi + + if [[ "$merge_completed" -eq 0 ]]; then + if [[ "$MERGE_MODE" == "direct" ]]; then + echo "[agent-branch-finish] Direct push/merge failed in --direct-only mode." >&2 + if [[ -n "$direct_push_error" ]]; then + echo "$direct_push_error" >&2 + fi + exit 1 + fi + + if run_pr_flow; then + merge_completed=1 + merge_status="pr" + else + pr_exit=$? + if [[ "$pr_exit" -eq 2 ]]; then + echo "[agent-branch-finish] PR flow created/updated branch '${SOURCE_BRANCH}' against '${BASE_BRANCH}'." >&2 + if [[ -n "$pr_url" ]]; then + echo "[agent-branch-finish] PR: ${pr_url}" >&2 + fi + if [[ "$WAIT_FOR_MERGE" -eq 1 ]]; then + echo "[agent-branch-finish] Merge did not complete within wait window; keeping branch open." >&2 + exit 1 + fi + echo "[agent-branch-finish] Merge pending review/check policy. Branch cleanup skipped for now." >&2 + exit 0 + fi + echo "[agent-branch-finish] PR flow failed." >&2 + if [[ -n "$direct_push_error" ]]; then + echo "[agent-branch-finish] Direct push failure details:" >&2 + echo "$direct_push_error" >&2 + fi + exit 1 + fi + fi +fi + +if [[ -x "${repo_root}/scripts/agent-file-locks.py" ]]; then + python3 "${repo_root}/scripts/agent-file-locks.py" release --branch "$SOURCE_BRANCH" >/dev/null 2>&1 || true +fi + +base_worktree="$(get_worktree_for_branch "$BASE_BRANCH")" +if [[ -n "$base_worktree" ]] && is_clean_worktree "$base_worktree" && [[ "$PUSH_ENABLED" -eq 1 ]]; then + git -C "$base_worktree" pull --ff-only origin "$BASE_BRANCH" >/dev/null 2>&1 || true +fi + +if [[ "$CLEANUP_AFTER_MERGE" -eq 1 ]]; then + if [[ "$source_worktree" == "$repo_root" ]]; then + if is_clean_worktree "$source_worktree"; then + switched_to_base=0 + if git -C "$source_worktree" checkout "$BASE_BRANCH" >/dev/null 2>&1; then + switched_to_base=1 + else + git -C "$source_worktree" checkout --detach >/dev/null 2>&1 || true + fi + if [[ "$switched_to_base" -eq 1 && "$PUSH_ENABLED" -eq 1 ]] && git -C "$repo_root" show-ref --verify --quiet "refs/remotes/origin/${BASE_BRANCH}"; then + git -C "$source_worktree" pull --ff-only origin "$BASE_BRANCH" >/dev/null 2>&1 || true + fi + fi + elif [[ "$source_worktree" == "$current_worktree" && "$source_worktree" == "${agent_worktree_root}"/* ]]; then + git -C "$source_worktree" checkout --detach >/dev/null 2>&1 || true + fi + + if [[ "$source_worktree" != "$current_worktree" && "$source_worktree" == "${agent_worktree_root}"/* ]]; then + git -C "$repo_root" worktree remove "$source_worktree" --force >/dev/null 2>&1 || true + fi + + git -C "$repo_root" branch -d "$SOURCE_BRANCH" + + if [[ "$PUSH_ENABLED" -eq 1 && "$DELETE_REMOTE_BRANCH" -eq 1 ]]; then + if git -C "$repo_root" ls-remote --exit-code --heads origin "$SOURCE_BRANCH" >/dev/null 2>&1; then + git -C "$repo_root" push origin --delete "$SOURCE_BRANCH" + fi + fi + + if [[ -x "${repo_root}/scripts/agent-worktree-prune.sh" ]]; then + prune_args=(--base "$BASE_BRANCH" --only-dirty-worktrees --delete-branches) + if [[ "$DELETE_REMOTE_BRANCH" -eq 1 ]]; then + prune_args+=(--delete-remote-branches) + fi + if ! bash "${repo_root}/scripts/agent-worktree-prune.sh" "${prune_args[@]}"; then + echo "[agent-branch-finish] Warning: automatic worktree prune failed." >&2 + echo "[agent-branch-finish] You can run manual cleanup: bash scripts/agent-worktree-prune.sh --base ${BASE_BRANCH} --delete-branches" >&2 + fi + fi + + echo "[agent-branch-finish] Merged '${SOURCE_BRANCH}' into '${BASE_BRANCH}' via ${merge_status} flow and cleaned source branch/worktree." + if [[ "$source_worktree" == "$current_worktree" && "$source_worktree" == "${agent_worktree_root}"/* ]]; then + echo "[agent-branch-finish] Current worktree '${source_worktree}' still exists because it is the active shell cwd." >&2 + echo "[agent-branch-finish] Leave this directory, then run: bash scripts/agent-worktree-prune.sh --base ${BASE_BRANCH} --delete-branches" >&2 + fi +else + if [[ -x "${repo_root}/scripts/agent-worktree-prune.sh" ]]; then + if ! bash "${repo_root}/scripts/agent-worktree-prune.sh" --base "$BASE_BRANCH"; then + echo "[agent-branch-finish] Warning: temporary worktree prune failed." >&2 + fi + fi + + echo "[agent-branch-finish] Merged '${SOURCE_BRANCH}' into '${BASE_BRANCH}' via ${merge_status} flow and kept source branch/worktree." + echo "[agent-branch-finish] Cleanup later with: bash scripts/agent-worktree-prune.sh --base ${BASE_BRANCH} --delete-branches --delete-remote-branches" +fi diff --git a/frontend/scripts/agent-branch-merge.sh b/frontend/scripts/agent-branch-merge.sh new file mode 100755 index 0000000..c8ab622 --- /dev/null +++ b/frontend/scripts/agent-branch-merge.sh @@ -0,0 +1,421 @@ +#!/usr/bin/env bash +set -euo pipefail + +BASE_BRANCH="" +BASE_BRANCH_EXPLICIT=0 +TARGET_BRANCH="" +TASK_NAME="" +AGENT_NAME="${GUARDEX_MERGE_AGENT_NAME:-codex}" +declare -a SOURCE_BRANCHES=() + +usage() { + cat <<'EOF' +Usage: scripts/agent-branch-merge.sh --branch [--branch ...] [--into ] [--task ] [--agent ] [--base ] + +Examples: + bash scripts/agent-branch-merge.sh --branch agent/codex/ui-a --branch agent/codex/ui-b + bash scripts/agent-branch-merge.sh --into agent/codex/owner-lane --branch agent/codex/helper-a --branch agent/codex/helper-b +EOF +} + +sanitize_slug() { + local raw="$1" + local fallback="${2:-merge-agent-branches}" + local slug + slug="$(printf '%s' "$raw" | tr '[:upper:]' '[:lower:]' | sed -E 's/[^a-z0-9]+/-/g; s/^-+//; s/-+$//; s/-{2,}/-/g')" + if [[ -z "$slug" ]]; then + slug="$fallback" + fi + printf '%s' "$slug" +} + +resolve_base_branch() { + local repo="$1" + local explicit_target="$2" + local configured="" + local branch_base="" + + if [[ -n "$explicit_target" ]]; then + branch_base="$(git -C "$repo" config --get "branch.${explicit_target}.guardexBase" || true)" + if [[ -n "$branch_base" ]]; then + printf '%s' "$branch_base" + return 0 + fi + fi + + configured="$(git -C "$repo" config --get multiagent.baseBranch || true)" + if [[ -n "$configured" ]]; then + printf '%s' "$configured" + return 0 + fi + + for fallback in dev main master; do + if git -C "$repo" show-ref --verify --quiet "refs/heads/${fallback}" \ + || git -C "$repo" show-ref --verify --quiet "refs/remotes/origin/${fallback}"; then + printf '%s' "$fallback" + return 0 + fi + done + + printf '%s' "dev" +} + +get_worktree_for_branch() { + local repo="$1" + local branch="$2" + git -C "$repo" worktree list --porcelain | awk -v target="refs/heads/${branch}" ' + $1 == "worktree" { wt = $2 } + $1 == "branch" && $2 == target { print wt; exit } + ' +} + +is_clean_worktree() { + local wt="$1" + git -C "$wt" diff --quiet -- . ":(exclude).omx/state/agent-file-locks.json" \ + && git -C "$wt" diff --cached --quiet -- . ":(exclude).omx/state/agent-file-locks.json" \ + && [[ -z "$(git -C "$wt" ls-files --others --exclude-standard)" ]] +} + +has_in_progress_git_op() { + local wt="$1" + local git_dir="" + git_dir="$(git -C "$wt" rev-parse --git-dir 2>/dev/null || true)" + if [[ -z "$git_dir" ]]; then + return 1 + fi + if [[ "$git_dir" != /* ]]; then + git_dir="$(cd "$wt/$git_dir" 2>/dev/null && pwd -P || true)" + fi + if [[ -z "$git_dir" ]]; then + return 1 + fi + [[ -f "${git_dir}/MERGE_HEAD" || -d "${git_dir}/rebase-merge" || -d "${git_dir}/rebase-apply" ]] +} + +select_unique_worktree_path() { + local root="$1" + local name="$2" + local candidate="${root}/${name}" + local suffix=2 + while [[ -e "$candidate" ]]; do + candidate="${root}/${name}-${suffix}" + suffix=$((suffix + 1)) + done + printf '%s' "$candidate" +} + +branch_exists() { + local repo="$1" + local branch="$2" + git -C "$repo" show-ref --verify --quiet "refs/heads/${branch}" +} + +branch_is_agent_lane() { + local branch="$1" + [[ "$branch" == agent/* ]] +} + +array_contains() { + local needle="$1" + shift || true + local item + for item in "$@"; do + if [[ "$item" == "$needle" ]]; then + return 0 + fi + done + return 1 +} + +collect_branch_files() { + local repo="$1" + local base_ref="$2" + local branch="$3" + git -C "$repo" diff --name-only "${base_ref}...${branch}" -- . ":(exclude).omx/state/agent-file-locks.json" 2>/dev/null || true +} + +while [[ $# -gt 0 ]]; do + case "$1" in + --base) + BASE_BRANCH="${2:-}" + BASE_BRANCH_EXPLICIT=1 + shift 2 + ;; + --into) + TARGET_BRANCH="${2:-}" + shift 2 + ;; + --branch) + SOURCE_BRANCHES+=("${2:-}") + shift 2 + ;; + --task) + TASK_NAME="${2:-}" + shift 2 + ;; + --agent) + AGENT_NAME="${2:-codex}" + shift 2 + ;; + -h|--help) + usage + exit 0 + ;; + *) + echo "[agent-branch-merge] Unknown argument: $1" >&2 + usage >&2 + exit 1 + ;; + esac +done + +if ! git rev-parse --is-inside-work-tree >/dev/null 2>&1; then + echo "[agent-branch-merge] Not inside a git repository." >&2 + exit 1 +fi + +repo_root="$(git rev-parse --show-toplevel)" +common_git_dir_raw="$(git -C "$repo_root" rev-parse --git-common-dir)" +if [[ "$common_git_dir_raw" == /* ]]; then + common_git_dir="$common_git_dir_raw" +else + common_git_dir="$(cd "$repo_root/$common_git_dir_raw" && pwd -P)" +fi +repo_common_root="$(cd "$common_git_dir/.." && pwd -P)" +agent_worktree_root="${repo_common_root}/.omx/agent-worktrees" +mkdir -p "$agent_worktree_root" + +if [[ "$BASE_BRANCH_EXPLICIT" -eq 1 && -z "$BASE_BRANCH" ]]; then + echo "[agent-branch-merge] --base requires a branch value." >&2 + exit 1 +fi + +if [[ -z "$TARGET_BRANCH" && "${#SOURCE_BRANCHES[@]}" -lt 1 ]]; then + echo "[agent-branch-merge] Provide at least one --branch source lane." >&2 + exit 1 +fi + +if [[ -n "$TARGET_BRANCH" ]] && ! branch_is_agent_lane "$TARGET_BRANCH"; then + echo "[agent-branch-merge] --into must reference an agent/* branch: ${TARGET_BRANCH}" >&2 + exit 1 +fi + +deduped_sources=() +for branch in "${SOURCE_BRANCHES[@]}"; do + if [[ -z "$branch" ]]; then + echo "[agent-branch-merge] --branch requires an agent/* branch value." >&2 + exit 1 + fi + if ! branch_is_agent_lane "$branch"; then + echo "[agent-branch-merge] Source branch must be agent/*: ${branch}" >&2 + exit 1 + fi + if ! branch_exists "$repo_root" "$branch"; then + echo "[agent-branch-merge] Local source branch not found: ${branch}" >&2 + exit 1 + fi + if ! array_contains "$branch" "${deduped_sources[@]}"; then + deduped_sources+=("$branch") + fi +done +SOURCE_BRANCHES=("${deduped_sources[@]}") + +if [[ "${#SOURCE_BRANCHES[@]}" -eq 0 ]]; then + echo "[agent-branch-merge] No unique source branches were provided." >&2 + exit 1 +fi + +if [[ "$BASE_BRANCH_EXPLICIT" -eq 0 ]]; then + BASE_BRANCH="$(resolve_base_branch "$repo_root" "$TARGET_BRANCH")" +fi + +if [[ -z "$BASE_BRANCH" ]]; then + echo "[agent-branch-merge] Unable to resolve a base branch." >&2 + exit 1 +fi + +start_ref="$BASE_BRANCH" +if git -C "$repo_root" show-ref --verify --quiet "refs/remotes/origin/${BASE_BRANCH}"; then + git -C "$repo_root" fetch origin "$BASE_BRANCH" --quiet + start_ref="origin/${BASE_BRANCH}" +elif ! git -C "$repo_root" show-ref --verify --quiet "refs/heads/${BASE_BRANCH}"; then + echo "[agent-branch-merge] Base branch not found locally or on origin: ${BASE_BRANCH}" >&2 + exit 1 +fi + +target_worktree="" +target_created=0 + +if [[ -z "$TARGET_BRANCH" ]]; then + if [[ -z "$TASK_NAME" ]]; then + first_hint="$(printf '%s' "${SOURCE_BRANCHES[0]}" | sed -E 's#^agent/[^/]+/##; s#^agent/##')" + source_count="${#SOURCE_BRANCHES[@]}" + if [[ "$source_count" -gt 1 ]]; then + TASK_NAME="$(sanitize_slug "merge-${first_hint}-and-$((source_count - 1))-more" "merge-agent-branches")" + else + TASK_NAME="$(sanitize_slug "merge-${first_hint}" "merge-agent-branches")" + fi + else + TASK_NAME="$(sanitize_slug "$TASK_NAME" "merge-agent-branches")" + fi + + start_output="" + if ! start_output="$( + cd "$repo_root" + env GUARDEX_OPENSPEC_AUTO_INIT=1 bash "scripts/agent-branch-start.sh" "$TASK_NAME" "$AGENT_NAME" "$BASE_BRANCH" 2>&1 + )"; then + printf '%s\n' "$start_output" >&2 + exit 1 + fi + + printf '%s\n' "$start_output" + TARGET_BRANCH="$(printf '%s\n' "$start_output" | sed -n 's/^\[agent-branch-start\] Created branch: //p' | head -n 1)" + target_worktree="$(printf '%s\n' "$start_output" | sed -n 's/^\[agent-branch-start\] Worktree: //p' | head -n 1)" + if [[ -z "$TARGET_BRANCH" || -z "$target_worktree" ]]; then + echo "[agent-branch-merge] Unable to parse target branch/worktree from agent-branch-start output." >&2 + exit 1 + fi + target_created=1 +else + if ! branch_exists "$repo_root" "$TARGET_BRANCH"; then + echo "[agent-branch-merge] Target branch not found: ${TARGET_BRANCH}" >&2 + exit 1 + fi + + target_worktree="$(get_worktree_for_branch "$repo_root" "$TARGET_BRANCH")" + if [[ -z "$target_worktree" ]]; then + target_worktree="$(select_unique_worktree_path "$agent_worktree_root" "${TARGET_BRANCH//\//__}")" + git -C "$repo_root" worktree add "$target_worktree" "$TARGET_BRANCH" >/dev/null + target_created=1 + echo "[agent-branch-merge] Attached worktree for target branch '${TARGET_BRANCH}': ${target_worktree}" + fi +fi + +if [[ "$TARGET_BRANCH" == "$BASE_BRANCH" ]]; then + echo "[agent-branch-merge] Target branch must not equal the protected base branch '${BASE_BRANCH}'." >&2 + exit 1 +fi + +if ! is_clean_worktree "$target_worktree"; then + if [[ "$target_created" -eq 1 ]]; then + echo "[agent-branch-merge] Target worktree has freshly generated scaffold changes; continuing inside the new integration lane." + else + echo "[agent-branch-merge] Target worktree is not clean: ${target_worktree}" >&2 + echo "[agent-branch-merge] Commit, stash, or discard local changes before merging agent lanes." >&2 + exit 1 + fi +fi + +if has_in_progress_git_op "$target_worktree"; then + echo "[agent-branch-merge] Target worktree has an in-progress merge/rebase: ${target_worktree}" >&2 + echo "[agent-branch-merge] Resolve or abort that git operation before running the merge workflow." >&2 + exit 1 +fi + +for source_branch in "${SOURCE_BRANCHES[@]}"; do + if [[ "$source_branch" == "$TARGET_BRANCH" ]]; then + echo "[agent-branch-merge] Source branch list includes the target branch: ${source_branch}" >&2 + exit 1 + fi + source_worktree="$(get_worktree_for_branch "$repo_root" "$source_branch")" + if [[ -n "$source_worktree" ]] && ! is_clean_worktree "$source_worktree"; then + echo "[agent-branch-merge] Source worktree is not clean for '${source_branch}': ${source_worktree}" >&2 + echo "[agent-branch-merge] Commit or stash source-lane changes before integration." >&2 + exit 1 + fi +done + +pending_branches=() +for source_branch in "${SOURCE_BRANCHES[@]}"; do + if git -C "$repo_root" merge-base --is-ancestor "$source_branch" "$TARGET_BRANCH" >/dev/null 2>&1; then + echo "[agent-branch-merge] Skipping '${source_branch}' because it is already integrated into '${TARGET_BRANCH}'." + continue + fi + pending_branches+=("$source_branch") +done + +if [[ "${#pending_branches[@]}" -eq 0 ]]; then + echo "[agent-branch-merge] No pending source branches remain for target '${TARGET_BRANCH}'." + echo "[agent-branch-merge] Target worktree: ${target_worktree}" + exit 0 +fi + +declare -A file_to_branches=() +declare -a overlap_files=() +for source_branch in "${pending_branches[@]}"; do + while IFS= read -r changed_file; do + [[ -z "$changed_file" ]] && continue + existing="${file_to_branches[$changed_file]:-}" + if [[ -z "$existing" ]]; then + file_to_branches["$changed_file"]="$source_branch" + continue + fi + if [[ ",${existing}," == *",${source_branch},"* ]]; then + continue + fi + file_to_branches["$changed_file"]="${existing},${source_branch}" + if ! array_contains "$changed_file" "${overlap_files[@]}"; then + overlap_files+=("$changed_file") + fi + done < <(collect_branch_files "$repo_root" "$start_ref" "$source_branch") +done + +echo "[agent-branch-merge] Target branch: ${TARGET_BRANCH}" +echo "[agent-branch-merge] Target worktree: ${target_worktree}" +echo "[agent-branch-merge] Base branch: ${BASE_BRANCH} (${start_ref})" +echo "[agent-branch-merge] Merge order: ${pending_branches[*]}" + +if [[ "${#overlap_files[@]}" -gt 0 ]]; then + echo "[agent-branch-merge] Overlapping changed files detected across requested branches:" + for overlap_file in "${overlap_files[@]}"; do + branches_csv="${file_to_branches[$overlap_file]}" + branches_display="$(printf '%s' "$branches_csv" | sed 's/,/, /g')" + echo " - ${overlap_file} <- ${branches_display}" + done +else + echo "[agent-branch-merge] No overlapping changed files detected across requested branches." +fi + +for index in "${!pending_branches[@]}"; do + source_branch="${pending_branches[$index]}" + echo "[agent-branch-merge] Merging '${source_branch}' into '${TARGET_BRANCH}'..." + if git -C "$target_worktree" merge --no-ff --no-edit "$source_branch"; then + echo "[agent-branch-merge] Merged '${source_branch}'." + continue + fi + + conflict_files="$(git -C "$target_worktree" diff --name-only --diff-filter=U || true)" + echo "[agent-branch-merge] Merge conflict detected while merging '${source_branch}' into '${TARGET_BRANCH}'." >&2 + echo "[agent-branch-merge] Target worktree: ${target_worktree}" >&2 + if [[ -n "$conflict_files" ]]; then + echo "[agent-branch-merge] Conflicting files:" >&2 + while IFS= read -r conflict_file; do + [[ -n "$conflict_file" ]] && echo " - ${conflict_file}" >&2 + done <<< "$conflict_files" + fi + echo "[agent-branch-merge] Resolve or abort inside the integration worktree:" >&2 + echo " cd \"${target_worktree}\"" >&2 + echo " git status" >&2 + echo " git add && git commit" >&2 + echo " # or: git merge --abort" >&2 + + remaining_branches=("${pending_branches[@]:$((index + 1))}") + if [[ "${#remaining_branches[@]}" -gt 0 ]]; then + echo "[agent-branch-merge] Remaining branches:" >&2 + for remaining in "${remaining_branches[@]}"; do + echo " - ${remaining}" >&2 + done + resume_cmd="gx merge --into ${TARGET_BRANCH} --base ${BASE_BRANCH}" + for remaining in "${remaining_branches[@]}"; do + resume_cmd="${resume_cmd} --branch ${remaining}" + done + echo "[agent-branch-merge] Resume after resolving with: ${resume_cmd}" >&2 + fi + exit 1 +done + +echo "[agent-branch-merge] Merge sequence complete for '${TARGET_BRANCH}'." +if [[ "$target_created" -eq 1 ]]; then + echo "[agent-branch-merge] Review and verify in '${target_worktree}', then finish the integration branch when ready." +fi +echo "[agent-branch-merge] Next step: bash scripts/agent-branch-finish.sh --branch \"${TARGET_BRANCH}\" --base \"${BASE_BRANCH}\" --via-pr --wait-for-merge --cleanup" diff --git a/frontend/scripts/agent-branch-start.sh b/frontend/scripts/agent-branch-start.sh new file mode 100755 index 0000000..ef8cc11 --- /dev/null +++ b/frontend/scripts/agent-branch-start.sh @@ -0,0 +1,614 @@ +#!/usr/bin/env bash +set -euo pipefail + +TASK_NAME="task" +AGENT_NAME="agent" +BASE_BRANCH="" +BASE_BRANCH_EXPLICIT=0 +WORKTREE_ROOT_REL="" +WORKTREE_ROOT_EXPLICIT=0 +OPENSPEC_AUTO_INIT_RAW="${GUARDEX_OPENSPEC_AUTO_INIT:-false}" +OPENSPEC_PLAN_SLUG_OVERRIDE="${GUARDEX_OPENSPEC_PLAN_SLUG:-}" +OPENSPEC_CHANGE_SLUG_OVERRIDE="${GUARDEX_OPENSPEC_CHANGE_SLUG:-}" +OPENSPEC_CAPABILITY_SLUG_OVERRIDE="${GUARDEX_OPENSPEC_CAPABILITY_SLUG:-}" +OPENSPEC_MASTERPLAN_LABEL_RAW="${GUARDEX_OPENSPEC_MASTERPLAN_LABEL-masterplan}" +PRINT_NAME_ONLY=0 +POSITIONAL_ARGS=() + +while [[ $# -gt 0 ]]; do + case "$1" in + --task) + TASK_NAME="${2:-task}" + shift 2 + ;; + --agent) + AGENT_NAME="${2:-agent}" + shift 2 + ;; + --base) + BASE_BRANCH="${2:-}" + BASE_BRANCH_EXPLICIT=1 + shift 2 + ;; + --print-name-only) + PRINT_NAME_ONLY=1 + shift + ;; + --tier) + # Accepted for CLAUDE.md compatibility; scaffold size is not yet wired + # through this script. Consume the value so callers can pass it. + shift 2 + ;; + --in-place|--allow-in-place) + echo "[agent-branch-start] In-place branch mode is disabled." >&2 + echo "[agent-branch-start] This command always creates an isolated worktree to keep your active checkout unchanged." >&2 + exit 1 + ;; + --worktree-root) + WORKTREE_ROOT_REL="${2:-.omx/agent-worktrees}" + WORKTREE_ROOT_EXPLICIT=1 + shift 2 + ;; + --) + shift + while [[ $# -gt 0 ]]; do + POSITIONAL_ARGS+=("$1") + shift + done + break + ;; + -*) + echo "[agent-branch-start] Unknown option: $1" >&2 + echo "Usage: $0 [task] [agent] [base] [--worktree-root ] [--print-name-only]" >&2 + exit 1 + ;; + *) + POSITIONAL_ARGS+=("$1") + shift + ;; + esac +done + +if [[ "${#POSITIONAL_ARGS[@]}" -gt 3 ]]; then + echo "[agent-branch-start] Too many positional arguments." >&2 + echo "Usage: $0 [task] [agent] [base] [--worktree-root ]" >&2 + exit 1 +fi + +if [[ "${#POSITIONAL_ARGS[@]}" -ge 1 ]]; then + TASK_NAME="${POSITIONAL_ARGS[0]}" +fi + +if [[ "${#POSITIONAL_ARGS[@]}" -ge 2 ]]; then + AGENT_NAME="${POSITIONAL_ARGS[1]}" +fi + +if [[ "${#POSITIONAL_ARGS[@]}" -ge 3 ]]; then + BASE_BRANCH="${POSITIONAL_ARGS[2]}" + BASE_BRANCH_EXPLICIT=1 +fi + +sanitize_slug() { + local raw="$1" + local fallback="${2:-task}" + local slug + slug="$(printf '%s' "$raw" | tr '[:upper:]' '[:lower:]' | sed -E 's/[^a-z0-9]+/-/g; s/^-+//; s/-+$//; s/-{2,}/-/g')" + if [[ -z "$slug" ]]; then + slug="$fallback" + fi + printf '%s' "$slug" +} + +normalize_positive_int() { + local raw="$1" + local fallback="$2" + if [[ "$raw" =~ ^[0-9]+$ ]] && [[ "$raw" -gt 0 ]]; then + printf '%s' "$raw" + return 0 + fi + printf '%s' "$fallback" +} + +shorten_slug() { + local slug="$1" + local raw_max="$2" + local max_len + max_len="$(normalize_positive_int "$raw_max" "32")" + if [[ "${#slug}" -le "$max_len" ]]; then + printf '%s' "$slug" + return 0 + fi + local shortened="${slug:0:max_len}" + shortened="$(printf '%s' "$shortened" | sed -E 's/-+$//')" + if [[ -z "$shortened" ]]; then + shortened="${slug:0:max_len}" + fi + printf '%s' "$shortened" +} + +env_flag_truthy() { + local raw="${1:-}" + local lowered + lowered="$(printf '%s' "$raw" | tr '[:upper:]' '[:lower:]')" + case "$lowered" in + 1|true|yes|on) return 0 ;; + *) return 1 ;; + esac +} + +default_worktree_root_rel() { + local raw_agent="$1" + local override="${GUARDEX_AGENT_TYPE:-}" + local lowered_agent lowered_override + lowered_agent="$(printf '%s' "$raw_agent" | tr '[:upper:]' '[:lower:]')" + lowered_override="$(printf '%s' "$override" | tr '[:upper:]' '[:lower:]')" + + if [[ -n "${CLAUDE_CODE_SESSION_ID:-}" ]] || env_flag_truthy "${CLAUDECODE:-}"; then + printf '.omc/agent-worktrees' + return 0 + fi + + if [[ "$lowered_agent" == *claude* ]] || [[ "$lowered_override" == *claude* ]]; then + printf '.omc/agent-worktrees' + return 0 + fi + + printf '.omx/agent-worktrees' +} + +# Collapse arbitrary agent identifiers to a clean role token. Priority: +# GUARDEX_AGENT_TYPE env override, then recognizable claude/codex aliases, then +# a small legacy compatibility set, then the literal requested role after slug +# sanitization. This preserves explicit roles such as planner/executor while +# keeping the older bot -> codex fallback stable for existing callers. +normalize_role() { + local raw_agent="$1" + local override="${GUARDEX_AGENT_TYPE:-}" + if [[ -n "$override" ]]; then + sanitize_slug "$override" "agent" + return 0 + fi + local lowered + lowered="$(printf '%s' "$raw_agent" | tr '[:upper:]' '[:lower:]')" + if [[ "$lowered" == *claude* ]]; then + printf 'claude' + return 0 + fi + if [[ "$lowered" == *codex* ]]; then + printf 'codex' + return 0 + fi + if [[ -n "${CLAUDECODE:-}" && "${CLAUDECODE}" != "0" && "${CLAUDECODE}" != "false" ]]; then + printf 'claude' + return 0 + fi + local sanitized + sanitized="$(sanitize_slug "$raw_agent" "codex")" + if [[ "$sanitized" == "bot" ]]; then + printf 'codex' + return 0 + fi + printf '%s' "$sanitized" +} + +# Timestamp the branch/worktree/openspec slug so parallel agents never collide +# and names sort chronologically. Format: YYYY-MM-DD-HH-MM (local time). +# Colons are illegal in git refs, so the HH:MM the user sees is stored as +# HH-MM in the slug. Can be overridden for tests via GUARDEX_BRANCH_TIMESTAMP. +compose_branch_timestamp() { + if [[ -n "${GUARDEX_BRANCH_TIMESTAMP:-}" ]]; then + printf '%s' "$GUARDEX_BRANCH_TIMESTAMP" + return 0 + fi + date +%Y-%m-%d-%H-%M +} + +compose_branch_descriptor() { + local task_slug="$1" + local stamp="$2" + local task_max task_part + task_max="$(normalize_positive_int "${GUARDEX_BRANCH_TASK_SLUG_MAX:-40}" "40")" + task_part="$(shorten_slug "$task_slug" "$task_max")" + printf '%s-%s' "$task_part" "$stamp" +} + +normalize_bool() { + local raw="${1:-}" + local fallback="${2:-0}" + local lowered + lowered="$(printf '%s' "$raw" | tr '[:upper:]' '[:lower:]')" + case "$lowered" in + 1|true|yes|on) printf '1' ;; + 0|false|no|off) printf '0' ;; + '') printf '%s' "$fallback" ;; + *) printf '%s' "$fallback" ;; + esac +} + +OPENSPEC_AUTO_INIT="$(normalize_bool "$OPENSPEC_AUTO_INIT_RAW" "1")" + +resolve_openspec_masterplan_label() { + local raw="${OPENSPEC_MASTERPLAN_LABEL_RAW:-}" + local label + + if [[ "$OPENSPEC_AUTO_INIT" -ne 1 ]] || [[ -z "$raw" ]]; then + printf '' + return 0 + fi + + label="$(printf '%s' "$raw" | tr '[:upper:]' '[:lower:]' | sed -E 's/[^a-z0-9]+/-/g; s/^-+//; s/-+$//; s/-{2,}/-/g')" + printf '%s' "$label" +} + +resolve_openspec_plan_slug() { + local branch_name="$1" + local agent_slug="$2" + local task_slug="$3" + local masterplan_label="" + local branch_leaf="" + if [[ -n "$OPENSPEC_PLAN_SLUG_OVERRIDE" ]]; then + sanitize_slug "$OPENSPEC_PLAN_SLUG_OVERRIDE" "$task_slug" + return 0 + fi + masterplan_label="$(resolve_openspec_masterplan_label)" + if [[ -n "$masterplan_label" ]] && [[ "$branch_name" == "agent/${agent_slug}/"* ]]; then + branch_leaf="${branch_name#agent/${agent_slug}/}" + sanitize_slug "agent-${agent_slug}-${masterplan_label}-${branch_leaf}" "$task_slug" + return 0 + fi + sanitize_slug "${branch_name//\//-}" "$task_slug" +} + +resolve_openspec_change_slug() { + local branch_name="$1" + local task_slug="$2" + if [[ -n "$OPENSPEC_CHANGE_SLUG_OVERRIDE" ]]; then + sanitize_slug "$OPENSPEC_CHANGE_SLUG_OVERRIDE" "$task_slug" + return 0 + fi + sanitize_slug "${branch_name//\//-}" "$task_slug" +} + +resolve_openspec_capability_slug() { + local task_slug="$1" + if [[ -n "$OPENSPEC_CAPABILITY_SLUG_OVERRIDE" ]]; then + sanitize_slug "$OPENSPEC_CAPABILITY_SLUG_OVERRIDE" "$task_slug" + return 0 + fi + sanitize_slug "$task_slug" "general-behavior" +} + +resolve_worktree_leaf() { + local branch_name="$1" + local agent_slug="$2" + local masterplan_label="" + local branch_leaf="" + + masterplan_label="$(resolve_openspec_masterplan_label)" + if [[ -n "$masterplan_label" ]] && [[ "$branch_name" == "agent/${agent_slug}/"* ]]; then + branch_leaf="${branch_name#agent/${agent_slug}/}" + printf 'agent__%s__%s__%s' "$agent_slug" "$masterplan_label" "$branch_leaf" + return 0 + fi + + printf '%s' "${branch_name//\//__}" +} + +has_local_changes() { + local root="$1" + if ! git -C "$root" diff --quiet; then + return 0 + fi + if ! git -C "$root" diff --cached --quiet; then + return 0 + fi + if [[ -n "$(git -C "$root" ls-files --others --exclude-standard)" ]]; then + return 0 + fi + return 1 +} + +resolve_protected_branches() { + local root="$1" + local raw + raw="${GUARDEX_PROTECTED_BRANCHES:-$(git -C "$root" config --get multiagent.protectedBranches || true)}" + if [[ -z "$raw" ]]; then + raw="dev main master" + fi + raw="${raw//,/ }" + printf '%s' "$raw" +} + +is_protected_branch_name() { + local branch="$1" + local protected_raw="$2" + for protected_branch in $protected_raw; do + if [[ "$branch" == "$protected_branch" ]]; then + return 0 + fi + done + return 1 +} + +hydrate_local_helper_in_worktree() { + local repo="$1" + local worktree="$2" + local relative_path="$3" + local worktree_target="${worktree}/${relative_path}" + local source_path="" + + if [[ -e "$worktree_target" ]]; then + return 0 + fi + + if [[ -f "${repo}/${relative_path}" ]]; then + source_path="${repo}/${relative_path}" + elif [[ -f "${repo}/templates/${relative_path}" ]]; then + source_path="${repo}/templates/${relative_path}" + fi + + if [[ -z "$source_path" ]]; then + return 0 + fi + + mkdir -p "$(dirname "$worktree_target")" + cp "$source_path" "$worktree_target" + if [[ -x "$source_path" ]]; then + chmod +x "$worktree_target" + fi + + echo "[agent-branch-start] Hydrated local helper in worktree: ${relative_path}" +} + +hydrate_dependency_dir_symlink_in_worktree() { + local repo="$1" + local worktree="$2" + local relative_path="$3" + local source_path="${repo}/${relative_path}" + local target_path="${worktree}/${relative_path}" + + if [[ ! -d "$source_path" ]]; then + return 0 + fi + + if [[ -e "$target_path" ]]; then + return 0 + fi + + mkdir -p "$(dirname "$target_path")" + ln -s "$source_path" "$target_path" + echo "[agent-branch-start] Linked dependency dir in worktree: ${relative_path}" +} + +initialize_openspec_plan_workspace() { + local repo="$1" + local worktree="$2" + local plan_slug="$3" + + hydrate_local_helper_in_worktree "$repo" "$worktree" "scripts/openspec/init-plan-workspace.sh" + + if [[ "$OPENSPEC_AUTO_INIT" -ne 1 ]]; then + return 0 + fi + + local openspec_script="${worktree}/scripts/openspec/init-plan-workspace.sh" + if [[ ! -f "$openspec_script" ]]; then + echo "[agent-branch-start] OpenSpec init script is missing in sandbox worktree." >&2 + echo "[agent-branch-start] Run 'gx setup --target \"$repo\"' to repair templates, then retry." >&2 + return 1 + fi + if [[ ! -x "$openspec_script" ]]; then + chmod +x "$openspec_script" 2>/dev/null || true + fi + + local init_output="" + if ! init_output="$( + cd "$worktree" + bash "scripts/openspec/init-plan-workspace.sh" "$plan_slug" 2>&1 + )"; then + printf '%s\n' "$init_output" >&2 + echo "[agent-branch-start] OpenSpec workspace initialization failed for plan '${plan_slug}'." >&2 + return 1 + fi + + if [[ -n "$init_output" ]]; then + printf '%s\n' "$init_output" + fi + echo "[agent-branch-start] OpenSpec plan workspace: ${worktree}/openspec/plan/${plan_slug}" +} + +initialize_openspec_change_workspace() { + local repo="$1" + local worktree="$2" + local change_slug="$3" + local capability_slug="$4" + + hydrate_local_helper_in_worktree "$repo" "$worktree" "scripts/openspec/init-change-workspace.sh" + + if [[ "$OPENSPEC_AUTO_INIT" -ne 1 ]]; then + return 0 + fi + + local openspec_script="${worktree}/scripts/openspec/init-change-workspace.sh" + if [[ ! -f "$openspec_script" ]]; then + echo "[agent-branch-start] OpenSpec change init script is missing in sandbox worktree." >&2 + echo "[agent-branch-start] Run 'gx setup --target \"$repo\"' to repair templates, then retry." >&2 + return 1 + fi + if [[ ! -x "$openspec_script" ]]; then + chmod +x "$openspec_script" 2>/dev/null || true + fi + + local init_output="" + if ! init_output="$( + cd "$worktree" + bash "scripts/openspec/init-change-workspace.sh" "$change_slug" "$capability_slug" 2>&1 + )"; then + printf '%s\n' "$init_output" >&2 + echo "[agent-branch-start] OpenSpec workspace initialization failed for change '${change_slug}'." >&2 + return 1 + fi + + if [[ -n "$init_output" ]]; then + printf '%s\n' "$init_output" + fi + echo "[agent-branch-start] OpenSpec change workspace: ${worktree}/openspec/changes/${change_slug}" +} + +if ! git rev-parse --is-inside-work-tree >/dev/null 2>&1; then + echo "[agent-branch-start] Not inside a git repository." >&2 + exit 1 +fi + +repo_root="$(git rev-parse --show-toplevel)" + +guardex_env_helper="${repo_root}/scripts/guardex-env.sh" +if [[ -f "$guardex_env_helper" ]]; then + # shellcheck source=/dev/null + source "$guardex_env_helper" +fi +if declare -F guardex_repo_is_enabled >/dev/null 2>&1 && ! guardex_repo_is_enabled "$repo_root"; then + toggle_source="$(guardex_repo_toggle_source "$repo_root" || true)" + toggle_raw="$(guardex_repo_toggle_raw "$repo_root" || true)" + if [[ -n "$toggle_source" && -n "$toggle_raw" ]]; then + echo "[agent-branch-start] Guardex is disabled for this repo (${toggle_source}: GUARDEX_ON=${toggle_raw})." >&2 + else + echo "[agent-branch-start] Guardex is disabled for this repo." >&2 + fi + echo "[agent-branch-start] Skip Guardex worktree/OpenSpec flow or re-enable with GUARDEX_ON=1." >&2 + exit 1 +fi + +if [[ "$BASE_BRANCH_EXPLICIT" -eq 1 && -z "$BASE_BRANCH" ]]; then + echo "[agent-branch-start] --base requires a non-empty branch name." >&2 + exit 1 +fi + +task_slug="$(sanitize_slug "$TASK_NAME" "task")" +agent_slug="$(normalize_role "$AGENT_NAME")" +if [[ "$WORKTREE_ROOT_EXPLICIT" -eq 0 ]]; then + WORKTREE_ROOT_REL="$(default_worktree_root_rel "$AGENT_NAME")" +fi +branch_timestamp="$(compose_branch_timestamp)" +branch_descriptor="$(compose_branch_descriptor "$task_slug" "$branch_timestamp")" +branch_name_base="agent/${agent_slug}/${branch_descriptor}" + +branch_name="$branch_name_base" +if [[ "$PRINT_NAME_ONLY" -eq 1 ]]; then + printf '%s\n' "$branch_name" + exit 0 +fi + +if [[ "$BASE_BRANCH_EXPLICIT" -eq 0 ]]; then + current_branch="$(git -C "$repo_root" rev-parse --abbrev-ref HEAD 2>/dev/null || true)" + protected_branches_raw="$(resolve_protected_branches "$repo_root")" + if [[ -n "$current_branch" && "$current_branch" != "HEAD" ]] && is_protected_branch_name "$current_branch" "$protected_branches_raw"; then + BASE_BRANCH="$current_branch" + else + configured_base="$(git -C "$repo_root" config --get multiagent.baseBranch || true)" + if [[ -n "$configured_base" ]]; then + BASE_BRANCH="$configured_base" + elif [[ -n "$current_branch" && "$current_branch" != "HEAD" ]]; then + BASE_BRANCH="$current_branch" + else + BASE_BRANCH="dev" + fi + fi +fi + +if git show-ref --verify --quiet "refs/remotes/origin/${BASE_BRANCH}"; then + git fetch origin "${BASE_BRANCH}" --quiet + start_ref="origin/${BASE_BRANCH}" +else + if ! git show-ref --verify --quiet "refs/heads/${BASE_BRANCH}"; then + echo "[agent-branch-start] Base branch not found locally or on origin: ${BASE_BRANCH}" >&2 + exit 1 + fi + start_ref="${BASE_BRANCH}" +fi + +timestamp="$(date +%Y%m%d-%H%M%S)" +branch_suffix=2 +while git show-ref --verify --quiet "refs/heads/${branch_name}"; do + branch_name="${branch_name_base}-${branch_suffix}" + branch_suffix=$((branch_suffix + 1)) +done + +worktree_root="${repo_root}/${WORKTREE_ROOT_REL}" +mkdir -p "$worktree_root" +worktree_leaf="$(resolve_worktree_leaf "$branch_name" "$agent_slug")" +worktree_path="${worktree_root}/${worktree_leaf}" +openspec_plan_slug="$(resolve_openspec_plan_slug "$branch_name" "$agent_slug" "$task_slug")" +openspec_change_slug="$(resolve_openspec_change_slug "$branch_name" "$task_slug")" +openspec_capability_slug="$(resolve_openspec_capability_slug "$task_slug")" + +if [[ -e "$worktree_path" ]]; then + echo "[agent-branch-start] Worktree path already exists: ${worktree_path}" >&2 + exit 1 +fi + +auto_transfer_stash_ref="" +auto_transfer_message="" +auto_transfer_source_branch="" +current_branch="$(git -C "$repo_root" rev-parse --abbrev-ref HEAD 2>/dev/null || true)" +protected_branches_raw="$(resolve_protected_branches "$repo_root")" +if [[ -n "$current_branch" && "$current_branch" != "HEAD" ]] && is_protected_branch_name "$current_branch" "$protected_branches_raw"; then + if has_local_changes "$repo_root"; then + auto_transfer_message="guardex-auto-transfer-${timestamp}-${agent_slug}-${task_slug}" + if git -C "$repo_root" stash push --include-untracked --message "$auto_transfer_message" >/dev/null 2>&1; then + auto_transfer_stash_ref="$( + git -C "$repo_root" stash list \ + | awk -v msg="$auto_transfer_message" '$0 ~ msg { ref=$1; sub(/:$/, "", ref); print ref; exit }' + )" + if [[ -n "$auto_transfer_stash_ref" ]]; then + auto_transfer_source_branch="$current_branch" + echo "[agent-branch-start] Detected local changes on protected branch '${current_branch}'. Moving them to '${branch_name}'..." + fi + fi + fi +fi + +worktree_add_output="" +if ! worktree_add_output="$(git -C "$repo_root" worktree add -b "$branch_name" "$worktree_path" "$start_ref" 2>&1)"; then + printf '%s\n' "$worktree_add_output" >&2 + exit 1 +fi +git -C "$repo_root" config "branch.${branch_name}.guardexBase" "$BASE_BRANCH" >/dev/null 2>&1 || true +git -C "$repo_root" config "branch.${branch_name}.guardexWorktreeRoot" "$WORKTREE_ROOT_REL" >/dev/null 2>&1 || true +# Fresh agent branches should start unpublished; clear any inherited base-branch tracking. +git -C "$worktree_path" branch --unset-upstream "$branch_name" >/dev/null 2>&1 || true + +if [[ -n "$auto_transfer_stash_ref" ]]; then + if git -C "$worktree_path" stash apply "$auto_transfer_stash_ref" >/dev/null 2>&1; then + git -C "$repo_root" stash drop "$auto_transfer_stash_ref" >/dev/null 2>&1 || true + transfer_label="${auto_transfer_source_branch:-$BASE_BRANCH}" + echo "[agent-branch-start] Moved local changes from '${transfer_label}' into '${branch_name}'." + else + echo "[agent-branch-start] Failed to auto-apply moved changes in new worktree." >&2 + transfer_label="${auto_transfer_source_branch:-$BASE_BRANCH}" + echo "[agent-branch-start] Changes are preserved in ${auto_transfer_stash_ref} on ${transfer_label}." >&2 + echo "[agent-branch-start] Apply manually with: git -C \"$worktree_path\" stash apply \"${auto_transfer_stash_ref}\"" >&2 + exit 1 + fi +fi + +hydrate_local_helper_in_worktree "$repo_root" "$worktree_path" "scripts/codex-agent.sh" +hydrate_dependency_dir_symlink_in_worktree "$repo_root" "$worktree_path" "node_modules" +hydrate_dependency_dir_symlink_in_worktree "$repo_root" "$worktree_path" "apps/frontend/node_modules" +hydrate_dependency_dir_symlink_in_worktree "$repo_root" "$worktree_path" "apps/backend/node_modules" +if ! initialize_openspec_change_workspace "$repo_root" "$worktree_path" "$openspec_change_slug" "$openspec_capability_slug"; then + exit 1 +fi +if ! initialize_openspec_plan_workspace "$repo_root" "$worktree_path" "$openspec_plan_slug"; then + exit 1 +fi + +echo "[agent-branch-start] Created branch: ${branch_name}" +echo "[agent-branch-start] Worktree: ${worktree_path}" +echo "[agent-branch-start] OpenSpec change: openspec/changes/${openspec_change_slug}" +echo "[agent-branch-start] OpenSpec plan: openspec/plan/${openspec_plan_slug}" +echo "[agent-branch-start] Next steps:" +echo " cd \"${worktree_path}\"" +echo " python3 scripts/agent-file-locks.py claim --branch \"${branch_name}\" " +echo " # implement + commit" +echo " bash scripts/agent-branch-finish.sh --branch \"${branch_name}\" --base ${BASE_BRANCH} --via-pr --wait-for-merge" diff --git a/frontend/scripts/agent-file-locks.py b/frontend/scripts/agent-file-locks.py new file mode 100755 index 0000000..06cdd7a --- /dev/null +++ b/frontend/scripts/agent-file-locks.py @@ -0,0 +1,407 @@ +#!/usr/bin/env python3 +"""Per-file lock registry for concurrent agent branches. + +Usage examples: + python3 scripts/agent-file-locks.py claim --branch agent/a path/to/file1 path/to/file2 + python3 scripts/agent-file-locks.py claim --branch agent/a --allow-delete path/to/obsolete-file + python3 scripts/agent-file-locks.py allow-delete --branch agent/a path/to/obsolete-file + python3 scripts/agent-file-locks.py validate --branch agent/a --staged + python3 scripts/agent-file-locks.py release --branch agent/a +""" + +from __future__ import annotations + +import argparse +import json +import os +import subprocess +import sys +from dataclasses import dataclass +from datetime import datetime, timezone +from pathlib import Path +from typing import Any + + +LOCK_FILE_RELATIVE = Path('.omx/state/agent-file-locks.json') +CRITICAL_GUARDRAIL_PATHS = { + 'AGENTS.md', + '.githooks/pre-commit', + '.githooks/pre-push', + 'scripts/agent-branch-start.sh', + 'scripts/agent-branch-finish.sh', + 'scripts/agent-file-locks.py', +} +ALLOW_GUARDRAIL_DELETE_ENV = 'AGENT_ALLOW_GUARDRAIL_DELETE' + + +@dataclass +class LockEntry: + branch: str + claimed_at: str + allow_delete: bool = False + + +class LockError(Exception): + pass + + +def run_git(args: list[str], cwd: Path) -> str: + result = subprocess.run( + ['git', *args], + cwd=str(cwd), + check=False, + stdout=subprocess.PIPE, + stderr=subprocess.PIPE, + text=True, + ) + if result.returncode != 0: + raise LockError(result.stderr.strip() or f"git {' '.join(args)} failed") + return result.stdout.strip() + + +def resolve_repo_root() -> Path: + output = run_git(['rev-parse', '--show-toplevel'], cwd=Path.cwd()) + return Path(output).resolve() + + +def normalize_repo_path(repo_root: Path, raw_path: str) -> str: + joined = Path(raw_path) + abs_path = joined if joined.is_absolute() else (repo_root / joined) + normalized_abs = Path(os.path.normpath(str(abs_path))) + try: + relative = normalized_abs.relative_to(repo_root) + except ValueError as exc: + raise LockError(f"Path is outside repository: {raw_path}") from exc + return relative.as_posix() + + +def lock_file_path(repo_root: Path) -> Path: + return repo_root / LOCK_FILE_RELATIVE + + +def load_state(repo_root: Path) -> dict[str, Any]: + path = lock_file_path(repo_root) + if not path.exists(): + return {'locks': {}} + try: + data = json.loads(path.read_text()) + except json.JSONDecodeError as exc: + raise LockError(f'Lock file is invalid JSON: {path}') from exc + + if not isinstance(data, dict): + return {'locks': {}} + locks = data.get('locks', {}) + if not isinstance(locks, dict): + return {'locks': {}} + + # Backward-compat normalization for older lock schema. + normalized_locks: dict[str, dict[str, Any]] = {} + for file_path, entry in locks.items(): + if not isinstance(entry, dict): + continue + branch = str(entry.get('branch', '')) + claimed_at = str(entry.get('claimed_at', '')) + allow_delete = bool(entry.get('allow_delete', False)) + normalized_locks[str(file_path)] = { + 'branch': branch, + 'claimed_at': claimed_at, + 'allow_delete': allow_delete, + } + + return {'locks': normalized_locks} + + +def write_state(repo_root: Path, state: dict[str, Any]) -> None: + path = lock_file_path(repo_root) + path.parent.mkdir(parents=True, exist_ok=True) + tmp = path.with_suffix(path.suffix + '.tmp') + tmp.write_text(json.dumps(state, indent=2, sort_keys=True) + '\n') + tmp.replace(path) + + +def now_iso() -> str: + return datetime.now(timezone.utc).isoformat() + + +def env_truthy(value: str | None) -> bool: + if value is None: + return False + return value.strip().lower() in {'1', 'true', 'yes', 'on'} + + +def staged_changes(repo_root: Path) -> list[tuple[str, str]]: + out = run_git(['diff', '--cached', '--name-status', '--diff-filter=ACMRDTUXB'], cwd=repo_root) + if not out: + return [] + + results: list[tuple[str, str]] = [] + for raw_line in out.splitlines(): + line = raw_line.strip() + if not line: + continue + parts = line.split('\t') + status_token = parts[0] + status = status_token[0] + if status in {'R', 'C'}: + if len(parts) < 3: + continue + path = parts[-1] + else: + if len(parts) < 2: + continue + path = parts[1] + normalized = normalize_repo_path(repo_root, path) + results.append((status, normalized)) + return results + + +def cmd_claim(args: argparse.Namespace, repo_root: Path) -> int: + state = load_state(repo_root) + locks: dict[str, dict[str, Any]] = state['locks'] + + files = [normalize_repo_path(repo_root, p) for p in args.files] + conflicts: list[tuple[str, str]] = [] + + for file_path in files: + existing = locks.get(file_path) + if existing and existing.get('branch') != args.branch: + conflicts.append((file_path, str(existing.get('branch')))) + + if conflicts: + print('[agent-file-locks] Cannot claim files already locked by other branches:', file=sys.stderr) + for file_path, owner_branch in conflicts: + print(f' - {file_path} (locked by {owner_branch})', file=sys.stderr) + return 1 + + for file_path in files: + existing = locks.get(file_path, {}) + existing_allow_delete = bool(existing.get('allow_delete', False)) + locks[file_path] = LockEntry( + branch=args.branch, + claimed_at=now_iso(), + allow_delete=args.allow_delete or existing_allow_delete, + ).__dict__ + + write_state(repo_root, state) + delete_note = ' (delete-approved)' if args.allow_delete else '' + print(f"[agent-file-locks] Claimed {len(files)} file(s) for {args.branch}{delete_note}.") + return 0 + + +def cmd_allow_delete(args: argparse.Namespace, repo_root: Path) -> int: + state = load_state(repo_root) + locks: dict[str, dict[str, Any]] = state['locks'] + files = [normalize_repo_path(repo_root, p) for p in args.files] + + missing: list[str] = [] + foreign: list[tuple[str, str]] = [] + for file_path in files: + entry = locks.get(file_path) + if not entry: + missing.append(file_path) + continue + owner = str(entry.get('branch', '')) + if owner != args.branch: + foreign.append((file_path, owner)) + continue + entry['allow_delete'] = True + + if missing or foreign: + if missing: + print('[agent-file-locks] Cannot enable delete: files are not claimed yet:', file=sys.stderr) + for file_path in missing: + print(f' - {file_path}', file=sys.stderr) + if foreign: + print('[agent-file-locks] Cannot enable delete: files are owned by another branch:', file=sys.stderr) + for file_path, owner in foreign: + print(f' - {file_path} (owner: {owner})', file=sys.stderr) + return 1 + + write_state(repo_root, state) + print(f"[agent-file-locks] Enabled delete approval for {len(files)} file(s) on {args.branch}.") + return 0 + + +def cmd_release(args: argparse.Namespace, repo_root: Path) -> int: + state = load_state(repo_root) + locks: dict[str, dict[str, Any]] = state['locks'] + + to_release: set[str] + if args.files: + requested = {normalize_repo_path(repo_root, p) for p in args.files} + to_release = {p for p in requested if locks.get(p, {}).get('branch') == args.branch} + else: + to_release = {p for p, entry in locks.items() if entry.get('branch') == args.branch} + + for file_path in to_release: + locks.pop(file_path, None) + + write_state(repo_root, state) + print(f"[agent-file-locks] Released {len(to_release)} file(s) for {args.branch}.") + return 0 + + +def cmd_status(args: argparse.Namespace, repo_root: Path) -> int: + state = load_state(repo_root) + locks: dict[str, dict[str, Any]] = state['locks'] + + rows: list[tuple[str, str, str, bool]] = [] + for file_path, entry in sorted(locks.items()): + branch = str(entry.get('branch', '')) + if args.branch and branch != args.branch: + continue + claimed_at = str(entry.get('claimed_at', '')) + allow_delete = bool(entry.get('allow_delete', False)) + rows.append((file_path, branch, claimed_at, allow_delete)) + + if not rows: + print('[agent-file-locks] No active locks.') + return 0 + + print('[agent-file-locks] Active locks:') + for file_path, branch, claimed_at, allow_delete in rows: + delete_flag = ' delete-ok' if allow_delete else '' + print(f' - {file_path} | {branch} | {claimed_at}{delete_flag}') + return 0 + + +def cmd_validate(args: argparse.Namespace, repo_root: Path) -> int: + state = load_state(repo_root) + locks: dict[str, dict[str, Any]] = state['locks'] + + if args.staged: + file_changes = staged_changes(repo_root) + else: + file_changes = [('M', normalize_repo_path(repo_root, p)) for p in args.files] + + file_changes = [ + (status, file_path) + for status, file_path in file_changes + if file_path and file_path != LOCK_FILE_RELATIVE.as_posix() + ] + if not file_changes: + return 0 + + missing: list[str] = [] + foreign: list[tuple[str, str]] = [] + delete_not_allowed: list[str] = [] + guardrail_delete_blocked: list[str] = [] + + allow_guardrail_delete = env_truthy(os.environ.get(ALLOW_GUARDRAIL_DELETE_ENV)) + + for status, file_path in file_changes: + entry = locks.get(file_path) + if not entry: + missing.append(file_path) + continue + + owner = str(entry.get('branch', '')) + if owner != args.branch: + foreign.append((file_path, owner)) + continue + + if status == 'D': + if file_path in CRITICAL_GUARDRAIL_PATHS and not allow_guardrail_delete: + guardrail_delete_blocked.append(file_path) + + allow_delete = bool(entry.get('allow_delete', False)) + if not allow_delete: + delete_not_allowed.append(file_path) + + if not missing and not foreign and not delete_not_allowed and not guardrail_delete_blocked: + return 0 + + print('[agent-file-locks] Commit blocked: staged files must be safely claimed by this branch first.', file=sys.stderr) + if missing: + print(' Unclaimed files:', file=sys.stderr) + for file_path in missing: + print(f' - {file_path}', file=sys.stderr) + if foreign: + print(' Files claimed by another branch:', file=sys.stderr) + for file_path, owner in foreign: + print(f' - {file_path} (owner: {owner})', file=sys.stderr) + if delete_not_allowed: + print(' Delete not approved for claimed files:', file=sys.stderr) + for file_path in delete_not_allowed: + print(f' - {file_path}', file=sys.stderr) + print(' Approve explicit deletions with one of:', file=sys.stderr) + print( + f' python3 scripts/agent-file-locks.py claim --branch "{args.branch}" --allow-delete ', + file=sys.stderr, + ) + print( + f' python3 scripts/agent-file-locks.py allow-delete --branch "{args.branch}" ', + file=sys.stderr, + ) + if guardrail_delete_blocked: + print(' Critical guardrail file deletion blocked:', file=sys.stderr) + for file_path in guardrail_delete_blocked: + print(f' - {file_path}', file=sys.stderr) + print( + f' To intentionally allow this rare operation, set {ALLOW_GUARDRAIL_DELETE_ENV}=1 for the commit command.', + file=sys.stderr, + ) + + print('\nClaim files with:', file=sys.stderr) + print(f' python3 scripts/agent-file-locks.py claim --branch "{args.branch}" ', file=sys.stderr) + return 1 + + +def build_parser() -> argparse.ArgumentParser: + parser = argparse.ArgumentParser(description='Concurrent agent file-lock utility') + sub = parser.add_subparsers(dest='command', required=True) + + claim = sub.add_parser('claim', help='Claim file locks for a branch') + claim.add_argument('--branch', required=True, help='Owner branch name (e.g., agent/foo/...)') + claim.add_argument( + '--allow-delete', + action='store_true', + help='Mark these files as explicitly approved for deletion by this branch', + ) + claim.add_argument('files', nargs='+', help='Files to claim (repo-relative or absolute)') + + allow_delete = sub.add_parser('allow-delete', help='Enable delete approval on already claimed files') + allow_delete.add_argument('--branch', required=True, help='Owner branch name') + allow_delete.add_argument('files', nargs='+', help='Files to mark as delete-approved') + + release = sub.add_parser('release', help='Release file locks for a branch') + release.add_argument('--branch', required=True, help='Owner branch name') + release.add_argument('files', nargs='*', help='Optional files; omit to release all branch locks') + + status = sub.add_parser('status', help='Show lock status') + status.add_argument('--branch', help='Filter by branch') + + validate = sub.add_parser('validate', help='Validate staged files are locked by branch') + validate.add_argument('--branch', required=True, help='Owner branch name') + validate.add_argument('--staged', action='store_true', help='Validate staged files from git index') + validate.add_argument('files', nargs='*', help='Files to validate when --staged is not used') + + return parser + + +def main() -> int: + parser = build_parser() + args = parser.parse_args() + + try: + repo_root = resolve_repo_root() + if args.command == 'claim': + return cmd_claim(args, repo_root) + if args.command == 'allow-delete': + return cmd_allow_delete(args, repo_root) + if args.command == 'release': + return cmd_release(args, repo_root) + if args.command == 'status': + return cmd_status(args, repo_root) + if args.command == 'validate': + if not args.staged and not args.files: + raise LockError('validate requires --staged or one or more file paths') + return cmd_validate(args, repo_root) + raise LockError(f'Unknown command: {args.command}') + except LockError as exc: + print(f'[agent-file-locks] {exc}', file=sys.stderr) + return 2 + + +if __name__ == '__main__': + raise SystemExit(main()) diff --git a/frontend/scripts/agent-worktree-prune.sh b/frontend/scripts/agent-worktree-prune.sh new file mode 100755 index 0000000..c967140 --- /dev/null +++ b/frontend/scripts/agent-worktree-prune.sh @@ -0,0 +1,572 @@ +#!/usr/bin/env bash +set -euo pipefail + +BASE_BRANCH="${GUARDEX_BASE_BRANCH:-}" +BASE_BRANCH_EXPLICIT=0 +DRY_RUN=0 +FORCE_DIRTY=0 +DELETE_BRANCHES=0 +DELETE_REMOTE_BRANCHES=0 +ONLY_DIRTY_WORKTREES=0 +INCLUDE_PR_MERGED=0 +TARGET_BRANCH="" +IDLE_MINUTES=0 +NOW_EPOCH_RAW="${GUARDEX_PRUNE_NOW_EPOCH:-}" +IDLE_SECONDS=0 +NOW_EPOCH=0 +GH_BIN="${GUARDEX_GH_BIN:-gh}" +PR_MERGED_LOOKUP_DISABLED=0 +PR_MERGED_LOOKUP_LOADED=0 +declare -A MERGED_PR_BRANCHES=() +WORKTREE_ROOT_RELS=( + ".omx/agent-worktrees" + ".omc/agent-worktrees" +) + +if [[ -n "$BASE_BRANCH" ]]; then + BASE_BRANCH_EXPLICIT=1 +fi + +while [[ $# -gt 0 ]]; do + case "$1" in + --base) + BASE_BRANCH="${2:-}" + BASE_BRANCH_EXPLICIT=1 + shift 2 + ;; + --dry-run) + DRY_RUN=1 + shift + ;; + --force-dirty) + FORCE_DIRTY=1 + shift + ;; + --delete-branches) + DELETE_BRANCHES=1 + shift + ;; + --delete-remote-branches) + DELETE_REMOTE_BRANCHES=1 + shift + ;; + --only-dirty-worktrees) + ONLY_DIRTY_WORKTREES=1 + shift + ;; + --include-pr-merged) + INCLUDE_PR_MERGED=1 + shift + ;; + --branch) + TARGET_BRANCH="${2:-}" + shift 2 + ;; + --idle-minutes) + IDLE_MINUTES="${2:-}" + shift 2 + ;; + *) + echo "[agent-worktree-prune] Unknown argument: $1" >&2 + echo "Usage: $0 [--base ] [--dry-run] [--force-dirty] [--delete-branches] [--delete-remote-branches] [--only-dirty-worktrees] [--include-pr-merged] [--branch ] [--idle-minutes ]" >&2 + exit 1 + ;; + esac +done + +if ! git rev-parse --is-inside-work-tree >/dev/null 2>&1; then + echo "[agent-worktree-prune] Not inside a git repository." >&2 + exit 1 +fi + +repo_root="$(git rev-parse --show-toplevel)" +current_pwd="$(pwd -P)" +repo_common_dir="$( + git -C "$repo_root" rev-parse --git-common-dir \ + | awk -v root="$repo_root" '{ if ($0 ~ /^\//) { print $0 } else { print root "/" $0 } }' +)" +repo_common_dir="$(cd "$repo_common_dir" && pwd -P)" + +resolve_worktree_root_rel_for_entry() { + local entry="$1" + case "$entry" in + */.omc/agent-worktrees/*) + printf '%s' '.omc/agent-worktrees' + ;; + *) + printf '%s' '.omx/agent-worktrees' + ;; + esac +} + +is_managed_worktree_path() { + local entry="$1" + local rel root + for rel in "${WORKTREE_ROOT_RELS[@]}"; do + root="${repo_root}/${rel}" + if [[ "$entry" == "${root}"/* ]]; then + return 0 + fi + done + return 1 +} + +resolve_base_branch() { + local configured="" + local current="" + + configured="$(git -C "$repo_root" config --get multiagent.baseBranch || true)" + if [[ -n "$configured" ]] && git -C "$repo_root" show-ref --verify --quiet "refs/heads/${configured}"; then + printf '%s' "$configured" + return 0 + fi + + current="$(git -C "$repo_root" rev-parse --abbrev-ref HEAD 2>/dev/null || true)" + if [[ -n "$current" && "$current" != "HEAD" ]] && git -C "$repo_root" show-ref --verify --quiet "refs/heads/${current}"; then + printf '%s' "$current" + return 0 + fi + + for fallback in main dev; do + if git -C "$repo_root" show-ref --verify --quiet "refs/heads/${fallback}"; then + printf '%s' "$fallback" + return 0 + fi + done + + printf '%s' "" +} + +load_merged_pr_branches() { + if [[ "$INCLUDE_PR_MERGED" -ne 1 ]]; then + return 1 + fi + if [[ "$PR_MERGED_LOOKUP_DISABLED" -eq 1 ]]; then + return 1 + fi + if [[ "$PR_MERGED_LOOKUP_LOADED" -eq 1 ]]; then + return 0 + fi + if ! command -v "$GH_BIN" >/dev/null 2>&1; then + PR_MERGED_LOOKUP_DISABLED=1 + return 1 + fi + + local merged_branches="" + merged_branches="$( + "$GH_BIN" pr list --state merged --base "$BASE_BRANCH" --limit 200 --json headRefName --jq '.[].headRefName' 2>/dev/null || true + )" + if [[ -n "$merged_branches" ]]; then + while IFS= read -r merged_branch; do + [[ -z "$merged_branch" ]] && continue + MERGED_PR_BRANCHES["$merged_branch"]=1 + done <<< "$merged_branches" + fi + PR_MERGED_LOOKUP_LOADED=1 + return 0 +} + +branch_has_merged_pr() { + local branch="$1" + if [[ "$INCLUDE_PR_MERGED" -ne 1 ]]; then + return 1 + fi + load_merged_pr_branches || return 1 + [[ -n "${MERGED_PR_BRANCHES[$branch]:-}" ]] +} + +if [[ "$BASE_BRANCH_EXPLICIT" -eq 1 && -z "$BASE_BRANCH" ]]; then + echo "[agent-worktree-prune] --base requires a non-empty branch name." >&2 + exit 1 +fi + +if [[ -n "$TARGET_BRANCH" && "$TARGET_BRANCH" != agent/* ]]; then + echo "[agent-worktree-prune] --branch must reference an agent/* branch: ${TARGET_BRANCH}" >&2 + exit 1 +fi + +if [[ ! "$IDLE_MINUTES" =~ ^[0-9]+$ ]]; then + echo "[agent-worktree-prune] --idle-minutes must be an integer >= 0." >&2 + exit 1 +fi + +if [[ -n "$NOW_EPOCH_RAW" && ! "$NOW_EPOCH_RAW" =~ ^[0-9]+$ ]]; then + echo "[agent-worktree-prune] GUARDEX_PRUNE_NOW_EPOCH must be a unix timestamp integer." >&2 + exit 1 +fi + +if [[ "$BASE_BRANCH_EXPLICIT" -eq 0 ]]; then + BASE_BRANCH="$(resolve_base_branch)" +fi + +if [[ -z "$BASE_BRANCH" ]]; then + echo "[agent-worktree-prune] Unable to infer base branch. Pass --base ." >&2 + exit 1 +fi + +if ! git -C "$repo_root" show-ref --verify --quiet "refs/heads/${BASE_BRANCH}"; then + echo "[agent-worktree-prune] Base branch not found: ${BASE_BRANCH}" >&2 + exit 1 +fi + +IDLE_SECONDS=$((IDLE_MINUTES * 60)) +if [[ -n "$NOW_EPOCH_RAW" ]]; then + NOW_EPOCH="$NOW_EPOCH_RAW" +else + NOW_EPOCH="$(date +%s)" +fi + +run_cmd() { + if [[ "$DRY_RUN" -eq 1 ]]; then + echo "[agent-worktree-prune] [dry-run] $*" + return 0 + fi + "$@" +} + +branch_has_worktree() { + local branch="$1" + git -C "$repo_root" worktree list --porcelain | grep -q "^branch refs/heads/${branch}$" +} + +is_clean_worktree() { + local wt="$1" + git -C "$wt" diff --quiet -- . ":(exclude).omx/state/agent-file-locks.json" \ + && git -C "$wt" diff --cached --quiet -- . ":(exclude).omx/state/agent-file-locks.json" \ + && [[ -z "$(git -C "$wt" ls-files --others --exclude-standard)" ]] +} + +resolve_worktree_common_dir() { + local wt="$1" + local common_dir="" + common_dir="$(git -C "$wt" rev-parse --git-common-dir 2>/dev/null || true)" + if [[ -z "$common_dir" ]]; then + return 1 + fi + if [[ "$common_dir" == /* ]]; then + common_dir="$(cd "$common_dir" 2>/dev/null && pwd -P || true)" + else + common_dir="$(cd "$wt/$common_dir" 2>/dev/null && pwd -P || true)" + fi + if [[ -z "$common_dir" ]]; then + return 1 + fi + printf '%s' "$common_dir" +} + +select_unique_worktree_path() { + local root="$1" + local name="$2" + local candidate="${root}/${name}" + local suffix=2 + while [[ -e "$candidate" ]]; do + candidate="${root}/${name}-${suffix}" + suffix=$((suffix + 1)) + done + printf '%s' "$candidate" +} + +read_branch_activity_epoch() { + local branch="$1" + local wt="${2:-}" + local activity_epoch="" + + activity_epoch="$( + git -C "$repo_root" reflog show --format='%ct' -n 1 "refs/heads/${branch}" 2>/dev/null \ + | head -n 1 \ + | tr -d '[:space:]' + )" + if [[ -z "$activity_epoch" ]]; then + activity_epoch="$( + git -C "$repo_root" log -1 --format='%ct' "$branch" 2>/dev/null \ + | head -n 1 \ + | tr -d '[:space:]' + )" + fi + + if [[ -n "$wt" && -d "$wt" ]]; then + local lock_file="${wt}/.omx/state/agent-file-locks.json" + if [[ -f "$lock_file" ]]; then + local lock_mtime="" + lock_mtime="$(stat -c %Y "$lock_file" 2>/dev/null || stat -f %m "$lock_file" 2>/dev/null || true)" + if [[ "$lock_mtime" =~ ^[0-9]+$ ]]; then + if [[ -z "$activity_epoch" || "$lock_mtime" -gt "$activity_epoch" ]]; then + activity_epoch="$lock_mtime" + fi + fi + fi + fi + + printf '%s' "$activity_epoch" +} + +skipped_recent=0 + +branch_idle_gate() { + local branch="$1" + local wt="$2" + local reason="$3" + if [[ "$IDLE_SECONDS" -le 0 ]]; then + return 0 + fi + if [[ -z "$branch" ]]; then + return 0 + fi + + local last_activity_epoch="" + last_activity_epoch="$(read_branch_activity_epoch "$branch" "$wt")" + if [[ ! "$last_activity_epoch" =~ ^[0-9]+$ ]]; then + return 0 + fi + + local idle_age=$((NOW_EPOCH - last_activity_epoch)) + if [[ "$idle_age" -lt 0 ]]; then + idle_age=0 + fi + if [[ "$idle_age" -lt "$IDLE_SECONDS" ]]; then + skipped_recent=$((skipped_recent + 1)) + echo "[agent-worktree-prune] Skipping recent branch (${reason}): ${branch} (idle=${idle_age}s < ${IDLE_SECONDS}s)" + return 1 + fi + return 0 +} + +relocated_foreign=0 +skipped_foreign=0 + +relocate_foreign_worktree_entries() { + local rel="" worktree_root="" entry="" + for rel in "${WORKTREE_ROOT_RELS[@]}"; do + worktree_root="${repo_root}/${rel}" + [[ -d "$worktree_root" ]] || continue + + for entry in "${worktree_root}"/*; do + [[ -d "$entry" ]] || continue + if ! git -C "$entry" rev-parse --is-inside-work-tree >/dev/null 2>&1; then + continue + fi + + local entry_common_dir="" + entry_common_dir="$(resolve_worktree_common_dir "$entry" || true)" + [[ -n "$entry_common_dir" ]] || continue + + if [[ "$entry_common_dir" == "$repo_common_dir" ]]; then + continue + fi + + if [[ "$(basename "$entry_common_dir")" != ".git" ]]; then + skipped_foreign=$((skipped_foreign + 1)) + echo "[agent-worktree-prune] Skipping foreign worktree with unsupported git common dir: ${entry}" + continue + fi + + local owner_repo_root + owner_repo_root="$(dirname "$entry_common_dir")" + local owner_worktree_root_rel owner_worktree_root + owner_worktree_root_rel="$(resolve_worktree_root_rel_for_entry "$entry")" + owner_worktree_root="${owner_repo_root}/${owner_worktree_root_rel}" + local target_path + target_path="$(select_unique_worktree_path "$owner_worktree_root" "$(basename "$entry")")" + + if [[ "$entry" == "$current_pwd" || "$current_pwd" == "${entry}"/* ]]; then + skipped_foreign=$((skipped_foreign + 1)) + echo "[agent-worktree-prune] Skipping active foreign worktree: ${entry}" + continue + fi + + echo "[agent-worktree-prune] Relocating foreign worktree to owning repo: ${entry} -> ${target_path}" + if [[ "$DRY_RUN" -eq 1 ]]; then + relocated_foreign=$((relocated_foreign + 1)) + continue + fi + + mkdir -p "$owner_worktree_root" + if git -C "$owner_repo_root" worktree move "$entry" "$target_path" >/dev/null 2>&1; then + relocated_foreign=$((relocated_foreign + 1)) + else + skipped_foreign=$((skipped_foreign + 1)) + echo "[agent-worktree-prune] Failed to relocate foreign worktree: ${entry}" >&2 + fi + done + done +} + +removed_worktrees=0 +removed_branches=0 +skipped_active=0 +skipped_dirty=0 + +relocate_foreign_worktree_entries + +process_entry() { + local wt="$1" + local branch_ref="$2" + + [[ -z "$wt" ]] && return + if ! is_managed_worktree_path "$wt"; then + return + fi + + local branch="" + if [[ -n "$branch_ref" ]]; then + branch="${branch_ref#refs/heads/}" + fi + + if [[ -n "$TARGET_BRANCH" && "$branch" != "$TARGET_BRANCH" ]]; then + return + fi + + if [[ "$wt" == "$current_pwd" ]]; then + skipped_active=$((skipped_active + 1)) + echo "[agent-worktree-prune] Skipping active cwd worktree: ${wt}" + return + fi + + local remove_reason="" + local branch_delete_mode="safe" + + if [[ -z "$branch_ref" ]]; then + remove_reason="detached-worktree" + elif ! git -C "$repo_root" show-ref --verify --quiet "refs/heads/${branch}"; then + remove_reason="missing-branch" + elif [[ "$branch" == agent/* ]]; then + if git -C "$repo_root" merge-base --is-ancestor "$branch" "$BASE_BRANCH"; then + if [[ "$DELETE_BRANCHES" -eq 1 ]]; then + remove_reason="merged-agent-branch" + fi + elif [[ "$DELETE_BRANCHES" -eq 1 ]] && branch_has_merged_pr "$branch"; then + remove_reason="merged-agent-pr" + branch_delete_mode="force" + elif [[ "$ONLY_DIRTY_WORKTREES" -eq 1 ]] && is_clean_worktree "$wt"; then + remove_reason="clean-agent-worktree" + fi + elif [[ "$branch" == __agent_integrate_* || "$branch" == __source-probe-* ]]; then + remove_reason="temporary-worktree" + fi + + if [[ -z "$remove_reason" ]]; then + return + fi + + if ! branch_idle_gate "$branch" "$wt" "$remove_reason"; then + return + fi + + if [[ "$FORCE_DIRTY" -ne 1 ]] && ! is_clean_worktree "$wt"; then + skipped_dirty=$((skipped_dirty + 1)) + echo "[agent-worktree-prune] Skipping dirty worktree (${remove_reason}): ${wt}" + return + fi + + echo "[agent-worktree-prune] Removing worktree (${remove_reason}): ${wt}" + run_cmd git -C "$repo_root" worktree remove "$wt" --force + removed_worktrees=$((removed_worktrees + 1)) + + if [[ -z "$branch" ]]; then + return + fi + + if git -C "$repo_root" show-ref --verify --quiet "refs/heads/${branch}" && ! branch_has_worktree "$branch"; then + if [[ "$branch" == agent/* && "$DELETE_BRANCHES" -eq 1 ]]; then + local delete_flag="-d" + local deleted_label="merged" + if [[ "$branch_delete_mode" == "force" ]]; then + delete_flag="-D" + deleted_label="merged PR" + fi + if run_cmd git -C "$repo_root" branch "$delete_flag" "$branch" >/dev/null 2>&1; then + removed_branches=$((removed_branches + 1)) + echo "[agent-worktree-prune] Deleted ${deleted_label} branch: ${branch}" + if [[ "$DELETE_REMOTE_BRANCHES" -eq 1 ]]; then + if git -C "$repo_root" ls-remote --exit-code --heads origin "$branch" >/dev/null 2>&1; then + run_cmd git -C "$repo_root" push origin --delete "$branch" >/dev/null 2>&1 || true + echo "[agent-worktree-prune] Deleted ${deleted_label} remote branch: ${branch}" + fi + fi + fi + elif [[ "$branch" == __agent_integrate_* || "$branch" == __source-probe-* ]]; then + run_cmd git -C "$repo_root" branch -D "$branch" >/dev/null 2>&1 || true + removed_branches=$((removed_branches + 1)) + echo "[agent-worktree-prune] Deleted temporary branch: ${branch}" + fi + fi +} + +current_wt="" +current_branch_ref="" + +while IFS= read -r line || [[ -n "$line" ]]; do + if [[ -z "$line" ]]; then + process_entry "$current_wt" "$current_branch_ref" + current_wt="" + current_branch_ref="" + continue + fi + + case "$line" in + worktree\ *) + current_wt="${line#worktree }" + ;; + branch\ *) + current_branch_ref="${line#branch }" + ;; + esac + done < <(git -C "$repo_root" worktree list --porcelain) + +process_entry "$current_wt" "$current_branch_ref" + +if [[ "$DELETE_BRANCHES" -eq 1 ]]; then + while IFS= read -r branch; do + [[ -z "$branch" ]] && continue + if [[ -n "$TARGET_BRANCH" && "$branch" != "$TARGET_BRANCH" ]]; then + continue + fi + if branch_has_worktree "$branch"; then + continue + fi + if ! branch_idle_gate "$branch" "" "stale-merged-branch"; then + continue + fi + merged_by_ancestor=0 + merged_by_pr=0 + if git -C "$repo_root" merge-base --is-ancestor "$branch" "$BASE_BRANCH"; then + merged_by_ancestor=1 + elif branch_has_merged_pr "$branch"; then + merged_by_pr=1 + fi + if [[ "$merged_by_ancestor" -eq 1 || "$merged_by_pr" -eq 1 ]]; then + delete_flag="-d" + deleted_label="merged" + if [[ "$merged_by_pr" -eq 1 && "$merged_by_ancestor" -eq 0 ]]; then + delete_flag="-D" + deleted_label="merged PR" + fi + if run_cmd git -C "$repo_root" branch "$delete_flag" "$branch" >/dev/null 2>&1; then + removed_branches=$((removed_branches + 1)) + echo "[agent-worktree-prune] Deleted stale ${deleted_label} branch: ${branch}" + if [[ "$DELETE_REMOTE_BRANCHES" -eq 1 ]]; then + if git -C "$repo_root" ls-remote --exit-code --heads origin "$branch" >/dev/null 2>&1; then + run_cmd git -C "$repo_root" push origin --delete "$branch" >/dev/null 2>&1 || true + echo "[agent-worktree-prune] Deleted stale ${deleted_label} remote branch: ${branch}" + fi + fi + fi + fi + done < <(git -C "$repo_root" for-each-ref --format='%(refname:short)' refs/heads/agent) +fi + +run_cmd git -C "$repo_root" worktree prune + +echo "[agent-worktree-prune] Summary: base=${BASE_BRANCH}, idle_minutes=${IDLE_MINUTES}, removed_worktrees=${removed_worktrees}, removed_branches=${removed_branches}, skipped_active=${skipped_active}, skipped_dirty=${skipped_dirty}, skipped_recent=${skipped_recent}" +if [[ "$relocated_foreign" -gt 0 || "$skipped_foreign" -gt 0 ]]; then + echo "[agent-worktree-prune] Foreign routing: relocated=${relocated_foreign}, skipped=${skipped_foreign}" +fi +if [[ "$skipped_active" -gt 0 ]]; then + echo "[agent-worktree-prune] Tip: leave active agent worktree directories, then run this command again for full cleanup." >&2 +fi +if [[ "$skipped_dirty" -gt 0 ]]; then + echo "[agent-worktree-prune] Tip: dirty worktrees were preserved. Clean/finish them first, or pass --force-dirty to remove anyway." >&2 +fi +if [[ "$IDLE_SECONDS" -gt 0 && "$skipped_recent" -gt 0 ]]; then + echo "[agent-worktree-prune] Tip: recent branches were preserved by --idle-minutes=${IDLE_MINUTES}. Re-run later or lower the threshold." >&2 +fi diff --git a/frontend/scripts/codex-agent.sh b/frontend/scripts/codex-agent.sh new file mode 100755 index 0000000..a8a53c8 --- /dev/null +++ b/frontend/scripts/codex-agent.sh @@ -0,0 +1,910 @@ +#!/usr/bin/env bash +set -euo pipefail + +TASK_NAME="${GUARDEX_TASK_NAME:-task}" +AGENT_NAME="${GUARDEX_AGENT_NAME:-agent}" +BASE_BRANCH="${GUARDEX_BASE_BRANCH:-}" +BASE_BRANCH_EXPLICIT=0 +CODEX_BIN="${GUARDEX_CODEX_BIN:-codex}" +AUTO_FINISH_RAW="${GUARDEX_CODEX_AUTO_FINISH:-true}" +AUTO_REVIEW_ON_CONFLICT_RAW="${GUARDEX_CODEX_AUTO_REVIEW_ON_CONFLICT:-true}" +AUTO_CLEANUP_RAW="${GUARDEX_CODEX_AUTO_CLEANUP:-true}" +AUTO_WAIT_FOR_MERGE_RAW="${GUARDEX_CODEX_WAIT_FOR_MERGE:-true}" +OPENSPEC_AUTO_INIT_RAW="${GUARDEX_OPENSPEC_AUTO_INIT:-true}" +OPENSPEC_PLAN_SLUG_OVERRIDE="${GUARDEX_OPENSPEC_PLAN_SLUG:-}" +OPENSPEC_CHANGE_SLUG_OVERRIDE="${GUARDEX_OPENSPEC_CHANGE_SLUG:-}" +OPENSPEC_CAPABILITY_SLUG_OVERRIDE="${GUARDEX_OPENSPEC_CAPABILITY_SLUG:-}" +OPENSPEC_MASTERPLAN_LABEL_RAW="${GUARDEX_OPENSPEC_MASTERPLAN_LABEL-masterplan}" + +normalize_bool() { + local raw="${1:-}" + local fallback="${2:-0}" + local lowered + lowered="$(printf '%s' "$raw" | tr '[:upper:]' '[:lower:]')" + case "$lowered" in + 1|true|yes|on) printf '1' ;; + 0|false|no|off) printf '0' ;; + '') printf '%s' "$fallback" ;; + *) printf '%s' "$fallback" ;; + esac +} + +AUTO_FINISH="$(normalize_bool "$AUTO_FINISH_RAW" "1")" +AUTO_REVIEW_ON_CONFLICT="$(normalize_bool "$AUTO_REVIEW_ON_CONFLICT_RAW" "1")" +AUTO_CLEANUP="$(normalize_bool "$AUTO_CLEANUP_RAW" "1")" +AUTO_WAIT_FOR_MERGE="$(normalize_bool "$AUTO_WAIT_FOR_MERGE_RAW" "1")" +OPENSPEC_AUTO_INIT="$(normalize_bool "$OPENSPEC_AUTO_INIT_RAW" "1")" + +resolve_openspec_masterplan_label() { + local raw="${OPENSPEC_MASTERPLAN_LABEL_RAW:-}" + local label + + if [[ "$OPENSPEC_AUTO_INIT" -ne 1 ]] || [[ -z "$raw" ]]; then + printf '' + return 0 + fi + + label="$(printf '%s' "$raw" | tr '[:upper:]' '[:lower:]' | sed -E 's/[^a-z0-9]+/-/g; s/^-+//; s/-+$//; s/-{2,}/-/g')" + printf '%s' "$label" +} + +if [[ -n "$BASE_BRANCH" ]]; then + BASE_BRANCH_EXPLICIT=1 +fi + +while [[ $# -gt 0 ]]; do + case "$1" in + --task) + TASK_NAME="${2:-$TASK_NAME}" + shift 2 + ;; + --agent) + AGENT_NAME="${2:-$AGENT_NAME}" + shift 2 + ;; + --base) + BASE_BRANCH="${2:-$BASE_BRANCH}" + BASE_BRANCH_EXPLICIT=1 + shift 2 + ;; + --codex-bin) + CODEX_BIN="${2:-$CODEX_BIN}" + shift 2 + ;; + --auto-finish) + AUTO_FINISH=1 + shift + ;; + --no-auto-finish) + AUTO_FINISH=0 + shift + ;; + --auto-review-on-conflict) + AUTO_REVIEW_ON_CONFLICT=1 + shift + ;; + --no-auto-review-on-conflict) + AUTO_REVIEW_ON_CONFLICT=0 + shift + ;; + --cleanup) + AUTO_CLEANUP=1 + shift + ;; + --no-cleanup) + AUTO_CLEANUP=0 + shift + ;; + --wait-for-merge) + AUTO_WAIT_FOR_MERGE=1 + shift + ;; + --no-wait-for-merge) + AUTO_WAIT_FOR_MERGE=0 + shift + ;; + --) + shift + break + ;; + -*) + break + ;; + *) + TASK_NAME="$1" + shift + if [[ $# -gt 0 && "${1:-}" != -* ]]; then + AGENT_NAME="$1" + shift + fi + if [[ $# -gt 0 && "${1:-}" != -* ]]; then + BASE_BRANCH="$1" + BASE_BRANCH_EXPLICIT=1 + shift + fi + break + ;; + esac +done + +if [[ "$BASE_BRANCH_EXPLICIT" -eq 1 && -z "$BASE_BRANCH" ]]; then + echo "[codex-agent] --base requires a non-empty branch name." >&2 + exit 1 +fi + +if ! command -v "$CODEX_BIN" >/dev/null 2>&1; then + echo "[codex-agent] Missing Codex CLI command: $CODEX_BIN" >&2 + echo "[codex-agent] Install Codex first, then retry." >&2 + exit 127 +fi + +if ! git rev-parse --is-inside-work-tree >/dev/null 2>&1; then + echo "[codex-agent] Not inside a git repository." >&2 + exit 1 +fi +repo_root="$(git rev-parse --show-toplevel)" + +guardex_env_helper="${repo_root}/scripts/guardex-env.sh" +if [[ -f "$guardex_env_helper" ]]; then + # shellcheck source=/dev/null + source "$guardex_env_helper" +fi +if declare -F guardex_repo_is_enabled >/dev/null 2>&1 && ! guardex_repo_is_enabled "$repo_root"; then + toggle_source="$(guardex_repo_toggle_source "$repo_root" || true)" + toggle_raw="$(guardex_repo_toggle_raw "$repo_root" || true)" + if [[ -n "$toggle_source" && -n "$toggle_raw" ]]; then + echo "[codex-agent] Guardex is disabled for this repo (${toggle_source}: GUARDEX_ON=${toggle_raw})." >&2 + else + echo "[codex-agent] Guardex is disabled for this repo." >&2 + fi + echo "[codex-agent] Skip Guardex sandbox flow or re-enable with GUARDEX_ON=1." >&2 + exit 1 +fi + +sanitize_slug() { + local raw="$1" + local fallback="${2:-task}" + local slug + slug="$(printf '%s' "$raw" | tr '[:upper:]' '[:lower:]' | sed -E 's/[^a-z0-9]+/-/g; s/^-+//; s/-+$//; s/-{2,}/-/g')" + if [[ -z "$slug" ]]; then + slug="$fallback" + fi + printf '%s' "$slug" +} + +resolve_openspec_plan_slug() { + local branch_name="$1" + local task_slug + local masterplan_label="" + local branch_role="" + local branch_leaf="" + task_slug="$(sanitize_slug "$TASK_NAME" "task")" + if [[ -n "$OPENSPEC_PLAN_SLUG_OVERRIDE" ]]; then + sanitize_slug "$OPENSPEC_PLAN_SLUG_OVERRIDE" "$task_slug" + return 0 + fi + masterplan_label="$(resolve_openspec_masterplan_label)" + if [[ -n "$masterplan_label" ]] && [[ "$branch_name" =~ ^agent/([^/]+)/(.+)$ ]]; then + branch_role="${BASH_REMATCH[1]}" + branch_leaf="${BASH_REMATCH[2]}" + sanitize_slug "agent-${branch_role}-${masterplan_label}-${branch_leaf}" "$task_slug" + return 0 + fi + sanitize_slug "${branch_name//\//-}" "$task_slug" +} + +resolve_openspec_change_slug() { + local branch_name="$1" + local task_slug + task_slug="$(sanitize_slug "$TASK_NAME" "task")" + if [[ -n "$OPENSPEC_CHANGE_SLUG_OVERRIDE" ]]; then + sanitize_slug "$OPENSPEC_CHANGE_SLUG_OVERRIDE" "$task_slug" + return 0 + fi + sanitize_slug "${branch_name//\//-}" "$task_slug" +} + +resolve_openspec_capability_slug() { + local task_slug + task_slug="$(sanitize_slug "$TASK_NAME" "task")" + if [[ -n "$OPENSPEC_CAPABILITY_SLUG_OVERRIDE" ]]; then + sanitize_slug "$OPENSPEC_CAPABILITY_SLUG_OVERRIDE" "$task_slug" + return 0 + fi + sanitize_slug "$task_slug" "general-behavior" +} + +resolve_worktree_leaf() { + local branch_name="$1" + local masterplan_label="" + local branch_role="" + local branch_leaf="" + + masterplan_label="$(resolve_openspec_masterplan_label)" + if [[ -n "$masterplan_label" ]] && [[ "$branch_name" =~ ^agent/([^/]+)/(.+)$ ]]; then + branch_role="${BASH_REMATCH[1]}" + branch_leaf="${BASH_REMATCH[2]}" + printf 'agent__%s__%s__%s' "$branch_role" "$masterplan_label" "$branch_leaf" + return 0 + fi + + printf '%s' "${branch_name//\//__}" +} + +hydrate_local_helper_in_worktree() { + local worktree="$1" + local relative_path="$2" + local worktree_target="${worktree}/${relative_path}" + local source_path="" + + if [[ -e "$worktree_target" ]]; then + return 0 + fi + + if [[ -f "${repo_root}/${relative_path}" ]]; then + source_path="${repo_root}/${relative_path}" + elif [[ -f "${repo_root}/templates/${relative_path}" ]]; then + source_path="${repo_root}/templates/${relative_path}" + fi + + if [[ -z "$source_path" ]]; then + return 0 + fi + + mkdir -p "$(dirname "$worktree_target")" + cp "$source_path" "$worktree_target" + if [[ -x "$source_path" ]]; then + chmod +x "$worktree_target" + fi + + echo "[codex-agent] Hydrated local helper in sandbox: ${relative_path}" +} + +resolve_start_base_branch() { + if [[ "$BASE_BRANCH_EXPLICIT" -eq 1 && -n "$BASE_BRANCH" ]]; then + printf '%s' "$BASE_BRANCH" + return 0 + fi + + local configured_base + configured_base="$(git -C "$repo_root" config --get multiagent.baseBranch || true)" + if [[ -n "$configured_base" ]]; then + printf '%s' "$configured_base" + return 0 + fi + + printf 'dev' +} + +resolve_start_ref() { + local base_branch="$1" + git -C "$repo_root" fetch origin "$base_branch" --quiet >/dev/null 2>&1 || true + if git -C "$repo_root" show-ref --verify --quiet "refs/remotes/origin/${base_branch}"; then + printf 'origin/%s' "$base_branch" + return 0 + fi + if git -C "$repo_root" show-ref --verify --quiet "refs/heads/${base_branch}"; then + printf '%s' "$base_branch" + return 0 + fi + return 1 +} + +origin_remote_looks_like_github() { + local wt="$1" + local origin_url="" + origin_url="$(git -C "$wt" remote get-url origin 2>/dev/null || true)" + [[ -n "$origin_url" && "$origin_url" =~ github\.com[:/] ]] +} + +auto_finish_context_is_ready() { + local wt="$1" + local gh_bin="${GUARDEX_GH_BIN:-gh}" + + if ! git -C "$wt" remote get-url origin >/dev/null 2>&1; then + return 1 + fi + if ! command -v "$gh_bin" >/dev/null 2>&1; then + return 1 + fi + + if [[ -n "${GUARDEX_GH_BIN:-}" ]]; then + return 0 + fi + + if ! origin_remote_looks_like_github "$wt"; then + return 1 + fi + + "$gh_bin" auth status >/dev/null 2>&1 +} + +restore_repo_branch_if_changed() { + local expected_branch="$1" + if [[ -z "$expected_branch" || "$expected_branch" == "HEAD" ]]; then + return 0 + fi + local current_branch + current_branch="$(git -C "$repo_root" rev-parse --abbrev-ref HEAD 2>/dev/null || true)" + if [[ -z "$current_branch" || "$current_branch" == "$expected_branch" ]]; then + return 0 + fi + git -C "$repo_root" checkout "$expected_branch" >/dev/null 2>&1 +} + +start_sandbox_fallback() { + local base_branch start_ref timestamp task_slug agent_slug branch_name_base branch_name suffix + local worktree_root worktree_path + + base_branch="$(resolve_start_base_branch)" + if ! start_ref="$(resolve_start_ref "$base_branch")"; then + echo "[codex-agent] Unable to resolve base ref for fallback sandbox start: ${base_branch}" >&2 + return 1 + fi + + timestamp="$(date +%Y%m%d-%H%M%S)" + task_slug="$(sanitize_slug "$TASK_NAME" "task")" + agent_slug="$(sanitize_slug "$AGENT_NAME" "agent")" + branch_name_base="agent/${agent_slug}/${timestamp}-${task_slug}" + branch_name="$branch_name_base" + suffix=2 + while git -C "$repo_root" show-ref --verify --quiet "refs/heads/${branch_name}"; do + branch_name="${branch_name_base}-${suffix}" + suffix=$((suffix + 1)) + done + + worktree_root="${repo_root}/.omx/agent-worktrees" + mkdir -p "$worktree_root" + worktree_path="${worktree_root}/$(resolve_worktree_leaf "$branch_name")" + if [[ -e "$worktree_path" ]]; then + echo "[codex-agent] Fallback worktree path already exists: $worktree_path" >&2 + return 1 + fi + + local worktree_add_output="" + if ! worktree_add_output="$(git -C "$repo_root" worktree add -b "$branch_name" "$worktree_path" "$start_ref" 2>&1)"; then + printf '%s\n' "$worktree_add_output" >&2 + return 1 + fi + git -C "$repo_root" config "branch.${branch_name}.guardexBase" "$base_branch" >/dev/null 2>&1 || true + git -C "$worktree_path" branch --unset-upstream "$branch_name" >/dev/null 2>&1 || true + + printf '[agent-branch-start] Created branch: %s\n' "$branch_name" + printf '[agent-branch-start] Worktree: %s\n' "$worktree_path" +} + +if [[ ! -x "${repo_root}/scripts/agent-branch-start.sh" ]]; then + echo "[codex-agent] Missing scripts/agent-branch-start.sh. Run: gx setup" >&2 + exit 1 +fi + +start_args=("$TASK_NAME" "$AGENT_NAME") +if [[ "$BASE_BRANCH_EXPLICIT" -eq 1 ]]; then + start_args+=("$BASE_BRANCH") +fi + +initial_repo_branch="$(git -C "$repo_root" rev-parse --abbrev-ref HEAD 2>/dev/null || true)" +start_output="" +start_status=0 +set +e +start_output="$( + GUARDEX_OPENSPEC_AUTO_INIT="$OPENSPEC_AUTO_INIT" \ + GUARDEX_OPENSPEC_MASTERPLAN_LABEL="$OPENSPEC_MASTERPLAN_LABEL_RAW" \ + bash "${repo_root}/scripts/agent-branch-start.sh" "${start_args[@]}" 2>&1 +)" +start_status=$? +set -e + +worktree_path="$(printf '%s\n' "$start_output" | sed -n 's/^\[agent-branch-start\] Worktree: //p' | tail -n1)" +current_repo_branch="$(git -C "$repo_root" rev-parse --abbrev-ref HEAD 2>/dev/null || true)" +resolved_repo_root="$(cd "$repo_root" && pwd -P)" +resolved_worktree_path="" +if [[ -n "$worktree_path" && -d "$worktree_path" ]]; then + resolved_worktree_path="$(cd "$worktree_path" && pwd -P)" +fi + +fallback_reason="" +if [[ "$start_status" -ne 0 ]]; then + fallback_reason="starter exited with status ${start_status}" +elif [[ -z "$worktree_path" ]]; then + fallback_reason="starter did not report worktree path" +elif [[ -n "$resolved_worktree_path" && "$resolved_worktree_path" == "$resolved_repo_root" ]]; then + fallback_reason="starter pointed to active checkout path" +elif [[ -n "$initial_repo_branch" && -n "$current_repo_branch" && "$current_repo_branch" != "$initial_repo_branch" ]]; then + fallback_reason="starter switched active checkout branch" +fi + +if [[ -n "$fallback_reason" ]]; then + if ! restore_repo_branch_if_changed "$initial_repo_branch"; then + echo "[codex-agent] agent-branch-start changed the active checkout branch and restore failed." >&2 + echo "[codex-agent] Run 'gx setup --target ${repo_root}' and 'gx doctor --target ${repo_root}', then retry." >&2 + exit 1 + fi + if [[ -n "$start_output" ]]; then + printf '%s\n' "$start_output" >&2 + fi + echo "[codex-agent] Unsafe starter output (${fallback_reason}); creating sandbox worktree directly." >&2 + start_output="$(start_sandbox_fallback)" + printf '%s\n' "$start_output" + worktree_path="$(printf '%s\n' "$start_output" | sed -n 's/^\[agent-branch-start\] Worktree: //p' | tail -n1)" +else + printf '%s\n' "$start_output" +fi + +if [[ -z "$worktree_path" ]]; then + echo "[codex-agent] Could not determine sandbox worktree path from sandbox startup output." >&2 + echo "[codex-agent] Run 'gx setup --target ${repo_root}' and 'gx doctor --target ${repo_root}', then retry." >&2 + exit 1 +fi + +if [[ ! -d "$worktree_path" ]]; then + echo "[codex-agent] Reported worktree path does not exist: $worktree_path" >&2 + exit 1 +fi + +has_origin_remote() { + git -C "$repo_root" remote get-url origin >/dev/null 2>&1 +} + +origin_remote_supports_pr_finish() { + local origin_url + origin_url="$(git -C "$repo_root" remote get-url origin 2>/dev/null || true)" + case "$origin_url" in + ''|/*|./*|../*|file://*) + return 1 + ;; + esac + return 0 +} + +resolve_worktree_base_branch() { + local _wt="$1" + if [[ "$BASE_BRANCH_EXPLICIT" -eq 1 && -n "$BASE_BRANCH" ]]; then + printf '%s' "$BASE_BRANCH" + return 0 + fi + + local configured_base + configured_base="$(git -C "$repo_root" config --get multiagent.baseBranch || true)" + if [[ -n "$configured_base" ]]; then + printf '%s' "$configured_base" + return 0 + fi + + printf 'dev' +} + +sync_worktree_with_base() { + local wt="$1" + if ! has_origin_remote; then + return 0 + fi + + local base_branch + base_branch="$(resolve_worktree_base_branch "$wt")" + if [[ -z "$base_branch" ]]; then + return 0 + fi + + if ! git -C "$wt" fetch origin "$base_branch" --quiet; then + echo "[codex-agent] Warning: could not fetch origin/${base_branch} before task start." >&2 + return 0 + fi + + if ! git -C "$wt" show-ref --verify --quiet "refs/remotes/origin/${base_branch}"; then + return 0 + fi + + local behind_count + behind_count="$(git -C "$wt" rev-list --left-right --count "HEAD...origin/${base_branch}" 2>/dev/null | awk '{print $2}')" + behind_count="${behind_count:-0}" + if [[ "$behind_count" -le 0 ]]; then + return 0 + fi + + local branch + branch="$(git -C "$wt" rev-parse --abbrev-ref HEAD 2>/dev/null || true)" + echo "[codex-agent] Task sync: '${branch}' is behind origin/${base_branch} by ${behind_count} commit(s). Rebasing before launch..." + if ! git -C "$wt" rebase "origin/${base_branch}"; then + echo "[codex-agent] Task sync failed. Resolve and continue in sandbox:" >&2 + echo " git -C \"$wt\" rebase --continue" >&2 + echo " # or abort" >&2 + echo " git -C \"$wt\" rebase --abort" >&2 + return 1 + fi + echo "[codex-agent] Task sync complete." + return 0 +} + +ensure_openspec_plan_workspace() { + local wt="$1" + local branch="$2" + + if [[ "$OPENSPEC_AUTO_INIT" -ne 1 ]]; then + return 0 + fi + + hydrate_local_helper_in_worktree "$wt" "scripts/openspec/init-plan-workspace.sh" + + local openspec_script="${wt}/scripts/openspec/init-plan-workspace.sh" + if [[ ! -f "$openspec_script" ]]; then + echo "[codex-agent] Missing OpenSpec init script in sandbox: ${openspec_script}" >&2 + echo "[codex-agent] Run 'gx setup --target ${repo_root}' and retry." >&2 + return 1 + fi + if [[ ! -x "$openspec_script" ]]; then + chmod +x "$openspec_script" 2>/dev/null || true + fi + + local plan_slug + plan_slug="$(resolve_openspec_plan_slug "$branch")" + local init_output="" + if ! init_output="$( + cd "$wt" + bash "scripts/openspec/init-plan-workspace.sh" "$plan_slug" 2>&1 + )"; then + printf '%s\n' "$init_output" >&2 + echo "[codex-agent] OpenSpec workspace initialization failed for plan '${plan_slug}'." >&2 + return 1 + fi + if [[ -n "$init_output" ]]; then + printf '%s\n' "$init_output" + fi + echo "[codex-agent] OpenSpec plan workspace: ${wt}/openspec/plan/${plan_slug}" +} + +ensure_openspec_change_workspace() { + local wt="$1" + local branch="$2" + + if [[ "$OPENSPEC_AUTO_INIT" -ne 1 ]]; then + return 0 + fi + + hydrate_local_helper_in_worktree "$wt" "scripts/openspec/init-change-workspace.sh" + + local openspec_script="${wt}/scripts/openspec/init-change-workspace.sh" + if [[ ! -f "$openspec_script" ]]; then + echo "[codex-agent] Missing OpenSpec change init script in sandbox: ${openspec_script}" >&2 + echo "[codex-agent] Run 'gx setup --target ${repo_root}' and retry." >&2 + return 1 + fi + if [[ ! -x "$openspec_script" ]]; then + chmod +x "$openspec_script" 2>/dev/null || true + fi + + local change_slug capability_slug init_output="" + change_slug="$(resolve_openspec_change_slug "$branch")" + capability_slug="$(resolve_openspec_capability_slug)" + if ! init_output="$( + cd "$wt" + bash "scripts/openspec/init-change-workspace.sh" "$change_slug" "$capability_slug" 2>&1 + )"; then + printf '%s\n' "$init_output" >&2 + echo "[codex-agent] OpenSpec workspace initialization failed for change '${change_slug}'." >&2 + return 1 + fi + if [[ -n "$init_output" ]]; then + printf '%s\n' "$init_output" + fi + echo "[codex-agent] OpenSpec change workspace: ${wt}/openspec/changes/${change_slug}" +} + +worktree_has_changes() { + local wt="$1" + if ! git -C "$wt" diff --quiet -- . ":(exclude).omx/state/agent-file-locks.json"; then + return 0 + fi + if ! git -C "$wt" diff --cached --quiet -- . ":(exclude).omx/state/agent-file-locks.json"; then + return 0 + fi + if [[ -n "$(git -C "$wt" ls-files --others --exclude-standard)" ]]; then + return 0 + fi + return 1 +} + +claim_changed_files() { + local wt="$1" + local branch="$2" + local lock_script="${repo_root}/scripts/agent-file-locks.py" + + if [[ ! -x "$lock_script" ]]; then + return 0 + fi + + local changed_raw deleted_raw + changed_raw="$({ + git -C "$wt" diff --name-only -- . ":(exclude).omx/state/agent-file-locks.json"; + git -C "$wt" diff --cached --name-only -- . ":(exclude).omx/state/agent-file-locks.json"; + git -C "$wt" ls-files --others --exclude-standard; + } | sed '/^$/d' | sort -u)" + + if [[ -n "$changed_raw" ]]; then + mapfile -t changed_files < <(printf '%s\n' "$changed_raw") + python3 "$lock_script" claim --branch "$branch" "${changed_files[@]}" >/dev/null 2>&1 || true + fi + + deleted_raw="$({ + git -C "$wt" diff --name-only --diff-filter=D -- . ":(exclude).omx/state/agent-file-locks.json"; + git -C "$wt" diff --cached --name-only --diff-filter=D -- . ":(exclude).omx/state/agent-file-locks.json"; + } | sed '/^$/d' | sort -u)" + + if [[ -n "$deleted_raw" ]]; then + mapfile -t deleted_files < <(printf '%s\n' "$deleted_raw") + python3 "$lock_script" allow-delete --branch "$branch" "${deleted_files[@]}" >/dev/null 2>&1 || true + fi +} + +auto_commit_worktree_changes() { + local wt="$1" + local branch="$2" + + if ! worktree_has_changes "$wt"; then + return 0 + fi + + claim_changed_files "$wt" "$branch" + git -C "$wt" add -A + + if git -C "$wt" diff --cached --quiet -- . ":(exclude).omx/state/agent-file-locks.json"; then + return 0 + fi + + local default_message="Auto-finish: ${TASK_NAME}" + local commit_message="${GUARDEX_CODEX_AUTO_COMMIT_MESSAGE:-$default_message}" + local commit_output="" + + if commit_output="$(git -C "$wt" commit -m "$commit_message" 2>&1)"; then + echo "[codex-agent] Auto-committed sandbox changes on '${branch}'." + return 0 + fi + + if auto_sync_for_commit_retry "$wt" "$branch"; then + claim_changed_files "$wt" "$branch" + git -C "$wt" add -A + if commit_output="$(git -C "$wt" commit -m "$commit_message" 2>&1)"; then + echo "[codex-agent] Auto-committed sandbox changes on '${branch}' after sync retry." + return 0 + fi + fi + + echo "[codex-agent] Auto-commit failed in sandbox. Keeping branch for manual review: $branch" >&2 + if [[ -n "$commit_output" ]]; then + printf '%s\n' "$commit_output" >&2 + fi + return 1 +} + +auto_sync_for_commit_retry() { + local wt="$1" + local branch="$2" + + if ! has_origin_remote; then + return 1 + fi + + local base_branch + base_branch="$(resolve_worktree_base_branch "$wt")" + if [[ -z "$base_branch" ]]; then + return 1 + fi + + if ! git -C "$wt" fetch origin "$base_branch" --quiet; then + return 1 + fi + + if ! git -C "$wt" show-ref --verify --quiet "refs/remotes/origin/${base_branch}"; then + return 1 + fi + + local behind_count + behind_count="$(git -C "$wt" rev-list --left-right --count "HEAD...origin/${base_branch}" 2>/dev/null | awk '{print $2}')" + behind_count="${behind_count:-0}" + if [[ "$behind_count" -le 0 ]]; then + return 1 + fi + + echo "[codex-agent] Auto-commit retry: '${branch}' is behind origin/${base_branch} by ${behind_count} commit(s). Syncing and retrying..." + + local stash_ref="" + local stash_output="" + if worktree_has_changes "$wt"; then + if ! stash_output="$(git -C "$wt" stash push --include-untracked -m "codex-agent-autocommit-sync-${branch}-$(date +%s)" 2>&1)"; then + return 1 + fi + stash_ref="$(printf '%s\n' "$stash_output" | grep -o 'stash@{[0-9]\+}' | head -n 1 || true)" + fi + + if ! git -C "$wt" rebase "origin/${base_branch}" >/dev/null 2>&1; then + git -C "$wt" rebase --abort >/dev/null 2>&1 || true + if [[ -n "$stash_ref" ]]; then + git -C "$wt" stash pop "$stash_ref" >/dev/null 2>&1 || true + fi + return 1 + fi + + if [[ -n "$stash_ref" ]]; then + if ! git -C "$wt" stash pop "$stash_ref" >/dev/null 2>&1; then + echo "[codex-agent] Auto-commit retry could not re-apply local changes after sync. Manual resolution required in: $wt" >&2 + return 1 + fi + fi + + return 0 +} + +looks_like_conflict_failure() { + local output="$1" + if grep -qiE 'preflight conflict detected|merge conflict detected|auto-sync failed while rebasing|rebase --continue|rebase --abort' <<< "$output"; then + return 0 + fi + return 1 +} + +run_finish_flow() { + local wt="$1" + local branch="$2" + local finish_base_branch="" + local finish_output="" + local -a finish_args + + finish_args=(--branch "$branch") + if [[ "$BASE_BRANCH_EXPLICIT" -eq 1 && -n "$BASE_BRANCH" ]]; then + finish_base_branch="$BASE_BRANCH" + else + finish_base_branch="$(resolve_worktree_base_branch "$wt")" + fi + if [[ -n "$finish_base_branch" ]]; then + finish_args+=(--base "$finish_base_branch") + fi + if [[ "$AUTO_CLEANUP" -eq 1 ]]; then + finish_args+=(--cleanup) + fi + if [[ "$AUTO_WAIT_FOR_MERGE" -eq 1 ]]; then + finish_args+=(--wait-for-merge) + fi + + if has_origin_remote; then + if ! command -v "${GUARDEX_GH_BIN:-gh}" >/dev/null 2>&1 && ! command -v gh >/dev/null 2>&1; then + echo "[codex-agent] Auto-finish requires GitHub CLI for PR flow; command not found: ${GUARDEX_GH_BIN:-gh}" >&2 + return 2 + fi + if origin_remote_supports_pr_finish; then + finish_args+=(--via-pr) + else + echo "[codex-agent] Origin remote does not provide a mergeable PR surface; skipping auto-finish merge/PR pipeline." >&2 + return 2 + fi + else + echo "[codex-agent] No origin remote detected; skipping auto-finish merge/PR pipeline." >&2 + return 2 + fi + + if finish_output="$(bash "${repo_root}/scripts/agent-branch-finish.sh" "${finish_args[@]}" 2>&1)"; then + printf '%s\n' "$finish_output" + return 0 + fi + + printf '%s\n' "$finish_output" >&2 + + if [[ "$AUTO_REVIEW_ON_CONFLICT" -eq 1 ]] && looks_like_conflict_failure "$finish_output"; then + echo "[codex-agent] Auto-finish hit conflicts. Launching Codex conflict-review pass in sandbox..." >&2 + local review_prompt + review_prompt="Resolve git conflicts for branch ${branch} against ${finish_base_branch:-dev}, then commit the resolution in this sandbox worktree and exit." + + ( + cd "$wt" + set +e + "$CODEX_BIN" "$review_prompt" + review_exit="$?" + set -e + if [[ "$review_exit" -ne 0 ]]; then + echo "[codex-agent] Conflict-review Codex pass exited with status ${review_exit}." >&2 + fi + ) + + if finish_output="$(bash "${repo_root}/scripts/agent-branch-finish.sh" "${finish_args[@]}" 2>&1)"; then + printf '%s\n' "$finish_output" + return 0 + fi + + printf '%s\n' "$finish_output" >&2 + fi + + return 1 +} + +if ! sync_worktree_with_base "$worktree_path"; then + exit 1 +fi + +worktree_branch="$(git -C "$worktree_path" rev-parse --abbrev-ref HEAD 2>/dev/null || true)" +if [[ -z "$worktree_branch" || "$worktree_branch" == "HEAD" ]]; then + echo "[codex-agent] Could not determine sandbox branch for worktree: $worktree_path" >&2 + exit 1 +fi + +if ! ensure_openspec_change_workspace "$worktree_path" "$worktree_branch"; then + exit 1 +fi + +if ! ensure_openspec_plan_workspace "$worktree_path" "$worktree_branch"; then + exit 1 +fi + +echo "[codex-agent] Launching ${CODEX_BIN} in sandbox: $worktree_path" +cd "$worktree_path" +set +e +"$CODEX_BIN" "$@" +codex_exit="$?" +set -e + +cd "$repo_root" +final_exit="$codex_exit" +auto_finish_completed=0 + +if [[ "$AUTO_FINISH" -eq 1 && -n "$worktree_branch" && "$worktree_branch" != "HEAD" ]]; then + if [[ "$AUTO_WAIT_FOR_MERGE" -eq 1 && "$AUTO_CLEANUP" -eq 1 ]]; then + echo "[codex-agent] Auto-finish enabled: commit -> push/PR -> wait for merge -> cleanup." + elif [[ "$AUTO_WAIT_FOR_MERGE" -eq 1 ]]; then + echo "[codex-agent] Auto-finish enabled: commit -> push/PR -> wait for merge (keep branch/worktree)." + elif [[ "$AUTO_CLEANUP" -eq 1 ]]; then + echo "[codex-agent] Auto-finish enabled: commit -> push/PR -> merge -> cleanup." + else + echo "[codex-agent] Auto-finish enabled: commit -> push/PR -> merge (keep branch/worktree)." + fi + if ! auto_finish_context_is_ready "$worktree_path"; then + echo "[codex-agent] Auto-finish skipped for '${worktree_branch}' (no mergeable remote context)." >&2 + elif auto_commit_worktree_changes "$worktree_path" "$worktree_branch"; then + if run_finish_flow "$worktree_path" "$worktree_branch"; then + auto_finish_completed=1 + echo "[codex-agent] Auto-finish completed for '${worktree_branch}'." + else + finish_status="$?" + if [[ "$finish_status" -eq 2 ]]; then + echo "[codex-agent] Auto-finish skipped for '${worktree_branch}' (no mergeable remote context)." >&2 + else + echo "[codex-agent] Auto-finish did not complete; keeping sandbox for manual review: $worktree_path" >&2 + if [[ "$final_exit" -eq 0 ]]; then + final_exit=1 + fi + fi + fi + else + if [[ "$final_exit" -eq 0 ]]; then + final_exit=1 + fi + fi +fi + +if [[ -x "${repo_root}/scripts/agent-worktree-prune.sh" ]]; then + echo "[codex-agent] Session ended (exit=${codex_exit}). Running worktree cleanup..." + prune_args=() + if [[ "$BASE_BRANCH_EXPLICIT" -eq 1 ]]; then + prune_args+=(--base "$BASE_BRANCH") + fi + if [[ "$AUTO_CLEANUP" -eq 1 && "$auto_finish_completed" -eq 1 ]]; then + prune_args+=(--only-dirty-worktrees --delete-branches --delete-remote-branches) + fi + if ! bash "${repo_root}/scripts/agent-worktree-prune.sh" "${prune_args[@]}"; then + echo "[codex-agent] Warning: automatic worktree cleanup failed." >&2 + fi +fi + +if [[ ! -d "$worktree_path" ]]; then + echo "[codex-agent] Auto-cleaned sandbox worktree: $worktree_path" +else + worktree_branch="$(git -C "$worktree_path" rev-parse --abbrev-ref HEAD 2>/dev/null || true)" + echo "[codex-agent] Sandbox worktree kept: $worktree_path" + if [[ -n "$worktree_branch" && "$worktree_branch" != "HEAD" ]]; then + if [[ "$auto_finish_completed" -eq 1 ]]; then + echo "[codex-agent] Branch kept intentionally. Cleanup on demand: gx cleanup --branch \"${worktree_branch}\"" + else + echo "[codex-agent] If finished, merge with: bash scripts/agent-branch-finish.sh --branch \"${worktree_branch}\" --base dev --via-pr --wait-for-merge" + echo "[codex-agent] Cleanup on demand: gx cleanup --branch \"${worktree_branch}\"" + fi + fi +fi + +exit "$final_exit" diff --git a/frontend/scripts/guardex-docker-loader.sh b/frontend/scripts/guardex-docker-loader.sh new file mode 100755 index 0000000..f44568a --- /dev/null +++ b/frontend/scripts/guardex-docker-loader.sh @@ -0,0 +1,123 @@ +#!/usr/bin/env bash +set -euo pipefail + +repo_root="$(cd "$(dirname "${BASH_SOURCE[0]}")/.." && pwd)" +compose_file="${GUARDEX_DOCKER_COMPOSE_FILE:-}" +service="${GUARDEX_DOCKER_SERVICE:-}" +mode="${GUARDEX_DOCKER_MODE:-auto}" +workdir_override="${GUARDEX_DOCKER_WORKDIR:-}" + +usage() { + cat >&2 <<'EOF' +Usage: bash scripts/guardex-docker-loader.sh [--] + +Environment: + GUARDEX_DOCKER_SERVICE= required unless compose defines exactly one service + GUARDEX_DOCKER_COMPOSE_FILE= optional docker compose file override + GUARDEX_DOCKER_MODE=auto|exec|run default: auto + GUARDEX_DOCKER_WORKDIR= optional working directory override inside the container +EOF +} + +choose_compose_cmd() { + if command -v docker >/dev/null 2>&1 && docker compose version >/dev/null 2>&1; then + printf 'docker compose' + return 0 + fi + if command -v docker-compose >/dev/null 2>&1; then + printf 'docker-compose' + return 0 + fi + return 1 +} + +mapfile_from_lines() { + local raw="$1" + local -n out_ref="$2" + out_ref=() + while IFS= read -r line; do + [[ -n "$line" ]] || continue + out_ref+=("$line") + done <<<"$raw" +} + +if [[ "${1:-}" == "--" ]]; then + shift +fi + +if [[ $# -eq 0 ]]; then + usage + exit 1 +fi + +if [[ "$mode" != "auto" && "$mode" != "exec" && "$mode" != "run" ]]; then + echo "[guardex-docker-loader] Invalid GUARDEX_DOCKER_MODE: $mode" >&2 + usage + exit 1 +fi + +compose_cmd_raw="$(choose_compose_cmd)" || { + echo "[guardex-docker-loader] Docker Compose is not available. Install docker compose or docker-compose first." >&2 + exit 1 +} +IFS=' ' read -r -a compose_cmd <<<"$compose_cmd_raw" +compose_args=() +if [[ -n "$compose_file" ]]; then + compose_args=(-f "$compose_file") +fi + +cd "$repo_root" + +services_raw="$("${compose_cmd[@]}" "${compose_args[@]}" config --services 2>/dev/null || true)" +declare -a services +mapfile_from_lines "$services_raw" services +if [[ ${#services[@]} -eq 0 ]]; then + echo "[guardex-docker-loader] No Docker Compose services found. Add a compose file or set GUARDEX_DOCKER_COMPOSE_FILE." >&2 + exit 1 +fi + +if [[ -z "$service" ]]; then + if [[ ${#services[@]} -eq 1 ]]; then + service="${services[0]}" + else + echo "[guardex-docker-loader] Multiple services found (${services[*]}). Set GUARDEX_DOCKER_SERVICE." >&2 + exit 1 + fi +fi + +service_known=0 +for candidate in "${services[@]}"; do + if [[ "$candidate" == "$service" ]]; then + service_known=1 + break + fi +done +if [[ $service_known -ne 1 ]]; then + echo "[guardex-docker-loader] Compose service not found: $service" >&2 + exit 1 +fi + +run_mode="$mode" +if [[ "$run_mode" == "auto" ]]; then + run_mode="run" + running_raw="$("${compose_cmd[@]}" "${compose_args[@]}" ps --status running --services 2>/dev/null || true)" + declare -a running_services + mapfile_from_lines "$running_raw" running_services + for candidate in "${running_services[@]}"; do + if [[ "$candidate" == "$service" ]]; then + run_mode="exec" + break + fi + done +fi + +workdir_args=() +if [[ -n "$workdir_override" ]]; then + workdir_args=(-w "$workdir_override") +fi + +if [[ "$run_mode" == "exec" ]]; then + exec "${compose_cmd[@]}" "${compose_args[@]}" exec -T "${workdir_args[@]}" "$service" "$@" +fi + +exec "${compose_cmd[@]}" "${compose_args[@]}" run --rm -T "${workdir_args[@]}" "$service" "$@" diff --git a/frontend/scripts/guardex-env.sh b/frontend/scripts/guardex-env.sh new file mode 100644 index 0000000..f406ea8 --- /dev/null +++ b/frontend/scripts/guardex-env.sh @@ -0,0 +1,74 @@ +#!/usr/bin/env bash + +guardex_normalize_bool() { + local raw="${1:-}" + local fallback="${2:-}" + local lowered + lowered="$(printf '%s' "$raw" | tr '[:upper:]' '[:lower:]')" + case "$lowered" in + 1|true|yes|on) printf '1' ;; + 0|false|no|off) printf '0' ;; + '') printf '%s' "$fallback" ;; + *) printf '%s' "$fallback" ;; + esac +} + +guardex_read_repo_dotenv_var() { + local repo_root="$1" + local key="${2:-GUARDEX_ON}" + local env_file="${repo_root}/.env" + local line value + + [[ -f "$env_file" ]] || return 1 + + while IFS= read -r line || [[ -n "$line" ]]; do + [[ "$line" =~ ^[[:space:]]*# ]] && continue + if [[ "$line" =~ ^[[:space:]]*(export[[:space:]]+)?${key}[[:space:]]*=(.*)$ ]]; then + value="${BASH_REMATCH[2]}" + value="$(printf '%s' "$value" | sed -E 's/[[:space:]]+#.*$//; s/^[[:space:]]+//; s/[[:space:]]+$//')" + if [[ "$value" == \"*\" && "$value" == *\" ]]; then + value="${value:1:${#value}-2}" + elif [[ "$value" == \'*\' && "$value" == *\' ]]; then + value="${value:1:${#value}-2}" + fi + printf '%s' "$value" + return 0 + fi + done < "$env_file" + + return 1 +} + +guardex_repo_toggle_raw() { + local repo_root="$1" + if [[ -n "${GUARDEX_ON:-}" ]]; then + printf '%s' "$GUARDEX_ON" + return 0 + fi + guardex_read_repo_dotenv_var "$repo_root" "GUARDEX_ON" +} + +guardex_repo_toggle_source() { + local repo_root="$1" + if [[ -n "${GUARDEX_ON:-}" ]]; then + printf 'process environment' + return 0 + fi + if guardex_read_repo_dotenv_var "$repo_root" "GUARDEX_ON" >/dev/null; then + printf 'repo .env' + return 0 + fi + return 1 +} + +guardex_repo_is_enabled() { + local repo_root="$1" + local raw normalized + if raw="$(guardex_repo_toggle_raw "$repo_root")"; then + normalized="$(guardex_normalize_bool "$raw" "")" + if [[ "$normalized" == "0" ]]; then + return 1 + fi + fi + return 0 +} diff --git a/frontend/scripts/install-agent-git-hooks.sh b/frontend/scripts/install-agent-git-hooks.sh new file mode 100755 index 0000000..d3257af --- /dev/null +++ b/frontend/scripts/install-agent-git-hooks.sh @@ -0,0 +1,21 @@ +#!/usr/bin/env bash +set -euo pipefail + +repo_root="$(git rev-parse --show-toplevel 2>/dev/null || true)" +if [[ -z "$repo_root" ]]; then + echo "[install-agent-git-hooks] Not inside a git repository." >&2 + exit 1 +fi + +hooks_dir="$repo_root/.githooks" +if [[ ! -d "$hooks_dir" ]]; then + echo "[install-agent-git-hooks] Missing hooks directory: $hooks_dir" >&2 + exit 1 +fi + +chmod +x "$hooks_dir"/* 2>/dev/null || true + +git -C "$repo_root" config core.hooksPath .githooks + +echo "[install-agent-git-hooks] Installed repo hooks path: .githooks" +echo "[install-agent-git-hooks] Branch protection hook is now active for this repo clone." diff --git a/frontend/scripts/openspec/init-change-workspace.sh b/frontend/scripts/openspec/init-change-workspace.sh new file mode 100755 index 0000000..a71940a --- /dev/null +++ b/frontend/scripts/openspec/init-change-workspace.sh @@ -0,0 +1,93 @@ +#!/usr/bin/env bash +set -euo pipefail + +if [[ $# -lt 1 || $# -gt 2 ]]; then + echo "Usage: $0 [capability-slug]" + echo "Example: $0 add-dashboard-live-usage runtime-migration" + exit 1 +fi + +CHANGE_SLUG="$1" +CAPABILITY_SLUG="${2:-$CHANGE_SLUG}" + +if [[ "$CHANGE_SLUG" =~ [^a-z0-9-] ]]; then + echo "Error: change slug must be kebab-case (lowercase letters, numbers, hyphens)." + exit 1 +fi + +if [[ "$CAPABILITY_SLUG" =~ [^a-z0-9-] ]]; then + echo "Error: capability slug must be kebab-case (lowercase letters, numbers, hyphens)." + exit 1 +fi + +CHANGE_DIR="openspec/changes/${CHANGE_SLUG}" +SPEC_DIR="${CHANGE_DIR}/specs/${CAPABILITY_SLUG}" +TODAY="$(date -u +%Y-%m-%d)" + +mkdir -p "$SPEC_DIR" + +if [[ ! -f "${CHANGE_DIR}/.openspec.yaml" ]]; then + cat > "${CHANGE_DIR}/.openspec.yaml" < "${CHANGE_DIR}/proposal.md" < "${CHANGE_DIR}/tasks.md" < --base --via-pr --wait-for-merge --cleanup\`). +- [ ] 4.2 Record PR URL + final \`MERGED\` state in the completion handoff. +- [ ] 4.3 Confirm sandbox cleanup (\`git worktree list\`, \`git branch -a\`) or capture a \`BLOCKED:\` handoff if merge/cleanup is pending. +TASKSEOF +fi + +if [[ ! -f "${SPEC_DIR}/spec.md" ]]; then + cat > "${SPEC_DIR}/spec.md" < [role ...]" + echo "Example: $0 stabilize-dashboard planner architect critic executor writer verifier" + exit 1 +fi + +PLAN_SLUG="$1" +shift || true + +if [[ "$PLAN_SLUG" =~ [^a-z0-9-] ]]; then + echo "Error: plan slug must be kebab-case (lowercase letters, numbers, hyphens)." >&2 + exit 1 +fi + +if [[ $# -gt 0 ]]; then + ROLES=("$@") +else + ROLES=(planner architect critic executor writer verifier) +fi + +PLAN_DIR="openspec/plan/${PLAN_SLUG}" +mkdir -p "$PLAN_DIR" + +write_if_missing() { + local file="$1" + shift + if [[ ! -f "$file" ]]; then + mkdir -p "$(dirname "$file")" + cat > "$file" < +\`\`\` +" + +write_if_missing "$PLAN_DIR/planner/plan.md" "# ExecPlan: ${PLAN_SLUG} + +This document is a living plan. Keep progress and decisions current. + +## Purpose / Big Picture + +## Progress + +- [ ] Initial draft +- [ ] Review + iterate +- [ ] Approved for execution + +## Surprises & Discoveries + +## Decision Log + +## Outcomes & Retrospective + +## Validation and Acceptance +" + +for role in "${ROLES[@]}"; do + ROLE_DIR="$PLAN_DIR/$role" + mkdir -p "$ROLE_DIR" + + write_if_missing "$ROLE_DIR/README.md" "# ${role} + +Role workspace for \`${role}\`. +" + + write_if_missing "$ROLE_DIR/prompt.md" "# ${role} prompt + +You are the \`${role}\` lane for shared plan \`${PLAN_SLUG}\`. + +## Scope + +- Work inside \`openspec/plan/${PLAN_SLUG}/${role}/\` plus directly-related shared plan files you explicitly claim. +- Reuse the owner's branch/worktree instead of creating a separate sandbox unless the owner says otherwise. + +## Ownership + +- Before editing, claim this role's files in the shared owner lane: + \`python3 scripts/agent-file-locks.py claim --branch openspec/plan/${PLAN_SLUG}/${role}/README.md openspec/plan/${PLAN_SLUG}/${role}/prompt.md openspec/plan/${PLAN_SLUG}/${role}/tasks.md openspec/plan/${PLAN_SLUG}/checkpoints.md\` +- Record branch, worktree, and scope in \`tasks.md\`. +- Do not change another role's files without reassignment. + +## Deliverables + +- Complete the role checklist in \`tasks.md\`. +- Leave a handoff with files changed, verification, and risks. +- The owner alone runs the change completion flow and sandbox cleanup after change tasks 4.1-4.3 are done. +" + + write_if_missing "$ROLE_DIR/tasks.md" "# ${role} tasks + +## Ownership + +- [ ] Claim this role's files in the shared owner branch/worktree before editing. +- [ ] Record branch, worktree, and scope for this role. +- [ ] Copy or hand off \`prompt.md\` when another agent joins this role. + +## 1. Spec + +- [ ] Define requirements and scope for ${role} +- [ ] Confirm acceptance criteria are explicit and testable + +## 2. Tests + +- [ ] Define verification approach and evidence requirements +- [ ] List concrete commands for verification + +## 3. Implementation + +- [ ] Execute role-specific deliverables +- [ ] Capture decisions, risks, and handoff notes + +## 4. Checkpoints + +- [ ] Publish checkpoint update for this role + +## 5. Collaboration + +- [ ] Leave a role handoff with files changed, verification, and risks. +- [ ] Owner records \`accept\`, \`revise\`, or \`reject\` for joined output, or marks \`N/A\` if no helper joined. + +## 6. Completion + +- [ ] Keep sandbox cleanup blocked until change tasks 4.1-4.3 are complete. +" +done + +echo "[gitguardex] OpenSpec plan workspace ready: ${PLAN_DIR}" +echo "[gitguardex] Roles: ${ROLES[*]}" diff --git a/frontend/scripts/review-bot-watch.sh b/frontend/scripts/review-bot-watch.sh new file mode 100755 index 0000000..f98d0ef --- /dev/null +++ b/frontend/scripts/review-bot-watch.sh @@ -0,0 +1,330 @@ +#!/usr/bin/env bash +set -euo pipefail + +INTERVAL_SECONDS="${GUARDEX_REVIEW_BOT_INTERVAL_SECONDS:-30}" +AGENT_NAME="${GUARDEX_REVIEW_BOT_AGENT_NAME:-guardex-review-bot}" +TASK_PREFIX="${GUARDEX_REVIEW_BOT_TASK_PREFIX:-review-merge}" +STATE_FILE="${GUARDEX_REVIEW_BOT_STATE_FILE:-}" +BASE_BRANCH="${GUARDEX_REVIEW_BOT_BASE_BRANCH:-}" +ONLY_PR="${GUARDEX_REVIEW_BOT_ONLY_PR:-}" +RETRY_FAILED_RAW="${GUARDEX_REVIEW_BOT_RETRY_FAILED:-false}" +INCLUDE_DRAFT_RAW="${GUARDEX_REVIEW_BOT_INCLUDE_DRAFT:-false}" + +usage() { + cat <<'USAGE' +Usage: bash scripts/review-bot-watch.sh [options] + +Continuously monitor GitHub pull requests targeting a base branch and dispatch +one Codex-agent task per newly opened/updated PR. + +Options: + --base Base branch to watch (default: current branch) + --interval Poll interval (default: 30) + --agent Agent name for codex-agent (default: guardex-review-bot) + --task-prefix Task prefix for codex-agent branches (default: review-merge) + --state-file State file path (default: .omx/state/review-bot-watch-.tsv) + --only-pr Watch only one PR number + --include-draft Include draft PRs + --retry-failed Retry PRs that previously failed even when SHA is unchanged + --once Run one poll cycle and exit + -h, --help Show this help + +Environment overrides: + GUARDEX_REVIEW_BOT_PROMPT_APPEND Additional instructions appended to each Codex prompt +USAGE +} + +normalize_bool() { + local raw="${1:-}" + local fallback="${2:-0}" + case "$(printf '%s' "$raw" | tr '[:upper:]' '[:lower:]')" in + 1|true|yes|on) printf '1' ;; + 0|false|no|off) printf '0' ;; + '') printf '%s' "$fallback" ;; + *) printf '%s' "$fallback" ;; + esac +} + +ONCE=0 + +while [[ $# -gt 0 ]]; do + case "$1" in + --base) + BASE_BRANCH="${2:-}" + shift 2 + ;; + --interval) + INTERVAL_SECONDS="${2:-}" + shift 2 + ;; + --agent) + AGENT_NAME="${2:-}" + shift 2 + ;; + --task-prefix) + TASK_PREFIX="${2:-}" + shift 2 + ;; + --state-file) + STATE_FILE="${2:-}" + shift 2 + ;; + --only-pr) + ONLY_PR="${2:-}" + shift 2 + ;; + --retry-failed) + RETRY_FAILED_RAW="true" + shift + ;; + --include-draft) + INCLUDE_DRAFT_RAW="true" + shift + ;; + --once) + ONCE=1 + shift + ;; + -h|--help) + usage + exit 0 + ;; + *) + echo "[review-bot-watch] Unknown option: $1" >&2 + usage >&2 + exit 1 + ;; + esac +done + +RETRY_FAILED="$(normalize_bool "$RETRY_FAILED_RAW" "0")" +INCLUDE_DRAFT="$(normalize_bool "$INCLUDE_DRAFT_RAW" "0")" + +if [[ ! "$INTERVAL_SECONDS" =~ ^[0-9]+$ ]] || [[ "$INTERVAL_SECONDS" -lt 5 ]]; then + echo "[review-bot-watch] --interval must be an integer >= 5 seconds." >&2 + exit 1 +fi + +if [[ -n "$ONLY_PR" ]] && [[ ! "$ONLY_PR" =~ ^[0-9]+$ ]]; then + echo "[review-bot-watch] --only-pr must be a numeric PR id." >&2 + exit 1 +fi + +if ! git rev-parse --is-inside-work-tree >/dev/null 2>&1; then + echo "[review-bot-watch] Not inside a git repository." >&2 + exit 1 +fi +repo_root="$(git rev-parse --show-toplevel)" + +if [[ -z "$BASE_BRANCH" ]]; then + BASE_BRANCH="$(git -C "$repo_root" rev-parse --abbrev-ref HEAD 2>/dev/null || true)" +fi +if [[ -z "$BASE_BRANCH" || "$BASE_BRANCH" == "HEAD" ]]; then + BASE_BRANCH="main" +fi + +if ! command -v gh >/dev/null 2>&1; then + echo "[review-bot-watch] Missing GitHub CLI (gh)." >&2 + echo "[review-bot-watch] Install gh and run: gh auth login" >&2 + exit 127 +fi + +if ! command -v codex >/dev/null 2>&1; then + echo "[review-bot-watch] Missing Codex CLI command: codex" >&2 + exit 127 +fi + +if [[ ! -x "$repo_root/scripts/codex-agent.sh" ]]; then + echo "[review-bot-watch] Missing scripts/codex-agent.sh. Run: gx setup" >&2 + exit 1 +fi + +if ! gh auth status >/dev/null 2>&1; then + echo "[review-bot-watch] gh is not authenticated. Run: gh auth login" >&2 + exit 1 +fi + +sanitize_slug() { + local raw="$1" + local fallback="$2" + local slug + slug="$(printf '%s' "$raw" | tr '[:upper:]' '[:lower:]' | sed -E 's/[^a-z0-9]+/-/g; s/^-+//; s/-+$//; s/-{2,}/-/g')" + if [[ -z "$slug" ]]; then + slug="$fallback" + fi + printf '%s' "$slug" +} + +base_slug="$(sanitize_slug "$BASE_BRANCH" "base")" +if [[ -z "$STATE_FILE" ]]; then + STATE_FILE="$repo_root/.omx/state/review-bot-watch-${base_slug}.tsv" +fi +mkdir -p "$(dirname "$STATE_FILE")" + +declare -A LAST_SHA + +declare -A LAST_STATUS + +load_state() { + if [[ ! -f "$STATE_FILE" ]]; then + return 0 + fi + while IFS=$'\t' read -r pr sha status updated_at; do + if [[ -z "${pr:-}" ]] || [[ "${pr:0:1}" == "#" ]]; then + continue + fi + LAST_SHA["$pr"]="$sha" + LAST_STATUS["$pr"]="$status" + done < "$STATE_FILE" +} + +save_state() { + { + echo "# pr\thead_sha\tstatus\tupdated_at" + for pr in "${!LAST_SHA[@]}"; do + printf '%s\t%s\t%s\t%s\n' "${pr}" "${LAST_SHA[$pr]}" "${LAST_STATUS[$pr]:-unknown}" "$(date -u +%Y-%m-%dT%H:%M:%SZ)" + done | sort -n + } > "$STATE_FILE" +} + +build_prompt() { + local pr="$1" + local head_branch="$2" + local head_sha="$3" + local pr_title="$4" + local pr_url="$5" + + cat <&2 + fi + + save_state +} + +load_state + +echo "[review-bot-watch] Starting monitor" +echo "[review-bot-watch] Base branch : ${BASE_BRANCH}" +echo "[review-bot-watch] Interval : ${INTERVAL_SECONDS}s" +echo "[review-bot-watch] State file : ${STATE_FILE}" +if [[ -n "$ONLY_PR" ]]; then + echo "[review-bot-watch] Only PR : #${ONLY_PR}" +fi + +trap 'echo "[review-bot-watch] Stopped."; exit 0' INT TERM + +while true; do + found=0 + while IFS=$'\t' read -r pr head_branch sha is_draft title url; do + if [[ -z "${pr:-}" ]]; then + continue + fi + + found=1 + + if [[ -n "$ONLY_PR" && "$pr" != "$ONLY_PR" ]]; then + continue + fi + + if [[ "$INCLUDE_DRAFT" != "1" && "$is_draft" == "true" ]]; then + continue + fi + + if ! should_process_pr "$pr" "$sha"; then + continue + fi + + process_one_pr "$pr" "$head_branch" "$sha" "$title" "$url" + done < <(list_open_prs || true) + + if [[ "$found" -eq 0 ]]; then + echo "[review-bot-watch] No open PRs for base '${BASE_BRANCH}'." + fi + + if [[ "$ONCE" -eq 1 ]]; then + break + fi + + sleep "$INTERVAL_SECONDS" +done From af45a92a7bfe9d0570f4bdcafbfce2493ee2972b Mon Sep 17 00:00:00 2001 From: NagyVikt Date: Tue, 21 Apr 2026 17:07:31 +0200 Subject: [PATCH 11/48] new --- AGENTS.md | 2 +- frontend/scripts/openspec/init-change-workspace.sh | 2 +- frontend/scripts/openspec/init-plan-workspace.sh | 6 ++++-- scripts/openspec/init-change-workspace.sh | 2 +- scripts/openspec/init-plan-workspace.sh | 6 ++++-- templates/scripts/openspec/init-change-workspace.sh | 2 +- templates/scripts/openspec/init-plan-workspace.sh | 6 ++++-- test/install.test.js | 5 +++-- 8 files changed, 19 insertions(+), 12 deletions(-) diff --git a/AGENTS.md b/AGENTS.md index d386fed..d177a49 100644 --- a/AGENTS.md +++ b/AGENTS.md @@ -238,7 +238,7 @@ Use `openspec/plan/README.md` as the operational runbook and `openspec/plan/PLAN Default quick flow: 1. Create/maintain `openspec/plan//`. 2. Keep role `tasks.md` files current (`planner`, `architect`, `critic`, `executor`, `writer`, `verifier`). -3. Keep checklist headings visible: `## 1. Spec`, `## 2. Tests`, `## 3. Implementation`, `## 4. Checkpoints`. +3. Keep checklist headings visible: `## 1. Spec`, `## 2. Tests`, `## 3. Implementation`, `## 4. Checkpoints`, plus a final cleanup section (`## 5. Cleanup` or `## 6. Cleanup`). 4. Update checkboxes continuously while work progresses. 5. Execute from approved `planner/plan.md` with role ownership. 6. Verify with evidence before archive/finish. diff --git a/frontend/scripts/openspec/init-change-workspace.sh b/frontend/scripts/openspec/init-change-workspace.sh index a71940a..e417495 100755 --- a/frontend/scripts/openspec/init-change-workspace.sh +++ b/frontend/scripts/openspec/init-change-workspace.sh @@ -67,7 +67,7 @@ if [[ ! -f "${CHANGE_DIR}/tasks.md" ]]; then - [ ] 3.2 Run \`openspec validate ${CHANGE_SLUG} --type change --strict\`. - [ ] 3.3 Run \`openspec validate --specs\`. -## 4. Completion +## 4. Cleanup - [ ] 4.1 Finish the agent branch via PR merge + cleanup (\`gx finish --via-pr --wait-for-merge --cleanup\` or \`bash scripts/agent-branch-finish.sh --branch --base --via-pr --wait-for-merge --cleanup\`). - [ ] 4.2 Record PR URL + final \`MERGED\` state in the completion handoff. diff --git a/frontend/scripts/openspec/init-plan-workspace.sh b/frontend/scripts/openspec/init-plan-workspace.sh index f05f28c..c6dd4da 100755 --- a/frontend/scripts/openspec/init-plan-workspace.sh +++ b/frontend/scripts/openspec/init-plan-workspace.sh @@ -150,9 +150,11 @@ You are the \`${role}\` lane for shared plan \`${PLAN_SLUG}\`. - [ ] Leave a role handoff with files changed, verification, and risks. - [ ] Owner records \`accept\`, \`revise\`, or \`reject\` for joined output, or marks \`N/A\` if no helper joined. -## 6. Completion +## 6. Cleanup -- [ ] Keep sandbox cleanup blocked until change tasks 4.1-4.3 are complete. +- [ ] If this role owns finish, run \`gx finish --via-pr --wait-for-merge --cleanup\` or \`bash scripts/agent-branch-finish.sh --branch --base --via-pr --wait-for-merge --cleanup\`. +- [ ] Record PR URL + final \`MERGED\` state in the role handoff or owner change task. +- [ ] Confirm sandbox cleanup, or capture a \`BLOCKED:\` handoff if finish is still pending. " done diff --git a/scripts/openspec/init-change-workspace.sh b/scripts/openspec/init-change-workspace.sh index a71940a..e417495 100755 --- a/scripts/openspec/init-change-workspace.sh +++ b/scripts/openspec/init-change-workspace.sh @@ -67,7 +67,7 @@ if [[ ! -f "${CHANGE_DIR}/tasks.md" ]]; then - [ ] 3.2 Run \`openspec validate ${CHANGE_SLUG} --type change --strict\`. - [ ] 3.3 Run \`openspec validate --specs\`. -## 4. Completion +## 4. Cleanup - [ ] 4.1 Finish the agent branch via PR merge + cleanup (\`gx finish --via-pr --wait-for-merge --cleanup\` or \`bash scripts/agent-branch-finish.sh --branch --base --via-pr --wait-for-merge --cleanup\`). - [ ] 4.2 Record PR URL + final \`MERGED\` state in the completion handoff. diff --git a/scripts/openspec/init-plan-workspace.sh b/scripts/openspec/init-plan-workspace.sh index f05f28c..c6dd4da 100755 --- a/scripts/openspec/init-plan-workspace.sh +++ b/scripts/openspec/init-plan-workspace.sh @@ -150,9 +150,11 @@ You are the \`${role}\` lane for shared plan \`${PLAN_SLUG}\`. - [ ] Leave a role handoff with files changed, verification, and risks. - [ ] Owner records \`accept\`, \`revise\`, or \`reject\` for joined output, or marks \`N/A\` if no helper joined. -## 6. Completion +## 6. Cleanup -- [ ] Keep sandbox cleanup blocked until change tasks 4.1-4.3 are complete. +- [ ] If this role owns finish, run \`gx finish --via-pr --wait-for-merge --cleanup\` or \`bash scripts/agent-branch-finish.sh --branch --base --via-pr --wait-for-merge --cleanup\`. +- [ ] Record PR URL + final \`MERGED\` state in the role handoff or owner change task. +- [ ] Confirm sandbox cleanup, or capture a \`BLOCKED:\` handoff if finish is still pending. " done diff --git a/templates/scripts/openspec/init-change-workspace.sh b/templates/scripts/openspec/init-change-workspace.sh index a71940a..e417495 100644 --- a/templates/scripts/openspec/init-change-workspace.sh +++ b/templates/scripts/openspec/init-change-workspace.sh @@ -67,7 +67,7 @@ if [[ ! -f "${CHANGE_DIR}/tasks.md" ]]; then - [ ] 3.2 Run \`openspec validate ${CHANGE_SLUG} --type change --strict\`. - [ ] 3.3 Run \`openspec validate --specs\`. -## 4. Completion +## 4. Cleanup - [ ] 4.1 Finish the agent branch via PR merge + cleanup (\`gx finish --via-pr --wait-for-merge --cleanup\` or \`bash scripts/agent-branch-finish.sh --branch --base --via-pr --wait-for-merge --cleanup\`). - [ ] 4.2 Record PR URL + final \`MERGED\` state in the completion handoff. diff --git a/templates/scripts/openspec/init-plan-workspace.sh b/templates/scripts/openspec/init-plan-workspace.sh index f05f28c..c6dd4da 100644 --- a/templates/scripts/openspec/init-plan-workspace.sh +++ b/templates/scripts/openspec/init-plan-workspace.sh @@ -150,9 +150,11 @@ You are the \`${role}\` lane for shared plan \`${PLAN_SLUG}\`. - [ ] Leave a role handoff with files changed, verification, and risks. - [ ] Owner records \`accept\`, \`revise\`, or \`reject\` for joined output, or marks \`N/A\` if no helper joined. -## 6. Completion +## 6. Cleanup -- [ ] Keep sandbox cleanup blocked until change tasks 4.1-4.3 are complete. +- [ ] If this role owns finish, run \`gx finish --via-pr --wait-for-merge --cleanup\` or \`bash scripts/agent-branch-finish.sh --branch --base --via-pr --wait-for-merge --cleanup\`. +- [ ] Record PR URL + final \`MERGED\` state in the role handoff or owner change task. +- [ ] Confirm sandbox cleanup, or capture a \`BLOCKED:\` handoff if finish is still pending. " done diff --git a/test/install.test.js b/test/install.test.js index 96f96e7..f8d4fa3 100644 --- a/test/install.test.js +++ b/test/install.test.js @@ -4126,7 +4126,8 @@ test('OpenSpec plan workspace scaffold creates expected role/task structure', () assert.match(plannerTasks, /## 3\. Implementation/); assert.match(plannerTasks, /## 4\. Checkpoints/); assert.match(plannerTasks, /## 5\. Collaboration/); - assert.match(plannerTasks, /## 6\. Completion/); + assert.match(plannerTasks, /## 6\. Cleanup/); + assert.match(plannerTasks, /gx finish --via-pr --wait-for-merge --cleanup/); assert.match(plannerTasks, /Claim this role's files in the shared owner branch\/worktree before editing/); const plannerPrompt = fs.readFileSync(path.join(planDir, 'planner', 'prompt.md'), 'utf8'); @@ -4156,7 +4157,7 @@ test('OpenSpec change workspace scaffold creates proposal/tasks/spec defaults', assert.equal(fs.existsSync(path.join(changeDir, 'specs', capabilitySlug, 'spec.md')), true, 'spec.md missing'); const tasksContent = fs.readFileSync(path.join(changeDir, 'tasks.md'), 'utf8'); - assert.match(tasksContent, /## 4\. Completion/); + assert.match(tasksContent, /## 4\. Cleanup/); assert.match(tasksContent, /gx finish --via-pr --wait-for-merge --cleanup/); assert.match(tasksContent, /Record PR URL \+ final `MERGED` state in the completion handoff\./); assert.match(tasksContent, /Confirm sandbox cleanup/); From f3c57079a1ea51220e46c2148c0dd79275e77b08 Mon Sep 17 00:00:00 2001 From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com> Date: Tue, 21 Apr 2026 17:30:21 +0200 Subject: [PATCH 12/48] Bump path-to-regexp from 0.1.12 to 0.1.13 in /frontend (#240) Bumps [path-to-regexp](https://github.com/pillarjs/path-to-regexp) from 0.1.12 to 0.1.13. - [Release notes](https://github.com/pillarjs/path-to-regexp/releases) - [Changelog](https://github.com/pillarjs/path-to-regexp/blob/v.0.1.13/History.md) - [Commits](https://github.com/pillarjs/path-to-regexp/compare/v0.1.12...v.0.1.13) --- updated-dependencies: - dependency-name: path-to-regexp dependency-version: 0.1.13 dependency-type: indirect ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> --- frontend/package-lock.json | 13 ++++++------- 1 file changed, 6 insertions(+), 7 deletions(-) diff --git a/frontend/package-lock.json b/frontend/package-lock.json index ec34336..4c0f26e 100644 --- a/frontend/package-lock.json +++ b/frontend/package-lock.json @@ -4630,10 +4630,9 @@ "license": "MIT" }, "node_modules/path-to-regexp": { - "version": "0.1.12", - "resolved": "https://registry.npmjs.org/path-to-regexp/-/path-to-regexp-0.1.12.tgz", - "integrity": "sha512-RA1GjUVMnvYFxuqovrEqZoxxW5NUZqbwKtYz/Tt7nXerk0LbLblQmrsgdeOxV5SFHf0UDggjS/bSeOZwt1pmEQ==", - "license": "MIT" + "version": "0.1.13", + "resolved": "https://registry.npmjs.org/path-to-regexp/-/path-to-regexp-0.1.13.tgz", + "integrity": "sha512-A/AGNMFN3c8bOlvV9RreMdrv7jsmF9XIfDeCd87+I8RNg6s78BhJxMu69NEMHBSJFxKidViTEdruRwEk/WIKqA==" }, "node_modules/picocolors": { "version": "1.1.1", @@ -8833,9 +8832,9 @@ "dev": true }, "path-to-regexp": { - "version": "0.1.12", - "resolved": "https://registry.npmjs.org/path-to-regexp/-/path-to-regexp-0.1.12.tgz", - "integrity": "sha512-RA1GjUVMnvYFxuqovrEqZoxxW5NUZqbwKtYz/Tt7nXerk0LbLblQmrsgdeOxV5SFHf0UDggjS/bSeOZwt1pmEQ==" + "version": "0.1.13", + "resolved": "https://registry.npmjs.org/path-to-regexp/-/path-to-regexp-0.1.13.tgz", + "integrity": "sha512-A/AGNMFN3c8bOlvV9RreMdrv7jsmF9XIfDeCd87+I8RNg6s78BhJxMu69NEMHBSJFxKidViTEdruRwEk/WIKqA==" }, "picocolors": { "version": "1.1.1", From 18c4ac1397acbdcff6c93e17e8408c95491ca75b Mon Sep 17 00:00:00 2001 From: Viktor Nagy <137165288+NagyVikt@users.noreply.github.com> Date: Tue, 21 Apr 2026 17:31:20 +0200 Subject: [PATCH 13/48] Align downstream OpenSpec scaffolds with recodee bootstrap (#247) Sync the setup-managed change and plan helpers to the richer recodee versions and lock the new scaffold shape with install regressions. Constraint: gx setup should install the same OpenSpec helper behavior already used in recodee Rejected: Update only templates | runtime/template drift would break parity guarantees Rejected: Copy the recodee plan scaffold verbatim without fixes | unescaped backticks corrupted generated cleanup text Confidence: high Scope-risk: moderate Reversibility: clean Directive: Keep scripts/openspec and templates/scripts/openspec in lockstep with frontend mirrors and install tests Tested: node --test --test-name-pattern OpenSpec test/install.test.js; node --test test/metadata.test.js; node --check bin/multiagent-safety.js; openspec validate agent-codex-sync-recodee-openspec-bootstrap-into-gx-2026-04-21-17-20 --type change --strict; openspec validate --specs; git diff --check Not-tested: Full npm test suite Co-authored-by: NagyVikt --- .../scripts/openspec/init-change-workspace.sh | 60 +- .../scripts/openspec/init-plan-workspace.sh | 650 ++++++++++++++++-- .../.openspec.yaml | 2 + .../proposal.md | 23 + .../spec.md | 26 + .../tasks.md | 25 + scripts/openspec/init-change-workspace.sh | 60 +- scripts/openspec/init-plan-workspace.sh | 650 ++++++++++++++++-- .../scripts/openspec/init-change-workspace.sh | 60 +- .../scripts/openspec/init-plan-workspace.sh | 650 ++++++++++++++++-- test/install.test.js | 99 ++- 11 files changed, 2025 insertions(+), 280 deletions(-) create mode 100644 openspec/changes/agent-codex-sync-recodee-openspec-bootstrap-into-gx-2026-04-21-17-20/.openspec.yaml create mode 100644 openspec/changes/agent-codex-sync-recodee-openspec-bootstrap-into-gx-2026-04-21-17-20/proposal.md create mode 100644 openspec/changes/agent-codex-sync-recodee-openspec-bootstrap-into-gx-2026-04-21-17-20/specs/agent-codex-sync-recodee-openspec-bootstrap-into-gx-2026-04-21-17-20/spec.md create mode 100644 openspec/changes/agent-codex-sync-recodee-openspec-bootstrap-into-gx-2026-04-21-17-20/tasks.md mode change 100644 => 100755 templates/scripts/openspec/init-change-workspace.sh mode change 100644 => 100755 templates/scripts/openspec/init-plan-workspace.sh diff --git a/frontend/scripts/openspec/init-change-workspace.sh b/frontend/scripts/openspec/init-change-workspace.sh index e417495..8f878e7 100755 --- a/frontend/scripts/openspec/init-change-workspace.sh +++ b/frontend/scripts/openspec/init-change-workspace.sh @@ -1,14 +1,15 @@ #!/usr/bin/env bash set -euo pipefail -if [[ $# -lt 1 || $# -gt 2 ]]; then - echo "Usage: $0 [capability-slug]" - echo "Example: $0 add-dashboard-live-usage runtime-migration" +if [[ $# -lt 1 || $# -gt 3 ]]; then + echo "Usage: $0 [capability-slug] [agent-branch]" + echo "Example: $0 add-dashboard-live-usage runtime-migration agent/claude-odin/add-dashboard-live-usage-123456" exit 1 fi CHANGE_SLUG="$1" CAPABILITY_SLUG="${2:-$CHANGE_SLUG}" +AGENT_BRANCH="${3:-agent//}" if [[ "$CHANGE_SLUG" =~ [^a-z0-9-] ]]; then echo "Error: change slug must be kebab-case (lowercase letters, numbers, hyphens)." @@ -24,7 +25,17 @@ CHANGE_DIR="openspec/changes/${CHANGE_SLUG}" SPEC_DIR="${CHANGE_DIR}/specs/${CAPABILITY_SLUG}" TODAY="$(date -u +%Y-%m-%d)" -mkdir -p "$SPEC_DIR" +MINIMAL_RAW="${GUARDEX_OPENSPEC_MINIMAL:-0}" +case "$(printf '%s' "$MINIMAL_RAW" | tr '[:upper:]' '[:lower:]')" in + 1|true|yes|on) MINIMAL=1 ;; + *) MINIMAL=0 ;; +esac + +if [[ "$MINIMAL" -eq 1 ]]; then + mkdir -p "$CHANGE_DIR" +else + mkdir -p "$SPEC_DIR" +fi if [[ ! -f "${CHANGE_DIR}/.openspec.yaml" ]]; then cat > "${CHANGE_DIR}/.openspec.yaml" < "${CHANGE_DIR}/notes.md" < "${CHANGE_DIR}/proposal.md" < "${CHANGE_DIR}/tasks.md" < --base --via-pr --wait-for-merge --cleanup\`). -- [ ] 4.2 Record PR URL + final \`MERGED\` state in the completion handoff. -- [ ] 4.3 Confirm sandbox cleanup (\`git worktree list\`, \`git branch -a\`) or capture a \`BLOCKED:\` handoff if merge/cleanup is pending. +- [ ] 4.1 Run the cleanup pipeline: \`bash scripts/agent-branch-finish.sh --branch ${AGENT_BRANCH} --base dev --via-pr --wait-for-merge --cleanup\`. This handles commit -> push -> PR create -> merge wait -> worktree prune in one invocation. +- [ ] 4.2 Record the PR URL and final merge state (\`MERGED\`) in the completion handoff. +- [ ] 4.3 Confirm the sandbox worktree is gone (\`git worktree list\` no longer shows the agent path; \`git branch -a\` shows no surviving local/remote refs for the branch). TASKSEOF fi @@ -89,5 +129,5 @@ The system SHALL enforce ${CAPABILITY_SLUG} behavior as defined by this change. SPECEOF fi -echo "[gitguardex] OpenSpec change workspace ready: ${CHANGE_DIR}" -echo "[gitguardex] OpenSpec change spec scaffold: ${SPEC_DIR}/spec.md" +echo "[guardex] OpenSpec change workspace ready: ${CHANGE_DIR}" +echo "[guardex] OpenSpec change spec scaffold: ${SPEC_DIR}/spec.md" diff --git a/frontend/scripts/openspec/init-plan-workspace.sh b/frontend/scripts/openspec/init-plan-workspace.sh index c6dd4da..c96dba3 100755 --- a/frontend/scripts/openspec/init-plan-workspace.sh +++ b/frontend/scripts/openspec/init-plan-workspace.sh @@ -2,8 +2,8 @@ set -euo pipefail if [[ $# -lt 1 ]]; then - echo "Usage: $0 [role ...]" - echo "Example: $0 stabilize-dashboard planner architect critic executor writer verifier" + echo "Usage: $0 [agent-role ...]" + echo "Example: $0 add-ralplan-openspec-plan-export planner architect critic executor writer verifier" exit 1 fi @@ -11,10 +11,14 @@ PLAN_SLUG="$1" shift || true if [[ "$PLAN_SLUG" =~ [^a-z0-9-] ]]; then - echo "Error: plan slug must be kebab-case (lowercase letters, numbers, hyphens)." >&2 + echo "Error: plan slug must be kebab-case (lowercase letters, numbers, hyphens)." exit 1 fi +to_kebab() { + printf '%s' "$1" | tr '[:upper:]' '[:lower:]' | sed -E 's/[^a-z0-9]+/-/g; s/^-+//; s/-+$//' +} + if [[ $# -gt 0 ]]; then ROLES=("$@") else @@ -24,139 +28,635 @@ fi PLAN_DIR="openspec/plan/${PLAN_SLUG}" mkdir -p "$PLAN_DIR" -write_if_missing() { - local file="$1" - shift - if [[ ! -f "$file" ]]; then - mkdir -p "$(dirname "$file")" - cat > "$file" < "$PLAN_DIR/summary.md" < "$PLAN_DIR/checkpoints.md" </spec.md\`" + echo "Planner also gets \`plan.md\`; executor also gets \`checkpoints.md\`." + echo "Planner plans should follow \`openspec/plan/PLANS.md\`." + } > "$PLAN_DIR/README.md" +fi + +if [[ ! -f "$PLAN_DIR/coordinator-prompt.md" ]]; then + cat > "$PLAN_DIR/coordinator-prompt.md" < "$PLAN_DIR/kickoff-prompts.md" < +Owned scope: +- + +Verification: +- + +Handoff format: +- Files changed +- Behavior touched +- Verification outputs +- Risks/follow-ups +\`\`\` + +## Prompt B — Wave B (Secondary lane) + +\`\`\`text +You own Wave-B for plan \`${PLAN_SLUG}\` in /home/deadpool/Documents/codex-lb. + +Goal: +Implement the assigned Wave-B scope and return verification evidence. + +Hard constraints: +- You are not alone in the codebase; do not revert others' work. +- Stay in your owned files/modules only. +- Record explicit handoff notes for integration. + +Owned scope: +- + +Verification: +- + +Handoff format: +- Files changed +- Behavior touched +- Verification outputs +- Risks/follow-ups \`\`\` -" -write_if_missing "$PLAN_DIR/planner/plan.md" "# ExecPlan: ${PLAN_SLUG} +## Prompt C — Wave C (Secondary lane) + +\`\`\`text +You own Wave-C for plan \`${PLAN_SLUG}\` in /home/deadpool/Documents/codex-lb. + +Goal: +Implement the assigned Wave-C scope and return verification evidence. + +Hard constraints: +- You are not alone in the codebase; do not revert others' work. +- Stay in your owned files/modules only. +- Record explicit handoff notes for integration. + +Owned scope: +- + +Verification: +- + +Handoff format: +- Files changed +- Behavior touched +- Verification outputs +- Risks/follow-ups +\`\`\` + +## Prompt D — Integrator lane + +\`\`\`text +You are the integrator for plan \`${PLAN_SLUG}\` in /home/deadpool/Documents/codex-lb. + +Goal: +Integrate completed waves, resolve conflicts, run final verification, and prepare rollout/cutover notes. + +Hard constraints: +- You are not alone in the codebase; do not revert others' work. +- Preserve safety-critical behavior unless explicitly planned and tested. +- Keep final output evidence-first. + +Owned scope: +- integration glue and shared touchpoints +- final validation + handoff summary + +Verification: +- + +Final report: +- Files changed +- Integration decisions +- Verification outputs +- Remaining risks +\`\`\` +KICKOFFPROMPTEOF +fi + +if [[ ! -f "$PLAN_DIR/phases.md" ]]; then + cat > "$PLAN_DIR/phases.md" <\` = in progress, space = pending. +Indented sub-bullets are optional metadata consumed by the Plans UI: + +- \`session\`: which agent kind runs the phase (\`codex\` / \`claude\`). +- \`checkpoints\`: comma-separated role checkpoint ids delivered within the phase. +- \`summary\`: one short sentence rendered under the phase title. + +One phase is intended to fit into a single Codex or Claude session task. + +- [ ] [PH01] First milestone title goes here + - session: codex + - checkpoints: P1, A1 + - summary: Describe the single session outcome expected for this phase. +PHASESEOF +fi + +for role in "${ROLES[@]}"; do + ROLE_DIR="$PLAN_DIR/$role" + mkdir -p "$ROLE_DIR" + + if [[ ! -f "$ROLE_DIR/README.md" ]]; then + cat > "$ROLE_DIR/README.md" </spec.md\` + +Use this folder for role notes, artifacts, and status updates. +ROLEEOF + fi + + ROLE_SPEC_SLUG="$(to_kebab "$role")" + if [[ -z "$ROLE_SPEC_SLUG" ]]; then + ROLE_SPEC_SLUG="role" + fi + + if [[ ! -f "$ROLE_DIR/.openspec.yaml" ]]; then + cat > "$ROLE_DIR/.openspec.yaml" < "$ROLE_DIR/proposal.md" < "$ROLE_SPEC_DIR/spec.md" < "$ROLE_DIR/plan.md" < "$ROLE_DIR/checkpoints.md" < "$ROLE_DIR/tasks.md" < --base dev --via-pr --wait-for-merge --cleanup\`. +- [ ] 6.2 Record PR URL + final \`MERGED\` state in the handoff. +- [ ] 6.3 Confirm sandbox cleanup (\`git worktree list\`, \`git branch -a\`) or append \`BLOCKED:\` and stop. +TASKEOF + ;; + architect) + cat > "$ROLE_DIR/tasks.md" < openspec/plan/${PLAN_SLUG}/${role}/README.md openspec/plan/${PLAN_SLUG}/${role}/prompt.md openspec/plan/${PLAN_SLUG}/${role}/tasks.md openspec/plan/${PLAN_SLUG}/checkpoints.md\` -- Record branch, worktree, and scope in \`tasks.md\`. -- Do not change another role's files without reassignment. +## 3. Implementation -## Deliverables +- [ ] 3.1 Review plan for strongest antithesis/tradeoff tensions +- [ ] 3.2 Propose synthesis path and guardrails for implementation teams +- [ ] 3.3 Record architecture sign-off notes for downstream execution -- Complete the role checklist in \`tasks.md\`. -- Leave a handoff with files changed, verification, and risks. -- The owner alone runs the change completion flow and sandbox cleanup after change tasks 4.1-4.3 are done. -" +## 4. Checkpoints - write_if_missing "$ROLE_DIR/tasks.md" "# ${role} tasks +- [ ] [A1] READY - Architecture review checkpoint -## Ownership +## 5. Collaboration + +- [ ] 5.1 Owner recorded this lane before edits. +- [ ] 5.2 Record joined agents / handoffs, or mark \`N/A\` when solo. + +## 6. Cleanup -- [ ] Claim this role's files in the shared owner branch/worktree before editing. -- [ ] Record branch, worktree, and scope for this role. -- [ ] Copy or hand off \`prompt.md\` when another agent joins this role. +- [ ] 6.1 If this lane owns finalization, run \`bash scripts/agent-branch-finish.sh --branch --base dev --via-pr --wait-for-merge --cleanup\`. +- [ ] 6.2 Record PR URL + final \`MERGED\` state in the handoff. +- [ ] 6.3 Confirm sandbox cleanup (\`git worktree list\`, \`git branch -a\`) or append \`BLOCKED:\` and stop. +TASKEOF + ;; + critic) + cat > "$ROLE_DIR/tasks.md" < --base --via-pr --wait-for-merge --cleanup\`. -- [ ] Record PR URL + final \`MERGED\` state in the role handoff or owner change task. -- [ ] Confirm sandbox cleanup, or capture a \`BLOCKED:\` handoff if finish is still pending. -" +- [ ] 6.1 If this lane owns finalization, run \`bash scripts/agent-branch-finish.sh --branch --base dev --via-pr --wait-for-merge --cleanup\`. +- [ ] 6.2 Record PR URL + final \`MERGED\` state in the handoff. +- [ ] 6.3 Confirm sandbox cleanup (\`git worktree list\`, \`git branch -a\`) or append \`BLOCKED:\` and stop. +TASKEOF + ;; + executor) + cat > "$ROLE_DIR/tasks.md" < --base dev --via-pr --wait-for-merge --cleanup\`. +- [ ] 6.2 Record PR URL + final \`MERGED\` state in the handoff. +- [ ] 6.3 Confirm sandbox cleanup (\`git worktree list\`, \`git branch -a\`) or append \`BLOCKED:\` and stop. +TASKEOF + ;; + writer) + cat > "$ROLE_DIR/tasks.md" < --base dev --via-pr --wait-for-merge --cleanup\`. +- [ ] 6.2 Record PR URL + final \`MERGED\` state in the handoff. +- [ ] 6.3 Confirm sandbox cleanup (\`git worktree list\`, \`git branch -a\`) or append \`BLOCKED:\` and stop. +TASKEOF + ;; + verifier) + cat > "$ROLE_DIR/tasks.md" < --base dev --via-pr --wait-for-merge --cleanup\`. +- [ ] 6.2 Record PR URL + final \`MERGED\` state in the handoff. +- [ ] 6.3 Confirm sandbox cleanup (\`git worktree list\`, \`git branch -a\`) or append \`BLOCKED:\` and stop. +TASKEOF + ;; + *) + cat > "$ROLE_DIR/tasks.md" < --base dev --via-pr --wait-for-merge --cleanup\`. +- [ ] 6.2 Record PR URL + final \`MERGED\` state in the handoff. +- [ ] 6.3 Confirm sandbox cleanup (\`git worktree list\`, \`git branch -a\`) or append \`BLOCKED:\` and stop. +TASKEOF + ;; + esac + fi done -echo "[gitguardex] OpenSpec plan workspace ready: ${PLAN_DIR}" -echo "[gitguardex] Roles: ${ROLES[*]}" +echo "Plan workspace ready: $PLAN_DIR" diff --git a/openspec/changes/agent-codex-sync-recodee-openspec-bootstrap-into-gx-2026-04-21-17-20/.openspec.yaml b/openspec/changes/agent-codex-sync-recodee-openspec-bootstrap-into-gx-2026-04-21-17-20/.openspec.yaml new file mode 100644 index 0000000..4b8c565 --- /dev/null +++ b/openspec/changes/agent-codex-sync-recodee-openspec-bootstrap-into-gx-2026-04-21-17-20/.openspec.yaml @@ -0,0 +1,2 @@ +schema: spec-driven +created: 2026-04-21 diff --git a/openspec/changes/agent-codex-sync-recodee-openspec-bootstrap-into-gx-2026-04-21-17-20/proposal.md b/openspec/changes/agent-codex-sync-recodee-openspec-bootstrap-into-gx-2026-04-21-17-20/proposal.md new file mode 100644 index 0000000..c620545 --- /dev/null +++ b/openspec/changes/agent-codex-sync-recodee-openspec-bootstrap-into-gx-2026-04-21-17-20/proposal.md @@ -0,0 +1,23 @@ +## Why + +- `gx setup` installs OpenSpec helper scripts that are older than the live `recodee` copies already guiding current work. +- The published `gitguardex` AGENTS contract already promises tier-aware OpenSpec scaffolding, but the installed helper scripts still miss the richer change and plan workspace behavior. +- Downstream repos bootstrapped with `gx setup` should get the same OpenSpec workspace shape that `recodee` is already using. + +## What Changes + +- Sync the published `init-change-workspace.sh` helper to the richer `recodee` version, including: + - T1/minimal notes-only change scaffolding support + - stronger cleanup and Definition-of-Done guidance in `tasks.md` + - optional agent-branch placeholder support for manual scaffolds +- Sync the published `init-plan-workspace.sh` helper to the richer `recodee` version, including: + - coordinator, kickoff, and phases artifacts + - per-role OpenSpec proposal/spec/task scaffolds + - stronger ExecPlan and checkpoint prompts +- Keep runtime/template copies aligned and expand regression coverage for the richer scaffold outputs. + +## Impact + +- Affected surface: `gx setup` / `gx doctor` managed OpenSpec helper scripts plus their runtime/template parity tests. +- User-facing effect: new repos bootstrapped with Guardex get the same richer OpenSpec workspace shape already present in `recodee`. +- Risk: low-to-moderate because the change is limited to scaffold content, but stale tests would hide drift if they are not updated together. diff --git a/openspec/changes/agent-codex-sync-recodee-openspec-bootstrap-into-gx-2026-04-21-17-20/specs/agent-codex-sync-recodee-openspec-bootstrap-into-gx-2026-04-21-17-20/spec.md b/openspec/changes/agent-codex-sync-recodee-openspec-bootstrap-into-gx-2026-04-21-17-20/specs/agent-codex-sync-recodee-openspec-bootstrap-into-gx-2026-04-21-17-20/spec.md new file mode 100644 index 0000000..ba4073a --- /dev/null +++ b/openspec/changes/agent-codex-sync-recodee-openspec-bootstrap-into-gx-2026-04-21-17-20/specs/agent-codex-sync-recodee-openspec-bootstrap-into-gx-2026-04-21-17-20/spec.md @@ -0,0 +1,26 @@ +## ADDED Requirements + +### Requirement: setup-managed repos receive the richer change scaffold +Guardex setup-managed repos SHALL receive the same richer OpenSpec change scaffold already present in `recodee`. + +#### Scenario: full change scaffold is initialized +- **GIVEN** a repo managed by `gx setup` or `gx doctor` +- **WHEN** the operator runs `scripts/openspec/init-change-workspace.sh ` +- **THEN** the script writes `.openspec.yaml`, `proposal.md`, `tasks.md`, and `specs//spec.md` +- **AND** `tasks.md` includes Definition-of-Done language plus explicit cleanup and merge evidence steps. + +#### Scenario: minimal T1 scaffold is initialized +- **GIVEN** `GUARDEX_OPENSPEC_MINIMAL=1` +- **WHEN** the operator runs `scripts/openspec/init-change-workspace.sh ` +- **THEN** the script creates `.openspec.yaml` and `notes.md` without the full change bundle +- **AND** `notes.md` records the provided agent branch placeholder plus cleanup expectations. + +### Requirement: setup-managed repos receive the richer plan scaffold +Guardex setup-managed repos SHALL receive the same richer OpenSpec plan scaffold already present in `recodee`. + +#### Scenario: plan workspace is initialized +- **GIVEN** a repo managed by `gx setup` or `gx doctor` +- **WHEN** the operator runs `scripts/openspec/init-plan-workspace.sh ` +- **THEN** the script creates summary, checkpoint, and root plan artifacts for the workspace +- **AND** it creates coordinator, kickoff, and phases artifacts +- **AND** each default role receives proposal, spec, and task scaffolds that preserve Spec, Tests, Implementation, Checkpoints, Collaboration, and Cleanup structure. diff --git a/openspec/changes/agent-codex-sync-recodee-openspec-bootstrap-into-gx-2026-04-21-17-20/tasks.md b/openspec/changes/agent-codex-sync-recodee-openspec-bootstrap-into-gx-2026-04-21-17-20/tasks.md new file mode 100644 index 0000000..23885f5 --- /dev/null +++ b/openspec/changes/agent-codex-sync-recodee-openspec-bootstrap-into-gx-2026-04-21-17-20/tasks.md @@ -0,0 +1,25 @@ +## 1. Specification + +- [x] 1.1 Capture the parity gap between `recodee`'s live OpenSpec helpers and the published `gitguardex` setup-managed copies. +- [x] 1.2 Define normative requirements for richer change and plan scaffolds in `specs/agent-codex-sync-recodee-openspec-bootstrap-into-gx-2026-04-21-17-20/spec.md`. + +## 2. Implementation + +- [x] 2.1 Sync `scripts/openspec/*` and `templates/scripts/openspec/*` to the richer `recodee` scaffolds. +- [x] 2.2 Keep mirrored frontend copies aligned where Guardex still ships duplicate helper surfaces. +- [x] 2.3 Update regression tests to lock the richer scaffold outputs. + +## 3. Verification + +- [x] 3.1 Run focused `node --test` coverage for the OpenSpec scaffold/install tests. +- [x] 3.2 Run `node --check bin/multiagent-safety.js`. +- [x] 3.3 Run `openspec validate agent-codex-sync-recodee-openspec-bootstrap-into-gx-2026-04-21-17-20 --type change --strict`. +- [x] 3.4 Run `openspec validate --specs`. + +Verification note: `node --test --test-name-pattern OpenSpec test/install.test.js` passed with 4 focused OpenSpec/install tests. `node --test test/metadata.test.js` passed with 14 metadata/parity tests. `node --check bin/multiagent-safety.js` passed. `git diff --check` passed. `openspec validate agent-codex-sync-recodee-openspec-bootstrap-into-gx-2026-04-21-17-20 --type change --strict` passed, and `openspec validate --specs` returned `No items found to validate.` + +## 4. Cleanup + +- [ ] 4.1 Finish the agent branch via PR merge + cleanup (`gx finish --via-pr --wait-for-merge --cleanup` or `bash scripts/agent-branch-finish.sh --branch --base --via-pr --wait-for-merge --cleanup`). +- [ ] 4.2 Record PR URL + final `MERGED` state in the completion handoff. +- [ ] 4.3 Confirm sandbox cleanup (`git worktree list`, `git branch -a`) or capture a `BLOCKED:` handoff if merge/cleanup is pending. diff --git a/scripts/openspec/init-change-workspace.sh b/scripts/openspec/init-change-workspace.sh index e417495..8f878e7 100755 --- a/scripts/openspec/init-change-workspace.sh +++ b/scripts/openspec/init-change-workspace.sh @@ -1,14 +1,15 @@ #!/usr/bin/env bash set -euo pipefail -if [[ $# -lt 1 || $# -gt 2 ]]; then - echo "Usage: $0 [capability-slug]" - echo "Example: $0 add-dashboard-live-usage runtime-migration" +if [[ $# -lt 1 || $# -gt 3 ]]; then + echo "Usage: $0 [capability-slug] [agent-branch]" + echo "Example: $0 add-dashboard-live-usage runtime-migration agent/claude-odin/add-dashboard-live-usage-123456" exit 1 fi CHANGE_SLUG="$1" CAPABILITY_SLUG="${2:-$CHANGE_SLUG}" +AGENT_BRANCH="${3:-agent//}" if [[ "$CHANGE_SLUG" =~ [^a-z0-9-] ]]; then echo "Error: change slug must be kebab-case (lowercase letters, numbers, hyphens)." @@ -24,7 +25,17 @@ CHANGE_DIR="openspec/changes/${CHANGE_SLUG}" SPEC_DIR="${CHANGE_DIR}/specs/${CAPABILITY_SLUG}" TODAY="$(date -u +%Y-%m-%d)" -mkdir -p "$SPEC_DIR" +MINIMAL_RAW="${GUARDEX_OPENSPEC_MINIMAL:-0}" +case "$(printf '%s' "$MINIMAL_RAW" | tr '[:upper:]' '[:lower:]')" in + 1|true|yes|on) MINIMAL=1 ;; + *) MINIMAL=0 ;; +esac + +if [[ "$MINIMAL" -eq 1 ]]; then + mkdir -p "$CHANGE_DIR" +else + mkdir -p "$SPEC_DIR" +fi if [[ ! -f "${CHANGE_DIR}/.openspec.yaml" ]]; then cat > "${CHANGE_DIR}/.openspec.yaml" < "${CHANGE_DIR}/notes.md" < "${CHANGE_DIR}/proposal.md" < "${CHANGE_DIR}/tasks.md" < --base --via-pr --wait-for-merge --cleanup\`). -- [ ] 4.2 Record PR URL + final \`MERGED\` state in the completion handoff. -- [ ] 4.3 Confirm sandbox cleanup (\`git worktree list\`, \`git branch -a\`) or capture a \`BLOCKED:\` handoff if merge/cleanup is pending. +- [ ] 4.1 Run the cleanup pipeline: \`bash scripts/agent-branch-finish.sh --branch ${AGENT_BRANCH} --base dev --via-pr --wait-for-merge --cleanup\`. This handles commit -> push -> PR create -> merge wait -> worktree prune in one invocation. +- [ ] 4.2 Record the PR URL and final merge state (\`MERGED\`) in the completion handoff. +- [ ] 4.3 Confirm the sandbox worktree is gone (\`git worktree list\` no longer shows the agent path; \`git branch -a\` shows no surviving local/remote refs for the branch). TASKSEOF fi @@ -89,5 +129,5 @@ The system SHALL enforce ${CAPABILITY_SLUG} behavior as defined by this change. SPECEOF fi -echo "[gitguardex] OpenSpec change workspace ready: ${CHANGE_DIR}" -echo "[gitguardex] OpenSpec change spec scaffold: ${SPEC_DIR}/spec.md" +echo "[guardex] OpenSpec change workspace ready: ${CHANGE_DIR}" +echo "[guardex] OpenSpec change spec scaffold: ${SPEC_DIR}/spec.md" diff --git a/scripts/openspec/init-plan-workspace.sh b/scripts/openspec/init-plan-workspace.sh index c6dd4da..c96dba3 100755 --- a/scripts/openspec/init-plan-workspace.sh +++ b/scripts/openspec/init-plan-workspace.sh @@ -2,8 +2,8 @@ set -euo pipefail if [[ $# -lt 1 ]]; then - echo "Usage: $0 [role ...]" - echo "Example: $0 stabilize-dashboard planner architect critic executor writer verifier" + echo "Usage: $0 [agent-role ...]" + echo "Example: $0 add-ralplan-openspec-plan-export planner architect critic executor writer verifier" exit 1 fi @@ -11,10 +11,14 @@ PLAN_SLUG="$1" shift || true if [[ "$PLAN_SLUG" =~ [^a-z0-9-] ]]; then - echo "Error: plan slug must be kebab-case (lowercase letters, numbers, hyphens)." >&2 + echo "Error: plan slug must be kebab-case (lowercase letters, numbers, hyphens)." exit 1 fi +to_kebab() { + printf '%s' "$1" | tr '[:upper:]' '[:lower:]' | sed -E 's/[^a-z0-9]+/-/g; s/^-+//; s/-+$//' +} + if [[ $# -gt 0 ]]; then ROLES=("$@") else @@ -24,139 +28,635 @@ fi PLAN_DIR="openspec/plan/${PLAN_SLUG}" mkdir -p "$PLAN_DIR" -write_if_missing() { - local file="$1" - shift - if [[ ! -f "$file" ]]; then - mkdir -p "$(dirname "$file")" - cat > "$file" < "$PLAN_DIR/summary.md" < "$PLAN_DIR/checkpoints.md" </spec.md\`" + echo "Planner also gets \`plan.md\`; executor also gets \`checkpoints.md\`." + echo "Planner plans should follow \`openspec/plan/PLANS.md\`." + } > "$PLAN_DIR/README.md" +fi + +if [[ ! -f "$PLAN_DIR/coordinator-prompt.md" ]]; then + cat > "$PLAN_DIR/coordinator-prompt.md" < "$PLAN_DIR/kickoff-prompts.md" < +Owned scope: +- + +Verification: +- + +Handoff format: +- Files changed +- Behavior touched +- Verification outputs +- Risks/follow-ups +\`\`\` + +## Prompt B — Wave B (Secondary lane) + +\`\`\`text +You own Wave-B for plan \`${PLAN_SLUG}\` in /home/deadpool/Documents/codex-lb. + +Goal: +Implement the assigned Wave-B scope and return verification evidence. + +Hard constraints: +- You are not alone in the codebase; do not revert others' work. +- Stay in your owned files/modules only. +- Record explicit handoff notes for integration. + +Owned scope: +- + +Verification: +- + +Handoff format: +- Files changed +- Behavior touched +- Verification outputs +- Risks/follow-ups \`\`\` -" -write_if_missing "$PLAN_DIR/planner/plan.md" "# ExecPlan: ${PLAN_SLUG} +## Prompt C — Wave C (Secondary lane) + +\`\`\`text +You own Wave-C for plan \`${PLAN_SLUG}\` in /home/deadpool/Documents/codex-lb. + +Goal: +Implement the assigned Wave-C scope and return verification evidence. + +Hard constraints: +- You are not alone in the codebase; do not revert others' work. +- Stay in your owned files/modules only. +- Record explicit handoff notes for integration. + +Owned scope: +- + +Verification: +- + +Handoff format: +- Files changed +- Behavior touched +- Verification outputs +- Risks/follow-ups +\`\`\` + +## Prompt D — Integrator lane + +\`\`\`text +You are the integrator for plan \`${PLAN_SLUG}\` in /home/deadpool/Documents/codex-lb. + +Goal: +Integrate completed waves, resolve conflicts, run final verification, and prepare rollout/cutover notes. + +Hard constraints: +- You are not alone in the codebase; do not revert others' work. +- Preserve safety-critical behavior unless explicitly planned and tested. +- Keep final output evidence-first. + +Owned scope: +- integration glue and shared touchpoints +- final validation + handoff summary + +Verification: +- + +Final report: +- Files changed +- Integration decisions +- Verification outputs +- Remaining risks +\`\`\` +KICKOFFPROMPTEOF +fi + +if [[ ! -f "$PLAN_DIR/phases.md" ]]; then + cat > "$PLAN_DIR/phases.md" <\` = in progress, space = pending. +Indented sub-bullets are optional metadata consumed by the Plans UI: + +- \`session\`: which agent kind runs the phase (\`codex\` / \`claude\`). +- \`checkpoints\`: comma-separated role checkpoint ids delivered within the phase. +- \`summary\`: one short sentence rendered under the phase title. + +One phase is intended to fit into a single Codex or Claude session task. + +- [ ] [PH01] First milestone title goes here + - session: codex + - checkpoints: P1, A1 + - summary: Describe the single session outcome expected for this phase. +PHASESEOF +fi + +for role in "${ROLES[@]}"; do + ROLE_DIR="$PLAN_DIR/$role" + mkdir -p "$ROLE_DIR" + + if [[ ! -f "$ROLE_DIR/README.md" ]]; then + cat > "$ROLE_DIR/README.md" </spec.md\` + +Use this folder for role notes, artifacts, and status updates. +ROLEEOF + fi + + ROLE_SPEC_SLUG="$(to_kebab "$role")" + if [[ -z "$ROLE_SPEC_SLUG" ]]; then + ROLE_SPEC_SLUG="role" + fi + + if [[ ! -f "$ROLE_DIR/.openspec.yaml" ]]; then + cat > "$ROLE_DIR/.openspec.yaml" < "$ROLE_DIR/proposal.md" < "$ROLE_SPEC_DIR/spec.md" < "$ROLE_DIR/plan.md" < "$ROLE_DIR/checkpoints.md" < "$ROLE_DIR/tasks.md" < --base dev --via-pr --wait-for-merge --cleanup\`. +- [ ] 6.2 Record PR URL + final \`MERGED\` state in the handoff. +- [ ] 6.3 Confirm sandbox cleanup (\`git worktree list\`, \`git branch -a\`) or append \`BLOCKED:\` and stop. +TASKEOF + ;; + architect) + cat > "$ROLE_DIR/tasks.md" < openspec/plan/${PLAN_SLUG}/${role}/README.md openspec/plan/${PLAN_SLUG}/${role}/prompt.md openspec/plan/${PLAN_SLUG}/${role}/tasks.md openspec/plan/${PLAN_SLUG}/checkpoints.md\` -- Record branch, worktree, and scope in \`tasks.md\`. -- Do not change another role's files without reassignment. +## 3. Implementation -## Deliverables +- [ ] 3.1 Review plan for strongest antithesis/tradeoff tensions +- [ ] 3.2 Propose synthesis path and guardrails for implementation teams +- [ ] 3.3 Record architecture sign-off notes for downstream execution -- Complete the role checklist in \`tasks.md\`. -- Leave a handoff with files changed, verification, and risks. -- The owner alone runs the change completion flow and sandbox cleanup after change tasks 4.1-4.3 are done. -" +## 4. Checkpoints - write_if_missing "$ROLE_DIR/tasks.md" "# ${role} tasks +- [ ] [A1] READY - Architecture review checkpoint -## Ownership +## 5. Collaboration + +- [ ] 5.1 Owner recorded this lane before edits. +- [ ] 5.2 Record joined agents / handoffs, or mark \`N/A\` when solo. + +## 6. Cleanup -- [ ] Claim this role's files in the shared owner branch/worktree before editing. -- [ ] Record branch, worktree, and scope for this role. -- [ ] Copy or hand off \`prompt.md\` when another agent joins this role. +- [ ] 6.1 If this lane owns finalization, run \`bash scripts/agent-branch-finish.sh --branch --base dev --via-pr --wait-for-merge --cleanup\`. +- [ ] 6.2 Record PR URL + final \`MERGED\` state in the handoff. +- [ ] 6.3 Confirm sandbox cleanup (\`git worktree list\`, \`git branch -a\`) or append \`BLOCKED:\` and stop. +TASKEOF + ;; + critic) + cat > "$ROLE_DIR/tasks.md" < --base --via-pr --wait-for-merge --cleanup\`. -- [ ] Record PR URL + final \`MERGED\` state in the role handoff or owner change task. -- [ ] Confirm sandbox cleanup, or capture a \`BLOCKED:\` handoff if finish is still pending. -" +- [ ] 6.1 If this lane owns finalization, run \`bash scripts/agent-branch-finish.sh --branch --base dev --via-pr --wait-for-merge --cleanup\`. +- [ ] 6.2 Record PR URL + final \`MERGED\` state in the handoff. +- [ ] 6.3 Confirm sandbox cleanup (\`git worktree list\`, \`git branch -a\`) or append \`BLOCKED:\` and stop. +TASKEOF + ;; + executor) + cat > "$ROLE_DIR/tasks.md" < --base dev --via-pr --wait-for-merge --cleanup\`. +- [ ] 6.2 Record PR URL + final \`MERGED\` state in the handoff. +- [ ] 6.3 Confirm sandbox cleanup (\`git worktree list\`, \`git branch -a\`) or append \`BLOCKED:\` and stop. +TASKEOF + ;; + writer) + cat > "$ROLE_DIR/tasks.md" < --base dev --via-pr --wait-for-merge --cleanup\`. +- [ ] 6.2 Record PR URL + final \`MERGED\` state in the handoff. +- [ ] 6.3 Confirm sandbox cleanup (\`git worktree list\`, \`git branch -a\`) or append \`BLOCKED:\` and stop. +TASKEOF + ;; + verifier) + cat > "$ROLE_DIR/tasks.md" < --base dev --via-pr --wait-for-merge --cleanup\`. +- [ ] 6.2 Record PR URL + final \`MERGED\` state in the handoff. +- [ ] 6.3 Confirm sandbox cleanup (\`git worktree list\`, \`git branch -a\`) or append \`BLOCKED:\` and stop. +TASKEOF + ;; + *) + cat > "$ROLE_DIR/tasks.md" < --base dev --via-pr --wait-for-merge --cleanup\`. +- [ ] 6.2 Record PR URL + final \`MERGED\` state in the handoff. +- [ ] 6.3 Confirm sandbox cleanup (\`git worktree list\`, \`git branch -a\`) or append \`BLOCKED:\` and stop. +TASKEOF + ;; + esac + fi done -echo "[gitguardex] OpenSpec plan workspace ready: ${PLAN_DIR}" -echo "[gitguardex] Roles: ${ROLES[*]}" +echo "Plan workspace ready: $PLAN_DIR" diff --git a/templates/scripts/openspec/init-change-workspace.sh b/templates/scripts/openspec/init-change-workspace.sh old mode 100644 new mode 100755 index e417495..8f878e7 --- a/templates/scripts/openspec/init-change-workspace.sh +++ b/templates/scripts/openspec/init-change-workspace.sh @@ -1,14 +1,15 @@ #!/usr/bin/env bash set -euo pipefail -if [[ $# -lt 1 || $# -gt 2 ]]; then - echo "Usage: $0 [capability-slug]" - echo "Example: $0 add-dashboard-live-usage runtime-migration" +if [[ $# -lt 1 || $# -gt 3 ]]; then + echo "Usage: $0 [capability-slug] [agent-branch]" + echo "Example: $0 add-dashboard-live-usage runtime-migration agent/claude-odin/add-dashboard-live-usage-123456" exit 1 fi CHANGE_SLUG="$1" CAPABILITY_SLUG="${2:-$CHANGE_SLUG}" +AGENT_BRANCH="${3:-agent//}" if [[ "$CHANGE_SLUG" =~ [^a-z0-9-] ]]; then echo "Error: change slug must be kebab-case (lowercase letters, numbers, hyphens)." @@ -24,7 +25,17 @@ CHANGE_DIR="openspec/changes/${CHANGE_SLUG}" SPEC_DIR="${CHANGE_DIR}/specs/${CAPABILITY_SLUG}" TODAY="$(date -u +%Y-%m-%d)" -mkdir -p "$SPEC_DIR" +MINIMAL_RAW="${GUARDEX_OPENSPEC_MINIMAL:-0}" +case "$(printf '%s' "$MINIMAL_RAW" | tr '[:upper:]' '[:lower:]')" in + 1|true|yes|on) MINIMAL=1 ;; + *) MINIMAL=0 ;; +esac + +if [[ "$MINIMAL" -eq 1 ]]; then + mkdir -p "$CHANGE_DIR" +else + mkdir -p "$SPEC_DIR" +fi if [[ ! -f "${CHANGE_DIR}/.openspec.yaml" ]]; then cat > "${CHANGE_DIR}/.openspec.yaml" < "${CHANGE_DIR}/notes.md" < "${CHANGE_DIR}/proposal.md" < "${CHANGE_DIR}/tasks.md" < --base --via-pr --wait-for-merge --cleanup\`). -- [ ] 4.2 Record PR URL + final \`MERGED\` state in the completion handoff. -- [ ] 4.3 Confirm sandbox cleanup (\`git worktree list\`, \`git branch -a\`) or capture a \`BLOCKED:\` handoff if merge/cleanup is pending. +- [ ] 4.1 Run the cleanup pipeline: \`bash scripts/agent-branch-finish.sh --branch ${AGENT_BRANCH} --base dev --via-pr --wait-for-merge --cleanup\`. This handles commit -> push -> PR create -> merge wait -> worktree prune in one invocation. +- [ ] 4.2 Record the PR URL and final merge state (\`MERGED\`) in the completion handoff. +- [ ] 4.3 Confirm the sandbox worktree is gone (\`git worktree list\` no longer shows the agent path; \`git branch -a\` shows no surviving local/remote refs for the branch). TASKSEOF fi @@ -89,5 +129,5 @@ The system SHALL enforce ${CAPABILITY_SLUG} behavior as defined by this change. SPECEOF fi -echo "[gitguardex] OpenSpec change workspace ready: ${CHANGE_DIR}" -echo "[gitguardex] OpenSpec change spec scaffold: ${SPEC_DIR}/spec.md" +echo "[guardex] OpenSpec change workspace ready: ${CHANGE_DIR}" +echo "[guardex] OpenSpec change spec scaffold: ${SPEC_DIR}/spec.md" diff --git a/templates/scripts/openspec/init-plan-workspace.sh b/templates/scripts/openspec/init-plan-workspace.sh old mode 100644 new mode 100755 index c6dd4da..c96dba3 --- a/templates/scripts/openspec/init-plan-workspace.sh +++ b/templates/scripts/openspec/init-plan-workspace.sh @@ -2,8 +2,8 @@ set -euo pipefail if [[ $# -lt 1 ]]; then - echo "Usage: $0 [role ...]" - echo "Example: $0 stabilize-dashboard planner architect critic executor writer verifier" + echo "Usage: $0 [agent-role ...]" + echo "Example: $0 add-ralplan-openspec-plan-export planner architect critic executor writer verifier" exit 1 fi @@ -11,10 +11,14 @@ PLAN_SLUG="$1" shift || true if [[ "$PLAN_SLUG" =~ [^a-z0-9-] ]]; then - echo "Error: plan slug must be kebab-case (lowercase letters, numbers, hyphens)." >&2 + echo "Error: plan slug must be kebab-case (lowercase letters, numbers, hyphens)." exit 1 fi +to_kebab() { + printf '%s' "$1" | tr '[:upper:]' '[:lower:]' | sed -E 's/[^a-z0-9]+/-/g; s/^-+//; s/-+$//' +} + if [[ $# -gt 0 ]]; then ROLES=("$@") else @@ -24,139 +28,635 @@ fi PLAN_DIR="openspec/plan/${PLAN_SLUG}" mkdir -p "$PLAN_DIR" -write_if_missing() { - local file="$1" - shift - if [[ ! -f "$file" ]]; then - mkdir -p "$(dirname "$file")" - cat > "$file" < "$PLAN_DIR/summary.md" < "$PLAN_DIR/checkpoints.md" </spec.md\`" + echo "Planner also gets \`plan.md\`; executor also gets \`checkpoints.md\`." + echo "Planner plans should follow \`openspec/plan/PLANS.md\`." + } > "$PLAN_DIR/README.md" +fi + +if [[ ! -f "$PLAN_DIR/coordinator-prompt.md" ]]; then + cat > "$PLAN_DIR/coordinator-prompt.md" < "$PLAN_DIR/kickoff-prompts.md" < +Owned scope: +- + +Verification: +- + +Handoff format: +- Files changed +- Behavior touched +- Verification outputs +- Risks/follow-ups +\`\`\` + +## Prompt B — Wave B (Secondary lane) + +\`\`\`text +You own Wave-B for plan \`${PLAN_SLUG}\` in /home/deadpool/Documents/codex-lb. + +Goal: +Implement the assigned Wave-B scope and return verification evidence. + +Hard constraints: +- You are not alone in the codebase; do not revert others' work. +- Stay in your owned files/modules only. +- Record explicit handoff notes for integration. + +Owned scope: +- + +Verification: +- + +Handoff format: +- Files changed +- Behavior touched +- Verification outputs +- Risks/follow-ups \`\`\` -" -write_if_missing "$PLAN_DIR/planner/plan.md" "# ExecPlan: ${PLAN_SLUG} +## Prompt C — Wave C (Secondary lane) + +\`\`\`text +You own Wave-C for plan \`${PLAN_SLUG}\` in /home/deadpool/Documents/codex-lb. + +Goal: +Implement the assigned Wave-C scope and return verification evidence. + +Hard constraints: +- You are not alone in the codebase; do not revert others' work. +- Stay in your owned files/modules only. +- Record explicit handoff notes for integration. + +Owned scope: +- + +Verification: +- + +Handoff format: +- Files changed +- Behavior touched +- Verification outputs +- Risks/follow-ups +\`\`\` + +## Prompt D — Integrator lane + +\`\`\`text +You are the integrator for plan \`${PLAN_SLUG}\` in /home/deadpool/Documents/codex-lb. + +Goal: +Integrate completed waves, resolve conflicts, run final verification, and prepare rollout/cutover notes. + +Hard constraints: +- You are not alone in the codebase; do not revert others' work. +- Preserve safety-critical behavior unless explicitly planned and tested. +- Keep final output evidence-first. + +Owned scope: +- integration glue and shared touchpoints +- final validation + handoff summary + +Verification: +- + +Final report: +- Files changed +- Integration decisions +- Verification outputs +- Remaining risks +\`\`\` +KICKOFFPROMPTEOF +fi + +if [[ ! -f "$PLAN_DIR/phases.md" ]]; then + cat > "$PLAN_DIR/phases.md" <\` = in progress, space = pending. +Indented sub-bullets are optional metadata consumed by the Plans UI: + +- \`session\`: which agent kind runs the phase (\`codex\` / \`claude\`). +- \`checkpoints\`: comma-separated role checkpoint ids delivered within the phase. +- \`summary\`: one short sentence rendered under the phase title. + +One phase is intended to fit into a single Codex or Claude session task. + +- [ ] [PH01] First milestone title goes here + - session: codex + - checkpoints: P1, A1 + - summary: Describe the single session outcome expected for this phase. +PHASESEOF +fi + +for role in "${ROLES[@]}"; do + ROLE_DIR="$PLAN_DIR/$role" + mkdir -p "$ROLE_DIR" + + if [[ ! -f "$ROLE_DIR/README.md" ]]; then + cat > "$ROLE_DIR/README.md" </spec.md\` + +Use this folder for role notes, artifacts, and status updates. +ROLEEOF + fi + + ROLE_SPEC_SLUG="$(to_kebab "$role")" + if [[ -z "$ROLE_SPEC_SLUG" ]]; then + ROLE_SPEC_SLUG="role" + fi + + if [[ ! -f "$ROLE_DIR/.openspec.yaml" ]]; then + cat > "$ROLE_DIR/.openspec.yaml" < "$ROLE_DIR/proposal.md" < "$ROLE_SPEC_DIR/spec.md" < "$ROLE_DIR/plan.md" < "$ROLE_DIR/checkpoints.md" < "$ROLE_DIR/tasks.md" < --base dev --via-pr --wait-for-merge --cleanup\`. +- [ ] 6.2 Record PR URL + final \`MERGED\` state in the handoff. +- [ ] 6.3 Confirm sandbox cleanup (\`git worktree list\`, \`git branch -a\`) or append \`BLOCKED:\` and stop. +TASKEOF + ;; + architect) + cat > "$ROLE_DIR/tasks.md" < openspec/plan/${PLAN_SLUG}/${role}/README.md openspec/plan/${PLAN_SLUG}/${role}/prompt.md openspec/plan/${PLAN_SLUG}/${role}/tasks.md openspec/plan/${PLAN_SLUG}/checkpoints.md\` -- Record branch, worktree, and scope in \`tasks.md\`. -- Do not change another role's files without reassignment. +## 3. Implementation -## Deliverables +- [ ] 3.1 Review plan for strongest antithesis/tradeoff tensions +- [ ] 3.2 Propose synthesis path and guardrails for implementation teams +- [ ] 3.3 Record architecture sign-off notes for downstream execution -- Complete the role checklist in \`tasks.md\`. -- Leave a handoff with files changed, verification, and risks. -- The owner alone runs the change completion flow and sandbox cleanup after change tasks 4.1-4.3 are done. -" +## 4. Checkpoints - write_if_missing "$ROLE_DIR/tasks.md" "# ${role} tasks +- [ ] [A1] READY - Architecture review checkpoint -## Ownership +## 5. Collaboration + +- [ ] 5.1 Owner recorded this lane before edits. +- [ ] 5.2 Record joined agents / handoffs, or mark \`N/A\` when solo. + +## 6. Cleanup -- [ ] Claim this role's files in the shared owner branch/worktree before editing. -- [ ] Record branch, worktree, and scope for this role. -- [ ] Copy or hand off \`prompt.md\` when another agent joins this role. +- [ ] 6.1 If this lane owns finalization, run \`bash scripts/agent-branch-finish.sh --branch --base dev --via-pr --wait-for-merge --cleanup\`. +- [ ] 6.2 Record PR URL + final \`MERGED\` state in the handoff. +- [ ] 6.3 Confirm sandbox cleanup (\`git worktree list\`, \`git branch -a\`) or append \`BLOCKED:\` and stop. +TASKEOF + ;; + critic) + cat > "$ROLE_DIR/tasks.md" < --base --via-pr --wait-for-merge --cleanup\`. -- [ ] Record PR URL + final \`MERGED\` state in the role handoff or owner change task. -- [ ] Confirm sandbox cleanup, or capture a \`BLOCKED:\` handoff if finish is still pending. -" +- [ ] 6.1 If this lane owns finalization, run \`bash scripts/agent-branch-finish.sh --branch --base dev --via-pr --wait-for-merge --cleanup\`. +- [ ] 6.2 Record PR URL + final \`MERGED\` state in the handoff. +- [ ] 6.3 Confirm sandbox cleanup (\`git worktree list\`, \`git branch -a\`) or append \`BLOCKED:\` and stop. +TASKEOF + ;; + executor) + cat > "$ROLE_DIR/tasks.md" < --base dev --via-pr --wait-for-merge --cleanup\`. +- [ ] 6.2 Record PR URL + final \`MERGED\` state in the handoff. +- [ ] 6.3 Confirm sandbox cleanup (\`git worktree list\`, \`git branch -a\`) or append \`BLOCKED:\` and stop. +TASKEOF + ;; + writer) + cat > "$ROLE_DIR/tasks.md" < --base dev --via-pr --wait-for-merge --cleanup\`. +- [ ] 6.2 Record PR URL + final \`MERGED\` state in the handoff. +- [ ] 6.3 Confirm sandbox cleanup (\`git worktree list\`, \`git branch -a\`) or append \`BLOCKED:\` and stop. +TASKEOF + ;; + verifier) + cat > "$ROLE_DIR/tasks.md" < --base dev --via-pr --wait-for-merge --cleanup\`. +- [ ] 6.2 Record PR URL + final \`MERGED\` state in the handoff. +- [ ] 6.3 Confirm sandbox cleanup (\`git worktree list\`, \`git branch -a\`) or append \`BLOCKED:\` and stop. +TASKEOF + ;; + *) + cat > "$ROLE_DIR/tasks.md" < --base dev --via-pr --wait-for-merge --cleanup\`. +- [ ] 6.2 Record PR URL + final \`MERGED\` state in the handoff. +- [ ] 6.3 Confirm sandbox cleanup (\`git worktree list\`, \`git branch -a\`) or append \`BLOCKED:\` and stop. +TASKEOF + ;; + esac + fi done -echo "[gitguardex] OpenSpec plan workspace ready: ${PLAN_DIR}" -echo "[gitguardex] Roles: ${ROLES[*]}" +echo "Plan workspace ready: $PLAN_DIR" diff --git a/test/install.test.js b/test/install.test.js index f8d4fa3..5d329dc 100644 --- a/test/install.test.js +++ b/test/install.test.js @@ -4098,41 +4098,58 @@ test('OpenSpec plan workspace scaffold creates expected role/task structure', () assert.equal(scaffold.status, 0, scaffold.stderr || scaffold.stdout); const planDir = path.join(repoDir, 'openspec', 'plan', planSlug); - const expected = [ + const rootExpected = [ + 'README.md', 'summary.md', 'checkpoints.md', - 'planner/plan.md', - 'planner/prompt.md', - 'planner/tasks.md', - 'architect/prompt.md', - 'architect/tasks.md', - 'critic/prompt.md', - 'critic/tasks.md', - 'executor/prompt.md', - 'executor/tasks.md', - 'writer/prompt.md', - 'writer/tasks.md', - 'verifier/prompt.md', - 'verifier/tasks.md', + 'coordinator-prompt.md', + 'kickoff-prompts.md', + 'phases.md', ]; - for (const rel of expected) { + for (const rel of rootExpected) { assert.equal(fs.existsSync(path.join(planDir, rel)), true, `${rel} missing`); } + for (const role of ['planner', 'architect', 'critic', 'executor', 'writer', 'verifier']) { + assert.equal(fs.existsSync(path.join(planDir, role, 'README.md')), true, `${role}/README.md missing`); + assert.equal(fs.existsSync(path.join(planDir, role, '.openspec.yaml')), true, `${role}/.openspec.yaml missing`); + assert.equal(fs.existsSync(path.join(planDir, role, 'proposal.md')), true, `${role}/proposal.md missing`); + assert.equal(fs.existsSync(path.join(planDir, role, 'tasks.md')), true, `${role}/tasks.md missing`); + assert.equal( + fs.existsSync(path.join(planDir, role, 'specs', role, 'spec.md')), + true, + `${role}/specs/${role}/spec.md missing`, + ); + } + assert.equal(fs.existsSync(path.join(planDir, 'planner', 'plan.md')), true, 'planner/plan.md missing'); + assert.equal( + fs.existsSync(path.join(planDir, 'executor', 'checkpoints.md')), + true, + 'executor/checkpoints.md missing', + ); + + const coordinatorPrompt = fs.readFileSync(path.join(planDir, 'coordinator-prompt.md'), 'utf8'); + assert.match(coordinatorPrompt, /Drive this plan from draft to execution-ready status/); + assert.match(coordinatorPrompt, /kickoff-prompts\.md/); + + const phasesContent = fs.readFileSync(path.join(planDir, 'phases.md'), 'utf8'); + assert.match(phasesContent, /\[PH01\]/); + assert.match(phasesContent, /session: codex/); + const plannerTasks = fs.readFileSync(path.join(planDir, 'planner', 'tasks.md'), 'utf8'); - assert.match(plannerTasks, /## Ownership/); + assert.match(plannerTasks, /# planner tasks/); assert.match(plannerTasks, /## 1\. Spec/); assert.match(plannerTasks, /## 2\. Tests/); assert.match(plannerTasks, /## 3\. Implementation/); assert.match(plannerTasks, /## 4\. Checkpoints/); assert.match(plannerTasks, /## 5\. Collaboration/); assert.match(plannerTasks, /## 6\. Cleanup/); - assert.match(plannerTasks, /gx finish --via-pr --wait-for-merge --cleanup/); - assert.match(plannerTasks, /Claim this role's files in the shared owner branch\/worktree before editing/); + assert.match(plannerTasks, /\[P1\] READY - Initial planning draft checkpoint/); + assert.match(plannerTasks, /bash scripts\/agent-branch-finish\.sh/); - const plannerPrompt = fs.readFileSync(path.join(planDir, 'planner', 'prompt.md'), 'utf8'); - assert.match(plannerPrompt, /claim --branch /); - assert.match(plannerPrompt, /change tasks 4\.1-4\.3/); + const plannerPlan = fs.readFileSync(path.join(planDir, 'planner', 'plan.md'), 'utf8'); + assert.match(plannerPlan, /This ExecPlan is a living document/); + assert.match(plannerPlan, /## Idempotence and Recovery/); }); test('OpenSpec change workspace scaffold creates proposal/tasks/spec defaults', () => { @@ -4157,10 +4174,42 @@ test('OpenSpec change workspace scaffold creates proposal/tasks/spec defaults', assert.equal(fs.existsSync(path.join(changeDir, 'specs', capabilitySlug, 'spec.md')), true, 'spec.md missing'); const tasksContent = fs.readFileSync(path.join(changeDir, 'tasks.md'), 'utf8'); - assert.match(tasksContent, /## 4\. Cleanup/); - assert.match(tasksContent, /gx finish --via-pr --wait-for-merge --cleanup/); - assert.match(tasksContent, /Record PR URL \+ final `MERGED` state in the completion handoff\./); - assert.match(tasksContent, /Confirm sandbox cleanup/); + assert.match(tasksContent, /## Definition of Done/); + assert.match(tasksContent, /append a `BLOCKED:` line under section 4/); + assert.match(tasksContent, /## 4\. Cleanup \(mandatory; run before claiming completion\)/); + assert.match(tasksContent, /Run the cleanup pipeline:/); + assert.match(tasksContent, /Record the PR URL and final merge state \(`MERGED`\)/); + assert.match(tasksContent, /Confirm the sandbox worktree is gone/); +}); + +test('OpenSpec change workspace scaffold supports minimal T1 notes mode', () => { + const repoDir = initRepo(); + + const setupResult = runNode(['setup', '--target', repoDir, '--no-global-install'], repoDir); + assert.equal(setupResult.status, 0, setupResult.stderr || setupResult.stdout); + + const changeSlug = 'change-workspace-minimal'; + const capabilitySlug = 'runtime-migration'; + const agentBranch = 'agent/codex/minimal-change'; + const scaffold = runCmd( + 'bash', + ['scripts/openspec/init-change-workspace.sh', changeSlug, capabilitySlug, agentBranch], + repoDir, + { GUARDEX_OPENSPEC_MINIMAL: '1' }, + ); + assert.equal(scaffold.status, 0, scaffold.stderr || scaffold.stdout); + + const changeDir = path.join(repoDir, 'openspec', 'changes', changeSlug); + assert.equal(fs.existsSync(path.join(changeDir, '.openspec.yaml')), true, '.openspec.yaml missing'); + assert.equal(fs.existsSync(path.join(changeDir, 'notes.md')), true, 'notes.md missing'); + assert.equal(fs.existsSync(path.join(changeDir, 'proposal.md')), false, 'proposal.md should not exist in minimal mode'); + assert.equal(fs.existsSync(path.join(changeDir, 'tasks.md')), false, 'tasks.md should not exist in minimal mode'); + + const notesContent = fs.readFileSync(path.join(changeDir, 'notes.md'), 'utf8'); + assert.match(notesContent, /minimal \/ T1/); + assert.match(notesContent, new RegExp(agentBranch.replace(/[.*+?^${}()|[\]\\]/g, '\\$&'))); + assert.match(notesContent, /Commit message is the spec of record/); + assert.match(notesContent, /Record PR URL \+ `MERGED` state/); }); test('validate blocks unapproved deletions until allow-delete is set', () => { From a151351c856763d2e559a10bb20ff1fd380171f4 Mon Sep 17 00:00:00 2001 From: Viktor Nagy <137165288+NagyVikt@users.noreply.github.com> Date: Tue, 21 Apr 2026 17:54:53 +0200 Subject: [PATCH 14/48] Expose live Guardex sandbox sessions inside VS Code Source Control (#248) Guardex now writes repo-local active session presence from codex-agent, scaffolds a lightweight VS Code companion into target repos, and ships a local install helper so users can see running lanes with native loading~spin rows in the Source Control container. Constraint: VS Code strips custom SVG animation from sidebar surfaces, so the companion must use native animated codicons Constraint: Guardex setup scaffolds from templates, so the companion and helpers must ship through template mapping as well as the source repo Rejected: Webview-only sidebar clone | would miss the native Source Control container users already watch Confidence: high Scope-risk: moderate Reversibility: clean Directive: Keep the companion consuming .omx/state/active-sessions and ignore dead PIDs before adding richer runtime signals Tested: node --check bin/multiagent-safety.js; node --check scripts/agent-session-state.js; node --check scripts/install-vscode-active-agents-extension.js; node --check templates/vscode/guardex-active-agents/extension.js; node --check templates/vscode/guardex-active-agents/session-schema.js; bash -n scripts/codex-agent.sh; bash -n templates/scripts/codex-agent.sh; node --test test/vscode-active-agents-session-state.test.js; temporary gx setup smoke; openspec validate agent-codex-vscode-active-agents-extension-2026-04-21-17-38 --type change --strict; openspec validate --specs; git diff --check Not-tested: Live activation inside an already-open VS Code window without a manual reload Co-authored-by: NagyVikt --- README.md | 8 + bin/multiagent-safety.js | 12 ++ .../.openspec.yaml | 2 + .../proposal.md | 16 ++ .../vscode-active-agents-extension/spec.md | 36 ++++ .../tasks.md | 31 +++ scripts/agent-session-state.js | 110 ++++++++++ scripts/codex-agent.sh | 51 +++++ .../install-vscode-active-agents-extension.js | 92 ++++++++ templates/scripts/agent-session-state.js | 110 ++++++++++ templates/scripts/codex-agent.sh | 51 +++++ .../install-vscode-active-agents-extension.js | 92 ++++++++ .../vscode/guardex-active-agents/README.md | 18 ++ .../vscode/guardex-active-agents/extension.js | 147 +++++++++++++ .../vscode/guardex-active-agents/package.json | 56 +++++ .../guardex-active-agents/session-schema.js | 203 ++++++++++++++++++ ...vscode-active-agents-session-state.test.js | 134 ++++++++++++ 17 files changed, 1169 insertions(+) create mode 100644 openspec/changes/agent-codex-vscode-active-agents-extension-2026-04-21-17-38/.openspec.yaml create mode 100644 openspec/changes/agent-codex-vscode-active-agents-extension-2026-04-21-17-38/proposal.md create mode 100644 openspec/changes/agent-codex-vscode-active-agents-extension-2026-04-21-17-38/specs/vscode-active-agents-extension/spec.md create mode 100644 openspec/changes/agent-codex-vscode-active-agents-extension-2026-04-21-17-38/tasks.md create mode 100755 scripts/agent-session-state.js create mode 100755 scripts/install-vscode-active-agents-extension.js create mode 100755 templates/scripts/agent-session-state.js create mode 100755 templates/scripts/install-vscode-active-agents-extension.js create mode 100644 templates/vscode/guardex-active-agents/README.md create mode 100644 templates/vscode/guardex-active-agents/extension.js create mode 100644 templates/vscode/guardex-active-agents/package.json create mode 100644 templates/vscode/guardex-active-agents/session-schema.js create mode 100644 test/vscode-active-agents-session-state.test.js diff --git a/README.md b/README.md index 94886ea..4151e14 100644 --- a/README.md +++ b/README.md @@ -221,6 +221,14 @@ This is the real Source Control shape Guardex is aiming for: isolated agent bran ![Guarded VS Code Source Control example](https://raw.githubusercontent.com/recodeee/gitguardex/main/docs/images/workflow-source-control-grouped.png) +To install the real companion into local VS Code from a Guardex-wired repo: + +```sh +node scripts/install-vscode-active-agents-extension.js +``` + +It adds an `Active Agents` view to the Source Control container, reads `.omx/state/active-sessions/*.json`, and uses VS Code's native `loading~spin` codicon for the running-state affordance. Reload the VS Code window after install. + --- ## Commands diff --git a/bin/multiagent-safety.js b/bin/multiagent-safety.js index b291bee..7b14a16 100755 --- a/bin/multiagent-safety.js +++ b/bin/multiagent-safety.js @@ -90,8 +90,10 @@ const TEMPLATE_FILES = [ 'scripts/agent-branch-start.sh', 'scripts/agent-branch-finish.sh', 'scripts/agent-branch-merge.sh', + 'scripts/agent-session-state.js', 'scripts/codex-agent.sh', 'scripts/guardex-docker-loader.sh', + 'scripts/install-vscode-active-agents-extension.js', 'scripts/review-bot-watch.sh', 'scripts/agent-worktree-prune.sh', 'scripts/agent-file-locks.py', @@ -108,12 +110,17 @@ const TEMPLATE_FILES = [ 'claude/commands/gitguardex.md', 'github/pull.yml.example', 'github/workflows/cr.yml', + 'vscode/guardex-active-agents/package.json', + 'vscode/guardex-active-agents/extension.js', + 'vscode/guardex-active-agents/session-schema.js', + 'vscode/guardex-active-agents/README.md', ]; const REQUIRED_WORKFLOW_FILES = [ 'scripts/agent-branch-start.sh', 'scripts/agent-branch-finish.sh', 'scripts/agent-branch-merge.sh', + 'scripts/agent-session-state.js', 'scripts/guardex-docker-loader.sh', 'scripts/agent-worktree-prune.sh', 'scripts/agent-file-locks.py', @@ -153,8 +160,10 @@ const EXECUTABLE_RELATIVE_PATHS = new Set([ 'scripts/agent-branch-start.sh', 'scripts/agent-branch-finish.sh', 'scripts/agent-branch-merge.sh', + 'scripts/agent-session-state.js', 'scripts/codex-agent.sh', 'scripts/guardex-docker-loader.sh', + 'scripts/install-vscode-active-agents-extension.js', 'scripts/review-bot-watch.sh', 'scripts/agent-worktree-prune.sh', 'scripts/agent-file-locks.py', @@ -814,6 +823,9 @@ function toDestinationPath(relativeTemplatePath) { if (relativeTemplatePath.startsWith('github/')) { return `.${relativeTemplatePath}`; } + if (relativeTemplatePath.startsWith('vscode/')) { + return relativeTemplatePath; + } throw new Error(`Unsupported template path: ${relativeTemplatePath}`); } diff --git a/openspec/changes/agent-codex-vscode-active-agents-extension-2026-04-21-17-38/.openspec.yaml b/openspec/changes/agent-codex-vscode-active-agents-extension-2026-04-21-17-38/.openspec.yaml new file mode 100644 index 0000000..4b8c565 --- /dev/null +++ b/openspec/changes/agent-codex-vscode-active-agents-extension-2026-04-21-17-38/.openspec.yaml @@ -0,0 +1,2 @@ +schema: spec-driven +created: 2026-04-21 diff --git a/openspec/changes/agent-codex-vscode-active-agents-extension-2026-04-21-17-38/proposal.md b/openspec/changes/agent-codex-vscode-active-agents-extension-2026-04-21-17-38/proposal.md new file mode 100644 index 0000000..ca32a0d --- /dev/null +++ b/openspec/changes/agent-codex-vscode-active-agents-extension-2026-04-21-17-38/proposal.md @@ -0,0 +1,16 @@ +## Why + +- Guardex already documents a polished VS Code Source Control view, but the repo does not ship a real companion that surfaces live agent lanes inside VS Code. +- Users need a concrete way to see running Guardex/Codex sandboxes without reading lock JSON or switching to terminal status output. + +## What Changes + +- Add repo-local active-session state that `scripts/codex-agent.sh` writes while a sandbox session is running and removes on exit. +- Add a lightweight VS Code companion extension that contributes an `Active Agents` view inside the Source Control container and renders one spinning row per live session. +- Add a local install path for the companion extension so users can enable it in their real VS Code without publishing to the Marketplace first. + +## Impact + +- Affected surfaces: `scripts/codex-agent.sh`, new session-state/install helpers, README, tests, and a new `vscode/guardex-active-agents` companion directory. +- Primary risk is stale or misleading session rows if lifecycle cleanup fails, so the companion must ignore dead PIDs and the writer must remove state on wrapper exit. +- The companion should stay native to VS Code constraints by using built-in spinning codicons instead of custom animated SVGs. diff --git a/openspec/changes/agent-codex-vscode-active-agents-extension-2026-04-21-17-38/specs/vscode-active-agents-extension/spec.md b/openspec/changes/agent-codex-vscode-active-agents-extension-2026-04-21-17-38/specs/vscode-active-agents-extension/spec.md new file mode 100644 index 0000000..3762eed --- /dev/null +++ b/openspec/changes/agent-codex-vscode-active-agents-extension-2026-04-21-17-38/specs/vscode-active-agents-extension/spec.md @@ -0,0 +1,36 @@ +## ADDED Requirements + +### Requirement: Guardex writes active session presence for sandboxed Codex runs +The system SHALL write repo-local active session presence records while `scripts/codex-agent.sh` is running an interactive sandbox session. + +#### Scenario: Session start records live metadata +- **WHEN** `scripts/codex-agent.sh` launches Codex in a sandbox worktree +- **THEN** Guardex writes a JSON record under `.omx/state/active-sessions/` +- **AND** the record includes the repo root, sandbox branch, task name, agent name, worktree path, launch PID, CLI name, and start timestamp. + +#### Scenario: Session exit removes presence record +- **WHEN** the wrapper exits after Codex finishes +- **THEN** the corresponding `.omx/state/active-sessions/` record is removed +- **AND** later launches for the same branch can recreate it cleanly. + +### Requirement: VS Code companion shows active Guardex lanes in Source Control +The system SHALL provide a VS Code companion extension that surfaces live Guardex sessions in the Source Control container. + +#### Scenario: Live sessions render with native spinner +- **WHEN** the companion finds live session records for the current workspace +- **THEN** it shows an `Active Agents` view in the Source Control container +- **AND** each live session renders with a native animated VS Code icon equivalent to `loading~spin` +- **AND** the row includes the branch identity plus an elapsed-time description. + +#### Scenario: Dead or stale sessions are ignored +- **WHEN** a session record references a PID that is no longer running or contains invalid JSON +- **THEN** the companion does not render it as an active agent row +- **AND** valid rows continue to render. + +### Requirement: Local install path enables the companion without Marketplace publishing +The system SHALL provide a local install path for the companion extension from the repo checkout. + +#### Scenario: Local install copies the extension to the VS Code extensions directory +- **WHEN** the local install helper is run +- **THEN** it copies the companion extension into the target VS Code extensions directory using the extension package version +- **AND** it replaces older local installs for the same extension identifier so reload picks up the newest sources. diff --git a/openspec/changes/agent-codex-vscode-active-agents-extension-2026-04-21-17-38/tasks.md b/openspec/changes/agent-codex-vscode-active-agents-extension-2026-04-21-17-38/tasks.md new file mode 100644 index 0000000..470b906 --- /dev/null +++ b/openspec/changes/agent-codex-vscode-active-agents-extension-2026-04-21-17-38/tasks.md @@ -0,0 +1,31 @@ +## Definition of Done + +This change is complete only when **all** of the following are true: + +- Every checkbox below is checked. +- The agent branch reaches `MERGED` state on `origin` and the PR URL + state are recorded in the completion handoff. +- If any step blocks (test failure, conflict, ambiguous result), append a `BLOCKED:` line under section 4 explaining the blocker and **STOP**. Do not tick remaining cleanup boxes; do not silently skip the cleanup pipeline. + +## 1. Specification + +- [x] 1.1 Finalize proposal scope and acceptance criteria for `agent-codex-vscode-active-agents-extension-2026-04-21-17-38`. +- [x] 1.2 Define normative requirements in `specs/vscode-active-agents-extension/spec.md`. + +## 2. Implementation + +- [x] 2.1 Add repo-local active-session state writing/cleanup around `scripts/codex-agent.sh`. +- [x] 2.2 Add the VS Code Source Control companion view and local install path. +- [x] 2.3 Add/update focused regression coverage for session-state parsing and install behavior. +- [x] 2.4 Update README guidance for the real VS Code companion flow. + +## 3. Verification + +- [x] 3.1 Run targeted project verification commands for the session-state helper, extension sources, and install path. +- [x] 3.2 Run `openspec validate agent-codex-vscode-active-agents-extension-2026-04-21-17-38 --type change --strict`. +- [x] 3.3 Run `openspec validate --specs`. + +## 4. Cleanup (mandatory; run before claiming completion) + +- [ ] 4.1 Run the cleanup pipeline: `bash scripts/agent-branch-finish.sh --branch agent// --base main --via-pr --wait-for-merge --cleanup`. This handles commit -> push -> PR create -> merge wait -> worktree prune in one invocation. +- [ ] 4.2 Record the PR URL and final merge state (`MERGED`) in the completion handoff. +- [ ] 4.3 Confirm the sandbox worktree is gone (`git worktree list` no longer shows the agent path; `git branch -a` shows no surviving local/remote refs for the branch). diff --git a/scripts/agent-session-state.js b/scripts/agent-session-state.js new file mode 100755 index 0000000..ae65c18 --- /dev/null +++ b/scripts/agent-session-state.js @@ -0,0 +1,110 @@ +#!/usr/bin/env node + +const fs = require('node:fs'); +const path = require('node:path'); + +function resolveSessionSchemaModule() { + const candidates = [ + path.resolve(__dirname, '..', 'vscode', 'guardex-active-agents', 'session-schema.js'), + path.resolve(__dirname, '..', 'templates', 'vscode', 'guardex-active-agents', 'session-schema.js'), + ]; + + for (const candidate of candidates) { + if (fs.existsSync(candidate)) { + return require(candidate); + } + } + + throw new Error('Could not resolve Guardex active-agent session schema module.'); +} + +const sessionSchema = resolveSessionSchemaModule(); + +function usage() { + return ( + 'Usage:\n' + + ' node scripts/agent-session-state.js start --repo --branch --task --agent --worktree --pid --cli \n' + + ' node scripts/agent-session-state.js stop --repo --branch \n' + ); +} + +function parseOptions(argv) { + const options = {}; + for (let index = 0; index < argv.length; index += 1) { + const token = argv[index]; + if (!token.startsWith('--')) { + throw new Error(`Unexpected argument: ${token}`); + } + const key = token.slice(2); + const value = argv[index + 1]; + if (!value || value.startsWith('--')) { + throw new Error(`Missing value for --${key}`); + } + options[key] = value; + index += 1; + } + return options; +} + +function requireOption(options, key) { + const value = options[key]; + if (!value) { + throw new Error(`Missing required option --${key}`); + } + return value; +} + +function writeSessionRecord(options) { + const repoRoot = requireOption(options, 'repo'); + const branch = requireOption(options, 'branch'); + const record = sessionSchema.buildSessionRecord({ + repoRoot, + branch, + taskName: requireOption(options, 'task'), + agentName: requireOption(options, 'agent'), + worktreePath: requireOption(options, 'worktree'), + pid: requireOption(options, 'pid'), + cliName: requireOption(options, 'cli'), + }); + + const targetPath = sessionSchema.sessionFilePathForBranch(repoRoot, branch); + fs.mkdirSync(path.dirname(targetPath), { recursive: true }); + fs.writeFileSync(targetPath, `${JSON.stringify(record, null, 2)}\n`, 'utf8'); +} + +function removeSessionRecord(options) { + const repoRoot = requireOption(options, 'repo'); + const branch = requireOption(options, 'branch'); + const targetPath = sessionSchema.sessionFilePathForBranch(repoRoot, branch); + if (fs.existsSync(targetPath)) { + fs.unlinkSync(targetPath); + } +} + +function main() { + const [command, ...rest] = process.argv.slice(2); + if (!command || ['-h', '--help', 'help'].includes(command)) { + process.stdout.write(usage()); + return; + } + + const options = parseOptions(rest); + if (command === 'start') { + writeSessionRecord(options); + return; + } + if (command === 'stop') { + removeSessionRecord(options); + return; + } + + throw new Error(`Unknown subcommand: ${command}`); +} + +try { + main(); +} catch (error) { + process.stderr.write(`[guardex-active-session] ${error.message}\n`); + process.stderr.write(usage()); + process.exitCode = 1; +} diff --git a/scripts/codex-agent.sh b/scripts/codex-agent.sh index a8a53c8..8b509dd 100755 --- a/scripts/codex-agent.sh +++ b/scripts/codex-agent.sh @@ -6,6 +6,7 @@ AGENT_NAME="${GUARDEX_AGENT_NAME:-agent}" BASE_BRANCH="${GUARDEX_BASE_BRANCH:-}" BASE_BRANCH_EXPLICIT=0 CODEX_BIN="${GUARDEX_CODEX_BIN:-codex}" +NODE_BIN="${GUARDEX_NODE_BIN:-node}" AUTO_FINISH_RAW="${GUARDEX_CODEX_AUTO_FINISH:-true}" AUTO_REVIEW_ON_CONFLICT_RAW="${GUARDEX_CODEX_AUTO_REVIEW_ON_CONFLICT:-true}" AUTO_CLEANUP_RAW="${GUARDEX_CODEX_AUTO_CLEANUP:-true}" @@ -143,6 +144,7 @@ if ! git rev-parse --is-inside-work-tree >/dev/null 2>&1; then exit 1 fi repo_root="$(git rev-parse --show-toplevel)" +active_session_state_script="${repo_root}/scripts/agent-session-state.js" guardex_env_helper="${repo_root}/scripts/guardex-env.sh" if [[ -f "$guardex_env_helper" ]]; then @@ -446,6 +448,40 @@ has_origin_remote() { git -C "$repo_root" remote get-url origin >/dev/null 2>&1 } +run_active_session_state() { + local action="$1" + shift + + if [[ ! -f "$active_session_state_script" ]]; then + return 0 + fi + if ! command -v "$NODE_BIN" >/dev/null 2>&1; then + return 0 + fi + + "$NODE_BIN" "$active_session_state_script" "$action" "$@" >/dev/null 2>&1 || true +} + +record_active_session_state() { + local wt="$1" + local branch="$2" + + run_active_session_state \ + start \ + --repo "$repo_root" \ + --branch "$branch" \ + --task "$TASK_NAME" \ + --agent "$AGENT_NAME" \ + --worktree "$wt" \ + --pid "$$" \ + --cli "$CODEX_BIN" +} + +clear_active_session_state() { + local branch="$1" + run_active_session_state stop --repo "$repo_root" --branch "$branch" +} + origin_remote_supports_pr_finish() { local origin_url origin_url="$(git -C "$repo_root" remote get-url origin 2>/dev/null || true)" @@ -833,6 +869,19 @@ if ! ensure_openspec_plan_workspace "$worktree_path" "$worktree_branch"; then exit 1 fi +active_session_recorded=0 +cleanup_active_session_state_on_exit() { + set +e + if [[ "${active_session_recorded:-0}" -eq 1 && -n "${worktree_branch:-}" && "${worktree_branch:-}" != "HEAD" ]]; then + clear_active_session_state "$worktree_branch" + active_session_recorded=0 + fi +} + +record_active_session_state "$worktree_path" "$worktree_branch" +active_session_recorded=1 +trap cleanup_active_session_state_on_exit EXIT INT TERM + echo "[codex-agent] Launching ${CODEX_BIN} in sandbox: $worktree_path" cd "$worktree_path" set +e @@ -841,6 +890,8 @@ codex_exit="$?" set -e cd "$repo_root" +cleanup_active_session_state_on_exit +trap - EXIT INT TERM final_exit="$codex_exit" auto_finish_completed=0 diff --git a/scripts/install-vscode-active-agents-extension.js b/scripts/install-vscode-active-agents-extension.js new file mode 100755 index 0000000..9a7647f --- /dev/null +++ b/scripts/install-vscode-active-agents-extension.js @@ -0,0 +1,92 @@ +#!/usr/bin/env node + +const fs = require('node:fs'); +const os = require('node:os'); +const path = require('node:path'); + +function parseOptions(argv) { + const options = {}; + for (let index = 0; index < argv.length; index += 1) { + const token = argv[index]; + if (!token.startsWith('--')) { + throw new Error(`Unexpected argument: ${token}`); + } + const key = token.slice(2); + const value = argv[index + 1]; + if (!value || value.startsWith('--')) { + throw new Error(`Missing value for --${key}`); + } + options[key] = value; + index += 1; + } + return options; +} + +function resolveExtensionSource(repoRoot) { + const candidates = [ + path.join(repoRoot, 'vscode', 'guardex-active-agents'), + path.join(repoRoot, 'templates', 'vscode', 'guardex-active-agents'), + ]; + + for (const candidate of candidates) { + if (fs.existsSync(path.join(candidate, 'package.json'))) { + return candidate; + } + } + + throw new Error('Could not find the Guardex VS Code companion sources.'); +} + +function removeIfExists(targetPath) { + if (fs.existsSync(targetPath)) { + fs.rmSync(targetPath, { recursive: true, force: true }); + } +} + +function main() { + const repoRoot = path.resolve(__dirname, '..'); + const options = parseOptions(process.argv.slice(2)); + const sourceDir = resolveExtensionSource(repoRoot); + const manifest = JSON.parse(fs.readFileSync(path.join(sourceDir, 'package.json'), 'utf8')); + const extensionId = `${manifest.publisher}.${manifest.name}`; + const extensionsDir = path.resolve( + options['extensions-dir'] || + process.env.GUARDEX_VSCODE_EXTENSIONS_DIR || + process.env.VSCODE_EXTENSIONS_DIR || + path.join(os.homedir(), '.vscode', 'extensions'), + ); + + fs.mkdirSync(extensionsDir, { recursive: true }); + const targetDir = path.join(extensionsDir, `${extensionId}-${manifest.version}`); + + for (const entry of fs.readdirSync(extensionsDir, { withFileTypes: true })) { + if (!entry.isDirectory()) { + continue; + } + if (entry.name === path.basename(targetDir)) { + continue; + } + if (entry.name.startsWith(`${extensionId}-`)) { + removeIfExists(path.join(extensionsDir, entry.name)); + } + } + + removeIfExists(targetDir); + fs.cpSync(sourceDir, targetDir, { + recursive: true, + force: true, + preserveTimestamps: true, + }); + + process.stdout.write( + `[guardex-active-agents] Installed ${extensionId}@${manifest.version} to ${targetDir}\n` + + '[guardex-active-agents] Reload the VS Code window to activate the Source Control companion.\n', + ); +} + +try { + main(); +} catch (error) { + process.stderr.write(`[guardex-active-agents] ${error.message}\n`); + process.exitCode = 1; +} diff --git a/templates/scripts/agent-session-state.js b/templates/scripts/agent-session-state.js new file mode 100755 index 0000000..ae65c18 --- /dev/null +++ b/templates/scripts/agent-session-state.js @@ -0,0 +1,110 @@ +#!/usr/bin/env node + +const fs = require('node:fs'); +const path = require('node:path'); + +function resolveSessionSchemaModule() { + const candidates = [ + path.resolve(__dirname, '..', 'vscode', 'guardex-active-agents', 'session-schema.js'), + path.resolve(__dirname, '..', 'templates', 'vscode', 'guardex-active-agents', 'session-schema.js'), + ]; + + for (const candidate of candidates) { + if (fs.existsSync(candidate)) { + return require(candidate); + } + } + + throw new Error('Could not resolve Guardex active-agent session schema module.'); +} + +const sessionSchema = resolveSessionSchemaModule(); + +function usage() { + return ( + 'Usage:\n' + + ' node scripts/agent-session-state.js start --repo --branch --task --agent --worktree --pid --cli \n' + + ' node scripts/agent-session-state.js stop --repo --branch \n' + ); +} + +function parseOptions(argv) { + const options = {}; + for (let index = 0; index < argv.length; index += 1) { + const token = argv[index]; + if (!token.startsWith('--')) { + throw new Error(`Unexpected argument: ${token}`); + } + const key = token.slice(2); + const value = argv[index + 1]; + if (!value || value.startsWith('--')) { + throw new Error(`Missing value for --${key}`); + } + options[key] = value; + index += 1; + } + return options; +} + +function requireOption(options, key) { + const value = options[key]; + if (!value) { + throw new Error(`Missing required option --${key}`); + } + return value; +} + +function writeSessionRecord(options) { + const repoRoot = requireOption(options, 'repo'); + const branch = requireOption(options, 'branch'); + const record = sessionSchema.buildSessionRecord({ + repoRoot, + branch, + taskName: requireOption(options, 'task'), + agentName: requireOption(options, 'agent'), + worktreePath: requireOption(options, 'worktree'), + pid: requireOption(options, 'pid'), + cliName: requireOption(options, 'cli'), + }); + + const targetPath = sessionSchema.sessionFilePathForBranch(repoRoot, branch); + fs.mkdirSync(path.dirname(targetPath), { recursive: true }); + fs.writeFileSync(targetPath, `${JSON.stringify(record, null, 2)}\n`, 'utf8'); +} + +function removeSessionRecord(options) { + const repoRoot = requireOption(options, 'repo'); + const branch = requireOption(options, 'branch'); + const targetPath = sessionSchema.sessionFilePathForBranch(repoRoot, branch); + if (fs.existsSync(targetPath)) { + fs.unlinkSync(targetPath); + } +} + +function main() { + const [command, ...rest] = process.argv.slice(2); + if (!command || ['-h', '--help', 'help'].includes(command)) { + process.stdout.write(usage()); + return; + } + + const options = parseOptions(rest); + if (command === 'start') { + writeSessionRecord(options); + return; + } + if (command === 'stop') { + removeSessionRecord(options); + return; + } + + throw new Error(`Unknown subcommand: ${command}`); +} + +try { + main(); +} catch (error) { + process.stderr.write(`[guardex-active-session] ${error.message}\n`); + process.stderr.write(usage()); + process.exitCode = 1; +} diff --git a/templates/scripts/codex-agent.sh b/templates/scripts/codex-agent.sh index a8a53c8..8b509dd 100755 --- a/templates/scripts/codex-agent.sh +++ b/templates/scripts/codex-agent.sh @@ -6,6 +6,7 @@ AGENT_NAME="${GUARDEX_AGENT_NAME:-agent}" BASE_BRANCH="${GUARDEX_BASE_BRANCH:-}" BASE_BRANCH_EXPLICIT=0 CODEX_BIN="${GUARDEX_CODEX_BIN:-codex}" +NODE_BIN="${GUARDEX_NODE_BIN:-node}" AUTO_FINISH_RAW="${GUARDEX_CODEX_AUTO_FINISH:-true}" AUTO_REVIEW_ON_CONFLICT_RAW="${GUARDEX_CODEX_AUTO_REVIEW_ON_CONFLICT:-true}" AUTO_CLEANUP_RAW="${GUARDEX_CODEX_AUTO_CLEANUP:-true}" @@ -143,6 +144,7 @@ if ! git rev-parse --is-inside-work-tree >/dev/null 2>&1; then exit 1 fi repo_root="$(git rev-parse --show-toplevel)" +active_session_state_script="${repo_root}/scripts/agent-session-state.js" guardex_env_helper="${repo_root}/scripts/guardex-env.sh" if [[ -f "$guardex_env_helper" ]]; then @@ -446,6 +448,40 @@ has_origin_remote() { git -C "$repo_root" remote get-url origin >/dev/null 2>&1 } +run_active_session_state() { + local action="$1" + shift + + if [[ ! -f "$active_session_state_script" ]]; then + return 0 + fi + if ! command -v "$NODE_BIN" >/dev/null 2>&1; then + return 0 + fi + + "$NODE_BIN" "$active_session_state_script" "$action" "$@" >/dev/null 2>&1 || true +} + +record_active_session_state() { + local wt="$1" + local branch="$2" + + run_active_session_state \ + start \ + --repo "$repo_root" \ + --branch "$branch" \ + --task "$TASK_NAME" \ + --agent "$AGENT_NAME" \ + --worktree "$wt" \ + --pid "$$" \ + --cli "$CODEX_BIN" +} + +clear_active_session_state() { + local branch="$1" + run_active_session_state stop --repo "$repo_root" --branch "$branch" +} + origin_remote_supports_pr_finish() { local origin_url origin_url="$(git -C "$repo_root" remote get-url origin 2>/dev/null || true)" @@ -833,6 +869,19 @@ if ! ensure_openspec_plan_workspace "$worktree_path" "$worktree_branch"; then exit 1 fi +active_session_recorded=0 +cleanup_active_session_state_on_exit() { + set +e + if [[ "${active_session_recorded:-0}" -eq 1 && -n "${worktree_branch:-}" && "${worktree_branch:-}" != "HEAD" ]]; then + clear_active_session_state "$worktree_branch" + active_session_recorded=0 + fi +} + +record_active_session_state "$worktree_path" "$worktree_branch" +active_session_recorded=1 +trap cleanup_active_session_state_on_exit EXIT INT TERM + echo "[codex-agent] Launching ${CODEX_BIN} in sandbox: $worktree_path" cd "$worktree_path" set +e @@ -841,6 +890,8 @@ codex_exit="$?" set -e cd "$repo_root" +cleanup_active_session_state_on_exit +trap - EXIT INT TERM final_exit="$codex_exit" auto_finish_completed=0 diff --git a/templates/scripts/install-vscode-active-agents-extension.js b/templates/scripts/install-vscode-active-agents-extension.js new file mode 100755 index 0000000..9a7647f --- /dev/null +++ b/templates/scripts/install-vscode-active-agents-extension.js @@ -0,0 +1,92 @@ +#!/usr/bin/env node + +const fs = require('node:fs'); +const os = require('node:os'); +const path = require('node:path'); + +function parseOptions(argv) { + const options = {}; + for (let index = 0; index < argv.length; index += 1) { + const token = argv[index]; + if (!token.startsWith('--')) { + throw new Error(`Unexpected argument: ${token}`); + } + const key = token.slice(2); + const value = argv[index + 1]; + if (!value || value.startsWith('--')) { + throw new Error(`Missing value for --${key}`); + } + options[key] = value; + index += 1; + } + return options; +} + +function resolveExtensionSource(repoRoot) { + const candidates = [ + path.join(repoRoot, 'vscode', 'guardex-active-agents'), + path.join(repoRoot, 'templates', 'vscode', 'guardex-active-agents'), + ]; + + for (const candidate of candidates) { + if (fs.existsSync(path.join(candidate, 'package.json'))) { + return candidate; + } + } + + throw new Error('Could not find the Guardex VS Code companion sources.'); +} + +function removeIfExists(targetPath) { + if (fs.existsSync(targetPath)) { + fs.rmSync(targetPath, { recursive: true, force: true }); + } +} + +function main() { + const repoRoot = path.resolve(__dirname, '..'); + const options = parseOptions(process.argv.slice(2)); + const sourceDir = resolveExtensionSource(repoRoot); + const manifest = JSON.parse(fs.readFileSync(path.join(sourceDir, 'package.json'), 'utf8')); + const extensionId = `${manifest.publisher}.${manifest.name}`; + const extensionsDir = path.resolve( + options['extensions-dir'] || + process.env.GUARDEX_VSCODE_EXTENSIONS_DIR || + process.env.VSCODE_EXTENSIONS_DIR || + path.join(os.homedir(), '.vscode', 'extensions'), + ); + + fs.mkdirSync(extensionsDir, { recursive: true }); + const targetDir = path.join(extensionsDir, `${extensionId}-${manifest.version}`); + + for (const entry of fs.readdirSync(extensionsDir, { withFileTypes: true })) { + if (!entry.isDirectory()) { + continue; + } + if (entry.name === path.basename(targetDir)) { + continue; + } + if (entry.name.startsWith(`${extensionId}-`)) { + removeIfExists(path.join(extensionsDir, entry.name)); + } + } + + removeIfExists(targetDir); + fs.cpSync(sourceDir, targetDir, { + recursive: true, + force: true, + preserveTimestamps: true, + }); + + process.stdout.write( + `[guardex-active-agents] Installed ${extensionId}@${manifest.version} to ${targetDir}\n` + + '[guardex-active-agents] Reload the VS Code window to activate the Source Control companion.\n', + ); +} + +try { + main(); +} catch (error) { + process.stderr.write(`[guardex-active-agents] ${error.message}\n`); + process.exitCode = 1; +} diff --git a/templates/vscode/guardex-active-agents/README.md b/templates/vscode/guardex-active-agents/README.md new file mode 100644 index 0000000..ea3ffb4 --- /dev/null +++ b/templates/vscode/guardex-active-agents/README.md @@ -0,0 +1,18 @@ +# GitGuardex Active Agents + +Local VS Code companion for Guardex-managed repos. + +What it does: + +- Adds an `Active Agents` view to the Source Control container. +- Renders one row per live Guardex sandbox session. +- Uses VS Code's native animated `loading~spin` icon for the running-state affordance. +- Reads repo-local presence files from `.omx/state/active-sessions/`. + +Install from a Guardex-wired repo: + +```sh +node scripts/install-vscode-active-agents-extension.js +``` + +Then reload the VS Code window. diff --git a/templates/vscode/guardex-active-agents/extension.js b/templates/vscode/guardex-active-agents/extension.js new file mode 100644 index 0000000..7473f73 --- /dev/null +++ b/templates/vscode/guardex-active-agents/extension.js @@ -0,0 +1,147 @@ +const path = require('node:path'); +const vscode = require('vscode'); +const { formatElapsedFrom, readActiveSessions } = require('./session-schema.js'); + +class InfoItem extends vscode.TreeItem { + constructor(label, description = '') { + super(label, vscode.TreeItemCollapsibleState.None); + this.description = description; + this.iconPath = new vscode.ThemeIcon('info'); + } +} + +class RepoItem extends vscode.TreeItem { + constructor(repoRoot, sessions) { + super(path.basename(repoRoot), vscode.TreeItemCollapsibleState.Expanded); + this.repoRoot = repoRoot; + this.sessions = sessions; + this.description = `${sessions.length} active`; + this.tooltip = repoRoot; + this.iconPath = new vscode.ThemeIcon('repo'); + this.contextValue = 'gitguardex.repo'; + } +} + +class SessionItem extends vscode.TreeItem { + constructor(session) { + super(session.label, vscode.TreeItemCollapsibleState.None); + this.session = session; + this.description = `thinking · ${formatElapsedFrom(session.startedAt)}`; + this.tooltip = [ + session.branch, + `${session.agentName} · ${session.taskName}`, + `Started ${session.startedAt}`, + session.worktreePath, + ].join('\n'); + this.iconPath = new vscode.ThemeIcon('loading~spin'); + this.contextValue = 'gitguardex.session'; + this.command = { + command: 'gitguardex.activeAgents.openWorktree', + title: 'Open Agent Worktree', + arguments: [session], + }; + } +} + +function repoRootFromSessionFile(filePath) { + return path.resolve(path.dirname(filePath), '..', '..', '..'); +} + +class ActiveAgentsProvider { + constructor() { + this.onDidChangeTreeDataEmitter = new vscode.EventEmitter(); + this.onDidChangeTreeData = this.onDidChangeTreeDataEmitter.event; + } + + refresh() { + this.onDidChangeTreeDataEmitter.fire(); + } + + async getChildren(element) { + if (element instanceof RepoItem) { + return element.sessions.map((session) => new SessionItem(session)); + } + + const sessionsByRepo = await this.loadSessionsByRepo(); + const repos = [...sessionsByRepo.entries()] + .map(([repoRoot, sessions]) => ({ repoRoot, sessions })) + .filter((entry) => entry.sessions.length > 0) + .sort((left, right) => left.repoRoot.localeCompare(right.repoRoot)); + + if (repos.length === 0) { + return [new InfoItem('No active Guardex agents', 'Start a sandbox session to populate this view.')]; + } + + if (repos.length === 1) { + return repos[0].sessions.map((session) => new SessionItem(session)); + } + + return repos.map((entry) => new RepoItem(entry.repoRoot, entry.sessions)); + } + + async loadSessionsByRepo() { + const sessionFiles = await vscode.workspace.findFiles( + '**/.omx/state/active-sessions/*.json', + '**/{node_modules,.git,.omx/agent-worktrees,.omc/agent-worktrees}/**', + 200, + ); + + const repoRoots = new Set(); + for (const uri of sessionFiles) { + repoRoots.add(repoRootFromSessionFile(uri.fsPath)); + } + + if (repoRoots.size === 0) { + for (const workspaceFolder of vscode.workspace.workspaceFolders || []) { + repoRoots.add(workspaceFolder.uri.fsPath); + } + } + + const sessionsByRepo = new Map(); + for (const repoRoot of repoRoots) { + const sessions = readActiveSessions(repoRoot); + if (sessions.length > 0) { + sessionsByRepo.set(repoRoot, sessions); + } + } + + return sessionsByRepo; + } +} + +function activate(context) { + const provider = new ActiveAgentsProvider(); + const refresh = () => provider.refresh(); + const watcher = vscode.workspace.createFileSystemWatcher('**/.omx/state/active-sessions/*.json'); + const interval = setInterval(refresh, 5_000); + + context.subscriptions.push( + vscode.window.registerTreeDataProvider('gitguardex.activeAgents', provider), + vscode.commands.registerCommand('gitguardex.activeAgents.refresh', refresh), + vscode.commands.registerCommand('gitguardex.activeAgents.openWorktree', async (session) => { + if (!session?.worktreePath) { + return; + } + + await vscode.commands.executeCommand( + 'vscode.openFolder', + vscode.Uri.file(session.worktreePath), + { forceNewWindow: true }, + ); + }), + vscode.workspace.onDidChangeWorkspaceFolders(refresh), + watcher, + { dispose: () => clearInterval(interval) }, + ); + + watcher.onDidCreate(refresh, undefined, context.subscriptions); + watcher.onDidChange(refresh, undefined, context.subscriptions); + watcher.onDidDelete(refresh, undefined, context.subscriptions); +} + +function deactivate() {} + +module.exports = { + activate, + deactivate, +}; diff --git a/templates/vscode/guardex-active-agents/package.json b/templates/vscode/guardex-active-agents/package.json new file mode 100644 index 0000000..354efa2 --- /dev/null +++ b/templates/vscode/guardex-active-agents/package.json @@ -0,0 +1,56 @@ +{ + "name": "gitguardex-active-agents", + "displayName": "GitGuardex Active Agents", + "description": "Shows live Guardex sandbox sessions inside VS Code Source Control.", + "publisher": "recodeee", + "version": "0.0.1", + "license": "MIT", + "engines": { + "vscode": "^1.88.0" + }, + "categories": [ + "Source Control", + "Other" + ], + "activationEvents": [ + "workspaceContains:.omx/state/active-sessions", + "onView:gitguardex.activeAgents" + ], + "main": "./extension.js", + "contributes": { + "commands": [ + { + "command": "gitguardex.activeAgents.refresh", + "title": "Refresh Active Agents" + }, + { + "command": "gitguardex.activeAgents.openWorktree", + "title": "Open Agent Worktree" + } + ], + "views": { + "scm": [ + { + "id": "gitguardex.activeAgents", + "name": "Active Agents" + } + ] + }, + "menus": { + "view/title": [ + { + "command": "gitguardex.activeAgents.refresh", + "when": "view == gitguardex.activeAgents", + "group": "navigation" + } + ], + "view/item/context": [ + { + "command": "gitguardex.activeAgents.openWorktree", + "when": "view == gitguardex.activeAgents && viewItem == gitguardex.session", + "group": "inline" + } + ] + } + } +} diff --git a/templates/vscode/guardex-active-agents/session-schema.js b/templates/vscode/guardex-active-agents/session-schema.js new file mode 100644 index 0000000..eed9851 --- /dev/null +++ b/templates/vscode/guardex-active-agents/session-schema.js @@ -0,0 +1,203 @@ +const fs = require('node:fs'); +const path = require('node:path'); + +const ACTIVE_SESSIONS_RELATIVE_DIR = path.join('.omx', 'state', 'active-sessions'); +const SESSION_SCHEMA_VERSION = 1; + +function toNonEmptyString(value, fallback = '') { + const normalized = typeof value === 'string' ? value.trim() : String(value || '').trim(); + return normalized || fallback; +} + +function toPositiveInteger(value) { + const normalized = Number.parseInt(String(value || ''), 10); + return Number.isInteger(normalized) && normalized > 0 ? normalized : null; +} + +function sanitizeBranchForFile(branch) { + const normalized = toNonEmptyString(branch, 'session'); + return normalized.replace(/[^a-zA-Z0-9._-]+/g, '__').replace(/^_+|_+$/g, '') || 'session'; +} + +function sessionFileNameForBranch(branch) { + return `${sanitizeBranchForFile(branch)}.json`; +} + +function activeSessionsDirForRepo(repoRoot) { + return path.join(path.resolve(repoRoot), ACTIVE_SESSIONS_RELATIVE_DIR); +} + +function sessionFilePathForBranch(repoRoot, branch) { + return path.join(activeSessionsDirForRepo(repoRoot), sessionFileNameForBranch(branch)); +} + +function buildSessionRecord(input) { + const repoRoot = path.resolve(toNonEmptyString(input.repoRoot)); + const worktreePath = path.resolve(toNonEmptyString(input.worktreePath)); + const branch = toNonEmptyString(input.branch); + const pid = toPositiveInteger(input.pid); + const startedAt = input.startedAt ? new Date(input.startedAt) : new Date(); + + if (!branch) { + throw new Error('branch is required'); + } + if (!repoRoot) { + throw new Error('repoRoot is required'); + } + if (!worktreePath) { + throw new Error('worktreePath is required'); + } + if (!pid) { + throw new Error('pid must be a positive integer'); + } + if (Number.isNaN(startedAt.getTime())) { + throw new Error('startedAt must be a valid date'); + } + + return { + schemaVersion: SESSION_SCHEMA_VERSION, + repoRoot, + branch, + taskName: toNonEmptyString(input.taskName, 'task'), + agentName: toNonEmptyString(input.agentName, 'agent'), + worktreePath, + pid, + cliName: toNonEmptyString(input.cliName, 'codex'), + startedAt: startedAt.toISOString(), + }; +} + +function deriveSessionLabel(branch, worktreePath) { + const worktreeLeaf = toNonEmptyString(path.basename(worktreePath || '')); + if (worktreeLeaf) { + return worktreeLeaf; + } + return toNonEmptyString(branch).replace(/[\\/]+/g, '-') || 'unknown-agent'; +} + +function normalizeSessionRecord(input, options = {}) { + if (!input || typeof input !== 'object') { + return null; + } + + const repoRoot = toNonEmptyString(input.repoRoot); + const branch = toNonEmptyString(input.branch); + const worktreePath = toNonEmptyString(input.worktreePath); + const startedAt = new Date(input.startedAt); + const pid = toPositiveInteger(input.pid); + + if (!repoRoot || !branch || !worktreePath || !pid || Number.isNaN(startedAt.getTime())) { + return null; + } + + return { + schemaVersion: toPositiveInteger(input.schemaVersion) || SESSION_SCHEMA_VERSION, + repoRoot: path.resolve(repoRoot), + branch, + taskName: toNonEmptyString(input.taskName, 'task'), + agentName: toNonEmptyString(input.agentName, 'agent'), + worktreePath: path.resolve(worktreePath), + pid, + cliName: toNonEmptyString(input.cliName, 'codex'), + startedAt: startedAt.toISOString(), + filePath: toNonEmptyString(options.filePath), + label: deriveSessionLabel(branch, worktreePath), + }; +} + +function formatElapsedFrom(startedAt, now = Date.now()) { + const startedAtMs = startedAt instanceof Date ? startedAt.getTime() : Date.parse(startedAt); + if (!Number.isFinite(startedAtMs)) { + return '0s'; + } + + const totalSeconds = Math.max(0, Math.floor((now - startedAtMs) / 1000)); + const days = Math.floor(totalSeconds / 86_400); + const hours = Math.floor((totalSeconds % 86_400) / 3_600); + const minutes = Math.floor((totalSeconds % 3_600) / 60); + const seconds = totalSeconds % 60; + + if (days > 0) { + return `${days}d ${hours}h`; + } + if (hours > 0) { + return `${hours}h ${minutes}m`; + } + if (minutes > 0) { + return `${minutes}m ${seconds}s`; + } + return `${seconds}s`; +} + +function isPidAlive(pid) { + const normalizedPid = toPositiveInteger(pid); + if (!normalizedPid) { + return false; + } + + try { + process.kill(normalizedPid, 0); + return true; + } catch (_error) { + return false; + } +} + +function readActiveSessions(repoRoot, options = {}) { + const activeSessionsDir = activeSessionsDirForRepo(repoRoot); + if (!fs.existsSync(activeSessionsDir)) { + return []; + } + + const now = options.now || Date.now(); + const sessions = []; + for (const entry of fs.readdirSync(activeSessionsDir, { withFileTypes: true })) { + if (!entry.isFile() || !entry.name.endsWith('.json')) { + continue; + } + + const filePath = path.join(activeSessionsDir, entry.name); + let parsed; + try { + parsed = JSON.parse(fs.readFileSync(filePath, 'utf8')); + } catch (_error) { + continue; + } + + const normalized = normalizeSessionRecord(parsed, { filePath }); + if (!normalized) { + continue; + } + if (!options.includeStale && !isPidAlive(normalized.pid)) { + continue; + } + + normalized.elapsedLabel = formatElapsedFrom(normalized.startedAt, now); + sessions.push(normalized); + } + + sessions.sort((left, right) => { + const timeDelta = Date.parse(right.startedAt) - Date.parse(left.startedAt); + if (timeDelta !== 0) { + return timeDelta; + } + return left.label.localeCompare(right.label); + }); + + return sessions; +} + +module.exports = { + ACTIVE_SESSIONS_RELATIVE_DIR, + SESSION_SCHEMA_VERSION, + activeSessionsDirForRepo, + buildSessionRecord, + deriveSessionLabel, + formatElapsedFrom, + isPidAlive, + normalizeSessionRecord, + readActiveSessions, + sanitizeBranchForFile, + sessionFileNameForBranch, + sessionFilePathForBranch, +}; diff --git a/test/vscode-active-agents-session-state.test.js b/test/vscode-active-agents-session-state.test.js new file mode 100644 index 0000000..9922076 --- /dev/null +++ b/test/vscode-active-agents-session-state.test.js @@ -0,0 +1,134 @@ +const test = require('node:test'); +const assert = require('node:assert/strict'); +const fs = require('node:fs'); +const os = require('node:os'); +const path = require('node:path'); +const cp = require('node:child_process'); + +const repoRoot = path.resolve(__dirname, '..'); +const sessionScript = path.join(repoRoot, 'scripts', 'agent-session-state.js'); +const installScript = path.join(repoRoot, 'scripts', 'install-vscode-active-agents-extension.js'); +const sessionSchema = require(path.join( + repoRoot, + 'templates', + 'vscode', + 'guardex-active-agents', + 'session-schema.js', +)); + +function runNode(scriptPath, args, options = {}) { + return cp.spawnSync('node', [scriptPath, ...args], { + encoding: 'utf8', + ...options, + }); +} + +test('agent-session-state writes and removes active session records', () => { + const tempRoot = fs.mkdtempSync(path.join(os.tmpdir(), 'guardex-active-session-')); + const branch = 'agent/codex/demo-task'; + const worktreePath = path.join(tempRoot, '.omx', 'agent-worktrees', 'agent__codex__demo-task'); + fs.mkdirSync(worktreePath, { recursive: true }); + + const start = runNode(sessionScript, [ + 'start', + '--repo', + tempRoot, + '--branch', + branch, + '--task', + 'demo-task', + '--agent', + 'codex', + '--worktree', + worktreePath, + '--pid', + String(process.pid), + '--cli', + 'codex', + ]); + assert.equal(start.status, 0, start.stderr); + + const sessionPath = sessionSchema.sessionFilePathForBranch(tempRoot, branch); + assert.equal(path.basename(sessionPath), 'agent__codex__demo-task.json'); + assert.equal(fs.existsSync(sessionPath), true); + + const parsed = JSON.parse(fs.readFileSync(sessionPath, 'utf8')); + assert.equal(parsed.branch, branch); + assert.equal(parsed.taskName, 'demo-task'); + assert.equal(parsed.agentName, 'codex'); + assert.equal(parsed.worktreePath, worktreePath); + + const sessions = sessionSchema.readActiveSessions(tempRoot); + assert.equal(sessions.length, 1); + assert.equal(sessions[0].label, 'agent__codex__demo-task'); + + const stop = runNode(sessionScript, [ + 'stop', + '--repo', + tempRoot, + '--branch', + branch, + ]); + assert.equal(stop.status, 0, stop.stderr); + assert.equal(fs.existsSync(sessionPath), false); +}); + +test('session-schema ignores stale or invalid session records', () => { + const tempRoot = fs.mkdtempSync(path.join(os.tmpdir(), 'guardex-active-session-stale-')); + const activeSessionsDir = sessionSchema.activeSessionsDirForRepo(tempRoot); + fs.mkdirSync(activeSessionsDir, { recursive: true }); + + const liveRecord = sessionSchema.buildSessionRecord({ + repoRoot: tempRoot, + branch: 'agent/codex/live-task', + taskName: 'live-task', + agentName: 'codex', + worktreePath: path.join(tempRoot, '.omx', 'agent-worktrees', 'live-task'), + pid: process.pid, + cliName: 'codex', + }); + fs.writeFileSync( + sessionSchema.sessionFilePathForBranch(tempRoot, liveRecord.branch), + `${JSON.stringify(liveRecord, null, 2)}\n`, + 'utf8', + ); + + const staleRecord = sessionSchema.buildSessionRecord({ + repoRoot: tempRoot, + branch: 'agent/codex/stale-task', + taskName: 'stale-task', + agentName: 'codex', + worktreePath: path.join(tempRoot, '.omx', 'agent-worktrees', 'stale-task'), + pid: 999999, + cliName: 'codex', + }); + fs.writeFileSync( + sessionSchema.sessionFilePathForBranch(tempRoot, staleRecord.branch), + `${JSON.stringify(staleRecord, null, 2)}\n`, + 'utf8', + ); + fs.writeFileSync(path.join(activeSessionsDir, 'broken.json'), '{broken json', 'utf8'); + + const sessions = sessionSchema.readActiveSessions(tempRoot); + assert.equal(sessions.length, 1); + assert.equal(sessions[0].branch, liveRecord.branch); +}); + +test('install-vscode-active-agents-extension installs the current extension version and prunes older copies', () => { + const tempExtensionsDir = fs.mkdtempSync(path.join(os.tmpdir(), 'guardex-vscode-ext-')); + const staleDir = path.join(tempExtensionsDir, 'recodeee.gitguardex-active-agents-0.0.0'); + fs.mkdirSync(staleDir, { recursive: true }); + fs.writeFileSync(path.join(staleDir, 'stale.txt'), 'old', 'utf8'); + + const result = runNode(installScript, ['--extensions-dir', tempExtensionsDir], { + cwd: repoRoot, + }); + assert.equal(result.status, 0, result.stderr); + + const installedDir = path.join(tempExtensionsDir, 'recodeee.gitguardex-active-agents-0.0.1'); + assert.equal(fs.existsSync(installedDir), true); + assert.equal(fs.existsSync(path.join(installedDir, 'extension.js')), true); + assert.equal(fs.existsSync(path.join(installedDir, 'session-schema.js')), true); + assert.equal(fs.existsSync(staleDir), false); + assert.match(result.stdout, /Reload the VS Code window/); +}); From 4c74aadd964300be64e426b9e3b048cee7fbcc53 Mon Sep 17 00:00:00 2001 From: Viktor Nagy <137165288+NagyVikt@users.noreply.github.com> Date: Tue, 21 Apr 2026 17:57:08 +0200 Subject: [PATCH 15/48] Enhance README with sidebar options and install section (#249) Added options for GitHub About sidebar description and installation instructions. --- README.md | 61 +++++++++++++++++++++++++++++++++++++++++++++++++++++++ 1 file changed, 61 insertions(+) diff --git a/README.md b/README.md index 4151e14..31c8608 100644 --- a/README.md +++ b/README.md @@ -42,6 +42,67 @@ flowchart LR I --> S ``` + + + + + + + + + + + + +Option A — leads with the tagline (matches README h1): + Guardian T-Rex for your multi-agent repo. Isolated worktrees, file locks, + and PR-only merges stop parallel Codex & Claude agents from overwriting + each other's work. Auto-wires Oh My Codex, Oh My Claude, OpenSpec, and + Caveman. + +Option B — leads with the benefit: + Run many Codex & Claude agents in parallel without them overwriting each + other. Isolated worktrees, file locks, PR-only merges. Auto-wires Oh My + Codex, Oh My Claude, OpenSpec, and Caveman in every worktree. + +Option C — punchiest, under 200 chars: + Safety layer for parallel Codex & Claude agents. Isolated worktrees + + file locks + PR-only merges. Auto-wires Oh My Codex, Oh My Claude, + OpenSpec, and Caveman. + + + + + + + + + + + +

+ Install GitGuardex +

+ +

Install in one line

+ +```bash +npm i -g @imdeadpool/guardex +``` + +

+ + Then cd into your repo and run gx setup — hooks, scripts, templates, + and OMX / OpenSpec / caveman wiring all scaffold in one go. + +

+ +

+ npm + downloads + stars +

+ ### Dashboard ![Multi-agent dashboard example](https://raw.githubusercontent.com/recodeee/gitguardex/main/docs/images/dashboard-multi-agent.png) From ef319dfb20789e2e07d0235e54e90e9ed3534b28 Mon Sep 17 00:00:00 2001 From: NagyVikt Date: Tue, 21 Apr 2026 17:58:33 +0200 Subject: [PATCH 16/48] new --- docs/images/install-hero.svg | 40 ++++++++++++++++++++++++++++++++++++ 1 file changed, 40 insertions(+) create mode 100644 docs/images/install-hero.svg diff --git a/docs/images/install-hero.svg b/docs/images/install-hero.svg new file mode 100644 index 0000000..1739d5f --- /dev/null +++ b/docs/images/install-hero.svg @@ -0,0 +1,40 @@ + +GitGuardex install terminal card +A stylized terminal window showing the install command, its success output, and a next-step command with a blinking cursor. + + + + + + + +~/my-repo — zsh + + +npm i -g +@imdeadpool/guardex + +added 47 packages in 3.2s + ++ @imdeadpool/guardex@ +7.0.16 + + + +cd ~/my-repo && +gx setup + +✓ hooks installed +✓ scripts scaffolded + + + + + + + + From e563bab021a6c274e5f36fb9d0e8b46021408014 Mon Sep 17 00:00:00 2001 From: Viktor Nagy <137165288+NagyVikt@users.noreply.github.com> Date: Tue, 21 Apr 2026 18:15:13 +0200 Subject: [PATCH 17/48] Clarify README problem and About copy surfaces (#250) Move the collision visual into The problem, move the Guardex workflow visual into Solution, and replace the three-option About copy dump with a single source file plus README pointer. This keeps the top of the README readable while preserving a copyable GitHub About description in-repo. Constraint: README needs one canonical GitHub About description instead of multiple alternatives Rejected: Keep all three About-copy options inline | adds noise and leaves no single source of truth Confidence: high Scope-risk: narrow Directive: Keep GitHub About copy in about_description.txt and mirror it from README instead of adding new inline variants Tested: git diff --check; openspec validate agent-codex-readme-about-description-2026-04-21-18-05 --type change --strict; openspec validate --specs (no items found) Not-tested: npm test times out after four passing subtests; npm run agent:safety:scan fails because gx status rejects --strict Co-authored-by: NagyVikt --- README.md | 52 +++------- about_description.txt | 1 + docs/images/problem-agent-collision.svg | 98 +++++++++++++++++++ .../.openspec.yaml | 2 + .../notes.md | 5 + .../proposal.md | 15 +++ .../specs/readme-about-description/spec.md | 22 +++++ .../tasks.md | 19 ++++ 8 files changed, 173 insertions(+), 41 deletions(-) create mode 100644 about_description.txt create mode 100644 docs/images/problem-agent-collision.svg create mode 100644 openspec/changes/agent-codex-readme-about-description-2026-04-21-18-05/.openspec.yaml create mode 100644 openspec/changes/agent-codex-readme-about-description-2026-04-21-18-05/notes.md create mode 100644 openspec/changes/agent-codex-readme-about-description-2026-04-21-18-05/proposal.md create mode 100644 openspec/changes/agent-codex-readme-about-description-2026-04-21-18-05/specs/readme-about-description/spec.md create mode 100644 openspec/changes/agent-codex-readme-about-description-2026-04-21-18-05/tasks.md diff --git a/README.md b/README.md index 31c8608..1026b94 100644 --- a/README.md +++ b/README.md @@ -22,53 +22,23 @@ ## The problem -I was running ~30 Codex agents in parallel and hit a wall: they kept working on the same files at the same time — especially tests — and started overwriting or deleting each other's changes. More agents meant *less* forward progress, not more. Classic de-progressive loop. +![Parallel agents colliding in the same files](https://raw.githubusercontent.com/recodeee/gitguardex/main/docs/images/problem-agent-collision.svg) -GitGuardex exists to stop that loop. Every agent gets its own worktree, claims the files it's touching, and can't clobber files another agent has claimed. Your local branch stays clean; agents stay in their lanes. +I was running ~30 Codex agents in parallel and hit a wall: they kept working on the same files at the same time — especially tests — and started overwriting or deleting each other's changes. More agents meant *less* forward progress, not more. Classic de-progressive loop. ### Solution -```mermaid -flowchart LR - A[Agent A adds assertions in a shared test] --> S[Several agents touch the same files] - B[Agent B rewrites the same test flow] --> S - C[Agent C updates the shared helper] --> S - D[Agent D deletes lines Agent A just added] --> S - E[Agent E saves an older snapshot of the file] --> S - S --> F[One agent overwrites another agent's edits] - F --> G[Another agent deletes code the others just added] - G --> H[Lost work, rework, and review confusion] - H --> I[Regression risk and flaky fixes grow] - I --> S -``` +![Agent branch/worktree start protocol](https://raw.githubusercontent.com/recodeee/gitguardex/main/docs/images/workflow-branch-start.svg) - - - - - - - - - - - +GitGuardex exists to stop that loop. Every agent gets its own worktree, claims the files it's touching, and can't clobber files another agent has claimed. Your local branch stays clean; agents stay in their lanes. + +## GitHub About description -Option A — leads with the tagline (matches README h1): - Guardian T-Rex for your multi-agent repo. Isolated worktrees, file locks, - and PR-only merges stop parallel Codex & Claude agents from overwriting - each other's work. Auto-wires Oh My Codex, Oh My Claude, OpenSpec, and - Caveman. - -Option B — leads with the benefit: - Run many Codex & Claude agents in parallel without them overwriting each - other. Isolated worktrees, file locks, PR-only merges. Auto-wires Oh My - Codex, Oh My Claude, OpenSpec, and Caveman in every worktree. - -Option C — punchiest, under 200 chars: - Safety layer for parallel Codex & Claude agents. Isolated worktrees + - file locks + PR-only merges. Auto-wires Oh My Codex, Oh My Claude, - OpenSpec, and Caveman. +The canonical GitHub About copy lives in [about_description.txt](./about_description.txt): + +```text +Guardian T-Rex for your multi-agent repo. Isolated worktrees, file locks, and PR-only merges stop parallel Codex & Claude agents from overwriting each other's work. Auto-wires Oh My Codex, Oh My Claude, OpenSpec, and Caveman. +``` diff --git a/about_description.txt b/about_description.txt new file mode 100644 index 0000000..aca3383 --- /dev/null +++ b/about_description.txt @@ -0,0 +1 @@ +Guardian T-Rex for your multi-agent repo. Isolated worktrees, file locks, and PR-only merges stop parallel Codex & Claude agents from overwriting each other's work. Auto-wires Oh My Codex, Oh My Claude, OpenSpec, and Caveman. diff --git a/docs/images/problem-agent-collision.svg b/docs/images/problem-agent-collision.svg new file mode 100644 index 0000000..b8e82b6 --- /dev/null +++ b/docs/images/problem-agent-collision.svg @@ -0,0 +1,98 @@ + + parallel agent collision loop + Five agents touch the same files, one overwrite leads to deletion, lost work, and a regression loop. + + + + + + + + + + + + + + + + + + Agent A adds assertions + in a shared test + + + + + Agent B rewrites + the same flow + + + + + Agent C updates + the shared helper + + + + + Agent D deletes lines + Agent A just added + + + + + Agent E saves + an older snapshot + + + + + + Several agents touch + the same files + + + + + One agent overwrites + another agent's edits + + + + + Another agent deletes + code the others added + + + + + Lost work, rework, + and review confusion + + + + + Regression risk + and flaky fixes grow + + + + + + + + + + + + + + + Five agents touch the same files. One overwrite turns into deletion, lost work, and a regression loop. + diff --git a/openspec/changes/agent-codex-readme-about-description-2026-04-21-18-05/.openspec.yaml b/openspec/changes/agent-codex-readme-about-description-2026-04-21-18-05/.openspec.yaml new file mode 100644 index 0000000..4b8c565 --- /dev/null +++ b/openspec/changes/agent-codex-readme-about-description-2026-04-21-18-05/.openspec.yaml @@ -0,0 +1,2 @@ +schema: spec-driven +created: 2026-04-21 diff --git a/openspec/changes/agent-codex-readme-about-description-2026-04-21-18-05/notes.md b/openspec/changes/agent-codex-readme-about-description-2026-04-21-18-05/notes.md new file mode 100644 index 0000000..db664f7 --- /dev/null +++ b/openspec/changes/agent-codex-readme-about-description-2026-04-21-18-05/notes.md @@ -0,0 +1,5 @@ +# T1 Notes + +- Handoff: own `README.md`, `about_description.txt`, `docs/images/problem-agent-collision.svg`, and this change note. +- Replace the README About-options dump with one canonical `about_description.txt` source and a short README section that points to it. +- Move the collision visual under `## The problem`, move the solution copy under `### Solution`, and add the existing branch-start visual under the solution heading. diff --git a/openspec/changes/agent-codex-readme-about-description-2026-04-21-18-05/proposal.md b/openspec/changes/agent-codex-readme-about-description-2026-04-21-18-05/proposal.md new file mode 100644 index 0000000..8fb2413 --- /dev/null +++ b/openspec/changes/agent-codex-readme-about-description-2026-04-21-18-05/proposal.md @@ -0,0 +1,15 @@ +## Why + +- The README currently mixes three alternative GitHub About descriptions into the product narrative, which adds noise near the top of the page. +- The user requested a single canonical `about_description` source and a clearer visual split between the problem statement and the solution. + +## What Changes + +- Add a root `about_description.txt` file with the canonical GitHub About copy and point the README at it. +- Move the collision visual under `## The problem` and place the existing branch-start workflow image under `### Solution`. +- Replace the README option dump with a short "GitHub About description" section that mirrors the canonical copy. + +## Impact + +- Affected surfaces: top-level `README.md`, one new docs image, one new root copy file, and the matching OpenSpec change record. +- Risk is low and limited to docs/rendered README structure. diff --git a/openspec/changes/agent-codex-readme-about-description-2026-04-21-18-05/specs/readme-about-description/spec.md b/openspec/changes/agent-codex-readme-about-description-2026-04-21-18-05/specs/readme-about-description/spec.md new file mode 100644 index 0000000..d14e9eb --- /dev/null +++ b/openspec/changes/agent-codex-readme-about-description-2026-04-21-18-05/specs/readme-about-description/spec.md @@ -0,0 +1,22 @@ +## ADDED Requirements + +### Requirement: README problem and solution sections use separate visuals +The README SHALL place the multi-agent collision visual directly under `## The problem` and SHALL place a Guardex solution visual directly under `### Solution`. + +#### Scenario: problem visual appears before the collision narrative +- **WHEN** a reader opens the top-level README +- **THEN** the `## The problem` heading is followed by the collision visual +- **AND** the narrative about parallel agents overwriting each other appears below that visual. + +#### Scenario: solution visual appears under the solution heading +- **WHEN** a reader reaches the `### Solution` section +- **THEN** the README shows a Guardex workflow image directly under that heading +- **AND** the solution copy appears below the image. + +### Requirement: README points to one canonical GitHub About description source +The repo SHALL keep one canonical GitHub About description in `about_description.txt`, and the README SHALL reference that file instead of listing multiple About-copy options. + +#### Scenario: canonical About copy is linked from the README +- **WHEN** a reader opens the `## GitHub About description` section +- **THEN** the README links to `about_description.txt` +- **AND** the section mirrors the same canonical copy in a copyable text block. diff --git a/openspec/changes/agent-codex-readme-about-description-2026-04-21-18-05/tasks.md b/openspec/changes/agent-codex-readme-about-description-2026-04-21-18-05/tasks.md new file mode 100644 index 0000000..0c5707d --- /dev/null +++ b/openspec/changes/agent-codex-readme-about-description-2026-04-21-18-05/tasks.md @@ -0,0 +1,19 @@ +## 1. Specification + +- [x] 1.1 Capture the README/about-copy scope in proposal + spec artifacts. + +## 2. Implementation + +- [x] 2.1 Add the canonical `about_description.txt` source file. +- [x] 2.2 Update `README.md` so `The problem` and `Solution` each have the intended visual placement. +- [x] 2.3 Add the static collision visual used by `The problem`. + +## 3. Verification + +- [x] 3.1 Run `git diff --check`. +- [x] 3.2 Run `openspec validate agent-codex-readme-about-description-2026-04-21-18-05 --type change --strict`. +- [x] 3.3 Run `openspec validate --specs`. + +## 4. Cleanup + +- [ ] 4.1 Branch is ready for `agent-branch-finish --via-pr --wait-for-merge --cleanup` with PR URL + `MERGED` evidence recorded. From b4869cac6be4e75ef54bd4f336e3a9e16b1f0d95 Mon Sep 17 00:00:00 2001 From: Viktor Nagy <137165288+NagyVikt@users.noreply.github.com> Date: Tue, 21 Apr 2026 18:18:35 +0200 Subject: [PATCH 18/48] Restore Active Agents view registration in VS Code (#251) The Active Agents provider was missing getTreeItem, so VS Code crashed the SCM view during render even when session records were present. This restores the required provider method in the installable template bundle and locks the failure mode with a mocked extension-host regression test. Constraint: This branch installs the VS Code companion from templates/vscode/guardex-active-agents Rejected: Reordering installer source precedence | broader behavior change than needed for the reported crash Confidence: high Scope-risk: narrow Reversibility: clean Directive: Keep the provider contract test aligned with any future extension activation refactors Tested: node --test test/vscode-active-agents-session-state.test.js Tested: openspec validate agent-codex-demo-vscode-active-agents-2026-04-21-18-04 --type change --strict Tested: openspec validate --specs Not-tested: Live VS Code extension host after window reload Co-authored-by: NagyVikt --- .../.openspec.yaml | 2 + .../proposal.md | 14 +++ .../specs/demo-vscode-active-agents/spec.md | 14 +++ .../tasks.md | 29 +++++ .../vscode/guardex-active-agents/extension.js | 4 + ...vscode-active-agents-session-state.test.js | 110 ++++++++++++++++++ 6 files changed, 173 insertions(+) create mode 100644 openspec/changes/agent-codex-demo-vscode-active-agents-2026-04-21-18-04/.openspec.yaml create mode 100644 openspec/changes/agent-codex-demo-vscode-active-agents-2026-04-21-18-04/proposal.md create mode 100644 openspec/changes/agent-codex-demo-vscode-active-agents-2026-04-21-18-04/specs/demo-vscode-active-agents/spec.md create mode 100644 openspec/changes/agent-codex-demo-vscode-active-agents-2026-04-21-18-04/tasks.md diff --git a/openspec/changes/agent-codex-demo-vscode-active-agents-2026-04-21-18-04/.openspec.yaml b/openspec/changes/agent-codex-demo-vscode-active-agents-2026-04-21-18-04/.openspec.yaml new file mode 100644 index 0000000..4b8c565 --- /dev/null +++ b/openspec/changes/agent-codex-demo-vscode-active-agents-2026-04-21-18-04/.openspec.yaml @@ -0,0 +1,2 @@ +schema: spec-driven +created: 2026-04-21 diff --git a/openspec/changes/agent-codex-demo-vscode-active-agents-2026-04-21-18-04/proposal.md b/openspec/changes/agent-codex-demo-vscode-active-agents-2026-04-21-18-04/proposal.md new file mode 100644 index 0000000..6c368d9 --- /dev/null +++ b/openspec/changes/agent-codex-demo-vscode-active-agents-2026-04-21-18-04/proposal.md @@ -0,0 +1,14 @@ +## Why + +- The Guardex Active Agents VS Code companion currently throws `this._dataProvider.getTreeItem is not a function` at render time. +- That crash keeps the Source Control view from showing live sandbox sessions even when `.omx/state/active-sessions/*.json` exists. + +## What Changes + +- Add the missing `getTreeItem` implementation to the Active Agents tree provider in the extension bundle this branch installs. +- Add focused regression coverage that activates the extension against a mocked VS Code host and asserts the provider satisfies the tree-data contract. + +## Impact + +- Affected surfaces: `templates/vscode/guardex-active-agents/extension.js`, `test/vscode-active-agents-session-state.test.js`, and this change workspace. +- Risk is narrow because the fix only restores the provider method VS Code already expects and the new test covers the exact failure mode. diff --git a/openspec/changes/agent-codex-demo-vscode-active-agents-2026-04-21-18-04/specs/demo-vscode-active-agents/spec.md b/openspec/changes/agent-codex-demo-vscode-active-agents-2026-04-21-18-04/specs/demo-vscode-active-agents/spec.md new file mode 100644 index 0000000..edff5d2 --- /dev/null +++ b/openspec/changes/agent-codex-demo-vscode-active-agents-2026-04-21-18-04/specs/demo-vscode-active-agents/spec.md @@ -0,0 +1,14 @@ +## ADDED Requirements + +### Requirement: Active Agents tree provider satisfies the VS Code contract +The Guardex Active Agents extension SHALL register a tree data provider that implements the tree item resolution contract required by `gitguardex.activeAgents`. + +#### Scenario: Provider exposes tree item resolution +- **WHEN** the Active Agents extension activates +- **THEN** the registered provider exposes a callable `getTreeItem` +- **AND** rendering tree items does not throw `this._dataProvider.getTreeItem is not a function`. + +#### Scenario: Regression coverage locks the provider contract +- **WHEN** the extension is exercised under the repo test harness with a mocked VS Code host +- **THEN** activation succeeds +- **AND** the regression test fails if the provider is registered without `getTreeItem`. diff --git a/openspec/changes/agent-codex-demo-vscode-active-agents-2026-04-21-18-04/tasks.md b/openspec/changes/agent-codex-demo-vscode-active-agents-2026-04-21-18-04/tasks.md new file mode 100644 index 0000000..12a20ba --- /dev/null +++ b/openspec/changes/agent-codex-demo-vscode-active-agents-2026-04-21-18-04/tasks.md @@ -0,0 +1,29 @@ +## Definition of Done + +This change is complete only when **all** of the following are true: + +- Every checkbox below is checked. +- The agent branch reaches `MERGED` state on `origin` and the PR URL + state are recorded in the completion handoff. +- If any step blocks (test failure, conflict, ambiguous result), append a `BLOCKED:` line under section 4 explaining the blocker and **STOP**. Do not tick remaining cleanup boxes; do not silently skip the cleanup pipeline. + +## 1. Specification + +- [x] 1.1 Capture the Active Agents VS Code failure mode and acceptance criteria for `agent-codex-demo-vscode-active-agents-2026-04-21-18-04`. +- [x] 1.2 Define the tree-provider requirements in `specs/demo-vscode-active-agents/spec.md`. + +## 2. Implementation + +- [x] 2.1 Add the missing `getTreeItem` implementation to the Active Agents extension bundle this branch installs. +- [x] 2.2 Add focused regression coverage for extension activation/provider registration. + +## 3. Verification + +- [x] 3.1 Run `node --test test/vscode-active-agents-session-state.test.js`. +- [x] 3.2 Run `openspec validate agent-codex-demo-vscode-active-agents-2026-04-21-18-04 --type change --strict`. +- [x] 3.3 Run `openspec validate --specs`. + +## 4. Cleanup (mandatory; run before claiming completion) + +- [ ] 4.1 Run the cleanup pipeline: `bash scripts/agent-branch-finish.sh --branch agent// --base dev --via-pr --wait-for-merge --cleanup`. This handles commit -> push -> PR create -> merge wait -> worktree prune in one invocation. +- [ ] 4.2 Record the PR URL and final merge state (`MERGED`) in the completion handoff. +- [ ] 4.3 Confirm the sandbox worktree is gone (`git worktree list` no longer shows the agent path; `git branch -a` shows no surviving local/remote refs for the branch). diff --git a/templates/vscode/guardex-active-agents/extension.js b/templates/vscode/guardex-active-agents/extension.js index 7473f73..7b340f5 100644 --- a/templates/vscode/guardex-active-agents/extension.js +++ b/templates/vscode/guardex-active-agents/extension.js @@ -53,6 +53,10 @@ class ActiveAgentsProvider { this.onDidChangeTreeData = this.onDidChangeTreeDataEmitter.event; } + getTreeItem(element) { + return element; + } + refresh() { this.onDidChangeTreeDataEmitter.fire(); } diff --git a/test/vscode-active-agents-session-state.test.js b/test/vscode-active-agents-session-state.test.js index 9922076..4d89e5f 100644 --- a/test/vscode-active-agents-session-state.test.js +++ b/test/vscode-active-agents-session-state.test.js @@ -15,6 +15,7 @@ const sessionSchema = require(path.join( 'guardex-active-agents', 'session-schema.js', )); +const extensionEntry = path.join(repoRoot, 'templates', 'vscode', 'guardex-active-agents', 'extension.js'); function runNode(scriptPath, args, options = {}) { return cp.spawnSync('node', [scriptPath, ...args], { @@ -23,6 +24,92 @@ function runNode(scriptPath, args, options = {}) { }); } +function loadExtensionWithMockVscode(mockVscode) { + const Module = require('node:module'); + const originalLoad = Module._load; + delete require.cache[require.resolve(extensionEntry)]; + + Module._load = function patchedModuleLoad(request, parent, isMain) { + if (request === 'vscode') { + return mockVscode; + } + return originalLoad.call(this, request, parent, isMain); + }; + + try { + return require(extensionEntry); + } finally { + Module._load = originalLoad; + } +} + +function createMockVscode(tempRoot) { + const registrations = { + providers: [], + }; + + class TreeItem { + constructor(label, collapsibleState) { + this.label = label; + this.collapsibleState = collapsibleState; + } + } + + class ThemeIcon { + constructor(id) { + this.id = id; + } + } + + class EventEmitter { + constructor() { + this.event = () => {}; + } + + fire() {} + } + + const disposable = () => ({ dispose() {} }); + const fileWatcher = { + onDidCreate() {}, + onDidChange() {}, + onDidDelete() {}, + dispose() {}, + }; + + return { + registrations, + vscode: { + TreeItem, + ThemeIcon, + EventEmitter, + TreeItemCollapsibleState: { + None: 0, + Expanded: 1, + }, + commands: { + executeCommand: async () => {}, + registerCommand: () => disposable(), + }, + Uri: { + file: (fsPath) => ({ fsPath }), + }, + window: { + registerTreeDataProvider: (viewId, provider) => { + registrations.providers.push({ viewId, provider }); + return disposable(); + }, + }, + workspace: { + createFileSystemWatcher: () => fileWatcher, + findFiles: async () => [], + onDidChangeWorkspaceFolders: () => disposable(), + workspaceFolders: [{ uri: { fsPath: tempRoot } }], + }, + }, + }; +} + test('agent-session-state writes and removes active session records', () => { const tempRoot = fs.mkdtempSync(path.join(os.tmpdir(), 'guardex-active-session-')); const branch = 'agent/codex/demo-task'; @@ -132,3 +219,26 @@ test('install-vscode-active-agents-extension installs the current extension vers assert.equal(fs.existsSync(staleDir), false); assert.match(result.stdout, /Reload the VS Code window/); }); + +test('active-agents extension registers a provider with getTreeItem', async () => { + const tempRoot = fs.mkdtempSync(path.join(os.tmpdir(), 'guardex-vscode-view-')); + const { registrations, vscode } = createMockVscode(tempRoot); + const extension = loadExtensionWithMockVscode(vscode); + const context = { subscriptions: [] }; + + extension.activate(context); + + assert.equal(registrations.providers.length, 1); + assert.equal(registrations.providers[0].viewId, 'gitguardex.activeAgents'); + + const provider = registrations.providers[0].provider; + assert.equal(typeof provider.getTreeItem, 'function'); + + const [rootItem] = await provider.getChildren(); + assert.equal(rootItem.label, 'No active Guardex agents'); + assert.equal(provider.getTreeItem(rootItem), rootItem); + + for (const subscription of context.subscriptions) { + subscription.dispose?.(); + } +}); From 3b008f511e52ab101c39ee6251b646c0863189a1 Mon Sep 17 00:00:00 2001 From: Viktor Nagy <137165288+NagyVikt@users.noreply.github.com> Date: Tue, 21 Apr 2026 18:21:56 +0200 Subject: [PATCH 19/48] Protect README About copy with regression coverage (#252) Add one metadata test that locks the merged README problem/solution visual placement and ensures the README mirrors the canonical about_description.txt copy. This keeps the docs contract from drifting without adding broader test churn. Constraint: Follow-up must stay test-only and scoped to the merged README/about-description contract Rejected: Re-run the entire install.test.js suite to prove this docs change | unrelated long-running suite still times out without incremental output Confidence: high Scope-risk: narrow Directive: If the README About copy changes again, update about_description.txt first and keep this metadata test aligned with the same source Tested: node --test test/metadata.test.js; openspec validate agent-codex-readme-about-description-regression-test-2026-04-21-18-20 --type change --strict; openspec validate --specs; git diff --check Not-tested: full npm test aggregate run still contains a long-running install.test.js file that timed out during ad hoc verification Co-authored-by: NagyVikt --- .../.openspec.yaml | 2 ++ .../notes.md | 5 +++++ .../proposal.md | 17 +++++++++++++++++ .../readme-about-description-regression/spec.md | 11 +++++++++++ .../tasks.md | 17 +++++++++++++++++ test/metadata.test.js | 17 +++++++++++++++++ 6 files changed, 69 insertions(+) create mode 100644 openspec/changes/agent-codex-readme-about-description-regression-test-2026-04-21-18-20/.openspec.yaml create mode 100644 openspec/changes/agent-codex-readme-about-description-regression-test-2026-04-21-18-20/notes.md create mode 100644 openspec/changes/agent-codex-readme-about-description-regression-test-2026-04-21-18-20/proposal.md create mode 100644 openspec/changes/agent-codex-readme-about-description-regression-test-2026-04-21-18-20/specs/readme-about-description-regression/spec.md create mode 100644 openspec/changes/agent-codex-readme-about-description-regression-test-2026-04-21-18-20/tasks.md diff --git a/openspec/changes/agent-codex-readme-about-description-regression-test-2026-04-21-18-20/.openspec.yaml b/openspec/changes/agent-codex-readme-about-description-regression-test-2026-04-21-18-20/.openspec.yaml new file mode 100644 index 0000000..4b8c565 --- /dev/null +++ b/openspec/changes/agent-codex-readme-about-description-regression-test-2026-04-21-18-20/.openspec.yaml @@ -0,0 +1,2 @@ +schema: spec-driven +created: 2026-04-21 diff --git a/openspec/changes/agent-codex-readme-about-description-regression-test-2026-04-21-18-20/notes.md b/openspec/changes/agent-codex-readme-about-description-regression-test-2026-04-21-18-20/notes.md new file mode 100644 index 0000000..98258bb --- /dev/null +++ b/openspec/changes/agent-codex-readme-about-description-regression-test-2026-04-21-18-20/notes.md @@ -0,0 +1,5 @@ +# T1 Notes + +- Handoff: own `test/metadata.test.js` and this change record. +- Add focused regression coverage for the merged README/about-description contract. +- Keep the change test-only: assert the problem visual, solution visual, README link to `about_description.txt`, and mirrored canonical copy. diff --git a/openspec/changes/agent-codex-readme-about-description-regression-test-2026-04-21-18-20/proposal.md b/openspec/changes/agent-codex-readme-about-description-regression-test-2026-04-21-18-20/proposal.md new file mode 100644 index 0000000..8041955 --- /dev/null +++ b/openspec/changes/agent-codex-readme-about-description-regression-test-2026-04-21-18-20/proposal.md @@ -0,0 +1,17 @@ +## Why + +- The merged README/about-description update is currently verified by manual inspection and OpenSpec, but not by a focused regression test. +- A narrow metadata test is enough to catch accidental drift in the README section layout or canonical About copy reference. + +## What Changes + +- Add one metadata test that checks: + - the collision visual stays under `## The problem` + - the branch-start visual stays under `### Solution` + - the README links to `about_description.txt` + - the canonical About copy in the README matches `about_description.txt` + +## Impact + +- Affected surface: `test/metadata.test.js` and the matching OpenSpec change docs. +- Risk is low and limited to test coverage. diff --git a/openspec/changes/agent-codex-readme-about-description-regression-test-2026-04-21-18-20/specs/readme-about-description-regression/spec.md b/openspec/changes/agent-codex-readme-about-description-regression-test-2026-04-21-18-20/specs/readme-about-description-regression/spec.md new file mode 100644 index 0000000..a34e658 --- /dev/null +++ b/openspec/changes/agent-codex-readme-about-description-regression-test-2026-04-21-18-20/specs/readme-about-description-regression/spec.md @@ -0,0 +1,11 @@ +## ADDED Requirements + +### Requirement: metadata tests protect the canonical README About copy contract +The repo SHALL keep focused regression coverage that verifies the README problem/solution visual placement and the canonical `about_description.txt` mirror. + +#### Scenario: metadata test locks the merged README structure +- **WHEN** the metadata test suite runs +- **THEN** it verifies the collision visual appears under `## The problem` +- **AND** it verifies the branch-start visual appears under `### Solution` +- **AND** it verifies the README links to `about_description.txt` +- **AND** it verifies the README mirrors the same canonical About copy stored in `about_description.txt`. diff --git a/openspec/changes/agent-codex-readme-about-description-regression-test-2026-04-21-18-20/tasks.md b/openspec/changes/agent-codex-readme-about-description-regression-test-2026-04-21-18-20/tasks.md new file mode 100644 index 0000000..2cccb11 --- /dev/null +++ b/openspec/changes/agent-codex-readme-about-description-regression-test-2026-04-21-18-20/tasks.md @@ -0,0 +1,17 @@ +## 1. Specification + +- [x] 1.1 Capture the regression-test scope in proposal + spec artifacts. + +## 2. Implementation + +- [x] 2.1 Add focused metadata coverage for the README/about-description contract. + +## 3. Verification + +- [x] 3.1 Run `node --test test/metadata.test.js`. +- [x] 3.2 Run `openspec validate agent-codex-readme-about-description-regression-test-2026-04-21-18-20 --type change --strict`. +- [x] 3.3 Run `openspec validate --specs`. + +## 4. Cleanup + +- [ ] 4.1 Branch is ready for `agent-branch-finish --via-pr --wait-for-merge --cleanup` with PR URL + `MERGED` evidence recorded. diff --git a/test/metadata.test.js b/test/metadata.test.js index 4d63748..41ea192 100644 --- a/test/metadata.test.js +++ b/test/metadata.test.js @@ -6,6 +6,7 @@ const path = require('node:path'); const repoRoot = path.resolve(__dirname, '..'); const packageJsonPath = path.join(repoRoot, 'package.json'); const readmePath = path.join(repoRoot, 'README.md'); +const aboutDescriptionPath = path.join(repoRoot, 'about_description.txt'); function escapeRegexLiteral(value) { return String(value).replace(/[.*+?^${}()|[\]\\]/g, '\\$&'); @@ -68,6 +69,22 @@ test('README documents gx release as README-driven GitHub release writer', () => assert.match(readme, /finds the last published GitHub release, and writes one grouped GitHub release body/); }); +test('README keeps canonical About copy and problem-solution visuals aligned', () => { + const readme = fs.readFileSync(readmePath, 'utf8'); + const aboutDescription = fs.readFileSync(aboutDescriptionPath, 'utf8').trim(); + + assert.match( + readme, + /## The problem\s+!\[Parallel agents colliding in the same files\]\(https:\/\/raw\.githubusercontent\.com\/recodeee\/gitguardex\/main\/docs\/images\/problem-agent-collision\.svg\)/s, + ); + assert.match( + readme, + /### Solution\s+!\[Agent branch\/worktree start protocol\]\(https:\/\/raw\.githubusercontent\.com\/recodeee\/gitguardex\/main\/docs\/images\/workflow-branch-start\.svg\)/s, + ); + assert.match(readme, /\[about_description\.txt\]\(\.\/about_description\.txt\)/); + assert.match(readme, new RegExp(escapeRegexLiteral(aboutDescription))); +}); + test('security workflows are present and use pinned GitHub Actions SHAs', () => { const workflowDir = path.join(repoRoot, '.github', 'workflows'); const expected = ['ci.yml', 'release.yml', 'scorecard.yml', 'codeql.yml', 'cr.yml']; From 837b88b57778bf1a263634e652b1f290a8e6d87b Mon Sep 17 00:00:00 2001 From: Viktor Nagy <137165288+NagyVikt@users.noreply.github.com> Date: Tue, 21 Apr 2026 18:35:31 +0200 Subject: [PATCH 20/48] Make Active Agents visible in SCM by default (#253) The companion could render session rows but it could not match the intended Source Control presentation because it never created a TreeView handle. This switches the SCM contribution to createTreeView so the header can expose a live badge and empty-state message, and marks the view visible by default on first install. Constraint: The fix must stay inside the installable VS Code companion surface Rejected: Show stale sessions to force a non-empty badge | would misrepresent inactive agents Confidence: high Scope-risk: narrow Reversibility: clean Directive: Keep header badge coverage aligned with any future view-container refactors Tested: node --test test/vscode-active-agents-session-state.test.js Tested: openspec validate agent-codex-vscode-active-agents-scm-badge-visibilit-2026-04-21-18-31 --type change --strict Tested: openspec validate --specs Not-tested: Live VS Code SCM rendering after manual window reload Co-authored-by: NagyVikt --- .../proposal.md | 15 +++++ .../specs/vscode-active-agents-scm/spec.md | 21 +++++++ .../tasks.md | 10 ++++ .../vscode/guardex-active-agents/extension.js | 33 ++++++++++- .../vscode/guardex-active-agents/package.json | 3 +- ...vscode-active-agents-session-state.test.js | 56 +++++++++++++++++++ 6 files changed, 135 insertions(+), 3 deletions(-) create mode 100644 openspec/changes/agent-codex-vscode-active-agents-scm-badge-visibilit-2026-04-21-18-31/proposal.md create mode 100644 openspec/changes/agent-codex-vscode-active-agents-scm-badge-visibilit-2026-04-21-18-31/specs/vscode-active-agents-scm/spec.md create mode 100644 openspec/changes/agent-codex-vscode-active-agents-scm-badge-visibilit-2026-04-21-18-31/tasks.md diff --git a/openspec/changes/agent-codex-vscode-active-agents-scm-badge-visibilit-2026-04-21-18-31/proposal.md b/openspec/changes/agent-codex-vscode-active-agents-scm-badge-visibilit-2026-04-21-18-31/proposal.md new file mode 100644 index 0000000..9c7ff1f --- /dev/null +++ b/openspec/changes/agent-codex-vscode-active-agents-scm-badge-visibilit-2026-04-21-18-31/proposal.md @@ -0,0 +1,15 @@ +## Why + +- The Active Agents SCM contribution can render session rows, but it cannot render the header count badge shown in the intended VS Code screenshot because it only registers a tree-data provider. +- The view also depends on prior user view-state persistence, so the section may stay hidden instead of appearing by default in Source Control. + +## What Changes + +- Create the SCM view with `createTreeView(...)` so the extension can set a live badge and empty-state message. +- Mark the contributed SCM view as visible by default. +- Add regression coverage for both the empty-state message and the live-session badge count. + +## Impact + +- Affected surfaces: `templates/vscode/guardex-active-agents/package.json`, `templates/vscode/guardex-active-agents/extension.js`, and `test/vscode-active-agents-session-state.test.js`. +- Risk is narrow because the change stays inside the VS Code companion and does not alter Guardex session-state generation. diff --git a/openspec/changes/agent-codex-vscode-active-agents-scm-badge-visibilit-2026-04-21-18-31/specs/vscode-active-agents-scm/spec.md b/openspec/changes/agent-codex-vscode-active-agents-scm-badge-visibilit-2026-04-21-18-31/specs/vscode-active-agents-scm/spec.md new file mode 100644 index 0000000..9e1bb72 --- /dev/null +++ b/openspec/changes/agent-codex-vscode-active-agents-scm-badge-visibilit-2026-04-21-18-31/specs/vscode-active-agents-scm/spec.md @@ -0,0 +1,21 @@ +## ADDED Requirements + +### Requirement: Active Agents SCM view exposes header state +The Guardex Active Agents VS Code companion SHALL create the `gitguardex.activeAgents` SCM view through a tree-view handle so the view can expose header state in addition to item rows. + +#### Scenario: Live sessions set a header badge +- **WHEN** one or more live Guardex sessions are available in the current workspace +- **THEN** the SCM view shows the session rows +- **AND** the view header badge reflects the live session count. + +#### Scenario: Empty state sets a view message +- **WHEN** no live Guardex sessions are available in the current workspace +- **THEN** the SCM view remains available in Source Control +- **AND** the view exposes an empty-state message that tells the operator to start a sandbox session. + +### Requirement: Active Agents SCM view is visible by default +The `gitguardex.activeAgents` SCM contribution SHALL default to visible so operators do not need to discover it manually in the SCM views menu on first install. + +#### Scenario: First load shows the section +- **WHEN** the extension is installed in a workspace with Source Control open +- **THEN** the Active Agents section is available in the SCM container without requiring a manual enable step. diff --git a/openspec/changes/agent-codex-vscode-active-agents-scm-badge-visibilit-2026-04-21-18-31/tasks.md b/openspec/changes/agent-codex-vscode-active-agents-scm-badge-visibilit-2026-04-21-18-31/tasks.md new file mode 100644 index 0000000..34fa58a --- /dev/null +++ b/openspec/changes/agent-codex-vscode-active-agents-scm-badge-visibilit-2026-04-21-18-31/tasks.md @@ -0,0 +1,10 @@ +## Definition of Done + +- [x] 1.1 Capture the SCM badge and default-visibility acceptance criteria in the proposal/spec. +- [x] 2.1 Create the Active Agents SCM view with badge/message support. +- [x] 2.2 Default the SCM view contribution to visible. +- [x] 2.3 Add regression coverage for empty and live SCM view states. +- [x] 3.1 Run `node --test test/vscode-active-agents-session-state.test.js`. +- [x] 3.2 Run `openspec validate agent-codex-vscode-active-agents-scm-badge-visibilit-2026-04-21-18-31 --type change --strict`. +- [x] 3.3 Run `openspec validate --specs`. +- [ ] 4.1 Run the Guardex finish flow with PR merge + cleanup, or record a `BLOCKED:` note. diff --git a/templates/vscode/guardex-active-agents/extension.js b/templates/vscode/guardex-active-agents/extension.js index 7b340f5..1028cd3 100644 --- a/templates/vscode/guardex-active-agents/extension.js +++ b/templates/vscode/guardex-active-agents/extension.js @@ -51,12 +51,34 @@ class ActiveAgentsProvider { constructor() { this.onDidChangeTreeDataEmitter = new vscode.EventEmitter(); this.onDidChangeTreeData = this.onDidChangeTreeDataEmitter.event; + this.treeView = null; } getTreeItem(element) { return element; } + attachTreeView(treeView) { + this.treeView = treeView; + this.updateViewState(0); + } + + updateViewState(sessionCount) { + if (!this.treeView) { + return; + } + + this.treeView.badge = sessionCount > 0 + ? { + value: sessionCount, + tooltip: `${sessionCount} active agent${sessionCount === 1 ? '' : 's'}`, + } + : undefined; + this.treeView.message = sessionCount > 0 + ? undefined + : 'Start a sandbox session to populate this view.'; + } + refresh() { this.onDidChangeTreeDataEmitter.fire(); } @@ -67,13 +89,15 @@ class ActiveAgentsProvider { } const sessionsByRepo = await this.loadSessionsByRepo(); + const sessionCount = [...sessionsByRepo.values()].reduce((total, sessions) => total + sessions.length, 0); + this.updateViewState(sessionCount); const repos = [...sessionsByRepo.entries()] .map(([repoRoot, sessions]) => ({ repoRoot, sessions })) .filter((entry) => entry.sessions.length > 0) .sort((left, right) => left.repoRoot.localeCompare(right.repoRoot)); if (repos.length === 0) { - return [new InfoItem('No active Guardex agents', 'Start a sandbox session to populate this view.')]; + return [new InfoItem('No active Guardex agents', 'Open or start a sandbox session.')]; } if (repos.length === 1) { @@ -115,12 +139,17 @@ class ActiveAgentsProvider { function activate(context) { const provider = new ActiveAgentsProvider(); + const treeView = vscode.window.createTreeView('gitguardex.activeAgents', { + treeDataProvider: provider, + showCollapseAll: true, + }); + provider.attachTreeView(treeView); const refresh = () => provider.refresh(); const watcher = vscode.workspace.createFileSystemWatcher('**/.omx/state/active-sessions/*.json'); const interval = setInterval(refresh, 5_000); context.subscriptions.push( - vscode.window.registerTreeDataProvider('gitguardex.activeAgents', provider), + treeView, vscode.commands.registerCommand('gitguardex.activeAgents.refresh', refresh), vscode.commands.registerCommand('gitguardex.activeAgents.openWorktree', async (session) => { if (!session?.worktreePath) { diff --git a/templates/vscode/guardex-active-agents/package.json b/templates/vscode/guardex-active-agents/package.json index 354efa2..3c3b0d1 100644 --- a/templates/vscode/guardex-active-agents/package.json +++ b/templates/vscode/guardex-active-agents/package.json @@ -32,7 +32,8 @@ "scm": [ { "id": "gitguardex.activeAgents", - "name": "Active Agents" + "name": "Active Agents", + "visibility": "visible" } ] }, diff --git a/test/vscode-active-agents-session-state.test.js b/test/vscode-active-agents-session-state.test.js index 4d89e5f..7497b51 100644 --- a/test/vscode-active-agents-session-state.test.js +++ b/test/vscode-active-agents-session-state.test.js @@ -46,6 +46,7 @@ function loadExtensionWithMockVscode(mockVscode) { function createMockVscode(tempRoot) { const registrations = { providers: [], + treeViews: [], }; class TreeItem { @@ -95,6 +96,18 @@ function createMockVscode(tempRoot) { file: (fsPath) => ({ fsPath }), }, window: { + createTreeView: (viewId, options) => { + const treeView = { + viewId, + options, + badge: undefined, + message: undefined, + dispose() {}, + }; + registrations.treeViews.push(treeView); + registrations.providers.push({ viewId, provider: options.treeDataProvider }); + return treeView; + }, registerTreeDataProvider: (viewId, provider) => { registrations.providers.push({ viewId, provider }); return disposable(); @@ -228,6 +241,8 @@ test('active-agents extension registers a provider with getTreeItem', async () = extension.activate(context); + assert.equal(registrations.treeViews.length, 1); + assert.equal(registrations.treeViews[0].viewId, 'gitguardex.activeAgents'); assert.equal(registrations.providers.length, 1); assert.equal(registrations.providers[0].viewId, 'gitguardex.activeAgents'); @@ -237,6 +252,47 @@ test('active-agents extension registers a provider with getTreeItem', async () = const [rootItem] = await provider.getChildren(); assert.equal(rootItem.label, 'No active Guardex agents'); assert.equal(provider.getTreeItem(rootItem), rootItem); + assert.equal(registrations.treeViews[0].badge, undefined); + assert.equal(registrations.treeViews[0].message, 'Start a sandbox session to populate this view.'); + + for (const subscription of context.subscriptions) { + subscription.dispose?.(); + } +}); + +test('active-agents extension updates the SCM badge for live sessions', async () => { + const tempRoot = fs.mkdtempSync(path.join(os.tmpdir(), 'guardex-vscode-live-view-')); + const sessionPath = sessionSchema.sessionFilePathForBranch(tempRoot, 'agent/codex/live-task'); + fs.mkdirSync(path.dirname(sessionPath), { recursive: true }); + fs.writeFileSync( + sessionPath, + `${JSON.stringify(sessionSchema.buildSessionRecord({ + repoRoot: tempRoot, + branch: 'agent/codex/live-task', + taskName: 'live-task', + agentName: 'codex', + worktreePath: path.join(tempRoot, '.omx', 'agent-worktrees', 'live-task'), + pid: process.pid, + cliName: 'codex', + }), null, 2)}\n`, + 'utf8', + ); + + const { registrations, vscode } = createMockVscode(tempRoot); + vscode.workspace.findFiles = async () => [{ fsPath: sessionPath }]; + const extension = loadExtensionWithMockVscode(vscode); + const context = { subscriptions: [] }; + + extension.activate(context); + + const provider = registrations.providers[0].provider; + const [sessionItem] = await provider.getChildren(); + assert.equal(sessionItem.label, 'live-task'); + assert.deepEqual(registrations.treeViews[0].badge, { + value: 1, + tooltip: '1 active agent', + }); + assert.equal(registrations.treeViews[0].message, undefined); for (const subscription of context.subscriptions) { subscription.dispose?.(); From 8283e48abbfca437fc7cbed2792f3ca45c61730b Mon Sep 17 00:00:00 2001 From: NagyVikt Date: Tue, 21 Apr 2026 18:37:40 +0200 Subject: [PATCH 21/48] new img --- docs/images/problem-agent-collision.svg | 194 ++++++++++++------------ 1 file changed, 97 insertions(+), 97 deletions(-) diff --git a/docs/images/problem-agent-collision.svg b/docs/images/problem-agent-collision.svg index b8e82b6..74f93a0 100644 --- a/docs/images/problem-agent-collision.svg +++ b/docs/images/problem-agent-collision.svg @@ -1,98 +1,98 @@ - - parallel agent collision loop - Five agents touch the same files, one overwrite leads to deletion, lost work, and a regression loop. - - - - - - - - - - - - - - - - - - Agent A adds assertions - in a shared test - - - - - Agent B rewrites - the same flow - - - - - Agent C updates - the shared helper - - - - - Agent D deletes lines - Agent A just added - - - - - Agent E saves - an older snapshot - - - - - - Several agents touch - the same files - - - - - One agent overwrites - another agent's edits - - - - - Another agent deletes - code the others added - - - - - Lost work, rework, - and review confusion - - - - - Regression risk - and flaky fixes grow - - - - - - - - - - - - - - - Five agents touch the same files. One overwrite turns into deletion, lost work, and a regression loop. + +Parallel agents overwriting each other — the de-progressive loop +Five agents stream edits toward a shared-files node that pulses on impact. Consequences cascade down the right side and a dashed loop arrow circles back, visualizing the regression cycle. + + + + + + + + + + + + + + + + + + + + + + + + + + + +Agent A adds assertions +in a shared test + +Agent B rewrites +the same test flow + +Agent C updates +the shared helper + +Agent D deletes lines +Agent A just added + +Agent E saves +an older snapshot + + + + + + + + + + + + + + + + + +Several agents +touch the same files + + + + + +One agent overwrites +another agent's edits + + + +Another agent deletes +code the others added + + + +Lost work, rework, +and review confusion + + + +Regression risk +and flaky fixes grow + + + + +…and the cycle starts again +Five agents touch the same files. One overwrite turns into deletion, lost work, and a regression loop. From 313fd5c2a873d52b4820ebf7de5d7384e1e373ac Mon Sep 17 00:00:00 2001 From: Viktor Nagy <137165288+NagyVikt@users.noreply.github.com> Date: Tue, 21 Apr 2026 18:39:04 +0200 Subject: [PATCH 22/48] Update README.md (#254) --- README.md | 2 -- 1 file changed, 2 deletions(-) diff --git a/README.md b/README.md index 1026b94..b6bce25 100644 --- a/README.md +++ b/README.md @@ -28,8 +28,6 @@ I was running ~30 Codex agents in parallel and hit a wall: they kept working on ### Solution -![Agent branch/worktree start protocol](https://raw.githubusercontent.com/recodeee/gitguardex/main/docs/images/workflow-branch-start.svg) - GitGuardex exists to stop that loop. Every agent gets its own worktree, claims the files it's touching, and can't clobber files another agent has claimed. Your local branch stays clean; agents stay in their lanes. ## GitHub About description From 89cbfaca727ad1d05b8ee30ab95b5398c101783a Mon Sep 17 00:00:00 2001 From: Viktor Nagy <137165288+NagyVikt@users.noreply.github.com> Date: Tue, 21 Apr 2026 18:39:59 +0200 Subject: [PATCH 23/48] Update README.md (#255) --- README.md | 3 --- 1 file changed, 3 deletions(-) diff --git a/README.md b/README.md index b6bce25..f1a69fb 100644 --- a/README.md +++ b/README.md @@ -30,9 +30,6 @@ I was running ~30 Codex agents in parallel and hit a wall: they kept working on GitGuardex exists to stop that loop. Every agent gets its own worktree, claims the files it's touching, and can't clobber files another agent has claimed. Your local branch stays clean; agents stay in their lanes. -## GitHub About description - -The canonical GitHub About copy lives in [about_description.txt](./about_description.txt): ```text Guardian T-Rex for your multi-agent repo. Isolated worktrees, file locks, and PR-only merges stop parallel Codex & Claude agents from overwriting each other's work. Auto-wires Oh My Codex, Oh My Claude, OpenSpec, and Caveman. From 46522477f8f63272671c1dc0ba02ef003a59bd3a Mon Sep 17 00:00:00 2001 From: Viktor Nagy <137165288+NagyVikt@users.noreply.github.com> Date: Tue, 21 Apr 2026 18:43:52 +0200 Subject: [PATCH 24/48] Update README.md (#256) --- README.md | 14 -------------- 1 file changed, 14 deletions(-) diff --git a/README.md b/README.md index f1a69fb..4950aae 100644 --- a/README.md +++ b/README.md @@ -31,20 +31,6 @@ I was running ~30 Codex agents in parallel and hit a wall: they kept working on GitGuardex exists to stop that loop. Every agent gets its own worktree, claims the files it's touching, and can't clobber files another agent has claimed. Your local branch stays clean; agents stay in their lanes. -```text -Guardian T-Rex for your multi-agent repo. Isolated worktrees, file locks, and PR-only merges stop parallel Codex & Claude agents from overwriting each other's work. Auto-wires Oh My Codex, Oh My Claude, OpenSpec, and Caveman. -``` - - - - - - - - - - -

Install GitGuardex

From b90c3824f65bb164b0779d02be17cef214f9724c Mon Sep 17 00:00:00 2001 From: Viktor Nagy <137165288+NagyVikt@users.noreply.github.com> Date: Tue, 21 Apr 2026 18:56:45 +0200 Subject: [PATCH 25/48] Align package metadata with the canonical Guardian T-Rex pitch (#257) README and about_description.txt already carried the approved product copy, but package.json still advertised the older generic description. This change brings npm metadata onto the same canonical line, restores the README About and Solution blocks required by the current metadata regression, and records the alignment in a matching OpenSpec delta. Constraint: Package metadata, README About copy, and about_description.txt must stay aligned Rejected: Update package.json only | left README/spec regressions red in this branch Confidence: high Scope-risk: narrow Reversibility: clean Directive: Keep package.json description, README About copy, and about_description.txt in lockstep Tested: node --test test/metadata.test.js; npm test; git diff --check; openspec validate agent-codex-align-guardian-t-rex-package-description-2026-04-21-18-46 --type change --strict; openspec validate --specs Not-tested: npm registry rendering of the longer package description Co-authored-by: NagyVikt --- README.md | 10 ++++++++ .../.openspec.yaml | 2 ++ .../notes.md | 7 ++++++ .../proposal.md | 16 ++++++++++++ .../specs/readme-about-description/spec.md | 14 +++++++++++ .../tasks.md | 25 +++++++++++++++++++ package.json | 2 +- test/metadata.test.js | 2 ++ 8 files changed, 77 insertions(+), 1 deletion(-) create mode 100644 openspec/changes/agent-codex-align-guardian-t-rex-package-description-2026-04-21-18-46/.openspec.yaml create mode 100644 openspec/changes/agent-codex-align-guardian-t-rex-package-description-2026-04-21-18-46/notes.md create mode 100644 openspec/changes/agent-codex-align-guardian-t-rex-package-description-2026-04-21-18-46/proposal.md create mode 100644 openspec/changes/agent-codex-align-guardian-t-rex-package-description-2026-04-21-18-46/specs/readme-about-description/spec.md create mode 100644 openspec/changes/agent-codex-align-guardian-t-rex-package-description-2026-04-21-18-46/tasks.md diff --git a/README.md b/README.md index 4950aae..db33645 100644 --- a/README.md +++ b/README.md @@ -28,8 +28,18 @@ I was running ~30 Codex agents in parallel and hit a wall: they kept working on ### Solution +![Agent branch/worktree start protocol](https://raw.githubusercontent.com/recodeee/gitguardex/main/docs/images/workflow-branch-start.svg) + GitGuardex exists to stop that loop. Every agent gets its own worktree, claims the files it's touching, and can't clobber files another agent has claimed. Your local branch stays clean; agents stay in their lanes. +## GitHub About description + +The canonical GitHub About copy lives in [about_description.txt](./about_description.txt): + +```text +Guardian T-Rex for your multi-agent repo. Isolated worktrees, file locks, and PR-only merges stop parallel Codex & Claude agents from overwriting each other's work. Auto-wires Oh My Codex, Oh My Claude, OpenSpec, and Caveman. +``` +

Install GitGuardex diff --git a/openspec/changes/agent-codex-align-guardian-t-rex-package-description-2026-04-21-18-46/.openspec.yaml b/openspec/changes/agent-codex-align-guardian-t-rex-package-description-2026-04-21-18-46/.openspec.yaml new file mode 100644 index 0000000..4b8c565 --- /dev/null +++ b/openspec/changes/agent-codex-align-guardian-t-rex-package-description-2026-04-21-18-46/.openspec.yaml @@ -0,0 +1,2 @@ +schema: spec-driven +created: 2026-04-21 diff --git a/openspec/changes/agent-codex-align-guardian-t-rex-package-description-2026-04-21-18-46/notes.md b/openspec/changes/agent-codex-align-guardian-t-rex-package-description-2026-04-21-18-46/notes.md new file mode 100644 index 0000000..19a31c8 --- /dev/null +++ b/openspec/changes/agent-codex-align-guardian-t-rex-package-description-2026-04-21-18-46/notes.md @@ -0,0 +1,7 @@ +# T1 Notes + +- Handoff: change=`agent-codex-align-guardian-t-rex-package-description-2026-04-21-18-46`; scope=`package.json`; action=`align the npm package description with the existing Guardian T-Rex README/About copy`. +- Replace the generic package description with the approved Guardian T-Rex positioning so npm metadata matches README and `about_description.txt`. +- Restore the README GitHub About section so the canonical copy is linked and mirrored from the documented source. +- Restore the README solution image required by the existing metadata/spec regression so the copy-alignment suite stays green after the metadata update. +- Keep the change narrowly scoped to package metadata, one README parity fix, and the matching regression coverage; no version bump or runtime behavior change. diff --git a/openspec/changes/agent-codex-align-guardian-t-rex-package-description-2026-04-21-18-46/proposal.md b/openspec/changes/agent-codex-align-guardian-t-rex-package-description-2026-04-21-18-46/proposal.md new file mode 100644 index 0000000..950e935 --- /dev/null +++ b/openspec/changes/agent-codex-align-guardian-t-rex-package-description-2026-04-21-18-46/proposal.md @@ -0,0 +1,16 @@ +## Why + +- `about_description.txt` and the README already use the approved Guardian T-Rex positioning, but `package.json` still exposes the older generic package description. +- That drift makes the npm package metadata tell a different story than the canonical About copy. + +## What Changes + +- Align `package.json` `description` with the canonical text in `about_description.txt`. +- Restore the README GitHub About section so the canonical copy is visible and linked from the documented source file. +- Restore the missing README solution visual required by the current `readme-about-description` regression. +- Add an OpenSpec delta that requires package metadata to stay aligned with the canonical About description source. + +## Impact + +- Affected surfaces: `package.json`, `README.md`, one metadata regression test, and the matching OpenSpec change record. +- Risk is low and limited to package metadata / product copy consistency. diff --git a/openspec/changes/agent-codex-align-guardian-t-rex-package-description-2026-04-21-18-46/specs/readme-about-description/spec.md b/openspec/changes/agent-codex-align-guardian-t-rex-package-description-2026-04-21-18-46/specs/readme-about-description/spec.md new file mode 100644 index 0000000..5fc2e52 --- /dev/null +++ b/openspec/changes/agent-codex-align-guardian-t-rex-package-description-2026-04-21-18-46/specs/readme-about-description/spec.md @@ -0,0 +1,14 @@ +## MODIFIED Requirements + +### Requirement: README points to one canonical GitHub About description source +The repo SHALL keep one canonical GitHub About description in `about_description.txt`, and the README plus package metadata SHALL mirror that same copy instead of drifting across product surfaces. + +#### Scenario: package metadata matches the canonical About copy +- **WHEN** a maintainer inspects `package.json` and `about_description.txt` +- **THEN** `package.json` `description` matches the full canonical text in `about_description.txt` +- **AND** the README continues to reference that same canonical source. + +#### Scenario: solution visual remains under the solution heading +- **WHEN** a reader opens the top-level README +- **THEN** the `### Solution` heading is followed by the workflow image +- **AND** the Guardex solution copy appears below that image. diff --git a/openspec/changes/agent-codex-align-guardian-t-rex-package-description-2026-04-21-18-46/tasks.md b/openspec/changes/agent-codex-align-guardian-t-rex-package-description-2026-04-21-18-46/tasks.md new file mode 100644 index 0000000..8464451 --- /dev/null +++ b/openspec/changes/agent-codex-align-guardian-t-rex-package-description-2026-04-21-18-46/tasks.md @@ -0,0 +1,25 @@ +## 1. Metadata + +- [x] 1.1 Update `package.json` so the npm description matches the canonical Guardian T-Rex copy. +- [x] 1.2 Restore the README GitHub About section so it links to `about_description.txt` and mirrors the canonical copy. +- [x] 1.3 Restore the README solution visual expected by the existing metadata/about regression. + +## 2. OpenSpec + +- [x] 2.1 Record the scope and handoff in the change notes. +- [x] 2.2 Add a spec delta covering package metadata alignment with `about_description.txt`. + +## 3. Verification + +- [x] 3.1 Validate `package.json` parses as JSON. +- [x] 3.2 Run `git diff --check`. +- [x] 3.3 Update/add regression coverage for canonical About copy alignment. +- [x] 3.4 Run `npm test`. +- [x] 3.5 Run `openspec validate agent-codex-align-guardian-t-rex-package-description-2026-04-21-18-46 --type change --strict`. +- [x] 3.6 Run `openspec validate --specs`. + +## 4. Cleanup + +- [ ] 4.1 Run `bash scripts/agent-branch-finish.sh --branch "agent/codex/align-guardian-t-rex-package-description-2026-04-21-18-46" --base main --via-pr --wait-for-merge --cleanup`. +- [ ] 4.2 Record PR URL + final `MERGED` state in the completion handoff. +- [ ] 4.3 Confirm sandbox cleanup (`git worktree list`, `git branch -a`) or capture a `BLOCKED:` handoff if merge/cleanup is pending. diff --git a/package.json b/package.json index 7422cd6..561864e 100644 --- a/package.json +++ b/package.json @@ -1,7 +1,7 @@ { "name": "@imdeadpool/guardex", "version": "7.0.16", - "description": "GitGuardex: hardened multi-agent git guardrails for parallel agent work.", + "description": "Guardian T-Rex for your multi-agent repo. Isolated worktrees, file locks, and PR-only merges stop parallel Codex & Claude agents from overwriting each other's work. Auto-wires Oh My Codex, Oh My Claude, OpenSpec, and Caveman.", "license": "MIT", "preferGlobal": true, "bin": { diff --git a/test/metadata.test.js b/test/metadata.test.js index 41ea192..85f9117 100644 --- a/test/metadata.test.js +++ b/test/metadata.test.js @@ -72,6 +72,7 @@ test('README documents gx release as README-driven GitHub release writer', () => test('README keeps canonical About copy and problem-solution visuals aligned', () => { const readme = fs.readFileSync(readmePath, 'utf8'); const aboutDescription = fs.readFileSync(aboutDescriptionPath, 'utf8').trim(); + const pkg = JSON.parse(fs.readFileSync(packageJsonPath, 'utf8')); assert.match( readme, @@ -83,6 +84,7 @@ test('README keeps canonical About copy and problem-solution visuals aligned', ( ); assert.match(readme, /\[about_description\.txt\]\(\.\/about_description\.txt\)/); assert.match(readme, new RegExp(escapeRegexLiteral(aboutDescription))); + assert.equal(pkg.description, aboutDescription); }); test('security workflows are present and use pinned GitHub Actions SHAs', () => { From 8cd94e1df412ec88284ca8dad8d2a3434425938c Mon Sep 17 00:00:00 2001 From: Viktor Nagy <137165288+NagyVikt@users.noreply.github.com> Date: Tue, 21 Apr 2026 20:39:41 +0200 Subject: [PATCH 26/48] Update README.md (#258) --- README.md | 8 -------- 1 file changed, 8 deletions(-) diff --git a/README.md b/README.md index db33645..c1601ae 100644 --- a/README.md +++ b/README.md @@ -32,14 +32,6 @@ I was running ~30 Codex agents in parallel and hit a wall: they kept working on GitGuardex exists to stop that loop. Every agent gets its own worktree, claims the files it's touching, and can't clobber files another agent has claimed. Your local branch stays clean; agents stay in their lanes. -## GitHub About description - -The canonical GitHub About copy lives in [about_description.txt](./about_description.txt): - -```text -Guardian T-Rex for your multi-agent repo. Isolated worktrees, file locks, and PR-only merges stop parallel Codex & Claude agents from overwriting each other's work. Auto-wires Oh My Codex, Oh My Claude, OpenSpec, and Caveman. -``` -

Install GitGuardex From ef51383fdc73691260470ca606d73f4beed6bba1 Mon Sep 17 00:00:00 2001 From: Viktor Nagy <137165288+NagyVikt@users.noreply.github.com> Date: Tue, 21 Apr 2026 20:40:19 +0200 Subject: [PATCH 27/48] Update README.md (#259) --- README.md | 2 -- 1 file changed, 2 deletions(-) diff --git a/README.md b/README.md index c1601ae..4950aae 100644 --- a/README.md +++ b/README.md @@ -28,8 +28,6 @@ I was running ~30 Codex agents in parallel and hit a wall: they kept working on ### Solution -![Agent branch/worktree start protocol](https://raw.githubusercontent.com/recodeee/gitguardex/main/docs/images/workflow-branch-start.svg) - GitGuardex exists to stop that loop. Every agent gets its own worktree, claims the files it's touching, and can't clobber files another agent has claimed. Your local branch stays clean; agents stay in their lanes. From 1a858d848282029d3490f468170a31d8d7b05208 Mon Sep 17 00:00:00 2001 From: Viktor Nagy <137165288+NagyVikt@users.noreply.github.com> Date: Tue, 21 Apr 2026 21:20:29 +0200 Subject: [PATCH 28/48] Auto-finish: gx doctor repairs (#260) Co-authored-by: NagyVikt --- scripts/codex-agent.sh | 51 ------------------------------------------ 1 file changed, 51 deletions(-) diff --git a/scripts/codex-agent.sh b/scripts/codex-agent.sh index 8b509dd..a8a53c8 100755 --- a/scripts/codex-agent.sh +++ b/scripts/codex-agent.sh @@ -6,7 +6,6 @@ AGENT_NAME="${GUARDEX_AGENT_NAME:-agent}" BASE_BRANCH="${GUARDEX_BASE_BRANCH:-}" BASE_BRANCH_EXPLICIT=0 CODEX_BIN="${GUARDEX_CODEX_BIN:-codex}" -NODE_BIN="${GUARDEX_NODE_BIN:-node}" AUTO_FINISH_RAW="${GUARDEX_CODEX_AUTO_FINISH:-true}" AUTO_REVIEW_ON_CONFLICT_RAW="${GUARDEX_CODEX_AUTO_REVIEW_ON_CONFLICT:-true}" AUTO_CLEANUP_RAW="${GUARDEX_CODEX_AUTO_CLEANUP:-true}" @@ -144,7 +143,6 @@ if ! git rev-parse --is-inside-work-tree >/dev/null 2>&1; then exit 1 fi repo_root="$(git rev-parse --show-toplevel)" -active_session_state_script="${repo_root}/scripts/agent-session-state.js" guardex_env_helper="${repo_root}/scripts/guardex-env.sh" if [[ -f "$guardex_env_helper" ]]; then @@ -448,40 +446,6 @@ has_origin_remote() { git -C "$repo_root" remote get-url origin >/dev/null 2>&1 } -run_active_session_state() { - local action="$1" - shift - - if [[ ! -f "$active_session_state_script" ]]; then - return 0 - fi - if ! command -v "$NODE_BIN" >/dev/null 2>&1; then - return 0 - fi - - "$NODE_BIN" "$active_session_state_script" "$action" "$@" >/dev/null 2>&1 || true -} - -record_active_session_state() { - local wt="$1" - local branch="$2" - - run_active_session_state \ - start \ - --repo "$repo_root" \ - --branch "$branch" \ - --task "$TASK_NAME" \ - --agent "$AGENT_NAME" \ - --worktree "$wt" \ - --pid "$$" \ - --cli "$CODEX_BIN" -} - -clear_active_session_state() { - local branch="$1" - run_active_session_state stop --repo "$repo_root" --branch "$branch" -} - origin_remote_supports_pr_finish() { local origin_url origin_url="$(git -C "$repo_root" remote get-url origin 2>/dev/null || true)" @@ -869,19 +833,6 @@ if ! ensure_openspec_plan_workspace "$worktree_path" "$worktree_branch"; then exit 1 fi -active_session_recorded=0 -cleanup_active_session_state_on_exit() { - set +e - if [[ "${active_session_recorded:-0}" -eq 1 && -n "${worktree_branch:-}" && "${worktree_branch:-}" != "HEAD" ]]; then - clear_active_session_state "$worktree_branch" - active_session_recorded=0 - fi -} - -record_active_session_state "$worktree_path" "$worktree_branch" -active_session_recorded=1 -trap cleanup_active_session_state_on_exit EXIT INT TERM - echo "[codex-agent] Launching ${CODEX_BIN} in sandbox: $worktree_path" cd "$worktree_path" set +e @@ -890,8 +841,6 @@ codex_exit="$?" set -e cd "$repo_root" -cleanup_active_session_state_on_exit -trap - EXIT INT TERM final_exit="$codex_exit" auto_finish_completed=0 From 569c30728590b84453de0e7f295348720ac3f34f Mon Sep 17 00:00:00 2001 From: Viktor Nagy <137165288+NagyVikt@users.noreply.github.com> Date: Tue, 21 Apr 2026 21:31:02 +0200 Subject: [PATCH 29/48] Show live Guardex worktree activity in the VS Code SCM view (#261) The Active Agents companion already showed presence, but every row was hardcoded to thinking. This change derives live activity from each sandbox worktree so clean lanes stay thinking while dirty lanes surface working with changed-file counts and tooltip previews. Constraint: VS Code companion state remains repo-local and read-only; do not require a new runtime daemon. Rejected: Add a periodic activity writer protocol | unnecessary complexity for a read-only status hint. Confidence: high Scope-risk: narrow Reversibility: clean Directive: Keep activity inference resilient; if git inspection fails, fall back to thinking rather than dropping the session row. Tested: node --test test/vscode-active-agents-session-state.test.js Tested: openspec validate agent-codex-vscode-active-agents-worktree-status-2026-04-21-21-23 --type change --strict Tested: openspec validate --specs Not-tested: VS Code interactive rendering in a live editor window Co-authored-by: NagyVikt --- README.md | 2 +- .../.openspec.yaml | 2 + .../proposal.md | 16 +++ .../vscode-active-agents-extension/spec.md | 20 ++++ .../tasks.md | 29 ++++++ .../vscode/guardex-active-agents/README.md | 1 + .../vscode/guardex-active-agents/extension.js | 16 ++- .../guardex-active-agents/session-schema.js | 98 +++++++++++++++++++ ...vscode-active-agents-session-state.test.js | 93 ++++++++++++++++++ 9 files changed, 273 insertions(+), 4 deletions(-) create mode 100644 openspec/changes/agent-codex-vscode-active-agents-worktree-status-2026-04-21-21-23/.openspec.yaml create mode 100644 openspec/changes/agent-codex-vscode-active-agents-worktree-status-2026-04-21-21-23/proposal.md create mode 100644 openspec/changes/agent-codex-vscode-active-agents-worktree-status-2026-04-21-21-23/specs/vscode-active-agents-extension/spec.md create mode 100644 openspec/changes/agent-codex-vscode-active-agents-worktree-status-2026-04-21-21-23/tasks.md diff --git a/README.md b/README.md index 4950aae..a05337b 100644 --- a/README.md +++ b/README.md @@ -239,7 +239,7 @@ To install the real companion into local VS Code from a Guardex-wired repo: node scripts/install-vscode-active-agents-extension.js ``` -It adds an `Active Agents` view to the Source Control container, reads `.omx/state/active-sessions/*.json`, and uses VS Code's native `loading~spin` codicon for the running-state affordance. Reload the VS Code window after install. +It adds an `Active Agents` view to the Source Control container, reads `.omx/state/active-sessions/*.json`, derives `thinking` versus `working` from each live sandbox worktree, and uses VS Code's native `loading~spin` codicon for the running-state affordance. Reload the VS Code window after install. --- diff --git a/openspec/changes/agent-codex-vscode-active-agents-worktree-status-2026-04-21-21-23/.openspec.yaml b/openspec/changes/agent-codex-vscode-active-agents-worktree-status-2026-04-21-21-23/.openspec.yaml new file mode 100644 index 0000000..4b8c565 --- /dev/null +++ b/openspec/changes/agent-codex-vscode-active-agents-worktree-status-2026-04-21-21-23/.openspec.yaml @@ -0,0 +1,2 @@ +schema: spec-driven +created: 2026-04-21 diff --git a/openspec/changes/agent-codex-vscode-active-agents-worktree-status-2026-04-21-21-23/proposal.md b/openspec/changes/agent-codex-vscode-active-agents-worktree-status-2026-04-21-21-23/proposal.md new file mode 100644 index 0000000..8f1ea12 --- /dev/null +++ b/openspec/changes/agent-codex-vscode-active-agents-worktree-status-2026-04-21-21-23/proposal.md @@ -0,0 +1,16 @@ +## Why + +- The Active Agents companion already shows live Guardex lanes, but every row is hardcoded to `thinking` even after the agent starts changing files in its sandbox. +- In multi-agent VS Code flows, users need to tell which worktree is still planning versus which one is actively moving without leaving Source Control. + +## What Changes + +- Derive per-session activity from the live sandbox worktree so clean lanes stay `thinking` while dirty lanes surface `working`. +- Update the Active Agents SCM rows and tooltips to include the live activity state, changed-file count, and changed-path preview. +- Add focused regression coverage for the activity inference and the rendered SCM row copy. + +## Impact + +- Affected surfaces: `templates/vscode/guardex-active-agents/extension.js`, `templates/vscode/guardex-active-agents/session-schema.js`, `test/vscode-active-agents-session-state.test.js`, and README/OpenSpec docs. +- Risk is narrow because the change stays read-only and derives activity from the existing live worktree instead of introducing a new runtime protocol. +- If git activity cannot be inspected for a live worktree, the companion must fall back to `thinking` instead of crashing or hiding the session row. diff --git a/openspec/changes/agent-codex-vscode-active-agents-worktree-status-2026-04-21-21-23/specs/vscode-active-agents-extension/spec.md b/openspec/changes/agent-codex-vscode-active-agents-worktree-status-2026-04-21-21-23/specs/vscode-active-agents-extension/spec.md new file mode 100644 index 0000000..006adf6 --- /dev/null +++ b/openspec/changes/agent-codex-vscode-active-agents-worktree-status-2026-04-21-21-23/specs/vscode-active-agents-extension/spec.md @@ -0,0 +1,20 @@ +## ADDED Requirements + +### Requirement: Active Agents rows reflect live sandbox worktree activity +The system SHALL describe whether each live Guardex sandbox is still thinking or is actively working inside its worktree. + +#### Scenario: Clean worktree stays thinking +- **WHEN** a live session points at a clean sandbox worktree +- **THEN** the Active Agents row description begins with `thinking` +- **AND** it still includes the elapsed time for that live lane. + +#### Scenario: Dirty worktree surfaces working state +- **WHEN** a live session points at a sandbox worktree with tracked or untracked file changes +- **THEN** the Active Agents row description begins with `working` +- **AND** it includes the changed-file count before the elapsed time +- **AND** the row tooltip includes a preview of the changed paths. + +#### Scenario: Activity inference falls back safely +- **WHEN** the companion cannot inspect the worktree git state for an otherwise live session +- **THEN** the row still renders as an active agent +- **AND** the description falls back to `thinking` instead of crashing or disappearing. diff --git a/openspec/changes/agent-codex-vscode-active-agents-worktree-status-2026-04-21-21-23/tasks.md b/openspec/changes/agent-codex-vscode-active-agents-worktree-status-2026-04-21-21-23/tasks.md new file mode 100644 index 0000000..8951241 --- /dev/null +++ b/openspec/changes/agent-codex-vscode-active-agents-worktree-status-2026-04-21-21-23/tasks.md @@ -0,0 +1,29 @@ +## Definition of Done + +This change is complete only when **all** of the following are true: + +- Every checkbox below is checked. +- The agent branch reaches `MERGED` state on `origin` and the PR URL + state are recorded in the completion handoff. +- If any step blocks (test failure, conflict, ambiguous result), append a `BLOCKED:` line under section 4 explaining the blocker and **STOP**. Do not tick remaining cleanup boxes; do not silently skip the cleanup pipeline. + +## 1. Specification + +- [x] 1.1 Finalize proposal scope and acceptance criteria for `agent-codex-vscode-active-agents-worktree-status-2026-04-21-21-23`. +- [x] 1.2 Define normative requirements in `specs/vscode-active-agents-extension/spec.md`. + +## 2. Implementation + +- [x] 2.1 Derive live `thinking` versus `working` status from each active sandbox worktree and surface it in the SCM row description/tooltip. +- [x] 2.2 Add/update focused regression coverage plus README guidance for the richer Active Agents status copy. + +## 3. Verification + +- [x] 3.1 Run targeted project verification commands. +- [x] 3.2 Run `openspec validate agent-codex-vscode-active-agents-worktree-status-2026-04-21-21-23 --type change --strict`. +- [x] 3.3 Run `openspec validate --specs`. + +## 4. Cleanup (mandatory; run before claiming completion) + +- [ ] 4.1 Run the cleanup pipeline: `bash scripts/agent-branch-finish.sh --branch agent// --base main --via-pr --wait-for-merge --cleanup`. This handles commit -> push -> PR create -> merge wait -> worktree prune in one invocation. +- [ ] 4.2 Record the PR URL and final merge state (`MERGED`) in the completion handoff. +- [ ] 4.3 Confirm the sandbox worktree is gone (`git worktree list` no longer shows the agent path; `git branch -a` shows no surviving local/remote refs for the branch). diff --git a/templates/vscode/guardex-active-agents/README.md b/templates/vscode/guardex-active-agents/README.md index ea3ffb4..aa9fabf 100644 --- a/templates/vscode/guardex-active-agents/README.md +++ b/templates/vscode/guardex-active-agents/README.md @@ -6,6 +6,7 @@ What it does: - Adds an `Active Agents` view to the Source Control container. - Renders one row per live Guardex sandbox session. +- Derives `thinking` versus `working` from the live sandbox worktree and shows changed-file counts for active edits. - Uses VS Code's native animated `loading~spin` icon for the running-state affordance. - Reads repo-local presence files from `.omx/state/active-sessions/`. diff --git a/templates/vscode/guardex-active-agents/extension.js b/templates/vscode/guardex-active-agents/extension.js index 1028cd3..2c89e51 100644 --- a/templates/vscode/guardex-active-agents/extension.js +++ b/templates/vscode/guardex-active-agents/extension.js @@ -26,13 +26,23 @@ class SessionItem extends vscode.TreeItem { constructor(session) { super(session.label, vscode.TreeItemCollapsibleState.None); this.session = session; - this.description = `thinking · ${formatElapsedFrom(session.startedAt)}`; - this.tooltip = [ + const descriptionParts = [session.activityLabel || 'thinking']; + if (session.activityCountLabel) { + descriptionParts.push(session.activityCountLabel); + } + descriptionParts.push(session.elapsedLabel || formatElapsedFrom(session.startedAt)); + this.description = descriptionParts.join(' · '); + const tooltipLines = [ session.branch, `${session.agentName} · ${session.taskName}`, + `Status ${this.description}`, + session.changeCount > 0 + ? `Changed ${session.activityCountLabel}: ${session.activitySummary}` + : session.activitySummary, `Started ${session.startedAt}`, session.worktreePath, - ].join('\n'); + ]; + this.tooltip = tooltipLines.filter(Boolean).join('\n'); this.iconPath = new vscode.ThemeIcon('loading~spin'); this.contextValue = 'gitguardex.session'; this.command = { diff --git a/templates/vscode/guardex-active-agents/session-schema.js b/templates/vscode/guardex-active-agents/session-schema.js index eed9851..bd689af 100644 --- a/templates/vscode/guardex-active-agents/session-schema.js +++ b/templates/vscode/guardex-active-agents/session-schema.js @@ -1,8 +1,11 @@ const fs = require('node:fs'); const path = require('node:path'); +const cp = require('node:child_process'); const ACTIVE_SESSIONS_RELATIVE_DIR = path.join('.omx', 'state', 'active-sessions'); const SESSION_SCHEMA_VERSION = 1; +const LOCK_FILE_RELATIVE = path.join('.omx', 'state', 'agent-file-locks.json'); +const MAX_CHANGED_PATH_PREVIEW = 3; function toNonEmptyString(value, fallback = '') { const normalized = typeof value === 'string' ? value.trim() : String(value || '').trim(); @@ -31,6 +34,96 @@ function sessionFilePathForBranch(repoRoot, branch) { return path.join(activeSessionsDirForRepo(repoRoot), sessionFileNameForBranch(branch)); } +function splitOutputLines(output) { + if (typeof output !== 'string') { + return null; + } + + return output + .split(/\r?\n/) + .map((line) => line.trim()) + .filter(Boolean); +} + +function runGitLines(worktreePath, args) { + try { + const output = cp.execFileSync('git', ['-C', worktreePath, ...args], { + encoding: 'utf8', + stdio: ['ignore', 'pipe', 'ignore'], + }); + return splitOutputLines(output); + } catch (_error) { + return null; + } +} + +function formatFileCount(count) { + return `${count} file${count === 1 ? '' : 's'}`; +} + +function previewChangedPaths(paths) { + if (!Array.isArray(paths) || paths.length === 0) { + return ''; + } + + if (paths.length <= MAX_CHANGED_PATH_PREVIEW) { + return paths.join(', '); + } + + const preview = paths.slice(0, MAX_CHANGED_PATH_PREVIEW).join(', '); + return `${preview}, +${paths.length - MAX_CHANGED_PATH_PREVIEW} more`; +} + +function collectWorktreeChangedPaths(worktreePath) { + const changedGroups = [ + runGitLines(worktreePath, ['diff', '--name-only', '--', '.', `:(exclude)${LOCK_FILE_RELATIVE}`]), + runGitLines(worktreePath, ['diff', '--cached', '--name-only', '--', '.', `:(exclude)${LOCK_FILE_RELATIVE}`]), + runGitLines(worktreePath, ['ls-files', '--others', '--exclude-standard']), + ]; + + if (changedGroups.some((group) => group === null)) { + return null; + } + + return [...new Set(changedGroups.flat())] + .filter((relativePath) => relativePath && relativePath !== LOCK_FILE_RELATIVE) + .sort((left, right) => left.localeCompare(right)); +} + +function deriveSessionActivity(session) { + const changedPaths = collectWorktreeChangedPaths(session.worktreePath); + if (!changedPaths) { + return { + activityKind: 'thinking', + activityLabel: 'thinking', + activityCountLabel: '', + activitySummary: 'Worktree activity unavailable.', + changeCount: 0, + changedPaths: [], + }; + } + + if (changedPaths.length === 0) { + return { + activityKind: 'thinking', + activityLabel: 'thinking', + activityCountLabel: '', + activitySummary: 'Worktree clean.', + changeCount: 0, + changedPaths: [], + }; + } + + return { + activityKind: 'working', + activityLabel: 'working', + activityCountLabel: formatFileCount(changedPaths.length), + activitySummary: previewChangedPaths(changedPaths), + changeCount: changedPaths.length, + changedPaths, + }; +} + function buildSessionRecord(input) { const repoRoot = path.resolve(toNonEmptyString(input.repoRoot)); const worktreePath = path.resolve(toNonEmptyString(input.worktreePath)); @@ -173,6 +266,7 @@ function readActiveSessions(repoRoot, options = {}) { } normalized.elapsedLabel = formatElapsedFrom(normalized.startedAt, now); + Object.assign(normalized, deriveSessionActivity(normalized)); sessions.push(normalized); } @@ -192,10 +286,14 @@ module.exports = { SESSION_SCHEMA_VERSION, activeSessionsDirForRepo, buildSessionRecord, + collectWorktreeChangedPaths, deriveSessionLabel, + deriveSessionActivity, formatElapsedFrom, + formatFileCount, isPidAlive, normalizeSessionRecord, + previewChangedPaths, readActiveSessions, sanitizeBranchForFile, sessionFileNameForBranch, diff --git a/test/vscode-active-agents-session-state.test.js b/test/vscode-active-agents-session-state.test.js index 7497b51..a55519f 100644 --- a/test/vscode-active-agents-session-state.test.js +++ b/test/vscode-active-agents-session-state.test.js @@ -24,6 +24,22 @@ function runNode(scriptPath, args, options = {}) { }); } +function runGit(repoPath, args, options = {}) { + const result = cp.spawnSync('git', ['-C', repoPath, ...args], { + encoding: 'utf8', + ...options, + }); + assert.equal(result.status, 0, result.stderr || result.stdout); + return result; +} + +function initGitRepo(repoPath) { + fs.mkdirSync(repoPath, { recursive: true }); + runGit(repoPath, ['init']); + runGit(repoPath, ['config', 'user.email', 'guardex-tests@example.com']); + runGit(repoPath, ['config', 'user.name', 'Guardex Tests']); +} + function loadExtensionWithMockVscode(mockVscode) { const Module = require('node:module'); const originalLoad = Module._load; @@ -214,6 +230,38 @@ test('session-schema ignores stale or invalid session records', () => { assert.equal(sessions[0].branch, liveRecord.branch); }); +test('session-schema derives working activity from dirty sandbox worktrees', () => { + const tempRoot = fs.mkdtempSync(path.join(os.tmpdir(), 'guardex-active-session-working-')); + const worktreePath = path.join(tempRoot, 'sandbox'); + initGitRepo(worktreePath); + fs.writeFileSync(path.join(worktreePath, 'tracked.txt'), 'base\n', 'utf8'); + runGit(worktreePath, ['add', 'tracked.txt']); + runGit(worktreePath, ['commit', '-m', 'baseline']); + + fs.writeFileSync(path.join(worktreePath, 'tracked.txt'), 'base\nchanged\n', 'utf8'); + fs.writeFileSync(path.join(worktreePath, 'new-file.txt'), 'new\n', 'utf8'); + + const record = sessionSchema.buildSessionRecord({ + repoRoot: tempRoot, + branch: 'agent/codex/working-task', + taskName: 'working-task', + agentName: 'codex', + worktreePath, + pid: process.pid, + cliName: 'codex', + }); + const sessionPath = sessionSchema.sessionFilePathForBranch(tempRoot, record.branch); + fs.mkdirSync(path.dirname(sessionPath), { recursive: true }); + fs.writeFileSync(sessionPath, `${JSON.stringify(record, null, 2)}\n`, 'utf8'); + + const [session] = sessionSchema.readActiveSessions(tempRoot); + assert.equal(session.activityKind, 'working'); + assert.equal(session.changeCount, 2); + assert.equal(session.activityCountLabel, '2 files'); + assert.deepEqual(session.changedPaths, ['new-file.txt', 'tracked.txt']); + assert.equal(session.activitySummary, 'new-file.txt, tracked.txt'); +}); + test('install-vscode-active-agents-extension installs the current extension version and prunes older copies', () => { const tempExtensionsDir = fs.mkdtempSync(path.join(os.tmpdir(), 'guardex-vscode-ext-')); const staleDir = path.join(tempExtensionsDir, 'recodeee.gitguardex-active-agents-0.0.0'); @@ -288,6 +336,7 @@ test('active-agents extension updates the SCM badge for live sessions', async () const provider = registrations.providers[0].provider; const [sessionItem] = await provider.getChildren(); assert.equal(sessionItem.label, 'live-task'); + assert.match(sessionItem.description, /^thinking · \d+[smhd]/); assert.deepEqual(registrations.treeViews[0].badge, { value: 1, tooltip: '1 active agent', @@ -298,3 +347,47 @@ test('active-agents extension updates the SCM badge for live sessions', async () subscription.dispose?.(); } }); + +test('active-agents extension shows working rows when the sandbox has changes', async () => { + const tempRoot = fs.mkdtempSync(path.join(os.tmpdir(), 'guardex-vscode-working-view-')); + const worktreePath = path.join(tempRoot, 'sandbox'); + initGitRepo(worktreePath); + fs.writeFileSync(path.join(worktreePath, 'tracked.txt'), 'base\n', 'utf8'); + runGit(worktreePath, ['add', 'tracked.txt']); + runGit(worktreePath, ['commit', '-m', 'baseline']); + fs.writeFileSync(path.join(worktreePath, 'tracked.txt'), 'base\nchanged\n', 'utf8'); + fs.writeFileSync(path.join(worktreePath, 'new-file.txt'), 'new\n', 'utf8'); + + const sessionPath = sessionSchema.sessionFilePathForBranch(tempRoot, 'agent/codex/live-task'); + fs.mkdirSync(path.dirname(sessionPath), { recursive: true }); + fs.writeFileSync( + sessionPath, + `${JSON.stringify(sessionSchema.buildSessionRecord({ + repoRoot: tempRoot, + branch: 'agent/codex/live-task', + taskName: 'live-task', + agentName: 'codex', + worktreePath, + pid: process.pid, + cliName: 'codex', + }), null, 2)}\n`, + 'utf8', + ); + + const { registrations, vscode } = createMockVscode(tempRoot); + vscode.workspace.findFiles = async () => [{ fsPath: sessionPath }]; + const extension = loadExtensionWithMockVscode(vscode); + const context = { subscriptions: [] }; + + extension.activate(context); + + const provider = registrations.providers[0].provider; + const [sessionItem] = await provider.getChildren(); + assert.equal(sessionItem.label, 'sandbox'); + assert.match(sessionItem.description, /^working · 2 files · /); + assert.match(sessionItem.tooltip, /Changed 2 files: new-file\.txt, tracked\.txt/); + + for (const subscription of context.subscriptions) { + subscription.dispose?.(); + } +}); From 4eb0be1f6a3a2bfb54b55b54014809f17cf2f7f2 Mon Sep 17 00:00:00 2001 From: Viktor Nagy <137165288+NagyVikt@users.noreply.github.com> Date: Tue, 21 Apr 2026 21:38:21 +0200 Subject: [PATCH 30/48] Align the published package name with the GitGuardex brand (#262) Rename package metadata and install/update surfaces to @imdeadpool/gitguardex while keeping gx preferred and guardex as a legacy alias. Refresh docs, tutorial assets, and OpenSpec scaffolding to point at the renamed package, and keep historical release notes tied to the package scope that was actually published at each version. Constraint: Existing gx and guardex entry points must keep working during the package-name transition Rejected: Drop the guardex bin alias now | would break existing shells and automation during the rename Rejected: Rewrite historical release entries to the new npm scope | inaccurate because older releases shipped as @imdeadpool/guardex Confidence: high Scope-risk: moderate Reversibility: clean Directive: Keep historical release-note package references aligned with the package name that was actually published at that version Tested: node --test --test-name-pattern "(self-update verifies on-disk version after @latest install and retries with pinned version when stale|self-update restarts into the installed CLI after a successful on-disk upgrade|status --json returns cli, services, and repo summary|prompt outputs AI setup instructions|prompt --exec outputs command-only checklist|deprecated copy-commands alias still works and warns)" test/install.test.js Tested: node --check bin/multiagent-safety.js Tested: npm pack --dry-run Tested: openspec validate agent-codex-rename-npm-package-to-gitguardex-everywh-2026-04-21-21-02 --type change --strict Tested: openspec validate --specs Not-tested: full node --test test/install.test.js still hangs in an unrelated gx setup path in this repo Co-authored-by: NagyVikt --- CONTRIBUTING.md | 2 +- README.md | 41 ++++++++++--------- bin/multiagent-safety.js | 9 ++-- docs/branch-sync-spec.md | 2 +- docs/images/guardex-service-status.svg | 10 ++--- docs/images/install-hero.svg | 4 +- docs/images/setup-success.svg | 24 +++++------ docs/images/status-tools-logs.svg | 14 +++---- docs/images/workflow-gx-terminal-status.svg | 6 +-- docs/redditpost.md | 22 +++++----- frontend/app/layout.tsx | 4 +- frontend/app/page.tsx | 37 +++++++++-------- .../scripts/openspec/init-change-workspace.sh | 8 ++-- .../proposal.md | 17 ++++++++ .../spec.md | 22 ++++++++++ .../tasks.md | 23 +++++++++++ package-lock.json | 4 +- package.json | 2 +- templates/githooks/pre-commit | 2 +- .../scripts/openspec/init-change-workspace.sh | 8 ++-- test/install.test.js | 28 ++++++------- 21 files changed, 177 insertions(+), 112 deletions(-) create mode 100644 openspec/changes/agent-codex-rename-npm-package-to-gitguardex-everywh-2026-04-21-21-02/proposal.md create mode 100644 openspec/changes/agent-codex-rename-npm-package-to-gitguardex-everywh-2026-04-21-21-02/specs/agent-codex-rename-npm-package-to-gitguardex-everywh-2026-04-21-21-02/spec.md create mode 100644 openspec/changes/agent-codex-rename-npm-package-to-gitguardex-everywh-2026-04-21-21-02/tasks.md diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md index c311fa7..588386c 100644 --- a/CONTRIBUTING.md +++ b/CONTRIBUTING.md @@ -1,6 +1,6 @@ # Contributing -Thanks for contributing to `GuardeX`. +Thanks for contributing to `GitGuardex`. ## Development setup diff --git a/README.md b/README.md index a05337b..84bbe51 100644 --- a/README.md +++ b/README.md @@ -1,9 +1,9 @@ # GitGuardex — Guardian T-Rex for your repo -[![npm version](https://img.shields.io/npm/v/%40imdeadpool%2Fguardex?label=npm&color=cb3837&logo=npm)](https://www.npmjs.com/package/@imdeadpool/guardex) -[![npm downloads](https://img.shields.io/npm/dm/%40imdeadpool%2Fguardex?label=downloads&color=0b76c5)](https://www.npmjs.com/package/@imdeadpool/guardex) +[![npm version](https://img.shields.io/npm/v/%40imdeadpool%2Fgitguardex?label=npm&color=cb3837&logo=npm)](https://www.npmjs.com/package/@imdeadpool/gitguardex) +[![npm downloads](https://img.shields.io/npm/dm/%40imdeadpool%2Fgitguardex?label=downloads&color=0b76c5)](https://www.npmjs.com/package/@imdeadpool/gitguardex) [![GitHub stars](https://img.shields.io/github/stars/recodeee/gitguardex?label=stars&color=d4ac0d)](https://github.com/recodeee/gitguardex/stargazers) -[![License](https://img.shields.io/npm/l/%40imdeadpool%2Fguardex?label=License&color=97ca00)](./LICENSE) +[![License](https://img.shields.io/npm/l/%40imdeadpool%2Fgitguardex?label=License&color=97ca00)](./LICENSE) [![CI](https://img.shields.io/github/actions/workflow/status/recodeee/gitguardex/ci.yml?branch=main&label=CI)](https://github.com/recodeee/gitguardex/actions/workflows/ci.yml) [![Release](https://img.shields.io/github/actions/workflow/status/recodeee/gitguardex/release.yml?label=Release)](https://github.com/recodeee/gitguardex/actions/workflows/release.yml) @@ -38,7 +38,7 @@ GitGuardex exists to stop that loop. Every agent gets its own worktree, claims t

Install in one line

```bash -npm i -g @imdeadpool/guardex +npm i -g @imdeadpool/gitguardex ```

@@ -49,8 +49,8 @@ npm i -g @imdeadpool/guardex

- npm - downloads + npm + downloads stars

@@ -70,19 +70,19 @@ Coming soon: [recodee.com](https://recodee.com) — live account health, usage, - **Protected-base safety** — `main`, `dev`, `master` are blocked by default; agents must go through PRs. - **Auto-merges agent configs into every worktree** — `oh-my-codex`, `oh-my-claudecode`, caveman mode, and OpenSpec all get applied automatically so every spawned agent starts tuned, not bare. - **Repair/doctor flow** — when drift happens (and it will), `gx doctor` gets you back to a clean state. -- **Auto-finish** — when Codex exits a session, Guardex commits sandbox changes, syncs against the base, retries once if the base moved, and opens a PR. +- **Auto-finish** — when Codex exits a session, GitGuardex commits sandbox changes, syncs against the base, retries once if the base moved, and opens a PR. --- ## Quick start ```sh -npm i -g @imdeadpool/guardex +npm i -g @imdeadpool/gitguardex cd /path/to/your/repo gx setup ``` -That's it. Setup installs hooks, scripts, templates, and scaffolds OpenSpec/caveman/OMX wiring. Aliases: `gx` (preferred), `gitguardex` (full), `guardex` (legacy). +That's it. New installs should use `@imdeadpool/gitguardex` so the published package matches the GitGuardex name. Setup installs hooks, scripts, templates, and scaffolds OpenSpec/caveman/OMX wiring. Aliases: `gx` (preferred), `gitguardex` (full), `guardex` (legacy compatibility). --- @@ -199,7 +199,7 @@ bash scripts/agent-branch-finish.sh \ If you use `scripts/codex-agent.sh`, the finish flow runs automatically when the Codex session exits — it auto-commits, retries once after syncing if the base moved during the run, then pushes and opens the PR. -Guardex normally prunes merged sandboxes for you as part of the finish flow. If you simply do not want a local sandbox/worktree anymore, remove that worktree directly; delete the branch too only if you are intentionally abandoning that lane: +GitGuardex normally prunes merged sandboxes for you as part of the finish flow. If you simply do not want a local sandbox/worktree anymore, remove that worktree directly; delete the branch too only if you are intentionally abandoning that lane: ```sh git worktree remove .omx/agent-worktrees/ @@ -229,11 +229,11 @@ Codex sessions default to `.omx/agent-worktrees/`. Claude Code sessions default ### How It Works In VS Code -This is the real Source Control shape Guardex is aiming for: isolated agent branches, clear OpenSpec artifacts, and no pile-up on one shared checkout. +This is the real Source Control shape GitGuardex is aiming for: isolated agent branches, clear OpenSpec artifacts, and no pile-up on one shared checkout. ![Guarded VS Code Source Control example](https://raw.githubusercontent.com/recodeee/gitguardex/main/docs/images/workflow-source-control-grouped.png) -To install the real companion into local VS Code from a Guardex-wired repo: +To install the real companion into local VS Code from a GitGuardex-wired repo: ```sh node scripts/install-vscode-active-agents-extension.js @@ -406,7 +406,7 @@ GitGuardex is designed to work alongside these. All optional — but if you're r ### oh-my-codex — Codex config + skills framework -Loads skills, slash commands, and session defaults into Codex. Guardex merges `oh-my-codex` into every agent worktree automatically, so every spawned agent starts with the same tuned config instead of vanilla Codex. +Loads skills, slash commands, and session defaults into Codex. GitGuardex merges `oh-my-codex` into every agent worktree automatically, so every spawned agent starts with the same tuned config instead of vanilla Codex. ```sh npm i -g oh-my-codex @@ -417,7 +417,7 @@ Repo: ### oh-my-claudecode — Claude Code equivalent -Claude-side mirror of oh-my-codex. Same idea: skills, commands, and defaults loaded into every Claude Code session. Guardex merges it into worktrees alongside oh-my-codex so mixed Codex + Claude agent fleets behave consistently. For the npm CLI/runtime path, the published package name is `oh-my-claude-sisyphus`. +Claude-side mirror of oh-my-codex. Same idea: skills, commands, and defaults loaded into every Claude Code session. GitGuardex merges it into worktrees alongside oh-my-codex so mixed Codex + Claude agent fleets behave consistently. For the npm CLI/runtime path, the published package name is `oh-my-claude-sisyphus`. ```sh npm i -g oh-my-claude-sisyphus@latest @@ -636,10 +636,11 @@ npm pack --dry-run v7.x ### v7.0.16 +- GitGuardex now publishes under the matching npm package name `@imdeadpool/gitguardex`, and install/help/docs surfaces point at the renamed package instead of the older `@imdeadpool/guardex` scope. - `gx doctor` now keeps nested repo repair runs visibly progressing, and overlapping integration work stays off the protected base branch instead of trying to merge back on `main`. - Cleanup and finish flows are less brittle: `codex-agent` no longer waits on PRs that can never exist, and prune cleanup now walks both managed worktree roots so stale sandboxes get removed consistently. -- Mirror-sync diagnostics are quieter: when the mirror PAT is unset, Guardex now skips the sync path instead of marking the run red, and shared `ralplan` lanes stay easier to identify during handoff/debugging. -- Bumped `@imdeadpool/guardex` from `7.0.15` → `7.0.16` after npm rejected a republish over the already-published `7.0.15`. +- Mirror-sync diagnostics are quieter: when the mirror PAT is unset, GitGuardex now skips the sync path instead of marking the run red, and shared `ralplan` lanes stay easier to identify during handoff/debugging. +- Bumped the release from `7.0.15` → `7.0.16` after npm rejected a republish of `7.0.15`. ### v7.0.15 - `gx doctor` no longer blocks recursive nested protected-repo repairs on child PR merge waits; nested sandboxes now force `--no-wait-for-merge` so the parent repair loop can continue. @@ -652,7 +653,7 @@ npm pack --dry-run ### v7.0.13 - `gx status` and `gx setup` now present the Claude companion as `oh-my-claudecode` while still installing the published npm package `oh-my-claude-sisyphus`. -- When that dependency is inactive or the user declines the optional install, Guardex now prints the upstream repo URL so the missing dependency is explicit instead of hidden behind the npm package name. +- When that dependency is inactive or the user declines the optional install, GitGuardex now prints the upstream repo URL so the missing dependency is explicit instead of hidden behind the npm package name. - Bumped `@imdeadpool/guardex` from `7.0.12` → `7.0.13` after npm rejected a republish over the already-published `7.0.12`. ### v7.0.12 @@ -710,8 +711,8 @@ npm pack --dry-run - **Breaking (soft).** Consolidated 17 commands into 12 visible commands with flag-based subcommands. Removed names still work but print a deprecation notice; will be removed in v8. - **Token-usage improvements.** Trimmed auto-installed agent templates that live in every consumer repo and get loaded into every session: - `templates/AGENTS.multiagent-safety.md`: 6990 B → 1615 B (−77%) - - `templates/codex/skills/guardex/SKILL.md`: 2732 B → 1086 B (−60%) - - `templates/claude/commands/guardex.md`: 472 B → 357 B (−24%) + - `templates/codex/skills/gitguardex/SKILL.md`: 2732 B → 1086 B (−60%) + - `templates/claude/commands/gitguardex.md`: 472 B → 357 B (−24%) - Total: 10194 B → 3058 B per consumer repo (−70%, ~1.5k fewer tokens per agent session). - New `gx prompt` command replaces three prompt-emitting commands. - New flag surface on `gx setup`: `--install-only`, `--repair`. @@ -752,7 +753,7 @@ Version bumps for npm publish continuity plus incremental fixes: doctor arg-pars - Allows tightly guarded Codex-only commits for `AGENTS.md` / `.gitignore` on protected branches. ### v5.0.0 -- Rebranded CLI to **GuardeX** with `gx`-first command UX. +- Rebranded CLI to **GitGuardex** with `gx`-first command UX. - Published under scoped package name `@imdeadpool/guardex`. - Enforced repeatable per-message agent branch lifecycle in setup/init flows. - Added codex-auth-aware sandbox branch naming support. diff --git a/bin/multiagent-safety.js b/bin/multiagent-safety.js index 7b14a16..8110c39 100755 --- a/bin/multiagent-safety.js +++ b/bin/multiagent-safety.js @@ -11,6 +11,7 @@ const packageJson = JSON.parse(fs.readFileSync(packageJsonPath, 'utf8')); const TOOL_NAME = 'gitguardex'; const SHORT_TOOL_NAME = 'gx'; const LEGACY_NAMES = ['guardex', 'multiagent-safety']; +const GLOBAL_INSTALL_COMMAND = `npm i -g ${packageJson.name}`; const OPENSPEC_PACKAGE = '@fission-ai/openspec'; const OMC_PACKAGE = 'oh-my-claude-sisyphus'; const OMC_REPO_URL = 'https://github.com/Yeachan-Heo/oh-my-claudecode'; @@ -281,7 +282,7 @@ const CLI_COMMAND_DESCRIPTIONS = [ ['prompt', 'Print AI setup checklist (--exec, --snippet)'], ['report', 'Security/safety reports (e.g. OpenSSF scorecard)'], ['help', 'Show this help output'], - ['version', 'Print GuardeX version'], + ['version', 'Print GitGuardex version'], ]; const DEPRECATED_COMMAND_ALIASES = new Map([ ['init', { target: 'setup', hint: 'gx setup' }], @@ -315,7 +316,7 @@ function defaultAgentWorktreeRelativeDir(env = process.env) { const AI_SETUP_PROMPT = `GitGuardex (gx) setup checklist for Codex/Claude in this repo. -1) Install: npm i -g @imdeadpool/guardex && gh --version +1) Install: ${GLOBAL_INSTALL_COMMAND} && gh --version 2) Bootstrap: gx setup 3) Repair: gx doctor 4) Task loop: bash scripts/codex-agent.sh "" "" @@ -330,7 +331,7 @@ const AI_SETUP_PROMPT = `GitGuardex (gx) setup checklist for Codex/Claude in thi 12) Fork sync: install https://github.com/apps/pull + cp .github/pull.yml.example .github/pull.yml `; -const AI_SETUP_COMMANDS = `npm i -g @imdeadpool/guardex +const AI_SETUP_COMMANDS = `${GLOBAL_INSTALL_COMMAND} gh --version gx setup gx doctor @@ -2517,7 +2518,7 @@ function runDoctorInSandbox(options, blocked) { if (finishResult.stderr) process.stderr.write(finishResult.stderr); } else if (finishResult.status === 'failed') { console.log(`[${TOOL_NAME}] Auto-finish flow failed for sandbox branch '${metadata.branch}'.`); - console.log(`[guardex] Auto-finish flow failed for sandbox branch '${metadata.branch}'.`); + console.log(`[${TOOL_NAME}] Auto-finish flow failed for sandbox branch '${metadata.branch}'.`); if (finishResult.stdout) process.stdout.write(finishResult.stdout); if (finishResult.stderr) process.stderr.write(finishResult.stderr); } else { diff --git a/docs/branch-sync-spec.md b/docs/branch-sync-spec.md index 222cb14..a14b199 100644 --- a/docs/branch-sync-spec.md +++ b/docs/branch-sync-spec.md @@ -1,4 +1,4 @@ -# GuardeX branch sync feature spec (pre-implementation) +# GitGuardex branch sync feature spec (pre-implementation) Status: draft Scope: CLI UX + behavior spec + test matrix for keeping `agent/*` branches synced with `origin/dev` safely. diff --git a/docs/images/guardex-service-status.svg b/docs/images/guardex-service-status.svg index 22609f4..bf1ff1d 100644 --- a/docs/images/guardex-service-status.svg +++ b/docs/images/guardex-service-status.svg @@ -9,13 +9,13 @@ - [guardex] CLI: guardex/0.4.7 linux-x64 node-v22.22.0 - [guardex] Global services: + [gitguardex] CLI: gitguardex/0.4.7 linux-x64 node-v22.22.0 + [gitguardex] Global services: - ● oh-my-codex: active - ● @fission-ai/openspec: active - [guardex] Repo safety service: ● active. - [guardex] Repo: /home/deadpool/YOUREPO - [guardex] Branch: ksskkfb02 + [gitguardex] Repo safety service: ● active. + [gitguardex] Repo: /home/deadpool/YOUREPO + [gitguardex] Branch: ksskkfb02 guardex-tools logs: ├─ USAGE diff --git a/docs/images/install-hero.svg b/docs/images/install-hero.svg index 1739d5f..783bb27 100644 --- a/docs/images/install-hero.svg +++ b/docs/images/install-hero.svg @@ -16,11 +16,11 @@ npm i -g -@imdeadpool/guardex +@imdeadpool/gitguardex added 47 packages in 3.2s -+ @imdeadpool/guardex@ ++ @imdeadpool/gitguardex@ 7.0.16 diff --git a/docs/images/setup-success.svg b/docs/images/setup-success.svg index a56260a..1dac87a 100644 --- a/docs/images/setup-success.svg +++ b/docs/images/setup-success.svg @@ -1,5 +1,5 @@ - guardex setup behavior outside git repository + gitguardex setup behavior outside git repository Terminal screenshot showing gx status, gx setup failure outside a git repo, and gx init unknown command output. @@ -12,31 +12,31 @@ - guardex status + setup + gitguardex status + setup deadpool@recodee:~/Documents/testguardex$ gx - [guardex] CLI: @imdeadpool/guardex/5.0.0 linux-x64 node-v22.22.0 - [guardex] Global services: + [gitguardex] CLI: @imdeadpool/gitguardex/5.0.0 linux-x64 node-v22.22.0 + [gitguardex] Global services: - ● oh-my-codex: active - ● @fission-ai/openspec: active - [guardex] Repo safety service: ● inactive (no git repository at target). + [gitguardex] Repo safety service: ● inactive (no git repository at target). deadpool@recodee:~/Documents/testguardex$ gx setup - [guardex] Already installed globally: oh-my-codex, @fission-ai/openspec - [guardex] ✅ OMX/OpenSpec global tools already installed. Skipping. - [guardex] Target is not inside a git repository: /home/deadpool/Documents/testguardex + [gitguardex] Already installed globally: oh-my-codex, @fission-ai/openspec + [gitguardex] ✅ OMX/OpenSpec global tools already installed. Skipping. + [gitguardex] Target is not inside a git repository: /home/deadpool/Documents/testguardex fatal: not a git repository (or any of the parent directories): .git deadpool@recodee:~/Documents/testguardex$ gx - [guardex] CLI: @imdeadpool/guardex/5.0.0 linux-x64 node-v22.22.0 - [guardex] Global services: + [gitguardex] CLI: @imdeadpool/gitguardex/5.0.0 linux-x64 node-v22.22.0 + [gitguardex] Global services: - ● oh-my-codex: active - ● @fission-ai/openspec: active - [guardex] Repo safety service: ● inactive (no git repository at target). + [gitguardex] Repo safety service: ● inactive (no git repository at target). deadpool@recodee:~/Documents/testguardex$ gx init - [guardex] Unknown command: init + [gitguardex] Unknown command: init deadpool@recodee:~/Documents/testguardex$ diff --git a/docs/images/status-tools-logs.svg b/docs/images/status-tools-logs.svg index ac0ff80..36b1c34 100644 --- a/docs/images/status-tools-logs.svg +++ b/docs/images/status-tools-logs.svg @@ -15,15 +15,15 @@ gx status - [guardex] CLI: @imdeadpool/guardex/5.0.2 linux-x64 node-v22.22.0 - [guardex] Global services: + [gitguardex] CLI: @imdeadpool/gitguardex/5.0.2 linux-x64 node-v22.22.0 + [gitguardex] Global services: - ● oh-my-codex: active - ● @fission-ai/openspec: active - ● @imdeadpool/codex-account-switcher: active - [guardex] Repo safety service: ● active. - [guardex] Repo: /path/to/your/repo - [guardex] Branch: agent/your-branch - guardex-tools logs: + [gitguardex] Repo safety service: ● active. + [gitguardex] Repo: /path/to/your/repo + [gitguardex] Branch: agent/your-branch + gitguardex-tools logs: ├─ USAGE @@ -31,7 +31,7 @@ ├─ COMMANDS - status Show GuardeX CLI + service health without modifying files + status Show GitGuardex CLI + service health without modifying files setup Install + repair guardrails in a git repo (supports --no-gitignore) diff --git a/docs/images/workflow-gx-terminal-status.svg b/docs/images/workflow-gx-terminal-status.svg index b49eac7..ac68d76 100644 --- a/docs/images/workflow-gx-terminal-status.svg +++ b/docs/images/workflow-gx-terminal-status.svg @@ -16,7 +16,7 @@ - [gitguardex] CLI: @imdeadpool/guardex/7.0.16 linux-x64 node-v22.22.0 + [gitguardex] CLI: @imdeadpool/gitguardex/7.0.16 linux-x64 node-v22.22.0 [gitguardex] Global services: - @@ -68,13 +68,13 @@ prompt Print AI setup checklist (--exec, --snippet) report Security/safety reports (e.g. OpenSSF scorecard) help Show this help output - version Print Guardex version + version Print GitGuardex version └─ AGENT BOT agents Start/stop review + cleanup bots for this repo └─ REPO TOGGLE - Set repo-root .env: GUARDEX_ON=0 disables Guardex, GUARDEX_ON=1 enables it again + Set repo-root .env: GUARDEX_ON=0 disables GitGuardex, GUARDEX_ON=1 enables it again └─ Try 'gitguardex doctor' for one-step repair + verification. diff --git a/docs/redditpost.md b/docs/redditpost.md index 8fc386b..254b5db 100644 --- a/docs/redditpost.md +++ b/docs/redditpost.md @@ -1,20 +1,20 @@ -# Reddit Post Kit for GuardeX +# Reddit Post Kit for GitGuardex Source baseline: [`README.md`](../README.md) Project links: - GitHub: https://github.com/recodeecom/multiagent-safety -- npm: https://www.npmjs.com/package/@imdeadpool/guardex +- npm: https://www.npmjs.com/package/@imdeadpool/gitguardex ## Recommended Title Options -1. `I open-sourced GuardeX: Git guardrails for multi-agent coding workflows` -2. `GuardeX (npm): safer branch + file-ownership workflow for parallel coding agents` -3. `Built an OSS CLI to stop multi-agent Git collisions (GuardeX)` +1. `I open-sourced GitGuardex: Git guardrails for multi-agent coding workflows` +2. `GitGuardex (npm): safer branch + file-ownership workflow for parallel coding agents` +3. `Built an OSS CLI to stop multi-agent Git collisions (GitGuardex)` ## Copy-Ready Reddit Post (long) -I open-sourced **GuardeX**, a CLI that adds guardrails for multi-agent coding in Git repos. +I open-sourced **GitGuardex**, a CLI that adds guardrails for multi-agent coding in Git repos. The goal is simple: if several agents (or teammates) work in parallel, prevent the common failure modes before they land in `main`. @@ -29,7 +29,7 @@ What it does: Quick start: ```bash -npm i -g @imdeadpool/guardex +npm i -g @imdeadpool/gitguardex gx setup ``` @@ -45,21 +45,21 @@ bash scripts/agent-branch-finish.sh --branch "$(git rev-parse --abbrev-ref HEAD) If you run Codex/Claude-style parallel workflows, I would value feedback on edge cases your team hits in production. GitHub: https://github.com/recodeecom/multiagent-safety -npm: https://www.npmjs.com/package/@imdeadpool/guardex +npm: https://www.npmjs.com/package/@imdeadpool/gitguardex ## Copy-Ready Reddit Post (short) -I open-sourced **GuardeX** for safer multi-agent Git workflows. +I open-sourced **GitGuardex** for safer multi-agent Git workflows. It adds branch/worktree guardrails, protected-branch enforcement, file-lock ownership, and repair scripts (`gx setup` / `gx doctor`) so parallel agent execution is safer by default. ```bash -npm i -g @imdeadpool/guardex +npm i -g @imdeadpool/gitguardex gx setup ``` GitHub: https://github.com/recodeecom/multiagent-safety -npm: https://www.npmjs.com/package/@imdeadpool/guardex +npm: https://www.npmjs.com/package/@imdeadpool/gitguardex ## Images to include in the Reddit post diff --git a/frontend/app/layout.tsx b/frontend/app/layout.tsx index 3c7e8e2..c45d4cf 100644 --- a/frontend/app/layout.tsx +++ b/frontend/app/layout.tsx @@ -2,8 +2,8 @@ import type { Metadata } from 'next' import './globals.css' export const metadata: Metadata = { - title: 'GuardeX | How It Works', - description: 'A workflow-style GuardeX onboarding preview built with Next.js.' + title: 'GitGuardex | How It Works', + description: 'A workflow-style GitGuardex onboarding preview built with Next.js.' } export default function RootLayout({ diff --git a/frontend/app/page.tsx b/frontend/app/page.tsx index cb23fe3..7856df2 100644 --- a/frontend/app/page.tsx +++ b/frontend/app/page.tsx @@ -126,7 +126,7 @@ interface ModeGuide { } const MODE_ORDER: ModeKey[] = ['execute', 'plan', 'merge', 'installation'] -const INSTALL_COMMAND = 'npm i -g @imdeadpool/guardex' +const INSTALL_COMMAND = 'npm i -g @imdeadpool/gitguardex' const PRODUCT_LABEL = 'Recodee' const EDITOR_LABEL = 'recodee — VS Code' @@ -1669,8 +1669,9 @@ const INSTALL_STEPS: TutorialStep[] = [ label: 'Install the CLI globally', description: ( <> - GuardeX ships as a single npm package. Install once and you get the gx,{' '} - guardex, and multiagent-safety binaries on your PATH. + GitGuardex ships as a single npm package. Install once and you get the gx,{' '} + gitguardex, guardex (legacy), and multiagent-safety{' '} + binaries on your PATH. ), messages: [ @@ -1696,7 +1697,7 @@ const INSTALL_STEPS: TutorialStep[] = [ sub: '· global install', elapsed: '3.8s', rows: [ - { kind: 'shell', label: 'bash:', value: 'npm i -g @imdeadpool/guardex' }, + { kind: 'shell', label: 'bash:', value: 'npm i -g @imdeadpool/gitguardex' }, ], }, ], @@ -1711,11 +1712,11 @@ const INSTALL_STEPS: TutorialStep[] = [ ], worktrees: [], codeLines: [ - { parts: [c('$ npm i -g @imdeadpool/guardex', 'c')] }, + { parts: [c('$ npm i -g @imdeadpool/gitguardex', 'c')] }, { parts: [c('added 1 package in 3.8s')] }, { parts: [c('')] }, { parts: [c('$ gx --version')] }, - { parts: [c('guardex 7.0.6', 'f')] }, + { parts: [c('gitguardex 7.0.16', 'f')] }, ], statusBranch: 'dev', }, @@ -1724,7 +1725,7 @@ const INSTALL_STEPS: TutorialStep[] = [ label: 'Audit the repo with gx doctor', description: ( <> - gx doctor scans for the guardrails GuardeX needs: git hooks, file-lock + gx doctor scans for the guardrails GitGuardex needs: git hooks, file-lock scripts, OpenSpec workspace, and ignore patterns. It reports what’s missing and offers to repair. @@ -1759,7 +1760,7 @@ const INSTALL_STEPS: TutorialStep[] = [ worktrees: [], codeLines: [ { parts: [c('$ gx doctor', 'c')] }, - { parts: [c('[guardex] Doctor/fix: ', 'n'), c('/home/you/your-repo', 's')] }, + { parts: [c('[gitguardex] Doctor/fix: ', 'n'), c('/home/you/your-repo', 's')] }, { parts: [c(' - unchanged .omx')] }, { parts: [c(' - unchanged .omx/state')] }, { parts: [c(' - unchanged .omx/logs')] }, @@ -1778,12 +1779,12 @@ const INSTALL_STEPS: TutorialStep[] = [ { parts: [c(' - unchanged .githooks/post-checkout')] }, { parts: [c(' - unchanged .gitignore')] }, { parts: [c(' - hooksPath set core.hooksPath=.githooks')] }, - { parts: [c('[guardex] Scan target: ', 'n'), c('/home/you/your-repo', 's')] }, - { parts: [c('[guardex] Branch: ', 'n'), c('dev', 'f')] }, - { parts: [c('[guardex] '), c('✅ No safety issues detected.', 'f')] }, + { parts: [c('[gitguardex] Scan target: ', 'n'), c('/home/you/your-repo', 's')] }, + { parts: [c('[gitguardex] Branch: ', 'n'), c('dev', 'f')] }, + { parts: [c('[gitguardex] '), c('✅ No safety issues detected.', 'f')] }, { parts: [ - c('[guardex] Auto-finish sweep (base=dev): ', 'n'), + c('[gitguardex] Auto-finish sweep (base=dev): ', 'n'), c('attempted=5', 'f'), c(', completed=0, skipped=15, failed=5'), ], @@ -1791,7 +1792,7 @@ const INSTALL_STEPS: TutorialStep[] = [ { parts: [c(' [skip] agent/claude-16-11/… already merged into dev.', 'c')] }, { parts: [c(' [skip] agent/codex-20-20/… already merged into dev.', 'c')] }, { parts: [c(' [fail] agent/…/rebase-in-progress → resolve conflicts.', 'c')] }, - { parts: [c('[guardex] '), c('✅ Repo is fully safe.', 'f')] }, + { parts: [c('[gitguardex] '), c('✅ Repo is fully safe.', 'f')] }, ], statusBranch: 'dev', }, @@ -1838,7 +1839,7 @@ const INSTALL_STEPS: TutorialStep[] = [ { parts: [c('scaffolding OpenSpec → openspec/'), c(' ✓', 'f')] }, { parts: [c('protecting main, dev branches'), c(' ✓', 'f')] }, { parts: [c('')] }, - { parts: [c('GuardeX ready. Start your first agent with `gx start`.', 'p')] }, + { parts: [c('GitGuardex ready. Start your first agent with `gx start`.', 'p')] }, ], statusBranch: 'dev', }, @@ -1934,7 +1935,7 @@ const INSTALL_STEPS: TutorialStep[] = [ kind: 'assistant', content: ( <> - Work done. Firing the full finish chain — GuardeX will commit, push, PR, wait for merge, + Work done. Firing the full finish chain — GitGuardex will commit, push, PR, wait for merge, and clean the sandbox. ), @@ -2026,7 +2027,7 @@ const MODE_GUIDES: Record = { }, merge: { eyebrow: 'Conflict recovery lane', - title: 'Show how GuardeX merges without trashing either branch', + title: 'Show how GitGuardex merges without trashing either branch', summary: 'Merge mode visualizes the recovery path when two PRs collide. A dedicated merge lane appears, reads both intents, writes a semantic resolution, and proves it with tests.', highlights: [ @@ -2146,7 +2147,7 @@ const getTakeawayCopy = (mode: ModeKey, step: TutorialStep): string => { if (step.worktrees.length === 1) { return 'Once the sandbox exists, every visible file mutation becomes reviewable and reversible.' } - return 'The execute demo starts by proving that GuardeX does not skip straight to writing code.' + return 'The execute demo starts by proving that GitGuardex does not skip straight to writing code.' } if (mode === 'plan') { @@ -2740,7 +2741,7 @@ export default function Home() {
guardian workflow
-
GuardeX
+
GitGuardex
the Guardian T-Rex for your repo ·{' '} &2 <<'MSG' [guardex-preedit-guard] Codex edit/commit detected on a protected branch. -GuardeX requires Codex work to run from an isolated agent/* branch. +GitGuardex requires Codex work to run from an isolated agent/* branch. Start the sub-branch/worktree with: bash scripts/codex-agent.sh "" "" Or manually: diff --git a/templates/scripts/openspec/init-change-workspace.sh b/templates/scripts/openspec/init-change-workspace.sh index 8f878e7..07f26c7 100755 --- a/templates/scripts/openspec/init-change-workspace.sh +++ b/templates/scripts/openspec/init-change-workspace.sh @@ -60,8 +60,8 @@ Describe the change in a sentence or two. Commit message is the spec of record. - [ ] Confirm sandbox worktree is gone (\`git worktree list\`, \`git branch -a\`). NOTESEOF fi - echo "[guardex] OpenSpec change workspace (minimal) ready: ${CHANGE_DIR}" - echo "[guardex] Notes-only scaffold: ${CHANGE_DIR}/notes.md" + echo "[gitguardex] OpenSpec change workspace (minimal) ready: ${CHANGE_DIR}" + echo "[gitguardex] Notes-only scaffold: ${CHANGE_DIR}/notes.md" exit 0 fi @@ -129,5 +129,5 @@ The system SHALL enforce ${CAPABILITY_SLUG} behavior as defined by this change. SPECEOF fi -echo "[guardex] OpenSpec change workspace ready: ${CHANGE_DIR}" -echo "[guardex] OpenSpec change spec scaffold: ${SPEC_DIR}/spec.md" +echo "[gitguardex] OpenSpec change workspace ready: ${CHANGE_DIR}" +echo "[gitguardex] OpenSpec change spec scaffold: ${SPEC_DIR}/spec.md" diff --git a/test/install.test.js b/test/install.test.js index 5d329dc..2b64254 100644 --- a/test/install.test.js +++ b/test/install.test.js @@ -191,7 +191,7 @@ function seedReleasePackageManifest(repoDir, overrides = {}) { const packageJson = JSON.parse(fs.readFileSync(packageJsonPath, 'utf8')); const mergedPackageJson = { ...packageJson, - name: packageJson.name || '@imdeadpool/guardex', + name: packageJson.name || '@imdeadpool/gitguardex', version: cliVersion, repository: { type: 'git', @@ -2447,7 +2447,7 @@ if [[ "$1" == "list" ]]; then echo '{"dependencies":{"oh-my-codex":{},"@fission-ai/openspec":{}}}' exit 0 fi -if [[ "$1" == "i" && "$2" == "-g" && "$3" == "@imdeadpool/guardex@latest" ]]; then +if [[ "$1" == "i" && "$2" == "-g" && "$3" == "@imdeadpool/gitguardex@latest" ]]; then echo "updated" > "${markerPath}" exit 0 fi @@ -2472,11 +2472,11 @@ exit 1 test('self-update verifies on-disk version after @latest install and retries with pinned version when stale', () => { const repoDir = initRepo(); const fakeGlobalRoot = fs.mkdtempSync(path.join(os.tmpdir(), 'guardex-fake-global-root-')); - const installedPkgDir = path.join(fakeGlobalRoot, '@imdeadpool', 'guardex'); + const installedPkgDir = path.join(fakeGlobalRoot, '@imdeadpool', 'gitguardex'); fs.mkdirSync(installedPkgDir, { recursive: true }); fs.writeFileSync( path.join(installedPkgDir, 'package.json'), - JSON.stringify({ name: '@imdeadpool/guardex', version: cliVersion }), + JSON.stringify({ name: '@imdeadpool/gitguardex', version: cliVersion }), 'utf8', ); const markerLatest = path.join(repoDir, '.npm-at-latest-called'); @@ -2494,15 +2494,15 @@ if [[ "$1" == "root" && "$2" == "-g" ]]; then echo "${fakeGlobalRoot}" exit 0 fi -if [[ "$1" == "i" && "$2" == "-g" && "$3" == "@imdeadpool/guardex@latest" ]]; then +if [[ "$1" == "i" && "$2" == "-g" && "$3" == "@imdeadpool/gitguardex@latest" ]]; then touch "${markerLatest}" # Simulate the npm quirk: report success without rewriting the on-disk package.json. exit 0 fi -if [[ "$1" == "i" && "$2" == "-g" && "$3" == "@imdeadpool/guardex@9.9.9" ]]; then +if [[ "$1" == "i" && "$2" == "-g" && "$3" == "@imdeadpool/gitguardex@9.9.9" ]]; then touch "${markerPinned}" # Pinned retry actually advances the on-disk version. - printf '%s' '{"name":"@imdeadpool/guardex","version":"9.9.9"}' > "${installedPkgDir}/package.json" + printf '%s' '{"name":"@imdeadpool/gitguardex","version":"9.9.9"}' > "${installedPkgDir}/package.json" exit 0 fi echo "unexpected npm args: $*" >&2 @@ -2527,14 +2527,14 @@ exit 1 test('self-update restarts into the installed CLI after a successful on-disk upgrade', () => { const repoDir = initRepo(); const fakeGlobalRoot = fs.mkdtempSync(path.join(os.tmpdir(), 'guardex-fake-global-root-')); - const installedPkgDir = path.join(fakeGlobalRoot, '@imdeadpool', 'guardex'); + const installedPkgDir = path.join(fakeGlobalRoot, '@imdeadpool', 'gitguardex'); const installedBinDir = path.join(installedPkgDir, 'bin'); const reexecMarker = path.join(repoDir, '.self-update-reexec-called'); fs.mkdirSync(installedBinDir, { recursive: true }); fs.writeFileSync( path.join(installedPkgDir, 'package.json'), JSON.stringify({ - name: '@imdeadpool/guardex', + name: '@imdeadpool/gitguardex', version: '9.9.9', bin: { gx: 'bin/multiagent-safety.js' }, }), @@ -2562,7 +2562,7 @@ if [[ "$1" == "root" && "$2" == "-g" ]]; then echo "${fakeGlobalRoot}" exit 0 fi -if [[ "$1" == "i" && "$2" == "-g" && "$3" == "@imdeadpool/guardex@latest" ]]; then +if [[ "$1" == "i" && "$2" == "-g" && "$3" == "@imdeadpool/gitguardex@latest" ]]; then exit 0 fi echo "unexpected npm args: $*" >&2 @@ -2651,7 +2651,7 @@ test('status --json returns cli, services, and repo summary', () => { assert.equal(result.status, 0, result.stderr || result.stdout); const parsed = JSON.parse(result.stdout); - assert.equal(parsed.cli.name, '@imdeadpool/guardex'); + assert.equal(parsed.cli.name, '@imdeadpool/gitguardex'); assert.equal(typeof parsed.cli.version, 'string'); assert.equal(Array.isArray(parsed.services), true); const claudeService = parsed.services.find((service) => service.name === 'oh-my-claudecode'); @@ -4535,7 +4535,7 @@ test('prompt outputs AI setup instructions', () => { const repoDir = initRepo(); const result = runNode(['prompt'], repoDir); assert.equal(result.status, 0, result.stderr || result.stdout); - assert.match(result.stdout, /npm i -g @imdeadpool\/guardex/); + assert.match(result.stdout, /npm i -g @imdeadpool\/gitguardex/); assert.match(result.stdout, /GitGuardex \(gx\) setup checklist/); assert.match(result.stdout, /gx setup/); assert.match(result.stdout, /gx doctor/); @@ -4552,7 +4552,7 @@ test('prompt --exec outputs command-only checklist', () => { const repoDir = initRepo(); const result = runNode(['prompt', '--exec'], repoDir); assert.equal(result.status, 0, result.stderr || result.stdout); - assert.match(result.stdout, /^npm i -g @imdeadpool\/guardex/m); + assert.match(result.stdout, /^npm i -g @imdeadpool\/gitguardex/m); assert.match(result.stdout, /^gh --version/m); assert.match(result.stdout, /^gx setup$/m); assert.match(result.stdout, /^gx doctor$/m); @@ -4575,7 +4575,7 @@ test('deprecated copy-commands alias still works and warns', () => { const repoDir = initRepo(); const result = runNode(['copy-commands'], repoDir); assert.equal(result.status, 0, result.stderr || result.stdout); - assert.match(result.stdout, /^npm i -g @imdeadpool\/guardex/m); + assert.match(result.stdout, /^npm i -g @imdeadpool\/gitguardex/m); assert.match(result.stderr, /'copy-commands' is deprecated/); assert.match(result.stderr, /gx prompt --exec/); }); From d43ed5999cbfcec175e926c16fc486e0cf416c50 Mon Sep 17 00:00:00 2001 From: Viktor Nagy <137165288+NagyVikt@users.noreply.github.com> Date: Tue, 21 Apr 2026 22:00:20 +0200 Subject: [PATCH 31/48] Restore post-rename parity so branding CI stays green (#263) The package rename left one stale doctor-output assertion, README structure drift, and two runtime-template mismatches. This follow-up restores the canonical README anchors and keeps installed runtime scripts aligned with their templates so GitGuardex verifies cleanly after the rename. Constraint: Protected main requires agent branch plus PR finish flow Rejected: Revert the branding rename | the repo and published package already use GitGuardex Confidence: high Scope-risk: narrow Reversibility: clean Directive: When branding or log prefixes change, update runtime helpers and their templates together Tested: node --test --test-name-pattern "(doctor on protected main fails when sandbox PR is not merged|README keeps canonical About copy and problem-solution visuals aligned|critical runtime helper scripts stay in sync with templates)" test/install.test.js test/metadata.test.js; npm test; node --check bin/multiagent-safety.js; openspec validate agent-codex-rename-npm-package-to-gitguardex-everywh-2026-04-21-21-02 --type change --strict; openspec validate --specs Not-tested: Remote PR merge path Co-authored-by: NagyVikt --- README.md | 6 +++ .../tasks.md | 3 ++ scripts/codex-agent.sh | 51 +++++++++++++++++++ scripts/openspec/init-change-workspace.sh | 8 +-- test/install.test.js | 2 +- 5 files changed, 65 insertions(+), 5 deletions(-) diff --git a/README.md b/README.md index 84bbe51..8f05ff0 100644 --- a/README.md +++ b/README.md @@ -10,6 +10,10 @@ [![CodeQL](https://img.shields.io/github/actions/workflow/status/recodeee/gitguardex/codeql.yml?branch=main&label=CodeQL)](https://github.com/recodeee/gitguardex/actions/workflows/codeql.yml) [![OpenSSF Scorecard](https://api.securityscorecards.dev/projects/github.com/recodeee/gitguardex/badge)](https://securityscorecards.dev/viewer/?uri=github.com/recodeee/gitguardex) +[about_description.txt](./about_description.txt) + +Guardian T-Rex for your multi-agent repo. Isolated worktrees, file locks, and PR-only merges stop parallel Codex & Claude agents from overwriting each other's work. Auto-wires Oh My Codex, Oh My Claude, OpenSpec, and Caveman. + **GitGuardex is a safety layer for parallel agent work in git repos.** If you're running more than one Codex or Claude agent on the same codebase, this is what keeps them from deleting each other's work. > [!WARNING] @@ -28,6 +32,8 @@ I was running ~30 Codex agents in parallel and hit a wall: they kept working on ### Solution +![Agent branch/worktree start protocol](https://raw.githubusercontent.com/recodeee/gitguardex/main/docs/images/workflow-branch-start.svg) + GitGuardex exists to stop that loop. Every agent gets its own worktree, claims the files it's touching, and can't clobber files another agent has claimed. Your local branch stays clean; agents stay in their lanes. diff --git a/openspec/changes/agent-codex-rename-npm-package-to-gitguardex-everywh-2026-04-21-21-02/tasks.md b/openspec/changes/agent-codex-rename-npm-package-to-gitguardex-everywh-2026-04-21-21-02/tasks.md index 9db1d8d..86d05c5 100644 --- a/openspec/changes/agent-codex-rename-npm-package-to-gitguardex-everywh-2026-04-21-21-02/tasks.md +++ b/openspec/changes/agent-codex-rename-npm-package-to-gitguardex-everywh-2026-04-21-21-02/tasks.md @@ -5,18 +5,21 @@ ## 2. Tests - [x] 2.1 Update targeted install/self-update/status expectations for the renamed package. +- [x] 2.2 Restore post-rename regression coverage for branded doctor output, README canonical copy, and runtime/template parity. ## 3. Implementation - [x] 3.1 Rename the published package metadata to `@imdeadpool/gitguardex`. - [x] 3.2 Refresh CLI install prompts and package-name-dependent surfaces. - [x] 3.3 Refresh README, tutorial/docs copy, and README-linked assets to use GitGuardex npm/install wording. +- [x] 3.4 Repair post-rename regressions in README structure, branded doctor-output assertions, and `codex-agent` runtime parity. ## 4. Verification - [x] 4.1 Run renamed-package verification (`node --test --test-name-pattern "(self-update verifies on-disk version after @latest install and retries with pinned version when stale|self-update restarts into the installed CLI after a successful on-disk upgrade|status --json returns cli, services, and repo summary|prompt outputs AI setup instructions|prompt --exec outputs command-only checklist|deprecated copy-commands alias still works and warns)" test/install.test.js`, `node --check bin/multiagent-safety.js`, `npm pack --dry-run`). Result: targeted renamed-package tests passed `6/6`; `node --check bin/multiagent-safety.js` passed; `npm pack --dry-run` produced `imdeadpool-gitguardex-7.0.16.tgz`. - [x] 4.2 Run `openspec validate agent-codex-rename-npm-package-to-gitguardex-everywh-2026-04-21-21-02 --type change --strict`. Result: `Change 'agent-codex-rename-npm-package-to-gitguardex-everywh-2026-04-21-21-02' is valid`. - [x] 4.3 Run `openspec validate --specs`. Result: `No items found to validate.` +- [x] 4.4 Re-run the affected metadata/install assertions plus `npm test` after the post-rename regression repair and record the results. Result: `diff -u templates/scripts/codex-agent.sh scripts/codex-agent.sh` passed; `node --check bin/multiagent-safety.js` passed; `openspec validate agent-codex-rename-npm-package-to-gitguardex-everywh-2026-04-21-21-02 --type change --strict` passed; `openspec validate --specs` returned `No items found to validate.`; targeted regression rerun passed `3/3`; full `npm test` passed `161/161`. ## 5. Cleanup diff --git a/scripts/codex-agent.sh b/scripts/codex-agent.sh index a8a53c8..8b509dd 100755 --- a/scripts/codex-agent.sh +++ b/scripts/codex-agent.sh @@ -6,6 +6,7 @@ AGENT_NAME="${GUARDEX_AGENT_NAME:-agent}" BASE_BRANCH="${GUARDEX_BASE_BRANCH:-}" BASE_BRANCH_EXPLICIT=0 CODEX_BIN="${GUARDEX_CODEX_BIN:-codex}" +NODE_BIN="${GUARDEX_NODE_BIN:-node}" AUTO_FINISH_RAW="${GUARDEX_CODEX_AUTO_FINISH:-true}" AUTO_REVIEW_ON_CONFLICT_RAW="${GUARDEX_CODEX_AUTO_REVIEW_ON_CONFLICT:-true}" AUTO_CLEANUP_RAW="${GUARDEX_CODEX_AUTO_CLEANUP:-true}" @@ -143,6 +144,7 @@ if ! git rev-parse --is-inside-work-tree >/dev/null 2>&1; then exit 1 fi repo_root="$(git rev-parse --show-toplevel)" +active_session_state_script="${repo_root}/scripts/agent-session-state.js" guardex_env_helper="${repo_root}/scripts/guardex-env.sh" if [[ -f "$guardex_env_helper" ]]; then @@ -446,6 +448,40 @@ has_origin_remote() { git -C "$repo_root" remote get-url origin >/dev/null 2>&1 } +run_active_session_state() { + local action="$1" + shift + + if [[ ! -f "$active_session_state_script" ]]; then + return 0 + fi + if ! command -v "$NODE_BIN" >/dev/null 2>&1; then + return 0 + fi + + "$NODE_BIN" "$active_session_state_script" "$action" "$@" >/dev/null 2>&1 || true +} + +record_active_session_state() { + local wt="$1" + local branch="$2" + + run_active_session_state \ + start \ + --repo "$repo_root" \ + --branch "$branch" \ + --task "$TASK_NAME" \ + --agent "$AGENT_NAME" \ + --worktree "$wt" \ + --pid "$$" \ + --cli "$CODEX_BIN" +} + +clear_active_session_state() { + local branch="$1" + run_active_session_state stop --repo "$repo_root" --branch "$branch" +} + origin_remote_supports_pr_finish() { local origin_url origin_url="$(git -C "$repo_root" remote get-url origin 2>/dev/null || true)" @@ -833,6 +869,19 @@ if ! ensure_openspec_plan_workspace "$worktree_path" "$worktree_branch"; then exit 1 fi +active_session_recorded=0 +cleanup_active_session_state_on_exit() { + set +e + if [[ "${active_session_recorded:-0}" -eq 1 && -n "${worktree_branch:-}" && "${worktree_branch:-}" != "HEAD" ]]; then + clear_active_session_state "$worktree_branch" + active_session_recorded=0 + fi +} + +record_active_session_state "$worktree_path" "$worktree_branch" +active_session_recorded=1 +trap cleanup_active_session_state_on_exit EXIT INT TERM + echo "[codex-agent] Launching ${CODEX_BIN} in sandbox: $worktree_path" cd "$worktree_path" set +e @@ -841,6 +890,8 @@ codex_exit="$?" set -e cd "$repo_root" +cleanup_active_session_state_on_exit +trap - EXIT INT TERM final_exit="$codex_exit" auto_finish_completed=0 diff --git a/scripts/openspec/init-change-workspace.sh b/scripts/openspec/init-change-workspace.sh index 8f878e7..07f26c7 100755 --- a/scripts/openspec/init-change-workspace.sh +++ b/scripts/openspec/init-change-workspace.sh @@ -60,8 +60,8 @@ Describe the change in a sentence or two. Commit message is the spec of record. - [ ] Confirm sandbox worktree is gone (\`git worktree list\`, \`git branch -a\`). NOTESEOF fi - echo "[guardex] OpenSpec change workspace (minimal) ready: ${CHANGE_DIR}" - echo "[guardex] Notes-only scaffold: ${CHANGE_DIR}/notes.md" + echo "[gitguardex] OpenSpec change workspace (minimal) ready: ${CHANGE_DIR}" + echo "[gitguardex] Notes-only scaffold: ${CHANGE_DIR}/notes.md" exit 0 fi @@ -129,5 +129,5 @@ The system SHALL enforce ${CAPABILITY_SLUG} behavior as defined by this change. SPECEOF fi -echo "[guardex] OpenSpec change workspace ready: ${CHANGE_DIR}" -echo "[guardex] OpenSpec change spec scaffold: ${SPEC_DIR}/spec.md" +echo "[gitguardex] OpenSpec change workspace ready: ${CHANGE_DIR}" +echo "[gitguardex] OpenSpec change spec scaffold: ${SPEC_DIR}/spec.md" diff --git a/test/install.test.js b/test/install.test.js index 2b64254..89c7c5f 100644 --- a/test/install.test.js +++ b/test/install.test.js @@ -1339,7 +1339,7 @@ exit 1 assert.doesNotMatch(ghCalls, /pr merge .* --auto/); const combinedOutput = `${result.stdout}\n${result.stderr}`; assert.match(combinedOutput, /PR closed without merge; cannot continue auto-finish/); - assert.match(combinedOutput, /\[guardex\] Auto-finish flow failed for sandbox branch/); + assert.match(combinedOutput, /\[gitguardex\] Auto-finish flow failed for sandbox branch/); assert.doesNotMatch(combinedOutput, /Auto-finish flow completed for sandbox branch/); }); From 02aa2939b7252074f3b3b650e33b08614ba0955f Mon Sep 17 00:00:00 2001 From: Viktor Nagy <137165288+NagyVikt@users.noreply.github.com> Date: Tue, 21 Apr 2026 23:33:23 +0200 Subject: [PATCH 32/48] Match the active-agents view to repo-scoped SCM grouping (#264) The VS Code companion was rendering a flat live-session list, which could not mirror the grouped Source Control layout operators expect. This change keeps repo context in the SCM-side tree, groups live lanes under ACTIVE AGENTS, surfaces repo-root changes beside them, and filters runtime presence files out of the change list while locking the behavior with focused tests. Constraint: Must stay within VS Code contributed tree-view APIs without mutating the built-in Git provider Rejected: Keep the flat active-agent list only | it still loses repo and change context Confidence: high Scope-risk: narrow Reversibility: clean Directive: Keep .omx/state/active-sessions presence files out of repo change rows; they are runtime signals, not operator-facing changes Tested: node --test test/vscode-active-agents-session-state.test.js Tested: openspec validate agent-codex-vscode-active-agents-scm-provider-layout-2026-04-21-23-22 --type change --strict Tested: openspec validate --specs Not-tested: Manual rendering inside a real VS Code window Co-authored-by: NagyVikt --- README.md | 2 +- .../.openspec.yaml | 2 + .../proposal.md | 16 ++ .../spec.md | 21 +++ .../tasks.md | 31 ++++ .../vscode/guardex-active-agents/README.md | 4 +- .../vscode/guardex-active-agents/extension.js | 167 +++++++++++++++--- .../vscode/guardex-active-agents/package.json | 2 +- .../guardex-active-agents/session-schema.js | 110 +++++++++++- ...vscode-active-agents-session-state.test.js | 56 +++++- 10 files changed, 380 insertions(+), 31 deletions(-) create mode 100644 openspec/changes/agent-codex-vscode-active-agents-scm-provider-layout-2026-04-21-23-22/.openspec.yaml create mode 100644 openspec/changes/agent-codex-vscode-active-agents-scm-provider-layout-2026-04-21-23-22/proposal.md create mode 100644 openspec/changes/agent-codex-vscode-active-agents-scm-provider-layout-2026-04-21-23-22/specs/vscode-active-agents-scm-provider-layout/spec.md create mode 100644 openspec/changes/agent-codex-vscode-active-agents-scm-provider-layout-2026-04-21-23-22/tasks.md diff --git a/README.md b/README.md index 8f05ff0..294b83e 100644 --- a/README.md +++ b/README.md @@ -245,7 +245,7 @@ To install the real companion into local VS Code from a GitGuardex-wired repo: node scripts/install-vscode-active-agents-extension.js ``` -It adds an `Active Agents` view to the Source Control container, reads `.omx/state/active-sessions/*.json`, derives `thinking` versus `working` from each live sandbox worktree, and uses VS Code's native `loading~spin` codicon for the running-state affordance. Reload the VS Code window after install. +It adds an `Active Agents` view to the Source Control container, groups each live repo into `ACTIVE AGENTS` and `CHANGES` sections, reads `.omx/state/active-sessions/*.json`, derives `thinking` versus `working` from each live sandbox worktree, and uses VS Code's native `loading~spin` codicon for the running-state affordance. Reload the VS Code window after install. --- diff --git a/openspec/changes/agent-codex-vscode-active-agents-scm-provider-layout-2026-04-21-23-22/.openspec.yaml b/openspec/changes/agent-codex-vscode-active-agents-scm-provider-layout-2026-04-21-23-22/.openspec.yaml new file mode 100644 index 0000000..4b8c565 --- /dev/null +++ b/openspec/changes/agent-codex-vscode-active-agents-scm-provider-layout-2026-04-21-23-22/.openspec.yaml @@ -0,0 +1,2 @@ +schema: spec-driven +created: 2026-04-21 diff --git a/openspec/changes/agent-codex-vscode-active-agents-scm-provider-layout-2026-04-21-23-22/proposal.md b/openspec/changes/agent-codex-vscode-active-agents-scm-provider-layout-2026-04-21-23-22/proposal.md new file mode 100644 index 0000000..9202ec7 --- /dev/null +++ b/openspec/changes/agent-codex-vscode-active-agents-scm-provider-layout-2026-04-21-23-22/proposal.md @@ -0,0 +1,16 @@ +## Why + +- The shipped VS Code companion only renders a flat `Active Agents` list, so it cannot resemble the grouped Source Control layout operators expect from the real Guardex workflow. +- Users need the companion to show active lanes in repo context, with nearby repo changes visible in the same SCM-side tree instead of forcing a separate mental model. + +## What Changes + +- Reshape the SCM-container tree so each repo renders as a top-level node with grouped `ACTIVE AGENTS` and `CHANGES` sections. +- Keep the existing live-agent activity copy (`thinking` / `working`, changed-file counts, elapsed time), while also deriving repo-root git changes for the new `CHANGES` group. +- Add focused regression coverage for the new grouped tree structure and update the extension/readme copy to describe the repo-context view. + +## Impact + +- Affected surfaces: `templates/vscode/guardex-active-agents/extension.js`, `templates/vscode/guardex-active-agents/session-schema.js`, `templates/vscode/guardex-active-agents/README.md`, `test/vscode-active-agents-session-state.test.js`, and the root `README.md`. +- Risk is narrow because the change remains read-only; it only changes how repo/session state is presented in the VS Code companion. +- If repo git state cannot be inspected, the extension should keep showing active agents and simply omit repo-change rows instead of failing the whole view. diff --git a/openspec/changes/agent-codex-vscode-active-agents-scm-provider-layout-2026-04-21-23-22/specs/vscode-active-agents-scm-provider-layout/spec.md b/openspec/changes/agent-codex-vscode-active-agents-scm-provider-layout-2026-04-21-23-22/specs/vscode-active-agents-scm-provider-layout/spec.md new file mode 100644 index 0000000..a167ba1 --- /dev/null +++ b/openspec/changes/agent-codex-vscode-active-agents-scm-provider-layout-2026-04-21-23-22/specs/vscode-active-agents-scm-provider-layout/spec.md @@ -0,0 +1,21 @@ +## ADDED Requirements + +### Requirement: Active Agents SCM tree keeps repo context +The Guardex Active Agents VS Code companion SHALL render live sessions under a repo-scoped tree layout so operators can see active lanes in the same SCM-side structure as nearby repo changes. + +#### Scenario: Live repo renders grouped sections +- **WHEN** the companion finds one or more live Guardex sessions for a repo in the current workspace +- **THEN** the SCM tree shows a repo node for that repo +- **AND** the repo node contains an `ACTIVE AGENTS` section with the live session rows +- **AND** each session row keeps its activity state plus elapsed-time description. + +#### Scenario: Repo changes render beside active agents +- **WHEN** a repo with live Guardex sessions also has local git modifications in its root working tree +- **THEN** the repo node also contains a `CHANGES` section +- **AND** the change rows reflect the repo-relative changed paths +- **AND** the change rows surface concise git status markers. + +#### Scenario: Change inspection failure degrades safely +- **WHEN** the companion cannot inspect repo git status for a repo that still has live Guardex sessions +- **THEN** the `ACTIVE AGENTS` section still renders +- **AND** the repo simply omits `CHANGES` rows instead of crashing or hiding the repo node. diff --git a/openspec/changes/agent-codex-vscode-active-agents-scm-provider-layout-2026-04-21-23-22/tasks.md b/openspec/changes/agent-codex-vscode-active-agents-scm-provider-layout-2026-04-21-23-22/tasks.md new file mode 100644 index 0000000..aab6ad3 --- /dev/null +++ b/openspec/changes/agent-codex-vscode-active-agents-scm-provider-layout-2026-04-21-23-22/tasks.md @@ -0,0 +1,31 @@ +## Definition of Done + +This change is complete only when **all** of the following are true: + +- Every checkbox below is checked. +- The agent branch reaches `MERGED` state on `origin` and the PR URL + state are recorded in the completion handoff. +- If any step blocks (test failure, conflict, ambiguous result), append a `BLOCKED:` line under section 4 explaining the blocker and **STOP**. Do not tick remaining cleanup boxes; do not silently skip the cleanup pipeline. + +Handoff: 2026-04-21 21:24Z codex owns `templates/vscode/guardex-active-agents/*`, `test/vscode-active-agents-session-state.test.js`, `README.md`, and this change workspace to ship a repo-grouped SCM tree with `ACTIVE AGENTS` plus repo `CHANGES`. + +## 1. Specification + +- [x] 1.1 Finalize proposal scope and acceptance criteria for `agent-codex-vscode-active-agents-scm-provider-layout-2026-04-21-23-22`. +- [x] 1.2 Define normative requirements in `specs/vscode-active-agents-scm-provider-layout/spec.md`. + +## 2. Implementation + +- [x] 2.1 Replace the flat session list with a repo-scoped tree that groups `ACTIVE AGENTS` and repo `CHANGES`. +- [x] 2.2 Add/update focused regression coverage for the grouped tree structure and repo-change parsing. + +## 3. Verification + +- [x] 3.1 Run targeted project verification commands. +- [x] 3.2 Run `openspec validate agent-codex-vscode-active-agents-scm-provider-layout-2026-04-21-23-22 --type change --strict`. +- [x] 3.3 Run `openspec validate --specs`. + +## 4. Cleanup (mandatory; run before claiming completion) + +- [ ] 4.1 Run the cleanup pipeline: `bash scripts/agent-branch-finish.sh --branch agent/codex/vscode-active-agents-scm-provider-layout-2026-04-21-23-22 --base main --via-pr --wait-for-merge --cleanup`. This handles commit -> push -> PR create -> merge wait -> worktree prune in one invocation. +- [ ] 4.2 Record the PR URL and final merge state (`MERGED`) in the completion handoff. +- [ ] 4.3 Confirm the sandbox worktree is gone (`git worktree list` no longer shows the agent path; `git branch -a` shows no surviving local/remote refs for the branch). diff --git a/templates/vscode/guardex-active-agents/README.md b/templates/vscode/guardex-active-agents/README.md index aa9fabf..06bb43e 100644 --- a/templates/vscode/guardex-active-agents/README.md +++ b/templates/vscode/guardex-active-agents/README.md @@ -5,7 +5,9 @@ Local VS Code companion for Guardex-managed repos. What it does: - Adds an `Active Agents` view to the Source Control container. -- Renders one row per live Guardex sandbox session. +- Renders one repo node per live Guardex workspace with grouped `ACTIVE AGENTS` and `CHANGES` sections. +- Shows one row per live Guardex sandbox session inside the repo's `ACTIVE AGENTS` section. +- Shows repo-root git changes in a sibling `CHANGES` section when the guarded repo itself is dirty. - Derives `thinking` versus `working` from the live sandbox worktree and shows changed-file counts for active edits. - Uses VS Code's native animated `loading~spin` icon for the running-state affordance. - Reads repo-local presence files from `.omx/state/active-sessions/`. diff --git a/templates/vscode/guardex-active-agents/extension.js b/templates/vscode/guardex-active-agents/extension.js index 2c89e51..c67eb05 100644 --- a/templates/vscode/guardex-active-agents/extension.js +++ b/templates/vscode/guardex-active-agents/extension.js @@ -1,6 +1,7 @@ +const fs = require('node:fs'); const path = require('node:path'); const vscode = require('vscode'); -const { formatElapsedFrom, readActiveSessions } = require('./session-schema.js'); +const { formatElapsedFrom, readActiveSessions, readRepoChanges } = require('./session-schema.js'); class InfoItem extends vscode.TreeItem { constructor(label, description = '') { @@ -11,17 +12,31 @@ class InfoItem extends vscode.TreeItem { } class RepoItem extends vscode.TreeItem { - constructor(repoRoot, sessions) { + constructor(repoRoot, sessions, changes) { super(path.basename(repoRoot), vscode.TreeItemCollapsibleState.Expanded); this.repoRoot = repoRoot; this.sessions = sessions; - this.description = `${sessions.length} active`; + this.changes = changes; + const descriptionParts = [`${sessions.length} active`]; + if (changes.length > 0) { + descriptionParts.push(`${changes.length} changed`); + } + this.description = descriptionParts.join(' · '); this.tooltip = repoRoot; this.iconPath = new vscode.ThemeIcon('repo'); this.contextValue = 'gitguardex.repo'; } } +class SectionItem extends vscode.TreeItem { + constructor(label, items) { + super(label, vscode.TreeItemCollapsibleState.Expanded); + this.items = items; + this.description = items.length > 0 ? String(items.length) : ''; + this.contextValue = 'gitguardex.section'; + } +} + class SessionItem extends vscode.TreeItem { constructor(session) { super(session.label, vscode.TreeItemCollapsibleState.None); @@ -53,10 +68,103 @@ class SessionItem extends vscode.TreeItem { } } +class FolderItem extends vscode.TreeItem { + constructor(label, relativePath, items) { + super(label, vscode.TreeItemCollapsibleState.Expanded); + this.relativePath = relativePath; + this.items = items; + this.tooltip = relativePath; + this.iconPath = new vscode.ThemeIcon('folder'); + this.contextValue = 'gitguardex.folder'; + } +} + +class ChangeItem extends vscode.TreeItem { + constructor(change) { + super(path.basename(change.relativePath), vscode.TreeItemCollapsibleState.None); + this.change = change; + this.description = change.statusLabel; + this.tooltip = [ + change.relativePath, + `Status ${change.statusText}`, + change.originalPath ? `Renamed from ${change.originalPath}` : '', + change.absolutePath, + ].filter(Boolean).join('\n'); + this.resourceUri = vscode.Uri.file(change.absolutePath); + this.contextValue = 'gitguardex.change'; + this.command = { + command: 'gitguardex.activeAgents.openChange', + title: 'Open Changed File', + arguments: [change], + }; + } +} + function repoRootFromSessionFile(filePath) { return path.resolve(path.dirname(filePath), '..', '..', '..'); } +function buildChangeTreeNodes(changes) { + const root = []; + + function sortNodes(nodes) { + nodes.sort((left, right) => { + const leftIsFolder = left.kind === 'folder'; + const rightIsFolder = right.kind === 'folder'; + if (leftIsFolder !== rightIsFolder) { + return leftIsFolder ? -1 : 1; + } + return left.label.localeCompare(right.label); + }); + + for (const node of nodes) { + if (node.kind === 'folder') { + sortNodes(node.children); + } + } + } + + for (const change of changes) { + const segments = change.relativePath.split(/[\\/]+/).filter(Boolean); + if (segments.length <= 1) { + root.push({ kind: 'change', label: change.relativePath, change }); + continue; + } + + let nodes = root; + let folderPath = ''; + for (const segment of segments.slice(0, -1)) { + folderPath = folderPath ? path.posix.join(folderPath, segment) : segment; + let folderNode = nodes.find((node) => node.kind === 'folder' && node.relativePath === folderPath); + if (!folderNode) { + folderNode = { + kind: 'folder', + label: segment, + relativePath: folderPath, + children: [], + }; + nodes.push(folderNode); + } + nodes = folderNode.children; + } + + nodes.push({ kind: 'change', label: change.relativePath, change }); + } + + sortNodes(root); + + function materialize(nodes) { + return nodes.map((node) => { + if (node.kind === 'folder') { + return new FolderItem(node.label, node.relativePath, materialize(node.children)); + } + return new ChangeItem(node.change); + }); + } + + return materialize(root); +} + class ActiveAgentsProvider { constructor() { this.onDidChangeTreeDataEmitter = new vscode.EventEmitter(); @@ -95,29 +203,31 @@ class ActiveAgentsProvider { async getChildren(element) { if (element instanceof RepoItem) { - return element.sessions.map((session) => new SessionItem(session)); + const sectionItems = [ + new SectionItem('ACTIVE AGENTS', element.sessions.map((session) => new SessionItem(session))), + ]; + if (element.changes.length > 0) { + sectionItems.push(new SectionItem('CHANGES', buildChangeTreeNodes(element.changes))); + } + return sectionItems; } - const sessionsByRepo = await this.loadSessionsByRepo(); - const sessionCount = [...sessionsByRepo.values()].reduce((total, sessions) => total + sessions.length, 0); + if (element instanceof SectionItem || element instanceof FolderItem) { + return element.items; + } + + const repoEntries = await this.loadRepoEntries(); + const sessionCount = repoEntries.reduce((total, entry) => total + entry.sessions.length, 0); this.updateViewState(sessionCount); - const repos = [...sessionsByRepo.entries()] - .map(([repoRoot, sessions]) => ({ repoRoot, sessions })) - .filter((entry) => entry.sessions.length > 0) - .sort((left, right) => left.repoRoot.localeCompare(right.repoRoot)); - if (repos.length === 0) { + if (repoEntries.length === 0) { return [new InfoItem('No active Guardex agents', 'Open or start a sandbox session.')]; } - if (repos.length === 1) { - return repos[0].sessions.map((session) => new SessionItem(session)); - } - - return repos.map((entry) => new RepoItem(entry.repoRoot, entry.sessions)); + return repoEntries.map((entry) => new RepoItem(entry.repoRoot, entry.sessions, entry.changes)); } - async loadSessionsByRepo() { + async loadRepoEntries() { const sessionFiles = await vscode.workspace.findFiles( '**/.omx/state/active-sessions/*.json', '**/{node_modules,.git,.omx/agent-worktrees,.omc/agent-worktrees}/**', @@ -135,15 +245,20 @@ class ActiveAgentsProvider { } } - const sessionsByRepo = new Map(); + const repoEntries = []; for (const repoRoot of repoRoots) { const sessions = readActiveSessions(repoRoot); if (sessions.length > 0) { - sessionsByRepo.set(repoRoot, sessions); + repoEntries.push({ + repoRoot, + sessions, + changes: readRepoChanges(repoRoot), + }); } } - return sessionsByRepo; + repoEntries.sort((left, right) => left.repoRoot.localeCompare(right.repoRoot)); + return repoEntries; } } @@ -172,6 +287,18 @@ function activate(context) { { forceNewWindow: true }, ); }), + vscode.commands.registerCommand('gitguardex.activeAgents.openChange', async (change) => { + if (!change?.absolutePath) { + return; + } + + if (!fs.existsSync(change.absolutePath)) { + vscode.window.showInformationMessage?.(`Changed path is no longer on disk: ${change.relativePath}`); + return; + } + + await vscode.commands.executeCommand('vscode.open', vscode.Uri.file(change.absolutePath)); + }), vscode.workspace.onDidChangeWorkspaceFolders(refresh), watcher, { dispose: () => clearInterval(interval) }, diff --git a/templates/vscode/guardex-active-agents/package.json b/templates/vscode/guardex-active-agents/package.json index 3c3b0d1..da83ad5 100644 --- a/templates/vscode/guardex-active-agents/package.json +++ b/templates/vscode/guardex-active-agents/package.json @@ -1,7 +1,7 @@ { "name": "gitguardex-active-agents", "displayName": "GitGuardex Active Agents", - "description": "Shows live Guardex sandbox sessions inside VS Code Source Control.", + "description": "Shows live Guardex sandbox sessions and repo changes inside VS Code Source Control.", "publisher": "recodeee", "version": "0.0.1", "license": "MIT", diff --git a/templates/vscode/guardex-active-agents/session-schema.js b/templates/vscode/guardex-active-agents/session-schema.js index bd689af..aaeef7b 100644 --- a/templates/vscode/guardex-active-agents/session-schema.js +++ b/templates/vscode/guardex-active-agents/session-schema.js @@ -6,6 +6,7 @@ const ACTIVE_SESSIONS_RELATIVE_DIR = path.join('.omx', 'state', 'active-sessions const SESSION_SCHEMA_VERSION = 1; const LOCK_FILE_RELATIVE = path.join('.omx', 'state', 'agent-file-locks.json'); const MAX_CHANGED_PATH_PREVIEW = 3; +const ACTIVE_SESSIONS_FILTER_PREFIX = ACTIVE_SESSIONS_RELATIVE_DIR.split(path.sep).join('/'); function toNonEmptyString(value, fallback = '') { const normalized = typeof value === 'string' ? value.trim() : String(value || '').trim(); @@ -41,8 +42,7 @@ function splitOutputLines(output) { return output .split(/\r?\n/) - .map((line) => line.trim()) - .filter(Boolean); + .filter((line) => line.trim().length > 0); } function runGitLines(worktreePath, args) { @@ -57,6 +57,23 @@ function runGitLines(worktreePath, args) { } } +function unquoteGitPath(value) { + if (typeof value !== 'string') { + return ''; + } + + const trimmed = value.trim(); + if (!trimmed.startsWith('"') || !trimmed.endsWith('"')) { + return trimmed; + } + + try { + return JSON.parse(trimmed); + } catch (_error) { + return trimmed.slice(1, -1); + } +} + function formatFileCount(count) { return `${count} file${count === 1 ? '' : 's'}`; } @@ -74,6 +91,80 @@ function previewChangedPaths(paths) { return `${preview}, +${paths.length - MAX_CHANGED_PATH_PREVIEW} more`; } +function deriveRepoChangeStatus(statusPair) { + if (statusPair === '??') { + return { + statusCode: '??', + statusLabel: 'U', + statusText: 'Untracked', + }; + } + + const code = [statusPair[1], statusPair[0]].find((value) => value && value !== ' ') || 'M'; + const statusTextByCode = { + A: 'Added', + C: 'Copied', + D: 'Deleted', + M: 'Modified', + R: 'Renamed', + T: 'Type changed', + U: 'Conflicted', + }; + + return { + statusCode: code, + statusLabel: code, + statusText: statusTextByCode[code] || 'Changed', + }; +} + +function parseRepoChangeLine(repoRoot, line) { + if (typeof line !== 'string' || line.length < 4) { + return null; + } + + const statusPair = line.slice(0, 2); + if (statusPair === '!!') { + return null; + } + + const rawPath = line.slice(3).trim(); + if (!rawPath) { + return null; + } + + let relativePath = rawPath; + let originalPath = ''; + if (rawPath.includes(' -> ')) { + const parts = rawPath.split(' -> '); + if (parts.length === 2) { + originalPath = unquoteGitPath(parts[0]); + relativePath = parts[1]; + } + } + + relativePath = unquoteGitPath(relativePath); + if (!relativePath) { + return null; + } + + const normalizedRelativePath = relativePath.split(path.sep).join('/'); + if ( + normalizedRelativePath === ACTIVE_SESSIONS_FILTER_PREFIX + || normalizedRelativePath.startsWith(`${ACTIVE_SESSIONS_FILTER_PREFIX}/`) + ) { + return null; + } + + const status = deriveRepoChangeStatus(statusPair); + return { + ...status, + originalPath, + relativePath, + absolutePath: path.join(path.resolve(repoRoot), relativePath), + }; +} + function collectWorktreeChangedPaths(worktreePath) { const changedGroups = [ runGitLines(worktreePath, ['diff', '--name-only', '--', '.', `:(exclude)${LOCK_FILE_RELATIVE}`]), @@ -281,6 +372,18 @@ function readActiveSessions(repoRoot, options = {}) { return sessions; } +function readRepoChanges(repoRoot) { + const statusLines = runGitLines(repoRoot, ['status', '--porcelain=v1', '--untracked-files=all']); + if (!statusLines) { + return []; + } + + return statusLines + .map((line) => parseRepoChangeLine(repoRoot, line)) + .filter(Boolean) + .sort((left, right) => left.relativePath.localeCompare(right.relativePath)); +} + module.exports = { ACTIVE_SESSIONS_RELATIVE_DIR, SESSION_SCHEMA_VERSION, @@ -293,8 +396,11 @@ module.exports = { formatFileCount, isPidAlive, normalizeSessionRecord, + parseRepoChangeLine, previewChangedPaths, readActiveSessions, + readRepoChanges, + deriveRepoChangeStatus, sanitizeBranchForFile, sessionFileNameForBranch, sessionFilePathForBranch, diff --git a/test/vscode-active-agents-session-state.test.js b/test/vscode-active-agents-session-state.test.js index a55519f..698b022 100644 --- a/test/vscode-active-agents-session-state.test.js +++ b/test/vscode-active-agents-session-state.test.js @@ -112,6 +112,7 @@ function createMockVscode(tempRoot) { file: (fsPath) => ({ fsPath }), }, window: { + showInformationMessage: async () => {}, createTreeView: (viewId, options) => { const treeView = { viewId, @@ -262,6 +263,26 @@ test('session-schema derives working activity from dirty sandbox worktrees', () assert.equal(session.activitySummary, 'new-file.txt, tracked.txt'); }); +test('session-schema derives repo change rows from root git status', () => { + const tempRoot = fs.mkdtempSync(path.join(os.tmpdir(), 'guardex-active-session-root-status-')); + initGitRepo(tempRoot); + fs.writeFileSync(path.join(tempRoot, 'tracked.txt'), 'base\n', 'utf8'); + runGit(tempRoot, ['add', 'tracked.txt']); + runGit(tempRoot, ['commit', '-m', 'baseline']); + + fs.writeFileSync(path.join(tempRoot, 'tracked.txt'), 'base\nchanged\n', 'utf8'); + fs.writeFileSync(path.join(tempRoot, 'new-file.txt'), 'new\n', 'utf8'); + + const changes = sessionSchema.readRepoChanges(tempRoot); + assert.deepEqual( + changes.map((change) => [change.relativePath, change.statusLabel]), + [ + ['new-file.txt', 'U'], + ['tracked.txt', 'M'], + ], + ); +}); + test('install-vscode-active-agents-extension installs the current extension version and prunes older copies', () => { const tempExtensionsDir = fs.mkdtempSync(path.join(os.tmpdir(), 'guardex-vscode-ext-')); const staleDir = path.join(tempExtensionsDir, 'recodeee.gitguardex-active-agents-0.0.0'); @@ -308,7 +329,7 @@ test('active-agents extension registers a provider with getTreeItem', async () = } }); -test('active-agents extension updates the SCM badge for live sessions', async () => { +test('active-agents extension groups live sessions under a repo node', async () => { const tempRoot = fs.mkdtempSync(path.join(os.tmpdir(), 'guardex-vscode-live-view-')); const sessionPath = sessionSchema.sessionFilePathForBranch(tempRoot, 'agent/codex/live-task'); fs.mkdirSync(path.dirname(sessionPath), { recursive: true }); @@ -334,7 +355,14 @@ test('active-agents extension updates the SCM badge for live sessions', async () extension.activate(context); const provider = registrations.providers[0].provider; - const [sessionItem] = await provider.getChildren(); + const [repoItem] = await provider.getChildren(); + assert.equal(repoItem.label, path.basename(tempRoot)); + assert.equal(repoItem.description, '1 active'); + + const [agentsSection] = await provider.getChildren(repoItem); + assert.equal(agentsSection.label, 'ACTIVE AGENTS'); + + const [sessionItem] = await provider.getChildren(agentsSection); assert.equal(sessionItem.label, 'live-task'); assert.match(sessionItem.description, /^thinking · \d+[smhd]/); assert.deepEqual(registrations.treeViews[0].badge, { @@ -348,9 +376,15 @@ test('active-agents extension updates the SCM badge for live sessions', async () } }); -test('active-agents extension shows working rows when the sandbox has changes', async () => { +test('active-agents extension shows grouped repo changes beside active agents', async () => { const tempRoot = fs.mkdtempSync(path.join(os.tmpdir(), 'guardex-vscode-working-view-')); - const worktreePath = path.join(tempRoot, 'sandbox'); + initGitRepo(tempRoot); + fs.writeFileSync(path.join(tempRoot, 'root-file.txt'), 'base\n', 'utf8'); + runGit(tempRoot, ['add', 'root-file.txt']); + runGit(tempRoot, ['commit', '-m', 'baseline']); + fs.writeFileSync(path.join(tempRoot, 'root-file.txt'), 'base\nchanged\n', 'utf8'); + + const worktreePath = fs.mkdtempSync(path.join(os.tmpdir(), 'guardex-vscode-working-session-')); initGitRepo(worktreePath); fs.writeFileSync(path.join(worktreePath, 'tracked.txt'), 'base\n', 'utf8'); runGit(worktreePath, ['add', 'tracked.txt']); @@ -382,11 +416,21 @@ test('active-agents extension shows working rows when the sandbox has changes', extension.activate(context); const provider = registrations.providers[0].provider; - const [sessionItem] = await provider.getChildren(); - assert.equal(sessionItem.label, 'sandbox'); + const [repoItem] = await provider.getChildren(); + const [agentsSection, changesSection] = await provider.getChildren(repoItem); + assert.equal(agentsSection.label, 'ACTIVE AGENTS'); + assert.equal(changesSection.label, 'CHANGES'); + + const [sessionItem] = await provider.getChildren(agentsSection); + assert.equal(sessionItem.label, path.basename(worktreePath)); assert.match(sessionItem.description, /^working · 2 files · /); assert.match(sessionItem.tooltip, /Changed 2 files: new-file\.txt, tracked\.txt/); + const [changeItem] = await provider.getChildren(changesSection); + assert.equal(changeItem.label, 'root-file.txt'); + assert.equal(changeItem.description, 'M'); + assert.match(changeItem.tooltip, /Status Modified/); + for (const subscription of context.subscriptions) { subscription.dispose?.(); } From ae8bb76be745cc7d378085736ef0548b520a506b Mon Sep 17 00:00:00 2001 From: Viktor Nagy <137165288+NagyVikt@users.noreply.github.com> Date: Tue, 21 Apr 2026 23:38:39 +0200 Subject: [PATCH 33/48] Make quota-hit agent takeovers copy-pasteable (#265) The launcher already preserved unfinished sandboxes, but replacement agents only got a generic merge hint and had to reconstruct the exact resume steps. This change makes codex-agent print a concrete takeover prompt and teaches new OpenSpec change workspaces to scaffold matching handoff/copy-prompt lines so a second agent can continue the same lane without inventing a new sandbox. Constraint: Must keep the existing finish pipeline and template/runtime script parity intact Rejected: Persist stale active-session records | would widen the VS Code active-session model without being necessary for the handoff path Confidence: high Scope-risk: narrow Reversibility: clean Directive: Keep the codex-agent takeover prompt and init-change-workspace handoff copy aligned so manual and scaffolded rescue flows do not drift Tested: node --test --test-name-pattern "(codex-agent prints a takeover prompt when the sandbox is kept after an incomplete run|OpenSpec change workspace scaffold creates proposal/tasks/spec defaults|OpenSpec change workspace scaffold supports minimal T1 notes mode|critical runtime helper scripts stay in sync with templates)" test/install.test.js test/metadata.test.js; bash -n scripts/codex-agent.sh scripts/openspec/init-change-workspace.sh templates/scripts/codex-agent.sh templates/scripts/openspec/init-change-workspace.sh; openspec validate agent-codex-improve-usage-limit-takeover-handoff-2026-04-21-23-30 --type change --strict; openspec validate --specs Not-tested: Full remote PR merge from a real usage-limit interrupted session Co-authored-by: NagyVikt --- .../.openspec.yaml | 2 + .../proposal.md | 25 +++++++ .../spec.md | 32 +++++++++ .../tasks.md | 34 ++++++++++ scripts/codex-agent.sh | 26 ++++++++ scripts/openspec/init-change-workspace.sh | 32 ++++++++- templates/scripts/codex-agent.sh | 26 ++++++++ .../scripts/openspec/init-change-workspace.sh | 32 ++++++++- test/install.test.js | 66 +++++++++++++++++++ 9 files changed, 271 insertions(+), 4 deletions(-) create mode 100644 openspec/changes/agent-codex-improve-usage-limit-takeover-handoff-2026-04-21-23-30/.openspec.yaml create mode 100644 openspec/changes/agent-codex-improve-usage-limit-takeover-handoff-2026-04-21-23-30/proposal.md create mode 100644 openspec/changes/agent-codex-improve-usage-limit-takeover-handoff-2026-04-21-23-30/specs/improve-usage-limit-takeover-handoff/spec.md create mode 100644 openspec/changes/agent-codex-improve-usage-limit-takeover-handoff-2026-04-21-23-30/tasks.md diff --git a/openspec/changes/agent-codex-improve-usage-limit-takeover-handoff-2026-04-21-23-30/.openspec.yaml b/openspec/changes/agent-codex-improve-usage-limit-takeover-handoff-2026-04-21-23-30/.openspec.yaml new file mode 100644 index 0000000..4b8c565 --- /dev/null +++ b/openspec/changes/agent-codex-improve-usage-limit-takeover-handoff-2026-04-21-23-30/.openspec.yaml @@ -0,0 +1,2 @@ +schema: spec-driven +created: 2026-04-21 diff --git a/openspec/changes/agent-codex-improve-usage-limit-takeover-handoff-2026-04-21-23-30/proposal.md b/openspec/changes/agent-codex-improve-usage-limit-takeover-handoff-2026-04-21-23-30/proposal.md new file mode 100644 index 0000000..67afd87 --- /dev/null +++ b/openspec/changes/agent-codex-improve-usage-limit-takeover-handoff-2026-04-21-23-30/proposal.md @@ -0,0 +1,25 @@ +## Why + +- When `codex-agent` keeps a sandbox after the first lane exits early, the next + agent only gets generic merge/cleanup hints and has to reconstruct the exact + takeover flow by hand. +- Fresh OpenSpec change workspaces do not scaffold a copy-paste takeover note, + so usage-limit handoffs are inconsistent and easy to botch. + +## What Changes + +- Make `codex-agent` print a concrete takeover prompt with the existing branch, + sandbox path, OpenSpec artifact, and finish command whenever auto-finish does + not complete and the worktree stays alive. +- Teach `init-change-workspace.sh` to scaffold structured `Handoff:` and + `Copy prompt:` lines, and resolve the cleanup command base branch from repo + metadata instead of hardcoding `dev`. +- Add regression coverage for both the launcher handoff output and the scaffold + defaults. + +## Impact + +- Affects `scripts/codex-agent.sh`, `scripts/openspec/init-change-workspace.sh`, + and `test/install.test.js`. +- Keeps the existing finish pipeline intact while making quota-hit/manual + takeovers copy-pasteable instead of reconstructive. diff --git a/openspec/changes/agent-codex-improve-usage-limit-takeover-handoff-2026-04-21-23-30/specs/improve-usage-limit-takeover-handoff/spec.md b/openspec/changes/agent-codex-improve-usage-limit-takeover-handoff-2026-04-21-23-30/specs/improve-usage-limit-takeover-handoff/spec.md new file mode 100644 index 0000000..aaf4c90 --- /dev/null +++ b/openspec/changes/agent-codex-improve-usage-limit-takeover-handoff-2026-04-21-23-30/specs/improve-usage-limit-takeover-handoff/spec.md @@ -0,0 +1,32 @@ +## ADDED Requirements + +### Requirement: `codex-agent` SHALL emit a takeover prompt when a sandbox is kept +`codex-agent` SHALL print a copy-paste takeover prompt whenever it leaves a +branch/worktree alive for manual follow-up. + +#### Scenario: incomplete run keeps the sandbox alive +- **GIVEN** `codex-agent` keeps the sandbox because auto-finish did not complete +- **WHEN** it reports the kept worktree +- **THEN** it SHALL print a takeover prompt that references the existing branch + and sandbox path +- **AND** the prompt SHALL tell the next agent to continue from the current + state instead of creating a new sandbox +- **AND** the prompt SHALL include the cleanup/finish command for + `agent-branch-finish.sh`. + +### Requirement: OpenSpec change scaffolds SHALL include structured takeover copy +OpenSpec change workspaces SHALL scaffold a structured handoff line plus a +copy-paste takeover prompt for usage-limit/manual handoffs. + +#### Scenario: standard change workspace scaffold +- **WHEN** `scripts/openspec/init-change-workspace.sh` creates a non-minimal + change workspace +- **THEN** `tasks.md` SHALL include a `Handoff:` line and a `Copy prompt:` line +- **AND** the generated cleanup command SHALL resolve the branch base from repo + metadata when available. + +#### Scenario: minimal notes workspace scaffold +- **WHEN** `scripts/openspec/init-change-workspace.sh` runs in minimal mode +- **THEN** `notes.md` SHALL include the same `Handoff:` and `Copy prompt:` flow +- **AND** the generated cleanup command SHALL resolve the branch base from repo + metadata when available. diff --git a/openspec/changes/agent-codex-improve-usage-limit-takeover-handoff-2026-04-21-23-30/tasks.md b/openspec/changes/agent-codex-improve-usage-limit-takeover-handoff-2026-04-21-23-30/tasks.md new file mode 100644 index 0000000..9701d8b --- /dev/null +++ b/openspec/changes/agent-codex-improve-usage-limit-takeover-handoff-2026-04-21-23-30/tasks.md @@ -0,0 +1,34 @@ +## Definition of Done + +This change is complete only when **all** of the following are true: + +- Every checkbox below is checked. +- The agent branch reaches `MERGED` state on `origin` and the PR URL + state are recorded in the completion handoff. +- If any step blocks (test failure, conflict, ambiguous result), append a `BLOCKED:` line under section 4 explaining the blocker and **STOP**. Do not tick remaining cleanup boxes; do not silently skip the cleanup pipeline. + +## Handoff + +- Handoff: change=`agent-codex-improve-usage-limit-takeover-handoff-2026-04-21-23-30`; branch=`agent/codex/improve-usage-limit-takeover-handoff-2026-04-21-23-30`; scope=`OpenSpec change docs, codex-agent takeover output, change scaffold defaults, install regressions`; action=`emit a copy-paste takeover prompt and keep usage-limit/manual rescue flow inside the existing sandbox`. +- Copy prompt: Continue `agent-codex-improve-usage-limit-takeover-handoff-2026-04-21-23-30` on branch `agent/codex/improve-usage-limit-takeover-handoff-2026-04-21-23-30`. Work inside the existing sandbox, review `openspec/changes/agent-codex-improve-usage-limit-takeover-handoff-2026-04-21-23-30/tasks.md`, continue from the current state instead of creating a new sandbox, and when the work is done run `bash scripts/agent-branch-finish.sh --branch "agent/codex/improve-usage-limit-takeover-handoff-2026-04-21-23-30" --base main --via-pr --wait-for-merge --cleanup`. + +## 1. Specification + +- [x] 1.1 Finalize proposal scope and acceptance criteria for `agent-codex-improve-usage-limit-takeover-handoff-2026-04-21-23-30`. +- [x] 1.2 Define normative requirements in `specs/improve-usage-limit-takeover-handoff/spec.md`. + +## 2. Implementation + +- [x] 2.1 Implement scoped behavior changes in `scripts/codex-agent.sh` and `scripts/openspec/init-change-workspace.sh`, then mirror the same changes into the managed templates. +- [x] 2.2 Add/update focused regression coverage in `test/install.test.js` for the launcher takeover prompt and the scaffolded handoff/copy prompt defaults. + +## 3. Verification + +- [x] 3.1 Run targeted project verification commands. Result: `node --test --test-name-pattern "(codex-agent prints a takeover prompt when the sandbox is kept after an incomplete run|OpenSpec change workspace scaffold creates proposal/tasks/spec defaults|OpenSpec change workspace scaffold supports minimal T1 notes mode|critical runtime helper scripts stay in sync with templates)" test/install.test.js test/metadata.test.js` passed `4/4`; `bash -n scripts/codex-agent.sh scripts/openspec/init-change-workspace.sh templates/scripts/codex-agent.sh templates/scripts/openspec/init-change-workspace.sh` passed. +- [x] 3.2 Run `openspec validate agent-codex-improve-usage-limit-takeover-handoff-2026-04-21-23-30 --type change --strict`. Result: passed. +- [x] 3.3 Run `openspec validate --specs`. Result: `No items found to validate.` + +## 4. Cleanup (mandatory; run before claiming completion) + +- [ ] 4.1 Run the cleanup pipeline: `bash scripts/agent-branch-finish.sh --branch agent/codex/improve-usage-limit-takeover-handoff-2026-04-21-23-30 --base dev --via-pr --wait-for-merge --cleanup`. This handles commit -> push -> PR create -> merge wait -> worktree prune in one invocation. +- [ ] 4.2 Record the PR URL and final merge state (`MERGED`) in the completion handoff. +- [ ] 4.3 Confirm the sandbox worktree is gone (`git worktree list` no longer shows the agent path; `git branch -a` shows no surviving local/remote refs for the branch). diff --git a/scripts/codex-agent.sh b/scripts/codex-agent.sh index 8b509dd..1829d92 100755 --- a/scripts/codex-agent.sh +++ b/scripts/codex-agent.sh @@ -510,6 +510,31 @@ resolve_worktree_base_branch() { printf 'dev' } +print_takeover_prompt() { + local wt="$1" + local branch="$2" + local base_branch change_slug change_artifact finish_cmd + + base_branch="$(resolve_worktree_base_branch "$wt")" + if [[ -z "$base_branch" ]]; then + base_branch="dev" + fi + + change_slug="$(resolve_openspec_change_slug "$branch")" + change_artifact="openspec/changes/${change_slug}/tasks.md" + if [[ ! -f "${wt}/${change_artifact}" ]]; then + change_artifact="openspec/changes/${change_slug}/notes.md" + fi + if [[ ! -f "${wt}/${change_artifact}" ]]; then + change_artifact="openspec/changes/${change_slug}/" + fi + + finish_cmd="bash scripts/agent-branch-finish.sh --branch \"${branch}\" --base ${base_branch} --via-pr --wait-for-merge --cleanup" + + echo "[codex-agent] Takeover sandbox: ${wt}" + echo "[codex-agent] Takeover prompt: Continue \`${change_slug}\` on branch \`${branch}\`. Work inside \`${wt}\`, review \`${change_artifact}\`, continue from the current state instead of creating a new sandbox, and when the work is done run \`${finish_cmd}\`." +} + sync_worktree_with_base() { local wt="$1" if ! has_origin_remote; then @@ -952,6 +977,7 @@ else if [[ "$auto_finish_completed" -eq 1 ]]; then echo "[codex-agent] Branch kept intentionally. Cleanup on demand: gx cleanup --branch \"${worktree_branch}\"" else + print_takeover_prompt "$worktree_path" "$worktree_branch" echo "[codex-agent] If finished, merge with: bash scripts/agent-branch-finish.sh --branch \"${worktree_branch}\" --base dev --via-pr --wait-for-merge" echo "[codex-agent] Cleanup on demand: gx cleanup --branch \"${worktree_branch}\"" fi diff --git a/scripts/openspec/init-change-workspace.sh b/scripts/openspec/init-change-workspace.sh index 07f26c7..6f0930d 100755 --- a/scripts/openspec/init-change-workspace.sh +++ b/scripts/openspec/init-change-workspace.sh @@ -21,9 +21,27 @@ if [[ "$CAPABILITY_SLUG" =~ [^a-z0-9-] ]]; then exit 1 fi +resolve_base_branch() { + local branch="$1" + local base_branch="" + + if [[ -n "$branch" ]] && [[ "$branch" != "agent//" ]]; then + base_branch="$(git config --get "branch.${branch}.guardexBase" || true)" + fi + if [[ -z "$base_branch" ]]; then + base_branch="$(git config --get multiagent.baseBranch || true)" + fi + if [[ -z "$base_branch" ]]; then + base_branch="dev" + fi + + printf '%s' "$base_branch" +} + CHANGE_DIR="openspec/changes/${CHANGE_SLUG}" SPEC_DIR="${CHANGE_DIR}/specs/${CAPABILITY_SLUG}" TODAY="$(date -u +%Y-%m-%d)" +BASE_BRANCH="$(resolve_base_branch "$AGENT_BRANCH")" MINIMAL_RAW="${GUARDEX_OPENSPEC_MINIMAL:-0}" case "$(printf '%s' "$MINIMAL_RAW" | tr '[:upper:]' '[:lower:]')" in @@ -53,9 +71,14 @@ Branch: \`${AGENT_BRANCH}\` Describe the change in a sentence or two. Commit message is the spec of record. +## Handoff + +- Handoff: change=\`${CHANGE_SLUG}\`; branch=\`${AGENT_BRANCH}\`; scope=\`TODO\`; action=\`continue this sandbox or finish cleanup after a usage-limit/manual takeover\`. +- Copy prompt: Continue \`${CHANGE_SLUG}\` on branch \`${AGENT_BRANCH}\`. Work inside the existing sandbox, review \`openspec/changes/${CHANGE_SLUG}/notes.md\`, continue from the current state instead of creating a new sandbox, and when the work is done run \`bash scripts/agent-branch-finish.sh --branch "${AGENT_BRANCH}" --base ${BASE_BRANCH} --via-pr --wait-for-merge --cleanup\`. + ## Cleanup -- [ ] Run: \`bash scripts/agent-branch-finish.sh --branch ${AGENT_BRANCH} --base dev --via-pr --wait-for-merge --cleanup\` +- [ ] Run: \`bash scripts/agent-branch-finish.sh --branch "${AGENT_BRANCH}" --base ${BASE_BRANCH} --via-pr --wait-for-merge --cleanup\` - [ ] Record PR URL + \`MERGED\` state in the completion handoff. - [ ] Confirm sandbox worktree is gone (\`git worktree list\`, \`git branch -a\`). NOTESEOF @@ -91,6 +114,11 @@ This change is complete only when **all** of the following are true: - The agent branch reaches \`MERGED\` state on \`origin\` and the PR URL + state are recorded in the completion handoff. - If any step blocks (test failure, conflict, ambiguous result), append a \`BLOCKED:\` line under section 4 explaining the blocker and **STOP**. Do not tick remaining cleanup boxes; do not silently skip the cleanup pipeline. +## Handoff + +- Handoff: change=\`${CHANGE_SLUG}\`; branch=\`${AGENT_BRANCH}\`; scope=\`TODO\`; action=\`continue this sandbox or finish cleanup after a usage-limit/manual takeover\`. +- Copy prompt: Continue \`${CHANGE_SLUG}\` on branch \`${AGENT_BRANCH}\`. Work inside the existing sandbox, review \`openspec/changes/${CHANGE_SLUG}/tasks.md\`, continue from the current state instead of creating a new sandbox, and when the work is done run \`bash scripts/agent-branch-finish.sh --branch "${AGENT_BRANCH}" --base ${BASE_BRANCH} --via-pr --wait-for-merge --cleanup\`. + ## 1. Specification - [ ] 1.1 Finalize proposal scope and acceptance criteria for \`${CHANGE_SLUG}\`. @@ -109,7 +137,7 @@ This change is complete only when **all** of the following are true: ## 4. Cleanup (mandatory; run before claiming completion) -- [ ] 4.1 Run the cleanup pipeline: \`bash scripts/agent-branch-finish.sh --branch ${AGENT_BRANCH} --base dev --via-pr --wait-for-merge --cleanup\`. This handles commit -> push -> PR create -> merge wait -> worktree prune in one invocation. +- [ ] 4.1 Run the cleanup pipeline: \`bash scripts/agent-branch-finish.sh --branch "${AGENT_BRANCH}" --base ${BASE_BRANCH} --via-pr --wait-for-merge --cleanup\`. This handles commit -> push -> PR create -> merge wait -> worktree prune in one invocation. - [ ] 4.2 Record the PR URL and final merge state (\`MERGED\`) in the completion handoff. - [ ] 4.3 Confirm the sandbox worktree is gone (\`git worktree list\` no longer shows the agent path; \`git branch -a\` shows no surviving local/remote refs for the branch). TASKSEOF diff --git a/templates/scripts/codex-agent.sh b/templates/scripts/codex-agent.sh index 8b509dd..1829d92 100755 --- a/templates/scripts/codex-agent.sh +++ b/templates/scripts/codex-agent.sh @@ -510,6 +510,31 @@ resolve_worktree_base_branch() { printf 'dev' } +print_takeover_prompt() { + local wt="$1" + local branch="$2" + local base_branch change_slug change_artifact finish_cmd + + base_branch="$(resolve_worktree_base_branch "$wt")" + if [[ -z "$base_branch" ]]; then + base_branch="dev" + fi + + change_slug="$(resolve_openspec_change_slug "$branch")" + change_artifact="openspec/changes/${change_slug}/tasks.md" + if [[ ! -f "${wt}/${change_artifact}" ]]; then + change_artifact="openspec/changes/${change_slug}/notes.md" + fi + if [[ ! -f "${wt}/${change_artifact}" ]]; then + change_artifact="openspec/changes/${change_slug}/" + fi + + finish_cmd="bash scripts/agent-branch-finish.sh --branch \"${branch}\" --base ${base_branch} --via-pr --wait-for-merge --cleanup" + + echo "[codex-agent] Takeover sandbox: ${wt}" + echo "[codex-agent] Takeover prompt: Continue \`${change_slug}\` on branch \`${branch}\`. Work inside \`${wt}\`, review \`${change_artifact}\`, continue from the current state instead of creating a new sandbox, and when the work is done run \`${finish_cmd}\`." +} + sync_worktree_with_base() { local wt="$1" if ! has_origin_remote; then @@ -952,6 +977,7 @@ else if [[ "$auto_finish_completed" -eq 1 ]]; then echo "[codex-agent] Branch kept intentionally. Cleanup on demand: gx cleanup --branch \"${worktree_branch}\"" else + print_takeover_prompt "$worktree_path" "$worktree_branch" echo "[codex-agent] If finished, merge with: bash scripts/agent-branch-finish.sh --branch \"${worktree_branch}\" --base dev --via-pr --wait-for-merge" echo "[codex-agent] Cleanup on demand: gx cleanup --branch \"${worktree_branch}\"" fi diff --git a/templates/scripts/openspec/init-change-workspace.sh b/templates/scripts/openspec/init-change-workspace.sh index 07f26c7..6f0930d 100755 --- a/templates/scripts/openspec/init-change-workspace.sh +++ b/templates/scripts/openspec/init-change-workspace.sh @@ -21,9 +21,27 @@ if [[ "$CAPABILITY_SLUG" =~ [^a-z0-9-] ]]; then exit 1 fi +resolve_base_branch() { + local branch="$1" + local base_branch="" + + if [[ -n "$branch" ]] && [[ "$branch" != "agent//" ]]; then + base_branch="$(git config --get "branch.${branch}.guardexBase" || true)" + fi + if [[ -z "$base_branch" ]]; then + base_branch="$(git config --get multiagent.baseBranch || true)" + fi + if [[ -z "$base_branch" ]]; then + base_branch="dev" + fi + + printf '%s' "$base_branch" +} + CHANGE_DIR="openspec/changes/${CHANGE_SLUG}" SPEC_DIR="${CHANGE_DIR}/specs/${CAPABILITY_SLUG}" TODAY="$(date -u +%Y-%m-%d)" +BASE_BRANCH="$(resolve_base_branch "$AGENT_BRANCH")" MINIMAL_RAW="${GUARDEX_OPENSPEC_MINIMAL:-0}" case "$(printf '%s' "$MINIMAL_RAW" | tr '[:upper:]' '[:lower:]')" in @@ -53,9 +71,14 @@ Branch: \`${AGENT_BRANCH}\` Describe the change in a sentence or two. Commit message is the spec of record. +## Handoff + +- Handoff: change=\`${CHANGE_SLUG}\`; branch=\`${AGENT_BRANCH}\`; scope=\`TODO\`; action=\`continue this sandbox or finish cleanup after a usage-limit/manual takeover\`. +- Copy prompt: Continue \`${CHANGE_SLUG}\` on branch \`${AGENT_BRANCH}\`. Work inside the existing sandbox, review \`openspec/changes/${CHANGE_SLUG}/notes.md\`, continue from the current state instead of creating a new sandbox, and when the work is done run \`bash scripts/agent-branch-finish.sh --branch "${AGENT_BRANCH}" --base ${BASE_BRANCH} --via-pr --wait-for-merge --cleanup\`. + ## Cleanup -- [ ] Run: \`bash scripts/agent-branch-finish.sh --branch ${AGENT_BRANCH} --base dev --via-pr --wait-for-merge --cleanup\` +- [ ] Run: \`bash scripts/agent-branch-finish.sh --branch "${AGENT_BRANCH}" --base ${BASE_BRANCH} --via-pr --wait-for-merge --cleanup\` - [ ] Record PR URL + \`MERGED\` state in the completion handoff. - [ ] Confirm sandbox worktree is gone (\`git worktree list\`, \`git branch -a\`). NOTESEOF @@ -91,6 +114,11 @@ This change is complete only when **all** of the following are true: - The agent branch reaches \`MERGED\` state on \`origin\` and the PR URL + state are recorded in the completion handoff. - If any step blocks (test failure, conflict, ambiguous result), append a \`BLOCKED:\` line under section 4 explaining the blocker and **STOP**. Do not tick remaining cleanup boxes; do not silently skip the cleanup pipeline. +## Handoff + +- Handoff: change=\`${CHANGE_SLUG}\`; branch=\`${AGENT_BRANCH}\`; scope=\`TODO\`; action=\`continue this sandbox or finish cleanup after a usage-limit/manual takeover\`. +- Copy prompt: Continue \`${CHANGE_SLUG}\` on branch \`${AGENT_BRANCH}\`. Work inside the existing sandbox, review \`openspec/changes/${CHANGE_SLUG}/tasks.md\`, continue from the current state instead of creating a new sandbox, and when the work is done run \`bash scripts/agent-branch-finish.sh --branch "${AGENT_BRANCH}" --base ${BASE_BRANCH} --via-pr --wait-for-merge --cleanup\`. + ## 1. Specification - [ ] 1.1 Finalize proposal scope and acceptance criteria for \`${CHANGE_SLUG}\`. @@ -109,7 +137,7 @@ This change is complete only when **all** of the following are true: ## 4. Cleanup (mandatory; run before claiming completion) -- [ ] 4.1 Run the cleanup pipeline: \`bash scripts/agent-branch-finish.sh --branch ${AGENT_BRANCH} --base dev --via-pr --wait-for-merge --cleanup\`. This handles commit -> push -> PR create -> merge wait -> worktree prune in one invocation. +- [ ] 4.1 Run the cleanup pipeline: \`bash scripts/agent-branch-finish.sh --branch "${AGENT_BRANCH}" --base ${BASE_BRANCH} --via-pr --wait-for-merge --cleanup\`. This handles commit -> push -> PR create -> merge wait -> worktree prune in one invocation. - [ ] 4.2 Record the PR URL and final merge state (\`MERGED\`) in the completion handoff. - [ ] 4.3 Confirm the sandbox worktree is gone (\`git worktree list\` no longer shows the agent path; \`git branch -a\` shows no surviving local/remote refs for the branch). TASKSEOF diff --git a/test/install.test.js b/test/install.test.js index 89c7c5f..65cd18f 100644 --- a/test/install.test.js +++ b/test/install.test.js @@ -3498,6 +3498,63 @@ exit 1 assert.match(launchedArgs, /--model gpt-5\.4-mini/); }); +test('codex-agent prints a takeover prompt when the sandbox is kept after an incomplete run', () => { + const repoDir = initRepo(); + seedCommit(repoDir); + + let result = runNode(['setup', '--target', repoDir, '--no-global-install'], repoDir); + assert.equal(result.status, 0, result.stderr || result.stdout); + result = runCmd('git', ['add', '.'], repoDir); + assert.equal(result.status, 0, result.stderr); + result = runCmd('git', ['commit', '-m', 'apply gx setup'], repoDir, { + ALLOW_COMMIT_ON_PROTECTED_BRANCH: '1', + }); + assert.equal(result.status, 0, result.stderr || result.stdout); + + const fakeCodexBin = fs.mkdtempSync(path.join(os.tmpdir(), 'guardex-fake-codex-takeover-')); + const fakeCodexPath = path.join(fakeCodexBin, 'codex'); + fs.writeFileSync( + fakeCodexPath, + '#!/usr/bin/env bash\n' + + 'pwd > "${GUARDEX_TEST_CODEX_CWD}"\n' + + 'echo "partial" > codex-partial.txt\n' + + 'exit 42\n', + 'utf8', + ); + fs.chmodSync(fakeCodexPath, 0o755); + + const cwdMarker = path.join(repoDir, '.codex-agent-cwd-takeover'); + const launch = runCmd( + 'bash', + ['scripts/codex-agent.sh', 'usage-limit-task', 'planner', 'dev'], + repoDir, + { + PATH: `${fakeCodexBin}:${process.env.PATH}`, + GUARDEX_TEST_CODEX_CWD: cwdMarker, + }, + ); + assert.equal(launch.status, 42, launch.stderr || launch.stdout); + + const combinedOutput = `${launch.stdout}\n${launch.stderr}`; + const launchedBranch = extractCreatedBranch(launch.stdout); + const changeSlug = launchedBranch.replace(/\//g, '-'); + assert.match(combinedOutput, /\[codex-agent\] Sandbox worktree kept:/); + assert.match(combinedOutput, new RegExp(`\\[codex-agent\\] Takeover sandbox: ${escapeRegexLiteral(fs.readFileSync(cwdMarker, 'utf8').trim())}`)); + assert.match( + combinedOutput, + new RegExp(`\\[codex-agent\\] Takeover prompt: Continue \`${escapeRegexLiteral(changeSlug)}\` on branch \`${escapeRegexLiteral(launchedBranch)}\``), + ); + assert.match(combinedOutput, /continue from the current state instead of creating a new sandbox/); + assert.match( + combinedOutput, + new RegExp(`openspec/changes/${escapeRegexLiteral(changeSlug)}/tasks\\.md`), + ); + assert.match( + combinedOutput, + new RegExp(`agent-branch-finish\\.sh --branch "${escapeRegexLiteral(launchedBranch)}" --base dev --via-pr --wait-for-merge --cleanup`), + ); +}); + test('codex-agent keeps the sandbox when base branch advances without a mergeable remote context', () => { const repoDir = initRepo(); seedCommit(repoDir); @@ -4176,6 +4233,9 @@ test('OpenSpec change workspace scaffold creates proposal/tasks/spec defaults', const tasksContent = fs.readFileSync(path.join(changeDir, 'tasks.md'), 'utf8'); assert.match(tasksContent, /## Definition of Done/); assert.match(tasksContent, /append a `BLOCKED:` line under section 4/); + assert.match(tasksContent, /## Handoff/); + assert.match(tasksContent, /Handoff: change=`change-workspace-smoke`/); + assert.match(tasksContent, /Copy prompt: Continue `change-workspace-smoke` on branch `agent\/\/`/); assert.match(tasksContent, /## 4\. Cleanup \(mandatory; run before claiming completion\)/); assert.match(tasksContent, /Run the cleanup pipeline:/); assert.match(tasksContent, /Record the PR URL and final merge state \(`MERGED`\)/); @@ -4187,6 +4247,8 @@ test('OpenSpec change workspace scaffold supports minimal T1 notes mode', () => const setupResult = runNode(['setup', '--target', repoDir, '--no-global-install'], repoDir); assert.equal(setupResult.status, 0, setupResult.stderr || setupResult.stdout); + let result = runCmd('git', ['config', 'multiagent.baseBranch', 'main'], repoDir); + assert.equal(result.status, 0, result.stderr || result.stdout); const changeSlug = 'change-workspace-minimal'; const capabilitySlug = 'runtime-migration'; @@ -4209,6 +4271,10 @@ test('OpenSpec change workspace scaffold supports minimal T1 notes mode', () => assert.match(notesContent, /minimal \/ T1/); assert.match(notesContent, new RegExp(agentBranch.replace(/[.*+?^${}()|[\]\\]/g, '\\$&'))); assert.match(notesContent, /Commit message is the spec of record/); + assert.match(notesContent, /## Handoff/); + assert.match(notesContent, /Handoff: change=`change-workspace-minimal`/); + assert.match(notesContent, /Copy prompt: Continue `change-workspace-minimal` on branch `agent\/codex\/minimal-change`/); + assert.match(notesContent, /--base main --via-pr --wait-for-merge --cleanup/); assert.match(notesContent, /Record PR URL \+ `MERGED` state/); }); From aca7f8a20a9f197c007504424b8f9926ac93f0b9 Mon Sep 17 00:00:00 2001 From: Viktor Nagy <137165288+NagyVikt@users.noreply.github.com> Date: Wed, 22 Apr 2026 00:11:04 +0200 Subject: [PATCH 34/48] Restore the published npm package identity users already install (#266) Changing package.json from @imdeadpool/guardex to @imdeadpool/gitguardex created a separate npm package identity instead of renaming the existing registry entry. This restores the manifest, install/update copy, tutorial/docs assets, and regression tests to the package users can actually install today, and bumps the version so the next publish can succeed against the existing @imdeadpool/guardex release line. Constraint: npm package names are immutable registry identities; changing name publishes a different package Constraint: @imdeadpool/guardex@7.0.16 is already published on npm, so the revert must advance the version Rejected: Keep @imdeadpool/gitguardex and only tweak docs | npm view returns 404 so install and self-update guidance would still point at a non-existent package Confidence: high Scope-risk: moderate Reversibility: clean Directive: Do not change package.json.name again without confirming the target package exists on npm and the next version is publishable Tested: node --test --test-name-pattern "(default invocation checks for update and can auto-approve latest install|self-update verifies on-disk version after @latest install and retries with pinned version when stale|self-update restarts into the installed CLI after a successful on-disk upgrade|status --json returns cli, services, and repo summary|prompt outputs AI setup instructions|prompt --exec outputs command-only checklist|deprecated copy-commands alias still works and warns)" test/install.test.js; node --check bin/multiagent-safety.js; npm pack --dry-run; openspec validate agent-codex-restore-guardex-npm-package-name-2026-04-22-00-02 --type change --strict; openspec validate --specs; npm test Not-tested: live npm publish; live GitHub release flow Co-authored-by: NagyVikt --- README.md | 21 ++++++++------ docs/images/install-hero.svg | 6 ++-- docs/images/setup-success.svg | 4 +-- docs/images/status-tools-logs.svg | 2 +- docs/images/workflow-gx-terminal-status.svg | 2 +- docs/redditpost.md | 10 +++---- frontend/app/page.tsx | 6 ++-- .../proposal.md | 17 +++++++++++ .../spec.md | 27 ++++++++++++++++++ .../tasks.md | 23 +++++++++++++++ package-lock.json | 8 +++--- package.json | 4 +-- test/install.test.js | 28 +++++++++---------- 13 files changed, 115 insertions(+), 43 deletions(-) create mode 100644 openspec/changes/agent-codex-restore-guardex-npm-package-name-2026-04-22-00-02/proposal.md create mode 100644 openspec/changes/agent-codex-restore-guardex-npm-package-name-2026-04-22-00-02/specs/agent-codex-restore-guardex-npm-package-name-2026-04-22-00-02/spec.md create mode 100644 openspec/changes/agent-codex-restore-guardex-npm-package-name-2026-04-22-00-02/tasks.md diff --git a/README.md b/README.md index 294b83e..6361524 100644 --- a/README.md +++ b/README.md @@ -1,9 +1,9 @@ # GitGuardex — Guardian T-Rex for your repo -[![npm version](https://img.shields.io/npm/v/%40imdeadpool%2Fgitguardex?label=npm&color=cb3837&logo=npm)](https://www.npmjs.com/package/@imdeadpool/gitguardex) -[![npm downloads](https://img.shields.io/npm/dm/%40imdeadpool%2Fgitguardex?label=downloads&color=0b76c5)](https://www.npmjs.com/package/@imdeadpool/gitguardex) +[![npm version](https://img.shields.io/npm/v/%40imdeadpool%2Fguardex?label=npm&color=cb3837&logo=npm)](https://www.npmjs.com/package/@imdeadpool/guardex) +[![npm downloads](https://img.shields.io/npm/dm/%40imdeadpool%2Fguardex?label=downloads&color=0b76c5)](https://www.npmjs.com/package/@imdeadpool/guardex) [![GitHub stars](https://img.shields.io/github/stars/recodeee/gitguardex?label=stars&color=d4ac0d)](https://github.com/recodeee/gitguardex/stargazers) -[![License](https://img.shields.io/npm/l/%40imdeadpool%2Fgitguardex?label=License&color=97ca00)](./LICENSE) +[![License](https://img.shields.io/npm/l/%40imdeadpool%2Fguardex?label=License&color=97ca00)](./LICENSE) [![CI](https://img.shields.io/github/actions/workflow/status/recodeee/gitguardex/ci.yml?branch=main&label=CI)](https://github.com/recodeee/gitguardex/actions/workflows/ci.yml) [![Release](https://img.shields.io/github/actions/workflow/status/recodeee/gitguardex/release.yml?label=Release)](https://github.com/recodeee/gitguardex/actions/workflows/release.yml) @@ -44,7 +44,7 @@ GitGuardex exists to stop that loop. Every agent gets its own worktree, claims t

Install in one line

```bash -npm i -g @imdeadpool/gitguardex +npm i -g @imdeadpool/guardex ```

@@ -55,8 +55,8 @@ npm i -g @imdeadpool/gitguardex

- npm - downloads + npm + downloads stars

@@ -83,12 +83,12 @@ Coming soon: [recodee.com](https://recodee.com) — live account health, usage, ## Quick start ```sh -npm i -g @imdeadpool/gitguardex +npm i -g @imdeadpool/guardex cd /path/to/your/repo gx setup ``` -That's it. New installs should use `@imdeadpool/gitguardex` so the published package matches the GitGuardex name. Setup installs hooks, scripts, templates, and scaffolds OpenSpec/caveman/OMX wiring. Aliases: `gx` (preferred), `gitguardex` (full), `guardex` (legacy compatibility). +That's it. Install and update via `@imdeadpool/guardex`. Setup installs hooks, scripts, templates, and scaffolds OpenSpec/caveman/OMX wiring. Aliases: `gx` (preferred), `gitguardex` (full), `guardex` (legacy compatibility). --- @@ -641,6 +641,11 @@ npm pack --dry-run
v7.x +### v7.0.17 +- Restored the published npm package name to `@imdeadpool/guardex` after the `@imdeadpool/gitguardex` rename only changed the package identity locally and could not rename the existing npm registry entry. +- README/install/tutorial/self-update surfaces now point back at `@imdeadpool/guardex` while keeping GitGuardex as the product/repo brand and `gitguardex` as the long-form command. +- Bumped the release from `7.0.16` → `7.0.17` because `@imdeadpool/guardex@7.0.16` is already published on npm. + ### v7.0.16 - GitGuardex now publishes under the matching npm package name `@imdeadpool/gitguardex`, and install/help/docs surfaces point at the renamed package instead of the older `@imdeadpool/guardex` scope. - `gx doctor` now keeps nested repo repair runs visibly progressing, and overlapping integration work stays off the protected base branch instead of trying to merge back on `main`. diff --git a/docs/images/install-hero.svg b/docs/images/install-hero.svg index 783bb27..668691b 100644 --- a/docs/images/install-hero.svg +++ b/docs/images/install-hero.svg @@ -16,12 +16,12 @@ npm i -g -@imdeadpool/gitguardex +@imdeadpool/guardex added 47 packages in 3.2s -+ @imdeadpool/gitguardex@ -7.0.16 ++ @imdeadpool/guardex@ +7.0.17 diff --git a/docs/images/setup-success.svg b/docs/images/setup-success.svg index 1dac87a..83f0cc8 100644 --- a/docs/images/setup-success.svg +++ b/docs/images/setup-success.svg @@ -16,7 +16,7 @@ deadpool@recodee:~/Documents/testguardex$ gx - [gitguardex] CLI: @imdeadpool/gitguardex/5.0.0 linux-x64 node-v22.22.0 + [gitguardex] CLI: @imdeadpool/guardex/5.0.0 linux-x64 node-v22.22.0 [gitguardex] Global services: - ● oh-my-codex: active - ● @fission-ai/openspec: active @@ -29,7 +29,7 @@ fatal: not a git repository (or any of the parent directories): .git deadpool@recodee:~/Documents/testguardex$ gx - [gitguardex] CLI: @imdeadpool/gitguardex/5.0.0 linux-x64 node-v22.22.0 + [gitguardex] CLI: @imdeadpool/guardex/5.0.0 linux-x64 node-v22.22.0 [gitguardex] Global services: - ● oh-my-codex: active - ● @fission-ai/openspec: active diff --git a/docs/images/status-tools-logs.svg b/docs/images/status-tools-logs.svg index 36b1c34..077a060 100644 --- a/docs/images/status-tools-logs.svg +++ b/docs/images/status-tools-logs.svg @@ -15,7 +15,7 @@ gx status - [gitguardex] CLI: @imdeadpool/gitguardex/5.0.2 linux-x64 node-v22.22.0 + [gitguardex] CLI: @imdeadpool/guardex/5.0.2 linux-x64 node-v22.22.0 [gitguardex] Global services: - ● oh-my-codex: active - ● @fission-ai/openspec: active diff --git a/docs/images/workflow-gx-terminal-status.svg b/docs/images/workflow-gx-terminal-status.svg index ac68d76..bf402d9 100644 --- a/docs/images/workflow-gx-terminal-status.svg +++ b/docs/images/workflow-gx-terminal-status.svg @@ -16,7 +16,7 @@ - [gitguardex] CLI: @imdeadpool/gitguardex/7.0.16 linux-x64 node-v22.22.0 + [gitguardex] CLI: @imdeadpool/guardex/7.0.17 linux-x64 node-v22.22.0 [gitguardex] Global services: - diff --git a/docs/redditpost.md b/docs/redditpost.md index 254b5db..57c7cdc 100644 --- a/docs/redditpost.md +++ b/docs/redditpost.md @@ -4,7 +4,7 @@ Source baseline: [`README.md`](../README.md) Project links: - GitHub: https://github.com/recodeecom/multiagent-safety -- npm: https://www.npmjs.com/package/@imdeadpool/gitguardex +- npm: https://www.npmjs.com/package/@imdeadpool/guardex ## Recommended Title Options @@ -29,7 +29,7 @@ What it does: Quick start: ```bash -npm i -g @imdeadpool/gitguardex +npm i -g @imdeadpool/guardex gx setup ``` @@ -45,7 +45,7 @@ bash scripts/agent-branch-finish.sh --branch "$(git rev-parse --abbrev-ref HEAD) If you run Codex/Claude-style parallel workflows, I would value feedback on edge cases your team hits in production. GitHub: https://github.com/recodeecom/multiagent-safety -npm: https://www.npmjs.com/package/@imdeadpool/gitguardex +npm: https://www.npmjs.com/package/@imdeadpool/guardex ## Copy-Ready Reddit Post (short) @@ -54,12 +54,12 @@ I open-sourced **GitGuardex** for safer multi-agent Git workflows. It adds branch/worktree guardrails, protected-branch enforcement, file-lock ownership, and repair scripts (`gx setup` / `gx doctor`) so parallel agent execution is safer by default. ```bash -npm i -g @imdeadpool/gitguardex +npm i -g @imdeadpool/guardex gx setup ``` GitHub: https://github.com/recodeecom/multiagent-safety -npm: https://www.npmjs.com/package/@imdeadpool/gitguardex +npm: https://www.npmjs.com/package/@imdeadpool/guardex ## Images to include in the Reddit post diff --git a/frontend/app/page.tsx b/frontend/app/page.tsx index 7856df2..63b0f1c 100644 --- a/frontend/app/page.tsx +++ b/frontend/app/page.tsx @@ -126,7 +126,7 @@ interface ModeGuide { } const MODE_ORDER: ModeKey[] = ['execute', 'plan', 'merge', 'installation'] -const INSTALL_COMMAND = 'npm i -g @imdeadpool/gitguardex' +const INSTALL_COMMAND = 'npm i -g @imdeadpool/guardex' const PRODUCT_LABEL = 'Recodee' const EDITOR_LABEL = 'recodee — VS Code' @@ -1697,7 +1697,7 @@ const INSTALL_STEPS: TutorialStep[] = [ sub: '· global install', elapsed: '3.8s', rows: [ - { kind: 'shell', label: 'bash:', value: 'npm i -g @imdeadpool/gitguardex' }, + { kind: 'shell', label: 'bash:', value: 'npm i -g @imdeadpool/guardex' }, ], }, ], @@ -1712,7 +1712,7 @@ const INSTALL_STEPS: TutorialStep[] = [ ], worktrees: [], codeLines: [ - { parts: [c('$ npm i -g @imdeadpool/gitguardex', 'c')] }, + { parts: [c('$ npm i -g @imdeadpool/guardex', 'c')] }, { parts: [c('added 1 package in 3.8s')] }, { parts: [c('')] }, { parts: [c('$ gx --version')] }, diff --git a/openspec/changes/agent-codex-restore-guardex-npm-package-name-2026-04-22-00-02/proposal.md b/openspec/changes/agent-codex-restore-guardex-npm-package-name-2026-04-22-00-02/proposal.md new file mode 100644 index 0000000..4f3b7c2 --- /dev/null +++ b/openspec/changes/agent-codex-restore-guardex-npm-package-name-2026-04-22-00-02/proposal.md @@ -0,0 +1,17 @@ +## Why + +- The repo changed `package.json.name` to `@imdeadpool/gitguardex`, but npm treats a package name change as a different package identity, not a rename of the existing registry entry. +- The live npm registry still serves `@imdeadpool/guardex@7.0.16`, while `npm view @imdeadpool/gitguardex version` returns `404`. +- README, tutorial, Reddit kit, and self-update expectations now point at a package name that is not the install target users actually have. + +## What Changes + +- Restore the published package metadata to `@imdeadpool/guardex`. +- Bump the package version from `7.0.16` to `7.0.17` so the next publish is valid against the existing `@imdeadpool/guardex@7.0.16` release. +- Refresh install, self-update, tutorial, and README-linked asset surfaces to reference `@imdeadpool/guardex`. + +## Compatibility + +- Keep `gx` as the preferred short command. +- Keep `gitguardex` as the long-form command and product/repo brand. +- Keep `guardex` as the legacy compatibility bin alias. diff --git a/openspec/changes/agent-codex-restore-guardex-npm-package-name-2026-04-22-00-02/specs/agent-codex-restore-guardex-npm-package-name-2026-04-22-00-02/spec.md b/openspec/changes/agent-codex-restore-guardex-npm-package-name-2026-04-22-00-02/specs/agent-codex-restore-guardex-npm-package-name-2026-04-22-00-02/spec.md new file mode 100644 index 0000000..a79ee14 --- /dev/null +++ b/openspec/changes/agent-codex-restore-guardex-npm-package-name-2026-04-22-00-02/specs/agent-codex-restore-guardex-npm-package-name-2026-04-22-00-02/spec.md @@ -0,0 +1,27 @@ +## ADDED Requirements + +### Requirement: Published npm package name stays on the existing guardex registry entry +The system SHALL publish and document the primary npm package as `@imdeadpool/guardex`. + +#### Scenario: package metadata advertises the existing package +- **WHEN** the root `package.json` metadata is inspected +- **THEN** `name` equals `@imdeadpool/guardex` +- **AND** downstream package metadata snapshots use the same package name. + +#### Scenario: the next release stays publishable on npm +- **WHEN** the package metadata is prepared for the next publish +- **THEN** the version is greater than the already-published `@imdeadpool/guardex@7.0.16` +- **AND** the package can publish without colliding with the existing registry version. + +### Requirement: Install and update guidance references the real npm package +The README, tutorial UI, and self-update/install guidance SHALL use `@imdeadpool/guardex` while keeping GitGuardex as the product brand. + +#### Scenario: install and self-update prompts use the restored package +- **WHEN** a user reads CLI install/setup guidance or the self-update flow +- **THEN** the npm command examples reference `@imdeadpool/guardex` +- **AND** the CLI keeps `gx`, `gitguardex`, and `guardex` command compatibility. + +#### Scenario: docs and README-linked assets are aligned +- **WHEN** the README, tutorial page, Reddit kit, or README-linked SVG assets are inspected +- **THEN** install commands, npm badges, and package-name callouts reference `@imdeadpool/guardex` +- **AND** GitGuardex remains the visible product/repo name. diff --git a/openspec/changes/agent-codex-restore-guardex-npm-package-name-2026-04-22-00-02/tasks.md b/openspec/changes/agent-codex-restore-guardex-npm-package-name-2026-04-22-00-02/tasks.md new file mode 100644 index 0000000..8dc09af --- /dev/null +++ b/openspec/changes/agent-codex-restore-guardex-npm-package-name-2026-04-22-00-02/tasks.md @@ -0,0 +1,23 @@ +## 1. Spec + +- [x] 1.1 Capture why the npm package rename has to revert and the versioning constraint for the next publish. + +## 2. Tests + +- [x] 2.1 Update the package-name-dependent install/self-update/status assertions back to `@imdeadpool/guardex`. + +## 3. Implementation + +- [x] 3.1 Restore package metadata to `@imdeadpool/guardex` and bump the package version to `7.0.17`. +- [x] 3.2 Refresh README, tutorial, Reddit kit, and README-linked assets to reference `@imdeadpool/guardex`. + +## 4. Verification + +- [x] 4.1 Run targeted package-name verification (`node --test --test-name-pattern "(default invocation checks for update and can auto-approve latest install|self-update verifies on-disk version after @latest install and retries with pinned version when stale|self-update restarts into the installed CLI after a successful on-disk upgrade|status --json returns cli, services, and repo summary|prompt outputs AI setup instructions|prompt --exec outputs command-only checklist|deprecated copy-commands alias still works and warns)" test/install.test.js`, `node --check bin/multiagent-safety.js`, `npm pack --dry-run`) and record the results. Result: targeted package-name verification passed `7/7`; `node --check bin/multiagent-safety.js` passed; `npm pack --dry-run` produced `imdeadpool-guardex-7.0.17.tgz`. +- [x] 4.2 Run `openspec validate agent-codex-restore-guardex-npm-package-name-2026-04-22-00-02 --type change --strict`. Result: `Change 'agent-codex-restore-guardex-npm-package-name-2026-04-22-00-02' is valid`. +- [x] 4.3 Run `openspec validate --specs`. Result: `No items found to validate.` +- [x] 4.4 Run `npm test` after the package-name revert to confirm broader repo integrity. Result: full suite passed `163/163`. + +## 5. Cleanup + +- [ ] 5.1 Finish branch via PR merge + cleanup and record final `MERGED` evidence. diff --git a/package-lock.json b/package-lock.json index 1601785..e6ae4fb 100644 --- a/package-lock.json +++ b/package-lock.json @@ -1,12 +1,12 @@ { - "name": "@imdeadpool/gitguardex", - "version": "7.0.16", + "name": "@imdeadpool/guardex", + "version": "7.0.17", "lockfileVersion": 3, "requires": true, "packages": { "": { - "name": "@imdeadpool/gitguardex", - "version": "7.0.16", + "name": "@imdeadpool/guardex", + "version": "7.0.17", "license": "MIT", "bin": { "gitguardex": "bin/multiagent-safety.js", diff --git a/package.json b/package.json index 8864ff6..1503f8f 100644 --- a/package.json +++ b/package.json @@ -1,6 +1,6 @@ { - "name": "@imdeadpool/gitguardex", - "version": "7.0.16", + "name": "@imdeadpool/guardex", + "version": "7.0.17", "description": "Guardian T-Rex for your multi-agent repo. Isolated worktrees, file locks, and PR-only merges stop parallel Codex & Claude agents from overwriting each other's work. Auto-wires Oh My Codex, Oh My Claude, OpenSpec, and Caveman.", "license": "MIT", "preferGlobal": true, diff --git a/test/install.test.js b/test/install.test.js index 65cd18f..80455ac 100644 --- a/test/install.test.js +++ b/test/install.test.js @@ -191,7 +191,7 @@ function seedReleasePackageManifest(repoDir, overrides = {}) { const packageJson = JSON.parse(fs.readFileSync(packageJsonPath, 'utf8')); const mergedPackageJson = { ...packageJson, - name: packageJson.name || '@imdeadpool/gitguardex', + name: packageJson.name || '@imdeadpool/guardex', version: cliVersion, repository: { type: 'git', @@ -2447,7 +2447,7 @@ if [[ "$1" == "list" ]]; then echo '{"dependencies":{"oh-my-codex":{},"@fission-ai/openspec":{}}}' exit 0 fi -if [[ "$1" == "i" && "$2" == "-g" && "$3" == "@imdeadpool/gitguardex@latest" ]]; then +if [[ "$1" == "i" && "$2" == "-g" && "$3" == "@imdeadpool/guardex@latest" ]]; then echo "updated" > "${markerPath}" exit 0 fi @@ -2472,11 +2472,11 @@ exit 1 test('self-update verifies on-disk version after @latest install and retries with pinned version when stale', () => { const repoDir = initRepo(); const fakeGlobalRoot = fs.mkdtempSync(path.join(os.tmpdir(), 'guardex-fake-global-root-')); - const installedPkgDir = path.join(fakeGlobalRoot, '@imdeadpool', 'gitguardex'); + const installedPkgDir = path.join(fakeGlobalRoot, '@imdeadpool', 'guardex'); fs.mkdirSync(installedPkgDir, { recursive: true }); fs.writeFileSync( path.join(installedPkgDir, 'package.json'), - JSON.stringify({ name: '@imdeadpool/gitguardex', version: cliVersion }), + JSON.stringify({ name: '@imdeadpool/guardex', version: cliVersion }), 'utf8', ); const markerLatest = path.join(repoDir, '.npm-at-latest-called'); @@ -2494,15 +2494,15 @@ if [[ "$1" == "root" && "$2" == "-g" ]]; then echo "${fakeGlobalRoot}" exit 0 fi -if [[ "$1" == "i" && "$2" == "-g" && "$3" == "@imdeadpool/gitguardex@latest" ]]; then +if [[ "$1" == "i" && "$2" == "-g" && "$3" == "@imdeadpool/guardex@latest" ]]; then touch "${markerLatest}" # Simulate the npm quirk: report success without rewriting the on-disk package.json. exit 0 fi -if [[ "$1" == "i" && "$2" == "-g" && "$3" == "@imdeadpool/gitguardex@9.9.9" ]]; then +if [[ "$1" == "i" && "$2" == "-g" && "$3" == "@imdeadpool/guardex@9.9.9" ]]; then touch "${markerPinned}" # Pinned retry actually advances the on-disk version. - printf '%s' '{"name":"@imdeadpool/gitguardex","version":"9.9.9"}' > "${installedPkgDir}/package.json" + printf '%s' '{"name":"@imdeadpool/guardex","version":"9.9.9"}' > "${installedPkgDir}/package.json" exit 0 fi echo "unexpected npm args: $*" >&2 @@ -2527,14 +2527,14 @@ exit 1 test('self-update restarts into the installed CLI after a successful on-disk upgrade', () => { const repoDir = initRepo(); const fakeGlobalRoot = fs.mkdtempSync(path.join(os.tmpdir(), 'guardex-fake-global-root-')); - const installedPkgDir = path.join(fakeGlobalRoot, '@imdeadpool', 'gitguardex'); + const installedPkgDir = path.join(fakeGlobalRoot, '@imdeadpool', 'guardex'); const installedBinDir = path.join(installedPkgDir, 'bin'); const reexecMarker = path.join(repoDir, '.self-update-reexec-called'); fs.mkdirSync(installedBinDir, { recursive: true }); fs.writeFileSync( path.join(installedPkgDir, 'package.json'), JSON.stringify({ - name: '@imdeadpool/gitguardex', + name: '@imdeadpool/guardex', version: '9.9.9', bin: { gx: 'bin/multiagent-safety.js' }, }), @@ -2562,7 +2562,7 @@ if [[ "$1" == "root" && "$2" == "-g" ]]; then echo "${fakeGlobalRoot}" exit 0 fi -if [[ "$1" == "i" && "$2" == "-g" && "$3" == "@imdeadpool/gitguardex@latest" ]]; then +if [[ "$1" == "i" && "$2" == "-g" && "$3" == "@imdeadpool/guardex@latest" ]]; then exit 0 fi echo "unexpected npm args: $*" >&2 @@ -2651,7 +2651,7 @@ test('status --json returns cli, services, and repo summary', () => { assert.equal(result.status, 0, result.stderr || result.stdout); const parsed = JSON.parse(result.stdout); - assert.equal(parsed.cli.name, '@imdeadpool/gitguardex'); + assert.equal(parsed.cli.name, '@imdeadpool/guardex'); assert.equal(typeof parsed.cli.version, 'string'); assert.equal(Array.isArray(parsed.services), true); const claudeService = parsed.services.find((service) => service.name === 'oh-my-claudecode'); @@ -4601,7 +4601,7 @@ test('prompt outputs AI setup instructions', () => { const repoDir = initRepo(); const result = runNode(['prompt'], repoDir); assert.equal(result.status, 0, result.stderr || result.stdout); - assert.match(result.stdout, /npm i -g @imdeadpool\/gitguardex/); + assert.match(result.stdout, /npm i -g @imdeadpool\/guardex/); assert.match(result.stdout, /GitGuardex \(gx\) setup checklist/); assert.match(result.stdout, /gx setup/); assert.match(result.stdout, /gx doctor/); @@ -4618,7 +4618,7 @@ test('prompt --exec outputs command-only checklist', () => { const repoDir = initRepo(); const result = runNode(['prompt', '--exec'], repoDir); assert.equal(result.status, 0, result.stderr || result.stdout); - assert.match(result.stdout, /^npm i -g @imdeadpool\/gitguardex/m); + assert.match(result.stdout, /^npm i -g @imdeadpool\/guardex/m); assert.match(result.stdout, /^gh --version/m); assert.match(result.stdout, /^gx setup$/m); assert.match(result.stdout, /^gx doctor$/m); @@ -4641,7 +4641,7 @@ test('deprecated copy-commands alias still works and warns', () => { const repoDir = initRepo(); const result = runNode(['copy-commands'], repoDir); assert.equal(result.status, 0, result.stderr || result.stdout); - assert.match(result.stdout, /^npm i -g @imdeadpool\/gitguardex/m); + assert.match(result.stdout, /^npm i -g @imdeadpool\/guardex/m); assert.match(result.stderr, /'copy-commands' is deprecated/); assert.match(result.stderr, /gx prompt --exec/); }); From dcc7154d7f8b12a7b996e3249651788c76135d5c Mon Sep 17 00:00:00 2001 From: Viktor Nagy <137165288+NagyVikt@users.noreply.github.com> Date: Wed, 22 Apr 2026 00:52:10 +0200 Subject: [PATCH 35/48] Move installed guard workflows to CLI-owned shims (#267) Setup and doctor now keep thin repo-local dispatch shims while package-owned assets handle hooks, finish flows, migrations, and OpenSpec scaffolding. This also aligns the prompt/test surfaces with the gx-first workflow and restores inherited CLI dispatch for doctor and hook child processes. Constraint: Installed repos must keep protected-branch repair and hook compatibility while copied workflow logic moves into the package Rejected: Keep repo-local workflow copies | doctor drift remains the steady state Confidence: high Scope-risk: broad Directive: Repo hooks and scripts are compatibility shims; change package-owned assets first and keep shim dispatch aligned Tested: node --check bin/multiagent-safety.js Tested: node --test test/install.test.js Tested: openspec validate agent-codex-cli-owned-install-surface-2026-04-21-23-09 --type change --strict Tested: openspec validate --specs Not-tested: Live GitHub PR or merge against a real GitHub remote beyond fake-gh harnesses Co-authored-by: NagyVikt --- README.md | 53 +- bin/multiagent-safety.js | 692 +++++++++++++++--- .../proposal.md | 26 + .../specs/cli-owned-install-surface/spec.md | 54 ++ .../tasks.md | 33 + templates/AGENTS.multiagent-safety.md | 8 +- templates/githooks/post-checkout | 2 +- templates/githooks/post-merge | 25 +- templates/githooks/pre-commit | 14 +- .../scripts/openspec/init-change-workspace.sh | 8 +- .../scripts/openspec/init-plan-workspace.sh | 14 +- test/install.test.js | 229 +++--- 12 files changed, 907 insertions(+), 251 deletions(-) create mode 100644 openspec/changes/agent-codex-cli-owned-install-surface-2026-04-21-23-09/proposal.md create mode 100644 openspec/changes/agent-codex-cli-owned-install-surface-2026-04-21-23-09/specs/cli-owned-install-surface/spec.md create mode 100644 openspec/changes/agent-codex-cli-owned-install-surface-2026-04-21-23-09/tasks.md diff --git a/README.md b/README.md index 6361524..c644f90 100644 --- a/README.md +++ b/README.md @@ -49,7 +49,7 @@ npm i -g @imdeadpool/guardex

- Then cd into your repo and run gx setup — hooks, scripts, templates, + Then cd into your repo and run gx setup — hook shims, workflow shims, repo state, and OMX / OpenSpec / caveman wiring all scaffold in one go.

@@ -88,7 +88,7 @@ cd /path/to/your/repo gx setup ``` -That's it. Install and update via `@imdeadpool/guardex`. Setup installs hooks, scripts, templates, and scaffolds OpenSpec/caveman/OMX wiring. Aliases: `gx` (preferred), `gitguardex` (full), `guardex` (legacy compatibility). +That's it. Install and update via `@imdeadpool/guardex`. Setup installs the minimal repo footprint: managed hook/workflow shims, lock state, AGENTS wiring, and OpenSpec/caveman/OMX scaffolding. Aliases: `gx` (preferred), `gitguardex` (full), `guardex` (legacy compatibility). --- @@ -119,7 +119,7 @@ That's it. Install and update via `@imdeadpool/guardex`. Setup installs hooks, s
> [!NOTE] -> In this repo, `CLAUDE.md` is a symlink to `AGENTS.md`, so Claude reads the same contract. Claude-specific command guidance is installed separately at `.claude/commands/gitguardex.md`. +> In this repo, `CLAUDE.md` is a symlink to `AGENTS.md`, so Claude reads the same contract. Optional Codex/Claude companion files are installed at the user level with `gx install-agent-skills`, not copied into each repo. ### Decision flow @@ -178,7 +178,7 @@ Before you branch, repair, or start agents, run plain `gx`. It gives you a one-s ![GitGuardex terminal status output](https://raw.githubusercontent.com/recodeee/gitguardex/main/docs/images/workflow-gx-terminal-status.svg) -Use `gx setup` the first time you wire GitGuardex into a repo. It bootstraps the managed hooks, scripts, templates, and optional workspace/OpenSpec wiring. If the repo drifts later, use `gx doctor` as the repair path: it reapplies the managed safety files, verifies the setup, and on protected `main` it auto-sandboxes the repair so your visible base branch stays clean. +Use `gx setup` the first time you wire GitGuardex into a repo. It bootstraps the managed hook/workflow shims, repo state, and optional workspace/OpenSpec wiring. If the repo drifts later, use `gx doctor` as the repair path: it reapplies the managed safety files, verifies the setup, and on protected `main` it auto-sandboxes the repair so your visible base branch stays clean. --- @@ -188,22 +188,22 @@ Per new agent task: ```sh # 1) Start isolated branch/worktree -bash scripts/agent-branch-start.sh "task-name" "agent-name" +gx branch start "task-name" "agent-name" # 2) Claim the files you're going to touch -python3 scripts/agent-file-locks.py claim \ +gx locks claim \ --branch "$(git rev-parse --abbrev-ref HEAD)" # 3) Implement + verify npm test # 4) Finish (commit + push + PR + merge + cleanup) -bash scripts/agent-branch-finish.sh \ +gx branch finish \ --branch "$(git rev-parse --abbrev-ref HEAD)" \ --base main --via-pr --wait-for-merge --cleanup ``` -If you use `scripts/codex-agent.sh`, the finish flow runs automatically when the Codex session exits — it auto-commits, retries once after syncing if the base moved during the run, then pushes and opens the PR. +If you use the managed Codex launcher shim, the finish flow runs automatically when the Codex session exits — it auto-commits, retries once after syncing if the base moved during the run, then pushes and opens the PR. GitGuardex normally prunes merged sandboxes for you as part of the finish flow. If you simply do not want a local sandbox/worktree anymore, remove that worktree directly; delete the branch too only if you are intentionally abandoning that lane: @@ -383,7 +383,7 @@ A few things worth knowing up front: - Direct commits/pushes to protected branches are **blocked** by default. Agents must use the `agent/*` + PR flow. - **Exception:** VS Code Source Control commits are allowed on protected branches that exist only locally (no upstream, no remote branch). - On protected `main`, `gx doctor` auto-runs in a sandbox agent branch/worktree so it can't touch your real main. -- In-place agent branching is disabled. `scripts/agent-branch-start.sh` always creates a separate worktree so your visible local/base branch never changes. +- In-place agent branching is disabled. `gx branch start` always creates a separate worktree so your visible local/base branch never changes. - Fresh sandbox branches start with no git upstream. Guardex records the protected base in `branch..guardexBase`, and the first `git push -u` publishes the real upstream. - Interactive self-update prompt defaults to **No** (`[y/N]`). @@ -542,8 +542,8 @@ Expanded flow: ### OpenSpec in agent sub-branches -- `scripts/codex-agent.sh` enforces OpenSpec workspaces before launching Codex. -- `scripts/agent-branch-start.sh` can scaffold both `openspec/changes//` and `openspec/plan//` when `GUARDEX_OPENSPEC_AUTO_INIT=true`. +- The managed Codex launcher shim enforces OpenSpec workspaces before launching Codex. +- `gx branch start` can scaffold both `openspec/changes//` and `openspec/plan//` when `GUARDEX_OPENSPEC_AUTO_INIT=true`. - The collaboration section in `tasks.md` is there for real cleanup handoffs too. If the first Codex/Claude session finishes the implementation work but hits a usage limit before `agent-branch-finish --cleanup`, hand the same sandbox to another agent, let that agent finish cleanup, and record the join/handoff in the change task. Environment variables: @@ -560,25 +560,26 @@ Environment variables: ## Files installed by setup ```text -scripts/agent-branch-start.sh -scripts/agent-branch-finish.sh -scripts/codex-agent.sh -scripts/review-bot-watch.sh -scripts/agent-worktree-prune.sh -scripts/agent-file-locks.py -scripts/install-agent-git-hooks.sh -scripts/openspec/init-plan-workspace.sh -.githooks/pre-commit -.githooks/pre-push -.codex/skills/gitguardex/SKILL.md -.claude/commands/gitguardex.md -.github/pull.yml.example -.github/workflows/cr.yml +.githooks/pre-commit # shim -> gx hook run pre-commit +.githooks/pre-push # shim -> gx hook run pre-push +.githooks/post-merge # shim -> gx hook run post-merge +.githooks/post-checkout # shim -> gx hook run post-checkout +scripts/agent-branch-start.sh # shim -> gx branch start +scripts/agent-branch-finish.sh # shim -> gx branch finish +scripts/agent-branch-merge.sh # shim -> gx branch merge +scripts/agent-worktree-prune.sh # shim -> gx worktree prune +scripts/agent-file-locks.py # shim -> gx locks ... +scripts/codex-agent.sh # shim -> CLI-owned Codex launcher +scripts/review-bot-watch.sh # shim -> CLI-owned review bot +scripts/openspec/init-plan-workspace.sh # shim -> CLI-owned OpenSpec init +scripts/openspec/init-change-workspace.sh # shim -> CLI-owned OpenSpec init .omc/agent-worktrees .omx/state/agent-file-locks.json +.github/pull.yml.example +.github/workflows/cr.yml ``` -If `package.json` exists, setup also adds `agent:*` helper scripts. +Repo-local Codex/Claude helper files are no longer copied into the repo. Install the optional user-level companions with `gx install-agent-skills`. --- diff --git a/bin/multiagent-safety.js b/bin/multiagent-safety.js index 8110c39..f30a943 100755 --- a/bin/multiagent-safety.js +++ b/bin/multiagent-safety.js @@ -5,11 +5,18 @@ const os = require('node:os'); const path = require('node:path'); const cp = require('node:child_process'); -const packageJsonPath = path.resolve(__dirname, '..', 'package.json'); +const PACKAGE_ROOT = path.resolve(__dirname, '..'); +const packageJsonPath = path.join(PACKAGE_ROOT, 'package.json'); const packageJson = JSON.parse(fs.readFileSync(packageJsonPath, 'utf8')); const TOOL_NAME = 'gitguardex'; const SHORT_TOOL_NAME = 'gx'; +if (!process.env.GUARDEX_CLI_ENTRY) { + process.env.GUARDEX_CLI_ENTRY = __filename; +} +if (!process.env.GUARDEX_NODE_BIN) { + process.env.GUARDEX_NODE_BIN = process.execPath; +} const LEGACY_NAMES = ['guardex', 'multiagent-safety']; const GLOBAL_INSTALL_COMMAND = `npm i -g ${packageJson.name}`; const OPENSPEC_PACKAGE = '@fission-ai/openspec'; @@ -85,30 +92,15 @@ const COMPOSE_HINT_FILES = [ 'compose.yaml', ]; -const TEMPLATE_ROOT = path.resolve(__dirname, '..', 'templates'); +const TEMPLATE_ROOT = path.join(PACKAGE_ROOT, 'templates'); + +const HOOK_NAMES = ['pre-commit', 'pre-push', 'post-merge', 'post-checkout']; const TEMPLATE_FILES = [ - 'scripts/agent-branch-start.sh', - 'scripts/agent-branch-finish.sh', - 'scripts/agent-branch-merge.sh', 'scripts/agent-session-state.js', - 'scripts/codex-agent.sh', 'scripts/guardex-docker-loader.sh', - 'scripts/install-vscode-active-agents-extension.js', - 'scripts/review-bot-watch.sh', - 'scripts/agent-worktree-prune.sh', - 'scripts/agent-file-locks.py', 'scripts/guardex-env.sh', - 'scripts/install-agent-git-hooks.sh', - 'scripts/openspec/init-plan-workspace.sh', - 'scripts/openspec/init-change-workspace.sh', - 'githooks/pre-commit', - 'githooks/pre-push', - 'githooks/post-merge', - 'githooks/post-checkout', - 'codex/skills/gitguardex/SKILL.md', - 'codex/skills/guardex-merge-skills-to-dev/SKILL.md', - 'claude/commands/gitguardex.md', + 'scripts/install-vscode-active-agents-extension.js', 'github/pull.yml.example', 'github/workflows/cr.yml', 'vscode/guardex-active-agents/package.json', @@ -117,22 +109,50 @@ const TEMPLATE_FILES = [ 'vscode/guardex-active-agents/README.md', ]; -const REQUIRED_WORKFLOW_FILES = [ +const SCRIPT_SHIMS = [ + { relativePath: 'scripts/agent-branch-start.sh', kind: 'shell', command: ['branch', 'start'] }, + { relativePath: 'scripts/agent-branch-finish.sh', kind: 'shell', command: ['branch', 'finish'] }, + { relativePath: 'scripts/agent-branch-merge.sh', kind: 'shell', command: ['branch', 'merge'] }, + { relativePath: 'scripts/codex-agent.sh', kind: 'shell', command: ['internal', 'run-shell', 'codexAgent'] }, + { relativePath: 'scripts/review-bot-watch.sh', kind: 'shell', command: ['internal', 'run-shell', 'reviewBot'] }, + { relativePath: 'scripts/agent-worktree-prune.sh', kind: 'shell', command: ['worktree', 'prune'] }, + { relativePath: 'scripts/agent-file-locks.py', kind: 'python', command: ['locks'] }, + { relativePath: 'scripts/openspec/init-plan-workspace.sh', kind: 'shell', command: ['internal', 'run-shell', 'planInit'] }, + { relativePath: 'scripts/openspec/init-change-workspace.sh', kind: 'shell', command: ['internal', 'run-shell', 'changeInit'] }, +]; + +const LEGACY_MANAGED_REPO_FILES = [ 'scripts/agent-branch-start.sh', 'scripts/agent-branch-finish.sh', 'scripts/agent-branch-merge.sh', 'scripts/agent-session-state.js', + 'scripts/codex-agent.sh', 'scripts/guardex-docker-loader.sh', + 'scripts/install-vscode-active-agents-extension.js', + 'scripts/review-bot-watch.sh', 'scripts/agent-worktree-prune.sh', 'scripts/agent-file-locks.py', 'scripts/guardex-env.sh', 'scripts/install-agent-git-hooks.sh', + 'scripts/openspec/init-plan-workspace.sh', + 'scripts/openspec/init-change-workspace.sh', '.githooks/pre-commit', + '.githooks/pre-push', '.githooks/post-merge', + '.githooks/post-checkout', + '.codex/skills/gitguardex/SKILL.md', + '.codex/skills/guardex-merge-skills-to-dev/SKILL.md', + '.claude/commands/gitguardex.md', +]; + +const REQUIRED_WORKFLOW_FILES = [ + ...TEMPLATE_FILES.map((entry) => toDestinationPath(entry)), + ...SCRIPT_SHIMS.map((entry) => entry.relativePath), + ...HOOK_NAMES.map((entry) => path.posix.join('.githooks', entry)), '.omx/state/agent-file-locks.json', ]; -const REQUIRED_PACKAGE_SCRIPTS = { +const LEGACY_MANAGED_PACKAGE_SCRIPTS = { 'agent:codex': 'bash ./scripts/codex-agent.sh', 'agent:branch:start': 'bash ./scripts/agent-branch-start.sh', 'agent:branch:finish': 'bash ./scripts/agent-branch-finish.sh', @@ -157,32 +177,44 @@ const REQUIRED_PACKAGE_SCRIPTS = { 'agent:finish': 'gx finish --all', }; +const PACKAGE_SCRIPT_ASSETS = { + branchStart: path.join(TEMPLATE_ROOT, 'scripts', 'agent-branch-start.sh'), + branchFinish: path.join(TEMPLATE_ROOT, 'scripts', 'agent-branch-finish.sh'), + branchMerge: path.join(TEMPLATE_ROOT, 'scripts', 'agent-branch-merge.sh'), + codexAgent: path.join(TEMPLATE_ROOT, 'scripts', 'codex-agent.sh'), + reviewBot: path.join(TEMPLATE_ROOT, 'scripts', 'review-bot-watch.sh'), + worktreePrune: path.join(TEMPLATE_ROOT, 'scripts', 'agent-worktree-prune.sh'), + lockTool: path.join(TEMPLATE_ROOT, 'scripts', 'agent-file-locks.py'), + planInit: path.join(TEMPLATE_ROOT, 'scripts', 'openspec', 'init-plan-workspace.sh'), + changeInit: path.join(TEMPLATE_ROOT, 'scripts', 'openspec', 'init-change-workspace.sh'), +}; + +const USER_LEVEL_SKILL_ASSETS = [ + { + source: path.join(TEMPLATE_ROOT, 'codex', 'skills', 'gitguardex', 'SKILL.md'), + destination: path.join('.codex', 'skills', 'gitguardex', 'SKILL.md'), + }, + { + source: path.join(TEMPLATE_ROOT, 'codex', 'skills', 'guardex-merge-skills-to-dev', 'SKILL.md'), + destination: path.join('.codex', 'skills', 'guardex-merge-skills-to-dev', 'SKILL.md'), + }, + { + source: path.join(TEMPLATE_ROOT, 'claude', 'commands', 'gitguardex.md'), + destination: path.join('.claude', 'commands', 'gitguardex.md'), + }, +]; + const EXECUTABLE_RELATIVE_PATHS = new Set([ - 'scripts/agent-branch-start.sh', - 'scripts/agent-branch-finish.sh', - 'scripts/agent-branch-merge.sh', 'scripts/agent-session-state.js', - 'scripts/codex-agent.sh', 'scripts/guardex-docker-loader.sh', 'scripts/install-vscode-active-agents-extension.js', - 'scripts/review-bot-watch.sh', - 'scripts/agent-worktree-prune.sh', - 'scripts/agent-file-locks.py', - 'scripts/install-agent-git-hooks.sh', - 'scripts/openspec/init-plan-workspace.sh', - 'scripts/openspec/init-change-workspace.sh', - '.githooks/pre-commit', - '.githooks/pre-push', - '.githooks/post-merge', - '.githooks/post-checkout', + ...SCRIPT_SHIMS.map((entry) => entry.relativePath), + ...HOOK_NAMES.map((entry) => path.posix.join('.githooks', entry)), ]); const CRITICAL_GUARDRAIL_PATHS = new Set([ 'AGENTS.md', - '.githooks/pre-commit', - '.githooks/pre-push', - '.githooks/post-merge', - '.githooks/post-checkout', + ...HOOK_NAMES.map((entry) => path.posix.join('.githooks', entry)), 'scripts/agent-branch-start.sh', 'scripts/agent-branch-finish.sh', 'scripts/agent-branch-merge.sh', @@ -212,9 +244,6 @@ const MANAGED_GITIGNORE_PATHS = [ 'scripts/agent-file-locks.py', '.githooks', 'oh-my-codex/', - '.codex/skills/gitguardex/SKILL.md', - '.codex/skills/guardex-merge-skills-to-dev/SKILL.md', - '.claude/commands/gitguardex.md', LOCK_FILE_RELATIVE, ]; const REPO_SCAFFOLD_DIRECTORIES = ['bin']; @@ -247,6 +276,12 @@ const SUGGESTIBLE_COMMANDS = [ 'status', 'setup', 'doctor', + 'branch', + 'locks', + 'worktree', + 'hook', + 'migrate', + 'install-agent-skills', 'agents', 'merge', 'finish', @@ -272,6 +307,12 @@ const CLI_COMMAND_DESCRIPTIONS = [ ['status', 'Show GitGuardex CLI + service health without modifying files'], ['setup', 'Install, repair, and verify guardrails (flags: --repair, --install-only, --target)'], ['doctor', 'Repair drift + verify (auto-sandboxes on protected main)'], + ['branch', 'CLI-owned branch workflow surface (start/finish/merge)'], + ['locks', 'CLI-owned file lock surface (claim/allow-delete/release/status/validate)'], + ['worktree', 'CLI-owned worktree cleanup surface (prune)'], + ['hook', 'Hook dispatch/install surface used by managed shims'], + ['migrate', 'Convert legacy repo-local installs to the new shim-based CLI-owned surface'], + ['install-agent-skills', 'Install Guardex Codex/Claude skills into the user home'], ['protect', 'Manage protected branches (list/add/remove/set/reset)'], ['merge', 'Create/reuse an integration lane and merge overlapping agent branches'], ['sync', 'Sync agent branches with origin/'], @@ -319,8 +360,8 @@ const AI_SETUP_PROMPT = `GitGuardex (gx) setup checklist for Codex/Claude in thi 1) Install: ${GLOBAL_INSTALL_COMMAND} && gh --version 2) Bootstrap: gx setup 3) Repair: gx doctor -4) Task loop: bash scripts/codex-agent.sh "" "" - or branch-start -> python3 scripts/agent-file-locks.py claim -> branch-finish +4) Task loop: gx branch start "" "" + then gx locks claim --branch "" -> gx branch finish 5) Integrate: gx merge --branch --branch 6) Finish: gx finish --all 7) Cleanup: gx cleanup @@ -335,8 +376,8 @@ const AI_SETUP_COMMANDS = `${GLOBAL_INSTALL_COMMAND} gh --version gx setup gx doctor -bash scripts/codex-agent.sh "" "" -python3 scripts/agent-file-locks.py claim --branch "" +gx branch start "" "" +gx locks claim --branch "" gx merge --branch "" --branch "" gx finish --all gx cleanup @@ -582,10 +623,74 @@ function run(cmd, args, options = {}) { encoding: 'utf8', stdio: options.stdio || 'pipe', cwd: options.cwd, + env: options.env ? { ...process.env, ...options.env } : process.env, timeout: options.timeout, }); } +function extractTargetedArgs(rawArgs, defaultTarget = process.cwd()) { + const passthrough = []; + let target = defaultTarget; + + for (let index = 0; index < rawArgs.length; index += 1) { + const arg = rawArgs[index]; + if (arg === '--target' || arg === '-t') { + target = requireValue(rawArgs, index, '--target'); + index += 1; + continue; + } + passthrough.push(arg); + } + + return { target, passthrough }; +} + +function packageAssetEnv(extraEnv = {}) { + return { + GUARDEX_CLI_ENTRY: __filename, + GUARDEX_NODE_BIN: process.execPath, + ...extraEnv, + }; +} + +function packageAssetPath(assetKey) { + const assetPath = PACKAGE_SCRIPT_ASSETS[assetKey]; + if (!assetPath) { + throw new Error(`Unknown package asset: ${assetKey}`); + } + if (!fs.existsSync(assetPath)) { + throw new Error(`Missing package asset: ${assetPath}`); + } + return assetPath; +} + +function runPackageAsset(assetKey, rawArgs, options = {}) { + const assetPath = packageAssetPath(assetKey); + let cmd = 'bash'; + if (assetPath.endsWith('.py')) { + cmd = 'python3'; + } else if (assetPath.endsWith('.js')) { + cmd = process.execPath; + } + return run(cmd, [assetPath, ...rawArgs], { + cwd: options.cwd || process.cwd(), + stdio: options.stdio || 'pipe', + timeout: options.timeout, + env: packageAssetEnv(options.env), + }); +} + +function invokePackageAsset(assetKey, rawArgs, options = {}) { + const result = runPackageAsset(assetKey, rawArgs, options); + if (result.stdout) process.stdout.write(result.stdout); + if (result.stderr) process.stderr.write(result.stderr); + if (result.status !== 0) { + throw new Error(`${assetKey} command failed with status ${result.status}`); + } + process.exitCode = 0; + return result; +} + function formatElapsedDuration(ms) { const durationMs = Number.isFinite(ms) ? Math.max(0, ms) : 0; if (durationMs < 1000) { @@ -864,6 +969,111 @@ function isCriticalGuardrailPath(relativePath) { return CRITICAL_GUARDRAIL_PATHS.has(relativePath); } +function shellSingleQuote(value) { + return `'${String(value).replace(/'/g, `'\"'\"'`)}'`; +} + +function renderShellDispatchShim(commandParts) { + const rendered = commandParts.map((part) => shellSingleQuote(part)).join(' '); + return ( + '#!/usr/bin/env bash\n' + + 'set -euo pipefail\n' + + '\n' + + 'if [[ -n "${GUARDEX_CLI_ENTRY:-}" ]]; then\n' + + ' node_bin="${GUARDEX_NODE_BIN:-node}"\n' + + ` exec "$node_bin" "$GUARDEX_CLI_ENTRY" ${rendered} "$@"\n` + + 'fi\n' + + '\n' + + 'resolve_guardex_cli() {\n' + + ' if [[ -n "${GUARDEX_CLI_BIN:-}" ]]; then\n' + + ' printf \'%s\' "$GUARDEX_CLI_BIN"\n' + + ' return 0\n' + + ' fi\n' + + ' if command -v gx >/dev/null 2>&1; then\n' + + ' printf \'%s\' "gx"\n' + + ' return 0\n' + + ' fi\n' + + ' if command -v gitguardex >/dev/null 2>&1; then\n' + + ' printf \'%s\' "gitguardex"\n' + + ' return 0\n' + + ' fi\n' + + ' echo "[gitguardex-shim] Missing gx CLI in PATH." >&2\n' + + ' exit 1\n' + + '}\n' + + '\n' + + 'cli_bin="$(resolve_guardex_cli)"\n' + + `exec "$cli_bin" ${rendered} "$@"\n` + ); +} + +function renderPythonDispatchShim(commandParts) { + return ( + '#!/usr/bin/env python3\n' + + 'import os\n' + + 'import shutil\n' + + 'import subprocess\n' + + 'import sys\n' + + '\n' + + `COMMAND = ${JSON.stringify(commandParts)}\n` + + '\n' + + 'entry = os.environ.get("GUARDEX_CLI_ENTRY")\n' + + 'if entry:\n' + + ' node_bin = os.environ.get("GUARDEX_NODE_BIN") or shutil.which("node") or "node"\n' + + ' raise SystemExit(subprocess.call([node_bin, entry, *COMMAND, *sys.argv[1:]]))\n' + + 'cli = os.environ.get("GUARDEX_CLI_BIN") or shutil.which("gx") or shutil.which("gitguardex")\n' + + 'if not cli:\n' + + ' sys.stderr.write("[gitguardex-shim] Missing gx CLI in PATH.\\n")\n' + + ' raise SystemExit(1)\n' + + 'raise SystemExit(subprocess.call([cli, *COMMAND, *sys.argv[1:]]))\n' + ); +} + +function renderManagedFile(repoRoot, relativePath, content, options = {}) { + const destinationPath = path.join(repoRoot, relativePath); + const destinationExists = fs.existsSync(destinationPath); + const force = Boolean(options.force); + const dryRun = Boolean(options.dryRun); + + if (destinationExists) { + const existingContent = fs.readFileSync(destinationPath, 'utf8'); + if (existingContent === content) { + ensureExecutable(destinationPath, relativePath, dryRun); + return { status: 'unchanged', file: relativePath }; + } + if (!force && !isCriticalGuardrailPath(relativePath)) { + throw new Error(`Refusing to overwrite existing file without --force: ${relativePath}`); + } + } + + ensureParentDir(repoRoot, destinationPath, dryRun); + if (!dryRun) { + fs.writeFileSync(destinationPath, content, 'utf8'); + ensureExecutable(destinationPath, relativePath, dryRun); + } + + if (destinationExists && !force && isCriticalGuardrailPath(relativePath)) { + return { status: dryRun ? 'would-repair-critical' : 'repaired-critical', file: relativePath }; + } + + return { status: destinationExists ? 'overwritten' : 'created', file: relativePath }; +} + +function ensureGeneratedScriptShim(repoRoot, spec, options = {}) { + const content = spec.kind === 'python' + ? renderPythonDispatchShim(spec.command) + : renderShellDispatchShim(spec.command); + return renderManagedFile(repoRoot, spec.relativePath, content, options); +} + +function ensureHookShim(repoRoot, hookName, options = {}) { + return renderManagedFile( + repoRoot, + path.posix.join('.githooks', hookName), + renderShellDispatchShim(['hook', 'run', hookName]), + options, + ); +} + function copyTemplateFile(repoRoot, relativeTemplatePath, force, dryRun) { const sourcePath = path.join(TEMPLATE_ROOT, relativeTemplatePath); const destinationRelativePath = toDestinationPath(relativeTemplatePath); @@ -1041,8 +1251,7 @@ function writeLockState(repoRoot, payload, dryRun) { fs.writeFileSync(lockPath, JSON.stringify(payload, null, 2) + '\n', 'utf8'); } -function ensurePackageScripts(repoRoot, dryRun, options = {}) { - const force = Boolean(options.force); +function removeLegacyPackageScripts(repoRoot, dryRun) { const packagePath = path.join(repoRoot, 'package.json'); if (!fs.existsSync(packagePath)) { return { status: 'skipped', file: 'package.json', note: 'package.json not found' }; @@ -1058,29 +1267,87 @@ function ensurePackageScripts(repoRoot, dryRun, options = {}) { const existingScripts = pkg.scripts && typeof pkg.scripts === 'object' ? pkg.scripts : {}; - const hasExistingAgentScripts = Object.keys(existingScripts).some((key) => key.startsWith('agent:')); - if (hasExistingAgentScripts && !force) { - return { status: 'unchanged', file: 'package.json', note: 'preserved existing agent:* scripts' }; - } - pkg.scripts = existingScripts; let changed = false; - for (const [key, value] of Object.entries(REQUIRED_PACKAGE_SCRIPTS)) { - if (pkg.scripts[key] !== value) { - pkg.scripts[key] = value; + for (const [key, value] of Object.entries(LEGACY_MANAGED_PACKAGE_SCRIPTS)) { + if (existingScripts[key] === value) { + delete existingScripts[key]; changed = true; } } if (!changed) { - return { status: 'unchanged', file: 'package.json' }; + return { status: 'unchanged', file: 'package.json', note: 'no Guardex-managed agent:* scripts found' }; } if (!dryRun) { fs.writeFileSync(packagePath, JSON.stringify(pkg, null, 2) + '\n', 'utf8'); } - return { status: 'updated', file: 'package.json' }; + return { status: dryRun ? 'would-update' : 'updated', file: 'package.json', note: 'removed Guardex-managed agent:* scripts' }; +} + +function installUserLevelAsset(asset, options = {}) { + const dryRun = Boolean(options.dryRun); + const force = Boolean(options.force); + const destinationPath = path.join(GUARDEX_HOME_DIR, asset.destination); + const sourceContent = fs.readFileSync(asset.source, 'utf8'); + const destinationExists = fs.existsSync(destinationPath); + + if (destinationExists) { + const existingContent = fs.readFileSync(destinationPath, 'utf8'); + if (existingContent === sourceContent) { + return { status: 'unchanged', file: asset.destination }; + } + if (!force) { + return { status: 'skipped-conflict', file: asset.destination }; + } + } + + if (!dryRun) { + fs.mkdirSync(path.dirname(destinationPath), { recursive: true }); + fs.writeFileSync(destinationPath, sourceContent, 'utf8'); + } + return { status: destinationExists ? (dryRun ? 'would-update' : 'updated') : 'created', file: asset.destination }; +} + +function removeLegacyManagedRepoFile(repoRoot, relativePath, options = {}) { + const dryRun = Boolean(options.dryRun); + const force = Boolean(options.force); + const absolutePath = path.join(repoRoot, relativePath); + if (!fs.existsSync(absolutePath)) { + return { status: 'unchanged', file: relativePath, note: 'not present' }; + } + if (!fs.statSync(absolutePath).isFile()) { + return { status: 'skipped-conflict', file: relativePath, note: 'not a regular file' }; + } + + const skillAsset = USER_LEVEL_SKILL_ASSETS.find((asset) => asset.destination === relativePath); + if (skillAsset) { + const userLevelPath = path.join(GUARDEX_HOME_DIR, skillAsset.destination); + if (!fs.existsSync(userLevelPath)) { + return { status: 'skipped', file: relativePath, note: 'user-level replacement not installed' }; + } + } + + const templateRelative = skillAsset + ? skillAsset.source.slice(TEMPLATE_ROOT.length + 1) + : relativePath.replace(/^\./, ''); + const sourcePath = path.join(TEMPLATE_ROOT, templateRelative); + if (!fs.existsSync(sourcePath)) { + return { status: 'skipped', file: relativePath, note: 'template source missing' }; + } + + const sourceContent = fs.readFileSync(sourcePath, 'utf8'); + const existingContent = fs.readFileSync(absolutePath, 'utf8'); + if (existingContent !== sourceContent && !force) { + return { status: 'skipped-conflict', file: relativePath, note: 'local edits differ from managed template' }; + } + + if (!dryRun) { + fs.rmSync(absolutePath, { force: true }); + } + return { status: dryRun ? 'would-remove' : 'removed', file: relativePath }; } function ensureAgentsSnippet(repoRoot, dryRun, options = {}) { @@ -1446,7 +1713,7 @@ function assertProtectedMainWriteAllowed(options, commandName) { throw new Error( `${commandName} blocked on protected branch '${blocked.branch}' in an initialized repo.\n` + `Keep local '${blocked.branch}' pull-only: start an agent branch/worktree first:\n` + - ` bash scripts/agent-branch-start.sh "" "codex"\n` + + ` gx branch start "" "codex"\n` + `Override once only when intentional: --allow-protected-base-write`, ); } @@ -1672,8 +1939,7 @@ function startProtectedBaseSandbox(blocked, { taskName, sandboxSuffix }) { return startProtectedBaseSandboxFallback(blocked, sandboxSuffix); } - const startResult = run('bash', [ - startScript, + const startResult = runPackageAsset('branchStart', [ '--task', taskName, '--agent', @@ -1822,8 +2088,7 @@ function collectWorktreeDirtyPaths(worktreePath) { } function collectDoctorForceAddPaths(worktreePath) { - return TEMPLATE_FILES - .map((entry) => toDestinationPath(entry)) + return REQUIRED_WORKFLOW_FILES .filter((relativePath) => relativePath.startsWith('scripts/') || relativePath.startsWith('.githooks/')) .filter((relativePath) => fs.existsSync(path.join(worktreePath, relativePath))); } @@ -1875,13 +2140,13 @@ function claimDoctorChangedLocks(metadata) { ])); const deletedPaths = collectDoctorDeletedPaths(metadata.worktreePath); if (changedPaths.length > 0) { - run('python3', [lockScript, 'claim', '--branch', metadata.branch, ...changedPaths], { + runPackageAsset('lockTool', ['claim', '--branch', metadata.branch, ...changedPaths], { cwd: metadata.worktreePath, timeout: 30_000, }); } if (deletedPaths.length > 0) { - run('python3', [lockScript, 'allow-delete', '--branch', metadata.branch, ...deletedPaths], { + runPackageAsset('lockTool', ['allow-delete', '--branch', metadata.branch, ...deletedPaths], { cwd: metadata.worktreePath, timeout: 30_000, }); @@ -2027,7 +2292,7 @@ function finishDoctorSandboxBranch(blocked, metadata, options = {}) { const finishResult = run( 'bash', - [finishScript, '--branch', metadata.branch, '--base', blocked.branch, '--via-pr', waitForMergeArg], + [finishScript, '--branch', metadata.branch, '--base', blocked.branch, '--via-pr', waitForMergeArg, '--cleanup'], { cwd: metadata.worktreePath, timeout: finishTimeoutMs }, ); if (isSpawnFailure(finishResult)) { @@ -2098,7 +2363,7 @@ function mergeDoctorSandboxRepairsBackToProtectedBase(options, blocked, metadata ...(autoCommitResult.stagedFiles || []), ...OMX_SCAFFOLD_DIRECTORIES, ...Array.from(OMX_SCAFFOLD_FILES.keys()), - ...TEMPLATE_FILES.map((entry) => toDestinationPath(entry)), + ...REQUIRED_WORKFLOW_FILES, 'bin', 'package.json', '.gitignore', @@ -2236,9 +2501,7 @@ function mergeDoctorSandboxRepairsBackToProtectedBase(options, blocked, metadata } function syncDoctorLocalSupportFiles(repoRoot, dryRun) { - return TEMPLATE_FILES - .filter((entry) => entry.startsWith('codex/') || entry.startsWith('claude/')) - .map((entry) => ensureTemplateFilePresent(repoRoot, entry, dryRun)); + return []; } function runDoctorInSandbox(options, blocked) { @@ -3196,7 +3459,6 @@ function autoFinishReadyAgentBranches(repoRoot, options = {}) { summary.attempted += 1; const finishArgs = [ - finishScript, '--branch', branch, '--base', @@ -3205,7 +3467,7 @@ function autoFinishReadyAgentBranches(repoRoot, options = {}) { waitForMerge ? '--wait-for-merge' : '--no-wait-for-merge', '--cleanup', ]; - const finishResult = run('bash', finishArgs, { cwd: repoRoot }); + const finishResult = runPackageAsset('branchFinish', finishArgs, { cwd: repoRoot }); const combinedOutput = [finishResult.stdout || '', finishResult.stderr || ''].join('\n').trim(); if (finishResult.status === 0) { @@ -3358,9 +3620,9 @@ function printSetupRepoHints(repoRoot, baseBranch, repoLabel = '') { console.log(`[${TOOL_NAME}] Bootstrap commit${label}: git add . && git commit -m "bootstrap gitguardex"`); console.log( `[${TOOL_NAME}] First agent flow${label}: ` + - `bash scripts/agent-branch-start.sh "" "codex" -> ` + - `python3 scripts/agent-file-locks.py claim --branch "$(git branch --show-current)" -> ` + - `bash scripts/agent-branch-finish.sh --branch "$(git branch --show-current)" --base ${baseBranch} --via-pr --wait-for-merge`, + `gx branch start "" "codex" -> ` + + `gx locks claim --branch "$(git branch --show-current)" -> ` + + `gx branch finish --branch "$(git branch --show-current)" --base ${baseBranch} --via-pr --wait-for-merge`, ); } if (!hasOrigin) { @@ -3708,19 +3970,20 @@ function parseMergeArgs(rawArgs) { return options; } -function parseFinishArgs(rawArgs) { +function parseFinishArgs(rawArgs, defaults = {}) { const options = { target: process.cwd(), base: '', branch: '', all: false, dryRun: false, - waitForMerge: true, - cleanup: true, + waitForMerge: defaults.waitForMerge ?? true, + cleanup: defaults.cleanup ?? true, keepRemote: false, noAutoCommit: false, failFast: false, commitMessage: '', + mergeMode: defaults.mergeMode || 'pr', }; for (let index = 0; index < rawArgs.length; index += 1) { @@ -3777,6 +4040,26 @@ function parseFinishArgs(rawArgs) { options.waitForMerge = false; continue; } + if (arg === '--via-pr') { + options.mergeMode = 'pr'; + continue; + } + if (arg === '--direct-only') { + options.mergeMode = 'direct'; + continue; + } + if (arg === '--mode') { + const next = rawArgs[index + 1]; + if (!next) { + throw new Error('--mode requires a value'); + } + if (!['auto', 'direct', 'pr'].includes(next)) { + throw new Error(`Invalid --mode value: ${next} (expected auto|direct|pr)`); + } + options.mergeMode = next; + index += 1; + continue; + } if (arg === '--cleanup') { options.cleanup = true; continue; @@ -3941,7 +4224,7 @@ function claimLocksForAutoCommit(repoRoot, worktreePath, branch) { ]); if (changedFiles.length > 0) { - const claim = run('python3', [lockScript, 'claim', '--branch', branch, ...changedFiles], { + const claim = runPackageAsset('lockTool', ['claim', '--branch', branch, ...changedFiles], { cwd: repoRoot, stdio: 'pipe', }); @@ -3975,7 +4258,7 @@ function claimLocksForAutoCommit(repoRoot, worktreePath, branch) { ]); if (deletedFiles.length > 0) { - const allowDelete = run('python3', [lockScript, 'allow-delete', '--branch', branch, ...deletedFiles], { + const allowDelete = runPackageAsset('lockTool', ['allow-delete', '--branch', branch, ...deletedFiles], { cwd: repoRoot, stdio: 'pipe', }); @@ -4753,6 +5036,16 @@ function askGlobalInstallForMissing(options, missingPackages, missingLocalTools) } function installGlobalToolchain(options) { + const approval = resolveGlobalInstallApproval(options); + if (approval.source === 'flag' && !approval.approved) { + return { + status: 'skipped', + reason: approval.source, + missingPackages: [], + missingLocalTools: [], + }; + } + if (options.dryRun) { return { status: 'dry-run-skip' }; } @@ -4781,11 +5074,11 @@ function installGlobalToolchain(options) { const missingPackages = detection.ok ? detection.missing : [...GLOBAL_TOOLCHAIN_PACKAGES]; const missingLocalTools = localCompanionTools.filter((tool) => tool.status !== 'active'); - const approval = askGlobalInstallForMissing(options, missingPackages, missingLocalTools); - if (!approval.approved) { + const installApproval = askGlobalInstallForMissing(options, missingPackages, missingLocalTools); + if (!installApproval.approved) { return { status: 'skipped', - reason: approval.source, + reason: installApproval.source, missingPackages, missingLocalTools, }; @@ -4880,13 +5173,15 @@ function runInstallInternal(options) { for (const templateFile of TEMPLATE_FILES) { operations.push(copyTemplateFile(repoRoot, templateFile, Boolean(options.force), Boolean(options.dryRun))); } + for (const shim of SCRIPT_SHIMS) { + operations.push(ensureGeneratedScriptShim(repoRoot, shim, options)); + } + for (const hookName of HOOK_NAMES) { + operations.push(ensureHookShim(repoRoot, hookName, options)); + } operations.push(ensureLockRegistry(repoRoot, Boolean(options.dryRun))); - if (!options.skipPackageJson) { - operations.push(ensurePackageScripts(repoRoot, Boolean(options.dryRun), { force: Boolean(options.force) })); - } - if (!options.skipAgents) { operations.push(ensureAgentsSnippet(repoRoot, Boolean(options.dryRun), { force: Boolean(options.force) })); } @@ -4925,6 +5220,12 @@ function runFixInternal(options) { for (const templateFile of TEMPLATE_FILES) { operations.push(ensureTemplateFilePresent(repoRoot, templateFile, Boolean(options.dryRun))); } + for (const shim of SCRIPT_SHIMS) { + operations.push(ensureGeneratedScriptShim(repoRoot, shim, options)); + } + for (const hookName of HOOK_NAMES) { + operations.push(ensureHookShim(repoRoot, hookName, options)); + } operations.push(ensureLockRegistry(repoRoot, Boolean(options.dryRun))); @@ -4954,10 +5255,6 @@ function runFixInternal(options) { } } - if (!options.skipPackageJson) { - operations.push(ensurePackageScripts(repoRoot, Boolean(options.dryRun), { force: Boolean(options.force) })); - } - if (!options.skipAgents) { operations.push(ensureAgentsSnippet(repoRoot, Boolean(options.dryRun), { force: Boolean(options.force) })); } @@ -4987,8 +5284,7 @@ function runScanInternal(options) { const requiredPaths = [ ...OMX_SCAFFOLD_DIRECTORIES, ...Array.from(OMX_SCAFFOLD_FILES.keys()), - ...TEMPLATE_FILES.map((entry) => toDestinationPath(entry)), - LOCK_FILE_RELATIVE, + ...REQUIRED_WORKFLOW_FILES, ]; for (const relativePath of requiredPaths) { @@ -6607,17 +6903,18 @@ function doctorAudit(rawArgs) { const packagePath = path.join(repoRoot, 'package.json'); if (!fs.existsSync(packagePath)) { - warn('package.json not found (npm helper scripts cannot be verified)'); + warn('package.json not found (legacy agent:* script drift cannot be checked)'); } else { try { const pkg = JSON.parse(fs.readFileSync(packagePath, 'utf8')); const scripts = pkg.scripts || {}; - for (const [name, expectedValue] of Object.entries(REQUIRED_PACKAGE_SCRIPTS)) { - if (scripts[name] !== expectedValue) { - fail(`package.json script mismatch for "${name}"`); - } else { - ok(`package.json script "${name}" is configured`); - } + const legacyAgentScripts = Object.entries(LEGACY_MANAGED_PACKAGE_SCRIPTS) + .filter(([name, expectedValue]) => scripts[name] === expectedValue) + .map(([name]) => name); + if (legacyAgentScripts.length > 0) { + warn(`legacy agent:* package.json scripts remain (${legacyAgentScripts.join(', ')}); run '${SHORT_TOOL_NAME} migrate' to remove them`); + } else { + ok('package.json does not contain Guardex-managed agent:* helper scripts'); } } catch (error) { fail(`package.json is invalid JSON: ${error.message}`); @@ -6699,6 +6996,167 @@ function prompt(rawArgs) { return copyPrompt(); } +function printStandaloneOperations(title, rootLabel, operations, dryRun = false) { + console.log(`[${TOOL_NAME}] ${title}: ${rootLabel}`); + for (const operation of operations) { + const note = operation.note ? ` (${operation.note})` : ''; + console.log(` - ${operation.status.padEnd(12)} ${operation.file}${note}`); + } + if (dryRun) { + console.log(`[${TOOL_NAME}] Dry run complete. No files were modified.`); + } +} + +function branch(rawArgs) { + const [subcommand, ...rest] = rawArgs; + if (subcommand === 'start') { + const { target, passthrough } = extractTargetedArgs(rest); + invokePackageAsset('branchStart', passthrough, { cwd: resolveRepoRoot(target) }); + return; + } + if (subcommand === 'finish') { + const { target, passthrough } = extractTargetedArgs(rest); + invokePackageAsset('branchFinish', passthrough, { cwd: resolveRepoRoot(target) }); + return; + } + if (subcommand === 'merge') return merge(rest); + throw new Error( + `Usage: ${SHORT_TOOL_NAME} branch [options] ` + + `(examples: '${SHORT_TOOL_NAME} branch start "" ""', '${SHORT_TOOL_NAME} branch finish --branch ')`, + ); +} + +function locks(rawArgs) { + const { target, passthrough } = extractTargetedArgs(rawArgs); + const result = runPackageAsset('lockTool', passthrough, { cwd: resolveRepoRoot(target) }); + if (result.stdout) process.stdout.write(result.stdout); + if (result.stderr) process.stderr.write(result.stderr); + process.exitCode = result.status; +} + +function worktree(rawArgs) { + const [subcommand, ...rest] = rawArgs; + if (subcommand === 'prune') { + const { target, passthrough } = extractTargetedArgs(rest); + invokePackageAsset('worktreePrune', passthrough, { cwd: resolveRepoRoot(target) }); + return; + } + throw new Error(`Usage: ${SHORT_TOOL_NAME} worktree prune [cleanup-options]`); +} + +function hook(rawArgs) { + const [subcommand, ...rest] = rawArgs; + if (subcommand === 'run') { + const [hookName, ...hookArgs] = rest; + if (!HOOK_NAMES.includes(hookName)) { + throw new Error(`Unknown hook name: ${hookName || '(missing)'}`); + } + const { target, passthrough } = extractTargetedArgs(hookArgs); + const hookAssetPath = path.join(TEMPLATE_ROOT, 'githooks', hookName); + const result = run('bash', [hookAssetPath, ...passthrough], { + cwd: resolveRepoRoot(target), + stdio: hookName === 'pre-push' ? 'inherit' : 'pipe', + env: packageAssetEnv(), + }); + if (result.stdout) process.stdout.write(result.stdout); + if (result.stderr) process.stderr.write(result.stderr); + process.exitCode = result.status; + return; + } + if (subcommand === 'install') { + const { target, passthrough } = extractTargetedArgs(rest); + if (passthrough.length > 0) { + throw new Error(`Unknown hook install option: ${passthrough[0]}`); + } + const repoRoot = resolveRepoRoot(target); + const hookResult = configureHooks(repoRoot, false); + console.log(`[${TOOL_NAME}] Hook install target: ${repoRoot}`); + console.log(` - hooksPath ${hookResult.status} ${hookResult.key}=${hookResult.value}`); + process.exitCode = 0; + return; + } + throw new Error(`Usage: ${SHORT_TOOL_NAME} hook ...`); +} + +function internal(rawArgs) { + const [subcommand, assetKey, ...rest] = rawArgs; + if (subcommand !== 'run-shell') { + throw new Error(`Unknown internal command: ${subcommand || '(missing)'}`); + } + const { target, passthrough } = extractTargetedArgs(rest); + const result = runPackageAsset(assetKey, passthrough, { cwd: resolveRepoRoot(target) }); + if (result.stdout) process.stdout.write(result.stdout); + if (result.stderr) process.stderr.write(result.stderr); + process.exitCode = result.status; +} + +function installAgentSkills(rawArgs) { + let dryRun = false; + let force = false; + for (const arg of rawArgs) { + if (arg === '--dry-run') { + dryRun = true; + continue; + } + if (arg === '--force') { + force = true; + continue; + } + throw new Error(`Unknown option: ${arg}`); + } + + const operations = USER_LEVEL_SKILL_ASSETS.map((asset) => installUserLevelAsset(asset, { dryRun, force })); + printStandaloneOperations('User-level Guardex skills', GUARDEX_HOME_DIR, operations, dryRun); + process.exitCode = 0; +} + +function migrate(rawArgs) { + const { target, passthrough } = extractTargetedArgs(rawArgs); + let dryRun = false; + let force = false; + let installSkills = false; + for (const arg of passthrough) { + if (arg === '--dry-run') { + dryRun = true; + continue; + } + if (arg === '--force') { + force = true; + continue; + } + if (arg === '--install-agent-skills') { + installSkills = true; + continue; + } + throw new Error(`Unknown option: ${arg}`); + } + + const repoRoot = resolveRepoRoot(target); + const fixPayload = runFixInternal({ + target: repoRoot, + dryRun, + force, + skipAgents: false, + skipPackageJson: true, + skipGitignore: false, + dropStaleLocks: true, + }); + printOperations('Migrate/fix', fixPayload, dryRun); + + if (installSkills) { + const skillOps = USER_LEVEL_SKILL_ASSETS.map((asset) => installUserLevelAsset(asset, { dryRun, force })); + printStandaloneOperations('Migrate/install-agent-skills', GUARDEX_HOME_DIR, skillOps, dryRun); + } + + const removableLegacyFiles = LEGACY_MANAGED_REPO_FILES.filter( + (relativePath) => !REQUIRED_WORKFLOW_FILES.includes(relativePath), + ); + const removalOps = removableLegacyFiles.map((relativePath) => removeLegacyManagedRepoFile(repoRoot, relativePath, { dryRun, force })); + removalOps.push(removeLegacyPackageScripts(repoRoot, dryRun)); + printStandaloneOperations('Migrate/cleanup', repoRoot, removalOps, dryRun); + process.exitCode = 0; +} + function cleanup(rawArgs) { const options = parseCleanupArgs(rawArgs); const repoRoot = resolveRepoRoot(options.target); @@ -6707,7 +7165,7 @@ function cleanup(rawArgs) { throw new Error(`Missing cleanup script: ${pruneScript}. Run '${SHORT_TOOL_NAME} setup' first.`); } - const args = [pruneScript]; + const args = []; if (options.base) { args.push('--base', options.base); } @@ -6738,7 +7196,7 @@ function cleanup(rawArgs) { } const runCleanupCycle = () => { - const runResult = run('bash', args, { cwd: repoRoot, stdio: 'inherit' }); + const runResult = runPackageAsset('worktreePrune', args, { cwd: repoRoot, stdio: 'inherit' }); if (runResult.status !== 0) { throw new Error('Cleanup command failed'); } @@ -6777,7 +7235,7 @@ function merge(rawArgs) { throw new Error(`Missing merge script: ${mergeScript}. Run '${SHORT_TOOL_NAME} setup' first.`); } - const args = [mergeScript]; + const args = []; if (options.base) { args.push('--base', options.base); } @@ -6794,7 +7252,7 @@ function merge(rawArgs) { args.push('--branch', branch); } - const mergeResult = run('bash', args, { cwd: repoRoot, stdio: 'pipe' }); + const mergeResult = runPackageAsset('branchMerge', args, { cwd: repoRoot, stdio: 'pipe' }); if (mergeResult.stdout) { process.stdout.write(mergeResult.stdout); } @@ -6808,8 +7266,8 @@ function merge(rawArgs) { process.exitCode = 0; } -function finish(rawArgs) { - const options = parseFinishArgs(rawArgs); +function finish(rawArgs, defaults = {}) { + const options = parseFinishArgs(rawArgs, defaults); const repoRoot = resolveRepoRoot(options.target); const finishScript = path.join(repoRoot, 'scripts', 'agent-branch-finish.sh'); @@ -6880,26 +7338,31 @@ function finish(rawArgs) { } const finishArgs = [ - finishScript, '--branch', branch, '--base', baseBranch, - '--via-pr', options.waitForMerge ? '--wait-for-merge' : '--no-wait-for-merge', options.cleanup ? '--cleanup' : '--no-cleanup', ]; + if (options.mergeMode === 'pr') { + finishArgs.push('--via-pr'); + } else if (options.mergeMode === 'direct') { + finishArgs.push('--direct-only'); + } else { + finishArgs.push('--mode', 'auto'); + } if (options.keepRemote) { finishArgs.push('--keep-remote-branch'); } if (options.dryRun) { - console.log(`[${TOOL_NAME}] [dry-run] Would run: bash ${finishArgs.join(' ')}`); + console.log(`[${TOOL_NAME}] [dry-run] Would run: gx branch finish ${finishArgs.join(' ')}`); succeeded += 1; continue; } - const finishResult = run('bash', finishArgs, { cwd: repoRoot, stdio: 'pipe' }); + const finishResult = runPackageAsset('branchFinish', finishArgs, { cwd: repoRoot, stdio: 'pipe' }); if (finishResult.stdout) { process.stdout.write(finishResult.stdout); } @@ -7293,6 +7756,13 @@ function main() { if (command === 'prompt') return prompt(rest); if (command === 'doctor') return doctor(rest); + if (command === 'branch') return branch(rest); + if (command === 'locks') return locks(rest); + if (command === 'worktree') return worktree(rest); + if (command === 'hook') return hook(rest); + if (command === 'migrate') return migrate(rest); + if (command === 'install-agent-skills') return installAgentSkills(rest); + if (command === 'internal') return internal(rest); if (command === 'agents') return agents(rest); if (command === 'merge') return merge(rest); if (command === 'finish') return finish(rest); diff --git a/openspec/changes/agent-codex-cli-owned-install-surface-2026-04-21-23-09/proposal.md b/openspec/changes/agent-codex-cli-owned-install-surface-2026-04-21-23-09/proposal.md new file mode 100644 index 0000000..9656094 --- /dev/null +++ b/openspec/changes/agent-codex-cli-owned-install-surface-2026-04-21-23-09/proposal.md @@ -0,0 +1,26 @@ +## Why + +- `gx setup` and `gx doctor` currently copy workflow implementations, full hook files, repo-local skills, and `agent:*` package scripts into every consumer repo. +- That distribution model creates drift by design, which is why doctor needs protected-branch sandbox repair flows just to re-sync copied logic that actually belongs to the CLI package. +- Repo-local copies also make the public workflow noisy: consumers are taught to call pasted scripts instead of the `gx` CLI that owns the behavior. + +## What Changes + +- Add CLI-owned workflow subcommands for branch start/finish, lock operations, hook dispatch, worktree prune, repo migration, and user-level agent-skill installation. +- Replace installed repo hooks with tiny shims that dispatch into `gx hook run ...`. +- Stop setup/doctor from copying repo-local workflow implementations or repo-local skills, and stop injecting Guardex-managed `agent:*` package scripts into target repos while keeping repo-local dispatch shims. +- Add a `gx migrate` path that converts old-style installs to the new minimal repo footprint. +- Update docs, prompts, and managed templates to teach the `gx ...` surface instead of pasted script paths. + +## Scope + +- `bin/multiagent-safety.js` +- hook templates and package-owned workflow assets under `templates/` +- setup/doctor/install/migrate tests in `test/install.test.js` +- user-facing docs/templates (`README.md`, managed AGENTS block, skill templates) + +## Risks + +- Hook shims must still work in repos that only have the CLI on `PATH`; the tests need to lock that behavior. +- Existing repos may keep stale copied files until `gx migrate` runs, so migration must be conservative and explicit about what it removes. +- Setup/doctor/status output will change materially because the managed repo footprint is smaller. diff --git a/openspec/changes/agent-codex-cli-owned-install-surface-2026-04-21-23-09/specs/cli-owned-install-surface/spec.md b/openspec/changes/agent-codex-cli-owned-install-surface-2026-04-21-23-09/specs/cli-owned-install-surface/spec.md new file mode 100644 index 0000000..0692774 --- /dev/null +++ b/openspec/changes/agent-codex-cli-owned-install-surface-2026-04-21-23-09/specs/cli-owned-install-surface/spec.md @@ -0,0 +1,54 @@ +## ADDED Requirements + +### Requirement: Setup installs only repo-local state and dispatch shims + +`gx setup` and `gx doctor` SHALL keep the managed repo footprint limited to repo-local state and dispatch shims, not copied workflow logic. + +#### Scenario: setup installs the minimal repo footprint + +- **GIVEN** a repo opts into Guardex +- **WHEN** `gx setup` runs +- **THEN** it installs the managed AGENTS block, `.githooks/*` dispatch shims, `scripts/*` workflow shims, `.omx/.omc` scaffold, lock registry state, and the managed `.gitignore` block +- **AND** it does not copy workflow implementations, repo-local Codex/Claude skills, or inject Guardex-managed `agent:*` helper scripts into `package.json` + +#### Scenario: doctor repairs the minimal footprint without restoring copied scripts + +- **GIVEN** a repo already uses the CLI-owned install surface +- **WHEN** `gx doctor` repairs drift +- **THEN** it restores the managed AGENTS block, hook/workflow shims, lock registry, and managed `.gitignore` entries as needed +- **AND** it does not recreate copied workflow implementations, repo-local skills, or Guardex-managed `agent:*` package scripts + +### Requirement: Hook shims dispatch through `gx` + +Installed repo hooks SHALL delegate to CLI-owned hook logic instead of embedding guard behavior inline. + +#### Scenario: pre-commit hook is a shim + +- **GIVEN** `gx setup` installed repo hooks +- **WHEN** `.githooks/pre-commit` is inspected or executed +- **THEN** it delegates to `gx hook run pre-commit` +- **AND** the guarded pre-commit behavior still enforces the same branch and lock rules + +### Requirement: CLI-owned workflow commands remain available without copied workflow implementations + +The CLI SHALL expose the guard workflow directly so consumers do not need copied repo workflow logic. + +#### Scenario: branch and lock commands run from the CLI + +- **GIVEN** a repo with the minimal install footprint +- **WHEN** a user runs `gx branch start`, `gx branch finish`, `gx locks claim`, or `gx worktree prune` +- **THEN** the command executes using package-owned logic +- **AND** any repo-local `scripts/agent-branch-*.sh` or `scripts/agent-file-locks.py` files remain thin dispatch shims instead of copied workflow logic + +### Requirement: Migration removes old-style copied workflow files + +The CLI SHALL provide a migration path from old repo-local installs to the CLI-owned surface. + +#### Scenario: migrate converts an old-style install + +- **GIVEN** a repo still contains Guardex-managed workflow scripts, repo-local skills, and injected `agent:*` package scripts +- **WHEN** `gx migrate` runs +- **THEN** it replaces hooks with dispatch shims +- **AND** it removes the copied workflow scripts and managed `agent:*` script injections +- **AND** it removes repo-local Guardex skill copies when matching user-level installs are present +- **AND** it leaves the AGENTS block, lock registry, and managed `.gitignore` in the new minimal form diff --git a/openspec/changes/agent-codex-cli-owned-install-surface-2026-04-21-23-09/tasks.md b/openspec/changes/agent-codex-cli-owned-install-surface-2026-04-21-23-09/tasks.md new file mode 100644 index 0000000..e2df242 --- /dev/null +++ b/openspec/changes/agent-codex-cli-owned-install-surface-2026-04-21-23-09/tasks.md @@ -0,0 +1,33 @@ +## 1. Spec + +- [x] 1.1 Define the CLI-owned install surface and minimal repo footprint in `specs/cli-owned-install-surface/spec.md`. +- [x] 1.2 Update the proposal/tasks to reflect the migration constraints and cleanup goal. + +## 2. Tests + +- [x] 2.1 Add or update install/setup/doctor regressions for the new minimal repo footprint: + - hooks install as `gx hook run ...` shims + - setup/doctor keep repo-local dispatch shims but stop copying workflow implementations or repo-local skills + - migration converts old-style installs and removes Guardex-managed `agent:*` package scripts +- [x] 2.2 Add coverage for the new CLI-owned command surface (`gx branch ...`, `gx locks ...`, `gx worktree prune`, `gx migrate --install-agent-skills` as applicable). + +## 3. Implementation + +- [x] 3.1 Add CLI-owned workflow subcommands and package-asset execution paths in `bin/multiagent-safety.js`. +- [x] 3.2 Convert installed hook templates to shims and route hook logic through package-owned assets. +- [x] 3.3 Remove repo-local workflow implementation/skill/package-script installation from setup/doctor while preserving AGENTS, hook/workflow shims, lock state, and managed gitignore behavior. +- [x] 3.4 Add `gx migrate` and user-level skill installation support. +- [x] 3.5 Update docs/templates/prompts to teach the `gx` surface instead of pasted repo scripts. + +## 4. Verification + +- [x] 4.1 Run `node --check bin/multiagent-safety.js`. +- [x] 4.2 Run the focused install/doctor suite: `node --test test/install.test.js`. +- [x] 4.3 Run `openspec validate agent-codex-cli-owned-install-surface-2026-04-21-23-09 --type change --strict`. +- [x] 4.4 Run `openspec validate --specs`. + +## 5. Cleanup + +- [x] 5.1 Confirm the OpenSpec tasks reflect the shipped behavior and note any deferred follow-ups. +- [ ] 5.2 Finish the agent branch via PR merge + cleanup (`gx finish --via-pr --wait-for-merge --cleanup` or `bash scripts/agent-branch-finish.sh --branch --base --via-pr --wait-for-merge --cleanup`). +- [ ] 5.3 Record PR URL + final `MERGED` evidence in the completion handoff. diff --git a/templates/AGENTS.multiagent-safety.md b/templates/AGENTS.multiagent-safety.md index a9fa4b6..9ab89d7 100644 --- a/templates/AGENTS.multiagent-safety.md +++ b/templates/AGENTS.multiagent-safety.md @@ -7,22 +7,22 @@ `GUARDEX_ON=0` disables Guardex for that repo. `GUARDEX_ON=1` explicitly enables Guardex for that repo again. -**Isolation.** Every task runs on a dedicated `agent/*` branch + worktree. Start with `scripts/agent-branch-start.sh "" ""`. Treat the base branch (`main`/`dev`) as read-only while an agent branch is active. Never `git checkout ` on a primary working tree (including nested repos); use `git worktree add` instead. The `.githooks/post-checkout` hook auto-reverts primary-branch switches during agent sessions - bypass only with `GUARDEX_ALLOW_PRIMARY_BRANCH_SWITCH=1`. +**Isolation.** Every task runs on a dedicated `agent/*` branch + worktree. Start with `gx branch start "" ""`. Treat the base branch (`main`/`dev`) as read-only while an agent branch is active. Never `git checkout ` on a primary working tree (including nested repos); use `git worktree add` instead. The `.githooks/post-checkout` hook auto-reverts primary-branch switches during agent sessions - bypass only with `GUARDEX_ALLOW_PRIMARY_BRANCH_SWITCH=1`. For every new task, including follow-up work in the same chat/session, if an assigned agent sub-branch/worktree is already open, continue in that sub-branch instead of creating a fresh lane unless the user explicitly redirects scope. Never implement directly on the local/base branch checkout; keep it unchanged and perform all edits in the agent sub-branch/worktree. -**Ownership.** Before editing, claim files: `scripts/agent-file-locks.py claim --branch "" `. Before deleting, confirm the path is in your claim. Don't edit outside your scope unless reassigned. +**Ownership.** Before editing, claim files: `gx locks claim --branch "" `. Before deleting, confirm the path is in your claim. Don't edit outside your scope unless reassigned. **Handoff gate.** Post a one-line handoff note (plan/change, owned scope, intended action) before editing. Re-read the latest handoffs before replacing others' code. -**Completion.** Finish with `scripts/agent-branch-finish.sh --branch "" --via-pr --wait-for-merge --cleanup` (or `gx finish --all`). Task is only complete when: commit pushed, PR URL recorded, state = `MERGED`, sandbox worktree pruned. If anything blocks, append a `BLOCKED:` note and stop - don't half-finish. +**Completion.** Finish with `gx branch finish --branch "" --via-pr --wait-for-merge --cleanup` (or `gx finish --all`). Task is only complete when: commit pushed, PR URL recorded, state = `MERGED`, sandbox worktree pruned. If anything blocks, append a `BLOCKED:` note and stop - don't half-finish. OMX completion policy: when a task is done, the agent must commit the task changes, push the agent branch, and create/update a PR before considering the branch complete. **Parallel safety.** Assume other agents edit nearby. Never revert unrelated changes. Report conflicts in the handoff. **Reporting.** Every completion handoff includes: files changed, behavior touched, verification commands + results, risks/follow-ups. -**OpenSpec (when change-driven).** Keep `openspec/changes//tasks.md` checkboxes current during work, not batched at the end. Task scaffolds and manual task edits must include an explicit final completion/cleanup section that ends with PR merge + sandbox cleanup (`gx finish --via-pr --wait-for-merge --cleanup` or `scripts/agent-branch-finish.sh ... --cleanup`) and records PR URL + final `MERGED` evidence. Verify specs with `openspec validate --specs` before archive. Don't archive unverified. +**OpenSpec (when change-driven).** Keep `openspec/changes//tasks.md` checkboxes current during work, not batched at the end. Task scaffolds and manual task edits must include an explicit final completion/cleanup section that ends with PR merge + sandbox cleanup (`gx finish --via-pr --wait-for-merge --cleanup` or `gx branch finish ... --cleanup`) and records PR URL + final `MERGED` evidence. Verify specs with `openspec validate --specs` before archive. Don't archive unverified. **Version bumps.** If a change bumps a published version, the same PR updates release notes/changelog. diff --git a/templates/githooks/post-checkout b/templates/githooks/post-checkout index ad90fad..21a2305 100755 --- a/templates/githooks/post-checkout +++ b/templates/githooks/post-checkout @@ -64,7 +64,7 @@ echo "[agent-primary-branch-guard] Primary checkout switched branches." >&2 echo "[agent-primary-branch-guard] from: $prev_branch (protected)" >&2 echo "[agent-primary-branch-guard] to: $new_branch" >&2 echo "[agent-primary-branch-guard] The primary working tree must stay on its base/protected branch." >&2 -echo "[agent-primary-branch-guard] Use 'git worktree add' (or scripts/agent-branch-start.sh) for feature work." >&2 +echo "[agent-primary-branch-guard] Use 'git worktree add' (or gx branch start) for feature work." >&2 if [[ "$is_agent" == "1" ]]; then echo "[agent-primary-branch-guard] Agent session detected — reverting to '$prev_branch'." >&2 diff --git a/templates/githooks/post-merge b/templates/githooks/post-merge index 8bd7809..fc405e8 100755 --- a/templates/githooks/post-merge +++ b/templates/githooks/post-merge @@ -32,17 +32,30 @@ if [[ "$branch" != "$base_branch" ]]; then exit 0 fi -cli_path="$repo_root/bin/multiagent-safety.js" -if [[ ! -f "$cli_path" ]]; then +if [[ -n "${GUARDEX_CLI_ENTRY:-}" ]]; then + node_bin="${GUARDEX_NODE_BIN:-node}" + if command -v "$node_bin" >/dev/null 2>&1; then + "$node_bin" "$GUARDEX_CLI_ENTRY" cleanup \ + --target "$repo_root" \ + --base "$base_branch" \ + --include-pr-merged \ + --keep-clean-worktrees >/dev/null 2>&1 || true + fi exit 0 fi -node_bin="${GUARDEX_NODE_BIN:-node}" -if ! command -v "$node_bin" >/dev/null 2>&1; then - exit 0 +cli_bin="${GUARDEX_CLI_BIN:-}" +if [[ -z "$cli_bin" ]]; then + if command -v gx >/dev/null 2>&1; then + cli_bin="gx" + elif command -v gitguardex >/dev/null 2>&1; then + cli_bin="gitguardex" + else + exit 0 + fi fi -"$node_bin" "$cli_path" cleanup \ +"$cli_bin" cleanup \ --target "$repo_root" \ --base "$base_branch" \ --include-pr-merged \ diff --git a/templates/githooks/pre-commit b/templates/githooks/pre-commit index facd3ff..444cb3e 100755 --- a/templates/githooks/pre-commit +++ b/templates/githooks/pre-commit @@ -118,9 +118,9 @@ if [[ "$should_require_codex_agent_branch" == "1" && "${GUARDEX_ALLOW_CODEX_ON_N [guardex-preedit-guard] Codex edit/commit detected on a protected branch. GitGuardex requires Codex work to run from an isolated agent/* branch. Start the sub-branch/worktree with: - bash scripts/codex-agent.sh "" "" + gx branch start "" "" Or manually: - bash scripts/agent-branch-start.sh "" "" + gx branch start "" "" Then commit from the created agent/* branch. Temporary bypass (not recommended): @@ -132,7 +132,7 @@ MSG cat >&2 <<'MSG' [codex-branch-guard] Codex agent commit blocked on non-agent branch. Use isolated branch/worktree first: - bash scripts/agent-branch-start.sh "" "" + gx branch start "" "" Then commit from the created agent/* branch. Temporary bypass (not recommended): @@ -163,9 +163,9 @@ if [[ "$is_protected_branch" == "1" ]]; then cat >&2 <<'MSG' [agent-branch-guard] Direct commits on protected branches are blocked. Use an agent branch first: - bash scripts/agent-branch-start.sh "" "" + gx branch start "" "" After finishing work: - bash scripts/agent-branch-finish.sh + gx branch finish Temporary bypass (not recommended): ALLOW_COMMIT_ON_PROTECTED_BRANCH=1 git commit ... @@ -177,7 +177,7 @@ if [[ "$is_agent_session" == "1" && "$branch" != agent/* ]]; then cat >&2 <<'MSG' [agent-branch-guard] Agent commits must run on dedicated agent/* branches. Start an agent branch first: - bash scripts/agent-branch-start.sh "" "" + gx branch start "" "" Then commit on that branch. Temporary bypass (not recommended): @@ -199,7 +199,7 @@ if [[ "$branch" == agent/* ]]; then cat >&2 <<'MSG' [agent-branch-guard] Agent branch commits require file ownership locks. Claim files first: - python3 scripts/agent-file-locks.py claim --branch "$(git rev-parse --abbrev-ref HEAD)" + gx locks claim --branch "$(git rev-parse --abbrev-ref HEAD)" MSG exit 1 fi diff --git a/templates/scripts/openspec/init-change-workspace.sh b/templates/scripts/openspec/init-change-workspace.sh index 6f0930d..7a89cdf 100755 --- a/templates/scripts/openspec/init-change-workspace.sh +++ b/templates/scripts/openspec/init-change-workspace.sh @@ -74,11 +74,11 @@ Describe the change in a sentence or two. Commit message is the spec of record. ## Handoff - Handoff: change=\`${CHANGE_SLUG}\`; branch=\`${AGENT_BRANCH}\`; scope=\`TODO\`; action=\`continue this sandbox or finish cleanup after a usage-limit/manual takeover\`. -- Copy prompt: Continue \`${CHANGE_SLUG}\` on branch \`${AGENT_BRANCH}\`. Work inside the existing sandbox, review \`openspec/changes/${CHANGE_SLUG}/notes.md\`, continue from the current state instead of creating a new sandbox, and when the work is done run \`bash scripts/agent-branch-finish.sh --branch "${AGENT_BRANCH}" --base ${BASE_BRANCH} --via-pr --wait-for-merge --cleanup\`. +- Copy prompt: Continue \`${CHANGE_SLUG}\` on branch \`${AGENT_BRANCH}\`. Work inside the existing sandbox, review \`openspec/changes/${CHANGE_SLUG}/notes.md\`, continue from the current state instead of creating a new sandbox, and when the work is done run \`gx branch finish --branch ${AGENT_BRANCH} --base ${BASE_BRANCH} --via-pr --wait-for-merge --cleanup\`. ## Cleanup -- [ ] Run: \`bash scripts/agent-branch-finish.sh --branch "${AGENT_BRANCH}" --base ${BASE_BRANCH} --via-pr --wait-for-merge --cleanup\` +- [ ] Run: \`gx branch finish --branch ${AGENT_BRANCH} --base ${BASE_BRANCH} --via-pr --wait-for-merge --cleanup\` - [ ] Record PR URL + \`MERGED\` state in the completion handoff. - [ ] Confirm sandbox worktree is gone (\`git worktree list\`, \`git branch -a\`). NOTESEOF @@ -117,7 +117,7 @@ This change is complete only when **all** of the following are true: ## Handoff - Handoff: change=\`${CHANGE_SLUG}\`; branch=\`${AGENT_BRANCH}\`; scope=\`TODO\`; action=\`continue this sandbox or finish cleanup after a usage-limit/manual takeover\`. -- Copy prompt: Continue \`${CHANGE_SLUG}\` on branch \`${AGENT_BRANCH}\`. Work inside the existing sandbox, review \`openspec/changes/${CHANGE_SLUG}/tasks.md\`, continue from the current state instead of creating a new sandbox, and when the work is done run \`bash scripts/agent-branch-finish.sh --branch "${AGENT_BRANCH}" --base ${BASE_BRANCH} --via-pr --wait-for-merge --cleanup\`. +- Copy prompt: Continue \`${CHANGE_SLUG}\` on branch \`${AGENT_BRANCH}\`. Work inside the existing sandbox, review \`openspec/changes/${CHANGE_SLUG}/tasks.md\`, continue from the current state instead of creating a new sandbox, and when the work is done run \`gx branch finish --branch ${AGENT_BRANCH} --base ${BASE_BRANCH} --via-pr --wait-for-merge --cleanup\`. ## 1. Specification @@ -137,7 +137,7 @@ This change is complete only when **all** of the following are true: ## 4. Cleanup (mandatory; run before claiming completion) -- [ ] 4.1 Run the cleanup pipeline: \`bash scripts/agent-branch-finish.sh --branch "${AGENT_BRANCH}" --base ${BASE_BRANCH} --via-pr --wait-for-merge --cleanup\`. This handles commit -> push -> PR create -> merge wait -> worktree prune in one invocation. +- [ ] 4.1 Run the cleanup pipeline: \`gx branch finish --branch ${AGENT_BRANCH} --base ${BASE_BRANCH} --via-pr --wait-for-merge --cleanup\`. This handles commit -> push -> PR create -> merge wait -> worktree prune in one invocation. - [ ] 4.2 Record the PR URL and final merge state (\`MERGED\`) in the completion handoff. - [ ] 4.3 Confirm the sandbox worktree is gone (\`git worktree list\` no longer shows the agent path; \`git branch -a\` shows no surviving local/remote refs for the branch). TASKSEOF diff --git a/templates/scripts/openspec/init-plan-workspace.sh b/templates/scripts/openspec/init-plan-workspace.sh index c96dba3..b664569 100755 --- a/templates/scripts/openspec/init-plan-workspace.sh +++ b/templates/scripts/openspec/init-plan-workspace.sh @@ -434,7 +434,7 @@ EXCCPTEOF ## 6. Cleanup -- [ ] 6.1 If this lane owns finalization, run \`bash scripts/agent-branch-finish.sh --branch --base dev --via-pr --wait-for-merge --cleanup\`. +- [ ] 6.1 If this lane owns finalization, run \`gx branch finish --branch --base dev --via-pr --wait-for-merge --cleanup\`. - [ ] 6.2 Record PR URL + final \`MERGED\` state in the handoff. - [ ] 6.3 Confirm sandbox cleanup (\`git worktree list\`, \`git branch -a\`) or append \`BLOCKED:\` and stop. TASKEOF @@ -470,7 +470,7 @@ TASKEOF ## 6. Cleanup -- [ ] 6.1 If this lane owns finalization, run \`bash scripts/agent-branch-finish.sh --branch --base dev --via-pr --wait-for-merge --cleanup\`. +- [ ] 6.1 If this lane owns finalization, run \`gx branch finish --branch --base dev --via-pr --wait-for-merge --cleanup\`. - [ ] 6.2 Record PR URL + final \`MERGED\` state in the handoff. - [ ] 6.3 Confirm sandbox cleanup (\`git worktree list\`, \`git branch -a\`) or append \`BLOCKED:\` and stop. TASKEOF @@ -506,7 +506,7 @@ TASKEOF ## 6. Cleanup -- [ ] 6.1 If this lane owns finalization, run \`bash scripts/agent-branch-finish.sh --branch --base dev --via-pr --wait-for-merge --cleanup\`. +- [ ] 6.1 If this lane owns finalization, run \`gx branch finish --branch --base dev --via-pr --wait-for-merge --cleanup\`. - [ ] 6.2 Record PR URL + final \`MERGED\` state in the handoff. - [ ] 6.3 Confirm sandbox cleanup (\`git worktree list\`, \`git branch -a\`) or append \`BLOCKED:\` and stop. TASKEOF @@ -542,7 +542,7 @@ TASKEOF ## 6. Cleanup -- [ ] 6.1 If this lane owns finalization, run \`bash scripts/agent-branch-finish.sh --branch --base dev --via-pr --wait-for-merge --cleanup\`. +- [ ] 6.1 If this lane owns finalization, run \`gx branch finish --branch --base dev --via-pr --wait-for-merge --cleanup\`. - [ ] 6.2 Record PR URL + final \`MERGED\` state in the handoff. - [ ] 6.3 Confirm sandbox cleanup (\`git worktree list\`, \`git branch -a\`) or append \`BLOCKED:\` and stop. TASKEOF @@ -578,7 +578,7 @@ TASKEOF ## 6. Cleanup -- [ ] 6.1 If this lane owns finalization, run \`bash scripts/agent-branch-finish.sh --branch --base dev --via-pr --wait-for-merge --cleanup\`. +- [ ] 6.1 If this lane owns finalization, run \`gx branch finish --branch --base dev --via-pr --wait-for-merge --cleanup\`. - [ ] 6.2 Record PR URL + final \`MERGED\` state in the handoff. - [ ] 6.3 Confirm sandbox cleanup (\`git worktree list\`, \`git branch -a\`) or append \`BLOCKED:\` and stop. TASKEOF @@ -614,7 +614,7 @@ TASKEOF ## 6. Cleanup -- [ ] 6.1 If this lane owns finalization, run \`bash scripts/agent-branch-finish.sh --branch --base dev --via-pr --wait-for-merge --cleanup\`. +- [ ] 6.1 If this lane owns finalization, run \`gx branch finish --branch --base dev --via-pr --wait-for-merge --cleanup\`. - [ ] 6.2 Record PR URL + final \`MERGED\` state in the handoff. - [ ] 6.3 Confirm sandbox cleanup (\`git worktree list\`, \`git branch -a\`) or append \`BLOCKED:\` and stop. TASKEOF @@ -650,7 +650,7 @@ TASKEOF ## 6. Cleanup -- [ ] 6.1 If this lane owns finalization, run \`bash scripts/agent-branch-finish.sh --branch --base dev --via-pr --wait-for-merge --cleanup\`. +- [ ] 6.1 If this lane owns finalization, run \`gx branch finish --branch --base dev --via-pr --wait-for-merge --cleanup\`. - [ ] 6.2 Record PR URL + final \`MERGED\` state in the handoff. - [ ] 6.3 Confirm sandbox cleanup (\`git worktree list\`, \`git branch -a\`) or append \`BLOCKED:\` and stop. TASKEOF diff --git a/test/install.test.js b/test/install.test.js index 80455ac..96734d3 100644 --- a/test/install.test.js +++ b/test/install.test.js @@ -55,7 +55,13 @@ function runCmd(cmd, args, cwd, options = {}) { return cp.spawnSync(cmd, args, { cwd, encoding: 'utf8', - env: { ...sanitizedEnv, ...pushBypassEnv, ...overrideEnv }, + env: { + ...sanitizedEnv, + GUARDEX_CLI_ENTRY: cliPath, + GUARDEX_NODE_BIN: process.execPath, + ...pushBypassEnv, + ...overrideEnv, + }, }); } @@ -412,7 +418,7 @@ if (!canSpawnChildProcesses) { test('setup provisions workflow files and repo config', () => { const repoDir = initRepo(); - let result = runNode(['setup', '--target', repoDir], repoDir); + let result = runNode(['setup', '--target', repoDir, '--no-global-install'], repoDir); assert.equal(result.status, 0, result.stderr || result.stdout); assert.match(result.stdout, /OpenSpec core workflow: \/opsx:propose -> \/opsx:apply -> \/opsx:archive/); assert.match(result.stdout, /OpenSpec guide: docs\/openspec-getting-started\.md/); @@ -435,15 +441,12 @@ test('setup provisions workflow files and repo config', () => { 'scripts/agent-worktree-prune.sh', 'scripts/agent-file-locks.py', 'scripts/guardex-env.sh', - 'scripts/install-agent-git-hooks.sh', 'scripts/openspec/init-plan-workspace.sh', 'scripts/openspec/init-change-workspace.sh', '.githooks/pre-commit', '.githooks/pre-push', '.githooks/post-merge', - '.codex/skills/gitguardex/SKILL.md', - '.codex/skills/guardex-merge-skills-to-dev/SKILL.md', - '.claude/commands/gitguardex.md', + '.githooks/post-checkout', '.github/pull.yml.example', '.github/workflows/cr.yml', '.omx/state/agent-file-locks.json', @@ -455,6 +458,14 @@ test('setup provisions workflow files and repo config', () => { assert.equal(fs.existsSync(path.join(repoDir, relativePath)), true, `${relativePath} missing`); } + const branchStartShim = fs.readFileSync(path.join(repoDir, 'scripts', 'agent-branch-start.sh'), 'utf8'); + assert.match(branchStartShim, /exec "\$node_bin" "\$GUARDEX_CLI_ENTRY" 'branch' 'start' "\$@"/); + assert.match(branchStartShim, /exec "\$cli_bin" 'branch' 'start' "\$@"/); + + const preCommitShim = fs.readFileSync(path.join(repoDir, '.githooks', 'pre-commit'), 'utf8'); + assert.match(preCommitShim, /exec "\$node_bin" "\$GUARDEX_CLI_ENTRY" 'hook' 'run' 'pre-commit' "\$@"/); + assert.match(preCommitShim, /exec "\$cli_bin" 'hook' 'run' 'pre-commit' "\$@"/); + const crWorkflow = fs.readFileSync(path.join(repoDir, '.github', 'workflows', 'cr.yml'), 'utf8'); assert.match(crWorkflow, /name:\s+Code Review/); assert.match(crWorkflow, /pull_request:/); @@ -464,18 +475,8 @@ test('setup provisions workflow files and repo config', () => { assert.doesNotMatch(crWorkflow, /if:\s+\$\{\{\s*secrets\.OPENAI_API_KEY/); const packageJson = JSON.parse(fs.readFileSync(path.join(repoDir, 'package.json'), 'utf8')); - assert.equal(packageJson.scripts['agent:codex'], 'bash ./scripts/codex-agent.sh'); - assert.equal(packageJson.scripts['agent:review:watch'], 'bash ./scripts/review-bot-watch.sh'); - assert.equal(packageJson.scripts['agent:branch:start'], 'bash ./scripts/agent-branch-start.sh'); - assert.equal(packageJson.scripts['agent:finish'], 'gx finish --all'); - assert.equal(packageJson.scripts['agent:plan:init'], 'bash ./scripts/openspec/init-plan-workspace.sh'); - assert.equal(packageJson.scripts['agent:change:init'], 'bash ./scripts/openspec/init-change-workspace.sh'); - assert.equal(packageJson.scripts['agent:protect:list'], 'gx protect list'); - assert.equal(packageJson.scripts['agent:branch:sync'], 'gx sync'); - assert.equal(packageJson.scripts['agent:branch:sync:check'], 'gx sync --check'); - assert.equal(packageJson.scripts['agent:docker:load'], 'bash ./scripts/guardex-docker-loader.sh'); - assert.equal(packageJson.scripts['agent:safety:setup'], 'gx setup'); - assert.equal(packageJson.scripts['agent:cleanup'], 'gx cleanup'); + const managedAgentScripts = Object.keys(packageJson.scripts || {}).filter((name) => name.startsWith('agent:')); + assert.deepEqual(managedAgentScripts, [], 'setup should not inject agent:* helper scripts'); const agentsContent = fs.readFileSync(path.join(repoDir, 'AGENTS.md'), 'utf8'); assert.equal(agentsContent.includes(''), true); @@ -495,9 +496,6 @@ test('setup provisions workflow files and repo config', () => { assert.match(gitignoreContent, /\.omx\//); assert.match(gitignoreContent, /\.omc\//); assert.match(gitignoreContent, /oh-my-codex\//); - assert.match(gitignoreContent, /\.codex\/skills\/gitguardex\/SKILL\.md/); - assert.match(gitignoreContent, /\.codex\/skills\/guardex-merge-skills-to-dev\/SKILL\.md/); - assert.match(gitignoreContent, /\.claude\/commands\/gitguardex\.md/); assert.match(gitignoreContent, /\.omx\/state\/agent-file-locks\.json/); assert.match(gitignoreContent, /# multiagent-safety:END/); @@ -505,7 +503,7 @@ test('setup provisions workflow files and repo config', () => { assert.equal(result.status, 0, result.stderr); assert.equal(result.stdout.trim(), '.githooks'); - const secondRun = runNode(['setup', '--target', repoDir], repoDir); + const secondRun = runNode(['setup', '--target', repoDir, '--no-global-install'], repoDir); assert.equal(secondRun.status, 0, secondRun.stderr || secondRun.stdout); }); @@ -527,7 +525,8 @@ test('setup on a fresh compose repo prints onboarding hints and installs a worki assert.match(result.stdout, /GUARDEX_DOCKER_SERVICE/); const packageJson = JSON.parse(fs.readFileSync(path.join(repoDir, 'package.json'), 'utf8')); - assert.equal(packageJson.scripts['agent:docker:load'], 'bash ./scripts/guardex-docker-loader.sh'); + const managedAgentScripts = Object.keys(packageJson.scripts || {}).filter((name) => name.startsWith('agent:')); + assert.deepEqual(managedAgentScripts, [], 'setup should not inject agent:* helper scripts'); const { fakeBin } = createFakeDockerScript( 'if [[ "$1" == "compose" && "$2" == "version" ]]; then\n' + @@ -561,26 +560,46 @@ test('setup on a fresh compose repo prints onboarding hints and installs a worki assert.match(result.stdout, /EXEC:compose exec -T app echo hello/); }); -test('setup and doctor explain .codex file conflicts and still write managed gitignore first', () => { +test('setup --no-global-install skips npm global toolchain probing', () => { const repoDir = initRepo(); - fs.writeFileSync(path.join(repoDir, '.codex'), '', 'utf8'); + const markerPath = path.join(repoDir, '.npm-probe-marker'); + const fakeNpmPath = createFakeNpmScript( + 'printf \'%s\\n\' "called" > "${GUARDEX_TEST_NPM_MARKER}"\n' + + 'exit 99\n', + ); + + const result = runNodeWithEnv( + ['setup', '--target', repoDir, '--no-global-install'], + repoDir, + { + GUARDEX_NPM_BIN: fakeNpmPath, + GUARDEX_TEST_NPM_MARKER: markerPath, + }, + ); + assert.equal(result.status, 0, result.stderr || result.stdout); + assert.equal(fs.existsSync(markerPath), false, '--no-global-install should bypass npm probing entirely'); +}); + +test('setup and doctor explain .githooks file conflicts and still write managed gitignore first', () => { + const repoDir = initRepo(); + fs.writeFileSync(path.join(repoDir, '.githooks'), '', 'utf8'); let result = runNode(['setup', '--target', repoDir, '--no-global-install'], repoDir); - assert.notEqual(result.status, 0, 'setup should fail when .codex is a file'); + assert.notEqual(result.status, 0, 'setup should fail when .githooks is a file'); let combined = `${result.stdout}\n${result.stderr}`; - assert.match(combined, /Path conflict: \.codex exists as a file/); - assert.match(combined, /\.codex\/skills\/gitguardex\/SKILL\.md needs it to be a directory/); + assert.match(combined, /Path conflict: \.githooks exists as a file/); + assert.match(combined, /\.githooks\/pre-commit needs it to be a directory/); let gitignoreContent = fs.readFileSync(path.join(repoDir, '.gitignore'), 'utf8'); assert.match(gitignoreContent, /# multiagent-safety:START/); assert.match(gitignoreContent, /scripts\/agent-branch-start\.sh/); assert.match(gitignoreContent, /scripts\/agent-file-locks\.py/); - assert.match(gitignoreContent, /\.codex\/skills\/gitguardex\/SKILL\.md/); + assert.match(gitignoreContent, /^\.githooks$/m); result = runNode(['doctor', '--target', repoDir], repoDir); - assert.notEqual(result.status, 0, 'doctor should fail when .codex is a file'); + assert.notEqual(result.status, 0, 'doctor should fail when .githooks is a file'); combined = `${result.stdout}\n${result.stderr}`; - assert.match(combined, /Path conflict: \.codex exists as a file/); + assert.match(combined, /Path conflict: \.githooks exists as a file/); gitignoreContent = fs.readFileSync(path.join(repoDir, '.gitignore'), 'utf8'); assert.match(gitignoreContent, /scripts\/agent-file-locks\.py/); @@ -738,13 +757,81 @@ test('setup and doctor preserve existing agent scripts in package.json by defaul assert.equal(result.status, 0, result.stderr || result.stdout); let currentPackage = JSON.parse(fs.readFileSync(packagePath, 'utf8')); assert.deepEqual(currentPackage.scripts, customPackage.scripts, 'setup should preserve existing agent scripts'); - assert.match(result.stdout, /preserved existing agent:\* scripts/); result = runNode(['doctor', '--target', repoDir], repoDir); assert.equal(result.status, 0, result.stderr || result.stdout); currentPackage = JSON.parse(fs.readFileSync(packagePath, 'utf8')); assert.deepEqual(currentPackage.scripts, customPackage.scripts, 'doctor should preserve existing agent scripts'); - assert.match(result.stdout, /preserved existing agent:\* scripts/); +}); + +test('migrate removes legacy copied assets and installs user-level skills on request', () => { + const repoDir = initRepo(); + const repoRoot = path.resolve(__dirname, '..'); + const guardexHomeDir = fs.mkdtempSync(path.join(os.tmpdir(), 'guardex-migrate-home-')); + const packagePath = path.join(repoDir, 'package.json'); + + fs.mkdirSync(path.join(repoDir, '.codex', 'skills', 'gitguardex'), { recursive: true }); + fs.mkdirSync(path.join(repoDir, '.claude', 'commands'), { recursive: true }); + fs.mkdirSync(path.join(repoDir, 'scripts'), { recursive: true }); + + fs.writeFileSync( + path.join(repoDir, 'scripts', 'install-agent-git-hooks.sh'), + fs.readFileSync(path.join(repoRoot, 'templates', 'scripts', 'install-agent-git-hooks.sh'), 'utf8'), + 'utf8', + ); + fs.writeFileSync( + path.join(repoDir, '.codex', 'skills', 'gitguardex', 'SKILL.md'), + fs.readFileSync(path.join(repoRoot, 'templates', 'codex', 'skills', 'gitguardex', 'SKILL.md'), 'utf8'), + 'utf8', + ); + fs.writeFileSync( + path.join(repoDir, '.claude', 'commands', 'gitguardex.md'), + fs.readFileSync(path.join(repoRoot, 'templates', 'claude', 'commands', 'gitguardex.md'), 'utf8'), + 'utf8', + ); + + fs.writeFileSync( + packagePath, + JSON.stringify( + { + name: path.basename(repoDir), + private: true, + scripts: { + 'agent:codex': 'bash ./scripts/codex-agent.sh', + 'agent:cleanup': 'gx cleanup', + 'agent:branch:start': 'bash ./scripts/custom-branch-start.sh', + test: 'node --test', + }, + }, + null, + 2, + ) + '\n', + 'utf8', + ); + + const result = runNodeWithEnv( + ['migrate', '--target', repoDir, '--install-agent-skills'], + repoDir, + { GUARDEX_HOME_DIR: guardexHomeDir }, + ); + assert.equal(result.status, 0, result.stderr || result.stdout); + + assert.equal(fs.existsSync(path.join(repoDir, 'scripts', 'install-agent-git-hooks.sh')), false); + assert.equal(fs.existsSync(path.join(repoDir, '.codex', 'skills', 'gitguardex', 'SKILL.md')), false); + assert.equal(fs.existsSync(path.join(repoDir, '.claude', 'commands', 'gitguardex.md')), false); + + const migratedPackage = JSON.parse(fs.readFileSync(packagePath, 'utf8')); + assert.equal(migratedPackage.scripts['agent:codex'], undefined); + assert.equal(migratedPackage.scripts['agent:cleanup'], undefined); + assert.equal(migratedPackage.scripts['agent:branch:start'], 'bash ./scripts/custom-branch-start.sh'); + + assert.equal(fs.existsSync(path.join(guardexHomeDir, '.codex', 'skills', 'gitguardex', 'SKILL.md')), true); + assert.equal(fs.existsSync(path.join(guardexHomeDir, '.claude', 'commands', 'gitguardex.md')), true); + + const branchStartShim = fs.readFileSync(path.join(repoDir, 'scripts', 'agent-branch-start.sh'), 'utf8'); + assert.match(branchStartShim, /exec "\$cli_bin" 'branch' 'start' "\$@"/); + const preCommitShim = fs.readFileSync(path.join(repoDir, '.githooks', 'pre-commit'), 'utf8'); + assert.match(preCommitShim, /exec "\$cli_bin" 'hook' 'run' 'pre-commit' "\$@"/); }); test('setup --parent-workspace-view creates one-level-up VS Code workspace for repo + agent worktrees', () => { @@ -1264,9 +1351,9 @@ exit 1 const createdBranch = extractCreatedBranch(result.stdout); result = runCmd('git', ['show-ref', '--verify', '--quiet', `refs/heads/${createdBranch}`], repoDir); - assert.equal(result.status, 0, 'doctor auto-finish should keep sandbox branch locally by default'); + assert.notEqual(result.status, 0, 'doctor auto-finish should clean up the merged sandbox branch locally by default'); result = runCmd('git', ['ls-remote', '--heads', 'origin', createdBranch], repoDir); - assert.match(result.stdout, /refs\/heads\//, 'doctor auto-finish should push sandbox branch to origin'); + assert.equal(result.stdout.trim(), '', 'doctor auto-finish should clean up the merged sandbox branch remotely by default'); const rootStatus = runCmd('git', ['status', '--short', '--untracked-files=no'], repoDir); assert.equal(rootStatus.status, 0, rootStatus.stderr || rootStatus.stdout); @@ -1615,7 +1702,7 @@ test('setup pre-commit detects codex commit attempts on protected main (includin }); assert.notEqual(result.status, 0, result.stdout); assert.match(result.stderr, /\[guardex-preedit-guard\] Codex edit\/commit detected on a protected branch\./); - assert.match(result.stderr, /bash scripts\/codex-agent\.sh/); + assert.match(result.stderr, /gx branch start/); }); test('setup pre-commit allows codex managed guardrail commits on protected main only for AGENTS.md/.gitignore', () => { @@ -2356,16 +2443,6 @@ test('finish command auto-commits dirty agent worktree and runs PR finish flow f fs.writeFileSync(path.join(agentWorktree, 'finisher-note.txt'), 'pending branch finish\n', 'utf8'); - const finishLog = path.join(repoDir, '.finish-invocations.log'); - fs.writeFileSync( - path.join(repoDir, 'scripts', 'agent-branch-finish.sh'), - '#!/usr/bin/env bash\n' + - 'set -euo pipefail\n' + - `printf '%s\\n' \"$*\" >> \"${finishLog}\"\n`, - 'utf8', - ); - fs.chmodSync(path.join(repoDir, 'scripts', 'agent-branch-finish.sh'), 0o755); - result = runNode( ['finish', '--target', repoDir, '--branch', agentBranch, '--base', 'main', '--no-wait-for-merge', '--no-cleanup'], repoDir, @@ -2374,13 +2451,9 @@ test('finish command auto-commits dirty agent worktree and runs PR finish flow f assert.match(result.stdout, new RegExp(`Finishing '${escapeRegexLiteral(agentBranch)}' -> 'main'`)); assert.match(result.stdout, /Auto-committed/); assert.match(result.stdout, /Finish summary: total=1, success=1, failed=0, autoCommitted=1/); - - const finishInvocations = fs.readFileSync(finishLog, 'utf8'); - assert.match(finishInvocations, new RegExp(`--branch ${escapeRegexLiteral(agentBranch)}`)); - assert.match(finishInvocations, /--base main/); - assert.match(finishInvocations, /--via-pr/); - assert.match(finishInvocations, /--no-wait-for-merge/); - assert.match(finishInvocations, /--no-cleanup/); + assert.equal(fs.existsSync(agentWorktree), true, 'finish --no-cleanup should keep the agent worktree'); + let branchResult = runCmd('git', ['show-ref', '--verify', '--quiet', `refs/heads/${agentBranch}`], repoDir); + assert.equal(branchResult.status, 0, 'finish --no-cleanup should keep the local agent branch'); const worktreeStatus = runCmd('git', ['status', '--short'], agentWorktree); assert.equal(worktreeStatus.status, 0, worktreeStatus.stderr || worktreeStatus.stdout); @@ -3103,10 +3176,14 @@ test('post-merge auto-runs cleanup on base branch and skips non-base branches', "if (marker) fs.appendFileSync(marker, process.argv.slice(2).join(' ') + '\\n', 'utf8');\n", 'utf8', ); - - let result = runCmd('bash', ['.githooks/post-merge', '0'], repoDir, { + const postMergeAsset = path.join(__dirname, '..', 'templates', 'githooks', 'post-merge'); + const hookDispatchEnv = { GUARDEX_POST_MERGE_MARKER: markerPath, - }); + GUARDEX_CLI_ENTRY: path.join(repoDir, 'bin', 'multiagent-safety.js'), + GUARDEX_NODE_BIN: process.execPath, + }; + + let result = runCmd('bash', [postMergeAsset, '0'], repoDir, hookDispatchEnv); assert.equal(result.status, 0, result.stderr || result.stdout); let invocations = fs @@ -3124,9 +3201,7 @@ test('post-merge auto-runs cleanup on base branch and skips non-base branches', result = runCmd('git', ['checkout', '-b', 'feature/post-merge-skip'], repoDir); assert.equal(result.status, 0, result.stderr || result.stdout); - result = runCmd('bash', ['.githooks/post-merge', '0'], repoDir, { - GUARDEX_POST_MERGE_MARKER: markerPath, - }); + result = runCmd('bash', [postMergeAsset, '0'], repoDir, hookDispatchEnv); assert.equal(result.status, 0, result.stderr || result.stdout); invocations = fs @@ -4202,7 +4277,7 @@ test('OpenSpec plan workspace scaffold creates expected role/task structure', () assert.match(plannerTasks, /## 5\. Collaboration/); assert.match(plannerTasks, /## 6\. Cleanup/); assert.match(plannerTasks, /\[P1\] READY - Initial planning draft checkpoint/); - assert.match(plannerTasks, /bash scripts\/agent-branch-finish\.sh/); + assert.match(plannerTasks, /gx branch finish --branch --base dev --via-pr --wait-for-merge --cleanup/); const plannerPlan = fs.readFileSync(path.join(planDir, 'planner', 'plan.md'), 'utf8'); assert.match(plannerPlan, /This ExecPlan is a living document/); @@ -4406,7 +4481,7 @@ test('doctor repairs setup drift and confirms repo is safe', () => { assert.match(result.stdout, /Repo is fully safe/); const repairedHook = fs.readFileSync(path.join(repoDir, '.githooks', 'pre-commit'), 'utf8'); - assert.match(repairedHook, /AGENTS\.md\|\.gitignore/); + assert.match(repairedHook, /'hook' 'run' 'pre-commit'/); assert.equal(fs.existsSync(path.join(repoDir, '.omx', 'notepad.md')), true); assert.equal(fs.existsSync(path.join(repoDir, '.omx', 'project-memory.json')), true); assert.equal(fs.existsSync(path.join(repoDir, '.omx', 'logs')), true); @@ -4463,7 +4538,7 @@ test('doctor recurses into nested frontend repos and repairs protected-main drif assert.match(repairedFrontendGitignore, /^scripts\/\*$/m); assert.match(repairedFrontendGitignore, /^\.githooks$/m); const repairedFrontendHook = fs.readFileSync(path.join(frontendDir, '.githooks', 'pre-commit'), 'utf8'); - assert.match(repairedFrontendHook, /AGENTS\.md\|\.gitignore/); + assert.match(repairedFrontendHook, /'hook' 'run' 'pre-commit'/); const frontendScanAfter = runNode(['scan', '--target', frontendDir], repoDir); assert.equal(frontendScanAfter.status, 0, frontendScanAfter.stderr || frontendScanAfter.stdout); @@ -4605,8 +4680,8 @@ test('prompt outputs AI setup instructions', () => { assert.match(result.stdout, /GitGuardex \(gx\) setup checklist/); assert.match(result.stdout, /gx setup/); assert.match(result.stdout, /gx doctor/); - assert.match(result.stdout, /codex-agent\.sh/); - assert.match(result.stdout, /scripts\/agent-file-locks\.py claim/); + assert.match(result.stdout, /gx branch start/); + assert.match(result.stdout, /gx locks claim/); assert.match(result.stdout, /gx finish --all/); assert.match(result.stdout, /\/opsx:propose/); assert.match(result.stdout, /https:\/\/github\.com\/apps\/pull/); @@ -4622,7 +4697,7 @@ test('prompt --exec outputs command-only checklist', () => { assert.match(result.stdout, /^gh --version/m); assert.match(result.stdout, /^gx setup$/m); assert.match(result.stdout, /^gx doctor$/m); - assert.match(result.stdout, /codex-agent\.sh/); + assert.match(result.stdout, /^gx branch start "" ""$/m); assert.match(result.stdout, /^gx finish --all$/m); assert.match(result.stdout, /^gx cleanup$/m); assert.doesNotMatch(result.stdout, /GitGuardex \(gx\) setup checklist/); @@ -4746,10 +4821,6 @@ exit 1 assert.equal(result.status, 0, result.stderr || result.stdout); assert.equal(fs.existsSync(marker), false, 'global install should not run'); assert.match(result.stdout, /Companion installs skipped by user choice/); - assert.match( - result.stdout, - /Guardex needs oh-my-claudecode as a dependency: https:\/\/github\.com\/Yeachan-Heo\/oh-my-claudecode/, - ); }); test('setup installs missing local companion tools with explicit approval', () => { @@ -5084,25 +5155,13 @@ test('cleanup command can remove squash-merged agent branches via merged PR dete test('cleanup command watch mode defaults to 60-minute idle threshold and supports one-cycle execution', () => { const repoDir = initRepo(); - const scriptsDir = path.join(repoDir, 'scripts'); - fs.mkdirSync(scriptsDir, { recursive: true }); - - const pruneScriptPath = path.join(scriptsDir, 'agent-worktree-prune.sh'); - const markerArgs = path.join(repoDir, '.cleanup-watch-args'); - fs.writeFileSync( - pruneScriptPath, - '#!/usr/bin/env bash\n' + - 'set -euo pipefail\n' + - `printf '%s\\n' \"$*\" > \"${markerArgs}\"\n`, - 'utf8', - ); - fs.chmodSync(pruneScriptPath, 0o755); + const resultSetup = runNode(['setup', '--target', repoDir, '--no-global-install'], repoDir); + assert.equal(resultSetup.status, 0, resultSetup.stderr || resultSetup.stdout); + seedCommit(repoDir); const result = runNode(['cleanup', '--target', repoDir, '--watch', '--once', '--interval', '15'], repoDir); assert.equal(result.status, 0, result.stderr || result.stdout); - const passedArgs = fs.readFileSync(markerArgs, 'utf8').trim(); - assert.match(passedArgs, /--idle-minutes 60/); - assert.match(passedArgs, /--only-dirty-worktrees/); + assert.match(result.stdout, /Cleanup watch cycle=1 \(interval=15s, idleMinutes=60, maxBranches=unbounded\)\./); }); test('release fails outside the maintainer repo path', () => { From 17de4f4c69e2efa654b0116b2f1368c3096f315a Mon Sep 17 00:00:00 2001 From: Viktor Nagy <137165288+NagyVikt@users.noreply.github.com> Date: Wed, 22 Apr 2026 01:17:42 +0200 Subject: [PATCH 36/48] Ship the CLI-owned install surface on a fresh npm version (#268) The CLI-owned install surface needs a publishable package version and README release entry, and the scaffold regressions need to assert the same gx-owned surface that setup now installs. This bumps the package to 7.0.18, records the release note, keeps runtime OpenSpec scaffolds aligned with packaged templates, and updates merge workflow tests so hook shims exercise the source CLI instead of an older global gx binary. Constraint: npm publish cannot reuse 7.0.17 Constraint: setup no longer injects Guardex-managed agent:* package scripts into consumer repos Rejected: Leave scaffold parity failures as unrelated baseline drift | the packaged release surface includes those templates and full verification would stay red Confidence: high Scope-risk: moderate Directive: Keep generated repo shims and packaged templates behaviorally aligned when changing CLI-owned workflow wording Tested: node --check bin/multiagent-safety.js Tested: bash -n scripts/openspec/init-plan-workspace.sh Tested: bash -n scripts/openspec/init-change-workspace.sh Tested: npm pack --dry-run Tested: openspec validate --specs Tested: npm test (165/165 pass) Not-tested: live npm publish Co-authored-by: NagyVikt --- README.md | 6 ++++++ .../notes.md | 10 ++++++++++ package-lock.json | 4 ++-- package.json | 2 +- scripts/openspec/init-change-workspace.sh | 8 ++++---- scripts/openspec/init-plan-workspace.sh | 14 +++++++------- test/merge-workflow.test.js | 12 +++++++++--- 7 files changed, 39 insertions(+), 17 deletions(-) create mode 100644 openspec/changes/agent-codex-release-cli-owned-install-surface-v7-0-1-2026-04-22-00-53/notes.md diff --git a/README.md b/README.md index c644f90..b94a82c 100644 --- a/README.md +++ b/README.md @@ -642,6 +642,12 @@ npm pack --dry-run
v7.x +### v7.0.18 +- GitGuardex now keeps the install workflow in `gx` itself: `gx branch ...`, `gx locks ...`, `gx worktree prune`, `gx migrate`, and user-level agent-skill install now own the agent lifecycle instead of teaching pasted repo scripts as the primary surface. +- Fresh installs switch repo hooks to tiny `gx hook run ...` shims, stop copying repo-local workflow implementations and repo-local skills, and stop injecting Guardex-managed `agent:*` package scripts into consumer repos. +- `gx migrate` can move older repos onto the smaller CLI-owned install surface while preserving the managed AGENTS block, lock registry state, repo-local dispatch shims, and required gitignore entries. +- Bumped the release from `7.0.17` → `7.0.18` so the shipped CLI-owned install-surface changes land on a fresh publishable npm version. + ### v7.0.17 - Restored the published npm package name to `@imdeadpool/guardex` after the `@imdeadpool/gitguardex` rename only changed the package identity locally and could not rename the existing npm registry entry. - README/install/tutorial/self-update surfaces now point back at `@imdeadpool/guardex` while keeping GitGuardex as the product/repo brand and `gitguardex` as the long-form command. diff --git a/openspec/changes/agent-codex-release-cli-owned-install-surface-v7-0-1-2026-04-22-00-53/notes.md b/openspec/changes/agent-codex-release-cli-owned-install-surface-v7-0-1-2026-04-22-00-53/notes.md new file mode 100644 index 0000000..476cfc0 --- /dev/null +++ b/openspec/changes/agent-codex-release-cli-owned-install-surface-v7-0-1-2026-04-22-00-53/notes.md @@ -0,0 +1,10 @@ +# agent-codex-release-cli-owned-install-surface-v7-0-1-2026-04-22-00-53 (minimal / T1) + +- Bump the package metadata from `7.0.17` to `7.0.18` so the CLI-owned install-surface changes can ship under a fresh publishable npm version. +- Add a `README.md` release-notes entry for `v7.0.18` that documents the shipped `gx`-owned branch/lock/worktree/migrate surface, hook shims, smaller repo footprint, and user-level agent-skill install path. +- Keep the release scoped to metadata and operator-facing release history only; no runtime behavior changes are introduced in this follow-up. +- Verification: + - `node --check bin/multiagent-safety.js` + - `npm pack --dry-run` + - `openspec validate --specs` + - `openspec validate agent-codex-release-cli-owned-install-surface-v7-0-1-2026-04-22-00-53 --type change --strict` is intentionally not applicable to this T1 notes-only change because there are no delta specs. diff --git a/package-lock.json b/package-lock.json index e6ae4fb..0bc6729 100644 --- a/package-lock.json +++ b/package-lock.json @@ -1,12 +1,12 @@ { "name": "@imdeadpool/guardex", - "version": "7.0.17", + "version": "7.0.18", "lockfileVersion": 3, "requires": true, "packages": { "": { "name": "@imdeadpool/guardex", - "version": "7.0.17", + "version": "7.0.18", "license": "MIT", "bin": { "gitguardex": "bin/multiagent-safety.js", diff --git a/package.json b/package.json index 1503f8f..61ce278 100644 --- a/package.json +++ b/package.json @@ -1,6 +1,6 @@ { "name": "@imdeadpool/guardex", - "version": "7.0.17", + "version": "7.0.18", "description": "Guardian T-Rex for your multi-agent repo. Isolated worktrees, file locks, and PR-only merges stop parallel Codex & Claude agents from overwriting each other's work. Auto-wires Oh My Codex, Oh My Claude, OpenSpec, and Caveman.", "license": "MIT", "preferGlobal": true, diff --git a/scripts/openspec/init-change-workspace.sh b/scripts/openspec/init-change-workspace.sh index 6f0930d..7a89cdf 100755 --- a/scripts/openspec/init-change-workspace.sh +++ b/scripts/openspec/init-change-workspace.sh @@ -74,11 +74,11 @@ Describe the change in a sentence or two. Commit message is the spec of record. ## Handoff - Handoff: change=\`${CHANGE_SLUG}\`; branch=\`${AGENT_BRANCH}\`; scope=\`TODO\`; action=\`continue this sandbox or finish cleanup after a usage-limit/manual takeover\`. -- Copy prompt: Continue \`${CHANGE_SLUG}\` on branch \`${AGENT_BRANCH}\`. Work inside the existing sandbox, review \`openspec/changes/${CHANGE_SLUG}/notes.md\`, continue from the current state instead of creating a new sandbox, and when the work is done run \`bash scripts/agent-branch-finish.sh --branch "${AGENT_BRANCH}" --base ${BASE_BRANCH} --via-pr --wait-for-merge --cleanup\`. +- Copy prompt: Continue \`${CHANGE_SLUG}\` on branch \`${AGENT_BRANCH}\`. Work inside the existing sandbox, review \`openspec/changes/${CHANGE_SLUG}/notes.md\`, continue from the current state instead of creating a new sandbox, and when the work is done run \`gx branch finish --branch ${AGENT_BRANCH} --base ${BASE_BRANCH} --via-pr --wait-for-merge --cleanup\`. ## Cleanup -- [ ] Run: \`bash scripts/agent-branch-finish.sh --branch "${AGENT_BRANCH}" --base ${BASE_BRANCH} --via-pr --wait-for-merge --cleanup\` +- [ ] Run: \`gx branch finish --branch ${AGENT_BRANCH} --base ${BASE_BRANCH} --via-pr --wait-for-merge --cleanup\` - [ ] Record PR URL + \`MERGED\` state in the completion handoff. - [ ] Confirm sandbox worktree is gone (\`git worktree list\`, \`git branch -a\`). NOTESEOF @@ -117,7 +117,7 @@ This change is complete only when **all** of the following are true: ## Handoff - Handoff: change=\`${CHANGE_SLUG}\`; branch=\`${AGENT_BRANCH}\`; scope=\`TODO\`; action=\`continue this sandbox or finish cleanup after a usage-limit/manual takeover\`. -- Copy prompt: Continue \`${CHANGE_SLUG}\` on branch \`${AGENT_BRANCH}\`. Work inside the existing sandbox, review \`openspec/changes/${CHANGE_SLUG}/tasks.md\`, continue from the current state instead of creating a new sandbox, and when the work is done run \`bash scripts/agent-branch-finish.sh --branch "${AGENT_BRANCH}" --base ${BASE_BRANCH} --via-pr --wait-for-merge --cleanup\`. +- Copy prompt: Continue \`${CHANGE_SLUG}\` on branch \`${AGENT_BRANCH}\`. Work inside the existing sandbox, review \`openspec/changes/${CHANGE_SLUG}/tasks.md\`, continue from the current state instead of creating a new sandbox, and when the work is done run \`gx branch finish --branch ${AGENT_BRANCH} --base ${BASE_BRANCH} --via-pr --wait-for-merge --cleanup\`. ## 1. Specification @@ -137,7 +137,7 @@ This change is complete only when **all** of the following are true: ## 4. Cleanup (mandatory; run before claiming completion) -- [ ] 4.1 Run the cleanup pipeline: \`bash scripts/agent-branch-finish.sh --branch "${AGENT_BRANCH}" --base ${BASE_BRANCH} --via-pr --wait-for-merge --cleanup\`. This handles commit -> push -> PR create -> merge wait -> worktree prune in one invocation. +- [ ] 4.1 Run the cleanup pipeline: \`gx branch finish --branch ${AGENT_BRANCH} --base ${BASE_BRANCH} --via-pr --wait-for-merge --cleanup\`. This handles commit -> push -> PR create -> merge wait -> worktree prune in one invocation. - [ ] 4.2 Record the PR URL and final merge state (\`MERGED\`) in the completion handoff. - [ ] 4.3 Confirm the sandbox worktree is gone (\`git worktree list\` no longer shows the agent path; \`git branch -a\` shows no surviving local/remote refs for the branch). TASKSEOF diff --git a/scripts/openspec/init-plan-workspace.sh b/scripts/openspec/init-plan-workspace.sh index c96dba3..b664569 100755 --- a/scripts/openspec/init-plan-workspace.sh +++ b/scripts/openspec/init-plan-workspace.sh @@ -434,7 +434,7 @@ EXCCPTEOF ## 6. Cleanup -- [ ] 6.1 If this lane owns finalization, run \`bash scripts/agent-branch-finish.sh --branch --base dev --via-pr --wait-for-merge --cleanup\`. +- [ ] 6.1 If this lane owns finalization, run \`gx branch finish --branch --base dev --via-pr --wait-for-merge --cleanup\`. - [ ] 6.2 Record PR URL + final \`MERGED\` state in the handoff. - [ ] 6.3 Confirm sandbox cleanup (\`git worktree list\`, \`git branch -a\`) or append \`BLOCKED:\` and stop. TASKEOF @@ -470,7 +470,7 @@ TASKEOF ## 6. Cleanup -- [ ] 6.1 If this lane owns finalization, run \`bash scripts/agent-branch-finish.sh --branch --base dev --via-pr --wait-for-merge --cleanup\`. +- [ ] 6.1 If this lane owns finalization, run \`gx branch finish --branch --base dev --via-pr --wait-for-merge --cleanup\`. - [ ] 6.2 Record PR URL + final \`MERGED\` state in the handoff. - [ ] 6.3 Confirm sandbox cleanup (\`git worktree list\`, \`git branch -a\`) or append \`BLOCKED:\` and stop. TASKEOF @@ -506,7 +506,7 @@ TASKEOF ## 6. Cleanup -- [ ] 6.1 If this lane owns finalization, run \`bash scripts/agent-branch-finish.sh --branch --base dev --via-pr --wait-for-merge --cleanup\`. +- [ ] 6.1 If this lane owns finalization, run \`gx branch finish --branch --base dev --via-pr --wait-for-merge --cleanup\`. - [ ] 6.2 Record PR URL + final \`MERGED\` state in the handoff. - [ ] 6.3 Confirm sandbox cleanup (\`git worktree list\`, \`git branch -a\`) or append \`BLOCKED:\` and stop. TASKEOF @@ -542,7 +542,7 @@ TASKEOF ## 6. Cleanup -- [ ] 6.1 If this lane owns finalization, run \`bash scripts/agent-branch-finish.sh --branch --base dev --via-pr --wait-for-merge --cleanup\`. +- [ ] 6.1 If this lane owns finalization, run \`gx branch finish --branch --base dev --via-pr --wait-for-merge --cleanup\`. - [ ] 6.2 Record PR URL + final \`MERGED\` state in the handoff. - [ ] 6.3 Confirm sandbox cleanup (\`git worktree list\`, \`git branch -a\`) or append \`BLOCKED:\` and stop. TASKEOF @@ -578,7 +578,7 @@ TASKEOF ## 6. Cleanup -- [ ] 6.1 If this lane owns finalization, run \`bash scripts/agent-branch-finish.sh --branch --base dev --via-pr --wait-for-merge --cleanup\`. +- [ ] 6.1 If this lane owns finalization, run \`gx branch finish --branch --base dev --via-pr --wait-for-merge --cleanup\`. - [ ] 6.2 Record PR URL + final \`MERGED\` state in the handoff. - [ ] 6.3 Confirm sandbox cleanup (\`git worktree list\`, \`git branch -a\`) or append \`BLOCKED:\` and stop. TASKEOF @@ -614,7 +614,7 @@ TASKEOF ## 6. Cleanup -- [ ] 6.1 If this lane owns finalization, run \`bash scripts/agent-branch-finish.sh --branch --base dev --via-pr --wait-for-merge --cleanup\`. +- [ ] 6.1 If this lane owns finalization, run \`gx branch finish --branch --base dev --via-pr --wait-for-merge --cleanup\`. - [ ] 6.2 Record PR URL + final \`MERGED\` state in the handoff. - [ ] 6.3 Confirm sandbox cleanup (\`git worktree list\`, \`git branch -a\`) or append \`BLOCKED:\` and stop. TASKEOF @@ -650,7 +650,7 @@ TASKEOF ## 6. Cleanup -- [ ] 6.1 If this lane owns finalization, run \`bash scripts/agent-branch-finish.sh --branch --base dev --via-pr --wait-for-merge --cleanup\`. +- [ ] 6.1 If this lane owns finalization, run \`gx branch finish --branch --base dev --via-pr --wait-for-merge --cleanup\`. - [ ] 6.2 Record PR URL + final \`MERGED\` state in the handoff. - [ ] 6.3 Confirm sandbox cleanup (\`git worktree list\`, \`git branch -a\`) or append \`BLOCKED:\` and stop. TASKEOF diff --git a/test/merge-workflow.test.js b/test/merge-workflow.test.js index 797c180..3f3986d 100644 --- a/test/merge-workflow.test.js +++ b/test/merge-workflow.test.js @@ -40,7 +40,13 @@ function runCmd(cmd, args, cwd, extraEnv = {}) { return cp.spawnSync(cmd, args, { cwd, encoding: 'utf8', - env: { ...sanitizedEnv, ...pushBypassEnv, ...extraEnv }, + env: { + ...sanitizedEnv, + GUARDEX_CLI_ENTRY: cliPath, + GUARDEX_NODE_BIN: process.execPath, + ...pushBypassEnv, + ...extraEnv, + }, }); } @@ -127,7 +133,7 @@ function extractMergeTargetWorktree(output) { return match[1].trim(); } -test('setup installs the managed merge workflow script and package entry', () => { +test('setup installs the managed merge workflow shim without package script churn', () => { const repoDir = initRepo(); seedCommit(repoDir); @@ -139,7 +145,7 @@ test('setup installs the managed merge workflow script and package entry', () => fs.accessSync(mergeScriptPath, fs.constants.X_OK); const packageJson = JSON.parse(fs.readFileSync(path.join(repoDir, 'package.json'), 'utf8')); - assert.equal(packageJson.scripts['agent:branch:merge'], 'bash ./scripts/agent-branch-merge.sh'); + assert.equal(packageJson.scripts['agent:branch:merge'], undefined); }); test('merge command creates an integration lane, reports overlaps, and merges cleanly', () => { From 41261922c43c7e9c82ee96c6da95b40d336520b2 Mon Sep 17 00:00:00 2001 From: Viktor Nagy <137165288+NagyVikt@users.noreply.github.com> Date: Wed, 22 Apr 2026 01:22:58 +0200 Subject: [PATCH 37/48] Keep small Guardex tasks on caveman-only routing by default (#269) Guardex installs a managed AGENTS block into downstream repos, but it did not tell agents when to stay lightweight versus when to escalate into OMX orchestration. This change adds a task-size routing clause to the managed template so small asks stay direct and caveman-only while larger cross-cutting work still has an explicit OMX path. Constraint: Guardex owns the managed AGENTS block, so the durable fix must live in the template and setup/doctor refresh coverage rather than repo-local docs only Rejected: Add a second Guardex-side runtime classifier | would duplicate OMX task-size heuristics and drift from the existing advisory detector Confidence: high Scope-risk: narrow Directive: Keep the managed Guardex AGENTS routing aligned with OMX task-size guidance when heavy-mode keywords or lightweight escape hatches change Tested: targeted install AGENTS refresh tests Tested: openspec change validation strict Not-tested: fresh downstream repo end-to-end after rerunning gx setup or gx doctor Co-authored-by: NagyVikt --- .../proposal.md | 12 ++++++++++ .../specs/guardex-task-size-routing/spec.md | 22 +++++++++++++++++++ .../tasks.md | 20 +++++++++++++++++ templates/AGENTS.multiagent-safety.md | 3 +++ test/install.test.js | 6 +++++ 5 files changed, 63 insertions(+) create mode 100644 openspec/changes/agent-codex-auto-route-small-tasks-to-caveman-and-la-2026-04-22-01-20/proposal.md create mode 100644 openspec/changes/agent-codex-auto-route-small-tasks-to-caveman-and-la-2026-04-22-01-20/specs/guardex-task-size-routing/spec.md create mode 100644 openspec/changes/agent-codex-auto-route-small-tasks-to-caveman-and-la-2026-04-22-01-20/tasks.md diff --git a/openspec/changes/agent-codex-auto-route-small-tasks-to-caveman-and-la-2026-04-22-01-20/proposal.md b/openspec/changes/agent-codex-auto-route-small-tasks-to-caveman-and-la-2026-04-22-01-20/proposal.md new file mode 100644 index 0000000..aab6912 --- /dev/null +++ b/openspec/changes/agent-codex-auto-route-small-tasks-to-caveman-and-la-2026-04-22-01-20/proposal.md @@ -0,0 +1,12 @@ +## Why +Guardex installs a managed AGENTS block, but it does not currently tell downstream repos when to stay lightweight versus when to invoke heavier OMX orchestration. That makes small asks burn unnecessary orchestration tokens even when Caveman/direct mode is enough. + +## What Changes +- Add a task-size routing clause to the managed Guardex AGENTS template. +- Keep small, bounded asks in direct caveman-only mode by default. +- Reserve heavy OMX modes for medium/large scope and allow explicit lightweight escape-hatch prefixes. + +## Impact +- Repos bootstrapped or refreshed by `gx setup` / `gx doctor` get a clearer default routing policy. +- Small asks stay cheaper and simpler. +- Larger, cross-cutting work still has an explicit path into OMX orchestration. diff --git a/openspec/changes/agent-codex-auto-route-small-tasks-to-caveman-and-la-2026-04-22-01-20/specs/guardex-task-size-routing/spec.md b/openspec/changes/agent-codex-auto-route-small-tasks-to-caveman-and-la-2026-04-22-01-20/specs/guardex-task-size-routing/spec.md new file mode 100644 index 0000000..4ead104 --- /dev/null +++ b/openspec/changes/agent-codex-auto-route-small-tasks-to-caveman-and-la-2026-04-22-01-20/specs/guardex-task-size-routing/spec.md @@ -0,0 +1,22 @@ +## ADDED Requirements + +### Requirement: Guardex-managed AGENTS task-size routing +The managed AGENTS block produced by `gx setup` and `gx doctor` SHALL tell downstream repos to keep small, bounded tasks in direct caveman-only mode and reserve heavy OMX orchestration for larger scope. + +#### Scenario: setup refresh writes small-task lightweight routing +- **WHEN** `gx setup` refreshes or installs the managed AGENTS block +- **THEN** the block says small tasks stay in direct caveman-only mode +- **AND** it treats bounded asks such as typos, single-file tweaks, one-liners, and version bumps as lightweight by default. + +#### Scenario: setup refresh writes heavy-mode promotion rules +- **WHEN** the managed AGENTS block is refreshed +- **THEN** it says OMX orchestration is promoted only for medium/large work +- **AND** it names heavy OMX modes as the larger-scope path instead of the default path. + +### Requirement: lightweight escape hatches stay explicit +The managed AGENTS block SHALL document explicit lightweight prefixes that force small-task handling. + +#### Scenario: lightweight prefixes remain available +- **WHEN** the agent reads the managed AGENTS block +- **THEN** it sees `quick:`, `simple:`, `tiny:`, `minor:`, `small:`, `just:`, and `only:` as explicit lightweight escape hatches +- **AND** those prefixes bias the task toward direct caveman-only handling. diff --git a/openspec/changes/agent-codex-auto-route-small-tasks-to-caveman-and-la-2026-04-22-01-20/tasks.md b/openspec/changes/agent-codex-auto-route-small-tasks-to-caveman-and-la-2026-04-22-01-20/tasks.md new file mode 100644 index 0000000..0938c06 --- /dev/null +++ b/openspec/changes/agent-codex-auto-route-small-tasks-to-caveman-and-la-2026-04-22-01-20/tasks.md @@ -0,0 +1,20 @@ +## 1. Spec + +- [x] 1.1 Define Guardex-managed task-size routing requirements for small versus larger tasks. + +## 2. Implementation + +- [x] 2.1 Add the task-size routing clause to `templates/AGENTS.multiagent-safety.md`. +- [x] 2.2 Update install/setup refresh tests so the managed AGENTS block is locked to the new routing policy. + +## 3. Verification + +- [x] 3.1 Run targeted Guardex install tests for managed AGENTS refresh coverage. +- [x] 3.2 Run `openspec validate agent-codex-auto-route-small-tasks-to-caveman-and-la-2026-04-22-01-20 --type change --strict`. + +## 4. Cleanup + +- [ ] 4.1 Commit the Guardex template/test update with Lore trailers. +- [ ] 4.2 Push the agent branch and open/update the PR. +- [ ] 4.3 Merge to `main` and prune the sandbox worktree. +- [ ] 4.4 Record PR URL and final `MERGED` evidence. diff --git a/templates/AGENTS.multiagent-safety.md b/templates/AGENTS.multiagent-safety.md index 9ab89d7..64f93b4 100644 --- a/templates/AGENTS.multiagent-safety.md +++ b/templates/AGENTS.multiagent-safety.md @@ -7,6 +7,9 @@ `GUARDEX_ON=0` disables Guardex for that repo. `GUARDEX_ON=1` explicitly enables Guardex for that repo again. +**Task-size routing.** Small tasks stay in direct caveman-only mode. For typos, single-file tweaks, one-liners, version bumps, or similarly bounded asks, solve directly and do not escalate into heavy OMX orchestration just because a keyword appears. Treat `quick:`, `simple:`, `tiny:`, `minor:`, `small:`, `just:`, and `only:` as explicit lightweight escape hatches. +Promote to OMX orchestration only when the task is medium/large: multi-file behavior changes, API/schema work, refactors, migrations, architecture, cross-cutting scope, or long prompts. Heavy OMX modes (`ralph`, `autopilot`, `team`, `ultrawork`, `swarm`, `ralplan`) are for that larger scope. If the task grows while working, upgrade then. + **Isolation.** Every task runs on a dedicated `agent/*` branch + worktree. Start with `gx branch start "" ""`. Treat the base branch (`main`/`dev`) as read-only while an agent branch is active. Never `git checkout ` on a primary working tree (including nested repos); use `git worktree add` instead. The `.githooks/post-checkout` hook auto-reverts primary-branch switches during agent sessions - bypass only with `GUARDEX_ALLOW_PRIMARY_BRANCH_SWITCH=1`. For every new task, including follow-up work in the same chat/session, if an assigned agent sub-branch/worktree is already open, continue in that sub-branch instead of creating a fresh lane unless the user explicitly redirects scope. Never implement directly on the local/base branch checkout; keep it unchanged and perform all edits in the agent sub-branch/worktree. diff --git a/test/install.test.js b/test/install.test.js index 96734d3..942918a 100644 --- a/test/install.test.js +++ b/test/install.test.js @@ -655,6 +655,8 @@ test('setup refreshes existing managed AGENTS block by default', () => { assert.match(currentAgents, /Guardex is enabled by default/); assert.match(currentAgents, /GUARDEX_ON=0/); assert.match(currentAgents, /GUARDEX_ON=1/); + assert.match(currentAgents, /Small tasks stay in direct caveman-only mode\./); + assert.match(currentAgents, /Promote to OMX orchestration only when the task is medium\/large/); assert.match(currentAgents, /explicit final completion\/cleanup section/); assert.match(currentAgents, /PR URL \+ final `MERGED` evidence/); assert.doesNotMatch(currentAgents, /legacy managed clause/); @@ -688,6 +690,8 @@ Trailing project notes after managed block. assert.match(currentAgents, /Guardex is enabled by default/); assert.match(currentAgents, /GUARDEX_ON=0/); assert.match(currentAgents, /GUARDEX_ON=1/); + assert.match(currentAgents, /Small tasks stay in direct caveman-only mode\./); + assert.match(currentAgents, /Promote to OMX orchestration only when the task is medium\/large/); assert.match(currentAgents, /explicit final completion\/cleanup section/); assert.match(currentAgents, /PR URL \+ final `MERGED` evidence/); assert.doesNotMatch(currentAgents, /legacy managed clause/); @@ -902,6 +906,8 @@ Trailing project notes after managed block. nextAgents, /Never implement directly on the local\/base branch checkout; keep it unchanged and perform all edits in the agent sub-branch\/worktree\./, ); + assert.match(nextAgents, /Small tasks stay in direct caveman-only mode\./); + assert.match(nextAgents, /Promote to OMX orchestration only when the task is medium\/large/); assert.match(nextAgents, /explicit final completion\/cleanup section/); assert.match(nextAgents, /PR URL \+ final `MERGED` evidence/); assert.doesNotMatch(nextAgents, /legacy managed clause/); From 38df0d743beb271c795b435933e4497ccfc40188 Mon Sep 17 00:00:00 2001 From: Viktor Nagy <137165288+NagyVikt@users.noreply.github.com> Date: Wed, 22 Apr 2026 09:11:14 +0200 Subject: [PATCH 38/48] Make working VS Code agent lanes visible at a glance (#270) The Active Agents view already exposed per-row activity, but active edit lanes still blended into a flat list. This groups sessions into WORKING NOW vs THINKING, adds repo/header working counts, and keeps docs/tests aligned. Constraint: Keep the existing active-session JSON contract and SCM companion surface intact Rejected: Add a separate top-level working-lanes view | would duplicate the existing Source Control companion Confidence: high Scope-risk: narrow Directive: Keep WORKING NOW ahead of THINKING so actively editing lanes stay scan-first in the tree Tested: node --check templates/vscode/guardex-active-agents/extension.js; node --test test/vscode-active-agents-session-state.test.js; openspec validate agent-codex-vscode-working-agents-groups-2026-04-22-09-05 --type change --strict; openspec validate --specs Not-tested: npm test stopped after early passing output in this environment and did not complete Co-authored-by: NagyVikt --- README.md | 2 +- .../proposal.md | 15 +++ .../vscode-working-agents-groups/spec.md | 20 ++++ .../tasks.md | 32 ++++++ .../vscode/guardex-active-agents/README.md | 5 +- .../vscode/guardex-active-agents/extension.js | 58 +++++++++-- ...vscode-active-agents-session-state.test.js | 98 ++++++++++++++++++- 7 files changed, 216 insertions(+), 14 deletions(-) create mode 100644 openspec/changes/agent-codex-vscode-working-agents-groups-2026-04-22-09-05/proposal.md create mode 100644 openspec/changes/agent-codex-vscode-working-agents-groups-2026-04-22-09-05/specs/vscode-working-agents-groups/spec.md create mode 100644 openspec/changes/agent-codex-vscode-working-agents-groups-2026-04-22-09-05/tasks.md diff --git a/README.md b/README.md index b94a82c..551c9ab 100644 --- a/README.md +++ b/README.md @@ -245,7 +245,7 @@ To install the real companion into local VS Code from a GitGuardex-wired repo: node scripts/install-vscode-active-agents-extension.js ``` -It adds an `Active Agents` view to the Source Control container, groups each live repo into `ACTIVE AGENTS` and `CHANGES` sections, reads `.omx/state/active-sessions/*.json`, derives `thinking` versus `working` from each live sandbox worktree, and uses VS Code's native `loading~spin` codicon for the running-state affordance. Reload the VS Code window after install. +It adds an `Active Agents` view to the Source Control container, groups each live repo into `ACTIVE AGENTS` and `CHANGES` sections, splits `ACTIVE AGENTS` into `WORKING NOW` and `THINKING` when both states are present, reads `.omx/state/active-sessions/*.json`, derives `thinking` versus `working` from each live sandbox worktree, and surfaces a working-count summary in the repo/header affordances. Reload the VS Code window after install. --- diff --git a/openspec/changes/agent-codex-vscode-working-agents-groups-2026-04-22-09-05/proposal.md b/openspec/changes/agent-codex-vscode-working-agents-groups-2026-04-22-09-05/proposal.md new file mode 100644 index 0000000..f08c6c7 --- /dev/null +++ b/openspec/changes/agent-codex-vscode-working-agents-groups-2026-04-22-09-05/proposal.md @@ -0,0 +1,15 @@ +## Why + +The VS Code Active Agents companion already shows per-row `working` versus `thinking`, but the busy lanes still blend into one flat `ACTIVE AGENTS` list. When several sandboxes are live, the user has to inspect each row one by one to find the branches actively editing files. + +## What Changes + +- Split the `ACTIVE AGENTS` tree into visible `WORKING NOW` and `THINKING` subgroups. +- Surface a repo-level working count in the repo summary row and the SCM badge tooltip. +- Use a distinct VS Code codicon for actively working lanes so they stand out from thinking-only sessions. +- Update README/test coverage to lock the new grouping and summary behavior. + +## Impact + +- Affected surfaces: `templates/vscode/guardex-active-agents/extension.js`, `templates/vscode/guardex-active-agents/README.md`, `test/vscode-active-agents-session-state.test.js`, and the root `README.md`. +- No runtime/session-file schema changes; the companion still reads the existing `.omx/state/active-sessions/*.json` records. diff --git a/openspec/changes/agent-codex-vscode-working-agents-groups-2026-04-22-09-05/specs/vscode-working-agents-groups/spec.md b/openspec/changes/agent-codex-vscode-working-agents-groups-2026-04-22-09-05/specs/vscode-working-agents-groups/spec.md new file mode 100644 index 0000000..ee321d9 --- /dev/null +++ b/openspec/changes/agent-codex-vscode-working-agents-groups-2026-04-22-09-05/specs/vscode-working-agents-groups/spec.md @@ -0,0 +1,20 @@ +## ADDED Requirements + +### Requirement: Active Agents highlights currently working lanes +The VS Code Active Agents companion SHALL separate actively editing Guardex lanes from idle-thinking lanes inside the `ACTIVE AGENTS` section. + +#### Scenario: Working and thinking sessions render in separate groups +- **WHEN** a repo has both live `working` and `thinking` Guardex sessions +- **THEN** the repo node contains an `ACTIVE AGENTS` section +- **AND** that section contains `WORKING NOW` and `THINKING` child groups +- **AND** the working group appears before the thinking group. + +#### Scenario: Repo summary exposes working counts +- **WHEN** a repo has one or more live working sessions +- **THEN** the repo row description includes the working count in addition to the active session count +- **AND** the Source Control badge tooltip mentions how many active sessions are currently working. + +#### Scenario: Working sessions use a distinct visual affordance +- **WHEN** a live Guardex session is inferred as `working` +- **THEN** its row uses a distinct codicon from `thinking` rows +- **AND** the row still keeps the existing activity/count/elapsed description text. diff --git a/openspec/changes/agent-codex-vscode-working-agents-groups-2026-04-22-09-05/tasks.md b/openspec/changes/agent-codex-vscode-working-agents-groups-2026-04-22-09-05/tasks.md new file mode 100644 index 0000000..bd57762 --- /dev/null +++ b/openspec/changes/agent-codex-vscode-working-agents-groups-2026-04-22-09-05/tasks.md @@ -0,0 +1,32 @@ +## Definition of Done + +This change is complete only when **all** of the following are true: + +- Every checkbox below is checked. +- The agent branch reaches `MERGED` state on `origin` and the PR URL + state are recorded in the completion handoff. +- If any step blocks (test failure, conflict, ambiguous result), append a `BLOCKED:` line under section 4 explaining the blocker and **STOP**. Do not tick remaining cleanup boxes; do not silently skip the cleanup pipeline. + +Handoff: 2026-04-22 09:05Z codex owns `templates/vscode/guardex-active-agents/*`, `test/vscode-active-agents-session-state.test.js`, `README.md`, and this change workspace to make actively working Guardex lanes easier to spot in VS Code. + +## 1. Specification + +- [x] 1.1 Finalize proposal scope and acceptance criteria for `agent-codex-vscode-working-agents-groups-2026-04-22-09-05`. +- [x] 1.2 Define normative requirements in `specs/vscode-working-agents-groups/spec.md`. + +## 2. Implementation + +- [x] 2.1 Split the `ACTIVE AGENTS` section into visible `WORKING NOW` and `THINKING` groups, preserving live session rows. +- [x] 2.2 Surface working counts in the repo row / view badge summary and add a distinct icon for working lanes. +- [x] 2.3 Update README guidance and focused regression tests for the new grouping behavior. + +## 3. Verification + +- [x] 3.1 Run `node --test test/vscode-active-agents-session-state.test.js`. +- [x] 3.2 Run `openspec validate agent-codex-vscode-working-agents-groups-2026-04-22-09-05 --type change --strict`. +- [x] 3.3 Run `openspec validate --specs`. + +## 4. Cleanup + +- [ ] 4.1 Run the cleanup pipeline: `bash scripts/agent-branch-finish.sh --branch agent/codex/vscode-working-agents-groups-2026-04-22-09-05 --base main --via-pr --wait-for-merge --cleanup`. +- [ ] 4.2 Record the PR URL and final merge state (`MERGED`) in the completion handoff. +- [ ] 4.3 Confirm the sandbox worktree is gone (`git worktree list` no longer shows the agent path; `git branch -a` shows no surviving local/remote refs for the branch). diff --git a/templates/vscode/guardex-active-agents/README.md b/templates/vscode/guardex-active-agents/README.md index 06bb43e..b63a8c3 100644 --- a/templates/vscode/guardex-active-agents/README.md +++ b/templates/vscode/guardex-active-agents/README.md @@ -6,9 +6,10 @@ What it does: - Adds an `Active Agents` view to the Source Control container. - Renders one repo node per live Guardex workspace with grouped `ACTIVE AGENTS` and `CHANGES` sections. -- Shows one row per live Guardex sandbox session inside the repo's `ACTIVE AGENTS` section. +- Splits live sessions inside `ACTIVE AGENTS` into `WORKING NOW` and `THINKING` groups so active edit lanes stand out immediately. +- Shows one row per live Guardex sandbox session inside those activity groups. - Shows repo-root git changes in a sibling `CHANGES` section when the guarded repo itself is dirty. -- Derives `thinking` versus `working` from the live sandbox worktree and shows changed-file counts for active edits. +- Derives `thinking` versus `working` from the live sandbox worktree, surfaces working counts in the repo/header summary, and shows changed-file counts for active edits. - Uses VS Code's native animated `loading~spin` icon for the running-state affordance. - Reads repo-local presence files from `.omx/state/active-sessions/`. diff --git a/templates/vscode/guardex-active-agents/extension.js b/templates/vscode/guardex-active-agents/extension.js index c67eb05..a375c52 100644 --- a/templates/vscode/guardex-active-agents/extension.js +++ b/templates/vscode/guardex-active-agents/extension.js @@ -18,21 +18,29 @@ class RepoItem extends vscode.TreeItem { this.sessions = sessions; this.changes = changes; const descriptionParts = [`${sessions.length} active`]; + const workingCount = countWorkingSessions(sessions); + if (workingCount > 0) { + descriptionParts.push(`${workingCount} working`); + } if (changes.length > 0) { descriptionParts.push(`${changes.length} changed`); } this.description = descriptionParts.join(' · '); - this.tooltip = repoRoot; + this.tooltip = [ + repoRoot, + this.description, + ].join('\n'); this.iconPath = new vscode.ThemeIcon('repo'); this.contextValue = 'gitguardex.repo'; } } class SectionItem extends vscode.TreeItem { - constructor(label, items) { + constructor(label, items, options = {}) { super(label, vscode.TreeItemCollapsibleState.Expanded); this.items = items; - this.description = items.length > 0 ? String(items.length) : ''; + this.description = options.description + || (items.length > 0 ? String(items.length) : ''); this.contextValue = 'gitguardex.section'; } } @@ -58,7 +66,9 @@ class SessionItem extends vscode.TreeItem { session.worktreePath, ]; this.tooltip = tooltipLines.filter(Boolean).join('\n'); - this.iconPath = new vscode.ThemeIcon('loading~spin'); + this.iconPath = session.activityKind === 'working' + ? new vscode.ThemeIcon('edit') + : new vscode.ThemeIcon('loading~spin'); this.contextValue = 'gitguardex.session'; this.command = { command: 'gitguardex.activeAgents.openWorktree', @@ -165,6 +175,29 @@ function buildChangeTreeNodes(changes) { return materialize(root); } +function countWorkingSessions(sessions) { + return sessions.filter((session) => session.activityKind === 'working').length; +} + +function buildActiveAgentGroupNodes(sessions) { + const workingSessions = sessions + .filter((session) => session.activityKind === 'working') + .map((session) => new SessionItem(session)); + const thinkingSessions = sessions + .filter((session) => session.activityKind !== 'working') + .map((session) => new SessionItem(session)); + const groups = []; + + if (workingSessions.length > 0) { + groups.push(new SectionItem('WORKING NOW', workingSessions)); + } + if (thinkingSessions.length > 0) { + groups.push(new SectionItem('THINKING', thinkingSessions)); + } + + return groups; +} + class ActiveAgentsProvider { constructor() { this.onDidChangeTreeDataEmitter = new vscode.EventEmitter(); @@ -178,10 +211,10 @@ class ActiveAgentsProvider { attachTreeView(treeView) { this.treeView = treeView; - this.updateViewState(0); + this.updateViewState(0, 0); } - updateViewState(sessionCount) { + updateViewState(sessionCount, workingCount) { if (!this.treeView) { return; } @@ -189,7 +222,8 @@ class ActiveAgentsProvider { this.treeView.badge = sessionCount > 0 ? { value: sessionCount, - tooltip: `${sessionCount} active agent${sessionCount === 1 ? '' : 's'}`, + tooltip: `${sessionCount} active agent${sessionCount === 1 ? '' : 's'}` + + (workingCount > 0 ? ` · ${workingCount} working now` : ''), } : undefined; this.treeView.message = sessionCount > 0 @@ -204,7 +238,9 @@ class ActiveAgentsProvider { async getChildren(element) { if (element instanceof RepoItem) { const sectionItems = [ - new SectionItem('ACTIVE AGENTS', element.sessions.map((session) => new SessionItem(session))), + new SectionItem('ACTIVE AGENTS', buildActiveAgentGroupNodes(element.sessions), { + description: String(element.sessions.length), + }), ]; if (element.changes.length > 0) { sectionItems.push(new SectionItem('CHANGES', buildChangeTreeNodes(element.changes))); @@ -218,7 +254,11 @@ class ActiveAgentsProvider { const repoEntries = await this.loadRepoEntries(); const sessionCount = repoEntries.reduce((total, entry) => total + entry.sessions.length, 0); - this.updateViewState(sessionCount); + const workingCount = repoEntries.reduce( + (total, entry) => total + countWorkingSessions(entry.sessions), + 0, + ); + this.updateViewState(sessionCount, workingCount); if (repoEntries.length === 0) { return [new InfoItem('No active Guardex agents', 'Open or start a sandbox session.')]; diff --git a/test/vscode-active-agents-session-state.test.js b/test/vscode-active-agents-session-state.test.js index 698b022..2687e65 100644 --- a/test/vscode-active-agents-session-state.test.js +++ b/test/vscode-active-agents-session-state.test.js @@ -361,10 +361,15 @@ test('active-agents extension groups live sessions under a repo node', async () const [agentsSection] = await provider.getChildren(repoItem); assert.equal(agentsSection.label, 'ACTIVE AGENTS'); + assert.equal(agentsSection.description, '1'); - const [sessionItem] = await provider.getChildren(agentsSection); + const [thinkingSection] = await provider.getChildren(agentsSection); + assert.equal(thinkingSection.label, 'THINKING'); + + const [sessionItem] = await provider.getChildren(thinkingSection); assert.equal(sessionItem.label, 'live-task'); assert.match(sessionItem.description, /^thinking · \d+[smhd]/); + assert.equal(sessionItem.iconPath.id, 'loading~spin'); assert.deepEqual(registrations.treeViews[0].badge, { value: 1, tooltip: '1 active agent', @@ -417,14 +422,23 @@ test('active-agents extension shows grouped repo changes beside active agents', const provider = registrations.providers[0].provider; const [repoItem] = await provider.getChildren(); + assert.equal(repoItem.description, '1 active · 1 working · 1 changed'); const [agentsSection, changesSection] = await provider.getChildren(repoItem); assert.equal(agentsSection.label, 'ACTIVE AGENTS'); assert.equal(changesSection.label, 'CHANGES'); - const [sessionItem] = await provider.getChildren(agentsSection); + const [workingSection] = await provider.getChildren(agentsSection); + assert.equal(workingSection.label, 'WORKING NOW'); + + const [sessionItem] = await provider.getChildren(workingSection); assert.equal(sessionItem.label, path.basename(worktreePath)); assert.match(sessionItem.description, /^working · 2 files · /); assert.match(sessionItem.tooltip, /Changed 2 files: new-file\.txt, tracked\.txt/); + assert.equal(sessionItem.iconPath.id, 'edit'); + assert.deepEqual(registrations.treeViews[0].badge, { + value: 1, + tooltip: '1 active agent · 1 working now', + }); const [changeItem] = await provider.getChildren(changesSection); assert.equal(changeItem.label, 'root-file.txt'); @@ -435,3 +449,83 @@ test('active-agents extension shows grouped repo changes beside active agents', subscription.dispose?.(); } }); + +test('active-agents extension splits working and thinking sessions into separate groups', async () => { + const tempRoot = fs.mkdtempSync(path.join(os.tmpdir(), 'guardex-vscode-mixed-view-')); + + const workingPath = fs.mkdtempSync(path.join(os.tmpdir(), 'guardex-vscode-mixed-working-')); + initGitRepo(workingPath); + fs.writeFileSync(path.join(workingPath, 'tracked.txt'), 'base\n', 'utf8'); + runGit(workingPath, ['add', 'tracked.txt']); + runGit(workingPath, ['commit', '-m', 'baseline']); + fs.writeFileSync(path.join(workingPath, 'tracked.txt'), 'base\nchanged\n', 'utf8'); + + const thinkingPath = fs.mkdtempSync(path.join(os.tmpdir(), 'guardex-vscode-mixed-thinking-')); + initGitRepo(thinkingPath); + fs.writeFileSync(path.join(thinkingPath, 'tracked.txt'), 'base\n', 'utf8'); + runGit(thinkingPath, ['add', 'tracked.txt']); + runGit(thinkingPath, ['commit', '-m', 'baseline']); + + const workingSessionPath = sessionSchema.sessionFilePathForBranch(tempRoot, 'agent/codex/working-task'); + fs.mkdirSync(path.dirname(workingSessionPath), { recursive: true }); + fs.writeFileSync( + workingSessionPath, + `${JSON.stringify(sessionSchema.buildSessionRecord({ + repoRoot: tempRoot, + branch: 'agent/codex/working-task', + taskName: 'working-task', + agentName: 'codex', + worktreePath: workingPath, + pid: process.pid, + cliName: 'codex', + }), null, 2)}\n`, + 'utf8', + ); + + const thinkingSessionPath = sessionSchema.sessionFilePathForBranch(tempRoot, 'agent/codex/thinking-task'); + fs.writeFileSync( + thinkingSessionPath, + `${JSON.stringify(sessionSchema.buildSessionRecord({ + repoRoot: tempRoot, + branch: 'agent/codex/thinking-task', + taskName: 'thinking-task', + agentName: 'codex', + worktreePath: thinkingPath, + pid: process.pid, + cliName: 'codex', + }), null, 2)}\n`, + 'utf8', + ); + + const { registrations, vscode } = createMockVscode(tempRoot); + vscode.workspace.findFiles = async () => [ + { fsPath: workingSessionPath }, + { fsPath: thinkingSessionPath }, + ]; + const extension = loadExtensionWithMockVscode(vscode); + const context = { subscriptions: [] }; + + extension.activate(context); + + const provider = registrations.providers[0].provider; + const [repoItem] = await provider.getChildren(); + assert.equal(repoItem.description, '2 active · 1 working'); + + const [agentsSection] = await provider.getChildren(repoItem); + const [workingSection, thinkingSection] = await provider.getChildren(agentsSection); + assert.equal(workingSection.label, 'WORKING NOW'); + assert.equal(thinkingSection.label, 'THINKING'); + + const [workingItem] = await provider.getChildren(workingSection); + const [thinkingItem] = await provider.getChildren(thinkingSection); + assert.match(workingItem.description, /^working · 1 file · /); + assert.match(thinkingItem.description, /^thinking · \d+[smhd]/); + assert.deepEqual(registrations.treeViews[0].badge, { + value: 2, + tooltip: '2 active agents · 1 working now', + }); + + for (const subscription of context.subscriptions) { + subscription.dispose?.(); + } +}); From 7c5bd067ec2376464b82caf20dadf04938448a82 Mon Sep 17 00:00:00 2001 From: Viktor Nagy <137165288+NagyVikt@users.noreply.github.com> Date: Wed, 22 Apr 2026 09:16:58 +0200 Subject: [PATCH 39/48] Stop repo-local workflow shims from defining Guardex command surface (#271) Guardex already runs branch, lock, review, and cleanup flows from gx, but repo installs still taught and depended on scripts/ workflow shims. This change shrinks the required repo footprint to hook shims plus repo-local state, removes CLI checks that required repo-local workflow scripts, and updates shipped docs, AGENTS guidance, skill references, and lock/review helper output around the zero-copy surface. Constraint: Existing repos may still carry legacy workflow scripts during migration; direct gx commands must keep working without weakening branch and lock guardrails Rejected: Keep thin repo-local workflow shims as permanent entrypoints | still leaves a second command surface and drift noise Confidence: high Scope-risk: moderate Directive: Keep repo-local workflow commands optional compatibility only; install, doctor, status, and skills should teach gx subcommands first Tested: node --check bin/multiagent-safety.js; node --test test/install.test.js Not-tested: test/metadata.test.js parity gate for non-targeted local runtime/template sync Co-authored-by: NagyVikt --- .githooks/pre-commit | 37 +- AGENTS.md | 32 +- README.md | 39 +- bin/multiagent-safety.js | 194 +++--- .../proposal.md | 26 + .../zero-copy-cli-install-surface/spec.md | 66 +++ .../tasks.md | 31 + scripts/agent-branch-merge.sh | 29 +- scripts/agent-file-locks.py | 22 +- scripts/review-bot-watch.sh | 37 +- templates/codex/skills/gitguardex/SKILL.md | 2 +- .../guardex-merge-skills-to-dev/SKILL.md | 6 +- templates/githooks/pre-commit | 23 +- templates/scripts/agent-branch-finish.sh | 51 +- templates/scripts/agent-branch-merge.sh | 29 +- templates/scripts/agent-branch-start.sh | 52 +- templates/scripts/agent-file-locks.py | 22 +- templates/scripts/codex-agent.sh | 92 ++- templates/scripts/review-bot-watch.sh | 37 +- test/install.test.js | 559 ++++++++---------- 20 files changed, 750 insertions(+), 636 deletions(-) create mode 100644 openspec/changes/agent-codex-zero-copy-cli-install-surface-2026-04-22-01-28/proposal.md create mode 100644 openspec/changes/agent-codex-zero-copy-cli-install-surface-2026-04-22-01-28/specs/zero-copy-cli-install-surface/spec.md create mode 100644 openspec/changes/agent-codex-zero-copy-cli-install-surface-2026-04-22-01-28/tasks.md diff --git a/.githooks/pre-commit b/.githooks/pre-commit index 89667bc..c2d751f 100755 --- a/.githooks/pre-commit +++ b/.githooks/pre-commit @@ -13,6 +13,8 @@ repo_root="$(git rev-parse --show-toplevel 2>/dev/null || true)" if [[ -z "$repo_root" ]]; then exit 0 fi +NODE_BIN="${GUARDEX_NODE_BIN:-node}" +CLI_ENTRY="${GUARDEX_CLI_ENTRY:-}" guardex_env_helper="${repo_root}/scripts/guardex-env.sh" if [[ -f "$guardex_env_helper" ]]; then # shellcheck source=/dev/null @@ -22,6 +24,23 @@ if declare -F guardex_repo_is_enabled >/dev/null 2>&1 && ! guardex_repo_is_enabl exit 0 fi +run_guardex_cli() { + if [[ -n "$CLI_ENTRY" ]]; then + "$NODE_BIN" "$CLI_ENTRY" "$@" + return $? + fi + if command -v gx >/dev/null 2>&1; then + gx "$@" + return $? + fi + if command -v gitguardex >/dev/null 2>&1; then + gitguardex "$@" + return $? + fi + echo "[agent-branch-guard] Guardex CLI entrypoint unavailable; rerun via gx." >&2 + return 127 +} + if [[ "${ALLOW_COMMIT_ON_PROTECTED_BRANCH:-0}" == "1" ]]; then exit 0 fi @@ -118,9 +137,9 @@ if [[ "$should_require_codex_agent_branch" == "1" && "${GUARDEX_ALLOW_CODEX_ON_N [guardex-preedit-guard] Codex edit/commit detected on a protected branch. GuardeX requires Codex work to run from an isolated agent/* branch. Start the sub-branch/worktree with: - bash scripts/codex-agent.sh "" "" + gx branch start "" "" Or manually: - bash scripts/agent-branch-start.sh "" "" + gx branch start "" "" Then commit from the created agent/* branch. Temporary bypass (not recommended): @@ -132,7 +151,7 @@ MSG cat >&2 <<'MSG' [codex-branch-guard] Codex agent commit blocked on non-agent branch. Use isolated branch/worktree first: - bash scripts/agent-branch-start.sh "" "" + gx branch start "" "" Then commit from the created agent/* branch. Temporary bypass (not recommended): @@ -163,9 +182,9 @@ if [[ "$is_protected_branch" == "1" ]]; then cat >&2 <<'MSG' [agent-branch-guard] Direct commits on protected branches are blocked. Use an agent branch first: - bash scripts/agent-branch-start.sh "" "" + gx branch start "" "" After finishing work: - bash scripts/agent-branch-finish.sh + gx branch finish Temporary bypass (not recommended): ALLOW_COMMIT_ON_PROTECTED_BRANCH=1 git commit ... @@ -177,7 +196,7 @@ if [[ "$is_agent_session" == "1" && "$branch" != agent/* ]]; then cat >&2 <<'MSG' [agent-branch-guard] Agent commits must run on dedicated agent/* branches. Start an agent branch first: - bash scripts/agent-branch-start.sh "" "" + gx branch start "" "" Then commit on that branch. Temporary bypass (not recommended): @@ -191,15 +210,15 @@ if [[ "$branch" == agent/* ]]; then while IFS= read -r staged_file; do [[ -z "$staged_file" ]] && continue [[ "$staged_file" == ".omx/state/agent-file-locks.json" ]] && continue - python3 scripts/agent-file-locks.py claim --branch "$branch" "$staged_file" >/dev/null 2>&1 || true + run_guardex_cli locks claim --branch "$branch" "$staged_file" >/dev/null 2>&1 || true done < <(git diff --cached --name-only --diff-filter=ACMRDTUXB) fi - if ! python3 scripts/agent-file-locks.py validate --branch "$branch" --staged; then + if ! run_guardex_cli locks validate --branch "$branch" --staged; then cat >&2 <<'MSG' [agent-branch-guard] Agent branch commits require file ownership locks. Claim files first: - python3 scripts/agent-file-locks.py claim --branch "$(git rev-parse --abbrev-ref HEAD)" + gx locks claim --branch "$(git rev-parse --abbrev-ref HEAD)" MSG exit 1 fi diff --git a/AGENTS.md b/AGENTS.md index d177a49..63b5ed8 100644 --- a/AGENTS.md +++ b/AGENTS.md @@ -54,7 +54,7 @@ When Guardex is enabled, Claude Code sessions use the same agent-worktree + Open ### Tiering (token-aware scaffolding) -`agent-branch-start.sh` and `agent-branch-finish.sh` accept `--tier {T0|T1|T2|T3}` to size the OpenSpec scaffolding to the change's blast radius. Default is `T3` (full scaffolding; current behavior). The tier is recorded in the bootstrap manifest so `finish` picks it up automatically. +`gx branch start` and `gx branch finish` accept `--tier {T0|T1|T2|T3}` to size the OpenSpec scaffolding to the change's blast radius. Default is `T3` (full scaffolding; current behavior). The tier is recorded in the bootstrap manifest so `finish` picks it up automatically. | Tier | Use for | Scaffolding on `start` | Gates on `finish` | |------|---------|------------------------|--------------------| @@ -67,16 +67,16 @@ Examples: ```bash # T0 (typo / trivial): fastest path, no OpenSpec artifacts -bash scripts/agent-branch-start.sh --tier T0 "fix-typo-in-readme" "claude-name" +gx branch start --tier T0 "fix-typo-in-readme" "claude-name" # T1 (small fix): notes-only scaffold, commit message is the spec of record -bash scripts/agent-branch-start.sh --tier T1 "tighten-retry-backoff" "claude-name" +gx branch start --tier T1 "tighten-retry-backoff" "claude-name" # T2 (default for real behavior changes): full change spec, no plan workspace -bash scripts/agent-branch-start.sh --tier T2 "add-oauth-endpoint" "claude-name" +gx branch start --tier T2 "add-oauth-endpoint" "claude-name" # T3 (current default if --tier is omitted): plan workspace + full OpenSpec -bash scripts/agent-branch-start.sh "refactor-payment-pipeline" "claude-name" +gx branch start "refactor-payment-pipeline" "claude-name" ``` `finish` reads the tier from the manifest automatically; passing `--tier` on finish is only needed to override (e.g., upgrading to a fuller gate). @@ -84,7 +84,7 @@ bash scripts/agent-branch-start.sh "refactor-payment-pipeline" "claude-name" 1. Start a sandbox worktree: ```bash - bash scripts/agent-branch-start.sh [--tier T0|T1|T2|T3] "" "claude-" + gx branch start [--tier T0|T1|T2|T3] "" "claude-" ``` Creates `agent/claude-/` under `.omc/agent-worktrees/`, scaffolds the OpenSpec change + plan workspaces (sized by tier), and records the bootstrap manifest. Codex sessions keep using `.omx/agent-worktrees/`. Missing `codex-auth` silently falls back to an empty snapshot slug (expected for Claude sessions). @@ -93,7 +93,7 @@ bash scripts/agent-branch-start.sh "refactor-payment-pipeline" "claude-name" ```bash cd .omc/agent-worktrees/agent__claude-__ - python3 scripts/agent-file-locks.py claim --branch "agent/claude-/" + gx locks claim --branch "agent/claude-/" # implement + commit inside this worktree ``` @@ -102,7 +102,7 @@ bash scripts/agent-branch-start.sh "refactor-payment-pipeline" "claude-name" 3. Finish via PR + cleanup: ```bash - bash scripts/agent-branch-finish.sh \ + gx branch finish \ --branch "agent/claude-/" \ --base dev --via-pr --wait-for-merge --cleanup ``` @@ -113,11 +113,11 @@ Notes: - Slash commands `/opsx:*` in `.claude/commands/opsx/` drive the OpenSpec artifact flow. - `.claude/settings.json` already wires the `skill_activation` / `skill_guard` hooks, so project-conventions enforcement runs automatically on edits. -- `skill_guard` blocks most Bash commands while the shell is on `dev`; run the start/claim/finish commands from within the worktree, or prefix the invocation with `ALLOW_BASH_ON_NON_AGENT_BRANCH=1` when calling from the primary checkout. +- `skill_guard` blocks most Bash commands while the shell is on `dev`; run the `gx branch ...`, `gx locks ...`, and `gx branch finish ...` commands from within the worktree, or prefix the invocation with `ALLOW_BASH_ON_NON_AGENT_BRANCH=1` when calling from the primary checkout. ### Stalled agent worktree recovery -`codex-agent.sh` auto-finishes a branch only when the codex CLI exits cleanly inside it. If the agent is killed, crashes, runs out of budget, or is started directly via `agent-branch-start.sh` (no `codex-agent.sh` wrapper), the worktree is left dirty with no commits and no PR — a "stalled" worktree. +The Guardex Codex launcher auto-finishes a branch only when the codex CLI exits cleanly inside it. If the agent is killed, crashes, runs out of budget, or is started directly via `gx branch start` without the launcher, the worktree is left dirty with no commits and no PR — a "stalled" worktree. `scripts/agent-stalled-report.sh` is a quiet wrapper around `scripts/agent-autofinish-watch.sh --once --dry-run` that surfaces stalled worktrees. It is wired as a `SessionStart` hook in `.claude/settings.json`, so each Claude Code session begins with a one-line summary per stalled branch (and is silent when nothing is stalled). @@ -144,7 +144,7 @@ Apply these repo-specific supplements in addition to that canonical contract: - `agent-branch-start` and `agent-branch-finish` must fast-forward local `dev` from `origin/dev` before branch creation/merge. 2. Ownership and lock discipline -- Claim owned files before edits: `python3 scripts/agent-file-locks.py claim --branch "" `. +- Claim owned files before edits: `gx locks claim --branch "" `. - If `main.rs` is in scope, claim lock first: `python3 scripts/main_rs_lock.py claim --owner "" --branch ""`. - Non-integrator branches must not edit `main.rs` unless explicit emergency override is approved. - Pre-commit blocks `agent/*` commits with unclaimed files or missing valid `main.rs` lock. @@ -181,7 +181,7 @@ When Guardex is enabled, this repo uses **OpenSpec as the primary workflow and S 3. Keep artifacts editable throughout implementation (proposal/spec/design/tasks are living docs, not rigid phase gates). 4. Implement from `tasks.md`; keep code and specs in sync (update `spec.md` as behavior changes). 5. Keep `tasks.md` checkpoint status updated continuously during execution; mark items as soon as they complete (do not batch-update at the end). -6. Default `tasks.md` scaffolds and manual task edits must include a final completion/cleanup section that ends with PR merge + sandbox cleanup (`gx finish --via-pr --wait-for-merge --cleanup` or `scripts/agent-branch-finish.sh ... --cleanup`) and captures PR URL + final `MERGED` handoff evidence. +6. Default `tasks.md` scaffolds and manual task edits must include a final completion/cleanup section that ends with PR merge + sandbox cleanup (`gx branch finish ... --cleanup` or `gx finish --all`) and captures PR URL + final `MERGED` handoff evidence. 7. Validate specs locally: `openspec validate --specs`. 8. Verify before archiving (`/opsx:verify ` when applicable); never archive unverified changes. @@ -261,22 +261,22 @@ scripts/openspec/init-plan-workspace.sh `GUARDEX_ON=0` disables Guardex for that repo. `GUARDEX_ON=1` explicitly enables Guardex for that repo again. -**Isolation.** Every task runs on a dedicated `agent/*` branch + worktree. Start with `scripts/agent-branch-start.sh "" ""`. Treat the base branch (`main`/`dev`) as read-only while an agent branch is active. Never `git checkout ` on a primary working tree (including nested repos); use `git worktree add` instead. The `.githooks/post-checkout` hook auto-reverts primary-branch switches during agent sessions - bypass only with `GUARDEX_ALLOW_PRIMARY_BRANCH_SWITCH=1`. +**Isolation.** Every task runs on a dedicated `agent/*` branch + worktree. Start with `gx branch start "" ""`. Treat the base branch (`main`/`dev`) as read-only while an agent branch is active. Never `git checkout ` on a primary working tree (including nested repos); use `git worktree add` instead. The `.githooks/post-checkout` hook auto-reverts primary-branch switches during agent sessions - bypass only with `GUARDEX_ALLOW_PRIMARY_BRANCH_SWITCH=1`. For every new task, including follow-up work in the same chat/session, if an assigned agent sub-branch/worktree is already open, continue in that sub-branch instead of creating a fresh lane unless the user explicitly redirects scope. Never implement directly on the local/base branch checkout; keep it unchanged and perform all edits in the agent sub-branch/worktree. -**Ownership.** Before editing, claim files: `scripts/agent-file-locks.py claim --branch "" `. Before deleting, confirm the path is in your claim. Don't edit outside your scope unless reassigned. +**Ownership.** Before editing, claim files: `gx locks claim --branch "" `. Before deleting, confirm the path is in your claim. Don't edit outside your scope unless reassigned. **Handoff gate.** Post a one-line handoff note (plan/change, owned scope, intended action) before editing. Re-read the latest handoffs before replacing others' code. -**Completion.** Finish with `scripts/agent-branch-finish.sh --branch "" --via-pr --wait-for-merge --cleanup` (or `gx finish --all`). Task is only complete when: commit pushed, PR URL recorded, state = `MERGED`, sandbox worktree pruned. If anything blocks, append a `BLOCKED:` note and stop - don't half-finish. +**Completion.** Finish with `gx branch finish --branch "" --via-pr --wait-for-merge --cleanup` (or `gx finish --all`). Task is only complete when: commit pushed, PR URL recorded, state = `MERGED`, sandbox worktree pruned. If anything blocks, append a `BLOCKED:` note and stop - don't half-finish. OMX completion policy: when a task is done, the agent must commit the task changes, push the agent branch, and create/update a PR before considering the branch complete. **Parallel safety.** Assume other agents edit nearby. Never revert unrelated changes. Report conflicts in the handoff. **Reporting.** Every completion handoff includes: files changed, behavior touched, verification commands + results, risks/follow-ups. -**OpenSpec (when change-driven).** Keep `openspec/changes//tasks.md` checkboxes current during work, not batched at the end. Task scaffolds and manual task edits must include an explicit final completion/cleanup section that ends with PR merge + sandbox cleanup (`gx finish --via-pr --wait-for-merge --cleanup` or `scripts/agent-branch-finish.sh ... --cleanup`) and records PR URL + final `MERGED` evidence. Verify specs with `openspec validate --specs` before archive. Don't archive unverified. +**OpenSpec (when change-driven).** Keep `openspec/changes//tasks.md` checkboxes current during work, not batched at the end. Task scaffolds and manual task edits must include an explicit final completion/cleanup section that ends with PR merge + sandbox cleanup (`gx branch finish ... --cleanup` or `gx finish --all`) and records PR URL + final `MERGED` evidence. Verify specs with `openspec validate --specs` before archive. Don't archive unverified. **Version bumps.** If a change bumps a published version, the same PR updates release notes/changelog. diff --git a/README.md b/README.md index 551c9ab..1b14fff 100644 --- a/README.md +++ b/README.md @@ -49,7 +49,7 @@ npm i -g @imdeadpool/guardex

- Then cd into your repo and run gx setup — hook shims, workflow shims, repo state, + Then cd into your repo and run gx setup — hook shims, repo state, and OMX / OpenSpec / caveman wiring all scaffold in one go.

@@ -88,7 +88,7 @@ cd /path/to/your/repo gx setup ``` -That's it. Install and update via `@imdeadpool/guardex`. Setup installs the minimal repo footprint: managed hook/workflow shims, lock state, AGENTS wiring, and OpenSpec/caveman/OMX scaffolding. Aliases: `gx` (preferred), `gitguardex` (full), `guardex` (legacy compatibility). +That's it. Install and update via `@imdeadpool/guardex`. Setup installs the minimal repo footprint: managed hook shims, repo-local state, AGENTS wiring, OpenSpec/caveman/OMX scaffolding, and a small set of repo-local helper assets. Aliases: `gx` (preferred), `gitguardex` (full), `guardex` (legacy compatibility). --- @@ -178,7 +178,7 @@ Before you branch, repair, or start agents, run plain `gx`. It gives you a one-s ![GitGuardex terminal status output](https://raw.githubusercontent.com/recodeee/gitguardex/main/docs/images/workflow-gx-terminal-status.svg) -Use `gx setup` the first time you wire GitGuardex into a repo. It bootstraps the managed hook/workflow shims, repo state, and optional workspace/OpenSpec wiring. If the repo drifts later, use `gx doctor` as the repair path: it reapplies the managed safety files, verifies the setup, and on protected `main` it auto-sandboxes the repair so your visible base branch stays clean. +Use `gx setup` the first time you wire GitGuardex into a repo. It bootstraps the managed hook shims, repo-local state, and optional workspace/OpenSpec wiring. If the repo drifts later, use `gx doctor` as the repair path: it reapplies the managed safety files, verifies the setup, and on protected `main` it auto-sandboxes the repair so your visible base branch stays clean. --- @@ -203,7 +203,7 @@ gx branch finish \ --base main --via-pr --wait-for-merge --cleanup ``` -If you use the managed Codex launcher shim, the finish flow runs automatically when the Codex session exits — it auto-commits, retries once after syncing if the base moved during the run, then pushes and opens the PR. +If you launch Codex through Guardex, the finish flow runs automatically when the Codex session exits — it auto-commits, retries once after syncing if the base moved during the run, then pushes and opens the PR. GitGuardex normally prunes merged sandboxes for you as part of the finish flow. If you simply do not want a local sandbox/worktree anymore, remove that worktree directly; delete the branch too only if you are intentionally abandoning that lane: @@ -542,7 +542,7 @@ Expanded flow: ### OpenSpec in agent sub-branches -- The managed Codex launcher shim enforces OpenSpec workspaces before launching Codex. +- The Guardex Codex launcher enforces OpenSpec workspaces before launching Codex. - `gx branch start` can scaffold both `openspec/changes//` and `openspec/plan//` when `GUARDEX_OPENSPEC_AUTO_INIT=true`. - The collaboration section in `tasks.md` is there for real cleanup handoffs too. If the first Codex/Claude session finishes the implementation work but hits a usage limit before `agent-branch-finish --cleanup`, hand the same sandbox to another agent, let that agent finish cleanup, and record the join/handoff in the change task. @@ -560,26 +560,29 @@ Environment variables: ## Files installed by setup ```text +AGENTS.md # managed multi-agent block appended/refreshed in place .githooks/pre-commit # shim -> gx hook run pre-commit .githooks/pre-push # shim -> gx hook run pre-push .githooks/post-merge # shim -> gx hook run post-merge .githooks/post-checkout # shim -> gx hook run post-checkout -scripts/agent-branch-start.sh # shim -> gx branch start -scripts/agent-branch-finish.sh # shim -> gx branch finish -scripts/agent-branch-merge.sh # shim -> gx branch merge -scripts/agent-worktree-prune.sh # shim -> gx worktree prune -scripts/agent-file-locks.py # shim -> gx locks ... -scripts/codex-agent.sh # shim -> CLI-owned Codex launcher -scripts/review-bot-watch.sh # shim -> CLI-owned review bot -scripts/openspec/init-plan-workspace.sh # shim -> CLI-owned OpenSpec init -scripts/openspec/init-change-workspace.sh # shim -> CLI-owned OpenSpec init -.omc/agent-worktrees -.omx/state/agent-file-locks.json +scripts/guardex-env.sh # repo toggle + hook/helper env bridge +scripts/guardex-docker-loader.sh # compose env/loader helper +scripts/agent-session-state.js # active-session state helper +scripts/install-vscode-active-agents-extension.js +.omc/agent-worktrees # Claude sandbox root +.omx/agent-worktrees # Codex sandbox root +.omx/state/agent-file-locks.json # file-lock registry .github/pull.yml.example .github/workflows/cr.yml +vscode/guardex-active-agents/package.json +vscode/guardex-active-agents/extension.js +vscode/guardex-active-agents/session-schema.js +vscode/guardex-active-agents/README.md ``` -Repo-local Codex/Claude helper files are no longer copied into the repo. Install the optional user-level companions with `gx install-agent-skills`. +Legacy compatibility note: older repos may still contain repo-local workflow scripts under `scripts/`. Direct `gx branch ...`, `gx locks ...`, `gx finish`, `gx cleanup`, `gx merge`, and `gx migrate` do not require them. `gx migrate` removes those leftover workflow shims by default. The CLI still honors repo-local `scripts/review-bot-watch.sh` and `scripts/codex-agent.sh` when they are already present so older repos can keep working during migration. + +Optional Codex/Claude user-level companions still install with `gx install-agent-skills`; they are not copied into each repo. --- @@ -645,7 +648,7 @@ npm pack --dry-run ### v7.0.18 - GitGuardex now keeps the install workflow in `gx` itself: `gx branch ...`, `gx locks ...`, `gx worktree prune`, `gx migrate`, and user-level agent-skill install now own the agent lifecycle instead of teaching pasted repo scripts as the primary surface. - Fresh installs switch repo hooks to tiny `gx hook run ...` shims, stop copying repo-local workflow implementations and repo-local skills, and stop injecting Guardex-managed `agent:*` package scripts into consumer repos. -- `gx migrate` can move older repos onto the smaller CLI-owned install surface while preserving the managed AGENTS block, lock registry state, repo-local dispatch shims, and required gitignore entries. +- `gx migrate` can move older repos onto the smaller CLI-owned install surface while preserving the managed AGENTS block, lock registry state, hook shims, required gitignore entries, and the repo-local helper assets that still carry local state. - Bumped the release from `7.0.17` → `7.0.18` so the shipped CLI-owned install-surface changes land on a fresh publishable npm version. ### v7.0.17 diff --git a/bin/multiagent-safety.js b/bin/multiagent-safety.js index f30a943..6f57748 100755 --- a/bin/multiagent-safety.js +++ b/bin/multiagent-safety.js @@ -109,33 +109,30 @@ const TEMPLATE_FILES = [ 'vscode/guardex-active-agents/README.md', ]; -const SCRIPT_SHIMS = [ - { relativePath: 'scripts/agent-branch-start.sh', kind: 'shell', command: ['branch', 'start'] }, - { relativePath: 'scripts/agent-branch-finish.sh', kind: 'shell', command: ['branch', 'finish'] }, - { relativePath: 'scripts/agent-branch-merge.sh', kind: 'shell', command: ['branch', 'merge'] }, - { relativePath: 'scripts/codex-agent.sh', kind: 'shell', command: ['internal', 'run-shell', 'codexAgent'] }, - { relativePath: 'scripts/review-bot-watch.sh', kind: 'shell', command: ['internal', 'run-shell', 'reviewBot'] }, - { relativePath: 'scripts/agent-worktree-prune.sh', kind: 'shell', command: ['worktree', 'prune'] }, - { relativePath: 'scripts/agent-file-locks.py', kind: 'python', command: ['locks'] }, - { relativePath: 'scripts/openspec/init-plan-workspace.sh', kind: 'shell', command: ['internal', 'run-shell', 'planInit'] }, - { relativePath: 'scripts/openspec/init-change-workspace.sh', kind: 'shell', command: ['internal', 'run-shell', 'changeInit'] }, -]; - -const LEGACY_MANAGED_REPO_FILES = [ +const LEGACY_WORKFLOW_SHIMS = [ 'scripts/agent-branch-start.sh', 'scripts/agent-branch-finish.sh', 'scripts/agent-branch-merge.sh', - 'scripts/agent-session-state.js', 'scripts/codex-agent.sh', - 'scripts/guardex-docker-loader.sh', - 'scripts/install-vscode-active-agents-extension.js', 'scripts/review-bot-watch.sh', 'scripts/agent-worktree-prune.sh', 'scripts/agent-file-locks.py', - 'scripts/guardex-env.sh', - 'scripts/install-agent-git-hooks.sh', 'scripts/openspec/init-plan-workspace.sh', 'scripts/openspec/init-change-workspace.sh', +]; + +const MANAGED_TEMPLATE_DESTINATIONS = TEMPLATE_FILES.map((entry) => toDestinationPath(entry)); +const MANAGED_TEMPLATE_SCRIPT_FILES = MANAGED_TEMPLATE_DESTINATIONS.filter((entry) => + entry.startsWith('scripts/'), +); + +const LEGACY_MANAGED_REPO_FILES = [ + ...LEGACY_WORKFLOW_SHIMS, + 'scripts/agent-session-state.js', + 'scripts/guardex-docker-loader.sh', + 'scripts/install-vscode-active-agents-extension.js', + 'scripts/guardex-env.sh', + 'scripts/install-agent-git-hooks.sh', '.githooks/pre-commit', '.githooks/pre-push', '.githooks/post-merge', @@ -145,9 +142,8 @@ const LEGACY_MANAGED_REPO_FILES = [ '.claude/commands/gitguardex.md', ]; -const REQUIRED_WORKFLOW_FILES = [ - ...TEMPLATE_FILES.map((entry) => toDestinationPath(entry)), - ...SCRIPT_SHIMS.map((entry) => entry.relativePath), +const REQUIRED_MANAGED_REPO_FILES = [ + ...MANAGED_TEMPLATE_DESTINATIONS, ...HOOK_NAMES.map((entry) => path.posix.join('.githooks', entry)), '.omx/state/agent-file-locks.json', ]; @@ -205,22 +201,13 @@ const USER_LEVEL_SKILL_ASSETS = [ ]; const EXECUTABLE_RELATIVE_PATHS = new Set([ - 'scripts/agent-session-state.js', - 'scripts/guardex-docker-loader.sh', - 'scripts/install-vscode-active-agents-extension.js', - ...SCRIPT_SHIMS.map((entry) => entry.relativePath), + ...MANAGED_TEMPLATE_SCRIPT_FILES, ...HOOK_NAMES.map((entry) => path.posix.join('.githooks', entry)), ]); const CRITICAL_GUARDRAIL_PATHS = new Set([ 'AGENTS.md', ...HOOK_NAMES.map((entry) => path.posix.join('.githooks', entry)), - 'scripts/agent-branch-start.sh', - 'scripts/agent-branch-finish.sh', - 'scripts/agent-branch-merge.sh', - 'scripts/agent-worktree-prune.sh', - 'scripts/codex-agent.sh', - 'scripts/agent-file-locks.py', 'scripts/guardex-env.sh', ]); @@ -239,9 +226,10 @@ const AGENT_WORKTREE_RELATIVE_DIRS = [ const MANAGED_GITIGNORE_PATHS = [ '.omx/', '.omc/', - 'scripts/*', - 'scripts/agent-branch-start.sh', - 'scripts/agent-file-locks.py', + 'scripts/agent-session-state.js', + 'scripts/guardex-docker-loader.sh', + 'scripts/guardex-env.sh', + 'scripts/install-vscode-active-agents-extension.js', '.githooks', 'oh-my-codex/', LOCK_FILE_RELATIVE, @@ -311,7 +299,7 @@ const CLI_COMMAND_DESCRIPTIONS = [ ['locks', 'CLI-owned file lock surface (claim/allow-delete/release/status/validate)'], ['worktree', 'CLI-owned worktree cleanup surface (prune)'], ['hook', 'Hook dispatch/install surface used by managed shims'], - ['migrate', 'Convert legacy repo-local installs to the new shim-based CLI-owned surface'], + ['migrate', 'Convert legacy repo-local installs to the zero-copy CLI-owned surface'], ['install-agent-skills', 'Install Guardex Codex/Claude skills into the user home'], ['protect', 'Manage protected branches (list/add/remove/set/reset)'], ['merge', 'Create/reuse an integration lane and merge overlapping agent branches'], @@ -680,6 +668,27 @@ function runPackageAsset(assetKey, rawArgs, options = {}) { }); } +function repoLocalLegacyScriptPath(repoRoot, relativePath) { + const assetPath = path.join(repoRoot, relativePath); + return fs.existsSync(assetPath) ? assetPath : null; +} + +function runReviewBotCommand(repoRoot, rawArgs, options = {}) { + const legacyScript = repoLocalLegacyScriptPath(repoRoot, 'scripts/review-bot-watch.sh'); + if (legacyScript) { + return run('bash', [legacyScript, ...rawArgs], { + cwd: repoRoot, + stdio: options.stdio || 'pipe', + timeout: options.timeout, + env: packageAssetEnv(options.env), + }); + } + return runPackageAsset('reviewBot', rawArgs, { + ...options, + cwd: repoRoot, + }); +} + function invokePackageAsset(assetKey, rawArgs, options = {}) { const result = runPackageAsset(assetKey, rawArgs, options); if (result.stdout) process.stdout.write(result.stdout); @@ -1670,7 +1679,6 @@ function ensureParentWorkspaceView(repoRoot, dryRun) { function hasGuardexBootstrapFiles(repoRoot) { const required = [ 'AGENTS.md', - 'scripts/agent-branch-start.sh', '.githooks/pre-commit', '.githooks/pre-push', LOCK_FILE_RELATIVE, @@ -1934,11 +1942,6 @@ function startProtectedBaseSandbox(blocked, { taskName, sandboxSuffix }) { return startProtectedBaseSandboxFallback(blocked, sandboxSuffix); } - const startScript = path.join(blocked.repoRoot, 'scripts', 'agent-branch-start.sh'); - if (!fs.existsSync(startScript)) { - return startProtectedBaseSandboxFallback(blocked, sandboxSuffix); - } - const startResult = runPackageAsset('branchStart', [ '--task', taskName, @@ -2088,7 +2091,7 @@ function collectWorktreeDirtyPaths(worktreePath) { } function collectDoctorForceAddPaths(worktreePath) { - return REQUIRED_WORKFLOW_FILES + return REQUIRED_MANAGED_REPO_FILES .filter((relativePath) => relativePath.startsWith('scripts/') || relativePath.startsWith('.githooks/')) .filter((relativePath) => fs.existsSync(path.join(worktreePath, relativePath))); } @@ -2124,11 +2127,10 @@ function stripDoctorSandboxLocks(rawContent, branchName) { } function claimDoctorChangedLocks(metadata) { - const lockScript = path.join(metadata.worktreePath, 'scripts', 'agent-file-locks.py'); - if (!fs.existsSync(lockScript) || !metadata.branch) { + if (!metadata.branch) { return { status: 'skipped', - note: 'lock helper unavailable in sandbox', + note: 'missing sandbox branch metadata', changedCount: 0, deletedCount: 0, }; @@ -2247,13 +2249,6 @@ function doctorFinishFlowIsPending(output) { } function finishDoctorSandboxBranch(blocked, metadata, options = {}) { - const finishScript = path.join(metadata.worktreePath, 'scripts', 'agent-branch-finish.sh'); - if (!fs.existsSync(finishScript)) { - return { - status: 'skipped', - note: `${path.relative(metadata.worktreePath, finishScript)} missing in sandbox`, - }; - } if (!hasOriginRemote(blocked.repoRoot)) { return { status: 'skipped', @@ -2290,9 +2285,9 @@ function finishDoctorSandboxBranch(blocked, metadata, options = {}) { const finishTimeoutMs = Math.max(180_000, (waitTimeoutSeconds + 60) * 1000); const waitForMergeArg = options.waitForMerge === false ? '--no-wait-for-merge' : '--wait-for-merge'; - const finishResult = run( - 'bash', - [finishScript, '--branch', metadata.branch, '--base', blocked.branch, '--via-pr', waitForMergeArg, '--cleanup'], + const finishResult = runPackageAsset( + 'branchFinish', + ['--branch', metadata.branch, '--base', blocked.branch, '--via-pr', waitForMergeArg, '--cleanup'], { cwd: metadata.worktreePath, timeout: finishTimeoutMs }, ); if (isSpawnFailure(finishResult)) { @@ -2363,7 +2358,7 @@ function mergeDoctorSandboxRepairsBackToProtectedBase(options, blocked, metadata ...(autoCommitResult.stagedFiles || []), ...OMX_SCAFFOLD_DIRECTORIES, ...Array.from(OMX_SCAFFOLD_FILES.keys()), - ...REQUIRED_WORKFLOW_FILES, + ...REQUIRED_MANAGED_REPO_FILES, 'bin', 'package.json', '.gitignore', @@ -3387,13 +3382,6 @@ function autoFinishReadyAgentBranches(repoRoot, options = {}) { return summary; } - const finishScript = path.join(repoRoot, 'scripts', 'agent-branch-finish.sh'); - if (!fs.existsSync(finishScript)) { - summary.enabled = false; - summary.details.push(`Skipped auto-finish sweep (missing ${path.relative(repoRoot, finishScript)}).`); - return summary; - } - const hasOrigin = gitRun(repoRoot, ['remote', 'get-url', 'origin'], { allowFailure: true }).status === 0; if (!hasOrigin) { summary.enabled = false; @@ -4212,11 +4200,6 @@ function gitOutputLines(worktreePath, args) { } function claimLocksForAutoCommit(repoRoot, worktreePath, branch) { - const lockScript = path.join(repoRoot, 'scripts', 'agent-file-locks.py'); - if (!fs.existsSync(lockScript)) { - return; - } - const changedFiles = uniquePreserveOrder([ ...gitOutputLines(worktreePath, ['diff', '--name-only', '--', '.', ':(exclude).omx/state/agent-file-locks.json']), ...gitOutputLines(worktreePath, ['diff', '--cached', '--name-only', '--', '.', ':(exclude).omx/state/agent-file-locks.json']), @@ -5173,9 +5156,6 @@ function runInstallInternal(options) { for (const templateFile of TEMPLATE_FILES) { operations.push(copyTemplateFile(repoRoot, templateFile, Boolean(options.force), Boolean(options.dryRun))); } - for (const shim of SCRIPT_SHIMS) { - operations.push(ensureGeneratedScriptShim(repoRoot, shim, options)); - } for (const hookName of HOOK_NAMES) { operations.push(ensureHookShim(repoRoot, hookName, options)); } @@ -5220,9 +5200,6 @@ function runFixInternal(options) { for (const templateFile of TEMPLATE_FILES) { operations.push(ensureTemplateFilePresent(repoRoot, templateFile, Boolean(options.dryRun))); } - for (const shim of SCRIPT_SHIMS) { - operations.push(ensureGeneratedScriptShim(repoRoot, shim, options)); - } for (const hookName of HOOK_NAMES) { operations.push(ensureHookShim(repoRoot, hookName, options)); } @@ -5284,7 +5261,7 @@ function runScanInternal(options) { const requiredPaths = [ ...OMX_SCAFFOLD_DIRECTORIES, ...Array.from(OMX_SCAFFOLD_FILES.keys()), - ...REQUIRED_WORKFLOW_FILES, + ...REQUIRED_MANAGED_REPO_FILES, ]; for (const relativePath of requiredPaths) { @@ -5294,7 +5271,7 @@ function runScanInternal(options) { level: 'error', code: 'missing-managed-file', path: relativePath, - message: `Missing managed workflow file: ${relativePath}`, + message: `Missing managed repo file: ${relativePath}`, }); } } @@ -5902,15 +5879,7 @@ function doctor(rawArgs) { function review(rawArgs) { const options = parseReviewArgs(rawArgs); const repoRoot = resolveRepoRoot(options.target); - const reviewScriptPath = path.join(repoRoot, 'scripts', 'review-bot-watch.sh'); - if (!fs.existsSync(reviewScriptPath)) { - throw new Error( - `Missing review bot script: ${reviewScriptPath}\n` + - `Run '${SHORT_TOOL_NAME} setup --target ${repoRoot}' then '${SHORT_TOOL_NAME} doctor --target ${repoRoot}'.`, - ); - } - - const result = run('bash', [reviewScriptPath, ...options.passthroughArgs], { cwd: repoRoot }); + const result = runReviewBotCommand(repoRoot, options.passthroughArgs); if (isSpawnFailure(result)) { throw result.error; } @@ -6049,24 +6018,9 @@ function spawnDetachedAgentProcess({ command, args, cwd, logPath }) { function agents(rawArgs) { const options = parseAgentsArgs(rawArgs); const repoRoot = resolveRepoRoot(options.target); - const reviewScriptPath = path.join(repoRoot, 'scripts', 'review-bot-watch.sh'); - const pruneScriptPath = path.join(repoRoot, 'scripts', 'agent-worktree-prune.sh'); const statePath = agentsStatePathForRepo(repoRoot); if (options.subcommand === 'start') { - if (!fs.existsSync(reviewScriptPath)) { - throw new Error( - `Missing review bot script: ${reviewScriptPath}\n` + - `Run '${SHORT_TOOL_NAME} setup --target ${repoRoot}' then '${SHORT_TOOL_NAME} doctor --target ${repoRoot}'.`, - ); - } - if (!fs.existsSync(pruneScriptPath)) { - throw new Error( - `Missing cleanup script: ${pruneScriptPath}\n` + - `Run '${SHORT_TOOL_NAME} setup --target ${repoRoot}' then '${SHORT_TOOL_NAME} doctor --target ${repoRoot}'.`, - ); - } - const existingState = readAgentsState(repoRoot); const existingReviewPid = Number.parseInt(String(existingState?.review?.pid || ''), 10); const existingCleanupPid = Number.parseInt(String(existingState?.cleanup?.pid || ''), 10); @@ -6091,8 +6045,17 @@ function agents(rawArgs) { if (!reviewRunning) { reviewPid = spawnDetachedAgentProcess({ - command: 'bash', - args: [reviewScriptPath, '--interval', String(options.reviewIntervalSeconds)], + command: process.execPath, + args: [ + path.resolve(__filename), + 'internal', + 'run-shell', + 'reviewBot', + '--target', + repoRoot, + '--interval', + String(options.reviewIntervalSeconds), + ], cwd: repoRoot, logPath: reviewLogPath, }); @@ -6143,7 +6106,7 @@ function agents(rawArgs) { review: { pid: reviewPid, intervalSeconds: reviewIntervalSeconds, - script: reviewScriptPath, + script: path.resolve(__filename), logPath: reviewLogPath, }, cleanup: { @@ -6174,7 +6137,7 @@ function agents(rawArgs) { return; } - const reviewStop = stopAgentProcessByPid(existingState?.review?.pid, 'review-bot-watch.sh'); + const reviewStop = stopAgentProcessByPid(existingState?.review?.pid, 'internal run-shell reviewBot'); const cleanupStop = stopAgentProcessByPid(existingState?.cleanup?.pid, `${path.basename(__filename)} cleanup`); if (fs.existsSync(statePath)) { @@ -6870,7 +6833,7 @@ function doctorAudit(rawArgs) { ok('git core.hooksPath is .githooks'); } - for (const relativePath of REQUIRED_WORKFLOW_FILES) { + for (const relativePath of REQUIRED_MANAGED_REPO_FILES) { const absolutePath = path.join(repoRoot, relativePath); if (!fs.existsSync(absolutePath)) { fail(`missing ${relativePath}`); @@ -7084,7 +7047,10 @@ function internal(rawArgs) { throw new Error(`Unknown internal command: ${subcommand || '(missing)'}`); } const { target, passthrough } = extractTargetedArgs(rest); - const result = runPackageAsset(assetKey, passthrough, { cwd: resolveRepoRoot(target) }); + const repoRoot = resolveRepoRoot(target); + const result = assetKey === 'reviewBot' + ? runReviewBotCommand(repoRoot, passthrough) + : runPackageAsset(assetKey, passthrough, { cwd: repoRoot }); if (result.stdout) process.stdout.write(result.stdout); if (result.stderr) process.stderr.write(result.stderr); process.exitCode = result.status; @@ -7149,7 +7115,7 @@ function migrate(rawArgs) { } const removableLegacyFiles = LEGACY_MANAGED_REPO_FILES.filter( - (relativePath) => !REQUIRED_WORKFLOW_FILES.includes(relativePath), + (relativePath) => !REQUIRED_MANAGED_REPO_FILES.includes(relativePath), ); const removalOps = removableLegacyFiles.map((relativePath) => removeLegacyManagedRepoFile(repoRoot, relativePath, { dryRun, force })); removalOps.push(removeLegacyPackageScripts(repoRoot, dryRun)); @@ -7160,10 +7126,6 @@ function migrate(rawArgs) { function cleanup(rawArgs) { const options = parseCleanupArgs(rawArgs); const repoRoot = resolveRepoRoot(options.target); - const pruneScript = path.join(repoRoot, 'scripts', 'agent-worktree-prune.sh'); - if (!fs.existsSync(pruneScript)) { - throw new Error(`Missing cleanup script: ${pruneScript}. Run '${SHORT_TOOL_NAME} setup' first.`); - } const args = []; if (options.base) { @@ -7229,11 +7191,6 @@ function cleanup(rawArgs) { function merge(rawArgs) { const options = parseMergeArgs(rawArgs); const repoRoot = resolveRepoRoot(options.target); - const mergeScript = path.join(repoRoot, 'scripts', 'agent-branch-merge.sh'); - - if (!fs.existsSync(mergeScript)) { - throw new Error(`Missing merge script: ${mergeScript}. Run '${SHORT_TOOL_NAME} setup' first.`); - } const args = []; if (options.base) { @@ -7269,11 +7226,6 @@ function merge(rawArgs) { function finish(rawArgs, defaults = {}) { const options = parseFinishArgs(rawArgs, defaults); const repoRoot = resolveRepoRoot(options.target); - const finishScript = path.join(repoRoot, 'scripts', 'agent-branch-finish.sh'); - - if (!fs.existsSync(finishScript)) { - throw new Error(`Missing finish script: ${finishScript}. Run '${SHORT_TOOL_NAME} setup' first.`); - } const worktreeEntries = listAgentWorktrees(repoRoot); const worktreeByBranch = new Map(worktreeEntries.map((entry) => [entry.branch, entry.worktreePath])); diff --git a/openspec/changes/agent-codex-zero-copy-cli-install-surface-2026-04-22-01-28/proposal.md b/openspec/changes/agent-codex-zero-copy-cli-install-surface-2026-04-22-01-28/proposal.md new file mode 100644 index 0000000..dd34cfe --- /dev/null +++ b/openspec/changes/agent-codex-zero-copy-cli-install-surface-2026-04-22-01-28/proposal.md @@ -0,0 +1,26 @@ +## Why + +- `gx` already owns the branch, lock, hook, and cleanup behavior, but fresh installs still leave repo-local `scripts/*` workflow shims behind as a second command surface. +- That remaining shim layer keeps `doctor`, `setup`, status checks, docs, and templates coupled to files that do not hold repo-specific state. +- The result is still a distributed install model: the CLI is authoritative, but repos are treated as a file-distribution medium for command entrypoints and presence markers. + +## What Changes + +- Remove repo-local workflow shims from the managed install surface and make `gx` subcommands the only canonical workflow entrypoints for branch, finish, merge, lock, cleanup, review, and OpenSpec bootstrap flows. +- Keep repo-local hook shims only; each installed hook stays a tiny `gx hook run ...` dispatcher. +- Shrink the managed repo footprint to only repo-local state and guidance: managed AGENTS block, hook shims, `.omx/.omc` scaffold, lock registry, and managed `.gitignore`. +- Teach `gx migrate` to remove leftover repo-local workflow shims and legacy command-script injections while preserving real repo-local state. +- Simplify `gx doctor` and related health checks so they validate the smaller install surface and stop treating missing repo-local workflow shims as drift. + +## Scope + +- `bin/multiagent-safety.js` +- managed install/doctor/migrate/status logic +- hook templates and docs/templates that still mention repo-local workflow shims +- install/migrate/status tests + +## Risks + +- Existing repos may still rely on `scripts/*` paths in local habits, CI, or agent prompts; migration and docs need a clear deprecation path. +- `gx finish`, `gx merge`, and `gx cleanup` currently still use repo-local shim presence as an install marker; removing that check must not weaken repo-root validation. +- `doctor` can shrink materially, but protected-branch AGENTS/hook repair still needs a deliberate posture instead of silently regressing branch-safety guarantees. diff --git a/openspec/changes/agent-codex-zero-copy-cli-install-surface-2026-04-22-01-28/specs/zero-copy-cli-install-surface/spec.md b/openspec/changes/agent-codex-zero-copy-cli-install-surface-2026-04-22-01-28/specs/zero-copy-cli-install-surface/spec.md new file mode 100644 index 0000000..c00f439 --- /dev/null +++ b/openspec/changes/agent-codex-zero-copy-cli-install-surface-2026-04-22-01-28/specs/zero-copy-cli-install-surface/spec.md @@ -0,0 +1,66 @@ +## ADDED Requirements + +### Requirement: Setup installs only repo-local state plus hook shims + +`gx setup` and `gx doctor` SHALL keep the managed repo footprint limited to repo-specific state, guidance, and hook dispatch shims. + +#### Scenario: setup installs the zero-copy footprint + +- **GIVEN** a repo opts into Guardex +- **WHEN** `gx setup` runs +- **THEN** it installs the managed AGENTS block, `.githooks/*` dispatch shims, `.omx/.omc` scaffold, lock registry state, and the managed `.gitignore` block +- **AND** it does not install repo-local workflow command shims under `scripts/` +- **AND** it does not copy workflow implementations, repo-local Codex/Claude skills, or inject Guardex-managed `agent:*` helper scripts into `package.json` + +#### Scenario: doctor repairs the zero-copy footprint without restoring workflow shims + +- **GIVEN** a repo already uses the zero-copy Guardex surface +- **WHEN** `gx doctor` repairs drift +- **THEN** it restores the managed AGENTS block, hook shims, lock registry, `.omx/.omc` scaffold, and managed `.gitignore` entries as needed +- **AND** it does not recreate repo-local workflow command shims, copied workflow implementations, repo-local skills, or Guardex-managed `agent:*` package scripts + +### Requirement: Workflow commands run directly from the CLI + +The CLI SHALL expose the Guardex workflow directly without requiring repo-local command shims to exist. + +#### Scenario: direct CLI workflow commands succeed in a zero-copy repo + +- **GIVEN** a repo with the zero-copy Guardex install surface +- **WHEN** a user runs `gx branch start`, `gx branch finish`, `gx branch merge`, `gx locks claim`, `gx worktree prune`, `gx finish`, or `gx cleanup` +- **THEN** the command executes using package-owned logic +- **AND** it does not require `scripts/agent-branch-*.sh`, `scripts/agent-file-locks.py`, `scripts/review-bot-watch.sh`, `scripts/codex-agent.sh`, or `scripts/openspec/*.sh` to exist in the repo + +### Requirement: Hook shims remain tiny dispatchers + +Installed repo hooks SHALL continue delegating to CLI-owned hook logic instead of embedding guard behavior inline. + +#### Scenario: pre-commit hook is a shim + +- **GIVEN** `gx setup` installed repo hooks +- **WHEN** `.githooks/pre-commit` is inspected or executed +- **THEN** it delegates to `gx hook run pre-commit` +- **AND** the guarded pre-commit behavior still enforces the same branch and lock rules + +### Requirement: Migration removes repo-local workflow shims + +The CLI SHALL provide a migration path from the partial CLI-owned surface to the zero-copy surface. + +#### Scenario: migrate removes leftover workflow shims + +- **GIVEN** a repo still contains Guardex-managed workflow command shims under `scripts/`, copied repo-local skills, or injected `agent:*` package scripts +- **WHEN** `gx migrate` runs +- **THEN** it replaces hooks with dispatch shims when needed +- **AND** it removes the leftover repo-local workflow command shims and managed `agent:*` script injections +- **AND** it removes repo-local Guardex skill copies when matching user-level installs are present +- **AND** it leaves the AGENTS block, `.omx/.omc` scaffold, lock registry, and managed `.gitignore` in the zero-copy form + +### Requirement: Status and doctor ignore removed workflow shims + +Guardex health checks SHALL treat the zero-copy footprint as authoritative. + +#### Scenario: status reports healthy without repo-local workflow shims + +- **GIVEN** a repo is fully migrated to the zero-copy Guardex surface +- **WHEN** `gx status --strict` or `gx doctor` inspects the repo +- **THEN** missing repo-local workflow command shims do not count as drift +- **AND** health reporting focuses on the managed AGENTS block, hook shims, `.omx/.omc` scaffold, lock registry, and managed `.gitignore` diff --git a/openspec/changes/agent-codex-zero-copy-cli-install-surface-2026-04-22-01-28/tasks.md b/openspec/changes/agent-codex-zero-copy-cli-install-surface-2026-04-22-01-28/tasks.md new file mode 100644 index 0000000..915ab2a --- /dev/null +++ b/openspec/changes/agent-codex-zero-copy-cli-install-surface-2026-04-22-01-28/tasks.md @@ -0,0 +1,31 @@ +## 1. Spec + +- [x] 1.1 Capture the zero-copy repo footprint and direct-CLI workflow requirements in `specs/zero-copy-cli-install-surface/spec.md`. +- [x] 1.2 Record the rationale, migration posture, and doctor simplification target in `proposal.md`. + +## 2. Tests + +- [x] 2.1 Update install/setup/doctor/status coverage so zero-copy repos stay healthy without any Guardex-managed workflow shims under `scripts/`. +- [x] 2.2 Add regression coverage proving direct CLI commands (`gx branch ...`, `gx locks ...`, `gx finish`, `gx cleanup`, `gx migrate`) work without repo-local workflow shims present. +- [x] 2.3 Add migration coverage that removes leftover `scripts/*` command shims while preserving hook shims, AGENTS, `.omx/.omc`, lock registry, and managed `.gitignore`. + +## 3. Implementation + +- [x] 3.1 Remove repo-local workflow command shims from the managed install/repair footprint (`SCRIPT_SHIMS`, related required-path lists, critical-path lists, and docs/templates). +- [x] 3.2 Remove CLI runtime checks that still require repo-local workflow shims to exist before `gx finish`, `gx merge`, `gx cleanup`, or related direct commands run. +- [x] 3.3 Update `gx migrate` cleanup to delete leftover workflow command shims by default and keep only the zero-copy footprint. +- [x] 3.4 Simplify status/doctor/install output and drift detection so missing repo-local workflow shims no longer trigger repair noise. +- [x] 3.5 Update README, managed AGENTS guidance, and skill/prompt references to teach `gx ...` as the only workflow command surface. + +## 4. Verification + +- [x] 4.1 Run `node --check bin/multiagent-safety.js`. Result: passed on `2026-04-22`. +- [x] 4.2 Run the focused install/migrate/status suite: `node --test test/install.test.js`. Result: passed `135/135` on `2026-04-22`. +- [x] 4.3 Run `openspec validate agent-codex-zero-copy-cli-install-surface-2026-04-22-01-28 --type change --strict`. +- [x] 4.4 Run `openspec validate --specs`. + +## 5. Cleanup + +- [x] 5.1 Reconcile the shipped README/install docs with the zero-copy repo footprint and note any intentional compatibility leftovers. README now documents the hook-only install footprint and notes that `gx migrate` removes leftover workflow shims while the CLI still honors repo-local `scripts/review-bot-watch.sh` / `scripts/codex-agent.sh` during migration. +- [ ] 5.2 Finish the agent branch via PR merge + cleanup (`gx finish --via-pr --wait-for-merge --cleanup` or `bash scripts/agent-branch-finish.sh --branch --base --via-pr --wait-for-merge --cleanup`). +- [ ] 5.3 Record PR URL + final `MERGED` evidence in the completion handoff. diff --git a/scripts/agent-branch-merge.sh b/scripts/agent-branch-merge.sh index c8ab622..ac47f6d 100755 --- a/scripts/agent-branch-merge.sh +++ b/scripts/agent-branch-merge.sh @@ -6,18 +6,37 @@ BASE_BRANCH_EXPLICIT=0 TARGET_BRANCH="" TASK_NAME="" AGENT_NAME="${GUARDEX_MERGE_AGENT_NAME:-codex}" +NODE_BIN="${GUARDEX_NODE_BIN:-node}" +CLI_ENTRY="${GUARDEX_CLI_ENTRY:-}" declare -a SOURCE_BRANCHES=() usage() { cat <<'EOF' -Usage: scripts/agent-branch-merge.sh --branch [--branch ...] [--into ] [--task ] [--agent ] [--base ] +Usage: gx branch merge --branch [--branch ...] [--into ] [--task ] [--agent ] [--base ] Examples: - bash scripts/agent-branch-merge.sh --branch agent/codex/ui-a --branch agent/codex/ui-b - bash scripts/agent-branch-merge.sh --into agent/codex/owner-lane --branch agent/codex/helper-a --branch agent/codex/helper-b + gx branch merge --branch agent/codex/ui-a --branch agent/codex/ui-b + gx branch merge --into agent/codex/owner-lane --branch agent/codex/helper-a --branch agent/codex/helper-b EOF } +run_guardex_cli() { + if [[ -n "$CLI_ENTRY" ]]; then + "$NODE_BIN" "$CLI_ENTRY" "$@" + return $? + fi + if command -v gx >/dev/null 2>&1; then + gx "$@" + return $? + fi + if command -v gitguardex >/dev/null 2>&1; then + gitguardex "$@" + return $? + fi + echo "[agent-branch-merge] Guardex CLI entrypoint unavailable; rerun via gx." >&2 + return 127 +} + sanitize_slug() { local raw="$1" local fallback="${2:-merge-agent-branches}" @@ -262,7 +281,7 @@ if [[ -z "$TARGET_BRANCH" ]]; then start_output="" if ! start_output="$( cd "$repo_root" - env GUARDEX_OPENSPEC_AUTO_INIT=1 bash "scripts/agent-branch-start.sh" "$TASK_NAME" "$AGENT_NAME" "$BASE_BRANCH" 2>&1 + GUARDEX_OPENSPEC_AUTO_INIT=1 run_guardex_cli branch start "$TASK_NAME" "$AGENT_NAME" "$BASE_BRANCH" 2>&1 )"; then printf '%s\n' "$start_output" >&2 exit 1 @@ -418,4 +437,4 @@ echo "[agent-branch-merge] Merge sequence complete for '${TARGET_BRANCH}'." if [[ "$target_created" -eq 1 ]]; then echo "[agent-branch-merge] Review and verify in '${target_worktree}', then finish the integration branch when ready." fi -echo "[agent-branch-merge] Next step: bash scripts/agent-branch-finish.sh --branch \"${TARGET_BRANCH}\" --base \"${BASE_BRANCH}\" --via-pr --wait-for-merge --cleanup" +echo "[agent-branch-merge] Next step: gx branch finish --branch \"${TARGET_BRANCH}\" --base \"${BASE_BRANCH}\" --via-pr --wait-for-merge --cleanup" diff --git a/scripts/agent-file-locks.py b/scripts/agent-file-locks.py index 06cdd7a..2abaa53 100755 --- a/scripts/agent-file-locks.py +++ b/scripts/agent-file-locks.py @@ -2,11 +2,11 @@ """Per-file lock registry for concurrent agent branches. Usage examples: - python3 scripts/agent-file-locks.py claim --branch agent/a path/to/file1 path/to/file2 - python3 scripts/agent-file-locks.py claim --branch agent/a --allow-delete path/to/obsolete-file - python3 scripts/agent-file-locks.py allow-delete --branch agent/a path/to/obsolete-file - python3 scripts/agent-file-locks.py validate --branch agent/a --staged - python3 scripts/agent-file-locks.py release --branch agent/a + gx locks claim --branch agent/a path/to/file1 path/to/file2 + gx locks claim --branch agent/a --allow-delete path/to/obsolete-file + gx locks allow-delete --branch agent/a path/to/obsolete-file + gx locks validate --branch agent/a --staged + gx locks release --branch agent/a """ from __future__ import annotations @@ -27,9 +27,9 @@ 'AGENTS.md', '.githooks/pre-commit', '.githooks/pre-push', - 'scripts/agent-branch-start.sh', - 'scripts/agent-branch-finish.sh', - 'scripts/agent-file-locks.py', + '.githooks/post-merge', + '.githooks/post-checkout', + 'scripts/guardex-env.sh', } ALLOW_GUARDRAIL_DELETE_ENV = 'AGENT_ALLOW_GUARDRAIL_DELETE' @@ -326,11 +326,11 @@ def cmd_validate(args: argparse.Namespace, repo_root: Path) -> int: print(f' - {file_path}', file=sys.stderr) print(' Approve explicit deletions with one of:', file=sys.stderr) print( - f' python3 scripts/agent-file-locks.py claim --branch "{args.branch}" --allow-delete ', + f' gx locks claim --branch "{args.branch}" --allow-delete ', file=sys.stderr, ) print( - f' python3 scripts/agent-file-locks.py allow-delete --branch "{args.branch}" ', + f' gx locks allow-delete --branch "{args.branch}" ', file=sys.stderr, ) if guardrail_delete_blocked: @@ -343,7 +343,7 @@ def cmd_validate(args: argparse.Namespace, repo_root: Path) -> int: ) print('\nClaim files with:', file=sys.stderr) - print(f' python3 scripts/agent-file-locks.py claim --branch "{args.branch}" ', file=sys.stderr) + print(f' gx locks claim --branch "{args.branch}" ', file=sys.stderr) return 1 diff --git a/scripts/review-bot-watch.sh b/scripts/review-bot-watch.sh index f98d0ef..064a71a 100755 --- a/scripts/review-bot-watch.sh +++ b/scripts/review-bot-watch.sh @@ -9,10 +9,12 @@ BASE_BRANCH="${GUARDEX_REVIEW_BOT_BASE_BRANCH:-}" ONLY_PR="${GUARDEX_REVIEW_BOT_ONLY_PR:-}" RETRY_FAILED_RAW="${GUARDEX_REVIEW_BOT_RETRY_FAILED:-false}" INCLUDE_DRAFT_RAW="${GUARDEX_REVIEW_BOT_INCLUDE_DRAFT:-false}" +NODE_BIN="${GUARDEX_NODE_BIN:-node}" +CLI_ENTRY="${GUARDEX_CLI_ENTRY:-}" usage() { cat <<'USAGE' -Usage: bash scripts/review-bot-watch.sh [options] +Usage: gx review [options] Continuously monitor GitHub pull requests targeting a base branch and dispatch one Codex-agent task per newly opened/updated PR. @@ -34,6 +36,23 @@ Environment overrides: USAGE } +run_guardex_cli() { + if [[ -n "$CLI_ENTRY" ]]; then + "$NODE_BIN" "$CLI_ENTRY" "$@" + return $? + fi + if command -v gx >/dev/null 2>&1; then + gx "$@" + return $? + fi + if command -v gitguardex >/dev/null 2>&1; then + gitguardex "$@" + return $? + fi + echo "[review-bot-watch] Guardex CLI entrypoint unavailable; rerun via gx." >&2 + return 127 +} + normalize_bool() { local raw="${1:-}" local fallback="${2:-0}" @@ -134,16 +153,20 @@ if ! command -v codex >/dev/null 2>&1; then exit 127 fi -if [[ ! -x "$repo_root/scripts/codex-agent.sh" ]]; then - echo "[review-bot-watch] Missing scripts/codex-agent.sh. Run: gx setup" >&2 - exit 1 -fi - if ! gh auth status >/dev/null 2>&1; then echo "[review-bot-watch] gh is not authenticated. Run: gh auth login" >&2 exit 1 fi +run_codex_agent() { + local local_script="$repo_root/scripts/codex-agent.sh" + if [[ -x "$local_script" ]]; then + bash "$local_script" "$@" + return $? + fi + run_guardex_cli internal run-shell codexAgent --target "$repo_root" "$@" +} + sanitize_slug() { local raw="$1" local fallback="$2" @@ -262,7 +285,7 @@ process_one_pr() { echo "[review-bot-watch] Dispatching Codex agent for PR #${pr} (${head_branch})" set +e - bash "$repo_root/scripts/codex-agent.sh" \ + run_codex_agent \ --task "$task_name" \ --agent "$AGENT_NAME" \ --base "$BASE_BRANCH" \ diff --git a/templates/codex/skills/gitguardex/SKILL.md b/templates/codex/skills/gitguardex/SKILL.md index 0d3c6a6..e156736 100644 --- a/templates/codex/skills/gitguardex/SKILL.md +++ b/templates/codex/skills/gitguardex/SKILL.md @@ -8,4 +8,4 @@ Use when repo safety may be broken. `gx status` -> `gx doctor` -> `gx status --strict` Bootstrap: `gx setup` -Ops: `bash scripts/codex-agent.sh "" ""`, `gx finish --all`, `gx cleanup` +Ops: `gx branch start "" ""`, `gx locks claim --branch "" `, `gx branch finish --branch "" --base --via-pr --wait-for-merge --cleanup`, `gx finish --all`, `gx cleanup` diff --git a/templates/codex/skills/guardex-merge-skills-to-dev/SKILL.md b/templates/codex/skills/guardex-merge-skills-to-dev/SKILL.md index 7e2d13e..dc2920f 100644 --- a/templates/codex/skills/guardex-merge-skills-to-dev/SKILL.md +++ b/templates/codex/skills/guardex-merge-skills-to-dev/SKILL.md @@ -24,7 +24,7 @@ echo "$BASE_BRANCH" 2. Start a dedicated integration sandbox from base: ```sh -bash scripts/agent-branch-start.sh "merge-skill-files-to-${BASE_BRANCH}" "skill-merge" "$BASE_BRANCH" +gx branch start "merge-skill-files-to-${BASE_BRANCH}" "skill-merge" "$BASE_BRANCH" ``` 3. Enter the sandbox worktree printed by the command above. @@ -48,11 +48,11 @@ git diff --name-only ```sh git add .codex/skills templates/codex/skills git commit -m "Merge skill file updates into ${BASE_BRANCH}" -bash scripts/agent-branch-finish.sh --branch "$(git rev-parse --abbrev-ref HEAD)" --base "$BASE_BRANCH" --via-pr --wait-for-merge --cleanup +gx branch finish --branch "$(git rev-parse --abbrev-ref HEAD)" --base "$BASE_BRANCH" --via-pr --wait-for-merge --cleanup ``` ## Notes - If a source branch has non-skill changes, this runbook keeps them out of the merge. -- If merge conflicts occur, resolve only within the skill files, then rerun `agent-branch-finish.sh`. +- If merge conflicts occur, resolve only within the skill files, then rerun `gx branch finish`. - Do not commit directly on `dev`/`main`; always merge through an agent branch/worktree. diff --git a/templates/githooks/pre-commit b/templates/githooks/pre-commit index 444cb3e..b34c919 100755 --- a/templates/githooks/pre-commit +++ b/templates/githooks/pre-commit @@ -13,6 +13,8 @@ repo_root="$(git rev-parse --show-toplevel 2>/dev/null || true)" if [[ -z "$repo_root" ]]; then exit 0 fi +NODE_BIN="${GUARDEX_NODE_BIN:-node}" +CLI_ENTRY="${GUARDEX_CLI_ENTRY:-}" guardex_env_helper="${repo_root}/scripts/guardex-env.sh" if [[ -f "$guardex_env_helper" ]]; then # shellcheck source=/dev/null @@ -22,6 +24,23 @@ if declare -F guardex_repo_is_enabled >/dev/null 2>&1 && ! guardex_repo_is_enabl exit 0 fi +run_guardex_cli() { + if [[ -n "$CLI_ENTRY" ]]; then + "$NODE_BIN" "$CLI_ENTRY" "$@" + return $? + fi + if command -v gx >/dev/null 2>&1; then + gx "$@" + return $? + fi + if command -v gitguardex >/dev/null 2>&1; then + gitguardex "$@" + return $? + fi + echo "[agent-branch-guard] Guardex CLI entrypoint unavailable; rerun via gx." >&2 + return 127 +} + if [[ "${ALLOW_COMMIT_ON_PROTECTED_BRANCH:-0}" == "1" ]]; then exit 0 fi @@ -191,11 +210,11 @@ if [[ "$branch" == agent/* ]]; then while IFS= read -r staged_file; do [[ -z "$staged_file" ]] && continue [[ "$staged_file" == ".omx/state/agent-file-locks.json" ]] && continue - python3 scripts/agent-file-locks.py claim --branch "$branch" "$staged_file" >/dev/null 2>&1 || true + run_guardex_cli locks claim --branch "$branch" "$staged_file" >/dev/null 2>&1 || true done < <(git diff --cached --name-only --diff-filter=ACMRDTUXB) fi - if ! python3 scripts/agent-file-locks.py validate --branch "$branch" --staged; then + if ! run_guardex_cli locks validate --branch "$branch" --staged; then cat >&2 <<'MSG' [agent-branch-guard] Agent branch commits require file ownership locks. Claim files first: diff --git a/templates/scripts/agent-branch-finish.sh b/templates/scripts/agent-branch-finish.sh index fb528e1..fa67866 100755 --- a/templates/scripts/agent-branch-finish.sh +++ b/templates/scripts/agent-branch-finish.sh @@ -9,11 +9,30 @@ DELETE_REMOTE_BRANCH=0 DELETE_REMOTE_BRANCH_EXPLICIT=0 MERGE_MODE="auto" GH_BIN="${GUARDEX_GH_BIN:-gh}" +NODE_BIN="${GUARDEX_NODE_BIN:-node}" +CLI_ENTRY="${GUARDEX_CLI_ENTRY:-}" CLEANUP_AFTER_MERGE_RAW="${GUARDEX_FINISH_CLEANUP:-false}" WAIT_FOR_MERGE_RAW="${GUARDEX_FINISH_WAIT_FOR_MERGE:-false}" WAIT_TIMEOUT_SECONDS_RAW="${GUARDEX_FINISH_WAIT_TIMEOUT_SECONDS:-1800}" WAIT_POLL_SECONDS_RAW="${GUARDEX_FINISH_WAIT_POLL_SECONDS:-10}" +run_guardex_cli() { + if [[ -n "$CLI_ENTRY" ]]; then + "$NODE_BIN" "$CLI_ENTRY" "$@" + return $? + fi + if command -v gx >/dev/null 2>&1; then + gx "$@" + return $? + fi + if command -v gitguardex >/dev/null 2>&1; then + gitguardex "$@" + return $? + fi + echo "[agent-branch-finish] Guardex CLI entrypoint unavailable; rerun via gx." >&2 + return 127 +} + normalize_bool() { local raw="${1:-}" local fallback="${2:-0}" @@ -431,7 +450,7 @@ run_pr_flow() { if [[ -z "$pr_title" ]]; then pr_title="Merge ${SOURCE_BRANCH} into ${BASE_BRANCH}" fi - pr_body="Automated by scripts/agent-branch-finish.sh (PR flow)." + pr_body="Automated by gx branch finish (PR flow)." "$GH_BIN" pr create \ --base "$BASE_BRANCH" \ @@ -517,9 +536,7 @@ if [[ "$PUSH_ENABLED" -eq 1 ]]; then fi fi -if [[ -x "${repo_root}/scripts/agent-file-locks.py" ]]; then - python3 "${repo_root}/scripts/agent-file-locks.py" release --branch "$SOURCE_BRANCH" >/dev/null 2>&1 || true -fi +run_guardex_cli locks release --branch "$SOURCE_BRANCH" >/dev/null 2>&1 || true base_worktree="$(get_worktree_for_branch "$BASE_BRANCH")" if [[ -n "$base_worktree" ]] && is_clean_worktree "$base_worktree" && [[ "$PUSH_ENABLED" -eq 1 ]]; then @@ -555,29 +572,25 @@ if [[ "$CLEANUP_AFTER_MERGE" -eq 1 ]]; then fi fi - if [[ -x "${repo_root}/scripts/agent-worktree-prune.sh" ]]; then - prune_args=(--base "$BASE_BRANCH" --only-dirty-worktrees --delete-branches) - if [[ "$DELETE_REMOTE_BRANCH" -eq 1 ]]; then - prune_args+=(--delete-remote-branches) - fi - if ! bash "${repo_root}/scripts/agent-worktree-prune.sh" "${prune_args[@]}"; then - echo "[agent-branch-finish] Warning: automatic worktree prune failed." >&2 - echo "[agent-branch-finish] You can run manual cleanup: bash scripts/agent-worktree-prune.sh --base ${BASE_BRANCH} --delete-branches" >&2 - fi + prune_args=(--base "$BASE_BRANCH" --only-dirty-worktrees --delete-branches) + if [[ "$DELETE_REMOTE_BRANCH" -eq 1 ]]; then + prune_args+=(--delete-remote-branches) + fi + if ! run_guardex_cli worktree prune "${prune_args[@]}"; then + echo "[agent-branch-finish] Warning: automatic worktree prune failed." >&2 + echo "[agent-branch-finish] You can run manual cleanup: gx cleanup --base ${BASE_BRANCH}" >&2 fi echo "[agent-branch-finish] Merged '${SOURCE_BRANCH}' into '${BASE_BRANCH}' via ${merge_status} flow and cleaned source branch/worktree." if [[ "$source_worktree" == "$current_worktree" && "$source_worktree" == "${agent_worktree_root}"/* ]]; then echo "[agent-branch-finish] Current worktree '${source_worktree}' still exists because it is the active shell cwd." >&2 - echo "[agent-branch-finish] Leave this directory, then run: bash scripts/agent-worktree-prune.sh --base ${BASE_BRANCH} --delete-branches" >&2 + echo "[agent-branch-finish] Leave this directory, then run: gx cleanup --base ${BASE_BRANCH}" >&2 fi else - if [[ -x "${repo_root}/scripts/agent-worktree-prune.sh" ]]; then - if ! bash "${repo_root}/scripts/agent-worktree-prune.sh" --base "$BASE_BRANCH"; then - echo "[agent-branch-finish] Warning: temporary worktree prune failed." >&2 - fi + if ! run_guardex_cli worktree prune --base "$BASE_BRANCH"; then + echo "[agent-branch-finish] Warning: temporary worktree prune failed." >&2 fi echo "[agent-branch-finish] Merged '${SOURCE_BRANCH}' into '${BASE_BRANCH}' via ${merge_status} flow and kept source branch/worktree." - echo "[agent-branch-finish] Cleanup later with: bash scripts/agent-worktree-prune.sh --base ${BASE_BRANCH} --delete-branches --delete-remote-branches" + echo "[agent-branch-finish] Cleanup later with: gx cleanup --base ${BASE_BRANCH}" fi diff --git a/templates/scripts/agent-branch-merge.sh b/templates/scripts/agent-branch-merge.sh index c8ab622..ac47f6d 100755 --- a/templates/scripts/agent-branch-merge.sh +++ b/templates/scripts/agent-branch-merge.sh @@ -6,18 +6,37 @@ BASE_BRANCH_EXPLICIT=0 TARGET_BRANCH="" TASK_NAME="" AGENT_NAME="${GUARDEX_MERGE_AGENT_NAME:-codex}" +NODE_BIN="${GUARDEX_NODE_BIN:-node}" +CLI_ENTRY="${GUARDEX_CLI_ENTRY:-}" declare -a SOURCE_BRANCHES=() usage() { cat <<'EOF' -Usage: scripts/agent-branch-merge.sh --branch [--branch ...] [--into ] [--task ] [--agent ] [--base ] +Usage: gx branch merge --branch [--branch ...] [--into ] [--task ] [--agent ] [--base ] Examples: - bash scripts/agent-branch-merge.sh --branch agent/codex/ui-a --branch agent/codex/ui-b - bash scripts/agent-branch-merge.sh --into agent/codex/owner-lane --branch agent/codex/helper-a --branch agent/codex/helper-b + gx branch merge --branch agent/codex/ui-a --branch agent/codex/ui-b + gx branch merge --into agent/codex/owner-lane --branch agent/codex/helper-a --branch agent/codex/helper-b EOF } +run_guardex_cli() { + if [[ -n "$CLI_ENTRY" ]]; then + "$NODE_BIN" "$CLI_ENTRY" "$@" + return $? + fi + if command -v gx >/dev/null 2>&1; then + gx "$@" + return $? + fi + if command -v gitguardex >/dev/null 2>&1; then + gitguardex "$@" + return $? + fi + echo "[agent-branch-merge] Guardex CLI entrypoint unavailable; rerun via gx." >&2 + return 127 +} + sanitize_slug() { local raw="$1" local fallback="${2:-merge-agent-branches}" @@ -262,7 +281,7 @@ if [[ -z "$TARGET_BRANCH" ]]; then start_output="" if ! start_output="$( cd "$repo_root" - env GUARDEX_OPENSPEC_AUTO_INIT=1 bash "scripts/agent-branch-start.sh" "$TASK_NAME" "$AGENT_NAME" "$BASE_BRANCH" 2>&1 + GUARDEX_OPENSPEC_AUTO_INIT=1 run_guardex_cli branch start "$TASK_NAME" "$AGENT_NAME" "$BASE_BRANCH" 2>&1 )"; then printf '%s\n' "$start_output" >&2 exit 1 @@ -418,4 +437,4 @@ echo "[agent-branch-merge] Merge sequence complete for '${TARGET_BRANCH}'." if [[ "$target_created" -eq 1 ]]; then echo "[agent-branch-merge] Review and verify in '${target_worktree}', then finish the integration branch when ready." fi -echo "[agent-branch-merge] Next step: bash scripts/agent-branch-finish.sh --branch \"${TARGET_BRANCH}\" --base \"${BASE_BRANCH}\" --via-pr --wait-for-merge --cleanup" +echo "[agent-branch-merge] Next step: gx branch finish --branch \"${TARGET_BRANCH}\" --base \"${BASE_BRANCH}\" --via-pr --wait-for-merge --cleanup" diff --git a/templates/scripts/agent-branch-start.sh b/templates/scripts/agent-branch-start.sh index ef8cc11..c871372 100755 --- a/templates/scripts/agent-branch-start.sh +++ b/templates/scripts/agent-branch-start.sh @@ -7,6 +7,8 @@ BASE_BRANCH="" BASE_BRANCH_EXPLICIT=0 WORKTREE_ROOT_REL="" WORKTREE_ROOT_EXPLICIT=0 +NODE_BIN="${GUARDEX_NODE_BIN:-node}" +CLI_ENTRY="${GUARDEX_CLI_ENTRY:-}" OPENSPEC_AUTO_INIT_RAW="${GUARDEX_OPENSPEC_AUTO_INIT:-false}" OPENSPEC_PLAN_SLUG_OVERRIDE="${GUARDEX_OPENSPEC_PLAN_SLUG:-}" OPENSPEC_CHANGE_SLUG_OVERRIDE="${GUARDEX_OPENSPEC_CHANGE_SLUG:-}" @@ -15,6 +17,23 @@ OPENSPEC_MASTERPLAN_LABEL_RAW="${GUARDEX_OPENSPEC_MASTERPLAN_LABEL-masterplan}" PRINT_NAME_ONLY=0 POSITIONAL_ARGS=() +run_guardex_cli() { + if [[ -n "$CLI_ENTRY" ]]; then + "$NODE_BIN" "$CLI_ENTRY" "$@" + return $? + fi + if command -v gx >/dev/null 2>&1; then + gx "$@" + return $? + fi + if command -v gitguardex >/dev/null 2>&1; then + gitguardex "$@" + return $? + fi + echo "[agent-branch-start] Guardex CLI entrypoint unavailable; rerun via gx." >&2 + return 127 +} + while [[ $# -gt 0 ]]; do case "$1" in --task) @@ -385,26 +404,14 @@ initialize_openspec_plan_workspace() { local worktree="$2" local plan_slug="$3" - hydrate_local_helper_in_worktree "$repo" "$worktree" "scripts/openspec/init-plan-workspace.sh" - if [[ "$OPENSPEC_AUTO_INIT" -ne 1 ]]; then return 0 fi - local openspec_script="${worktree}/scripts/openspec/init-plan-workspace.sh" - if [[ ! -f "$openspec_script" ]]; then - echo "[agent-branch-start] OpenSpec init script is missing in sandbox worktree." >&2 - echo "[agent-branch-start] Run 'gx setup --target \"$repo\"' to repair templates, then retry." >&2 - return 1 - fi - if [[ ! -x "$openspec_script" ]]; then - chmod +x "$openspec_script" 2>/dev/null || true - fi - local init_output="" if ! init_output="$( cd "$worktree" - bash "scripts/openspec/init-plan-workspace.sh" "$plan_slug" 2>&1 + run_guardex_cli internal run-shell planInit "$plan_slug" 2>&1 )"; then printf '%s\n' "$init_output" >&2 echo "[agent-branch-start] OpenSpec workspace initialization failed for plan '${plan_slug}'." >&2 @@ -423,26 +430,14 @@ initialize_openspec_change_workspace() { local change_slug="$3" local capability_slug="$4" - hydrate_local_helper_in_worktree "$repo" "$worktree" "scripts/openspec/init-change-workspace.sh" - if [[ "$OPENSPEC_AUTO_INIT" -ne 1 ]]; then return 0 fi - local openspec_script="${worktree}/scripts/openspec/init-change-workspace.sh" - if [[ ! -f "$openspec_script" ]]; then - echo "[agent-branch-start] OpenSpec change init script is missing in sandbox worktree." >&2 - echo "[agent-branch-start] Run 'gx setup --target \"$repo\"' to repair templates, then retry." >&2 - return 1 - fi - if [[ ! -x "$openspec_script" ]]; then - chmod +x "$openspec_script" 2>/dev/null || true - fi - local init_output="" if ! init_output="$( cd "$worktree" - bash "scripts/openspec/init-change-workspace.sh" "$change_slug" "$capability_slug" 2>&1 + run_guardex_cli internal run-shell changeInit "$change_slug" "$capability_slug" 2>&1 )"; then printf '%s\n' "$init_output" >&2 echo "[agent-branch-start] OpenSpec workspace initialization failed for change '${change_slug}'." >&2 @@ -592,7 +587,6 @@ if [[ -n "$auto_transfer_stash_ref" ]]; then fi fi -hydrate_local_helper_in_worktree "$repo_root" "$worktree_path" "scripts/codex-agent.sh" hydrate_dependency_dir_symlink_in_worktree "$repo_root" "$worktree_path" "node_modules" hydrate_dependency_dir_symlink_in_worktree "$repo_root" "$worktree_path" "apps/frontend/node_modules" hydrate_dependency_dir_symlink_in_worktree "$repo_root" "$worktree_path" "apps/backend/node_modules" @@ -609,6 +603,6 @@ echo "[agent-branch-start] OpenSpec change: openspec/changes/${openspec_change_s echo "[agent-branch-start] OpenSpec plan: openspec/plan/${openspec_plan_slug}" echo "[agent-branch-start] Next steps:" echo " cd \"${worktree_path}\"" -echo " python3 scripts/agent-file-locks.py claim --branch \"${branch_name}\" " +echo " gx locks claim --branch \"${branch_name}\" " echo " # implement + commit" -echo " bash scripts/agent-branch-finish.sh --branch \"${branch_name}\" --base ${BASE_BRANCH} --via-pr --wait-for-merge" +echo " gx branch finish --branch \"${branch_name}\" --base ${BASE_BRANCH} --via-pr --wait-for-merge" diff --git a/templates/scripts/agent-file-locks.py b/templates/scripts/agent-file-locks.py index 06cdd7a..2abaa53 100755 --- a/templates/scripts/agent-file-locks.py +++ b/templates/scripts/agent-file-locks.py @@ -2,11 +2,11 @@ """Per-file lock registry for concurrent agent branches. Usage examples: - python3 scripts/agent-file-locks.py claim --branch agent/a path/to/file1 path/to/file2 - python3 scripts/agent-file-locks.py claim --branch agent/a --allow-delete path/to/obsolete-file - python3 scripts/agent-file-locks.py allow-delete --branch agent/a path/to/obsolete-file - python3 scripts/agent-file-locks.py validate --branch agent/a --staged - python3 scripts/agent-file-locks.py release --branch agent/a + gx locks claim --branch agent/a path/to/file1 path/to/file2 + gx locks claim --branch agent/a --allow-delete path/to/obsolete-file + gx locks allow-delete --branch agent/a path/to/obsolete-file + gx locks validate --branch agent/a --staged + gx locks release --branch agent/a """ from __future__ import annotations @@ -27,9 +27,9 @@ 'AGENTS.md', '.githooks/pre-commit', '.githooks/pre-push', - 'scripts/agent-branch-start.sh', - 'scripts/agent-branch-finish.sh', - 'scripts/agent-file-locks.py', + '.githooks/post-merge', + '.githooks/post-checkout', + 'scripts/guardex-env.sh', } ALLOW_GUARDRAIL_DELETE_ENV = 'AGENT_ALLOW_GUARDRAIL_DELETE' @@ -326,11 +326,11 @@ def cmd_validate(args: argparse.Namespace, repo_root: Path) -> int: print(f' - {file_path}', file=sys.stderr) print(' Approve explicit deletions with one of:', file=sys.stderr) print( - f' python3 scripts/agent-file-locks.py claim --branch "{args.branch}" --allow-delete ', + f' gx locks claim --branch "{args.branch}" --allow-delete ', file=sys.stderr, ) print( - f' python3 scripts/agent-file-locks.py allow-delete --branch "{args.branch}" ', + f' gx locks allow-delete --branch "{args.branch}" ', file=sys.stderr, ) if guardrail_delete_blocked: @@ -343,7 +343,7 @@ def cmd_validate(args: argparse.Namespace, repo_root: Path) -> int: ) print('\nClaim files with:', file=sys.stderr) - print(f' python3 scripts/agent-file-locks.py claim --branch "{args.branch}" ', file=sys.stderr) + print(f' gx locks claim --branch "{args.branch}" ', file=sys.stderr) return 1 diff --git a/templates/scripts/codex-agent.sh b/templates/scripts/codex-agent.sh index 1829d92..6a05817 100755 --- a/templates/scripts/codex-agent.sh +++ b/templates/scripts/codex-agent.sh @@ -7,6 +7,7 @@ BASE_BRANCH="${GUARDEX_BASE_BRANCH:-}" BASE_BRANCH_EXPLICIT=0 CODEX_BIN="${GUARDEX_CODEX_BIN:-codex}" NODE_BIN="${GUARDEX_NODE_BIN:-node}" +CLI_ENTRY="${GUARDEX_CLI_ENTRY:-}" AUTO_FINISH_RAW="${GUARDEX_CODEX_AUTO_FINISH:-true}" AUTO_REVIEW_ON_CONFLICT_RAW="${GUARDEX_CODEX_AUTO_REVIEW_ON_CONFLICT:-true}" AUTO_CLEANUP_RAW="${GUARDEX_CODEX_AUTO_CLEANUP:-true}" @@ -17,6 +18,23 @@ OPENSPEC_CHANGE_SLUG_OVERRIDE="${GUARDEX_OPENSPEC_CHANGE_SLUG:-}" OPENSPEC_CAPABILITY_SLUG_OVERRIDE="${GUARDEX_OPENSPEC_CAPABILITY_SLUG:-}" OPENSPEC_MASTERPLAN_LABEL_RAW="${GUARDEX_OPENSPEC_MASTERPLAN_LABEL-masterplan}" +run_guardex_cli() { + if [[ -n "$CLI_ENTRY" ]]; then + "$NODE_BIN" "$CLI_ENTRY" "$@" + return $? + fi + if command -v gx >/dev/null 2>&1; then + gx "$@" + return $? + fi + if command -v gitguardex >/dev/null 2>&1; then + gitguardex "$@" + return $? + fi + echo "[codex-agent] Guardex CLI entrypoint unavailable; rerun via gx." >&2 + return 127 +} + normalize_bool() { local raw="${1:-}" local fallback="${2:-0}" @@ -375,11 +393,6 @@ start_sandbox_fallback() { printf '[agent-branch-start] Worktree: %s\n' "$worktree_path" } -if [[ ! -x "${repo_root}/scripts/agent-branch-start.sh" ]]; then - echo "[codex-agent] Missing scripts/agent-branch-start.sh. Run: gx setup" >&2 - exit 1 -fi - start_args=("$TASK_NAME" "$AGENT_NAME") if [[ "$BASE_BRANCH_EXPLICIT" -eq 1 ]]; then start_args+=("$BASE_BRANCH") @@ -392,7 +405,7 @@ set +e start_output="$( GUARDEX_OPENSPEC_AUTO_INIT="$OPENSPEC_AUTO_INIT" \ GUARDEX_OPENSPEC_MASTERPLAN_LABEL="$OPENSPEC_MASTERPLAN_LABEL_RAW" \ - bash "${repo_root}/scripts/agent-branch-start.sh" "${start_args[@]}" 2>&1 + run_guardex_cli branch start "${start_args[@]}" 2>&1 )" start_status=$? set -e @@ -529,7 +542,7 @@ print_takeover_prompt() { change_artifact="openspec/changes/${change_slug}/" fi - finish_cmd="bash scripts/agent-branch-finish.sh --branch \"${branch}\" --base ${base_branch} --via-pr --wait-for-merge --cleanup" + finish_cmd="gx branch finish --branch \"${branch}\" --base ${base_branch} --via-pr --wait-for-merge --cleanup" echo "[codex-agent] Takeover sandbox: ${wt}" echo "[codex-agent] Takeover prompt: Continue \`${change_slug}\` on branch \`${branch}\`. Work inside \`${wt}\`, review \`${change_artifact}\`, continue from the current state instead of creating a new sandbox, and when the work is done run \`${finish_cmd}\`." @@ -585,24 +598,12 @@ ensure_openspec_plan_workspace() { return 0 fi - hydrate_local_helper_in_worktree "$wt" "scripts/openspec/init-plan-workspace.sh" - - local openspec_script="${wt}/scripts/openspec/init-plan-workspace.sh" - if [[ ! -f "$openspec_script" ]]; then - echo "[codex-agent] Missing OpenSpec init script in sandbox: ${openspec_script}" >&2 - echo "[codex-agent] Run 'gx setup --target ${repo_root}' and retry." >&2 - return 1 - fi - if [[ ! -x "$openspec_script" ]]; then - chmod +x "$openspec_script" 2>/dev/null || true - fi - local plan_slug plan_slug="$(resolve_openspec_plan_slug "$branch")" local init_output="" if ! init_output="$( cd "$wt" - bash "scripts/openspec/init-plan-workspace.sh" "$plan_slug" 2>&1 + run_guardex_cli internal run-shell planInit "$plan_slug" 2>&1 )"; then printf '%s\n' "$init_output" >&2 echo "[codex-agent] OpenSpec workspace initialization failed for plan '${plan_slug}'." >&2 @@ -622,24 +623,12 @@ ensure_openspec_change_workspace() { return 0 fi - hydrate_local_helper_in_worktree "$wt" "scripts/openspec/init-change-workspace.sh" - - local openspec_script="${wt}/scripts/openspec/init-change-workspace.sh" - if [[ ! -f "$openspec_script" ]]; then - echo "[codex-agent] Missing OpenSpec change init script in sandbox: ${openspec_script}" >&2 - echo "[codex-agent] Run 'gx setup --target ${repo_root}' and retry." >&2 - return 1 - fi - if [[ ! -x "$openspec_script" ]]; then - chmod +x "$openspec_script" 2>/dev/null || true - fi - local change_slug capability_slug init_output="" change_slug="$(resolve_openspec_change_slug "$branch")" capability_slug="$(resolve_openspec_capability_slug)" if ! init_output="$( cd "$wt" - bash "scripts/openspec/init-change-workspace.sh" "$change_slug" "$capability_slug" 2>&1 + run_guardex_cli internal run-shell changeInit "$change_slug" "$capability_slug" 2>&1 )"; then printf '%s\n' "$init_output" >&2 echo "[codex-agent] OpenSpec workspace initialization failed for change '${change_slug}'." >&2 @@ -668,11 +657,6 @@ worktree_has_changes() { claim_changed_files() { local wt="$1" local branch="$2" - local lock_script="${repo_root}/scripts/agent-file-locks.py" - - if [[ ! -x "$lock_script" ]]; then - return 0 - fi local changed_raw deleted_raw changed_raw="$({ @@ -683,7 +667,7 @@ claim_changed_files() { if [[ -n "$changed_raw" ]]; then mapfile -t changed_files < <(printf '%s\n' "$changed_raw") - python3 "$lock_script" claim --branch "$branch" "${changed_files[@]}" >/dev/null 2>&1 || true + run_guardex_cli locks claim --branch "$branch" "${changed_files[@]}" >/dev/null 2>&1 || true fi deleted_raw="$({ @@ -693,7 +677,7 @@ claim_changed_files() { if [[ -n "$deleted_raw" ]]; then mapfile -t deleted_files < <(printf '%s\n' "$deleted_raw") - python3 "$lock_script" allow-delete --branch "$branch" "${deleted_files[@]}" >/dev/null 2>&1 || true + run_guardex_cli locks allow-delete --branch "$branch" "${deleted_files[@]}" >/dev/null 2>&1 || true fi } @@ -842,7 +826,7 @@ run_finish_flow() { return 2 fi - if finish_output="$(bash "${repo_root}/scripts/agent-branch-finish.sh" "${finish_args[@]}" 2>&1)"; then + if finish_output="$(run_guardex_cli branch finish "${finish_args[@]}" 2>&1)"; then printf '%s\n' "$finish_output" return 0 fi @@ -865,7 +849,7 @@ run_finish_flow() { fi ) - if finish_output="$(bash "${repo_root}/scripts/agent-branch-finish.sh" "${finish_args[@]}" 2>&1)"; then + if finish_output="$(run_guardex_cli branch finish "${finish_args[@]}" 2>&1)"; then printf '%s\n' "$finish_output" return 0 fi @@ -954,18 +938,16 @@ if [[ "$AUTO_FINISH" -eq 1 && -n "$worktree_branch" && "$worktree_branch" != "HE fi fi -if [[ -x "${repo_root}/scripts/agent-worktree-prune.sh" ]]; then - echo "[codex-agent] Session ended (exit=${codex_exit}). Running worktree cleanup..." - prune_args=() - if [[ "$BASE_BRANCH_EXPLICIT" -eq 1 ]]; then - prune_args+=(--base "$BASE_BRANCH") - fi - if [[ "$AUTO_CLEANUP" -eq 1 && "$auto_finish_completed" -eq 1 ]]; then - prune_args+=(--only-dirty-worktrees --delete-branches --delete-remote-branches) - fi - if ! bash "${repo_root}/scripts/agent-worktree-prune.sh" "${prune_args[@]}"; then - echo "[codex-agent] Warning: automatic worktree cleanup failed." >&2 - fi +echo "[codex-agent] Session ended (exit=${codex_exit}). Running worktree cleanup..." +prune_args=() +if [[ "$BASE_BRANCH_EXPLICIT" -eq 1 ]]; then + prune_args+=(--base "$BASE_BRANCH") +fi +if [[ "$AUTO_CLEANUP" -eq 1 && "$auto_finish_completed" -eq 1 ]]; then + prune_args+=(--only-dirty-worktrees --delete-branches --delete-remote-branches) +fi +if ! run_guardex_cli worktree prune "${prune_args[@]}"; then + echo "[codex-agent] Warning: automatic worktree cleanup failed." >&2 fi if [[ ! -d "$worktree_path" ]]; then @@ -978,7 +960,7 @@ else echo "[codex-agent] Branch kept intentionally. Cleanup on demand: gx cleanup --branch \"${worktree_branch}\"" else print_takeover_prompt "$worktree_path" "$worktree_branch" - echo "[codex-agent] If finished, merge with: bash scripts/agent-branch-finish.sh --branch \"${worktree_branch}\" --base dev --via-pr --wait-for-merge" + echo "[codex-agent] If finished, merge with: gx branch finish --branch \"${worktree_branch}\" --base dev --via-pr --wait-for-merge" echo "[codex-agent] Cleanup on demand: gx cleanup --branch \"${worktree_branch}\"" fi fi diff --git a/templates/scripts/review-bot-watch.sh b/templates/scripts/review-bot-watch.sh index f98d0ef..064a71a 100755 --- a/templates/scripts/review-bot-watch.sh +++ b/templates/scripts/review-bot-watch.sh @@ -9,10 +9,12 @@ BASE_BRANCH="${GUARDEX_REVIEW_BOT_BASE_BRANCH:-}" ONLY_PR="${GUARDEX_REVIEW_BOT_ONLY_PR:-}" RETRY_FAILED_RAW="${GUARDEX_REVIEW_BOT_RETRY_FAILED:-false}" INCLUDE_DRAFT_RAW="${GUARDEX_REVIEW_BOT_INCLUDE_DRAFT:-false}" +NODE_BIN="${GUARDEX_NODE_BIN:-node}" +CLI_ENTRY="${GUARDEX_CLI_ENTRY:-}" usage() { cat <<'USAGE' -Usage: bash scripts/review-bot-watch.sh [options] +Usage: gx review [options] Continuously monitor GitHub pull requests targeting a base branch and dispatch one Codex-agent task per newly opened/updated PR. @@ -34,6 +36,23 @@ Environment overrides: USAGE } +run_guardex_cli() { + if [[ -n "$CLI_ENTRY" ]]; then + "$NODE_BIN" "$CLI_ENTRY" "$@" + return $? + fi + if command -v gx >/dev/null 2>&1; then + gx "$@" + return $? + fi + if command -v gitguardex >/dev/null 2>&1; then + gitguardex "$@" + return $? + fi + echo "[review-bot-watch] Guardex CLI entrypoint unavailable; rerun via gx." >&2 + return 127 +} + normalize_bool() { local raw="${1:-}" local fallback="${2:-0}" @@ -134,16 +153,20 @@ if ! command -v codex >/dev/null 2>&1; then exit 127 fi -if [[ ! -x "$repo_root/scripts/codex-agent.sh" ]]; then - echo "[review-bot-watch] Missing scripts/codex-agent.sh. Run: gx setup" >&2 - exit 1 -fi - if ! gh auth status >/dev/null 2>&1; then echo "[review-bot-watch] gh is not authenticated. Run: gh auth login" >&2 exit 1 fi +run_codex_agent() { + local local_script="$repo_root/scripts/codex-agent.sh" + if [[ -x "$local_script" ]]; then + bash "$local_script" "$@" + return $? + fi + run_guardex_cli internal run-shell codexAgent --target "$repo_root" "$@" +} + sanitize_slug() { local raw="$1" local fallback="$2" @@ -262,7 +285,7 @@ process_one_pr() { echo "[review-bot-watch] Dispatching Codex agent for PR #${pr} (${head_branch})" set +e - bash "$repo_root/scripts/codex-agent.sh" \ + run_codex_agent \ --task "$task_name" \ --agent "$AGENT_NAME" \ --base "$BASE_BRANCH" \ diff --git a/test/install.test.js b/test/install.test.js index 942918a..58e7ab5 100644 --- a/test/install.test.js +++ b/test/install.test.js @@ -36,6 +36,42 @@ function runNodeWithEnv(args, cwd, extraEnv) { }); } +function runBranchStart(args, cwd, extraEnv = {}) { + return runNodeWithEnv(['branch', 'start', ...args], cwd, extraEnv); +} + +function runBranchFinish(args, cwd, extraEnv = {}) { + return runNodeWithEnv(['branch', 'finish', ...args], cwd, extraEnv); +} + +function runWorktreePrune(args, cwd, extraEnv = {}) { + return runNodeWithEnv(['worktree', 'prune', ...args], cwd, extraEnv); +} + +function runLockTool(args, cwd, extraEnv = {}) { + return runNodeWithEnv(['locks', ...args], cwd, extraEnv); +} + +function runInternalShell(assetKey, args, cwd, extraEnv = {}) { + return runNodeWithEnv(['internal', 'run-shell', assetKey, ...args], cwd, extraEnv); +} + +function runCodexAgent(args, cwd, extraEnv = {}) { + return runInternalShell('codexAgent', args, cwd, extraEnv); +} + +function runReviewBot(args, cwd, extraEnv = {}) { + return runInternalShell('reviewBot', args, cwd, extraEnv); +} + +function runPlanInit(args, cwd, extraEnv = {}) { + return runInternalShell('planInit', args, cwd, extraEnv); +} + +function runChangeInit(args, cwd, extraEnv = {}) { + return runInternalShell('changeInit', args, cwd, extraEnv); +} + function runCmd(cmd, args, cwd, options = {}) { const sanitizedEnv = { ...process.env }; delete sanitizedEnv.CODEX_THREAD_ID; @@ -65,6 +101,19 @@ function runCmd(cmd, args, cwd, options = {}) { }); } +function assertZeroCopyManagedGitignore(content) { + assert.match(content, /# multiagent-safety:START/); + assert.match(content, /^scripts\/agent-session-state\.js$/m); + assert.match(content, /^scripts\/guardex-docker-loader\.sh$/m); + assert.match(content, /^scripts\/guardex-env\.sh$/m); + assert.match(content, /^scripts\/install-vscode-active-agents-extension\.js$/m); + assert.doesNotMatch(content, /^scripts\/\*$/m); + assert.doesNotMatch(content, /^scripts\/agent-branch-start\.sh$/m); + assert.doesNotMatch(content, /^scripts\/agent-file-locks\.py$/m); + assert.match(content, /^\.githooks$/m); + assert.match(content, /# multiagent-safety:END/); +} + function createFakeNpmScript(scriptBody) { const fakeBin = fs.mkdtempSync(path.join(os.tmpdir(), 'guardex-fake-npm-')); const fakeNpmPath = path.join(fakeBin, 'npm'); @@ -225,11 +274,7 @@ function prepareDoctorAutoFinishReadyBranch(repoDir, options = {}) { result = runCmd('git', ['push', 'origin', baseBranch], repoDir); assert.equal(result.status, 0, result.stderr || result.stdout); - result = runCmd( - 'bash', - ['scripts/agent-branch-start.sh', taskName, agentName, baseBranch], - repoDir, - ); + result = runBranchStart([taskName, agentName, baseBranch], repoDir); assert.equal(result.status, 0, result.stderr || result.stdout); const readyBranch = extractCreatedBranch(result.stdout); const readyWorktree = extractCreatedWorktree(result.stdout); @@ -275,13 +320,8 @@ function commitFile(repoDir, relativePath, contents, message) { const currentBranch = runCmd('git', ['branch', '--show-current'], repoDir); assert.equal(currentBranch.status, 0, currentBranch.stderr); const branchName = currentBranch.stdout.trim(); - const lockScriptPath = path.join(repoDir, 'scripts', 'agent-file-locks.py'); - if (branchName.startsWith('agent/') && fs.existsSync(lockScriptPath)) { - const claim = runCmd( - 'python3', - ['scripts/agent-file-locks.py', 'claim', '--branch', branchName, relativePath], - repoDir, - ); + if (branchName.startsWith('agent/')) { + const claim = runLockTool(['claim', '--branch', branchName, relativePath], repoDir); assert.equal(claim.status, 0, claim.stderr || claim.stdout); } @@ -433,16 +473,10 @@ test('setup provisions workflow files and repo config', () => { '.omc/agent-worktrees', '.omx/notepad.md', '.omx/project-memory.json', - 'scripts/agent-branch-start.sh', - 'scripts/agent-branch-finish.sh', - 'scripts/codex-agent.sh', + 'scripts/agent-session-state.js', 'scripts/guardex-docker-loader.sh', - 'scripts/review-bot-watch.sh', - 'scripts/agent-worktree-prune.sh', - 'scripts/agent-file-locks.py', 'scripts/guardex-env.sh', - 'scripts/openspec/init-plan-workspace.sh', - 'scripts/openspec/init-change-workspace.sh', + 'scripts/install-vscode-active-agents-extension.js', '.githooks/pre-commit', '.githooks/pre-push', '.githooks/post-merge', @@ -458,9 +492,20 @@ test('setup provisions workflow files and repo config', () => { assert.equal(fs.existsSync(path.join(repoDir, relativePath)), true, `${relativePath} missing`); } - const branchStartShim = fs.readFileSync(path.join(repoDir, 'scripts', 'agent-branch-start.sh'), 'utf8'); - assert.match(branchStartShim, /exec "\$node_bin" "\$GUARDEX_CLI_ENTRY" 'branch' 'start' "\$@"/); - assert.match(branchStartShim, /exec "\$cli_bin" 'branch' 'start' "\$@"/); + const removedWorkflowShims = [ + 'scripts/agent-branch-start.sh', + 'scripts/agent-branch-finish.sh', + 'scripts/agent-branch-merge.sh', + 'scripts/codex-agent.sh', + 'scripts/review-bot-watch.sh', + 'scripts/agent-worktree-prune.sh', + 'scripts/agent-file-locks.py', + 'scripts/openspec/init-plan-workspace.sh', + 'scripts/openspec/init-change-workspace.sh', + ]; + for (const relativePath of removedWorkflowShims) { + assert.equal(fs.existsSync(path.join(repoDir, relativePath)), false, `${relativePath} should not be installed`); + } const preCommitShim = fs.readFileSync(path.join(repoDir, '.githooks', 'pre-commit'), 'utf8'); assert.match(preCommitShim, /exec "\$node_bin" "\$GUARDEX_CLI_ENTRY" 'hook' 'run' 'pre-commit' "\$@"/); @@ -488,9 +533,13 @@ test('setup provisions workflow files and repo config', () => { const gitignoreContent = fs.readFileSync(path.join(repoDir, '.gitignore'), 'utf8'); assert.match(gitignoreContent, /# multiagent-safety:START/); - assert.match(gitignoreContent, /^scripts\/\*$/m); - assert.match(gitignoreContent, /^scripts\/agent-branch-start\.sh$/m); - assert.match(gitignoreContent, /^scripts\/agent-file-locks\.py$/m); + assert.match(gitignoreContent, /^scripts\/agent-session-state\.js$/m); + assert.match(gitignoreContent, /^scripts\/guardex-docker-loader\.sh$/m); + assert.match(gitignoreContent, /^scripts\/guardex-env\.sh$/m); + assert.match(gitignoreContent, /^scripts\/install-vscode-active-agents-extension\.js$/m); + assert.doesNotMatch(gitignoreContent, /^scripts\/\*$/m); + assert.doesNotMatch(gitignoreContent, /^scripts\/agent-branch-start\.sh$/m); + assert.doesNotMatch(gitignoreContent, /^scripts\/agent-file-locks\.py$/m); assert.match(gitignoreContent, /^\.githooks$/m); assert.doesNotMatch(gitignoreContent, /^\.githooks\/pre-commit$/m); assert.match(gitignoreContent, /\.omx\//); @@ -591,10 +640,7 @@ test('setup and doctor explain .githooks file conflicts and still write managed assert.match(combined, /\.githooks\/pre-commit needs it to be a directory/); let gitignoreContent = fs.readFileSync(path.join(repoDir, '.gitignore'), 'utf8'); - assert.match(gitignoreContent, /# multiagent-safety:START/); - assert.match(gitignoreContent, /scripts\/agent-branch-start\.sh/); - assert.match(gitignoreContent, /scripts\/agent-file-locks\.py/); - assert.match(gitignoreContent, /^\.githooks$/m); + assertZeroCopyManagedGitignore(gitignoreContent); result = runNode(['doctor', '--target', repoDir], repoDir); assert.notEqual(result.status, 0, 'doctor should fail when .githooks is a file'); @@ -602,7 +648,7 @@ test('setup and doctor explain .githooks file conflicts and still write managed assert.match(combined, /Path conflict: \.githooks exists as a file/); gitignoreContent = fs.readFileSync(path.join(repoDir, '.gitignore'), 'utf8'); - assert.match(gitignoreContent, /scripts\/agent-file-locks\.py/); + assertZeroCopyManagedGitignore(gitignoreContent); }); test('setup and doctor skip repo bootstrap when repo .env disables Guardex', () => { @@ -832,8 +878,7 @@ test('migrate removes legacy copied assets and installs user-level skills on req assert.equal(fs.existsSync(path.join(guardexHomeDir, '.codex', 'skills', 'gitguardex', 'SKILL.md')), true); assert.equal(fs.existsSync(path.join(guardexHomeDir, '.claude', 'commands', 'gitguardex.md')), true); - const branchStartShim = fs.readFileSync(path.join(repoDir, 'scripts', 'agent-branch-start.sh'), 'utf8'); - assert.match(branchStartShim, /exec "\$cli_bin" 'branch' 'start' "\$@"/); + assert.equal(fs.existsSync(path.join(repoDir, 'scripts', 'agent-branch-start.sh')), false); const preCommitShim = fs.readFileSync(path.join(repoDir, '.githooks', 'pre-commit'), 'utf8'); assert.match(preCommitShim, /exec "\$cli_bin" 'hook' 'run' 'pre-commit' "\$@"/); }); @@ -940,8 +985,8 @@ test('init aliases setup and provisions workflow files', () => { const result = runNode(['init', '--target', repoDir, '--no-global-install'], repoDir); assert.equal(result.status, 0, result.stderr || result.stdout); - assert.equal(fs.existsSync(path.join(repoDir, 'scripts', 'agent-branch-start.sh')), true); - assert.equal(fs.existsSync(path.join(repoDir, 'scripts', 'agent-branch-finish.sh')), true); + assert.equal(fs.existsSync(path.join(repoDir, 'scripts', 'guardex-env.sh')), true); + assert.equal(fs.existsSync(path.join(repoDir, '.githooks', 'pre-commit')), true); assert.equal(fs.existsSync(path.join(repoDir, 'AGENTS.md')), true); }); @@ -973,9 +1018,9 @@ test('setup recursively installs into nested git repos, skipping node_modules/wo for (const repo of [topDir, nestedA, nestedB]) { assert.equal(fs.existsSync(path.join(repo, 'AGENTS.md')), true, `AGENTS.md missing in ${repo}`); assert.equal( - fs.existsSync(path.join(repo, 'scripts', 'agent-branch-start.sh')), + fs.existsSync(path.join(repo, 'scripts', 'guardex-env.sh')), true, - `agent-branch-start.sh missing in ${repo}`, + `guardex-env.sh missing in ${repo}`, ); assert.equal( fs.existsSync(path.join(repo, '.githooks', 'pre-commit')), @@ -1025,13 +1070,13 @@ test('setup --no-recursive limits install to the top-level repo', () => { ); }); -test('review-bot-watch script prints help after setup', () => { +test('review bot helper prints help after setup', () => { const repoDir = initRepo(); const setupResult = runNode(['setup', '--target', repoDir, '--no-global-install'], repoDir); assert.equal(setupResult.status, 0, setupResult.stderr || setupResult.stdout); - const helpResult = runCmd('bash', ['scripts/review-bot-watch.sh', '--help'], repoDir); + const helpResult = runReviewBot(['--help'], repoDir); assert.equal(helpResult.status, 0, helpResult.stderr || helpResult.stdout); assert.match(helpResult.stdout, /Continuously monitor GitHub pull requests targeting a base branch/); }); @@ -1052,7 +1097,11 @@ test('setup refreshes initialized protected main through a sandbox and prunes it assert.equal(result.status, 0, result.stderr || result.stdout); const initialGitignore = fs.readFileSync(gitignorePath, 'utf8'); - fs.writeFileSync(gitignorePath, initialGitignore.replace(/^scripts\/\*\n/m, ''), 'utf8'); + fs.writeFileSync( + gitignorePath, + initialGitignore.replace(/^scripts\/agent-session-state\.js\n/m, ''), + 'utf8', + ); result = runNode(['setup', '--target', repoDir, '--no-global-install'], repoDir); assert.equal(result.status, 0, result.stderr || result.stdout); @@ -1072,7 +1121,7 @@ test('setup refreshes initialized protected main through a sandbox and prunes it assert.equal(sandboxBranchCheck.stdout.trim(), '', 'setup sandbox branch should be pruned'); const refreshedGitignore = fs.readFileSync(gitignorePath, 'utf8'); - assert.match(refreshedGitignore, /^scripts\/\*$/m); + assert.match(refreshedGitignore, /^scripts\/agent-session-state\.js$/m); }); test('setup allows explicit protected-main override for in-place maintenance', () => { @@ -1132,19 +1181,14 @@ test('doctor on protected main auto-runs in a sandbox branch/worktree', () => { result = runCmd('git', ['push', 'origin', 'main'], repoDir); assert.equal(result.status, 0, result.stderr || result.stdout); - fs.rmSync(path.join(repoDir, 'scripts', 'agent-branch-finish.sh')); + assert.equal(fs.existsSync(path.join(repoDir, 'scripts', 'agent-branch-finish.sh')), false); result = runNode(['doctor', '--target', repoDir], repoDir); assert.equal(result.status, 0, result.stderr || result.stdout); assert.match(result.stdout, /doctor detected protected branch 'main'/); const createdBranch = extractCreatedBranch(result.stdout); - const createdWorktree = extractCreatedWorktree(result.stdout); assert.match(createdBranch, /^agent\/gx\/.+-gx-doctor$/); - assert.equal( - fs.existsSync(path.join(repoDir, 'scripts', 'agent-branch-finish.sh')), - true, - 'protected main checkout should regain finish script', - ); + assert.equal(fs.existsSync(path.join(repoDir, 'scripts', 'agent-branch-finish.sh')), false); const rootStatus = runCmd('git', ['status', '--short', '--untracked-files=no'], repoDir); assert.equal(rootStatus.status, 0, rootStatus.stderr || rootStatus.stdout); @@ -1269,9 +1313,9 @@ test('doctor on protected main bootstraps sandbox branch even before setup exist const createdWorktree = extractCreatedWorktree(result.stdout); assert.match(createdBranch, /^agent\/gx\/.+-gx-doctor$/); assert.equal( - fs.existsSync(path.join(repoDir, 'scripts', 'agent-branch-start.sh')), + fs.existsSync(path.join(repoDir, 'scripts', 'guardex-env.sh')), true, - 'protected main checkout should regain agent-branch-start.sh', + 'protected main checkout should regain zero-copy managed scripts', ); assert.equal(fs.existsSync(path.join(repoDir, '.omx', 'state')), true); assert.equal(fs.existsSync(path.join(repoDir, '.omx', 'logs')), true); @@ -1352,8 +1396,7 @@ exit 1 'protected main checkout should stay untouched while sandbox finish flow delivers the repair', ); const repairedRootGitignore = fs.readFileSync(path.join(repoDir, '.gitignore'), 'utf8'); - assert.match(repairedRootGitignore, /^scripts\/\*$/m); - assert.match(repairedRootGitignore, /^\.githooks$/m); + assertZeroCopyManagedGitignore(repairedRootGitignore); const createdBranch = extractCreatedBranch(result.stdout); result = runCmd('git', ['show-ref', '--verify', '--quiet', `refs/heads/${createdBranch}`], repoDir); @@ -1745,12 +1788,12 @@ test('setup agent-branch-start rejects in-place flags to keep local checkout unc seedCommit(repoDir); - result = runCmd('bash', ['scripts/agent-branch-start.sh', 'demo', 'bot', 'dev', '--in-place'], repoDir); + result = runBranchStart(['demo', 'bot', 'dev', '--in-place'], repoDir); assert.notEqual(result.status, 0, result.stdout); assert.match(result.stderr, /In-place branch mode is disabled/); assert.match(result.stderr, /always creates an isolated worktree/); - result = runCmd('bash', ['scripts/agent-branch-start.sh', 'demo', 'bot', 'dev', '--allow-in-place'], repoDir); + result = runBranchStart(['demo', 'bot', 'dev', '--allow-in-place'], repoDir); assert.notEqual(result.status, 0, result.stdout); assert.match(result.stderr, /In-place branch mode is disabled/); }); @@ -1774,17 +1817,10 @@ cat <<'OUT' OUT `); - result = runCmd( - 'bash', - ['scripts/agent-branch-start.sh', 'restore-snapshot', 'planner', 'dev'], - repoDir, - { - env: { - PATH: `${fakeBin}:${process.env.PATH || ''}`, - GUARDEX_AGENT_TYPE: 'planner', - }, - }, - ); + result = runBranchStart(['restore-snapshot', 'planner', 'dev'], repoDir, { + PATH: `${fakeBin}:${process.env.PATH || ''}`, + GUARDEX_AGENT_TYPE: 'planner', + }); assert.equal(result.status, 0, result.stderr || result.stdout); assert.match( result.stdout, @@ -1801,12 +1837,10 @@ test('setup agent-branch-start ignores GUARDEX_CODEX_AUTH_SNAPSHOT for branch na assert.equal(result.status, 0, result.stderr || result.stdout); seedCommit(repoDir); - result = runCmd( - 'bash', - ['scripts/agent-branch-start.sh', 'ship-fix', 'bot', 'dev'], - repoDir, - { env: { GUARDEX_CODEX_AUTH_SNAPSHOT: 'Prod Snapshot One', CLAUDECODE: '0' } }, - ); + result = runBranchStart(['ship-fix', 'bot', 'dev'], repoDir, { + GUARDEX_CODEX_AUTH_SNAPSHOT: 'Prod Snapshot One', + CLAUDECODE: '0', + }); assert.equal(result.status, 0, result.stderr || result.stdout); // 'bot' has no claude/codex substring and no CLAUDECODE sentinel → role falls back to 'codex'. assert.match( @@ -1824,16 +1858,14 @@ test('setup agent-branch-start keeps role-datetime branch labels compact (v7.0.3 assert.equal(result.status, 0, result.stderr || result.stdout); seedCommit(repoDir); - result = runCmd( - 'bash', + result = runBranchStart( [ - 'scripts/agent-branch-start.sh', 'rust-layer-phase7-dashboard-read-name-columns-and-badges', 'codex-admin-recodee-com', 'dev', ], repoDir, - { env: { GUARDEX_CODEX_AUTH_SNAPSHOT: 'Zeus Portasmosonmagyarovar Hu Snapshot' } }, + { GUARDEX_CODEX_AUTH_SNAPSHOT: 'Zeus Portasmosonmagyarovar Hu Snapshot' }, ); assert.equal(result.status, 0, result.stderr || result.stdout); const createdBranch = extractCreatedBranch(result.stdout); @@ -1853,17 +1885,10 @@ test('setup agent-branch-start routes Claude sessions into .omc worktrees and st assert.equal(result.status, 0, result.stderr || result.stdout); seedCommit(repoDir); - result = runCmd( - 'bash', - ['scripts/agent-branch-start.sh', 'claude-session-task', 'bot', 'dev'], - repoDir, - { - env: { - CLAUDECODE: '1', - GUARDEX_AGENT_TYPE: 'planner', - }, - }, - ); + result = runBranchStart(['claude-session-task', 'bot', 'dev'], repoDir, { + CLAUDECODE: '1', + GUARDEX_AGENT_TYPE: 'planner', + }); assert.equal(result.status, 0, result.stderr || result.stdout); const createdBranch = extractCreatedBranch(result.stdout); @@ -1896,12 +1921,9 @@ test('setup agent-branch-start supports optional OpenSpec auto-bootstrap toggles assert.equal(result.status, 0, result.stderr || result.stdout); seedCommit(repoDir); - result = runCmd( - 'bash', - ['scripts/agent-branch-start.sh', 'openspec-default', 'bot', 'dev'], - repoDir, - { env: { GUARDEX_OPENSPEC_AUTO_INIT: 'true' } }, - ); + result = runBranchStart(['openspec-default', 'bot', 'dev'], repoDir, { + GUARDEX_OPENSPEC_AUTO_INIT: 'true', + }); assert.equal(result.status, 0, result.stderr || result.stdout); const defaultBranch = extractCreatedBranch(result.stdout); const defaultWorktree = extractCreatedWorktree(result.stdout); @@ -1940,12 +1962,9 @@ test('setup agent-branch-start supports optional OpenSpec auto-bootstrap toggles 'default branch start should scaffold OpenSpec change spec', ); - result = runCmd( - 'bash', - ['scripts/agent-branch-start.sh', 'openspec-disabled', 'bot', 'dev'], - repoDir, - { env: { GUARDEX_OPENSPEC_AUTO_INIT: 'false' } }, - ); + result = runBranchStart(['openspec-disabled', 'bot', 'dev'], repoDir, { + GUARDEX_OPENSPEC_AUTO_INIT: 'false', + }); assert.equal(result.status, 0, result.stderr || result.stdout); const disabledWorktree = extractCreatedWorktree(result.stdout); const disabledPlanSlug = extractOpenSpecPlanSlug(result.stdout); @@ -1979,7 +1998,7 @@ test('setup agent-branch-start defaults base to current branch, stores base meta result = runCmd('git', ['push', 'origin', 'main'], repoDir); assert.equal(result.status, 0, result.stderr || result.stdout); - result = runCmd('bash', ['scripts/agent-branch-start.sh', 'auto-base', 'bot'], repoDir); + result = runBranchStart(['auto-base', 'bot'], repoDir); assert.equal(result.status, 0, result.stderr || result.stdout); assert.doesNotMatch(`${result.stdout}\n${result.stderr}`, /set up to track/i); const agentBranch = extractCreatedBranch(result.stdout); @@ -2027,7 +2046,7 @@ test('agent-branch-start prefers current protected branch over stale configured fs.writeFileSync(packageJsonPath, `${JSON.stringify(packageJson, null, 2)}\n`, 'utf8'); fs.writeFileSync(path.join(repoDir, 'dev-untracked.txt'), 'dev untracked change\n', 'utf8'); - result = runCmd('bash', ['scripts/agent-branch-start.sh', 'prefer-dev', 'bot'], repoDir); + result = runBranchStart(['prefer-dev', 'bot'], repoDir); assert.equal(result.status, 0, result.stderr || result.stdout); assert.match(result.stdout, /Moved local changes from 'dev' into 'agent\/codex\//); @@ -2075,7 +2094,7 @@ test('agent-branch-start moves protected-branch local changes into the new agent fs.writeFileSync(packageJsonPath, `${JSON.stringify(packageJson, null, 2)}\n`, 'utf8'); fs.writeFileSync(path.join(repoDir, 'scratch-note.txt'), 'untracked change\n', 'utf8'); - result = runCmd('bash', ['scripts/agent-branch-start.sh', 'move-readme', 'bot'], repoDir); + result = runBranchStart(['move-readme', 'bot'], repoDir); assert.equal(result.status, 0, result.stderr || result.stdout); const agentWorktree = extractCreatedWorktree(result.stdout); assert.match(result.stdout, /Moved local changes from 'main' into 'agent\/codex\//); @@ -2092,7 +2111,7 @@ test('agent-branch-start moves protected-branch local changes into the new agent assert.doesNotMatch(stashList.stdout, /guardex-auto-transfer-/); }); -test('agent-branch-start hydrates codex-agent helper into new worktrees when missing locally', () => { +test('agent-branch-start leaves removed workflow helpers out of new worktrees', () => { const repoDir = initRepo(); seedCommit(repoDir); @@ -2107,17 +2126,15 @@ test('agent-branch-start hydrates codex-agent helper into new worktrees when mis assert.equal(result.status, 0, result.stderr || result.stdout); const localCodexAgent = path.join(repoDir, 'scripts', 'codex-agent.sh'); - assert.equal(fs.existsSync(localCodexAgent), true, 'setup should provision local codex-agent helper'); + assert.equal(fs.existsSync(localCodexAgent), false, 'zero-copy setup should not provision local codex-agent helper'); - result = runCmd('bash', ['scripts/agent-branch-start.sh', 'hydrate-codex', 'bot'], repoDir); + result = runBranchStart(['hydrate-codex', 'bot'], repoDir); assert.equal(result.status, 0, result.stderr || result.stdout); - assert.match(result.stdout, /Hydrated local helper in worktree: scripts\/codex-agent\.sh/); + assert.doesNotMatch(result.stdout, /Hydrated local helper in worktree: scripts\/codex-agent\.sh/); const createdWorktree = extractCreatedWorktree(result.stdout); const worktreeCodexAgent = path.join(createdWorktree, 'scripts', 'codex-agent.sh'); - assert.equal(fs.existsSync(worktreeCodexAgent), true, 'worktree should receive codex-agent helper'); - const mode = fs.statSync(worktreeCodexAgent).mode; - assert.equal((mode & 0o111) !== 0, true, 'hydrated codex-agent helper should be executable'); + assert.equal(fs.existsSync(worktreeCodexAgent), false, 'worktree should stay zero-copy for codex-agent helper'); }); test('agent-branch-start links dependency node_modules directories into new worktrees when present', () => { @@ -2144,7 +2161,7 @@ test('agent-branch-start links dependency node_modules directories into new work fs.writeFileSync(path.join(sourceDir, '.guardex-link-marker'), 'present\n', 'utf8'); } - result = runCmd('bash', ['scripts/agent-branch-start.sh', 'hydrate-deps', 'bot'], repoDir, { + result = runBranchStart(['hydrate-deps', 'bot'], repoDir, { GUARDEX_PROTECTED_BRANCHES: 'main', }); assert.equal(result.status, 0, result.stderr || result.stdout); @@ -2183,9 +2200,7 @@ test('agent-branch-finish handles Claude-root worktrees when inferring base from result = runCmd('git', ['push', 'origin', 'main'], repoDir); assert.equal(result.status, 0, result.stderr || result.stdout); - result = runCmd('bash', ['scripts/agent-branch-start.sh', 'finish-from-dev', 'bot'], repoDir, { - env: { CLAUDECODE: '1' }, - }); + result = runBranchStart(['finish-from-dev', 'bot'], repoDir, { CLAUDECODE: '1' }); assert.equal(result.status, 0, result.stderr || result.stdout); const agentBranch = extractCreatedBranch(result.stdout); const agentWorktree = extractCreatedWorktree(result.stdout); @@ -2199,7 +2214,7 @@ test('agent-branch-finish handles Claude-root worktrees when inferring base from result = runCmd('git', ['worktree', 'add', auxWorktree, 'main'], repoDir); assert.equal(result.status, 0, result.stderr || result.stdout); - const finish = runCmd('bash', ['scripts/agent-branch-finish.sh', '--branch', agentBranch], repoDir); + const finish = runBranchFinish(['--branch', agentBranch], repoDir); assert.equal(finish.status, 0, finish.stderr || finish.stdout); assert.match(finish.stdout, new RegExp(`Merged '${escapeRegexLiteral(agentBranch)}' into 'main'`)); @@ -2267,15 +2282,32 @@ test('review command launches local review-bot script and accepts legacy start t assert.equal(fs.readFileSync(markerArgs, 'utf8').trim(), '--interval 45 --once'); }); -test('review command explains setup + doctor steps when script is missing in target repo', () => { +test('review command falls back to the package review bot when the repo has no local helper', () => { const repoDir = initRepo(); - - const result = runNode(['review', '--target', repoDir], repoDir); - assert.equal(result.status, 1, result.stderr || result.stdout); - assert.match( - result.stderr, - new RegExp(`Run 'gx setup --target ${escapeRegexLiteral(repoDir)}' then 'gx doctor --target ${escapeRegexLiteral(repoDir)}'`), + seedCommit(repoDir); + const { fakeBin: fakeGhBin } = createFakeGhScript( + 'if [[ "$1" == "auth" && "$2" == "status" ]]; then\n' + + ' exit 0\n' + + 'fi\n' + + 'if [[ "$1" == "pr" && "$2" == "list" ]]; then\n' + + ' exit 0\n' + + 'fi\n' + + 'echo "unexpected gh args: $*" >&2\n' + + 'exit 1\n', ); + const fakeCodexBin = fs.mkdtempSync(path.join(os.tmpdir(), 'guardex-fake-codex-review-')); + const fakeCodexPath = path.join(fakeCodexBin, 'codex'); + fs.writeFileSync(fakeCodexPath, '#!/usr/bin/env bash\nset -e\nexit 0\n', 'utf8'); + fs.chmodSync(fakeCodexPath, 0o755); + + const result = runNodeWithEnv(['review', '--target', repoDir, '--once'], repoDir, { + PATH: `${fakeGhBin}:${fakeCodexBin}:${process.env.PATH}`, + }); + assert.equal(result.status, 0, result.stderr || result.stdout); + assert.equal(fs.existsSync(path.join(repoDir, 'scripts', 'review-bot-watch.sh')), false); + assert.equal(fs.existsSync(path.join(repoDir, 'scripts', 'codex-agent.sh')), false); + assert.match(result.stdout, /\[review-bot-watch\] Starting monitor/); + assert.match(result.stdout, /\[review-bot-watch\] No open PRs for base 'dev'\./); }); test('agents command starts review+cleanup bots for the target repo and stops them', () => { @@ -2442,7 +2474,7 @@ test('finish command auto-commits dirty agent worktree and runs PR finish flow f result = runCmd('git', ['push', 'origin', 'main'], repoDir); assert.equal(result.status, 0, result.stderr || result.stdout); - result = runCmd('bash', ['scripts/agent-branch-start.sh', 'finish-all', 'bot'], repoDir); + result = runBranchStart(['finish-all', 'bot'], repoDir); assert.equal(result.status, 0, result.stderr || result.stdout); const agentBranch = extractCreatedBranch(result.stdout); const agentWorktree = extractCreatedWorktree(result.stdout); @@ -2809,10 +2841,7 @@ test('setup appends managed gitignore block without clobbering existing entries' const first = fs.readFileSync(path.join(repoDir, '.gitignore'), 'utf8'); assert.match(first, /node_modules\//); - assert.match(first, /# multiagent-safety:START/); - assert.match(first, /^scripts\/\*$/m); - assert.match(first, /^\.githooks$/m); - assert.match(first, /# multiagent-safety:END/); + assertZeroCopyManagedGitignore(first); result = runNode(['setup', '--target', repoDir, '--no-global-install'], repoDir); assert.equal(result.status, 0, result.stderr || result.stdout); @@ -3133,7 +3162,7 @@ test('repo .env GUARDEX_ON=false disables bootstrap scripts and git hook enforce fs.writeFileSync(path.join(repoDir, '.env'), 'GUARDEX_ON=false\n', 'utf8'); - result = runCmd('bash', ['scripts/agent-branch-start.sh', 'disabled-toggle', 'bot', 'dev'], repoDir); + result = runBranchStart(['disabled-toggle', 'bot', 'dev'], repoDir); assert.notEqual(result.status, 0, result.stderr || result.stdout); assert.match(result.stderr, /Guardex is disabled for this repo/); @@ -3244,16 +3273,11 @@ test('codex-agent launches codex inside a fresh sandbox worktree and keeps branc const cwdMarker = path.join(repoDir, '.codex-agent-cwd'); const argsMarker = path.join(repoDir, '.codex-agent-args'); - const launch = runCmd( - 'bash', - ['scripts/codex-agent.sh', 'launch-task', 'planner', 'dev', '--model', 'gpt-5.4-mini'], - repoDir, - { - PATH: `${fakeBin}:${process.env.PATH}`, - GUARDEX_TEST_CODEX_CWD: cwdMarker, - GUARDEX_TEST_CODEX_ARGS: argsMarker, - }, - ); + const launch = runCodexAgent(['launch-task', 'planner', 'dev', '--model', 'gpt-5.4-mini'], repoDir, { + PATH: `${fakeBin}:${process.env.PATH}`, + GUARDEX_TEST_CODEX_CWD: cwdMarker, + GUARDEX_TEST_CODEX_ARGS: argsMarker, + }); assert.equal(launch.status, 0, launch.stderr || launch.stdout); assert.match(launch.stdout, /\[codex-agent\] Launching codex in sandbox:/); assert.match(launch.stdout, /\[codex-agent\] Session ended \(exit=0\)\. Running worktree cleanup\.\.\./); @@ -3300,7 +3324,7 @@ test('codex-agent launches codex inside a fresh sandbox worktree and keeps branc ); }); -test('codex-agent restores local branch and falls back to safe worktree start when starter script switches in-place', () => { +test('codex-agent ignores stale repo-local starter shims and keeps the visible checkout stable', () => { const repoDir = initRepo(); seedCommit(repoDir); attachOriginRemote(repoDir); @@ -3338,21 +3362,16 @@ test('codex-agent restores local branch and falls back to safe worktree start wh const cwdMarker = path.join(repoDir, '.codex-agent-cwd-fallback'); const argsMarker = path.join(repoDir, '.codex-agent-args-fallback'); - const launch = runCmd( - 'bash', - ['scripts/codex-agent.sh', 'fallback-task', 'planner', 'dev', '--model', 'gpt-5.4-mini'], - repoDir, - { - PATH: `${fakeBin}:${process.env.PATH}`, - GUARDEX_TEST_CODEX_CWD: cwdMarker, - GUARDEX_TEST_CODEX_ARGS: argsMarker, - }, - ); + const launch = runCodexAgent(['fallback-task', 'planner', 'dev', '--model', 'gpt-5.4-mini'], repoDir, { + PATH: `${fakeBin}:${process.env.PATH}`, + GUARDEX_TEST_CODEX_CWD: cwdMarker, + GUARDEX_TEST_CODEX_ARGS: argsMarker, + }); assert.equal(launch.status, 0, launch.stderr || launch.stdout); const combinedOutput = `${launch.stdout}\n${launch.stderr}`; - assert.match(combinedOutput, /Unsafe starter output/); assert.match(combinedOutput, /\[agent-branch-start\] Created branch: agent\/planner\//); assert.match(combinedOutput, /\[codex-agent\] Auto-finish skipped.*no mergeable remote context/); + assert.doesNotMatch(combinedOutput, /Unsafe starter output/); const launchedCwd = fs.readFileSync(cwdMarker, 'utf8').trim(); assert.match( @@ -3414,18 +3433,8 @@ test('codex-agent supports --codex-bin override before positional arguments', () const cwdMarker = path.join(repoDir, '.codex-agent-cwd-override'); const argsMarker = path.join(repoDir, '.codex-agent-args-override'); - const launch = runCmd( - 'bash', - [ - 'scripts/codex-agent.sh', - '--codex-bin', - fakeCodexPath, - 'launch-task', - 'planner', - 'dev', - '--model', - 'gpt-5.4-mini', - ], + const launch = runCodexAgent( + ['--codex-bin', fakeCodexPath, 'launch-task', 'planner', 'dev', '--model', 'gpt-5.4-mini'], repoDir, { GUARDEX_TEST_CODEX_CWD: cwdMarker, @@ -3467,16 +3476,11 @@ test('codex-agent keeps dirty sandbox worktrees after session exit', () => { const cwdMarker = path.join(repoDir, '.codex-agent-cwd-dirty'); const argsMarker = path.join(repoDir, '.codex-agent-args-dirty'); - const launch = runCmd( - 'bash', - ['scripts/codex-agent.sh', 'dirty-task', 'planner', 'dev', '--model', 'gpt-5.4-mini'], - repoDir, - { - PATH: `${fakeBin}:${process.env.PATH}`, - GUARDEX_TEST_CODEX_CWD: cwdMarker, - GUARDEX_TEST_CODEX_ARGS: argsMarker, - }, - ); + const launch = runCodexAgent(['dirty-task', 'planner', 'dev', '--model', 'gpt-5.4-mini'], repoDir, { + PATH: `${fakeBin}:${process.env.PATH}`, + GUARDEX_TEST_CODEX_CWD: cwdMarker, + GUARDEX_TEST_CODEX_ARGS: argsMarker, + }); assert.equal(launch.status, 0, launch.stderr || launch.stdout); assert.match(launch.stdout, /\[agent-worktree-prune\] Summary: .*removed_worktrees=0/); assert.match(launch.stdout, /\[codex-agent\] Sandbox worktree kept:/); @@ -3547,20 +3551,15 @@ exit 1 const cwdMarker = path.join(repoDir, '.codex-agent-cwd-autofinish'); const argsMarker = path.join(repoDir, '.codex-agent-args-autofinish'); - const launch = runCmd( - 'bash', - ['scripts/codex-agent.sh', 'autofinish-task', 'planner', 'dev', '--model', 'gpt-5.4-mini'], - repoDir, - { - PATH: `${fakeCodexBin}:${process.env.PATH}`, - GUARDEX_TEST_CODEX_CWD: cwdMarker, - GUARDEX_TEST_CODEX_ARGS: argsMarker, - GUARDEX_TEST_GH_MERGE_STATE: ghMergeState, - GUARDEX_GH_BIN: fakeGhPath, - GUARDEX_FINISH_WAIT_TIMEOUT_SECONDS: '60', - GUARDEX_FINISH_WAIT_POLL_SECONDS: '0', - }, - ); + const launch = runCodexAgent(['autofinish-task', 'planner', 'dev', '--model', 'gpt-5.4-mini'], repoDir, { + PATH: `${fakeCodexBin}:${process.env.PATH}`, + GUARDEX_TEST_CODEX_CWD: cwdMarker, + GUARDEX_TEST_CODEX_ARGS: argsMarker, + GUARDEX_TEST_GH_MERGE_STATE: ghMergeState, + GUARDEX_GH_BIN: fakeGhPath, + GUARDEX_FINISH_WAIT_TIMEOUT_SECONDS: '60', + GUARDEX_FINISH_WAIT_POLL_SECONDS: '0', + }); assert.equal(launch.status, 0, launch.stderr || launch.stdout); const combinedOutput = `${launch.stdout}\n${launch.stderr}`; assert.match(combinedOutput, /\[codex-agent\] Auto-finish enabled: commit -> push\/PR -> wait for merge -> cleanup\./); @@ -3605,15 +3604,10 @@ test('codex-agent prints a takeover prompt when the sandbox is kept after an inc fs.chmodSync(fakeCodexPath, 0o755); const cwdMarker = path.join(repoDir, '.codex-agent-cwd-takeover'); - const launch = runCmd( - 'bash', - ['scripts/codex-agent.sh', 'usage-limit-task', 'planner', 'dev'], - repoDir, - { - PATH: `${fakeCodexBin}:${process.env.PATH}`, - GUARDEX_TEST_CODEX_CWD: cwdMarker, - }, - ); + const launch = runCodexAgent(['usage-limit-task', 'planner', 'dev'], repoDir, { + PATH: `${fakeCodexBin}:${process.env.PATH}`, + GUARDEX_TEST_CODEX_CWD: cwdMarker, + }); assert.equal(launch.status, 42, launch.stderr || launch.stdout); const combinedOutput = `${launch.stdout}\n${launch.stderr}`; @@ -3632,7 +3626,7 @@ test('codex-agent prints a takeover prompt when the sandbox is kept after an inc ); assert.match( combinedOutput, - new RegExp(`agent-branch-finish\\.sh --branch "${escapeRegexLiteral(launchedBranch)}" --base dev --via-pr --wait-for-merge --cleanup`), + new RegExp(`gx branch finish --branch "${escapeRegexLiteral(launchedBranch)}" --base dev --via-pr --wait-for-merge --cleanup`), ); }); @@ -3706,21 +3700,16 @@ exit 1 const cwdMarker = path.join(repoDir, '.codex-agent-cwd-autocommit-retry'); const argsMarker = path.join(repoDir, '.codex-agent-args-autocommit-retry'); const originAdvanceClone = path.join(repoDir, '.origin-advance-clone'); - const launch = runCmd( - 'bash', - ['scripts/codex-agent.sh', 'autocommit-retry-task', 'planner', 'dev', '--model', 'gpt-5.4-mini'], - repoDir, - { - PATH: `${fakeCodexBin}:${process.env.PATH}`, - GUARDEX_TEST_CODEX_CWD: cwdMarker, - GUARDEX_TEST_CODEX_ARGS: argsMarker, - GUARDEX_TEST_ORIGIN_PATH: originPath, - GUARDEX_TEST_ORIGIN_ADVANCE_CLONE: originAdvanceClone, - GUARDEX_GH_BIN: fakeGhPath, - GUARDEX_FINISH_WAIT_TIMEOUT_SECONDS: '60', - GUARDEX_FINISH_WAIT_POLL_SECONDS: '0', - }, - ); + const launch = runCodexAgent(['autocommit-retry-task', 'planner', 'dev', '--model', 'gpt-5.4-mini'], repoDir, { + PATH: `${fakeCodexBin}:${process.env.PATH}`, + GUARDEX_TEST_CODEX_CWD: cwdMarker, + GUARDEX_TEST_CODEX_ARGS: argsMarker, + GUARDEX_TEST_ORIGIN_PATH: originPath, + GUARDEX_TEST_ORIGIN_ADVANCE_CLONE: originAdvanceClone, + GUARDEX_GH_BIN: fakeGhPath, + GUARDEX_FINISH_WAIT_TIMEOUT_SECONDS: '60', + GUARDEX_FINISH_WAIT_POLL_SECONDS: '0', + }); assert.equal(launch.status, 0, launch.stderr || launch.stdout); const combinedOutput = `${launch.stdout}\n${launch.stderr}`; assert.match(combinedOutput, /\[codex-agent\] Auto-committed sandbox changes on 'agent\/planner\/autocommit-retry-task-/); @@ -3770,18 +3759,13 @@ echo "unexpected gh args: $*" >&2 exit 1 `); - const launch = runCmd( - 'bash', - ['scripts/codex-agent.sh', 'hook-fail-task', 'planner', 'dev'], - repoDir, - { - PATH: `${fakeCodexBin}:${process.env.PATH}`, - GUARDEX_CODEX_WAIT_FOR_MERGE: 'false', - GUARDEX_GH_BIN: fakeGhPath, - GUARDEX_FINISH_WAIT_TIMEOUT_SECONDS: '30', - GUARDEX_FINISH_WAIT_POLL_SECONDS: '0', - }, - ); + const launch = runCodexAgent(['hook-fail-task', 'planner', 'dev'], repoDir, { + PATH: `${fakeCodexBin}:${process.env.PATH}`, + GUARDEX_CODEX_WAIT_FOR_MERGE: 'false', + GUARDEX_GH_BIN: fakeGhPath, + GUARDEX_FINISH_WAIT_TIMEOUT_SECONDS: '30', + GUARDEX_FINISH_WAIT_POLL_SECONDS: '0', + }); assert.notEqual(launch.status, 0, launch.stderr || launch.stdout); assert.match(launch.stderr, /Auto-commit failed in sandbox/); assert.match(launch.stderr, /forced pre-commit failure for test/); @@ -3866,11 +3850,7 @@ test('pre-commit sync gate blocks agent commits when branch is too far behind ba assert.equal(result.status, 0, result.stderr); fs.writeFileSync(path.join(repoDir, 'agent-blocked.txt'), 'blocked\n'); - result = runCmd( - 'python3', - ['scripts/agent-file-locks.py', 'claim', '--branch', 'agent/test-behind-gate', 'agent-blocked.txt'], - repoDir, - ); + result = runLockTool(['claim', '--branch', 'agent/test-behind-gate', 'agent-blocked.txt'], repoDir); assert.equal(result.status, 0, result.stderr || result.stdout); result = runCmd('git', ['add', 'agent-blocked.txt'], repoDir); assert.equal(result.status, 0, result.stderr); @@ -3914,11 +3894,7 @@ test('pre-commit sync gate honors maxBehindCommits threshold', () => { assert.equal(result.status, 0, result.stderr); fs.writeFileSync(path.join(repoDir, 'agent-allowed.txt'), 'allowed\n'); - result = runCmd( - 'python3', - ['scripts/agent-file-locks.py', 'claim', '--branch', 'agent/test-behind-threshold', 'agent-allowed.txt'], - repoDir, - ); + result = runLockTool(['claim', '--branch', 'agent/test-behind-threshold', 'agent-allowed.txt'], repoDir); assert.equal(result.status, 0, result.stderr || result.stdout); result = runCmd('git', ['add', 'agent-allowed.txt'], repoDir); assert.equal(result.status, 0, result.stderr); @@ -3956,11 +3932,7 @@ test('agent-branch-finish auto-syncs source branch when behind origin/dev', () = result = runCmd('git', ['checkout', 'agent/test-finish-sync-guard'], repoDir); assert.equal(result.status, 0, result.stderr); - const finish = runCmd( - 'bash', - ['scripts/agent-branch-finish.sh', '--branch', 'agent/test-finish-sync-guard'], - repoDir, - ); + const finish = runBranchFinish(['--branch', 'agent/test-finish-sync-guard'], repoDir); assert.equal(finish.status, 0, finish.stderr || finish.stdout); assert.match(finish.stderr, /agent-sync-guard/); assert.match(finish.stderr, /Auto-syncing 'agent\/test-finish-sync-guard' onto origin\/dev before finish/); @@ -4015,9 +3987,8 @@ echo "unexpected gh args: $*" >&2 exit 1 `); - const finish = runCmd( - 'bash', - ['scripts/agent-branch-finish.sh', '--branch', 'agent/test-pr-delete-error', '--mode', 'pr', '--cleanup'], + const finish = runBranchFinish( + ['--branch', 'agent/test-pr-delete-error', '--mode', 'pr', '--cleanup'], repoDir, { GUARDEX_GH_BIN: fakeGhPath }, ); @@ -4093,18 +4064,8 @@ echo "unexpected gh args: $*" >&2 exit 1 `); - const finish = runCmd( - 'bash', - [ - path.join(repoDir, 'scripts', 'agent-branch-finish.sh'), - '--branch', - 'agent/test-active-worktree-cleanup', - '--base', - 'dev', - '--mode', - 'pr', - '--cleanup', - ], + const finish = runBranchFinish( + ['--branch', 'agent/test-active-worktree-cleanup', '--base', 'dev', '--mode', 'pr', '--cleanup'], agentWorktreePath, { GUARDEX_GH_BIN: fakeGhPath }, ); @@ -4187,10 +4148,8 @@ echo "unexpected gh args: $*" >&2 exit 1 `); - const finish = runCmd( - 'bash', + const finish = runBranchFinish( [ - 'scripts/agent-branch-finish.sh', '--branch', 'agent/test-pr-wait-merge', '--mode', @@ -4228,11 +4187,7 @@ test('OpenSpec plan workspace scaffold creates expected role/task structure', () assert.equal(setupResult.status, 0, setupResult.stderr || setupResult.stdout); const planSlug = 'plan-workspace-smoke'; - const scaffold = runCmd( - 'bash', - ['scripts/openspec/init-plan-workspace.sh', planSlug], - repoDir, - ); + const scaffold = runPlanInit([planSlug], repoDir); assert.equal(scaffold.status, 0, scaffold.stderr || scaffold.stdout); const planDir = path.join(repoDir, 'openspec', 'plan', planSlug); @@ -4298,11 +4253,7 @@ test('OpenSpec change workspace scaffold creates proposal/tasks/spec defaults', const changeSlug = 'change-workspace-smoke'; const capabilitySlug = 'runtime-migration'; - const scaffold = runCmd( - 'bash', - ['scripts/openspec/init-change-workspace.sh', changeSlug, capabilitySlug], - repoDir, - ); + const scaffold = runChangeInit([changeSlug, capabilitySlug], repoDir); assert.equal(scaffold.status, 0, scaffold.stderr || scaffold.stdout); const changeDir = path.join(repoDir, 'openspec', 'changes', changeSlug); @@ -4334,12 +4285,9 @@ test('OpenSpec change workspace scaffold supports minimal T1 notes mode', () => const changeSlug = 'change-workspace-minimal'; const capabilitySlug = 'runtime-migration'; const agentBranch = 'agent/codex/minimal-change'; - const scaffold = runCmd( - 'bash', - ['scripts/openspec/init-change-workspace.sh', changeSlug, capabilitySlug, agentBranch], - repoDir, - { GUARDEX_OPENSPEC_MINIMAL: '1' }, - ); + const scaffold = runChangeInit([changeSlug, capabilitySlug, agentBranch], repoDir, { + GUARDEX_OPENSPEC_MINIMAL: '1', + }); assert.equal(scaffold.status, 0, scaffold.stderr || scaffold.stdout); const changeDir = path.join(repoDir, 'openspec', 'changes', changeSlug); @@ -4374,37 +4322,21 @@ test('validate blocks unapproved deletions until allow-delete is set', () => { result = runCmd('git', ['commit', '-m', 'seed'], repoDir); assert.equal(result.status, 0, result.stderr); - result = runCmd( - 'python3', - ['scripts/agent-file-locks.py', 'claim', '--branch', 'agent/test', 'src/logic.txt'], - repoDir, - ); + result = runLockTool(['claim', '--branch', 'agent/test', 'src/logic.txt'], repoDir); assert.equal(result.status, 0, result.stderr || result.stdout); fs.unlinkSync(featureFile); result = runCmd('git', ['add', '-A'], repoDir); assert.equal(result.status, 0, result.stderr); - result = runCmd( - 'python3', - ['scripts/agent-file-locks.py', 'validate', '--branch', 'agent/test', '--staged'], - repoDir, - ); + result = runLockTool(['validate', '--branch', 'agent/test', '--staged'], repoDir); assert.equal(result.status, 1, 'deletion should be blocked without allow-delete'); assert.match(result.stderr, /Delete not approved/); - result = runCmd( - 'python3', - ['scripts/agent-file-locks.py', 'allow-delete', '--branch', 'agent/test', 'src/logic.txt'], - repoDir, - ); + result = runLockTool(['allow-delete', '--branch', 'agent/test', 'src/logic.txt'], repoDir); assert.equal(result.status, 0, result.stderr || result.stdout); - result = runCmd( - 'python3', - ['scripts/agent-file-locks.py', 'validate', '--branch', 'agent/test', '--staged'], - repoDir, - ); + result = runLockTool(['validate', '--branch', 'agent/test', '--staged'], repoDir); assert.equal(result.status, 0, result.stderr || result.stdout); }); @@ -4415,7 +4347,7 @@ test('fix repairs stale lock issues so scan becomes clean', () => { assert.equal(result.status, 0, result.stderr || result.stdout); // Simulate broken state - fs.rmSync(path.join(repoDir, 'scripts', 'agent-branch-start.sh')); + fs.rmSync(path.join(repoDir, 'scripts', 'guardex-env.sh')); result = runCmd('git', ['config', 'core.hooksPath', '.git/hooks'], repoDir); assert.equal(result.status, 0, result.stderr); @@ -4454,7 +4386,7 @@ test('doctor repairs setup drift and confirms repo is safe', () => { assert.equal(result.status, 0, result.stderr || result.stdout); // Simulate broken setup + stale lock. - fs.rmSync(path.join(repoDir, 'scripts', 'agent-branch-start.sh')); + fs.rmSync(path.join(repoDir, 'scripts', 'guardex-env.sh')); fs.rmSync(path.join(repoDir, '.omx', 'notepad.md')); fs.rmSync(path.join(repoDir, '.omx', 'project-memory.json')); fs.rmSync(path.join(repoDir, '.omx', 'logs'), { recursive: true, force: true }); @@ -4512,16 +4444,15 @@ test('doctor recurses into nested frontend repos and repairs protected-main drif assert.equal(result.status, 0, result.stderr || result.stdout); assert.equal(fs.existsSync(path.join(frontendDir, 'AGENTS.md')), true, 'nested frontend should be bootstrapped by setup'); const initialFrontendGitignore = fs.readFileSync(frontendGitignorePath, 'utf8'); - assert.match(initialFrontendGitignore, /^scripts\/\*$/m); - assert.match(initialFrontendGitignore, /^\.githooks$/m); + assertZeroCopyManagedGitignore(initialFrontendGitignore); fs.rmSync(path.join(frontendDir, 'AGENTS.md')); - fs.rmSync(path.join(frontendDir, 'scripts', 'agent-branch-start.sh')); + fs.rmSync(path.join(frontendDir, 'scripts', 'guardex-env.sh')); fs.rmSync(path.join(frontendDir, '.githooks', 'pre-commit')); fs.writeFileSync( frontendGitignorePath, initialFrontendGitignore - .replace(/^scripts\/\*\n/m, '') + .replace(/^scripts\/guardex-env\.sh\n/m, '') .replace(/^\.githooks\n/m, ''), 'utf8', ); @@ -4536,13 +4467,12 @@ test('doctor recurses into nested frontend repos and repairs protected-main drif assert.equal(fs.existsSync(path.join(frontendDir, 'AGENTS.md')), true, 'nested frontend AGENTS.md should be restored'); assert.equal( - fs.existsSync(path.join(frontendDir, 'scripts', 'agent-branch-start.sh')), + fs.existsSync(path.join(frontendDir, 'scripts', 'guardex-env.sh')), true, - 'nested frontend sandbox starter should be restored', + 'nested frontend zero-copy managed script should be restored', ); const repairedFrontendGitignore = fs.readFileSync(frontendGitignorePath, 'utf8'); - assert.match(repairedFrontendGitignore, /^scripts\/\*$/m); - assert.match(repairedFrontendGitignore, /^\.githooks$/m); + assertZeroCopyManagedGitignore(repairedFrontendGitignore); const repairedFrontendHook = fs.readFileSync(path.join(frontendDir, '.githooks', 'pre-commit'), 'utf8'); assert.match(repairedFrontendHook, /'hook' 'run' 'pre-commit'/); @@ -4579,12 +4509,12 @@ test('recursive doctor forwards no-wait-for-merge to protected nested sandbox re assert.equal(result.status, 0, result.stderr || result.stdout); fs.rmSync(path.join(frontendDir, 'AGENTS.md')); - fs.rmSync(path.join(frontendDir, 'scripts', 'agent-branch-start.sh')); + fs.rmSync(path.join(frontendDir, 'scripts', 'guardex-env.sh')); fs.rmSync(path.join(frontendDir, '.githooks', 'pre-commit')); fs.writeFileSync( frontendGitignorePath, initialFrontendGitignore - .replace(/^scripts\/\*\n/m, '') + .replace(/^scripts\/guardex-env\.sh\n/m, '') .replace(/^\.githooks\n/m, ''), 'utf8', ); @@ -4930,13 +4860,13 @@ test('worktree prune keeps merged agent worktrees/branches unless delete flags a assert.equal(result.status, 0, result.stderr); assert.equal(fs.existsSync(worktreePath), true); - result = runCmd('bash', ['scripts/agent-worktree-prune.sh'], repoDir); + result = runWorktreePrune([], repoDir); assert.equal(result.status, 0, result.stderr || result.stdout); const branchResult = runCmd('git', ['show-ref', '--verify', '--quiet', 'refs/heads/agent/test-prune'], repoDir); assert.equal(branchResult.status, 0, 'merged agent branch should remain by default'); - result = runCmd('bash', ['scripts/agent-worktree-prune.sh', '--delete-branches'], repoDir); + result = runWorktreePrune(['--delete-branches'], repoDir); assert.equal(result.status, 0, result.stderr || result.stdout); assert.equal(fs.existsSync(worktreePath), false); const branchAfterDelete = runCmd('git', ['show-ref', '--verify', '--quiet', 'refs/heads/agent/test-prune'], repoDir); @@ -4955,11 +4885,11 @@ test('worktree prune preserves dirty agent worktrees unless --force-dirty is use fs.writeFileSync(path.join(worktreePath, 'dirty.txt'), 'dirty\n', 'utf8'); - result = runCmd('bash', ['scripts/agent-worktree-prune.sh', '--delete-branches'], repoDir); + result = runWorktreePrune(['--delete-branches'], repoDir); assert.equal(result.status, 0, result.stderr || result.stdout); assert.equal(fs.existsSync(worktreePath), true, 'dirty worktree should remain without --force-dirty'); - result = runCmd('bash', ['scripts/agent-worktree-prune.sh', '--force-dirty', '--delete-branches'], repoDir); + result = runWorktreePrune(['--force-dirty', '--delete-branches'], repoDir); assert.equal(result.status, 0, result.stderr || result.stdout); assert.equal(fs.existsSync(worktreePath), false, 'dirty worktree should be removable with --force-dirty'); }); @@ -4980,7 +4910,7 @@ test('worktree prune --only-dirty-worktrees removes clean agent worktrees but ke result = runCmd('git', ['-C', worktreePath, 'commit', '-m', 'unmerged clean worktree commit'], repoDir); assert.equal(result.status, 0, result.stderr || result.stdout); - result = runCmd('bash', ['scripts/agent-worktree-prune.sh', '--only-dirty-worktrees'], repoDir); + result = runWorktreePrune(['--only-dirty-worktrees'], repoDir); assert.equal(result.status, 0, result.stderr || result.stdout); assert.equal(fs.existsSync(worktreePath), false, 'clean agent worktree should be removed'); @@ -5006,7 +4936,7 @@ test('worktree prune reroutes foreign worktrees to the owning repo .omx root', ( assert.equal(result.status, 0, result.stderr || result.stdout); assert.equal(fs.existsSync(misplacedPath), true, 'foreign worktree should start misplaced under current repo'); - result = runCmd('bash', ['scripts/agent-worktree-prune.sh'], repoDir); + result = runWorktreePrune([], repoDir); assert.equal(result.status, 0, result.stderr || result.stdout); assert.match(result.stdout, /Relocating foreign worktree to owning repo/); assert.equal(fs.existsSync(misplacedPath), false, 'misplaced foreign worktree should be moved out'); @@ -5039,19 +4969,14 @@ test('worktree prune --idle-minutes preserves recent branch activity and prunes result = runCmd('git', ['-C', worktreePath, 'commit', '-m', 'idle threshold branch commit'], repoDir); assert.equal(result.status, 0, result.stderr || result.stdout); - result = runCmd('bash', ['scripts/agent-worktree-prune.sh', '--only-dirty-worktrees', '--idle-minutes', '10'], repoDir); + result = runWorktreePrune(['--only-dirty-worktrees', '--idle-minutes', '10'], repoDir); assert.equal(result.status, 0, result.stderr || result.stdout); assert.equal(fs.existsSync(worktreePath), true, 'recent branch should remain inside idle threshold'); const fakeNowEpoch = Math.floor(Date.now() / 1000) + 3600; - result = runCmd( - 'bash', - ['scripts/agent-worktree-prune.sh', '--only-dirty-worktrees', '--idle-minutes', '10'], - repoDir, - { - GUARDEX_PRUNE_NOW_EPOCH: String(fakeNowEpoch), - }, - ); + result = runWorktreePrune(['--only-dirty-worktrees', '--idle-minutes', '10'], repoDir, { + GUARDEX_PRUNE_NOW_EPOCH: String(fakeNowEpoch), + }); assert.equal(result.status, 0, result.stderr || result.stdout); assert.equal(fs.existsSync(worktreePath), false, 'idle branch should be pruned after threshold is exceeded'); }); From db38432abada2bbba15309172046ace04216ec06 Mon Sep 17 00:00:00 2001 From: Viktor Nagy <137165288+NagyVikt@users.noreply.github.com> Date: Wed, 22 Apr 2026 09:54:56 +0200 Subject: [PATCH 40/48] Record merged zero-copy closeout evidence (#272) The zero-copy install-surface change already merged via PR #271, but the follow-up OpenSpec tasks file still showed cleanup unchecked. This commit marks cleanup complete and records the merged PR, commit, and local cleanup evidence so the change log matches repo state. Constraint: OpenSpec cleanup bookkeeping must match merged branch state before the change is considered closed Rejected: Leave tasks.md stale | hides verified merge completion and keeps false cleanup debt Confidence: high Scope-risk: narrow Reversibility: clean Directive: Use evidence-only follow-up branches only to repair stale bookkeeping; do not reopen implementation scope without a new change Tested: openspec validate agent-codex-zero-copy-cli-install-surface-2026-04-22-01-28 --type change --strict; openspec validate --specs Not-tested: No runtime or CLI behavior changed Co-authored-by: NagyVikt --- .../tasks.md | 6 ++++-- 1 file changed, 4 insertions(+), 2 deletions(-) diff --git a/openspec/changes/agent-codex-zero-copy-cli-install-surface-2026-04-22-01-28/tasks.md b/openspec/changes/agent-codex-zero-copy-cli-install-surface-2026-04-22-01-28/tasks.md index 915ab2a..17c80c4 100644 --- a/openspec/changes/agent-codex-zero-copy-cli-install-surface-2026-04-22-01-28/tasks.md +++ b/openspec/changes/agent-codex-zero-copy-cli-install-surface-2026-04-22-01-28/tasks.md @@ -27,5 +27,7 @@ ## 5. Cleanup - [x] 5.1 Reconcile the shipped README/install docs with the zero-copy repo footprint and note any intentional compatibility leftovers. README now documents the hook-only install footprint and notes that `gx migrate` removes leftover workflow shims while the CLI still honors repo-local `scripts/review-bot-watch.sh` / `scripts/codex-agent.sh` during migration. -- [ ] 5.2 Finish the agent branch via PR merge + cleanup (`gx finish --via-pr --wait-for-merge --cleanup` or `bash scripts/agent-branch-finish.sh --branch --base --via-pr --wait-for-merge --cleanup`). -- [ ] 5.3 Record PR URL + final `MERGED` evidence in the completion handoff. +- [x] 5.2 Finish the agent branch via PR merge + cleanup (`gx finish --via-pr --wait-for-merge --cleanup` or `bash scripts/agent-branch-finish.sh --branch --base --via-pr --wait-for-merge --cleanup`). +- [x] 5.3 Record PR URL + final `MERGED` evidence in the completion handoff. + +Completion note: PR #271 (`https://github.com/recodeee/gitguardex/pull/271`) reached `MERGED` at `2026-04-22T07:16:58Z` with merge commit `7c5bd067ec2376464b82caf20dadf04938448a82` (`7c5bd06`). In the nested `gitguardex` repo, `git branch -a --list '*zero-copy-cli-install-surface-2026-04-22-01-28*'` returns no remaining source branch, and `git worktree list` shows no leftover zero-copy task worktree. From 2f6d564a2159f302cae52f9be7e7ea80a445b652 Mon Sep 17 00:00:00 2001 From: Viktor Nagy <137165288+NagyVikt@users.noreply.github.com> Date: Wed, 22 Apr 2026 10:07:13 +0200 Subject: [PATCH 41/48] Allow targeted managed-file force rewrites during setup and doctor (#273) Setup/doctor/install/fix now treat managed relative paths that follow --force as targeted rewrite selectors instead of unknown args. The overwrite-conflict message now teaches both targeted and full-surface recovery, and the install regressions lock doctor/setup behavior for the review-bot shim and cr.yml template. Constraint: Existing bare --force must keep whole-surface rewrite behavior Rejected: Add a separate single-file repair command | heavier UX than extending the existing recovery path Confidence: high Scope-risk: moderate Directive: Keep targeted selectors limited to explicit managed paths with dedicated rewrite helpers; do not widen them to arbitrary repo files Tested: node --check bin/multiagent-safety.js; node --test --test-name-pattern "(doctor --force |setup --force |setup conflict message teaches)" test/install.test.js; openspec validate agent-codex-fix-doctor-setup-force-conflict-ux-2026-04-22-08-58 --type change --strict; openspec validate --specs Not-tested: PR merge/cleanup flow on origin/main Co-authored-by: NagyVikt --- bin/multiagent-safety.js | 163 ++++++++++++++++-- .../proposal.md | 24 +++ .../specs/doctor-setup-force-targets/spec.md | 32 ++++ .../tasks.md | 29 ++++ test/install.test.js | 63 +++++++ 5 files changed, 292 insertions(+), 19 deletions(-) create mode 100644 openspec/changes/agent-codex-fix-doctor-setup-force-conflict-ux-2026-04-22-08-58/proposal.md create mode 100644 openspec/changes/agent-codex-fix-doctor-setup-force-conflict-ux-2026-04-22-08-58/specs/doctor-setup-force-targets/spec.md create mode 100644 openspec/changes/agent-codex-fix-doctor-setup-force-conflict-ux-2026-04-22-08-58/tasks.md diff --git a/bin/multiagent-safety.js b/bin/multiagent-safety.js index 6f57748..2d1e140 100755 --- a/bin/multiagent-safety.js +++ b/bin/multiagent-safety.js @@ -109,18 +109,20 @@ const TEMPLATE_FILES = [ 'vscode/guardex-active-agents/README.md', ]; -const LEGACY_WORKFLOW_SHIMS = [ - 'scripts/agent-branch-start.sh', - 'scripts/agent-branch-finish.sh', - 'scripts/agent-branch-merge.sh', - 'scripts/codex-agent.sh', - 'scripts/review-bot-watch.sh', - 'scripts/agent-worktree-prune.sh', - 'scripts/agent-file-locks.py', - 'scripts/openspec/init-plan-workspace.sh', - 'scripts/openspec/init-change-workspace.sh', +const LEGACY_WORKFLOW_SHIM_SPECS = [ + { relativePath: 'scripts/agent-branch-start.sh', kind: 'shell', command: ['branch', 'start'] }, + { relativePath: 'scripts/agent-branch-finish.sh', kind: 'shell', command: ['branch', 'finish'] }, + { relativePath: 'scripts/agent-branch-merge.sh', kind: 'shell', command: ['branch', 'merge'] }, + { relativePath: 'scripts/codex-agent.sh', kind: 'shell', command: ['internal', 'run-shell', 'codexAgent'] }, + { relativePath: 'scripts/review-bot-watch.sh', kind: 'shell', command: ['internal', 'run-shell', 'reviewBot'] }, + { relativePath: 'scripts/agent-worktree-prune.sh', kind: 'shell', command: ['worktree', 'prune'] }, + { relativePath: 'scripts/agent-file-locks.py', kind: 'python', command: ['locks'] }, + { relativePath: 'scripts/openspec/init-plan-workspace.sh', kind: 'shell', command: ['internal', 'run-shell', 'planInit'] }, + { relativePath: 'scripts/openspec/init-change-workspace.sh', kind: 'shell', command: ['internal', 'run-shell', 'changeInit'] }, ]; +const LEGACY_WORKFLOW_SHIMS = LEGACY_WORKFLOW_SHIM_SPECS.map((entry) => entry.relativePath); + const MANAGED_TEMPLATE_DESTINATIONS = TEMPLATE_FILES.map((entry) => toDestinationPath(entry)); const MANAGED_TEMPLATE_SCRIPT_FILES = MANAGED_TEMPLATE_DESTINATIONS.filter((entry) => entry.startsWith('scripts/'), @@ -248,6 +250,13 @@ const OMX_SCAFFOLD_FILES = new Map([ ['.omx/notepad.md', '\n\n## WORKING MEMORY\n'], ['.omx/project-memory.json', '{}\n'], ]); +const TARGETED_FORCEABLE_MANAGED_PATHS = new Set([ + 'AGENTS.md', + '.gitignore', + ...Array.from(OMX_SCAFFOLD_FILES.keys()), + ...REQUIRED_MANAGED_REPO_FILES, + ...LEGACY_WORKFLOW_SHIMS, +]); const COMMAND_TYPO_ALIASES = new Map([ ['relaese', 'release'], ['realaese', 'release'], @@ -1037,6 +1046,13 @@ function renderPythonDispatchShim(commandParts) { ); } +function managedForceConflictMessage(relativePath) { + return ( + `Refusing to overwrite existing file without --force: ${relativePath}\n` + + `Use '--force ${relativePath}' to rewrite only this managed file, or '--force' to rewrite all managed files.` + ); +} + function renderManagedFile(repoRoot, relativePath, content, options = {}) { const destinationPath = path.join(repoRoot, relativePath); const destinationExists = fs.existsSync(destinationPath); @@ -1050,7 +1066,7 @@ function renderManagedFile(repoRoot, relativePath, content, options = {}) { return { status: 'unchanged', file: relativePath }; } if (!force && !isCriticalGuardrailPath(relativePath)) { - throw new Error(`Refusing to overwrite existing file without --force: ${relativePath}`); + throw new Error(managedForceConflictMessage(relativePath)); } } @@ -1098,9 +1114,7 @@ function copyTemplateFile(repoRoot, relativeTemplatePath, force, dryRun) { return { status: 'unchanged', file: destinationRelativePath }; } if (!force && !isCriticalGuardrailPath(destinationRelativePath)) { - throw new Error( - `Refusing to overwrite existing file without --force: ${destinationRelativePath}`, - ); + throw new Error(managedForceConflictMessage(destinationRelativePath)); } } @@ -1151,6 +1165,22 @@ function ensureTemplateFilePresent(repoRoot, relativeTemplatePath, dryRun) { return { status: 'created', file: destinationRelativePath }; } +function ensureTargetedLegacyWorkflowShims(repoRoot, options) { + const targetedPaths = Array.isArray(options.forceManagedPaths) ? options.forceManagedPaths : []; + if (targetedPaths.length === 0) { + return []; + } + + const operations = []; + for (const shim of LEGACY_WORKFLOW_SHIM_SPECS) { + if (!shouldForceManagedPath(options, shim.relativePath)) { + continue; + } + operations.push(ensureGeneratedScriptShim(repoRoot, shim, { dryRun: options.dryRun, force: true })); + } + return operations; +} + function lockFilePath(repoRoot) { return path.join(repoRoot, LOCK_FILE_RELATIVE); } @@ -1457,8 +1487,65 @@ function requireValue(rawArgs, index, flagName) { return value; } +function normalizeManagedForcePath(rawPath) { + if (typeof rawPath !== 'string') { + return null; + } + const normalized = path.posix.normalize(rawPath.replace(/\\/g, '/')); + if (!normalized || normalized === '.' || normalized.startsWith('../') || path.posix.isAbsolute(normalized)) { + return null; + } + return normalized.startsWith('./') ? normalized.slice(2) : normalized; +} + +function collectForceManagedPaths(rawArgs, startIndex) { + const forceManagedPaths = []; + let nextIndex = startIndex; + + while (nextIndex + 1 < rawArgs.length) { + const candidate = rawArgs[nextIndex + 1]; + if (!candidate || candidate.startsWith('-')) { + break; + } + const normalized = normalizeManagedForcePath(candidate); + if (!normalized || !TARGETED_FORCEABLE_MANAGED_PATHS.has(normalized)) { + throw new Error(`Unknown managed path after --force: ${candidate}`); + } + forceManagedPaths.push(normalized); + nextIndex += 1; + } + + return { forceManagedPaths, nextIndex }; +} + +function appendForceArgs(args, options) { + if (!options.force) { + return; + } + args.push('--force'); + for (const managedPath of options.forceManagedPaths || []) { + args.push(managedPath); + } +} + +function shouldForceManagedPath(options, relativePath) { + if (!options.force) { + return false; + } + const targetedPaths = Array.isArray(options.forceManagedPaths) ? options.forceManagedPaths : []; + if (targetedPaths.length === 0) { + return true; + } + const normalized = normalizeManagedForcePath(relativePath); + return normalized !== null && targetedPaths.includes(normalized); +} + function parseCommonArgs(rawArgs, defaults) { const options = { ...defaults }; + const supportsForce = Object.prototype.hasOwnProperty.call(options, 'force'); + if (supportsForce && !Array.isArray(options.forceManagedPaths)) { + options.forceManagedPaths = []; + } for (let index = 0; index < rawArgs.length; index += 1) { const arg = rawArgs[index]; @@ -1480,7 +1567,17 @@ function parseCommonArgs(rawArgs, defaults) { continue; } if (arg === '--force') { + if (!supportsForce) { + throw new Error(`Unknown option: ${arg}`); + } options.force = true; + const parsed = collectForceManagedPaths(rawArgs, index); + if (parsed.forceManagedPaths.length > 0) { + options.forceManagedPaths = Array.from( + new Set([...(options.forceManagedPaths || []), ...parsed.forceManagedPaths]), + ); + } + index = parsed.nextIndex; continue; } if (arg === '--keep-stale-locks') { @@ -1598,6 +1695,7 @@ function parseSetupArgs(rawArgs, defaults) { function parseDoctorArgs(rawArgs) { const doctorDefaults = { target: process.cwd(), + force: false, dropStaleLocks: true, skipAgents: false, skipPackageJson: false, @@ -1746,6 +1844,7 @@ function runSetupBootstrapInternal(options) { target: installPayload.repoRoot, dryRun: options.dryRun, force: options.force, + forceManagedPaths: options.forceManagedPaths, dropStaleLocks: true, skipAgents: options.skipAgents, skipPackageJson: options.skipPackageJson, @@ -1783,7 +1882,7 @@ function resolveSandboxTarget(repoRoot, worktreePath, targetPath) { function buildSandboxSetupArgs(options, sandboxTarget) { const args = ['setup', '--target', sandboxTarget, '--no-global-install', '--no-recursive']; - if (options.force) args.push('--force'); + appendForceArgs(args, options); if (options.skipAgents) args.push('--skip-agents'); if (options.skipPackageJson) args.push('--skip-package-json'); if (options.skipGitignore) args.push('--no-gitignore'); @@ -1794,7 +1893,7 @@ function buildSandboxSetupArgs(options, sandboxTarget) { function buildSandboxDoctorArgs(options, sandboxTarget) { const args = ['doctor', '--target', sandboxTarget]; if (options.dryRun) args.push('--dry-run'); - if (options.force) args.push('--force'); + appendForceArgs(args, options); if (options.skipAgents) args.push('--skip-agents'); if (options.skipPackageJson) args.push('--skip-package-json'); if (options.skipGitignore) args.push('--no-gitignore'); @@ -5154,10 +5253,24 @@ function runInstallInternal(options) { operations.push(...ensureOmxScaffold(repoRoot, Boolean(options.dryRun))); for (const templateFile of TEMPLATE_FILES) { - operations.push(copyTemplateFile(repoRoot, templateFile, Boolean(options.force), Boolean(options.dryRun))); + operations.push( + copyTemplateFile( + repoRoot, + templateFile, + shouldForceManagedPath(options, toDestinationPath(templateFile)), + Boolean(options.dryRun), + ), + ); } + operations.push(...ensureTargetedLegacyWorkflowShims(repoRoot, options)); for (const hookName of HOOK_NAMES) { - operations.push(ensureHookShim(repoRoot, hookName, options)); + const hookRelativePath = path.posix.join('.githooks', hookName); + operations.push( + ensureHookShim(repoRoot, hookName, { + dryRun: options.dryRun, + force: shouldForceManagedPath(options, hookRelativePath), + }), + ); } operations.push(ensureLockRegistry(repoRoot, Boolean(options.dryRun))); @@ -5198,10 +5311,21 @@ function runFixInternal(options) { operations.push(...ensureOmxScaffold(repoRoot, Boolean(options.dryRun))); for (const templateFile of TEMPLATE_FILES) { + if (shouldForceManagedPath(options, toDestinationPath(templateFile))) { + operations.push(copyTemplateFile(repoRoot, templateFile, true, Boolean(options.dryRun))); + continue; + } operations.push(ensureTemplateFilePresent(repoRoot, templateFile, Boolean(options.dryRun))); } + operations.push(...ensureTargetedLegacyWorkflowShims(repoRoot, options)); for (const hookName of HOOK_NAMES) { - operations.push(ensureHookShim(repoRoot, hookName, options)); + const hookRelativePath = path.posix.join('.githooks', hookName); + operations.push( + ensureHookShim(repoRoot, hookName, { + dryRun: options.dryRun, + force: shouldForceManagedPath(options, hookRelativePath), + }), + ); } operations.push(ensureLockRegistry(repoRoot, Boolean(options.dryRun))); @@ -5715,6 +5839,7 @@ function doctor(rawArgs) { '--single-repo', '--target', repoPath, + ...(options.force ? ['--force', ...(options.forceManagedPaths || [])] : []), ...(options.dropStaleLocks ? [] : ['--keep-stale-locks']), ...(options.skipAgents ? ['--skip-agents'] : []), ...(options.skipPackageJson ? ['--skip-package-json'] : []), diff --git a/openspec/changes/agent-codex-fix-doctor-setup-force-conflict-ux-2026-04-22-08-58/proposal.md b/openspec/changes/agent-codex-fix-doctor-setup-force-conflict-ux-2026-04-22-08-58/proposal.md new file mode 100644 index 0000000..5cbb074 --- /dev/null +++ b/openspec/changes/agent-codex-fix-doctor-setup-force-conflict-ux-2026-04-22-08-58/proposal.md @@ -0,0 +1,24 @@ +## Why + +- `gx doctor` currently surfaces managed-file conflicts like `scripts/review-bot-watch.sh`, but the recovery hint is incomplete: users can reasonably try `gx doctor --force scripts/review-bot-watch.sh` and hit `Unknown option`. +- `gx setup` has the same gap for managed template conflicts like `.github/workflows/cr.yml`. +- The CLI already distinguishes managed files from repo-owned package scripts and AGENTS content, so the remaining missing piece is a safe, explicit way to force only the named managed path instead of all managed files. + +## What Changes + +- Allow `gx setup`, `gx doctor`, and the shared repair/install aliases to accept managed relative paths after `--force`. +- Keep plain `--force` as the whole-surface rewrite path. +- Update managed-file conflict errors to explain both recovery options: + - `--force ` to rewrite only that file + - `--force` to rewrite all managed files +- Add install regressions for targeted doctor/setup force-path recovery. + +## Scope + +- `bin/multiagent-safety.js` +- `test/install.test.js` + +## Risks + +- Path matching must stay relative and deterministic so targeted force rewrites only the named managed file. +- The parser change must not accidentally relax other commands into accepting stray positional arguments. diff --git a/openspec/changes/agent-codex-fix-doctor-setup-force-conflict-ux-2026-04-22-08-58/specs/doctor-setup-force-targets/spec.md b/openspec/changes/agent-codex-fix-doctor-setup-force-conflict-ux-2026-04-22-08-58/specs/doctor-setup-force-targets/spec.md new file mode 100644 index 0000000..4da395b --- /dev/null +++ b/openspec/changes/agent-codex-fix-doctor-setup-force-conflict-ux-2026-04-22-08-58/specs/doctor-setup-force-targets/spec.md @@ -0,0 +1,32 @@ +## ADDED Requirements + +### Requirement: setup and doctor accept targeted managed-file force paths + +`gx setup` and `gx doctor` SHALL accept one or more managed relative paths after `--force` so users can repair only the named managed files instead of rewriting the entire managed surface. + +#### Scenario: doctor rewrites one named managed shim + +- **GIVEN** a repo has a conflicting managed `scripts/review-bot-watch.sh` +- **WHEN** the user runs `gx doctor --force scripts/review-bot-watch.sh` +- **THEN** the command succeeds +- **AND** `scripts/review-bot-watch.sh` is rewritten to the current managed shim +- **AND** the path selector is not treated as an unknown option + +#### Scenario: setup rewrites one named managed template + +- **GIVEN** a repo has a conflicting managed `.github/workflows/cr.yml` +- **WHEN** the user runs `gx setup --force .github/workflows/cr.yml` +- **THEN** the command succeeds +- **AND** `.github/workflows/cr.yml` is rewritten to the current managed template + +### Requirement: conflict output teaches targeted and global force recovery + +When a managed file conflict blocks `gx setup` or `gx doctor`, the CLI SHALL tell the user how to recover with either a targeted `--force ` or a full-surface `--force`. + +#### Scenario: conflict message names both force paths + +- **GIVEN** a managed file differs from the current Guardex output +- **WHEN** `gx setup` or `gx doctor` hits that conflict without `--force` +- **THEN** the error names the conflicting managed path +- **AND** the error teaches `--force ` for one-file recovery +- **AND** the error teaches plain `--force` for rewriting all managed files diff --git a/openspec/changes/agent-codex-fix-doctor-setup-force-conflict-ux-2026-04-22-08-58/tasks.md b/openspec/changes/agent-codex-fix-doctor-setup-force-conflict-ux-2026-04-22-08-58/tasks.md new file mode 100644 index 0000000..23e3d20 --- /dev/null +++ b/openspec/changes/agent-codex-fix-doctor-setup-force-conflict-ux-2026-04-22-08-58/tasks.md @@ -0,0 +1,29 @@ +## 1. Spec + +- [x] 1.1 Define the targeted managed-file `--force` behavior in `specs/doctor-setup-force-targets/spec.md`. +- [x] 1.2 Capture the recovery UX problem and bounded scope in `proposal.md`. + +## 2. Tests + +- [x] 2.1 Add a regression that `gx doctor --force scripts/review-bot-watch.sh` rewrites the named managed shim instead of throwing `Unknown option`. +- [x] 2.2 Add a regression that `gx setup --force .github/workflows/cr.yml` rewrites the named managed template. +- [x] 2.3 Lock the conflict message so it teaches both targeted `--force ` and global `--force`. + +## 3. Implementation + +- [x] 3.1 Extend the shared setup/doctor/install/fix arg parsing to accept managed path selectors only after `--force`. +- [x] 3.2 Route targeted force-path matching through the managed file/template rewrite helpers. +- [x] 3.3 Preserve the existing plain `--force` behavior for whole-surface rewrites. + +## 4. Verification + +- [x] 4.1 Run `node --check bin/multiagent-safety.js`. +- [x] 4.2 Run targeted install regressions in `test/install.test.js`. +- [x] 4.3 Run `openspec validate agent-codex-fix-doctor-setup-force-conflict-ux-2026-04-22-08-58 --type change --strict`. +- [x] 4.4 Run `openspec validate --specs`. + +## 5. Cleanup + +- [x] 5.1 Confirm the OpenSpec tasks reflect the shipped behavior and note any residual risk. Residual risk: targeted `--force` selectors intentionally fail fast for unlisted paths, and this worktree currently has no main specs for `openspec validate --specs` beyond the clean `No items found to validate.` result. +- [ ] 5.2 Finish the agent branch via PR merge + cleanup (`gx finish --via-pr --wait-for-merge --cleanup` or `bash scripts/agent-branch-finish.sh --branch --base --via-pr --wait-for-merge --cleanup`). +- [ ] 5.3 Record PR URL + final `MERGED` evidence in the completion handoff. diff --git a/test/install.test.js b/test/install.test.js index 58e7ab5..b3a5931 100644 --- a/test/install.test.js +++ b/test/install.test.js @@ -651,6 +651,69 @@ test('setup and doctor explain .githooks file conflicts and still write managed assertZeroCopyManagedGitignore(gitignoreContent); }); +test('doctor --force rewrites only the named managed shim', () => { + const repoDir = initRepo(); + + let result = runNode(['setup', '--target', repoDir, '--no-global-install'], repoDir); + assert.equal(result.status, 0, result.stderr || result.stdout); + + const reviewScriptPath = path.join(repoDir, 'scripts', 'review-bot-watch.sh'); + const workflowPath = path.join(repoDir, '.github', 'workflows', 'cr.yml'); + fs.writeFileSync(reviewScriptPath, '#!/usr/bin/env bash\nprintf "custom review shim\\n"\n', 'utf8'); + fs.chmodSync(reviewScriptPath, 0o755); + fs.writeFileSync(workflowPath, '# custom workflow\n', 'utf8'); + + result = runNode( + ['doctor', '--target', repoDir, '--force', 'scripts/review-bot-watch.sh'], + repoDir, + ); + assert.equal(result.status, 0, result.stderr || result.stdout); + assert.doesNotMatch(`${result.stdout}\n${result.stderr}`, /Unknown option:/); + const managedReviewShim = fs.readFileSync(reviewScriptPath, 'utf8'); + assert.match(managedReviewShim, /exec "\$node_bin" "\$GUARDEX_CLI_ENTRY" 'internal' 'run-shell' 'reviewBot' "\$@"/); + assert.match(managedReviewShim, /exec "\$cli_bin" 'internal' 'run-shell' 'reviewBot' "\$@"/); + assert.equal(fs.readFileSync(workflowPath, 'utf8'), '# custom workflow\n'); + assert.match(result.stdout, /skipped-conflict\s+\.github\/workflows\/cr\.yml/); +}); + +test('setup --force rewrites the named managed template', () => { + const repoDir = initRepo(); + + let result = runNode(['setup', '--target', repoDir, '--no-global-install'], repoDir); + assert.equal(result.status, 0, result.stderr || result.stdout); + + const workflowPath = path.join(repoDir, '.github', 'workflows', 'cr.yml'); + const managedWorkflow = fs.readFileSync(workflowPath, 'utf8'); + fs.writeFileSync(workflowPath, '# custom workflow\n', 'utf8'); + + result = runNode( + ['setup', '--target', repoDir, '--force', '.github/workflows/cr.yml', '--no-global-install'], + repoDir, + ); + assert.equal(result.status, 0, result.stderr || result.stdout); + assert.doesNotMatch(`${result.stdout}\n${result.stderr}`, /Unknown option:/); + assert.equal(fs.readFileSync(workflowPath, 'utf8'), managedWorkflow); +}); + +test('setup conflict message teaches targeted and global managed --force recovery', () => { + const repoDir = initRepo(); + + let result = runNode(['setup', '--target', repoDir, '--no-global-install'], repoDir); + assert.equal(result.status, 0, result.stderr || result.stdout); + + const dockerLoaderPath = path.join(repoDir, 'scripts', 'guardex-docker-loader.sh'); + fs.writeFileSync(dockerLoaderPath, '#!/usr/bin/env bash\nprintf "custom docker loader\\n"\n', 'utf8'); + fs.chmodSync(dockerLoaderPath, 0o755); + + result = runNode(['setup', '--target', repoDir, '--no-global-install'], repoDir); + assert.notEqual(result.status, 0, 'setup should fail on non-critical managed conflicts without --force'); + + const combined = `${result.stdout}\n${result.stderr}`; + assert.match(combined, /Refusing to overwrite existing file without --force: scripts\/guardex-docker-loader\.sh/); + assert.match(combined, /--force scripts\/guardex-docker-loader\.sh/); + assert.match(combined, /--force' to rewrite all managed files/); +}); + test('setup and doctor skip repo bootstrap when repo .env disables Guardex', () => { const repoDir = initRepo(); fs.writeFileSync(path.join(repoDir, '.env'), 'GUARDEX_ON=0\n', 'utf8'); From 343b8a82f2860aefb8195b76d5de4c28f35d07b3 Mon Sep 17 00:00:00 2001 From: Viktor Nagy <137165288+NagyVikt@users.noreply.github.com> Date: Wed, 22 Apr 2026 10:33:27 +0200 Subject: [PATCH 42/48] Unblock the next Guardex npm publish (#274) This bumps @imdeadpool/guardex from 7.0.18 to 7.0.19 and records the shipped post-7.0.18 changes in the README release history and OpenSpec release artifact. No runtime behavior changed in this branch; the package metadata now matches the next publishable version the user requested. Constraint: Release metadata and README release history must move together for gitguardex version bumps Rejected: Version bump without README/OpenSpec update | would drift release state again Confidence: high Scope-risk: narrow Reversibility: clean Directive: Keep README release notes aligned with package.json and package-lock.json on every npm version bump Tested: node --check bin/multiagent-safety.js Tested: npm pack --dry-run Tested: openspec validate agent-codex-release-guardex-7-0-19-2026-04-22-10-27 --type change --strict Tested: openspec validate --specs Not-tested: node --test test/metadata.test.js still fails on the pre-existing helper-template parity mismatch already red on current main Co-authored-by: NagyVikt --- README.md | 6 ++++++ .../.openspec.yaml | 2 ++ .../proposal.md | 14 +++++++++++++ .../specs/release-version-bump/spec.md | 10 +++++++++ .../tasks.md | 21 +++++++++++++++++++ package-lock.json | 4 ++-- package.json | 2 +- 7 files changed, 56 insertions(+), 3 deletions(-) create mode 100644 openspec/changes/agent-codex-release-guardex-7-0-19-2026-04-22-10-27/.openspec.yaml create mode 100644 openspec/changes/agent-codex-release-guardex-7-0-19-2026-04-22-10-27/proposal.md create mode 100644 openspec/changes/agent-codex-release-guardex-7-0-19-2026-04-22-10-27/specs/release-version-bump/spec.md create mode 100644 openspec/changes/agent-codex-release-guardex-7-0-19-2026-04-22-10-27/tasks.md diff --git a/README.md b/README.md index 1b14fff..7276114 100644 --- a/README.md +++ b/README.md @@ -645,6 +645,12 @@ npm pack --dry-run
v7.x +### v7.0.19 +- `gx setup` and `gx doctor` now accept targeted managed-file recovery after `--force`, so `gx doctor --force scripts/review-bot-watch.sh` repairs the named managed file instead of failing on an unknown argument. +- Managed-file conflict output now teaches both recovery forms directly: `--force ` for one file and plain `--force` for whole-surface rewrites. +- GitGuardex now keeps small-task routing caveman-only by default and makes working VS Code agent lanes easier to spot at a glance while keeping the CLI-owned install-surface rollout intact. +- Bumped the release from `7.0.18` → `7.0.19` so the shipped setup/doctor recovery and UX refinements land on a fresh publishable npm version. + ### v7.0.18 - GitGuardex now keeps the install workflow in `gx` itself: `gx branch ...`, `gx locks ...`, `gx worktree prune`, `gx migrate`, and user-level agent-skill install now own the agent lifecycle instead of teaching pasted repo scripts as the primary surface. - Fresh installs switch repo hooks to tiny `gx hook run ...` shims, stop copying repo-local workflow implementations and repo-local skills, and stop injecting Guardex-managed `agent:*` package scripts into consumer repos. diff --git a/openspec/changes/agent-codex-release-guardex-7-0-19-2026-04-22-10-27/.openspec.yaml b/openspec/changes/agent-codex-release-guardex-7-0-19-2026-04-22-10-27/.openspec.yaml new file mode 100644 index 0000000..25345f4 --- /dev/null +++ b/openspec/changes/agent-codex-release-guardex-7-0-19-2026-04-22-10-27/.openspec.yaml @@ -0,0 +1,2 @@ +schema: spec-driven +created: 2026-04-22 diff --git a/openspec/changes/agent-codex-release-guardex-7-0-19-2026-04-22-10-27/proposal.md b/openspec/changes/agent-codex-release-guardex-7-0-19-2026-04-22-10-27/proposal.md new file mode 100644 index 0000000..06c4db8 --- /dev/null +++ b/openspec/changes/agent-codex-release-guardex-7-0-19-2026-04-22-10-27/proposal.md @@ -0,0 +1,14 @@ +## Why + +- The user asked for the next npm version in `gitguardex`, so the package metadata needs the next publishable patch release after `7.0.18`. +- The shipped behavior since `7.0.18` is not recorded in the README release history yet, so the package version and release notes would drift again without a matching docs update. + +## What Changes + +- Bump the package release metadata from `7.0.18` to `7.0.19` in `package.json` and `package-lock.json`. +- Add a `README.md` release-notes entry for `v7.0.19` that captures the shipped targeted `--force ` recovery flow plus the post-`7.0.18` UX refinements already merged on `main`. + +## Impact + +- Unblocks the next npm publish without changing runtime behavior beyond what is already merged on `main`. +- Keeps the packaged version, lockfile metadata, and README release history aligned so the release state is easier to trust. diff --git a/openspec/changes/agent-codex-release-guardex-7-0-19-2026-04-22-10-27/specs/release-version-bump/spec.md b/openspec/changes/agent-codex-release-guardex-7-0-19-2026-04-22-10-27/specs/release-version-bump/spec.md new file mode 100644 index 0000000..0ce736c --- /dev/null +++ b/openspec/changes/agent-codex-release-guardex-7-0-19-2026-04-22-10-27/specs/release-version-bump/spec.md @@ -0,0 +1,10 @@ +## ADDED Requirements + +### Requirement: Release recovery version alignment +The release metadata SHALL move to the next publishable package version when maintainers intentionally request the next npm release after the current published version. + +#### Scenario: Prepare the next publishable npm patch release +- **GIVEN** the current Guardex package version is already the latest published release metadata in the repo +- **WHEN** maintainers request the next npm version bump +- **THEN** `package.json` and `package-lock.json` SHALL be bumped to the next publishable semver +- **AND** `README.md` SHALL record the new release version with the newly shipped behavior that the package now contains. diff --git a/openspec/changes/agent-codex-release-guardex-7-0-19-2026-04-22-10-27/tasks.md b/openspec/changes/agent-codex-release-guardex-7-0-19-2026-04-22-10-27/tasks.md new file mode 100644 index 0000000..45d72a8 --- /dev/null +++ b/openspec/changes/agent-codex-release-guardex-7-0-19-2026-04-22-10-27/tasks.md @@ -0,0 +1,21 @@ +## 1. Specification + +- [x] 1.1 Finalize proposal scope and acceptance criteria for `agent-codex-release-guardex-7-0-19-2026-04-22-10-27`. +- [x] 1.2 Define normative requirements in `specs/release-version-bump/spec.md`. + +## 2. Implementation + +- [x] 2.1 Bump `package.json`, `package-lock.json`, and `README.md` to the next publishable Guardex release version. +- [x] 2.2 No new runtime regression coverage is required because this change only updates release metadata for already-merged behavior. + +## 3. Verification + +- [x] 3.1 Run `node --check bin/multiagent-safety.js`, `node --test test/metadata.test.js`, and `npm pack --dry-run` for the release-only change. `node --check` and `npm pack --dry-run` passed; `node --test test/metadata.test.js` still fails on the pre-existing `critical runtime helper scripts stay in sync with templates` parity mismatch, and the same failure reproduces on current `main`. +- [x] 3.2 Run `openspec validate agent-codex-release-guardex-7-0-19-2026-04-22-10-27 --type change --strict`. +- [x] 3.3 Run `openspec validate --specs`. + +## 4. Completion + +- [ ] 4.1 Finish the agent branch via PR merge + cleanup (`gx finish --via-pr --wait-for-merge --cleanup` or `bash scripts/agent-branch-finish.sh --branch --base --via-pr --wait-for-merge --cleanup`). +- [ ] 4.2 Record PR URL + final `MERGED` state in the completion handoff. +- [ ] 4.3 Confirm sandbox cleanup (`git worktree list`, `git branch -a`) or capture a `BLOCKED:` handoff if merge/cleanup is pending. diff --git a/package-lock.json b/package-lock.json index 0bc6729..17229c7 100644 --- a/package-lock.json +++ b/package-lock.json @@ -1,12 +1,12 @@ { "name": "@imdeadpool/guardex", - "version": "7.0.18", + "version": "7.0.19", "lockfileVersion": 3, "requires": true, "packages": { "": { "name": "@imdeadpool/guardex", - "version": "7.0.18", + "version": "7.0.19", "license": "MIT", "bin": { "gitguardex": "bin/multiagent-safety.js", diff --git a/package.json b/package.json index 61ce278..617d2ee 100644 --- a/package.json +++ b/package.json @@ -1,6 +1,6 @@ { "name": "@imdeadpool/guardex", - "version": "7.0.18", + "version": "7.0.19", "description": "Guardian T-Rex for your multi-agent repo. Isolated worktrees, file locks, and PR-only merges stop parallel Codex & Claude agents from overwriting each other's work. Auto-wires Oh My Codex, Oh My Claude, OpenSpec, and Caveman.", "license": "MIT", "preferGlobal": true, From babcf07ecb386f7c3ce1a8effb751c63ac45235b Mon Sep 17 00:00:00 2001 From: Viktor Nagy <137165288+NagyVikt@users.noreply.github.com> Date: Wed, 22 Apr 2026 10:45:09 +0200 Subject: [PATCH 43/48] Auto-finish: gx doctor repairs (#275) Co-authored-by: NagyVikt --- .githooks/post-checkout | 94 +--- .githooks/post-merge | 72 ++-- .githooks/pre-commit | 288 +------------ .githooks/pre-push | 121 +----- .gitignore | 10 +- AGENTS.md | 5 +- scripts/guardex-env.sh | 0 vscode/guardex-active-agents/README.md | 22 + vscode/guardex-active-agents/extension.js | 357 +++++++++++++++ vscode/guardex-active-agents/package.json | 57 +++ .../guardex-active-agents/session-schema.js | 407 ++++++++++++++++++ 11 files changed, 927 insertions(+), 506 deletions(-) mode change 100644 => 100755 scripts/guardex-env.sh create mode 100644 vscode/guardex-active-agents/README.md create mode 100644 vscode/guardex-active-agents/extension.js create mode 100644 vscode/guardex-active-agents/package.json create mode 100644 vscode/guardex-active-agents/session-schema.js diff --git a/.githooks/post-checkout b/.githooks/post-checkout index ad90fad..6db7c20 100755 --- a/.githooks/post-checkout +++ b/.githooks/post-checkout @@ -1,81 +1,27 @@ #!/usr/bin/env bash set -euo pipefail -# post-checkout -branch_checkout="${3:-0}" -[[ "$branch_checkout" == "1" ]] || exit 0 - -if [[ "${GUARDEX_ALLOW_PRIMARY_BRANCH_SWITCH:-0}" == "1" ]]; then - exit 0 -fi - -repo_root="$(git rev-parse --show-toplevel 2>/dev/null || true)" -if [[ -z "$repo_root" ]]; then - exit 0 -fi -guardex_env_helper="${repo_root}/scripts/guardex-env.sh" -if [[ -f "$guardex_env_helper" ]]; then - # shellcheck source=/dev/null - source "$guardex_env_helper" -fi -if declare -F guardex_repo_is_enabled >/dev/null 2>&1 && ! guardex_repo_is_enabled "$repo_root"; then - exit 0 +if [[ -n "${GUARDEX_CLI_ENTRY:-}" ]]; then + node_bin="${GUARDEX_NODE_BIN:-node}" + exec "$node_bin" "$GUARDEX_CLI_ENTRY" 'hook' 'run' 'post-checkout' "$@" fi -# Skip in secondary worktrees — only the primary checkout is guarded. -git_dir_abs="$(cd "$(git rev-parse --git-dir)" && pwd -P)" -common_dir_abs="$(cd "$(git rev-parse --git-common-dir)" && pwd -P)" -if [[ "$git_dir_abs" != "$common_dir_abs" ]]; then - exit 0 -fi - -new_branch="$(git rev-parse --abbrev-ref HEAD 2>/dev/null || true)" -# Parse the latest reflog entry; post-checkout writes "checkout: moving from to ". -prev_branch="$(git reflog -1 HEAD 2>/dev/null | sed -n 's/.*checkout: moving from \([^ ]*\) to .*/\1/p' || true)" - -[[ -n "$prev_branch" && -n "$new_branch" && "$prev_branch" != "$new_branch" ]] || exit 0 - -protected_raw="${GUARDEX_PROTECTED_BRANCHES:-$(git config --get multiagent.protectedBranches || true)}" -[[ -n "$protected_raw" ]] || protected_raw="dev main master" -protected_raw="${protected_raw//,/ }" - -is_protected() { - local branch="$1" - for p in $protected_raw; do - [[ "$branch" == "$p" ]] && return 0 - done - return 1 +resolve_guardex_cli() { + if [[ -n "${GUARDEX_CLI_BIN:-}" ]]; then + printf '%s' "$GUARDEX_CLI_BIN" + return 0 + fi + if command -v gx >/dev/null 2>&1; then + printf '%s' "gx" + return 0 + fi + if command -v gitguardex >/dev/null 2>&1; then + printf '%s' "gitguardex" + return 0 + fi + echo "[gitguardex-shim] Missing gx CLI in PATH." >&2 + exit 1 } -# Only guard when moving AWAY from a protected primary branch. -is_protected "$prev_branch" || exit 0 - -is_agent=0 -if [[ -n "${CLAUDECODE:-}" \ - || -n "${CLAUDE_CODE_SESSION_ID:-}" \ - || -n "${CODEX_THREAD_ID:-}" \ - || -n "${OMX_SESSION_ID:-}" \ - || "${CODEX_CI:-0}" == "1" ]]; then - is_agent=1 -fi - -echo "" >&2 -echo "[agent-primary-branch-guard] Primary checkout switched branches." >&2 -echo "[agent-primary-branch-guard] from: $prev_branch (protected)" >&2 -echo "[agent-primary-branch-guard] to: $new_branch" >&2 -echo "[agent-primary-branch-guard] The primary working tree must stay on its base/protected branch." >&2 -echo "[agent-primary-branch-guard] Use 'git worktree add' (or scripts/agent-branch-start.sh) for feature work." >&2 - -if [[ "$is_agent" == "1" ]]; then - echo "[agent-primary-branch-guard] Agent session detected — reverting to '$prev_branch'." >&2 - echo "[agent-primary-branch-guard] Bypass with GUARDEX_ALLOW_PRIMARY_BRANCH_SWITCH=1 if truly intentional." >&2 - if git diff --quiet && git diff --cached --quiet; then - GUARDEX_ALLOW_PRIMARY_BRANCH_SWITCH=1 git checkout "$prev_branch" >/dev/null 2>&1 || true - echo "[agent-primary-branch-guard] Reverted to '$prev_branch'." >&2 - else - echo "[agent-primary-branch-guard] Working tree dirty — auto-revert skipped." >&2 - echo "[agent-primary-branch-guard] Fix manually: git stash && git checkout $prev_branch" >&2 - fi -else - echo "[agent-primary-branch-guard] Bypass with GUARDEX_ALLOW_PRIMARY_BRANCH_SWITCH=1 if intentional." >&2 -fi +cli_bin="$(resolve_guardex_cli)" +exec "$cli_bin" 'hook' 'run' 'post-checkout' "$@" diff --git a/.githooks/post-merge b/.githooks/post-merge index 8bd7809..4da7d05 100755 --- a/.githooks/post-merge +++ b/.githooks/post-merge @@ -1,51 +1,27 @@ #!/usr/bin/env bash set -euo pipefail -if [[ "${GUARDEX_DISABLE_POST_MERGE_CLEANUP:-0}" == "1" ]]; then - exit 0 -fi - -repo_root="$(git rev-parse --show-toplevel 2>/dev/null || true)" -if [[ -z "$repo_root" ]]; then - exit 0 -fi -guardex_env_helper="${repo_root}/scripts/guardex-env.sh" -if [[ -f "$guardex_env_helper" ]]; then - # shellcheck source=/dev/null - source "$guardex_env_helper" -fi -if declare -F guardex_repo_is_enabled >/dev/null 2>&1 && ! guardex_repo_is_enabled "$repo_root"; then - exit 0 -fi - -branch="$(git -C "$repo_root" rev-parse --abbrev-ref HEAD 2>/dev/null || true)" -if [[ -z "$branch" || "$branch" == "HEAD" ]]; then - exit 0 -fi - -base_branch="${GUARDEX_BASE_BRANCH:-$(git -C "$repo_root" config --get multiagent.baseBranch || true)}" -if [[ -z "$base_branch" ]]; then - base_branch="dev" -fi - -if [[ "$branch" != "$base_branch" ]]; then - exit 0 -fi - -cli_path="$repo_root/bin/multiagent-safety.js" -if [[ ! -f "$cli_path" ]]; then - exit 0 -fi - -node_bin="${GUARDEX_NODE_BIN:-node}" -if ! command -v "$node_bin" >/dev/null 2>&1; then - exit 0 -fi - -"$node_bin" "$cli_path" cleanup \ - --target "$repo_root" \ - --base "$base_branch" \ - --include-pr-merged \ - --keep-clean-worktrees >/dev/null 2>&1 || true - -exit 0 +if [[ -n "${GUARDEX_CLI_ENTRY:-}" ]]; then + node_bin="${GUARDEX_NODE_BIN:-node}" + exec "$node_bin" "$GUARDEX_CLI_ENTRY" 'hook' 'run' 'post-merge' "$@" +fi + +resolve_guardex_cli() { + if [[ -n "${GUARDEX_CLI_BIN:-}" ]]; then + printf '%s' "$GUARDEX_CLI_BIN" + return 0 + fi + if command -v gx >/dev/null 2>&1; then + printf '%s' "gx" + return 0 + fi + if command -v gitguardex >/dev/null 2>&1; then + printf '%s' "gitguardex" + return 0 + fi + echo "[gitguardex-shim] Missing gx CLI in PATH." >&2 + exit 1 +} + +cli_bin="$(resolve_guardex_cli)" +exec "$cli_bin" 'hook' 'run' 'post-merge' "$@" diff --git a/.githooks/pre-commit b/.githooks/pre-commit index c2d751f..11a0ba7 100755 --- a/.githooks/pre-commit +++ b/.githooks/pre-commit @@ -1,285 +1,27 @@ #!/usr/bin/env bash set -euo pipefail -branch="$(git rev-parse --abbrev-ref HEAD 2>/dev/null || true)" -if [[ -z "$branch" || "$branch" == "HEAD" ]]; then - branch="$(git symbolic-ref --quiet --short HEAD 2>/dev/null || true)" -fi -if [[ -z "$branch" ]]; then - exit 0 -fi - -repo_root="$(git rev-parse --show-toplevel 2>/dev/null || true)" -if [[ -z "$repo_root" ]]; then - exit 0 -fi -NODE_BIN="${GUARDEX_NODE_BIN:-node}" -CLI_ENTRY="${GUARDEX_CLI_ENTRY:-}" -guardex_env_helper="${repo_root}/scripts/guardex-env.sh" -if [[ -f "$guardex_env_helper" ]]; then - # shellcheck source=/dev/null - source "$guardex_env_helper" -fi -if declare -F guardex_repo_is_enabled >/dev/null 2>&1 && ! guardex_repo_is_enabled "$repo_root"; then - exit 0 +if [[ -n "${GUARDEX_CLI_ENTRY:-}" ]]; then + node_bin="${GUARDEX_NODE_BIN:-node}" + exec "$node_bin" "$GUARDEX_CLI_ENTRY" 'hook' 'run' 'pre-commit' "$@" fi -run_guardex_cli() { - if [[ -n "$CLI_ENTRY" ]]; then - "$NODE_BIN" "$CLI_ENTRY" "$@" - return $? +resolve_guardex_cli() { + if [[ -n "${GUARDEX_CLI_BIN:-}" ]]; then + printf '%s' "$GUARDEX_CLI_BIN" + return 0 fi if command -v gx >/dev/null 2>&1; then - gx "$@" - return $? + printf '%s' "gx" + return 0 fi if command -v gitguardex >/dev/null 2>&1; then - gitguardex "$@" - return $? - fi - echo "[agent-branch-guard] Guardex CLI entrypoint unavailable; rerun via gx." >&2 - return 127 -} - -if [[ "${ALLOW_COMMIT_ON_PROTECTED_BRANCH:-0}" == "1" ]]; then - exit 0 -fi - -is_unborn_branch=0 -if ! git rev-parse --verify HEAD >/dev/null 2>&1; then - is_unborn_branch=1 -fi - -is_codex_session=0 -if [[ -n "${CODEX_THREAD_ID:-}" || -n "${OMX_SESSION_ID:-}" || "${CODEX_CI:-0}" == "1" ]]; then - is_codex_session=1 -fi - -# Superset of is_codex_session that also covers Claude Code sessions so the -# protected-branch gate below only triggers for automated agents — humans stay -# free to commit directly on main/dev/master. -is_agent_session=$is_codex_session -if [[ -n "${CLAUDECODE:-}" || -n "${CLAUDE_CODE_SESSION_ID:-}" ]]; then - is_agent_session=1 -fi - -is_vscode_git_context=0 -if [[ -n "${VSCODE_GIT_IPC_HANDLE:-}" || -n "${VSCODE_GIT_ASKPASS_NODE:-}" || -n "${VSCODE_IPC_HOOK_CLI:-}" ]]; then - is_vscode_git_context=1 -fi - -allow_vscode_protected_raw="${GUARDEX_ALLOW_VSCODE_PROTECTED_BRANCH_WRITES:-$(git config --get multiagent.allowVscodeProtectedBranchWrites || true)}" -if [[ -z "$allow_vscode_protected_raw" ]]; then - allow_vscode_protected_raw="false" -fi -allow_vscode_protected="$(printf '%s' "$allow_vscode_protected_raw" | tr '[:upper:]' '[:lower:]')" - -allow_vscode_protected_branch_writes=0 -case "$allow_vscode_protected" in - 1|true|yes|on) allow_vscode_protected_branch_writes=1 ;; - 0|false|no|off) allow_vscode_protected_branch_writes=0 ;; - *) allow_vscode_protected_branch_writes=0 ;; -esac - -protected_branches_raw="${GUARDEX_PROTECTED_BRANCHES:-$(git config --get multiagent.protectedBranches || true)}" -if [[ -z "$protected_branches_raw" ]]; then - protected_branches_raw="dev main master" -fi -protected_branches_raw="${protected_branches_raw//,/ }" - -is_protected_branch=0 -for protected_branch in $protected_branches_raw; do - if [[ "$branch" == "$protected_branch" ]]; then - is_protected_branch=1 - break - fi -done - -codex_require_agent_branch_raw="${GUARDEX_CODEX_REQUIRE_AGENT_BRANCH:-$(git config --get multiagent.codexRequireAgentBranch || true)}" -if [[ -z "$codex_require_agent_branch_raw" ]]; then - codex_require_agent_branch_raw="true" -fi -codex_require_agent_branch="$(printf '%s' "$codex_require_agent_branch_raw" | tr '[:upper:]' '[:lower:]')" - -should_require_codex_agent_branch=0 -case "$codex_require_agent_branch" in - 1|true|yes|on) should_require_codex_agent_branch=1 ;; - 0|false|no|off) should_require_codex_agent_branch=0 ;; - *) should_require_codex_agent_branch=1 ;; -esac - -is_codex_managed_only_commit_on_protected=0 -if [[ "$is_codex_session" == "1" && "$is_protected_branch" == "1" ]]; then - deleted_paths="$(git diff --cached --name-only --diff-filter=D)" - staged_paths="$(git diff --cached --name-only --diff-filter=ACMRTUXB)" - if [[ -z "$deleted_paths" && -n "$staged_paths" ]]; then - managed_only=1 - while IFS= read -r staged_path; do - case "$staged_path" in - AGENTS.md|.gitignore) ;; - *) managed_only=0; break ;; - esac - done <<< "$staged_paths" - if [[ "$managed_only" == "1" ]]; then - is_codex_managed_only_commit_on_protected=1 - fi - fi -fi - -if [[ "$should_require_codex_agent_branch" == "1" && "${GUARDEX_ALLOW_CODEX_ON_NON_AGENT:-0}" != "1" ]]; then - if [[ "$is_codex_session" == "1" && "$branch" != agent/* ]]; then - if [[ "$is_protected_branch" == "1" ]]; then - if [[ "$is_codex_managed_only_commit_on_protected" == "1" ]]; then - exit 0 - fi - - cat >&2 <<'MSG' -[guardex-preedit-guard] Codex edit/commit detected on a protected branch. -GuardeX requires Codex work to run from an isolated agent/* branch. -Start the sub-branch/worktree with: - gx branch start "" "" -Or manually: - gx branch start "" "" -Then commit from the created agent/* branch. - -Temporary bypass (not recommended): - GUARDEX_ALLOW_CODEX_ON_NON_AGENT=1 git commit ... -MSG - exit 1 - fi - - cat >&2 <<'MSG' -[codex-branch-guard] Codex agent commit blocked on non-agent branch. -Use isolated branch/worktree first: - gx branch start "" "" -Then commit from the created agent/* branch. - -Temporary bypass (not recommended): - GUARDEX_ALLOW_CODEX_ON_NON_AGENT=1 git commit ... -Disable this rule for a repo (not recommended): - git config multiagent.codexRequireAgentBranch false -MSG - exit 1 - fi -fi - -if [[ "$is_protected_branch" == "1" ]]; then - # Humans may commit directly on protected branches; only agent sessions - # (Codex / Claude Code / OMX) are blocked. - if [[ "$is_agent_session" != "1" ]]; then - exit 0 - fi - - if [[ "$is_unborn_branch" == "1" && "$is_codex_session" != "1" ]]; then - exit 0 - fi - - git_dir="$(git rev-parse --git-dir)" - if [[ -f "$git_dir/MERGE_HEAD" ]]; then - exit 0 + printf '%s' "gitguardex" + return 0 fi - - cat >&2 <<'MSG' -[agent-branch-guard] Direct commits on protected branches are blocked. -Use an agent branch first: - gx branch start "" "" -After finishing work: - gx branch finish - -Temporary bypass (not recommended): - ALLOW_COMMIT_ON_PROTECTED_BRANCH=1 git commit ... -MSG - exit 1 -fi - -if [[ "$is_agent_session" == "1" && "$branch" != agent/* ]]; then - cat >&2 <<'MSG' -[agent-branch-guard] Agent commits must run on dedicated agent/* branches. -Start an agent branch first: - gx branch start "" "" -Then commit on that branch. - -Temporary bypass (not recommended): - ALLOW_COMMIT_ON_PROTECTED_BRANCH=1 git commit ... -MSG + echo "[gitguardex-shim] Missing gx CLI in PATH." >&2 exit 1 -fi - -if [[ "$branch" == agent/* ]]; then - if [[ "${GUARDEX_AUTOCLAIM_STAGED_LOCKS:-1}" == "1" ]]; then - while IFS= read -r staged_file; do - [[ -z "$staged_file" ]] && continue - [[ "$staged_file" == ".omx/state/agent-file-locks.json" ]] && continue - run_guardex_cli locks claim --branch "$branch" "$staged_file" >/dev/null 2>&1 || true - done < <(git diff --cached --name-only --diff-filter=ACMRDTUXB) - fi - - if ! run_guardex_cli locks validate --branch "$branch" --staged; then - cat >&2 <<'MSG' -[agent-branch-guard] Agent branch commits require file ownership locks. -Claim files first: - gx locks claim --branch "$(git rev-parse --abbrev-ref HEAD)" -MSG - exit 1 - fi - - require_sync_before_commit_raw="$(git config --get multiagent.sync.requireBeforeCommit || true)" - if [[ -z "$require_sync_before_commit_raw" ]]; then - require_sync_before_commit_raw="false" - fi - require_sync_before_commit="$(printf '%s' "$require_sync_before_commit_raw" | tr '[:upper:]' '[:lower:]')" - - should_require_sync=0 - case "$require_sync_before_commit" in - 1|true|yes|on) should_require_sync=1 ;; - 0|false|no|off) should_require_sync=0 ;; - *) should_require_sync=0 ;; - esac - - if [[ "$should_require_sync" == "1" ]]; then - base_branch="$(git config --get multiagent.baseBranch || true)" - if [[ -z "$base_branch" ]]; then - base_branch="dev" - fi - - max_behind_raw="$(git config --get multiagent.sync.maxBehindCommits || true)" - if [[ -z "$max_behind_raw" ]]; then - max_behind_raw="0" - fi - if [[ ! "$max_behind_raw" =~ ^[0-9]+$ ]]; then - echo "[agent-sync-guard] Invalid multiagent.sync.maxBehindCommits value: ${max_behind_raw}" >&2 - echo "[agent-sync-guard] Expected non-negative integer. Example: git config multiagent.sync.maxBehindCommits 0" >&2 - exit 1 - fi - - if ! git fetch origin "$base_branch" --quiet >/dev/null 2>&1; then - echo "[agent-sync-guard] Unable to fetch origin/${base_branch} while commit sync gate is enabled." >&2 - echo "[agent-sync-guard] Disable gate temporarily with: git config multiagent.sync.requireBeforeCommit false" >&2 - exit 1 - fi - - if ! git show-ref --verify --quiet "refs/remotes/origin/${base_branch}"; then - echo "[agent-sync-guard] Remote base branch not found: origin/${base_branch}" >&2 - exit 1 - fi - - behind_count="$(git rev-list --left-right --count "${branch}...origin/${base_branch}" 2>/dev/null | awk '{print $2}')" - behind_count="${behind_count:-0}" - max_behind="${max_behind_raw}" - - if [[ "$behind_count" -gt "$max_behind" ]]; then - cat >&2 < -MSG - exit 1 - fi - fi -fi +} -if command -v pre-commit >/dev/null 2>&1 && [[ -f .pre-commit-config.yaml ]]; then - pre-commit run --hook-stage pre-commit -fi +cli_bin="$(resolve_guardex_cli)" +exec "$cli_bin" 'hook' 'run' 'pre-commit' "$@" diff --git a/.githooks/pre-push b/.githooks/pre-push index b064af4..48cf72a 100755 --- a/.githooks/pre-push +++ b/.githooks/pre-push @@ -1,114 +1,27 @@ #!/usr/bin/env bash set -euo pipefail -if [[ "${ALLOW_PUSH_ON_PROTECTED_BRANCH:-0}" == "1" || "${ALLOW_COMMIT_ON_PROTECTED_BRANCH:-0}" == "1" ]]; then - exit 0 +if [[ -n "${GUARDEX_CLI_ENTRY:-}" ]]; then + node_bin="${GUARDEX_NODE_BIN:-node}" + exec "$node_bin" "$GUARDEX_CLI_ENTRY" 'hook' 'run' 'pre-push' "$@" fi -repo_root="$(git rev-parse --show-toplevel 2>/dev/null || true)" -if [[ -z "$repo_root" ]]; then - exit 0 -fi -guardex_env_helper="${repo_root}/scripts/guardex-env.sh" -if [[ -f "$guardex_env_helper" ]]; then - # shellcheck source=/dev/null - source "$guardex_env_helper" -fi -if declare -F guardex_repo_is_enabled >/dev/null 2>&1 && ! guardex_repo_is_enabled "$repo_root"; then - exit 0 -fi - -is_vscode_git_context=0 -if [[ -n "${VSCODE_GIT_IPC_HANDLE:-}" || -n "${VSCODE_GIT_ASKPASS_NODE:-}" || -n "${VSCODE_IPC_HOOK_CLI:-}" ]]; then - is_vscode_git_context=1 -fi - -allow_vscode_protected_raw="${GUARDEX_ALLOW_VSCODE_PROTECTED_BRANCH_WRITES:-$(git config --get multiagent.allowVscodeProtectedBranchWrites || true)}" -if [[ -z "$allow_vscode_protected_raw" ]]; then - allow_vscode_protected_raw="false" -fi -allow_vscode_protected="$(printf '%s' "$allow_vscode_protected_raw" | tr '[:upper:]' '[:lower:]')" - -allow_vscode_protected_branch_writes=0 -case "$allow_vscode_protected" in - 1|true|yes|on) allow_vscode_protected_branch_writes=1 ;; - 0|false|no|off) allow_vscode_protected_branch_writes=0 ;; - *) allow_vscode_protected_branch_writes=0 ;; -esac - -is_codex_session=0 -if [[ -n "${CODEX_THREAD_ID:-}" || -n "${OMX_SESSION_ID:-}" || "${CODEX_CI:-0}" == "1" ]]; then - is_codex_session=1 -fi - -# Superset covering Claude Code so only agents are blocked from pushing to -# protected refs; humans push directly from their primary checkout. -is_agent_session=$is_codex_session -if [[ -n "${CLAUDECODE:-}" || -n "${CLAUDE_CODE_SESSION_ID:-}" ]]; then - is_agent_session=1 -fi - -protected_branches_raw="${GUARDEX_PROTECTED_BRANCHES:-$(git config --get multiagent.protectedBranches || true)}" -if [[ -z "$protected_branches_raw" ]]; then - protected_branches_raw="dev main master" -fi -protected_branches_raw="${protected_branches_raw//,/ }" - -is_protected_branch() { - local branch="$1" - for protected_branch in $protected_branches_raw; do - if [[ "$branch" == "$protected_branch" ]]; then - return 0 - fi - done - return 1 -} - -blocked_refs=() -while IFS=' ' read -r local_ref local_sha remote_ref remote_sha; do - if [[ -z "${remote_ref:-}" || "$remote_ref" != refs/heads/* ]]; then - continue +resolve_guardex_cli() { + if [[ -n "${GUARDEX_CLI_BIN:-}" ]]; then + printf '%s' "$GUARDEX_CLI_BIN" + return 0 fi - - remote_branch="${remote_ref#refs/heads/}" - if is_protected_branch "$remote_branch"; then - blocked_refs+=("$remote_branch") + if command -v gx >/dev/null 2>&1; then + printf '%s' "gx" + return 0 fi -done - -if [[ "${#blocked_refs[@]}" -gt 0 ]]; then - if [[ "$is_codex_session" == "1" ]]; then - { - echo "[guardex-preedit-guard] Codex push detected toward protected branch." - echo "[guardex-preedit-guard] Protected target(s): ${blocked_refs[*]}" - echo "[guardex-preedit-guard] Run Codex from an agent/* branch and merge via PR." - echo - echo "Temporary bypass (not recommended):" - echo " ALLOW_PUSH_ON_PROTECTED_BRANCH=1 git push ..." - } >&2 - exit 1 + if command -v gitguardex >/dev/null 2>&1; then + printf '%s' "gitguardex" + return 0 fi - - # Humans may push directly to protected branches; only agent sessions are blocked. - if [[ "$is_agent_session" != "1" ]]; then - exit 0 - fi - - if [[ "$is_vscode_git_context" == "1" && "$allow_vscode_protected_branch_writes" == "1" ]]; then - exit 0 - fi - - { - echo "[agent-branch-guard] Push to protected branch blocked." - echo "[agent-branch-guard] Protected target(s): ${blocked_refs[*]}" - echo "[agent-branch-guard] Use an agent branch and merge via PR." - echo "[agent-branch-guard] Optional repo opt-in for VS Code protected-branch push:" - echo " git config multiagent.allowVscodeProtectedBranchWrites true" - echo - echo "Temporary bypass (not recommended):" - echo " ALLOW_PUSH_ON_PROTECTED_BRANCH=1 git push ..." - } >&2 + echo "[gitguardex-shim] Missing gx CLI in PATH." >&2 exit 1 -fi +} -exit 0 +cli_bin="$(resolve_guardex_cli)" +exec "$cli_bin" 'hook' 'run' 'pre-push' "$@" diff --git a/.gitignore b/.gitignore index 9cfa814..dc5ec96 100644 --- a/.gitignore +++ b/.gitignore @@ -78,14 +78,12 @@ openspec/plan/* # multiagent-safety:START .omx/ .omc/ -scripts/* -scripts/agent-branch-start.sh -scripts/agent-file-locks.py +scripts/agent-session-state.js +scripts/guardex-docker-loader.sh +scripts/guardex-env.sh +scripts/install-vscode-active-agents-extension.js .githooks oh-my-codex/ -.codex/skills/gitguardex/SKILL.md -.codex/skills/guardex-merge-skills-to-dev/SKILL.md -.claude/commands/gitguardex.md .omx/state/agent-file-locks.json # multiagent-safety:END diff --git a/AGENTS.md b/AGENTS.md index 63b5ed8..b009532 100644 --- a/AGENTS.md +++ b/AGENTS.md @@ -261,6 +261,9 @@ scripts/openspec/init-plan-workspace.sh `GUARDEX_ON=0` disables Guardex for that repo. `GUARDEX_ON=1` explicitly enables Guardex for that repo again. +**Task-size routing.** Small tasks stay in direct caveman-only mode. For typos, single-file tweaks, one-liners, version bumps, or similarly bounded asks, solve directly and do not escalate into heavy OMX orchestration just because a keyword appears. Treat `quick:`, `simple:`, `tiny:`, `minor:`, `small:`, `just:`, and `only:` as explicit lightweight escape hatches. +Promote to OMX orchestration only when the task is medium/large: multi-file behavior changes, API/schema work, refactors, migrations, architecture, cross-cutting scope, or long prompts. Heavy OMX modes (`ralph`, `autopilot`, `team`, `ultrawork`, `swarm`, `ralplan`) are for that larger scope. If the task grows while working, upgrade then. + **Isolation.** Every task runs on a dedicated `agent/*` branch + worktree. Start with `gx branch start "" ""`. Treat the base branch (`main`/`dev`) as read-only while an agent branch is active. Never `git checkout ` on a primary working tree (including nested repos); use `git worktree add` instead. The `.githooks/post-checkout` hook auto-reverts primary-branch switches during agent sessions - bypass only with `GUARDEX_ALLOW_PRIMARY_BRANCH_SWITCH=1`. For every new task, including follow-up work in the same chat/session, if an assigned agent sub-branch/worktree is already open, continue in that sub-branch instead of creating a fresh lane unless the user explicitly redirects scope. Never implement directly on the local/base branch checkout; keep it unchanged and perform all edits in the agent sub-branch/worktree. @@ -276,7 +279,7 @@ OMX completion policy: when a task is done, the agent must commit the task chang **Reporting.** Every completion handoff includes: files changed, behavior touched, verification commands + results, risks/follow-ups. -**OpenSpec (when change-driven).** Keep `openspec/changes//tasks.md` checkboxes current during work, not batched at the end. Task scaffolds and manual task edits must include an explicit final completion/cleanup section that ends with PR merge + sandbox cleanup (`gx branch finish ... --cleanup` or `gx finish --all`) and records PR URL + final `MERGED` evidence. Verify specs with `openspec validate --specs` before archive. Don't archive unverified. +**OpenSpec (when change-driven).** Keep `openspec/changes//tasks.md` checkboxes current during work, not batched at the end. Task scaffolds and manual task edits must include an explicit final completion/cleanup section that ends with PR merge + sandbox cleanup (`gx finish --via-pr --wait-for-merge --cleanup` or `gx branch finish ... --cleanup`) and records PR URL + final `MERGED` evidence. Verify specs with `openspec validate --specs` before archive. Don't archive unverified. **Version bumps.** If a change bumps a published version, the same PR updates release notes/changelog. diff --git a/scripts/guardex-env.sh b/scripts/guardex-env.sh old mode 100644 new mode 100755 diff --git a/vscode/guardex-active-agents/README.md b/vscode/guardex-active-agents/README.md new file mode 100644 index 0000000..b63a8c3 --- /dev/null +++ b/vscode/guardex-active-agents/README.md @@ -0,0 +1,22 @@ +# GitGuardex Active Agents + +Local VS Code companion for Guardex-managed repos. + +What it does: + +- Adds an `Active Agents` view to the Source Control container. +- Renders one repo node per live Guardex workspace with grouped `ACTIVE AGENTS` and `CHANGES` sections. +- Splits live sessions inside `ACTIVE AGENTS` into `WORKING NOW` and `THINKING` groups so active edit lanes stand out immediately. +- Shows one row per live Guardex sandbox session inside those activity groups. +- Shows repo-root git changes in a sibling `CHANGES` section when the guarded repo itself is dirty. +- Derives `thinking` versus `working` from the live sandbox worktree, surfaces working counts in the repo/header summary, and shows changed-file counts for active edits. +- Uses VS Code's native animated `loading~spin` icon for the running-state affordance. +- Reads repo-local presence files from `.omx/state/active-sessions/`. + +Install from a Guardex-wired repo: + +```sh +node scripts/install-vscode-active-agents-extension.js +``` + +Then reload the VS Code window. diff --git a/vscode/guardex-active-agents/extension.js b/vscode/guardex-active-agents/extension.js new file mode 100644 index 0000000..a375c52 --- /dev/null +++ b/vscode/guardex-active-agents/extension.js @@ -0,0 +1,357 @@ +const fs = require('node:fs'); +const path = require('node:path'); +const vscode = require('vscode'); +const { formatElapsedFrom, readActiveSessions, readRepoChanges } = require('./session-schema.js'); + +class InfoItem extends vscode.TreeItem { + constructor(label, description = '') { + super(label, vscode.TreeItemCollapsibleState.None); + this.description = description; + this.iconPath = new vscode.ThemeIcon('info'); + } +} + +class RepoItem extends vscode.TreeItem { + constructor(repoRoot, sessions, changes) { + super(path.basename(repoRoot), vscode.TreeItemCollapsibleState.Expanded); + this.repoRoot = repoRoot; + this.sessions = sessions; + this.changes = changes; + const descriptionParts = [`${sessions.length} active`]; + const workingCount = countWorkingSessions(sessions); + if (workingCount > 0) { + descriptionParts.push(`${workingCount} working`); + } + if (changes.length > 0) { + descriptionParts.push(`${changes.length} changed`); + } + this.description = descriptionParts.join(' · '); + this.tooltip = [ + repoRoot, + this.description, + ].join('\n'); + this.iconPath = new vscode.ThemeIcon('repo'); + this.contextValue = 'gitguardex.repo'; + } +} + +class SectionItem extends vscode.TreeItem { + constructor(label, items, options = {}) { + super(label, vscode.TreeItemCollapsibleState.Expanded); + this.items = items; + this.description = options.description + || (items.length > 0 ? String(items.length) : ''); + this.contextValue = 'gitguardex.section'; + } +} + +class SessionItem extends vscode.TreeItem { + constructor(session) { + super(session.label, vscode.TreeItemCollapsibleState.None); + this.session = session; + const descriptionParts = [session.activityLabel || 'thinking']; + if (session.activityCountLabel) { + descriptionParts.push(session.activityCountLabel); + } + descriptionParts.push(session.elapsedLabel || formatElapsedFrom(session.startedAt)); + this.description = descriptionParts.join(' · '); + const tooltipLines = [ + session.branch, + `${session.agentName} · ${session.taskName}`, + `Status ${this.description}`, + session.changeCount > 0 + ? `Changed ${session.activityCountLabel}: ${session.activitySummary}` + : session.activitySummary, + `Started ${session.startedAt}`, + session.worktreePath, + ]; + this.tooltip = tooltipLines.filter(Boolean).join('\n'); + this.iconPath = session.activityKind === 'working' + ? new vscode.ThemeIcon('edit') + : new vscode.ThemeIcon('loading~spin'); + this.contextValue = 'gitguardex.session'; + this.command = { + command: 'gitguardex.activeAgents.openWorktree', + title: 'Open Agent Worktree', + arguments: [session], + }; + } +} + +class FolderItem extends vscode.TreeItem { + constructor(label, relativePath, items) { + super(label, vscode.TreeItemCollapsibleState.Expanded); + this.relativePath = relativePath; + this.items = items; + this.tooltip = relativePath; + this.iconPath = new vscode.ThemeIcon('folder'); + this.contextValue = 'gitguardex.folder'; + } +} + +class ChangeItem extends vscode.TreeItem { + constructor(change) { + super(path.basename(change.relativePath), vscode.TreeItemCollapsibleState.None); + this.change = change; + this.description = change.statusLabel; + this.tooltip = [ + change.relativePath, + `Status ${change.statusText}`, + change.originalPath ? `Renamed from ${change.originalPath}` : '', + change.absolutePath, + ].filter(Boolean).join('\n'); + this.resourceUri = vscode.Uri.file(change.absolutePath); + this.contextValue = 'gitguardex.change'; + this.command = { + command: 'gitguardex.activeAgents.openChange', + title: 'Open Changed File', + arguments: [change], + }; + } +} + +function repoRootFromSessionFile(filePath) { + return path.resolve(path.dirname(filePath), '..', '..', '..'); +} + +function buildChangeTreeNodes(changes) { + const root = []; + + function sortNodes(nodes) { + nodes.sort((left, right) => { + const leftIsFolder = left.kind === 'folder'; + const rightIsFolder = right.kind === 'folder'; + if (leftIsFolder !== rightIsFolder) { + return leftIsFolder ? -1 : 1; + } + return left.label.localeCompare(right.label); + }); + + for (const node of nodes) { + if (node.kind === 'folder') { + sortNodes(node.children); + } + } + } + + for (const change of changes) { + const segments = change.relativePath.split(/[\\/]+/).filter(Boolean); + if (segments.length <= 1) { + root.push({ kind: 'change', label: change.relativePath, change }); + continue; + } + + let nodes = root; + let folderPath = ''; + for (const segment of segments.slice(0, -1)) { + folderPath = folderPath ? path.posix.join(folderPath, segment) : segment; + let folderNode = nodes.find((node) => node.kind === 'folder' && node.relativePath === folderPath); + if (!folderNode) { + folderNode = { + kind: 'folder', + label: segment, + relativePath: folderPath, + children: [], + }; + nodes.push(folderNode); + } + nodes = folderNode.children; + } + + nodes.push({ kind: 'change', label: change.relativePath, change }); + } + + sortNodes(root); + + function materialize(nodes) { + return nodes.map((node) => { + if (node.kind === 'folder') { + return new FolderItem(node.label, node.relativePath, materialize(node.children)); + } + return new ChangeItem(node.change); + }); + } + + return materialize(root); +} + +function countWorkingSessions(sessions) { + return sessions.filter((session) => session.activityKind === 'working').length; +} + +function buildActiveAgentGroupNodes(sessions) { + const workingSessions = sessions + .filter((session) => session.activityKind === 'working') + .map((session) => new SessionItem(session)); + const thinkingSessions = sessions + .filter((session) => session.activityKind !== 'working') + .map((session) => new SessionItem(session)); + const groups = []; + + if (workingSessions.length > 0) { + groups.push(new SectionItem('WORKING NOW', workingSessions)); + } + if (thinkingSessions.length > 0) { + groups.push(new SectionItem('THINKING', thinkingSessions)); + } + + return groups; +} + +class ActiveAgentsProvider { + constructor() { + this.onDidChangeTreeDataEmitter = new vscode.EventEmitter(); + this.onDidChangeTreeData = this.onDidChangeTreeDataEmitter.event; + this.treeView = null; + } + + getTreeItem(element) { + return element; + } + + attachTreeView(treeView) { + this.treeView = treeView; + this.updateViewState(0, 0); + } + + updateViewState(sessionCount, workingCount) { + if (!this.treeView) { + return; + } + + this.treeView.badge = sessionCount > 0 + ? { + value: sessionCount, + tooltip: `${sessionCount} active agent${sessionCount === 1 ? '' : 's'}` + + (workingCount > 0 ? ` · ${workingCount} working now` : ''), + } + : undefined; + this.treeView.message = sessionCount > 0 + ? undefined + : 'Start a sandbox session to populate this view.'; + } + + refresh() { + this.onDidChangeTreeDataEmitter.fire(); + } + + async getChildren(element) { + if (element instanceof RepoItem) { + const sectionItems = [ + new SectionItem('ACTIVE AGENTS', buildActiveAgentGroupNodes(element.sessions), { + description: String(element.sessions.length), + }), + ]; + if (element.changes.length > 0) { + sectionItems.push(new SectionItem('CHANGES', buildChangeTreeNodes(element.changes))); + } + return sectionItems; + } + + if (element instanceof SectionItem || element instanceof FolderItem) { + return element.items; + } + + const repoEntries = await this.loadRepoEntries(); + const sessionCount = repoEntries.reduce((total, entry) => total + entry.sessions.length, 0); + const workingCount = repoEntries.reduce( + (total, entry) => total + countWorkingSessions(entry.sessions), + 0, + ); + this.updateViewState(sessionCount, workingCount); + + if (repoEntries.length === 0) { + return [new InfoItem('No active Guardex agents', 'Open or start a sandbox session.')]; + } + + return repoEntries.map((entry) => new RepoItem(entry.repoRoot, entry.sessions, entry.changes)); + } + + async loadRepoEntries() { + const sessionFiles = await vscode.workspace.findFiles( + '**/.omx/state/active-sessions/*.json', + '**/{node_modules,.git,.omx/agent-worktrees,.omc/agent-worktrees}/**', + 200, + ); + + const repoRoots = new Set(); + for (const uri of sessionFiles) { + repoRoots.add(repoRootFromSessionFile(uri.fsPath)); + } + + if (repoRoots.size === 0) { + for (const workspaceFolder of vscode.workspace.workspaceFolders || []) { + repoRoots.add(workspaceFolder.uri.fsPath); + } + } + + const repoEntries = []; + for (const repoRoot of repoRoots) { + const sessions = readActiveSessions(repoRoot); + if (sessions.length > 0) { + repoEntries.push({ + repoRoot, + sessions, + changes: readRepoChanges(repoRoot), + }); + } + } + + repoEntries.sort((left, right) => left.repoRoot.localeCompare(right.repoRoot)); + return repoEntries; + } +} + +function activate(context) { + const provider = new ActiveAgentsProvider(); + const treeView = vscode.window.createTreeView('gitguardex.activeAgents', { + treeDataProvider: provider, + showCollapseAll: true, + }); + provider.attachTreeView(treeView); + const refresh = () => provider.refresh(); + const watcher = vscode.workspace.createFileSystemWatcher('**/.omx/state/active-sessions/*.json'); + const interval = setInterval(refresh, 5_000); + + context.subscriptions.push( + treeView, + vscode.commands.registerCommand('gitguardex.activeAgents.refresh', refresh), + vscode.commands.registerCommand('gitguardex.activeAgents.openWorktree', async (session) => { + if (!session?.worktreePath) { + return; + } + + await vscode.commands.executeCommand( + 'vscode.openFolder', + vscode.Uri.file(session.worktreePath), + { forceNewWindow: true }, + ); + }), + vscode.commands.registerCommand('gitguardex.activeAgents.openChange', async (change) => { + if (!change?.absolutePath) { + return; + } + + if (!fs.existsSync(change.absolutePath)) { + vscode.window.showInformationMessage?.(`Changed path is no longer on disk: ${change.relativePath}`); + return; + } + + await vscode.commands.executeCommand('vscode.open', vscode.Uri.file(change.absolutePath)); + }), + vscode.workspace.onDidChangeWorkspaceFolders(refresh), + watcher, + { dispose: () => clearInterval(interval) }, + ); + + watcher.onDidCreate(refresh, undefined, context.subscriptions); + watcher.onDidChange(refresh, undefined, context.subscriptions); + watcher.onDidDelete(refresh, undefined, context.subscriptions); +} + +function deactivate() {} + +module.exports = { + activate, + deactivate, +}; diff --git a/vscode/guardex-active-agents/package.json b/vscode/guardex-active-agents/package.json new file mode 100644 index 0000000..da83ad5 --- /dev/null +++ b/vscode/guardex-active-agents/package.json @@ -0,0 +1,57 @@ +{ + "name": "gitguardex-active-agents", + "displayName": "GitGuardex Active Agents", + "description": "Shows live Guardex sandbox sessions and repo changes inside VS Code Source Control.", + "publisher": "recodeee", + "version": "0.0.1", + "license": "MIT", + "engines": { + "vscode": "^1.88.0" + }, + "categories": [ + "Source Control", + "Other" + ], + "activationEvents": [ + "workspaceContains:.omx/state/active-sessions", + "onView:gitguardex.activeAgents" + ], + "main": "./extension.js", + "contributes": { + "commands": [ + { + "command": "gitguardex.activeAgents.refresh", + "title": "Refresh Active Agents" + }, + { + "command": "gitguardex.activeAgents.openWorktree", + "title": "Open Agent Worktree" + } + ], + "views": { + "scm": [ + { + "id": "gitguardex.activeAgents", + "name": "Active Agents", + "visibility": "visible" + } + ] + }, + "menus": { + "view/title": [ + { + "command": "gitguardex.activeAgents.refresh", + "when": "view == gitguardex.activeAgents", + "group": "navigation" + } + ], + "view/item/context": [ + { + "command": "gitguardex.activeAgents.openWorktree", + "when": "view == gitguardex.activeAgents && viewItem == gitguardex.session", + "group": "inline" + } + ] + } + } +} diff --git a/vscode/guardex-active-agents/session-schema.js b/vscode/guardex-active-agents/session-schema.js new file mode 100644 index 0000000..aaeef7b --- /dev/null +++ b/vscode/guardex-active-agents/session-schema.js @@ -0,0 +1,407 @@ +const fs = require('node:fs'); +const path = require('node:path'); +const cp = require('node:child_process'); + +const ACTIVE_SESSIONS_RELATIVE_DIR = path.join('.omx', 'state', 'active-sessions'); +const SESSION_SCHEMA_VERSION = 1; +const LOCK_FILE_RELATIVE = path.join('.omx', 'state', 'agent-file-locks.json'); +const MAX_CHANGED_PATH_PREVIEW = 3; +const ACTIVE_SESSIONS_FILTER_PREFIX = ACTIVE_SESSIONS_RELATIVE_DIR.split(path.sep).join('/'); + +function toNonEmptyString(value, fallback = '') { + const normalized = typeof value === 'string' ? value.trim() : String(value || '').trim(); + return normalized || fallback; +} + +function toPositiveInteger(value) { + const normalized = Number.parseInt(String(value || ''), 10); + return Number.isInteger(normalized) && normalized > 0 ? normalized : null; +} + +function sanitizeBranchForFile(branch) { + const normalized = toNonEmptyString(branch, 'session'); + return normalized.replace(/[^a-zA-Z0-9._-]+/g, '__').replace(/^_+|_+$/g, '') || 'session'; +} + +function sessionFileNameForBranch(branch) { + return `${sanitizeBranchForFile(branch)}.json`; +} + +function activeSessionsDirForRepo(repoRoot) { + return path.join(path.resolve(repoRoot), ACTIVE_SESSIONS_RELATIVE_DIR); +} + +function sessionFilePathForBranch(repoRoot, branch) { + return path.join(activeSessionsDirForRepo(repoRoot), sessionFileNameForBranch(branch)); +} + +function splitOutputLines(output) { + if (typeof output !== 'string') { + return null; + } + + return output + .split(/\r?\n/) + .filter((line) => line.trim().length > 0); +} + +function runGitLines(worktreePath, args) { + try { + const output = cp.execFileSync('git', ['-C', worktreePath, ...args], { + encoding: 'utf8', + stdio: ['ignore', 'pipe', 'ignore'], + }); + return splitOutputLines(output); + } catch (_error) { + return null; + } +} + +function unquoteGitPath(value) { + if (typeof value !== 'string') { + return ''; + } + + const trimmed = value.trim(); + if (!trimmed.startsWith('"') || !trimmed.endsWith('"')) { + return trimmed; + } + + try { + return JSON.parse(trimmed); + } catch (_error) { + return trimmed.slice(1, -1); + } +} + +function formatFileCount(count) { + return `${count} file${count === 1 ? '' : 's'}`; +} + +function previewChangedPaths(paths) { + if (!Array.isArray(paths) || paths.length === 0) { + return ''; + } + + if (paths.length <= MAX_CHANGED_PATH_PREVIEW) { + return paths.join(', '); + } + + const preview = paths.slice(0, MAX_CHANGED_PATH_PREVIEW).join(', '); + return `${preview}, +${paths.length - MAX_CHANGED_PATH_PREVIEW} more`; +} + +function deriveRepoChangeStatus(statusPair) { + if (statusPair === '??') { + return { + statusCode: '??', + statusLabel: 'U', + statusText: 'Untracked', + }; + } + + const code = [statusPair[1], statusPair[0]].find((value) => value && value !== ' ') || 'M'; + const statusTextByCode = { + A: 'Added', + C: 'Copied', + D: 'Deleted', + M: 'Modified', + R: 'Renamed', + T: 'Type changed', + U: 'Conflicted', + }; + + return { + statusCode: code, + statusLabel: code, + statusText: statusTextByCode[code] || 'Changed', + }; +} + +function parseRepoChangeLine(repoRoot, line) { + if (typeof line !== 'string' || line.length < 4) { + return null; + } + + const statusPair = line.slice(0, 2); + if (statusPair === '!!') { + return null; + } + + const rawPath = line.slice(3).trim(); + if (!rawPath) { + return null; + } + + let relativePath = rawPath; + let originalPath = ''; + if (rawPath.includes(' -> ')) { + const parts = rawPath.split(' -> '); + if (parts.length === 2) { + originalPath = unquoteGitPath(parts[0]); + relativePath = parts[1]; + } + } + + relativePath = unquoteGitPath(relativePath); + if (!relativePath) { + return null; + } + + const normalizedRelativePath = relativePath.split(path.sep).join('/'); + if ( + normalizedRelativePath === ACTIVE_SESSIONS_FILTER_PREFIX + || normalizedRelativePath.startsWith(`${ACTIVE_SESSIONS_FILTER_PREFIX}/`) + ) { + return null; + } + + const status = deriveRepoChangeStatus(statusPair); + return { + ...status, + originalPath, + relativePath, + absolutePath: path.join(path.resolve(repoRoot), relativePath), + }; +} + +function collectWorktreeChangedPaths(worktreePath) { + const changedGroups = [ + runGitLines(worktreePath, ['diff', '--name-only', '--', '.', `:(exclude)${LOCK_FILE_RELATIVE}`]), + runGitLines(worktreePath, ['diff', '--cached', '--name-only', '--', '.', `:(exclude)${LOCK_FILE_RELATIVE}`]), + runGitLines(worktreePath, ['ls-files', '--others', '--exclude-standard']), + ]; + + if (changedGroups.some((group) => group === null)) { + return null; + } + + return [...new Set(changedGroups.flat())] + .filter((relativePath) => relativePath && relativePath !== LOCK_FILE_RELATIVE) + .sort((left, right) => left.localeCompare(right)); +} + +function deriveSessionActivity(session) { + const changedPaths = collectWorktreeChangedPaths(session.worktreePath); + if (!changedPaths) { + return { + activityKind: 'thinking', + activityLabel: 'thinking', + activityCountLabel: '', + activitySummary: 'Worktree activity unavailable.', + changeCount: 0, + changedPaths: [], + }; + } + + if (changedPaths.length === 0) { + return { + activityKind: 'thinking', + activityLabel: 'thinking', + activityCountLabel: '', + activitySummary: 'Worktree clean.', + changeCount: 0, + changedPaths: [], + }; + } + + return { + activityKind: 'working', + activityLabel: 'working', + activityCountLabel: formatFileCount(changedPaths.length), + activitySummary: previewChangedPaths(changedPaths), + changeCount: changedPaths.length, + changedPaths, + }; +} + +function buildSessionRecord(input) { + const repoRoot = path.resolve(toNonEmptyString(input.repoRoot)); + const worktreePath = path.resolve(toNonEmptyString(input.worktreePath)); + const branch = toNonEmptyString(input.branch); + const pid = toPositiveInteger(input.pid); + const startedAt = input.startedAt ? new Date(input.startedAt) : new Date(); + + if (!branch) { + throw new Error('branch is required'); + } + if (!repoRoot) { + throw new Error('repoRoot is required'); + } + if (!worktreePath) { + throw new Error('worktreePath is required'); + } + if (!pid) { + throw new Error('pid must be a positive integer'); + } + if (Number.isNaN(startedAt.getTime())) { + throw new Error('startedAt must be a valid date'); + } + + return { + schemaVersion: SESSION_SCHEMA_VERSION, + repoRoot, + branch, + taskName: toNonEmptyString(input.taskName, 'task'), + agentName: toNonEmptyString(input.agentName, 'agent'), + worktreePath, + pid, + cliName: toNonEmptyString(input.cliName, 'codex'), + startedAt: startedAt.toISOString(), + }; +} + +function deriveSessionLabel(branch, worktreePath) { + const worktreeLeaf = toNonEmptyString(path.basename(worktreePath || '')); + if (worktreeLeaf) { + return worktreeLeaf; + } + return toNonEmptyString(branch).replace(/[\\/]+/g, '-') || 'unknown-agent'; +} + +function normalizeSessionRecord(input, options = {}) { + if (!input || typeof input !== 'object') { + return null; + } + + const repoRoot = toNonEmptyString(input.repoRoot); + const branch = toNonEmptyString(input.branch); + const worktreePath = toNonEmptyString(input.worktreePath); + const startedAt = new Date(input.startedAt); + const pid = toPositiveInteger(input.pid); + + if (!repoRoot || !branch || !worktreePath || !pid || Number.isNaN(startedAt.getTime())) { + return null; + } + + return { + schemaVersion: toPositiveInteger(input.schemaVersion) || SESSION_SCHEMA_VERSION, + repoRoot: path.resolve(repoRoot), + branch, + taskName: toNonEmptyString(input.taskName, 'task'), + agentName: toNonEmptyString(input.agentName, 'agent'), + worktreePath: path.resolve(worktreePath), + pid, + cliName: toNonEmptyString(input.cliName, 'codex'), + startedAt: startedAt.toISOString(), + filePath: toNonEmptyString(options.filePath), + label: deriveSessionLabel(branch, worktreePath), + }; +} + +function formatElapsedFrom(startedAt, now = Date.now()) { + const startedAtMs = startedAt instanceof Date ? startedAt.getTime() : Date.parse(startedAt); + if (!Number.isFinite(startedAtMs)) { + return '0s'; + } + + const totalSeconds = Math.max(0, Math.floor((now - startedAtMs) / 1000)); + const days = Math.floor(totalSeconds / 86_400); + const hours = Math.floor((totalSeconds % 86_400) / 3_600); + const minutes = Math.floor((totalSeconds % 3_600) / 60); + const seconds = totalSeconds % 60; + + if (days > 0) { + return `${days}d ${hours}h`; + } + if (hours > 0) { + return `${hours}h ${minutes}m`; + } + if (minutes > 0) { + return `${minutes}m ${seconds}s`; + } + return `${seconds}s`; +} + +function isPidAlive(pid) { + const normalizedPid = toPositiveInteger(pid); + if (!normalizedPid) { + return false; + } + + try { + process.kill(normalizedPid, 0); + return true; + } catch (_error) { + return false; + } +} + +function readActiveSessions(repoRoot, options = {}) { + const activeSessionsDir = activeSessionsDirForRepo(repoRoot); + if (!fs.existsSync(activeSessionsDir)) { + return []; + } + + const now = options.now || Date.now(); + const sessions = []; + for (const entry of fs.readdirSync(activeSessionsDir, { withFileTypes: true })) { + if (!entry.isFile() || !entry.name.endsWith('.json')) { + continue; + } + + const filePath = path.join(activeSessionsDir, entry.name); + let parsed; + try { + parsed = JSON.parse(fs.readFileSync(filePath, 'utf8')); + } catch (_error) { + continue; + } + + const normalized = normalizeSessionRecord(parsed, { filePath }); + if (!normalized) { + continue; + } + if (!options.includeStale && !isPidAlive(normalized.pid)) { + continue; + } + + normalized.elapsedLabel = formatElapsedFrom(normalized.startedAt, now); + Object.assign(normalized, deriveSessionActivity(normalized)); + sessions.push(normalized); + } + + sessions.sort((left, right) => { + const timeDelta = Date.parse(right.startedAt) - Date.parse(left.startedAt); + if (timeDelta !== 0) { + return timeDelta; + } + return left.label.localeCompare(right.label); + }); + + return sessions; +} + +function readRepoChanges(repoRoot) { + const statusLines = runGitLines(repoRoot, ['status', '--porcelain=v1', '--untracked-files=all']); + if (!statusLines) { + return []; + } + + return statusLines + .map((line) => parseRepoChangeLine(repoRoot, line)) + .filter(Boolean) + .sort((left, right) => left.relativePath.localeCompare(right.relativePath)); +} + +module.exports = { + ACTIVE_SESSIONS_RELATIVE_DIR, + SESSION_SCHEMA_VERSION, + activeSessionsDirForRepo, + buildSessionRecord, + collectWorktreeChangedPaths, + deriveSessionLabel, + deriveSessionActivity, + formatElapsedFrom, + formatFileCount, + isPidAlive, + normalizeSessionRecord, + parseRepoChangeLine, + previewChangedPaths, + readActiveSessions, + readRepoChanges, + deriveRepoChangeStatus, + sanitizeBranchForFile, + sessionFileNameForBranch, + sessionFilePathForBranch, +}; From 4ded9ddfa35d3bd0894adf9682810aea237779f8 Mon Sep 17 00:00:00 2001 From: Viktor Nagy <137165288+NagyVikt@users.noreply.github.com> Date: Wed, 22 Apr 2026 10:45:20 +0200 Subject: [PATCH 44/48] Auto-finish: gx doctor repairs (#276) Co-authored-by: NagyVikt --- scripts/codex-agent.sh | 51 ------------------------------------------ 1 file changed, 51 deletions(-) diff --git a/scripts/codex-agent.sh b/scripts/codex-agent.sh index 1829d92..75b349c 100755 --- a/scripts/codex-agent.sh +++ b/scripts/codex-agent.sh @@ -6,7 +6,6 @@ AGENT_NAME="${GUARDEX_AGENT_NAME:-agent}" BASE_BRANCH="${GUARDEX_BASE_BRANCH:-}" BASE_BRANCH_EXPLICIT=0 CODEX_BIN="${GUARDEX_CODEX_BIN:-codex}" -NODE_BIN="${GUARDEX_NODE_BIN:-node}" AUTO_FINISH_RAW="${GUARDEX_CODEX_AUTO_FINISH:-true}" AUTO_REVIEW_ON_CONFLICT_RAW="${GUARDEX_CODEX_AUTO_REVIEW_ON_CONFLICT:-true}" AUTO_CLEANUP_RAW="${GUARDEX_CODEX_AUTO_CLEANUP:-true}" @@ -144,7 +143,6 @@ if ! git rev-parse --is-inside-work-tree >/dev/null 2>&1; then exit 1 fi repo_root="$(git rev-parse --show-toplevel)" -active_session_state_script="${repo_root}/scripts/agent-session-state.js" guardex_env_helper="${repo_root}/scripts/guardex-env.sh" if [[ -f "$guardex_env_helper" ]]; then @@ -448,40 +446,6 @@ has_origin_remote() { git -C "$repo_root" remote get-url origin >/dev/null 2>&1 } -run_active_session_state() { - local action="$1" - shift - - if [[ ! -f "$active_session_state_script" ]]; then - return 0 - fi - if ! command -v "$NODE_BIN" >/dev/null 2>&1; then - return 0 - fi - - "$NODE_BIN" "$active_session_state_script" "$action" "$@" >/dev/null 2>&1 || true -} - -record_active_session_state() { - local wt="$1" - local branch="$2" - - run_active_session_state \ - start \ - --repo "$repo_root" \ - --branch "$branch" \ - --task "$TASK_NAME" \ - --agent "$AGENT_NAME" \ - --worktree "$wt" \ - --pid "$$" \ - --cli "$CODEX_BIN" -} - -clear_active_session_state() { - local branch="$1" - run_active_session_state stop --repo "$repo_root" --branch "$branch" -} - origin_remote_supports_pr_finish() { local origin_url origin_url="$(git -C "$repo_root" remote get-url origin 2>/dev/null || true)" @@ -894,19 +858,6 @@ if ! ensure_openspec_plan_workspace "$worktree_path" "$worktree_branch"; then exit 1 fi -active_session_recorded=0 -cleanup_active_session_state_on_exit() { - set +e - if [[ "${active_session_recorded:-0}" -eq 1 && -n "${worktree_branch:-}" && "${worktree_branch:-}" != "HEAD" ]]; then - clear_active_session_state "$worktree_branch" - active_session_recorded=0 - fi -} - -record_active_session_state "$worktree_path" "$worktree_branch" -active_session_recorded=1 -trap cleanup_active_session_state_on_exit EXIT INT TERM - echo "[codex-agent] Launching ${CODEX_BIN} in sandbox: $worktree_path" cd "$worktree_path" set +e @@ -915,8 +866,6 @@ codex_exit="$?" set -e cd "$repo_root" -cleanup_active_session_state_on_exit -trap - EXIT INT TERM final_exit="$codex_exit" auto_finish_completed=0 From 8a49fbaa2a9d75c9255f116733c4e563f5893ec1 Mon Sep 17 00:00:00 2001 From: Viktor Nagy <137165288+NagyVikt@users.noreply.github.com> Date: Wed, 22 Apr 2026 10:50:33 +0200 Subject: [PATCH 45/48] Keep gx doctor from treating manual conflict cleanup as repo failure (#277) Recoverable auto-finish rebase and merge conflicts mean the repo is safe but the branch still needs a human conflict-resolution step. Doctor now reports those rows as actionable skips, keeps the compact summary honest, and preserves verbose tail text for the operator. Constraint: Doctor should not look unsafe when only manual branch cleanup remains Rejected: Keep recoverable rebase conflicts under failed count | it makes safe repos read as broken Confidence: high Scope-risk: narrow Reversibility: clean Directive: Manual conflict detection is text-pattern based; update it if finish-script conflict wording changes Tested: node --check bin/multiagent-safety.js Tested: node --test --test-name-pattern "doctor" test/install.test.js Tested: openspec validate agent-codex-doctor-auto-finish-manual-conflict-2026-04-22-10-42 --type change --strict Tested: openspec validate --specs Not-tested: npm test remains red on pre-existing metadata parity assertion for scripts/agent-branch-start.sh vs templates/scripts/agent-branch-start.sh Co-authored-by: NagyVikt --- bin/multiagent-safety.js | 43 ++++++++++++++++++- .../proposal.md | 16 +++++++ .../specs/doctor-workflow/spec.md | 18 ++++++++ .../tasks.md | 24 +++++++++++ test/install.test.js | 13 +++--- 5 files changed, 107 insertions(+), 7 deletions(-) create mode 100644 openspec/changes/agent-codex-doctor-auto-finish-manual-conflict-2026-04-22-10-42/proposal.md create mode 100644 openspec/changes/agent-codex-doctor-auto-finish-manual-conflict-2026-04-22-10-42/specs/doctor-workflow/spec.md create mode 100644 openspec/changes/agent-codex-doctor-auto-finish-manual-conflict-2026-04-22-10-42/tasks.md diff --git a/bin/multiagent-safety.js b/bin/multiagent-safety.js index 2d1e140..428af80 100755 --- a/bin/multiagent-safety.js +++ b/bin/multiagent-safety.js @@ -754,6 +754,36 @@ function compactAutoFinishPathSegments(message) { }); } +function detectRecoverableAutoFinishConflict(message) { + const text = String(message || '').trim(); + if (!text) { + return null; + } + + if (/rebase --continue/i.test(text) && /rebase --abort/i.test(text)) { + return { + rawLabel: 'auto-finish requires manual rebase.', + summary: 'manual rebase required in the source-probe worktree; run rebase --continue or rebase --abort', + }; + } + + if (/Rebase\/merge '.+' into '.+' and resolve conflicts before finishing\./i.test(text)) { + return { + rawLabel: 'auto-finish requires manual rebase or merge.', + summary: 'manual rebase or merge required before auto-finish can continue', + }; + } + + if (/Merge conflict detected while merging/i.test(text)) { + return { + rawLabel: 'auto-finish requires manual merge resolution.', + summary: 'manual merge resolution required before auto-finish can continue', + }; + } + + return null; +} + function summarizeAutoFinishDetail(detail) { const trimmed = String(detail || '').trim(); const match = trimmed.match(/^\[(\w+)\]\s+([^:]+):\s*(.*)$/); @@ -764,8 +794,11 @@ function summarizeAutoFinishDetail(detail) { const [, status, rawBranch, rawMessage] = match; const branch = truncateMiddle(rawBranch, DOCTOR_AUTO_FINISH_BRANCH_LABEL_MAX); let message = String(rawMessage || '').trim(); + const recoverableConflict = status === 'skip' ? detectRecoverableAutoFinishConflict(message) : null; - if (status === 'fail') { + if (recoverableConflict) { + message = recoverableConflict.summary; + } else if (status === 'fail') { message = message.replace(/^auto-finish failed\.?\s*/i, ''); if (/\[agent-sync-guard\]/.test(message) && /Resolve conflicts/i.test(message)) { message = 'rebase conflict in finish flow; run rebase --continue or rebase --abort in the source-probe worktree'; @@ -3563,6 +3596,14 @@ function autoFinishReadyAgentBranches(repoRoot, options = {}) { continue; } + const recoverableConflict = detectRecoverableAutoFinishConflict(combinedOutput); + if (recoverableConflict) { + summary.skipped += 1; + const tail = combinedOutput ? ` ${combinedOutput.split('\n').slice(-2).join(' | ')}` : ''; + summary.details.push(`[skip] ${branch}: ${recoverableConflict.rawLabel}${tail}`); + continue; + } + summary.failed += 1; const tail = combinedOutput ? ` ${combinedOutput.split('\n').slice(-2).join(' | ')}` : ''; summary.details.push(`[fail] ${branch}: auto-finish failed.${tail}`); diff --git a/openspec/changes/agent-codex-doctor-auto-finish-manual-conflict-2026-04-22-10-42/proposal.md b/openspec/changes/agent-codex-doctor-auto-finish-manual-conflict-2026-04-22-10-42/proposal.md new file mode 100644 index 0000000..5a8eea7 --- /dev/null +++ b/openspec/changes/agent-codex-doctor-auto-finish-manual-conflict-2026-04-22-10-42/proposal.md @@ -0,0 +1,16 @@ +## Why + +- `gx doctor` currently counts recoverable auto-finish rebase conflicts as hard failures even when the repo itself is safe and the only remaining work is a manual conflict resolution step. +- That makes long doctor sweeps look broken or unsafe when the real state is narrower: the branch cannot be auto-finished yet and needs a human to rebase or merge it. + +## What Changes + +- Reclassify recoverable auto-finish conflict states during `gx doctor` from `[fail]` to a manual-action `[skip]` status. +- Keep the compact default summary actionable and keep `--verbose-auto-finish` useful by preserving the raw tail text behind the skip line. +- Add install-test coverage for the new summary counts and color behavior. + +## Impact + +- Affects only doctor auto-finish reporting for branches that hit recoverable rebase or merge conflicts. +- Keeps true auto-finish failures red and failed; only manual-resolution conflict cases move to the skip/pending bucket. +- Main risk: conflict detection could miss a new finish-script wording, so the pattern matching should stay narrow and test-backed. diff --git a/openspec/changes/agent-codex-doctor-auto-finish-manual-conflict-2026-04-22-10-42/specs/doctor-workflow/spec.md b/openspec/changes/agent-codex-doctor-auto-finish-manual-conflict-2026-04-22-10-42/specs/doctor-workflow/spec.md new file mode 100644 index 0000000..93786ed --- /dev/null +++ b/openspec/changes/agent-codex-doctor-auto-finish-manual-conflict-2026-04-22-10-42/specs/doctor-workflow/spec.md @@ -0,0 +1,18 @@ +## ADDED Requirements + +### Requirement: doctor sweep classifies manual conflict work as actionable skips +The human-readable `gx doctor` auto-finish sweep SHALL classify recoverable manual conflict states as skip/manual-action rows instead of hard failures. + +#### Scenario: auto-finish rebase conflict becomes a skip/manual-action row +- **GIVEN** a ready local `agent/*` branch exists during `gx doctor` +- **AND** `scripts/agent-branch-finish.sh` stops because it needs a human to continue or abort a source-probe rebase +- **WHEN** doctor prints the auto-finish summary +- **THEN** the summary SHALL not count that branch as failed +- **AND** the branch detail SHALL be emitted as a skip/manual-action row with the rebase instructions preserved in verbose mode + +#### Scenario: true auto-finish failures remain failures +- **GIVEN** a ready local `agent/*` branch exists during `gx doctor` +- **AND** `scripts/agent-branch-finish.sh` fails for a reason other than a recoverable manual conflict +- **WHEN** doctor prints the auto-finish summary +- **THEN** the summary SHALL still count that branch as failed +- **AND** the branch detail SHALL remain a failed row diff --git a/openspec/changes/agent-codex-doctor-auto-finish-manual-conflict-2026-04-22-10-42/tasks.md b/openspec/changes/agent-codex-doctor-auto-finish-manual-conflict-2026-04-22-10-42/tasks.md new file mode 100644 index 0000000..cd23542 --- /dev/null +++ b/openspec/changes/agent-codex-doctor-auto-finish-manual-conflict-2026-04-22-10-42/tasks.md @@ -0,0 +1,24 @@ +## 1. Specification + +- [x] 1.1 Finalize proposal scope and acceptance criteria for `doctor-auto-finish-manual-conflict`. +- [x] 1.2 Define normative requirements in `specs/doctor-workflow/spec.md`. + +## 2. Implementation + +- [x] 2.1 Reclassify recoverable doctor auto-finish rebase/merge conflicts from failed rows to manual-action skip rows. +- [x] 2.2 Keep compact default output actionable and preserve verbose raw tail text for manual conflict rows. +- [x] 2.3 Update focused doctor/install regressions for counts, detail text, and ANSI colors. + +## 3. Verification + +- [x] 3.1 Run focused doctor/install verification (`node --test --test-name-pattern "doctor" test/install.test.js`, `node --check bin/multiagent-safety.js`). +- [x] 3.2 Run `openspec validate agent-codex-doctor-auto-finish-manual-conflict-2026-04-22-10-42 --type change --strict`. +- [x] 3.3 Run `openspec validate --specs`. + +Verification note: `node --check bin/multiagent-safety.js` passed. `node --test --test-name-pattern "doctor" test/install.test.js` passed with `18/18` doctor-focused tests, including the new skip/manual-conflict regressions. `openspec validate agent-codex-doctor-auto-finish-manual-conflict-2026-04-22-10-42 --type change --strict` passed, and `openspec validate --specs` returned `No items found to validate.` Extra check: `npm test` still fails on the pre-existing metadata parity assertion that `scripts/agent-branch-start.sh` diverges from `templates/scripts/agent-branch-start.sh`; this branch did not modify either file. + +## 4. Completion + +- [ ] 4.1 Finish the agent branch via PR merge + cleanup (`gx finish --via-pr --wait-for-merge --cleanup` or `bash scripts/agent-branch-finish.sh --branch --base --via-pr --wait-for-merge --cleanup`). +- [ ] 4.2 Record PR URL + final `MERGED` state in the completion handoff. +- [ ] 4.3 Confirm sandbox cleanup (`git worktree list`, `git branch -a`) or capture a `BLOCKED:` handoff if merge/cleanup is pending. diff --git a/test/install.test.js b/test/install.test.js index b3a5931..1af295c 100644 --- a/test/install.test.js +++ b/test/install.test.js @@ -1669,7 +1669,7 @@ exit 1 assert.match(combinedOutput, /Auto-finish sweep \(base=main\): attempted=1, completed=1, skipped=\d+, failed=0/); }); -test('doctor compacts auto-finish failures by default and expands them with --verbose-auto-finish', () => { +test('doctor treats recoverable auto-finish rebase conflicts as actionable skips', () => { const repoDir = initRepoOnBranch('main'); seedCommit(repoDir); attachOriginRemoteForBranch(repoDir, 'main'); @@ -1706,10 +1706,11 @@ exit 1 ); assert.equal(result.status, 0, result.stderr || result.stdout); const compactOutput = `${result.stdout}\n${result.stderr}`; + assert.match(compactOutput, /Auto-finish sweep \(base=main\): attempted=1, completed=0, skipped=\d+, failed=0/); assert.match( compactOutput, new RegExp( - `\\[fail\\] ${escapeRegexLiteral(readyBranch)}: rebase conflict in finish flow; run rebase --continue or rebase --abort in the source-probe worktree`, + `\\[skip\\] ${escapeRegexLiteral(readyBranch)}: manual rebase required in the source-probe worktree; run rebase --continue or rebase --abort`, ), ); assert.doesNotMatch(compactOutput, /git -C "\/tmp\/very\/long\/path\/for\/source-probe-agent-worktree/); @@ -1721,11 +1722,11 @@ exit 1 ); assert.equal(result.status, 0, result.stderr || result.stdout); const verboseOutput = `${result.stdout}\n${result.stderr}`; - assert.match(verboseOutput, new RegExp(`\\[fail\\] ${escapeRegexLiteral(readyBranch)}: auto-finish failed\\.`)); + assert.match(verboseOutput, new RegExp(`\\[skip\\] ${escapeRegexLiteral(readyBranch)}: auto-finish requires manual rebase\\.`)); assert.match(verboseOutput, /git -C ".+rebase --continue/); }); -test('doctor colors failure and success status lines when color output is enabled', () => { +test('doctor colors manual conflict skips yellow and success status lines green', () => { const repoDir = initRepoOnBranch('main'); seedCommit(repoDir); attachOriginRemoteForBranch(repoDir, 'main'); @@ -1767,12 +1768,12 @@ exit 1 assert.match(ansiOutput, /\u001B\[32m\[gitguardex\] ✅ No safety issues detected\.\u001B\[0m/); assert.match( ansiOutput, - /\u001B\[31m\[gitguardex\] Auto-finish sweep \(base=main\): attempted=1, completed=0, skipped=\d+, failed=1\u001B\[0m/, + /\u001B\[33m\[gitguardex\] Auto-finish sweep \(base=main\): attempted=1, completed=0, skipped=\d+, failed=0\u001B\[0m/, ); assert.match( ansiOutput, new RegExp( - `\\u001B\\[31m\\[gitguardex\\]\\s+\\[fail\\] ${escapeRegexLiteral(readyBranch)}: rebase conflict in finish flow; run rebase --continue or rebase --abort in the source-probe worktree\\u001B\\[0m`, + `\\u001B\\[33m\\[gitguardex\\]\\s+\\[skip\\] ${escapeRegexLiteral(readyBranch)}: manual rebase required in the source-probe worktree; run rebase --continue or rebase --abort\\u001B\\[0m`, ), ); assert.match(ansiOutput, /\u001B\[32m\[gitguardex\] ✅ Repo is fully safe\.\u001B\[0m/); From 9f1a129360b6293018f13505f68d7f52f055ee0b Mon Sep 17 00:00:00 2001 From: Viktor Nagy <137165288+NagyVikt@users.noreply.github.com> Date: Wed, 22 Apr 2026 10:54:34 +0200 Subject: [PATCH 46/48] Record the merged doctor fix as complete in its OpenSpec task log (#278) The behavior fix already merged, but its change record still showed the completion section unchecked. This follow-up only updates the existing task log so the OpenSpec artifact matches the actual PR, merge commit, and cleanup state. Constraint: Keep the follow-up scoped to the existing change record only Rejected: Leave the merged change with stale completion state | it keeps the execution record inaccurate Confidence: high Scope-risk: narrow Reversibility: clean Directive: When finish/cleanup evidence is only known after merge, record it in the merged change rather than leaving unchecked completion tasks behind Tested: openspec validate agent-codex-doctor-auto-finish-manual-conflict-2026-04-22-10-42 --type change --strict Not-tested: No code-path tests run; docs-only OpenSpec completion update Co-authored-by: NagyVikt --- .../tasks.md | 13 ++++++++++--- 1 file changed, 10 insertions(+), 3 deletions(-) diff --git a/openspec/changes/agent-codex-doctor-auto-finish-manual-conflict-2026-04-22-10-42/tasks.md b/openspec/changes/agent-codex-doctor-auto-finish-manual-conflict-2026-04-22-10-42/tasks.md index cd23542..c4b60e4 100644 --- a/openspec/changes/agent-codex-doctor-auto-finish-manual-conflict-2026-04-22-10-42/tasks.md +++ b/openspec/changes/agent-codex-doctor-auto-finish-manual-conflict-2026-04-22-10-42/tasks.md @@ -19,6 +19,13 @@ Verification note: `node --check bin/multiagent-safety.js` passed. `node --test ## 4. Completion -- [ ] 4.1 Finish the agent branch via PR merge + cleanup (`gx finish --via-pr --wait-for-merge --cleanup` or `bash scripts/agent-branch-finish.sh --branch --base --via-pr --wait-for-merge --cleanup`). -- [ ] 4.2 Record PR URL + final `MERGED` state in the completion handoff. -- [ ] 4.3 Confirm sandbox cleanup (`git worktree list`, `git branch -a`) or capture a `BLOCKED:` handoff if merge/cleanup is pending. +- [x] 4.1 Finish the agent branch via PR merge + cleanup (`gx finish --via-pr --wait-for-merge --cleanup` or `bash scripts/agent-branch-finish.sh --branch --base --via-pr --wait-for-merge --cleanup`). +- [x] 4.2 Record PR URL + final `MERGED` state in the completion handoff. +- [x] 4.3 Confirm sandbox cleanup (`git worktree list`, `git branch -a`) or capture a `BLOCKED:` handoff if merge/cleanup is pending. + +Completion evidence: +- PR: `#277` +- Final state: `MERGED` into `main` at `2026-04-22T08:50:34Z` +- Merge commit: `8a49fbaa2a9d75c9255f116733c4e563f5893ec1` +- Merge/cleanup path: `bash scripts/agent-branch-finish.sh --branch "agent/codex/doctor-auto-finish-rebase-conflict-statu-2026-04-22-10-42" --base main --via-pr --wait-for-merge --cleanup` +- Cleanup confirmation: `git worktree list` now shows only the primary repo plus one unrelated active `agent/gx/...` doctor sandbox, and `git branch -a | rg "doctor-auto-finish-rebase-conflict-statu-2026-04-22-10-42|main$|origin/main"` shows only `main` and `origin/main` after `git remote prune origin` From c695b57f944a9ad99e8c39cecc19cf3f9a450807 Mon Sep 17 00:00:00 2001 From: NagyVikt Date: Wed, 22 Apr 2026 12:36:45 +0200 Subject: [PATCH 47/48] Make extracted Guardex CLI seams executable and publishable The branch already moved the Guardex CLI into src/**, but the extracted scaffold and git seams were still incomplete and the thin entrypoint/package regressions were only half-switched. This wires the runtime through the extracted scaffold, hooks, toolchain, and finish modules, restores the helper export contract that src/cli/main.js depends on, ships src/** in the published package, and locks the modular entrypoint with focused metadata and command-route regressions. Constraint: The thin bin entrypoint must preserve the existing gx command surface for installed CLIs Constraint: The existing agent branch already carried partial refactor state in bin/src/scripts, so the fix had to resume that lane instead of restarting from dev Rejected: Re-expand logic back into bin/multiagent-safety.js | would undo the modularization goal Confidence: medium Scope-risk: moderate Reversibility: clean Directive: Keep future CLI refactors moving behavior into src/** and verify module export contracts before thinning the bin entrypoint Tested: node --check bin/multiagent-safety.js; find src -type f -name '*.js' -exec node --check {} +; node -e "require('./src/cli/main'); console.log('cli-main-ok')"; node bin/multiagent-safety.js --version; node --test test/metadata.test.js; node --test --test-name-pattern "thin entrypoint still routes hook install through the extracted hooks module|thin entrypoint routes finish --all --dry-run through the extracted finish module" test/install.test.js; npm pack --dry-run; openspec validate agent-codex-decompose-cli-monolith-2026-04-22-11-06 --type change --strict; openspec validate --specs Not-tested: Full test/install.test.js bundle (long-running background path; replaced with targeted command-route coverage) --- bin/multiagent-safety.js | 7904 +---------------- .../.openspec.yaml | 2 + .../proposal.md | 18 + .../specs/cli-modularization/spec.md | 29 + .../tasks.md | 40 + package.json | 1 + scripts/agent-branch-start.sh | 52 +- scripts/codex-agent.sh | 143 +- src/cli/args.js | 837 ++ src/cli/dispatch.js | 86 + src/cli/main.js | 6150 +++++++++++++ src/context.js | 503 ++ src/core/runtime.js | 119 + src/finish/index.js | 425 + src/git/index.js | 112 + src/hooks/index.js | 77 + src/output/index.js | 398 + src/sandbox/index.js | 68 + src/scaffold/index.js | 169 + src/toolchain/index.js | 223 + test/install.test.js | 26 +- test/metadata.test.js | 57 +- 22 files changed, 9441 insertions(+), 7998 deletions(-) mode change 100755 => 100644 bin/multiagent-safety.js create mode 100644 openspec/changes/agent-codex-decompose-cli-monolith-2026-04-22-11-06/.openspec.yaml create mode 100644 openspec/changes/agent-codex-decompose-cli-monolith-2026-04-22-11-06/proposal.md create mode 100644 openspec/changes/agent-codex-decompose-cli-monolith-2026-04-22-11-06/specs/cli-modularization/spec.md create mode 100644 openspec/changes/agent-codex-decompose-cli-monolith-2026-04-22-11-06/tasks.md create mode 100644 src/cli/args.js create mode 100644 src/cli/dispatch.js create mode 100644 src/cli/main.js create mode 100644 src/context.js create mode 100644 src/core/runtime.js create mode 100644 src/finish/index.js create mode 100644 src/git/index.js create mode 100644 src/hooks/index.js create mode 100644 src/output/index.js create mode 100644 src/sandbox/index.js create mode 100644 src/scaffold/index.js create mode 100644 src/toolchain/index.js diff --git a/bin/multiagent-safety.js b/bin/multiagent-safety.js old mode 100755 new mode 100644 index 428af80..a38e9e9 --- a/bin/multiagent-safety.js +++ b/bin/multiagent-safety.js @@ -1,7905 +1,5 @@ #!/usr/bin/env node -const fs = require('node:fs'); -const os = require('node:os'); -const path = require('node:path'); -const cp = require('node:child_process'); +const { runFromBin } = require('../src/cli/main'); -const PACKAGE_ROOT = path.resolve(__dirname, '..'); -const packageJsonPath = path.join(PACKAGE_ROOT, 'package.json'); -const packageJson = JSON.parse(fs.readFileSync(packageJsonPath, 'utf8')); - -const TOOL_NAME = 'gitguardex'; -const SHORT_TOOL_NAME = 'gx'; -if (!process.env.GUARDEX_CLI_ENTRY) { - process.env.GUARDEX_CLI_ENTRY = __filename; -} -if (!process.env.GUARDEX_NODE_BIN) { - process.env.GUARDEX_NODE_BIN = process.execPath; -} -const LEGACY_NAMES = ['guardex', 'multiagent-safety']; -const GLOBAL_INSTALL_COMMAND = `npm i -g ${packageJson.name}`; -const OPENSPEC_PACKAGE = '@fission-ai/openspec'; -const OMC_PACKAGE = 'oh-my-claude-sisyphus'; -const OMC_REPO_URL = 'https://github.com/Yeachan-Heo/oh-my-claudecode'; -const CAVEMEM_PACKAGE = 'cavemem'; -const NPX_BIN = process.env.GUARDEX_NPX_BIN || 'npx'; -const GUARDEX_HOME_DIR = path.resolve(process.env.GUARDEX_HOME_DIR || os.homedir()); -const GLOBAL_TOOLCHAIN_SERVICES = [ - { name: 'oh-my-codex', packageName: 'oh-my-codex' }, - { - name: 'oh-my-claudecode', - packageName: OMC_PACKAGE, - dependencyUrl: OMC_REPO_URL, - }, - { name: OPENSPEC_PACKAGE, packageName: OPENSPEC_PACKAGE }, - { name: CAVEMEM_PACKAGE, packageName: CAVEMEM_PACKAGE }, - { - name: '@imdeadpool/codex-account-switcher', - packageName: '@imdeadpool/codex-account-switcher', - }, -]; -const GLOBAL_TOOLCHAIN_PACKAGES = [ - ...GLOBAL_TOOLCHAIN_SERVICES.map((service) => service.packageName), -]; -const OPTIONAL_LOCAL_COMPANION_TOOLS = [ - { - name: 'cavekit', - candidatePaths: [ - '.cavekit/plugin.json', - '.codex/local-marketplaces/cavekit/.agents/plugins/marketplace.json', - ], - installCommand: `${NPX_BIN} skills add JuliusBrussee/cavekit`, - installArgs: ['skills', 'add', 'JuliusBrussee/cavekit'], - }, - { - name: 'caveman', - candidatePaths: [ - '.config/caveman/config.json', - '.cavekit/skills/caveman/SKILL.md', - ], - installCommand: `${NPX_BIN} skills add JuliusBrussee/caveman`, - installArgs: ['skills', 'add', 'JuliusBrussee/caveman'], - }, -]; -const GH_BIN = process.env.GUARDEX_GH_BIN || 'gh'; -const REQUIRED_SYSTEM_TOOLS = [ - { - name: 'gh', - displayName: 'GitHub (gh)', - command: GH_BIN, - installHint: 'https://cli.github.com/', - }, -]; -const MAINTAINER_RELEASE_REPO = path.resolve( - process.env.GUARDEX_RELEASE_REPO || path.resolve(__dirname, '..'), -); -const NPM_BIN = process.env.GUARDEX_NPM_BIN || 'npm'; -const OPENSPEC_BIN = process.env.GUARDEX_OPENSPEC_BIN || 'openspec'; -const SCORECARD_BIN = process.env.GUARDEX_SCORECARD_BIN || 'scorecard'; -const GIT_PROTECTED_BRANCHES_KEY = 'multiagent.protectedBranches'; -const GIT_BASE_BRANCH_KEY = 'multiagent.baseBranch'; -const GIT_SYNC_STRATEGY_KEY = 'multiagent.sync.strategy'; -const GUARDEX_REPO_TOGGLE_ENV = 'GUARDEX_ON'; -const DEFAULT_PROTECTED_BRANCHES = ['dev', 'main', 'master']; -const DEFAULT_BASE_BRANCH = 'dev'; -const DEFAULT_SYNC_STRATEGY = 'rebase'; -const DEFAULT_SHADOW_CLEANUP_IDLE_MINUTES = 60; -const COMPOSE_HINT_FILES = [ - 'docker-compose.yml', - 'docker-compose.yaml', - 'compose.yml', - 'compose.yaml', -]; - -const TEMPLATE_ROOT = path.join(PACKAGE_ROOT, 'templates'); - -const HOOK_NAMES = ['pre-commit', 'pre-push', 'post-merge', 'post-checkout']; - -const TEMPLATE_FILES = [ - 'scripts/agent-session-state.js', - 'scripts/guardex-docker-loader.sh', - 'scripts/guardex-env.sh', - 'scripts/install-vscode-active-agents-extension.js', - 'github/pull.yml.example', - 'github/workflows/cr.yml', - 'vscode/guardex-active-agents/package.json', - 'vscode/guardex-active-agents/extension.js', - 'vscode/guardex-active-agents/session-schema.js', - 'vscode/guardex-active-agents/README.md', -]; - -const LEGACY_WORKFLOW_SHIM_SPECS = [ - { relativePath: 'scripts/agent-branch-start.sh', kind: 'shell', command: ['branch', 'start'] }, - { relativePath: 'scripts/agent-branch-finish.sh', kind: 'shell', command: ['branch', 'finish'] }, - { relativePath: 'scripts/agent-branch-merge.sh', kind: 'shell', command: ['branch', 'merge'] }, - { relativePath: 'scripts/codex-agent.sh', kind: 'shell', command: ['internal', 'run-shell', 'codexAgent'] }, - { relativePath: 'scripts/review-bot-watch.sh', kind: 'shell', command: ['internal', 'run-shell', 'reviewBot'] }, - { relativePath: 'scripts/agent-worktree-prune.sh', kind: 'shell', command: ['worktree', 'prune'] }, - { relativePath: 'scripts/agent-file-locks.py', kind: 'python', command: ['locks'] }, - { relativePath: 'scripts/openspec/init-plan-workspace.sh', kind: 'shell', command: ['internal', 'run-shell', 'planInit'] }, - { relativePath: 'scripts/openspec/init-change-workspace.sh', kind: 'shell', command: ['internal', 'run-shell', 'changeInit'] }, -]; - -const LEGACY_WORKFLOW_SHIMS = LEGACY_WORKFLOW_SHIM_SPECS.map((entry) => entry.relativePath); - -const MANAGED_TEMPLATE_DESTINATIONS = TEMPLATE_FILES.map((entry) => toDestinationPath(entry)); -const MANAGED_TEMPLATE_SCRIPT_FILES = MANAGED_TEMPLATE_DESTINATIONS.filter((entry) => - entry.startsWith('scripts/'), -); - -const LEGACY_MANAGED_REPO_FILES = [ - ...LEGACY_WORKFLOW_SHIMS, - 'scripts/agent-session-state.js', - 'scripts/guardex-docker-loader.sh', - 'scripts/install-vscode-active-agents-extension.js', - 'scripts/guardex-env.sh', - 'scripts/install-agent-git-hooks.sh', - '.githooks/pre-commit', - '.githooks/pre-push', - '.githooks/post-merge', - '.githooks/post-checkout', - '.codex/skills/gitguardex/SKILL.md', - '.codex/skills/guardex-merge-skills-to-dev/SKILL.md', - '.claude/commands/gitguardex.md', -]; - -const REQUIRED_MANAGED_REPO_FILES = [ - ...MANAGED_TEMPLATE_DESTINATIONS, - ...HOOK_NAMES.map((entry) => path.posix.join('.githooks', entry)), - '.omx/state/agent-file-locks.json', -]; - -const LEGACY_MANAGED_PACKAGE_SCRIPTS = { - 'agent:codex': 'bash ./scripts/codex-agent.sh', - 'agent:branch:start': 'bash ./scripts/agent-branch-start.sh', - 'agent:branch:finish': 'bash ./scripts/agent-branch-finish.sh', - 'agent:branch:merge': 'bash ./scripts/agent-branch-merge.sh', - 'agent:cleanup': 'gx cleanup', - 'agent:hooks:install': 'bash ./scripts/install-agent-git-hooks.sh', - 'agent:locks:claim': 'python3 ./scripts/agent-file-locks.py claim', - 'agent:locks:allow-delete': 'python3 ./scripts/agent-file-locks.py allow-delete', - 'agent:locks:release': 'python3 ./scripts/agent-file-locks.py release', - 'agent:locks:status': 'python3 ./scripts/agent-file-locks.py status', - 'agent:plan:init': 'bash ./scripts/openspec/init-plan-workspace.sh', - 'agent:change:init': 'bash ./scripts/openspec/init-change-workspace.sh', - 'agent:protect:list': 'gx protect list', - 'agent:branch:sync': 'gx sync', - 'agent:branch:sync:check': 'gx sync --check', - 'agent:safety:setup': 'gx setup', - 'agent:safety:scan': 'gx status --strict', - 'agent:safety:fix': 'gx setup --repair', - 'agent:safety:doctor': 'gx doctor', - 'agent:docker:load': 'bash ./scripts/guardex-docker-loader.sh', - 'agent:review:watch': 'bash ./scripts/review-bot-watch.sh', - 'agent:finish': 'gx finish --all', -}; - -const PACKAGE_SCRIPT_ASSETS = { - branchStart: path.join(TEMPLATE_ROOT, 'scripts', 'agent-branch-start.sh'), - branchFinish: path.join(TEMPLATE_ROOT, 'scripts', 'agent-branch-finish.sh'), - branchMerge: path.join(TEMPLATE_ROOT, 'scripts', 'agent-branch-merge.sh'), - codexAgent: path.join(TEMPLATE_ROOT, 'scripts', 'codex-agent.sh'), - reviewBot: path.join(TEMPLATE_ROOT, 'scripts', 'review-bot-watch.sh'), - worktreePrune: path.join(TEMPLATE_ROOT, 'scripts', 'agent-worktree-prune.sh'), - lockTool: path.join(TEMPLATE_ROOT, 'scripts', 'agent-file-locks.py'), - planInit: path.join(TEMPLATE_ROOT, 'scripts', 'openspec', 'init-plan-workspace.sh'), - changeInit: path.join(TEMPLATE_ROOT, 'scripts', 'openspec', 'init-change-workspace.sh'), -}; - -const USER_LEVEL_SKILL_ASSETS = [ - { - source: path.join(TEMPLATE_ROOT, 'codex', 'skills', 'gitguardex', 'SKILL.md'), - destination: path.join('.codex', 'skills', 'gitguardex', 'SKILL.md'), - }, - { - source: path.join(TEMPLATE_ROOT, 'codex', 'skills', 'guardex-merge-skills-to-dev', 'SKILL.md'), - destination: path.join('.codex', 'skills', 'guardex-merge-skills-to-dev', 'SKILL.md'), - }, - { - source: path.join(TEMPLATE_ROOT, 'claude', 'commands', 'gitguardex.md'), - destination: path.join('.claude', 'commands', 'gitguardex.md'), - }, -]; - -const EXECUTABLE_RELATIVE_PATHS = new Set([ - ...MANAGED_TEMPLATE_SCRIPT_FILES, - ...HOOK_NAMES.map((entry) => path.posix.join('.githooks', entry)), -]); - -const CRITICAL_GUARDRAIL_PATHS = new Set([ - 'AGENTS.md', - ...HOOK_NAMES.map((entry) => path.posix.join('.githooks', entry)), - 'scripts/guardex-env.sh', -]); - -const LOCK_FILE_RELATIVE = '.omx/state/agent-file-locks.json'; -const AGENTS_BOTS_STATE_RELATIVE = '.omx/state/agents-bots.json'; -const AGENTS_MARKER_START = ''; -const AGENTS_MARKER_END = ''; -const GITIGNORE_MARKER_START = '# multiagent-safety:START'; -const GITIGNORE_MARKER_END = '# multiagent-safety:END'; -const CODEX_WORKTREE_RELATIVE_DIR = path.join('.omx', 'agent-worktrees'); -const CLAUDE_WORKTREE_RELATIVE_DIR = path.join('.omc', 'agent-worktrees'); -const AGENT_WORKTREE_RELATIVE_DIRS = [ - CODEX_WORKTREE_RELATIVE_DIR, - CLAUDE_WORKTREE_RELATIVE_DIR, -]; -const MANAGED_GITIGNORE_PATHS = [ - '.omx/', - '.omc/', - 'scripts/agent-session-state.js', - 'scripts/guardex-docker-loader.sh', - 'scripts/guardex-env.sh', - 'scripts/install-vscode-active-agents-extension.js', - '.githooks', - 'oh-my-codex/', - LOCK_FILE_RELATIVE, -]; -const REPO_SCAFFOLD_DIRECTORIES = ['bin']; -const OMX_SCAFFOLD_DIRECTORIES = [ - '.omx', - '.omx/state', - '.omx/logs', - '.omx/plans', - CODEX_WORKTREE_RELATIVE_DIR, - '.omc', - CLAUDE_WORKTREE_RELATIVE_DIR, -]; -const OMX_SCAFFOLD_FILES = new Map([ - ['.omx/notepad.md', '\n\n## WORKING MEMORY\n'], - ['.omx/project-memory.json', '{}\n'], -]); -const TARGETED_FORCEABLE_MANAGED_PATHS = new Set([ - 'AGENTS.md', - '.gitignore', - ...Array.from(OMX_SCAFFOLD_FILES.keys()), - ...REQUIRED_MANAGED_REPO_FILES, - ...LEGACY_WORKFLOW_SHIMS, -]); -const COMMAND_TYPO_ALIASES = new Map([ - ['relaese', 'release'], - ['realaese', 'release'], - ['relase', 'release'], - ['setpu', 'setup'], - ['inti', 'init'], - ['intsall', 'install'], - ['docter', 'doctor'], - ['doctro', 'doctor'], - ['cleunup', 'cleanup'], - ['scna', 'scan'], -]); -const SUGGESTIBLE_COMMANDS = [ - 'status', - 'setup', - 'doctor', - 'branch', - 'locks', - 'worktree', - 'hook', - 'migrate', - 'install-agent-skills', - 'agents', - 'merge', - 'finish', - 'report', - 'protect', - 'sync', - 'cleanup', - 'prompt', - 'help', - 'version', - // deprecated aliases still routable with a warning - 'init', - 'install', - 'fix', - 'scan', - 'review', - 'copy-prompt', - 'copy-commands', - 'print-agents-snippet', - 'release', -]; -const CLI_COMMAND_DESCRIPTIONS = [ - ['status', 'Show GitGuardex CLI + service health without modifying files'], - ['setup', 'Install, repair, and verify guardrails (flags: --repair, --install-only, --target)'], - ['doctor', 'Repair drift + verify (auto-sandboxes on protected main)'], - ['branch', 'CLI-owned branch workflow surface (start/finish/merge)'], - ['locks', 'CLI-owned file lock surface (claim/allow-delete/release/status/validate)'], - ['worktree', 'CLI-owned worktree cleanup surface (prune)'], - ['hook', 'Hook dispatch/install surface used by managed shims'], - ['migrate', 'Convert legacy repo-local installs to the zero-copy CLI-owned surface'], - ['install-agent-skills', 'Install Guardex Codex/Claude skills into the user home'], - ['protect', 'Manage protected branches (list/add/remove/set/reset)'], - ['merge', 'Create/reuse an integration lane and merge overlapping agent branches'], - ['sync', 'Sync agent branches with origin/'], - ['finish', 'Commit + PR + merge completed agent branches (--all, --branch)'], - ['cleanup', 'Prune merged/stale agent branches and worktrees'], - ['release', 'Create or update the current GitHub release with README-generated notes'], - ['agents', 'Start/stop repo-scoped review + cleanup bots'], - ['prompt', 'Print AI setup checklist (--exec, --snippet)'], - ['report', 'Security/safety reports (e.g. OpenSSF scorecard)'], - ['help', 'Show this help output'], - ['version', 'Print GitGuardex version'], -]; -const DEPRECATED_COMMAND_ALIASES = new Map([ - ['init', { target: 'setup', hint: 'gx setup' }], - ['install', { target: 'setup', hint: 'gx setup --install-only' }], - ['fix', { target: 'setup', hint: 'gx setup --repair' }], - ['scan', { target: 'status', hint: 'gx status --strict' }], - ['copy-prompt', { target: 'prompt', hint: 'gx prompt' }], - ['copy-commands', { target: 'prompt', hint: 'gx prompt --exec' }], - ['print-agents-snippet', { target: 'prompt', hint: 'gx prompt --snippet' }], - ['review', { target: 'agents', hint: 'gx agents start (runs review + cleanup)' }], -]); -const AGENT_BOT_DESCRIPTIONS = [ - ['agents', 'Start/stop review + cleanup bots for this repo'], -]; -const DOCTOR_AUTO_FINISH_DETAIL_LIMIT = 6; -const DOCTOR_AUTO_FINISH_BRANCH_LABEL_MAX = 72; -const DOCTOR_AUTO_FINISH_MESSAGE_MAX = 160; - -function envFlagIsTruthy(raw) { - const lowered = String(raw || '').trim().toLowerCase(); - return lowered === '1' || lowered === 'true' || lowered === 'yes' || lowered === 'on'; -} - -function isClaudeCodeSession(env = process.env) { - return envFlagIsTruthy(env.CLAUDECODE) || Boolean(env.CLAUDE_CODE_SESSION_ID); -} - -function defaultAgentWorktreeRelativeDir(env = process.env) { - return isClaudeCodeSession(env) ? CLAUDE_WORKTREE_RELATIVE_DIR : CODEX_WORKTREE_RELATIVE_DIR; -} - -const AI_SETUP_PROMPT = `GitGuardex (gx) setup checklist for Codex/Claude in this repo. - -1) Install: ${GLOBAL_INSTALL_COMMAND} && gh --version -2) Bootstrap: gx setup -3) Repair: gx doctor -4) Task loop: gx branch start "" "" - then gx locks claim --branch "" -> gx branch finish -5) Integrate: gx merge --branch --branch -6) Finish: gx finish --all -7) Cleanup: gx cleanup -8) OpenSpec: /opsx:propose -> /opsx:apply -> /opsx:archive -9) Optional: gx protect add release staging -10) Optional: gx sync --check && gx sync -11) Review bot: install https://github.com/apps/cr-gpt + set OPENAI_API_KEY -12) Fork sync: install https://github.com/apps/pull + cp .github/pull.yml.example .github/pull.yml -`; - -const AI_SETUP_COMMANDS = `${GLOBAL_INSTALL_COMMAND} -gh --version -gx setup -gx doctor -gx branch start "" "" -gx locks claim --branch "" -gx merge --branch "" --branch "" -gx finish --all -gx cleanup -gx protect add release staging -gx sync --check && gx sync -`; - -const SCORECARD_RISK_BY_CHECK = { - 'Dangerous-Workflow': 'Critical', - 'Code-Review': 'High', - Maintained: 'High', - 'Binary-Artifacts': 'High', - 'Dependency-Update-Tool': 'High', - 'Token-Permissions': 'High', - Vulnerabilities: 'High', - 'Branch-Protection': 'High', - Fuzzing: 'Medium', - 'Pinned-Dependencies': 'Medium', - SAST: 'Medium', - 'Security-Policy': 'Medium', - 'CII-Best-Practices': 'Low', - Contributors: 'Low', - License: 'Low', -}; - -function runtimeVersion() { - return `${packageJson.name}/${packageJson.version} ${process.platform}-${process.arch} node-${process.version}`; -} - -function supportsAnsiColors() { - const forced = String(process.env.FORCE_COLOR || '').trim().toLowerCase(); - if (['0', 'false', 'no', 'off'].includes(forced)) { - return false; - } - if (forced.length > 0) { - return true; - } - if (process.env.NO_COLOR) { - return false; - } - return Boolean(process.stdout.isTTY) && process.env.TERM !== 'dumb'; -} - -function colorize(text, colorCode) { - if (!supportsAnsiColors()) { - return text; - } - return `\u001B[${colorCode}m${text}\u001B[0m`; -} - -function doctorOutputColorCode(status) { - const normalized = String(status || '').trim().toLowerCase(); - if (['active', 'done', 'ok', 'safe', 'success'].includes(normalized)) { - return '32'; - } - if (normalized === 'disabled') { - return '36'; - } - if (['degraded', 'pending', 'skip', 'warn', 'warning'].includes(normalized)) { - return '33'; - } - if (['error', 'fail', 'inactive', 'unsafe'].includes(normalized)) { - return '31'; - } - return null; -} - -function colorizeDoctorOutput(text, status) { - const colorCode = doctorOutputColorCode(status); - return colorCode ? colorize(text, colorCode) : text; -} - -function detectAutoFinishDetailStatus(detail) { - const trimmed = String(detail || '').trim(); - const match = trimmed.match(/^\[(\w+)\]/); - if (match) { - return match[1].toLowerCase(); - } - if (/^Skipped\b/i.test(trimmed) || /^No local agent branches found\b/i.test(trimmed)) { - return 'skip'; - } - return null; -} - -function detectAutoFinishSummaryStatus(summary) { - if (!summary || summary.enabled === false) { - return detectAutoFinishDetailStatus(summary?.details?.[0]); - } - if ((summary.failed || 0) > 0) { - return 'fail'; - } - if ((summary.completed || 0) > 0) { - return 'done'; - } - if ((summary.skipped || 0) > 0) { - return 'skip'; - } - return null; -} - -function statusDot(status) { - if (status === 'active') { - return colorize('●', '32'); // green - } - if (status === 'inactive') { - return colorize('●', '31'); // red - } - if (status === 'disabled') { - return colorize('●', '36'); // cyan - } - return colorize('●', '33'); // yellow for degraded/unknown -} - -function commandCatalogLines(indent = ' ') { - const maxCommandLength = CLI_COMMAND_DESCRIPTIONS.reduce( - (max, [command]) => Math.max(max, command.length), - 0, - ); - return CLI_COMMAND_DESCRIPTIONS.map( - ([command, description]) => `${indent}${command.padEnd(maxCommandLength + 2)}${description}`, - ); -} - -function agentBotCatalogLines(indent = ' ') { - const maxCommandLength = AGENT_BOT_DESCRIPTIONS.reduce( - (max, [command]) => Math.max(max, command.length), - 0, - ); - return AGENT_BOT_DESCRIPTIONS.map( - ([command, description]) => `${indent}${command.padEnd(maxCommandLength + 2)}${description}`, - ); -} - -function repoToggleLines(indent = ' ') { - return [ - `${indent}Set repo-root .env: ${GUARDEX_REPO_TOGGLE_ENV}=0 disables Guardex, ${GUARDEX_REPO_TOGGLE_ENV}=1 enables it again`, - ]; -} - -function printToolLogsSummary() { - const usageLine = ` $ ${SHORT_TOOL_NAME} [options]`; - const commandDetails = commandCatalogLines(' '); - const agentBotDetails = agentBotCatalogLines(' '); - const repoToggleDetails = repoToggleLines(' '); - - if (!supportsAnsiColors()) { - console.log(`${TOOL_NAME}-tools logs:`); - console.log(' USAGE'); - console.log(usageLine); - console.log(' COMMANDS'); - for (const line of commandDetails) { - console.log(line); - } - console.log(' AGENT BOT'); - for (const line of agentBotDetails) { - console.log(line); - } - console.log(' REPO TOGGLE'); - for (const line of repoToggleDetails) { - console.log(line); - } - return; - } - - const title = colorize(`${TOOL_NAME}-tools logs`, '1;36'); - const usageHeader = colorize('USAGE', '1'); - const commandsHeader = colorize('COMMANDS', '1'); - const agentBotHeader = colorize('AGENT BOT', '1'); - const repoToggleHeader = colorize('REPO TOGGLE', '1'); - const pipe = colorize('│', '90'); - const tee = colorize('├', '90'); - const corner = colorize('└', '90'); - - console.log(`${title}:`); - console.log(` ${tee}─ ${usageHeader}`); - console.log(` ${pipe}${usageLine}`); - console.log(` ${tee}─ ${commandsHeader}`); - for (const line of commandDetails) { - if (!line) { - console.log(` ${pipe}`); - continue; - } - console.log(` ${pipe}${line.slice(2)}`); - } - console.log(` ${tee}─ ${agentBotHeader}`); - for (const line of agentBotDetails) { - if (!line) { - console.log(` ${pipe}`); - continue; - } - console.log(` ${pipe}${line.slice(2)}`); - } - console.log(` ${tee}─ ${repoToggleHeader}`); - for (const line of repoToggleDetails) { - if (!line) { - console.log(` ${pipe}`); - continue; - } - console.log(` ${pipe}${line.slice(2)}`); - } - console.log(` ${corner}─ ${colorize(`Try '${TOOL_NAME} doctor' for one-step repair + verification.`, '2')}`); -} - -function usage(options = {}) { - const { outsideGitRepo = false } = options; - - console.log(`A command-line tool that sets up hardened multi-agent safety for git repositories. - -VERSION - ${runtimeVersion()} - -USAGE - $ ${SHORT_TOOL_NAME} [options] - -COMMANDS -${commandCatalogLines().join('\n')} - -AGENT BOT -${agentBotCatalogLines().join('\n')} - -REPO TOGGLE -${repoToggleLines().join('\n')} - -NOTES - - No command = ${SHORT_TOOL_NAME} status. ${SHORT_TOOL_NAME} init is an alias of ${SHORT_TOOL_NAME} setup. - - Global installs need Y/N approval; GitHub CLI (gh) is required for PR automation. - - Target another repo: ${SHORT_TOOL_NAME} --target . - - On protected main, setup/install/fix/doctor auto-sandbox via agent branch + PR flow. - - Run '${SHORT_TOOL_NAME} cleanup' to prune merged agent branches/worktrees. - - Legacy aliases: ${LEGACY_NAMES.join(', ')}.`); - - if (outsideGitRepo) { - console.log(` -[${TOOL_NAME}] No git repository detected in current directory. -[${TOOL_NAME}] Start from a repo root, or pass an explicit target: - ${TOOL_NAME} setup --target - ${TOOL_NAME} doctor --target `); - } -} - -function run(cmd, args, options = {}) { - return cp.spawnSync(cmd, args, { - encoding: 'utf8', - stdio: options.stdio || 'pipe', - cwd: options.cwd, - env: options.env ? { ...process.env, ...options.env } : process.env, - timeout: options.timeout, - }); -} - -function extractTargetedArgs(rawArgs, defaultTarget = process.cwd()) { - const passthrough = []; - let target = defaultTarget; - - for (let index = 0; index < rawArgs.length; index += 1) { - const arg = rawArgs[index]; - if (arg === '--target' || arg === '-t') { - target = requireValue(rawArgs, index, '--target'); - index += 1; - continue; - } - passthrough.push(arg); - } - - return { target, passthrough }; -} - -function packageAssetEnv(extraEnv = {}) { - return { - GUARDEX_CLI_ENTRY: __filename, - GUARDEX_NODE_BIN: process.execPath, - ...extraEnv, - }; -} - -function packageAssetPath(assetKey) { - const assetPath = PACKAGE_SCRIPT_ASSETS[assetKey]; - if (!assetPath) { - throw new Error(`Unknown package asset: ${assetKey}`); - } - if (!fs.existsSync(assetPath)) { - throw new Error(`Missing package asset: ${assetPath}`); - } - return assetPath; -} - -function runPackageAsset(assetKey, rawArgs, options = {}) { - const assetPath = packageAssetPath(assetKey); - let cmd = 'bash'; - if (assetPath.endsWith('.py')) { - cmd = 'python3'; - } else if (assetPath.endsWith('.js')) { - cmd = process.execPath; - } - return run(cmd, [assetPath, ...rawArgs], { - cwd: options.cwd || process.cwd(), - stdio: options.stdio || 'pipe', - timeout: options.timeout, - env: packageAssetEnv(options.env), - }); -} - -function repoLocalLegacyScriptPath(repoRoot, relativePath) { - const assetPath = path.join(repoRoot, relativePath); - return fs.existsSync(assetPath) ? assetPath : null; -} - -function runReviewBotCommand(repoRoot, rawArgs, options = {}) { - const legacyScript = repoLocalLegacyScriptPath(repoRoot, 'scripts/review-bot-watch.sh'); - if (legacyScript) { - return run('bash', [legacyScript, ...rawArgs], { - cwd: repoRoot, - stdio: options.stdio || 'pipe', - timeout: options.timeout, - env: packageAssetEnv(options.env), - }); - } - return runPackageAsset('reviewBot', rawArgs, { - ...options, - cwd: repoRoot, - }); -} - -function invokePackageAsset(assetKey, rawArgs, options = {}) { - const result = runPackageAsset(assetKey, rawArgs, options); - if (result.stdout) process.stdout.write(result.stdout); - if (result.stderr) process.stderr.write(result.stderr); - if (result.status !== 0) { - throw new Error(`${assetKey} command failed with status ${result.status}`); - } - process.exitCode = 0; - return result; -} - -function formatElapsedDuration(ms) { - const durationMs = Number.isFinite(ms) ? Math.max(0, ms) : 0; - if (durationMs < 1000) { - return `${Math.round(durationMs)}ms`; - } - if (durationMs < 10_000) { - return `${(durationMs / 1000).toFixed(1)}s`; - } - return `${Math.round(durationMs / 1000)}s`; -} - -function truncateMiddle(value, maxLength) { - const text = String(value || ''); - const limit = Number.isFinite(maxLength) ? Math.max(4, maxLength) : 0; - if (!limit || text.length <= limit) { - return text; - } - - const visible = limit - 1; - const headLength = Math.ceil(visible / 2); - const tailLength = Math.floor(visible / 2); - return `${text.slice(0, headLength)}…${text.slice(text.length - tailLength)}`; -} - -function truncateTail(value, maxLength) { - const text = String(value || ''); - const limit = Number.isFinite(maxLength) ? Math.max(4, maxLength) : 0; - if (!limit || text.length <= limit) { - return text; - } - return `${text.slice(0, limit - 1)}…`; -} - -function compactAutoFinishPathSegments(message) { - return String(message || '').replace(/\((\/[^)]+)\)/g, (_, rawPath) => { - if ( - rawPath.includes(`${path.sep}.omx${path.sep}agent-worktrees${path.sep}`) || - rawPath.includes(`${path.sep}.omc${path.sep}agent-worktrees${path.sep}`) - ) { - return `(${path.basename(rawPath)})`; - } - return `(${truncateMiddle(rawPath, 72)})`; - }); -} - -function detectRecoverableAutoFinishConflict(message) { - const text = String(message || '').trim(); - if (!text) { - return null; - } - - if (/rebase --continue/i.test(text) && /rebase --abort/i.test(text)) { - return { - rawLabel: 'auto-finish requires manual rebase.', - summary: 'manual rebase required in the source-probe worktree; run rebase --continue or rebase --abort', - }; - } - - if (/Rebase\/merge '.+' into '.+' and resolve conflicts before finishing\./i.test(text)) { - return { - rawLabel: 'auto-finish requires manual rebase or merge.', - summary: 'manual rebase or merge required before auto-finish can continue', - }; - } - - if (/Merge conflict detected while merging/i.test(text)) { - return { - rawLabel: 'auto-finish requires manual merge resolution.', - summary: 'manual merge resolution required before auto-finish can continue', - }; - } - - return null; -} - -function summarizeAutoFinishDetail(detail) { - const trimmed = String(detail || '').trim(); - const match = trimmed.match(/^\[(\w+)\]\s+([^:]+):\s*(.*)$/); - if (!match) { - return truncateTail(compactAutoFinishPathSegments(trimmed), DOCTOR_AUTO_FINISH_MESSAGE_MAX); - } - - const [, status, rawBranch, rawMessage] = match; - const branch = truncateMiddle(rawBranch, DOCTOR_AUTO_FINISH_BRANCH_LABEL_MAX); - let message = String(rawMessage || '').trim(); - const recoverableConflict = status === 'skip' ? detectRecoverableAutoFinishConflict(message) : null; - - if (recoverableConflict) { - message = recoverableConflict.summary; - } else if (status === 'fail') { - message = message.replace(/^auto-finish failed\.?\s*/i, ''); - if (/\[agent-sync-guard\]/.test(message) && /Resolve conflicts/i.test(message)) { - message = 'rebase conflict in finish flow; run rebase --continue or rebase --abort in the source-probe worktree'; - } else if (/unable to compute ahead\/behind/i.test(message)) { - const aheadBehindMatch = message.match(/unable to compute ahead\/behind(?: \([^)]+\))?/i); - if (aheadBehindMatch) { - message = aheadBehindMatch[0]; - } - } else if (/remote ref does not exist/i.test(message)) { - message = 'branch merged, but the remote ref was already removed during cleanup'; - } - } - - message = compactAutoFinishPathSegments(message) - .replace(/\s+\|\s+/g, '; ') - .trim(); - - return `[${status}] ${branch}: ${truncateTail(message, DOCTOR_AUTO_FINISH_MESSAGE_MAX)}`; -} - -function printAutoFinishSummary(summary, options = {}) { - const enabled = Boolean(summary && summary.enabled); - const details = Array.isArray(summary && summary.details) ? summary.details : []; - const baseBranch = String(options.baseBranch || summary?.baseBranch || '').trim(); - const verbose = Boolean(options.verbose); - const detailLimit = Number.isFinite(options.detailLimit) - ? Math.max(0, options.detailLimit) - : DOCTOR_AUTO_FINISH_DETAIL_LIMIT; - - if (enabled) { - console.log( - colorizeDoctorOutput( - `[${TOOL_NAME}] Auto-finish sweep (base=${baseBranch}): attempted=${summary.attempted}, completed=${summary.completed}, skipped=${summary.skipped}, failed=${summary.failed}`, - detectAutoFinishSummaryStatus(summary), - ), - ); - const visibleDetails = verbose ? details : details.slice(0, detailLimit).map(summarizeAutoFinishDetail); - for (const detail of visibleDetails) { - console.log(colorizeDoctorOutput(`[${TOOL_NAME}] ${detail}`, detectAutoFinishDetailStatus(detail))); - } - if (!verbose && details.length > detailLimit) { - console.log( - colorizeDoctorOutput( - `[${TOOL_NAME}] … ${details.length - detailLimit} more branch result(s). Re-run with --verbose-auto-finish for full details.`, - 'warn', - ), - ); - } - return; - } - - if (details.length > 0) { - const detail = verbose ? details[0] : summarizeAutoFinishDetail(details[0]); - console.log(colorizeDoctorOutput(`[${TOOL_NAME}] ${detail}`, detectAutoFinishDetailStatus(detail))); - } -} - -function gitRun(repoRoot, args, { allowFailure = false } = {}) { - const result = run('git', ['-C', repoRoot, ...args]); - if (!allowFailure && result.status !== 0) { - throw new Error(`git ${args.join(' ')} failed: ${(result.stderr || '').trim()}`); - } - return result; -} - -function resolveRepoRoot(targetPath) { - const resolvedTarget = path.resolve(targetPath || process.cwd()); - const result = run('git', ['-C', resolvedTarget, 'rev-parse', '--show-toplevel']); - if (result.status !== 0) { - const stderr = (result.stderr || '').trim(); - throw new Error( - `Target is not inside a git repository: ${resolvedTarget}${stderr ? `\n${stderr}` : ''}`, - ); - } - return result.stdout.trim(); -} - -function isGitRepo(targetPath) { - const resolvedTarget = path.resolve(targetPath || process.cwd()); - const result = run('git', ['-C', resolvedTarget, 'rev-parse', '--show-toplevel']); - return result.status === 0; -} - -const NESTED_REPO_DEFAULT_MAX_DEPTH = 6; -const NESTED_REPO_DEFAULT_SKIP_DIRS = new Set([ - 'node_modules', - '.git', - 'dist', - 'build', - '.next', - '.cache', - 'target', - 'vendor', - '.venv', - '.pnpm-store', -]); -function discoverNestedGitRepos(rootPath, opts = {}) { - const maxDepth = Number.isFinite(opts.maxDepth) ? Math.max(1, opts.maxDepth) : NESTED_REPO_DEFAULT_MAX_DEPTH; - const extraSkip = new Set(Array.isArray(opts.extraSkip) ? opts.extraSkip : []); - const includeSubmodules = Boolean(opts.includeSubmodules); - const resolvedRoot = path.resolve(rootPath); - - const rootCommonDir = (() => { - const result = run('git', ['-C', resolvedRoot, 'rev-parse', '--git-common-dir'], { cwd: resolvedRoot }); - if (result.status !== 0) return null; - const raw = result.stdout.trim(); - if (!raw) return null; - return path.resolve(resolvedRoot, raw); - })(); - - const worktreeSkipAbsolutes = AGENT_WORKTREE_RELATIVE_DIRS.map((relativeDir) => path.join(resolvedRoot, relativeDir)); - const found = new Set(); - found.add(resolvedRoot); - - function shouldSkipDir(dirName) { - return NESTED_REPO_DEFAULT_SKIP_DIRS.has(dirName) || extraSkip.has(dirName); - } - - function walk(currentPath, depth) { - if (depth > maxDepth) return; - let entries; - try { - entries = fs.readdirSync(currentPath, { withFileTypes: true }); - } catch { - return; - } - - for (const entry of entries) { - const entryPath = path.join(currentPath, entry.name); - - if (entry.name === '.git') { - if (entry.isDirectory()) { - if (entryPath === path.join(resolvedRoot, '.git')) continue; - found.add(path.dirname(entryPath)); - } else if (includeSubmodules && entry.isFile()) { - found.add(path.dirname(entryPath)); - } - continue; - } - - if (!entry.isDirectory() || entry.isSymbolicLink()) continue; - if (shouldSkipDir(entry.name)) continue; - if (worktreeSkipAbsolutes.includes(entryPath)) continue; - walk(entryPath, depth + 1); - } - } - - walk(resolvedRoot, 0); - - const filtered = Array.from(found).filter((repoPath) => { - if (repoPath === resolvedRoot) return true; - if (!rootCommonDir) return true; - const childResult = run('git', ['-C', repoPath, 'rev-parse', '--git-common-dir'], { cwd: repoPath }); - if (childResult.status !== 0) return true; - const childCommonDirRaw = childResult.stdout.trim(); - if (!childCommonDirRaw) return true; - const childCommonDir = path.resolve(repoPath, childCommonDirRaw); - return childCommonDir !== rootCommonDir; - }); - - const [root, ...rest] = filtered; - rest.sort((a, b) => a.localeCompare(b)); - return [root, ...rest]; -} - -function toDestinationPath(relativeTemplatePath) { - if (relativeTemplatePath.startsWith('scripts/')) { - return relativeTemplatePath; - } - if (relativeTemplatePath.startsWith('githooks/')) { - return `.${relativeTemplatePath}`; - } - if (relativeTemplatePath.startsWith('codex/')) { - return `.${relativeTemplatePath}`; - } - if (relativeTemplatePath.startsWith('claude/')) { - return `.${relativeTemplatePath}`; - } - if (relativeTemplatePath.startsWith('github/')) { - return `.${relativeTemplatePath}`; - } - if (relativeTemplatePath.startsWith('vscode/')) { - return relativeTemplatePath; - } - throw new Error(`Unsupported template path: ${relativeTemplatePath}`); -} - -function ensureParentDir(repoRoot, filePath, dryRun) { - if (dryRun) return; - - const parentDir = path.dirname(filePath); - const relativeParentDir = path.relative(repoRoot, parentDir); - const segments = relativeParentDir.split(path.sep).filter(Boolean); - let currentPath = repoRoot; - - for (const segment of segments) { - currentPath = path.join(currentPath, segment); - if (fs.existsSync(currentPath) && !fs.statSync(currentPath).isDirectory()) { - const blockingPath = path.relative(repoRoot, currentPath) || path.basename(currentPath); - const targetPath = path.relative(repoRoot, filePath) || path.basename(filePath); - throw new Error( - `Path conflict: ${blockingPath} exists as a file, but ${targetPath} needs it to be a directory. ` + - `Remove or rename ${blockingPath} and rerun '${SHORT_TOOL_NAME} setup'.`, - ); - } - } - - fs.mkdirSync(parentDir, { recursive: true }); -} - -function ensureExecutable(destinationPath, relativePath, dryRun) { - if (dryRun) return; - if (EXECUTABLE_RELATIVE_PATHS.has(relativePath)) { - fs.chmodSync(destinationPath, 0o755); - } -} - -function isCriticalGuardrailPath(relativePath) { - return CRITICAL_GUARDRAIL_PATHS.has(relativePath); -} - -function shellSingleQuote(value) { - return `'${String(value).replace(/'/g, `'\"'\"'`)}'`; -} - -function renderShellDispatchShim(commandParts) { - const rendered = commandParts.map((part) => shellSingleQuote(part)).join(' '); - return ( - '#!/usr/bin/env bash\n' + - 'set -euo pipefail\n' + - '\n' + - 'if [[ -n "${GUARDEX_CLI_ENTRY:-}" ]]; then\n' + - ' node_bin="${GUARDEX_NODE_BIN:-node}"\n' + - ` exec "$node_bin" "$GUARDEX_CLI_ENTRY" ${rendered} "$@"\n` + - 'fi\n' + - '\n' + - 'resolve_guardex_cli() {\n' + - ' if [[ -n "${GUARDEX_CLI_BIN:-}" ]]; then\n' + - ' printf \'%s\' "$GUARDEX_CLI_BIN"\n' + - ' return 0\n' + - ' fi\n' + - ' if command -v gx >/dev/null 2>&1; then\n' + - ' printf \'%s\' "gx"\n' + - ' return 0\n' + - ' fi\n' + - ' if command -v gitguardex >/dev/null 2>&1; then\n' + - ' printf \'%s\' "gitguardex"\n' + - ' return 0\n' + - ' fi\n' + - ' echo "[gitguardex-shim] Missing gx CLI in PATH." >&2\n' + - ' exit 1\n' + - '}\n' + - '\n' + - 'cli_bin="$(resolve_guardex_cli)"\n' + - `exec "$cli_bin" ${rendered} "$@"\n` - ); -} - -function renderPythonDispatchShim(commandParts) { - return ( - '#!/usr/bin/env python3\n' + - 'import os\n' + - 'import shutil\n' + - 'import subprocess\n' + - 'import sys\n' + - '\n' + - `COMMAND = ${JSON.stringify(commandParts)}\n` + - '\n' + - 'entry = os.environ.get("GUARDEX_CLI_ENTRY")\n' + - 'if entry:\n' + - ' node_bin = os.environ.get("GUARDEX_NODE_BIN") or shutil.which("node") or "node"\n' + - ' raise SystemExit(subprocess.call([node_bin, entry, *COMMAND, *sys.argv[1:]]))\n' + - 'cli = os.environ.get("GUARDEX_CLI_BIN") or shutil.which("gx") or shutil.which("gitguardex")\n' + - 'if not cli:\n' + - ' sys.stderr.write("[gitguardex-shim] Missing gx CLI in PATH.\\n")\n' + - ' raise SystemExit(1)\n' + - 'raise SystemExit(subprocess.call([cli, *COMMAND, *sys.argv[1:]]))\n' - ); -} - -function managedForceConflictMessage(relativePath) { - return ( - `Refusing to overwrite existing file without --force: ${relativePath}\n` + - `Use '--force ${relativePath}' to rewrite only this managed file, or '--force' to rewrite all managed files.` - ); -} - -function renderManagedFile(repoRoot, relativePath, content, options = {}) { - const destinationPath = path.join(repoRoot, relativePath); - const destinationExists = fs.existsSync(destinationPath); - const force = Boolean(options.force); - const dryRun = Boolean(options.dryRun); - - if (destinationExists) { - const existingContent = fs.readFileSync(destinationPath, 'utf8'); - if (existingContent === content) { - ensureExecutable(destinationPath, relativePath, dryRun); - return { status: 'unchanged', file: relativePath }; - } - if (!force && !isCriticalGuardrailPath(relativePath)) { - throw new Error(managedForceConflictMessage(relativePath)); - } - } - - ensureParentDir(repoRoot, destinationPath, dryRun); - if (!dryRun) { - fs.writeFileSync(destinationPath, content, 'utf8'); - ensureExecutable(destinationPath, relativePath, dryRun); - } - - if (destinationExists && !force && isCriticalGuardrailPath(relativePath)) { - return { status: dryRun ? 'would-repair-critical' : 'repaired-critical', file: relativePath }; - } - - return { status: destinationExists ? 'overwritten' : 'created', file: relativePath }; -} - -function ensureGeneratedScriptShim(repoRoot, spec, options = {}) { - const content = spec.kind === 'python' - ? renderPythonDispatchShim(spec.command) - : renderShellDispatchShim(spec.command); - return renderManagedFile(repoRoot, spec.relativePath, content, options); -} - -function ensureHookShim(repoRoot, hookName, options = {}) { - return renderManagedFile( - repoRoot, - path.posix.join('.githooks', hookName), - renderShellDispatchShim(['hook', 'run', hookName]), - options, - ); -} - -function copyTemplateFile(repoRoot, relativeTemplatePath, force, dryRun) { - const sourcePath = path.join(TEMPLATE_ROOT, relativeTemplatePath); - const destinationRelativePath = toDestinationPath(relativeTemplatePath); - const destinationPath = path.join(repoRoot, destinationRelativePath); - - const sourceContent = fs.readFileSync(sourcePath, 'utf8'); - const destinationExists = fs.existsSync(destinationPath); - - if (destinationExists) { - const existingContent = fs.readFileSync(destinationPath, 'utf8'); - if (existingContent === sourceContent) { - ensureExecutable(destinationPath, destinationRelativePath, dryRun); - return { status: 'unchanged', file: destinationRelativePath }; - } - if (!force && !isCriticalGuardrailPath(destinationRelativePath)) { - throw new Error(managedForceConflictMessage(destinationRelativePath)); - } - } - - ensureParentDir(repoRoot, destinationPath, dryRun); - if (!dryRun) { - fs.writeFileSync(destinationPath, sourceContent, 'utf8'); - ensureExecutable(destinationPath, destinationRelativePath, dryRun); - } - - if (destinationExists && !force && isCriticalGuardrailPath(destinationRelativePath)) { - return { status: dryRun ? 'would-repair-critical' : 'repaired-critical', file: destinationRelativePath }; - } - - return { status: destinationExists ? 'overwritten' : 'created', file: destinationRelativePath }; -} - -function ensureTemplateFilePresent(repoRoot, relativeTemplatePath, dryRun) { - const sourcePath = path.join(TEMPLATE_ROOT, relativeTemplatePath); - const destinationRelativePath = toDestinationPath(relativeTemplatePath); - const destinationPath = path.join(repoRoot, destinationRelativePath); - const sourceContent = fs.readFileSync(sourcePath, 'utf8'); - - if (fs.existsSync(destinationPath)) { - const existingContent = fs.readFileSync(destinationPath, 'utf8'); - if (existingContent === sourceContent) { - ensureExecutable(destinationPath, destinationRelativePath, dryRun); - return { status: 'unchanged', file: destinationRelativePath }; - } - - if (isCriticalGuardrailPath(destinationRelativePath)) { - if (!dryRun) { - fs.writeFileSync(destinationPath, sourceContent, 'utf8'); - ensureExecutable(destinationPath, destinationRelativePath, dryRun); - } - return { status: dryRun ? 'would-repair-critical' : 'repaired-critical', file: destinationRelativePath }; - } - - // In fix mode, avoid silently replacing local customizations. - return { status: 'skipped-conflict', file: destinationRelativePath }; - } - - ensureParentDir(repoRoot, destinationPath, dryRun); - if (!dryRun) { - fs.writeFileSync(destinationPath, sourceContent, 'utf8'); - ensureExecutable(destinationPath, destinationRelativePath, dryRun); - } - - return { status: 'created', file: destinationRelativePath }; -} - -function ensureTargetedLegacyWorkflowShims(repoRoot, options) { - const targetedPaths = Array.isArray(options.forceManagedPaths) ? options.forceManagedPaths : []; - if (targetedPaths.length === 0) { - return []; - } - - const operations = []; - for (const shim of LEGACY_WORKFLOW_SHIM_SPECS) { - if (!shouldForceManagedPath(options, shim.relativePath)) { - continue; - } - operations.push(ensureGeneratedScriptShim(repoRoot, shim, { dryRun: options.dryRun, force: true })); - } - return operations; -} - -function lockFilePath(repoRoot) { - return path.join(repoRoot, LOCK_FILE_RELATIVE); -} - -function ensureOmxScaffold(repoRoot, dryRun) { - const operations = []; - - for (const relativeDir of REPO_SCAFFOLD_DIRECTORIES) { - const absoluteDir = path.join(repoRoot, relativeDir); - if (fs.existsSync(absoluteDir)) { - if (!fs.statSync(absoluteDir).isDirectory()) { - throw new Error(`Expected directory at ${relativeDir} but found a file.`); - } - operations.push({ status: 'unchanged', file: relativeDir }); - continue; - } - - if (!dryRun) { - fs.mkdirSync(absoluteDir, { recursive: true }); - } - operations.push({ status: 'created', file: relativeDir }); - } - - for (const relativeDir of OMX_SCAFFOLD_DIRECTORIES) { - const absoluteDir = path.join(repoRoot, relativeDir); - if (fs.existsSync(absoluteDir)) { - if (!fs.statSync(absoluteDir).isDirectory()) { - throw new Error(`Expected directory at ${relativeDir} but found a file.`); - } - operations.push({ status: 'unchanged', file: relativeDir }); - continue; - } - - if (!dryRun) { - fs.mkdirSync(absoluteDir, { recursive: true }); - } - operations.push({ status: 'created', file: relativeDir }); - } - - for (const [relativeFile, defaultContent] of OMX_SCAFFOLD_FILES.entries()) { - const absoluteFile = path.join(repoRoot, relativeFile); - if (fs.existsSync(absoluteFile)) { - if (!fs.statSync(absoluteFile).isFile()) { - throw new Error(`Expected file at ${relativeFile} but found a directory.`); - } - operations.push({ status: 'unchanged', file: relativeFile }); - continue; - } - - if (!dryRun) { - fs.mkdirSync(path.dirname(absoluteFile), { recursive: true }); - fs.writeFileSync(absoluteFile, defaultContent, 'utf8'); - } - operations.push({ status: 'created', file: relativeFile }); - } - - return operations; -} - -function ensureLockRegistry(repoRoot, dryRun) { - const absolutePath = lockFilePath(repoRoot); - if (fs.existsSync(absolutePath)) { - return { status: 'unchanged', file: LOCK_FILE_RELATIVE }; - } - - if (!dryRun) { - fs.mkdirSync(path.dirname(absolutePath), { recursive: true }); - fs.writeFileSync(absolutePath, JSON.stringify({ locks: {} }, null, 2) + '\n', 'utf8'); - } - - return { status: 'created', file: LOCK_FILE_RELATIVE }; -} - -function lockStateOrError(repoRoot) { - const lockPath = lockFilePath(repoRoot); - if (!fs.existsSync(lockPath)) { - return { ok: false, error: `${LOCK_FILE_RELATIVE} is missing` }; - } - - try { - const parsed = JSON.parse(fs.readFileSync(lockPath, 'utf8')); - if (!parsed || typeof parsed !== 'object' || typeof parsed.locks !== 'object' || parsed.locks === null) { - return { ok: false, error: `${LOCK_FILE_RELATIVE} has invalid schema (expected { locks: {} })` }; - } - - // Normalize older schema entries. - for (const [filePath, entry] of Object.entries(parsed.locks)) { - if (!entry || typeof entry !== 'object') { - parsed.locks[filePath] = { branch: '', claimed_at: '', allow_delete: false }; - continue; - } - if (!Object.prototype.hasOwnProperty.call(entry, 'allow_delete')) { - entry.allow_delete = false; - } - } - - return { ok: true, raw: parsed, locks: parsed.locks }; - } catch (error) { - return { ok: false, error: `${LOCK_FILE_RELATIVE} is invalid JSON: ${error.message}` }; - } -} - -function writeLockState(repoRoot, payload, dryRun) { - if (dryRun) return; - const lockPath = lockFilePath(repoRoot); - fs.mkdirSync(path.dirname(lockPath), { recursive: true }); - fs.writeFileSync(lockPath, JSON.stringify(payload, null, 2) + '\n', 'utf8'); -} - -function removeLegacyPackageScripts(repoRoot, dryRun) { - const packagePath = path.join(repoRoot, 'package.json'); - if (!fs.existsSync(packagePath)) { - return { status: 'skipped', file: 'package.json', note: 'package.json not found' }; - } - - let pkg; - try { - pkg = JSON.parse(fs.readFileSync(packagePath, 'utf8')); - } catch (error) { - throw new Error(`Unable to parse package.json in target repo: ${error.message}`); - } - - const existingScripts = pkg.scripts && typeof pkg.scripts === 'object' - ? pkg.scripts - : {}; - pkg.scripts = existingScripts; - let changed = false; - for (const [key, value] of Object.entries(LEGACY_MANAGED_PACKAGE_SCRIPTS)) { - if (existingScripts[key] === value) { - delete existingScripts[key]; - changed = true; - } - } - - if (!changed) { - return { status: 'unchanged', file: 'package.json', note: 'no Guardex-managed agent:* scripts found' }; - } - - if (!dryRun) { - fs.writeFileSync(packagePath, JSON.stringify(pkg, null, 2) + '\n', 'utf8'); - } - - return { status: dryRun ? 'would-update' : 'updated', file: 'package.json', note: 'removed Guardex-managed agent:* scripts' }; -} - -function installUserLevelAsset(asset, options = {}) { - const dryRun = Boolean(options.dryRun); - const force = Boolean(options.force); - const destinationPath = path.join(GUARDEX_HOME_DIR, asset.destination); - const sourceContent = fs.readFileSync(asset.source, 'utf8'); - const destinationExists = fs.existsSync(destinationPath); - - if (destinationExists) { - const existingContent = fs.readFileSync(destinationPath, 'utf8'); - if (existingContent === sourceContent) { - return { status: 'unchanged', file: asset.destination }; - } - if (!force) { - return { status: 'skipped-conflict', file: asset.destination }; - } - } - - if (!dryRun) { - fs.mkdirSync(path.dirname(destinationPath), { recursive: true }); - fs.writeFileSync(destinationPath, sourceContent, 'utf8'); - } - return { status: destinationExists ? (dryRun ? 'would-update' : 'updated') : 'created', file: asset.destination }; -} - -function removeLegacyManagedRepoFile(repoRoot, relativePath, options = {}) { - const dryRun = Boolean(options.dryRun); - const force = Boolean(options.force); - const absolutePath = path.join(repoRoot, relativePath); - if (!fs.existsSync(absolutePath)) { - return { status: 'unchanged', file: relativePath, note: 'not present' }; - } - if (!fs.statSync(absolutePath).isFile()) { - return { status: 'skipped-conflict', file: relativePath, note: 'not a regular file' }; - } - - const skillAsset = USER_LEVEL_SKILL_ASSETS.find((asset) => asset.destination === relativePath); - if (skillAsset) { - const userLevelPath = path.join(GUARDEX_HOME_DIR, skillAsset.destination); - if (!fs.existsSync(userLevelPath)) { - return { status: 'skipped', file: relativePath, note: 'user-level replacement not installed' }; - } - } - - const templateRelative = skillAsset - ? skillAsset.source.slice(TEMPLATE_ROOT.length + 1) - : relativePath.replace(/^\./, ''); - const sourcePath = path.join(TEMPLATE_ROOT, templateRelative); - if (!fs.existsSync(sourcePath)) { - return { status: 'skipped', file: relativePath, note: 'template source missing' }; - } - - const sourceContent = fs.readFileSync(sourcePath, 'utf8'); - const existingContent = fs.readFileSync(absolutePath, 'utf8'); - if (existingContent !== sourceContent && !force) { - return { status: 'skipped-conflict', file: relativePath, note: 'local edits differ from managed template' }; - } - - if (!dryRun) { - fs.rmSync(absolutePath, { force: true }); - } - return { status: dryRun ? 'would-remove' : 'removed', file: relativePath }; -} - -function ensureAgentsSnippet(repoRoot, dryRun, options = {}) { - const agentsPath = path.join(repoRoot, 'AGENTS.md'); - const snippet = fs.readFileSync(path.join(TEMPLATE_ROOT, 'AGENTS.multiagent-safety.md'), 'utf8').trimEnd(); - const managedRegex = new RegExp( - `${AGENTS_MARKER_START.replace(/[.*+?^${}()|[\]\\]/g, '\\$&')}[\\s\\S]*?${AGENTS_MARKER_END.replace(/[.*+?^${}()|[\]\\]/g, '\\$&')}`, - 'm', - ); - - if (!fs.existsSync(agentsPath)) { - if (!dryRun) { - fs.writeFileSync(agentsPath, `# AGENTS\n\n${snippet}\n`, 'utf8'); - } - return { status: 'created', file: 'AGENTS.md' }; - } - - const existing = fs.readFileSync(agentsPath, 'utf8'); - if (managedRegex.test(existing)) { - const next = existing.replace(managedRegex, snippet); - if (next === existing) { - return { status: 'unchanged', file: 'AGENTS.md' }; - } - if (!dryRun) { - fs.writeFileSync(agentsPath, next, 'utf8'); - } - return { status: 'updated', file: 'AGENTS.md', note: 'refreshed gitguardex-managed block' }; - } - - if (existing.includes(AGENTS_MARKER_START)) { - return { status: 'unchanged', file: 'AGENTS.md', note: 'existing marker found without managed end marker' }; - } - - const separator = existing.endsWith('\n') ? '\n' : '\n\n'; - if (!dryRun) { - fs.writeFileSync(agentsPath, `${existing}${separator}${snippet}\n`, 'utf8'); - } - - return { status: 'updated', file: 'AGENTS.md' }; -} - -function ensureManagedGitignore(repoRoot, dryRun) { - const gitignorePath = path.join(repoRoot, '.gitignore'); - const managedBlock = [ - GITIGNORE_MARKER_START, - ...MANAGED_GITIGNORE_PATHS, - GITIGNORE_MARKER_END, - ].join('\n'); - const managedRegex = new RegExp( - `${GITIGNORE_MARKER_START.replace(/[.*+?^${}()|[\]\\]/g, '\\$&')}[\\s\\S]*?${GITIGNORE_MARKER_END.replace(/[.*+?^${}()|[\]\\]/g, '\\$&')}`, - 'm', - ); - - if (!fs.existsSync(gitignorePath)) { - if (!dryRun) { - fs.writeFileSync(gitignorePath, `${managedBlock}\n`, 'utf8'); - } - return { status: 'created', file: '.gitignore', note: 'added gitguardex-managed entries' }; - } - - const existing = fs.readFileSync(gitignorePath, 'utf8'); - if (managedRegex.test(existing)) { - const next = existing.replace(managedRegex, managedBlock); - if (next === existing) { - return { status: 'unchanged', file: '.gitignore' }; - } - if (!dryRun) { - fs.writeFileSync(gitignorePath, next, 'utf8'); - } - return { status: 'updated', file: '.gitignore', note: 'refreshed gitguardex-managed entries' }; - } - - const separator = existing.endsWith('\n') ? '\n' : '\n\n'; - if (!dryRun) { - fs.writeFileSync(gitignorePath, `${existing}${separator}${managedBlock}\n`, 'utf8'); - } - return { status: 'updated', file: '.gitignore', note: 'appended gitguardex-managed entries' }; -} - -function configureHooks(repoRoot, dryRun) { - if (dryRun) { - return { status: 'would-set', key: 'core.hooksPath', value: '.githooks' }; - } - - const result = run('git', ['-C', repoRoot, 'config', 'core.hooksPath', '.githooks']); - if (result.status !== 0) { - throw new Error(`Failed to set git hooksPath: ${(result.stderr || '').trim()}`); - } - - return { status: 'set', key: 'core.hooksPath', value: '.githooks' }; -} - -function requireValue(rawArgs, index, flagName) { - const value = rawArgs[index + 1]; - if (!value || value.startsWith('-')) { - throw new Error(`${flagName} requires a value`); - } - return value; -} - -function normalizeManagedForcePath(rawPath) { - if (typeof rawPath !== 'string') { - return null; - } - const normalized = path.posix.normalize(rawPath.replace(/\\/g, '/')); - if (!normalized || normalized === '.' || normalized.startsWith('../') || path.posix.isAbsolute(normalized)) { - return null; - } - return normalized.startsWith('./') ? normalized.slice(2) : normalized; -} - -function collectForceManagedPaths(rawArgs, startIndex) { - const forceManagedPaths = []; - let nextIndex = startIndex; - - while (nextIndex + 1 < rawArgs.length) { - const candidate = rawArgs[nextIndex + 1]; - if (!candidate || candidate.startsWith('-')) { - break; - } - const normalized = normalizeManagedForcePath(candidate); - if (!normalized || !TARGETED_FORCEABLE_MANAGED_PATHS.has(normalized)) { - throw new Error(`Unknown managed path after --force: ${candidate}`); - } - forceManagedPaths.push(normalized); - nextIndex += 1; - } - - return { forceManagedPaths, nextIndex }; -} - -function appendForceArgs(args, options) { - if (!options.force) { - return; - } - args.push('--force'); - for (const managedPath of options.forceManagedPaths || []) { - args.push(managedPath); - } -} - -function shouldForceManagedPath(options, relativePath) { - if (!options.force) { - return false; - } - const targetedPaths = Array.isArray(options.forceManagedPaths) ? options.forceManagedPaths : []; - if (targetedPaths.length === 0) { - return true; - } - const normalized = normalizeManagedForcePath(relativePath); - return normalized !== null && targetedPaths.includes(normalized); -} - -function parseCommonArgs(rawArgs, defaults) { - const options = { ...defaults }; - const supportsForce = Object.prototype.hasOwnProperty.call(options, 'force'); - if (supportsForce && !Array.isArray(options.forceManagedPaths)) { - options.forceManagedPaths = []; - } - - for (let index = 0; index < rawArgs.length; index += 1) { - const arg = rawArgs[index]; - if (arg === '--target' || arg === '-t') { - options.target = requireValue(rawArgs, index, '--target'); - index += 1; - continue; - } - if (arg === '--dry-run') { - options.dryRun = true; - continue; - } - if (arg === '--skip-agents') { - options.skipAgents = true; - continue; - } - if (arg === '--skip-package-json') { - options.skipPackageJson = true; - continue; - } - if (arg === '--force') { - if (!supportsForce) { - throw new Error(`Unknown option: ${arg}`); - } - options.force = true; - const parsed = collectForceManagedPaths(rawArgs, index); - if (parsed.forceManagedPaths.length > 0) { - options.forceManagedPaths = Array.from( - new Set([...(options.forceManagedPaths || []), ...parsed.forceManagedPaths]), - ); - } - index = parsed.nextIndex; - continue; - } - if (arg === '--keep-stale-locks') { - options.dropStaleLocks = false; - continue; - } - if (arg === '--json') { - options.json = true; - continue; - } - if (arg === '--yes-global-install') { - options.yesGlobalInstall = true; - continue; - } - if (arg === '--no-global-install') { - options.noGlobalInstall = true; - continue; - } - if (arg === '--no-gitignore') { - options.skipGitignore = true; - continue; - } - if (arg === '--allow-protected-base-write') { - options.allowProtectedBaseWrite = true; - continue; - } - if (Object.prototype.hasOwnProperty.call(options, 'waitForMerge') && arg === '--wait-for-merge') { - options.waitForMerge = true; - continue; - } - if (Object.prototype.hasOwnProperty.call(options, 'waitForMerge') && arg === '--no-wait-for-merge') { - options.waitForMerge = false; - continue; - } - - throw new Error(`Unknown option: ${arg}`); - } - - if (!options.target) { - throw new Error('--target requires a path value'); - } - - return options; -} - -function parseRepoTraversalArgs(rawArgs, defaults) { - const traversalDefaults = { - ...defaults, - recursive: true, - nestedMaxDepth: NESTED_REPO_DEFAULT_MAX_DEPTH, - nestedSkipDirs: [], - includeSubmodules: false, - }; - const forwardedArgs = []; - - for (let index = 0; index < rawArgs.length; index += 1) { - const arg = rawArgs[index]; - if (arg === '--no-recursive' || arg === '--no-nested' || arg === '--single-repo') { - traversalDefaults.recursive = false; - continue; - } - if (arg === '--recursive' || arg === '--nested') { - traversalDefaults.recursive = true; - continue; - } - if (arg === '--max-depth') { - const raw = requireValue(rawArgs, index, '--max-depth'); - const parsed = Number.parseInt(raw, 10); - if (!Number.isFinite(parsed) || parsed < 1) { - throw new Error('--max-depth requires a positive integer'); - } - traversalDefaults.nestedMaxDepth = parsed; - index += 1; - continue; - } - if (arg === '--skip-nested') { - const raw = requireValue(rawArgs, index, '--skip-nested'); - traversalDefaults.nestedSkipDirs.push(raw); - index += 1; - continue; - } - if (arg === '--include-submodules') { - traversalDefaults.includeSubmodules = true; - continue; - } - forwardedArgs.push(arg); - } - - return parseCommonArgs(forwardedArgs, traversalDefaults); -} - -function parseSetupArgs(rawArgs, defaults) { - const setupDefaults = { - ...defaults, - parentWorkspaceView: false, - }; - const forwardedArgs = []; - - for (let index = 0; index < rawArgs.length; index += 1) { - const arg = rawArgs[index]; - if (arg === '--parent-workspace-view') { - setupDefaults.parentWorkspaceView = true; - continue; - } - if (arg === '--no-parent-workspace-view') { - setupDefaults.parentWorkspaceView = false; - continue; - } - forwardedArgs.push(arg); - } - - return parseRepoTraversalArgs(forwardedArgs, setupDefaults); -} - -function parseDoctorArgs(rawArgs) { - const doctorDefaults = { - target: process.cwd(), - force: false, - dropStaleLocks: true, - skipAgents: false, - skipPackageJson: false, - skipGitignore: false, - dryRun: false, - json: false, - allowProtectedBaseWrite: false, - waitForMerge: true, - verboseAutoFinish: false, - }; - const forwardedArgs = []; - - for (let index = 0; index < rawArgs.length; index += 1) { - const arg = rawArgs[index]; - if (arg === '--verbose-auto-finish') { - doctorDefaults.verboseAutoFinish = true; - continue; - } - if (arg === '--compact-auto-finish') { - doctorDefaults.verboseAutoFinish = false; - continue; - } - forwardedArgs.push(arg); - } - - return parseRepoTraversalArgs(forwardedArgs, doctorDefaults); -} - -function normalizeWorkspacePath(relativePath) { - return String(relativePath || '.').replace(/\\/g, '/'); -} - -function buildParentWorkspaceView(repoRoot) { - const parentDir = path.dirname(repoRoot); - const workspaceFileName = `${path.basename(repoRoot)}-branches.code-workspace`; - const workspacePath = path.join(parentDir, workspaceFileName); - const repoRelativePath = normalizeWorkspacePath(path.relative(parentDir, repoRoot) || '.'); - - return { - workspacePath, - payload: { - folders: [ - { path: repoRelativePath }, - ...AGENT_WORKTREE_RELATIVE_DIRS.map((relativeDir) => ({ - path: normalizeWorkspacePath(path.join(repoRelativePath === '.' ? '' : repoRelativePath, relativeDir)), - })), - ], - settings: { - 'scm.alwaysShowRepositories': true, - }, - }, - }; -} - -function ensureParentWorkspaceView(repoRoot, dryRun) { - const { workspacePath, payload } = buildParentWorkspaceView(repoRoot); - const operationFile = path.relative(repoRoot, workspacePath) || path.basename(workspacePath); - const nextContent = `${JSON.stringify(payload, null, 2)}\n`; - const note = 'parent VS Code workspace view'; - - if (!fs.existsSync(workspacePath)) { - if (!dryRun) { - fs.writeFileSync(workspacePath, nextContent, 'utf8'); - } - return { status: dryRun ? 'would-create' : 'created', file: operationFile, note }; - } - - const currentContent = fs.readFileSync(workspacePath, 'utf8'); - if (currentContent === nextContent) { - return { status: 'unchanged', file: operationFile, note }; - } - - if (!dryRun) { - fs.writeFileSync(workspacePath, nextContent, 'utf8'); - } - return { status: dryRun ? 'would-update' : 'updated', file: operationFile, note }; -} - -function hasGuardexBootstrapFiles(repoRoot) { - const required = [ - 'AGENTS.md', - '.githooks/pre-commit', - '.githooks/pre-push', - LOCK_FILE_RELATIVE, - ]; - return required.every((relativePath) => fs.existsSync(path.join(repoRoot, relativePath))); -} - -function protectedBaseWriteBlock(options, { requireBootstrap = true } = {}) { - if (options.dryRun || options.allowProtectedBaseWrite) { - return null; - } - - const repoRoot = resolveRepoRoot(options.target); - if (requireBootstrap && !hasGuardexBootstrapFiles(repoRoot)) { - return null; - } - - const branch = currentBranchName(repoRoot); - if (branch !== 'main') { - return null; - } - - const protectedBranches = readProtectedBranches(repoRoot); - if (!protectedBranches.includes(branch)) { - return null; - } - - return { - repoRoot, - branch, - }; -} - -function assertProtectedMainWriteAllowed(options, commandName) { - const blocked = protectedBaseWriteBlock(options); - if (!blocked) { - return; - } - - throw new Error( - `${commandName} blocked on protected branch '${blocked.branch}' in an initialized repo.\n` + - `Keep local '${blocked.branch}' pull-only: start an agent branch/worktree first:\n` + - ` gx branch start "" "codex"\n` + - `Override once only when intentional: --allow-protected-base-write`, - ); -} - -function runSetupBootstrapInternal(options) { - const installPayload = runInstallInternal(options); - installPayload.operations.push( - ensureSetupProtectedBranches(installPayload.repoRoot, Boolean(options.dryRun)), - ); - - let parentWorkspace = null; - if (options.parentWorkspaceView) { - installPayload.operations.push( - ensureParentWorkspaceView(installPayload.repoRoot, Boolean(options.dryRun)), - ); - if (!options.dryRun) { - parentWorkspace = buildParentWorkspaceView(installPayload.repoRoot); - } - } - - const fixPayload = runFixInternal({ - target: installPayload.repoRoot, - dryRun: options.dryRun, - force: options.force, - forceManagedPaths: options.forceManagedPaths, - dropStaleLocks: true, - skipAgents: options.skipAgents, - skipPackageJson: options.skipPackageJson, - skipGitignore: options.skipGitignore, - allowProtectedBaseWrite: options.allowProtectedBaseWrite, - }); - - return { - installPayload, - fixPayload, - parentWorkspace, - }; -} - -function extractAgentBranchStartMetadata(output) { - const branchMatch = String(output || '').match(/^\[agent-branch-start\] Created branch: (.+)$/m); - const worktreeMatch = String(output || '').match(/^\[agent-branch-start\] Worktree: (.+)$/m); - return { - branch: branchMatch ? branchMatch[1].trim() : '', - worktreePath: worktreeMatch ? worktreeMatch[1].trim() : '', - }; -} - -function resolveSandboxTarget(repoRoot, worktreePath, targetPath) { - const resolvedTarget = path.resolve(targetPath); - const relativeTarget = path.relative(repoRoot, resolvedTarget); - if (relativeTarget.startsWith('..') || path.isAbsolute(relativeTarget)) { - throw new Error(`sandbox target must stay inside repo root: ${resolvedTarget}`); - } - if (!relativeTarget || relativeTarget === '.') { - return worktreePath; - } - return path.join(worktreePath, relativeTarget); -} - -function buildSandboxSetupArgs(options, sandboxTarget) { - const args = ['setup', '--target', sandboxTarget, '--no-global-install', '--no-recursive']; - appendForceArgs(args, options); - if (options.skipAgents) args.push('--skip-agents'); - if (options.skipPackageJson) args.push('--skip-package-json'); - if (options.skipGitignore) args.push('--no-gitignore'); - if (options.dryRun) args.push('--dry-run'); - return args; -} - -function buildSandboxDoctorArgs(options, sandboxTarget) { - const args = ['doctor', '--target', sandboxTarget]; - if (options.dryRun) args.push('--dry-run'); - appendForceArgs(args, options); - if (options.skipAgents) args.push('--skip-agents'); - if (options.skipPackageJson) args.push('--skip-package-json'); - if (options.skipGitignore) args.push('--no-gitignore'); - if (!options.dropStaleLocks) args.push('--keep-stale-locks'); - args.push(options.waitForMerge ? '--wait-for-merge' : '--no-wait-for-merge'); - if (options.verboseAutoFinish) args.push('--verbose-auto-finish'); - if (options.json) args.push('--json'); - return args; -} - -function isSpawnFailure(result) { - return Boolean(result?.error) && typeof result?.status !== 'number'; -} - -function ensureRepoBranch(repoRoot, branch) { - const current = currentBranchName(repoRoot); - if (current === branch) { - return { ok: true, changed: false }; - } - - const checkoutResult = run('git', ['-C', repoRoot, 'checkout', branch], { timeout: 20_000 }); - if (isSpawnFailure(checkoutResult)) { - return { - ok: false, - changed: false, - stdout: checkoutResult.stdout || '', - stderr: checkoutResult.stderr || '', - }; - } - if (checkoutResult.status !== 0) { - return { - ok: false, - changed: false, - stdout: checkoutResult.stdout || '', - stderr: checkoutResult.stderr || '', - }; - } - - return { ok: true, changed: true }; -} - -function protectedBaseSandboxBranchPrefix() { - const now = new Date(); - const stamp = [ - now.getUTCFullYear(), - String(now.getUTCMonth() + 1).padStart(2, '0'), - String(now.getUTCDate()).padStart(2, '0'), - ].join('') + '-' + [ - String(now.getUTCHours()).padStart(2, '0'), - String(now.getUTCMinutes()).padStart(2, '0'), - String(now.getUTCSeconds()).padStart(2, '0'), - ].join(''); - return `agent/gx/${stamp}`; -} - -function protectedBaseSandboxWorktreePath(repoRoot, branchName) { - return path.join(repoRoot, defaultAgentWorktreeRelativeDir(), branchName.replace(/\//g, '__')); -} - -function gitRefExists(repoRoot, ref) { - return run('git', ['-C', repoRoot, 'show-ref', '--verify', '--quiet', ref]).status === 0; -} - -function resolveProtectedBaseSandboxStartRef(repoRoot, baseBranch) { - run('git', ['-C', repoRoot, 'fetch', 'origin', baseBranch, '--quiet'], { timeout: 20_000 }); - if (gitRefExists(repoRoot, `refs/remotes/origin/${baseBranch}`)) { - return `origin/${baseBranch}`; - } - if (gitRefExists(repoRoot, `refs/heads/${baseBranch}`)) { - return baseBranch; - } - if (currentBranchName(repoRoot) === baseBranch) { - return null; - } - throw new Error(`Unable to find base ref for sandbox bootstrap: ${baseBranch}`); -} - -function startProtectedBaseSandboxFallback(blocked, sandboxSuffix) { - const branchPrefix = protectedBaseSandboxBranchPrefix(); - let selectedBranch = ''; - let selectedWorktreePath = ''; - - for (let attempt = 0; attempt < 30; attempt += 1) { - const suffix = attempt === 0 ? sandboxSuffix : `${attempt + 1}-${sandboxSuffix}`; - const candidateBranch = `${branchPrefix}-${suffix}`; - const candidateWorktreePath = protectedBaseSandboxWorktreePath(blocked.repoRoot, candidateBranch); - if (gitRefExists(blocked.repoRoot, `refs/heads/${candidateBranch}`)) { - continue; - } - if (fs.existsSync(candidateWorktreePath)) { - continue; - } - selectedBranch = candidateBranch; - selectedWorktreePath = candidateWorktreePath; - break; - } - - if (!selectedBranch || !selectedWorktreePath) { - throw new Error('Unable to allocate unique sandbox branch/worktree'); - } - - fs.mkdirSync(path.dirname(selectedWorktreePath), { recursive: true }); - const startRef = resolveProtectedBaseSandboxStartRef(blocked.repoRoot, blocked.branch); - const addArgs = startRef - ? ['-C', blocked.repoRoot, 'worktree', 'add', '-b', selectedBranch, selectedWorktreePath, startRef] - : ['-C', blocked.repoRoot, 'worktree', 'add', '--orphan', selectedWorktreePath]; - const addResult = run('git', addArgs); - if (isSpawnFailure(addResult)) { - throw addResult.error; - } - if (addResult.status !== 0) { - throw new Error((addResult.stderr || addResult.stdout || 'failed to create sandbox').trim()); - } - - if (!startRef) { - const renameResult = run( - 'git', - ['-C', selectedWorktreePath, 'branch', '-m', selectedBranch], - { timeout: 20_000 }, - ); - if (isSpawnFailure(renameResult)) { - throw renameResult.error; - } - if (renameResult.status !== 0) { - throw new Error( - (renameResult.stderr || renameResult.stdout || 'failed to name orphan sandbox branch').trim(), - ); - } - } - - return { - metadata: { - branch: selectedBranch, - worktreePath: selectedWorktreePath, - }, - stdout: - `[agent-branch-start] Created branch: ${selectedBranch}\n` + - `[agent-branch-start] Worktree: ${selectedWorktreePath}\n`, - stderr: addResult.stderr || '', - }; -} - -function startProtectedBaseSandbox(blocked, { taskName, sandboxSuffix }) { - if (sandboxSuffix === 'gx-doctor') { - return startProtectedBaseSandboxFallback(blocked, sandboxSuffix); - } - - const startResult = runPackageAsset('branchStart', [ - '--task', - taskName, - '--agent', - SHORT_TOOL_NAME, - '--base', - blocked.branch, - ], { cwd: blocked.repoRoot }); - if (isSpawnFailure(startResult)) { - throw startResult.error; - } - if (startResult.status !== 0) { - return startProtectedBaseSandboxFallback(blocked, sandboxSuffix); - } - - const metadata = extractAgentBranchStartMetadata(startResult.stdout); - const currentBranch = currentBranchName(blocked.repoRoot); - const worktreePath = metadata.worktreePath ? path.resolve(metadata.worktreePath) : ''; - const repoRootPath = path.resolve(blocked.repoRoot); - const hasSafeWorktree = Boolean(worktreePath) && worktreePath !== repoRootPath; - const branchChanged = Boolean(currentBranch) && currentBranch !== blocked.branch; - - if (!hasSafeWorktree || branchChanged) { - const restoreResult = ensureRepoBranch(blocked.repoRoot, blocked.branch); - if (!restoreResult.ok) { - const detail = [restoreResult.stderr, restoreResult.stdout].filter(Boolean).join('\n').trim(); - throw new Error( - `sandbox startup switched protected base checkout and could not restore '${blocked.branch}'.` + - (detail ? `\n${detail}` : ''), - ); - } - return startProtectedBaseSandboxFallback(blocked, sandboxSuffix); - } - - return { - metadata, - stdout: startResult.stdout || '', - stderr: startResult.stderr || '', - }; -} - -function cleanupProtectedBaseSandbox(repoRoot, metadata) { - const result = { - worktree: 'skipped', - branch: 'skipped', - note: 'missing sandbox metadata', - }; - - if (!metadata?.worktreePath || !metadata?.branch) { - return result; - } - - if (fs.existsSync(metadata.worktreePath)) { - const removeResult = run( - 'git', - ['-C', repoRoot, 'worktree', 'remove', '--force', metadata.worktreePath], - { timeout: 30_000 }, - ); - if (isSpawnFailure(removeResult)) { - throw removeResult.error; - } - if (removeResult.status !== 0) { - throw new Error( - (removeResult.stderr || removeResult.stdout || 'failed to remove sandbox worktree').trim(), - ); - } - result.worktree = 'removed'; - } else { - result.worktree = 'missing'; - } - - if (gitRefExists(repoRoot, `refs/heads/${metadata.branch}`)) { - const branchDeleteResult = run( - 'git', - ['-C', repoRoot, 'branch', '-D', metadata.branch], - { timeout: 20_000 }, - ); - if (isSpawnFailure(branchDeleteResult)) { - throw branchDeleteResult.error; - } - if (branchDeleteResult.status !== 0) { - throw new Error( - (branchDeleteResult.stderr || branchDeleteResult.stdout || 'failed to delete sandbox branch').trim(), - ); - } - result.branch = 'deleted'; - } else { - result.branch = 'missing'; - } - - result.note = 'sandbox worktree pruned'; - return result; -} - -function parseGitPathList(output) { - return String(output || '') - .split('\n') - .map((line) => line.trim()) - .filter((line) => line && line !== LOCK_FILE_RELATIVE); -} - -function collectDoctorChangedPaths(worktreePath) { - const changed = new Set(); - const commands = [ - ['diff', '--name-only'], - ['diff', '--cached', '--name-only'], - ['ls-files', '--others', '--exclude-standard'], - ]; - for (const gitArgs of commands) { - const result = run('git', ['-C', worktreePath, ...gitArgs], { timeout: 20_000 }); - for (const filePath of parseGitPathList(result.stdout)) { - changed.add(filePath); - } - } - return Array.from(changed); -} - -function collectDoctorDeletedPaths(worktreePath) { - const deleted = new Set(); - const commands = [ - ['diff', '--name-only', '--diff-filter=D'], - ['diff', '--cached', '--name-only', '--diff-filter=D'], - ]; - for (const gitArgs of commands) { - const result = run('git', ['-C', worktreePath, ...gitArgs], { timeout: 20_000 }); - for (const filePath of parseGitPathList(result.stdout)) { - deleted.add(filePath); - } - } - return Array.from(deleted); -} - -function collectWorktreeDirtyPaths(worktreePath) { - const dirty = new Set(); - const commands = [ - ['diff', '--name-only'], - ['diff', '--cached', '--name-only'], - ['ls-files', '--others', '--exclude-standard'], - ]; - for (const gitArgs of commands) { - const result = run('git', ['-C', worktreePath, ...gitArgs], { timeout: 20_000 }); - for (const filePath of parseGitPathList(result.stdout)) { - dirty.add(filePath); - } - } - return Array.from(dirty); -} - -function collectDoctorForceAddPaths(worktreePath) { - return REQUIRED_MANAGED_REPO_FILES - .filter((relativePath) => relativePath.startsWith('scripts/') || relativePath.startsWith('.githooks/')) - .filter((relativePath) => fs.existsSync(path.join(worktreePath, relativePath))); -} - -function stripDoctorSandboxLocks(rawContent, branchName) { - if (!rawContent || !branchName) { - return rawContent; - } - try { - const parsed = JSON.parse(rawContent); - const locks = parsed && typeof parsed === 'object' && parsed.locks && typeof parsed.locks === 'object' - ? parsed.locks - : null; - if (!locks) { - return rawContent; - } - let changed = false; - const filteredLocks = {}; - for (const [filePath, lockInfo] of Object.entries(locks)) { - if (lockInfo && lockInfo.branch === branchName) { - changed = true; - continue; - } - filteredLocks[filePath] = lockInfo; - } - if (!changed) { - return rawContent; - } - return `${JSON.stringify({ ...parsed, locks: filteredLocks }, null, 2)}\n`; - } catch { - return rawContent; - } -} - -function claimDoctorChangedLocks(metadata) { - if (!metadata.branch) { - return { - status: 'skipped', - note: 'missing sandbox branch metadata', - changedCount: 0, - deletedCount: 0, - }; - } - - const changedPaths = Array.from(new Set([ - ...collectDoctorChangedPaths(metadata.worktreePath), - ...collectDoctorForceAddPaths(metadata.worktreePath), - ])); - const deletedPaths = collectDoctorDeletedPaths(metadata.worktreePath); - if (changedPaths.length > 0) { - runPackageAsset('lockTool', ['claim', '--branch', metadata.branch, ...changedPaths], { - cwd: metadata.worktreePath, - timeout: 30_000, - }); - } - if (deletedPaths.length > 0) { - runPackageAsset('lockTool', ['allow-delete', '--branch', metadata.branch, ...deletedPaths], { - cwd: metadata.worktreePath, - timeout: 30_000, - }); - } - - return { - status: 'claimed', - note: 'claimed locks for doctor auto-commit', - changedCount: changedPaths.length, - deletedCount: deletedPaths.length, - }; -} - -function autoCommitDoctorSandboxChanges(metadata) { - if (!metadata.worktreePath || !metadata.branch) { - return { - status: 'skipped', - note: 'missing sandbox branch metadata', - }; - } - - claimDoctorChangedLocks(metadata); - run( - 'git', - ['-C', metadata.worktreePath, 'add', '-A', '--', '.', `:(exclude)${LOCK_FILE_RELATIVE}`], - { timeout: 20_000 }, - ); - const forceAddPaths = collectDoctorForceAddPaths(metadata.worktreePath); - if (forceAddPaths.length > 0) { - run( - 'git', - ['-C', metadata.worktreePath, 'add', '-f', '--', ...forceAddPaths], - { timeout: 20_000 }, - ); - } - const staged = run( - 'git', - ['-C', metadata.worktreePath, 'diff', '--cached', '--name-only', '--', '.', `:(exclude)${LOCK_FILE_RELATIVE}`], - { timeout: 20_000 }, - ); - const stagedFiles = parseGitPathList(staged.stdout); - if (stagedFiles.length === 0) { - return { - status: 'no-changes', - note: 'no committable doctor changes found in sandbox', - }; - } - - const commitResult = run( - 'git', - ['-C', metadata.worktreePath, 'commit', '-m', 'Auto-finish: gx doctor repairs'], - { timeout: 30_000 }, - ); - if (commitResult.status !== 0) { - return { - status: 'failed', - note: 'doctor sandbox auto-commit failed', - stdout: commitResult.stdout || '', - stderr: commitResult.stderr || '', - }; - } - - return { - status: 'committed', - note: 'doctor sandbox repairs committed', - commitMessage: 'Auto-finish: gx doctor repairs', - stagedFiles, - }; -} - -function hasOriginRemote(repoRoot) { - return run('git', ['-C', repoRoot, 'remote', 'get-url', 'origin']).status === 0; -} - -function originRemoteLooksLikeGithub(repoRoot) { - const originUrl = readGitConfig(repoRoot, 'remote.origin.url'); - if (!originUrl) { - return false; - } - return /github\.com[:/]/i.test(originUrl); -} - -function isCommandAvailable(commandName) { - return run('which', [commandName]).status === 0; -} - -function extractAgentBranchFinishPrUrl(output) { - const match = String(output || '').match(/\[agent-branch-finish\] PR:\s*(\S+)/); - return match ? match[1] : ''; -} - -function doctorFinishFlowIsPending(output) { - return ( - /\[agent-branch-finish\] PR merge not completed yet; leaving PR open\./.test(output) || - /\[agent-branch-finish\] Merge pending review\/check policy\. Branch cleanup skipped for now\./.test(output) || - /\[agent-branch-finish\] PR auto-merge enabled; waiting for required checks\/reviews\./.test(output) - ); -} - -function finishDoctorSandboxBranch(blocked, metadata, options = {}) { - if (!hasOriginRemote(blocked.repoRoot)) { - return { - status: 'skipped', - note: 'origin remote missing; skipped auto-finish', - }; - } - const explicitGhBin = Boolean(String(process.env.GUARDEX_GH_BIN || '').trim()); - if (!explicitGhBin && !originRemoteLooksLikeGithub(blocked.repoRoot)) { - return { - status: 'skipped', - note: 'origin remote is not GitHub; skipped auto-finish PR flow', - }; - } - - const ghBin = process.env.GUARDEX_GH_BIN || 'gh'; - if (!isCommandAvailable(ghBin)) { - return { - status: 'skipped', - note: `'${ghBin}' not available; skipped auto-finish PR flow`, - }; - } - const ghAuthStatus = run(ghBin, ['auth', 'status'], { timeout: 20_000 }); - if (ghAuthStatus.status !== 0) { - return { - status: 'skipped', - note: `'${ghBin}' auth unavailable; skipped auto-finish PR flow`, - stderr: ghAuthStatus.stderr || '', - }; - } - - const rawWaitTimeoutSeconds = Number.parseInt(process.env.GUARDEX_FINISH_WAIT_TIMEOUT_SECONDS || '1800', 10); - const waitTimeoutSeconds = - Number.isFinite(rawWaitTimeoutSeconds) && rawWaitTimeoutSeconds >= 30 ? rawWaitTimeoutSeconds : 1800; - const finishTimeoutMs = Math.max(180_000, (waitTimeoutSeconds + 60) * 1000); - const waitForMergeArg = options.waitForMerge === false ? '--no-wait-for-merge' : '--wait-for-merge'; - - const finishResult = runPackageAsset( - 'branchFinish', - ['--branch', metadata.branch, '--base', blocked.branch, '--via-pr', waitForMergeArg, '--cleanup'], - { cwd: metadata.worktreePath, timeout: finishTimeoutMs }, - ); - if (isSpawnFailure(finishResult)) { - return { - status: 'failed', - note: 'doctor sandbox finish flow errored', - stdout: finishResult.stdout || '', - stderr: finishResult.stderr || '', - }; - } - if (finishResult.status !== 0) { - return { - status: 'failed', - note: 'doctor sandbox finish flow failed', - stdout: finishResult.stdout || '', - stderr: finishResult.stderr || '', - }; - } - - const combinedOutput = `${finishResult.stdout || ''}\n${finishResult.stderr || ''}`; - if (doctorFinishFlowIsPending(combinedOutput)) { - return { - status: 'pending', - note: 'PR created and waiting for merge policy/checks', - prUrl: extractAgentBranchFinishPrUrl(combinedOutput), - stdout: finishResult.stdout || '', - stderr: finishResult.stderr || '', - }; - } - - return { - status: 'completed', - note: 'doctor sandbox finish flow completed', - stdout: finishResult.stdout || '', - stderr: finishResult.stderr || '', - }; -} - -function mergeDoctorSandboxRepairsBackToProtectedBase(options, blocked, metadata, autoCommitResult, finishResult) { - if (options.dryRun) { - return { - status: autoCommitResult.status === 'committed' ? 'would-merge' : 'skipped', - note: autoCommitResult.status === 'committed' - ? 'dry run: would fast-forward tracked doctor repairs into the protected base workspace' - : 'dry run skips tracked repair merge', - }; - } - - if (autoCommitResult.status !== 'committed') { - return { - status: autoCommitResult.status === 'no-changes' ? 'unchanged' : 'skipped', - note: autoCommitResult.status === 'no-changes' - ? 'no tracked doctor repairs needed in the protected base workspace' - : 'tracked doctor repair merge skipped', - }; - } - - if (finishResult.status !== 'skipped') { - return { - status: 'skipped', - note: finishResult.status === 'failed' - ? 'tracked doctor repairs remain in the sandbox after finish failure' - : 'tracked doctor repairs are being delivered through the sandbox finish flow', - }; - } - - const allowedPaths = new Set([ - ...(autoCommitResult.stagedFiles || []), - ...OMX_SCAFFOLD_DIRECTORIES, - ...Array.from(OMX_SCAFFOLD_FILES.keys()), - ...REQUIRED_MANAGED_REPO_FILES, - 'bin', - 'package.json', - '.gitignore', - 'AGENTS.md', - ]); - const dirtyPaths = collectWorktreeDirtyPaths(blocked.repoRoot); - let stashRef = ''; - if (dirtyPaths.length > 0) { - const unexpectedPaths = dirtyPaths.filter((filePath) => { - if (allowedPaths.has(filePath)) { - return false; - } - return !AGENT_WORKTREE_RELATIVE_DIRS.some( - (relativeDir) => filePath === relativeDir || filePath.startsWith(`${relativeDir}/`), - ); - }); - if (unexpectedPaths.length > 0) { - return { - status: 'failed', - note: `protected branch workspace has unrelated local changes: ${unexpectedPaths.join(', ')}`, - }; - } - const stashMessage = `guardex-doctor-merge-${Date.now()}`; - const stashResult = run( - 'git', - ['-C', blocked.repoRoot, 'stash', 'push', '--all', '--message', stashMessage], - { timeout: 30_000 }, - ); - if (isSpawnFailure(stashResult)) { - return { - status: 'failed', - note: 'could not stash protected branch doctor drift before merge', - stdout: stashResult.stdout || '', - stderr: stashResult.stderr || '', - }; - } - if (stashResult.status !== 0) { - return { - status: 'failed', - note: 'stashing protected branch doctor drift failed', - stdout: stashResult.stdout || '', - stderr: stashResult.stderr || '', - }; - } - - const stashLookup = run( - 'git', - ['-C', blocked.repoRoot, 'stash', 'list'], - { timeout: 20_000 }, - ); - stashRef = String(stashLookup.stdout || '') - .split('\n') - .find((line) => line.includes(stashMessage)) - ?.split(':')[0] - ?.trim() || ''; - } - - const restoreResult = ensureRepoBranch(blocked.repoRoot, blocked.branch); - if (!restoreResult.ok) { - if (stashRef) { - run('git', ['-C', blocked.repoRoot, 'stash', 'apply', stashRef], { timeout: 30_000 }); - } - return { - status: 'failed', - note: `could not restore protected branch '${blocked.branch}' before applying sandbox repairs`, - stdout: restoreResult.stdout || '', - stderr: restoreResult.stderr || '', - }; - } - - const mergeResult = run( - 'git', - ['-C', blocked.repoRoot, 'merge', '--ff-only', metadata.branch], - { timeout: 30_000 }, - ); - if (isSpawnFailure(mergeResult)) { - if (stashRef) { - run('git', ['-C', blocked.repoRoot, 'stash', 'apply', stashRef], { timeout: 30_000 }); - } - return { - status: 'failed', - note: 'tracked doctor repair merge errored', - stdout: mergeResult.stdout || '', - stderr: mergeResult.stderr || '', - }; - } - if (mergeResult.status !== 0) { - if (stashRef) { - run('git', ['-C', blocked.repoRoot, 'stash', 'apply', stashRef], { timeout: 30_000 }); - } - return { - status: 'failed', - note: 'tracked doctor repair merge failed', - stdout: mergeResult.stdout || '', - stderr: mergeResult.stderr || '', - }; - } - - let cleanupResult; - try { - cleanupResult = cleanupProtectedBaseSandbox(blocked.repoRoot, metadata); - } catch (error) { - return { - status: 'failed', - note: `tracked doctor repair merge succeeded but sandbox cleanup failed: ${error.message}`, - stdout: mergeResult.stdout || '', - stderr: mergeResult.stderr || '', - }; - } - - let hookRefreshResult; - try { - hookRefreshResult = configureHooks(blocked.repoRoot, false); - } catch (error) { - return { - status: 'failed', - note: `tracked doctor repair merge succeeded but local hook refresh failed: ${error.message}`, - stdout: mergeResult.stdout || '', - stderr: mergeResult.stderr || '', - }; - } - - if (stashRef) { - run('git', ['-C', blocked.repoRoot, 'stash', 'drop', stashRef], { timeout: 20_000 }); - } - - return { - status: 'merged', - note: 'fast-forwarded tracked doctor repairs into the protected base workspace', - stdout: mergeResult.stdout || '', - stderr: mergeResult.stderr || '', - cleanup: cleanupResult, - hookRefresh: hookRefreshResult, - }; -} - -function syncDoctorLocalSupportFiles(repoRoot, dryRun) { - return []; -} - -function runDoctorInSandbox(options, blocked) { - const startResult = startProtectedBaseSandbox(blocked, { - taskName: `${SHORT_TOOL_NAME}-doctor`, - sandboxSuffix: 'gx-doctor', - }); - const metadata = startResult.metadata; - - const sandboxTarget = resolveSandboxTarget(blocked.repoRoot, metadata.worktreePath, options.target); - const nestedResult = run( - process.execPath, - [__filename, ...buildSandboxDoctorArgs(options, sandboxTarget)], - { cwd: metadata.worktreePath }, - ); - if (isSpawnFailure(nestedResult)) { - throw nestedResult.error; - } - - let autoCommitResult = { - status: 'skipped', - note: 'sandbox doctor did not complete successfully', - }; - let finishResult = { - status: 'skipped', - note: 'sandbox doctor did not complete successfully', - }; - - let protectedBaseRepairSyncResult = { - status: 'skipped', - note: 'sandbox doctor did not complete successfully', - }; - let lockSyncResult = { - status: 'skipped', - note: 'sandbox doctor did not complete successfully', - }; - let sandboxLockContent = null; - let postSandboxAutoFinishSummary = { - enabled: false, - attempted: 0, - completed: 0, - skipped: 0, - failed: 0, - details: ['Skipped auto-finish sweep (sandbox doctor did not complete successfully).'], - }; - let omxScaffoldSyncResult = { - status: 'skipped', - note: 'sandbox doctor did not complete successfully', - }; - if (nestedResult.status === 0) { - const omxScaffoldOps = ensureOmxScaffold(blocked.repoRoot, Boolean(options.dryRun)); - const changedOmxPaths = omxScaffoldOps.filter((operation) => operation.status !== 'unchanged'); - if (changedOmxPaths.length === 0) { - omxScaffoldSyncResult = { - status: 'unchanged', - note: '.omx scaffold already in sync', - operations: omxScaffoldOps, - }; - } else { - omxScaffoldSyncResult = { - status: options.dryRun ? 'would-sync' : 'synced', - note: `${options.dryRun ? 'would sync' : 'synced'} ${changedOmxPaths.length} .omx path(s)`, - operations: omxScaffoldOps, - }; - } - - if (!options.dryRun) { - autoCommitResult = autoCommitDoctorSandboxChanges(metadata); - if (autoCommitResult.status === 'committed') { - finishResult = finishDoctorSandboxBranch(blocked, metadata, options); - } else if (autoCommitResult.status === 'no-changes') { - finishResult = { - status: 'skipped', - note: 'no doctor changes to auto-finish', - }; - } else if (autoCommitResult.status !== 'failed') { - finishResult = { - status: 'skipped', - note: 'auto-commit did not run', - }; - } - } else { - autoCommitResult = { - status: 'skipped', - note: 'dry-run skips doctor sandbox auto-commit', - }; - finishResult = { - status: 'skipped', - note: 'dry-run skips doctor sandbox finish flow', - }; - } - - const sandboxLockPath = path.join(metadata.worktreePath, LOCK_FILE_RELATIVE); - const baseLockPath = path.join(blocked.repoRoot, LOCK_FILE_RELATIVE); - if (!fs.existsSync(baseLockPath)) { - lockSyncResult = { - status: 'skipped', - note: `${LOCK_FILE_RELATIVE} missing in protected base workspace`, - }; - } else if (!fs.existsSync(sandboxLockPath)) { - lockSyncResult = { - status: 'skipped', - note: `${LOCK_FILE_RELATIVE} missing in sandbox worktree`, - }; - } else { - const sourceContent = stripDoctorSandboxLocks( - fs.readFileSync(sandboxLockPath, 'utf8'), - metadata.branch, - ); - sandboxLockContent = sourceContent; - const destinationContent = fs.readFileSync(baseLockPath, 'utf8'); - if (sourceContent === destinationContent) { - lockSyncResult = { - status: 'unchanged', - note: `${LOCK_FILE_RELATIVE} already in sync`, - }; - } else { - fs.mkdirSync(path.dirname(baseLockPath), { recursive: true }); - fs.writeFileSync(baseLockPath, sourceContent, 'utf8'); - lockSyncResult = { - status: 'synced', - note: `${LOCK_FILE_RELATIVE} synced from sandbox`, - }; - } - } - - protectedBaseRepairSyncResult = mergeDoctorSandboxRepairsBackToProtectedBase( - options, - blocked, - metadata, - autoCommitResult, - finishResult, - ); - - syncDoctorLocalSupportFiles(blocked.repoRoot, Boolean(options.dryRun)); - - const postMergeOmxScaffoldOps = ensureOmxScaffold(blocked.repoRoot, Boolean(options.dryRun)); - const postMergeChangedOmxPaths = postMergeOmxScaffoldOps.filter((operation) => operation.status !== 'unchanged'); - if (postMergeChangedOmxPaths.length === 0) { - omxScaffoldSyncResult = { - status: 'unchanged', - note: '.omx scaffold already in sync', - operations: postMergeOmxScaffoldOps, - }; - } else { - omxScaffoldSyncResult = { - status: options.dryRun ? 'would-sync' : 'synced', - note: `${options.dryRun ? 'would sync' : 'synced'} ${postMergeChangedOmxPaths.length} .omx path(s)`, - operations: postMergeOmxScaffoldOps, - }; - } - - const postMergeBaseLockPath = path.join(blocked.repoRoot, LOCK_FILE_RELATIVE); - if (sandboxLockContent === null) { - lockSyncResult = { - status: 'skipped', - note: `${LOCK_FILE_RELATIVE} missing in sandbox worktree`, - }; - } else if (!fs.existsSync(postMergeBaseLockPath)) { - fs.mkdirSync(path.dirname(postMergeBaseLockPath), { recursive: true }); - fs.writeFileSync(postMergeBaseLockPath, sandboxLockContent, 'utf8'); - lockSyncResult = { - status: 'synced', - note: `${LOCK_FILE_RELATIVE} recreated from sandbox`, - }; - } else { - const destinationContent = fs.readFileSync(postMergeBaseLockPath, 'utf8'); - if (sandboxLockContent === destinationContent) { - lockSyncResult = { - status: 'unchanged', - note: `${LOCK_FILE_RELATIVE} already in sync`, - }; - } else { - fs.mkdirSync(path.dirname(postMergeBaseLockPath), { recursive: true }); - fs.writeFileSync(postMergeBaseLockPath, sandboxLockContent, 'utf8'); - lockSyncResult = { - status: 'synced', - note: `${LOCK_FILE_RELATIVE} synced from sandbox`, - }; - } - } - - postSandboxAutoFinishSummary = autoFinishReadyAgentBranches(blocked.repoRoot, { - baseBranch: blocked.branch, - dryRun: options.dryRun, - waitForMerge: options.waitForMerge, - excludeBranches: [metadata.branch], - }); - } - - if (options.json) { - if (nestedResult.stdout) { - if (nestedResult.status === 0) { - try { - const parsed = JSON.parse(nestedResult.stdout); - process.stdout.write( - JSON.stringify( - { - ...parsed, - protectedBaseRepairSync: protectedBaseRepairSyncResult, - sandboxOmxScaffoldSync: omxScaffoldSyncResult, - sandboxLockSync: lockSyncResult, - sandboxAutoCommit: autoCommitResult, - sandboxFinish: finishResult, - autoFinish: postSandboxAutoFinishSummary, - }, - null, - 2, - ) + '\n', - ); - } catch { - process.stdout.write(nestedResult.stdout); - } - } else { - process.stdout.write(nestedResult.stdout); - } - } - if (nestedResult.stderr) process.stderr.write(nestedResult.stderr); - } else { - console.log( - `[${TOOL_NAME}] doctor detected protected branch '${blocked.branch}'. ` + - `Running repairs in sandbox branch '${metadata.branch || 'agent/'}'.`, - ); - if (startResult.stdout) process.stdout.write(startResult.stdout); - if (startResult.stderr) process.stderr.write(startResult.stderr); - if (nestedResult.stdout) process.stdout.write(nestedResult.stdout); - if (nestedResult.stderr) process.stderr.write(nestedResult.stderr); - if (nestedResult.status === 0) { - if (autoCommitResult.status === 'committed') { - console.log( - `[${TOOL_NAME}] Auto-committed doctor repairs in sandbox branch '${metadata.branch}'.`, - ); - } else if (autoCommitResult.status === 'failed') { - console.log(`[${TOOL_NAME}] Doctor sandbox auto-commit failed; branch left for manual follow-up.`); - if (autoCommitResult.stdout) process.stdout.write(autoCommitResult.stdout); - if (autoCommitResult.stderr) process.stderr.write(autoCommitResult.stderr); - } else { - console.log(`[${TOOL_NAME}] Doctor sandbox auto-commit skipped: ${autoCommitResult.note}.`); - } - - if (protectedBaseRepairSyncResult.status === 'merged') { - console.log(`[${TOOL_NAME}] Fast-forwarded tracked doctor repairs into the protected branch workspace.`); - } else if (protectedBaseRepairSyncResult.status === 'unchanged') { - console.log(`[${TOOL_NAME}] Protected branch workspace already had the tracked doctor repairs.`); - } else if (protectedBaseRepairSyncResult.status === 'would-merge') { - console.log(`[${TOOL_NAME}] Dry run: would fast-forward tracked doctor repairs into the protected branch workspace.`); - } else if (protectedBaseRepairSyncResult.status === 'failed') { - console.log(`[${TOOL_NAME}] Protected branch tracked repair merge failed: ${protectedBaseRepairSyncResult.note}.`); - if (protectedBaseRepairSyncResult.stdout) process.stdout.write(protectedBaseRepairSyncResult.stdout); - if (protectedBaseRepairSyncResult.stderr) process.stderr.write(protectedBaseRepairSyncResult.stderr); - } else { - console.log(`[${TOOL_NAME}] Protected branch tracked repair merge skipped: ${protectedBaseRepairSyncResult.note}.`); - } - - if (lockSyncResult.status === 'synced') { - console.log( - `[${TOOL_NAME}] Synced repaired lock registry back to protected branch workspace (${LOCK_FILE_RELATIVE}).`, - ); - } else if (lockSyncResult.status === 'unchanged') { - console.log(`[${TOOL_NAME}] Lock registry already synced in protected branch workspace.`); - } else { - console.log(`[${TOOL_NAME}] Lock registry sync skipped: ${lockSyncResult.note}.`); - } - - if (finishResult.status === 'completed') { - console.log(`[${TOOL_NAME}] Auto-finish flow completed for sandbox branch '${metadata.branch}'.`); - if (finishResult.stdout) process.stdout.write(finishResult.stdout); - if (finishResult.stderr) process.stderr.write(finishResult.stderr); - } else if (finishResult.status === 'pending') { - console.log( - `[${TOOL_NAME}] Auto-finish pending for sandbox branch '${metadata.branch}': ${finishResult.note}.`, - ); - if (finishResult.prUrl) { - console.log(`[${TOOL_NAME}] PR: ${finishResult.prUrl}`); - } - if (finishResult.stdout) process.stdout.write(finishResult.stdout); - if (finishResult.stderr) process.stderr.write(finishResult.stderr); - } else if (finishResult.status === 'failed') { - console.log(`[${TOOL_NAME}] Auto-finish flow failed for sandbox branch '${metadata.branch}'.`); - console.log(`[${TOOL_NAME}] Auto-finish flow failed for sandbox branch '${metadata.branch}'.`); - if (finishResult.stdout) process.stdout.write(finishResult.stdout); - if (finishResult.stderr) process.stderr.write(finishResult.stderr); - } else { - console.log(`[${TOOL_NAME}] Auto-finish skipped: ${finishResult.note}.`); - } - - printAutoFinishSummary(postSandboxAutoFinishSummary, { - baseBranch: blocked.branch, - verbose: options.verboseAutoFinish, - }); - if (omxScaffoldSyncResult.status === 'synced') { - console.log(`[${TOOL_NAME}] Synced .omx scaffold back to protected branch workspace.`); - } else if (omxScaffoldSyncResult.status === 'unchanged') { - console.log(`[${TOOL_NAME}] .omx scaffold already aligned in protected branch workspace.`); - } else if (omxScaffoldSyncResult.status === 'would-sync') { - console.log(`[${TOOL_NAME}] Dry run: would sync .omx scaffold back to protected branch workspace.`); - } else { - console.log(`[${TOOL_NAME}] .omx scaffold sync skipped: ${omxScaffoldSyncResult.note}.`); - } - } - } - - if (typeof nestedResult.status === 'number') { - let exitCode = nestedResult.status; - if (exitCode === 0 && autoCommitResult.status === 'failed') { - exitCode = 1; - } - if ( - exitCode === 0 && - autoCommitResult.status === 'committed' && - (finishResult.status === 'failed' || finishResult.status === 'pending') - ) { - exitCode = 1; - } - if (exitCode === 0 && protectedBaseRepairSyncResult.status === 'failed') { - exitCode = 1; - } - process.exitCode = exitCode; - return; - } - process.exitCode = 1; -} - -function runSetupInSandbox(options, blocked, repoLabel = '') { - const startResult = startProtectedBaseSandbox(blocked, { - taskName: `${SHORT_TOOL_NAME}-setup`, - sandboxSuffix: 'gx-setup', - }); - const metadata = startResult.metadata; - - if (startResult.stdout) process.stdout.write(startResult.stdout); - if (startResult.stderr) process.stderr.write(startResult.stderr); - console.log( - `[${TOOL_NAME}] setup blocked on protected branch '${blocked.branch}' in an initialized repo; ` + - 'refreshing through a sandbox worktree and syncing managed bootstrap files back locally.', - ); - - const sandboxTarget = resolveSandboxTarget(blocked.repoRoot, metadata.worktreePath, options.target); - const nestedResult = run( - process.execPath, - [__filename, ...buildSandboxSetupArgs(options, sandboxTarget)], - { cwd: metadata.worktreePath }, - ); - if (isSpawnFailure(nestedResult)) { - throw nestedResult.error; - } - if (nestedResult.status !== 0) { - if (nestedResult.stdout) process.stdout.write(nestedResult.stdout); - if (nestedResult.stderr) process.stderr.write(nestedResult.stderr); - throw new Error( - `sandboxed setup failed for protected branch '${blocked.branch}'. ` + - `Inspect sandbox at ${metadata.worktreePath}`, - ); - } - - const syncOptions = { - ...options, - target: blocked.repoRoot, - recursive: false, - allowProtectedBaseWrite: true, - }; - const { installPayload, fixPayload, parentWorkspace } = runSetupBootstrapInternal(syncOptions); - printOperations(`Setup/install${repoLabel}`, installPayload, syncOptions.dryRun); - printOperations(`Setup/fix${repoLabel}`, fixPayload, syncOptions.dryRun); - if (!syncOptions.dryRun && parentWorkspace) { - console.log(`[${TOOL_NAME}] Parent workspace view: ${parentWorkspace.workspacePath}`); - } - - const scanResult = runScanInternal({ target: blocked.repoRoot, json: false }); - const currentBaseBranch = currentBranchName(scanResult.repoRoot); - const autoFinishSummary = autoFinishReadyAgentBranches(scanResult.repoRoot, { - baseBranch: currentBaseBranch, - dryRun: syncOptions.dryRun, - }); - printScanResult(scanResult, false); - if (autoFinishSummary.enabled) { - console.log( - `[${TOOL_NAME}] Auto-finish sweep (base=${currentBaseBranch}): attempted=${autoFinishSummary.attempted}, completed=${autoFinishSummary.completed}, skipped=${autoFinishSummary.skipped}, failed=${autoFinishSummary.failed}`, - ); - for (const detail of autoFinishSummary.details) { - console.log(`[${TOOL_NAME}] ${detail}`); - } - } else if (autoFinishSummary.details.length > 0) { - console.log(`[${TOOL_NAME}] ${autoFinishSummary.details[0]}`); - } - - const cleanupResult = cleanupProtectedBaseSandbox(blocked.repoRoot, metadata); - console.log( - `[${TOOL_NAME}] Protected-base setup sandbox cleanup: ${cleanupResult.note} ` + - `(worktree=${cleanupResult.worktree}, branch=${cleanupResult.branch}).`, - ); - - return { - scanResult, - }; -} - -function parseTargetFlag(rawArgs, defaultTarget = process.cwd()) { - const remaining = []; - let target = defaultTarget; - - for (let index = 0; index < rawArgs.length; index += 1) { - const arg = rawArgs[index]; - if (arg === '--target') { - const next = rawArgs[index + 1]; - if (!next) { - throw new Error('--target requires a path value'); - } - target = next; - index += 1; - continue; - } - remaining.push(arg); - } - - return { target, args: remaining }; -} - -function parseReviewArgs(rawArgs) { - const parsed = parseTargetFlag(rawArgs, process.cwd()); - const passthroughArgs = [...parsed.args]; - if (passthroughArgs[0] === 'start') { - passthroughArgs.shift(); - } - return { - target: parsed.target, - passthroughArgs, - }; -} - -function parseAgentsArgs(rawArgs) { - const parsed = parseTargetFlag(rawArgs, process.cwd()); - const [subcommandRaw = '', ...rest] = parsed.args; - const subcommand = subcommandRaw || 'status'; - const options = { - target: parsed.target, - subcommand, - reviewIntervalSeconds: 30, - cleanupIntervalSeconds: 60, - idleMinutes: DEFAULT_SHADOW_CLEANUP_IDLE_MINUTES, - }; - - for (let index = 0; index < rest.length; index += 1) { - const arg = rest[index]; - if (arg === '--review-interval') { - const next = rest[index + 1]; - if (!next) { - throw new Error('--review-interval requires an integer seconds value'); - } - const parsedValue = Number.parseInt(next, 10); - if (!Number.isInteger(parsedValue) || parsedValue < 5) { - throw new Error('--review-interval must be an integer >= 5 seconds'); - } - options.reviewIntervalSeconds = parsedValue; - index += 1; - continue; - } - if (arg === '--cleanup-interval') { - const next = rest[index + 1]; - if (!next) { - throw new Error('--cleanup-interval requires an integer seconds value'); - } - const parsedValue = Number.parseInt(next, 10); - if (!Number.isInteger(parsedValue) || parsedValue < 5) { - throw new Error('--cleanup-interval must be an integer >= 5 seconds'); - } - options.cleanupIntervalSeconds = parsedValue; - index += 1; - continue; - } - if (arg === '--idle-minutes') { - const next = rest[index + 1]; - if (!next) { - throw new Error('--idle-minutes requires an integer minutes value'); - } - const parsedValue = Number.parseInt(next, 10); - if (!Number.isInteger(parsedValue) || parsedValue < 1) { - throw new Error('--idle-minutes must be an integer >= 1'); - } - options.idleMinutes = parsedValue; - index += 1; - continue; - } - throw new Error(`Unknown option: ${arg}`); - } - - if (!['start', 'stop', 'status'].includes(options.subcommand)) { - throw new Error(`Unknown agents subcommand: ${options.subcommand}`); - } - - return options; -} - -function parseReportArgs(rawArgs) { - const options = { - target: process.cwd(), - subcommand: '', - repo: '', - scorecardJson: '', - outputDir: '', - date: '', - dryRun: false, - json: false, - }; - - for (let index = 0; index < rawArgs.length; index += 1) { - const arg = rawArgs[index]; - if (arg === '--target') { - const next = rawArgs[index + 1]; - if (!next) throw new Error('--target requires a path value'); - options.target = next; - index += 1; - continue; - } - if (arg === '--repo') { - const next = rawArgs[index + 1]; - if (!next) throw new Error('--repo requires a value like github.com/owner/repo'); - options.repo = next; - index += 1; - continue; - } - if (arg === '--scorecard-json') { - const next = rawArgs[index + 1]; - if (!next) throw new Error('--scorecard-json requires a path value'); - options.scorecardJson = next; - index += 1; - continue; - } - if (arg === '--output-dir') { - const next = rawArgs[index + 1]; - if (!next) throw new Error('--output-dir requires a path value'); - options.outputDir = next; - index += 1; - continue; - } - if (arg === '--date') { - const next = rawArgs[index + 1]; - if (!next) throw new Error('--date requires a YYYY-MM-DD value'); - options.date = next; - index += 1; - continue; - } - if (arg === '--dry-run') { - options.dryRun = true; - continue; - } - if (arg === '--json') { - options.json = true; - continue; - } - if (arg.startsWith('-')) { - throw new Error(`Unknown option: ${arg}`); - } - if (!options.subcommand) { - options.subcommand = arg; - continue; - } - throw new Error(`Unexpected argument: ${arg}`); - } - - return options; -} - -function todayDateStamp() { - return new Date().toISOString().slice(0, 10); -} - -function inferGithubRepoFromOrigin(repoRoot) { - const rawOrigin = readGitConfig(repoRoot, 'remote.origin.url'); - if (!rawOrigin) return ''; - - const httpsMatch = rawOrigin.match(/github\.com[:/](.+?)(?:\.git)?$/i); - if (!httpsMatch) return ''; - const slug = (httpsMatch[1] || '').replace(/^\/+/, '').trim(); - if (!slug || !slug.includes('/')) return ''; - return `github.com/${slug}`; -} - -function inferGithubRepoSlug(rawValue) { - const raw = String(rawValue || '').trim(); - if (!raw) return ''; - const match = raw.match(/github\.com[:/](.+?)(?:\.git)?$/i); - if (!match) return ''; - const slug = String(match[1] || '') - .replace(/^\/+/, '') - .replace(/^github\.com\//i, '') - .trim(); - if (!slug || !slug.includes('/')) return ''; - return slug; -} - -function resolveScorecardRepo(repoRoot, explicitRepo) { - if (explicitRepo) { - return explicitRepo.trim(); - } - const inferred = inferGithubRepoFromOrigin(repoRoot); - if (inferred) return inferred; - throw new Error( - 'Unable to infer GitHub repo from origin remote. Pass --repo github.com//.', - ); -} - -function runScorecardJson(repo) { - const result = run(SCORECARD_BIN, ['--repo', repo, '--format', 'json'], { allowFailure: true }); - if (result.status !== 0) { - const details = (result.stderr || result.stdout || '').trim(); - throw new Error( - `Failed to run scorecard CLI ('${SCORECARD_BIN} --repo ${repo} --format json').${details ? `\n${details}` : ''}`, - ); - } - - try { - return JSON.parse(result.stdout || '{}'); - } catch (error) { - throw new Error(`Unable to parse scorecard JSON output: ${error.message}`); - } -} - -function readScorecardJsonFile(filePath) { - const absolute = path.resolve(filePath); - if (!fs.existsSync(absolute)) { - throw new Error(`scorecard JSON file not found: ${absolute}`); - } - try { - return JSON.parse(fs.readFileSync(absolute, 'utf8')); - } catch (error) { - throw new Error(`Unable to parse scorecard JSON file: ${error.message}`); - } -} - -function normalizeScorecardChecks(payload) { - const rawChecks = Array.isArray(payload?.checks) ? payload.checks : []; - return rawChecks.map((check) => { - const name = String(check?.name || 'Unknown'); - const rawScore = Number(check?.score); - const score = Number.isFinite(rawScore) ? rawScore : 0; - return { - name, - score, - risk: SCORECARD_RISK_BY_CHECK[name] || 'Unknown', - }; - }); -} - -function renderScorecardBaselineMarkdown({ repo, score, checks, capturedAt, scorecardVersion, reportDate }) { - const rows = checks - .map((item) => `| ${item.name} | ${item.score} | ${item.risk} |`) - .join('\n'); - - return [ - '# OpenSSF Scorecard Baseline Report', - '', - `- **Repository:** \`${repo}\``, - '- **Source:** generated by `gx report scorecard`', - `- **Captured at:** ${capturedAt}`, - `- **Scorecard version:** \`${scorecardVersion}\``, - `- **Overall score:** **${score} / 10**`, - '', - '## Check breakdown', - '', - '| Check | Score | Risk |', - '|---|---:|---|', - rows || '| (none) | 0 | Unknown |', - '', - `## Report date`, - '', - `- ${reportDate}`, - '', - ].join('\n'); -} - -function renderScorecardRemediationPlanMarkdown({ baselineRelativePath, checks }) { - const failing = checks.filter((item) => item.score < 10); - const failingRows = failing - .sort((a, b) => a.score - b.score || a.name.localeCompare(b.name)) - .map((item) => `| ${item.name} | ${item.score} | ${item.risk} |`) - .join('\n'); - - return [ - '# OpenSSF Scorecard Remediation Plan', - '', - `Based on baseline report: \`${baselineRelativePath}\`.`, - '', - '## Failing checks', - '', - '| Check | Score | Risk |', - '|---|---:|---|', - (failingRows || '| None | 10 | N/A |'), - '', - '## Priority order', - '', - '1. Fix **High** risk checks first (especially score 0 items).', - '2. Then close **Medium** risk checks with score < 10.', - '3. Finally address **Low** risk ecosystem/process checks.', - '', - '## Verification loop', - '', - '1. Run scorecard again.', - '2. Re-generate baseline + remediation files.', - '3. Compare score deltas and track improved checks.', - '', - ].join('\n'); -} - -function parseBranchList(rawValue) { - return String(rawValue || '') - .split(/[\s,]+/) - .map((item) => item.trim()) - .filter(Boolean); -} - -function uniquePreserveOrder(items) { - const seen = new Set(); - const result = []; - for (const item of items) { - if (seen.has(item)) continue; - seen.add(item); - result.push(item); - } - return result; -} - -function readConfiguredProtectedBranches(repoRoot) { - const result = gitRun(repoRoot, ['config', '--get', GIT_PROTECTED_BRANCHES_KEY], { allowFailure: true }); - if (result.status !== 0) { - return null; - } - const parsed = uniquePreserveOrder(parseBranchList(result.stdout.trim())); - if (parsed.length === 0) { - return null; - } - return parsed; -} - -function listLocalUserBranches(repoRoot) { - const result = gitRun(repoRoot, ['for-each-ref', '--format=%(refname:short)', 'refs/heads'], { allowFailure: true }); - const branchNames = result.status === 0 - ? uniquePreserveOrder( - String(result.stdout || '') - .split('\n') - .map((item) => item.trim()) - .filter(Boolean), - ) - : []; - - const additionalUserBranches = branchNames.filter( - (branchName) => - !branchName.startsWith('agent/') && - !DEFAULT_PROTECTED_BRANCHES.includes(branchName), - ); - if (additionalUserBranches.length > 0) { - return additionalUserBranches; - } - - const current = gitRun(repoRoot, ['branch', '--show-current'], { allowFailure: true }); - if (current.status !== 0) { - return []; - } - - const branchName = String(current.stdout || '').trim(); - if ( - !branchName || - branchName.startsWith('agent/') || - DEFAULT_PROTECTED_BRANCHES.includes(branchName) - ) { - return []; - } - - return [branchName]; -} - -function listLocalAgentBranches(repoRoot) { - const result = gitRun( - repoRoot, - ['for-each-ref', '--format=%(refname:short)', 'refs/heads/agent/'], - { allowFailure: true }, - ); - if (result.status !== 0) { - return []; - } - return uniquePreserveOrder( - String(result.stdout || '') - .split('\n') - .map((item) => item.trim()) - .filter(Boolean), - ); -} - -function mapWorktreePathsByBranch(repoRoot) { - const result = gitRun(repoRoot, ['worktree', 'list', '--porcelain'], { allowFailure: true }); - const map = new Map(); - if (result.status !== 0) { - return map; - } - - const lines = String(result.stdout || '').split('\n'); - let currentWorktree = ''; - for (const line of lines) { - if (line.startsWith('worktree ')) { - currentWorktree = line.slice('worktree '.length).trim(); - continue; - } - if (line.startsWith('branch refs/heads/')) { - const branchName = line.slice('branch refs/heads/'.length).trim(); - if (currentWorktree && branchName) { - map.set(branchName, currentWorktree); - } - } - } - return map; -} - -function hasSignificantWorkingTreeChanges(worktreePath) { - const result = run('git', [ - '-C', - worktreePath, - 'status', - '--porcelain', - '--untracked-files=normal', - '--', - ]); - if (result.status !== 0) { - return true; - } - - const lines = String(result.stdout || '') - .split('\n') - .map((line) => line.trimEnd()) - .filter((line) => line.length > 0); - - for (const line of lines) { - const pathPart = (line.length > 3 ? line.slice(3) : '').trim(); - if (!pathPart) continue; - if (pathPart === LOCK_FILE_RELATIVE) continue; - if (pathPart.startsWith(`${LOCK_FILE_RELATIVE} -> `)) continue; - if (pathPart.endsWith(` -> ${LOCK_FILE_RELATIVE}`)) continue; - return true; - } - return false; -} - -function autoFinishReadyAgentBranches(repoRoot, options = {}) { - const baseBranch = String(options.baseBranch || '').trim(); - const dryRun = Boolean(options.dryRun); - const waitForMerge = options.waitForMerge !== false; - const excludedBranches = new Set( - Array.isArray(options.excludeBranches) - ? options.excludeBranches.map((branch) => String(branch || '').trim()).filter(Boolean) - : [], - ); - - const summary = { - enabled: true, - baseBranch, - attempted: 0, - completed: 0, - skipped: 0, - failed: 0, - details: [], - }; - - if (!baseBranch || baseBranch === 'HEAD' || baseBranch.startsWith('agent/')) { - summary.enabled = false; - summary.details.push('Skipped auto-finish sweep (base branch is missing or not a non-agent local branch).'); - return summary; - } - - if (String(process.env.GUARDEX_DOCTOR_SANDBOX || '') === '1') { - summary.enabled = false; - summary.details.push('Skipped auto-finish sweep inside doctor sandbox pass.'); - return summary; - } - - if (String(process.env.GUARDEX_SKIP_AUTO_FINISH_READY_BRANCHES || '') === '1') { - summary.enabled = false; - summary.details.push('Skipped auto-finish sweep (GUARDEX_SKIP_AUTO_FINISH_READY_BRANCHES=1).'); - return summary; - } - - if (dryRun) { - summary.enabled = false; - summary.details.push('Skipped auto-finish sweep in dry-run mode.'); - return summary; - } - - const hasOrigin = gitRun(repoRoot, ['remote', 'get-url', 'origin'], { allowFailure: true }).status === 0; - if (!hasOrigin) { - summary.enabled = false; - summary.details.push('Skipped auto-finish sweep (origin remote missing).'); - return summary; - } - const explicitGhBin = Boolean(String(process.env.GUARDEX_GH_BIN || '').trim()); - if (!explicitGhBin && !originRemoteLooksLikeGithub(repoRoot)) { - summary.enabled = false; - summary.details.push('Skipped auto-finish sweep (origin remote is not GitHub).'); - return summary; - } - - const ghBin = process.env.GUARDEX_GH_BIN || 'gh'; - if (run(ghBin, ['--version']).status !== 0) { - summary.enabled = false; - summary.details.push(`Skipped auto-finish sweep (${ghBin} not available).`); - return summary; - } - - const branchWorktrees = mapWorktreePathsByBranch(repoRoot); - const agentBranches = listLocalAgentBranches(repoRoot); - if (agentBranches.length === 0) { - summary.enabled = false; - summary.details.push('No local agent branches found for auto-finish sweep.'); - return summary; - } - - for (const branch of agentBranches) { - if (excludedBranches.has(branch)) { - summary.skipped += 1; - summary.details.push(`[skip] ${branch}: excluded from this auto-finish sweep.`); - continue; - } - - if (branch === baseBranch) { - summary.skipped += 1; - summary.details.push(`[skip] ${branch}: source branch equals base branch.`); - continue; - } - - let counts; - try { - counts = aheadBehind(repoRoot, branch, baseBranch); - } catch (error) { - summary.failed += 1; - summary.details.push(`[fail] ${branch}: unable to compute ahead/behind (${error.message}).`); - continue; - } - - if (counts.ahead <= 0) { - summary.skipped += 1; - summary.details.push(`[skip] ${branch}: already merged into ${baseBranch}.`); - continue; - } - - const branchWorktree = branchWorktrees.get(branch) || ''; - if (branchWorktree && hasSignificantWorkingTreeChanges(branchWorktree)) { - summary.skipped += 1; - summary.details.push(`[skip] ${branch}: dirty worktree (${branchWorktree}).`); - continue; - } - - summary.attempted += 1; - const finishArgs = [ - '--branch', - branch, - '--base', - baseBranch, - '--via-pr', - waitForMerge ? '--wait-for-merge' : '--no-wait-for-merge', - '--cleanup', - ]; - const finishResult = runPackageAsset('branchFinish', finishArgs, { cwd: repoRoot }); - const combinedOutput = [finishResult.stdout || '', finishResult.stderr || ''].join('\n').trim(); - - if (finishResult.status === 0) { - summary.completed += 1; - summary.details.push(`[done] ${branch}: auto-finish completed.`); - continue; - } - - const recoverableConflict = detectRecoverableAutoFinishConflict(combinedOutput); - if (recoverableConflict) { - summary.skipped += 1; - const tail = combinedOutput ? ` ${combinedOutput.split('\n').slice(-2).join(' | ')}` : ''; - summary.details.push(`[skip] ${branch}: ${recoverableConflict.rawLabel}${tail}`); - continue; - } - - summary.failed += 1; - const tail = combinedOutput ? ` ${combinedOutput.split('\n').slice(-2).join(' | ')}` : ''; - summary.details.push(`[fail] ${branch}: auto-finish failed.${tail}`); - } - - return summary; -} - -function ensureSetupProtectedBranches(repoRoot, dryRun) { - const localUserBranches = listLocalUserBranches(repoRoot); - if (localUserBranches.length === 0) { - return { - status: 'unchanged', - file: `git config ${GIT_PROTECTED_BRANCHES_KEY}`, - note: 'no additional local user branches detected', - }; - } - - const configured = readConfiguredProtectedBranches(repoRoot); - const currentBranches = configured || [...DEFAULT_PROTECTED_BRANCHES]; - const missingBranches = localUserBranches.filter((branchName) => !currentBranches.includes(branchName)); - if (missingBranches.length === 0) { - return { - status: 'unchanged', - file: `git config ${GIT_PROTECTED_BRANCHES_KEY}`, - note: 'local user branches already protected', - }; - } - - const nextBranches = uniquePreserveOrder([...currentBranches, ...missingBranches]); - if (!dryRun) { - writeProtectedBranches(repoRoot, nextBranches); - } - - return { - status: dryRun ? 'would-update' : 'updated', - file: `git config ${GIT_PROTECTED_BRANCHES_KEY}`, - note: `added local user branch(es): ${missingBranches.join(', ')}`, - }; -} - -function readProtectedBranches(repoRoot) { - const result = gitRun(repoRoot, ['config', '--get', GIT_PROTECTED_BRANCHES_KEY], { allowFailure: true }); - if (result.status !== 0) { - return [...DEFAULT_PROTECTED_BRANCHES]; - } - - const parsed = uniquePreserveOrder(parseBranchList(result.stdout.trim())); - if (parsed.length === 0) { - return [...DEFAULT_PROTECTED_BRANCHES]; - } - return parsed; -} - -function writeProtectedBranches(repoRoot, branches) { - if (branches.length === 0) { - gitRun(repoRoot, ['config', '--unset-all', GIT_PROTECTED_BRANCHES_KEY], { allowFailure: true }); - return; - } - gitRun(repoRoot, ['config', GIT_PROTECTED_BRANCHES_KEY, branches.join(' ')]); -} - -function readGitConfig(repoRoot, key) { - const result = gitRun(repoRoot, ['config', '--get', key], { allowFailure: true }); - if (result.status !== 0) { - return ''; - } - return (result.stdout || '').trim(); -} - -function resolveBaseBranch(repoRoot, explicitBase) { - if (explicitBase) { - return explicitBase; - } - const configured = readGitConfig(repoRoot, GIT_BASE_BRANCH_KEY); - return configured || DEFAULT_BASE_BRANCH; -} - -function resolveSyncStrategy(repoRoot, explicitStrategy) { - const strategy = (explicitStrategy || readGitConfig(repoRoot, GIT_SYNC_STRATEGY_KEY) || DEFAULT_SYNC_STRATEGY) - .trim() - .toLowerCase(); - if (strategy !== 'rebase' && strategy !== 'merge') { - throw new Error(`Invalid sync strategy '${strategy}' (expected: rebase or merge)`); - } - return strategy; -} - -function currentBranchName(repoRoot) { - const result = gitRun(repoRoot, ['branch', '--show-current'], { allowFailure: true }); - if (result.status !== 0) { - throw new Error('Unable to detect current branch'); - } - const branch = (result.stdout || '').trim(); - if (!branch) { - throw new Error('Detached HEAD is not supported for sync operations'); - } - return branch; -} - -function repoHasHeadCommit(repoRoot) { - return gitRun(repoRoot, ['rev-parse', '--verify', 'HEAD'], { allowFailure: true }).status === 0; -} - -function readBranchDisplayName(repoRoot) { - const symbolic = gitRun(repoRoot, ['symbolic-ref', '--quiet', '--short', 'HEAD'], { allowFailure: true }); - if (symbolic.status === 0) { - const branch = String(symbolic.stdout || '').trim(); - if (!branch) { - return '(unknown)'; - } - return repoHasHeadCommit(repoRoot) ? branch : `${branch} (unborn; no commits yet)`; - } - - const detached = gitRun(repoRoot, ['rev-parse', '--short', 'HEAD'], { allowFailure: true }); - if (detached.status === 0) { - return `(detached at ${String(detached.stdout || '').trim()})`; - } - return '(unknown)'; -} - -function repoHasOriginRemote(repoRoot) { - return gitRun(repoRoot, ['remote', 'get-url', 'origin'], { allowFailure: true }).status === 0; -} - -function detectComposeHintFiles(repoRoot) { - return COMPOSE_HINT_FILES.filter((relativePath) => fs.existsSync(path.join(repoRoot, relativePath))); -} - -function printSetupRepoHints(repoRoot, baseBranch, repoLabel = '') { - const branchDisplay = readBranchDisplayName(repoRoot); - const hasHeadCommit = repoHasHeadCommit(repoRoot); - const hasOrigin = repoHasOriginRemote(repoRoot); - const composeFiles = detectComposeHintFiles(repoRoot); - if (hasHeadCommit && hasOrigin && composeFiles.length === 0) { - return; - } - - const label = repoLabel ? ` ${repoLabel}` : ''; - if (!hasHeadCommit) { - console.log(`[${TOOL_NAME}] Fresh repo onboarding${label}: current branch is ${branchDisplay}.`); - console.log(`[${TOOL_NAME}] Bootstrap commit${label}: git add . && git commit -m "bootstrap gitguardex"`); - console.log( - `[${TOOL_NAME}] First agent flow${label}: ` + - `gx branch start "" "codex" -> ` + - `gx locks claim --branch "$(git branch --show-current)" -> ` + - `gx branch finish --branch "$(git branch --show-current)" --base ${baseBranch} --via-pr --wait-for-merge`, - ); - } - if (!hasOrigin) { - console.log(`[${TOOL_NAME}] No origin remote${label}: finish and auto-merge flows stay local until you add one.`); - } - if (composeFiles.length > 0) { - console.log( - `[${TOOL_NAME}] Docker Compose helper${label}: detected ${composeFiles.join(', ')}. ` + - `Set GUARDEX_DOCKER_SERVICE and run 'bash scripts/guardex-docker-loader.sh -- '.`, - ); - } -} - -function workingTreeIsDirty(repoRoot) { - const result = gitRun(repoRoot, ['status', '--porcelain'], { allowFailure: true }); - if (result.status !== 0) { - throw new Error('Unable to inspect git working tree status'); - } - const lines = (result.stdout || '').split('\n').filter((line) => line.length > 0); - const significant = lines.filter((line) => { - const pathPart = (line.length > 3 ? line.slice(3) : '').trim(); - if (!pathPart) return false; - if (pathPart === LOCK_FILE_RELATIVE) return false; - if (pathPart.startsWith(`${LOCK_FILE_RELATIVE} -> `)) return false; - if (pathPart.endsWith(` -> ${LOCK_FILE_RELATIVE}`)) return false; - return true; - }); - return significant.length > 0; -} - -function ensureOriginBaseRef(repoRoot, baseBranch) { - const fetch = gitRun(repoRoot, ['fetch', 'origin', baseBranch, '--quiet'], { allowFailure: true }); - if (fetch.status !== 0) { - throw new Error( - `Unable to fetch origin/${baseBranch}. Ensure remote 'origin' exists and branch '${baseBranch}' is available.`, - ); - } - const hasRemoteBase = gitRun(repoRoot, ['show-ref', '--verify', '--quiet', `refs/remotes/origin/${baseBranch}`], { - allowFailure: true, - }); - if (hasRemoteBase.status !== 0) { - throw new Error(`Remote base branch not found: origin/${baseBranch}`); - } -} - -function aheadBehind(repoRoot, branchRef, baseRef) { - const result = gitRun(repoRoot, ['rev-list', '--left-right', '--count', `${branchRef}...${baseRef}`], { - allowFailure: true, - }); - if (result.status !== 0) { - throw new Error(`Unable to compute ahead/behind for ${branchRef} vs ${baseRef}`); - } - const parts = (result.stdout || '').trim().split(/\s+/).filter(Boolean); - const ahead = Number.parseInt(parts[0] || '0', 10); - const behind = Number.parseInt(parts[1] || '0', 10); - return { ahead: Number.isFinite(ahead) ? ahead : 0, behind: Number.isFinite(behind) ? behind : 0 }; -} - -function lockRegistryStatus(repoRoot) { - const result = gitRun(repoRoot, ['status', '--porcelain', '--', LOCK_FILE_RELATIVE], { allowFailure: true }); - if (result.status !== 0) { - return { dirty: false, untracked: false }; - } - const lines = (result.stdout || '').split('\n').filter((line) => line.length > 0); - if (lines.length === 0) { - return { dirty: false, untracked: false }; - } - const untracked = lines.some((line) => line.startsWith('??')); - return { dirty: true, untracked }; -} - -function parseSyncArgs(rawArgs) { - const options = { - target: process.cwd(), - check: false, - base: '', - strategy: '', - ffOnly: false, - dryRun: false, - json: false, - allAgentBranches: false, - allowNonAgent: false, - allowDirty: false, - }; - - for (let index = 0; index < rawArgs.length; index += 1) { - const arg = rawArgs[index]; - if (arg === '--target') { - const next = rawArgs[index + 1]; - if (!next) { - throw new Error('--target requires a path value'); - } - options.target = next; - index += 1; - continue; - } - if (arg === '--base') { - const next = rawArgs[index + 1]; - if (!next) { - throw new Error('--base requires a branch value'); - } - options.base = next; - index += 1; - continue; - } - if (arg === '--strategy') { - const next = rawArgs[index + 1]; - if (!next) { - throw new Error('--strategy requires a value (rebase|merge)'); - } - options.strategy = next; - index += 1; - continue; - } - if (arg === '--check') { - options.check = true; - continue; - } - if (arg === '--ff-only') { - options.ffOnly = true; - continue; - } - if (arg === '--dry-run') { - options.dryRun = true; - continue; - } - if (arg === '--json') { - options.json = true; - continue; - } - if (arg === '--all-agent-branches') { - options.allAgentBranches = true; - continue; - } - if (arg === '--allow-non-agent') { - options.allowNonAgent = true; - continue; - } - if (arg === '--allow-dirty') { - options.allowDirty = true; - continue; - } - throw new Error(`Unknown option: ${arg}`); - } - - return options; -} - -function parseCleanupArgs(rawArgs) { - const options = { - target: process.cwd(), - base: '', - branch: '', - dryRun: false, - forceDirty: false, - keepRemote: false, - keepCleanWorktrees: false, - includePrMerged: false, - idleMinutes: 0, - watch: false, - intervalSeconds: 60, - once: false, - maxBranches: 0, - }; - - for (let index = 0; index < rawArgs.length; index += 1) { - const arg = rawArgs[index]; - if (arg === '--target') { - const next = rawArgs[index + 1]; - if (!next) { - throw new Error('--target requires a path value'); - } - options.target = next; - index += 1; - continue; - } - if (arg === '--base') { - const next = rawArgs[index + 1]; - if (!next) { - throw new Error('--base requires a branch value'); - } - options.base = next; - index += 1; - continue; - } - if (arg === '--branch') { - const next = rawArgs[index + 1]; - if (!next) { - throw new Error('--branch requires an agent branch value'); - } - options.branch = next; - index += 1; - continue; - } - if (arg === '--dry-run') { - options.dryRun = true; - continue; - } - if (arg === '--force-dirty') { - options.forceDirty = true; - continue; - } - if (arg === '--keep-remote') { - options.keepRemote = true; - continue; - } - if (arg === '--keep-clean-worktrees') { - options.keepCleanWorktrees = true; - continue; - } - if (arg === '--include-pr-merged') { - options.includePrMerged = true; - continue; - } - if (arg === '--idle-minutes') { - const next = rawArgs[index + 1]; - if (!next) { - throw new Error('--idle-minutes requires an integer value'); - } - const parsed = Number.parseInt(next, 10); - if (!Number.isInteger(parsed) || parsed < 0) { - throw new Error('--idle-minutes must be an integer >= 0'); - } - options.idleMinutes = parsed; - index += 1; - continue; - } - if (arg === '--watch') { - options.watch = true; - continue; - } - if (arg === '--interval') { - const next = rawArgs[index + 1]; - if (!next) { - throw new Error('--interval requires an integer seconds value'); - } - const parsed = Number.parseInt(next, 10); - if (!Number.isInteger(parsed) || parsed < 5) { - throw new Error('--interval must be an integer >= 5 seconds'); - } - options.intervalSeconds = parsed; - index += 1; - continue; - } - if (arg === '--once') { - options.once = true; - continue; - } - if (arg === '--max-branches') { - const next = rawArgs[index + 1]; - if (!next) { - throw new Error('--max-branches requires an integer value'); - } - const parsed = Number.parseInt(next, 10); - if (!Number.isInteger(parsed) || parsed < 1) { - throw new Error('--max-branches must be an integer >= 1'); - } - options.maxBranches = parsed; - index += 1; - continue; - } - throw new Error(`Unknown option: ${arg}`); - } - - if (options.watch && options.idleMinutes === 0) { - options.idleMinutes = DEFAULT_SHADOW_CLEANUP_IDLE_MINUTES; - } - - return options; -} - -function parseMergeArgs(rawArgs) { - const options = { - target: process.cwd(), - base: '', - into: '', - branches: [], - task: '', - agent: '', - }; - - for (let index = 0; index < rawArgs.length; index += 1) { - const arg = rawArgs[index]; - if (arg === '--target') { - const next = rawArgs[index + 1]; - if (!next) { - throw new Error('--target requires a path value'); - } - options.target = next; - index += 1; - continue; - } - if (arg === '--base') { - const next = rawArgs[index + 1]; - if (!next) { - throw new Error('--base requires a branch value'); - } - options.base = next; - index += 1; - continue; - } - if (arg === '--into') { - const next = rawArgs[index + 1]; - if (!next) { - throw new Error('--into requires an agent/* branch value'); - } - options.into = next; - index += 1; - continue; - } - if (arg === '--branch') { - const next = rawArgs[index + 1]; - if (!next) { - throw new Error('--branch requires an agent/* branch value'); - } - options.branches.push(next); - index += 1; - continue; - } - if (arg === '--task') { - const next = rawArgs[index + 1]; - if (!next) { - throw new Error('--task requires a task value'); - } - options.task = next; - index += 1; - continue; - } - if (arg === '--agent') { - const next = rawArgs[index + 1]; - if (!next) { - throw new Error('--agent requires an agent value'); - } - options.agent = next; - index += 1; - continue; - } - throw new Error(`Unknown option: ${arg}`); - } - - if (options.branches.length === 0) { - throw new Error('merge requires at least one --branch input'); - } - - return options; -} - -function parseFinishArgs(rawArgs, defaults = {}) { - const options = { - target: process.cwd(), - base: '', - branch: '', - all: false, - dryRun: false, - waitForMerge: defaults.waitForMerge ?? true, - cleanup: defaults.cleanup ?? true, - keepRemote: false, - noAutoCommit: false, - failFast: false, - commitMessage: '', - mergeMode: defaults.mergeMode || 'pr', - }; - - for (let index = 0; index < rawArgs.length; index += 1) { - const arg = rawArgs[index]; - if (arg === '--target') { - const next = rawArgs[index + 1]; - if (!next) { - throw new Error('--target requires a path value'); - } - options.target = next; - index += 1; - continue; - } - if (arg === '--base') { - const next = rawArgs[index + 1]; - if (!next) { - throw new Error('--base requires a branch value'); - } - options.base = next; - index += 1; - continue; - } - if (arg === '--branch') { - const next = rawArgs[index + 1]; - if (!next) { - throw new Error('--branch requires an agent/* branch value'); - } - options.branch = next; - index += 1; - continue; - } - if (arg === '--commit-message') { - const next = rawArgs[index + 1]; - if (!next) { - throw new Error('--commit-message requires a value'); - } - options.commitMessage = next; - index += 1; - continue; - } - if (arg === '--all') { - options.all = true; - continue; - } - if (arg === '--dry-run') { - options.dryRun = true; - continue; - } - if (arg === '--wait-for-merge') { - options.waitForMerge = true; - continue; - } - if (arg === '--no-wait-for-merge') { - options.waitForMerge = false; - continue; - } - if (arg === '--via-pr') { - options.mergeMode = 'pr'; - continue; - } - if (arg === '--direct-only') { - options.mergeMode = 'direct'; - continue; - } - if (arg === '--mode') { - const next = rawArgs[index + 1]; - if (!next) { - throw new Error('--mode requires a value'); - } - if (!['auto', 'direct', 'pr'].includes(next)) { - throw new Error(`Invalid --mode value: ${next} (expected auto|direct|pr)`); - } - options.mergeMode = next; - index += 1; - continue; - } - if (arg === '--cleanup') { - options.cleanup = true; - continue; - } - if (arg === '--no-cleanup') { - options.cleanup = false; - continue; - } - if (arg === '--keep-remote') { - options.keepRemote = true; - continue; - } - if (arg === '--no-auto-commit') { - options.noAutoCommit = true; - continue; - } - if (arg === '--fail-fast') { - options.failFast = true; - continue; - } - throw new Error(`Unknown option: ${arg}`); - } - - if (options.branch && !options.branch.startsWith('agent/')) { - throw new Error(`--branch must reference an agent/* branch (received: ${options.branch})`); - } - - return options; -} - -function listAgentWorktrees(repoRoot) { - const result = gitRun(repoRoot, ['worktree', 'list', '--porcelain'], { allowFailure: true }); - if (result.status !== 0) { - throw new Error('Unable to list git worktrees for finish command'); - } - - const entries = []; - let currentPath = ''; - let currentBranchRef = ''; - const lines = String(result.stdout || '').split('\n'); - for (const line of lines) { - if (!line.trim()) { - if (currentPath && currentBranchRef.startsWith('refs/heads/agent/')) { - entries.push({ - worktreePath: currentPath, - branch: currentBranchRef.replace(/^refs\/heads\//, ''), - }); - } - currentPath = ''; - currentBranchRef = ''; - continue; - } - if (line.startsWith('worktree ')) { - currentPath = line.slice('worktree '.length).trim(); - continue; - } - if (line.startsWith('branch ')) { - currentBranchRef = line.slice('branch '.length).trim(); - continue; - } - } - if (currentPath && currentBranchRef.startsWith('refs/heads/agent/')) { - entries.push({ - worktreePath: currentPath, - branch: currentBranchRef.replace(/^refs\/heads\//, ''), - }); - } - - return entries; -} - -function listLocalAgentBranchesForFinish(repoRoot) { - const result = gitRun( - repoRoot, - ['for-each-ref', '--format=%(refname:short)', 'refs/heads/agent/'], - { allowFailure: true }, - ); - if (result.status !== 0) { - throw new Error('Unable to list local agent branches'); - } - return uniquePreserveOrder( - String(result.stdout || '') - .split('\n') - .map((line) => line.trim()) - .filter((line) => line.startsWith('agent/')), - ); -} - -function gitQuietChangeResult(worktreePath, args) { - const result = run('git', ['-C', worktreePath, ...args], { stdio: 'pipe' }); - if (result.status === 0) { - return false; - } - if (result.status === 1) { - return true; - } - throw new Error( - `git ${args.join(' ')} failed in ${worktreePath}: ${( - result.stderr || result.stdout || '' - ).trim()}`, - ); -} - -function worktreeHasLocalChanges(worktreePath) { - const hasUnstaged = gitQuietChangeResult(worktreePath, [ - 'diff', - '--quiet', - '--', - '.', - ':(exclude).omx/state/agent-file-locks.json', - ]); - if (hasUnstaged) { - return true; - } - - const hasStaged = gitQuietChangeResult(worktreePath, [ - 'diff', - '--cached', - '--quiet', - '--', - '.', - ':(exclude).omx/state/agent-file-locks.json', - ]); - if (hasStaged) { - return true; - } - - const untracked = run('git', ['-C', worktreePath, 'ls-files', '--others', '--exclude-standard'], { - stdio: 'pipe', - }); - if (untracked.status !== 0) { - throw new Error(`Unable to inspect untracked files in ${worktreePath}`); - } - return String(untracked.stdout || '').trim().length > 0; -} - -function gitOutputLines(worktreePath, args) { - const result = run('git', ['-C', worktreePath, ...args], { stdio: 'pipe' }); - if (result.status !== 0) { - throw new Error( - `git ${args.join(' ')} failed in ${worktreePath}: ${( - result.stderr || result.stdout || '' - ).trim()}`, - ); - } - return String(result.stdout || '') - .split('\n') - .map((line) => line.trim()) - .filter(Boolean); -} - -function claimLocksForAutoCommit(repoRoot, worktreePath, branch) { - const changedFiles = uniquePreserveOrder([ - ...gitOutputLines(worktreePath, ['diff', '--name-only', '--', '.', ':(exclude).omx/state/agent-file-locks.json']), - ...gitOutputLines(worktreePath, ['diff', '--cached', '--name-only', '--', '.', ':(exclude).omx/state/agent-file-locks.json']), - ...gitOutputLines(worktreePath, ['ls-files', '--others', '--exclude-standard']), - ]); - - if (changedFiles.length > 0) { - const claim = runPackageAsset('lockTool', ['claim', '--branch', branch, ...changedFiles], { - cwd: repoRoot, - stdio: 'pipe', - }); - if (claim.status !== 0) { - throw new Error( - `Lock claim failed for ${branch}: ${( - claim.stderr || claim.stdout || '' - ).trim()}`, - ); - } - } - - const deletedFiles = uniquePreserveOrder([ - ...gitOutputLines(worktreePath, [ - 'diff', - '--name-only', - '--diff-filter=D', - '--', - '.', - ':(exclude).omx/state/agent-file-locks.json', - ]), - ...gitOutputLines(worktreePath, [ - 'diff', - '--cached', - '--name-only', - '--diff-filter=D', - '--', - '.', - ':(exclude).omx/state/agent-file-locks.json', - ]), - ]); - - if (deletedFiles.length > 0) { - const allowDelete = runPackageAsset('lockTool', ['allow-delete', '--branch', branch, ...deletedFiles], { - cwd: repoRoot, - stdio: 'pipe', - }); - if (allowDelete.status !== 0) { - throw new Error( - `Delete-lock grant failed for ${branch}: ${( - allowDelete.stderr || allowDelete.stdout || '' - ).trim()}`, - ); - } - } -} - -function branchExists(repoRoot, branch) { - const result = gitRun(repoRoot, ['show-ref', '--verify', '--quiet', `refs/heads/${branch}`], { - allowFailure: true, - }); - return result.status === 0; -} - -function resolveFinishBaseBranch(repoRoot, _sourceBranch, explicitBase) { - if (explicitBase) { - return explicitBase; - } - - const configured = readGitConfig(repoRoot, GIT_BASE_BRANCH_KEY); - if (configured) { - return configured; - } - - return DEFAULT_BASE_BRANCH; -} - -function branchMergedIntoBase(repoRoot, branch, baseBranch) { - if (!branchExists(repoRoot, baseBranch)) { - return false; - } - const result = gitRun(repoRoot, ['merge-base', '--is-ancestor', branch, baseBranch], { - allowFailure: true, - }); - if (result.status === 0) { - return true; - } - if (result.status === 1) { - return false; - } - throw new Error(`Unable to determine merge status for ${branch} -> ${baseBranch}`); -} - -function autoCommitWorktreeForFinish(repoRoot, worktreePath, branch, options) { - const hasChanges = worktreeHasLocalChanges(worktreePath); - if (!hasChanges) { - return { changed: false, committed: false }; - } - - if (options.noAutoCommit) { - throw new Error( - `Branch '${branch}' has local changes in ${worktreePath}. Re-run without --no-auto-commit or commit manually first.`, - ); - } - - if (options.dryRun) { - return { changed: true, committed: false, dryRun: true }; - } - - claimLocksForAutoCommit(repoRoot, worktreePath, branch); - - const addResult = run('git', ['-C', worktreePath, 'add', '-A'], { stdio: 'pipe' }); - if (addResult.status !== 0) { - throw new Error(`git add failed in ${worktreePath}: ${(addResult.stderr || addResult.stdout || '').trim()}`); - } - - const stagedHasChanges = gitQuietChangeResult(worktreePath, [ - 'diff', - '--cached', - '--quiet', - '--', - '.', - ':(exclude).omx/state/agent-file-locks.json', - ]); - if (!stagedHasChanges) { - return { changed: true, committed: false }; - } - - const commitMessage = options.commitMessage || `Auto-finish: ${branch}`; - const commitResult = run('git', ['-C', worktreePath, 'commit', '-m', commitMessage], { stdio: 'pipe' }); - if (commitResult.status !== 0) { - throw new Error( - `Auto-commit failed on '${branch}': ${( - commitResult.stderr || commitResult.stdout || '' - ).trim()}`, - ); - } - - return { changed: true, committed: true, message: commitMessage }; -} - -function syncOperation(repoRoot, strategy, baseRef, ffOnly) { - if (strategy === 'rebase') { - if (ffOnly) { - throw new Error('--ff-only is only supported with --strategy merge'); - } - const rebased = run('git', ['-C', repoRoot, 'rebase', baseRef], { stdio: 'pipe' }); - if (rebased.status !== 0) { - const details = (rebased.stderr || rebased.stdout || '').trim(); - const gitDir = path.join(repoRoot, '.git'); - const rebaseActive = fs.existsSync(path.join(gitDir, 'rebase-merge')) || fs.existsSync(path.join(gitDir, 'rebase-apply')); - const help = rebaseActive - ? '\nResolve conflicts, then run: git rebase --continue\nOr abort: git rebase --abort' - : ''; - throw new Error(`Sync failed during rebase onto ${baseRef}.${details ? `\n${details}` : ''}${help}`); - } - return; - } - - const mergeArgs = ['-C', repoRoot, 'merge', '--no-edit']; - if (ffOnly) { - mergeArgs.push('--ff-only'); - } - mergeArgs.push(baseRef); - const merged = run('git', mergeArgs, { stdio: 'pipe' }); - if (merged.status !== 0) { - const details = (merged.stderr || merged.stdout || '').trim(); - const gitDir = path.join(repoRoot, '.git'); - const mergeActive = fs.existsSync(path.join(gitDir, 'MERGE_HEAD')); - const help = mergeActive ? '\nResolve conflicts, then run: git commit\nOr abort: git merge --abort' : ''; - throw new Error(`Sync failed during merge from ${baseRef}.${details ? `\n${details}` : ''}${help}`); - } -} - -function isInteractiveTerminal() { - return Boolean(process.stdin.isTTY && process.stdout.isTTY); -} - -const stdinWaitArray = new Int32Array(new SharedArrayBuffer(4)); - -function sleepSyncMs(milliseconds) { - Atomics.wait(stdinWaitArray, 0, 0, milliseconds); -} - -function readSingleLineFromStdin() { - let input = ''; - const buffer = Buffer.alloc(1); - - while (true) { - let bytesRead = 0; - try { - bytesRead = fs.readSync(process.stdin.fd, buffer, 0, 1); - } catch (error) { - if (error && ['EAGAIN', 'EWOULDBLOCK', 'EINTR'].includes(error.code)) { - sleepSyncMs(15); - continue; - } - return input; - } - - if (bytesRead === 0) { - if (process.stdin.isTTY) { - sleepSyncMs(15); - continue; - } - return input; - } - - const char = buffer.toString('utf8', 0, bytesRead); - if (char === '\n' || char === '\r') { - return input; - } - input += char; - } -} - -function promptYesNo(question, defaultYes = true) { - const hint = defaultYes ? '[Y/n]' : '[y/N]'; - while (true) { - process.stdout.write(`${question} ${hint} `); - const answer = readSingleLineFromStdin().trim().toLowerCase(); - - if (!answer) { - return defaultYes; - } - if (answer === 'y' || answer === 'yes') { - return true; - } - if (answer === 'n' || answer === 'no') { - return false; - } - process.stdout.write('Please answer with y or n.\n'); - } -} - -function envFlagEnabled(name) { - const raw = process.env[name]; - if (raw == null) return false; - return ['1', 'true', 'yes', 'on'].includes(String(raw).trim().toLowerCase()); -} - -function parseAutoApproval(name) { - const raw = process.env[name]; - if (raw == null) return null; - const normalized = String(raw).trim().toLowerCase(); - if (['1', 'true', 'yes', 'y', 'on'].includes(normalized)) return true; - if (['0', 'false', 'no', 'n', 'off'].includes(normalized)) return false; - return null; -} - -function parseBooleanLike(raw) { - if (raw == null) return null; - const normalized = String(raw).trim().toLowerCase(); - if (!normalized) return null; - if (['1', 'true', 'yes', 'y', 'on'].includes(normalized)) return true; - if (['0', 'false', 'no', 'n', 'off'].includes(normalized)) return false; - return null; -} - -function parseDotenvAssignmentValue(raw) { - let value = String(raw || '').trim(); - if (!value) return ''; - if ((value.startsWith('"') && value.endsWith('"')) || (value.startsWith('\'') && value.endsWith('\''))) { - return value.slice(1, -1).trim(); - } - value = value.replace(/\s+#.*$/, '').trim(); - return value; -} - -function readRepoDotenvValue(repoRoot, name) { - const envPath = path.join(repoRoot, '.env'); - if (!fs.existsSync(envPath)) return null; - const pattern = new RegExp(`^\\s*(?:export\\s+)?${name.replace(/[.*+?^${}()|[\]\\]/g, '\\$&')}\\s*=\\s*(.*)$`); - const lines = fs.readFileSync(envPath, 'utf8').split(/\r?\n/); - for (const line of lines) { - const trimmed = line.trim(); - if (!trimmed || trimmed.startsWith('#')) continue; - const match = line.match(pattern); - if (!match) continue; - return parseDotenvAssignmentValue(match[1]); - } - return null; -} - -function resolveGuardexRepoToggle(repoRoot, env = process.env) { - const envRaw = env[GUARDEX_REPO_TOGGLE_ENV]; - const envEnabled = parseBooleanLike(envRaw); - if (envEnabled !== null) { - return { - enabled: envEnabled, - source: 'process environment', - raw: String(envRaw).trim(), - }; - } - - const dotenvRaw = readRepoDotenvValue(repoRoot, GUARDEX_REPO_TOGGLE_ENV); - const dotenvEnabled = parseBooleanLike(dotenvRaw); - if (dotenvEnabled !== null) { - return { - enabled: dotenvEnabled, - source: 'repo .env', - raw: String(dotenvRaw).trim(), - }; - } - - return { - enabled: true, - source: 'default', - raw: '', - }; -} - -function describeGuardexRepoToggle(toggle) { - if (!toggle || toggle.source === 'default') { - return 'default enabled mode'; - } - return `${toggle.source} (${GUARDEX_REPO_TOGGLE_ENV}=${toggle.raw})`; -} - -function parseVersionString(version) { - const match = String(version || '').trim().match(/^v?(\d+)\.(\d+)\.(\d+)/); - if (!match) return null; - return [ - Number.parseInt(match[1], 10), - Number.parseInt(match[2], 10), - Number.parseInt(match[3], 10), - ]; -} - -function compareParsedVersions(left, right) { - if (!left || !right) return 0; - for (let index = 0; index < Math.max(left.length, right.length); index += 1) { - const leftValue = left[index] || 0; - const rightValue = right[index] || 0; - if (leftValue > rightValue) return 1; - if (leftValue < rightValue) return -1; - } - return 0; -} - -function isNewerVersion(latest, current) { - const latestParts = parseVersionString(latest); - const currentParts = parseVersionString(current); - - if (!latestParts || !currentParts) { - return String(latest || '').trim() !== String(current || '').trim(); - } - - return compareParsedVersions(latestParts, currentParts) > 0; -} - -function parseNpmVersionOutput(stdout) { - const trimmed = String(stdout || '').trim(); - if (!trimmed) return ''; - - try { - const parsed = JSON.parse(trimmed); - if (Array.isArray(parsed)) { - return String(parsed[parsed.length - 1] || '').trim(); - } - return String(parsed || '').trim(); - } catch { - const firstLine = trimmed.split('\n').map((line) => line.trim()).find(Boolean); - return firstLine || ''; - } -} - -function checkForGuardexUpdate() { - if (envFlagEnabled('GUARDEX_SKIP_UPDATE_CHECK')) { - return { checked: false, reason: 'disabled' }; - } - - const forceCheck = envFlagEnabled('GUARDEX_FORCE_UPDATE_CHECK'); - if (!forceCheck && !isInteractiveTerminal()) { - return { checked: false, reason: 'non-interactive' }; - } - - const result = run(NPM_BIN, ['view', packageJson.name, 'version', '--json'], { timeout: 5000 }); - if (result.status !== 0) { - return { checked: false, reason: 'lookup-failed' }; - } - - const latest = parseNpmVersionOutput(result.stdout); - if (!latest) { - return { checked: false, reason: 'invalid-latest-version' }; - } - - return { - checked: true, - current: packageJson.version, - latest, - updateAvailable: isNewerVersion(latest, packageJson.version), - }; -} - -function printUpdateAvailableBanner(current, latest) { - const title = colorize('UPDATE AVAILABLE', '1;33'); - console.log(`[${TOOL_NAME}] ${title}`); - console.log(`[${TOOL_NAME}] Current: ${current}`); - console.log(`[${TOOL_NAME}] Latest : ${latest}`); - console.log(`[${TOOL_NAME}] Command: ${NPM_BIN} i -g ${packageJson.name}@latest`); -} - -function maybeSelfUpdateBeforeStatus() { - const check = checkForGuardexUpdate(); - if (!check.checked || !check.updateAvailable) { - return; - } - - printUpdateAvailableBanner(check.current, check.latest); - - const autoApproval = parseAutoApproval('GUARDEX_AUTO_UPDATE_APPROVAL'); - const interactive = isInteractiveTerminal(); - - if (!interactive && autoApproval == null) { - console.log(`[${TOOL_NAME}] Non-interactive shell; skipping auto-update prompt.`); - return; - } - - const shouldUpdate = interactive - ? promptYesNoStrict( - `Update now? (${NPM_BIN} i -g ${packageJson.name}@latest)`, - ) - : autoApproval; - - if (!shouldUpdate) { - console.log(`[${TOOL_NAME}] Skipped update.`); - return; - } - - const installResult = run(NPM_BIN, ['i', '-g', `${packageJson.name}@latest`], { stdio: 'inherit' }); - if (installResult.status !== 0) { - console.log(`[${TOOL_NAME}] ⚠️ Update failed. You can retry manually.`); - return; - } - - // Verify the install actually advanced the on-disk version. npm sometimes - // reports "changed 1 package" with status 0 while leaving the old files - // in place (version resolution cache / dedupe quirks). If the installed - // version doesn't match check.latest, retry with the pinned version so - // npm bypasses whatever heuristic made it skip the upgrade. - const postInstallVersion = readInstalledGuardexVersion(); - if (postInstallVersion != null && postInstallVersion !== check.latest) { - console.log( - `[${TOOL_NAME}] Installed version is still ${postInstallVersion} (expected ${check.latest}). ` + - `Retrying with pinned version ${check.latest}…`, - ); - const pinnedResult = run( - NPM_BIN, - ['i', '-g', `${packageJson.name}@${check.latest}`], - { stdio: 'inherit' }, - ); - if (pinnedResult.status !== 0) { - console.log( - `[${TOOL_NAME}] ⚠️ Pinned retry failed. Run manually: ${NPM_BIN} i -g ${packageJson.name}@${check.latest}`, - ); - return; - } - const pinnedVersion = readInstalledGuardexVersion(); - if (pinnedVersion != null && pinnedVersion !== check.latest) { - console.log( - `[${TOOL_NAME}] ⚠️ On-disk version still ${pinnedVersion} after pinned retry. ` + - `Investigate: ${NPM_BIN} root -g && ${NPM_BIN} cache verify`, - ); - return; - } - } - - console.log(`[${TOOL_NAME}] ✅ Updated to latest published version.`); - restartIntoUpdatedGuardex(check.latest); -} - -function readInstalledGuardexVersion() { - const installInfo = readInstalledGuardexInstallInfo(); - return installInfo ? installInfo.version : null; -} - -function readInstalledGuardexInstallInfo() { - // Resolves the globally-installed package's on-disk version so we can - // verify npm actually wrote new bytes. Uses `npm root -g` to locate the - // global install root so we don't accidentally read the running source - // tree (which is the file the CLI was spawned from — that IS the global - // copy in the normal case, but a bump should be visible via a fresh read - // either way). Returns null if we can't determine it. - try { - const rootResult = run(NPM_BIN, ['root', '-g'], { timeout: 5000 }); - if (rootResult.status !== 0) { - return null; - } - const globalRoot = String(rootResult.stdout || '').trim(); - if (!globalRoot) { - return null; - } - const installedPkgPath = path.join(globalRoot, packageJson.name, 'package.json'); - if (!fs.existsSync(installedPkgPath)) { - return null; - } - const parsed = JSON.parse(fs.readFileSync(installedPkgPath, 'utf8')); - if (parsed && typeof parsed.version === 'string') { - let binRelative = null; - if (typeof parsed.bin === 'string') { - binRelative = parsed.bin; - } else if (parsed.bin && typeof parsed.bin === 'object') { - const invokedName = path.basename(process.argv[1] || ''); - binRelative = - parsed.bin[invokedName] || - parsed.bin[SHORT_TOOL_NAME] || - Object.values(parsed.bin).find((value) => typeof value === 'string') || - null; - } - const packageRoot = path.dirname(installedPkgPath); - const binPath = binRelative ? path.join(packageRoot, binRelative) : null; - return { - version: parsed.version, - packageRoot, - binPath, - }; - } - } catch (error) { - return null; - } - return null; -} - -function restartIntoUpdatedGuardex(expectedVersion) { - const installInfo = readInstalledGuardexInstallInfo(); - if (!installInfo || installInfo.version !== expectedVersion || installInfo.version === packageJson.version) { - return; - } - if (!installInfo.binPath || !fs.existsSync(installInfo.binPath)) { - console.log(`[${TOOL_NAME}] Restart required to use ${installInfo.version}. Rerun ${SHORT_TOOL_NAME}.`); - return; - } - - console.log(`[${TOOL_NAME}] Restarting into ${installInfo.version}…`); - const restartResult = cp.spawnSync( - process.execPath, - [installInfo.binPath, ...process.argv.slice(2)], - { - cwd: process.cwd(), - env: { - ...process.env, - GUARDEX_SKIP_UPDATE_CHECK: '1', - }, - stdio: 'inherit', - }, - ); - if (restartResult.error) { - console.log( - `[${TOOL_NAME}] Restart into ${installInfo.version} failed. Rerun ${SHORT_TOOL_NAME}.`, - ); - return; - } - process.exit(restartResult.status == null ? 0 : restartResult.status); -} - -function checkForOpenSpecPackageUpdate() { - if (envFlagEnabled('GUARDEX_SKIP_OPENSPEC_UPDATE_CHECK')) { - return { checked: false, reason: 'disabled' }; - } - - const forceCheck = envFlagEnabled('GUARDEX_FORCE_OPENSPEC_UPDATE_CHECK'); - if (!forceCheck && !isInteractiveTerminal()) { - return { checked: false, reason: 'non-interactive' }; - } - - const detection = detectGlobalToolchainPackages(); - if (!detection.ok) { - return { checked: false, reason: 'package-detect-failed' }; - } - - const current = String((detection.installedVersions || {})[OPENSPEC_PACKAGE] || '').trim(); - if (!current) { - return { checked: false, reason: 'not-installed' }; - } - - const latestResult = run(NPM_BIN, ['view', OPENSPEC_PACKAGE, 'version', '--json'], { timeout: 5000 }); - if (latestResult.status !== 0) { - return { checked: false, reason: 'lookup-failed' }; - } - - const latest = parseNpmVersionOutput(latestResult.stdout); - if (!latest) { - return { checked: false, reason: 'invalid-latest-version' }; - } - - return { - checked: true, - current, - latest, - updateAvailable: isNewerVersion(latest, current), - }; -} - -function printOpenSpecUpdateAvailableBanner(current, latest) { - const title = colorize('OPENSPEC UPDATE AVAILABLE', '1;33'); - console.log(`[${TOOL_NAME}] ${title}`); - console.log(`[${TOOL_NAME}] Current: ${current}`); - console.log(`[${TOOL_NAME}] Latest : ${latest}`); - console.log(`[${TOOL_NAME}] Command: ${NPM_BIN} i -g ${OPENSPEC_PACKAGE}@latest`); - console.log(`[${TOOL_NAME}] Then : ${OPENSPEC_BIN} update`); -} - -function maybeOpenSpecUpdateBeforeStatus() { - const check = checkForOpenSpecPackageUpdate(); - if (!check.checked || !check.updateAvailable) { - return; - } - - printOpenSpecUpdateAvailableBanner(check.current, check.latest); - - const autoApproval = parseAutoApproval('GUARDEX_AUTO_OPENSPEC_UPDATE_APPROVAL'); - const interactive = isInteractiveTerminal(); - - if (!interactive && autoApproval == null) { - console.log(`[${TOOL_NAME}] Non-interactive shell; skipping OpenSpec update prompt.`); - return; - } - - const shouldUpdate = interactive - ? promptYesNoStrict( - `Update OpenSpec now? (${NPM_BIN} i -g ${OPENSPEC_PACKAGE}@latest && ${OPENSPEC_BIN} update)`, - ) - : autoApproval; - - if (!shouldUpdate) { - console.log(`[${TOOL_NAME}] Skipped OpenSpec update.`); - return; - } - - const installResult = run(NPM_BIN, ['i', '-g', `${OPENSPEC_PACKAGE}@latest`], { stdio: 'inherit' }); - if (installResult.status !== 0) { - console.log(`[${TOOL_NAME}] ⚠️ OpenSpec npm install failed. You can retry manually.`); - return; - } - - const toolUpdateResult = run(OPENSPEC_BIN, ['update'], { stdio: 'inherit' }); - if (toolUpdateResult.status !== 0) { - console.log(`[${TOOL_NAME}] ⚠️ OpenSpec tool update failed. Run '${OPENSPEC_BIN} update' manually.`); - return; - } - - console.log(`[${TOOL_NAME}] ✅ OpenSpec updated to latest package and tool plugins refreshed.`); -} - -function promptYesNoStrict(question) { - while (true) { - process.stdout.write(`${question} [y/n] `); - const answer = readSingleLineFromStdin().trim().toLowerCase(); - - if (answer === 'y' || answer === 'yes') { - process.stdout.write('\n'); - return true; - } - if (answer === 'n' || answer === 'no') { - process.stdout.write('\n'); - return false; - } - - process.stdout.write('Please answer with y or n.\n'); - } -} - -function resolveGlobalInstallApproval(options) { - if (options.yesGlobalInstall && options.noGlobalInstall) { - throw new Error('Cannot use both --yes-global-install and --no-global-install'); - } - - if (options.yesGlobalInstall) { - return { approved: true, source: 'flag' }; - } - - if (options.noGlobalInstall) { - return { approved: false, source: 'flag' }; - } - - if (!isInteractiveTerminal()) { - return { approved: false, source: 'non-interactive-default' }; - } - return { approved: true, source: 'prompt' }; -} - -function getGlobalToolchainService(packageName) { - const service = GLOBAL_TOOLCHAIN_SERVICES.find( - (candidate) => candidate.packageName === packageName, - ); - return service || { name: packageName, packageName }; -} - -function formatGlobalToolchainServiceName(packageName) { - return getGlobalToolchainService(packageName).name; -} - -function describeMissingGlobalDependencyWarnings(packageNames) { - return packageNames - .map((packageName) => getGlobalToolchainService(packageName)) - .filter((service) => service.dependencyUrl) - .map( - (service) => - `Guardex needs ${service.name} as a dependency: ${service.dependencyUrl}`, - ); -} - -function buildMissingCompanionInstallPrompt(missingPackages, missingLocalTools) { - const dependencyWarnings = describeMissingGlobalDependencyWarnings(missingPackages); - const installCommands = describeCompanionInstallCommands(missingPackages, missingLocalTools); - const dependencyPrefix = dependencyWarnings.length > 0 - ? `${dependencyWarnings.join(' ')} ` - : ''; - return `${dependencyPrefix}Install missing companion tools now? (${installCommands.join(' && ')})`; -} - -function detectGlobalToolchainPackages() { - const result = run(NPM_BIN, ['list', '-g', '--depth=0', '--json']); - if (result.status !== 0) { - const stderr = (result.stderr || '').trim(); - return { - ok: false, - error: stderr || 'Unable to detect globally installed npm packages', - }; - } - - let parsed; - try { - parsed = JSON.parse(result.stdout || '{}'); - } catch (error) { - return { - ok: false, - error: `Failed to parse npm list output: ${error.message}`, - }; - } - - const dependencyMap = parsed && parsed.dependencies && typeof parsed.dependencies === 'object' - ? parsed.dependencies - : {}; - const installedSet = new Set(Object.keys(dependencyMap)); - - const installed = []; - const missing = []; - const installedVersions = {}; - for (const pkg of GLOBAL_TOOLCHAIN_PACKAGES) { - if (installedSet.has(pkg)) { - installed.push(pkg); - const rawVersion = dependencyMap[pkg] && dependencyMap[pkg].version; - const version = String(rawVersion || '').trim(); - if (version) { - installedVersions[pkg] = version; - } - } else { - missing.push(pkg); - } - } - - return { ok: true, installed, missing, installedVersions }; -} - -function detectRequiredSystemTools() { - const services = []; - for (const tool of REQUIRED_SYSTEM_TOOLS) { - const result = run(tool.command, ['--version']); - const active = result.status === 0; - const rawReason = result.error && result.error.code - ? result.error.code - : (result.stderr || '').trim(); - const reason = rawReason.split('\n')[0] || ''; - services.push({ - name: tool.name, - displayName: tool.displayName || tool.name, - command: tool.command, - installHint: tool.installHint, - status: active ? 'active' : 'inactive', - reason, - }); - } - return services; -} - -function detectOptionalLocalCompanionTools() { - return OPTIONAL_LOCAL_COMPANION_TOOLS.map((tool) => { - const detectedPath = tool.candidatePaths - .map((relativePath) => path.join(GUARDEX_HOME_DIR, relativePath)) - .find((candidatePath) => fs.existsSync(candidatePath)); - return { - name: tool.name, - displayName: tool.displayName || tool.name, - installCommand: tool.installCommand, - installArgs: [...tool.installArgs], - status: detectedPath ? 'active' : 'inactive', - detectedPath: detectedPath || null, - }; - }); -} - -function describeCompanionInstallCommands(missingPackages, missingLocalTools) { - const commands = []; - if (missingPackages.length > 0) { - commands.push(`${NPM_BIN} i -g ${missingPackages.join(' ')}`); - } - for (const tool of missingLocalTools) { - commands.push(tool.installCommand); - } - return commands; -} - -function askGlobalInstallForMissing(options, missingPackages, missingLocalTools) { - const approval = resolveGlobalInstallApproval(options); - if (!approval.approved) { - return approval; - } - - if (approval.source === 'prompt') { - const approved = promptYesNoStrict( - buildMissingCompanionInstallPrompt(missingPackages, missingLocalTools), - ); - return { approved, source: 'prompt' }; - } - - return approval; -} - -function installGlobalToolchain(options) { - const approval = resolveGlobalInstallApproval(options); - if (approval.source === 'flag' && !approval.approved) { - return { - status: 'skipped', - reason: approval.source, - missingPackages: [], - missingLocalTools: [], - }; - } - - if (options.dryRun) { - return { status: 'dry-run-skip' }; - } - - const detection = detectGlobalToolchainPackages(); - const localCompanionTools = detectOptionalLocalCompanionTools(); - if (!detection.ok) { - console.log(`[${TOOL_NAME}] ⚠️ Could not detect global packages: ${detection.error}`); - } else { - if (detection.installed.length > 0) { - console.log( - `[${TOOL_NAME}] Already installed globally: ` + - `${detection.installed.map((pkg) => formatGlobalToolchainServiceName(pkg)).join(', ')}`, - ); - } - const installedLocalTools = localCompanionTools - .filter((tool) => tool.status === 'active') - .map((tool) => tool.name); - if (installedLocalTools.length > 0) { - console.log(`[${TOOL_NAME}] Already installed locally: ${installedLocalTools.join(', ')}`); - } - if (detection.missing.length === 0 && localCompanionTools.every((tool) => tool.status === 'active')) { - return { status: 'already-installed' }; - } - } - - const missingPackages = detection.ok ? detection.missing : [...GLOBAL_TOOLCHAIN_PACKAGES]; - const missingLocalTools = localCompanionTools.filter((tool) => tool.status !== 'active'); - const installApproval = askGlobalInstallForMissing(options, missingPackages, missingLocalTools); - if (!installApproval.approved) { - return { - status: 'skipped', - reason: installApproval.source, - missingPackages, - missingLocalTools, - }; - } - - const installed = []; - if (missingPackages.length > 0) { - console.log( - `[${TOOL_NAME}] Installing global toolchain: npm i -g ${missingPackages.join(' ')}`, - ); - const result = run(NPM_BIN, ['i', '-g', ...missingPackages], { stdio: 'inherit' }); - if (result.status !== 0) { - const stderr = (result.stderr || '').trim(); - return { - status: 'failed', - reason: stderr || 'npm global install failed', - }; - } - installed.push(...missingPackages); - } - - for (const tool of missingLocalTools) { - console.log(`[${TOOL_NAME}] Installing local companion tool: ${tool.installCommand}`); - const result = run(NPX_BIN, tool.installArgs, { stdio: 'inherit' }); - if (result.status !== 0) { - const stderr = (result.stderr || '').trim(); - return { - status: 'failed', - reason: stderr || `${tool.name} install failed`, - }; - } - installed.push(tool.name); - } - - return { status: 'installed', packages: installed }; -} - -function gitRefExists(repoRoot, refName) { - return gitRun(repoRoot, ['show-ref', '--verify', '--quiet', refName], { allowFailure: true }).status === 0; -} - -function findStaleLockPaths(repoRoot, locks) { - const stale = []; - - for (const [filePath, rawEntry] of Object.entries(locks)) { - const entry = rawEntry && typeof rawEntry === 'object' ? rawEntry : {}; - const ownerBranch = String(entry.branch || ''); - - const hasOwner = ownerBranch.length > 0; - const localRef = hasOwner ? `refs/heads/${ownerBranch}` : null; - const remoteRef = hasOwner ? `refs/remotes/origin/${ownerBranch}` : null; - const branchExists = hasOwner - ? gitRefExists(repoRoot, localRef) || gitRefExists(repoRoot, remoteRef) - : false; - - const pathExists = fs.existsSync(path.join(repoRoot, filePath)); - - if (!hasOwner || !branchExists || !pathExists) { - stale.push(filePath); - } - } - - return stale; -} - -function runInstallInternal(options) { - const repoRoot = resolveRepoRoot(options.target); - const guardexToggle = resolveGuardexRepoToggle(repoRoot); - if (!guardexToggle.enabled) { - return { - repoRoot, - operations: [ - { - status: 'skipped', - file: '.env', - note: `Guardex disabled by ${describeGuardexRepoToggle(guardexToggle)}`, - }, - ], - hookResult: { status: 'skipped', key: 'core.hooksPath', value: '(unchanged)' }, - guardexEnabled: false, - guardexToggle, - }; - } - const operations = []; - - if (!options.skipGitignore) { - operations.push(ensureManagedGitignore(repoRoot, Boolean(options.dryRun))); - } - - operations.push(...ensureOmxScaffold(repoRoot, Boolean(options.dryRun))); - - for (const templateFile of TEMPLATE_FILES) { - operations.push( - copyTemplateFile( - repoRoot, - templateFile, - shouldForceManagedPath(options, toDestinationPath(templateFile)), - Boolean(options.dryRun), - ), - ); - } - operations.push(...ensureTargetedLegacyWorkflowShims(repoRoot, options)); - for (const hookName of HOOK_NAMES) { - const hookRelativePath = path.posix.join('.githooks', hookName); - operations.push( - ensureHookShim(repoRoot, hookName, { - dryRun: options.dryRun, - force: shouldForceManagedPath(options, hookRelativePath), - }), - ); - } - - operations.push(ensureLockRegistry(repoRoot, Boolean(options.dryRun))); - - if (!options.skipAgents) { - operations.push(ensureAgentsSnippet(repoRoot, Boolean(options.dryRun), { force: Boolean(options.force) })); - } - - const hookResult = configureHooks(repoRoot, Boolean(options.dryRun)); - - return { repoRoot, operations, hookResult, guardexEnabled: true, guardexToggle }; -} - -function runFixInternal(options) { - const repoRoot = resolveRepoRoot(options.target); - const guardexToggle = resolveGuardexRepoToggle(repoRoot); - if (!guardexToggle.enabled) { - return { - repoRoot, - operations: [ - { - status: 'skipped', - file: '.env', - note: `Guardex disabled by ${describeGuardexRepoToggle(guardexToggle)}`, - }, - ], - hookResult: { status: 'skipped', key: 'core.hooksPath', value: '(unchanged)' }, - guardexEnabled: false, - guardexToggle, - }; - } - const operations = []; - - if (!options.skipGitignore) { - operations.push(ensureManagedGitignore(repoRoot, Boolean(options.dryRun))); - } - - operations.push(...ensureOmxScaffold(repoRoot, Boolean(options.dryRun))); - - for (const templateFile of TEMPLATE_FILES) { - if (shouldForceManagedPath(options, toDestinationPath(templateFile))) { - operations.push(copyTemplateFile(repoRoot, templateFile, true, Boolean(options.dryRun))); - continue; - } - operations.push(ensureTemplateFilePresent(repoRoot, templateFile, Boolean(options.dryRun))); - } - operations.push(...ensureTargetedLegacyWorkflowShims(repoRoot, options)); - for (const hookName of HOOK_NAMES) { - const hookRelativePath = path.posix.join('.githooks', hookName); - operations.push( - ensureHookShim(repoRoot, hookName, { - dryRun: options.dryRun, - force: shouldForceManagedPath(options, hookRelativePath), - }), - ); - } - - operations.push(ensureLockRegistry(repoRoot, Boolean(options.dryRun))); - - const lockState = lockStateOrError(repoRoot); - if (!lockState.ok) { - if (!options.dryRun) { - writeLockState(repoRoot, { locks: {} }, false); - } - operations.push({ - status: options.dryRun ? 'would-reset' : 'reset', - file: LOCK_FILE_RELATIVE, - note: 'invalid lock state reset to empty', - }); - } else { - const staleLockPaths = options.dropStaleLocks ? findStaleLockPaths(repoRoot, lockState.locks) : []; - if (staleLockPaths.length > 0) { - const updated = { ...lockState.raw, locks: { ...lockState.locks } }; - for (const filePath of staleLockPaths) { - delete updated.locks[filePath]; - } - writeLockState(repoRoot, updated, Boolean(options.dryRun)); - operations.push({ - status: options.dryRun ? 'would-prune' : 'pruned', - file: LOCK_FILE_RELATIVE, - note: `removed ${staleLockPaths.length} stale lock(s)`, - }); - } - } - - if (!options.skipAgents) { - operations.push(ensureAgentsSnippet(repoRoot, Boolean(options.dryRun), { force: Boolean(options.force) })); - } - - const hookResult = configureHooks(repoRoot, Boolean(options.dryRun)); - - return { repoRoot, operations, hookResult, guardexEnabled: true, guardexToggle }; -} - -function runScanInternal(options) { - const repoRoot = resolveRepoRoot(options.target); - const guardexToggle = resolveGuardexRepoToggle(repoRoot); - const branch = readBranchDisplayName(repoRoot); - if (!guardexToggle.enabled) { - return { - repoRoot, - branch, - findings: [], - errors: 0, - warnings: 0, - guardexEnabled: false, - guardexToggle, - }; - } - const findings = []; - - const requiredPaths = [ - ...OMX_SCAFFOLD_DIRECTORIES, - ...Array.from(OMX_SCAFFOLD_FILES.keys()), - ...REQUIRED_MANAGED_REPO_FILES, - ]; - - for (const relativePath of requiredPaths) { - const absolutePath = path.join(repoRoot, relativePath); - if (!fs.existsSync(absolutePath)) { - findings.push({ - level: 'error', - code: 'missing-managed-file', - path: relativePath, - message: `Missing managed repo file: ${relativePath}`, - }); - } - } - - const hooksPathResult = gitRun(repoRoot, ['config', '--get', 'core.hooksPath'], { allowFailure: true }); - const hooksPath = hooksPathResult.status === 0 ? hooksPathResult.stdout.trim() : ''; - if (hooksPath !== '.githooks') { - findings.push({ - level: 'warn', - code: 'hooks-path-mismatch', - message: `git core.hooksPath is '${hooksPath || '(unset)'}' (expected '.githooks')`, - }); - } - - const lockState = lockStateOrError(repoRoot); - if (!lockState.ok) { - findings.push({ - level: 'error', - code: 'lock-state-invalid', - message: lockState.error, - }); - } else { - for (const [filePath, rawEntry] of Object.entries(lockState.locks)) { - const entry = rawEntry && typeof rawEntry === 'object' ? rawEntry : {}; - const ownerBranch = String(entry.branch || ''); - const allowDelete = Boolean(entry.allow_delete); - - if (!ownerBranch) { - findings.push({ - level: 'warn', - code: 'lock-missing-owner', - path: filePath, - message: `Lock entry has no owner branch: ${filePath}`, - }); - } - - const absolutePath = path.join(repoRoot, filePath); - if (!fs.existsSync(absolutePath)) { - findings.push({ - level: 'warn', - code: 'lock-target-missing', - path: filePath, - message: `Locked path is missing from disk: ${filePath}`, - }); - } - - if (ownerBranch) { - const localRef = `refs/heads/${ownerBranch}`; - const remoteRef = `refs/remotes/origin/${ownerBranch}`; - if (!gitRefExists(repoRoot, localRef) && !gitRefExists(repoRoot, remoteRef)) { - findings.push({ - level: 'warn', - code: 'stale-branch-lock', - path: filePath, - message: `Lock owner branch not found locally/remotely: ${ownerBranch} (${filePath})`, - }); - } - } - - if (allowDelete && CRITICAL_GUARDRAIL_PATHS.has(filePath)) { - findings.push({ - level: 'error', - code: 'guardrail-delete-approved', - path: filePath, - message: `Critical guardrail file is delete-approved: ${filePath}`, - }); - } - } - } - - const errors = findings.filter((item) => item.level === 'error'); - const warnings = findings.filter((item) => item.level === 'warn'); - - return { - repoRoot, - branch, - findings, - errors: errors.length, - warnings: warnings.length, - guardexEnabled: true, - guardexToggle, - }; -} - -function printOperations(title, payload, dryRun = false) { - console.log(`[${TOOL_NAME}] ${title}: ${payload.repoRoot}`); - for (const operation of payload.operations) { - const note = operation.note ? ` (${operation.note})` : ''; - console.log(` - ${operation.status.padEnd(12)} ${operation.file}${note}`); - } - console.log( - ` - hooksPath ${payload.hookResult.status} ${payload.hookResult.key}=${payload.hookResult.value}`, - ); - - if (dryRun) { - console.log(`[${TOOL_NAME}] Dry run complete. No files were modified.`); - } -} - -function printScanResult(scan, json = false) { - if (json) { - process.stdout.write( - JSON.stringify( - { - repoRoot: scan.repoRoot, - branch: scan.branch, - guardexEnabled: scan.guardexEnabled !== false, - guardexToggle: scan.guardexToggle || null, - errors: scan.errors, - warnings: scan.warnings, - findings: scan.findings, - }, - null, - 2, - ) + '\n', - ); - return; - } - - console.log(`[${TOOL_NAME}] Scan target: ${scan.repoRoot}`); - console.log(`[${TOOL_NAME}] Branch: ${scan.branch}`); - - if (scan.guardexEnabled === false) { - console.log( - colorizeDoctorOutput( - `[${TOOL_NAME}] Guardex is disabled for this repo (${describeGuardexRepoToggle(scan.guardexToggle)}).`, - 'disabled', - ), - ); - return; - } - - if (scan.findings.length === 0) { - console.log(colorizeDoctorOutput(`[${TOOL_NAME}] ✅ No safety issues detected.`, 'safe')); - return; - } - - for (const item of scan.findings) { - const target = item.path ? ` (${item.path})` : ''; - console.log( - colorizeDoctorOutput( - `[${item.level.toUpperCase()}] ${item.code}${target}: ${item.message}`, - item.level, - ), - ); - } - console.log( - colorizeDoctorOutput( - `[${TOOL_NAME}] Summary: ${scan.errors} error(s), ${scan.warnings} warning(s).`, - scan.errors > 0 ? 'error' : 'warn', - ), - ); -} - -function setExitCodeFromScan(scan) { - if (scan.guardexEnabled === false) { - process.exitCode = 0; - return; - } - if (scan.errors > 0) { - process.exitCode = 2; - return; - } - if (scan.warnings > 0) { - process.exitCode = 1; - return; - } - process.exitCode = 0; -} - -function status(rawArgs) { - const options = parseCommonArgs(rawArgs, { - target: process.cwd(), - json: false, - }); - - const toolchain = detectGlobalToolchainPackages(); - const npmServices = GLOBAL_TOOLCHAIN_PACKAGES.map((pkg) => { - const service = getGlobalToolchainService(pkg); - if (!toolchain.ok) { - return { - name: service.name, - displayName: service.name, - packageName: pkg, - dependencyUrl: service.dependencyUrl || null, - status: 'unknown', - }; - } - return { - name: service.name, - displayName: service.name, - packageName: pkg, - dependencyUrl: service.dependencyUrl || null, - status: toolchain.installed.includes(pkg) ? 'active' : 'inactive', - }; - }); - const localCompanionServices = detectOptionalLocalCompanionTools().map((tool) => ({ - name: tool.name, - displayName: tool.displayName || tool.name, - status: tool.status, - })); - const requiredSystemTools = detectRequiredSystemTools(); - const services = [ - ...npmServices, - ...localCompanionServices, - ...requiredSystemTools.map((tool) => ({ - name: tool.name, - displayName: tool.displayName || tool.name, - status: tool.status, - })), - ]; - - const targetPath = path.resolve(options.target); - const inGitRepo = isGitRepo(targetPath); - const scanResult = inGitRepo ? runScanInternal({ target: targetPath, json: false }) : null; - const repoServiceStatus = scanResult - ? (scanResult.guardexEnabled === false - ? 'disabled' - : (scanResult.errors === 0 && scanResult.warnings === 0 ? 'active' : 'degraded')) - : 'inactive'; - - const payload = { - cli: { - name: packageJson.name, - version: packageJson.version, - runtime: runtimeVersion(), - }, - services, - repo: { - target: targetPath, - inGitRepo, - serviceStatus: repoServiceStatus, - guardexEnabled: scanResult ? scanResult.guardexEnabled !== false : null, - guardexToggle: scanResult ? scanResult.guardexToggle || null : null, - scan: scanResult - ? { - repoRoot: scanResult.repoRoot, - branch: scanResult.branch, - errors: scanResult.errors, - warnings: scanResult.warnings, - findings: scanResult.findings.length, - } - : null, - }, - detectionError: toolchain.ok ? null : toolchain.error, - }; - - if (options.json) { - process.stdout.write(`${JSON.stringify(payload, null, 2)}\n`); - process.exitCode = 0; - return; - } - - console.log(`[${TOOL_NAME}] CLI: ${payload.cli.runtime}`); - if (!toolchain.ok) { - console.log(`[${TOOL_NAME}] ⚠️ Could not detect global services: ${toolchain.error}`); - } - - console.log(`[${TOOL_NAME}] Global services:`); - for (const service of services) { - const serviceLabel = service.displayName || service.name; - console.log(` - ${statusDot(service.status)} ${serviceLabel}: ${service.status}`); - } - const inactiveOptionalCompanions = [...npmServices, ...localCompanionServices] - .filter((service) => service.status !== 'active') - .map((service) => service.displayName || service.name); - if (inactiveOptionalCompanions.length > 0) { - console.log( - `[${TOOL_NAME}] Optional companion tools inactive: ${inactiveOptionalCompanions.join(', ')}`, - ); - for (const warning of describeMissingGlobalDependencyWarnings( - npmServices - .filter((service) => service.status === 'inactive') - .map((service) => service.packageName), - )) { - console.log(`[${TOOL_NAME}] ${warning}`); - } - console.log( - `[${TOOL_NAME}] Run '${SHORT_TOOL_NAME} setup' to install missing companions with an explicit Y/N prompt.`, - ); - } - const missingSystemTools = requiredSystemTools.filter((tool) => tool.status !== 'active'); - if (missingSystemTools.length > 0) { - const tools = missingSystemTools - .map((tool) => tool.displayName || tool.name) - .join(', '); - console.log(`[${TOOL_NAME}] ⚠️ Missing required system tool(s): ${tools}`); - for (const tool of missingSystemTools) { - const reasonText = tool.reason ? ` (${tool.reason})` : ''; - console.log(` - install ${tool.name}: ${tool.installHint}${reasonText}`); - } - } - - if (!scanResult) { - console.log( - `[${TOOL_NAME}] Repo safety service: ${statusDot('inactive')} inactive (no git repository at target).`, - ); - process.exitCode = 0; - return; - } - - if (scanResult.guardexEnabled === false) { - console.log( - `[${TOOL_NAME}] Repo safety service: ${statusDot('disabled')} disabled (${describeGuardexRepoToggle(scanResult.guardexToggle)}).`, - ); - console.log(`[${TOOL_NAME}] Repo: ${scanResult.repoRoot}`); - console.log(`[${TOOL_NAME}] Branch: ${scanResult.branch}`); - printToolLogsSummary(); - process.exitCode = 0; - return; - } - - if (scanResult.errors === 0 && scanResult.warnings === 0) { - console.log(`[${TOOL_NAME}] Repo safety service: ${statusDot('active')} active.`); - } else if (scanResult.errors === 0) { - console.log( - `[${TOOL_NAME}] Repo safety service: ${statusDot('degraded')} degraded (${scanResult.warnings} warning(s)).`, - ); - console.log(`[${TOOL_NAME}] Run '${TOOL_NAME} scan' to review warning details.`); - } else if (scanResult.warnings === 0) { - console.log( - `[${TOOL_NAME}] Repo safety service: ${statusDot('degraded')} degraded (${scanResult.errors} error(s)).`, - ); - console.log(`[${TOOL_NAME}] Run '${TOOL_NAME} scan' for detailed findings.`); - } else { - console.log( - `[${TOOL_NAME}] Repo safety service: ${statusDot('degraded')} degraded (${scanResult.errors} error(s), ${scanResult.warnings} warning(s)).`, - ); - console.log(`[${TOOL_NAME}] Run '${TOOL_NAME} scan' for detailed findings.`); - } - console.log(`[${TOOL_NAME}] Repo: ${scanResult.repoRoot}`); - console.log(`[${TOOL_NAME}] Branch: ${scanResult.branch}`); - printToolLogsSummary(); - - process.exitCode = 0; -} - -function install(rawArgs) { - const options = parseCommonArgs(rawArgs, { - target: process.cwd(), - force: false, - skipAgents: false, - skipPackageJson: false, - skipGitignore: false, - dryRun: false, - allowProtectedBaseWrite: false, - }); - - assertProtectedMainWriteAllowed(options, 'install'); - const payload = runInstallInternal(options); - printOperations('Install target', payload, options.dryRun); - - if (!options.dryRun) { - if (payload.guardexEnabled === false) { - console.log( - `[${TOOL_NAME}] Guardex is disabled for this repo (${describeGuardexRepoToggle(payload.guardexToggle)}). Skipping repo bootstrap.`, - ); - process.exitCode = 0; - return; - } - if (!options.skipAgents) { - console.log(`[${TOOL_NAME}] AGENTS.md managed policy block is configured by install.`); - } - console.log(`[${TOOL_NAME}] Installed. Next step: ${TOOL_NAME} setup`); - } - - process.exitCode = 0; -} - -function fix(rawArgs) { - const options = parseCommonArgs(rawArgs, { - target: process.cwd(), - dropStaleLocks: true, - skipAgents: false, - skipPackageJson: false, - skipGitignore: false, - dryRun: false, - allowProtectedBaseWrite: false, - }); - - assertProtectedMainWriteAllowed(options, 'fix'); - const payload = runFixInternal(options); - printOperations('Fix target', payload, options.dryRun); - - if (!options.dryRun) { - if (payload.guardexEnabled === false) { - console.log( - `[${TOOL_NAME}] Guardex is disabled for this repo (${describeGuardexRepoToggle(payload.guardexToggle)}). Skipping repo repair.`, - ); - process.exitCode = 0; - return; - } - console.log(`[${TOOL_NAME}] Repair complete. Next step: ${TOOL_NAME} scan`); - } - - process.exitCode = 0; -} - -function scan(rawArgs) { - const options = parseCommonArgs(rawArgs, { - target: process.cwd(), - json: false, - }); - - const result = runScanInternal(options); - printScanResult(result, options.json); - setExitCodeFromScan(result); -} - -function doctor(rawArgs) { - const options = parseDoctorArgs(rawArgs); - const topRepoRoot = resolveRepoRoot(options.target); - const discoveredRepos = options.recursive - ? discoverNestedGitRepos(topRepoRoot, { - maxDepth: options.nestedMaxDepth, - extraSkip: options.nestedSkipDirs, - includeSubmodules: options.includeSubmodules, - }) - : [topRepoRoot]; - - if (discoveredRepos.length > 1) { - if (!options.json) { - console.log( - `[${TOOL_NAME}] Detected ${discoveredRepos.length} git repos under ${topRepoRoot}. ` + - `Repairing each with doctor (use --single-repo to limit to the target).`, - ); - } - - const repoResults = []; - let aggregateExitCode = 0; - for (let repoIndex = 0; repoIndex < discoveredRepos.length; repoIndex += 1) { - const repoPath = discoveredRepos[repoIndex]; - const progressLabel = `${repoIndex + 1}/${discoveredRepos.length}`; - if (!options.json) { - console.log(`[${TOOL_NAME}] ── Doctor target: ${repoPath} [${progressLabel}] ──`); - } - - const childArgs = [ - path.resolve(__filename), - 'doctor', - '--single-repo', - '--target', - repoPath, - ...(options.force ? ['--force', ...(options.forceManagedPaths || [])] : []), - ...(options.dropStaleLocks ? [] : ['--keep-stale-locks']), - ...(options.skipAgents ? ['--skip-agents'] : []), - ...(options.skipPackageJson ? ['--skip-package-json'] : []), - ...(options.skipGitignore ? ['--no-gitignore'] : []), - ...(options.dryRun ? ['--dry-run'] : []), - // Recursive child doctor runs should report pending PR state immediately instead of blocking the parent loop. - '--no-wait-for-merge', - ...(options.verboseAutoFinish ? ['--verbose-auto-finish'] : []), - ...(options.json ? ['--json'] : []), - ...(options.allowProtectedBaseWrite ? ['--allow-protected-base-write'] : []), - ]; - const startedAt = Date.now(); - const nestedResult = options.json - ? run(process.execPath, childArgs, { cwd: topRepoRoot }) - : cp.spawnSync(process.execPath, childArgs, { - cwd: topRepoRoot, - encoding: 'utf8', - stdio: 'inherit', - }); - if (isSpawnFailure(nestedResult)) { - throw nestedResult.error; - } - - const exitCode = typeof nestedResult.status === 'number' ? nestedResult.status : 1; - if (exitCode !== 0 && aggregateExitCode === 0) { - aggregateExitCode = exitCode; - } - - if (options.json) { - let parsedResult = null; - if (nestedResult.stdout) { - try { - parsedResult = JSON.parse(nestedResult.stdout); - } catch { - parsedResult = null; - } - } - repoResults.push( - parsedResult - ? { repoRoot: repoPath, exitCode, result: parsedResult } - : { - repoRoot: repoPath, - exitCode, - stdout: nestedResult.stdout || '', - stderr: nestedResult.stderr || '', - }, - ); - } else { - console.log( - `[${TOOL_NAME}] Doctor target complete: ${repoPath} [${progressLabel}] in ${formatElapsedDuration(Date.now() - startedAt)}.`, - ); - if (repoIndex < discoveredRepos.length - 1) { - process.stdout.write('\n'); - } - } - } - - if (options.json) { - process.stdout.write( - JSON.stringify( - { - repoRoot: topRepoRoot, - recursive: true, - repos: repoResults, - }, - null, - 2, - ) + '\n', - ); - } - - process.exitCode = aggregateExitCode; - return; - } - - const singleRepoOptions = { - ...options, - target: topRepoRoot, - }; - - const blocked = protectedBaseWriteBlock(singleRepoOptions, { requireBootstrap: false }); - if (blocked) { - runDoctorInSandbox(singleRepoOptions, blocked); - return; - } - - assertProtectedMainWriteAllowed(singleRepoOptions, 'doctor'); - const fixPayload = runFixInternal(singleRepoOptions); - const scanResult = runScanInternal({ target: singleRepoOptions.target, json: false }); - const currentBaseBranch = currentBranchName(scanResult.repoRoot); - const autoFinishSummary = scanResult.guardexEnabled === false - ? { - enabled: false, - attempted: 0, - completed: 0, - skipped: 0, - failed: 0, - details: [], - } - : autoFinishReadyAgentBranches(scanResult.repoRoot, { - baseBranch: currentBaseBranch, - dryRun: singleRepoOptions.dryRun, - waitForMerge: singleRepoOptions.waitForMerge, - }); - const safe = scanResult.guardexEnabled === false || (scanResult.errors === 0 && scanResult.warnings === 0); - const musafe = safe; - - if (singleRepoOptions.json) { - process.stdout.write( - JSON.stringify( - { - repoRoot: scanResult.repoRoot, - branch: scanResult.branch, - safe, - musafe, - fix: { - operations: fixPayload.operations, - hookResult: fixPayload.hookResult, - dryRun: Boolean(singleRepoOptions.dryRun), - }, - scan: { - guardexEnabled: scanResult.guardexEnabled !== false, - guardexToggle: scanResult.guardexToggle || null, - errors: scanResult.errors, - warnings: scanResult.warnings, - findings: scanResult.findings, - }, - autoFinish: autoFinishSummary, - }, - null, - 2, - ) + '\n', - ); - setExitCodeFromScan(scanResult); - return; - } - - printOperations('Doctor/fix', fixPayload, options.dryRun); - printScanResult(scanResult, false); - if (scanResult.guardexEnabled === false) { - console.log(`[${TOOL_NAME}] Repo-local Guardex enforcement is intentionally disabled.`); - setExitCodeFromScan(scanResult); - return; - } - printAutoFinishSummary(autoFinishSummary, { - baseBranch: currentBaseBranch, - verbose: singleRepoOptions.verboseAutoFinish, - }); - if (safe) { - console.log(colorizeDoctorOutput(`[${TOOL_NAME}] ✅ Repo is fully safe.`, 'safe')); - } else { - console.log( - colorizeDoctorOutput( - `[${TOOL_NAME}] ⚠️ Repo is not fully safe yet (${scanResult.errors} error(s), ${scanResult.warnings} warning(s)).`, - scanResult.errors > 0 ? 'unsafe' : 'warn', - ), - ); - } - setExitCodeFromScan(scanResult); -} - -function review(rawArgs) { - const options = parseReviewArgs(rawArgs); - const repoRoot = resolveRepoRoot(options.target); - const result = runReviewBotCommand(repoRoot, options.passthroughArgs); - if (isSpawnFailure(result)) { - throw result.error; - } - - if (result.stdout) process.stdout.write(result.stdout); - if (result.stderr) process.stderr.write(result.stderr); - process.exitCode = typeof result.status === 'number' ? result.status : 1; -} - -function agentsStatePathForRepo(repoRoot) { - return path.join(repoRoot, AGENTS_BOTS_STATE_RELATIVE); -} - -function readAgentsState(repoRoot) { - const statePath = agentsStatePathForRepo(repoRoot); - if (!fs.existsSync(statePath)) { - return null; - } - try { - return JSON.parse(fs.readFileSync(statePath, 'utf8')); - } catch (_error) { - return null; - } -} - -function writeAgentsState(repoRoot, state) { - const statePath = agentsStatePathForRepo(repoRoot); - fs.mkdirSync(path.dirname(statePath), { recursive: true }); - fs.writeFileSync(statePath, `${JSON.stringify(state, null, 2)}\n`, 'utf8'); -} - -function processAlive(pid) { - const normalizedPid = Number.parseInt(String(pid || ''), 10); - if (!Number.isInteger(normalizedPid) || normalizedPid <= 0) { - return false; - } - try { - process.kill(normalizedPid, 0); - return true; - } catch (_error) { - return false; - } -} - -function sleepSeconds(seconds) { - const result = run('sleep', [String(seconds)]); - if (isSpawnFailure(result) || result.status !== 0) { - throw new Error(`sleep command failed for ${seconds}s`); - } -} - -function readProcessCommand(pid) { - const result = run('ps', ['-o', 'command=', '-p', String(pid)]); - if (isSpawnFailure(result) || result.status !== 0) { - return ''; - } - return String(result.stdout || '').trim(); -} - -function stopAgentProcessByPid(pid, expectedToken = '') { - const normalizedPid = Number.parseInt(String(pid || ''), 10); - if (!Number.isInteger(normalizedPid) || normalizedPid <= 0) { - return { status: 'invalid', pid: normalizedPid }; - } - if (!processAlive(normalizedPid)) { - return { status: 'not-running', pid: normalizedPid }; - } - - if (expectedToken) { - const cmdline = readProcessCommand(normalizedPid); - if (cmdline && !cmdline.includes(expectedToken)) { - return { status: 'mismatch', pid: normalizedPid, command: cmdline }; - } - } - - try { - process.kill(-normalizedPid, 'SIGTERM'); - } catch (_error) { - try { - process.kill(normalizedPid, 'SIGTERM'); - } catch (_err) { - return { status: 'term-failed', pid: normalizedPid }; - } - } - - const deadline = Date.now() + 3_000; - while (Date.now() < deadline) { - if (!processAlive(normalizedPid)) { - return { status: 'stopped', pid: normalizedPid }; - } - sleepSeconds(0.1); - } - - try { - process.kill(-normalizedPid, 'SIGKILL'); - } catch (_error) { - try { - process.kill(normalizedPid, 'SIGKILL'); - } catch (_err) { - return { status: 'kill-failed', pid: normalizedPid }; - } - } - sleepSeconds(0.1); - - return { - status: processAlive(normalizedPid) ? 'kill-failed' : 'stopped', - pid: normalizedPid, - }; -} - -function spawnDetachedAgentProcess({ command, args, cwd, logPath }) { - fs.mkdirSync(path.dirname(logPath), { recursive: true }); - const logHandle = fs.openSync(logPath, 'a'); - fs.writeSync( - logHandle, - `[${new Date().toISOString()}] spawn: ${command} ${args.join(' ')}\n`, - ); - const child = cp.spawn(command, args, { - cwd, - detached: true, - stdio: ['ignore', logHandle, logHandle], - env: process.env, - }); - fs.closeSync(logHandle); - if (child.error) { - throw child.error; - } - child.unref(); - const pid = Number.parseInt(String(child.pid || ''), 10); - if (!Number.isInteger(pid) || pid <= 0) { - throw new Error(`Failed to spawn detached process for ${command}`); - } - return pid; -} - -function agents(rawArgs) { - const options = parseAgentsArgs(rawArgs); - const repoRoot = resolveRepoRoot(options.target); - const statePath = agentsStatePathForRepo(repoRoot); - - if (options.subcommand === 'start') { - const existingState = readAgentsState(repoRoot); - const existingReviewPid = Number.parseInt(String(existingState?.review?.pid || ''), 10); - const existingCleanupPid = Number.parseInt(String(existingState?.cleanup?.pid || ''), 10); - const reviewRunning = processAlive(existingReviewPid); - const cleanupRunning = processAlive(existingCleanupPid); - - if (reviewRunning && cleanupRunning) { - console.log( - `[${TOOL_NAME}] Repo agents already running (review pid=${existingReviewPid}, cleanup pid=${existingCleanupPid}).`, - ); - process.exitCode = 0; - return; - } - - const reviewLogPath = path.join(repoRoot, '.omx', 'logs', 'agent-review.log'); - const cleanupLogPath = path.join(repoRoot, '.omx', 'logs', 'agent-cleanup.log'); - - let reviewPid = existingReviewPid; - let cleanupPid = existingCleanupPid; - let startedAny = false; - let reusedAny = false; - - if (!reviewRunning) { - reviewPid = spawnDetachedAgentProcess({ - command: process.execPath, - args: [ - path.resolve(__filename), - 'internal', - 'run-shell', - 'reviewBot', - '--target', - repoRoot, - '--interval', - String(options.reviewIntervalSeconds), - ], - cwd: repoRoot, - logPath: reviewLogPath, - }); - startedAny = true; - } else { - reusedAny = true; - } - - if (!cleanupRunning) { - cleanupPid = spawnDetachedAgentProcess({ - command: process.execPath, - args: [ - path.resolve(__filename), - 'cleanup', - '--target', - repoRoot, - '--watch', - '--interval', - String(options.cleanupIntervalSeconds), - '--idle-minutes', - String(options.idleMinutes), - ], - cwd: repoRoot, - logPath: cleanupLogPath, - }); - startedAny = true; - } else { - reusedAny = true; - } - - const priorReviewInterval = Number.parseInt(String(existingState?.review?.intervalSeconds || ''), 10); - const priorCleanupInterval = Number.parseInt(String(existingState?.cleanup?.intervalSeconds || ''), 10); - const priorIdleMinutes = Number.parseInt(String(existingState?.cleanup?.idleMinutes || ''), 10); - const reviewIntervalSeconds = reviewRunning && Number.isInteger(priorReviewInterval) && priorReviewInterval >= 5 - ? priorReviewInterval - : options.reviewIntervalSeconds; - const cleanupIntervalSeconds = cleanupRunning && Number.isInteger(priorCleanupInterval) && priorCleanupInterval >= 5 - ? priorCleanupInterval - : options.cleanupIntervalSeconds; - const idleMinutes = cleanupRunning && Number.isInteger(priorIdleMinutes) && priorIdleMinutes >= 1 - ? priorIdleMinutes - : options.idleMinutes; - - writeAgentsState(repoRoot, { - schemaVersion: 1, - repoRoot, - startedAt: new Date().toISOString(), - review: { - pid: reviewPid, - intervalSeconds: reviewIntervalSeconds, - script: path.resolve(__filename), - logPath: reviewLogPath, - }, - cleanup: { - pid: cleanupPid, - intervalSeconds: cleanupIntervalSeconds, - idleMinutes, - script: path.resolve(__filename), - logPath: cleanupLogPath, - }, - }); - - console.log( - `[${TOOL_NAME}] Started repo agents in ${repoRoot} (review pid=${reviewPid}, cleanup pid=${cleanupPid}).`, - ); - if (reusedAny && startedAny) { - console.log(`[${TOOL_NAME}] Reused healthy bot process(es) and started only missing ones.`); - } - console.log(`[${TOOL_NAME}] Logs: ${reviewLogPath}, ${cleanupLogPath}`); - process.exitCode = 0; - return; - } - - if (options.subcommand === 'stop') { - const existingState = readAgentsState(repoRoot); - if (!existingState) { - console.log(`[${TOOL_NAME}] Repo agents are not running for ${repoRoot}.`); - process.exitCode = 0; - return; - } - - const reviewStop = stopAgentProcessByPid(existingState?.review?.pid, 'internal run-shell reviewBot'); - const cleanupStop = stopAgentProcessByPid(existingState?.cleanup?.pid, `${path.basename(__filename)} cleanup`); - - if (fs.existsSync(statePath)) { - fs.unlinkSync(statePath); - } - - console.log( - `[${TOOL_NAME}] Stopped repo agents in ${repoRoot} (review=${reviewStop.status}, cleanup=${cleanupStop.status}).`, - ); - process.exitCode = 0; - return; - } - - const existingState = readAgentsState(repoRoot); - if (!existingState) { - console.log(`[${TOOL_NAME}] Repo agents status: inactive (${repoRoot})`); - process.exitCode = 0; - return; - } - - const reviewPid = Number.parseInt(String(existingState?.review?.pid || ''), 10); - const cleanupPid = Number.parseInt(String(existingState?.cleanup?.pid || ''), 10); - console.log( - `[${TOOL_NAME}] Repo agents status: review=${processAlive(reviewPid) ? 'running' : 'stopped'}(pid=${reviewPid || 0}), cleanup=${processAlive(cleanupPid) ? 'running' : 'stopped'}(pid=${cleanupPid || 0})`, - ); - process.exitCode = 0; -} - -function report(rawArgs) { - const options = parseReportArgs(rawArgs); - const subcommand = options.subcommand || 'help'; - if (subcommand === 'help' || subcommand === '--help' || subcommand === '-h') { - console.log( - `${TOOL_NAME} report commands:\n` + - ` ${TOOL_NAME} report scorecard [--target ] [--repo github.com//] [--scorecard-json ] [--output-dir ] [--date YYYY-MM-DD] [--dry-run] [--json]\n` + - `\n` + - `Examples:\n` + - ` ${TOOL_NAME} report scorecard --repo github.com/recodeecom/multiagent-safety\n` + - ` ${TOOL_NAME} report scorecard --scorecard-json ./scorecard.json --date 2026-04-10`, - ); - process.exitCode = 0; - return; - } - - if (subcommand !== 'scorecard') { - throw new Error(`Unknown report subcommand: ${subcommand}`); - } - - const repoRoot = resolveRepoRoot(options.target); - const repo = resolveScorecardRepo(repoRoot, options.repo); - const payload = options.scorecardJson - ? readScorecardJsonFile(options.scorecardJson) - : runScorecardJson(repo); - - const reportDate = options.date || todayDateStamp(); - const outputDir = path.resolve(options.outputDir || path.join(repoRoot, 'docs', 'reports')); - const baselinePath = path.join(outputDir, `openssf-scorecard-baseline-${reportDate}.md`); - const remediationPath = path.join(outputDir, `openssf-scorecard-remediation-plan-${reportDate}.md`); - - const checks = normalizeScorecardChecks(payload); - const rawScore = Number(payload?.score); - const score = Number.isFinite(rawScore) ? rawScore : 0; - const capturedAt = String(payload?.date || new Date().toISOString()); - const scorecardVersion = String(payload?.scorecard?.version || payload?.version || 'unknown'); - - const baselineMarkdown = renderScorecardBaselineMarkdown({ - repo, - score, - checks, - capturedAt, - scorecardVersion, - reportDate, - }); - - const remediationMarkdown = renderScorecardRemediationPlanMarkdown({ - baselineRelativePath: path.relative(repoRoot, baselinePath) || path.basename(baselinePath), - checks, - }); - - if (!options.dryRun) { - fs.mkdirSync(outputDir, { recursive: true }); - fs.writeFileSync(baselinePath, baselineMarkdown, 'utf8'); - fs.writeFileSync(remediationPath, remediationMarkdown, 'utf8'); - } - - if (options.json) { - process.stdout.write( - JSON.stringify( - { - repoRoot, - repo, - score, - checks: checks.length, - outputDir, - baselinePath, - remediationPath, - dryRun: Boolean(options.dryRun), - }, - null, - 2, - ) + '\n', - ); - process.exitCode = 0; - return; - } - - console.log(`[${TOOL_NAME}] Report target: ${repoRoot}`); - console.log(`[${TOOL_NAME}] Scorecard repo: ${repo}`); - console.log(`[${TOOL_NAME}] Score: ${score}/10`); - if (options.dryRun) { - console.log(`[${TOOL_NAME}] Dry run report paths:`); - } else { - console.log(`[${TOOL_NAME}] Generated reports:`); - } - console.log(` - ${baselinePath}`); - console.log(` - ${remediationPath}`); - process.exitCode = 0; -} - -function setup(rawArgs) { - const options = parseSetupArgs(rawArgs, { - target: process.cwd(), - force: false, - skipAgents: false, - skipPackageJson: false, - skipGitignore: false, - dryRun: false, - yesGlobalInstall: false, - noGlobalInstall: false, - allowProtectedBaseWrite: false, - }); - - const globalInstallStatus = installGlobalToolchain(options); - if (globalInstallStatus.status === 'installed') { - console.log( - `[${TOOL_NAME}] ✅ Companion tools installed (${(globalInstallStatus.packages || []).join(', ')}).`, - ); - } else if (globalInstallStatus.status === 'already-installed') { - console.log(`[${TOOL_NAME}] ✅ Companion tools already installed. Skipping.`); - } else if (globalInstallStatus.status === 'failed') { - const installCommands = describeCompanionInstallCommands( - GLOBAL_TOOLCHAIN_PACKAGES, - OPTIONAL_LOCAL_COMPANION_TOOLS, - ); - console.log( - `[${TOOL_NAME}] ⚠️ Global install failed: ${globalInstallStatus.reason}\n` + - `[${TOOL_NAME}] Continue with local safety setup. You can retry later with:\n` + - installCommands.map((command) => ` ${command}`).join('\n'), - ); - } else if (globalInstallStatus.status === 'skipped' && globalInstallStatus.reason === 'non-interactive-default') { - console.log( - `[${TOOL_NAME}] Skipping companion installs (non-interactive mode). ` + - `Use --yes-global-install to force or run interactively for Y/N prompt.`, - ); - } else if (globalInstallStatus.status === 'skipped') { - console.log(`[${TOOL_NAME}] ⚠️ Companion installs skipped by user choice.`); - for (const warning of describeMissingGlobalDependencyWarnings( - globalInstallStatus.missingPackages || [], - )) { - console.log(`[${TOOL_NAME}] ⚠️ ${warning}`); - } - } - const requiredSystemTools = detectRequiredSystemTools(); - const missingSystemTools = requiredSystemTools.filter((tool) => tool.status !== 'active'); - if (missingSystemTools.length === 0) { - console.log(`[${TOOL_NAME}] ✅ Required system tools available (${requiredSystemTools.map((tool) => tool.name).join(', ')}).`); - } else { - const names = missingSystemTools.map((tool) => tool.name).join(', '); - console.log(`[${TOOL_NAME}] ⚠️ Missing required system tool(s): ${names}`); - for (const tool of missingSystemTools) { - const reasonText = tool.reason ? ` (${tool.reason})` : ''; - console.log(`[${TOOL_NAME}] Install ${tool.name}: ${tool.installHint}${reasonText}`); - } - } - - const topRepoRoot = resolveRepoRoot(options.target); - const discoveredRepos = options.recursive - ? discoverNestedGitRepos(topRepoRoot, { - maxDepth: options.nestedMaxDepth, - extraSkip: options.nestedSkipDirs, - includeSubmodules: options.includeSubmodules, - }) - : [topRepoRoot]; - - if (discoveredRepos.length > 1) { - console.log( - `[${TOOL_NAME}] Detected ${discoveredRepos.length} git repos under ${topRepoRoot}. Installing into each (use --no-recursive to limit to the top-level).`, - ); - for (const repoPath of discoveredRepos) { - const marker = repoPath === topRepoRoot ? ' (top-level)' : ''; - console.log(`[${TOOL_NAME}] - ${repoPath}${marker}`); - } - } - - let aggregateErrors = 0; - let aggregateWarnings = 0; - let lastScanResult = null; - - for (const repoPath of discoveredRepos) { - const perRepoOptions = { ...options, target: repoPath }; - const repoLabel = discoveredRepos.length > 1 ? ` [${path.relative(topRepoRoot, repoPath) || '.'}]` : ''; - - if (discoveredRepos.length > 1) { - console.log(`[${TOOL_NAME}] ── Setup target: ${repoPath} ──`); - } - - const blocked = protectedBaseWriteBlock(perRepoOptions); - if (blocked) { - const sandboxResult = runSetupInSandbox(perRepoOptions, blocked, repoLabel); - aggregateErrors += sandboxResult.scanResult.errors; - aggregateWarnings += sandboxResult.scanResult.warnings; - lastScanResult = sandboxResult.scanResult; - continue; - } - - const { installPayload, fixPayload, parentWorkspace } = runSetupBootstrapInternal(perRepoOptions); - printOperations(`Setup/install${repoLabel}`, installPayload, perRepoOptions.dryRun); - printOperations(`Setup/fix${repoLabel}`, fixPayload, perRepoOptions.dryRun); - - if (perRepoOptions.dryRun) { - continue; - } - - if (parentWorkspace) { - console.log(`[${TOOL_NAME}] Parent workspace view: ${parentWorkspace.workspacePath}`); - } - - const scanResult = runScanInternal({ target: repoPath, json: false }); - const currentBaseBranch = currentBranchName(scanResult.repoRoot); - const autoFinishSummary = autoFinishReadyAgentBranches(scanResult.repoRoot, { - baseBranch: currentBaseBranch, - dryRun: perRepoOptions.dryRun, - }); - printScanResult(scanResult, false); - if (autoFinishSummary.enabled) { - console.log( - `[${TOOL_NAME}] Auto-finish sweep (base=${currentBaseBranch}): attempted=${autoFinishSummary.attempted}, completed=${autoFinishSummary.completed}, skipped=${autoFinishSummary.skipped}, failed=${autoFinishSummary.failed}`, - ); - for (const detail of autoFinishSummary.details) { - console.log(`[${TOOL_NAME}] ${detail}`); - } - } else if (autoFinishSummary.details.length > 0) { - console.log(`[${TOOL_NAME}] ${autoFinishSummary.details[0]}`); - } - printSetupRepoHints(scanResult.repoRoot, currentBaseBranch, repoLabel); - - aggregateErrors += scanResult.errors; - aggregateWarnings += scanResult.warnings; - lastScanResult = scanResult; - } - - if (options.dryRun) { - console.log(`[${TOOL_NAME}] Dry run setup done.`); - process.exitCode = 0; - return; - } - - if (aggregateErrors === 0 && aggregateWarnings === 0) { - const repoCount = discoveredRepos.length; - const suffix = repoCount > 1 ? ` (${repoCount} repos)` : ''; - console.log(`[${TOOL_NAME}] ✅ Setup complete.${suffix}`); - console.log(`[${TOOL_NAME}] Copy AI setup prompt with: ${SHORT_TOOL_NAME} prompt`); - console.log( - `[${TOOL_NAME}] OpenSpec core workflow: /opsx:propose -> /opsx:apply -> /opsx:archive`, - ); - console.log( - `[${TOOL_NAME}] Optional expanded OpenSpec profile: openspec config profile && openspec update`, - ); - console.log(`[${TOOL_NAME}] OpenSpec guide: docs/openspec-getting-started.md`); - } - - if (lastScanResult) { - setExitCodeFromScan({ - ...lastScanResult, - errors: aggregateErrors, - warnings: aggregateWarnings, - }); - } -} - -function ensureMainBranch(repoRoot) { - const branchResult = gitRun(repoRoot, ['rev-parse', '--abbrev-ref', 'HEAD'], { allowFailure: true }); - if (branchResult.status !== 0) { - throw new Error(`Unable to detect current branch in ${repoRoot}`); - } - - const branch = branchResult.stdout.trim(); - if (branch !== 'main') { - throw new Error(`Release blocked: current branch is '${branch}' (required: 'main')`); - } -} - -function ensureCleanWorkingTree(repoRoot) { - const statusResult = gitRun(repoRoot, ['status', '--porcelain'], { allowFailure: true }); - if (statusResult.status !== 0) { - throw new Error(`Unable to read git status in ${repoRoot}`); - } - - const dirty = statusResult.stdout.trim(); - if (dirty.length > 0) { - throw new Error('Release blocked: working tree is not clean'); - } -} - -function readReleaseRepoPackageJson(repoRoot) { - const manifestPath = path.join(repoRoot, 'package.json'); - if (!fs.existsSync(manifestPath)) { - throw new Error(`Release blocked: package.json missing in ${repoRoot}`); - } - - try { - return JSON.parse(fs.readFileSync(manifestPath, 'utf8')); - } catch (error) { - throw new Error(`Release blocked: unable to parse package.json in ${repoRoot}: ${error.message}`); - } -} - -function resolveReleaseGithubRepo(repoRoot) { - const releasePackageJson = readReleaseRepoPackageJson(repoRoot); - const fromManifest = inferGithubRepoSlug( - releasePackageJson.repository && - (releasePackageJson.repository.url || releasePackageJson.repository), - ); - if (fromManifest) { - return fromManifest; - } - - const fromOrigin = inferGithubRepoSlug(readGitConfig(repoRoot, 'remote.origin.url')); - if (fromOrigin) { - return fromOrigin; - } - - throw new Error( - 'Release blocked: unable to resolve GitHub repo from package.json repository URL or origin remote.', - ); -} - -function readRepoReadme(repoRoot) { - const readmePath = path.join(repoRoot, 'README.md'); - if (!fs.existsSync(readmePath)) { - throw new Error(`Release blocked: README.md missing in ${repoRoot}`); - } - return fs.readFileSync(readmePath, 'utf8'); -} - -function parseReadmeReleaseEntries(readmeContent) { - const releaseNotesIndex = String(readmeContent || '').indexOf('## Release notes'); - if (releaseNotesIndex < 0) { - throw new Error('Release blocked: README.md is missing the "## Release notes" section'); - } - - const releaseNotesContent = String(readmeContent || '').slice(releaseNotesIndex); - const entries = []; - const lines = releaseNotesContent.split(/\r?\n/); - let currentTag = ''; - let currentLines = []; - - function flushEntry() { - if (!currentTag) { - return; - } - const body = currentLines.join('\n').trim(); - if (body) { - entries.push({ tag: currentTag, body, version: parseVersionString(currentTag) }); - } - currentTag = ''; - currentLines = []; - } - - for (const line of lines) { - const headingMatch = line.match(/^###\s+(v\d+\.\d+\.\d+)\s*$/); - if (headingMatch) { - flushEntry(); - currentTag = headingMatch[1]; - continue; - } - - if (!currentTag) { - continue; - } - - if (/^<\/details>\s*$/.test(line) || /^##\s+/.test(line)) { - flushEntry(); - continue; - } - - currentLines.push(line); - } - - flushEntry(); - - if (entries.length === 0) { - throw new Error('Release blocked: README.md did not yield any versioned release-note sections'); - } - - return entries; -} - -function resolvePreviousPublishedReleaseTag(repoSlug, currentTag) { - const result = run(GH_BIN, ['release', 'list', '--repo', repoSlug, '--limit', '20'], { - timeout: 20_000, - }); - if (result.error) { - throw new Error(`Release blocked: unable to run '${GH_BIN} release list': ${result.error.message}`); - } - if (result.status !== 0) { - const details = (result.stderr || result.stdout || '').trim(); - throw new Error(`Release blocked: unable to list GitHub releases.${details ? `\n${details}` : ''}`); - } - - const tags = String(result.stdout || '') - .split('\n') - .map((line) => line.split('\t')[0].trim()) - .filter(Boolean); - - return tags.find((tag) => tag !== currentTag) || ''; -} - -function selectReleaseEntriesForWindow(entries, currentTag, previousTag) { - const currentVersion = parseVersionString(currentTag); - if (!currentVersion) { - throw new Error(`Release blocked: invalid current version tag '${currentTag}'`); - } - const previousVersion = previousTag ? parseVersionString(previousTag) : null; - - const selected = entries.filter((entry) => { - if (!entry.version) return false; - if (compareParsedVersions(entry.version, currentVersion) > 0) return false; - if (!previousVersion) return entry.tag === currentTag; - return compareParsedVersions(entry.version, previousVersion) > 0; - }); - - if (!selected.some((entry) => entry.tag === currentTag)) { - throw new Error(`Release blocked: README.md is missing release notes for ${currentTag}`); - } - - return selected; -} - -function renderGeneratedReleaseNotes(entries, currentTag, previousTag) { - const intro = previousTag ? `Changes since ${previousTag}.` : `Changes in ${currentTag}.`; - const sections = entries - .map((entry) => `### ${entry.tag}\n${entry.body}`) - .join('\n\n'); - return `GitGuardex ${currentTag}\n\n${intro}\n\n${sections}`; -} - -function buildReleaseNotesFromReadme(repoRoot, currentTag, previousTag) { - const readme = readRepoReadme(repoRoot); - const entries = parseReadmeReleaseEntries(readme); - const selected = selectReleaseEntriesForWindow(entries, currentTag, previousTag); - return renderGeneratedReleaseNotes(selected, currentTag, previousTag); -} - -function release(rawArgs) { - if (rawArgs.length > 0) { - throw new Error(`Unknown option: ${rawArgs[0]}`); - } - - const repoRoot = resolveRepoRoot(process.cwd()); - if (path.resolve(repoRoot) !== MAINTAINER_RELEASE_REPO) { - throw new Error( - `Release blocked: command only allowed in ${MAINTAINER_RELEASE_REPO} (current: ${repoRoot})`, - ); - } - - ensureMainBranch(repoRoot); - ensureCleanWorkingTree(repoRoot); - - if (!isCommandAvailable(GH_BIN)) { - throw new Error(`Release blocked: '${GH_BIN}' is not available`); - } - - const ghAuthStatus = run(GH_BIN, ['auth', 'status'], { timeout: 20_000 }); - if (ghAuthStatus.error) { - throw new Error(`Release blocked: unable to run '${GH_BIN} auth status': ${ghAuthStatus.error.message}`); - } - if (ghAuthStatus.status !== 0) { - const details = (ghAuthStatus.stderr || ghAuthStatus.stdout || '').trim(); - throw new Error(`Release blocked: '${GH_BIN}' auth is unavailable.${details ? `\n${details}` : ''}`); - } - - const releasePackageJson = readReleaseRepoPackageJson(repoRoot); - const repoSlug = resolveReleaseGithubRepo(repoRoot); - const currentTag = `v${releasePackageJson.version}`; - const previousTag = resolvePreviousPublishedReleaseTag(repoSlug, currentTag); - const notes = buildReleaseNotesFromReadme(repoRoot, currentTag, previousTag); - const headCommit = gitRun(repoRoot, ['rev-parse', 'HEAD']).stdout.trim(); - - const existingRelease = run(GH_BIN, ['release', 'view', currentTag, '--repo', repoSlug], { - timeout: 20_000, - }); - if (existingRelease.error) { - throw new Error(`Release blocked: unable to run '${GH_BIN} release view': ${existingRelease.error.message}`); - } - - const releaseArgs = - existingRelease.status === 0 - ? ['release', 'edit', currentTag, '--repo', repoSlug, '--title', currentTag, '--notes', notes] - : [ - 'release', - 'create', - currentTag, - '--repo', - repoSlug, - '--target', - headCommit, - '--title', - currentTag, - '--notes', - notes, - ]; - - console.log( - `[${TOOL_NAME}] ${existingRelease.status === 0 ? 'Updating' : 'Creating'} GitHub release ${currentTag} on ${repoSlug}`, - ); - if (previousTag) { - console.log(`[${TOOL_NAME}] Aggregating README release notes newer than ${previousTag}.`); - } else { - console.log(`[${TOOL_NAME}] No earlier published GitHub release found; using only ${currentTag}.`); - } - - const releaseResult = run(GH_BIN, releaseArgs, { cwd: repoRoot, timeout: 60_000 }); - if (releaseResult.error) { - throw new Error(`Release blocked: unable to run '${GH_BIN} release': ${releaseResult.error.message}`); - } - if (releaseResult.status !== 0) { - const details = (releaseResult.stderr || releaseResult.stdout || '').trim(); - throw new Error(`GitHub release command failed.${details ? `\n${details}` : ''}`); - } - - const releaseUrl = String(releaseResult.stdout || '').trim(); - if (releaseUrl) { - console.log(releaseUrl); - } - - console.log(`[${TOOL_NAME}] ✅ GitHub release ${currentTag} is synced to the README history.`); - process.exitCode = 0; -} - -function installMany(rawArgs) { - const options = parseInstallManyArgs(rawArgs); - const targets = collectInstallManyTargets(options); - - if (!targets.length) { - throw new Error('install-many did not find any targets to process.'); - } - - if (options.usedImplicitWorkspaceDefault) { - console.log( - `[multiagent-safety] No explicit targets provided. Defaulting to workspace scan: ${path.resolve( - options.workspace, - )} (max depth ${options.maxDepth})`, - ); - } - - console.log( - `[multiagent-safety] install-many starting for ${targets.length} target path(s)${ - options.dryRun ? ' [dry-run]' : '' - }`, - ); - - let installed = 0; - let duplicateRepos = 0; - const seenRepoRoots = new Set(); - const failures = []; - - for (const targetPath of targets) { - let repoRoot; - try { - repoRoot = resolveRepoRoot(targetPath); - } catch (error) { - failures.push({ target: targetPath, message: error.message }); - if (options.failFast) { - break; - } - continue; - } - - if (seenRepoRoots.has(repoRoot)) { - duplicateRepos += 1; - console.log(`[multiagent-safety] Skipping duplicate repo target: ${targetPath} -> ${repoRoot}`); - continue; - } - - seenRepoRoots.add(repoRoot); - - try { - const report = installIntoRepoRoot(repoRoot, options); - printInstallReport(report); - installed += 1; - } catch (error) { - failures.push({ target: repoRoot, message: error.message }); - if (options.failFast) { - break; - } - } - } - - console.log( - `[multiagent-safety] install-many summary: installed=${installed}, failures=${failures.length}, duplicate-targets=${duplicateRepos}`, - ); - - if (failures.length > 0) { - console.error('[multiagent-safety] Failed targets:'); - for (const failure of failures) { - console.error(` - ${failure.target}`); - console.error(` ${failure.message}`); - } - throw new Error(`install-many completed with ${failures.length} failure(s)`); - } - - if (options.dryRun) { - console.log('[multiagent-safety] Dry run complete. No files were modified.'); - } else { - console.log('[multiagent-safety] Installed multi-agent safety workflow across all targets.'); - } -} - -function initWorkspace(rawArgs) { - const options = parseInitWorkspaceArgs(rawArgs); - const resolvedWorkspace = path.resolve(options.workspace); - const repos = discoverGitRepos(resolvedWorkspace, options.maxDepth) - .map((repoPath) => path.resolve(repoPath)) - .sort(); - - const outputPath = options.output - ? path.resolve(options.output) - : path.join(resolvedWorkspace, DEFAULT_WORKSPACE_TARGETS_FILE); - - if (fs.existsSync(outputPath) && !options.force) { - throw new Error(`Refusing to overwrite existing file without --force: ${outputPath}`); - } - - const headerLines = [ - '# multiagent-safety workspace targets', - `# generated: ${new Date().toISOString()}`, - `# workspace: ${resolvedWorkspace}`, - `# max-depth: ${options.maxDepth}`, - '#', - '# Run:', - `# multiagent-safety install-many --targets-file "${outputPath}"`, - '', - ]; - const content = `${headerLines.join('\n')}${repos.join('\n')}${repos.length ? '\n' : ''}`; - - fs.mkdirSync(path.dirname(outputPath), { recursive: true }); - fs.writeFileSync(outputPath, content, 'utf8'); - - console.log(`[multiagent-safety] Workspace target file written: ${outputPath}`); - console.log(`[multiagent-safety] Repos discovered: ${repos.length}`); - if (repos.length === 0) { - console.log('[multiagent-safety] No git repos found. You can add target paths manually to the file.'); - } else { - console.log(`[multiagent-safety] Next step: multiagent-safety install-many --targets-file "${outputPath}"`); - } -} - -function doctorAudit(rawArgs) { - const options = parseDoctorArgs(rawArgs); - const repoRoot = resolveRepoRoot(options.target); - const guardexToggle = resolveGuardexRepoToggle(repoRoot); - const failures = []; - const warnings = []; - - function ok(message) { - console.log(` [ok] ${message}`); - } - function warn(message) { - warnings.push(message); - console.log(` [warn] ${message}`); - } - function fail(message) { - failures.push(message); - console.log(` [fail] ${message}`); - } - - console.log(`[multiagent-safety] doctor target: ${repoRoot}`); - if (!guardexToggle.enabled) { - console.log( - `[multiagent-safety] Guardex is disabled for this repo (${describeGuardexRepoToggle(guardexToggle)}).`, - ); - console.log('[multiagent-safety] doctor passed.'); - return; - } - - const hooksPath = run('git', ['-C', repoRoot, 'config', '--get', 'core.hooksPath']); - if (hooksPath.status !== 0) { - fail('git core.hooksPath is not configured'); - } else if (hooksPath.stdout.trim() !== '.githooks') { - fail(`git core.hooksPath is "${hooksPath.stdout.trim()}" (expected ".githooks")`); - } else { - ok('git core.hooksPath is .githooks'); - } - - for (const relativePath of REQUIRED_MANAGED_REPO_FILES) { - const absolutePath = path.join(repoRoot, relativePath); - if (!fs.existsSync(absolutePath)) { - fail(`missing ${relativePath}`); - continue; - } - ok(`found ${relativePath}`); - - if (EXECUTABLE_RELATIVE_PATHS.has(relativePath)) { - try { - fs.accessSync(absolutePath, fs.constants.X_OK); - } catch { - fail(`${relativePath} exists but is not executable`); - } - } - } - - const lockFilePath = path.join(repoRoot, '.omx/state/agent-file-locks.json'); - if (fs.existsSync(lockFilePath)) { - try { - const parsed = JSON.parse(fs.readFileSync(lockFilePath, 'utf8')); - if (!parsed || typeof parsed !== 'object' || typeof parsed.locks !== 'object') { - fail('.omx/state/agent-file-locks.json does not contain a valid { locks: {} } object'); - } else { - ok('lock registry JSON is valid'); - } - } catch (error) { - fail(`lock registry JSON is invalid: ${error.message}`); - } - } - - const packagePath = path.join(repoRoot, 'package.json'); - if (!fs.existsSync(packagePath)) { - warn('package.json not found (legacy agent:* script drift cannot be checked)'); - } else { - try { - const pkg = JSON.parse(fs.readFileSync(packagePath, 'utf8')); - const scripts = pkg.scripts || {}; - const legacyAgentScripts = Object.entries(LEGACY_MANAGED_PACKAGE_SCRIPTS) - .filter(([name, expectedValue]) => scripts[name] === expectedValue) - .map(([name]) => name); - if (legacyAgentScripts.length > 0) { - warn(`legacy agent:* package.json scripts remain (${legacyAgentScripts.join(', ')}); run '${SHORT_TOOL_NAME} migrate' to remove them`); - } else { - ok('package.json does not contain Guardex-managed agent:* helper scripts'); - } - } catch (error) { - fail(`package.json is invalid JSON: ${error.message}`); - } - } - - const agentsPath = path.join(repoRoot, 'AGENTS.md'); - if (!fs.existsSync(agentsPath)) { - warn('AGENTS.md not found (multi-agent contract snippet not present)'); - } else { - const agentsContent = fs.readFileSync(agentsPath, 'utf8'); - if (!agentsContent.includes(AGENTS_MARKER_START)) { - warn('AGENTS.md exists but multiagent-safety snippet marker is missing'); - } else { - ok('AGENTS.md contains multiagent-safety snippet marker'); - } - } - - if (warnings.length) { - console.log(`[multiagent-safety] warnings: ${warnings.length}`); - } - if (failures.length) { - console.log(`[multiagent-safety] failures: ${failures.length}`); - } - - if (failures.length === 0 && (!options.strict || warnings.length === 0)) { - console.log('[multiagent-safety] doctor passed.'); - if (warnings.length > 0) { - console.log('[multiagent-safety] tip: run with --strict to treat warnings as failures.'); - } - return; - } - - if (options.strict && warnings.length > 0 && failures.length === 0) { - console.log('[multiagent-safety] strict mode failed due to warnings.'); - } else { - console.log('[multiagent-safety] doctor failed.'); - } - throw new Error('doctor detected configuration issues'); -} - -function printAgentsSnippet() { - const snippetPath = path.join(TEMPLATE_ROOT, 'AGENTS.multiagent-safety.md'); - process.stdout.write(fs.readFileSync(snippetPath, 'utf8')); -} - -function copyPrompt() { - process.stdout.write(AI_SETUP_PROMPT); - process.exitCode = 0; -} - -function copyCommands() { - process.stdout.write(AI_SETUP_COMMANDS); - process.exitCode = 0; -} - -function prompt(rawArgs) { - const args = Array.isArray(rawArgs) ? rawArgs : []; - let variant = 'prompt'; - for (const arg of args) { - if (arg === '--exec' || arg === '--commands') variant = 'exec'; - else if (arg === '--snippet' || arg === '--agents') variant = 'snippet'; - else if (arg === '--prompt' || arg === '--full') variant = 'prompt'; - else if (arg === '-h' || arg === '--help') variant = 'help'; - else throw new Error(`Unknown option: ${arg}`); - } - if (variant === 'help') { - console.log( - `${SHORT_TOOL_NAME} prompt commands:\n` + - ` ${SHORT_TOOL_NAME} prompt Print AI setup checklist\n` + - ` ${SHORT_TOOL_NAME} prompt --exec Print setup commands only (shell-ready)\n` + - ` ${SHORT_TOOL_NAME} prompt --snippet Print the AGENTS.md managed-block template`, - ); - process.exitCode = 0; - return; - } - if (variant === 'exec') return copyCommands(); - if (variant === 'snippet') return printAgentsSnippet(); - return copyPrompt(); -} - -function printStandaloneOperations(title, rootLabel, operations, dryRun = false) { - console.log(`[${TOOL_NAME}] ${title}: ${rootLabel}`); - for (const operation of operations) { - const note = operation.note ? ` (${operation.note})` : ''; - console.log(` - ${operation.status.padEnd(12)} ${operation.file}${note}`); - } - if (dryRun) { - console.log(`[${TOOL_NAME}] Dry run complete. No files were modified.`); - } -} - -function branch(rawArgs) { - const [subcommand, ...rest] = rawArgs; - if (subcommand === 'start') { - const { target, passthrough } = extractTargetedArgs(rest); - invokePackageAsset('branchStart', passthrough, { cwd: resolveRepoRoot(target) }); - return; - } - if (subcommand === 'finish') { - const { target, passthrough } = extractTargetedArgs(rest); - invokePackageAsset('branchFinish', passthrough, { cwd: resolveRepoRoot(target) }); - return; - } - if (subcommand === 'merge') return merge(rest); - throw new Error( - `Usage: ${SHORT_TOOL_NAME} branch [options] ` + - `(examples: '${SHORT_TOOL_NAME} branch start "" ""', '${SHORT_TOOL_NAME} branch finish --branch ')`, - ); -} - -function locks(rawArgs) { - const { target, passthrough } = extractTargetedArgs(rawArgs); - const result = runPackageAsset('lockTool', passthrough, { cwd: resolveRepoRoot(target) }); - if (result.stdout) process.stdout.write(result.stdout); - if (result.stderr) process.stderr.write(result.stderr); - process.exitCode = result.status; -} - -function worktree(rawArgs) { - const [subcommand, ...rest] = rawArgs; - if (subcommand === 'prune') { - const { target, passthrough } = extractTargetedArgs(rest); - invokePackageAsset('worktreePrune', passthrough, { cwd: resolveRepoRoot(target) }); - return; - } - throw new Error(`Usage: ${SHORT_TOOL_NAME} worktree prune [cleanup-options]`); -} - -function hook(rawArgs) { - const [subcommand, ...rest] = rawArgs; - if (subcommand === 'run') { - const [hookName, ...hookArgs] = rest; - if (!HOOK_NAMES.includes(hookName)) { - throw new Error(`Unknown hook name: ${hookName || '(missing)'}`); - } - const { target, passthrough } = extractTargetedArgs(hookArgs); - const hookAssetPath = path.join(TEMPLATE_ROOT, 'githooks', hookName); - const result = run('bash', [hookAssetPath, ...passthrough], { - cwd: resolveRepoRoot(target), - stdio: hookName === 'pre-push' ? 'inherit' : 'pipe', - env: packageAssetEnv(), - }); - if (result.stdout) process.stdout.write(result.stdout); - if (result.stderr) process.stderr.write(result.stderr); - process.exitCode = result.status; - return; - } - if (subcommand === 'install') { - const { target, passthrough } = extractTargetedArgs(rest); - if (passthrough.length > 0) { - throw new Error(`Unknown hook install option: ${passthrough[0]}`); - } - const repoRoot = resolveRepoRoot(target); - const hookResult = configureHooks(repoRoot, false); - console.log(`[${TOOL_NAME}] Hook install target: ${repoRoot}`); - console.log(` - hooksPath ${hookResult.status} ${hookResult.key}=${hookResult.value}`); - process.exitCode = 0; - return; - } - throw new Error(`Usage: ${SHORT_TOOL_NAME} hook ...`); -} - -function internal(rawArgs) { - const [subcommand, assetKey, ...rest] = rawArgs; - if (subcommand !== 'run-shell') { - throw new Error(`Unknown internal command: ${subcommand || '(missing)'}`); - } - const { target, passthrough } = extractTargetedArgs(rest); - const repoRoot = resolveRepoRoot(target); - const result = assetKey === 'reviewBot' - ? runReviewBotCommand(repoRoot, passthrough) - : runPackageAsset(assetKey, passthrough, { cwd: repoRoot }); - if (result.stdout) process.stdout.write(result.stdout); - if (result.stderr) process.stderr.write(result.stderr); - process.exitCode = result.status; -} - -function installAgentSkills(rawArgs) { - let dryRun = false; - let force = false; - for (const arg of rawArgs) { - if (arg === '--dry-run') { - dryRun = true; - continue; - } - if (arg === '--force') { - force = true; - continue; - } - throw new Error(`Unknown option: ${arg}`); - } - - const operations = USER_LEVEL_SKILL_ASSETS.map((asset) => installUserLevelAsset(asset, { dryRun, force })); - printStandaloneOperations('User-level Guardex skills', GUARDEX_HOME_DIR, operations, dryRun); - process.exitCode = 0; -} - -function migrate(rawArgs) { - const { target, passthrough } = extractTargetedArgs(rawArgs); - let dryRun = false; - let force = false; - let installSkills = false; - for (const arg of passthrough) { - if (arg === '--dry-run') { - dryRun = true; - continue; - } - if (arg === '--force') { - force = true; - continue; - } - if (arg === '--install-agent-skills') { - installSkills = true; - continue; - } - throw new Error(`Unknown option: ${arg}`); - } - - const repoRoot = resolveRepoRoot(target); - const fixPayload = runFixInternal({ - target: repoRoot, - dryRun, - force, - skipAgents: false, - skipPackageJson: true, - skipGitignore: false, - dropStaleLocks: true, - }); - printOperations('Migrate/fix', fixPayload, dryRun); - - if (installSkills) { - const skillOps = USER_LEVEL_SKILL_ASSETS.map((asset) => installUserLevelAsset(asset, { dryRun, force })); - printStandaloneOperations('Migrate/install-agent-skills', GUARDEX_HOME_DIR, skillOps, dryRun); - } - - const removableLegacyFiles = LEGACY_MANAGED_REPO_FILES.filter( - (relativePath) => !REQUIRED_MANAGED_REPO_FILES.includes(relativePath), - ); - const removalOps = removableLegacyFiles.map((relativePath) => removeLegacyManagedRepoFile(repoRoot, relativePath, { dryRun, force })); - removalOps.push(removeLegacyPackageScripts(repoRoot, dryRun)); - printStandaloneOperations('Migrate/cleanup', repoRoot, removalOps, dryRun); - process.exitCode = 0; -} - -function cleanup(rawArgs) { - const options = parseCleanupArgs(rawArgs); - const repoRoot = resolveRepoRoot(options.target); - - const args = []; - if (options.base) { - args.push('--base', options.base); - } - if (options.branch) { - args.push('--branch', options.branch); - } - if (options.forceDirty) { - args.push('--force-dirty'); - } - if (options.dryRun) { - args.push('--dry-run'); - } - if (!options.keepCleanWorktrees) { - args.push('--only-dirty-worktrees'); - } - if (options.includePrMerged) { - args.push('--include-pr-merged'); - } - if (options.idleMinutes > 0) { - args.push('--idle-minutes', String(options.idleMinutes)); - } - if (options.maxBranches > 0) { - args.push('--max-branches', String(options.maxBranches)); - } - args.push('--delete-branches'); - if (!options.keepRemote) { - args.push('--delete-remote-branches'); - } - - const runCleanupCycle = () => { - const runResult = runPackageAsset('worktreePrune', args, { cwd: repoRoot, stdio: 'inherit' }); - if (runResult.status !== 0) { - throw new Error('Cleanup command failed'); - } - }; - - if (options.watch) { - let cycle = 0; - while (true) { - cycle += 1; - console.log( - `[${TOOL_NAME}] Cleanup watch cycle=${cycle} (interval=${options.intervalSeconds}s, idleMinutes=${options.idleMinutes}, maxBranches=${options.maxBranches > 0 ? options.maxBranches : "unbounded"}).`, - ); - runCleanupCycle(); - if (options.once) { - break; - } - const sleepResult = run('sleep', [String(options.intervalSeconds)], { cwd: repoRoot }); - if (sleepResult.status !== 0) { - throw new Error(`Cleanup watch sleep failed (interval=${options.intervalSeconds}s)`); - } - } - process.exitCode = 0; - return; - } - - runCleanupCycle(); - process.exitCode = 0; -} - -function merge(rawArgs) { - const options = parseMergeArgs(rawArgs); - const repoRoot = resolveRepoRoot(options.target); - - const args = []; - if (options.base) { - args.push('--base', options.base); - } - if (options.into) { - args.push('--into', options.into); - } - if (options.task) { - args.push('--task', options.task); - } - if (options.agent) { - args.push('--agent', options.agent); - } - for (const branch of options.branches) { - args.push('--branch', branch); - } - - const mergeResult = runPackageAsset('branchMerge', args, { cwd: repoRoot, stdio: 'pipe' }); - if (mergeResult.stdout) { - process.stdout.write(mergeResult.stdout); - } - if (mergeResult.stderr) { - process.stderr.write(mergeResult.stderr); - } - if (mergeResult.status !== 0) { - throw new Error(`merge command failed with status ${mergeResult.status}`); - } - - process.exitCode = 0; -} - -function finish(rawArgs, defaults = {}) { - const options = parseFinishArgs(rawArgs, defaults); - const repoRoot = resolveRepoRoot(options.target); - - const worktreeEntries = listAgentWorktrees(repoRoot); - const worktreeByBranch = new Map(worktreeEntries.map((entry) => [entry.branch, entry.worktreePath])); - - let candidateBranches = []; - if (options.branch) { - if (!branchExists(repoRoot, options.branch)) { - throw new Error(`Local branch not found: ${options.branch}`); - } - candidateBranches = [options.branch]; - } else { - candidateBranches = uniquePreserveOrder([ - ...listLocalAgentBranchesForFinish(repoRoot), - ...worktreeEntries.map((entry) => entry.branch), - ]); - } - - const candidates = []; - for (const branch of candidateBranches) { - const worktreePath = worktreeByBranch.get(branch) || ''; - const baseBranch = resolveFinishBaseBranch(repoRoot, branch, options.base); - const hasChanges = worktreePath ? worktreeHasLocalChanges(worktreePath) : false; - const alreadyMerged = branchMergedIntoBase(repoRoot, branch, baseBranch); - if (options.all || options.branch || hasChanges || !alreadyMerged) { - candidates.push({ - branch, - baseBranch, - worktreePath, - hasChanges, - alreadyMerged, - }); - } - } - - if (candidates.length === 0) { - console.log(`[${TOOL_NAME}] No pending agent branches to finish.`); - process.exitCode = 0; - return; - } - - let succeeded = 0; - let failed = 0; - let autoCommitted = 0; - - for (const candidate of candidates) { - const { branch, baseBranch, worktreePath } = candidate; - console.log( - `[${TOOL_NAME}] Finishing '${branch}' -> '${baseBranch}'${worktreePath ? ` (${worktreePath})` : ''}...`, - ); - - try { - let commitState = { changed: false, committed: false }; - if (worktreePath) { - commitState = autoCommitWorktreeForFinish(repoRoot, worktreePath, branch, options); - } - - if (commitState.committed) { - autoCommitted += 1; - console.log(`[${TOOL_NAME}] Auto-committed '${branch}' before finish.`); - } else if (commitState.changed && commitState.dryRun) { - console.log(`[${TOOL_NAME}] [dry-run] Would auto-commit pending changes on '${branch}'.`); - } - - const finishArgs = [ - '--branch', - branch, - '--base', - baseBranch, - options.waitForMerge ? '--wait-for-merge' : '--no-wait-for-merge', - options.cleanup ? '--cleanup' : '--no-cleanup', - ]; - if (options.mergeMode === 'pr') { - finishArgs.push('--via-pr'); - } else if (options.mergeMode === 'direct') { - finishArgs.push('--direct-only'); - } else { - finishArgs.push('--mode', 'auto'); - } - if (options.keepRemote) { - finishArgs.push('--keep-remote-branch'); - } - - if (options.dryRun) { - console.log(`[${TOOL_NAME}] [dry-run] Would run: gx branch finish ${finishArgs.join(' ')}`); - succeeded += 1; - continue; - } - - const finishResult = runPackageAsset('branchFinish', finishArgs, { cwd: repoRoot, stdio: 'pipe' }); - if (finishResult.stdout) { - process.stdout.write(finishResult.stdout); - } - if (finishResult.stderr) { - process.stderr.write(finishResult.stderr); - } - if (finishResult.status !== 0) { - throw new Error(`agent-branch-finish exited with status ${finishResult.status}`); - } - - succeeded += 1; - } catch (error) { - failed += 1; - console.error(`[${TOOL_NAME}] Finish failed for '${branch}': ${error.message}`); - if (options.failFast) { - break; - } - } - } - - console.log( - `[${TOOL_NAME}] Finish summary: total=${candidates.length}, success=${succeeded}, failed=${failed}, autoCommitted=${autoCommitted}`, - ); - - if (failed > 0) { - throw new Error('finish command failed for one or more agent branches'); - } - - process.exitCode = 0; -} - -function sync(rawArgs) { - const options = parseSyncArgs(rawArgs); - const repoRoot = resolveRepoRoot(options.target); - const baseBranch = resolveBaseBranch(repoRoot, options.base); - const strategy = resolveSyncStrategy(repoRoot, options.strategy); - const baseRef = `origin/${baseBranch}`; - - ensureOriginBaseRef(repoRoot, baseBranch); - - if (options.allAgentBranches) { - const refs = gitRun(repoRoot, ['for-each-ref', '--format=%(refname:short)', 'refs/heads/agent/*'], { allowFailure: true }); - if (refs.status !== 0) { - throw new Error('Unable to list local agent branches'); - } - const branches = (refs.stdout || '').split('\n').map((item) => item.trim()).filter(Boolean); - const rows = branches.map((branch) => { - const counts = aheadBehind(repoRoot, branch, baseRef); - return { - branch, - base: baseRef, - ahead: counts.ahead, - behind: counts.behind, - syncRequired: counts.behind > 0, - }; - }); - - if (options.json) { - process.stdout.write(`${JSON.stringify({ - repoRoot, - base: baseRef, - branchCount: rows.length, - rows, - }, null, 2)}\n`); - } else { - console.log(`[${TOOL_NAME}] Sync report target: ${repoRoot}`); - console.log(`[${TOOL_NAME}] Base: ${baseRef}`); - if (rows.length === 0) { - console.log(`[${TOOL_NAME}] No local agent branches found.`); - } else { - for (const row of rows) { - console.log(` - ${row.branch} | ahead ${row.ahead} | behind ${row.behind} | syncRequired=${row.syncRequired}`); - } - } - } - - const hasBehind = rows.some((row) => row.behind > 0); - process.exitCode = options.check && hasBehind ? 1 : 0; - return; - } - - const branch = currentBranchName(repoRoot); - if (!options.allowNonAgent && !branch.startsWith('agent/')) { - throw new Error(`sync is limited to agent/* branches by default (current: ${branch}). Use --allow-non-agent to override.`); - } - - const dirty = workingTreeIsDirty(repoRoot); - if (!options.check && !options.allowDirty && dirty) { - throw new Error('Sync blocked: working tree is not clean. Commit or stash changes first, or pass --allow-dirty.'); - } - - const before = aheadBehind(repoRoot, branch, baseRef); - - const payload = { - repoRoot, - branch, - base: baseRef, - strategy, - dirty, - aheadBefore: before.ahead, - behindBefore: before.behind, - syncRequired: before.behind > 0, - status: 'checked', - }; - - if (options.check) { - if (options.json) { - process.stdout.write(`${JSON.stringify(payload, null, 2)}\n`); - } else { - console.log(`[${TOOL_NAME}] Sync check target: ${repoRoot}`); - console.log(`[${TOOL_NAME}] Branch: ${branch}`); - console.log(`[${TOOL_NAME}] Base: ${baseRef}`); - console.log(`[${TOOL_NAME}] Ahead: ${before.ahead}`); - console.log(`[${TOOL_NAME}] Behind: ${before.behind}`); - console.log(`[${TOOL_NAME}] Sync required: ${before.behind > 0 ? 'yes' : 'no'}`); - } - process.exitCode = before.behind > 0 ? 1 : 0; - return; - } - - if (before.behind === 0) { - const result = { ...payload, status: 'no-op', aheadAfter: before.ahead, behindAfter: before.behind }; - if (options.json) { - process.stdout.write(`${JSON.stringify(result, null, 2)}\n`); - } else { - console.log(`[${TOOL_NAME}] Branch '${branch}' is already up to date with ${baseRef}.`); - } - process.exitCode = 0; - return; - } - - if (options.dryRun) { - const result = { ...payload, status: 'dry-run' }; - if (options.json) { - process.stdout.write(`${JSON.stringify(result, null, 2)}\n`); - } else { - console.log(`[${TOOL_NAME}] Dry run: would sync '${branch}' onto ${baseRef} via ${strategy}.`); - } - process.exitCode = 0; - return; - } - - const lockPath = path.join(repoRoot, LOCK_FILE_RELATIVE); - const lockState = lockRegistryStatus(repoRoot); - let lockBackup = null; - if (lockState.dirty && fs.existsSync(lockPath)) { - lockBackup = fs.readFileSync(lockPath, 'utf8'); - } - - if (lockState.dirty) { - if (lockState.untracked) { - fs.rmSync(lockPath, { force: true }); - } else { - const resetLock = gitRun(repoRoot, ['checkout', '--', LOCK_FILE_RELATIVE], { allowFailure: true }); - if (resetLock.status !== 0) { - throw new Error(`Unable to temporarily reset ${LOCK_FILE_RELATIVE} before sync`); - } - } - } - - try { - syncOperation(repoRoot, strategy, baseRef, options.ffOnly); - } finally { - if (lockBackup !== null) { - fs.mkdirSync(path.dirname(lockPath), { recursive: true }); - fs.writeFileSync(lockPath, lockBackup, 'utf8'); - } - } - const after = aheadBehind(repoRoot, branch, baseRef); - const result = { - ...payload, - status: 'success', - aheadAfter: after.ahead, - behindAfter: after.behind, - }; - - if (options.json) { - process.stdout.write(`${JSON.stringify(result, null, 2)}\n`); - } else { - console.log(`[${TOOL_NAME}] Sync target: ${repoRoot}`); - console.log(`[${TOOL_NAME}] Branch: ${branch}`); - console.log(`[${TOOL_NAME}] Base: ${baseRef}`); - console.log(`[${TOOL_NAME}] Strategy: ${strategy}`); - console.log(`[${TOOL_NAME}] Behind before sync: ${before.behind}`); - console.log(`[${TOOL_NAME}] Result: success (behind now: ${after.behind})`); - } - - process.exitCode = 0; -} - -function protect(rawArgs) { - const parsed = parseTargetFlag(rawArgs, process.cwd()); - const [subcommand, ...rest] = parsed.args; - const repoRoot = resolveRepoRoot(parsed.target); - - if (!subcommand || subcommand === 'help' || subcommand === '--help' || subcommand === '-h') { - console.log( - `${TOOL_NAME} protect commands:\n` + - ` ${TOOL_NAME} protect list [--target ]\n` + - ` ${TOOL_NAME} protect add [--target ]\n` + - ` ${TOOL_NAME} protect remove [--target ]\n` + - ` ${TOOL_NAME} protect set [--target ]\n` + - ` ${TOOL_NAME} protect reset [--target ]`, - ); - process.exitCode = 0; - return; - } - - const requestedBranches = uniquePreserveOrder(parseBranchList(rest.join(' '))); - - if (subcommand === 'list') { - const branches = readProtectedBranches(repoRoot); - console.log(`[${TOOL_NAME}] Protected branches (${branches.length}): ${branches.join(', ')}`); - process.exitCode = 0; - return; - } - - if (subcommand === 'add') { - if (requestedBranches.length === 0) { - throw new Error('protect add requires one or more branch names'); - } - const current = readProtectedBranches(repoRoot); - const next = uniquePreserveOrder([...current, ...requestedBranches]); - writeProtectedBranches(repoRoot, next); - console.log(`[${TOOL_NAME}] Protected branches updated: ${next.join(', ')}`); - process.exitCode = 0; - return; - } - - if (subcommand === 'remove') { - if (requestedBranches.length === 0) { - throw new Error('protect remove requires one or more branch names'); - } - const current = readProtectedBranches(repoRoot); - const removals = new Set(requestedBranches); - const next = current.filter((branch) => !removals.has(branch)); - writeProtectedBranches(repoRoot, next); - console.log( - `[${TOOL_NAME}] Protected branches updated: ` + - `${(next.length > 0 ? next : DEFAULT_PROTECTED_BRANCHES).join(', ')}`, - ); - if (next.length === 0) { - console.log(`[${TOOL_NAME}] Reset to defaults (${DEFAULT_PROTECTED_BRANCHES.join(', ')}) because list was empty.`); - } - process.exitCode = 0; - return; - } - - if (subcommand === 'set') { - if (requestedBranches.length === 0) { - throw new Error('protect set requires one or more branch names'); - } - writeProtectedBranches(repoRoot, requestedBranches); - console.log(`[${TOOL_NAME}] Protected branches set: ${requestedBranches.join(', ')}`); - process.exitCode = 0; - return; - } - - if (subcommand === 'reset') { - writeProtectedBranches(repoRoot, []); - console.log(`[${TOOL_NAME}] Protected branches reset to defaults: ${DEFAULT_PROTECTED_BRANCHES.join(', ')}`); - process.exitCode = 0; - return; - } - - throw new Error(`Unknown protect subcommand: ${subcommand}`); -} - -function levenshteinDistance(a, b) { - const rows = a.length + 1; - const cols = b.length + 1; - const matrix = Array.from({ length: rows }, () => Array(cols).fill(0)); - - for (let i = 0; i < rows; i += 1) matrix[i][0] = i; - for (let j = 0; j < cols; j += 1) matrix[0][j] = j; - - for (let i = 1; i < rows; i += 1) { - for (let j = 1; j < cols; j += 1) { - const cost = a[i - 1] === b[j - 1] ? 0 : 1; - matrix[i][j] = Math.min( - matrix[i - 1][j] + 1, // deletion - matrix[i][j - 1] + 1, // insertion - matrix[i - 1][j - 1] + cost, // substitution - ); - } - } - return matrix[a.length][b.length]; -} - -function maybeSuggestCommand(command) { - let best = null; - let bestDistance = Number.POSITIVE_INFINITY; - - for (const candidate of SUGGESTIBLE_COMMANDS) { - const dist = levenshteinDistance(command, candidate); - if (dist < bestDistance) { - bestDistance = dist; - best = candidate; - } - } - - if (best && bestDistance <= 2) { - return best; - } - - return null; -} - -function normalizeCommandOrThrow(command) { - if (COMMAND_TYPO_ALIASES.has(command)) { - const mapped = COMMAND_TYPO_ALIASES.get(command); - console.log(`[${TOOL_NAME}] Interpreting '${command}' as '${mapped}'.`); - return mapped; - } - return command; -} - -function warnDeprecatedAlias(aliasName) { - const entry = DEPRECATED_COMMAND_ALIASES.get(aliasName); - if (!entry) return; - console.error( - `[${TOOL_NAME}] '${aliasName}' is deprecated and will be removed in a future major release. ` + - `Use: ${entry.hint}`, - ); -} - -function extractFlag(args, ...names) { - const flagSet = new Set(names); - let found = false; - const remaining = []; - for (const arg of args) { - if (flagSet.has(arg)) { - found = true; - } else { - remaining.push(arg); - } - } - return { found, remaining }; -} - -function main() { - const args = process.argv.slice(2); - - if (args.length === 0) { - maybeSelfUpdateBeforeStatus(); - maybeOpenSpecUpdateBeforeStatus(); - status([]); - return; - } - - const [rawCommand, ...rest] = args; - const command = normalizeCommandOrThrow(rawCommand); - - if (command === '--help' || command === '-h' || command === 'help') { - usage(); - return; - } - - if (command === '--version' || command === '-v' || command === 'version') { - maybeSelfUpdateBeforeStatus(); - console.log(packageJson.version); - return; - } - - // Deprecated direct aliases — route to new surface and warn once. - if (DEPRECATED_COMMAND_ALIASES.has(command)) { - warnDeprecatedAlias(command); - if (command === 'init') return setup(rest); - if (command === 'install') return install(rest); - if (command === 'fix') return fix(rest); - if (command === 'scan') return scan(rest); - if (command === 'copy-prompt') return copyPrompt(); - if (command === 'copy-commands') return copyCommands(); - if (command === 'print-agents-snippet') return printAgentsSnippet(); - if (command === 'review') return review(rest); - } - - if (command === 'status') { - const { found: strict, remaining } = extractFlag(rest, '--strict'); - if (strict) return scan(remaining); - return status(remaining); - } - - if (command === 'setup') { - const installOnly = extractFlag(rest, '--install-only', '--only-install'); - if (installOnly.found) return install(installOnly.remaining); - const repairOnly = extractFlag(installOnly.remaining, '--repair', '--fix-only'); - if (repairOnly.found) return fix(repairOnly.remaining); - return setup(repairOnly.remaining); - } - - if (command === 'prompt') return prompt(rest); - if (command === 'doctor') return doctor(rest); - if (command === 'branch') return branch(rest); - if (command === 'locks') return locks(rest); - if (command === 'worktree') return worktree(rest); - if (command === 'hook') return hook(rest); - if (command === 'migrate') return migrate(rest); - if (command === 'install-agent-skills') return installAgentSkills(rest); - if (command === 'internal') return internal(rest); - if (command === 'agents') return agents(rest); - if (command === 'merge') return merge(rest); - if (command === 'finish') return finish(rest); - if (command === 'report') return report(rest); - if (command === 'protect') return protect(rest); - if (command === 'sync') return sync(rest); - if (command === 'cleanup') return cleanup(rest); - if (command === 'release') return release(rest); - - const suggestion = maybeSuggestCommand(command); - if (suggestion) { - throw new Error(`Unknown command: ${command}. Did you mean '${suggestion}'?`); - } - throw new Error(`Unknown command: ${command}`); -} - -try { - main(); -} catch (error) { - console.error(`[${TOOL_NAME}] ${error.message}`); - process.exitCode = 1; -} +runFromBin(); diff --git a/openspec/changes/agent-codex-decompose-cli-monolith-2026-04-22-11-06/.openspec.yaml b/openspec/changes/agent-codex-decompose-cli-monolith-2026-04-22-11-06/.openspec.yaml new file mode 100644 index 0000000..25345f4 --- /dev/null +++ b/openspec/changes/agent-codex-decompose-cli-monolith-2026-04-22-11-06/.openspec.yaml @@ -0,0 +1,2 @@ +schema: spec-driven +created: 2026-04-22 diff --git a/openspec/changes/agent-codex-decompose-cli-monolith-2026-04-22-11-06/proposal.md b/openspec/changes/agent-codex-decompose-cli-monolith-2026-04-22-11-06/proposal.md new file mode 100644 index 0000000..acebb3b --- /dev/null +++ b/openspec/changes/agent-codex-decompose-cli-monolith-2026-04-22-11-06/proposal.md @@ -0,0 +1,18 @@ +## Why + +- `bin/multiagent-safety.js` is roughly 7,864 lines long and currently mixes CLI parsing, template rendering, git/worktree plumbing, protected-base sandboxing, finish/merge logic, toolchain self-update, and output/report formatting. +- That shared module scope makes even small changes hard to review and easy to regress because unrelated helpers are tightly coupled and tests have to exercise one monolith. +- The requested outcome is a seam-based decomposition so future Guardex CLI changes can land in smaller diffs with clearer ownership and lower regression risk. + +## What Changes + +- Introduce a `src/` runtime layout that separates `cli`, `output`, `git`, `scaffold`, `hooks`, `toolchain`, `sandbox`, and `finish`, with only small shared helpers left in `src/context.js` and `src/core/runtime.js`. +- Reduce `bin/multiagent-safety.js` to a thin entrypoint that boots `src/cli/main.js`. +- Preserve the current command surface, aliases, and targeted behavior while moving the existing logic wholesale into the new modules. +- Update package shipping and regression coverage so installed CLIs still include `src/**` and the extracted runtime stays exercised by install/metadata tests. + +## Impact + +- Primary surfaces: `bin/multiagent-safety.js`, new `src/**` modules, `package.json`, and CLI regression tests. +- Main refactor risk is hidden cross-module coupling in `doctor`, protected-main sandbox flows, and finish/cleanup helpers, so extraction should move lower-risk seams first and verify after each pass. +- This is an internal architecture cleanup only; it must not intentionally change command names, output contracts, or the zero-copy install surface. diff --git a/openspec/changes/agent-codex-decompose-cli-monolith-2026-04-22-11-06/specs/cli-modularization/spec.md b/openspec/changes/agent-codex-decompose-cli-monolith-2026-04-22-11-06/specs/cli-modularization/spec.md new file mode 100644 index 0000000..f009646 --- /dev/null +++ b/openspec/changes/agent-codex-decompose-cli-monolith-2026-04-22-11-06/specs/cli-modularization/spec.md @@ -0,0 +1,29 @@ +## ADDED Requirements + +### Requirement: Thin CLI entrypoint +The CLI SHALL keep `bin/multiagent-safety.js` as a thin bootstrap surface that delegates command execution into `src/cli`. + +#### Scenario: Entrypoint delegates into src/cli +- **WHEN** the published CLI binary is executed +- **THEN** `bin/multiagent-safety.js` loads the modular runtime from `src/cli/main.js` +- **AND** command dispatch logic no longer depends on the monolithic file body. + +### Requirement: Module seams mirror operational responsibility +The CLI SHALL separate major operational seams into dedicated modules under `src/` instead of keeping them in one file. + +#### Scenario: Responsibilities live under dedicated src modules +- **WHEN** a maintainer inspects the refactored CLI +- **THEN** argument parsing and dispatch live under `src/cli` +- **AND** output formatting lives under `src/output` +- **AND** git/worktree helpers live under `src/git` +- **AND** managed-file and template logic live under `src/scaffold` and `src/hooks` +- **AND** toolchain and self-update logic live under `src/toolchain` +- **AND** protected-base sandbox and finish flows live under `src/sandbox` and `src/finish`. + +### Requirement: Refactor preserves targeted CLI behavior +The modularization SHALL preserve the current command surface for targeted verified flows. + +#### Scenario: Targeted CLI regressions stay green after extraction +- **WHEN** the focused install/metadata/command regression suites and packaging checks are run after the extraction +- **THEN** they pass without command-name regressions +- **AND** the published package still contains the runtime files required by the extracted `src/**` modules. diff --git a/openspec/changes/agent-codex-decompose-cli-monolith-2026-04-22-11-06/tasks.md b/openspec/changes/agent-codex-decompose-cli-monolith-2026-04-22-11-06/tasks.md new file mode 100644 index 0000000..49ff729 --- /dev/null +++ b/openspec/changes/agent-codex-decompose-cli-monolith-2026-04-22-11-06/tasks.md @@ -0,0 +1,40 @@ +## Definition of Done + +This change is complete only when **all** of the following are true: + +- Every checkbox below is checked. +- The agent branch reaches `MERGED` state on `origin` and the PR URL + state are recorded in the completion handoff. +- If any step blocks (test failure, conflict, ambiguous result), append a `BLOCKED:` line under section 4 explaining the blocker and **STOP**. Do not tick remaining cleanup boxes; do not silently skip the cleanup pipeline. + +## Handoff + +- Handoff: change=`agent-codex-decompose-cli-monolith-2026-04-22-11-06`; branch=`agent/codex/decompose-cli-monolith-2026-04-22-11-06`; scope=`bin/multiagent-safety.js`, new `src/**` runtime modules, packaging metadata, and targeted CLI regression tests; action=`decompose the monolithic CLI into seam-owned modules while preserving the command surface`. +- Copy prompt: Continue `agent-codex-decompose-cli-monolith-2026-04-22-11-06` on branch `agent/codex/decompose-cli-monolith-2026-04-22-11-06`. Work inside the existing sandbox, review `openspec/changes/agent-codex-decompose-cli-monolith-2026-04-22-11-06/tasks.md`, continue from the current state instead of creating a new sandbox, and when the work is done run `gx branch finish --branch agent/codex/decompose-cli-monolith-2026-04-22-11-06 --base dev --via-pr --wait-for-merge --cleanup`. + +## 1. Specification + +- [x] 1.1 Finalize proposal scope and acceptance criteria for `agent-codex-decompose-cli-monolith-2026-04-22-11-06`. +- [x] 1.2 Define normative requirements in `specs/cli-modularization/spec.md`. + +## 2. Implementation + +- [x] 2.1 Add shared `src/context.js` / `src/core/runtime.js` foundations for constants, process helpers, and low-level utilities. +- [x] 2.2 Extract low-risk seams into `src/output`, `src/git`, `src/scaffold`, `src/hooks`, and `src/toolchain`. +- [x] 2.3 Extract higher-coupling seams into `src/sandbox`, `src/finish`, and `src/cli`. +- [x] 2.4 Reduce `bin/multiagent-safety.js` to a thin launcher that boots `src/cli/main.js`. +- [x] 2.5 Update publish packaging / metadata so installed CLIs ship the new `src/**` runtime. + +## 3. Verification + +- [x] 3.1 Add/update targeted regression coverage for the thin entrypoint, representative command routes, and package shipping of `src/**`. +- [x] 3.2 Run syntax checks for the entrypoint and extracted modules (`node --check bin/multiagent-safety.js` plus `node --check` on `src/**`). +- [x] 3.3 Run focused install/metadata/command regression suites. +- [x] 3.4 Run `npm pack --dry-run` to confirm `src/**` ships in the package. +- [x] 3.5 Run `openspec validate agent-codex-decompose-cli-monolith-2026-04-22-11-06 --type change --strict`. +- [x] 3.6 Run `openspec validate --specs`. + +## 4. Cleanup (mandatory; run before claiming completion) + +- [ ] 4.1 Run the cleanup pipeline: `gx branch finish --branch agent/codex/decompose-cli-monolith-2026-04-22-11-06 --base dev --via-pr --wait-for-merge --cleanup`. This handles commit -> push -> PR create -> merge wait -> worktree prune in one invocation. +- [ ] 4.2 Record the PR URL and final merge state (`MERGED`) in the completion handoff. +- [ ] 4.3 Confirm the sandbox worktree is gone (`git worktree list` no longer shows the agent path; `git branch -a` shows no surviving local/remote refs for the branch). diff --git a/package.json b/package.json index 617d2ee..9979ca2 100644 --- a/package.json +++ b/package.json @@ -38,6 +38,7 @@ }, "files": [ "bin", + "src", "templates", "README.md", "LICENSE", diff --git a/scripts/agent-branch-start.sh b/scripts/agent-branch-start.sh index ef8cc11..c871372 100755 --- a/scripts/agent-branch-start.sh +++ b/scripts/agent-branch-start.sh @@ -7,6 +7,8 @@ BASE_BRANCH="" BASE_BRANCH_EXPLICIT=0 WORKTREE_ROOT_REL="" WORKTREE_ROOT_EXPLICIT=0 +NODE_BIN="${GUARDEX_NODE_BIN:-node}" +CLI_ENTRY="${GUARDEX_CLI_ENTRY:-}" OPENSPEC_AUTO_INIT_RAW="${GUARDEX_OPENSPEC_AUTO_INIT:-false}" OPENSPEC_PLAN_SLUG_OVERRIDE="${GUARDEX_OPENSPEC_PLAN_SLUG:-}" OPENSPEC_CHANGE_SLUG_OVERRIDE="${GUARDEX_OPENSPEC_CHANGE_SLUG:-}" @@ -15,6 +17,23 @@ OPENSPEC_MASTERPLAN_LABEL_RAW="${GUARDEX_OPENSPEC_MASTERPLAN_LABEL-masterplan}" PRINT_NAME_ONLY=0 POSITIONAL_ARGS=() +run_guardex_cli() { + if [[ -n "$CLI_ENTRY" ]]; then + "$NODE_BIN" "$CLI_ENTRY" "$@" + return $? + fi + if command -v gx >/dev/null 2>&1; then + gx "$@" + return $? + fi + if command -v gitguardex >/dev/null 2>&1; then + gitguardex "$@" + return $? + fi + echo "[agent-branch-start] Guardex CLI entrypoint unavailable; rerun via gx." >&2 + return 127 +} + while [[ $# -gt 0 ]]; do case "$1" in --task) @@ -385,26 +404,14 @@ initialize_openspec_plan_workspace() { local worktree="$2" local plan_slug="$3" - hydrate_local_helper_in_worktree "$repo" "$worktree" "scripts/openspec/init-plan-workspace.sh" - if [[ "$OPENSPEC_AUTO_INIT" -ne 1 ]]; then return 0 fi - local openspec_script="${worktree}/scripts/openspec/init-plan-workspace.sh" - if [[ ! -f "$openspec_script" ]]; then - echo "[agent-branch-start] OpenSpec init script is missing in sandbox worktree." >&2 - echo "[agent-branch-start] Run 'gx setup --target \"$repo\"' to repair templates, then retry." >&2 - return 1 - fi - if [[ ! -x "$openspec_script" ]]; then - chmod +x "$openspec_script" 2>/dev/null || true - fi - local init_output="" if ! init_output="$( cd "$worktree" - bash "scripts/openspec/init-plan-workspace.sh" "$plan_slug" 2>&1 + run_guardex_cli internal run-shell planInit "$plan_slug" 2>&1 )"; then printf '%s\n' "$init_output" >&2 echo "[agent-branch-start] OpenSpec workspace initialization failed for plan '${plan_slug}'." >&2 @@ -423,26 +430,14 @@ initialize_openspec_change_workspace() { local change_slug="$3" local capability_slug="$4" - hydrate_local_helper_in_worktree "$repo" "$worktree" "scripts/openspec/init-change-workspace.sh" - if [[ "$OPENSPEC_AUTO_INIT" -ne 1 ]]; then return 0 fi - local openspec_script="${worktree}/scripts/openspec/init-change-workspace.sh" - if [[ ! -f "$openspec_script" ]]; then - echo "[agent-branch-start] OpenSpec change init script is missing in sandbox worktree." >&2 - echo "[agent-branch-start] Run 'gx setup --target \"$repo\"' to repair templates, then retry." >&2 - return 1 - fi - if [[ ! -x "$openspec_script" ]]; then - chmod +x "$openspec_script" 2>/dev/null || true - fi - local init_output="" if ! init_output="$( cd "$worktree" - bash "scripts/openspec/init-change-workspace.sh" "$change_slug" "$capability_slug" 2>&1 + run_guardex_cli internal run-shell changeInit "$change_slug" "$capability_slug" 2>&1 )"; then printf '%s\n' "$init_output" >&2 echo "[agent-branch-start] OpenSpec workspace initialization failed for change '${change_slug}'." >&2 @@ -592,7 +587,6 @@ if [[ -n "$auto_transfer_stash_ref" ]]; then fi fi -hydrate_local_helper_in_worktree "$repo_root" "$worktree_path" "scripts/codex-agent.sh" hydrate_dependency_dir_symlink_in_worktree "$repo_root" "$worktree_path" "node_modules" hydrate_dependency_dir_symlink_in_worktree "$repo_root" "$worktree_path" "apps/frontend/node_modules" hydrate_dependency_dir_symlink_in_worktree "$repo_root" "$worktree_path" "apps/backend/node_modules" @@ -609,6 +603,6 @@ echo "[agent-branch-start] OpenSpec change: openspec/changes/${openspec_change_s echo "[agent-branch-start] OpenSpec plan: openspec/plan/${openspec_plan_slug}" echo "[agent-branch-start] Next steps:" echo " cd \"${worktree_path}\"" -echo " python3 scripts/agent-file-locks.py claim --branch \"${branch_name}\" " +echo " gx locks claim --branch \"${branch_name}\" " echo " # implement + commit" -echo " bash scripts/agent-branch-finish.sh --branch \"${branch_name}\" --base ${BASE_BRANCH} --via-pr --wait-for-merge" +echo " gx branch finish --branch \"${branch_name}\" --base ${BASE_BRANCH} --via-pr --wait-for-merge" diff --git a/scripts/codex-agent.sh b/scripts/codex-agent.sh index 75b349c..6a05817 100755 --- a/scripts/codex-agent.sh +++ b/scripts/codex-agent.sh @@ -6,6 +6,8 @@ AGENT_NAME="${GUARDEX_AGENT_NAME:-agent}" BASE_BRANCH="${GUARDEX_BASE_BRANCH:-}" BASE_BRANCH_EXPLICIT=0 CODEX_BIN="${GUARDEX_CODEX_BIN:-codex}" +NODE_BIN="${GUARDEX_NODE_BIN:-node}" +CLI_ENTRY="${GUARDEX_CLI_ENTRY:-}" AUTO_FINISH_RAW="${GUARDEX_CODEX_AUTO_FINISH:-true}" AUTO_REVIEW_ON_CONFLICT_RAW="${GUARDEX_CODEX_AUTO_REVIEW_ON_CONFLICT:-true}" AUTO_CLEANUP_RAW="${GUARDEX_CODEX_AUTO_CLEANUP:-true}" @@ -16,6 +18,23 @@ OPENSPEC_CHANGE_SLUG_OVERRIDE="${GUARDEX_OPENSPEC_CHANGE_SLUG:-}" OPENSPEC_CAPABILITY_SLUG_OVERRIDE="${GUARDEX_OPENSPEC_CAPABILITY_SLUG:-}" OPENSPEC_MASTERPLAN_LABEL_RAW="${GUARDEX_OPENSPEC_MASTERPLAN_LABEL-masterplan}" +run_guardex_cli() { + if [[ -n "$CLI_ENTRY" ]]; then + "$NODE_BIN" "$CLI_ENTRY" "$@" + return $? + fi + if command -v gx >/dev/null 2>&1; then + gx "$@" + return $? + fi + if command -v gitguardex >/dev/null 2>&1; then + gitguardex "$@" + return $? + fi + echo "[codex-agent] Guardex CLI entrypoint unavailable; rerun via gx." >&2 + return 127 +} + normalize_bool() { local raw="${1:-}" local fallback="${2:-0}" @@ -143,6 +162,7 @@ if ! git rev-parse --is-inside-work-tree >/dev/null 2>&1; then exit 1 fi repo_root="$(git rev-parse --show-toplevel)" +active_session_state_script="${repo_root}/scripts/agent-session-state.js" guardex_env_helper="${repo_root}/scripts/guardex-env.sh" if [[ -f "$guardex_env_helper" ]]; then @@ -373,11 +393,6 @@ start_sandbox_fallback() { printf '[agent-branch-start] Worktree: %s\n' "$worktree_path" } -if [[ ! -x "${repo_root}/scripts/agent-branch-start.sh" ]]; then - echo "[codex-agent] Missing scripts/agent-branch-start.sh. Run: gx setup" >&2 - exit 1 -fi - start_args=("$TASK_NAME" "$AGENT_NAME") if [[ "$BASE_BRANCH_EXPLICIT" -eq 1 ]]; then start_args+=("$BASE_BRANCH") @@ -390,7 +405,7 @@ set +e start_output="$( GUARDEX_OPENSPEC_AUTO_INIT="$OPENSPEC_AUTO_INIT" \ GUARDEX_OPENSPEC_MASTERPLAN_LABEL="$OPENSPEC_MASTERPLAN_LABEL_RAW" \ - bash "${repo_root}/scripts/agent-branch-start.sh" "${start_args[@]}" 2>&1 + run_guardex_cli branch start "${start_args[@]}" 2>&1 )" start_status=$? set -e @@ -446,6 +461,40 @@ has_origin_remote() { git -C "$repo_root" remote get-url origin >/dev/null 2>&1 } +run_active_session_state() { + local action="$1" + shift + + if [[ ! -f "$active_session_state_script" ]]; then + return 0 + fi + if ! command -v "$NODE_BIN" >/dev/null 2>&1; then + return 0 + fi + + "$NODE_BIN" "$active_session_state_script" "$action" "$@" >/dev/null 2>&1 || true +} + +record_active_session_state() { + local wt="$1" + local branch="$2" + + run_active_session_state \ + start \ + --repo "$repo_root" \ + --branch "$branch" \ + --task "$TASK_NAME" \ + --agent "$AGENT_NAME" \ + --worktree "$wt" \ + --pid "$$" \ + --cli "$CODEX_BIN" +} + +clear_active_session_state() { + local branch="$1" + run_active_session_state stop --repo "$repo_root" --branch "$branch" +} + origin_remote_supports_pr_finish() { local origin_url origin_url="$(git -C "$repo_root" remote get-url origin 2>/dev/null || true)" @@ -493,7 +542,7 @@ print_takeover_prompt() { change_artifact="openspec/changes/${change_slug}/" fi - finish_cmd="bash scripts/agent-branch-finish.sh --branch \"${branch}\" --base ${base_branch} --via-pr --wait-for-merge --cleanup" + finish_cmd="gx branch finish --branch \"${branch}\" --base ${base_branch} --via-pr --wait-for-merge --cleanup" echo "[codex-agent] Takeover sandbox: ${wt}" echo "[codex-agent] Takeover prompt: Continue \`${change_slug}\` on branch \`${branch}\`. Work inside \`${wt}\`, review \`${change_artifact}\`, continue from the current state instead of creating a new sandbox, and when the work is done run \`${finish_cmd}\`." @@ -549,24 +598,12 @@ ensure_openspec_plan_workspace() { return 0 fi - hydrate_local_helper_in_worktree "$wt" "scripts/openspec/init-plan-workspace.sh" - - local openspec_script="${wt}/scripts/openspec/init-plan-workspace.sh" - if [[ ! -f "$openspec_script" ]]; then - echo "[codex-agent] Missing OpenSpec init script in sandbox: ${openspec_script}" >&2 - echo "[codex-agent] Run 'gx setup --target ${repo_root}' and retry." >&2 - return 1 - fi - if [[ ! -x "$openspec_script" ]]; then - chmod +x "$openspec_script" 2>/dev/null || true - fi - local plan_slug plan_slug="$(resolve_openspec_plan_slug "$branch")" local init_output="" if ! init_output="$( cd "$wt" - bash "scripts/openspec/init-plan-workspace.sh" "$plan_slug" 2>&1 + run_guardex_cli internal run-shell planInit "$plan_slug" 2>&1 )"; then printf '%s\n' "$init_output" >&2 echo "[codex-agent] OpenSpec workspace initialization failed for plan '${plan_slug}'." >&2 @@ -586,24 +623,12 @@ ensure_openspec_change_workspace() { return 0 fi - hydrate_local_helper_in_worktree "$wt" "scripts/openspec/init-change-workspace.sh" - - local openspec_script="${wt}/scripts/openspec/init-change-workspace.sh" - if [[ ! -f "$openspec_script" ]]; then - echo "[codex-agent] Missing OpenSpec change init script in sandbox: ${openspec_script}" >&2 - echo "[codex-agent] Run 'gx setup --target ${repo_root}' and retry." >&2 - return 1 - fi - if [[ ! -x "$openspec_script" ]]; then - chmod +x "$openspec_script" 2>/dev/null || true - fi - local change_slug capability_slug init_output="" change_slug="$(resolve_openspec_change_slug "$branch")" capability_slug="$(resolve_openspec_capability_slug)" if ! init_output="$( cd "$wt" - bash "scripts/openspec/init-change-workspace.sh" "$change_slug" "$capability_slug" 2>&1 + run_guardex_cli internal run-shell changeInit "$change_slug" "$capability_slug" 2>&1 )"; then printf '%s\n' "$init_output" >&2 echo "[codex-agent] OpenSpec workspace initialization failed for change '${change_slug}'." >&2 @@ -632,11 +657,6 @@ worktree_has_changes() { claim_changed_files() { local wt="$1" local branch="$2" - local lock_script="${repo_root}/scripts/agent-file-locks.py" - - if [[ ! -x "$lock_script" ]]; then - return 0 - fi local changed_raw deleted_raw changed_raw="$({ @@ -647,7 +667,7 @@ claim_changed_files() { if [[ -n "$changed_raw" ]]; then mapfile -t changed_files < <(printf '%s\n' "$changed_raw") - python3 "$lock_script" claim --branch "$branch" "${changed_files[@]}" >/dev/null 2>&1 || true + run_guardex_cli locks claim --branch "$branch" "${changed_files[@]}" >/dev/null 2>&1 || true fi deleted_raw="$({ @@ -657,7 +677,7 @@ claim_changed_files() { if [[ -n "$deleted_raw" ]]; then mapfile -t deleted_files < <(printf '%s\n' "$deleted_raw") - python3 "$lock_script" allow-delete --branch "$branch" "${deleted_files[@]}" >/dev/null 2>&1 || true + run_guardex_cli locks allow-delete --branch "$branch" "${deleted_files[@]}" >/dev/null 2>&1 || true fi } @@ -806,7 +826,7 @@ run_finish_flow() { return 2 fi - if finish_output="$(bash "${repo_root}/scripts/agent-branch-finish.sh" "${finish_args[@]}" 2>&1)"; then + if finish_output="$(run_guardex_cli branch finish "${finish_args[@]}" 2>&1)"; then printf '%s\n' "$finish_output" return 0 fi @@ -829,7 +849,7 @@ run_finish_flow() { fi ) - if finish_output="$(bash "${repo_root}/scripts/agent-branch-finish.sh" "${finish_args[@]}" 2>&1)"; then + if finish_output="$(run_guardex_cli branch finish "${finish_args[@]}" 2>&1)"; then printf '%s\n' "$finish_output" return 0 fi @@ -858,6 +878,19 @@ if ! ensure_openspec_plan_workspace "$worktree_path" "$worktree_branch"; then exit 1 fi +active_session_recorded=0 +cleanup_active_session_state_on_exit() { + set +e + if [[ "${active_session_recorded:-0}" -eq 1 && -n "${worktree_branch:-}" && "${worktree_branch:-}" != "HEAD" ]]; then + clear_active_session_state "$worktree_branch" + active_session_recorded=0 + fi +} + +record_active_session_state "$worktree_path" "$worktree_branch" +active_session_recorded=1 +trap cleanup_active_session_state_on_exit EXIT INT TERM + echo "[codex-agent] Launching ${CODEX_BIN} in sandbox: $worktree_path" cd "$worktree_path" set +e @@ -866,6 +899,8 @@ codex_exit="$?" set -e cd "$repo_root" +cleanup_active_session_state_on_exit +trap - EXIT INT TERM final_exit="$codex_exit" auto_finish_completed=0 @@ -903,18 +938,16 @@ if [[ "$AUTO_FINISH" -eq 1 && -n "$worktree_branch" && "$worktree_branch" != "HE fi fi -if [[ -x "${repo_root}/scripts/agent-worktree-prune.sh" ]]; then - echo "[codex-agent] Session ended (exit=${codex_exit}). Running worktree cleanup..." - prune_args=() - if [[ "$BASE_BRANCH_EXPLICIT" -eq 1 ]]; then - prune_args+=(--base "$BASE_BRANCH") - fi - if [[ "$AUTO_CLEANUP" -eq 1 && "$auto_finish_completed" -eq 1 ]]; then - prune_args+=(--only-dirty-worktrees --delete-branches --delete-remote-branches) - fi - if ! bash "${repo_root}/scripts/agent-worktree-prune.sh" "${prune_args[@]}"; then - echo "[codex-agent] Warning: automatic worktree cleanup failed." >&2 - fi +echo "[codex-agent] Session ended (exit=${codex_exit}). Running worktree cleanup..." +prune_args=() +if [[ "$BASE_BRANCH_EXPLICIT" -eq 1 ]]; then + prune_args+=(--base "$BASE_BRANCH") +fi +if [[ "$AUTO_CLEANUP" -eq 1 && "$auto_finish_completed" -eq 1 ]]; then + prune_args+=(--only-dirty-worktrees --delete-branches --delete-remote-branches) +fi +if ! run_guardex_cli worktree prune "${prune_args[@]}"; then + echo "[codex-agent] Warning: automatic worktree cleanup failed." >&2 fi if [[ ! -d "$worktree_path" ]]; then @@ -927,7 +960,7 @@ else echo "[codex-agent] Branch kept intentionally. Cleanup on demand: gx cleanup --branch \"${worktree_branch}\"" else print_takeover_prompt "$worktree_path" "$worktree_branch" - echo "[codex-agent] If finished, merge with: bash scripts/agent-branch-finish.sh --branch \"${worktree_branch}\" --base dev --via-pr --wait-for-merge" + echo "[codex-agent] If finished, merge with: gx branch finish --branch \"${worktree_branch}\" --base dev --via-pr --wait-for-merge" echo "[codex-agent] Cleanup on demand: gx cleanup --branch \"${worktree_branch}\"" fi fi diff --git a/src/cli/args.js b/src/cli/args.js new file mode 100644 index 0000000..aa4d906 --- /dev/null +++ b/src/cli/args.js @@ -0,0 +1,837 @@ +const { + TARGETED_FORCEABLE_MANAGED_PATHS, + DEFAULT_SHADOW_CLEANUP_IDLE_MINUTES, +} = require('../context'); + +function requireValue(rawArgs, index, flagName) { + const value = rawArgs[index + 1]; + if (!value || value.startsWith('-')) { + throw new Error(`${flagName} requires a value`); + } + return value; +} + +function normalizeManagedForcePath(rawPath) { + if (typeof rawPath !== 'string') { + return null; + } + const normalized = require('node:path').posix.normalize(rawPath.replace(/\\/g, '/')); + if ( + !normalized || + normalized === '.' || + normalized.startsWith('../') || + require('node:path').posix.isAbsolute(normalized) + ) { + return null; + } + return normalized.startsWith('./') ? normalized.slice(2) : normalized; +} + +function collectForceManagedPaths(rawArgs, startIndex) { + const forceManagedPaths = []; + let nextIndex = startIndex; + + while (nextIndex + 1 < rawArgs.length) { + const candidate = rawArgs[nextIndex + 1]; + if (!candidate || candidate.startsWith('-')) { + break; + } + const normalized = normalizeManagedForcePath(candidate); + if (!normalized || !TARGETED_FORCEABLE_MANAGED_PATHS.has(normalized)) { + throw new Error(`Unknown managed path after --force: ${candidate}`); + } + forceManagedPaths.push(normalized); + nextIndex += 1; + } + + return { forceManagedPaths, nextIndex }; +} + +function appendForceArgs(args, options) { + if (!options.force) { + return; + } + args.push('--force'); + for (const managedPath of options.forceManagedPaths || []) { + args.push(managedPath); + } +} + +function shouldForceManagedPath(options, relativePath) { + if (!options.force) { + return false; + } + const targetedPaths = Array.isArray(options.forceManagedPaths) ? options.forceManagedPaths : []; + if (targetedPaths.length === 0) { + return true; + } + const normalized = normalizeManagedForcePath(relativePath); + return normalized !== null && targetedPaths.includes(normalized); +} + +function parseCommonArgs(rawArgs, defaults, options = {}) { + const parsed = { ...defaults }; + const supportsForce = Object.prototype.hasOwnProperty.call(parsed, 'force'); + if (supportsForce && !Array.isArray(parsed.forceManagedPaths)) { + parsed.forceManagedPaths = []; + } + + for (let index = 0; index < rawArgs.length; index += 1) { + const arg = rawArgs[index]; + if (arg === '--target' || arg === '-t') { + parsed.target = requireValue(rawArgs, index, '--target'); + index += 1; + continue; + } + if (arg === '--dry-run') { + parsed.dryRun = true; + continue; + } + if (arg === '--skip-agents') { + parsed.skipAgents = true; + continue; + } + if (arg === '--skip-package-json') { + parsed.skipPackageJson = true; + continue; + } + if (arg === '--force') { + if (!supportsForce) { + throw new Error(`Unknown option: ${arg}`); + } + parsed.force = true; + const collected = collectForceManagedPaths(rawArgs, index); + if (collected.forceManagedPaths.length > 0) { + parsed.forceManagedPaths = Array.from( + new Set([...(parsed.forceManagedPaths || []), ...collected.forceManagedPaths]), + ); + } + index = collected.nextIndex; + continue; + } + if (arg === '--keep-stale-locks') { + parsed.dropStaleLocks = false; + continue; + } + if (arg === '--json') { + parsed.json = true; + continue; + } + if (arg === '--yes-global-install') { + parsed.yesGlobalInstall = true; + continue; + } + if (arg === '--no-global-install') { + parsed.noGlobalInstall = true; + continue; + } + if (arg === '--no-gitignore') { + parsed.skipGitignore = true; + continue; + } + if (arg === '--allow-protected-base-write') { + parsed.allowProtectedBaseWrite = true; + continue; + } + if (Object.prototype.hasOwnProperty.call(parsed, 'waitForMerge') && arg === '--wait-for-merge') { + parsed.waitForMerge = true; + continue; + } + if (Object.prototype.hasOwnProperty.call(parsed, 'waitForMerge') && arg === '--no-wait-for-merge') { + parsed.waitForMerge = false; + continue; + } + + throw new Error(`Unknown option: ${arg}`); + } + + if (!parsed.target) { + throw new Error('--target requires a path value'); + } + + return parsed; +} + +function parseRepoTraversalArgs(rawArgs, defaults, options = {}) { + const nestedRepoDefaultMaxDepth = options.nestedRepoDefaultMaxDepth ?? 6; + const traversalDefaults = { + ...defaults, + recursive: true, + nestedMaxDepth: nestedRepoDefaultMaxDepth, + nestedSkipDirs: [], + includeSubmodules: false, + }; + const forwardedArgs = []; + + for (let index = 0; index < rawArgs.length; index += 1) { + const arg = rawArgs[index]; + if (arg === '--no-recursive' || arg === '--no-nested' || arg === '--single-repo') { + traversalDefaults.recursive = false; + continue; + } + if (arg === '--recursive' || arg === '--nested') { + traversalDefaults.recursive = true; + continue; + } + if (arg === '--max-depth') { + const raw = requireValue(rawArgs, index, '--max-depth'); + const parsed = Number.parseInt(raw, 10); + if (!Number.isFinite(parsed) || parsed < 1) { + throw new Error('--max-depth requires a positive integer'); + } + traversalDefaults.nestedMaxDepth = parsed; + index += 1; + continue; + } + if (arg === '--skip-nested') { + const raw = requireValue(rawArgs, index, '--skip-nested'); + traversalDefaults.nestedSkipDirs.push(raw); + index += 1; + continue; + } + if (arg === '--include-submodules') { + traversalDefaults.includeSubmodules = true; + continue; + } + forwardedArgs.push(arg); + } + + return parseCommonArgs(forwardedArgs, traversalDefaults, options); +} + +function parseSetupArgs(rawArgs, defaults, options = {}) { + const setupDefaults = { + ...defaults, + parentWorkspaceView: false, + }; + const forwardedArgs = []; + + for (let index = 0; index < rawArgs.length; index += 1) { + const arg = rawArgs[index]; + if (arg === '--parent-workspace-view') { + setupDefaults.parentWorkspaceView = true; + continue; + } + if (arg === '--no-parent-workspace-view') { + setupDefaults.parentWorkspaceView = false; + continue; + } + forwardedArgs.push(arg); + } + + return parseRepoTraversalArgs(forwardedArgs, setupDefaults, options); +} + +function parseDoctorArgs(rawArgs, options = {}) { + const doctorDefaults = { + target: process.cwd(), + force: false, + dropStaleLocks: true, + skipAgents: false, + skipPackageJson: false, + skipGitignore: false, + dryRun: false, + json: false, + allowProtectedBaseWrite: false, + waitForMerge: true, + verboseAutoFinish: false, + }; + const forwardedArgs = []; + + for (let index = 0; index < rawArgs.length; index += 1) { + const arg = rawArgs[index]; + if (arg === '--verbose-auto-finish') { + doctorDefaults.verboseAutoFinish = true; + continue; + } + if (arg === '--compact-auto-finish') { + doctorDefaults.verboseAutoFinish = false; + continue; + } + forwardedArgs.push(arg); + } + + return parseRepoTraversalArgs(forwardedArgs, doctorDefaults, options); +} + +function parseTargetFlag(rawArgs, defaultTarget = process.cwd()) { + const remaining = []; + let target = defaultTarget; + + for (let index = 0; index < rawArgs.length; index += 1) { + const arg = rawArgs[index]; + if (arg === '--target') { + const next = rawArgs[index + 1]; + if (!next) { + throw new Error('--target requires a path value'); + } + target = next; + index += 1; + continue; + } + remaining.push(arg); + } + + return { target, args: remaining }; +} + +function parseReviewArgs(rawArgs) { + const parsed = parseTargetFlag(rawArgs, process.cwd()); + const passthroughArgs = [...parsed.args]; + if (passthroughArgs[0] === 'start') { + passthroughArgs.shift(); + } + return { + target: parsed.target, + passthroughArgs, + }; +} + +function parseAgentsArgs(rawArgs) { + const parsed = parseTargetFlag(rawArgs, process.cwd()); + const [subcommandRaw = '', ...rest] = parsed.args; + const subcommand = subcommandRaw || 'status'; + const options = { + target: parsed.target, + subcommand, + reviewIntervalSeconds: 30, + cleanupIntervalSeconds: 60, + idleMinutes: DEFAULT_SHADOW_CLEANUP_IDLE_MINUTES, + }; + + for (let index = 0; index < rest.length; index += 1) { + const arg = rest[index]; + if (arg === '--review-interval') { + const next = rest[index + 1]; + if (!next) { + throw new Error('--review-interval requires an integer seconds value'); + } + const parsedValue = Number.parseInt(next, 10); + if (!Number.isInteger(parsedValue) || parsedValue < 5) { + throw new Error('--review-interval must be an integer >= 5 seconds'); + } + options.reviewIntervalSeconds = parsedValue; + index += 1; + continue; + } + if (arg === '--cleanup-interval') { + const next = rest[index + 1]; + if (!next) { + throw new Error('--cleanup-interval requires an integer seconds value'); + } + const parsedValue = Number.parseInt(next, 10); + if (!Number.isInteger(parsedValue) || parsedValue < 5) { + throw new Error('--cleanup-interval must be an integer >= 5 seconds'); + } + options.cleanupIntervalSeconds = parsedValue; + index += 1; + continue; + } + if (arg === '--idle-minutes') { + const next = rest[index + 1]; + if (!next) { + throw new Error('--idle-minutes requires an integer minutes value'); + } + const parsedValue = Number.parseInt(next, 10); + if (!Number.isInteger(parsedValue) || parsedValue < 1) { + throw new Error('--idle-minutes must be an integer >= 1'); + } + options.idleMinutes = parsedValue; + index += 1; + continue; + } + throw new Error(`Unknown option: ${arg}`); + } + + if (!['start', 'stop', 'status'].includes(options.subcommand)) { + throw new Error(`Unknown agents subcommand: ${options.subcommand}`); + } + + return options; +} + +function parseReportArgs(rawArgs) { + const options = { + target: process.cwd(), + subcommand: '', + repo: '', + scorecardJson: '', + outputDir: '', + date: '', + dryRun: false, + json: false, + }; + + for (let index = 0; index < rawArgs.length; index += 1) { + const arg = rawArgs[index]; + if (arg === '--target') { + const next = rawArgs[index + 1]; + if (!next) throw new Error('--target requires a path value'); + options.target = next; + index += 1; + continue; + } + if (arg === '--repo') { + const next = rawArgs[index + 1]; + if (!next) throw new Error('--repo requires a value like github.com/owner/repo'); + options.repo = next; + index += 1; + continue; + } + if (arg === '--scorecard-json') { + const next = rawArgs[index + 1]; + if (!next) throw new Error('--scorecard-json requires a path value'); + options.scorecardJson = next; + index += 1; + continue; + } + if (arg === '--output-dir') { + const next = rawArgs[index + 1]; + if (!next) throw new Error('--output-dir requires a path value'); + options.outputDir = next; + index += 1; + continue; + } + if (arg === '--date') { + const next = rawArgs[index + 1]; + if (!next) throw new Error('--date requires a YYYY-MM-DD value'); + options.date = next; + index += 1; + continue; + } + if (arg === '--dry-run') { + options.dryRun = true; + continue; + } + if (arg === '--json') { + options.json = true; + continue; + } + if (arg.startsWith('-')) { + throw new Error(`Unknown option: ${arg}`); + } + if (!options.subcommand) { + options.subcommand = arg; + continue; + } + throw new Error(`Unexpected argument: ${arg}`); + } + + return options; +} + +function parseSyncArgs(rawArgs) { + const options = { + target: process.cwd(), + check: false, + base: '', + strategy: '', + ffOnly: false, + dryRun: false, + json: false, + allAgentBranches: false, + allowNonAgent: false, + allowDirty: false, + }; + + for (let index = 0; index < rawArgs.length; index += 1) { + const arg = rawArgs[index]; + if (arg === '--target') { + const next = rawArgs[index + 1]; + if (!next) { + throw new Error('--target requires a path value'); + } + options.target = next; + index += 1; + continue; + } + if (arg === '--base') { + const next = rawArgs[index + 1]; + if (!next) { + throw new Error('--base requires a branch value'); + } + options.base = next; + index += 1; + continue; + } + if (arg === '--strategy') { + const next = rawArgs[index + 1]; + if (!next) { + throw new Error('--strategy requires a value (rebase|merge)'); + } + options.strategy = next; + index += 1; + continue; + } + if (arg === '--check') { + options.check = true; + continue; + } + if (arg === '--ff-only') { + options.ffOnly = true; + continue; + } + if (arg === '--dry-run') { + options.dryRun = true; + continue; + } + if (arg === '--json') { + options.json = true; + continue; + } + if (arg === '--all-agent-branches') { + options.allAgentBranches = true; + continue; + } + if (arg === '--allow-non-agent') { + options.allowNonAgent = true; + continue; + } + if (arg === '--allow-dirty') { + options.allowDirty = true; + continue; + } + throw new Error(`Unknown option: ${arg}`); + } + + return options; +} + +function parseCleanupArgs(rawArgs) { + const options = { + target: process.cwd(), + base: '', + branch: '', + dryRun: false, + forceDirty: false, + keepRemote: false, + keepCleanWorktrees: false, + includePrMerged: false, + idleMinutes: 0, + watch: false, + intervalSeconds: 60, + once: false, + maxBranches: 0, + }; + + for (let index = 0; index < rawArgs.length; index += 1) { + const arg = rawArgs[index]; + if (arg === '--target') { + const next = rawArgs[index + 1]; + if (!next) { + throw new Error('--target requires a path value'); + } + options.target = next; + index += 1; + continue; + } + if (arg === '--base') { + const next = rawArgs[index + 1]; + if (!next) { + throw new Error('--base requires a branch value'); + } + options.base = next; + index += 1; + continue; + } + if (arg === '--branch') { + const next = rawArgs[index + 1]; + if (!next) { + throw new Error('--branch requires an agent branch value'); + } + options.branch = next; + index += 1; + continue; + } + if (arg === '--dry-run') { + options.dryRun = true; + continue; + } + if (arg === '--force-dirty') { + options.forceDirty = true; + continue; + } + if (arg === '--keep-remote') { + options.keepRemote = true; + continue; + } + if (arg === '--keep-clean-worktrees') { + options.keepCleanWorktrees = true; + continue; + } + if (arg === '--include-pr-merged') { + options.includePrMerged = true; + continue; + } + if (arg === '--idle-minutes') { + const next = rawArgs[index + 1]; + if (!next) { + throw new Error('--idle-minutes requires an integer value'); + } + const parsed = Number.parseInt(next, 10); + if (!Number.isInteger(parsed) || parsed < 0) { + throw new Error('--idle-minutes must be an integer >= 0'); + } + options.idleMinutes = parsed; + index += 1; + continue; + } + if (arg === '--watch') { + options.watch = true; + continue; + } + if (arg === '--interval') { + const next = rawArgs[index + 1]; + if (!next) { + throw new Error('--interval requires an integer seconds value'); + } + const parsed = Number.parseInt(next, 10); + if (!Number.isInteger(parsed) || parsed < 5) { + throw new Error('--interval must be an integer >= 5 seconds'); + } + options.intervalSeconds = parsed; + index += 1; + continue; + } + if (arg === '--once') { + options.once = true; + continue; + } + if (arg === '--max-branches') { + const next = rawArgs[index + 1]; + if (!next) { + throw new Error('--max-branches requires an integer value'); + } + const parsed = Number.parseInt(next, 10); + if (!Number.isInteger(parsed) || parsed < 1) { + throw new Error('--max-branches must be an integer >= 1'); + } + options.maxBranches = parsed; + index += 1; + continue; + } + throw new Error(`Unknown option: ${arg}`); + } + + if (options.watch && options.idleMinutes === 0) { + options.idleMinutes = DEFAULT_SHADOW_CLEANUP_IDLE_MINUTES; + } + + return options; +} + +function parseMergeArgs(rawArgs) { + const options = { + target: process.cwd(), + base: '', + into: '', + branches: [], + task: '', + agent: '', + }; + + for (let index = 0; index < rawArgs.length; index += 1) { + const arg = rawArgs[index]; + if (arg === '--target') { + const next = rawArgs[index + 1]; + if (!next) { + throw new Error('--target requires a path value'); + } + options.target = next; + index += 1; + continue; + } + if (arg === '--base') { + const next = rawArgs[index + 1]; + if (!next) { + throw new Error('--base requires a branch value'); + } + options.base = next; + index += 1; + continue; + } + if (arg === '--into') { + const next = rawArgs[index + 1]; + if (!next) { + throw new Error('--into requires an agent/* branch value'); + } + options.into = next; + index += 1; + continue; + } + if (arg === '--branch') { + const next = rawArgs[index + 1]; + if (!next) { + throw new Error('--branch requires an agent/* branch value'); + } + options.branches.push(next); + index += 1; + continue; + } + if (arg === '--task') { + const next = rawArgs[index + 1]; + if (!next) { + throw new Error('--task requires a task value'); + } + options.task = next; + index += 1; + continue; + } + if (arg === '--agent') { + const next = rawArgs[index + 1]; + if (!next) { + throw new Error('--agent requires an agent value'); + } + options.agent = next; + index += 1; + continue; + } + throw new Error(`Unknown option: ${arg}`); + } + + if (options.branches.length === 0) { + throw new Error('merge requires at least one --branch input'); + } + + return options; +} + +function parseFinishArgs(rawArgs, defaults = {}) { + const options = { + target: process.cwd(), + base: '', + branch: '', + all: false, + dryRun: false, + waitForMerge: defaults.waitForMerge ?? true, + cleanup: defaults.cleanup ?? true, + keepRemote: false, + noAutoCommit: false, + failFast: false, + commitMessage: '', + mergeMode: defaults.mergeMode || 'pr', + }; + + for (let index = 0; index < rawArgs.length; index += 1) { + const arg = rawArgs[index]; + if (arg === '--target') { + const next = rawArgs[index + 1]; + if (!next) { + throw new Error('--target requires a path value'); + } + options.target = next; + index += 1; + continue; + } + if (arg === '--base') { + const next = rawArgs[index + 1]; + if (!next) { + throw new Error('--base requires a branch value'); + } + options.base = next; + index += 1; + continue; + } + if (arg === '--branch') { + const next = rawArgs[index + 1]; + if (!next) { + throw new Error('--branch requires an agent/* branch value'); + } + options.branch = next; + index += 1; + continue; + } + if (arg === '--commit-message') { + const next = rawArgs[index + 1]; + if (!next) { + throw new Error('--commit-message requires a value'); + } + options.commitMessage = next; + index += 1; + continue; + } + if (arg === '--all') { + options.all = true; + continue; + } + if (arg === '--dry-run') { + options.dryRun = true; + continue; + } + if (arg === '--wait-for-merge') { + options.waitForMerge = true; + continue; + } + if (arg === '--no-wait-for-merge') { + options.waitForMerge = false; + continue; + } + if (arg === '--via-pr') { + options.mergeMode = 'pr'; + continue; + } + if (arg === '--direct-only') { + options.mergeMode = 'direct'; + continue; + } + if (arg === '--mode') { + const next = rawArgs[index + 1]; + if (!next) { + throw new Error('--mode requires a value'); + } + if (!['auto', 'direct', 'pr'].includes(next)) { + throw new Error(`Invalid --mode value: ${next} (expected auto|direct|pr)`); + } + options.mergeMode = next; + index += 1; + continue; + } + if (arg === '--cleanup') { + options.cleanup = true; + continue; + } + if (arg === '--no-cleanup') { + options.cleanup = false; + continue; + } + if (arg === '--keep-remote') { + options.keepRemote = true; + continue; + } + if (arg === '--no-auto-commit') { + options.noAutoCommit = true; + continue; + } + if (arg === '--fail-fast') { + options.failFast = true; + continue; + } + throw new Error(`Unknown option: ${arg}`); + } + + if (options.branch && !options.branch.startsWith('agent/')) { + throw new Error(`--branch must reference an agent/* branch (received: ${options.branch})`); + } + + return options; +} + +module.exports = { + requireValue, + normalizeManagedForcePath, + collectForceManagedPaths, + appendForceArgs, + shouldForceManagedPath, + parseCommonArgs, + parseRepoTraversalArgs, + parseSetupArgs, + parseDoctorArgs, + parseTargetFlag, + parseReviewArgs, + parseAgentsArgs, + parseReportArgs, + parseSyncArgs, + parseCleanupArgs, + parseMergeArgs, + parseFinishArgs, +}; diff --git a/src/cli/dispatch.js b/src/cli/dispatch.js new file mode 100644 index 0000000..63ec94e --- /dev/null +++ b/src/cli/dispatch.js @@ -0,0 +1,86 @@ +const { + TOOL_NAME, + COMMAND_TYPO_ALIASES, + DEPRECATED_COMMAND_ALIASES, + SUGGESTIBLE_COMMANDS, +} = require('../context'); + +function levenshteinDistance(a, b) { + const rows = a.length + 1; + const cols = b.length + 1; + const matrix = Array.from({ length: rows }, () => Array(cols).fill(0)); + + for (let i = 0; i < rows; i += 1) matrix[i][0] = i; + for (let j = 0; j < cols; j += 1) matrix[0][j] = j; + + for (let i = 1; i < rows; i += 1) { + for (let j = 1; j < cols; j += 1) { + const cost = a[i - 1] === b[j - 1] ? 0 : 1; + matrix[i][j] = Math.min( + matrix[i - 1][j] + 1, + matrix[i][j - 1] + 1, + matrix[i - 1][j - 1] + cost, + ); + } + } + return matrix[a.length][b.length]; +} + +function maybeSuggestCommand(command) { + let best = null; + let bestDistance = Number.POSITIVE_INFINITY; + + for (const candidate of SUGGESTIBLE_COMMANDS) { + const dist = levenshteinDistance(command, candidate); + if (dist < bestDistance) { + bestDistance = dist; + best = candidate; + } + } + + if (best && bestDistance <= 2) { + return best; + } + + return null; +} + +function normalizeCommandOrThrow(command) { + if (COMMAND_TYPO_ALIASES.has(command)) { + const mapped = COMMAND_TYPO_ALIASES.get(command); + console.log(`[${TOOL_NAME}] Interpreting '${command}' as '${mapped}'.`); + return mapped; + } + return command; +} + +function warnDeprecatedAlias(aliasName) { + const entry = DEPRECATED_COMMAND_ALIASES.get(aliasName); + if (!entry) return; + console.error( + `[${TOOL_NAME}] '${aliasName}' is deprecated and will be removed in a future major release. ` + + `Use: ${entry.hint}`, + ); +} + +function extractFlag(args, ...names) { + const flagSet = new Set(names); + let found = false; + const remaining = []; + for (const arg of args) { + if (flagSet.has(arg)) { + found = true; + } else { + remaining.push(arg); + } + } + return { found, remaining }; +} + +module.exports = { + levenshteinDistance, + maybeSuggestCommand, + normalizeCommandOrThrow, + warnDeprecatedAlias, + extractFlag, +}; diff --git a/src/cli/main.js b/src/cli/main.js new file mode 100644 index 0000000..2eb1d97 --- /dev/null +++ b/src/cli/main.js @@ -0,0 +1,6150 @@ +#!/usr/bin/env node + +const fs = require('node:fs'); +const os = require('node:os'); +const path = require('node:path'); +const cp = require('node:child_process'); +const outputModule = require('../output'); +const runtimeModule = require('../core/runtime'); +const gitModule = require('../git'); +const scaffoldModule = require('../scaffold'); +const hooksModule = require('../hooks'); +const sandboxModule = require('../sandbox'); +const toolchainModule = require('../toolchain'); +const finishModule = require('../finish'); +const cliArgsModule = require('./args'); +const cliDispatchModule = require('./dispatch'); + +let sandboxApi; +let toolchainApi; +let finishApi; + +const PACKAGE_ROOT = path.resolve(__dirname, '../..'); +const CLI_ENTRY_PATH = path.join(PACKAGE_ROOT, 'bin', 'multiagent-safety.js'); +const packageJsonPath = path.join(PACKAGE_ROOT, 'package.json'); +const packageJson = JSON.parse(fs.readFileSync(packageJsonPath, 'utf8')); + +const TOOL_NAME = 'gitguardex'; +const SHORT_TOOL_NAME = 'gx'; +if (!process.env.GUARDEX_CLI_ENTRY) { + process.env.GUARDEX_CLI_ENTRY = CLI_ENTRY_PATH; +} +if (!process.env.GUARDEX_NODE_BIN) { + process.env.GUARDEX_NODE_BIN = process.execPath; +} +const LEGACY_NAMES = ['guardex', 'multiagent-safety']; +const GLOBAL_INSTALL_COMMAND = `npm i -g ${packageJson.name}`; +const OPENSPEC_PACKAGE = '@fission-ai/openspec'; +const OMC_PACKAGE = 'oh-my-claude-sisyphus'; +const OMC_REPO_URL = 'https://github.com/Yeachan-Heo/oh-my-claudecode'; +const CAVEMEM_PACKAGE = 'cavemem'; +const NPX_BIN = process.env.GUARDEX_NPX_BIN || 'npx'; +const GUARDEX_HOME_DIR = path.resolve(process.env.GUARDEX_HOME_DIR || os.homedir()); +const GLOBAL_TOOLCHAIN_SERVICES = [ + { name: 'oh-my-codex', packageName: 'oh-my-codex' }, + { + name: 'oh-my-claudecode', + packageName: OMC_PACKAGE, + dependencyUrl: OMC_REPO_URL, + }, + { name: OPENSPEC_PACKAGE, packageName: OPENSPEC_PACKAGE }, + { name: CAVEMEM_PACKAGE, packageName: CAVEMEM_PACKAGE }, + { + name: '@imdeadpool/codex-account-switcher', + packageName: '@imdeadpool/codex-account-switcher', + }, +]; +const GLOBAL_TOOLCHAIN_PACKAGES = [ + ...GLOBAL_TOOLCHAIN_SERVICES.map((service) => service.packageName), +]; +const OPTIONAL_LOCAL_COMPANION_TOOLS = [ + { + name: 'cavekit', + candidatePaths: [ + '.cavekit/plugin.json', + '.codex/local-marketplaces/cavekit/.agents/plugins/marketplace.json', + ], + installCommand: `${NPX_BIN} skills add JuliusBrussee/cavekit`, + installArgs: ['skills', 'add', 'JuliusBrussee/cavekit'], + }, + { + name: 'caveman', + candidatePaths: [ + '.config/caveman/config.json', + '.cavekit/skills/caveman/SKILL.md', + ], + installCommand: `${NPX_BIN} skills add JuliusBrussee/caveman`, + installArgs: ['skills', 'add', 'JuliusBrussee/caveman'], + }, +]; +const GH_BIN = process.env.GUARDEX_GH_BIN || 'gh'; +const REQUIRED_SYSTEM_TOOLS = [ + { + name: 'gh', + displayName: 'GitHub (gh)', + command: GH_BIN, + installHint: 'https://cli.github.com/', + }, +]; +const MAINTAINER_RELEASE_REPO = path.resolve( + process.env.GUARDEX_RELEASE_REPO || PACKAGE_ROOT, +); +const NPM_BIN = process.env.GUARDEX_NPM_BIN || 'npm'; +const OPENSPEC_BIN = process.env.GUARDEX_OPENSPEC_BIN || 'openspec'; +const SCORECARD_BIN = process.env.GUARDEX_SCORECARD_BIN || 'scorecard'; +const GIT_PROTECTED_BRANCHES_KEY = 'multiagent.protectedBranches'; +const GIT_BASE_BRANCH_KEY = 'multiagent.baseBranch'; +const GIT_SYNC_STRATEGY_KEY = 'multiagent.sync.strategy'; +const GUARDEX_REPO_TOGGLE_ENV = 'GUARDEX_ON'; +const DEFAULT_PROTECTED_BRANCHES = ['dev', 'main', 'master']; +const DEFAULT_BASE_BRANCH = 'dev'; +const DEFAULT_SYNC_STRATEGY = 'rebase'; +const DEFAULT_SHADOW_CLEANUP_IDLE_MINUTES = 60; +const COMPOSE_HINT_FILES = [ + 'docker-compose.yml', + 'docker-compose.yaml', + 'compose.yml', + 'compose.yaml', +]; + +const TEMPLATE_ROOT = path.join(PACKAGE_ROOT, 'templates'); + +const HOOK_NAMES = ['pre-commit', 'pre-push', 'post-merge', 'post-checkout']; + +const TEMPLATE_FILES = [ + 'scripts/agent-session-state.js', + 'scripts/guardex-docker-loader.sh', + 'scripts/guardex-env.sh', + 'scripts/install-vscode-active-agents-extension.js', + 'github/pull.yml.example', + 'github/workflows/cr.yml', + 'vscode/guardex-active-agents/package.json', + 'vscode/guardex-active-agents/extension.js', + 'vscode/guardex-active-agents/session-schema.js', + 'vscode/guardex-active-agents/README.md', +]; + +const LEGACY_WORKFLOW_SHIM_SPECS = [ + { relativePath: 'scripts/agent-branch-start.sh', kind: 'shell', command: ['branch', 'start'] }, + { relativePath: 'scripts/agent-branch-finish.sh', kind: 'shell', command: ['branch', 'finish'] }, + { relativePath: 'scripts/agent-branch-merge.sh', kind: 'shell', command: ['branch', 'merge'] }, + { relativePath: 'scripts/codex-agent.sh', kind: 'shell', command: ['internal', 'run-shell', 'codexAgent'] }, + { relativePath: 'scripts/review-bot-watch.sh', kind: 'shell', command: ['internal', 'run-shell', 'reviewBot'] }, + { relativePath: 'scripts/agent-worktree-prune.sh', kind: 'shell', command: ['worktree', 'prune'] }, + { relativePath: 'scripts/agent-file-locks.py', kind: 'python', command: ['locks'] }, + { relativePath: 'scripts/openspec/init-plan-workspace.sh', kind: 'shell', command: ['internal', 'run-shell', 'planInit'] }, + { relativePath: 'scripts/openspec/init-change-workspace.sh', kind: 'shell', command: ['internal', 'run-shell', 'changeInit'] }, +]; + +const LEGACY_WORKFLOW_SHIMS = LEGACY_WORKFLOW_SHIM_SPECS.map((entry) => entry.relativePath); + +const MANAGED_TEMPLATE_DESTINATIONS = TEMPLATE_FILES.map((entry) => toDestinationPath(entry)); +const MANAGED_TEMPLATE_SCRIPT_FILES = MANAGED_TEMPLATE_DESTINATIONS.filter((entry) => + entry.startsWith('scripts/'), +); + +const LEGACY_MANAGED_REPO_FILES = [ + ...LEGACY_WORKFLOW_SHIMS, + 'scripts/agent-session-state.js', + 'scripts/guardex-docker-loader.sh', + 'scripts/install-vscode-active-agents-extension.js', + 'scripts/guardex-env.sh', + 'scripts/install-agent-git-hooks.sh', + '.githooks/pre-commit', + '.githooks/pre-push', + '.githooks/post-merge', + '.githooks/post-checkout', + '.codex/skills/gitguardex/SKILL.md', + '.codex/skills/guardex-merge-skills-to-dev/SKILL.md', + '.claude/commands/gitguardex.md', +]; + +const REQUIRED_MANAGED_REPO_FILES = [ + ...MANAGED_TEMPLATE_DESTINATIONS, + ...HOOK_NAMES.map((entry) => path.posix.join('.githooks', entry)), + '.omx/state/agent-file-locks.json', +]; + +const LEGACY_MANAGED_PACKAGE_SCRIPTS = { + 'agent:codex': 'bash ./scripts/codex-agent.sh', + 'agent:branch:start': 'bash ./scripts/agent-branch-start.sh', + 'agent:branch:finish': 'bash ./scripts/agent-branch-finish.sh', + 'agent:branch:merge': 'bash ./scripts/agent-branch-merge.sh', + 'agent:cleanup': 'gx cleanup', + 'agent:hooks:install': 'bash ./scripts/install-agent-git-hooks.sh', + 'agent:locks:claim': 'python3 ./scripts/agent-file-locks.py claim', + 'agent:locks:allow-delete': 'python3 ./scripts/agent-file-locks.py allow-delete', + 'agent:locks:release': 'python3 ./scripts/agent-file-locks.py release', + 'agent:locks:status': 'python3 ./scripts/agent-file-locks.py status', + 'agent:plan:init': 'bash ./scripts/openspec/init-plan-workspace.sh', + 'agent:change:init': 'bash ./scripts/openspec/init-change-workspace.sh', + 'agent:protect:list': 'gx protect list', + 'agent:branch:sync': 'gx sync', + 'agent:branch:sync:check': 'gx sync --check', + 'agent:safety:setup': 'gx setup', + 'agent:safety:scan': 'gx status --strict', + 'agent:safety:fix': 'gx setup --repair', + 'agent:safety:doctor': 'gx doctor', + 'agent:docker:load': 'bash ./scripts/guardex-docker-loader.sh', + 'agent:review:watch': 'bash ./scripts/review-bot-watch.sh', + 'agent:finish': 'gx finish --all', +}; + +const PACKAGE_SCRIPT_ASSETS = { + branchStart: path.join(TEMPLATE_ROOT, 'scripts', 'agent-branch-start.sh'), + branchFinish: path.join(TEMPLATE_ROOT, 'scripts', 'agent-branch-finish.sh'), + branchMerge: path.join(TEMPLATE_ROOT, 'scripts', 'agent-branch-merge.sh'), + codexAgent: path.join(TEMPLATE_ROOT, 'scripts', 'codex-agent.sh'), + reviewBot: path.join(TEMPLATE_ROOT, 'scripts', 'review-bot-watch.sh'), + worktreePrune: path.join(TEMPLATE_ROOT, 'scripts', 'agent-worktree-prune.sh'), + lockTool: path.join(TEMPLATE_ROOT, 'scripts', 'agent-file-locks.py'), + planInit: path.join(TEMPLATE_ROOT, 'scripts', 'openspec', 'init-plan-workspace.sh'), + changeInit: path.join(TEMPLATE_ROOT, 'scripts', 'openspec', 'init-change-workspace.sh'), +}; + +const USER_LEVEL_SKILL_ASSETS = [ + { + source: path.join(TEMPLATE_ROOT, 'codex', 'skills', 'gitguardex', 'SKILL.md'), + destination: path.join('.codex', 'skills', 'gitguardex', 'SKILL.md'), + }, + { + source: path.join(TEMPLATE_ROOT, 'codex', 'skills', 'guardex-merge-skills-to-dev', 'SKILL.md'), + destination: path.join('.codex', 'skills', 'guardex-merge-skills-to-dev', 'SKILL.md'), + }, + { + source: path.join(TEMPLATE_ROOT, 'claude', 'commands', 'gitguardex.md'), + destination: path.join('.claude', 'commands', 'gitguardex.md'), + }, +]; + +const EXECUTABLE_RELATIVE_PATHS = new Set([ + ...MANAGED_TEMPLATE_SCRIPT_FILES, + ...HOOK_NAMES.map((entry) => path.posix.join('.githooks', entry)), +]); + +const CRITICAL_GUARDRAIL_PATHS = new Set([ + 'AGENTS.md', + ...HOOK_NAMES.map((entry) => path.posix.join('.githooks', entry)), + 'scripts/guardex-env.sh', +]); + +const LOCK_FILE_RELATIVE = '.omx/state/agent-file-locks.json'; +const AGENTS_BOTS_STATE_RELATIVE = '.omx/state/agents-bots.json'; +const AGENTS_MARKER_START = ''; +const AGENTS_MARKER_END = ''; +const GITIGNORE_MARKER_START = '# multiagent-safety:START'; +const GITIGNORE_MARKER_END = '# multiagent-safety:END'; +const CODEX_WORKTREE_RELATIVE_DIR = path.join('.omx', 'agent-worktrees'); +const CLAUDE_WORKTREE_RELATIVE_DIR = path.join('.omc', 'agent-worktrees'); +const AGENT_WORKTREE_RELATIVE_DIRS = [ + CODEX_WORKTREE_RELATIVE_DIR, + CLAUDE_WORKTREE_RELATIVE_DIR, +]; +const MANAGED_GITIGNORE_PATHS = [ + '.omx/', + '.omc/', + 'scripts/agent-session-state.js', + 'scripts/guardex-docker-loader.sh', + 'scripts/guardex-env.sh', + 'scripts/install-vscode-active-agents-extension.js', + '.githooks', + 'oh-my-codex/', + LOCK_FILE_RELATIVE, +]; +const REPO_SCAFFOLD_DIRECTORIES = ['bin']; +const OMX_SCAFFOLD_DIRECTORIES = [ + '.omx', + '.omx/state', + '.omx/logs', + '.omx/plans', + CODEX_WORKTREE_RELATIVE_DIR, + '.omc', + CLAUDE_WORKTREE_RELATIVE_DIR, +]; +const OMX_SCAFFOLD_FILES = new Map([ + ['.omx/notepad.md', '\n\n## WORKING MEMORY\n'], + ['.omx/project-memory.json', '{}\n'], +]); +const TARGETED_FORCEABLE_MANAGED_PATHS = new Set([ + 'AGENTS.md', + '.gitignore', + ...Array.from(OMX_SCAFFOLD_FILES.keys()), + ...REQUIRED_MANAGED_REPO_FILES, + ...LEGACY_WORKFLOW_SHIMS, +]); +const COMMAND_TYPO_ALIASES = new Map([ + ['relaese', 'release'], + ['realaese', 'release'], + ['relase', 'release'], + ['setpu', 'setup'], + ['inti', 'init'], + ['intsall', 'install'], + ['docter', 'doctor'], + ['doctro', 'doctor'], + ['cleunup', 'cleanup'], + ['scna', 'scan'], +]); +const SUGGESTIBLE_COMMANDS = [ + 'status', + 'setup', + 'doctor', + 'branch', + 'locks', + 'worktree', + 'hook', + 'migrate', + 'install-agent-skills', + 'agents', + 'merge', + 'finish', + 'report', + 'protect', + 'sync', + 'cleanup', + 'prompt', + 'help', + 'version', + // deprecated aliases still routable with a warning + 'init', + 'install', + 'fix', + 'scan', + 'review', + 'copy-prompt', + 'copy-commands', + 'print-agents-snippet', + 'release', +]; +const CLI_COMMAND_DESCRIPTIONS = [ + ['status', 'Show GitGuardex CLI + service health without modifying files'], + ['setup', 'Install, repair, and verify guardrails (flags: --repair, --install-only, --target)'], + ['doctor', 'Repair drift + verify (auto-sandboxes on protected main)'], + ['branch', 'CLI-owned branch workflow surface (start/finish/merge)'], + ['locks', 'CLI-owned file lock surface (claim/allow-delete/release/status/validate)'], + ['worktree', 'CLI-owned worktree cleanup surface (prune)'], + ['hook', 'Hook dispatch/install surface used by managed shims'], + ['migrate', 'Convert legacy repo-local installs to the zero-copy CLI-owned surface'], + ['install-agent-skills', 'Install Guardex Codex/Claude skills into the user home'], + ['protect', 'Manage protected branches (list/add/remove/set/reset)'], + ['merge', 'Create/reuse an integration lane and merge overlapping agent branches'], + ['sync', 'Sync agent branches with origin/'], + ['finish', 'Commit + PR + merge completed agent branches (--all, --branch)'], + ['cleanup', 'Prune merged/stale agent branches and worktrees'], + ['release', 'Create or update the current GitHub release with README-generated notes'], + ['agents', 'Start/stop repo-scoped review + cleanup bots'], + ['prompt', 'Print AI setup checklist (--exec, --snippet)'], + ['report', 'Security/safety reports (e.g. OpenSSF scorecard)'], + ['help', 'Show this help output'], + ['version', 'Print GitGuardex version'], +]; +const DEPRECATED_COMMAND_ALIASES = new Map([ + ['init', { target: 'setup', hint: 'gx setup' }], + ['install', { target: 'setup', hint: 'gx setup --install-only' }], + ['fix', { target: 'setup', hint: 'gx setup --repair' }], + ['scan', { target: 'status', hint: 'gx status --strict' }], + ['copy-prompt', { target: 'prompt', hint: 'gx prompt' }], + ['copy-commands', { target: 'prompt', hint: 'gx prompt --exec' }], + ['print-agents-snippet', { target: 'prompt', hint: 'gx prompt --snippet' }], + ['review', { target: 'agents', hint: 'gx agents start (runs review + cleanup)' }], +]); +const AGENT_BOT_DESCRIPTIONS = [ + ['agents', 'Start/stop review + cleanup bots for this repo'], +]; +const DOCTOR_AUTO_FINISH_DETAIL_LIMIT = 6; +const DOCTOR_AUTO_FINISH_BRANCH_LABEL_MAX = 72; +const DOCTOR_AUTO_FINISH_MESSAGE_MAX = 160; + +function envFlagIsTruthy(raw) { + const lowered = String(raw || '').trim().toLowerCase(); + return lowered === '1' || lowered === 'true' || lowered === 'yes' || lowered === 'on'; +} + +function isClaudeCodeSession(env = process.env) { + return envFlagIsTruthy(env.CLAUDECODE) || Boolean(env.CLAUDE_CODE_SESSION_ID); +} + +function defaultAgentWorktreeRelativeDir(env = process.env) { + return isClaudeCodeSession(env) ? CLAUDE_WORKTREE_RELATIVE_DIR : CODEX_WORKTREE_RELATIVE_DIR; +} + +const AI_SETUP_PROMPT = `GitGuardex (gx) setup checklist for Codex/Claude in this repo. + +1) Install: ${GLOBAL_INSTALL_COMMAND} && gh --version +2) Bootstrap: gx setup +3) Repair: gx doctor +4) Task loop: gx branch start "" "" + then gx locks claim --branch "" -> gx branch finish +5) Integrate: gx merge --branch --branch +6) Finish: gx finish --all +7) Cleanup: gx cleanup +8) OpenSpec: /opsx:propose -> /opsx:apply -> /opsx:archive +9) Optional: gx protect add release staging +10) Optional: gx sync --check && gx sync +11) Review bot: install https://github.com/apps/cr-gpt + set OPENAI_API_KEY +12) Fork sync: install https://github.com/apps/pull + cp .github/pull.yml.example .github/pull.yml +`; + +const AI_SETUP_COMMANDS = `${GLOBAL_INSTALL_COMMAND} +gh --version +gx setup +gx doctor +gx branch start "" "" +gx locks claim --branch "" +gx merge --branch "" --branch "" +gx finish --all +gx cleanup +gx protect add release staging +gx sync --check && gx sync +`; + +const SCORECARD_RISK_BY_CHECK = { + 'Dangerous-Workflow': 'Critical', + 'Code-Review': 'High', + Maintained: 'High', + 'Binary-Artifacts': 'High', + 'Dependency-Update-Tool': 'High', + 'Token-Permissions': 'High', + Vulnerabilities: 'High', + 'Branch-Protection': 'High', + Fuzzing: 'Medium', + 'Pinned-Dependencies': 'Medium', + SAST: 'Medium', + 'Security-Policy': 'Medium', + 'CII-Best-Practices': 'Low', + Contributors: 'Low', + License: 'Low', +}; + +function getSandboxApi() { + if (!sandboxApi) { + sandboxApi = sandboxModule.createSandboxApi({ + protectedBaseWriteBlock, + runInstallInternal, + ensureSetupProtectedBranches, + ensureParentWorkspaceView, + buildParentWorkspaceView, + runFixInternal, + }); + } + return sandboxApi; +} + +function getToolchainApi() { + if (!toolchainApi) { + toolchainApi = toolchainModule.createToolchainApi({ + TOOL_NAME, + NPM_BIN, + NPX_BIN, + packageJson, + OPENSPEC_PACKAGE, + OPENSPEC_BIN, + GLOBAL_TOOLCHAIN_PACKAGES, + parseAutoApproval, + isInteractiveTerminal, + promptYesNoStrict, + run, + checkForGuardexUpdate, + printUpdateAvailableBanner, + readInstalledGuardexVersion, + restartIntoUpdatedGuardex, + checkForOpenSpecPackageUpdate, + printOpenSpecUpdateAvailableBanner, + resolveGlobalInstallApproval, + detectGlobalToolchainPackages, + detectOptionalLocalCompanionTools, + formatGlobalToolchainServiceName, + askGlobalInstallForMissing, + }); + } + return toolchainApi; +} + +function getFinishApi() { + if (!finishApi) { + finishApi = finishModule.createFinishApi({ + TOOL_NAME, + LOCK_FILE_RELATIVE, + path, + fs, + run, + runPackageAsset, + resolveRepoRoot, + parseCleanupArgs, + parseMergeArgs, + parseFinishArgs, + parseSyncArgs, + listAgentWorktrees, + listLocalAgentBranchesForFinish, + uniquePreserveOrder, + branchExists, + resolveFinishBaseBranch, + worktreeHasLocalChanges, + branchMergedIntoBase, + autoCommitWorktreeForFinish, + resolveBaseBranch, + resolveSyncStrategy, + ensureOriginBaseRef, + gitRun, + currentBranchName, + workingTreeIsDirty, + aheadBehind, + lockRegistryStatus, + syncOperation, + }); + } + return finishApi; +} + +function runtimeVersion() { + return outputModule.runtimeVersion(); +} + +function supportsAnsiColors() { + return outputModule.supportsAnsiColors(); +} + +function colorize(text, colorCode) { + return outputModule.colorize(text, colorCode); +} + +function doctorOutputColorCode(status) { + return outputModule.doctorOutputColorCode(status); +} + +function colorizeDoctorOutput(text, status) { + return outputModule.colorizeDoctorOutput(text, status); +} + +function detectAutoFinishDetailStatus(detail) { + return outputModule.detectAutoFinishDetailStatus(detail); +} + +function detectAutoFinishSummaryStatus(summary) { + return outputModule.detectAutoFinishSummaryStatus(summary); +} + +function statusDot(status) { + return outputModule.statusDot(status); +} + +function commandCatalogLines(indent = ' ') { + return outputModule.commandCatalogLines(indent); +} + +function agentBotCatalogLines(indent = ' ') { + return outputModule.agentBotCatalogLines(indent); +} + +function repoToggleLines(indent = ' ') { + return outputModule.repoToggleLines(indent); +} + +function printToolLogsSummary() { + return outputModule.printToolLogsSummary(); +} + +function usage(options = {}) { + return outputModule.usage(options); +} + +function run(cmd, args, options = {}) { + return runtimeModule.run(cmd, args, options); +} + +function extractTargetedArgs(rawArgs, defaultTarget = process.cwd()) { + return runtimeModule.extractTargetedArgs(rawArgs, defaultTarget); +} + +function packageAssetEnv(extraEnv = {}) { + return runtimeModule.packageAssetEnv(extraEnv); +} + +function packageAssetPath(assetKey) { + return runtimeModule.packageAssetPath(assetKey); +} + +function runPackageAsset(assetKey, rawArgs, options = {}) { + return runtimeModule.runPackageAsset(assetKey, rawArgs, options); +} + +function repoLocalLegacyScriptPath(repoRoot, relativePath) { + return runtimeModule.repoLocalLegacyScriptPath(repoRoot, relativePath); +} + +function runReviewBotCommand(repoRoot, rawArgs, options = {}) { + return runtimeModule.runReviewBotCommand(repoRoot, rawArgs, options); +} + +function invokePackageAsset(assetKey, rawArgs, options = {}) { + return runtimeModule.invokePackageAsset(assetKey, rawArgs, options); +} + +function formatElapsedDuration(ms) { + return outputModule.formatElapsedDuration(ms); +} + +function truncateMiddle(value, maxLength) { + return outputModule.truncateMiddle(value, maxLength); +} + +function truncateTail(value, maxLength) { + return outputModule.truncateTail(value, maxLength); +} + +function compactAutoFinishPathSegments(message) { + return outputModule.compactAutoFinishPathSegments(message); +} + +function detectRecoverableAutoFinishConflict(message) { + return outputModule.detectRecoverableAutoFinishConflict(message); +} + +function summarizeAutoFinishDetail(detail) { + return outputModule.summarizeAutoFinishDetail(detail); +} + +function printAutoFinishSummary(summary, options = {}) { + return outputModule.printAutoFinishSummary(summary, options); +} + +function gitRun(repoRoot, args, { allowFailure = false } = {}) { + return gitModule.gitRun(repoRoot, args, { allowFailure }); +} + +function resolveRepoRoot(targetPath) { + return gitModule.resolveRepoRoot(targetPath); +} + +function isGitRepo(targetPath) { + return gitModule.isGitRepo(targetPath); +} + +const NESTED_REPO_DEFAULT_MAX_DEPTH = gitModule.DEFAULT_NESTED_REPO_MAX_DEPTH; +function discoverNestedGitRepos(rootPath, opts = {}) { + return gitModule.discoverNestedGitRepos(rootPath, opts); +} + +function toDestinationPath(relativeTemplatePath) { + return scaffoldModule.toDestinationPath(relativeTemplatePath); +} + +function ensureParentDir(repoRoot, filePath, dryRun) { + return scaffoldModule.ensureParentDir(repoRoot, filePath, dryRun); +} + +function ensureExecutable(destinationPath, relativePath, dryRun) { + return scaffoldModule.ensureExecutable(destinationPath, relativePath, dryRun); +} + +function isCriticalGuardrailPath(relativePath) { + return scaffoldModule.isCriticalGuardrailPath(relativePath); +} + +function shellSingleQuote(value) { + return scaffoldModule.shellSingleQuote(value); +} + +function renderShellDispatchShim(commandParts) { + return scaffoldModule.renderShellDispatchShim(commandParts); +} + +function renderPythonDispatchShim(commandParts) { + return scaffoldModule.renderPythonDispatchShim(commandParts); +} + +function managedForceConflictMessage(relativePath) { + return scaffoldModule.managedForceConflictMessage(relativePath); +} + +function renderManagedFile(repoRoot, relativePath, content, options = {}) { + const destinationPath = path.join(repoRoot, relativePath); + const destinationExists = fs.existsSync(destinationPath); + const force = Boolean(options.force); + const dryRun = Boolean(options.dryRun); + + if (destinationExists) { + const existingContent = fs.readFileSync(destinationPath, 'utf8'); + if (existingContent === content) { + ensureExecutable(destinationPath, relativePath, dryRun); + return { status: 'unchanged', file: relativePath }; + } + if (!force && !isCriticalGuardrailPath(relativePath)) { + throw new Error(managedForceConflictMessage(relativePath)); + } + } + + ensureParentDir(repoRoot, destinationPath, dryRun); + if (!dryRun) { + fs.writeFileSync(destinationPath, content, 'utf8'); + ensureExecutable(destinationPath, relativePath, dryRun); + } + + if (destinationExists && !force && isCriticalGuardrailPath(relativePath)) { + return { status: dryRun ? 'would-repair-critical' : 'repaired-critical', file: relativePath }; + } + + return { status: destinationExists ? 'overwritten' : 'created', file: relativePath }; +} + +function ensureGeneratedScriptShim(repoRoot, spec, options = {}) { + const content = spec.kind === 'python' + ? renderPythonDispatchShim(spec.command) + : renderShellDispatchShim(spec.command); + return renderManagedFile(repoRoot, spec.relativePath, content, options); +} + +function ensureHookShim(repoRoot, hookName, options = {}) { + return renderManagedFile( + repoRoot, + path.posix.join('.githooks', hookName), + renderShellDispatchShim(['hook', 'run', hookName]), + options, + ); +} + +function copyTemplateFile(repoRoot, relativeTemplatePath, force, dryRun) { + const sourcePath = path.join(TEMPLATE_ROOT, relativeTemplatePath); + const destinationRelativePath = toDestinationPath(relativeTemplatePath); + const destinationPath = path.join(repoRoot, destinationRelativePath); + + const sourceContent = fs.readFileSync(sourcePath, 'utf8'); + const destinationExists = fs.existsSync(destinationPath); + + if (destinationExists) { + const existingContent = fs.readFileSync(destinationPath, 'utf8'); + if (existingContent === sourceContent) { + ensureExecutable(destinationPath, destinationRelativePath, dryRun); + return { status: 'unchanged', file: destinationRelativePath }; + } + if (!force && !isCriticalGuardrailPath(destinationRelativePath)) { + throw new Error(managedForceConflictMessage(destinationRelativePath)); + } + } + + ensureParentDir(repoRoot, destinationPath, dryRun); + if (!dryRun) { + fs.writeFileSync(destinationPath, sourceContent, 'utf8'); + ensureExecutable(destinationPath, destinationRelativePath, dryRun); + } + + if (destinationExists && !force && isCriticalGuardrailPath(destinationRelativePath)) { + return { status: dryRun ? 'would-repair-critical' : 'repaired-critical', file: destinationRelativePath }; + } + + return { status: destinationExists ? 'overwritten' : 'created', file: destinationRelativePath }; +} + +function ensureTemplateFilePresent(repoRoot, relativeTemplatePath, dryRun) { + const sourcePath = path.join(TEMPLATE_ROOT, relativeTemplatePath); + const destinationRelativePath = toDestinationPath(relativeTemplatePath); + const destinationPath = path.join(repoRoot, destinationRelativePath); + const sourceContent = fs.readFileSync(sourcePath, 'utf8'); + + if (fs.existsSync(destinationPath)) { + const existingContent = fs.readFileSync(destinationPath, 'utf8'); + if (existingContent === sourceContent) { + ensureExecutable(destinationPath, destinationRelativePath, dryRun); + return { status: 'unchanged', file: destinationRelativePath }; + } + + if (isCriticalGuardrailPath(destinationRelativePath)) { + if (!dryRun) { + fs.writeFileSync(destinationPath, sourceContent, 'utf8'); + ensureExecutable(destinationPath, destinationRelativePath, dryRun); + } + return { status: dryRun ? 'would-repair-critical' : 'repaired-critical', file: destinationRelativePath }; + } + + // In fix mode, avoid silently replacing local customizations. + return { status: 'skipped-conflict', file: destinationRelativePath }; + } + + ensureParentDir(repoRoot, destinationPath, dryRun); + if (!dryRun) { + fs.writeFileSync(destinationPath, sourceContent, 'utf8'); + ensureExecutable(destinationPath, destinationRelativePath, dryRun); + } + + return { status: 'created', file: destinationRelativePath }; +} + +function ensureTargetedLegacyWorkflowShims(repoRoot, options) { + const targetedPaths = Array.isArray(options.forceManagedPaths) ? options.forceManagedPaths : []; + if (targetedPaths.length === 0) { + return []; + } + + const operations = []; + for (const shim of LEGACY_WORKFLOW_SHIM_SPECS) { + if (!shouldForceManagedPath(options, shim.relativePath)) { + continue; + } + operations.push(ensureGeneratedScriptShim(repoRoot, shim, { dryRun: options.dryRun, force: true })); + } + return operations; +} + +function lockFilePath(repoRoot) { + return path.join(repoRoot, LOCK_FILE_RELATIVE); +} + +function ensureOmxScaffold(repoRoot, dryRun) { + const operations = []; + + for (const relativeDir of REPO_SCAFFOLD_DIRECTORIES) { + const absoluteDir = path.join(repoRoot, relativeDir); + if (fs.existsSync(absoluteDir)) { + if (!fs.statSync(absoluteDir).isDirectory()) { + throw new Error(`Expected directory at ${relativeDir} but found a file.`); + } + operations.push({ status: 'unchanged', file: relativeDir }); + continue; + } + + if (!dryRun) { + fs.mkdirSync(absoluteDir, { recursive: true }); + } + operations.push({ status: 'created', file: relativeDir }); + } + + for (const relativeDir of OMX_SCAFFOLD_DIRECTORIES) { + const absoluteDir = path.join(repoRoot, relativeDir); + if (fs.existsSync(absoluteDir)) { + if (!fs.statSync(absoluteDir).isDirectory()) { + throw new Error(`Expected directory at ${relativeDir} but found a file.`); + } + operations.push({ status: 'unchanged', file: relativeDir }); + continue; + } + + if (!dryRun) { + fs.mkdirSync(absoluteDir, { recursive: true }); + } + operations.push({ status: 'created', file: relativeDir }); + } + + for (const [relativeFile, defaultContent] of OMX_SCAFFOLD_FILES.entries()) { + const absoluteFile = path.join(repoRoot, relativeFile); + if (fs.existsSync(absoluteFile)) { + if (!fs.statSync(absoluteFile).isFile()) { + throw new Error(`Expected file at ${relativeFile} but found a directory.`); + } + operations.push({ status: 'unchanged', file: relativeFile }); + continue; + } + + if (!dryRun) { + fs.mkdirSync(path.dirname(absoluteFile), { recursive: true }); + fs.writeFileSync(absoluteFile, defaultContent, 'utf8'); + } + operations.push({ status: 'created', file: relativeFile }); + } + + return operations; +} + +function ensureLockRegistry(repoRoot, dryRun) { + const absolutePath = lockFilePath(repoRoot); + if (fs.existsSync(absolutePath)) { + return { status: 'unchanged', file: LOCK_FILE_RELATIVE }; + } + + if (!dryRun) { + fs.mkdirSync(path.dirname(absolutePath), { recursive: true }); + fs.writeFileSync(absolutePath, JSON.stringify({ locks: {} }, null, 2) + '\n', 'utf8'); + } + + return { status: 'created', file: LOCK_FILE_RELATIVE }; +} + +function lockStateOrError(repoRoot) { + const lockPath = lockFilePath(repoRoot); + if (!fs.existsSync(lockPath)) { + return { ok: false, error: `${LOCK_FILE_RELATIVE} is missing` }; + } + + try { + const parsed = JSON.parse(fs.readFileSync(lockPath, 'utf8')); + if (!parsed || typeof parsed !== 'object' || typeof parsed.locks !== 'object' || parsed.locks === null) { + return { ok: false, error: `${LOCK_FILE_RELATIVE} has invalid schema (expected { locks: {} })` }; + } + + // Normalize older schema entries. + for (const [filePath, entry] of Object.entries(parsed.locks)) { + if (!entry || typeof entry !== 'object') { + parsed.locks[filePath] = { branch: '', claimed_at: '', allow_delete: false }; + continue; + } + if (!Object.prototype.hasOwnProperty.call(entry, 'allow_delete')) { + entry.allow_delete = false; + } + } + + return { ok: true, raw: parsed, locks: parsed.locks }; + } catch (error) { + return { ok: false, error: `${LOCK_FILE_RELATIVE} is invalid JSON: ${error.message}` }; + } +} + +function writeLockState(repoRoot, payload, dryRun) { + if (dryRun) return; + const lockPath = lockFilePath(repoRoot); + fs.mkdirSync(path.dirname(lockPath), { recursive: true }); + fs.writeFileSync(lockPath, JSON.stringify(payload, null, 2) + '\n', 'utf8'); +} + +function removeLegacyPackageScripts(repoRoot, dryRun) { + const packagePath = path.join(repoRoot, 'package.json'); + if (!fs.existsSync(packagePath)) { + return { status: 'skipped', file: 'package.json', note: 'package.json not found' }; + } + + let pkg; + try { + pkg = JSON.parse(fs.readFileSync(packagePath, 'utf8')); + } catch (error) { + throw new Error(`Unable to parse package.json in target repo: ${error.message}`); + } + + const existingScripts = pkg.scripts && typeof pkg.scripts === 'object' + ? pkg.scripts + : {}; + pkg.scripts = existingScripts; + let changed = false; + for (const [key, value] of Object.entries(LEGACY_MANAGED_PACKAGE_SCRIPTS)) { + if (existingScripts[key] === value) { + delete existingScripts[key]; + changed = true; + } + } + + if (!changed) { + return { status: 'unchanged', file: 'package.json', note: 'no Guardex-managed agent:* scripts found' }; + } + + if (!dryRun) { + fs.writeFileSync(packagePath, JSON.stringify(pkg, null, 2) + '\n', 'utf8'); + } + + return { status: dryRun ? 'would-update' : 'updated', file: 'package.json', note: 'removed Guardex-managed agent:* scripts' }; +} + +function installUserLevelAsset(asset, options = {}) { + const dryRun = Boolean(options.dryRun); + const force = Boolean(options.force); + const destinationPath = path.join(GUARDEX_HOME_DIR, asset.destination); + const sourceContent = fs.readFileSync(asset.source, 'utf8'); + const destinationExists = fs.existsSync(destinationPath); + + if (destinationExists) { + const existingContent = fs.readFileSync(destinationPath, 'utf8'); + if (existingContent === sourceContent) { + return { status: 'unchanged', file: asset.destination }; + } + if (!force) { + return { status: 'skipped-conflict', file: asset.destination }; + } + } + + if (!dryRun) { + fs.mkdirSync(path.dirname(destinationPath), { recursive: true }); + fs.writeFileSync(destinationPath, sourceContent, 'utf8'); + } + return { status: destinationExists ? (dryRun ? 'would-update' : 'updated') : 'created', file: asset.destination }; +} + +function removeLegacyManagedRepoFile(repoRoot, relativePath, options = {}) { + const dryRun = Boolean(options.dryRun); + const force = Boolean(options.force); + const absolutePath = path.join(repoRoot, relativePath); + if (!fs.existsSync(absolutePath)) { + return { status: 'unchanged', file: relativePath, note: 'not present' }; + } + if (!fs.statSync(absolutePath).isFile()) { + return { status: 'skipped-conflict', file: relativePath, note: 'not a regular file' }; + } + + const skillAsset = USER_LEVEL_SKILL_ASSETS.find((asset) => asset.destination === relativePath); + if (skillAsset) { + const userLevelPath = path.join(GUARDEX_HOME_DIR, skillAsset.destination); + if (!fs.existsSync(userLevelPath)) { + return { status: 'skipped', file: relativePath, note: 'user-level replacement not installed' }; + } + } + + const templateRelative = skillAsset + ? skillAsset.source.slice(TEMPLATE_ROOT.length + 1) + : relativePath.replace(/^\./, ''); + const sourcePath = path.join(TEMPLATE_ROOT, templateRelative); + if (!fs.existsSync(sourcePath)) { + return { status: 'skipped', file: relativePath, note: 'template source missing' }; + } + + const sourceContent = fs.readFileSync(sourcePath, 'utf8'); + const existingContent = fs.readFileSync(absolutePath, 'utf8'); + if (existingContent !== sourceContent && !force) { + return { status: 'skipped-conflict', file: relativePath, note: 'local edits differ from managed template' }; + } + + if (!dryRun) { + fs.rmSync(absolutePath, { force: true }); + } + return { status: dryRun ? 'would-remove' : 'removed', file: relativePath }; +} + +function ensureAgentsSnippet(repoRoot, dryRun, options = {}) { + const agentsPath = path.join(repoRoot, 'AGENTS.md'); + const snippet = fs.readFileSync(path.join(TEMPLATE_ROOT, 'AGENTS.multiagent-safety.md'), 'utf8').trimEnd(); + const managedRegex = new RegExp( + `${AGENTS_MARKER_START.replace(/[.*+?^${}()|[\]\\]/g, '\\$&')}[\\s\\S]*?${AGENTS_MARKER_END.replace(/[.*+?^${}()|[\]\\]/g, '\\$&')}`, + 'm', + ); + + if (!fs.existsSync(agentsPath)) { + if (!dryRun) { + fs.writeFileSync(agentsPath, `# AGENTS\n\n${snippet}\n`, 'utf8'); + } + return { status: 'created', file: 'AGENTS.md' }; + } + + const existing = fs.readFileSync(agentsPath, 'utf8'); + if (managedRegex.test(existing)) { + const next = existing.replace(managedRegex, snippet); + if (next === existing) { + return { status: 'unchanged', file: 'AGENTS.md' }; + } + if (!dryRun) { + fs.writeFileSync(agentsPath, next, 'utf8'); + } + return { status: 'updated', file: 'AGENTS.md', note: 'refreshed gitguardex-managed block' }; + } + + if (existing.includes(AGENTS_MARKER_START)) { + return { status: 'unchanged', file: 'AGENTS.md', note: 'existing marker found without managed end marker' }; + } + + const separator = existing.endsWith('\n') ? '\n' : '\n\n'; + if (!dryRun) { + fs.writeFileSync(agentsPath, `${existing}${separator}${snippet}\n`, 'utf8'); + } + + return { status: 'updated', file: 'AGENTS.md' }; +} + +function ensureManagedGitignore(repoRoot, dryRun) { + const gitignorePath = path.join(repoRoot, '.gitignore'); + const managedBlock = [ + GITIGNORE_MARKER_START, + ...MANAGED_GITIGNORE_PATHS, + GITIGNORE_MARKER_END, + ].join('\n'); + const managedRegex = new RegExp( + `${GITIGNORE_MARKER_START.replace(/[.*+?^${}()|[\]\\]/g, '\\$&')}[\\s\\S]*?${GITIGNORE_MARKER_END.replace(/[.*+?^${}()|[\]\\]/g, '\\$&')}`, + 'm', + ); + + if (!fs.existsSync(gitignorePath)) { + if (!dryRun) { + fs.writeFileSync(gitignorePath, `${managedBlock}\n`, 'utf8'); + } + return { status: 'created', file: '.gitignore', note: 'added gitguardex-managed entries' }; + } + + const existing = fs.readFileSync(gitignorePath, 'utf8'); + if (managedRegex.test(existing)) { + const next = existing.replace(managedRegex, managedBlock); + if (next === existing) { + return { status: 'unchanged', file: '.gitignore' }; + } + if (!dryRun) { + fs.writeFileSync(gitignorePath, next, 'utf8'); + } + return { status: 'updated', file: '.gitignore', note: 'refreshed gitguardex-managed entries' }; + } + + const separator = existing.endsWith('\n') ? '\n' : '\n\n'; + if (!dryRun) { + fs.writeFileSync(gitignorePath, `${existing}${separator}${managedBlock}\n`, 'utf8'); + } + return { status: 'updated', file: '.gitignore', note: 'appended gitguardex-managed entries' }; +} + +function configureHooks(repoRoot, dryRun) { + return hooksModule.configureHooks(repoRoot, dryRun); +} + +function requireValue(rawArgs, index, flagName) { + const value = rawArgs[index + 1]; + if (!value || value.startsWith('-')) { + throw new Error(`${flagName} requires a value`); + } + return value; +} + +function normalizeManagedForcePath(rawPath) { + if (typeof rawPath !== 'string') { + return null; + } + const normalized = path.posix.normalize(rawPath.replace(/\\/g, '/')); + if (!normalized || normalized === '.' || normalized.startsWith('../') || path.posix.isAbsolute(normalized)) { + return null; + } + return normalized.startsWith('./') ? normalized.slice(2) : normalized; +} + +function collectForceManagedPaths(rawArgs, startIndex) { + const forceManagedPaths = []; + let nextIndex = startIndex; + + while (nextIndex + 1 < rawArgs.length) { + const candidate = rawArgs[nextIndex + 1]; + if (!candidate || candidate.startsWith('-')) { + break; + } + const normalized = normalizeManagedForcePath(candidate); + if (!normalized || !TARGETED_FORCEABLE_MANAGED_PATHS.has(normalized)) { + throw new Error(`Unknown managed path after --force: ${candidate}`); + } + forceManagedPaths.push(normalized); + nextIndex += 1; + } + + return { forceManagedPaths, nextIndex }; +} + +function appendForceArgs(args, options) { + if (!options.force) { + return; + } + args.push('--force'); + for (const managedPath of options.forceManagedPaths || []) { + args.push(managedPath); + } +} + +function shouldForceManagedPath(options, relativePath) { + if (!options.force) { + return false; + } + const targetedPaths = Array.isArray(options.forceManagedPaths) ? options.forceManagedPaths : []; + if (targetedPaths.length === 0) { + return true; + } + const normalized = normalizeManagedForcePath(relativePath); + return normalized !== null && targetedPaths.includes(normalized); +} + +function parseCommonArgs(rawArgs, defaults) { + return cliArgsModule.parseCommonArgs(rawArgs, defaults); +} + +function parseRepoTraversalArgs(rawArgs, defaults) { + return cliArgsModule.parseRepoTraversalArgs(rawArgs, defaults, { + nestedRepoDefaultMaxDepth: NESTED_REPO_DEFAULT_MAX_DEPTH, + }); +} + +function parseSetupArgs(rawArgs, defaults) { + return cliArgsModule.parseSetupArgs(rawArgs, defaults, { + nestedRepoDefaultMaxDepth: NESTED_REPO_DEFAULT_MAX_DEPTH, + }); +} + +function parseDoctorArgs(rawArgs) { + return cliArgsModule.parseDoctorArgs(rawArgs, { + nestedRepoDefaultMaxDepth: NESTED_REPO_DEFAULT_MAX_DEPTH, + }); +} + +function normalizeWorkspacePath(relativePath) { + return String(relativePath || '.').replace(/\\/g, '/'); +} + +function buildParentWorkspaceView(repoRoot) { + const parentDir = path.dirname(repoRoot); + const workspaceFileName = `${path.basename(repoRoot)}-branches.code-workspace`; + const workspacePath = path.join(parentDir, workspaceFileName); + const repoRelativePath = normalizeWorkspacePath(path.relative(parentDir, repoRoot) || '.'); + + return { + workspacePath, + payload: { + folders: [ + { path: repoRelativePath }, + ...AGENT_WORKTREE_RELATIVE_DIRS.map((relativeDir) => ({ + path: normalizeWorkspacePath(path.join(repoRelativePath === '.' ? '' : repoRelativePath, relativeDir)), + })), + ], + settings: { + 'scm.alwaysShowRepositories': true, + }, + }, + }; +} + +function ensureParentWorkspaceView(repoRoot, dryRun) { + const { workspacePath, payload } = buildParentWorkspaceView(repoRoot); + const operationFile = path.relative(repoRoot, workspacePath) || path.basename(workspacePath); + const nextContent = `${JSON.stringify(payload, null, 2)}\n`; + const note = 'parent VS Code workspace view'; + + if (!fs.existsSync(workspacePath)) { + if (!dryRun) { + fs.writeFileSync(workspacePath, nextContent, 'utf8'); + } + return { status: dryRun ? 'would-create' : 'created', file: operationFile, note }; + } + + const currentContent = fs.readFileSync(workspacePath, 'utf8'); + if (currentContent === nextContent) { + return { status: 'unchanged', file: operationFile, note }; + } + + if (!dryRun) { + fs.writeFileSync(workspacePath, nextContent, 'utf8'); + } + return { status: dryRun ? 'would-update' : 'updated', file: operationFile, note }; +} + +function hasGuardexBootstrapFiles(repoRoot) { + const required = [ + 'AGENTS.md', + '.githooks/pre-commit', + '.githooks/pre-push', + LOCK_FILE_RELATIVE, + ]; + return required.every((relativePath) => fs.existsSync(path.join(repoRoot, relativePath))); +} + +function protectedBaseWriteBlock(options, { requireBootstrap = true } = {}) { + if (options.dryRun || options.allowProtectedBaseWrite) { + return null; + } + + const repoRoot = resolveRepoRoot(options.target); + if (requireBootstrap && !hasGuardexBootstrapFiles(repoRoot)) { + return null; + } + + const branch = currentBranchName(repoRoot); + if (branch !== 'main') { + return null; + } + + const protectedBranches = readProtectedBranches(repoRoot); + if (!protectedBranches.includes(branch)) { + return null; + } + + return { + repoRoot, + branch, + }; +} + +function assertProtectedMainWriteAllowed(options, commandName) { + return getSandboxApi().assertProtectedMainWriteAllowed(options, commandName); +} + +function runSetupBootstrapInternal(options) { + return getSandboxApi().runSetupBootstrapInternal(options); +} + +function extractAgentBranchStartMetadata(output) { + const branchMatch = String(output || '').match(/^\[agent-branch-start\] Created branch: (.+)$/m); + const worktreeMatch = String(output || '').match(/^\[agent-branch-start\] Worktree: (.+)$/m); + return { + branch: branchMatch ? branchMatch[1].trim() : '', + worktreePath: worktreeMatch ? worktreeMatch[1].trim() : '', + }; +} + +function resolveSandboxTarget(repoRoot, worktreePath, targetPath) { + const resolvedTarget = path.resolve(targetPath); + const relativeTarget = path.relative(repoRoot, resolvedTarget); + if (relativeTarget.startsWith('..') || path.isAbsolute(relativeTarget)) { + throw new Error(`sandbox target must stay inside repo root: ${resolvedTarget}`); + } + if (!relativeTarget || relativeTarget === '.') { + return worktreePath; + } + return path.join(worktreePath, relativeTarget); +} + +function buildSandboxSetupArgs(options, sandboxTarget) { + const args = ['setup', '--target', sandboxTarget, '--no-global-install', '--no-recursive']; + appendForceArgs(args, options); + if (options.skipAgents) args.push('--skip-agents'); + if (options.skipPackageJson) args.push('--skip-package-json'); + if (options.skipGitignore) args.push('--no-gitignore'); + if (options.dryRun) args.push('--dry-run'); + return args; +} + +function buildSandboxDoctorArgs(options, sandboxTarget) { + const args = ['doctor', '--target', sandboxTarget]; + if (options.dryRun) args.push('--dry-run'); + appendForceArgs(args, options); + if (options.skipAgents) args.push('--skip-agents'); + if (options.skipPackageJson) args.push('--skip-package-json'); + if (options.skipGitignore) args.push('--no-gitignore'); + if (!options.dropStaleLocks) args.push('--keep-stale-locks'); + args.push(options.waitForMerge ? '--wait-for-merge' : '--no-wait-for-merge'); + if (options.verboseAutoFinish) args.push('--verbose-auto-finish'); + if (options.json) args.push('--json'); + return args; +} + +function isSpawnFailure(result) { + return Boolean(result?.error) && typeof result?.status !== 'number'; +} + +function ensureRepoBranch(repoRoot, branch) { + const current = currentBranchName(repoRoot); + if (current === branch) { + return { ok: true, changed: false }; + } + + const checkoutResult = run('git', ['-C', repoRoot, 'checkout', branch], { timeout: 20_000 }); + if (isSpawnFailure(checkoutResult)) { + return { + ok: false, + changed: false, + stdout: checkoutResult.stdout || '', + stderr: checkoutResult.stderr || '', + }; + } + if (checkoutResult.status !== 0) { + return { + ok: false, + changed: false, + stdout: checkoutResult.stdout || '', + stderr: checkoutResult.stderr || '', + }; + } + + return { ok: true, changed: true }; +} + +function protectedBaseSandboxBranchPrefix() { + const now = new Date(); + const stamp = [ + now.getUTCFullYear(), + String(now.getUTCMonth() + 1).padStart(2, '0'), + String(now.getUTCDate()).padStart(2, '0'), + ].join('') + '-' + [ + String(now.getUTCHours()).padStart(2, '0'), + String(now.getUTCMinutes()).padStart(2, '0'), + String(now.getUTCSeconds()).padStart(2, '0'), + ].join(''); + return `agent/gx/${stamp}`; +} + +function protectedBaseSandboxWorktreePath(repoRoot, branchName) { + return path.join(repoRoot, defaultAgentWorktreeRelativeDir(), branchName.replace(/\//g, '__')); +} + +function gitRefExists(repoRoot, ref) { + return run('git', ['-C', repoRoot, 'show-ref', '--verify', '--quiet', ref]).status === 0; +} + +function resolveProtectedBaseSandboxStartRef(repoRoot, baseBranch) { + run('git', ['-C', repoRoot, 'fetch', 'origin', baseBranch, '--quiet'], { timeout: 20_000 }); + if (gitRefExists(repoRoot, `refs/remotes/origin/${baseBranch}`)) { + return `origin/${baseBranch}`; + } + if (gitRefExists(repoRoot, `refs/heads/${baseBranch}`)) { + return baseBranch; + } + if (currentBranchName(repoRoot) === baseBranch) { + return null; + } + throw new Error(`Unable to find base ref for sandbox bootstrap: ${baseBranch}`); +} + +function startProtectedBaseSandboxFallback(blocked, sandboxSuffix) { + const branchPrefix = protectedBaseSandboxBranchPrefix(); + let selectedBranch = ''; + let selectedWorktreePath = ''; + + for (let attempt = 0; attempt < 30; attempt += 1) { + const suffix = attempt === 0 ? sandboxSuffix : `${attempt + 1}-${sandboxSuffix}`; + const candidateBranch = `${branchPrefix}-${suffix}`; + const candidateWorktreePath = protectedBaseSandboxWorktreePath(blocked.repoRoot, candidateBranch); + if (gitRefExists(blocked.repoRoot, `refs/heads/${candidateBranch}`)) { + continue; + } + if (fs.existsSync(candidateWorktreePath)) { + continue; + } + selectedBranch = candidateBranch; + selectedWorktreePath = candidateWorktreePath; + break; + } + + if (!selectedBranch || !selectedWorktreePath) { + throw new Error('Unable to allocate unique sandbox branch/worktree'); + } + + fs.mkdirSync(path.dirname(selectedWorktreePath), { recursive: true }); + const startRef = resolveProtectedBaseSandboxStartRef(blocked.repoRoot, blocked.branch); + const addArgs = startRef + ? ['-C', blocked.repoRoot, 'worktree', 'add', '-b', selectedBranch, selectedWorktreePath, startRef] + : ['-C', blocked.repoRoot, 'worktree', 'add', '--orphan', selectedWorktreePath]; + const addResult = run('git', addArgs); + if (isSpawnFailure(addResult)) { + throw addResult.error; + } + if (addResult.status !== 0) { + throw new Error((addResult.stderr || addResult.stdout || 'failed to create sandbox').trim()); + } + + if (!startRef) { + const renameResult = run( + 'git', + ['-C', selectedWorktreePath, 'branch', '-m', selectedBranch], + { timeout: 20_000 }, + ); + if (isSpawnFailure(renameResult)) { + throw renameResult.error; + } + if (renameResult.status !== 0) { + throw new Error( + (renameResult.stderr || renameResult.stdout || 'failed to name orphan sandbox branch').trim(), + ); + } + } + + return { + metadata: { + branch: selectedBranch, + worktreePath: selectedWorktreePath, + }, + stdout: + `[agent-branch-start] Created branch: ${selectedBranch}\n` + + `[agent-branch-start] Worktree: ${selectedWorktreePath}\n`, + stderr: addResult.stderr || '', + }; +} + +function startProtectedBaseSandbox(blocked, { taskName, sandboxSuffix }) { + if (sandboxSuffix === 'gx-doctor') { + return startProtectedBaseSandboxFallback(blocked, sandboxSuffix); + } + + const startResult = runPackageAsset('branchStart', [ + '--task', + taskName, + '--agent', + SHORT_TOOL_NAME, + '--base', + blocked.branch, + ], { cwd: blocked.repoRoot }); + if (isSpawnFailure(startResult)) { + throw startResult.error; + } + if (startResult.status !== 0) { + return startProtectedBaseSandboxFallback(blocked, sandboxSuffix); + } + + const metadata = extractAgentBranchStartMetadata(startResult.stdout); + const currentBranch = currentBranchName(blocked.repoRoot); + const worktreePath = metadata.worktreePath ? path.resolve(metadata.worktreePath) : ''; + const repoRootPath = path.resolve(blocked.repoRoot); + const hasSafeWorktree = Boolean(worktreePath) && worktreePath !== repoRootPath; + const branchChanged = Boolean(currentBranch) && currentBranch !== blocked.branch; + + if (!hasSafeWorktree || branchChanged) { + const restoreResult = ensureRepoBranch(blocked.repoRoot, blocked.branch); + if (!restoreResult.ok) { + const detail = [restoreResult.stderr, restoreResult.stdout].filter(Boolean).join('\n').trim(); + throw new Error( + `sandbox startup switched protected base checkout and could not restore '${blocked.branch}'.` + + (detail ? `\n${detail}` : ''), + ); + } + return startProtectedBaseSandboxFallback(blocked, sandboxSuffix); + } + + return { + metadata, + stdout: startResult.stdout || '', + stderr: startResult.stderr || '', + }; +} + +function cleanupProtectedBaseSandbox(repoRoot, metadata) { + const result = { + worktree: 'skipped', + branch: 'skipped', + note: 'missing sandbox metadata', + }; + + if (!metadata?.worktreePath || !metadata?.branch) { + return result; + } + + if (fs.existsSync(metadata.worktreePath)) { + const removeResult = run( + 'git', + ['-C', repoRoot, 'worktree', 'remove', '--force', metadata.worktreePath], + { timeout: 30_000 }, + ); + if (isSpawnFailure(removeResult)) { + throw removeResult.error; + } + if (removeResult.status !== 0) { + throw new Error( + (removeResult.stderr || removeResult.stdout || 'failed to remove sandbox worktree').trim(), + ); + } + result.worktree = 'removed'; + } else { + result.worktree = 'missing'; + } + + if (gitRefExists(repoRoot, `refs/heads/${metadata.branch}`)) { + const branchDeleteResult = run( + 'git', + ['-C', repoRoot, 'branch', '-D', metadata.branch], + { timeout: 20_000 }, + ); + if (isSpawnFailure(branchDeleteResult)) { + throw branchDeleteResult.error; + } + if (branchDeleteResult.status !== 0) { + throw new Error( + (branchDeleteResult.stderr || branchDeleteResult.stdout || 'failed to delete sandbox branch').trim(), + ); + } + result.branch = 'deleted'; + } else { + result.branch = 'missing'; + } + + result.note = 'sandbox worktree pruned'; + return result; +} + +function parseGitPathList(output) { + return String(output || '') + .split('\n') + .map((line) => line.trim()) + .filter((line) => line && line !== LOCK_FILE_RELATIVE); +} + +function collectDoctorChangedPaths(worktreePath) { + const changed = new Set(); + const commands = [ + ['diff', '--name-only'], + ['diff', '--cached', '--name-only'], + ['ls-files', '--others', '--exclude-standard'], + ]; + for (const gitArgs of commands) { + const result = run('git', ['-C', worktreePath, ...gitArgs], { timeout: 20_000 }); + for (const filePath of parseGitPathList(result.stdout)) { + changed.add(filePath); + } + } + return Array.from(changed); +} + +function collectDoctorDeletedPaths(worktreePath) { + const deleted = new Set(); + const commands = [ + ['diff', '--name-only', '--diff-filter=D'], + ['diff', '--cached', '--name-only', '--diff-filter=D'], + ]; + for (const gitArgs of commands) { + const result = run('git', ['-C', worktreePath, ...gitArgs], { timeout: 20_000 }); + for (const filePath of parseGitPathList(result.stdout)) { + deleted.add(filePath); + } + } + return Array.from(deleted); +} + +function collectWorktreeDirtyPaths(worktreePath) { + const dirty = new Set(); + const commands = [ + ['diff', '--name-only'], + ['diff', '--cached', '--name-only'], + ['ls-files', '--others', '--exclude-standard'], + ]; + for (const gitArgs of commands) { + const result = run('git', ['-C', worktreePath, ...gitArgs], { timeout: 20_000 }); + for (const filePath of parseGitPathList(result.stdout)) { + dirty.add(filePath); + } + } + return Array.from(dirty); +} + +function collectDoctorForceAddPaths(worktreePath) { + return REQUIRED_MANAGED_REPO_FILES + .filter((relativePath) => relativePath.startsWith('scripts/') || relativePath.startsWith('.githooks/')) + .filter((relativePath) => fs.existsSync(path.join(worktreePath, relativePath))); +} + +function stripDoctorSandboxLocks(rawContent, branchName) { + if (!rawContent || !branchName) { + return rawContent; + } + try { + const parsed = JSON.parse(rawContent); + const locks = parsed && typeof parsed === 'object' && parsed.locks && typeof parsed.locks === 'object' + ? parsed.locks + : null; + if (!locks) { + return rawContent; + } + let changed = false; + const filteredLocks = {}; + for (const [filePath, lockInfo] of Object.entries(locks)) { + if (lockInfo && lockInfo.branch === branchName) { + changed = true; + continue; + } + filteredLocks[filePath] = lockInfo; + } + if (!changed) { + return rawContent; + } + return `${JSON.stringify({ ...parsed, locks: filteredLocks }, null, 2)}\n`; + } catch { + return rawContent; + } +} + +function claimDoctorChangedLocks(metadata) { + if (!metadata.branch) { + return { + status: 'skipped', + note: 'missing sandbox branch metadata', + changedCount: 0, + deletedCount: 0, + }; + } + + const changedPaths = Array.from(new Set([ + ...collectDoctorChangedPaths(metadata.worktreePath), + ...collectDoctorForceAddPaths(metadata.worktreePath), + ])); + const deletedPaths = collectDoctorDeletedPaths(metadata.worktreePath); + if (changedPaths.length > 0) { + runPackageAsset('lockTool', ['claim', '--branch', metadata.branch, ...changedPaths], { + cwd: metadata.worktreePath, + timeout: 30_000, + }); + } + if (deletedPaths.length > 0) { + runPackageAsset('lockTool', ['allow-delete', '--branch', metadata.branch, ...deletedPaths], { + cwd: metadata.worktreePath, + timeout: 30_000, + }); + } + + return { + status: 'claimed', + note: 'claimed locks for doctor auto-commit', + changedCount: changedPaths.length, + deletedCount: deletedPaths.length, + }; +} + +function autoCommitDoctorSandboxChanges(metadata) { + if (!metadata.worktreePath || !metadata.branch) { + return { + status: 'skipped', + note: 'missing sandbox branch metadata', + }; + } + + claimDoctorChangedLocks(metadata); + run( + 'git', + ['-C', metadata.worktreePath, 'add', '-A', '--', '.', `:(exclude)${LOCK_FILE_RELATIVE}`], + { timeout: 20_000 }, + ); + const forceAddPaths = collectDoctorForceAddPaths(metadata.worktreePath); + if (forceAddPaths.length > 0) { + run( + 'git', + ['-C', metadata.worktreePath, 'add', '-f', '--', ...forceAddPaths], + { timeout: 20_000 }, + ); + } + const staged = run( + 'git', + ['-C', metadata.worktreePath, 'diff', '--cached', '--name-only', '--', '.', `:(exclude)${LOCK_FILE_RELATIVE}`], + { timeout: 20_000 }, + ); + const stagedFiles = parseGitPathList(staged.stdout); + if (stagedFiles.length === 0) { + return { + status: 'no-changes', + note: 'no committable doctor changes found in sandbox', + }; + } + + const commitResult = run( + 'git', + ['-C', metadata.worktreePath, 'commit', '-m', 'Auto-finish: gx doctor repairs'], + { timeout: 30_000 }, + ); + if (commitResult.status !== 0) { + return { + status: 'failed', + note: 'doctor sandbox auto-commit failed', + stdout: commitResult.stdout || '', + stderr: commitResult.stderr || '', + }; + } + + return { + status: 'committed', + note: 'doctor sandbox repairs committed', + commitMessage: 'Auto-finish: gx doctor repairs', + stagedFiles, + }; +} + +function hasOriginRemote(repoRoot) { + return run('git', ['-C', repoRoot, 'remote', 'get-url', 'origin']).status === 0; +} + +function originRemoteLooksLikeGithub(repoRoot) { + const originUrl = readGitConfig(repoRoot, 'remote.origin.url'); + if (!originUrl) { + return false; + } + return /github\.com[:/]/i.test(originUrl); +} + +function isCommandAvailable(commandName) { + return run('which', [commandName]).status === 0; +} + +function extractAgentBranchFinishPrUrl(output) { + const match = String(output || '').match(/\[agent-branch-finish\] PR:\s*(\S+)/); + return match ? match[1] : ''; +} + +function doctorFinishFlowIsPending(output) { + return ( + /\[agent-branch-finish\] PR merge not completed yet; leaving PR open\./.test(output) || + /\[agent-branch-finish\] Merge pending review\/check policy\. Branch cleanup skipped for now\./.test(output) || + /\[agent-branch-finish\] PR auto-merge enabled; waiting for required checks\/reviews\./.test(output) + ); +} + +function finishDoctorSandboxBranch(blocked, metadata, options = {}) { + if (!hasOriginRemote(blocked.repoRoot)) { + return { + status: 'skipped', + note: 'origin remote missing; skipped auto-finish', + }; + } + const explicitGhBin = Boolean(String(process.env.GUARDEX_GH_BIN || '').trim()); + if (!explicitGhBin && !originRemoteLooksLikeGithub(blocked.repoRoot)) { + return { + status: 'skipped', + note: 'origin remote is not GitHub; skipped auto-finish PR flow', + }; + } + + const ghBin = process.env.GUARDEX_GH_BIN || 'gh'; + if (!isCommandAvailable(ghBin)) { + return { + status: 'skipped', + note: `'${ghBin}' not available; skipped auto-finish PR flow`, + }; + } + const ghAuthStatus = run(ghBin, ['auth', 'status'], { timeout: 20_000 }); + if (ghAuthStatus.status !== 0) { + return { + status: 'skipped', + note: `'${ghBin}' auth unavailable; skipped auto-finish PR flow`, + stderr: ghAuthStatus.stderr || '', + }; + } + + const rawWaitTimeoutSeconds = Number.parseInt(process.env.GUARDEX_FINISH_WAIT_TIMEOUT_SECONDS || '1800', 10); + const waitTimeoutSeconds = + Number.isFinite(rawWaitTimeoutSeconds) && rawWaitTimeoutSeconds >= 30 ? rawWaitTimeoutSeconds : 1800; + const finishTimeoutMs = Math.max(180_000, (waitTimeoutSeconds + 60) * 1000); + const waitForMergeArg = options.waitForMerge === false ? '--no-wait-for-merge' : '--wait-for-merge'; + + const finishResult = runPackageAsset( + 'branchFinish', + ['--branch', metadata.branch, '--base', blocked.branch, '--via-pr', waitForMergeArg, '--cleanup'], + { cwd: metadata.worktreePath, timeout: finishTimeoutMs }, + ); + if (isSpawnFailure(finishResult)) { + return { + status: 'failed', + note: 'doctor sandbox finish flow errored', + stdout: finishResult.stdout || '', + stderr: finishResult.stderr || '', + }; + } + if (finishResult.status !== 0) { + return { + status: 'failed', + note: 'doctor sandbox finish flow failed', + stdout: finishResult.stdout || '', + stderr: finishResult.stderr || '', + }; + } + + const combinedOutput = `${finishResult.stdout || ''}\n${finishResult.stderr || ''}`; + if (doctorFinishFlowIsPending(combinedOutput)) { + return { + status: 'pending', + note: 'PR created and waiting for merge policy/checks', + prUrl: extractAgentBranchFinishPrUrl(combinedOutput), + stdout: finishResult.stdout || '', + stderr: finishResult.stderr || '', + }; + } + + return { + status: 'completed', + note: 'doctor sandbox finish flow completed', + stdout: finishResult.stdout || '', + stderr: finishResult.stderr || '', + }; +} + +function mergeDoctorSandboxRepairsBackToProtectedBase(options, blocked, metadata, autoCommitResult, finishResult) { + if (options.dryRun) { + return { + status: autoCommitResult.status === 'committed' ? 'would-merge' : 'skipped', + note: autoCommitResult.status === 'committed' + ? 'dry run: would fast-forward tracked doctor repairs into the protected base workspace' + : 'dry run skips tracked repair merge', + }; + } + + if (autoCommitResult.status !== 'committed') { + return { + status: autoCommitResult.status === 'no-changes' ? 'unchanged' : 'skipped', + note: autoCommitResult.status === 'no-changes' + ? 'no tracked doctor repairs needed in the protected base workspace' + : 'tracked doctor repair merge skipped', + }; + } + + if (finishResult.status !== 'skipped') { + return { + status: 'skipped', + note: finishResult.status === 'failed' + ? 'tracked doctor repairs remain in the sandbox after finish failure' + : 'tracked doctor repairs are being delivered through the sandbox finish flow', + }; + } + + const allowedPaths = new Set([ + ...(autoCommitResult.stagedFiles || []), + ...OMX_SCAFFOLD_DIRECTORIES, + ...Array.from(OMX_SCAFFOLD_FILES.keys()), + ...REQUIRED_MANAGED_REPO_FILES, + 'bin', + 'package.json', + '.gitignore', + 'AGENTS.md', + ]); + const dirtyPaths = collectWorktreeDirtyPaths(blocked.repoRoot); + let stashRef = ''; + if (dirtyPaths.length > 0) { + const unexpectedPaths = dirtyPaths.filter((filePath) => { + if (allowedPaths.has(filePath)) { + return false; + } + return !AGENT_WORKTREE_RELATIVE_DIRS.some( + (relativeDir) => filePath === relativeDir || filePath.startsWith(`${relativeDir}/`), + ); + }); + if (unexpectedPaths.length > 0) { + return { + status: 'failed', + note: `protected branch workspace has unrelated local changes: ${unexpectedPaths.join(', ')}`, + }; + } + const stashMessage = `guardex-doctor-merge-${Date.now()}`; + const stashResult = run( + 'git', + ['-C', blocked.repoRoot, 'stash', 'push', '--all', '--message', stashMessage], + { timeout: 30_000 }, + ); + if (isSpawnFailure(stashResult)) { + return { + status: 'failed', + note: 'could not stash protected branch doctor drift before merge', + stdout: stashResult.stdout || '', + stderr: stashResult.stderr || '', + }; + } + if (stashResult.status !== 0) { + return { + status: 'failed', + note: 'stashing protected branch doctor drift failed', + stdout: stashResult.stdout || '', + stderr: stashResult.stderr || '', + }; + } + + const stashLookup = run( + 'git', + ['-C', blocked.repoRoot, 'stash', 'list'], + { timeout: 20_000 }, + ); + stashRef = String(stashLookup.stdout || '') + .split('\n') + .find((line) => line.includes(stashMessage)) + ?.split(':')[0] + ?.trim() || ''; + } + + const restoreResult = ensureRepoBranch(blocked.repoRoot, blocked.branch); + if (!restoreResult.ok) { + if (stashRef) { + run('git', ['-C', blocked.repoRoot, 'stash', 'apply', stashRef], { timeout: 30_000 }); + } + return { + status: 'failed', + note: `could not restore protected branch '${blocked.branch}' before applying sandbox repairs`, + stdout: restoreResult.stdout || '', + stderr: restoreResult.stderr || '', + }; + } + + const mergeResult = run( + 'git', + ['-C', blocked.repoRoot, 'merge', '--ff-only', metadata.branch], + { timeout: 30_000 }, + ); + if (isSpawnFailure(mergeResult)) { + if (stashRef) { + run('git', ['-C', blocked.repoRoot, 'stash', 'apply', stashRef], { timeout: 30_000 }); + } + return { + status: 'failed', + note: 'tracked doctor repair merge errored', + stdout: mergeResult.stdout || '', + stderr: mergeResult.stderr || '', + }; + } + if (mergeResult.status !== 0) { + if (stashRef) { + run('git', ['-C', blocked.repoRoot, 'stash', 'apply', stashRef], { timeout: 30_000 }); + } + return { + status: 'failed', + note: 'tracked doctor repair merge failed', + stdout: mergeResult.stdout || '', + stderr: mergeResult.stderr || '', + }; + } + + let cleanupResult; + try { + cleanupResult = cleanupProtectedBaseSandbox(blocked.repoRoot, metadata); + } catch (error) { + return { + status: 'failed', + note: `tracked doctor repair merge succeeded but sandbox cleanup failed: ${error.message}`, + stdout: mergeResult.stdout || '', + stderr: mergeResult.stderr || '', + }; + } + + let hookRefreshResult; + try { + hookRefreshResult = configureHooks(blocked.repoRoot, false); + } catch (error) { + return { + status: 'failed', + note: `tracked doctor repair merge succeeded but local hook refresh failed: ${error.message}`, + stdout: mergeResult.stdout || '', + stderr: mergeResult.stderr || '', + }; + } + + if (stashRef) { + run('git', ['-C', blocked.repoRoot, 'stash', 'drop', stashRef], { timeout: 20_000 }); + } + + return { + status: 'merged', + note: 'fast-forwarded tracked doctor repairs into the protected base workspace', + stdout: mergeResult.stdout || '', + stderr: mergeResult.stderr || '', + cleanup: cleanupResult, + hookRefresh: hookRefreshResult, + }; +} + +function syncDoctorLocalSupportFiles(repoRoot, dryRun) { + return []; +} + +function runDoctorInSandbox(options, blocked) { + const startResult = startProtectedBaseSandbox(blocked, { + taskName: `${SHORT_TOOL_NAME}-doctor`, + sandboxSuffix: 'gx-doctor', + }); + const metadata = startResult.metadata; + + const sandboxTarget = resolveSandboxTarget(blocked.repoRoot, metadata.worktreePath, options.target); + const nestedResult = run( + process.execPath, + [__filename, ...buildSandboxDoctorArgs(options, sandboxTarget)], + { cwd: metadata.worktreePath }, + ); + if (isSpawnFailure(nestedResult)) { + throw nestedResult.error; + } + + let autoCommitResult = { + status: 'skipped', + note: 'sandbox doctor did not complete successfully', + }; + let finishResult = { + status: 'skipped', + note: 'sandbox doctor did not complete successfully', + }; + + let protectedBaseRepairSyncResult = { + status: 'skipped', + note: 'sandbox doctor did not complete successfully', + }; + let lockSyncResult = { + status: 'skipped', + note: 'sandbox doctor did not complete successfully', + }; + let sandboxLockContent = null; + let postSandboxAutoFinishSummary = { + enabled: false, + attempted: 0, + completed: 0, + skipped: 0, + failed: 0, + details: ['Skipped auto-finish sweep (sandbox doctor did not complete successfully).'], + }; + let omxScaffoldSyncResult = { + status: 'skipped', + note: 'sandbox doctor did not complete successfully', + }; + if (nestedResult.status === 0) { + const omxScaffoldOps = ensureOmxScaffold(blocked.repoRoot, Boolean(options.dryRun)); + const changedOmxPaths = omxScaffoldOps.filter((operation) => operation.status !== 'unchanged'); + if (changedOmxPaths.length === 0) { + omxScaffoldSyncResult = { + status: 'unchanged', + note: '.omx scaffold already in sync', + operations: omxScaffoldOps, + }; + } else { + omxScaffoldSyncResult = { + status: options.dryRun ? 'would-sync' : 'synced', + note: `${options.dryRun ? 'would sync' : 'synced'} ${changedOmxPaths.length} .omx path(s)`, + operations: omxScaffoldOps, + }; + } + + if (!options.dryRun) { + autoCommitResult = autoCommitDoctorSandboxChanges(metadata); + if (autoCommitResult.status === 'committed') { + finishResult = finishDoctorSandboxBranch(blocked, metadata, options); + } else if (autoCommitResult.status === 'no-changes') { + finishResult = { + status: 'skipped', + note: 'no doctor changes to auto-finish', + }; + } else if (autoCommitResult.status !== 'failed') { + finishResult = { + status: 'skipped', + note: 'auto-commit did not run', + }; + } + } else { + autoCommitResult = { + status: 'skipped', + note: 'dry-run skips doctor sandbox auto-commit', + }; + finishResult = { + status: 'skipped', + note: 'dry-run skips doctor sandbox finish flow', + }; + } + + const sandboxLockPath = path.join(metadata.worktreePath, LOCK_FILE_RELATIVE); + const baseLockPath = path.join(blocked.repoRoot, LOCK_FILE_RELATIVE); + if (!fs.existsSync(baseLockPath)) { + lockSyncResult = { + status: 'skipped', + note: `${LOCK_FILE_RELATIVE} missing in protected base workspace`, + }; + } else if (!fs.existsSync(sandboxLockPath)) { + lockSyncResult = { + status: 'skipped', + note: `${LOCK_FILE_RELATIVE} missing in sandbox worktree`, + }; + } else { + const sourceContent = stripDoctorSandboxLocks( + fs.readFileSync(sandboxLockPath, 'utf8'), + metadata.branch, + ); + sandboxLockContent = sourceContent; + const destinationContent = fs.readFileSync(baseLockPath, 'utf8'); + if (sourceContent === destinationContent) { + lockSyncResult = { + status: 'unchanged', + note: `${LOCK_FILE_RELATIVE} already in sync`, + }; + } else { + fs.mkdirSync(path.dirname(baseLockPath), { recursive: true }); + fs.writeFileSync(baseLockPath, sourceContent, 'utf8'); + lockSyncResult = { + status: 'synced', + note: `${LOCK_FILE_RELATIVE} synced from sandbox`, + }; + } + } + + protectedBaseRepairSyncResult = mergeDoctorSandboxRepairsBackToProtectedBase( + options, + blocked, + metadata, + autoCommitResult, + finishResult, + ); + + syncDoctorLocalSupportFiles(blocked.repoRoot, Boolean(options.dryRun)); + + const postMergeOmxScaffoldOps = ensureOmxScaffold(blocked.repoRoot, Boolean(options.dryRun)); + const postMergeChangedOmxPaths = postMergeOmxScaffoldOps.filter((operation) => operation.status !== 'unchanged'); + if (postMergeChangedOmxPaths.length === 0) { + omxScaffoldSyncResult = { + status: 'unchanged', + note: '.omx scaffold already in sync', + operations: postMergeOmxScaffoldOps, + }; + } else { + omxScaffoldSyncResult = { + status: options.dryRun ? 'would-sync' : 'synced', + note: `${options.dryRun ? 'would sync' : 'synced'} ${postMergeChangedOmxPaths.length} .omx path(s)`, + operations: postMergeOmxScaffoldOps, + }; + } + + const postMergeBaseLockPath = path.join(blocked.repoRoot, LOCK_FILE_RELATIVE); + if (sandboxLockContent === null) { + lockSyncResult = { + status: 'skipped', + note: `${LOCK_FILE_RELATIVE} missing in sandbox worktree`, + }; + } else if (!fs.existsSync(postMergeBaseLockPath)) { + fs.mkdirSync(path.dirname(postMergeBaseLockPath), { recursive: true }); + fs.writeFileSync(postMergeBaseLockPath, sandboxLockContent, 'utf8'); + lockSyncResult = { + status: 'synced', + note: `${LOCK_FILE_RELATIVE} recreated from sandbox`, + }; + } else { + const destinationContent = fs.readFileSync(postMergeBaseLockPath, 'utf8'); + if (sandboxLockContent === destinationContent) { + lockSyncResult = { + status: 'unchanged', + note: `${LOCK_FILE_RELATIVE} already in sync`, + }; + } else { + fs.mkdirSync(path.dirname(postMergeBaseLockPath), { recursive: true }); + fs.writeFileSync(postMergeBaseLockPath, sandboxLockContent, 'utf8'); + lockSyncResult = { + status: 'synced', + note: `${LOCK_FILE_RELATIVE} synced from sandbox`, + }; + } + } + + postSandboxAutoFinishSummary = autoFinishReadyAgentBranches(blocked.repoRoot, { + baseBranch: blocked.branch, + dryRun: options.dryRun, + waitForMerge: options.waitForMerge, + excludeBranches: [metadata.branch], + }); + } + + if (options.json) { + if (nestedResult.stdout) { + if (nestedResult.status === 0) { + try { + const parsed = JSON.parse(nestedResult.stdout); + process.stdout.write( + JSON.stringify( + { + ...parsed, + protectedBaseRepairSync: protectedBaseRepairSyncResult, + sandboxOmxScaffoldSync: omxScaffoldSyncResult, + sandboxLockSync: lockSyncResult, + sandboxAutoCommit: autoCommitResult, + sandboxFinish: finishResult, + autoFinish: postSandboxAutoFinishSummary, + }, + null, + 2, + ) + '\n', + ); + } catch { + process.stdout.write(nestedResult.stdout); + } + } else { + process.stdout.write(nestedResult.stdout); + } + } + if (nestedResult.stderr) process.stderr.write(nestedResult.stderr); + } else { + console.log( + `[${TOOL_NAME}] doctor detected protected branch '${blocked.branch}'. ` + + `Running repairs in sandbox branch '${metadata.branch || 'agent/'}'.`, + ); + if (startResult.stdout) process.stdout.write(startResult.stdout); + if (startResult.stderr) process.stderr.write(startResult.stderr); + if (nestedResult.stdout) process.stdout.write(nestedResult.stdout); + if (nestedResult.stderr) process.stderr.write(nestedResult.stderr); + if (nestedResult.status === 0) { + if (autoCommitResult.status === 'committed') { + console.log( + `[${TOOL_NAME}] Auto-committed doctor repairs in sandbox branch '${metadata.branch}'.`, + ); + } else if (autoCommitResult.status === 'failed') { + console.log(`[${TOOL_NAME}] Doctor sandbox auto-commit failed; branch left for manual follow-up.`); + if (autoCommitResult.stdout) process.stdout.write(autoCommitResult.stdout); + if (autoCommitResult.stderr) process.stderr.write(autoCommitResult.stderr); + } else { + console.log(`[${TOOL_NAME}] Doctor sandbox auto-commit skipped: ${autoCommitResult.note}.`); + } + + if (protectedBaseRepairSyncResult.status === 'merged') { + console.log(`[${TOOL_NAME}] Fast-forwarded tracked doctor repairs into the protected branch workspace.`); + } else if (protectedBaseRepairSyncResult.status === 'unchanged') { + console.log(`[${TOOL_NAME}] Protected branch workspace already had the tracked doctor repairs.`); + } else if (protectedBaseRepairSyncResult.status === 'would-merge') { + console.log(`[${TOOL_NAME}] Dry run: would fast-forward tracked doctor repairs into the protected branch workspace.`); + } else if (protectedBaseRepairSyncResult.status === 'failed') { + console.log(`[${TOOL_NAME}] Protected branch tracked repair merge failed: ${protectedBaseRepairSyncResult.note}.`); + if (protectedBaseRepairSyncResult.stdout) process.stdout.write(protectedBaseRepairSyncResult.stdout); + if (protectedBaseRepairSyncResult.stderr) process.stderr.write(protectedBaseRepairSyncResult.stderr); + } else { + console.log(`[${TOOL_NAME}] Protected branch tracked repair merge skipped: ${protectedBaseRepairSyncResult.note}.`); + } + + if (lockSyncResult.status === 'synced') { + console.log( + `[${TOOL_NAME}] Synced repaired lock registry back to protected branch workspace (${LOCK_FILE_RELATIVE}).`, + ); + } else if (lockSyncResult.status === 'unchanged') { + console.log(`[${TOOL_NAME}] Lock registry already synced in protected branch workspace.`); + } else { + console.log(`[${TOOL_NAME}] Lock registry sync skipped: ${lockSyncResult.note}.`); + } + + if (finishResult.status === 'completed') { + console.log(`[${TOOL_NAME}] Auto-finish flow completed for sandbox branch '${metadata.branch}'.`); + if (finishResult.stdout) process.stdout.write(finishResult.stdout); + if (finishResult.stderr) process.stderr.write(finishResult.stderr); + } else if (finishResult.status === 'pending') { + console.log( + `[${TOOL_NAME}] Auto-finish pending for sandbox branch '${metadata.branch}': ${finishResult.note}.`, + ); + if (finishResult.prUrl) { + console.log(`[${TOOL_NAME}] PR: ${finishResult.prUrl}`); + } + if (finishResult.stdout) process.stdout.write(finishResult.stdout); + if (finishResult.stderr) process.stderr.write(finishResult.stderr); + } else if (finishResult.status === 'failed') { + console.log(`[${TOOL_NAME}] Auto-finish flow failed for sandbox branch '${metadata.branch}'.`); + console.log(`[${TOOL_NAME}] Auto-finish flow failed for sandbox branch '${metadata.branch}'.`); + if (finishResult.stdout) process.stdout.write(finishResult.stdout); + if (finishResult.stderr) process.stderr.write(finishResult.stderr); + } else { + console.log(`[${TOOL_NAME}] Auto-finish skipped: ${finishResult.note}.`); + } + + printAutoFinishSummary(postSandboxAutoFinishSummary, { + baseBranch: blocked.branch, + verbose: options.verboseAutoFinish, + }); + if (omxScaffoldSyncResult.status === 'synced') { + console.log(`[${TOOL_NAME}] Synced .omx scaffold back to protected branch workspace.`); + } else if (omxScaffoldSyncResult.status === 'unchanged') { + console.log(`[${TOOL_NAME}] .omx scaffold already aligned in protected branch workspace.`); + } else if (omxScaffoldSyncResult.status === 'would-sync') { + console.log(`[${TOOL_NAME}] Dry run: would sync .omx scaffold back to protected branch workspace.`); + } else { + console.log(`[${TOOL_NAME}] .omx scaffold sync skipped: ${omxScaffoldSyncResult.note}.`); + } + } + } + + if (typeof nestedResult.status === 'number') { + let exitCode = nestedResult.status; + if (exitCode === 0 && autoCommitResult.status === 'failed') { + exitCode = 1; + } + if ( + exitCode === 0 && + autoCommitResult.status === 'committed' && + (finishResult.status === 'failed' || finishResult.status === 'pending') + ) { + exitCode = 1; + } + if (exitCode === 0 && protectedBaseRepairSyncResult.status === 'failed') { + exitCode = 1; + } + process.exitCode = exitCode; + return; + } + process.exitCode = 1; +} + +function runSetupInSandbox(options, blocked, repoLabel = '') { + const startResult = startProtectedBaseSandbox(blocked, { + taskName: `${SHORT_TOOL_NAME}-setup`, + sandboxSuffix: 'gx-setup', + }); + const metadata = startResult.metadata; + + if (startResult.stdout) process.stdout.write(startResult.stdout); + if (startResult.stderr) process.stderr.write(startResult.stderr); + console.log( + `[${TOOL_NAME}] setup blocked on protected branch '${blocked.branch}' in an initialized repo; ` + + 'refreshing through a sandbox worktree and syncing managed bootstrap files back locally.', + ); + + const sandboxTarget = resolveSandboxTarget(blocked.repoRoot, metadata.worktreePath, options.target); + const nestedResult = run( + process.execPath, + [__filename, ...buildSandboxSetupArgs(options, sandboxTarget)], + { cwd: metadata.worktreePath }, + ); + if (isSpawnFailure(nestedResult)) { + throw nestedResult.error; + } + if (nestedResult.status !== 0) { + if (nestedResult.stdout) process.stdout.write(nestedResult.stdout); + if (nestedResult.stderr) process.stderr.write(nestedResult.stderr); + throw new Error( + `sandboxed setup failed for protected branch '${blocked.branch}'. ` + + `Inspect sandbox at ${metadata.worktreePath}`, + ); + } + + const syncOptions = { + ...options, + target: blocked.repoRoot, + recursive: false, + allowProtectedBaseWrite: true, + }; + const { installPayload, fixPayload, parentWorkspace } = runSetupBootstrapInternal(syncOptions); + printOperations(`Setup/install${repoLabel}`, installPayload, syncOptions.dryRun); + printOperations(`Setup/fix${repoLabel}`, fixPayload, syncOptions.dryRun); + if (!syncOptions.dryRun && parentWorkspace) { + console.log(`[${TOOL_NAME}] Parent workspace view: ${parentWorkspace.workspacePath}`); + } + + const scanResult = runScanInternal({ target: blocked.repoRoot, json: false }); + const currentBaseBranch = currentBranchName(scanResult.repoRoot); + const autoFinishSummary = autoFinishReadyAgentBranches(scanResult.repoRoot, { + baseBranch: currentBaseBranch, + dryRun: syncOptions.dryRun, + }); + printScanResult(scanResult, false); + if (autoFinishSummary.enabled) { + console.log( + `[${TOOL_NAME}] Auto-finish sweep (base=${currentBaseBranch}): attempted=${autoFinishSummary.attempted}, completed=${autoFinishSummary.completed}, skipped=${autoFinishSummary.skipped}, failed=${autoFinishSummary.failed}`, + ); + for (const detail of autoFinishSummary.details) { + console.log(`[${TOOL_NAME}] ${detail}`); + } + } else if (autoFinishSummary.details.length > 0) { + console.log(`[${TOOL_NAME}] ${autoFinishSummary.details[0]}`); + } + + const cleanupResult = cleanupProtectedBaseSandbox(blocked.repoRoot, metadata); + console.log( + `[${TOOL_NAME}] Protected-base setup sandbox cleanup: ${cleanupResult.note} ` + + `(worktree=${cleanupResult.worktree}, branch=${cleanupResult.branch}).`, + ); + + return { + scanResult, + }; +} + +function parseTargetFlag(rawArgs, defaultTarget = process.cwd()) { + return cliArgsModule.parseTargetFlag(rawArgs, defaultTarget); +} + +function parseReviewArgs(rawArgs) { + return cliArgsModule.parseReviewArgs(rawArgs); +} + +function parseAgentsArgs(rawArgs) { + return cliArgsModule.parseAgentsArgs(rawArgs); +} + +function parseReportArgs(rawArgs) { + return cliArgsModule.parseReportArgs(rawArgs); +} + +function todayDateStamp() { + return new Date().toISOString().slice(0, 10); +} + +function inferGithubRepoFromOrigin(repoRoot) { + const rawOrigin = readGitConfig(repoRoot, 'remote.origin.url'); + if (!rawOrigin) return ''; + + const httpsMatch = rawOrigin.match(/github\.com[:/](.+?)(?:\.git)?$/i); + if (!httpsMatch) return ''; + const slug = (httpsMatch[1] || '').replace(/^\/+/, '').trim(); + if (!slug || !slug.includes('/')) return ''; + return `github.com/${slug}`; +} + +function inferGithubRepoSlug(rawValue) { + const raw = String(rawValue || '').trim(); + if (!raw) return ''; + const match = raw.match(/github\.com[:/](.+?)(?:\.git)?$/i); + if (!match) return ''; + const slug = String(match[1] || '') + .replace(/^\/+/, '') + .replace(/^github\.com\//i, '') + .trim(); + if (!slug || !slug.includes('/')) return ''; + return slug; +} + +function resolveScorecardRepo(repoRoot, explicitRepo) { + if (explicitRepo) { + return explicitRepo.trim(); + } + const inferred = inferGithubRepoFromOrigin(repoRoot); + if (inferred) return inferred; + throw new Error( + 'Unable to infer GitHub repo from origin remote. Pass --repo github.com//.', + ); +} + +function runScorecardJson(repo) { + const result = run(SCORECARD_BIN, ['--repo', repo, '--format', 'json'], { allowFailure: true }); + if (result.status !== 0) { + const details = (result.stderr || result.stdout || '').trim(); + throw new Error( + `Failed to run scorecard CLI ('${SCORECARD_BIN} --repo ${repo} --format json').${details ? `\n${details}` : ''}`, + ); + } + + try { + return JSON.parse(result.stdout || '{}'); + } catch (error) { + throw new Error(`Unable to parse scorecard JSON output: ${error.message}`); + } +} + +function readScorecardJsonFile(filePath) { + const absolute = path.resolve(filePath); + if (!fs.existsSync(absolute)) { + throw new Error(`scorecard JSON file not found: ${absolute}`); + } + try { + return JSON.parse(fs.readFileSync(absolute, 'utf8')); + } catch (error) { + throw new Error(`Unable to parse scorecard JSON file: ${error.message}`); + } +} + +function normalizeScorecardChecks(payload) { + const rawChecks = Array.isArray(payload?.checks) ? payload.checks : []; + return rawChecks.map((check) => { + const name = String(check?.name || 'Unknown'); + const rawScore = Number(check?.score); + const score = Number.isFinite(rawScore) ? rawScore : 0; + return { + name, + score, + risk: SCORECARD_RISK_BY_CHECK[name] || 'Unknown', + }; + }); +} + +function renderScorecardBaselineMarkdown({ repo, score, checks, capturedAt, scorecardVersion, reportDate }) { + const rows = checks + .map((item) => `| ${item.name} | ${item.score} | ${item.risk} |`) + .join('\n'); + + return [ + '# OpenSSF Scorecard Baseline Report', + '', + `- **Repository:** \`${repo}\``, + '- **Source:** generated by `gx report scorecard`', + `- **Captured at:** ${capturedAt}`, + `- **Scorecard version:** \`${scorecardVersion}\``, + `- **Overall score:** **${score} / 10**`, + '', + '## Check breakdown', + '', + '| Check | Score | Risk |', + '|---|---:|---|', + rows || '| (none) | 0 | Unknown |', + '', + `## Report date`, + '', + `- ${reportDate}`, + '', + ].join('\n'); +} + +function renderScorecardRemediationPlanMarkdown({ baselineRelativePath, checks }) { + const failing = checks.filter((item) => item.score < 10); + const failingRows = failing + .sort((a, b) => a.score - b.score || a.name.localeCompare(b.name)) + .map((item) => `| ${item.name} | ${item.score} | ${item.risk} |`) + .join('\n'); + + return [ + '# OpenSSF Scorecard Remediation Plan', + '', + `Based on baseline report: \`${baselineRelativePath}\`.`, + '', + '## Failing checks', + '', + '| Check | Score | Risk |', + '|---|---:|---|', + (failingRows || '| None | 10 | N/A |'), + '', + '## Priority order', + '', + '1. Fix **High** risk checks first (especially score 0 items).', + '2. Then close **Medium** risk checks with score < 10.', + '3. Finally address **Low** risk ecosystem/process checks.', + '', + '## Verification loop', + '', + '1. Run scorecard again.', + '2. Re-generate baseline + remediation files.', + '3. Compare score deltas and track improved checks.', + '', + ].join('\n'); +} + +function parseBranchList(rawValue) { + return String(rawValue || '') + .split(/[\s,]+/) + .map((item) => item.trim()) + .filter(Boolean); +} + +function uniquePreserveOrder(items) { + const seen = new Set(); + const result = []; + for (const item of items) { + if (seen.has(item)) continue; + seen.add(item); + result.push(item); + } + return result; +} + +function readConfiguredProtectedBranches(repoRoot) { + const result = gitRun(repoRoot, ['config', '--get', GIT_PROTECTED_BRANCHES_KEY], { allowFailure: true }); + if (result.status !== 0) { + return null; + } + const parsed = uniquePreserveOrder(parseBranchList(result.stdout.trim())); + if (parsed.length === 0) { + return null; + } + return parsed; +} + +function listLocalUserBranches(repoRoot) { + const result = gitRun(repoRoot, ['for-each-ref', '--format=%(refname:short)', 'refs/heads'], { allowFailure: true }); + const branchNames = result.status === 0 + ? uniquePreserveOrder( + String(result.stdout || '') + .split('\n') + .map((item) => item.trim()) + .filter(Boolean), + ) + : []; + + const additionalUserBranches = branchNames.filter( + (branchName) => + !branchName.startsWith('agent/') && + !DEFAULT_PROTECTED_BRANCHES.includes(branchName), + ); + if (additionalUserBranches.length > 0) { + return additionalUserBranches; + } + + const current = gitRun(repoRoot, ['branch', '--show-current'], { allowFailure: true }); + if (current.status !== 0) { + return []; + } + + const branchName = String(current.stdout || '').trim(); + if ( + !branchName || + branchName.startsWith('agent/') || + DEFAULT_PROTECTED_BRANCHES.includes(branchName) + ) { + return []; + } + + return [branchName]; +} + +function listLocalAgentBranches(repoRoot) { + const result = gitRun( + repoRoot, + ['for-each-ref', '--format=%(refname:short)', 'refs/heads/agent/'], + { allowFailure: true }, + ); + if (result.status !== 0) { + return []; + } + return uniquePreserveOrder( + String(result.stdout || '') + .split('\n') + .map((item) => item.trim()) + .filter(Boolean), + ); +} + +function mapWorktreePathsByBranch(repoRoot) { + const result = gitRun(repoRoot, ['worktree', 'list', '--porcelain'], { allowFailure: true }); + const map = new Map(); + if (result.status !== 0) { + return map; + } + + const lines = String(result.stdout || '').split('\n'); + let currentWorktree = ''; + for (const line of lines) { + if (line.startsWith('worktree ')) { + currentWorktree = line.slice('worktree '.length).trim(); + continue; + } + if (line.startsWith('branch refs/heads/')) { + const branchName = line.slice('branch refs/heads/'.length).trim(); + if (currentWorktree && branchName) { + map.set(branchName, currentWorktree); + } + } + } + return map; +} + +function hasSignificantWorkingTreeChanges(worktreePath) { + const result = run('git', [ + '-C', + worktreePath, + 'status', + '--porcelain', + '--untracked-files=normal', + '--', + ]); + if (result.status !== 0) { + return true; + } + + const lines = String(result.stdout || '') + .split('\n') + .map((line) => line.trimEnd()) + .filter((line) => line.length > 0); + + for (const line of lines) { + const pathPart = (line.length > 3 ? line.slice(3) : '').trim(); + if (!pathPart) continue; + if (pathPart === LOCK_FILE_RELATIVE) continue; + if (pathPart.startsWith(`${LOCK_FILE_RELATIVE} -> `)) continue; + if (pathPart.endsWith(` -> ${LOCK_FILE_RELATIVE}`)) continue; + return true; + } + return false; +} + +function autoFinishReadyAgentBranches(repoRoot, options = {}) { + const baseBranch = String(options.baseBranch || '').trim(); + const dryRun = Boolean(options.dryRun); + const waitForMerge = options.waitForMerge !== false; + const excludedBranches = new Set( + Array.isArray(options.excludeBranches) + ? options.excludeBranches.map((branch) => String(branch || '').trim()).filter(Boolean) + : [], + ); + + const summary = { + enabled: true, + baseBranch, + attempted: 0, + completed: 0, + skipped: 0, + failed: 0, + details: [], + }; + + if (!baseBranch || baseBranch === 'HEAD' || baseBranch.startsWith('agent/')) { + summary.enabled = false; + summary.details.push('Skipped auto-finish sweep (base branch is missing or not a non-agent local branch).'); + return summary; + } + + if (String(process.env.GUARDEX_DOCTOR_SANDBOX || '') === '1') { + summary.enabled = false; + summary.details.push('Skipped auto-finish sweep inside doctor sandbox pass.'); + return summary; + } + + if (String(process.env.GUARDEX_SKIP_AUTO_FINISH_READY_BRANCHES || '') === '1') { + summary.enabled = false; + summary.details.push('Skipped auto-finish sweep (GUARDEX_SKIP_AUTO_FINISH_READY_BRANCHES=1).'); + return summary; + } + + if (dryRun) { + summary.enabled = false; + summary.details.push('Skipped auto-finish sweep in dry-run mode.'); + return summary; + } + + const hasOrigin = gitRun(repoRoot, ['remote', 'get-url', 'origin'], { allowFailure: true }).status === 0; + if (!hasOrigin) { + summary.enabled = false; + summary.details.push('Skipped auto-finish sweep (origin remote missing).'); + return summary; + } + const explicitGhBin = Boolean(String(process.env.GUARDEX_GH_BIN || '').trim()); + if (!explicitGhBin && !originRemoteLooksLikeGithub(repoRoot)) { + summary.enabled = false; + summary.details.push('Skipped auto-finish sweep (origin remote is not GitHub).'); + return summary; + } + + const ghBin = process.env.GUARDEX_GH_BIN || 'gh'; + if (run(ghBin, ['--version']).status !== 0) { + summary.enabled = false; + summary.details.push(`Skipped auto-finish sweep (${ghBin} not available).`); + return summary; + } + + const branchWorktrees = mapWorktreePathsByBranch(repoRoot); + const agentBranches = listLocalAgentBranches(repoRoot); + if (agentBranches.length === 0) { + summary.enabled = false; + summary.details.push('No local agent branches found for auto-finish sweep.'); + return summary; + } + + for (const branch of agentBranches) { + if (excludedBranches.has(branch)) { + summary.skipped += 1; + summary.details.push(`[skip] ${branch}: excluded from this auto-finish sweep.`); + continue; + } + + if (branch === baseBranch) { + summary.skipped += 1; + summary.details.push(`[skip] ${branch}: source branch equals base branch.`); + continue; + } + + let counts; + try { + counts = aheadBehind(repoRoot, branch, baseBranch); + } catch (error) { + summary.failed += 1; + summary.details.push(`[fail] ${branch}: unable to compute ahead/behind (${error.message}).`); + continue; + } + + if (counts.ahead <= 0) { + summary.skipped += 1; + summary.details.push(`[skip] ${branch}: already merged into ${baseBranch}.`); + continue; + } + + const branchWorktree = branchWorktrees.get(branch) || ''; + if (branchWorktree && hasSignificantWorkingTreeChanges(branchWorktree)) { + summary.skipped += 1; + summary.details.push(`[skip] ${branch}: dirty worktree (${branchWorktree}).`); + continue; + } + + summary.attempted += 1; + const finishArgs = [ + '--branch', + branch, + '--base', + baseBranch, + '--via-pr', + waitForMerge ? '--wait-for-merge' : '--no-wait-for-merge', + '--cleanup', + ]; + const finishResult = runPackageAsset('branchFinish', finishArgs, { cwd: repoRoot }); + const combinedOutput = [finishResult.stdout || '', finishResult.stderr || ''].join('\n').trim(); + + if (finishResult.status === 0) { + summary.completed += 1; + summary.details.push(`[done] ${branch}: auto-finish completed.`); + continue; + } + + const recoverableConflict = detectRecoverableAutoFinishConflict(combinedOutput); + if (recoverableConflict) { + summary.skipped += 1; + const tail = combinedOutput ? ` ${combinedOutput.split('\n').slice(-2).join(' | ')}` : ''; + summary.details.push(`[skip] ${branch}: ${recoverableConflict.rawLabel}${tail}`); + continue; + } + + summary.failed += 1; + const tail = combinedOutput ? ` ${combinedOutput.split('\n').slice(-2).join(' | ')}` : ''; + summary.details.push(`[fail] ${branch}: auto-finish failed.${tail}`); + } + + return summary; +} + +function ensureSetupProtectedBranches(repoRoot, dryRun) { + const localUserBranches = listLocalUserBranches(repoRoot); + if (localUserBranches.length === 0) { + return { + status: 'unchanged', + file: `git config ${GIT_PROTECTED_BRANCHES_KEY}`, + note: 'no additional local user branches detected', + }; + } + + const configured = readConfiguredProtectedBranches(repoRoot); + const currentBranches = configured || [...DEFAULT_PROTECTED_BRANCHES]; + const missingBranches = localUserBranches.filter((branchName) => !currentBranches.includes(branchName)); + if (missingBranches.length === 0) { + return { + status: 'unchanged', + file: `git config ${GIT_PROTECTED_BRANCHES_KEY}`, + note: 'local user branches already protected', + }; + } + + const nextBranches = uniquePreserveOrder([...currentBranches, ...missingBranches]); + if (!dryRun) { + writeProtectedBranches(repoRoot, nextBranches); + } + + return { + status: dryRun ? 'would-update' : 'updated', + file: `git config ${GIT_PROTECTED_BRANCHES_KEY}`, + note: `added local user branch(es): ${missingBranches.join(', ')}`, + }; +} + +function readProtectedBranches(repoRoot) { + const result = gitRun(repoRoot, ['config', '--get', GIT_PROTECTED_BRANCHES_KEY], { allowFailure: true }); + if (result.status !== 0) { + return [...DEFAULT_PROTECTED_BRANCHES]; + } + + const parsed = uniquePreserveOrder(parseBranchList(result.stdout.trim())); + if (parsed.length === 0) { + return [...DEFAULT_PROTECTED_BRANCHES]; + } + return parsed; +} + +function writeProtectedBranches(repoRoot, branches) { + if (branches.length === 0) { + gitRun(repoRoot, ['config', '--unset-all', GIT_PROTECTED_BRANCHES_KEY], { allowFailure: true }); + return; + } + gitRun(repoRoot, ['config', GIT_PROTECTED_BRANCHES_KEY, branches.join(' ')]); +} + +function readGitConfig(repoRoot, key) { + const result = gitRun(repoRoot, ['config', '--get', key], { allowFailure: true }); + if (result.status !== 0) { + return ''; + } + return (result.stdout || '').trim(); +} + +function resolveBaseBranch(repoRoot, explicitBase) { + if (explicitBase) { + return explicitBase; + } + const configured = readGitConfig(repoRoot, GIT_BASE_BRANCH_KEY); + return configured || DEFAULT_BASE_BRANCH; +} + +function resolveSyncStrategy(repoRoot, explicitStrategy) { + const strategy = (explicitStrategy || readGitConfig(repoRoot, GIT_SYNC_STRATEGY_KEY) || DEFAULT_SYNC_STRATEGY) + .trim() + .toLowerCase(); + if (strategy !== 'rebase' && strategy !== 'merge') { + throw new Error(`Invalid sync strategy '${strategy}' (expected: rebase or merge)`); + } + return strategy; +} + +function currentBranchName(repoRoot) { + const result = gitRun(repoRoot, ['branch', '--show-current'], { allowFailure: true }); + if (result.status !== 0) { + throw new Error('Unable to detect current branch'); + } + const branch = (result.stdout || '').trim(); + if (!branch) { + throw new Error('Detached HEAD is not supported for sync operations'); + } + return branch; +} + +function repoHasHeadCommit(repoRoot) { + return gitRun(repoRoot, ['rev-parse', '--verify', 'HEAD'], { allowFailure: true }).status === 0; +} + +function readBranchDisplayName(repoRoot) { + const symbolic = gitRun(repoRoot, ['symbolic-ref', '--quiet', '--short', 'HEAD'], { allowFailure: true }); + if (symbolic.status === 0) { + const branch = String(symbolic.stdout || '').trim(); + if (!branch) { + return '(unknown)'; + } + return repoHasHeadCommit(repoRoot) ? branch : `${branch} (unborn; no commits yet)`; + } + + const detached = gitRun(repoRoot, ['rev-parse', '--short', 'HEAD'], { allowFailure: true }); + if (detached.status === 0) { + return `(detached at ${String(detached.stdout || '').trim()})`; + } + return '(unknown)'; +} + +function repoHasOriginRemote(repoRoot) { + return gitRun(repoRoot, ['remote', 'get-url', 'origin'], { allowFailure: true }).status === 0; +} + +function detectComposeHintFiles(repoRoot) { + return COMPOSE_HINT_FILES.filter((relativePath) => fs.existsSync(path.join(repoRoot, relativePath))); +} + +function printSetupRepoHints(repoRoot, baseBranch, repoLabel = '') { + const branchDisplay = readBranchDisplayName(repoRoot); + const hasHeadCommit = repoHasHeadCommit(repoRoot); + const hasOrigin = repoHasOriginRemote(repoRoot); + const composeFiles = detectComposeHintFiles(repoRoot); + if (hasHeadCommit && hasOrigin && composeFiles.length === 0) { + return; + } + + const label = repoLabel ? ` ${repoLabel}` : ''; + if (!hasHeadCommit) { + console.log(`[${TOOL_NAME}] Fresh repo onboarding${label}: current branch is ${branchDisplay}.`); + console.log(`[${TOOL_NAME}] Bootstrap commit${label}: git add . && git commit -m "bootstrap gitguardex"`); + console.log( + `[${TOOL_NAME}] First agent flow${label}: ` + + `gx branch start "" "codex" -> ` + + `gx locks claim --branch "$(git branch --show-current)" -> ` + + `gx branch finish --branch "$(git branch --show-current)" --base ${baseBranch} --via-pr --wait-for-merge`, + ); + } + if (!hasOrigin) { + console.log(`[${TOOL_NAME}] No origin remote${label}: finish and auto-merge flows stay local until you add one.`); + } + if (composeFiles.length > 0) { + console.log( + `[${TOOL_NAME}] Docker Compose helper${label}: detected ${composeFiles.join(', ')}. ` + + `Set GUARDEX_DOCKER_SERVICE and run 'bash scripts/guardex-docker-loader.sh -- '.`, + ); + } +} + +function workingTreeIsDirty(repoRoot) { + const result = gitRun(repoRoot, ['status', '--porcelain'], { allowFailure: true }); + if (result.status !== 0) { + throw new Error('Unable to inspect git working tree status'); + } + const lines = (result.stdout || '').split('\n').filter((line) => line.length > 0); + const significant = lines.filter((line) => { + const pathPart = (line.length > 3 ? line.slice(3) : '').trim(); + if (!pathPart) return false; + if (pathPart === LOCK_FILE_RELATIVE) return false; + if (pathPart.startsWith(`${LOCK_FILE_RELATIVE} -> `)) return false; + if (pathPart.endsWith(` -> ${LOCK_FILE_RELATIVE}`)) return false; + return true; + }); + return significant.length > 0; +} + +function ensureOriginBaseRef(repoRoot, baseBranch) { + const fetch = gitRun(repoRoot, ['fetch', 'origin', baseBranch, '--quiet'], { allowFailure: true }); + if (fetch.status !== 0) { + throw new Error( + `Unable to fetch origin/${baseBranch}. Ensure remote 'origin' exists and branch '${baseBranch}' is available.`, + ); + } + const hasRemoteBase = gitRun(repoRoot, ['show-ref', '--verify', '--quiet', `refs/remotes/origin/${baseBranch}`], { + allowFailure: true, + }); + if (hasRemoteBase.status !== 0) { + throw new Error(`Remote base branch not found: origin/${baseBranch}`); + } +} + +function aheadBehind(repoRoot, branchRef, baseRef) { + const result = gitRun(repoRoot, ['rev-list', '--left-right', '--count', `${branchRef}...${baseRef}`], { + allowFailure: true, + }); + if (result.status !== 0) { + throw new Error(`Unable to compute ahead/behind for ${branchRef} vs ${baseRef}`); + } + const parts = (result.stdout || '').trim().split(/\s+/).filter(Boolean); + const ahead = Number.parseInt(parts[0] || '0', 10); + const behind = Number.parseInt(parts[1] || '0', 10); + return { ahead: Number.isFinite(ahead) ? ahead : 0, behind: Number.isFinite(behind) ? behind : 0 }; +} + +function lockRegistryStatus(repoRoot) { + const result = gitRun(repoRoot, ['status', '--porcelain', '--', LOCK_FILE_RELATIVE], { allowFailure: true }); + if (result.status !== 0) { + return { dirty: false, untracked: false }; + } + const lines = (result.stdout || '').split('\n').filter((line) => line.length > 0); + if (lines.length === 0) { + return { dirty: false, untracked: false }; + } + const untracked = lines.some((line) => line.startsWith('??')); + return { dirty: true, untracked }; +} + +function parseSyncArgs(rawArgs) { + return cliArgsModule.parseSyncArgs(rawArgs); +} + +function parseCleanupArgs(rawArgs) { + return cliArgsModule.parseCleanupArgs(rawArgs); +} + +function parseMergeArgs(rawArgs) { + return cliArgsModule.parseMergeArgs(rawArgs); +} + +function parseFinishArgs(rawArgs, defaults = {}) { + return cliArgsModule.parseFinishArgs(rawArgs, defaults); +} + +function listAgentWorktrees(repoRoot) { + const result = gitRun(repoRoot, ['worktree', 'list', '--porcelain'], { allowFailure: true }); + if (result.status !== 0) { + throw new Error('Unable to list git worktrees for finish command'); + } + + const entries = []; + let currentPath = ''; + let currentBranchRef = ''; + const lines = String(result.stdout || '').split('\n'); + for (const line of lines) { + if (!line.trim()) { + if (currentPath && currentBranchRef.startsWith('refs/heads/agent/')) { + entries.push({ + worktreePath: currentPath, + branch: currentBranchRef.replace(/^refs\/heads\//, ''), + }); + } + currentPath = ''; + currentBranchRef = ''; + continue; + } + if (line.startsWith('worktree ')) { + currentPath = line.slice('worktree '.length).trim(); + continue; + } + if (line.startsWith('branch ')) { + currentBranchRef = line.slice('branch '.length).trim(); + continue; + } + } + if (currentPath && currentBranchRef.startsWith('refs/heads/agent/')) { + entries.push({ + worktreePath: currentPath, + branch: currentBranchRef.replace(/^refs\/heads\//, ''), + }); + } + + return entries; +} + +function listLocalAgentBranchesForFinish(repoRoot) { + const result = gitRun( + repoRoot, + ['for-each-ref', '--format=%(refname:short)', 'refs/heads/agent/'], + { allowFailure: true }, + ); + if (result.status !== 0) { + throw new Error('Unable to list local agent branches'); + } + return uniquePreserveOrder( + String(result.stdout || '') + .split('\n') + .map((line) => line.trim()) + .filter((line) => line.startsWith('agent/')), + ); +} + +function gitQuietChangeResult(worktreePath, args) { + const result = run('git', ['-C', worktreePath, ...args], { stdio: 'pipe' }); + if (result.status === 0) { + return false; + } + if (result.status === 1) { + return true; + } + throw new Error( + `git ${args.join(' ')} failed in ${worktreePath}: ${( + result.stderr || result.stdout || '' + ).trim()}`, + ); +} + +function worktreeHasLocalChanges(worktreePath) { + const hasUnstaged = gitQuietChangeResult(worktreePath, [ + 'diff', + '--quiet', + '--', + '.', + ':(exclude).omx/state/agent-file-locks.json', + ]); + if (hasUnstaged) { + return true; + } + + const hasStaged = gitQuietChangeResult(worktreePath, [ + 'diff', + '--cached', + '--quiet', + '--', + '.', + ':(exclude).omx/state/agent-file-locks.json', + ]); + if (hasStaged) { + return true; + } + + const untracked = run('git', ['-C', worktreePath, 'ls-files', '--others', '--exclude-standard'], { + stdio: 'pipe', + }); + if (untracked.status !== 0) { + throw new Error(`Unable to inspect untracked files in ${worktreePath}`); + } + return String(untracked.stdout || '').trim().length > 0; +} + +function gitOutputLines(worktreePath, args) { + const result = run('git', ['-C', worktreePath, ...args], { stdio: 'pipe' }); + if (result.status !== 0) { + throw new Error( + `git ${args.join(' ')} failed in ${worktreePath}: ${( + result.stderr || result.stdout || '' + ).trim()}`, + ); + } + return String(result.stdout || '') + .split('\n') + .map((line) => line.trim()) + .filter(Boolean); +} + +function claimLocksForAutoCommit(repoRoot, worktreePath, branch) { + const changedFiles = uniquePreserveOrder([ + ...gitOutputLines(worktreePath, ['diff', '--name-only', '--', '.', ':(exclude).omx/state/agent-file-locks.json']), + ...gitOutputLines(worktreePath, ['diff', '--cached', '--name-only', '--', '.', ':(exclude).omx/state/agent-file-locks.json']), + ...gitOutputLines(worktreePath, ['ls-files', '--others', '--exclude-standard']), + ]); + + if (changedFiles.length > 0) { + const claim = runPackageAsset('lockTool', ['claim', '--branch', branch, ...changedFiles], { + cwd: repoRoot, + stdio: 'pipe', + }); + if (claim.status !== 0) { + throw new Error( + `Lock claim failed for ${branch}: ${( + claim.stderr || claim.stdout || '' + ).trim()}`, + ); + } + } + + const deletedFiles = uniquePreserveOrder([ + ...gitOutputLines(worktreePath, [ + 'diff', + '--name-only', + '--diff-filter=D', + '--', + '.', + ':(exclude).omx/state/agent-file-locks.json', + ]), + ...gitOutputLines(worktreePath, [ + 'diff', + '--cached', + '--name-only', + '--diff-filter=D', + '--', + '.', + ':(exclude).omx/state/agent-file-locks.json', + ]), + ]); + + if (deletedFiles.length > 0) { + const allowDelete = runPackageAsset('lockTool', ['allow-delete', '--branch', branch, ...deletedFiles], { + cwd: repoRoot, + stdio: 'pipe', + }); + if (allowDelete.status !== 0) { + throw new Error( + `Delete-lock grant failed for ${branch}: ${( + allowDelete.stderr || allowDelete.stdout || '' + ).trim()}`, + ); + } + } +} + +function branchExists(repoRoot, branch) { + const result = gitRun(repoRoot, ['show-ref', '--verify', '--quiet', `refs/heads/${branch}`], { + allowFailure: true, + }); + return result.status === 0; +} + +function resolveFinishBaseBranch(repoRoot, _sourceBranch, explicitBase) { + if (explicitBase) { + return explicitBase; + } + + const configured = readGitConfig(repoRoot, GIT_BASE_BRANCH_KEY); + if (configured) { + return configured; + } + + return DEFAULT_BASE_BRANCH; +} + +function branchMergedIntoBase(repoRoot, branch, baseBranch) { + if (!branchExists(repoRoot, baseBranch)) { + return false; + } + const result = gitRun(repoRoot, ['merge-base', '--is-ancestor', branch, baseBranch], { + allowFailure: true, + }); + if (result.status === 0) { + return true; + } + if (result.status === 1) { + return false; + } + throw new Error(`Unable to determine merge status for ${branch} -> ${baseBranch}`); +} + +function autoCommitWorktreeForFinish(repoRoot, worktreePath, branch, options) { + const hasChanges = worktreeHasLocalChanges(worktreePath); + if (!hasChanges) { + return { changed: false, committed: false }; + } + + if (options.noAutoCommit) { + throw new Error( + `Branch '${branch}' has local changes in ${worktreePath}. Re-run without --no-auto-commit or commit manually first.`, + ); + } + + if (options.dryRun) { + return { changed: true, committed: false, dryRun: true }; + } + + claimLocksForAutoCommit(repoRoot, worktreePath, branch); + + const addResult = run('git', ['-C', worktreePath, 'add', '-A'], { stdio: 'pipe' }); + if (addResult.status !== 0) { + throw new Error(`git add failed in ${worktreePath}: ${(addResult.stderr || addResult.stdout || '').trim()}`); + } + + const stagedHasChanges = gitQuietChangeResult(worktreePath, [ + 'diff', + '--cached', + '--quiet', + '--', + '.', + ':(exclude).omx/state/agent-file-locks.json', + ]); + if (!stagedHasChanges) { + return { changed: true, committed: false }; + } + + const commitMessage = options.commitMessage || `Auto-finish: ${branch}`; + const commitResult = run('git', ['-C', worktreePath, 'commit', '-m', commitMessage], { stdio: 'pipe' }); + if (commitResult.status !== 0) { + throw new Error( + `Auto-commit failed on '${branch}': ${( + commitResult.stderr || commitResult.stdout || '' + ).trim()}`, + ); + } + + return { changed: true, committed: true, message: commitMessage }; +} + +function syncOperation(repoRoot, strategy, baseRef, ffOnly) { + if (strategy === 'rebase') { + if (ffOnly) { + throw new Error('--ff-only is only supported with --strategy merge'); + } + const rebased = run('git', ['-C', repoRoot, 'rebase', baseRef], { stdio: 'pipe' }); + if (rebased.status !== 0) { + const details = (rebased.stderr || rebased.stdout || '').trim(); + const gitDir = path.join(repoRoot, '.git'); + const rebaseActive = fs.existsSync(path.join(gitDir, 'rebase-merge')) || fs.existsSync(path.join(gitDir, 'rebase-apply')); + const help = rebaseActive + ? '\nResolve conflicts, then run: git rebase --continue\nOr abort: git rebase --abort' + : ''; + throw new Error(`Sync failed during rebase onto ${baseRef}.${details ? `\n${details}` : ''}${help}`); + } + return; + } + + const mergeArgs = ['-C', repoRoot, 'merge', '--no-edit']; + if (ffOnly) { + mergeArgs.push('--ff-only'); + } + mergeArgs.push(baseRef); + const merged = run('git', mergeArgs, { stdio: 'pipe' }); + if (merged.status !== 0) { + const details = (merged.stderr || merged.stdout || '').trim(); + const gitDir = path.join(repoRoot, '.git'); + const mergeActive = fs.existsSync(path.join(gitDir, 'MERGE_HEAD')); + const help = mergeActive ? '\nResolve conflicts, then run: git commit\nOr abort: git merge --abort' : ''; + throw new Error(`Sync failed during merge from ${baseRef}.${details ? `\n${details}` : ''}${help}`); + } +} + +function isInteractiveTerminal() { + return Boolean(process.stdin.isTTY && process.stdout.isTTY); +} + +const stdinWaitArray = new Int32Array(new SharedArrayBuffer(4)); + +function sleepSyncMs(milliseconds) { + Atomics.wait(stdinWaitArray, 0, 0, milliseconds); +} + +function readSingleLineFromStdin() { + let input = ''; + const buffer = Buffer.alloc(1); + + while (true) { + let bytesRead = 0; + try { + bytesRead = fs.readSync(process.stdin.fd, buffer, 0, 1); + } catch (error) { + if (error && ['EAGAIN', 'EWOULDBLOCK', 'EINTR'].includes(error.code)) { + sleepSyncMs(15); + continue; + } + return input; + } + + if (bytesRead === 0) { + if (process.stdin.isTTY) { + sleepSyncMs(15); + continue; + } + return input; + } + + const char = buffer.toString('utf8', 0, bytesRead); + if (char === '\n' || char === '\r') { + return input; + } + input += char; + } +} + +function promptYesNo(question, defaultYes = true) { + const hint = defaultYes ? '[Y/n]' : '[y/N]'; + while (true) { + process.stdout.write(`${question} ${hint} `); + const answer = readSingleLineFromStdin().trim().toLowerCase(); + + if (!answer) { + return defaultYes; + } + if (answer === 'y' || answer === 'yes') { + return true; + } + if (answer === 'n' || answer === 'no') { + return false; + } + process.stdout.write('Please answer with y or n.\n'); + } +} + +function envFlagEnabled(name) { + const raw = process.env[name]; + if (raw == null) return false; + return ['1', 'true', 'yes', 'on'].includes(String(raw).trim().toLowerCase()); +} + +function parseAutoApproval(name) { + const raw = process.env[name]; + if (raw == null) return null; + const normalized = String(raw).trim().toLowerCase(); + if (['1', 'true', 'yes', 'y', 'on'].includes(normalized)) return true; + if (['0', 'false', 'no', 'n', 'off'].includes(normalized)) return false; + return null; +} + +function parseBooleanLike(raw) { + if (raw == null) return null; + const normalized = String(raw).trim().toLowerCase(); + if (!normalized) return null; + if (['1', 'true', 'yes', 'y', 'on'].includes(normalized)) return true; + if (['0', 'false', 'no', 'n', 'off'].includes(normalized)) return false; + return null; +} + +function parseDotenvAssignmentValue(raw) { + let value = String(raw || '').trim(); + if (!value) return ''; + if ((value.startsWith('"') && value.endsWith('"')) || (value.startsWith('\'') && value.endsWith('\''))) { + return value.slice(1, -1).trim(); + } + value = value.replace(/\s+#.*$/, '').trim(); + return value; +} + +function readRepoDotenvValue(repoRoot, name) { + const envPath = path.join(repoRoot, '.env'); + if (!fs.existsSync(envPath)) return null; + const pattern = new RegExp(`^\\s*(?:export\\s+)?${name.replace(/[.*+?^${}()|[\]\\]/g, '\\$&')}\\s*=\\s*(.*)$`); + const lines = fs.readFileSync(envPath, 'utf8').split(/\r?\n/); + for (const line of lines) { + const trimmed = line.trim(); + if (!trimmed || trimmed.startsWith('#')) continue; + const match = line.match(pattern); + if (!match) continue; + return parseDotenvAssignmentValue(match[1]); + } + return null; +} + +function resolveGuardexRepoToggle(repoRoot, env = process.env) { + const envRaw = env[GUARDEX_REPO_TOGGLE_ENV]; + const envEnabled = parseBooleanLike(envRaw); + if (envEnabled !== null) { + return { + enabled: envEnabled, + source: 'process environment', + raw: String(envRaw).trim(), + }; + } + + const dotenvRaw = readRepoDotenvValue(repoRoot, GUARDEX_REPO_TOGGLE_ENV); + const dotenvEnabled = parseBooleanLike(dotenvRaw); + if (dotenvEnabled !== null) { + return { + enabled: dotenvEnabled, + source: 'repo .env', + raw: String(dotenvRaw).trim(), + }; + } + + return { + enabled: true, + source: 'default', + raw: '', + }; +} + +function describeGuardexRepoToggle(toggle) { + if (!toggle || toggle.source === 'default') { + return 'default enabled mode'; + } + return `${toggle.source} (${GUARDEX_REPO_TOGGLE_ENV}=${toggle.raw})`; +} + +function parseVersionString(version) { + const match = String(version || '').trim().match(/^v?(\d+)\.(\d+)\.(\d+)/); + if (!match) return null; + return [ + Number.parseInt(match[1], 10), + Number.parseInt(match[2], 10), + Number.parseInt(match[3], 10), + ]; +} + +function compareParsedVersions(left, right) { + if (!left || !right) return 0; + for (let index = 0; index < Math.max(left.length, right.length); index += 1) { + const leftValue = left[index] || 0; + const rightValue = right[index] || 0; + if (leftValue > rightValue) return 1; + if (leftValue < rightValue) return -1; + } + return 0; +} + +function isNewerVersion(latest, current) { + const latestParts = parseVersionString(latest); + const currentParts = parseVersionString(current); + + if (!latestParts || !currentParts) { + return String(latest || '').trim() !== String(current || '').trim(); + } + + return compareParsedVersions(latestParts, currentParts) > 0; +} + +function parseNpmVersionOutput(stdout) { + const trimmed = String(stdout || '').trim(); + if (!trimmed) return ''; + + try { + const parsed = JSON.parse(trimmed); + if (Array.isArray(parsed)) { + return String(parsed[parsed.length - 1] || '').trim(); + } + return String(parsed || '').trim(); + } catch { + const firstLine = trimmed.split('\n').map((line) => line.trim()).find(Boolean); + return firstLine || ''; + } +} + +function checkForGuardexUpdate() { + if (envFlagEnabled('GUARDEX_SKIP_UPDATE_CHECK')) { + return { checked: false, reason: 'disabled' }; + } + + const forceCheck = envFlagEnabled('GUARDEX_FORCE_UPDATE_CHECK'); + if (!forceCheck && !isInteractiveTerminal()) { + return { checked: false, reason: 'non-interactive' }; + } + + const result = run(NPM_BIN, ['view', packageJson.name, 'version', '--json'], { timeout: 5000 }); + if (result.status !== 0) { + return { checked: false, reason: 'lookup-failed' }; + } + + const latest = parseNpmVersionOutput(result.stdout); + if (!latest) { + return { checked: false, reason: 'invalid-latest-version' }; + } + + return { + checked: true, + current: packageJson.version, + latest, + updateAvailable: isNewerVersion(latest, packageJson.version), + }; +} + +function printUpdateAvailableBanner(current, latest) { + const title = colorize('UPDATE AVAILABLE', '1;33'); + console.log(`[${TOOL_NAME}] ${title}`); + console.log(`[${TOOL_NAME}] Current: ${current}`); + console.log(`[${TOOL_NAME}] Latest : ${latest}`); + console.log(`[${TOOL_NAME}] Command: ${NPM_BIN} i -g ${packageJson.name}@latest`); +} + +function maybeSelfUpdateBeforeStatus() { + return getToolchainApi().maybeSelfUpdateBeforeStatus(); +} + +function readInstalledGuardexVersion() { + const installInfo = readInstalledGuardexInstallInfo(); + return installInfo ? installInfo.version : null; +} + +function readInstalledGuardexInstallInfo() { + // Resolves the globally-installed package's on-disk version so we can + // verify npm actually wrote new bytes. Uses `npm root -g` to locate the + // global install root so we don't accidentally read the running source + // tree (which is the file the CLI was spawned from — that IS the global + // copy in the normal case, but a bump should be visible via a fresh read + // either way). Returns null if we can't determine it. + try { + const rootResult = run(NPM_BIN, ['root', '-g'], { timeout: 5000 }); + if (rootResult.status !== 0) { + return null; + } + const globalRoot = String(rootResult.stdout || '').trim(); + if (!globalRoot) { + return null; + } + const installedPkgPath = path.join(globalRoot, packageJson.name, 'package.json'); + if (!fs.existsSync(installedPkgPath)) { + return null; + } + const parsed = JSON.parse(fs.readFileSync(installedPkgPath, 'utf8')); + if (parsed && typeof parsed.version === 'string') { + let binRelative = null; + if (typeof parsed.bin === 'string') { + binRelative = parsed.bin; + } else if (parsed.bin && typeof parsed.bin === 'object') { + const invokedName = path.basename(process.argv[1] || ''); + binRelative = + parsed.bin[invokedName] || + parsed.bin[SHORT_TOOL_NAME] || + Object.values(parsed.bin).find((value) => typeof value === 'string') || + null; + } + const packageRoot = path.dirname(installedPkgPath); + const binPath = binRelative ? path.join(packageRoot, binRelative) : null; + return { + version: parsed.version, + packageRoot, + binPath, + }; + } + } catch (error) { + return null; + } + return null; +} + +function restartIntoUpdatedGuardex(expectedVersion) { + const installInfo = readInstalledGuardexInstallInfo(); + if (!installInfo || installInfo.version !== expectedVersion || installInfo.version === packageJson.version) { + return; + } + if (!installInfo.binPath || !fs.existsSync(installInfo.binPath)) { + console.log(`[${TOOL_NAME}] Restart required to use ${installInfo.version}. Rerun ${SHORT_TOOL_NAME}.`); + return; + } + + console.log(`[${TOOL_NAME}] Restarting into ${installInfo.version}…`); + const restartResult = cp.spawnSync( + process.execPath, + [installInfo.binPath, ...process.argv.slice(2)], + { + cwd: process.cwd(), + env: { + ...process.env, + GUARDEX_SKIP_UPDATE_CHECK: '1', + }, + stdio: 'inherit', + }, + ); + if (restartResult.error) { + console.log( + `[${TOOL_NAME}] Restart into ${installInfo.version} failed. Rerun ${SHORT_TOOL_NAME}.`, + ); + return; + } + process.exit(restartResult.status == null ? 0 : restartResult.status); +} + +function checkForOpenSpecPackageUpdate() { + if (envFlagEnabled('GUARDEX_SKIP_OPENSPEC_UPDATE_CHECK')) { + return { checked: false, reason: 'disabled' }; + } + + const forceCheck = envFlagEnabled('GUARDEX_FORCE_OPENSPEC_UPDATE_CHECK'); + if (!forceCheck && !isInteractiveTerminal()) { + return { checked: false, reason: 'non-interactive' }; + } + + const detection = detectGlobalToolchainPackages(); + if (!detection.ok) { + return { checked: false, reason: 'package-detect-failed' }; + } + + const current = String((detection.installedVersions || {})[OPENSPEC_PACKAGE] || '').trim(); + if (!current) { + return { checked: false, reason: 'not-installed' }; + } + + const latestResult = run(NPM_BIN, ['view', OPENSPEC_PACKAGE, 'version', '--json'], { timeout: 5000 }); + if (latestResult.status !== 0) { + return { checked: false, reason: 'lookup-failed' }; + } + + const latest = parseNpmVersionOutput(latestResult.stdout); + if (!latest) { + return { checked: false, reason: 'invalid-latest-version' }; + } + + return { + checked: true, + current, + latest, + updateAvailable: isNewerVersion(latest, current), + }; +} + +function printOpenSpecUpdateAvailableBanner(current, latest) { + const title = colorize('OPENSPEC UPDATE AVAILABLE', '1;33'); + console.log(`[${TOOL_NAME}] ${title}`); + console.log(`[${TOOL_NAME}] Current: ${current}`); + console.log(`[${TOOL_NAME}] Latest : ${latest}`); + console.log(`[${TOOL_NAME}] Command: ${NPM_BIN} i -g ${OPENSPEC_PACKAGE}@latest`); + console.log(`[${TOOL_NAME}] Then : ${OPENSPEC_BIN} update`); +} + +function maybeOpenSpecUpdateBeforeStatus() { + return getToolchainApi().maybeOpenSpecUpdateBeforeStatus(); +} + +function promptYesNoStrict(question) { + while (true) { + process.stdout.write(`${question} [y/n] `); + const answer = readSingleLineFromStdin().trim().toLowerCase(); + + if (answer === 'y' || answer === 'yes') { + process.stdout.write('\n'); + return true; + } + if (answer === 'n' || answer === 'no') { + process.stdout.write('\n'); + return false; + } + + process.stdout.write('Please answer with y or n.\n'); + } +} + +function resolveGlobalInstallApproval(options) { + if (options.yesGlobalInstall && options.noGlobalInstall) { + throw new Error('Cannot use both --yes-global-install and --no-global-install'); + } + + if (options.yesGlobalInstall) { + return { approved: true, source: 'flag' }; + } + + if (options.noGlobalInstall) { + return { approved: false, source: 'flag' }; + } + + if (!isInteractiveTerminal()) { + return { approved: false, source: 'non-interactive-default' }; + } + return { approved: true, source: 'prompt' }; +} + +function getGlobalToolchainService(packageName) { + const service = GLOBAL_TOOLCHAIN_SERVICES.find( + (candidate) => candidate.packageName === packageName, + ); + return service || { name: packageName, packageName }; +} + +function formatGlobalToolchainServiceName(packageName) { + return getGlobalToolchainService(packageName).name; +} + +function describeMissingGlobalDependencyWarnings(packageNames) { + return packageNames + .map((packageName) => getGlobalToolchainService(packageName)) + .filter((service) => service.dependencyUrl) + .map( + (service) => + `Guardex needs ${service.name} as a dependency: ${service.dependencyUrl}`, + ); +} + +function buildMissingCompanionInstallPrompt(missingPackages, missingLocalTools) { + const dependencyWarnings = describeMissingGlobalDependencyWarnings(missingPackages); + const installCommands = describeCompanionInstallCommands(missingPackages, missingLocalTools); + const dependencyPrefix = dependencyWarnings.length > 0 + ? `${dependencyWarnings.join(' ')} ` + : ''; + return `${dependencyPrefix}Install missing companion tools now? (${installCommands.join(' && ')})`; +} + +function detectGlobalToolchainPackages() { + const result = run(NPM_BIN, ['list', '-g', '--depth=0', '--json']); + if (result.status !== 0) { + const stderr = (result.stderr || '').trim(); + return { + ok: false, + error: stderr || 'Unable to detect globally installed npm packages', + }; + } + + let parsed; + try { + parsed = JSON.parse(result.stdout || '{}'); + } catch (error) { + return { + ok: false, + error: `Failed to parse npm list output: ${error.message}`, + }; + } + + const dependencyMap = parsed && parsed.dependencies && typeof parsed.dependencies === 'object' + ? parsed.dependencies + : {}; + const installedSet = new Set(Object.keys(dependencyMap)); + + const installed = []; + const missing = []; + const installedVersions = {}; + for (const pkg of GLOBAL_TOOLCHAIN_PACKAGES) { + if (installedSet.has(pkg)) { + installed.push(pkg); + const rawVersion = dependencyMap[pkg] && dependencyMap[pkg].version; + const version = String(rawVersion || '').trim(); + if (version) { + installedVersions[pkg] = version; + } + } else { + missing.push(pkg); + } + } + + return { ok: true, installed, missing, installedVersions }; +} + +function detectRequiredSystemTools() { + const services = []; + for (const tool of REQUIRED_SYSTEM_TOOLS) { + const result = run(tool.command, ['--version']); + const active = result.status === 0; + const rawReason = result.error && result.error.code + ? result.error.code + : (result.stderr || '').trim(); + const reason = rawReason.split('\n')[0] || ''; + services.push({ + name: tool.name, + displayName: tool.displayName || tool.name, + command: tool.command, + installHint: tool.installHint, + status: active ? 'active' : 'inactive', + reason, + }); + } + return services; +} + +function detectOptionalLocalCompanionTools() { + return OPTIONAL_LOCAL_COMPANION_TOOLS.map((tool) => { + const detectedPath = tool.candidatePaths + .map((relativePath) => path.join(GUARDEX_HOME_DIR, relativePath)) + .find((candidatePath) => fs.existsSync(candidatePath)); + return { + name: tool.name, + displayName: tool.displayName || tool.name, + installCommand: tool.installCommand, + installArgs: [...tool.installArgs], + status: detectedPath ? 'active' : 'inactive', + detectedPath: detectedPath || null, + }; + }); +} + +function describeCompanionInstallCommands(missingPackages, missingLocalTools) { + const commands = []; + if (missingPackages.length > 0) { + commands.push(`${NPM_BIN} i -g ${missingPackages.join(' ')}`); + } + for (const tool of missingLocalTools) { + commands.push(tool.installCommand); + } + return commands; +} + +function askGlobalInstallForMissing(options, missingPackages, missingLocalTools) { + const approval = resolveGlobalInstallApproval(options); + if (!approval.approved) { + return approval; + } + + if (approval.source === 'prompt') { + const approved = promptYesNoStrict( + buildMissingCompanionInstallPrompt(missingPackages, missingLocalTools), + ); + return { approved, source: 'prompt' }; + } + + return approval; +} + +function installGlobalToolchain(options) { + return getToolchainApi().installGlobalToolchain(options); +} + +function gitRefExists(repoRoot, refName) { + return gitRun(repoRoot, ['show-ref', '--verify', '--quiet', refName], { allowFailure: true }).status === 0; +} + +function findStaleLockPaths(repoRoot, locks) { + const stale = []; + + for (const [filePath, rawEntry] of Object.entries(locks)) { + const entry = rawEntry && typeof rawEntry === 'object' ? rawEntry : {}; + const ownerBranch = String(entry.branch || ''); + + const hasOwner = ownerBranch.length > 0; + const localRef = hasOwner ? `refs/heads/${ownerBranch}` : null; + const remoteRef = hasOwner ? `refs/remotes/origin/${ownerBranch}` : null; + const branchExists = hasOwner + ? gitRefExists(repoRoot, localRef) || gitRefExists(repoRoot, remoteRef) + : false; + + const pathExists = fs.existsSync(path.join(repoRoot, filePath)); + + if (!hasOwner || !branchExists || !pathExists) { + stale.push(filePath); + } + } + + return stale; +} + +function runInstallInternal(options) { + const repoRoot = resolveRepoRoot(options.target); + const guardexToggle = resolveGuardexRepoToggle(repoRoot); + if (!guardexToggle.enabled) { + return { + repoRoot, + operations: [ + { + status: 'skipped', + file: '.env', + note: `Guardex disabled by ${describeGuardexRepoToggle(guardexToggle)}`, + }, + ], + hookResult: { status: 'skipped', key: 'core.hooksPath', value: '(unchanged)' }, + guardexEnabled: false, + guardexToggle, + }; + } + const operations = []; + + if (!options.skipGitignore) { + operations.push(ensureManagedGitignore(repoRoot, Boolean(options.dryRun))); + } + + operations.push(...ensureOmxScaffold(repoRoot, Boolean(options.dryRun))); + + for (const templateFile of TEMPLATE_FILES) { + operations.push( + copyTemplateFile( + repoRoot, + templateFile, + shouldForceManagedPath(options, toDestinationPath(templateFile)), + Boolean(options.dryRun), + ), + ); + } + operations.push(...ensureTargetedLegacyWorkflowShims(repoRoot, options)); + for (const hookName of HOOK_NAMES) { + const hookRelativePath = path.posix.join('.githooks', hookName); + operations.push( + ensureHookShim(repoRoot, hookName, { + dryRun: options.dryRun, + force: shouldForceManagedPath(options, hookRelativePath), + }), + ); + } + + operations.push(ensureLockRegistry(repoRoot, Boolean(options.dryRun))); + + if (!options.skipAgents) { + operations.push(ensureAgentsSnippet(repoRoot, Boolean(options.dryRun), { force: Boolean(options.force) })); + } + + const hookResult = configureHooks(repoRoot, Boolean(options.dryRun)); + + return { repoRoot, operations, hookResult, guardexEnabled: true, guardexToggle }; +} + +function runFixInternal(options) { + const repoRoot = resolveRepoRoot(options.target); + const guardexToggle = resolveGuardexRepoToggle(repoRoot); + if (!guardexToggle.enabled) { + return { + repoRoot, + operations: [ + { + status: 'skipped', + file: '.env', + note: `Guardex disabled by ${describeGuardexRepoToggle(guardexToggle)}`, + }, + ], + hookResult: { status: 'skipped', key: 'core.hooksPath', value: '(unchanged)' }, + guardexEnabled: false, + guardexToggle, + }; + } + const operations = []; + + if (!options.skipGitignore) { + operations.push(ensureManagedGitignore(repoRoot, Boolean(options.dryRun))); + } + + operations.push(...ensureOmxScaffold(repoRoot, Boolean(options.dryRun))); + + for (const templateFile of TEMPLATE_FILES) { + if (shouldForceManagedPath(options, toDestinationPath(templateFile))) { + operations.push(copyTemplateFile(repoRoot, templateFile, true, Boolean(options.dryRun))); + continue; + } + operations.push(ensureTemplateFilePresent(repoRoot, templateFile, Boolean(options.dryRun))); + } + operations.push(...ensureTargetedLegacyWorkflowShims(repoRoot, options)); + for (const hookName of HOOK_NAMES) { + const hookRelativePath = path.posix.join('.githooks', hookName); + operations.push( + ensureHookShim(repoRoot, hookName, { + dryRun: options.dryRun, + force: shouldForceManagedPath(options, hookRelativePath), + }), + ); + } + + operations.push(ensureLockRegistry(repoRoot, Boolean(options.dryRun))); + + const lockState = lockStateOrError(repoRoot); + if (!lockState.ok) { + if (!options.dryRun) { + writeLockState(repoRoot, { locks: {} }, false); + } + operations.push({ + status: options.dryRun ? 'would-reset' : 'reset', + file: LOCK_FILE_RELATIVE, + note: 'invalid lock state reset to empty', + }); + } else { + const staleLockPaths = options.dropStaleLocks ? findStaleLockPaths(repoRoot, lockState.locks) : []; + if (staleLockPaths.length > 0) { + const updated = { ...lockState.raw, locks: { ...lockState.locks } }; + for (const filePath of staleLockPaths) { + delete updated.locks[filePath]; + } + writeLockState(repoRoot, updated, Boolean(options.dryRun)); + operations.push({ + status: options.dryRun ? 'would-prune' : 'pruned', + file: LOCK_FILE_RELATIVE, + note: `removed ${staleLockPaths.length} stale lock(s)`, + }); + } + } + + if (!options.skipAgents) { + operations.push(ensureAgentsSnippet(repoRoot, Boolean(options.dryRun), { force: Boolean(options.force) })); + } + + const hookResult = configureHooks(repoRoot, Boolean(options.dryRun)); + + return { repoRoot, operations, hookResult, guardexEnabled: true, guardexToggle }; +} + +function runScanInternal(options) { + const repoRoot = resolveRepoRoot(options.target); + const guardexToggle = resolveGuardexRepoToggle(repoRoot); + const branch = readBranchDisplayName(repoRoot); + if (!guardexToggle.enabled) { + return { + repoRoot, + branch, + findings: [], + errors: 0, + warnings: 0, + guardexEnabled: false, + guardexToggle, + }; + } + const findings = []; + + const requiredPaths = [ + ...OMX_SCAFFOLD_DIRECTORIES, + ...Array.from(OMX_SCAFFOLD_FILES.keys()), + ...REQUIRED_MANAGED_REPO_FILES, + ]; + + for (const relativePath of requiredPaths) { + const absolutePath = path.join(repoRoot, relativePath); + if (!fs.existsSync(absolutePath)) { + findings.push({ + level: 'error', + code: 'missing-managed-file', + path: relativePath, + message: `Missing managed repo file: ${relativePath}`, + }); + } + } + + const hooksPathResult = gitRun(repoRoot, ['config', '--get', 'core.hooksPath'], { allowFailure: true }); + const hooksPath = hooksPathResult.status === 0 ? hooksPathResult.stdout.trim() : ''; + if (hooksPath !== '.githooks') { + findings.push({ + level: 'warn', + code: 'hooks-path-mismatch', + message: `git core.hooksPath is '${hooksPath || '(unset)'}' (expected '.githooks')`, + }); + } + + const lockState = lockStateOrError(repoRoot); + if (!lockState.ok) { + findings.push({ + level: 'error', + code: 'lock-state-invalid', + message: lockState.error, + }); + } else { + for (const [filePath, rawEntry] of Object.entries(lockState.locks)) { + const entry = rawEntry && typeof rawEntry === 'object' ? rawEntry : {}; + const ownerBranch = String(entry.branch || ''); + const allowDelete = Boolean(entry.allow_delete); + + if (!ownerBranch) { + findings.push({ + level: 'warn', + code: 'lock-missing-owner', + path: filePath, + message: `Lock entry has no owner branch: ${filePath}`, + }); + } + + const absolutePath = path.join(repoRoot, filePath); + if (!fs.existsSync(absolutePath)) { + findings.push({ + level: 'warn', + code: 'lock-target-missing', + path: filePath, + message: `Locked path is missing from disk: ${filePath}`, + }); + } + + if (ownerBranch) { + const localRef = `refs/heads/${ownerBranch}`; + const remoteRef = `refs/remotes/origin/${ownerBranch}`; + if (!gitRefExists(repoRoot, localRef) && !gitRefExists(repoRoot, remoteRef)) { + findings.push({ + level: 'warn', + code: 'stale-branch-lock', + path: filePath, + message: `Lock owner branch not found locally/remotely: ${ownerBranch} (${filePath})`, + }); + } + } + + if (allowDelete && CRITICAL_GUARDRAIL_PATHS.has(filePath)) { + findings.push({ + level: 'error', + code: 'guardrail-delete-approved', + path: filePath, + message: `Critical guardrail file is delete-approved: ${filePath}`, + }); + } + } + } + + const errors = findings.filter((item) => item.level === 'error'); + const warnings = findings.filter((item) => item.level === 'warn'); + + return { + repoRoot, + branch, + findings, + errors: errors.length, + warnings: warnings.length, + guardexEnabled: true, + guardexToggle, + }; +} + +function printOperations(title, payload, dryRun = false) { + return scaffoldModule.printOperations(title, payload, dryRun); +} + +function printScanResult(scan, json = false) { + if (json) { + process.stdout.write( + JSON.stringify( + { + repoRoot: scan.repoRoot, + branch: scan.branch, + guardexEnabled: scan.guardexEnabled !== false, + guardexToggle: scan.guardexToggle || null, + errors: scan.errors, + warnings: scan.warnings, + findings: scan.findings, + }, + null, + 2, + ) + '\n', + ); + return; + } + + console.log(`[${TOOL_NAME}] Scan target: ${scan.repoRoot}`); + console.log(`[${TOOL_NAME}] Branch: ${scan.branch}`); + + if (scan.guardexEnabled === false) { + console.log( + colorizeDoctorOutput( + `[${TOOL_NAME}] Guardex is disabled for this repo (${describeGuardexRepoToggle(scan.guardexToggle)}).`, + 'disabled', + ), + ); + return; + } + + if (scan.findings.length === 0) { + console.log(colorizeDoctorOutput(`[${TOOL_NAME}] ✅ No safety issues detected.`, 'safe')); + return; + } + + for (const item of scan.findings) { + const target = item.path ? ` (${item.path})` : ''; + console.log( + colorizeDoctorOutput( + `[${item.level.toUpperCase()}] ${item.code}${target}: ${item.message}`, + item.level, + ), + ); + } + console.log( + colorizeDoctorOutput( + `[${TOOL_NAME}] Summary: ${scan.errors} error(s), ${scan.warnings} warning(s).`, + scan.errors > 0 ? 'error' : 'warn', + ), + ); +} + +function setExitCodeFromScan(scan) { + if (scan.guardexEnabled === false) { + process.exitCode = 0; + return; + } + if (scan.errors > 0) { + process.exitCode = 2; + return; + } + if (scan.warnings > 0) { + process.exitCode = 1; + return; + } + process.exitCode = 0; +} + +function status(rawArgs) { + const options = parseCommonArgs(rawArgs, { + target: process.cwd(), + json: false, + }); + + const toolchain = detectGlobalToolchainPackages(); + const npmServices = GLOBAL_TOOLCHAIN_PACKAGES.map((pkg) => { + const service = getGlobalToolchainService(pkg); + if (!toolchain.ok) { + return { + name: service.name, + displayName: service.name, + packageName: pkg, + dependencyUrl: service.dependencyUrl || null, + status: 'unknown', + }; + } + return { + name: service.name, + displayName: service.name, + packageName: pkg, + dependencyUrl: service.dependencyUrl || null, + status: toolchain.installed.includes(pkg) ? 'active' : 'inactive', + }; + }); + const localCompanionServices = detectOptionalLocalCompanionTools().map((tool) => ({ + name: tool.name, + displayName: tool.displayName || tool.name, + status: tool.status, + })); + const requiredSystemTools = detectRequiredSystemTools(); + const services = [ + ...npmServices, + ...localCompanionServices, + ...requiredSystemTools.map((tool) => ({ + name: tool.name, + displayName: tool.displayName || tool.name, + status: tool.status, + })), + ]; + + const targetPath = path.resolve(options.target); + const inGitRepo = isGitRepo(targetPath); + const scanResult = inGitRepo ? runScanInternal({ target: targetPath, json: false }) : null; + const repoServiceStatus = scanResult + ? (scanResult.guardexEnabled === false + ? 'disabled' + : (scanResult.errors === 0 && scanResult.warnings === 0 ? 'active' : 'degraded')) + : 'inactive'; + + const payload = { + cli: { + name: packageJson.name, + version: packageJson.version, + runtime: runtimeVersion(), + }, + services, + repo: { + target: targetPath, + inGitRepo, + serviceStatus: repoServiceStatus, + guardexEnabled: scanResult ? scanResult.guardexEnabled !== false : null, + guardexToggle: scanResult ? scanResult.guardexToggle || null : null, + scan: scanResult + ? { + repoRoot: scanResult.repoRoot, + branch: scanResult.branch, + errors: scanResult.errors, + warnings: scanResult.warnings, + findings: scanResult.findings.length, + } + : null, + }, + detectionError: toolchain.ok ? null : toolchain.error, + }; + + if (options.json) { + process.stdout.write(`${JSON.stringify(payload, null, 2)}\n`); + process.exitCode = 0; + return; + } + + console.log(`[${TOOL_NAME}] CLI: ${payload.cli.runtime}`); + if (!toolchain.ok) { + console.log(`[${TOOL_NAME}] ⚠️ Could not detect global services: ${toolchain.error}`); + } + + console.log(`[${TOOL_NAME}] Global services:`); + for (const service of services) { + const serviceLabel = service.displayName || service.name; + console.log(` - ${statusDot(service.status)} ${serviceLabel}: ${service.status}`); + } + const inactiveOptionalCompanions = [...npmServices, ...localCompanionServices] + .filter((service) => service.status !== 'active') + .map((service) => service.displayName || service.name); + if (inactiveOptionalCompanions.length > 0) { + console.log( + `[${TOOL_NAME}] Optional companion tools inactive: ${inactiveOptionalCompanions.join(', ')}`, + ); + for (const warning of describeMissingGlobalDependencyWarnings( + npmServices + .filter((service) => service.status === 'inactive') + .map((service) => service.packageName), + )) { + console.log(`[${TOOL_NAME}] ${warning}`); + } + console.log( + `[${TOOL_NAME}] Run '${SHORT_TOOL_NAME} setup' to install missing companions with an explicit Y/N prompt.`, + ); + } + const missingSystemTools = requiredSystemTools.filter((tool) => tool.status !== 'active'); + if (missingSystemTools.length > 0) { + const tools = missingSystemTools + .map((tool) => tool.displayName || tool.name) + .join(', '); + console.log(`[${TOOL_NAME}] ⚠️ Missing required system tool(s): ${tools}`); + for (const tool of missingSystemTools) { + const reasonText = tool.reason ? ` (${tool.reason})` : ''; + console.log(` - install ${tool.name}: ${tool.installHint}${reasonText}`); + } + } + + if (!scanResult) { + console.log( + `[${TOOL_NAME}] Repo safety service: ${statusDot('inactive')} inactive (no git repository at target).`, + ); + process.exitCode = 0; + return; + } + + if (scanResult.guardexEnabled === false) { + console.log( + `[${TOOL_NAME}] Repo safety service: ${statusDot('disabled')} disabled (${describeGuardexRepoToggle(scanResult.guardexToggle)}).`, + ); + console.log(`[${TOOL_NAME}] Repo: ${scanResult.repoRoot}`); + console.log(`[${TOOL_NAME}] Branch: ${scanResult.branch}`); + printToolLogsSummary(); + process.exitCode = 0; + return; + } + + if (scanResult.errors === 0 && scanResult.warnings === 0) { + console.log(`[${TOOL_NAME}] Repo safety service: ${statusDot('active')} active.`); + } else if (scanResult.errors === 0) { + console.log( + `[${TOOL_NAME}] Repo safety service: ${statusDot('degraded')} degraded (${scanResult.warnings} warning(s)).`, + ); + console.log(`[${TOOL_NAME}] Run '${TOOL_NAME} scan' to review warning details.`); + } else if (scanResult.warnings === 0) { + console.log( + `[${TOOL_NAME}] Repo safety service: ${statusDot('degraded')} degraded (${scanResult.errors} error(s)).`, + ); + console.log(`[${TOOL_NAME}] Run '${TOOL_NAME} scan' for detailed findings.`); + } else { + console.log( + `[${TOOL_NAME}] Repo safety service: ${statusDot('degraded')} degraded (${scanResult.errors} error(s), ${scanResult.warnings} warning(s)).`, + ); + console.log(`[${TOOL_NAME}] Run '${TOOL_NAME} scan' for detailed findings.`); + } + console.log(`[${TOOL_NAME}] Repo: ${scanResult.repoRoot}`); + console.log(`[${TOOL_NAME}] Branch: ${scanResult.branch}`); + printToolLogsSummary(); + + process.exitCode = 0; +} + +function install(rawArgs) { + const options = parseCommonArgs(rawArgs, { + target: process.cwd(), + force: false, + skipAgents: false, + skipPackageJson: false, + skipGitignore: false, + dryRun: false, + allowProtectedBaseWrite: false, + }); + + assertProtectedMainWriteAllowed(options, 'install'); + const payload = runInstallInternal(options); + printOperations('Install target', payload, options.dryRun); + + if (!options.dryRun) { + if (payload.guardexEnabled === false) { + console.log( + `[${TOOL_NAME}] Guardex is disabled for this repo (${describeGuardexRepoToggle(payload.guardexToggle)}). Skipping repo bootstrap.`, + ); + process.exitCode = 0; + return; + } + if (!options.skipAgents) { + console.log(`[${TOOL_NAME}] AGENTS.md managed policy block is configured by install.`); + } + console.log(`[${TOOL_NAME}] Installed. Next step: ${TOOL_NAME} setup`); + } + + process.exitCode = 0; +} + +function fix(rawArgs) { + const options = parseCommonArgs(rawArgs, { + target: process.cwd(), + dropStaleLocks: true, + skipAgents: false, + skipPackageJson: false, + skipGitignore: false, + dryRun: false, + allowProtectedBaseWrite: false, + }); + + assertProtectedMainWriteAllowed(options, 'fix'); + const payload = runFixInternal(options); + printOperations('Fix target', payload, options.dryRun); + + if (!options.dryRun) { + if (payload.guardexEnabled === false) { + console.log( + `[${TOOL_NAME}] Guardex is disabled for this repo (${describeGuardexRepoToggle(payload.guardexToggle)}). Skipping repo repair.`, + ); + process.exitCode = 0; + return; + } + console.log(`[${TOOL_NAME}] Repair complete. Next step: ${TOOL_NAME} scan`); + } + + process.exitCode = 0; +} + +function scan(rawArgs) { + const options = parseCommonArgs(rawArgs, { + target: process.cwd(), + json: false, + }); + + const result = runScanInternal(options); + printScanResult(result, options.json); + setExitCodeFromScan(result); +} + +function doctor(rawArgs) { + const options = parseDoctorArgs(rawArgs); + const topRepoRoot = resolveRepoRoot(options.target); + const discoveredRepos = options.recursive + ? discoverNestedGitRepos(topRepoRoot, { + maxDepth: options.nestedMaxDepth, + extraSkip: options.nestedSkipDirs, + includeSubmodules: options.includeSubmodules, + }) + : [topRepoRoot]; + + if (discoveredRepos.length > 1) { + if (!options.json) { + console.log( + `[${TOOL_NAME}] Detected ${discoveredRepos.length} git repos under ${topRepoRoot}. ` + + `Repairing each with doctor (use --single-repo to limit to the target).`, + ); + } + + const repoResults = []; + let aggregateExitCode = 0; + for (let repoIndex = 0; repoIndex < discoveredRepos.length; repoIndex += 1) { + const repoPath = discoveredRepos[repoIndex]; + const progressLabel = `${repoIndex + 1}/${discoveredRepos.length}`; + if (!options.json) { + console.log(`[${TOOL_NAME}] ── Doctor target: ${repoPath} [${progressLabel}] ──`); + } + + const childArgs = [ + path.resolve(__filename), + 'doctor', + '--single-repo', + '--target', + repoPath, + ...(options.force ? ['--force', ...(options.forceManagedPaths || [])] : []), + ...(options.dropStaleLocks ? [] : ['--keep-stale-locks']), + ...(options.skipAgents ? ['--skip-agents'] : []), + ...(options.skipPackageJson ? ['--skip-package-json'] : []), + ...(options.skipGitignore ? ['--no-gitignore'] : []), + ...(options.dryRun ? ['--dry-run'] : []), + // Recursive child doctor runs should report pending PR state immediately instead of blocking the parent loop. + '--no-wait-for-merge', + ...(options.verboseAutoFinish ? ['--verbose-auto-finish'] : []), + ...(options.json ? ['--json'] : []), + ...(options.allowProtectedBaseWrite ? ['--allow-protected-base-write'] : []), + ]; + const startedAt = Date.now(); + const nestedResult = options.json + ? run(process.execPath, childArgs, { cwd: topRepoRoot }) + : cp.spawnSync(process.execPath, childArgs, { + cwd: topRepoRoot, + encoding: 'utf8', + stdio: 'inherit', + }); + if (isSpawnFailure(nestedResult)) { + throw nestedResult.error; + } + + const exitCode = typeof nestedResult.status === 'number' ? nestedResult.status : 1; + if (exitCode !== 0 && aggregateExitCode === 0) { + aggregateExitCode = exitCode; + } + + if (options.json) { + let parsedResult = null; + if (nestedResult.stdout) { + try { + parsedResult = JSON.parse(nestedResult.stdout); + } catch { + parsedResult = null; + } + } + repoResults.push( + parsedResult + ? { repoRoot: repoPath, exitCode, result: parsedResult } + : { + repoRoot: repoPath, + exitCode, + stdout: nestedResult.stdout || '', + stderr: nestedResult.stderr || '', + }, + ); + } else { + console.log( + `[${TOOL_NAME}] Doctor target complete: ${repoPath} [${progressLabel}] in ${formatElapsedDuration(Date.now() - startedAt)}.`, + ); + if (repoIndex < discoveredRepos.length - 1) { + process.stdout.write('\n'); + } + } + } + + if (options.json) { + process.stdout.write( + JSON.stringify( + { + repoRoot: topRepoRoot, + recursive: true, + repos: repoResults, + }, + null, + 2, + ) + '\n', + ); + } + + process.exitCode = aggregateExitCode; + return; + } + + const singleRepoOptions = { + ...options, + target: topRepoRoot, + }; + + const blocked = protectedBaseWriteBlock(singleRepoOptions, { requireBootstrap: false }); + if (blocked) { + runDoctorInSandbox(singleRepoOptions, blocked); + return; + } + + assertProtectedMainWriteAllowed(singleRepoOptions, 'doctor'); + const fixPayload = runFixInternal(singleRepoOptions); + const scanResult = runScanInternal({ target: singleRepoOptions.target, json: false }); + const currentBaseBranch = currentBranchName(scanResult.repoRoot); + const autoFinishSummary = scanResult.guardexEnabled === false + ? { + enabled: false, + attempted: 0, + completed: 0, + skipped: 0, + failed: 0, + details: [], + } + : autoFinishReadyAgentBranches(scanResult.repoRoot, { + baseBranch: currentBaseBranch, + dryRun: singleRepoOptions.dryRun, + waitForMerge: singleRepoOptions.waitForMerge, + }); + const safe = scanResult.guardexEnabled === false || (scanResult.errors === 0 && scanResult.warnings === 0); + const musafe = safe; + + if (singleRepoOptions.json) { + process.stdout.write( + JSON.stringify( + { + repoRoot: scanResult.repoRoot, + branch: scanResult.branch, + safe, + musafe, + fix: { + operations: fixPayload.operations, + hookResult: fixPayload.hookResult, + dryRun: Boolean(singleRepoOptions.dryRun), + }, + scan: { + guardexEnabled: scanResult.guardexEnabled !== false, + guardexToggle: scanResult.guardexToggle || null, + errors: scanResult.errors, + warnings: scanResult.warnings, + findings: scanResult.findings, + }, + autoFinish: autoFinishSummary, + }, + null, + 2, + ) + '\n', + ); + setExitCodeFromScan(scanResult); + return; + } + + printOperations('Doctor/fix', fixPayload, options.dryRun); + printScanResult(scanResult, false); + if (scanResult.guardexEnabled === false) { + console.log(`[${TOOL_NAME}] Repo-local Guardex enforcement is intentionally disabled.`); + setExitCodeFromScan(scanResult); + return; + } + printAutoFinishSummary(autoFinishSummary, { + baseBranch: currentBaseBranch, + verbose: singleRepoOptions.verboseAutoFinish, + }); + if (safe) { + console.log(colorizeDoctorOutput(`[${TOOL_NAME}] ✅ Repo is fully safe.`, 'safe')); + } else { + console.log( + colorizeDoctorOutput( + `[${TOOL_NAME}] ⚠️ Repo is not fully safe yet (${scanResult.errors} error(s), ${scanResult.warnings} warning(s)).`, + scanResult.errors > 0 ? 'unsafe' : 'warn', + ), + ); + } + setExitCodeFromScan(scanResult); +} + +function review(rawArgs) { + const options = parseReviewArgs(rawArgs); + const repoRoot = resolveRepoRoot(options.target); + const result = runReviewBotCommand(repoRoot, options.passthroughArgs); + if (isSpawnFailure(result)) { + throw result.error; + } + + if (result.stdout) process.stdout.write(result.stdout); + if (result.stderr) process.stderr.write(result.stderr); + process.exitCode = typeof result.status === 'number' ? result.status : 1; +} + +function agentsStatePathForRepo(repoRoot) { + return path.join(repoRoot, AGENTS_BOTS_STATE_RELATIVE); +} + +function readAgentsState(repoRoot) { + const statePath = agentsStatePathForRepo(repoRoot); + if (!fs.existsSync(statePath)) { + return null; + } + try { + return JSON.parse(fs.readFileSync(statePath, 'utf8')); + } catch (_error) { + return null; + } +} + +function writeAgentsState(repoRoot, state) { + const statePath = agentsStatePathForRepo(repoRoot); + fs.mkdirSync(path.dirname(statePath), { recursive: true }); + fs.writeFileSync(statePath, `${JSON.stringify(state, null, 2)}\n`, 'utf8'); +} + +function processAlive(pid) { + const normalizedPid = Number.parseInt(String(pid || ''), 10); + if (!Number.isInteger(normalizedPid) || normalizedPid <= 0) { + return false; + } + try { + process.kill(normalizedPid, 0); + return true; + } catch (_error) { + return false; + } +} + +function sleepSeconds(seconds) { + const result = run('sleep', [String(seconds)]); + if (isSpawnFailure(result) || result.status !== 0) { + throw new Error(`sleep command failed for ${seconds}s`); + } +} + +function readProcessCommand(pid) { + const result = run('ps', ['-o', 'command=', '-p', String(pid)]); + if (isSpawnFailure(result) || result.status !== 0) { + return ''; + } + return String(result.stdout || '').trim(); +} + +function stopAgentProcessByPid(pid, expectedToken = '') { + const normalizedPid = Number.parseInt(String(pid || ''), 10); + if (!Number.isInteger(normalizedPid) || normalizedPid <= 0) { + return { status: 'invalid', pid: normalizedPid }; + } + if (!processAlive(normalizedPid)) { + return { status: 'not-running', pid: normalizedPid }; + } + + if (expectedToken) { + const cmdline = readProcessCommand(normalizedPid); + if (cmdline && !cmdline.includes(expectedToken)) { + return { status: 'mismatch', pid: normalizedPid, command: cmdline }; + } + } + + try { + process.kill(-normalizedPid, 'SIGTERM'); + } catch (_error) { + try { + process.kill(normalizedPid, 'SIGTERM'); + } catch (_err) { + return { status: 'term-failed', pid: normalizedPid }; + } + } + + const deadline = Date.now() + 3_000; + while (Date.now() < deadline) { + if (!processAlive(normalizedPid)) { + return { status: 'stopped', pid: normalizedPid }; + } + sleepSeconds(0.1); + } + + try { + process.kill(-normalizedPid, 'SIGKILL'); + } catch (_error) { + try { + process.kill(normalizedPid, 'SIGKILL'); + } catch (_err) { + return { status: 'kill-failed', pid: normalizedPid }; + } + } + sleepSeconds(0.1); + + return { + status: processAlive(normalizedPid) ? 'kill-failed' : 'stopped', + pid: normalizedPid, + }; +} + +function spawnDetachedAgentProcess({ command, args, cwd, logPath }) { + fs.mkdirSync(path.dirname(logPath), { recursive: true }); + const logHandle = fs.openSync(logPath, 'a'); + fs.writeSync( + logHandle, + `[${new Date().toISOString()}] spawn: ${command} ${args.join(' ')}\n`, + ); + const child = cp.spawn(command, args, { + cwd, + detached: true, + stdio: ['ignore', logHandle, logHandle], + env: process.env, + }); + fs.closeSync(logHandle); + if (child.error) { + throw child.error; + } + child.unref(); + const pid = Number.parseInt(String(child.pid || ''), 10); + if (!Number.isInteger(pid) || pid <= 0) { + throw new Error(`Failed to spawn detached process for ${command}`); + } + return pid; +} + +function agents(rawArgs) { + const options = parseAgentsArgs(rawArgs); + const repoRoot = resolveRepoRoot(options.target); + const statePath = agentsStatePathForRepo(repoRoot); + + if (options.subcommand === 'start') { + const existingState = readAgentsState(repoRoot); + const existingReviewPid = Number.parseInt(String(existingState?.review?.pid || ''), 10); + const existingCleanupPid = Number.parseInt(String(existingState?.cleanup?.pid || ''), 10); + const reviewRunning = processAlive(existingReviewPid); + const cleanupRunning = processAlive(existingCleanupPid); + + if (reviewRunning && cleanupRunning) { + console.log( + `[${TOOL_NAME}] Repo agents already running (review pid=${existingReviewPid}, cleanup pid=${existingCleanupPid}).`, + ); + process.exitCode = 0; + return; + } + + const reviewLogPath = path.join(repoRoot, '.omx', 'logs', 'agent-review.log'); + const cleanupLogPath = path.join(repoRoot, '.omx', 'logs', 'agent-cleanup.log'); + + let reviewPid = existingReviewPid; + let cleanupPid = existingCleanupPid; + let startedAny = false; + let reusedAny = false; + + if (!reviewRunning) { + reviewPid = spawnDetachedAgentProcess({ + command: process.execPath, + args: [ + path.resolve(__filename), + 'internal', + 'run-shell', + 'reviewBot', + '--target', + repoRoot, + '--interval', + String(options.reviewIntervalSeconds), + ], + cwd: repoRoot, + logPath: reviewLogPath, + }); + startedAny = true; + } else { + reusedAny = true; + } + + if (!cleanupRunning) { + cleanupPid = spawnDetachedAgentProcess({ + command: process.execPath, + args: [ + path.resolve(__filename), + 'cleanup', + '--target', + repoRoot, + '--watch', + '--interval', + String(options.cleanupIntervalSeconds), + '--idle-minutes', + String(options.idleMinutes), + ], + cwd: repoRoot, + logPath: cleanupLogPath, + }); + startedAny = true; + } else { + reusedAny = true; + } + + const priorReviewInterval = Number.parseInt(String(existingState?.review?.intervalSeconds || ''), 10); + const priorCleanupInterval = Number.parseInt(String(existingState?.cleanup?.intervalSeconds || ''), 10); + const priorIdleMinutes = Number.parseInt(String(existingState?.cleanup?.idleMinutes || ''), 10); + const reviewIntervalSeconds = reviewRunning && Number.isInteger(priorReviewInterval) && priorReviewInterval >= 5 + ? priorReviewInterval + : options.reviewIntervalSeconds; + const cleanupIntervalSeconds = cleanupRunning && Number.isInteger(priorCleanupInterval) && priorCleanupInterval >= 5 + ? priorCleanupInterval + : options.cleanupIntervalSeconds; + const idleMinutes = cleanupRunning && Number.isInteger(priorIdleMinutes) && priorIdleMinutes >= 1 + ? priorIdleMinutes + : options.idleMinutes; + + writeAgentsState(repoRoot, { + schemaVersion: 1, + repoRoot, + startedAt: new Date().toISOString(), + review: { + pid: reviewPid, + intervalSeconds: reviewIntervalSeconds, + script: path.resolve(__filename), + logPath: reviewLogPath, + }, + cleanup: { + pid: cleanupPid, + intervalSeconds: cleanupIntervalSeconds, + idleMinutes, + script: path.resolve(__filename), + logPath: cleanupLogPath, + }, + }); + + console.log( + `[${TOOL_NAME}] Started repo agents in ${repoRoot} (review pid=${reviewPid}, cleanup pid=${cleanupPid}).`, + ); + if (reusedAny && startedAny) { + console.log(`[${TOOL_NAME}] Reused healthy bot process(es) and started only missing ones.`); + } + console.log(`[${TOOL_NAME}] Logs: ${reviewLogPath}, ${cleanupLogPath}`); + process.exitCode = 0; + return; + } + + if (options.subcommand === 'stop') { + const existingState = readAgentsState(repoRoot); + if (!existingState) { + console.log(`[${TOOL_NAME}] Repo agents are not running for ${repoRoot}.`); + process.exitCode = 0; + return; + } + + const reviewStop = stopAgentProcessByPid(existingState?.review?.pid, 'internal run-shell reviewBot'); + const cleanupStop = stopAgentProcessByPid(existingState?.cleanup?.pid, `${path.basename(__filename)} cleanup`); + + if (fs.existsSync(statePath)) { + fs.unlinkSync(statePath); + } + + console.log( + `[${TOOL_NAME}] Stopped repo agents in ${repoRoot} (review=${reviewStop.status}, cleanup=${cleanupStop.status}).`, + ); + process.exitCode = 0; + return; + } + + const existingState = readAgentsState(repoRoot); + if (!existingState) { + console.log(`[${TOOL_NAME}] Repo agents status: inactive (${repoRoot})`); + process.exitCode = 0; + return; + } + + const reviewPid = Number.parseInt(String(existingState?.review?.pid || ''), 10); + const cleanupPid = Number.parseInt(String(existingState?.cleanup?.pid || ''), 10); + console.log( + `[${TOOL_NAME}] Repo agents status: review=${processAlive(reviewPid) ? 'running' : 'stopped'}(pid=${reviewPid || 0}), cleanup=${processAlive(cleanupPid) ? 'running' : 'stopped'}(pid=${cleanupPid || 0})`, + ); + process.exitCode = 0; +} + +function report(rawArgs) { + const options = parseReportArgs(rawArgs); + const subcommand = options.subcommand || 'help'; + if (subcommand === 'help' || subcommand === '--help' || subcommand === '-h') { + console.log( + `${TOOL_NAME} report commands:\n` + + ` ${TOOL_NAME} report scorecard [--target ] [--repo github.com//] [--scorecard-json ] [--output-dir ] [--date YYYY-MM-DD] [--dry-run] [--json]\n` + + `\n` + + `Examples:\n` + + ` ${TOOL_NAME} report scorecard --repo github.com/recodeecom/multiagent-safety\n` + + ` ${TOOL_NAME} report scorecard --scorecard-json ./scorecard.json --date 2026-04-10`, + ); + process.exitCode = 0; + return; + } + + if (subcommand !== 'scorecard') { + throw new Error(`Unknown report subcommand: ${subcommand}`); + } + + const repoRoot = resolveRepoRoot(options.target); + const repo = resolveScorecardRepo(repoRoot, options.repo); + const payload = options.scorecardJson + ? readScorecardJsonFile(options.scorecardJson) + : runScorecardJson(repo); + + const reportDate = options.date || todayDateStamp(); + const outputDir = path.resolve(options.outputDir || path.join(repoRoot, 'docs', 'reports')); + const baselinePath = path.join(outputDir, `openssf-scorecard-baseline-${reportDate}.md`); + const remediationPath = path.join(outputDir, `openssf-scorecard-remediation-plan-${reportDate}.md`); + + const checks = normalizeScorecardChecks(payload); + const rawScore = Number(payload?.score); + const score = Number.isFinite(rawScore) ? rawScore : 0; + const capturedAt = String(payload?.date || new Date().toISOString()); + const scorecardVersion = String(payload?.scorecard?.version || payload?.version || 'unknown'); + + const baselineMarkdown = renderScorecardBaselineMarkdown({ + repo, + score, + checks, + capturedAt, + scorecardVersion, + reportDate, + }); + + const remediationMarkdown = renderScorecardRemediationPlanMarkdown({ + baselineRelativePath: path.relative(repoRoot, baselinePath) || path.basename(baselinePath), + checks, + }); + + if (!options.dryRun) { + fs.mkdirSync(outputDir, { recursive: true }); + fs.writeFileSync(baselinePath, baselineMarkdown, 'utf8'); + fs.writeFileSync(remediationPath, remediationMarkdown, 'utf8'); + } + + if (options.json) { + process.stdout.write( + JSON.stringify( + { + repoRoot, + repo, + score, + checks: checks.length, + outputDir, + baselinePath, + remediationPath, + dryRun: Boolean(options.dryRun), + }, + null, + 2, + ) + '\n', + ); + process.exitCode = 0; + return; + } + + console.log(`[${TOOL_NAME}] Report target: ${repoRoot}`); + console.log(`[${TOOL_NAME}] Scorecard repo: ${repo}`); + console.log(`[${TOOL_NAME}] Score: ${score}/10`); + if (options.dryRun) { + console.log(`[${TOOL_NAME}] Dry run report paths:`); + } else { + console.log(`[${TOOL_NAME}] Generated reports:`); + } + console.log(` - ${baselinePath}`); + console.log(` - ${remediationPath}`); + process.exitCode = 0; +} + +function setup(rawArgs) { + const options = parseSetupArgs(rawArgs, { + target: process.cwd(), + force: false, + skipAgents: false, + skipPackageJson: false, + skipGitignore: false, + dryRun: false, + yesGlobalInstall: false, + noGlobalInstall: false, + allowProtectedBaseWrite: false, + }); + + const globalInstallStatus = installGlobalToolchain(options); + if (globalInstallStatus.status === 'installed') { + console.log( + `[${TOOL_NAME}] ✅ Companion tools installed (${(globalInstallStatus.packages || []).join(', ')}).`, + ); + } else if (globalInstallStatus.status === 'already-installed') { + console.log(`[${TOOL_NAME}] ✅ Companion tools already installed. Skipping.`); + } else if (globalInstallStatus.status === 'failed') { + const installCommands = describeCompanionInstallCommands( + GLOBAL_TOOLCHAIN_PACKAGES, + OPTIONAL_LOCAL_COMPANION_TOOLS, + ); + console.log( + `[${TOOL_NAME}] ⚠️ Global install failed: ${globalInstallStatus.reason}\n` + + `[${TOOL_NAME}] Continue with local safety setup. You can retry later with:\n` + + installCommands.map((command) => ` ${command}`).join('\n'), + ); + } else if (globalInstallStatus.status === 'skipped' && globalInstallStatus.reason === 'non-interactive-default') { + console.log( + `[${TOOL_NAME}] Skipping companion installs (non-interactive mode). ` + + `Use --yes-global-install to force or run interactively for Y/N prompt.`, + ); + } else if (globalInstallStatus.status === 'skipped') { + console.log(`[${TOOL_NAME}] ⚠️ Companion installs skipped by user choice.`); + for (const warning of describeMissingGlobalDependencyWarnings( + globalInstallStatus.missingPackages || [], + )) { + console.log(`[${TOOL_NAME}] ⚠️ ${warning}`); + } + } + const requiredSystemTools = detectRequiredSystemTools(); + const missingSystemTools = requiredSystemTools.filter((tool) => tool.status !== 'active'); + if (missingSystemTools.length === 0) { + console.log(`[${TOOL_NAME}] ✅ Required system tools available (${requiredSystemTools.map((tool) => tool.name).join(', ')}).`); + } else { + const names = missingSystemTools.map((tool) => tool.name).join(', '); + console.log(`[${TOOL_NAME}] ⚠️ Missing required system tool(s): ${names}`); + for (const tool of missingSystemTools) { + const reasonText = tool.reason ? ` (${tool.reason})` : ''; + console.log(`[${TOOL_NAME}] Install ${tool.name}: ${tool.installHint}${reasonText}`); + } + } + + const topRepoRoot = resolveRepoRoot(options.target); + const discoveredRepos = options.recursive + ? discoverNestedGitRepos(topRepoRoot, { + maxDepth: options.nestedMaxDepth, + extraSkip: options.nestedSkipDirs, + includeSubmodules: options.includeSubmodules, + }) + : [topRepoRoot]; + + if (discoveredRepos.length > 1) { + console.log( + `[${TOOL_NAME}] Detected ${discoveredRepos.length} git repos under ${topRepoRoot}. Installing into each (use --no-recursive to limit to the top-level).`, + ); + for (const repoPath of discoveredRepos) { + const marker = repoPath === topRepoRoot ? ' (top-level)' : ''; + console.log(`[${TOOL_NAME}] - ${repoPath}${marker}`); + } + } + + let aggregateErrors = 0; + let aggregateWarnings = 0; + let lastScanResult = null; + + for (const repoPath of discoveredRepos) { + const perRepoOptions = { ...options, target: repoPath }; + const repoLabel = discoveredRepos.length > 1 ? ` [${path.relative(topRepoRoot, repoPath) || '.'}]` : ''; + + if (discoveredRepos.length > 1) { + console.log(`[${TOOL_NAME}] ── Setup target: ${repoPath} ──`); + } + + const blocked = protectedBaseWriteBlock(perRepoOptions); + if (blocked) { + const sandboxResult = runSetupInSandbox(perRepoOptions, blocked, repoLabel); + aggregateErrors += sandboxResult.scanResult.errors; + aggregateWarnings += sandboxResult.scanResult.warnings; + lastScanResult = sandboxResult.scanResult; + continue; + } + + const { installPayload, fixPayload, parentWorkspace } = runSetupBootstrapInternal(perRepoOptions); + printOperations(`Setup/install${repoLabel}`, installPayload, perRepoOptions.dryRun); + printOperations(`Setup/fix${repoLabel}`, fixPayload, perRepoOptions.dryRun); + + if (perRepoOptions.dryRun) { + continue; + } + + if (parentWorkspace) { + console.log(`[${TOOL_NAME}] Parent workspace view: ${parentWorkspace.workspacePath}`); + } + + const scanResult = runScanInternal({ target: repoPath, json: false }); + const currentBaseBranch = currentBranchName(scanResult.repoRoot); + const autoFinishSummary = autoFinishReadyAgentBranches(scanResult.repoRoot, { + baseBranch: currentBaseBranch, + dryRun: perRepoOptions.dryRun, + }); + printScanResult(scanResult, false); + if (autoFinishSummary.enabled) { + console.log( + `[${TOOL_NAME}] Auto-finish sweep (base=${currentBaseBranch}): attempted=${autoFinishSummary.attempted}, completed=${autoFinishSummary.completed}, skipped=${autoFinishSummary.skipped}, failed=${autoFinishSummary.failed}`, + ); + for (const detail of autoFinishSummary.details) { + console.log(`[${TOOL_NAME}] ${detail}`); + } + } else if (autoFinishSummary.details.length > 0) { + console.log(`[${TOOL_NAME}] ${autoFinishSummary.details[0]}`); + } + printSetupRepoHints(scanResult.repoRoot, currentBaseBranch, repoLabel); + + aggregateErrors += scanResult.errors; + aggregateWarnings += scanResult.warnings; + lastScanResult = scanResult; + } + + if (options.dryRun) { + console.log(`[${TOOL_NAME}] Dry run setup done.`); + process.exitCode = 0; + return; + } + + if (aggregateErrors === 0 && aggregateWarnings === 0) { + const repoCount = discoveredRepos.length; + const suffix = repoCount > 1 ? ` (${repoCount} repos)` : ''; + console.log(`[${TOOL_NAME}] ✅ Setup complete.${suffix}`); + console.log(`[${TOOL_NAME}] Copy AI setup prompt with: ${SHORT_TOOL_NAME} prompt`); + console.log( + `[${TOOL_NAME}] OpenSpec core workflow: /opsx:propose -> /opsx:apply -> /opsx:archive`, + ); + console.log( + `[${TOOL_NAME}] Optional expanded OpenSpec profile: openspec config profile && openspec update`, + ); + console.log(`[${TOOL_NAME}] OpenSpec guide: docs/openspec-getting-started.md`); + } + + if (lastScanResult) { + setExitCodeFromScan({ + ...lastScanResult, + errors: aggregateErrors, + warnings: aggregateWarnings, + }); + } +} + +function ensureMainBranch(repoRoot) { + const branchResult = gitRun(repoRoot, ['rev-parse', '--abbrev-ref', 'HEAD'], { allowFailure: true }); + if (branchResult.status !== 0) { + throw new Error(`Unable to detect current branch in ${repoRoot}`); + } + + const branch = branchResult.stdout.trim(); + if (branch !== 'main') { + throw new Error(`Release blocked: current branch is '${branch}' (required: 'main')`); + } +} + +function ensureCleanWorkingTree(repoRoot) { + const statusResult = gitRun(repoRoot, ['status', '--porcelain'], { allowFailure: true }); + if (statusResult.status !== 0) { + throw new Error(`Unable to read git status in ${repoRoot}`); + } + + const dirty = statusResult.stdout.trim(); + if (dirty.length > 0) { + throw new Error('Release blocked: working tree is not clean'); + } +} + +function readReleaseRepoPackageJson(repoRoot) { + const manifestPath = path.join(repoRoot, 'package.json'); + if (!fs.existsSync(manifestPath)) { + throw new Error(`Release blocked: package.json missing in ${repoRoot}`); + } + + try { + return JSON.parse(fs.readFileSync(manifestPath, 'utf8')); + } catch (error) { + throw new Error(`Release blocked: unable to parse package.json in ${repoRoot}: ${error.message}`); + } +} + +function resolveReleaseGithubRepo(repoRoot) { + const releasePackageJson = readReleaseRepoPackageJson(repoRoot); + const fromManifest = inferGithubRepoSlug( + releasePackageJson.repository && + (releasePackageJson.repository.url || releasePackageJson.repository), + ); + if (fromManifest) { + return fromManifest; + } + + const fromOrigin = inferGithubRepoSlug(readGitConfig(repoRoot, 'remote.origin.url')); + if (fromOrigin) { + return fromOrigin; + } + + throw new Error( + 'Release blocked: unable to resolve GitHub repo from package.json repository URL or origin remote.', + ); +} + +function readRepoReadme(repoRoot) { + const readmePath = path.join(repoRoot, 'README.md'); + if (!fs.existsSync(readmePath)) { + throw new Error(`Release blocked: README.md missing in ${repoRoot}`); + } + return fs.readFileSync(readmePath, 'utf8'); +} + +function parseReadmeReleaseEntries(readmeContent) { + const releaseNotesIndex = String(readmeContent || '').indexOf('## Release notes'); + if (releaseNotesIndex < 0) { + throw new Error('Release blocked: README.md is missing the "## Release notes" section'); + } + + const releaseNotesContent = String(readmeContent || '').slice(releaseNotesIndex); + const entries = []; + const lines = releaseNotesContent.split(/\r?\n/); + let currentTag = ''; + let currentLines = []; + + function flushEntry() { + if (!currentTag) { + return; + } + const body = currentLines.join('\n').trim(); + if (body) { + entries.push({ tag: currentTag, body, version: parseVersionString(currentTag) }); + } + currentTag = ''; + currentLines = []; + } + + for (const line of lines) { + const headingMatch = line.match(/^###\s+(v\d+\.\d+\.\d+)\s*$/); + if (headingMatch) { + flushEntry(); + currentTag = headingMatch[1]; + continue; + } + + if (!currentTag) { + continue; + } + + if (/^<\/details>\s*$/.test(line) || /^##\s+/.test(line)) { + flushEntry(); + continue; + } + + currentLines.push(line); + } + + flushEntry(); + + if (entries.length === 0) { + throw new Error('Release blocked: README.md did not yield any versioned release-note sections'); + } + + return entries; +} + +function resolvePreviousPublishedReleaseTag(repoSlug, currentTag) { + const result = run(GH_BIN, ['release', 'list', '--repo', repoSlug, '--limit', '20'], { + timeout: 20_000, + }); + if (result.error) { + throw new Error(`Release blocked: unable to run '${GH_BIN} release list': ${result.error.message}`); + } + if (result.status !== 0) { + const details = (result.stderr || result.stdout || '').trim(); + throw new Error(`Release blocked: unable to list GitHub releases.${details ? `\n${details}` : ''}`); + } + + const tags = String(result.stdout || '') + .split('\n') + .map((line) => line.split('\t')[0].trim()) + .filter(Boolean); + + return tags.find((tag) => tag !== currentTag) || ''; +} + +function selectReleaseEntriesForWindow(entries, currentTag, previousTag) { + const currentVersion = parseVersionString(currentTag); + if (!currentVersion) { + throw new Error(`Release blocked: invalid current version tag '${currentTag}'`); + } + const previousVersion = previousTag ? parseVersionString(previousTag) : null; + + const selected = entries.filter((entry) => { + if (!entry.version) return false; + if (compareParsedVersions(entry.version, currentVersion) > 0) return false; + if (!previousVersion) return entry.tag === currentTag; + return compareParsedVersions(entry.version, previousVersion) > 0; + }); + + if (!selected.some((entry) => entry.tag === currentTag)) { + throw new Error(`Release blocked: README.md is missing release notes for ${currentTag}`); + } + + return selected; +} + +function renderGeneratedReleaseNotes(entries, currentTag, previousTag) { + const intro = previousTag ? `Changes since ${previousTag}.` : `Changes in ${currentTag}.`; + const sections = entries + .map((entry) => `### ${entry.tag}\n${entry.body}`) + .join('\n\n'); + return `GitGuardex ${currentTag}\n\n${intro}\n\n${sections}`; +} + +function buildReleaseNotesFromReadme(repoRoot, currentTag, previousTag) { + const readme = readRepoReadme(repoRoot); + const entries = parseReadmeReleaseEntries(readme); + const selected = selectReleaseEntriesForWindow(entries, currentTag, previousTag); + return renderGeneratedReleaseNotes(selected, currentTag, previousTag); +} + +function release(rawArgs) { + if (rawArgs.length > 0) { + throw new Error(`Unknown option: ${rawArgs[0]}`); + } + + const repoRoot = resolveRepoRoot(process.cwd()); + if (path.resolve(repoRoot) !== MAINTAINER_RELEASE_REPO) { + throw new Error( + `Release blocked: command only allowed in ${MAINTAINER_RELEASE_REPO} (current: ${repoRoot})`, + ); + } + + ensureMainBranch(repoRoot); + ensureCleanWorkingTree(repoRoot); + + if (!isCommandAvailable(GH_BIN)) { + throw new Error(`Release blocked: '${GH_BIN}' is not available`); + } + + const ghAuthStatus = run(GH_BIN, ['auth', 'status'], { timeout: 20_000 }); + if (ghAuthStatus.error) { + throw new Error(`Release blocked: unable to run '${GH_BIN} auth status': ${ghAuthStatus.error.message}`); + } + if (ghAuthStatus.status !== 0) { + const details = (ghAuthStatus.stderr || ghAuthStatus.stdout || '').trim(); + throw new Error(`Release blocked: '${GH_BIN}' auth is unavailable.${details ? `\n${details}` : ''}`); + } + + const releasePackageJson = readReleaseRepoPackageJson(repoRoot); + const repoSlug = resolveReleaseGithubRepo(repoRoot); + const currentTag = `v${releasePackageJson.version}`; + const previousTag = resolvePreviousPublishedReleaseTag(repoSlug, currentTag); + const notes = buildReleaseNotesFromReadme(repoRoot, currentTag, previousTag); + const headCommit = gitRun(repoRoot, ['rev-parse', 'HEAD']).stdout.trim(); + + const existingRelease = run(GH_BIN, ['release', 'view', currentTag, '--repo', repoSlug], { + timeout: 20_000, + }); + if (existingRelease.error) { + throw new Error(`Release blocked: unable to run '${GH_BIN} release view': ${existingRelease.error.message}`); + } + + const releaseArgs = + existingRelease.status === 0 + ? ['release', 'edit', currentTag, '--repo', repoSlug, '--title', currentTag, '--notes', notes] + : [ + 'release', + 'create', + currentTag, + '--repo', + repoSlug, + '--target', + headCommit, + '--title', + currentTag, + '--notes', + notes, + ]; + + console.log( + `[${TOOL_NAME}] ${existingRelease.status === 0 ? 'Updating' : 'Creating'} GitHub release ${currentTag} on ${repoSlug}`, + ); + if (previousTag) { + console.log(`[${TOOL_NAME}] Aggregating README release notes newer than ${previousTag}.`); + } else { + console.log(`[${TOOL_NAME}] No earlier published GitHub release found; using only ${currentTag}.`); + } + + const releaseResult = run(GH_BIN, releaseArgs, { cwd: repoRoot, timeout: 60_000 }); + if (releaseResult.error) { + throw new Error(`Release blocked: unable to run '${GH_BIN} release': ${releaseResult.error.message}`); + } + if (releaseResult.status !== 0) { + const details = (releaseResult.stderr || releaseResult.stdout || '').trim(); + throw new Error(`GitHub release command failed.${details ? `\n${details}` : ''}`); + } + + const releaseUrl = String(releaseResult.stdout || '').trim(); + if (releaseUrl) { + console.log(releaseUrl); + } + + console.log(`[${TOOL_NAME}] ✅ GitHub release ${currentTag} is synced to the README history.`); + process.exitCode = 0; +} + +function installMany(rawArgs) { + const options = parseInstallManyArgs(rawArgs); + const targets = collectInstallManyTargets(options); + + if (!targets.length) { + throw new Error('install-many did not find any targets to process.'); + } + + if (options.usedImplicitWorkspaceDefault) { + console.log( + `[multiagent-safety] No explicit targets provided. Defaulting to workspace scan: ${path.resolve( + options.workspace, + )} (max depth ${options.maxDepth})`, + ); + } + + console.log( + `[multiagent-safety] install-many starting for ${targets.length} target path(s)${ + options.dryRun ? ' [dry-run]' : '' + }`, + ); + + let installed = 0; + let duplicateRepos = 0; + const seenRepoRoots = new Set(); + const failures = []; + + for (const targetPath of targets) { + let repoRoot; + try { + repoRoot = resolveRepoRoot(targetPath); + } catch (error) { + failures.push({ target: targetPath, message: error.message }); + if (options.failFast) { + break; + } + continue; + } + + if (seenRepoRoots.has(repoRoot)) { + duplicateRepos += 1; + console.log(`[multiagent-safety] Skipping duplicate repo target: ${targetPath} -> ${repoRoot}`); + continue; + } + + seenRepoRoots.add(repoRoot); + + try { + const report = installIntoRepoRoot(repoRoot, options); + printInstallReport(report); + installed += 1; + } catch (error) { + failures.push({ target: repoRoot, message: error.message }); + if (options.failFast) { + break; + } + } + } + + console.log( + `[multiagent-safety] install-many summary: installed=${installed}, failures=${failures.length}, duplicate-targets=${duplicateRepos}`, + ); + + if (failures.length > 0) { + console.error('[multiagent-safety] Failed targets:'); + for (const failure of failures) { + console.error(` - ${failure.target}`); + console.error(` ${failure.message}`); + } + throw new Error(`install-many completed with ${failures.length} failure(s)`); + } + + if (options.dryRun) { + console.log('[multiagent-safety] Dry run complete. No files were modified.'); + } else { + console.log('[multiagent-safety] Installed multi-agent safety workflow across all targets.'); + } +} + +function initWorkspace(rawArgs) { + const options = parseInitWorkspaceArgs(rawArgs); + const resolvedWorkspace = path.resolve(options.workspace); + const repos = discoverGitRepos(resolvedWorkspace, options.maxDepth) + .map((repoPath) => path.resolve(repoPath)) + .sort(); + + const outputPath = options.output + ? path.resolve(options.output) + : path.join(resolvedWorkspace, DEFAULT_WORKSPACE_TARGETS_FILE); + + if (fs.existsSync(outputPath) && !options.force) { + throw new Error(`Refusing to overwrite existing file without --force: ${outputPath}`); + } + + const headerLines = [ + '# multiagent-safety workspace targets', + `# generated: ${new Date().toISOString()}`, + `# workspace: ${resolvedWorkspace}`, + `# max-depth: ${options.maxDepth}`, + '#', + '# Run:', + `# multiagent-safety install-many --targets-file "${outputPath}"`, + '', + ]; + const content = `${headerLines.join('\n')}${repos.join('\n')}${repos.length ? '\n' : ''}`; + + fs.mkdirSync(path.dirname(outputPath), { recursive: true }); + fs.writeFileSync(outputPath, content, 'utf8'); + + console.log(`[multiagent-safety] Workspace target file written: ${outputPath}`); + console.log(`[multiagent-safety] Repos discovered: ${repos.length}`); + if (repos.length === 0) { + console.log('[multiagent-safety] No git repos found. You can add target paths manually to the file.'); + } else { + console.log(`[multiagent-safety] Next step: multiagent-safety install-many --targets-file "${outputPath}"`); + } +} + +function doctorAudit(rawArgs) { + const options = parseDoctorArgs(rawArgs); + const repoRoot = resolveRepoRoot(options.target); + const guardexToggle = resolveGuardexRepoToggle(repoRoot); + const failures = []; + const warnings = []; + + function ok(message) { + console.log(` [ok] ${message}`); + } + function warn(message) { + warnings.push(message); + console.log(` [warn] ${message}`); + } + function fail(message) { + failures.push(message); + console.log(` [fail] ${message}`); + } + + console.log(`[multiagent-safety] doctor target: ${repoRoot}`); + if (!guardexToggle.enabled) { + console.log( + `[multiagent-safety] Guardex is disabled for this repo (${describeGuardexRepoToggle(guardexToggle)}).`, + ); + console.log('[multiagent-safety] doctor passed.'); + return; + } + + const hooksPath = run('git', ['-C', repoRoot, 'config', '--get', 'core.hooksPath']); + if (hooksPath.status !== 0) { + fail('git core.hooksPath is not configured'); + } else if (hooksPath.stdout.trim() !== '.githooks') { + fail(`git core.hooksPath is "${hooksPath.stdout.trim()}" (expected ".githooks")`); + } else { + ok('git core.hooksPath is .githooks'); + } + + for (const relativePath of REQUIRED_MANAGED_REPO_FILES) { + const absolutePath = path.join(repoRoot, relativePath); + if (!fs.existsSync(absolutePath)) { + fail(`missing ${relativePath}`); + continue; + } + ok(`found ${relativePath}`); + + if (EXECUTABLE_RELATIVE_PATHS.has(relativePath)) { + try { + fs.accessSync(absolutePath, fs.constants.X_OK); + } catch { + fail(`${relativePath} exists but is not executable`); + } + } + } + + const lockFilePath = path.join(repoRoot, '.omx/state/agent-file-locks.json'); + if (fs.existsSync(lockFilePath)) { + try { + const parsed = JSON.parse(fs.readFileSync(lockFilePath, 'utf8')); + if (!parsed || typeof parsed !== 'object' || typeof parsed.locks !== 'object') { + fail('.omx/state/agent-file-locks.json does not contain a valid { locks: {} } object'); + } else { + ok('lock registry JSON is valid'); + } + } catch (error) { + fail(`lock registry JSON is invalid: ${error.message}`); + } + } + + const packagePath = path.join(repoRoot, 'package.json'); + if (!fs.existsSync(packagePath)) { + warn('package.json not found (legacy agent:* script drift cannot be checked)'); + } else { + try { + const pkg = JSON.parse(fs.readFileSync(packagePath, 'utf8')); + const scripts = pkg.scripts || {}; + const legacyAgentScripts = Object.entries(LEGACY_MANAGED_PACKAGE_SCRIPTS) + .filter(([name, expectedValue]) => scripts[name] === expectedValue) + .map(([name]) => name); + if (legacyAgentScripts.length > 0) { + warn(`legacy agent:* package.json scripts remain (${legacyAgentScripts.join(', ')}); run '${SHORT_TOOL_NAME} migrate' to remove them`); + } else { + ok('package.json does not contain Guardex-managed agent:* helper scripts'); + } + } catch (error) { + fail(`package.json is invalid JSON: ${error.message}`); + } + } + + const agentsPath = path.join(repoRoot, 'AGENTS.md'); + if (!fs.existsSync(agentsPath)) { + warn('AGENTS.md not found (multi-agent contract snippet not present)'); + } else { + const agentsContent = fs.readFileSync(agentsPath, 'utf8'); + if (!agentsContent.includes(AGENTS_MARKER_START)) { + warn('AGENTS.md exists but multiagent-safety snippet marker is missing'); + } else { + ok('AGENTS.md contains multiagent-safety snippet marker'); + } + } + + if (warnings.length) { + console.log(`[multiagent-safety] warnings: ${warnings.length}`); + } + if (failures.length) { + console.log(`[multiagent-safety] failures: ${failures.length}`); + } + + if (failures.length === 0 && (!options.strict || warnings.length === 0)) { + console.log('[multiagent-safety] doctor passed.'); + if (warnings.length > 0) { + console.log('[multiagent-safety] tip: run with --strict to treat warnings as failures.'); + } + return; + } + + if (options.strict && warnings.length > 0 && failures.length === 0) { + console.log('[multiagent-safety] strict mode failed due to warnings.'); + } else { + console.log('[multiagent-safety] doctor failed.'); + } + throw new Error('doctor detected configuration issues'); +} + +function printAgentsSnippet() { + const snippetPath = path.join(TEMPLATE_ROOT, 'AGENTS.multiagent-safety.md'); + process.stdout.write(fs.readFileSync(snippetPath, 'utf8')); +} + +function copyPrompt() { + process.stdout.write(AI_SETUP_PROMPT); + process.exitCode = 0; +} + +function copyCommands() { + process.stdout.write(AI_SETUP_COMMANDS); + process.exitCode = 0; +} + +function prompt(rawArgs) { + const args = Array.isArray(rawArgs) ? rawArgs : []; + let variant = 'prompt'; + for (const arg of args) { + if (arg === '--exec' || arg === '--commands') variant = 'exec'; + else if (arg === '--snippet' || arg === '--agents') variant = 'snippet'; + else if (arg === '--prompt' || arg === '--full') variant = 'prompt'; + else if (arg === '-h' || arg === '--help') variant = 'help'; + else throw new Error(`Unknown option: ${arg}`); + } + if (variant === 'help') { + console.log( + `${SHORT_TOOL_NAME} prompt commands:\n` + + ` ${SHORT_TOOL_NAME} prompt Print AI setup checklist\n` + + ` ${SHORT_TOOL_NAME} prompt --exec Print setup commands only (shell-ready)\n` + + ` ${SHORT_TOOL_NAME} prompt --snippet Print the AGENTS.md managed-block template`, + ); + process.exitCode = 0; + return; + } + if (variant === 'exec') return copyCommands(); + if (variant === 'snippet') return printAgentsSnippet(); + return copyPrompt(); +} + +function printStandaloneOperations(title, rootLabel, operations, dryRun = false) { + return scaffoldModule.printStandaloneOperations(title, rootLabel, operations, dryRun); +} + +function branch(rawArgs) { + const [subcommand, ...rest] = rawArgs; + if (subcommand === 'start') { + const { target, passthrough } = extractTargetedArgs(rest); + invokePackageAsset('branchStart', passthrough, { cwd: resolveRepoRoot(target) }); + return; + } + if (subcommand === 'finish') { + const { target, passthrough } = extractTargetedArgs(rest); + invokePackageAsset('branchFinish', passthrough, { cwd: resolveRepoRoot(target) }); + return; + } + if (subcommand === 'merge') return merge(rest); + throw new Error( + `Usage: ${SHORT_TOOL_NAME} branch [options] ` + + `(examples: '${SHORT_TOOL_NAME} branch start "" ""', '${SHORT_TOOL_NAME} branch finish --branch ')`, + ); +} + +function locks(rawArgs) { + const { target, passthrough } = extractTargetedArgs(rawArgs); + const result = runPackageAsset('lockTool', passthrough, { cwd: resolveRepoRoot(target) }); + if (result.stdout) process.stdout.write(result.stdout); + if (result.stderr) process.stderr.write(result.stderr); + process.exitCode = result.status; +} + +function worktree(rawArgs) { + const [subcommand, ...rest] = rawArgs; + if (subcommand === 'prune') { + const { target, passthrough } = extractTargetedArgs(rest); + invokePackageAsset('worktreePrune', passthrough, { cwd: resolveRepoRoot(target) }); + return; + } + throw new Error(`Usage: ${SHORT_TOOL_NAME} worktree prune [cleanup-options]`); +} + +function hook(rawArgs) { + return hooksModule.hook(rawArgs); +} + +function internal(rawArgs) { + const [subcommand, assetKey, ...rest] = rawArgs; + if (subcommand !== 'run-shell') { + throw new Error(`Unknown internal command: ${subcommand || '(missing)'}`); + } + const { target, passthrough } = extractTargetedArgs(rest); + const repoRoot = resolveRepoRoot(target); + const result = assetKey === 'reviewBot' + ? runReviewBotCommand(repoRoot, passthrough) + : runPackageAsset(assetKey, passthrough, { cwd: repoRoot }); + if (result.stdout) process.stdout.write(result.stdout); + if (result.stderr) process.stderr.write(result.stderr); + process.exitCode = result.status; +} + +function installAgentSkills(rawArgs) { + let dryRun = false; + let force = false; + for (const arg of rawArgs) { + if (arg === '--dry-run') { + dryRun = true; + continue; + } + if (arg === '--force') { + force = true; + continue; + } + throw new Error(`Unknown option: ${arg}`); + } + + const operations = USER_LEVEL_SKILL_ASSETS.map((asset) => installUserLevelAsset(asset, { dryRun, force })); + printStandaloneOperations('User-level Guardex skills', GUARDEX_HOME_DIR, operations, dryRun); + process.exitCode = 0; +} + +function migrate(rawArgs) { + const { target, passthrough } = extractTargetedArgs(rawArgs); + let dryRun = false; + let force = false; + let installSkills = false; + for (const arg of passthrough) { + if (arg === '--dry-run') { + dryRun = true; + continue; + } + if (arg === '--force') { + force = true; + continue; + } + if (arg === '--install-agent-skills') { + installSkills = true; + continue; + } + throw new Error(`Unknown option: ${arg}`); + } + + const repoRoot = resolveRepoRoot(target); + const fixPayload = runFixInternal({ + target: repoRoot, + dryRun, + force, + skipAgents: false, + skipPackageJson: true, + skipGitignore: false, + dropStaleLocks: true, + }); + printOperations('Migrate/fix', fixPayload, dryRun); + + if (installSkills) { + const skillOps = USER_LEVEL_SKILL_ASSETS.map((asset) => installUserLevelAsset(asset, { dryRun, force })); + printStandaloneOperations('Migrate/install-agent-skills', GUARDEX_HOME_DIR, skillOps, dryRun); + } + + const removableLegacyFiles = LEGACY_MANAGED_REPO_FILES.filter( + (relativePath) => !REQUIRED_MANAGED_REPO_FILES.includes(relativePath), + ); + const removalOps = removableLegacyFiles.map((relativePath) => removeLegacyManagedRepoFile(repoRoot, relativePath, { dryRun, force })); + removalOps.push(removeLegacyPackageScripts(repoRoot, dryRun)); + printStandaloneOperations('Migrate/cleanup', repoRoot, removalOps, dryRun); + process.exitCode = 0; +} + +function cleanup(rawArgs) { + return getFinishApi().cleanup(rawArgs); +} + +function merge(rawArgs) { + return getFinishApi().merge(rawArgs); +} + +function finish(rawArgs, defaults = {}) { + return getFinishApi().finish(rawArgs, defaults); +} + +function sync(rawArgs) { + return getFinishApi().sync(rawArgs); +} + +function protect(rawArgs) { + const parsed = parseTargetFlag(rawArgs, process.cwd()); + const [subcommand, ...rest] = parsed.args; + const repoRoot = resolveRepoRoot(parsed.target); + + if (!subcommand || subcommand === 'help' || subcommand === '--help' || subcommand === '-h') { + console.log( + `${TOOL_NAME} protect commands:\n` + + ` ${TOOL_NAME} protect list [--target ]\n` + + ` ${TOOL_NAME} protect add [--target ]\n` + + ` ${TOOL_NAME} protect remove [--target ]\n` + + ` ${TOOL_NAME} protect set [--target ]\n` + + ` ${TOOL_NAME} protect reset [--target ]`, + ); + process.exitCode = 0; + return; + } + + const requestedBranches = uniquePreserveOrder(parseBranchList(rest.join(' '))); + + if (subcommand === 'list') { + const branches = readProtectedBranches(repoRoot); + console.log(`[${TOOL_NAME}] Protected branches (${branches.length}): ${branches.join(', ')}`); + process.exitCode = 0; + return; + } + + if (subcommand === 'add') { + if (requestedBranches.length === 0) { + throw new Error('protect add requires one or more branch names'); + } + const current = readProtectedBranches(repoRoot); + const next = uniquePreserveOrder([...current, ...requestedBranches]); + writeProtectedBranches(repoRoot, next); + console.log(`[${TOOL_NAME}] Protected branches updated: ${next.join(', ')}`); + process.exitCode = 0; + return; + } + + if (subcommand === 'remove') { + if (requestedBranches.length === 0) { + throw new Error('protect remove requires one or more branch names'); + } + const current = readProtectedBranches(repoRoot); + const removals = new Set(requestedBranches); + const next = current.filter((branch) => !removals.has(branch)); + writeProtectedBranches(repoRoot, next); + console.log( + `[${TOOL_NAME}] Protected branches updated: ` + + `${(next.length > 0 ? next : DEFAULT_PROTECTED_BRANCHES).join(', ')}`, + ); + if (next.length === 0) { + console.log(`[${TOOL_NAME}] Reset to defaults (${DEFAULT_PROTECTED_BRANCHES.join(', ')}) because list was empty.`); + } + process.exitCode = 0; + return; + } + + if (subcommand === 'set') { + if (requestedBranches.length === 0) { + throw new Error('protect set requires one or more branch names'); + } + writeProtectedBranches(repoRoot, requestedBranches); + console.log(`[${TOOL_NAME}] Protected branches set: ${requestedBranches.join(', ')}`); + process.exitCode = 0; + return; + } + + if (subcommand === 'reset') { + writeProtectedBranches(repoRoot, []); + console.log(`[${TOOL_NAME}] Protected branches reset to defaults: ${DEFAULT_PROTECTED_BRANCHES.join(', ')}`); + process.exitCode = 0; + return; + } + + throw new Error(`Unknown protect subcommand: ${subcommand}`); +} + +function levenshteinDistance(a, b) { + const rows = a.length + 1; + const cols = b.length + 1; + const matrix = Array.from({ length: rows }, () => Array(cols).fill(0)); + + for (let i = 0; i < rows; i += 1) matrix[i][0] = i; + for (let j = 0; j < cols; j += 1) matrix[0][j] = j; + + for (let i = 1; i < rows; i += 1) { + for (let j = 1; j < cols; j += 1) { + const cost = a[i - 1] === b[j - 1] ? 0 : 1; + matrix[i][j] = Math.min( + matrix[i - 1][j] + 1, // deletion + matrix[i][j - 1] + 1, // insertion + matrix[i - 1][j - 1] + cost, // substitution + ); + } + } + return matrix[a.length][b.length]; +} + +function maybeSuggestCommand(command) { + let best = null; + let bestDistance = Number.POSITIVE_INFINITY; + + for (const candidate of SUGGESTIBLE_COMMANDS) { + const dist = levenshteinDistance(command, candidate); + if (dist < bestDistance) { + bestDistance = dist; + best = candidate; + } + } + + if (best && bestDistance <= 2) { + return best; + } + + return null; +} + +function normalizeCommandOrThrow(command) { + if (COMMAND_TYPO_ALIASES.has(command)) { + const mapped = COMMAND_TYPO_ALIASES.get(command); + console.log(`[${TOOL_NAME}] Interpreting '${command}' as '${mapped}'.`); + return mapped; + } + return command; +} + +function warnDeprecatedAlias(aliasName) { + const entry = DEPRECATED_COMMAND_ALIASES.get(aliasName); + if (!entry) return; + console.error( + `[${TOOL_NAME}] '${aliasName}' is deprecated and will be removed in a future major release. ` + + `Use: ${entry.hint}`, + ); +} + +function extractFlag(args, ...names) { + const flagSet = new Set(names); + let found = false; + const remaining = []; + for (const arg of args) { + if (flagSet.has(arg)) { + found = true; + } else { + remaining.push(arg); + } + } + return { found, remaining }; +} + +function main(argv = process.argv.slice(2)) { + const args = [...argv]; + + if (args.length === 0) { + maybeSelfUpdateBeforeStatus(); + maybeOpenSpecUpdateBeforeStatus(); + status([]); + return; + } + + const [rawCommand, ...rest] = args; + const command = normalizeCommandOrThrow(rawCommand); + + if (command === '--help' || command === '-h' || command === 'help') { + usage(); + return; + } + + if (command === '--version' || command === '-v' || command === 'version') { + maybeSelfUpdateBeforeStatus(); + console.log(packageJson.version); + return; + } + + // Deprecated direct aliases — route to new surface and warn once. + if (DEPRECATED_COMMAND_ALIASES.has(command)) { + warnDeprecatedAlias(command); + if (command === 'init') return setup(rest); + if (command === 'install') return install(rest); + if (command === 'fix') return fix(rest); + if (command === 'scan') return scan(rest); + if (command === 'copy-prompt') return copyPrompt(); + if (command === 'copy-commands') return copyCommands(); + if (command === 'print-agents-snippet') return printAgentsSnippet(); + if (command === 'review') return review(rest); + } + + if (command === 'status') { + const { found: strict, remaining } = extractFlag(rest, '--strict'); + if (strict) return scan(remaining); + return status(remaining); + } + + if (command === 'setup') { + const installOnly = extractFlag(rest, '--install-only', '--only-install'); + if (installOnly.found) return install(installOnly.remaining); + const repairOnly = extractFlag(installOnly.remaining, '--repair', '--fix-only'); + if (repairOnly.found) return fix(repairOnly.remaining); + return setup(repairOnly.remaining); + } + + if (command === 'prompt') return prompt(rest); + if (command === 'doctor') return doctor(rest); + if (command === 'branch') return branch(rest); + if (command === 'locks') return locks(rest); + if (command === 'worktree') return worktree(rest); + if (command === 'hook') return hook(rest); + if (command === 'migrate') return migrate(rest); + if (command === 'install-agent-skills') return installAgentSkills(rest); + if (command === 'internal') return internal(rest); + if (command === 'agents') return agents(rest); + if (command === 'merge') return merge(rest); + if (command === 'finish') return finish(rest); + if (command === 'report') return report(rest); + if (command === 'protect') return protect(rest); + if (command === 'sync') return sync(rest); + if (command === 'cleanup') return cleanup(rest); + if (command === 'release') return release(rest); + + const suggestion = maybeSuggestCommand(command); + if (suggestion) { + throw new Error(`Unknown command: ${command}. Did you mean '${suggestion}'?`); + } + throw new Error(`Unknown command: ${command}`); +} + +function runFromBin(argv = process.argv.slice(2)) { + try { + main(argv); + } catch (error) { + console.error(`[${TOOL_NAME}] ${error.message}`); + process.exitCode = 1; + } +} + +module.exports = { + main, + runFromBin, +}; + +if (require.main === module) { + runFromBin(); +} diff --git a/src/context.js b/src/context.js new file mode 100644 index 0000000..f64dbf3 --- /dev/null +++ b/src/context.js @@ -0,0 +1,503 @@ +const fs = require('node:fs'); +const os = require('node:os'); +const path = require('node:path'); +const cp = require('node:child_process'); + +const PACKAGE_ROOT = path.resolve(__dirname, '..'); +const CLI_ENTRY_PATH = path.join(PACKAGE_ROOT, 'bin', 'multiagent-safety.js'); +const packageJsonPath = path.join(PACKAGE_ROOT, 'package.json'); +const packageJson = JSON.parse(fs.readFileSync(packageJsonPath, 'utf8')); + +const TOOL_NAME = 'gitguardex'; +const SHORT_TOOL_NAME = 'gx'; +if (!process.env.GUARDEX_CLI_ENTRY) { + process.env.GUARDEX_CLI_ENTRY = CLI_ENTRY_PATH; +} +if (!process.env.GUARDEX_NODE_BIN) { + process.env.GUARDEX_NODE_BIN = process.execPath; +} +const LEGACY_NAMES = ['guardex', 'multiagent-safety']; +const GLOBAL_INSTALL_COMMAND = `npm i -g ${packageJson.name}`; +const OPENSPEC_PACKAGE = '@fission-ai/openspec'; +const OMC_PACKAGE = 'oh-my-claude-sisyphus'; +const OMC_REPO_URL = 'https://github.com/Yeachan-Heo/oh-my-claudecode'; +const CAVEMEM_PACKAGE = 'cavemem'; +const NPX_BIN = process.env.GUARDEX_NPX_BIN || 'npx'; +const GUARDEX_HOME_DIR = path.resolve(process.env.GUARDEX_HOME_DIR || os.homedir()); +const GLOBAL_TOOLCHAIN_SERVICES = [ + { name: 'oh-my-codex', packageName: 'oh-my-codex' }, + { + name: 'oh-my-claudecode', + packageName: OMC_PACKAGE, + dependencyUrl: OMC_REPO_URL, + }, + { name: OPENSPEC_PACKAGE, packageName: OPENSPEC_PACKAGE }, + { name: CAVEMEM_PACKAGE, packageName: CAVEMEM_PACKAGE }, + { + name: '@imdeadpool/codex-account-switcher', + packageName: '@imdeadpool/codex-account-switcher', + }, +]; +const GLOBAL_TOOLCHAIN_PACKAGES = [ + ...GLOBAL_TOOLCHAIN_SERVICES.map((service) => service.packageName), +]; +const OPTIONAL_LOCAL_COMPANION_TOOLS = [ + { + name: 'cavekit', + candidatePaths: [ + '.cavekit/plugin.json', + '.codex/local-marketplaces/cavekit/.agents/plugins/marketplace.json', + ], + installCommand: `${NPX_BIN} skills add JuliusBrussee/cavekit`, + installArgs: ['skills', 'add', 'JuliusBrussee/cavekit'], + }, + { + name: 'caveman', + candidatePaths: [ + '.config/caveman/config.json', + '.cavekit/skills/caveman/SKILL.md', + ], + installCommand: `${NPX_BIN} skills add JuliusBrussee/caveman`, + installArgs: ['skills', 'add', 'JuliusBrussee/caveman'], + }, +]; +const GH_BIN = process.env.GUARDEX_GH_BIN || 'gh'; +const REQUIRED_SYSTEM_TOOLS = [ + { + name: 'gh', + displayName: 'GitHub (gh)', + command: GH_BIN, + installHint: 'https://cli.github.com/', + }, +]; +const MAINTAINER_RELEASE_REPO = path.resolve( + process.env.GUARDEX_RELEASE_REPO || path.resolve(PACKAGE_ROOT), +); +const NPM_BIN = process.env.GUARDEX_NPM_BIN || 'npm'; +const OPENSPEC_BIN = process.env.GUARDEX_OPENSPEC_BIN || 'openspec'; +const SCORECARD_BIN = process.env.GUARDEX_SCORECARD_BIN || 'scorecard'; +const GIT_PROTECTED_BRANCHES_KEY = 'multiagent.protectedBranches'; +const GIT_BASE_BRANCH_KEY = 'multiagent.baseBranch'; +const GIT_SYNC_STRATEGY_KEY = 'multiagent.sync.strategy'; +const GUARDEX_REPO_TOGGLE_ENV = 'GUARDEX_ON'; +const DEFAULT_PROTECTED_BRANCHES = ['dev', 'main', 'master']; +const DEFAULT_BASE_BRANCH = 'dev'; +const DEFAULT_SYNC_STRATEGY = 'rebase'; +const DEFAULT_SHADOW_CLEANUP_IDLE_MINUTES = 60; +const COMPOSE_HINT_FILES = [ + 'docker-compose.yml', + 'docker-compose.yaml', + 'compose.yml', + 'compose.yaml', +]; + +const TEMPLATE_ROOT = path.join(PACKAGE_ROOT, 'templates'); + +const HOOK_NAMES = ['pre-commit', 'pre-push', 'post-merge', 'post-checkout']; + +function toDestinationPath(relativeTemplatePath) { + if (relativeTemplatePath.startsWith('scripts/')) { + return relativeTemplatePath; + } + if (relativeTemplatePath.startsWith('githooks/')) { + return `.${relativeTemplatePath}`; + } + if (relativeTemplatePath.startsWith('codex/')) { + return `.${relativeTemplatePath}`; + } + if (relativeTemplatePath.startsWith('claude/')) { + return `.${relativeTemplatePath}`; + } + if (relativeTemplatePath.startsWith('github/')) { + return `.${relativeTemplatePath}`; + } + if (relativeTemplatePath.startsWith('vscode/')) { + return relativeTemplatePath; + } + throw new Error(`Unsupported template path: ${relativeTemplatePath}`); +} + +const TEMPLATE_FILES = [ + 'scripts/agent-session-state.js', + 'scripts/guardex-docker-loader.sh', + 'scripts/guardex-env.sh', + 'scripts/install-vscode-active-agents-extension.js', + 'github/pull.yml.example', + 'github/workflows/cr.yml', + 'vscode/guardex-active-agents/package.json', + 'vscode/guardex-active-agents/extension.js', + 'vscode/guardex-active-agents/session-schema.js', + 'vscode/guardex-active-agents/README.md', +]; + +const LEGACY_WORKFLOW_SHIM_SPECS = [ + { relativePath: 'scripts/agent-branch-start.sh', kind: 'shell', command: ['branch', 'start'] }, + { relativePath: 'scripts/agent-branch-finish.sh', kind: 'shell', command: ['branch', 'finish'] }, + { relativePath: 'scripts/agent-branch-merge.sh', kind: 'shell', command: ['branch', 'merge'] }, + { relativePath: 'scripts/codex-agent.sh', kind: 'shell', command: ['internal', 'run-shell', 'codexAgent'] }, + { relativePath: 'scripts/review-bot-watch.sh', kind: 'shell', command: ['internal', 'run-shell', 'reviewBot'] }, + { relativePath: 'scripts/agent-worktree-prune.sh', kind: 'shell', command: ['worktree', 'prune'] }, + { relativePath: 'scripts/agent-file-locks.py', kind: 'python', command: ['locks'] }, + { relativePath: 'scripts/openspec/init-plan-workspace.sh', kind: 'shell', command: ['internal', 'run-shell', 'planInit'] }, + { relativePath: 'scripts/openspec/init-change-workspace.sh', kind: 'shell', command: ['internal', 'run-shell', 'changeInit'] }, +]; + +const LEGACY_WORKFLOW_SHIMS = LEGACY_WORKFLOW_SHIM_SPECS.map((entry) => entry.relativePath); + +const MANAGED_TEMPLATE_DESTINATIONS = TEMPLATE_FILES.map((entry) => toDestinationPath(entry)); +const MANAGED_TEMPLATE_SCRIPT_FILES = MANAGED_TEMPLATE_DESTINATIONS.filter((entry) => + entry.startsWith('scripts/'), +); + +const LEGACY_MANAGED_REPO_FILES = [ + ...LEGACY_WORKFLOW_SHIMS, + 'scripts/agent-session-state.js', + 'scripts/guardex-docker-loader.sh', + 'scripts/install-vscode-active-agents-extension.js', + 'scripts/guardex-env.sh', + 'scripts/install-agent-git-hooks.sh', + '.githooks/pre-commit', + '.githooks/pre-push', + '.githooks/post-merge', + '.githooks/post-checkout', + '.codex/skills/gitguardex/SKILL.md', + '.codex/skills/guardex-merge-skills-to-dev/SKILL.md', + '.claude/commands/gitguardex.md', +]; + +const REQUIRED_MANAGED_REPO_FILES = [ + ...MANAGED_TEMPLATE_DESTINATIONS, + ...HOOK_NAMES.map((entry) => path.posix.join('.githooks', entry)), + '.omx/state/agent-file-locks.json', +]; + +const LEGACY_MANAGED_PACKAGE_SCRIPTS = { + 'agent:codex': 'bash ./scripts/codex-agent.sh', + 'agent:branch:start': 'bash ./scripts/agent-branch-start.sh', + 'agent:branch:finish': 'bash ./scripts/agent-branch-finish.sh', + 'agent:branch:merge': 'bash ./scripts/agent-branch-merge.sh', + 'agent:cleanup': 'gx cleanup', + 'agent:hooks:install': 'bash ./scripts/install-agent-git-hooks.sh', + 'agent:locks:claim': 'python3 ./scripts/agent-file-locks.py claim', + 'agent:locks:allow-delete': 'python3 ./scripts/agent-file-locks.py allow-delete', + 'agent:locks:release': 'python3 ./scripts/agent-file-locks.py release', + 'agent:locks:status': 'python3 ./scripts/agent-file-locks.py status', + 'agent:plan:init': 'bash ./scripts/openspec/init-plan-workspace.sh', + 'agent:change:init': 'bash ./scripts/openspec/init-change-workspace.sh', + 'agent:protect:list': 'gx protect list', + 'agent:branch:sync': 'gx sync', + 'agent:branch:sync:check': 'gx sync --check', + 'agent:safety:setup': 'gx setup', + 'agent:safety:scan': 'gx status --strict', + 'agent:safety:fix': 'gx setup --repair', + 'agent:safety:doctor': 'gx doctor', + 'agent:docker:load': 'bash ./scripts/guardex-docker-loader.sh', + 'agent:review:watch': 'bash ./scripts/review-bot-watch.sh', + 'agent:finish': 'gx finish --all', +}; + +const PACKAGE_SCRIPT_ASSETS = { + branchStart: path.join(TEMPLATE_ROOT, 'scripts', 'agent-branch-start.sh'), + branchFinish: path.join(TEMPLATE_ROOT, 'scripts', 'agent-branch-finish.sh'), + branchMerge: path.join(TEMPLATE_ROOT, 'scripts', 'agent-branch-merge.sh'), + codexAgent: path.join(TEMPLATE_ROOT, 'scripts', 'codex-agent.sh'), + reviewBot: path.join(TEMPLATE_ROOT, 'scripts', 'review-bot-watch.sh'), + worktreePrune: path.join(TEMPLATE_ROOT, 'scripts', 'agent-worktree-prune.sh'), + lockTool: path.join(TEMPLATE_ROOT, 'scripts', 'agent-file-locks.py'), + planInit: path.join(TEMPLATE_ROOT, 'scripts', 'openspec', 'init-plan-workspace.sh'), + changeInit: path.join(TEMPLATE_ROOT, 'scripts', 'openspec', 'init-change-workspace.sh'), +}; + +const USER_LEVEL_SKILL_ASSETS = [ + { + source: path.join(TEMPLATE_ROOT, 'codex', 'skills', 'gitguardex', 'SKILL.md'), + destination: path.join('.codex', 'skills', 'gitguardex', 'SKILL.md'), + }, + { + source: path.join(TEMPLATE_ROOT, 'codex', 'skills', 'guardex-merge-skills-to-dev', 'SKILL.md'), + destination: path.join('.codex', 'skills', 'guardex-merge-skills-to-dev', 'SKILL.md'), + }, + { + source: path.join(TEMPLATE_ROOT, 'claude', 'commands', 'gitguardex.md'), + destination: path.join('.claude', 'commands', 'gitguardex.md'), + }, +]; + +const EXECUTABLE_RELATIVE_PATHS = new Set([ + ...MANAGED_TEMPLATE_SCRIPT_FILES, + ...HOOK_NAMES.map((entry) => path.posix.join('.githooks', entry)), +]); + +const CRITICAL_GUARDRAIL_PATHS = new Set([ + 'AGENTS.md', + ...HOOK_NAMES.map((entry) => path.posix.join('.githooks', entry)), + 'scripts/guardex-env.sh', +]); + +const LOCK_FILE_RELATIVE = '.omx/state/agent-file-locks.json'; +const AGENTS_BOTS_STATE_RELATIVE = '.omx/state/agents-bots.json'; +const AGENTS_MARKER_START = ''; +const AGENTS_MARKER_END = ''; +const GITIGNORE_MARKER_START = '# multiagent-safety:START'; +const GITIGNORE_MARKER_END = '# multiagent-safety:END'; +const CODEX_WORKTREE_RELATIVE_DIR = path.join('.omx', 'agent-worktrees'); +const CLAUDE_WORKTREE_RELATIVE_DIR = path.join('.omc', 'agent-worktrees'); +const AGENT_WORKTREE_RELATIVE_DIRS = [ + CODEX_WORKTREE_RELATIVE_DIR, + CLAUDE_WORKTREE_RELATIVE_DIR, +]; +const MANAGED_GITIGNORE_PATHS = [ + '.omx/', + '.omc/', + 'scripts/agent-session-state.js', + 'scripts/guardex-docker-loader.sh', + 'scripts/guardex-env.sh', + 'scripts/install-vscode-active-agents-extension.js', + '.githooks', + 'oh-my-codex/', + LOCK_FILE_RELATIVE, +]; +const REPO_SCAFFOLD_DIRECTORIES = ['bin']; +const OMX_SCAFFOLD_DIRECTORIES = [ + '.omx', + '.omx/state', + '.omx/logs', + '.omx/plans', + CODEX_WORKTREE_RELATIVE_DIR, + '.omc', + CLAUDE_WORKTREE_RELATIVE_DIR, +]; +const OMX_SCAFFOLD_FILES = new Map([ + ['.omx/notepad.md', '\n\n## WORKING MEMORY\n'], + ['.omx/project-memory.json', '{}\n'], +]); +const TARGETED_FORCEABLE_MANAGED_PATHS = new Set([ + 'AGENTS.md', + '.gitignore', + ...Array.from(OMX_SCAFFOLD_FILES.keys()), + ...REQUIRED_MANAGED_REPO_FILES, + ...LEGACY_WORKFLOW_SHIMS, +]); +const COMMAND_TYPO_ALIASES = new Map([ + ['relaese', 'release'], + ['realaese', 'release'], + ['relase', 'release'], + ['setpu', 'setup'], + ['inti', 'init'], + ['intsall', 'install'], + ['docter', 'doctor'], + ['doctro', 'doctor'], + ['cleunup', 'cleanup'], + ['scna', 'scan'], +]); +const SUGGESTIBLE_COMMANDS = [ + 'status', + 'setup', + 'doctor', + 'branch', + 'locks', + 'worktree', + 'hook', + 'migrate', + 'install-agent-skills', + 'agents', + 'merge', + 'finish', + 'report', + 'protect', + 'sync', + 'cleanup', + 'prompt', + 'help', + 'version', + 'init', + 'install', + 'fix', + 'scan', + 'review', + 'copy-prompt', + 'copy-commands', + 'print-agents-snippet', + 'release', +]; +const CLI_COMMAND_DESCRIPTIONS = [ + ['status', 'Show GitGuardex CLI + service health without modifying files'], + ['setup', 'Install, repair, and verify guardrails (flags: --repair, --install-only, --target)'], + ['doctor', 'Repair drift + verify (auto-sandboxes on protected main)'], + ['branch', 'CLI-owned branch workflow surface (start/finish/merge)'], + ['locks', 'CLI-owned file lock surface (claim/allow-delete/release/status/validate)'], + ['worktree', 'CLI-owned worktree cleanup surface (prune)'], + ['hook', 'Hook dispatch/install surface used by managed shims'], + ['migrate', 'Convert legacy repo-local installs to the zero-copy CLI-owned surface'], + ['install-agent-skills', 'Install Guardex Codex/Claude skills into the user home'], + ['protect', 'Manage protected branches (list/add/remove/set/reset)'], + ['merge', 'Create/reuse an integration lane and merge overlapping agent branches'], + ['sync', 'Sync agent branches with origin/'], + ['finish', 'Commit + PR + merge completed agent branches (--all, --branch)'], + ['cleanup', 'Prune merged/stale agent branches and worktrees'], + ['release', 'Create or update the current GitHub release with README-generated notes'], + ['agents', 'Start/stop repo-scoped review + cleanup bots'], + ['prompt', 'Print AI setup checklist (--exec, --snippet)'], + ['report', 'Security/safety reports (e.g. OpenSSF scorecard)'], + ['help', 'Show this help output'], + ['version', 'Print GitGuardex version'], +]; +const DEPRECATED_COMMAND_ALIASES = new Map([ + ['init', { target: 'setup', hint: 'gx setup' }], + ['install', { target: 'setup', hint: 'gx setup --install-only' }], + ['fix', { target: 'setup', hint: 'gx setup --repair' }], + ['scan', { target: 'status', hint: 'gx status --strict' }], + ['copy-prompt', { target: 'prompt', hint: 'gx prompt' }], + ['copy-commands', { target: 'prompt', hint: 'gx prompt --exec' }], + ['print-agents-snippet', { target: 'prompt', hint: 'gx prompt --snippet' }], + ['review', { target: 'agents', hint: 'gx agents start (runs review + cleanup)' }], +]); +const AGENT_BOT_DESCRIPTIONS = [ + ['agents', 'Start/stop review + cleanup bots for this repo'], +]; +const DOCTOR_AUTO_FINISH_DETAIL_LIMIT = 6; +const DOCTOR_AUTO_FINISH_BRANCH_LABEL_MAX = 72; +const DOCTOR_AUTO_FINISH_MESSAGE_MAX = 160; + +function envFlagIsTruthy(raw) { + const lowered = String(raw || '').trim().toLowerCase(); + return lowered === '1' || lowered === 'true' || lowered === 'yes' || lowered === 'on'; +} + +function isClaudeCodeSession(env = process.env) { + return envFlagIsTruthy(env.CLAUDECODE) || Boolean(env.CLAUDE_CODE_SESSION_ID); +} + +function defaultAgentWorktreeRelativeDir(env = process.env) { + return isClaudeCodeSession(env) ? CLAUDE_WORKTREE_RELATIVE_DIR : CODEX_WORKTREE_RELATIVE_DIR; +} + +const AI_SETUP_PROMPT = `GitGuardex (gx) setup checklist for Codex/Claude in this repo. + +1) Install: ${GLOBAL_INSTALL_COMMAND} && gh --version +2) Bootstrap: gx setup +3) Repair: gx doctor +4) Task loop: gx branch start "" "" + then gx locks claim --branch "" -> gx branch finish +5) Integrate: gx merge --branch --branch +6) Finish: gx finish --all +7) Cleanup: gx cleanup +8) OpenSpec: /opsx:propose -> /opsx:apply -> /opsx:archive +9) Optional: gx protect add release staging +10) Optional: gx sync --check && gx sync +11) Review bot: install https://github.com/apps/cr-gpt + set OPENAI_API_KEY +12) Fork sync: install https://github.com/apps/pull + cp .github/pull.yml.example .github/pull.yml +`; + +const AI_SETUP_COMMANDS = `${GLOBAL_INSTALL_COMMAND} +gh --version +gx setup +gx doctor +gx branch start "" "" +gx locks claim --branch "" +gx merge --branch "" --branch "" +gx finish --all +gx cleanup +gx protect add release staging +gx sync --check && gx sync +`; + +const SCORECARD_RISK_BY_CHECK = { + 'Dangerous-Workflow': 'Critical', + 'Code-Review': 'High', + Maintained: 'High', + 'Binary-Artifacts': 'High', + 'Dependency-Update-Tool': 'High', + 'Token-Permissions': 'High', + Vulnerabilities: 'High', + 'Branch-Protection': 'High', + Fuzzing: 'Medium', + 'Pinned-Dependencies': 'Medium', + SAST: 'Medium', + 'Security-Policy': 'Medium', + 'CII-Best-Practices': 'Low', + Contributors: 'Low', + License: 'Low', +}; + +module.exports = { + fs, + os, + path, + cp, + PACKAGE_ROOT, + CLI_ENTRY_PATH, + packageJsonPath, + packageJson, + TOOL_NAME, + SHORT_TOOL_NAME, + LEGACY_NAMES, + GLOBAL_INSTALL_COMMAND, + OPENSPEC_PACKAGE, + OMC_PACKAGE, + OMC_REPO_URL, + CAVEMEM_PACKAGE, + NPX_BIN, + GUARDEX_HOME_DIR, + GLOBAL_TOOLCHAIN_SERVICES, + GLOBAL_TOOLCHAIN_PACKAGES, + OPTIONAL_LOCAL_COMPANION_TOOLS, + GH_BIN, + REQUIRED_SYSTEM_TOOLS, + MAINTAINER_RELEASE_REPO, + NPM_BIN, + OPENSPEC_BIN, + SCORECARD_BIN, + GIT_PROTECTED_BRANCHES_KEY, + GIT_BASE_BRANCH_KEY, + GIT_SYNC_STRATEGY_KEY, + GUARDEX_REPO_TOGGLE_ENV, + DEFAULT_PROTECTED_BRANCHES, + DEFAULT_BASE_BRANCH, + DEFAULT_SYNC_STRATEGY, + DEFAULT_SHADOW_CLEANUP_IDLE_MINUTES, + COMPOSE_HINT_FILES, + TEMPLATE_ROOT, + HOOK_NAMES, + toDestinationPath, + TEMPLATE_FILES, + LEGACY_WORKFLOW_SHIM_SPECS, + LEGACY_WORKFLOW_SHIMS, + MANAGED_TEMPLATE_DESTINATIONS, + MANAGED_TEMPLATE_SCRIPT_FILES, + LEGACY_MANAGED_REPO_FILES, + REQUIRED_MANAGED_REPO_FILES, + LEGACY_MANAGED_PACKAGE_SCRIPTS, + PACKAGE_SCRIPT_ASSETS, + USER_LEVEL_SKILL_ASSETS, + EXECUTABLE_RELATIVE_PATHS, + CRITICAL_GUARDRAIL_PATHS, + LOCK_FILE_RELATIVE, + AGENTS_BOTS_STATE_RELATIVE, + AGENTS_MARKER_START, + AGENTS_MARKER_END, + GITIGNORE_MARKER_START, + GITIGNORE_MARKER_END, + CODEX_WORKTREE_RELATIVE_DIR, + CLAUDE_WORKTREE_RELATIVE_DIR, + AGENT_WORKTREE_RELATIVE_DIRS, + MANAGED_GITIGNORE_PATHS, + REPO_SCAFFOLD_DIRECTORIES, + OMX_SCAFFOLD_DIRECTORIES, + OMX_SCAFFOLD_FILES, + TARGETED_FORCEABLE_MANAGED_PATHS, + COMMAND_TYPO_ALIASES, + SUGGESTIBLE_COMMANDS, + CLI_COMMAND_DESCRIPTIONS, + DEPRECATED_COMMAND_ALIASES, + AGENT_BOT_DESCRIPTIONS, + DOCTOR_AUTO_FINISH_DETAIL_LIMIT, + DOCTOR_AUTO_FINISH_BRANCH_LABEL_MAX, + DOCTOR_AUTO_FINISH_MESSAGE_MAX, + envFlagIsTruthy, + isClaudeCodeSession, + defaultAgentWorktreeRelativeDir, + AI_SETUP_PROMPT, + AI_SETUP_COMMANDS, + SCORECARD_RISK_BY_CHECK, +}; diff --git a/src/core/runtime.js b/src/core/runtime.js new file mode 100644 index 0000000..c5044ca --- /dev/null +++ b/src/core/runtime.js @@ -0,0 +1,119 @@ +const { + fs, + path, + CLI_ENTRY_PATH, + PACKAGE_SCRIPT_ASSETS, +} = require('../context'); + +function requireValue(rawArgs, index, flagName) { + const value = rawArgs[index + 1]; + if (!value || value.startsWith('-')) { + throw new Error(`${flagName} requires a value`); + } + return value; +} + +function run(cmd, args, options = {}) { + return require('node:child_process').spawnSync(cmd, args, { + encoding: 'utf8', + stdio: options.stdio || 'pipe', + cwd: options.cwd, + env: options.env ? { ...process.env, ...options.env } : process.env, + timeout: options.timeout, + }); +} + +function extractTargetedArgs(rawArgs, defaultTarget = process.cwd()) { + const passthrough = []; + let target = defaultTarget; + + for (let index = 0; index < rawArgs.length; index += 1) { + const arg = rawArgs[index]; + if (arg === '--target' || arg === '-t') { + target = requireValue(rawArgs, index, '--target'); + index += 1; + continue; + } + passthrough.push(arg); + } + + return { target, passthrough }; +} + +function packageAssetEnv(extraEnv = {}) { + return { + GUARDEX_CLI_ENTRY: CLI_ENTRY_PATH, + GUARDEX_NODE_BIN: process.execPath, + ...extraEnv, + }; +} + +function packageAssetPath(assetKey) { + const assetPath = PACKAGE_SCRIPT_ASSETS[assetKey]; + if (!assetPath) { + throw new Error(`Unknown package asset: ${assetKey}`); + } + if (!fs.existsSync(assetPath)) { + throw new Error(`Missing package asset: ${assetPath}`); + } + return assetPath; +} + +function runPackageAsset(assetKey, rawArgs, options = {}) { + const assetPath = packageAssetPath(assetKey); + let cmd = 'bash'; + if (assetPath.endsWith('.py')) { + cmd = 'python3'; + } else if (assetPath.endsWith('.js')) { + cmd = process.execPath; + } + return run(cmd, [assetPath, ...rawArgs], { + cwd: options.cwd || process.cwd(), + stdio: options.stdio || 'pipe', + timeout: options.timeout, + env: packageAssetEnv(options.env), + }); +} + +function repoLocalLegacyScriptPath(repoRoot, relativePath) { + const assetPath = path.join(repoRoot, relativePath); + return fs.existsSync(assetPath) ? assetPath : null; +} + +function runReviewBotCommand(repoRoot, rawArgs, options = {}) { + const legacyScript = repoLocalLegacyScriptPath(repoRoot, 'scripts/review-bot-watch.sh'); + if (legacyScript) { + return run('bash', [legacyScript, ...rawArgs], { + cwd: repoRoot, + stdio: options.stdio || 'pipe', + timeout: options.timeout, + env: packageAssetEnv(options.env), + }); + } + return runPackageAsset('reviewBot', rawArgs, { + ...options, + cwd: repoRoot, + }); +} + +function invokePackageAsset(assetKey, rawArgs, options = {}) { + const result = runPackageAsset(assetKey, rawArgs, options); + if (result.stdout) process.stdout.write(result.stdout); + if (result.stderr) process.stderr.write(result.stderr); + if (result.status !== 0) { + throw new Error(`${assetKey} command failed with status ${result.status}`); + } + process.exitCode = 0; + return result; +} + +module.exports = { + run, + extractTargetedArgs, + packageAssetEnv, + packageAssetPath, + runPackageAsset, + repoLocalLegacyScriptPath, + runReviewBotCommand, + invokePackageAsset, +}; diff --git a/src/finish/index.js b/src/finish/index.js new file mode 100644 index 0000000..9ff9eec --- /dev/null +++ b/src/finish/index.js @@ -0,0 +1,425 @@ +function createFinishApi(deps) { + const { + TOOL_NAME, + LOCK_FILE_RELATIVE, + path, + fs, + run, + runPackageAsset, + resolveRepoRoot, + parseCleanupArgs, + parseMergeArgs, + parseFinishArgs, + parseSyncArgs, + listAgentWorktrees, + listLocalAgentBranchesForFinish, + uniquePreserveOrder, + branchExists, + resolveFinishBaseBranch, + worktreeHasLocalChanges, + branchMergedIntoBase, + autoCommitWorktreeForFinish, + resolveBaseBranch, + resolveSyncStrategy, + ensureOriginBaseRef, + gitRun, + currentBranchName, + workingTreeIsDirty, + aheadBehind, + lockRegistryStatus, + syncOperation, + } = deps; + + function cleanup(rawArgs) { + const options = parseCleanupArgs(rawArgs); + const repoRoot = resolveRepoRoot(options.target); + + const args = []; + if (options.base) { + args.push('--base', options.base); + } + if (options.branch) { + args.push('--branch', options.branch); + } + if (options.forceDirty) { + args.push('--force-dirty'); + } + if (options.dryRun) { + args.push('--dry-run'); + } + if (!options.keepCleanWorktrees) { + args.push('--only-dirty-worktrees'); + } + if (options.includePrMerged) { + args.push('--include-pr-merged'); + } + if (options.idleMinutes > 0) { + args.push('--idle-minutes', String(options.idleMinutes)); + } + if (options.maxBranches > 0) { + args.push('--max-branches', String(options.maxBranches)); + } + args.push('--delete-branches'); + if (!options.keepRemote) { + args.push('--delete-remote-branches'); + } + + const runCleanupCycle = () => { + const runResult = runPackageAsset('worktreePrune', args, { cwd: repoRoot, stdio: 'inherit' }); + if (runResult.status !== 0) { + throw new Error('Cleanup command failed'); + } + }; + + if (options.watch) { + let cycle = 0; + while (true) { + cycle += 1; + console.log( + `[${TOOL_NAME}] Cleanup watch cycle=${cycle} (interval=${options.intervalSeconds}s, idleMinutes=${options.idleMinutes}, maxBranches=${options.maxBranches > 0 ? options.maxBranches : 'unbounded'}).`, + ); + runCleanupCycle(); + if (options.once) { + break; + } + const sleepResult = run('sleep', [String(options.intervalSeconds)], { cwd: repoRoot }); + if (sleepResult.status !== 0) { + throw new Error(`Cleanup watch sleep failed (interval=${options.intervalSeconds}s)`); + } + } + process.exitCode = 0; + return; + } + + runCleanupCycle(); + process.exitCode = 0; + } + + function merge(rawArgs) { + const options = parseMergeArgs(rawArgs); + const repoRoot = resolveRepoRoot(options.target); + + const args = []; + if (options.base) { + args.push('--base', options.base); + } + if (options.into) { + args.push('--into', options.into); + } + if (options.task) { + args.push('--task', options.task); + } + if (options.agent) { + args.push('--agent', options.agent); + } + for (const branch of options.branches) { + args.push('--branch', branch); + } + + const mergeResult = runPackageAsset('branchMerge', args, { cwd: repoRoot, stdio: 'pipe' }); + if (mergeResult.stdout) { + process.stdout.write(mergeResult.stdout); + } + if (mergeResult.stderr) { + process.stderr.write(mergeResult.stderr); + } + if (mergeResult.status !== 0) { + throw new Error(`merge command failed with status ${mergeResult.status}`); + } + + process.exitCode = 0; + } + + function finish(rawArgs, defaults = {}) { + const options = parseFinishArgs(rawArgs, defaults); + const repoRoot = resolveRepoRoot(options.target); + + const worktreeEntries = listAgentWorktrees(repoRoot); + const worktreeByBranch = new Map(worktreeEntries.map((entry) => [entry.branch, entry.worktreePath])); + + let candidateBranches = []; + if (options.branch) { + if (!branchExists(repoRoot, options.branch)) { + throw new Error(`Local branch not found: ${options.branch}`); + } + candidateBranches = [options.branch]; + } else { + candidateBranches = uniquePreserveOrder([ + ...listLocalAgentBranchesForFinish(repoRoot), + ...worktreeEntries.map((entry) => entry.branch), + ]); + } + + const candidates = []; + for (const branch of candidateBranches) { + const worktreePath = worktreeByBranch.get(branch) || ''; + const baseBranch = resolveFinishBaseBranch(repoRoot, branch, options.base); + const hasChanges = worktreePath ? worktreeHasLocalChanges(worktreePath) : false; + const alreadyMerged = branchMergedIntoBase(repoRoot, branch, baseBranch); + if (options.all || options.branch || hasChanges || !alreadyMerged) { + candidates.push({ + branch, + baseBranch, + worktreePath, + hasChanges, + alreadyMerged, + }); + } + } + + if (candidates.length === 0) { + console.log(`[${TOOL_NAME}] No pending agent branches to finish.`); + process.exitCode = 0; + return; + } + + let succeeded = 0; + let failed = 0; + let autoCommitted = 0; + + for (const candidate of candidates) { + const { branch, baseBranch, worktreePath } = candidate; + console.log( + `[${TOOL_NAME}] Finishing '${branch}' -> '${baseBranch}'${worktreePath ? ` (${worktreePath})` : ''}...`, + ); + + try { + let commitState = { changed: false, committed: false }; + if (worktreePath) { + commitState = autoCommitWorktreeForFinish(repoRoot, worktreePath, branch, options); + } + + if (commitState.committed) { + autoCommitted += 1; + console.log(`[${TOOL_NAME}] Auto-committed '${branch}' before finish.`); + } else if (commitState.changed && commitState.dryRun) { + console.log(`[${TOOL_NAME}] [dry-run] Would auto-commit pending changes on '${branch}'.`); + } + + const finishArgs = [ + '--branch', + branch, + '--base', + baseBranch, + options.waitForMerge ? '--wait-for-merge' : '--no-wait-for-merge', + options.cleanup ? '--cleanup' : '--no-cleanup', + ]; + if (options.mergeMode === 'pr') { + finishArgs.push('--via-pr'); + } else if (options.mergeMode === 'direct') { + finishArgs.push('--direct-only'); + } else { + finishArgs.push('--mode', 'auto'); + } + if (options.keepRemote) { + finishArgs.push('--keep-remote-branch'); + } + + if (options.dryRun) { + console.log(`[${TOOL_NAME}] [dry-run] Would run: gx branch finish ${finishArgs.join(' ')}`); + succeeded += 1; + continue; + } + + const finishResult = runPackageAsset('branchFinish', finishArgs, { cwd: repoRoot, stdio: 'pipe' }); + if (finishResult.stdout) { + process.stdout.write(finishResult.stdout); + } + if (finishResult.stderr) { + process.stderr.write(finishResult.stderr); + } + if (finishResult.status !== 0) { + throw new Error(`agent-branch-finish exited with status ${finishResult.status}`); + } + + succeeded += 1; + } catch (error) { + failed += 1; + console.error(`[${TOOL_NAME}] Finish failed for '${branch}': ${error.message}`); + if (options.failFast) { + break; + } + } + } + + console.log( + `[${TOOL_NAME}] Finish summary: total=${candidates.length}, success=${succeeded}, failed=${failed}, autoCommitted=${autoCommitted}`, + ); + + if (failed > 0) { + throw new Error('finish command failed for one or more agent branches'); + } + + process.exitCode = 0; + } + + function sync(rawArgs) { + const options = parseSyncArgs(rawArgs); + const repoRoot = resolveRepoRoot(options.target); + const baseBranch = resolveBaseBranch(repoRoot, options.base); + const strategy = resolveSyncStrategy(repoRoot, options.strategy); + const baseRef = `origin/${baseBranch}`; + + ensureOriginBaseRef(repoRoot, baseBranch); + + if (options.allAgentBranches) { + const refs = gitRun(repoRoot, ['for-each-ref', '--format=%(refname:short)', 'refs/heads/agent/*'], { allowFailure: true }); + if (refs.status !== 0) { + throw new Error('Unable to list local agent branches'); + } + const branches = (refs.stdout || '').split('\n').map((item) => item.trim()).filter(Boolean); + const rows = branches.map((branch) => { + const counts = aheadBehind(repoRoot, branch, baseRef); + return { + branch, + base: baseRef, + ahead: counts.ahead, + behind: counts.behind, + syncRequired: counts.behind > 0, + }; + }); + + if (options.json) { + process.stdout.write(`${JSON.stringify({ + repoRoot, + base: baseRef, + branchCount: rows.length, + rows, + }, null, 2)}\n`); + } else { + console.log(`[${TOOL_NAME}] Sync report target: ${repoRoot}`); + console.log(`[${TOOL_NAME}] Base: ${baseRef}`); + if (rows.length === 0) { + console.log(`[${TOOL_NAME}] No local agent branches found.`); + } else { + for (const row of rows) { + console.log(` - ${row.branch} | ahead ${row.ahead} | behind ${row.behind} | syncRequired=${row.syncRequired}`); + } + } + } + + const hasBehind = rows.some((row) => row.behind > 0); + process.exitCode = options.check && hasBehind ? 1 : 0; + return; + } + + const branch = currentBranchName(repoRoot); + if (!options.allowNonAgent && !branch.startsWith('agent/')) { + throw new Error(`sync is limited to agent/* branches by default (current: ${branch}). Use --allow-non-agent to override.`); + } + + const dirty = workingTreeIsDirty(repoRoot); + if (!options.check && !options.allowDirty && dirty) { + throw new Error('Sync blocked: working tree is not clean. Commit or stash changes first, or pass --allow-dirty.'); + } + + const before = aheadBehind(repoRoot, branch, baseRef); + + const payload = { + repoRoot, + branch, + base: baseRef, + strategy, + dirty, + aheadBefore: before.ahead, + behindBefore: before.behind, + syncRequired: before.behind > 0, + status: 'checked', + }; + + if (options.check) { + if (options.json) { + process.stdout.write(`${JSON.stringify(payload, null, 2)}\n`); + } else { + console.log(`[${TOOL_NAME}] Sync check target: ${repoRoot}`); + console.log(`[${TOOL_NAME}] Branch: ${branch}`); + console.log(`[${TOOL_NAME}] Base: ${baseRef}`); + console.log(`[${TOOL_NAME}] Ahead: ${before.ahead}`); + console.log(`[${TOOL_NAME}] Behind: ${before.behind}`); + console.log(`[${TOOL_NAME}] Sync required: ${before.behind > 0 ? 'yes' : 'no'}`); + } + process.exitCode = before.behind > 0 ? 1 : 0; + return; + } + + if (before.behind === 0) { + const result = { ...payload, status: 'no-op', aheadAfter: before.ahead, behindAfter: before.behind }; + if (options.json) { + process.stdout.write(`${JSON.stringify(result, null, 2)}\n`); + } else { + console.log(`[${TOOL_NAME}] Branch '${branch}' is already up to date with ${baseRef}.`); + } + process.exitCode = 0; + return; + } + + if (options.dryRun) { + const result = { ...payload, status: 'dry-run' }; + if (options.json) { + process.stdout.write(`${JSON.stringify(result, null, 2)}\n`); + } else { + console.log(`[${TOOL_NAME}] Dry run: would sync '${branch}' onto ${baseRef} via ${strategy}.`); + } + process.exitCode = 0; + return; + } + + const lockPath = path.join(repoRoot, LOCK_FILE_RELATIVE); + const lockState = lockRegistryStatus(repoRoot); + let lockBackup = null; + if (lockState.dirty && fs.existsSync(lockPath)) { + lockBackup = fs.readFileSync(lockPath, 'utf8'); + } + + if (lockState.dirty) { + if (lockState.untracked) { + fs.rmSync(lockPath, { force: true }); + } else { + const resetLock = gitRun(repoRoot, ['checkout', '--', LOCK_FILE_RELATIVE], { allowFailure: true }); + if (resetLock.status !== 0) { + throw new Error(`Unable to temporarily reset ${LOCK_FILE_RELATIVE} before sync`); + } + } + } + + try { + syncOperation(repoRoot, strategy, baseRef, options.ffOnly); + } finally { + if (lockBackup !== null) { + fs.mkdirSync(path.dirname(lockPath), { recursive: true }); + fs.writeFileSync(lockPath, lockBackup, 'utf8'); + } + } + const after = aheadBehind(repoRoot, branch, baseRef); + const result = { + ...payload, + status: 'success', + aheadAfter: after.ahead, + behindAfter: after.behind, + }; + + if (options.json) { + process.stdout.write(`${JSON.stringify(result, null, 2)}\n`); + } else { + console.log(`[${TOOL_NAME}] Sync target: ${repoRoot}`); + console.log(`[${TOOL_NAME}] Branch: ${branch}`); + console.log(`[${TOOL_NAME}] Base: ${baseRef}`); + console.log(`[${TOOL_NAME}] Strategy: ${strategy}`); + console.log(`[${TOOL_NAME}] Behind before sync: ${before.behind}`); + console.log(`[${TOOL_NAME}] Result: success (behind now: ${after.behind})`); + } + + process.exitCode = 0; + } + + return { + cleanup, + merge, + finish, + sync, + }; +} + +module.exports = { + createFinishApi, +}; diff --git a/src/git/index.js b/src/git/index.js new file mode 100644 index 0000000..45edb88 --- /dev/null +++ b/src/git/index.js @@ -0,0 +1,112 @@ +const { path } = require('../context'); +const { run } = require('../core/runtime'); + +function gitRun(repoRoot, args, { allowFailure = false } = {}) { + const result = run('git', ['-C', repoRoot, ...args]); + if (!allowFailure && result.status !== 0) { + throw new Error(`git ${args.join(' ')} failed: ${(result.stderr || '').trim()}`); + } + return result; +} + +function resolveRepoRoot(targetPath) { + const resolvedTarget = path.resolve(targetPath || process.cwd()); + const result = run('git', ['-C', resolvedTarget, 'rev-parse', '--show-toplevel']); + if (result.status !== 0) { + const stderr = (result.stderr || '').trim(); + throw new Error( + `Target is not inside a git repository: ${resolvedTarget}${stderr ? `\n${stderr}` : ''}`, + ); + } + return result.stdout.trim(); +} + +function isGitRepo(targetPath) { + const resolvedTarget = path.resolve(targetPath || process.cwd()); + const result = run('git', ['-C', resolvedTarget, 'rev-parse', '--show-toplevel']); + return result.status === 0; +} + +const NESTED_REPO_DEFAULT_MAX_DEPTH = 6; +const NESTED_REPO_DEFAULT_SKIP_DIRS = new Set([ + 'node_modules', + '.git', + 'dist', + 'build', + '.next', + '.cache', + 'target', + 'vendor', + '.venv', + '.pnpm-store', +]); + +function discoverNestedGitRepos(rootPath, opts = {}) { + const maxDepth = Number.isFinite(opts.maxDepth) + ? Math.max(1, opts.maxDepth) + : NESTED_REPO_DEFAULT_MAX_DEPTH; + const extraSkip = new Set(Array.isArray(opts.extraSkip) ? opts.extraSkip : []); + const includeSubmodules = Boolean(opts.includeSubmodules); + const resolvedRoot = path.resolve(rootPath); + + if (!isGitRepo(resolvedRoot)) { + throw new Error(`Target is not inside a git repository: ${resolvedRoot}`); + } + + const results = []; + const seen = new Set(); + + function visit(directoryPath, depth) { + const repoRoot = resolveRepoRoot(directoryPath); + if (!seen.has(repoRoot)) { + seen.add(repoRoot); + results.push(repoRoot); + } + + if (depth >= maxDepth) { + return; + } + + let entries = []; + try { + entries = require('node:fs').readdirSync(directoryPath, { withFileTypes: true }); + } catch { + return; + } + + for (const entry of entries) { + if (!entry.isDirectory()) { + continue; + } + if (NESTED_REPO_DEFAULT_SKIP_DIRS.has(entry.name) || extraSkip.has(entry.name)) { + continue; + } + + const childPath = path.join(directoryPath, entry.name); + const gitDir = path.join(childPath, '.git'); + if (require('node:fs').existsSync(gitDir)) { + if (!includeSubmodules) { + const gitInfo = require('node:fs').lstatSync(gitDir); + if (gitInfo.isFile()) { + continue; + } + } + visit(childPath, depth + 1); + continue; + } + + visit(childPath, depth + 1); + } + } + + visit(resolvedRoot, 0); + return results; +} + +module.exports = { + DEFAULT_NESTED_REPO_MAX_DEPTH: NESTED_REPO_DEFAULT_MAX_DEPTH, + gitRun, + resolveRepoRoot, + isGitRepo, + discoverNestedGitRepos, +}; diff --git a/src/hooks/index.js b/src/hooks/index.js new file mode 100644 index 0000000..8ba2fc2 --- /dev/null +++ b/src/hooks/index.js @@ -0,0 +1,77 @@ +const { path, TEMPLATE_ROOT, HOOK_NAMES, TOOL_NAME, SHORT_TOOL_NAME } = require('../context'); +const { + run, + runPackageAsset, + runReviewBotCommand, + packageAssetEnv, + extractTargetedArgs, +} = require('../core/runtime'); +const { resolveRepoRoot } = require('../git'); + +function configureHooks(repoRoot, dryRun) { + if (dryRun) { + return { status: 'would-set', key: 'core.hooksPath', value: '.githooks' }; + } + + const result = run('git', ['-C', repoRoot, 'config', 'core.hooksPath', '.githooks']); + if (result.status !== 0) { + throw new Error(`Failed to set git hooksPath: ${(result.stderr || '').trim()}`); + } + + return { status: 'set', key: 'core.hooksPath', value: '.githooks' }; +} + +function hook(rawArgs) { + const [subcommand, ...rest] = rawArgs; + if (subcommand === 'run') { + const [hookName, ...hookArgs] = rest; + if (!HOOK_NAMES.includes(hookName)) { + throw new Error(`Unknown hook name: ${hookName || '(missing)'}`); + } + const { target, passthrough } = extractTargetedArgs(hookArgs); + const hookAssetPath = path.join(TEMPLATE_ROOT, 'githooks', hookName); + const result = run('bash', [hookAssetPath, ...passthrough], { + cwd: resolveRepoRoot(target), + stdio: hookName === 'pre-push' ? 'inherit' : 'pipe', + env: packageAssetEnv(), + }); + if (result.stdout) process.stdout.write(result.stdout); + if (result.stderr) process.stderr.write(result.stderr); + process.exitCode = result.status; + return; + } + if (subcommand === 'install') { + const { target, passthrough } = extractTargetedArgs(rest); + if (passthrough.length > 0) { + throw new Error(`Unknown hook install option: ${passthrough[0]}`); + } + const repoRoot = resolveRepoRoot(target); + const hookResult = configureHooks(repoRoot, false); + console.log(`[${TOOL_NAME}] Hook install target: ${repoRoot}`); + console.log(` - hooksPath ${hookResult.status} ${hookResult.key}=${hookResult.value}`); + process.exitCode = 0; + return; + } + throw new Error(`Usage: ${SHORT_TOOL_NAME} hook ...`); +} + +function internal(rawArgs) { + const [subcommand, assetKey, ...rest] = rawArgs; + if (subcommand !== 'run-shell') { + throw new Error(`Unknown internal command: ${subcommand || '(missing)'}`); + } + const { target, passthrough } = extractTargetedArgs(rest); + const repoRoot = resolveRepoRoot(target); + const result = assetKey === 'reviewBot' + ? runReviewBotCommand(repoRoot, passthrough) + : runPackageAsset(assetKey, passthrough, { cwd: repoRoot }); + if (result.stdout) process.stdout.write(result.stdout); + if (result.stderr) process.stderr.write(result.stderr); + process.exitCode = result.status; +} + +module.exports = { + configureHooks, + hook, + internal, +}; diff --git a/src/output/index.js b/src/output/index.js new file mode 100644 index 0000000..8d43f57 --- /dev/null +++ b/src/output/index.js @@ -0,0 +1,398 @@ +const { + path, + packageJson, + TOOL_NAME, + SHORT_TOOL_NAME, + LEGACY_NAMES, + GUARDEX_REPO_TOGGLE_ENV, + CLI_COMMAND_DESCRIPTIONS, + AGENT_BOT_DESCRIPTIONS, + DOCTOR_AUTO_FINISH_DETAIL_LIMIT, + DOCTOR_AUTO_FINISH_BRANCH_LABEL_MAX, + DOCTOR_AUTO_FINISH_MESSAGE_MAX, +} = require('../context'); + +function runtimeVersion() { + return `${packageJson.name}/${packageJson.version} ${process.platform}-${process.arch} node-${process.version}`; +} + +function supportsAnsiColors() { + const forced = String(process.env.FORCE_COLOR || '').trim().toLowerCase(); + if (['0', 'false', 'no', 'off'].includes(forced)) { + return false; + } + if (forced.length > 0) { + return true; + } + if (process.env.NO_COLOR) { + return false; + } + return Boolean(process.stdout.isTTY) && process.env.TERM !== 'dumb'; +} + +function colorize(text, colorCode) { + if (!supportsAnsiColors()) { + return text; + } + return `\u001B[${colorCode}m${text}\u001B[0m`; +} + +function doctorOutputColorCode(status) { + const normalized = String(status || '').trim().toLowerCase(); + if (['active', 'done', 'ok', 'safe', 'success'].includes(normalized)) { + return '32'; + } + if (normalized === 'disabled') { + return '36'; + } + if (['degraded', 'pending', 'skip', 'warn', 'warning'].includes(normalized)) { + return '33'; + } + if (['error', 'fail', 'inactive', 'unsafe'].includes(normalized)) { + return '31'; + } + return null; +} + +function colorizeDoctorOutput(text, status) { + const colorCode = doctorOutputColorCode(status); + return colorCode ? colorize(text, colorCode) : text; +} + +function detectAutoFinishDetailStatus(detail) { + const trimmed = String(detail || '').trim(); + const match = trimmed.match(/^\[(\w+)\]/); + if (match) { + return match[1].toLowerCase(); + } + if (/^Skipped\b/i.test(trimmed) || /^No local agent branches found\b/i.test(trimmed)) { + return 'skip'; + } + return null; +} + +function detectAutoFinishSummaryStatus(summary) { + if (!summary || summary.enabled === false) { + return detectAutoFinishDetailStatus(summary?.details?.[0]); + } + if ((summary.failed || 0) > 0) { + return 'fail'; + } + if ((summary.completed || 0) > 0) { + return 'done'; + } + if ((summary.skipped || 0) > 0) { + return 'skip'; + } + return null; +} + +function statusDot(status) { + if (status === 'active') { + return colorize('●', '32'); + } + if (status === 'inactive') { + return colorize('●', '31'); + } + if (status === 'disabled') { + return colorize('●', '36'); + } + return colorize('●', '33'); +} + +function commandCatalogLines(indent = ' ') { + const maxCommandLength = CLI_COMMAND_DESCRIPTIONS.reduce( + (max, [command]) => Math.max(max, command.length), + 0, + ); + return CLI_COMMAND_DESCRIPTIONS.map( + ([command, description]) => `${indent}${command.padEnd(maxCommandLength + 2)}${description}`, + ); +} + +function agentBotCatalogLines(indent = ' ') { + const maxCommandLength = AGENT_BOT_DESCRIPTIONS.reduce( + (max, [command]) => Math.max(max, command.length), + 0, + ); + return AGENT_BOT_DESCRIPTIONS.map( + ([command, description]) => `${indent}${command.padEnd(maxCommandLength + 2)}${description}`, + ); +} + +function repoToggleLines(indent = ' ') { + return [ + `${indent}Set repo-root .env: ${GUARDEX_REPO_TOGGLE_ENV}=0 disables Guardex, ${GUARDEX_REPO_TOGGLE_ENV}=1 enables it again`, + ]; +} + +function printToolLogsSummary() { + const usageLine = ` $ ${SHORT_TOOL_NAME} [options]`; + const commandDetails = commandCatalogLines(' '); + const agentBotDetails = agentBotCatalogLines(' '); + const repoToggleDetails = repoToggleLines(' '); + + if (!supportsAnsiColors()) { + console.log(`${TOOL_NAME}-tools logs:`); + console.log(' USAGE'); + console.log(usageLine); + console.log(' COMMANDS'); + for (const line of commandDetails) { + console.log(line); + } + console.log(' AGENT BOT'); + for (const line of agentBotDetails) { + console.log(line); + } + console.log(' REPO TOGGLE'); + for (const line of repoToggleDetails) { + console.log(line); + } + return; + } + + const title = colorize(`${TOOL_NAME}-tools logs`, '1;36'); + const usageHeader = colorize('USAGE', '1'); + const commandsHeader = colorize('COMMANDS', '1'); + const agentBotHeader = colorize('AGENT BOT', '1'); + const repoToggleHeader = colorize('REPO TOGGLE', '1'); + const pipe = colorize('│', '90'); + const tee = colorize('├', '90'); + const corner = colorize('└', '90'); + + console.log(`${title}:`); + console.log(` ${tee}─ ${usageHeader}`); + console.log(` ${pipe}${usageLine}`); + console.log(` ${tee}─ ${commandsHeader}`); + for (const line of commandDetails) { + if (!line) { + console.log(` ${pipe}`); + continue; + } + console.log(` ${pipe}${line.slice(2)}`); + } + console.log(` ${tee}─ ${agentBotHeader}`); + for (const line of agentBotDetails) { + if (!line) { + console.log(` ${pipe}`); + continue; + } + console.log(` ${pipe}${line.slice(2)}`); + } + console.log(` ${tee}─ ${repoToggleHeader}`); + for (const line of repoToggleDetails) { + if (!line) { + console.log(` ${pipe}`); + continue; + } + console.log(` ${pipe}${line.slice(2)}`); + } + console.log(` ${corner}─ ${colorize(`Try '${TOOL_NAME} doctor' for one-step repair + verification.`, '2')}`); +} + +function usage(options = {}) { + const { outsideGitRepo = false } = options; + + console.log(`A command-line tool that sets up hardened multi-agent safety for git repositories. + +VERSION + ${runtimeVersion()} + +USAGE + $ ${SHORT_TOOL_NAME} [options] + +COMMANDS +${commandCatalogLines().join('\n')} + +AGENT BOT +${agentBotCatalogLines().join('\n')} + +REPO TOGGLE +${repoToggleLines().join('\n')} + +NOTES + - No command = ${SHORT_TOOL_NAME} status. ${SHORT_TOOL_NAME} init is an alias of ${SHORT_TOOL_NAME} setup. + - Global installs need Y/N approval; GitHub CLI (gh) is required for PR automation. + - Target another repo: ${SHORT_TOOL_NAME} --target . + - On protected main, setup/install/fix/doctor auto-sandbox via agent branch + PR flow. + - Run '${SHORT_TOOL_NAME} cleanup' to prune merged agent branches/worktrees. + - Legacy aliases: ${LEGACY_NAMES.join(', ')}.`); + + if (outsideGitRepo) { + console.log(` +[${TOOL_NAME}] No git repository detected in current directory. +[${TOOL_NAME}] Start from a repo root, or pass an explicit target: + ${TOOL_NAME} setup --target + ${TOOL_NAME} doctor --target `); + } +} + +function formatElapsedDuration(ms) { + const durationMs = Number.isFinite(ms) ? Math.max(0, ms) : 0; + if (durationMs < 1000) { + return `${Math.round(durationMs)}ms`; + } + if (durationMs < 10_000) { + return `${(durationMs / 1000).toFixed(1)}s`; + } + return `${Math.round(durationMs / 1000)}s`; +} + +function truncateMiddle(value, maxLength) { + const text = String(value || ''); + const limit = Number.isFinite(maxLength) ? Math.max(4, maxLength) : 0; + if (!limit || text.length <= limit) { + return text; + } + + const visible = limit - 1; + const headLength = Math.ceil(visible / 2); + const tailLength = Math.floor(visible / 2); + return `${text.slice(0, headLength)}…${text.slice(text.length - tailLength)}`; +} + +function truncateTail(value, maxLength) { + const text = String(value || ''); + const limit = Number.isFinite(maxLength) ? Math.max(4, maxLength) : 0; + if (!limit || text.length <= limit) { + return text; + } + return `${text.slice(0, limit - 1)}…`; +} + +function compactAutoFinishPathSegments(message) { + return String(message || '').replace(/\((\/[^)]+)\)/g, (_, rawPath) => { + if ( + rawPath.includes(`${path.sep}.omx${path.sep}agent-worktrees${path.sep}`) || + rawPath.includes(`${path.sep}.omc${path.sep}agent-worktrees${path.sep}`) + ) { + return `(${path.basename(rawPath)})`; + } + return `(${truncateMiddle(rawPath, 72)})`; + }); +} + +function detectRecoverableAutoFinishConflict(message) { + const text = String(message || '').trim(); + if (!text) { + return null; + } + + if (/rebase --continue/i.test(text) && /rebase --abort/i.test(text)) { + return { + rawLabel: 'auto-finish requires manual rebase.', + summary: 'manual rebase required in the source-probe worktree; run rebase --continue or rebase --abort', + }; + } + + if (/Rebase\/merge '.+' into '.+' and resolve conflicts before finishing\./i.test(text)) { + return { + rawLabel: 'auto-finish requires manual rebase or merge.', + summary: 'manual rebase or merge required before auto-finish can continue', + }; + } + + if (/Merge conflict detected while merging/i.test(text)) { + return { + rawLabel: 'auto-finish requires manual merge resolution.', + summary: 'manual merge resolution required before auto-finish can continue', + }; + } + + return null; +} + +function summarizeAutoFinishDetail(detail) { + const trimmed = String(detail || '').trim(); + const match = trimmed.match(/^\[(\w+)\]\s+([^:]+):\s*(.*)$/); + if (!match) { + return truncateTail(compactAutoFinishPathSegments(trimmed), DOCTOR_AUTO_FINISH_MESSAGE_MAX); + } + + const [, status, rawBranch, rawMessage] = match; + const branch = truncateMiddle(rawBranch, DOCTOR_AUTO_FINISH_BRANCH_LABEL_MAX); + let message = String(rawMessage || '').trim(); + const recoverableConflict = status === 'skip' ? detectRecoverableAutoFinishConflict(message) : null; + + if (recoverableConflict) { + message = recoverableConflict.summary; + } else if (status === 'fail') { + message = message.replace(/^auto-finish failed\.?\s*/i, ''); + if (/\[agent-sync-guard\]/.test(message) && /Resolve conflicts/i.test(message)) { + message = 'rebase conflict in finish flow; run rebase --continue or rebase --abort in the source-probe worktree'; + } else if (/unable to compute ahead\/behind/i.test(message)) { + const aheadBehindMatch = message.match(/unable to compute ahead\/behind(?: \([^)]+\))?/i); + if (aheadBehindMatch) { + message = aheadBehindMatch[0]; + } + } else if (/remote ref does not exist/i.test(message)) { + message = 'branch merged, but the remote ref was already removed during cleanup'; + } + } + + message = compactAutoFinishPathSegments(message) + .replace(/\s+\|\s+/g, '; ') + .trim(); + + return `[${status}] ${branch}: ${truncateTail(message, DOCTOR_AUTO_FINISH_MESSAGE_MAX)}`; +} + +function printAutoFinishSummary(summary, options = {}) { + const enabled = Boolean(summary && summary.enabled); + const details = Array.isArray(summary && summary.details) ? summary.details : []; + const baseBranch = String(options.baseBranch || summary?.baseBranch || '').trim(); + const verbose = Boolean(options.verbose); + const detailLimit = Number.isFinite(options.detailLimit) + ? Math.max(0, options.detailLimit) + : DOCTOR_AUTO_FINISH_DETAIL_LIMIT; + + if (enabled) { + console.log( + colorizeDoctorOutput( + `[${TOOL_NAME}] Auto-finish sweep (base=${baseBranch}): attempted=${summary.attempted}, completed=${summary.completed}, skipped=${summary.skipped}, failed=${summary.failed}`, + detectAutoFinishSummaryStatus(summary), + ), + ); + const visibleDetails = verbose ? details : details.slice(0, detailLimit).map(summarizeAutoFinishDetail); + for (const detail of visibleDetails) { + console.log(colorizeDoctorOutput(`[${TOOL_NAME}] ${detail}`, detectAutoFinishDetailStatus(detail))); + } + if (!verbose && details.length > visibleDetails.length) { + console.log( + colorizeDoctorOutput( + `[${TOOL_NAME}] … ${details.length - visibleDetails.length} more branch result(s). Re-run with --verbose-auto-finish for full details.`, + 'warn', + ), + ); + } + return; + } + + if (details.length > 0) { + const detail = verbose ? details[0] : summarizeAutoFinishDetail(details[0]); + console.log(colorizeDoctorOutput(`[${TOOL_NAME}] ${detail}`, detectAutoFinishDetailStatus(detail))); + } +} + +module.exports = { + runtimeVersion, + supportsAnsiColors, + colorize, + doctorOutputColorCode, + colorizeDoctorOutput, + detectAutoFinishDetailStatus, + detectAutoFinishSummaryStatus, + statusDot, + commandCatalogLines, + agentBotCatalogLines, + repoToggleLines, + printToolLogsSummary, + usage, + formatElapsedDuration, + truncateMiddle, + truncateTail, + compactAutoFinishPathSegments, + detectRecoverableAutoFinishConflict, + summarizeAutoFinishDetail, + printAutoFinishSummary, +}; diff --git a/src/sandbox/index.js b/src/sandbox/index.js new file mode 100644 index 0000000..66a7726 --- /dev/null +++ b/src/sandbox/index.js @@ -0,0 +1,68 @@ +function createSandboxApi(deps) { + const { + protectedBaseWriteBlock, + runInstallInternal, + ensureSetupProtectedBranches, + ensureParentWorkspaceView, + buildParentWorkspaceView, + runFixInternal, + } = deps; + + function assertProtectedMainWriteAllowed(options, commandName) { + const blocked = protectedBaseWriteBlock(options); + if (!blocked) { + return; + } + + throw new Error( + `${commandName} blocked on protected branch '${blocked.branch}' in an initialized repo.\n` + + `Keep local '${blocked.branch}' pull-only: start an agent branch/worktree first:\n` + + ` gx branch start "" "codex"\n` + + `Override once only when intentional: --allow-protected-base-write`, + ); + } + + function runSetupBootstrapInternal(options) { + const installPayload = runInstallInternal(options); + installPayload.operations.push( + ensureSetupProtectedBranches(installPayload.repoRoot, Boolean(options.dryRun)), + ); + + let parentWorkspace = null; + if (options.parentWorkspaceView) { + installPayload.operations.push( + ensureParentWorkspaceView(installPayload.repoRoot, Boolean(options.dryRun)), + ); + if (!options.dryRun) { + parentWorkspace = buildParentWorkspaceView(installPayload.repoRoot); + } + } + + const fixPayload = runFixInternal({ + target: installPayload.repoRoot, + dryRun: options.dryRun, + force: options.force, + forceManagedPaths: options.forceManagedPaths, + dropStaleLocks: true, + skipAgents: options.skipAgents, + skipPackageJson: options.skipPackageJson, + skipGitignore: options.skipGitignore, + allowProtectedBaseWrite: options.allowProtectedBaseWrite, + }); + + return { + installPayload, + fixPayload, + parentWorkspace, + }; + } + + return { + assertProtectedMainWriteAllowed, + runSetupBootstrapInternal, + }; +} + +module.exports = { + createSandboxApi, +}; diff --git a/src/scaffold/index.js b/src/scaffold/index.js new file mode 100644 index 0000000..d6b3b08 --- /dev/null +++ b/src/scaffold/index.js @@ -0,0 +1,169 @@ +const { + fs, + path, + TOOL_NAME, + SHORT_TOOL_NAME, + EXECUTABLE_RELATIVE_PATHS, + CRITICAL_GUARDRAIL_PATHS, +} = require('../context'); + +function toDestinationPath(relativeTemplatePath) { + if (relativeTemplatePath.startsWith('scripts/')) { + return relativeTemplatePath; + } + if (relativeTemplatePath.startsWith('githooks/')) { + return `.${relativeTemplatePath}`; + } + if (relativeTemplatePath.startsWith('codex/')) { + return `.${relativeTemplatePath}`; + } + if (relativeTemplatePath.startsWith('claude/')) { + return `.${relativeTemplatePath}`; + } + if (relativeTemplatePath.startsWith('github/')) { + return `.${relativeTemplatePath}`; + } + if (relativeTemplatePath.startsWith('vscode/')) { + return relativeTemplatePath; + } + throw new Error(`Unsupported template path: ${relativeTemplatePath}`); +} + +function ensureParentDir(repoRoot, filePath, dryRun) { + if (dryRun) return; + + const parentDir = path.dirname(filePath); + const relativeParentDir = path.relative(repoRoot, parentDir); + const segments = relativeParentDir.split(path.sep).filter(Boolean); + let currentPath = repoRoot; + + for (const segment of segments) { + currentPath = path.join(currentPath, segment); + if (fs.existsSync(currentPath) && !fs.statSync(currentPath).isDirectory()) { + const blockingPath = path.relative(repoRoot, currentPath) || path.basename(currentPath); + const targetPath = path.relative(repoRoot, filePath) || path.basename(filePath); + throw new Error( + `Path conflict: ${blockingPath} exists as a file, but ${targetPath} needs it to be a directory. ` + + `Remove or rename ${blockingPath} and rerun '${SHORT_TOOL_NAME} setup'.`, + ); + } + } + + fs.mkdirSync(parentDir, { recursive: true }); +} + +function ensureExecutable(destinationPath, relativePath, dryRun) { + if (dryRun) return; + if (EXECUTABLE_RELATIVE_PATHS.has(relativePath)) { + fs.chmodSync(destinationPath, 0o755); + } +} + +function isCriticalGuardrailPath(relativePath) { + return CRITICAL_GUARDRAIL_PATHS.has(relativePath); +} + +function shellSingleQuote(value) { + return `'${String(value).replace(/'/g, `'\"'\"'`)}'`; +} + +function renderShellDispatchShim(commandParts) { + const rendered = commandParts.map((part) => shellSingleQuote(part)).join(' '); + return ( + '#!/usr/bin/env bash\n' + + 'set -euo pipefail\n' + + '\n' + + 'if [[ -n "${GUARDEX_CLI_ENTRY:-}" ]]; then\n' + + ' node_bin="${GUARDEX_NODE_BIN:-node}"\n' + + ` exec "$node_bin" "$GUARDEX_CLI_ENTRY" ${rendered} "$@"\n` + + 'fi\n' + + '\n' + + 'resolve_guardex_cli() {\n' + + ' if [[ -n "${GUARDEX_CLI_BIN:-}" ]]; then\n' + + ' printf \'%s\' "$GUARDEX_CLI_BIN"\n' + + ' return 0\n' + + ' fi\n' + + ' if command -v gx >/dev/null 2>&1; then\n' + + ' printf \'%s\' "gx"\n' + + ' return 0\n' + + ' fi\n' + + ' if command -v gitguardex >/dev/null 2>&1; then\n' + + ' printf \'%s\' "gitguardex"\n' + + ' return 0\n' + + ' fi\n' + + ' echo "[gitguardex-shim] Missing gx CLI in PATH." >&2\n' + + ' exit 1\n' + + '}\n' + + '\n' + + 'cli_bin="$(resolve_guardex_cli)"\n' + + `exec "$cli_bin" ${rendered} "$@"\n` + ); +} + +function renderPythonDispatchShim(commandParts) { + return ( + '#!/usr/bin/env python3\n' + + 'import os\n' + + 'import shutil\n' + + 'import subprocess\n' + + 'import sys\n' + + '\n' + + `COMMAND = ${JSON.stringify(commandParts)}\n` + + '\n' + + 'entry = os.environ.get("GUARDEX_CLI_ENTRY")\n' + + 'if entry:\n' + + ' node_bin = os.environ.get("GUARDEX_NODE_BIN") or shutil.which("node") or "node"\n' + + ' raise SystemExit(subprocess.call([node_bin, entry, *COMMAND, *sys.argv[1:]]))\n' + + 'cli = os.environ.get("GUARDEX_CLI_BIN") or shutil.which("gx") or shutil.which("gitguardex")\n' + + 'if not cli:\n' + + ' sys.stderr.write("[gitguardex-shim] Missing gx CLI in PATH.\\n")\n' + + ' raise SystemExit(1)\n' + + 'raise SystemExit(subprocess.call([cli, *COMMAND, *sys.argv[1:]]))\n' + ); +} + +function managedForceConflictMessage(relativePath) { + return ( + `Refusing to overwrite existing file without --force: ${relativePath}\n` + + `Use '--force ${relativePath}' to rewrite only this managed file, or '--force' to rewrite all managed files.` + ); +} + +function printOperations(title, payload, dryRun = false) { + console.log(`[${TOOL_NAME}] ${title}: ${payload.repoRoot}`); + for (const operation of payload.operations) { + const note = operation.note ? ` (${operation.note})` : ''; + console.log(` - ${operation.status.padEnd(12)} ${operation.file}${note}`); + } + console.log( + ` - hooksPath ${payload.hookResult.status} ${payload.hookResult.key}=${payload.hookResult.value}`, + ); + + if (dryRun) { + console.log(`[${TOOL_NAME}] Dry run complete. No files were modified.`); + } +} + +function printStandaloneOperations(title, rootLabel, operations, dryRun = false) { + console.log(`[${TOOL_NAME}] ${title}: ${rootLabel}`); + for (const operation of operations) { + const note = operation.note ? ` (${operation.note})` : ''; + console.log(` - ${operation.status.padEnd(12)} ${operation.file}${note}`); + } + if (dryRun) { + console.log(`[${TOOL_NAME}] Dry run complete. No files were modified.`); + } +} + +module.exports = { + toDestinationPath, + ensureParentDir, + ensureExecutable, + isCriticalGuardrailPath, + shellSingleQuote, + renderShellDispatchShim, + renderPythonDispatchShim, + managedForceConflictMessage, + printOperations, + printStandaloneOperations, +}; diff --git a/src/toolchain/index.js b/src/toolchain/index.js new file mode 100644 index 0000000..ea746ec --- /dev/null +++ b/src/toolchain/index.js @@ -0,0 +1,223 @@ +function createToolchainApi(deps) { + const { + TOOL_NAME, + NPM_BIN, + NPX_BIN, + packageJson, + OPENSPEC_PACKAGE, + OPENSPEC_BIN, + GLOBAL_TOOLCHAIN_PACKAGES, + parseAutoApproval, + isInteractiveTerminal, + promptYesNoStrict, + run, + checkForGuardexUpdate, + printUpdateAvailableBanner, + readInstalledGuardexVersion, + restartIntoUpdatedGuardex, + checkForOpenSpecPackageUpdate, + printOpenSpecUpdateAvailableBanner, + resolveGlobalInstallApproval, + detectGlobalToolchainPackages, + detectOptionalLocalCompanionTools, + formatGlobalToolchainServiceName, + askGlobalInstallForMissing, + } = deps; + + function maybeSelfUpdateBeforeStatus() { + const check = checkForGuardexUpdate(); + if (!check.checked || !check.updateAvailable) { + return; + } + + printUpdateAvailableBanner(check.current, check.latest); + + const autoApproval = parseAutoApproval('GUARDEX_AUTO_UPDATE_APPROVAL'); + const interactive = isInteractiveTerminal(); + + if (!interactive && autoApproval == null) { + console.log(`[${TOOL_NAME}] Non-interactive shell; skipping auto-update prompt.`); + return; + } + + const shouldUpdate = interactive + ? promptYesNoStrict( + `Update now? (${NPM_BIN} i -g ${packageJson.name}@latest)`, + ) + : autoApproval; + + if (!shouldUpdate) { + console.log(`[${TOOL_NAME}] Skipped update.`); + return; + } + + const installResult = run(NPM_BIN, ['i', '-g', `${packageJson.name}@latest`], { stdio: 'inherit' }); + if (installResult.status !== 0) { + console.log(`[${TOOL_NAME}] ⚠️ Update failed. You can retry manually.`); + return; + } + + const postInstallVersion = readInstalledGuardexVersion(); + if (postInstallVersion != null && postInstallVersion !== check.latest) { + console.log( + `[${TOOL_NAME}] Installed version is still ${postInstallVersion} (expected ${check.latest}). ` + + `Retrying with pinned version ${check.latest}…`, + ); + const pinnedResult = run( + NPM_BIN, + ['i', '-g', `${packageJson.name}@${check.latest}`], + { stdio: 'inherit' }, + ); + if (pinnedResult.status !== 0) { + console.log( + `[${TOOL_NAME}] ⚠️ Pinned retry failed. Run manually: ${NPM_BIN} i -g ${packageJson.name}@${check.latest}`, + ); + return; + } + const pinnedVersion = readInstalledGuardexVersion(); + if (pinnedVersion != null && pinnedVersion !== check.latest) { + console.log( + `[${TOOL_NAME}] ⚠️ On-disk version still ${pinnedVersion} after pinned retry. ` + + `Investigate: ${NPM_BIN} root -g && ${NPM_BIN} cache verify`, + ); + return; + } + } + + console.log(`[${TOOL_NAME}] ✅ Updated to latest published version.`); + restartIntoUpdatedGuardex(check.latest); + } + + function maybeOpenSpecUpdateBeforeStatus() { + const check = checkForOpenSpecPackageUpdate(); + if (!check.checked || !check.updateAvailable) { + return; + } + + printOpenSpecUpdateAvailableBanner(check.current, check.latest); + + const autoApproval = parseAutoApproval('GUARDEX_AUTO_OPENSPEC_UPDATE_APPROVAL'); + const interactive = isInteractiveTerminal(); + + if (!interactive && autoApproval == null) { + console.log(`[${TOOL_NAME}] Non-interactive shell; skipping OpenSpec update prompt.`); + return; + } + + const shouldUpdate = interactive + ? promptYesNoStrict( + `Update OpenSpec now? (${NPM_BIN} i -g ${OPENSPEC_PACKAGE}@latest && ${OPENSPEC_BIN} update)`, + ) + : autoApproval; + + if (!shouldUpdate) { + console.log(`[${TOOL_NAME}] Skipped OpenSpec update.`); + return; + } + + const installResult = run(NPM_BIN, ['i', '-g', `${OPENSPEC_PACKAGE}@latest`], { stdio: 'inherit' }); + if (installResult.status !== 0) { + console.log(`[${TOOL_NAME}] ⚠️ OpenSpec npm install failed. You can retry manually.`); + return; + } + + const toolUpdateResult = run(OPENSPEC_BIN, ['update'], { stdio: 'inherit' }); + if (toolUpdateResult.status !== 0) { + console.log(`[${TOOL_NAME}] ⚠️ OpenSpec tool update failed. Run '${OPENSPEC_BIN} update' manually.`); + return; + } + + console.log(`[${TOOL_NAME}] ✅ OpenSpec updated to latest package and tool plugins refreshed.`); + } + + function installGlobalToolchain(options) { + const approval = resolveGlobalInstallApproval(options); + if (approval.source === 'flag' && !approval.approved) { + return { + status: 'skipped', + reason: approval.source, + missingPackages: [], + missingLocalTools: [], + }; + } + + if (options.dryRun) { + return { status: 'dry-run-skip' }; + } + + const detection = detectGlobalToolchainPackages(); + const localCompanionTools = detectOptionalLocalCompanionTools(); + if (!detection.ok) { + console.log(`[${TOOL_NAME}] ⚠️ Could not detect global packages: ${detection.error}`); + } else { + if (detection.installed.length > 0) { + console.log( + `[${TOOL_NAME}] Already installed globally: ` + + `${detection.installed.map((pkg) => formatGlobalToolchainServiceName(pkg)).join(', ')}`, + ); + } + const installedLocalTools = localCompanionTools + .filter((tool) => tool.status === 'active') + .map((tool) => tool.name); + if (installedLocalTools.length > 0) { + console.log(`[${TOOL_NAME}] Already installed locally: ${installedLocalTools.join(', ')}`); + } + if (detection.missing.length === 0 && localCompanionTools.every((tool) => tool.status === 'active')) { + return { status: 'already-installed' }; + } + } + + const missingPackages = detection.ok ? detection.missing : [...GLOBAL_TOOLCHAIN_PACKAGES]; + const missingLocalTools = localCompanionTools.filter((tool) => tool.status !== 'active'); + const installApproval = askGlobalInstallForMissing(options, missingPackages, missingLocalTools); + if (!installApproval.approved) { + return { + status: 'skipped', + reason: installApproval.source, + missingPackages, + missingLocalTools, + }; + } + + const installed = []; + if (missingPackages.length > 0) { + console.log( + `[${TOOL_NAME}] Installing global toolchain: npm i -g ${missingPackages.join(' ')}`, + ); + const result = run(NPM_BIN, ['i', '-g', ...missingPackages], { stdio: 'inherit' }); + if (result.status !== 0) { + const stderr = (result.stderr || '').trim(); + return { + status: 'failed', + reason: stderr || 'npm global install failed', + }; + } + installed.push(...missingPackages); + } + + for (const tool of missingLocalTools) { + console.log(`[${TOOL_NAME}] Installing local companion tool: ${tool.installCommand}`); + const result = run(NPX_BIN, tool.installArgs, { stdio: 'inherit' }); + if (result.status !== 0) { + const stderr = (result.stderr || '').trim(); + return { + status: 'failed', + reason: stderr || `${tool.name} install failed`, + }; + } + installed.push(tool.name); + } + + return { status: 'installed', packages: installed }; + } + + return { + maybeSelfUpdateBeforeStatus, + maybeOpenSpecUpdateBeforeStatus, + installGlobalToolchain, + }; +} + +module.exports = { + createToolchainApi, +}; diff --git a/test/install.test.js b/test/install.test.js index 1af295c..b74456a 100644 --- a/test/install.test.js +++ b/test/install.test.js @@ -6,6 +6,8 @@ const path = require('node:path'); const cp = require('node:child_process'); const cliPath = path.resolve(__dirname, '..', 'bin', 'multiagent-safety.js'); +const cliSourcePath = path.resolve(__dirname, '..', 'src', 'cli', 'main.js'); +const toolchainSourcePath = path.resolve(__dirname, '..', 'src', 'toolchain', 'index.js'); const cliVersion = JSON.parse( fs.readFileSync(path.resolve(__dirname, '..', 'package.json'), 'utf8'), ).version; @@ -445,7 +447,7 @@ const spawnUnavailableReason = spawnProbe.error if (!canSpawnChildProcesses) { test('self-update prompt requires explicit y/n when approval is not preconfigured', () => { - const source = fs.readFileSync(cliPath, 'utf8'); + const source = fs.readFileSync(toolchainSourcePath, 'utf8'); assert.match( source, /const shouldUpdate = interactive\s*\?\s*promptYesNoStrict\(\s*`Update now\?\s*\(\$\{NPM_BIN\} i -g \$\{packageJson\.name\}@latest\)`\s*,?\s*\)\s*:\s*autoApproval;/s, @@ -2758,7 +2760,7 @@ exit 1 }); test('self-update prompt requires explicit y/n when approval is not preconfigured', () => { - const source = fs.readFileSync(cliPath, 'utf8'); + const source = fs.readFileSync(toolchainSourcePath, 'utf8'); assert.match( source, /const shouldUpdate = interactive\s*\?\s*promptYesNoStrict\(\s*`Update now\?\s*\(\$\{NPM_BIN\} i -g \$\{packageJson\.name\}@latest\)`\s*,?\s*\)\s*:\s*autoApproval;/s, @@ -2812,7 +2814,7 @@ exit 1 }); test('openspec update prompt requires explicit y/n when approval is not preconfigured', () => { - const source = fs.readFileSync(cliPath, 'utf8'); + const source = fs.readFileSync(toolchainSourcePath, 'utf8'); assert.match( source, /const shouldUpdate = interactive\s*\?\s*promptYesNoStrict\(\s*`Update OpenSpec now\?\s*\(\$\{NPM_BIN\} i -g \$\{OPENSPEC_PACKAGE\}@latest && \$\{OPENSPEC_BIN\} update\)`\s*,?\s*\)\s*:\s*autoApproval;/s, @@ -4703,6 +4705,24 @@ test('prompt --exec outputs command-only checklist', () => { assert.doesNotMatch(result.stdout, /GitGuardex \(gx\) setup checklist/); }); +test('thin entrypoint still routes hook install through the extracted hooks module', () => { + const repoDir = initRepo(); + const result = runNode(['hook', 'install', '--target', repoDir], repoDir); + assert.equal(result.status, 0, result.stderr || result.stdout); + assert.match(result.stdout, /\[gitguardex\] Hook install target:/); + + const hooksPath = runCmd('git', ['config', '--get', 'core.hooksPath'], repoDir); + assert.equal(hooksPath.status, 0, hooksPath.stderr || hooksPath.stdout); + assert.equal(hooksPath.stdout.trim(), '.githooks'); +}); + +test('thin entrypoint routes finish --all --dry-run through the extracted finish module', () => { + const repoDir = initRepo(); + const result = runNode(['finish', '--all', '--dry-run', '--target', repoDir], repoDir); + assert.equal(result.status, 0, result.stderr || result.stdout); + assert.match(result.stdout, /\[gitguardex\] No pending agent branches to finish\./); +}); + test('deprecated copy-prompt alias still works and warns', () => { const repoDir = initRepo(); const result = runNode(['copy-prompt'], repoDir); diff --git a/test/metadata.test.js b/test/metadata.test.js index 85f9117..e4dafd3 100644 --- a/test/metadata.test.js +++ b/test/metadata.test.js @@ -141,23 +141,62 @@ test('critical runtime helper scripts stay in sync with templates', () => { } }); -test('doctor CLI parser exists to prevent runtime ReferenceError regressions', () => { - const cliPath = path.join(repoRoot, 'bin', 'multiagent-safety.js'); - const cliSource = fs.readFileSync(cliPath, 'utf8'); - assert.match(cliSource, /function parseDoctorArgs\(rawArgs\)/); +test('thin CLI entrypoint delegates to src/cli runtime', () => { + const entryPath = path.join(repoRoot, 'bin', 'multiagent-safety.js'); + const entrySource = fs.readFileSync(entryPath, 'utf8'); + assert.match(entrySource, /require\('\.\.\/src\/cli\/main'\)/); + assert.match(entrySource, /runFromBin\(\)/); +}); + +test('package manifest ships the extracted src runtime', () => { + const pkg = JSON.parse(fs.readFileSync(packageJsonPath, 'utf8')); + assert.ok(Array.isArray(pkg.files), 'package.json files must stay explicit'); + assert.match(pkg.files.join('\n'), /^src$/m); +}); + +test('doctor CLI parser exists in src/cli args and main runtime to prevent ReferenceError regressions', () => { + const argsSource = fs.readFileSync(path.join(repoRoot, 'src', 'cli', 'args.js'), 'utf8'); + const cliSource = fs.readFileSync(path.join(repoRoot, 'src', 'cli', 'main.js'), 'utf8'); + assert.match(argsSource, /function parseDoctorArgs\(rawArgs(?:, options = \{\})?\)/); assert.match(cliSource, /function doctorAudit\(rawArgs\)/); }); -test('active doctor command remains single-source and runs the repair-first path', () => { - const cliPath = path.join(repoRoot, 'bin', 'multiagent-safety.js'); - const cliSource = fs.readFileSync(cliPath, 'utf8'); +test('cli main delegates extracted seams and keeps doctor single-source', () => { + const cliSource = fs.readFileSync(path.join(repoRoot, 'src', 'cli', 'main.js'), 'utf8'); const doctorDefs = cliSource.match(/function doctor\(rawArgs\)/g) || []; assert.equal(doctorDefs.length, 1, 'doctor() must not be duplicated'); + assert.match(cliSource, /function assertProtectedMainWriteAllowed\(options, commandName\)\s*{\s*return getSandboxApi\(\)\.assertProtectedMainWriteAllowed\(options, commandName\);\s*}/s); + assert.match(cliSource, /function maybeSelfUpdateBeforeStatus\(\)\s*{\s*return getToolchainApi\(\)\.maybeSelfUpdateBeforeStatus\(\);\s*}/s); + assert.match(cliSource, /function configureHooks\(repoRoot, dryRun\)\s*{\s*return hooksModule\.configureHooks\(repoRoot, dryRun\);\s*}/s); + assert.match(cliSource, /function printOperations\(title, payload, dryRun = false\)\s*{\s*return scaffoldModule\.printOperations\(title, payload, dryRun\);\s*}/s); + assert.match(cliSource, /function hook\(rawArgs\)\s*{\s*return hooksModule\.hook\(rawArgs\);\s*}/s); + assert.match(cliSource, /function finish\(rawArgs, defaults = \{\}\)\s*{\s*return getFinishApi\(\)\.finish\(rawArgs, defaults\);\s*}/s); assert.match(cliSource, /printOperations\('Doctor\/fix', fixPayload, (?:singleRepoOptions|options)\.dryRun\);/); }); +test('extracted scaffold and git helpers expose the runtime contract expected by cli main', () => { + const scaffold = require(path.join(repoRoot, 'src', 'scaffold')); + const git = require(path.join(repoRoot, 'src', 'git')); + + for (const key of [ + 'toDestinationPath', + 'ensureParentDir', + 'ensureExecutable', + 'isCriticalGuardrailPath', + 'shellSingleQuote', + 'renderShellDispatchShim', + 'renderPythonDispatchShim', + 'managedForceConflictMessage', + 'printOperations', + 'printStandaloneOperations', + ]) { + assert.equal(typeof scaffold[key], 'function', `src/scaffold must export ${key}`); + } + + assert.equal(git.DEFAULT_NESTED_REPO_MAX_DEPTH, 6); +}); + test('worktree-change detection uses normal untracked-file mode', () => { - const cliPath = path.join(repoRoot, 'bin', 'multiagent-safety.js'); - const cliSource = fs.readFileSync(cliPath, 'utf8'); + const cliSource = fs.readFileSync(path.join(repoRoot, 'src', 'cli', 'main.js'), 'utf8'); assert.match(cliSource, /'status',\s*'--porcelain',\s*'--untracked-files=normal',\s*'--'/s); }); From d5b3ed5d55b1b94b4e324c254c62c8a828c846c8 Mon Sep 17 00:00:00 2001 From: NagyVikt Date: Wed, 22 Apr 2026 12:38:05 +0200 Subject: [PATCH 48/48] Keep the thin Guardex entrypoint executable Rewriting bin/multiagent-safety.js into a tiny bootstrap reset its file mode to 0644, which would break direct repo usage even though the modular runtime itself was correct. This restores the executable bit and adds a metadata regression so future entrypoint rewrites keep the bin file runnable. Constraint: The published and repo-local CLI entrypoint must remain directly executable after the modularization rewrite Rejected: Rely on npm install behavior alone | does not protect direct repo execution or future pack regressions Confidence: high Scope-risk: narrow Reversibility: clean Directive: When rewriting files under bin/, preserve the existing executable mode and add an assertion if the rewrite path can reset it Tested: node --test --test-name-pattern "thin CLI entrypoint delegates to src/cli runtime" test/metadata.test.js; ./bin/multiagent-safety.js --version Not-tested: Full metadata and install suites re-run after the mode-only fix --- bin/multiagent-safety.js | 0 test/metadata.test.js | 1 + 2 files changed, 1 insertion(+) mode change 100644 => 100755 bin/multiagent-safety.js diff --git a/bin/multiagent-safety.js b/bin/multiagent-safety.js old mode 100644 new mode 100755 diff --git a/test/metadata.test.js b/test/metadata.test.js index e4dafd3..891bd2a 100644 --- a/test/metadata.test.js +++ b/test/metadata.test.js @@ -146,6 +146,7 @@ test('thin CLI entrypoint delegates to src/cli runtime', () => { const entrySource = fs.readFileSync(entryPath, 'utf8'); assert.match(entrySource, /require\('\.\.\/src\/cli\/main'\)/); assert.match(entrySource, /runFromBin\(\)/); + assert.ok((fs.statSync(entryPath).mode & 0o111) !== 0, 'bin/multiagent-safety.js must stay executable'); }); test('package manifest ships the extracted src runtime', () => {