diff --git a/README.md b/README.md index 6361524..c644f90 100644 --- a/README.md +++ b/README.md @@ -49,7 +49,7 @@ npm i -g @imdeadpool/guardex

- Then cd into your repo and run gx setup — hooks, scripts, templates, + Then cd into your repo and run gx setup — hook shims, workflow shims, repo state, and OMX / OpenSpec / caveman wiring all scaffold in one go.

@@ -88,7 +88,7 @@ cd /path/to/your/repo gx setup ``` -That's it. Install and update via `@imdeadpool/guardex`. Setup installs hooks, scripts, templates, and scaffolds OpenSpec/caveman/OMX wiring. Aliases: `gx` (preferred), `gitguardex` (full), `guardex` (legacy compatibility). +That's it. Install and update via `@imdeadpool/guardex`. Setup installs the minimal repo footprint: managed hook/workflow shims, lock state, AGENTS wiring, and OpenSpec/caveman/OMX scaffolding. Aliases: `gx` (preferred), `gitguardex` (full), `guardex` (legacy compatibility). --- @@ -119,7 +119,7 @@ That's it. Install and update via `@imdeadpool/guardex`. Setup installs hooks, s > [!NOTE] -> In this repo, `CLAUDE.md` is a symlink to `AGENTS.md`, so Claude reads the same contract. Claude-specific command guidance is installed separately at `.claude/commands/gitguardex.md`. +> In this repo, `CLAUDE.md` is a symlink to `AGENTS.md`, so Claude reads the same contract. Optional Codex/Claude companion files are installed at the user level with `gx install-agent-skills`, not copied into each repo. ### Decision flow @@ -178,7 +178,7 @@ Before you branch, repair, or start agents, run plain `gx`. It gives you a one-s ![GitGuardex terminal status output](https://raw.githubusercontent.com/recodeee/gitguardex/main/docs/images/workflow-gx-terminal-status.svg) -Use `gx setup` the first time you wire GitGuardex into a repo. It bootstraps the managed hooks, scripts, templates, and optional workspace/OpenSpec wiring. If the repo drifts later, use `gx doctor` as the repair path: it reapplies the managed safety files, verifies the setup, and on protected `main` it auto-sandboxes the repair so your visible base branch stays clean. +Use `gx setup` the first time you wire GitGuardex into a repo. It bootstraps the managed hook/workflow shims, repo state, and optional workspace/OpenSpec wiring. If the repo drifts later, use `gx doctor` as the repair path: it reapplies the managed safety files, verifies the setup, and on protected `main` it auto-sandboxes the repair so your visible base branch stays clean. --- @@ -188,22 +188,22 @@ Per new agent task: ```sh # 1) Start isolated branch/worktree -bash scripts/agent-branch-start.sh "task-name" "agent-name" +gx branch start "task-name" "agent-name" # 2) Claim the files you're going to touch -python3 scripts/agent-file-locks.py claim \ +gx locks claim \ --branch "$(git rev-parse --abbrev-ref HEAD)" # 3) Implement + verify npm test # 4) Finish (commit + push + PR + merge + cleanup) -bash scripts/agent-branch-finish.sh \ +gx branch finish \ --branch "$(git rev-parse --abbrev-ref HEAD)" \ --base main --via-pr --wait-for-merge --cleanup ``` -If you use `scripts/codex-agent.sh`, the finish flow runs automatically when the Codex session exits — it auto-commits, retries once after syncing if the base moved during the run, then pushes and opens the PR. +If you use the managed Codex launcher shim, the finish flow runs automatically when the Codex session exits — it auto-commits, retries once after syncing if the base moved during the run, then pushes and opens the PR. GitGuardex normally prunes merged sandboxes for you as part of the finish flow. If you simply do not want a local sandbox/worktree anymore, remove that worktree directly; delete the branch too only if you are intentionally abandoning that lane: @@ -383,7 +383,7 @@ A few things worth knowing up front: - Direct commits/pushes to protected branches are **blocked** by default. Agents must use the `agent/*` + PR flow. - **Exception:** VS Code Source Control commits are allowed on protected branches that exist only locally (no upstream, no remote branch). - On protected `main`, `gx doctor` auto-runs in a sandbox agent branch/worktree so it can't touch your real main. -- In-place agent branching is disabled. `scripts/agent-branch-start.sh` always creates a separate worktree so your visible local/base branch never changes. +- In-place agent branching is disabled. `gx branch start` always creates a separate worktree so your visible local/base branch never changes. - Fresh sandbox branches start with no git upstream. Guardex records the protected base in `branch..guardexBase`, and the first `git push -u` publishes the real upstream. - Interactive self-update prompt defaults to **No** (`[y/N]`). @@ -542,8 +542,8 @@ Expanded flow: ### OpenSpec in agent sub-branches -- `scripts/codex-agent.sh` enforces OpenSpec workspaces before launching Codex. -- `scripts/agent-branch-start.sh` can scaffold both `openspec/changes//` and `openspec/plan//` when `GUARDEX_OPENSPEC_AUTO_INIT=true`. +- The managed Codex launcher shim enforces OpenSpec workspaces before launching Codex. +- `gx branch start` can scaffold both `openspec/changes//` and `openspec/plan//` when `GUARDEX_OPENSPEC_AUTO_INIT=true`. - The collaboration section in `tasks.md` is there for real cleanup handoffs too. If the first Codex/Claude session finishes the implementation work but hits a usage limit before `agent-branch-finish --cleanup`, hand the same sandbox to another agent, let that agent finish cleanup, and record the join/handoff in the change task. Environment variables: @@ -560,25 +560,26 @@ Environment variables: ## Files installed by setup ```text -scripts/agent-branch-start.sh -scripts/agent-branch-finish.sh -scripts/codex-agent.sh -scripts/review-bot-watch.sh -scripts/agent-worktree-prune.sh -scripts/agent-file-locks.py -scripts/install-agent-git-hooks.sh -scripts/openspec/init-plan-workspace.sh -.githooks/pre-commit -.githooks/pre-push -.codex/skills/gitguardex/SKILL.md -.claude/commands/gitguardex.md -.github/pull.yml.example -.github/workflows/cr.yml +.githooks/pre-commit # shim -> gx hook run pre-commit +.githooks/pre-push # shim -> gx hook run pre-push +.githooks/post-merge # shim -> gx hook run post-merge +.githooks/post-checkout # shim -> gx hook run post-checkout +scripts/agent-branch-start.sh # shim -> gx branch start +scripts/agent-branch-finish.sh # shim -> gx branch finish +scripts/agent-branch-merge.sh # shim -> gx branch merge +scripts/agent-worktree-prune.sh # shim -> gx worktree prune +scripts/agent-file-locks.py # shim -> gx locks ... +scripts/codex-agent.sh # shim -> CLI-owned Codex launcher +scripts/review-bot-watch.sh # shim -> CLI-owned review bot +scripts/openspec/init-plan-workspace.sh # shim -> CLI-owned OpenSpec init +scripts/openspec/init-change-workspace.sh # shim -> CLI-owned OpenSpec init .omc/agent-worktrees .omx/state/agent-file-locks.json +.github/pull.yml.example +.github/workflows/cr.yml ``` -If `package.json` exists, setup also adds `agent:*` helper scripts. +Repo-local Codex/Claude helper files are no longer copied into the repo. Install the optional user-level companions with `gx install-agent-skills`. --- diff --git a/bin/multiagent-safety.js b/bin/multiagent-safety.js index 8110c39..f30a943 100755 --- a/bin/multiagent-safety.js +++ b/bin/multiagent-safety.js @@ -5,11 +5,18 @@ const os = require('node:os'); const path = require('node:path'); const cp = require('node:child_process'); -const packageJsonPath = path.resolve(__dirname, '..', 'package.json'); +const PACKAGE_ROOT = path.resolve(__dirname, '..'); +const packageJsonPath = path.join(PACKAGE_ROOT, 'package.json'); const packageJson = JSON.parse(fs.readFileSync(packageJsonPath, 'utf8')); const TOOL_NAME = 'gitguardex'; const SHORT_TOOL_NAME = 'gx'; +if (!process.env.GUARDEX_CLI_ENTRY) { + process.env.GUARDEX_CLI_ENTRY = __filename; +} +if (!process.env.GUARDEX_NODE_BIN) { + process.env.GUARDEX_NODE_BIN = process.execPath; +} const LEGACY_NAMES = ['guardex', 'multiagent-safety']; const GLOBAL_INSTALL_COMMAND = `npm i -g ${packageJson.name}`; const OPENSPEC_PACKAGE = '@fission-ai/openspec'; @@ -85,30 +92,15 @@ const COMPOSE_HINT_FILES = [ 'compose.yaml', ]; -const TEMPLATE_ROOT = path.resolve(__dirname, '..', 'templates'); +const TEMPLATE_ROOT = path.join(PACKAGE_ROOT, 'templates'); + +const HOOK_NAMES = ['pre-commit', 'pre-push', 'post-merge', 'post-checkout']; const TEMPLATE_FILES = [ - 'scripts/agent-branch-start.sh', - 'scripts/agent-branch-finish.sh', - 'scripts/agent-branch-merge.sh', 'scripts/agent-session-state.js', - 'scripts/codex-agent.sh', 'scripts/guardex-docker-loader.sh', - 'scripts/install-vscode-active-agents-extension.js', - 'scripts/review-bot-watch.sh', - 'scripts/agent-worktree-prune.sh', - 'scripts/agent-file-locks.py', 'scripts/guardex-env.sh', - 'scripts/install-agent-git-hooks.sh', - 'scripts/openspec/init-plan-workspace.sh', - 'scripts/openspec/init-change-workspace.sh', - 'githooks/pre-commit', - 'githooks/pre-push', - 'githooks/post-merge', - 'githooks/post-checkout', - 'codex/skills/gitguardex/SKILL.md', - 'codex/skills/guardex-merge-skills-to-dev/SKILL.md', - 'claude/commands/gitguardex.md', + 'scripts/install-vscode-active-agents-extension.js', 'github/pull.yml.example', 'github/workflows/cr.yml', 'vscode/guardex-active-agents/package.json', @@ -117,22 +109,50 @@ const TEMPLATE_FILES = [ 'vscode/guardex-active-agents/README.md', ]; -const REQUIRED_WORKFLOW_FILES = [ +const SCRIPT_SHIMS = [ + { relativePath: 'scripts/agent-branch-start.sh', kind: 'shell', command: ['branch', 'start'] }, + { relativePath: 'scripts/agent-branch-finish.sh', kind: 'shell', command: ['branch', 'finish'] }, + { relativePath: 'scripts/agent-branch-merge.sh', kind: 'shell', command: ['branch', 'merge'] }, + { relativePath: 'scripts/codex-agent.sh', kind: 'shell', command: ['internal', 'run-shell', 'codexAgent'] }, + { relativePath: 'scripts/review-bot-watch.sh', kind: 'shell', command: ['internal', 'run-shell', 'reviewBot'] }, + { relativePath: 'scripts/agent-worktree-prune.sh', kind: 'shell', command: ['worktree', 'prune'] }, + { relativePath: 'scripts/agent-file-locks.py', kind: 'python', command: ['locks'] }, + { relativePath: 'scripts/openspec/init-plan-workspace.sh', kind: 'shell', command: ['internal', 'run-shell', 'planInit'] }, + { relativePath: 'scripts/openspec/init-change-workspace.sh', kind: 'shell', command: ['internal', 'run-shell', 'changeInit'] }, +]; + +const LEGACY_MANAGED_REPO_FILES = [ 'scripts/agent-branch-start.sh', 'scripts/agent-branch-finish.sh', 'scripts/agent-branch-merge.sh', 'scripts/agent-session-state.js', + 'scripts/codex-agent.sh', 'scripts/guardex-docker-loader.sh', + 'scripts/install-vscode-active-agents-extension.js', + 'scripts/review-bot-watch.sh', 'scripts/agent-worktree-prune.sh', 'scripts/agent-file-locks.py', 'scripts/guardex-env.sh', 'scripts/install-agent-git-hooks.sh', + 'scripts/openspec/init-plan-workspace.sh', + 'scripts/openspec/init-change-workspace.sh', '.githooks/pre-commit', + '.githooks/pre-push', '.githooks/post-merge', + '.githooks/post-checkout', + '.codex/skills/gitguardex/SKILL.md', + '.codex/skills/guardex-merge-skills-to-dev/SKILL.md', + '.claude/commands/gitguardex.md', +]; + +const REQUIRED_WORKFLOW_FILES = [ + ...TEMPLATE_FILES.map((entry) => toDestinationPath(entry)), + ...SCRIPT_SHIMS.map((entry) => entry.relativePath), + ...HOOK_NAMES.map((entry) => path.posix.join('.githooks', entry)), '.omx/state/agent-file-locks.json', ]; -const REQUIRED_PACKAGE_SCRIPTS = { +const LEGACY_MANAGED_PACKAGE_SCRIPTS = { 'agent:codex': 'bash ./scripts/codex-agent.sh', 'agent:branch:start': 'bash ./scripts/agent-branch-start.sh', 'agent:branch:finish': 'bash ./scripts/agent-branch-finish.sh', @@ -157,32 +177,44 @@ const REQUIRED_PACKAGE_SCRIPTS = { 'agent:finish': 'gx finish --all', }; +const PACKAGE_SCRIPT_ASSETS = { + branchStart: path.join(TEMPLATE_ROOT, 'scripts', 'agent-branch-start.sh'), + branchFinish: path.join(TEMPLATE_ROOT, 'scripts', 'agent-branch-finish.sh'), + branchMerge: path.join(TEMPLATE_ROOT, 'scripts', 'agent-branch-merge.sh'), + codexAgent: path.join(TEMPLATE_ROOT, 'scripts', 'codex-agent.sh'), + reviewBot: path.join(TEMPLATE_ROOT, 'scripts', 'review-bot-watch.sh'), + worktreePrune: path.join(TEMPLATE_ROOT, 'scripts', 'agent-worktree-prune.sh'), + lockTool: path.join(TEMPLATE_ROOT, 'scripts', 'agent-file-locks.py'), + planInit: path.join(TEMPLATE_ROOT, 'scripts', 'openspec', 'init-plan-workspace.sh'), + changeInit: path.join(TEMPLATE_ROOT, 'scripts', 'openspec', 'init-change-workspace.sh'), +}; + +const USER_LEVEL_SKILL_ASSETS = [ + { + source: path.join(TEMPLATE_ROOT, 'codex', 'skills', 'gitguardex', 'SKILL.md'), + destination: path.join('.codex', 'skills', 'gitguardex', 'SKILL.md'), + }, + { + source: path.join(TEMPLATE_ROOT, 'codex', 'skills', 'guardex-merge-skills-to-dev', 'SKILL.md'), + destination: path.join('.codex', 'skills', 'guardex-merge-skills-to-dev', 'SKILL.md'), + }, + { + source: path.join(TEMPLATE_ROOT, 'claude', 'commands', 'gitguardex.md'), + destination: path.join('.claude', 'commands', 'gitguardex.md'), + }, +]; + const EXECUTABLE_RELATIVE_PATHS = new Set([ - 'scripts/agent-branch-start.sh', - 'scripts/agent-branch-finish.sh', - 'scripts/agent-branch-merge.sh', 'scripts/agent-session-state.js', - 'scripts/codex-agent.sh', 'scripts/guardex-docker-loader.sh', 'scripts/install-vscode-active-agents-extension.js', - 'scripts/review-bot-watch.sh', - 'scripts/agent-worktree-prune.sh', - 'scripts/agent-file-locks.py', - 'scripts/install-agent-git-hooks.sh', - 'scripts/openspec/init-plan-workspace.sh', - 'scripts/openspec/init-change-workspace.sh', - '.githooks/pre-commit', - '.githooks/pre-push', - '.githooks/post-merge', - '.githooks/post-checkout', + ...SCRIPT_SHIMS.map((entry) => entry.relativePath), + ...HOOK_NAMES.map((entry) => path.posix.join('.githooks', entry)), ]); const CRITICAL_GUARDRAIL_PATHS = new Set([ 'AGENTS.md', - '.githooks/pre-commit', - '.githooks/pre-push', - '.githooks/post-merge', - '.githooks/post-checkout', + ...HOOK_NAMES.map((entry) => path.posix.join('.githooks', entry)), 'scripts/agent-branch-start.sh', 'scripts/agent-branch-finish.sh', 'scripts/agent-branch-merge.sh', @@ -212,9 +244,6 @@ const MANAGED_GITIGNORE_PATHS = [ 'scripts/agent-file-locks.py', '.githooks', 'oh-my-codex/', - '.codex/skills/gitguardex/SKILL.md', - '.codex/skills/guardex-merge-skills-to-dev/SKILL.md', - '.claude/commands/gitguardex.md', LOCK_FILE_RELATIVE, ]; const REPO_SCAFFOLD_DIRECTORIES = ['bin']; @@ -247,6 +276,12 @@ const SUGGESTIBLE_COMMANDS = [ 'status', 'setup', 'doctor', + 'branch', + 'locks', + 'worktree', + 'hook', + 'migrate', + 'install-agent-skills', 'agents', 'merge', 'finish', @@ -272,6 +307,12 @@ const CLI_COMMAND_DESCRIPTIONS = [ ['status', 'Show GitGuardex CLI + service health without modifying files'], ['setup', 'Install, repair, and verify guardrails (flags: --repair, --install-only, --target)'], ['doctor', 'Repair drift + verify (auto-sandboxes on protected main)'], + ['branch', 'CLI-owned branch workflow surface (start/finish/merge)'], + ['locks', 'CLI-owned file lock surface (claim/allow-delete/release/status/validate)'], + ['worktree', 'CLI-owned worktree cleanup surface (prune)'], + ['hook', 'Hook dispatch/install surface used by managed shims'], + ['migrate', 'Convert legacy repo-local installs to the new shim-based CLI-owned surface'], + ['install-agent-skills', 'Install Guardex Codex/Claude skills into the user home'], ['protect', 'Manage protected branches (list/add/remove/set/reset)'], ['merge', 'Create/reuse an integration lane and merge overlapping agent branches'], ['sync', 'Sync agent branches with origin/'], @@ -319,8 +360,8 @@ const AI_SETUP_PROMPT = `GitGuardex (gx) setup checklist for Codex/Claude in thi 1) Install: ${GLOBAL_INSTALL_COMMAND} && gh --version 2) Bootstrap: gx setup 3) Repair: gx doctor -4) Task loop: bash scripts/codex-agent.sh "" "" - or branch-start -> python3 scripts/agent-file-locks.py claim -> branch-finish +4) Task loop: gx branch start "" "" + then gx locks claim --branch "" -> gx branch finish 5) Integrate: gx merge --branch --branch 6) Finish: gx finish --all 7) Cleanup: gx cleanup @@ -335,8 +376,8 @@ const AI_SETUP_COMMANDS = `${GLOBAL_INSTALL_COMMAND} gh --version gx setup gx doctor -bash scripts/codex-agent.sh "" "" -python3 scripts/agent-file-locks.py claim --branch "" +gx branch start "" "" +gx locks claim --branch "" gx merge --branch "" --branch "" gx finish --all gx cleanup @@ -582,10 +623,74 @@ function run(cmd, args, options = {}) { encoding: 'utf8', stdio: options.stdio || 'pipe', cwd: options.cwd, + env: options.env ? { ...process.env, ...options.env } : process.env, timeout: options.timeout, }); } +function extractTargetedArgs(rawArgs, defaultTarget = process.cwd()) { + const passthrough = []; + let target = defaultTarget; + + for (let index = 0; index < rawArgs.length; index += 1) { + const arg = rawArgs[index]; + if (arg === '--target' || arg === '-t') { + target = requireValue(rawArgs, index, '--target'); + index += 1; + continue; + } + passthrough.push(arg); + } + + return { target, passthrough }; +} + +function packageAssetEnv(extraEnv = {}) { + return { + GUARDEX_CLI_ENTRY: __filename, + GUARDEX_NODE_BIN: process.execPath, + ...extraEnv, + }; +} + +function packageAssetPath(assetKey) { + const assetPath = PACKAGE_SCRIPT_ASSETS[assetKey]; + if (!assetPath) { + throw new Error(`Unknown package asset: ${assetKey}`); + } + if (!fs.existsSync(assetPath)) { + throw new Error(`Missing package asset: ${assetPath}`); + } + return assetPath; +} + +function runPackageAsset(assetKey, rawArgs, options = {}) { + const assetPath = packageAssetPath(assetKey); + let cmd = 'bash'; + if (assetPath.endsWith('.py')) { + cmd = 'python3'; + } else if (assetPath.endsWith('.js')) { + cmd = process.execPath; + } + return run(cmd, [assetPath, ...rawArgs], { + cwd: options.cwd || process.cwd(), + stdio: options.stdio || 'pipe', + timeout: options.timeout, + env: packageAssetEnv(options.env), + }); +} + +function invokePackageAsset(assetKey, rawArgs, options = {}) { + const result = runPackageAsset(assetKey, rawArgs, options); + if (result.stdout) process.stdout.write(result.stdout); + if (result.stderr) process.stderr.write(result.stderr); + if (result.status !== 0) { + throw new Error(`${assetKey} command failed with status ${result.status}`); + } + process.exitCode = 0; + return result; +} + function formatElapsedDuration(ms) { const durationMs = Number.isFinite(ms) ? Math.max(0, ms) : 0; if (durationMs < 1000) { @@ -864,6 +969,111 @@ function isCriticalGuardrailPath(relativePath) { return CRITICAL_GUARDRAIL_PATHS.has(relativePath); } +function shellSingleQuote(value) { + return `'${String(value).replace(/'/g, `'\"'\"'`)}'`; +} + +function renderShellDispatchShim(commandParts) { + const rendered = commandParts.map((part) => shellSingleQuote(part)).join(' '); + return ( + '#!/usr/bin/env bash\n' + + 'set -euo pipefail\n' + + '\n' + + 'if [[ -n "${GUARDEX_CLI_ENTRY:-}" ]]; then\n' + + ' node_bin="${GUARDEX_NODE_BIN:-node}"\n' + + ` exec "$node_bin" "$GUARDEX_CLI_ENTRY" ${rendered} "$@"\n` + + 'fi\n' + + '\n' + + 'resolve_guardex_cli() {\n' + + ' if [[ -n "${GUARDEX_CLI_BIN:-}" ]]; then\n' + + ' printf \'%s\' "$GUARDEX_CLI_BIN"\n' + + ' return 0\n' + + ' fi\n' + + ' if command -v gx >/dev/null 2>&1; then\n' + + ' printf \'%s\' "gx"\n' + + ' return 0\n' + + ' fi\n' + + ' if command -v gitguardex >/dev/null 2>&1; then\n' + + ' printf \'%s\' "gitguardex"\n' + + ' return 0\n' + + ' fi\n' + + ' echo "[gitguardex-shim] Missing gx CLI in PATH." >&2\n' + + ' exit 1\n' + + '}\n' + + '\n' + + 'cli_bin="$(resolve_guardex_cli)"\n' + + `exec "$cli_bin" ${rendered} "$@"\n` + ); +} + +function renderPythonDispatchShim(commandParts) { + return ( + '#!/usr/bin/env python3\n' + + 'import os\n' + + 'import shutil\n' + + 'import subprocess\n' + + 'import sys\n' + + '\n' + + `COMMAND = ${JSON.stringify(commandParts)}\n` + + '\n' + + 'entry = os.environ.get("GUARDEX_CLI_ENTRY")\n' + + 'if entry:\n' + + ' node_bin = os.environ.get("GUARDEX_NODE_BIN") or shutil.which("node") or "node"\n' + + ' raise SystemExit(subprocess.call([node_bin, entry, *COMMAND, *sys.argv[1:]]))\n' + + 'cli = os.environ.get("GUARDEX_CLI_BIN") or shutil.which("gx") or shutil.which("gitguardex")\n' + + 'if not cli:\n' + + ' sys.stderr.write("[gitguardex-shim] Missing gx CLI in PATH.\\n")\n' + + ' raise SystemExit(1)\n' + + 'raise SystemExit(subprocess.call([cli, *COMMAND, *sys.argv[1:]]))\n' + ); +} + +function renderManagedFile(repoRoot, relativePath, content, options = {}) { + const destinationPath = path.join(repoRoot, relativePath); + const destinationExists = fs.existsSync(destinationPath); + const force = Boolean(options.force); + const dryRun = Boolean(options.dryRun); + + if (destinationExists) { + const existingContent = fs.readFileSync(destinationPath, 'utf8'); + if (existingContent === content) { + ensureExecutable(destinationPath, relativePath, dryRun); + return { status: 'unchanged', file: relativePath }; + } + if (!force && !isCriticalGuardrailPath(relativePath)) { + throw new Error(`Refusing to overwrite existing file without --force: ${relativePath}`); + } + } + + ensureParentDir(repoRoot, destinationPath, dryRun); + if (!dryRun) { + fs.writeFileSync(destinationPath, content, 'utf8'); + ensureExecutable(destinationPath, relativePath, dryRun); + } + + if (destinationExists && !force && isCriticalGuardrailPath(relativePath)) { + return { status: dryRun ? 'would-repair-critical' : 'repaired-critical', file: relativePath }; + } + + return { status: destinationExists ? 'overwritten' : 'created', file: relativePath }; +} + +function ensureGeneratedScriptShim(repoRoot, spec, options = {}) { + const content = spec.kind === 'python' + ? renderPythonDispatchShim(spec.command) + : renderShellDispatchShim(spec.command); + return renderManagedFile(repoRoot, spec.relativePath, content, options); +} + +function ensureHookShim(repoRoot, hookName, options = {}) { + return renderManagedFile( + repoRoot, + path.posix.join('.githooks', hookName), + renderShellDispatchShim(['hook', 'run', hookName]), + options, + ); +} + function copyTemplateFile(repoRoot, relativeTemplatePath, force, dryRun) { const sourcePath = path.join(TEMPLATE_ROOT, relativeTemplatePath); const destinationRelativePath = toDestinationPath(relativeTemplatePath); @@ -1041,8 +1251,7 @@ function writeLockState(repoRoot, payload, dryRun) { fs.writeFileSync(lockPath, JSON.stringify(payload, null, 2) + '\n', 'utf8'); } -function ensurePackageScripts(repoRoot, dryRun, options = {}) { - const force = Boolean(options.force); +function removeLegacyPackageScripts(repoRoot, dryRun) { const packagePath = path.join(repoRoot, 'package.json'); if (!fs.existsSync(packagePath)) { return { status: 'skipped', file: 'package.json', note: 'package.json not found' }; @@ -1058,29 +1267,87 @@ function ensurePackageScripts(repoRoot, dryRun, options = {}) { const existingScripts = pkg.scripts && typeof pkg.scripts === 'object' ? pkg.scripts : {}; - const hasExistingAgentScripts = Object.keys(existingScripts).some((key) => key.startsWith('agent:')); - if (hasExistingAgentScripts && !force) { - return { status: 'unchanged', file: 'package.json', note: 'preserved existing agent:* scripts' }; - } - pkg.scripts = existingScripts; let changed = false; - for (const [key, value] of Object.entries(REQUIRED_PACKAGE_SCRIPTS)) { - if (pkg.scripts[key] !== value) { - pkg.scripts[key] = value; + for (const [key, value] of Object.entries(LEGACY_MANAGED_PACKAGE_SCRIPTS)) { + if (existingScripts[key] === value) { + delete existingScripts[key]; changed = true; } } if (!changed) { - return { status: 'unchanged', file: 'package.json' }; + return { status: 'unchanged', file: 'package.json', note: 'no Guardex-managed agent:* scripts found' }; } if (!dryRun) { fs.writeFileSync(packagePath, JSON.stringify(pkg, null, 2) + '\n', 'utf8'); } - return { status: 'updated', file: 'package.json' }; + return { status: dryRun ? 'would-update' : 'updated', file: 'package.json', note: 'removed Guardex-managed agent:* scripts' }; +} + +function installUserLevelAsset(asset, options = {}) { + const dryRun = Boolean(options.dryRun); + const force = Boolean(options.force); + const destinationPath = path.join(GUARDEX_HOME_DIR, asset.destination); + const sourceContent = fs.readFileSync(asset.source, 'utf8'); + const destinationExists = fs.existsSync(destinationPath); + + if (destinationExists) { + const existingContent = fs.readFileSync(destinationPath, 'utf8'); + if (existingContent === sourceContent) { + return { status: 'unchanged', file: asset.destination }; + } + if (!force) { + return { status: 'skipped-conflict', file: asset.destination }; + } + } + + if (!dryRun) { + fs.mkdirSync(path.dirname(destinationPath), { recursive: true }); + fs.writeFileSync(destinationPath, sourceContent, 'utf8'); + } + return { status: destinationExists ? (dryRun ? 'would-update' : 'updated') : 'created', file: asset.destination }; +} + +function removeLegacyManagedRepoFile(repoRoot, relativePath, options = {}) { + const dryRun = Boolean(options.dryRun); + const force = Boolean(options.force); + const absolutePath = path.join(repoRoot, relativePath); + if (!fs.existsSync(absolutePath)) { + return { status: 'unchanged', file: relativePath, note: 'not present' }; + } + if (!fs.statSync(absolutePath).isFile()) { + return { status: 'skipped-conflict', file: relativePath, note: 'not a regular file' }; + } + + const skillAsset = USER_LEVEL_SKILL_ASSETS.find((asset) => asset.destination === relativePath); + if (skillAsset) { + const userLevelPath = path.join(GUARDEX_HOME_DIR, skillAsset.destination); + if (!fs.existsSync(userLevelPath)) { + return { status: 'skipped', file: relativePath, note: 'user-level replacement not installed' }; + } + } + + const templateRelative = skillAsset + ? skillAsset.source.slice(TEMPLATE_ROOT.length + 1) + : relativePath.replace(/^\./, ''); + const sourcePath = path.join(TEMPLATE_ROOT, templateRelative); + if (!fs.existsSync(sourcePath)) { + return { status: 'skipped', file: relativePath, note: 'template source missing' }; + } + + const sourceContent = fs.readFileSync(sourcePath, 'utf8'); + const existingContent = fs.readFileSync(absolutePath, 'utf8'); + if (existingContent !== sourceContent && !force) { + return { status: 'skipped-conflict', file: relativePath, note: 'local edits differ from managed template' }; + } + + if (!dryRun) { + fs.rmSync(absolutePath, { force: true }); + } + return { status: dryRun ? 'would-remove' : 'removed', file: relativePath }; } function ensureAgentsSnippet(repoRoot, dryRun, options = {}) { @@ -1446,7 +1713,7 @@ function assertProtectedMainWriteAllowed(options, commandName) { throw new Error( `${commandName} blocked on protected branch '${blocked.branch}' in an initialized repo.\n` + `Keep local '${blocked.branch}' pull-only: start an agent branch/worktree first:\n` + - ` bash scripts/agent-branch-start.sh "" "codex"\n` + + ` gx branch start "" "codex"\n` + `Override once only when intentional: --allow-protected-base-write`, ); } @@ -1672,8 +1939,7 @@ function startProtectedBaseSandbox(blocked, { taskName, sandboxSuffix }) { return startProtectedBaseSandboxFallback(blocked, sandboxSuffix); } - const startResult = run('bash', [ - startScript, + const startResult = runPackageAsset('branchStart', [ '--task', taskName, '--agent', @@ -1822,8 +2088,7 @@ function collectWorktreeDirtyPaths(worktreePath) { } function collectDoctorForceAddPaths(worktreePath) { - return TEMPLATE_FILES - .map((entry) => toDestinationPath(entry)) + return REQUIRED_WORKFLOW_FILES .filter((relativePath) => relativePath.startsWith('scripts/') || relativePath.startsWith('.githooks/')) .filter((relativePath) => fs.existsSync(path.join(worktreePath, relativePath))); } @@ -1875,13 +2140,13 @@ function claimDoctorChangedLocks(metadata) { ])); const deletedPaths = collectDoctorDeletedPaths(metadata.worktreePath); if (changedPaths.length > 0) { - run('python3', [lockScript, 'claim', '--branch', metadata.branch, ...changedPaths], { + runPackageAsset('lockTool', ['claim', '--branch', metadata.branch, ...changedPaths], { cwd: metadata.worktreePath, timeout: 30_000, }); } if (deletedPaths.length > 0) { - run('python3', [lockScript, 'allow-delete', '--branch', metadata.branch, ...deletedPaths], { + runPackageAsset('lockTool', ['allow-delete', '--branch', metadata.branch, ...deletedPaths], { cwd: metadata.worktreePath, timeout: 30_000, }); @@ -2027,7 +2292,7 @@ function finishDoctorSandboxBranch(blocked, metadata, options = {}) { const finishResult = run( 'bash', - [finishScript, '--branch', metadata.branch, '--base', blocked.branch, '--via-pr', waitForMergeArg], + [finishScript, '--branch', metadata.branch, '--base', blocked.branch, '--via-pr', waitForMergeArg, '--cleanup'], { cwd: metadata.worktreePath, timeout: finishTimeoutMs }, ); if (isSpawnFailure(finishResult)) { @@ -2098,7 +2363,7 @@ function mergeDoctorSandboxRepairsBackToProtectedBase(options, blocked, metadata ...(autoCommitResult.stagedFiles || []), ...OMX_SCAFFOLD_DIRECTORIES, ...Array.from(OMX_SCAFFOLD_FILES.keys()), - ...TEMPLATE_FILES.map((entry) => toDestinationPath(entry)), + ...REQUIRED_WORKFLOW_FILES, 'bin', 'package.json', '.gitignore', @@ -2236,9 +2501,7 @@ function mergeDoctorSandboxRepairsBackToProtectedBase(options, blocked, metadata } function syncDoctorLocalSupportFiles(repoRoot, dryRun) { - return TEMPLATE_FILES - .filter((entry) => entry.startsWith('codex/') || entry.startsWith('claude/')) - .map((entry) => ensureTemplateFilePresent(repoRoot, entry, dryRun)); + return []; } function runDoctorInSandbox(options, blocked) { @@ -3196,7 +3459,6 @@ function autoFinishReadyAgentBranches(repoRoot, options = {}) { summary.attempted += 1; const finishArgs = [ - finishScript, '--branch', branch, '--base', @@ -3205,7 +3467,7 @@ function autoFinishReadyAgentBranches(repoRoot, options = {}) { waitForMerge ? '--wait-for-merge' : '--no-wait-for-merge', '--cleanup', ]; - const finishResult = run('bash', finishArgs, { cwd: repoRoot }); + const finishResult = runPackageAsset('branchFinish', finishArgs, { cwd: repoRoot }); const combinedOutput = [finishResult.stdout || '', finishResult.stderr || ''].join('\n').trim(); if (finishResult.status === 0) { @@ -3358,9 +3620,9 @@ function printSetupRepoHints(repoRoot, baseBranch, repoLabel = '') { console.log(`[${TOOL_NAME}] Bootstrap commit${label}: git add . && git commit -m "bootstrap gitguardex"`); console.log( `[${TOOL_NAME}] First agent flow${label}: ` + - `bash scripts/agent-branch-start.sh "" "codex" -> ` + - `python3 scripts/agent-file-locks.py claim --branch "$(git branch --show-current)" -> ` + - `bash scripts/agent-branch-finish.sh --branch "$(git branch --show-current)" --base ${baseBranch} --via-pr --wait-for-merge`, + `gx branch start "" "codex" -> ` + + `gx locks claim --branch "$(git branch --show-current)" -> ` + + `gx branch finish --branch "$(git branch --show-current)" --base ${baseBranch} --via-pr --wait-for-merge`, ); } if (!hasOrigin) { @@ -3708,19 +3970,20 @@ function parseMergeArgs(rawArgs) { return options; } -function parseFinishArgs(rawArgs) { +function parseFinishArgs(rawArgs, defaults = {}) { const options = { target: process.cwd(), base: '', branch: '', all: false, dryRun: false, - waitForMerge: true, - cleanup: true, + waitForMerge: defaults.waitForMerge ?? true, + cleanup: defaults.cleanup ?? true, keepRemote: false, noAutoCommit: false, failFast: false, commitMessage: '', + mergeMode: defaults.mergeMode || 'pr', }; for (let index = 0; index < rawArgs.length; index += 1) { @@ -3777,6 +4040,26 @@ function parseFinishArgs(rawArgs) { options.waitForMerge = false; continue; } + if (arg === '--via-pr') { + options.mergeMode = 'pr'; + continue; + } + if (arg === '--direct-only') { + options.mergeMode = 'direct'; + continue; + } + if (arg === '--mode') { + const next = rawArgs[index + 1]; + if (!next) { + throw new Error('--mode requires a value'); + } + if (!['auto', 'direct', 'pr'].includes(next)) { + throw new Error(`Invalid --mode value: ${next} (expected auto|direct|pr)`); + } + options.mergeMode = next; + index += 1; + continue; + } if (arg === '--cleanup') { options.cleanup = true; continue; @@ -3941,7 +4224,7 @@ function claimLocksForAutoCommit(repoRoot, worktreePath, branch) { ]); if (changedFiles.length > 0) { - const claim = run('python3', [lockScript, 'claim', '--branch', branch, ...changedFiles], { + const claim = runPackageAsset('lockTool', ['claim', '--branch', branch, ...changedFiles], { cwd: repoRoot, stdio: 'pipe', }); @@ -3975,7 +4258,7 @@ function claimLocksForAutoCommit(repoRoot, worktreePath, branch) { ]); if (deletedFiles.length > 0) { - const allowDelete = run('python3', [lockScript, 'allow-delete', '--branch', branch, ...deletedFiles], { + const allowDelete = runPackageAsset('lockTool', ['allow-delete', '--branch', branch, ...deletedFiles], { cwd: repoRoot, stdio: 'pipe', }); @@ -4753,6 +5036,16 @@ function askGlobalInstallForMissing(options, missingPackages, missingLocalTools) } function installGlobalToolchain(options) { + const approval = resolveGlobalInstallApproval(options); + if (approval.source === 'flag' && !approval.approved) { + return { + status: 'skipped', + reason: approval.source, + missingPackages: [], + missingLocalTools: [], + }; + } + if (options.dryRun) { return { status: 'dry-run-skip' }; } @@ -4781,11 +5074,11 @@ function installGlobalToolchain(options) { const missingPackages = detection.ok ? detection.missing : [...GLOBAL_TOOLCHAIN_PACKAGES]; const missingLocalTools = localCompanionTools.filter((tool) => tool.status !== 'active'); - const approval = askGlobalInstallForMissing(options, missingPackages, missingLocalTools); - if (!approval.approved) { + const installApproval = askGlobalInstallForMissing(options, missingPackages, missingLocalTools); + if (!installApproval.approved) { return { status: 'skipped', - reason: approval.source, + reason: installApproval.source, missingPackages, missingLocalTools, }; @@ -4880,13 +5173,15 @@ function runInstallInternal(options) { for (const templateFile of TEMPLATE_FILES) { operations.push(copyTemplateFile(repoRoot, templateFile, Boolean(options.force), Boolean(options.dryRun))); } + for (const shim of SCRIPT_SHIMS) { + operations.push(ensureGeneratedScriptShim(repoRoot, shim, options)); + } + for (const hookName of HOOK_NAMES) { + operations.push(ensureHookShim(repoRoot, hookName, options)); + } operations.push(ensureLockRegistry(repoRoot, Boolean(options.dryRun))); - if (!options.skipPackageJson) { - operations.push(ensurePackageScripts(repoRoot, Boolean(options.dryRun), { force: Boolean(options.force) })); - } - if (!options.skipAgents) { operations.push(ensureAgentsSnippet(repoRoot, Boolean(options.dryRun), { force: Boolean(options.force) })); } @@ -4925,6 +5220,12 @@ function runFixInternal(options) { for (const templateFile of TEMPLATE_FILES) { operations.push(ensureTemplateFilePresent(repoRoot, templateFile, Boolean(options.dryRun))); } + for (const shim of SCRIPT_SHIMS) { + operations.push(ensureGeneratedScriptShim(repoRoot, shim, options)); + } + for (const hookName of HOOK_NAMES) { + operations.push(ensureHookShim(repoRoot, hookName, options)); + } operations.push(ensureLockRegistry(repoRoot, Boolean(options.dryRun))); @@ -4954,10 +5255,6 @@ function runFixInternal(options) { } } - if (!options.skipPackageJson) { - operations.push(ensurePackageScripts(repoRoot, Boolean(options.dryRun), { force: Boolean(options.force) })); - } - if (!options.skipAgents) { operations.push(ensureAgentsSnippet(repoRoot, Boolean(options.dryRun), { force: Boolean(options.force) })); } @@ -4987,8 +5284,7 @@ function runScanInternal(options) { const requiredPaths = [ ...OMX_SCAFFOLD_DIRECTORIES, ...Array.from(OMX_SCAFFOLD_FILES.keys()), - ...TEMPLATE_FILES.map((entry) => toDestinationPath(entry)), - LOCK_FILE_RELATIVE, + ...REQUIRED_WORKFLOW_FILES, ]; for (const relativePath of requiredPaths) { @@ -6607,17 +6903,18 @@ function doctorAudit(rawArgs) { const packagePath = path.join(repoRoot, 'package.json'); if (!fs.existsSync(packagePath)) { - warn('package.json not found (npm helper scripts cannot be verified)'); + warn('package.json not found (legacy agent:* script drift cannot be checked)'); } else { try { const pkg = JSON.parse(fs.readFileSync(packagePath, 'utf8')); const scripts = pkg.scripts || {}; - for (const [name, expectedValue] of Object.entries(REQUIRED_PACKAGE_SCRIPTS)) { - if (scripts[name] !== expectedValue) { - fail(`package.json script mismatch for "${name}"`); - } else { - ok(`package.json script "${name}" is configured`); - } + const legacyAgentScripts = Object.entries(LEGACY_MANAGED_PACKAGE_SCRIPTS) + .filter(([name, expectedValue]) => scripts[name] === expectedValue) + .map(([name]) => name); + if (legacyAgentScripts.length > 0) { + warn(`legacy agent:* package.json scripts remain (${legacyAgentScripts.join(', ')}); run '${SHORT_TOOL_NAME} migrate' to remove them`); + } else { + ok('package.json does not contain Guardex-managed agent:* helper scripts'); } } catch (error) { fail(`package.json is invalid JSON: ${error.message}`); @@ -6699,6 +6996,167 @@ function prompt(rawArgs) { return copyPrompt(); } +function printStandaloneOperations(title, rootLabel, operations, dryRun = false) { + console.log(`[${TOOL_NAME}] ${title}: ${rootLabel}`); + for (const operation of operations) { + const note = operation.note ? ` (${operation.note})` : ''; + console.log(` - ${operation.status.padEnd(12)} ${operation.file}${note}`); + } + if (dryRun) { + console.log(`[${TOOL_NAME}] Dry run complete. No files were modified.`); + } +} + +function branch(rawArgs) { + const [subcommand, ...rest] = rawArgs; + if (subcommand === 'start') { + const { target, passthrough } = extractTargetedArgs(rest); + invokePackageAsset('branchStart', passthrough, { cwd: resolveRepoRoot(target) }); + return; + } + if (subcommand === 'finish') { + const { target, passthrough } = extractTargetedArgs(rest); + invokePackageAsset('branchFinish', passthrough, { cwd: resolveRepoRoot(target) }); + return; + } + if (subcommand === 'merge') return merge(rest); + throw new Error( + `Usage: ${SHORT_TOOL_NAME} branch [options] ` + + `(examples: '${SHORT_TOOL_NAME} branch start "" ""', '${SHORT_TOOL_NAME} branch finish --branch ')`, + ); +} + +function locks(rawArgs) { + const { target, passthrough } = extractTargetedArgs(rawArgs); + const result = runPackageAsset('lockTool', passthrough, { cwd: resolveRepoRoot(target) }); + if (result.stdout) process.stdout.write(result.stdout); + if (result.stderr) process.stderr.write(result.stderr); + process.exitCode = result.status; +} + +function worktree(rawArgs) { + const [subcommand, ...rest] = rawArgs; + if (subcommand === 'prune') { + const { target, passthrough } = extractTargetedArgs(rest); + invokePackageAsset('worktreePrune', passthrough, { cwd: resolveRepoRoot(target) }); + return; + } + throw new Error(`Usage: ${SHORT_TOOL_NAME} worktree prune [cleanup-options]`); +} + +function hook(rawArgs) { + const [subcommand, ...rest] = rawArgs; + if (subcommand === 'run') { + const [hookName, ...hookArgs] = rest; + if (!HOOK_NAMES.includes(hookName)) { + throw new Error(`Unknown hook name: ${hookName || '(missing)'}`); + } + const { target, passthrough } = extractTargetedArgs(hookArgs); + const hookAssetPath = path.join(TEMPLATE_ROOT, 'githooks', hookName); + const result = run('bash', [hookAssetPath, ...passthrough], { + cwd: resolveRepoRoot(target), + stdio: hookName === 'pre-push' ? 'inherit' : 'pipe', + env: packageAssetEnv(), + }); + if (result.stdout) process.stdout.write(result.stdout); + if (result.stderr) process.stderr.write(result.stderr); + process.exitCode = result.status; + return; + } + if (subcommand === 'install') { + const { target, passthrough } = extractTargetedArgs(rest); + if (passthrough.length > 0) { + throw new Error(`Unknown hook install option: ${passthrough[0]}`); + } + const repoRoot = resolveRepoRoot(target); + const hookResult = configureHooks(repoRoot, false); + console.log(`[${TOOL_NAME}] Hook install target: ${repoRoot}`); + console.log(` - hooksPath ${hookResult.status} ${hookResult.key}=${hookResult.value}`); + process.exitCode = 0; + return; + } + throw new Error(`Usage: ${SHORT_TOOL_NAME} hook ...`); +} + +function internal(rawArgs) { + const [subcommand, assetKey, ...rest] = rawArgs; + if (subcommand !== 'run-shell') { + throw new Error(`Unknown internal command: ${subcommand || '(missing)'}`); + } + const { target, passthrough } = extractTargetedArgs(rest); + const result = runPackageAsset(assetKey, passthrough, { cwd: resolveRepoRoot(target) }); + if (result.stdout) process.stdout.write(result.stdout); + if (result.stderr) process.stderr.write(result.stderr); + process.exitCode = result.status; +} + +function installAgentSkills(rawArgs) { + let dryRun = false; + let force = false; + for (const arg of rawArgs) { + if (arg === '--dry-run') { + dryRun = true; + continue; + } + if (arg === '--force') { + force = true; + continue; + } + throw new Error(`Unknown option: ${arg}`); + } + + const operations = USER_LEVEL_SKILL_ASSETS.map((asset) => installUserLevelAsset(asset, { dryRun, force })); + printStandaloneOperations('User-level Guardex skills', GUARDEX_HOME_DIR, operations, dryRun); + process.exitCode = 0; +} + +function migrate(rawArgs) { + const { target, passthrough } = extractTargetedArgs(rawArgs); + let dryRun = false; + let force = false; + let installSkills = false; + for (const arg of passthrough) { + if (arg === '--dry-run') { + dryRun = true; + continue; + } + if (arg === '--force') { + force = true; + continue; + } + if (arg === '--install-agent-skills') { + installSkills = true; + continue; + } + throw new Error(`Unknown option: ${arg}`); + } + + const repoRoot = resolveRepoRoot(target); + const fixPayload = runFixInternal({ + target: repoRoot, + dryRun, + force, + skipAgents: false, + skipPackageJson: true, + skipGitignore: false, + dropStaleLocks: true, + }); + printOperations('Migrate/fix', fixPayload, dryRun); + + if (installSkills) { + const skillOps = USER_LEVEL_SKILL_ASSETS.map((asset) => installUserLevelAsset(asset, { dryRun, force })); + printStandaloneOperations('Migrate/install-agent-skills', GUARDEX_HOME_DIR, skillOps, dryRun); + } + + const removableLegacyFiles = LEGACY_MANAGED_REPO_FILES.filter( + (relativePath) => !REQUIRED_WORKFLOW_FILES.includes(relativePath), + ); + const removalOps = removableLegacyFiles.map((relativePath) => removeLegacyManagedRepoFile(repoRoot, relativePath, { dryRun, force })); + removalOps.push(removeLegacyPackageScripts(repoRoot, dryRun)); + printStandaloneOperations('Migrate/cleanup', repoRoot, removalOps, dryRun); + process.exitCode = 0; +} + function cleanup(rawArgs) { const options = parseCleanupArgs(rawArgs); const repoRoot = resolveRepoRoot(options.target); @@ -6707,7 +7165,7 @@ function cleanup(rawArgs) { throw new Error(`Missing cleanup script: ${pruneScript}. Run '${SHORT_TOOL_NAME} setup' first.`); } - const args = [pruneScript]; + const args = []; if (options.base) { args.push('--base', options.base); } @@ -6738,7 +7196,7 @@ function cleanup(rawArgs) { } const runCleanupCycle = () => { - const runResult = run('bash', args, { cwd: repoRoot, stdio: 'inherit' }); + const runResult = runPackageAsset('worktreePrune', args, { cwd: repoRoot, stdio: 'inherit' }); if (runResult.status !== 0) { throw new Error('Cleanup command failed'); } @@ -6777,7 +7235,7 @@ function merge(rawArgs) { throw new Error(`Missing merge script: ${mergeScript}. Run '${SHORT_TOOL_NAME} setup' first.`); } - const args = [mergeScript]; + const args = []; if (options.base) { args.push('--base', options.base); } @@ -6794,7 +7252,7 @@ function merge(rawArgs) { args.push('--branch', branch); } - const mergeResult = run('bash', args, { cwd: repoRoot, stdio: 'pipe' }); + const mergeResult = runPackageAsset('branchMerge', args, { cwd: repoRoot, stdio: 'pipe' }); if (mergeResult.stdout) { process.stdout.write(mergeResult.stdout); } @@ -6808,8 +7266,8 @@ function merge(rawArgs) { process.exitCode = 0; } -function finish(rawArgs) { - const options = parseFinishArgs(rawArgs); +function finish(rawArgs, defaults = {}) { + const options = parseFinishArgs(rawArgs, defaults); const repoRoot = resolveRepoRoot(options.target); const finishScript = path.join(repoRoot, 'scripts', 'agent-branch-finish.sh'); @@ -6880,26 +7338,31 @@ function finish(rawArgs) { } const finishArgs = [ - finishScript, '--branch', branch, '--base', baseBranch, - '--via-pr', options.waitForMerge ? '--wait-for-merge' : '--no-wait-for-merge', options.cleanup ? '--cleanup' : '--no-cleanup', ]; + if (options.mergeMode === 'pr') { + finishArgs.push('--via-pr'); + } else if (options.mergeMode === 'direct') { + finishArgs.push('--direct-only'); + } else { + finishArgs.push('--mode', 'auto'); + } if (options.keepRemote) { finishArgs.push('--keep-remote-branch'); } if (options.dryRun) { - console.log(`[${TOOL_NAME}] [dry-run] Would run: bash ${finishArgs.join(' ')}`); + console.log(`[${TOOL_NAME}] [dry-run] Would run: gx branch finish ${finishArgs.join(' ')}`); succeeded += 1; continue; } - const finishResult = run('bash', finishArgs, { cwd: repoRoot, stdio: 'pipe' }); + const finishResult = runPackageAsset('branchFinish', finishArgs, { cwd: repoRoot, stdio: 'pipe' }); if (finishResult.stdout) { process.stdout.write(finishResult.stdout); } @@ -7293,6 +7756,13 @@ function main() { if (command === 'prompt') return prompt(rest); if (command === 'doctor') return doctor(rest); + if (command === 'branch') return branch(rest); + if (command === 'locks') return locks(rest); + if (command === 'worktree') return worktree(rest); + if (command === 'hook') return hook(rest); + if (command === 'migrate') return migrate(rest); + if (command === 'install-agent-skills') return installAgentSkills(rest); + if (command === 'internal') return internal(rest); if (command === 'agents') return agents(rest); if (command === 'merge') return merge(rest); if (command === 'finish') return finish(rest); diff --git a/openspec/changes/agent-codex-cli-owned-install-surface-2026-04-21-23-09/proposal.md b/openspec/changes/agent-codex-cli-owned-install-surface-2026-04-21-23-09/proposal.md new file mode 100644 index 0000000..9656094 --- /dev/null +++ b/openspec/changes/agent-codex-cli-owned-install-surface-2026-04-21-23-09/proposal.md @@ -0,0 +1,26 @@ +## Why + +- `gx setup` and `gx doctor` currently copy workflow implementations, full hook files, repo-local skills, and `agent:*` package scripts into every consumer repo. +- That distribution model creates drift by design, which is why doctor needs protected-branch sandbox repair flows just to re-sync copied logic that actually belongs to the CLI package. +- Repo-local copies also make the public workflow noisy: consumers are taught to call pasted scripts instead of the `gx` CLI that owns the behavior. + +## What Changes + +- Add CLI-owned workflow subcommands for branch start/finish, lock operations, hook dispatch, worktree prune, repo migration, and user-level agent-skill installation. +- Replace installed repo hooks with tiny shims that dispatch into `gx hook run ...`. +- Stop setup/doctor from copying repo-local workflow implementations or repo-local skills, and stop injecting Guardex-managed `agent:*` package scripts into target repos while keeping repo-local dispatch shims. +- Add a `gx migrate` path that converts old-style installs to the new minimal repo footprint. +- Update docs, prompts, and managed templates to teach the `gx ...` surface instead of pasted script paths. + +## Scope + +- `bin/multiagent-safety.js` +- hook templates and package-owned workflow assets under `templates/` +- setup/doctor/install/migrate tests in `test/install.test.js` +- user-facing docs/templates (`README.md`, managed AGENTS block, skill templates) + +## Risks + +- Hook shims must still work in repos that only have the CLI on `PATH`; the tests need to lock that behavior. +- Existing repos may keep stale copied files until `gx migrate` runs, so migration must be conservative and explicit about what it removes. +- Setup/doctor/status output will change materially because the managed repo footprint is smaller. diff --git a/openspec/changes/agent-codex-cli-owned-install-surface-2026-04-21-23-09/specs/cli-owned-install-surface/spec.md b/openspec/changes/agent-codex-cli-owned-install-surface-2026-04-21-23-09/specs/cli-owned-install-surface/spec.md new file mode 100644 index 0000000..0692774 --- /dev/null +++ b/openspec/changes/agent-codex-cli-owned-install-surface-2026-04-21-23-09/specs/cli-owned-install-surface/spec.md @@ -0,0 +1,54 @@ +## ADDED Requirements + +### Requirement: Setup installs only repo-local state and dispatch shims + +`gx setup` and `gx doctor` SHALL keep the managed repo footprint limited to repo-local state and dispatch shims, not copied workflow logic. + +#### Scenario: setup installs the minimal repo footprint + +- **GIVEN** a repo opts into Guardex +- **WHEN** `gx setup` runs +- **THEN** it installs the managed AGENTS block, `.githooks/*` dispatch shims, `scripts/*` workflow shims, `.omx/.omc` scaffold, lock registry state, and the managed `.gitignore` block +- **AND** it does not copy workflow implementations, repo-local Codex/Claude skills, or inject Guardex-managed `agent:*` helper scripts into `package.json` + +#### Scenario: doctor repairs the minimal footprint without restoring copied scripts + +- **GIVEN** a repo already uses the CLI-owned install surface +- **WHEN** `gx doctor` repairs drift +- **THEN** it restores the managed AGENTS block, hook/workflow shims, lock registry, and managed `.gitignore` entries as needed +- **AND** it does not recreate copied workflow implementations, repo-local skills, or Guardex-managed `agent:*` package scripts + +### Requirement: Hook shims dispatch through `gx` + +Installed repo hooks SHALL delegate to CLI-owned hook logic instead of embedding guard behavior inline. + +#### Scenario: pre-commit hook is a shim + +- **GIVEN** `gx setup` installed repo hooks +- **WHEN** `.githooks/pre-commit` is inspected or executed +- **THEN** it delegates to `gx hook run pre-commit` +- **AND** the guarded pre-commit behavior still enforces the same branch and lock rules + +### Requirement: CLI-owned workflow commands remain available without copied workflow implementations + +The CLI SHALL expose the guard workflow directly so consumers do not need copied repo workflow logic. + +#### Scenario: branch and lock commands run from the CLI + +- **GIVEN** a repo with the minimal install footprint +- **WHEN** a user runs `gx branch start`, `gx branch finish`, `gx locks claim`, or `gx worktree prune` +- **THEN** the command executes using package-owned logic +- **AND** any repo-local `scripts/agent-branch-*.sh` or `scripts/agent-file-locks.py` files remain thin dispatch shims instead of copied workflow logic + +### Requirement: Migration removes old-style copied workflow files + +The CLI SHALL provide a migration path from old repo-local installs to the CLI-owned surface. + +#### Scenario: migrate converts an old-style install + +- **GIVEN** a repo still contains Guardex-managed workflow scripts, repo-local skills, and injected `agent:*` package scripts +- **WHEN** `gx migrate` runs +- **THEN** it replaces hooks with dispatch shims +- **AND** it removes the copied workflow scripts and managed `agent:*` script injections +- **AND** it removes repo-local Guardex skill copies when matching user-level installs are present +- **AND** it leaves the AGENTS block, lock registry, and managed `.gitignore` in the new minimal form diff --git a/openspec/changes/agent-codex-cli-owned-install-surface-2026-04-21-23-09/tasks.md b/openspec/changes/agent-codex-cli-owned-install-surface-2026-04-21-23-09/tasks.md new file mode 100644 index 0000000..e2df242 --- /dev/null +++ b/openspec/changes/agent-codex-cli-owned-install-surface-2026-04-21-23-09/tasks.md @@ -0,0 +1,33 @@ +## 1. Spec + +- [x] 1.1 Define the CLI-owned install surface and minimal repo footprint in `specs/cli-owned-install-surface/spec.md`. +- [x] 1.2 Update the proposal/tasks to reflect the migration constraints and cleanup goal. + +## 2. Tests + +- [x] 2.1 Add or update install/setup/doctor regressions for the new minimal repo footprint: + - hooks install as `gx hook run ...` shims + - setup/doctor keep repo-local dispatch shims but stop copying workflow implementations or repo-local skills + - migration converts old-style installs and removes Guardex-managed `agent:*` package scripts +- [x] 2.2 Add coverage for the new CLI-owned command surface (`gx branch ...`, `gx locks ...`, `gx worktree prune`, `gx migrate --install-agent-skills` as applicable). + +## 3. Implementation + +- [x] 3.1 Add CLI-owned workflow subcommands and package-asset execution paths in `bin/multiagent-safety.js`. +- [x] 3.2 Convert installed hook templates to shims and route hook logic through package-owned assets. +- [x] 3.3 Remove repo-local workflow implementation/skill/package-script installation from setup/doctor while preserving AGENTS, hook/workflow shims, lock state, and managed gitignore behavior. +- [x] 3.4 Add `gx migrate` and user-level skill installation support. +- [x] 3.5 Update docs/templates/prompts to teach the `gx` surface instead of pasted repo scripts. + +## 4. Verification + +- [x] 4.1 Run `node --check bin/multiagent-safety.js`. +- [x] 4.2 Run the focused install/doctor suite: `node --test test/install.test.js`. +- [x] 4.3 Run `openspec validate agent-codex-cli-owned-install-surface-2026-04-21-23-09 --type change --strict`. +- [x] 4.4 Run `openspec validate --specs`. + +## 5. Cleanup + +- [x] 5.1 Confirm the OpenSpec tasks reflect the shipped behavior and note any deferred follow-ups. +- [ ] 5.2 Finish the agent branch via PR merge + cleanup (`gx finish --via-pr --wait-for-merge --cleanup` or `bash scripts/agent-branch-finish.sh --branch --base --via-pr --wait-for-merge --cleanup`). +- [ ] 5.3 Record PR URL + final `MERGED` evidence in the completion handoff. diff --git a/templates/AGENTS.multiagent-safety.md b/templates/AGENTS.multiagent-safety.md index a9fa4b6..9ab89d7 100644 --- a/templates/AGENTS.multiagent-safety.md +++ b/templates/AGENTS.multiagent-safety.md @@ -7,22 +7,22 @@ `GUARDEX_ON=0` disables Guardex for that repo. `GUARDEX_ON=1` explicitly enables Guardex for that repo again. -**Isolation.** Every task runs on a dedicated `agent/*` branch + worktree. Start with `scripts/agent-branch-start.sh "" ""`. Treat the base branch (`main`/`dev`) as read-only while an agent branch is active. Never `git checkout ` on a primary working tree (including nested repos); use `git worktree add` instead. The `.githooks/post-checkout` hook auto-reverts primary-branch switches during agent sessions - bypass only with `GUARDEX_ALLOW_PRIMARY_BRANCH_SWITCH=1`. +**Isolation.** Every task runs on a dedicated `agent/*` branch + worktree. Start with `gx branch start "" ""`. Treat the base branch (`main`/`dev`) as read-only while an agent branch is active. Never `git checkout ` on a primary working tree (including nested repos); use `git worktree add` instead. The `.githooks/post-checkout` hook auto-reverts primary-branch switches during agent sessions - bypass only with `GUARDEX_ALLOW_PRIMARY_BRANCH_SWITCH=1`. For every new task, including follow-up work in the same chat/session, if an assigned agent sub-branch/worktree is already open, continue in that sub-branch instead of creating a fresh lane unless the user explicitly redirects scope. Never implement directly on the local/base branch checkout; keep it unchanged and perform all edits in the agent sub-branch/worktree. -**Ownership.** Before editing, claim files: `scripts/agent-file-locks.py claim --branch "" `. Before deleting, confirm the path is in your claim. Don't edit outside your scope unless reassigned. +**Ownership.** Before editing, claim files: `gx locks claim --branch "" `. Before deleting, confirm the path is in your claim. Don't edit outside your scope unless reassigned. **Handoff gate.** Post a one-line handoff note (plan/change, owned scope, intended action) before editing. Re-read the latest handoffs before replacing others' code. -**Completion.** Finish with `scripts/agent-branch-finish.sh --branch "" --via-pr --wait-for-merge --cleanup` (or `gx finish --all`). Task is only complete when: commit pushed, PR URL recorded, state = `MERGED`, sandbox worktree pruned. If anything blocks, append a `BLOCKED:` note and stop - don't half-finish. +**Completion.** Finish with `gx branch finish --branch "" --via-pr --wait-for-merge --cleanup` (or `gx finish --all`). Task is only complete when: commit pushed, PR URL recorded, state = `MERGED`, sandbox worktree pruned. If anything blocks, append a `BLOCKED:` note and stop - don't half-finish. OMX completion policy: when a task is done, the agent must commit the task changes, push the agent branch, and create/update a PR before considering the branch complete. **Parallel safety.** Assume other agents edit nearby. Never revert unrelated changes. Report conflicts in the handoff. **Reporting.** Every completion handoff includes: files changed, behavior touched, verification commands + results, risks/follow-ups. -**OpenSpec (when change-driven).** Keep `openspec/changes//tasks.md` checkboxes current during work, not batched at the end. Task scaffolds and manual task edits must include an explicit final completion/cleanup section that ends with PR merge + sandbox cleanup (`gx finish --via-pr --wait-for-merge --cleanup` or `scripts/agent-branch-finish.sh ... --cleanup`) and records PR URL + final `MERGED` evidence. Verify specs with `openspec validate --specs` before archive. Don't archive unverified. +**OpenSpec (when change-driven).** Keep `openspec/changes//tasks.md` checkboxes current during work, not batched at the end. Task scaffolds and manual task edits must include an explicit final completion/cleanup section that ends with PR merge + sandbox cleanup (`gx finish --via-pr --wait-for-merge --cleanup` or `gx branch finish ... --cleanup`) and records PR URL + final `MERGED` evidence. Verify specs with `openspec validate --specs` before archive. Don't archive unverified. **Version bumps.** If a change bumps a published version, the same PR updates release notes/changelog. diff --git a/templates/githooks/post-checkout b/templates/githooks/post-checkout index ad90fad..21a2305 100755 --- a/templates/githooks/post-checkout +++ b/templates/githooks/post-checkout @@ -64,7 +64,7 @@ echo "[agent-primary-branch-guard] Primary checkout switched branches." >&2 echo "[agent-primary-branch-guard] from: $prev_branch (protected)" >&2 echo "[agent-primary-branch-guard] to: $new_branch" >&2 echo "[agent-primary-branch-guard] The primary working tree must stay on its base/protected branch." >&2 -echo "[agent-primary-branch-guard] Use 'git worktree add' (or scripts/agent-branch-start.sh) for feature work." >&2 +echo "[agent-primary-branch-guard] Use 'git worktree add' (or gx branch start) for feature work." >&2 if [[ "$is_agent" == "1" ]]; then echo "[agent-primary-branch-guard] Agent session detected — reverting to '$prev_branch'." >&2 diff --git a/templates/githooks/post-merge b/templates/githooks/post-merge index 8bd7809..fc405e8 100755 --- a/templates/githooks/post-merge +++ b/templates/githooks/post-merge @@ -32,17 +32,30 @@ if [[ "$branch" != "$base_branch" ]]; then exit 0 fi -cli_path="$repo_root/bin/multiagent-safety.js" -if [[ ! -f "$cli_path" ]]; then +if [[ -n "${GUARDEX_CLI_ENTRY:-}" ]]; then + node_bin="${GUARDEX_NODE_BIN:-node}" + if command -v "$node_bin" >/dev/null 2>&1; then + "$node_bin" "$GUARDEX_CLI_ENTRY" cleanup \ + --target "$repo_root" \ + --base "$base_branch" \ + --include-pr-merged \ + --keep-clean-worktrees >/dev/null 2>&1 || true + fi exit 0 fi -node_bin="${GUARDEX_NODE_BIN:-node}" -if ! command -v "$node_bin" >/dev/null 2>&1; then - exit 0 +cli_bin="${GUARDEX_CLI_BIN:-}" +if [[ -z "$cli_bin" ]]; then + if command -v gx >/dev/null 2>&1; then + cli_bin="gx" + elif command -v gitguardex >/dev/null 2>&1; then + cli_bin="gitguardex" + else + exit 0 + fi fi -"$node_bin" "$cli_path" cleanup \ +"$cli_bin" cleanup \ --target "$repo_root" \ --base "$base_branch" \ --include-pr-merged \ diff --git a/templates/githooks/pre-commit b/templates/githooks/pre-commit index facd3ff..444cb3e 100755 --- a/templates/githooks/pre-commit +++ b/templates/githooks/pre-commit @@ -118,9 +118,9 @@ if [[ "$should_require_codex_agent_branch" == "1" && "${GUARDEX_ALLOW_CODEX_ON_N [guardex-preedit-guard] Codex edit/commit detected on a protected branch. GitGuardex requires Codex work to run from an isolated agent/* branch. Start the sub-branch/worktree with: - bash scripts/codex-agent.sh "" "" + gx branch start "" "" Or manually: - bash scripts/agent-branch-start.sh "" "" + gx branch start "" "" Then commit from the created agent/* branch. Temporary bypass (not recommended): @@ -132,7 +132,7 @@ MSG cat >&2 <<'MSG' [codex-branch-guard] Codex agent commit blocked on non-agent branch. Use isolated branch/worktree first: - bash scripts/agent-branch-start.sh "" "" + gx branch start "" "" Then commit from the created agent/* branch. Temporary bypass (not recommended): @@ -163,9 +163,9 @@ if [[ "$is_protected_branch" == "1" ]]; then cat >&2 <<'MSG' [agent-branch-guard] Direct commits on protected branches are blocked. Use an agent branch first: - bash scripts/agent-branch-start.sh "" "" + gx branch start "" "" After finishing work: - bash scripts/agent-branch-finish.sh + gx branch finish Temporary bypass (not recommended): ALLOW_COMMIT_ON_PROTECTED_BRANCH=1 git commit ... @@ -177,7 +177,7 @@ if [[ "$is_agent_session" == "1" && "$branch" != agent/* ]]; then cat >&2 <<'MSG' [agent-branch-guard] Agent commits must run on dedicated agent/* branches. Start an agent branch first: - bash scripts/agent-branch-start.sh "" "" + gx branch start "" "" Then commit on that branch. Temporary bypass (not recommended): @@ -199,7 +199,7 @@ if [[ "$branch" == agent/* ]]; then cat >&2 <<'MSG' [agent-branch-guard] Agent branch commits require file ownership locks. Claim files first: - python3 scripts/agent-file-locks.py claim --branch "$(git rev-parse --abbrev-ref HEAD)" + gx locks claim --branch "$(git rev-parse --abbrev-ref HEAD)" MSG exit 1 fi diff --git a/templates/scripts/openspec/init-change-workspace.sh b/templates/scripts/openspec/init-change-workspace.sh index 6f0930d..7a89cdf 100755 --- a/templates/scripts/openspec/init-change-workspace.sh +++ b/templates/scripts/openspec/init-change-workspace.sh @@ -74,11 +74,11 @@ Describe the change in a sentence or two. Commit message is the spec of record. ## Handoff - Handoff: change=\`${CHANGE_SLUG}\`; branch=\`${AGENT_BRANCH}\`; scope=\`TODO\`; action=\`continue this sandbox or finish cleanup after a usage-limit/manual takeover\`. -- Copy prompt: Continue \`${CHANGE_SLUG}\` on branch \`${AGENT_BRANCH}\`. Work inside the existing sandbox, review \`openspec/changes/${CHANGE_SLUG}/notes.md\`, continue from the current state instead of creating a new sandbox, and when the work is done run \`bash scripts/agent-branch-finish.sh --branch "${AGENT_BRANCH}" --base ${BASE_BRANCH} --via-pr --wait-for-merge --cleanup\`. +- Copy prompt: Continue \`${CHANGE_SLUG}\` on branch \`${AGENT_BRANCH}\`. Work inside the existing sandbox, review \`openspec/changes/${CHANGE_SLUG}/notes.md\`, continue from the current state instead of creating a new sandbox, and when the work is done run \`gx branch finish --branch ${AGENT_BRANCH} --base ${BASE_BRANCH} --via-pr --wait-for-merge --cleanup\`. ## Cleanup -- [ ] Run: \`bash scripts/agent-branch-finish.sh --branch "${AGENT_BRANCH}" --base ${BASE_BRANCH} --via-pr --wait-for-merge --cleanup\` +- [ ] Run: \`gx branch finish --branch ${AGENT_BRANCH} --base ${BASE_BRANCH} --via-pr --wait-for-merge --cleanup\` - [ ] Record PR URL + \`MERGED\` state in the completion handoff. - [ ] Confirm sandbox worktree is gone (\`git worktree list\`, \`git branch -a\`). NOTESEOF @@ -117,7 +117,7 @@ This change is complete only when **all** of the following are true: ## Handoff - Handoff: change=\`${CHANGE_SLUG}\`; branch=\`${AGENT_BRANCH}\`; scope=\`TODO\`; action=\`continue this sandbox or finish cleanup after a usage-limit/manual takeover\`. -- Copy prompt: Continue \`${CHANGE_SLUG}\` on branch \`${AGENT_BRANCH}\`. Work inside the existing sandbox, review \`openspec/changes/${CHANGE_SLUG}/tasks.md\`, continue from the current state instead of creating a new sandbox, and when the work is done run \`bash scripts/agent-branch-finish.sh --branch "${AGENT_BRANCH}" --base ${BASE_BRANCH} --via-pr --wait-for-merge --cleanup\`. +- Copy prompt: Continue \`${CHANGE_SLUG}\` on branch \`${AGENT_BRANCH}\`. Work inside the existing sandbox, review \`openspec/changes/${CHANGE_SLUG}/tasks.md\`, continue from the current state instead of creating a new sandbox, and when the work is done run \`gx branch finish --branch ${AGENT_BRANCH} --base ${BASE_BRANCH} --via-pr --wait-for-merge --cleanup\`. ## 1. Specification @@ -137,7 +137,7 @@ This change is complete only when **all** of the following are true: ## 4. Cleanup (mandatory; run before claiming completion) -- [ ] 4.1 Run the cleanup pipeline: \`bash scripts/agent-branch-finish.sh --branch "${AGENT_BRANCH}" --base ${BASE_BRANCH} --via-pr --wait-for-merge --cleanup\`. This handles commit -> push -> PR create -> merge wait -> worktree prune in one invocation. +- [ ] 4.1 Run the cleanup pipeline: \`gx branch finish --branch ${AGENT_BRANCH} --base ${BASE_BRANCH} --via-pr --wait-for-merge --cleanup\`. This handles commit -> push -> PR create -> merge wait -> worktree prune in one invocation. - [ ] 4.2 Record the PR URL and final merge state (\`MERGED\`) in the completion handoff. - [ ] 4.3 Confirm the sandbox worktree is gone (\`git worktree list\` no longer shows the agent path; \`git branch -a\` shows no surviving local/remote refs for the branch). TASKSEOF diff --git a/templates/scripts/openspec/init-plan-workspace.sh b/templates/scripts/openspec/init-plan-workspace.sh index c96dba3..b664569 100755 --- a/templates/scripts/openspec/init-plan-workspace.sh +++ b/templates/scripts/openspec/init-plan-workspace.sh @@ -434,7 +434,7 @@ EXCCPTEOF ## 6. Cleanup -- [ ] 6.1 If this lane owns finalization, run \`bash scripts/agent-branch-finish.sh --branch --base dev --via-pr --wait-for-merge --cleanup\`. +- [ ] 6.1 If this lane owns finalization, run \`gx branch finish --branch --base dev --via-pr --wait-for-merge --cleanup\`. - [ ] 6.2 Record PR URL + final \`MERGED\` state in the handoff. - [ ] 6.3 Confirm sandbox cleanup (\`git worktree list\`, \`git branch -a\`) or append \`BLOCKED:\` and stop. TASKEOF @@ -470,7 +470,7 @@ TASKEOF ## 6. Cleanup -- [ ] 6.1 If this lane owns finalization, run \`bash scripts/agent-branch-finish.sh --branch --base dev --via-pr --wait-for-merge --cleanup\`. +- [ ] 6.1 If this lane owns finalization, run \`gx branch finish --branch --base dev --via-pr --wait-for-merge --cleanup\`. - [ ] 6.2 Record PR URL + final \`MERGED\` state in the handoff. - [ ] 6.3 Confirm sandbox cleanup (\`git worktree list\`, \`git branch -a\`) or append \`BLOCKED:\` and stop. TASKEOF @@ -506,7 +506,7 @@ TASKEOF ## 6. Cleanup -- [ ] 6.1 If this lane owns finalization, run \`bash scripts/agent-branch-finish.sh --branch --base dev --via-pr --wait-for-merge --cleanup\`. +- [ ] 6.1 If this lane owns finalization, run \`gx branch finish --branch --base dev --via-pr --wait-for-merge --cleanup\`. - [ ] 6.2 Record PR URL + final \`MERGED\` state in the handoff. - [ ] 6.3 Confirm sandbox cleanup (\`git worktree list\`, \`git branch -a\`) or append \`BLOCKED:\` and stop. TASKEOF @@ -542,7 +542,7 @@ TASKEOF ## 6. Cleanup -- [ ] 6.1 If this lane owns finalization, run \`bash scripts/agent-branch-finish.sh --branch --base dev --via-pr --wait-for-merge --cleanup\`. +- [ ] 6.1 If this lane owns finalization, run \`gx branch finish --branch --base dev --via-pr --wait-for-merge --cleanup\`. - [ ] 6.2 Record PR URL + final \`MERGED\` state in the handoff. - [ ] 6.3 Confirm sandbox cleanup (\`git worktree list\`, \`git branch -a\`) or append \`BLOCKED:\` and stop. TASKEOF @@ -578,7 +578,7 @@ TASKEOF ## 6. Cleanup -- [ ] 6.1 If this lane owns finalization, run \`bash scripts/agent-branch-finish.sh --branch --base dev --via-pr --wait-for-merge --cleanup\`. +- [ ] 6.1 If this lane owns finalization, run \`gx branch finish --branch --base dev --via-pr --wait-for-merge --cleanup\`. - [ ] 6.2 Record PR URL + final \`MERGED\` state in the handoff. - [ ] 6.3 Confirm sandbox cleanup (\`git worktree list\`, \`git branch -a\`) or append \`BLOCKED:\` and stop. TASKEOF @@ -614,7 +614,7 @@ TASKEOF ## 6. Cleanup -- [ ] 6.1 If this lane owns finalization, run \`bash scripts/agent-branch-finish.sh --branch --base dev --via-pr --wait-for-merge --cleanup\`. +- [ ] 6.1 If this lane owns finalization, run \`gx branch finish --branch --base dev --via-pr --wait-for-merge --cleanup\`. - [ ] 6.2 Record PR URL + final \`MERGED\` state in the handoff. - [ ] 6.3 Confirm sandbox cleanup (\`git worktree list\`, \`git branch -a\`) or append \`BLOCKED:\` and stop. TASKEOF @@ -650,7 +650,7 @@ TASKEOF ## 6. Cleanup -- [ ] 6.1 If this lane owns finalization, run \`bash scripts/agent-branch-finish.sh --branch --base dev --via-pr --wait-for-merge --cleanup\`. +- [ ] 6.1 If this lane owns finalization, run \`gx branch finish --branch --base dev --via-pr --wait-for-merge --cleanup\`. - [ ] 6.2 Record PR URL + final \`MERGED\` state in the handoff. - [ ] 6.3 Confirm sandbox cleanup (\`git worktree list\`, \`git branch -a\`) or append \`BLOCKED:\` and stop. TASKEOF diff --git a/test/install.test.js b/test/install.test.js index 80455ac..96734d3 100644 --- a/test/install.test.js +++ b/test/install.test.js @@ -55,7 +55,13 @@ function runCmd(cmd, args, cwd, options = {}) { return cp.spawnSync(cmd, args, { cwd, encoding: 'utf8', - env: { ...sanitizedEnv, ...pushBypassEnv, ...overrideEnv }, + env: { + ...sanitizedEnv, + GUARDEX_CLI_ENTRY: cliPath, + GUARDEX_NODE_BIN: process.execPath, + ...pushBypassEnv, + ...overrideEnv, + }, }); } @@ -412,7 +418,7 @@ if (!canSpawnChildProcesses) { test('setup provisions workflow files and repo config', () => { const repoDir = initRepo(); - let result = runNode(['setup', '--target', repoDir], repoDir); + let result = runNode(['setup', '--target', repoDir, '--no-global-install'], repoDir); assert.equal(result.status, 0, result.stderr || result.stdout); assert.match(result.stdout, /OpenSpec core workflow: \/opsx:propose -> \/opsx:apply -> \/opsx:archive/); assert.match(result.stdout, /OpenSpec guide: docs\/openspec-getting-started\.md/); @@ -435,15 +441,12 @@ test('setup provisions workflow files and repo config', () => { 'scripts/agent-worktree-prune.sh', 'scripts/agent-file-locks.py', 'scripts/guardex-env.sh', - 'scripts/install-agent-git-hooks.sh', 'scripts/openspec/init-plan-workspace.sh', 'scripts/openspec/init-change-workspace.sh', '.githooks/pre-commit', '.githooks/pre-push', '.githooks/post-merge', - '.codex/skills/gitguardex/SKILL.md', - '.codex/skills/guardex-merge-skills-to-dev/SKILL.md', - '.claude/commands/gitguardex.md', + '.githooks/post-checkout', '.github/pull.yml.example', '.github/workflows/cr.yml', '.omx/state/agent-file-locks.json', @@ -455,6 +458,14 @@ test('setup provisions workflow files and repo config', () => { assert.equal(fs.existsSync(path.join(repoDir, relativePath)), true, `${relativePath} missing`); } + const branchStartShim = fs.readFileSync(path.join(repoDir, 'scripts', 'agent-branch-start.sh'), 'utf8'); + assert.match(branchStartShim, /exec "\$node_bin" "\$GUARDEX_CLI_ENTRY" 'branch' 'start' "\$@"/); + assert.match(branchStartShim, /exec "\$cli_bin" 'branch' 'start' "\$@"/); + + const preCommitShim = fs.readFileSync(path.join(repoDir, '.githooks', 'pre-commit'), 'utf8'); + assert.match(preCommitShim, /exec "\$node_bin" "\$GUARDEX_CLI_ENTRY" 'hook' 'run' 'pre-commit' "\$@"/); + assert.match(preCommitShim, /exec "\$cli_bin" 'hook' 'run' 'pre-commit' "\$@"/); + const crWorkflow = fs.readFileSync(path.join(repoDir, '.github', 'workflows', 'cr.yml'), 'utf8'); assert.match(crWorkflow, /name:\s+Code Review/); assert.match(crWorkflow, /pull_request:/); @@ -464,18 +475,8 @@ test('setup provisions workflow files and repo config', () => { assert.doesNotMatch(crWorkflow, /if:\s+\$\{\{\s*secrets\.OPENAI_API_KEY/); const packageJson = JSON.parse(fs.readFileSync(path.join(repoDir, 'package.json'), 'utf8')); - assert.equal(packageJson.scripts['agent:codex'], 'bash ./scripts/codex-agent.sh'); - assert.equal(packageJson.scripts['agent:review:watch'], 'bash ./scripts/review-bot-watch.sh'); - assert.equal(packageJson.scripts['agent:branch:start'], 'bash ./scripts/agent-branch-start.sh'); - assert.equal(packageJson.scripts['agent:finish'], 'gx finish --all'); - assert.equal(packageJson.scripts['agent:plan:init'], 'bash ./scripts/openspec/init-plan-workspace.sh'); - assert.equal(packageJson.scripts['agent:change:init'], 'bash ./scripts/openspec/init-change-workspace.sh'); - assert.equal(packageJson.scripts['agent:protect:list'], 'gx protect list'); - assert.equal(packageJson.scripts['agent:branch:sync'], 'gx sync'); - assert.equal(packageJson.scripts['agent:branch:sync:check'], 'gx sync --check'); - assert.equal(packageJson.scripts['agent:docker:load'], 'bash ./scripts/guardex-docker-loader.sh'); - assert.equal(packageJson.scripts['agent:safety:setup'], 'gx setup'); - assert.equal(packageJson.scripts['agent:cleanup'], 'gx cleanup'); + const managedAgentScripts = Object.keys(packageJson.scripts || {}).filter((name) => name.startsWith('agent:')); + assert.deepEqual(managedAgentScripts, [], 'setup should not inject agent:* helper scripts'); const agentsContent = fs.readFileSync(path.join(repoDir, 'AGENTS.md'), 'utf8'); assert.equal(agentsContent.includes(''), true); @@ -495,9 +496,6 @@ test('setup provisions workflow files and repo config', () => { assert.match(gitignoreContent, /\.omx\//); assert.match(gitignoreContent, /\.omc\//); assert.match(gitignoreContent, /oh-my-codex\//); - assert.match(gitignoreContent, /\.codex\/skills\/gitguardex\/SKILL\.md/); - assert.match(gitignoreContent, /\.codex\/skills\/guardex-merge-skills-to-dev\/SKILL\.md/); - assert.match(gitignoreContent, /\.claude\/commands\/gitguardex\.md/); assert.match(gitignoreContent, /\.omx\/state\/agent-file-locks\.json/); assert.match(gitignoreContent, /# multiagent-safety:END/); @@ -505,7 +503,7 @@ test('setup provisions workflow files and repo config', () => { assert.equal(result.status, 0, result.stderr); assert.equal(result.stdout.trim(), '.githooks'); - const secondRun = runNode(['setup', '--target', repoDir], repoDir); + const secondRun = runNode(['setup', '--target', repoDir, '--no-global-install'], repoDir); assert.equal(secondRun.status, 0, secondRun.stderr || secondRun.stdout); }); @@ -527,7 +525,8 @@ test('setup on a fresh compose repo prints onboarding hints and installs a worki assert.match(result.stdout, /GUARDEX_DOCKER_SERVICE/); const packageJson = JSON.parse(fs.readFileSync(path.join(repoDir, 'package.json'), 'utf8')); - assert.equal(packageJson.scripts['agent:docker:load'], 'bash ./scripts/guardex-docker-loader.sh'); + const managedAgentScripts = Object.keys(packageJson.scripts || {}).filter((name) => name.startsWith('agent:')); + assert.deepEqual(managedAgentScripts, [], 'setup should not inject agent:* helper scripts'); const { fakeBin } = createFakeDockerScript( 'if [[ "$1" == "compose" && "$2" == "version" ]]; then\n' + @@ -561,26 +560,46 @@ test('setup on a fresh compose repo prints onboarding hints and installs a worki assert.match(result.stdout, /EXEC:compose exec -T app echo hello/); }); -test('setup and doctor explain .codex file conflicts and still write managed gitignore first', () => { +test('setup --no-global-install skips npm global toolchain probing', () => { const repoDir = initRepo(); - fs.writeFileSync(path.join(repoDir, '.codex'), '', 'utf8'); + const markerPath = path.join(repoDir, '.npm-probe-marker'); + const fakeNpmPath = createFakeNpmScript( + 'printf \'%s\\n\' "called" > "${GUARDEX_TEST_NPM_MARKER}"\n' + + 'exit 99\n', + ); + + const result = runNodeWithEnv( + ['setup', '--target', repoDir, '--no-global-install'], + repoDir, + { + GUARDEX_NPM_BIN: fakeNpmPath, + GUARDEX_TEST_NPM_MARKER: markerPath, + }, + ); + assert.equal(result.status, 0, result.stderr || result.stdout); + assert.equal(fs.existsSync(markerPath), false, '--no-global-install should bypass npm probing entirely'); +}); + +test('setup and doctor explain .githooks file conflicts and still write managed gitignore first', () => { + const repoDir = initRepo(); + fs.writeFileSync(path.join(repoDir, '.githooks'), '', 'utf8'); let result = runNode(['setup', '--target', repoDir, '--no-global-install'], repoDir); - assert.notEqual(result.status, 0, 'setup should fail when .codex is a file'); + assert.notEqual(result.status, 0, 'setup should fail when .githooks is a file'); let combined = `${result.stdout}\n${result.stderr}`; - assert.match(combined, /Path conflict: \.codex exists as a file/); - assert.match(combined, /\.codex\/skills\/gitguardex\/SKILL\.md needs it to be a directory/); + assert.match(combined, /Path conflict: \.githooks exists as a file/); + assert.match(combined, /\.githooks\/pre-commit needs it to be a directory/); let gitignoreContent = fs.readFileSync(path.join(repoDir, '.gitignore'), 'utf8'); assert.match(gitignoreContent, /# multiagent-safety:START/); assert.match(gitignoreContent, /scripts\/agent-branch-start\.sh/); assert.match(gitignoreContent, /scripts\/agent-file-locks\.py/); - assert.match(gitignoreContent, /\.codex\/skills\/gitguardex\/SKILL\.md/); + assert.match(gitignoreContent, /^\.githooks$/m); result = runNode(['doctor', '--target', repoDir], repoDir); - assert.notEqual(result.status, 0, 'doctor should fail when .codex is a file'); + assert.notEqual(result.status, 0, 'doctor should fail when .githooks is a file'); combined = `${result.stdout}\n${result.stderr}`; - assert.match(combined, /Path conflict: \.codex exists as a file/); + assert.match(combined, /Path conflict: \.githooks exists as a file/); gitignoreContent = fs.readFileSync(path.join(repoDir, '.gitignore'), 'utf8'); assert.match(gitignoreContent, /scripts\/agent-file-locks\.py/); @@ -738,13 +757,81 @@ test('setup and doctor preserve existing agent scripts in package.json by defaul assert.equal(result.status, 0, result.stderr || result.stdout); let currentPackage = JSON.parse(fs.readFileSync(packagePath, 'utf8')); assert.deepEqual(currentPackage.scripts, customPackage.scripts, 'setup should preserve existing agent scripts'); - assert.match(result.stdout, /preserved existing agent:\* scripts/); result = runNode(['doctor', '--target', repoDir], repoDir); assert.equal(result.status, 0, result.stderr || result.stdout); currentPackage = JSON.parse(fs.readFileSync(packagePath, 'utf8')); assert.deepEqual(currentPackage.scripts, customPackage.scripts, 'doctor should preserve existing agent scripts'); - assert.match(result.stdout, /preserved existing agent:\* scripts/); +}); + +test('migrate removes legacy copied assets and installs user-level skills on request', () => { + const repoDir = initRepo(); + const repoRoot = path.resolve(__dirname, '..'); + const guardexHomeDir = fs.mkdtempSync(path.join(os.tmpdir(), 'guardex-migrate-home-')); + const packagePath = path.join(repoDir, 'package.json'); + + fs.mkdirSync(path.join(repoDir, '.codex', 'skills', 'gitguardex'), { recursive: true }); + fs.mkdirSync(path.join(repoDir, '.claude', 'commands'), { recursive: true }); + fs.mkdirSync(path.join(repoDir, 'scripts'), { recursive: true }); + + fs.writeFileSync( + path.join(repoDir, 'scripts', 'install-agent-git-hooks.sh'), + fs.readFileSync(path.join(repoRoot, 'templates', 'scripts', 'install-agent-git-hooks.sh'), 'utf8'), + 'utf8', + ); + fs.writeFileSync( + path.join(repoDir, '.codex', 'skills', 'gitguardex', 'SKILL.md'), + fs.readFileSync(path.join(repoRoot, 'templates', 'codex', 'skills', 'gitguardex', 'SKILL.md'), 'utf8'), + 'utf8', + ); + fs.writeFileSync( + path.join(repoDir, '.claude', 'commands', 'gitguardex.md'), + fs.readFileSync(path.join(repoRoot, 'templates', 'claude', 'commands', 'gitguardex.md'), 'utf8'), + 'utf8', + ); + + fs.writeFileSync( + packagePath, + JSON.stringify( + { + name: path.basename(repoDir), + private: true, + scripts: { + 'agent:codex': 'bash ./scripts/codex-agent.sh', + 'agent:cleanup': 'gx cleanup', + 'agent:branch:start': 'bash ./scripts/custom-branch-start.sh', + test: 'node --test', + }, + }, + null, + 2, + ) + '\n', + 'utf8', + ); + + const result = runNodeWithEnv( + ['migrate', '--target', repoDir, '--install-agent-skills'], + repoDir, + { GUARDEX_HOME_DIR: guardexHomeDir }, + ); + assert.equal(result.status, 0, result.stderr || result.stdout); + + assert.equal(fs.existsSync(path.join(repoDir, 'scripts', 'install-agent-git-hooks.sh')), false); + assert.equal(fs.existsSync(path.join(repoDir, '.codex', 'skills', 'gitguardex', 'SKILL.md')), false); + assert.equal(fs.existsSync(path.join(repoDir, '.claude', 'commands', 'gitguardex.md')), false); + + const migratedPackage = JSON.parse(fs.readFileSync(packagePath, 'utf8')); + assert.equal(migratedPackage.scripts['agent:codex'], undefined); + assert.equal(migratedPackage.scripts['agent:cleanup'], undefined); + assert.equal(migratedPackage.scripts['agent:branch:start'], 'bash ./scripts/custom-branch-start.sh'); + + assert.equal(fs.existsSync(path.join(guardexHomeDir, '.codex', 'skills', 'gitguardex', 'SKILL.md')), true); + assert.equal(fs.existsSync(path.join(guardexHomeDir, '.claude', 'commands', 'gitguardex.md')), true); + + const branchStartShim = fs.readFileSync(path.join(repoDir, 'scripts', 'agent-branch-start.sh'), 'utf8'); + assert.match(branchStartShim, /exec "\$cli_bin" 'branch' 'start' "\$@"/); + const preCommitShim = fs.readFileSync(path.join(repoDir, '.githooks', 'pre-commit'), 'utf8'); + assert.match(preCommitShim, /exec "\$cli_bin" 'hook' 'run' 'pre-commit' "\$@"/); }); test('setup --parent-workspace-view creates one-level-up VS Code workspace for repo + agent worktrees', () => { @@ -1264,9 +1351,9 @@ exit 1 const createdBranch = extractCreatedBranch(result.stdout); result = runCmd('git', ['show-ref', '--verify', '--quiet', `refs/heads/${createdBranch}`], repoDir); - assert.equal(result.status, 0, 'doctor auto-finish should keep sandbox branch locally by default'); + assert.notEqual(result.status, 0, 'doctor auto-finish should clean up the merged sandbox branch locally by default'); result = runCmd('git', ['ls-remote', '--heads', 'origin', createdBranch], repoDir); - assert.match(result.stdout, /refs\/heads\//, 'doctor auto-finish should push sandbox branch to origin'); + assert.equal(result.stdout.trim(), '', 'doctor auto-finish should clean up the merged sandbox branch remotely by default'); const rootStatus = runCmd('git', ['status', '--short', '--untracked-files=no'], repoDir); assert.equal(rootStatus.status, 0, rootStatus.stderr || rootStatus.stdout); @@ -1615,7 +1702,7 @@ test('setup pre-commit detects codex commit attempts on protected main (includin }); assert.notEqual(result.status, 0, result.stdout); assert.match(result.stderr, /\[guardex-preedit-guard\] Codex edit\/commit detected on a protected branch\./); - assert.match(result.stderr, /bash scripts\/codex-agent\.sh/); + assert.match(result.stderr, /gx branch start/); }); test('setup pre-commit allows codex managed guardrail commits on protected main only for AGENTS.md/.gitignore', () => { @@ -2356,16 +2443,6 @@ test('finish command auto-commits dirty agent worktree and runs PR finish flow f fs.writeFileSync(path.join(agentWorktree, 'finisher-note.txt'), 'pending branch finish\n', 'utf8'); - const finishLog = path.join(repoDir, '.finish-invocations.log'); - fs.writeFileSync( - path.join(repoDir, 'scripts', 'agent-branch-finish.sh'), - '#!/usr/bin/env bash\n' + - 'set -euo pipefail\n' + - `printf '%s\\n' \"$*\" >> \"${finishLog}\"\n`, - 'utf8', - ); - fs.chmodSync(path.join(repoDir, 'scripts', 'agent-branch-finish.sh'), 0o755); - result = runNode( ['finish', '--target', repoDir, '--branch', agentBranch, '--base', 'main', '--no-wait-for-merge', '--no-cleanup'], repoDir, @@ -2374,13 +2451,9 @@ test('finish command auto-commits dirty agent worktree and runs PR finish flow f assert.match(result.stdout, new RegExp(`Finishing '${escapeRegexLiteral(agentBranch)}' -> 'main'`)); assert.match(result.stdout, /Auto-committed/); assert.match(result.stdout, /Finish summary: total=1, success=1, failed=0, autoCommitted=1/); - - const finishInvocations = fs.readFileSync(finishLog, 'utf8'); - assert.match(finishInvocations, new RegExp(`--branch ${escapeRegexLiteral(agentBranch)}`)); - assert.match(finishInvocations, /--base main/); - assert.match(finishInvocations, /--via-pr/); - assert.match(finishInvocations, /--no-wait-for-merge/); - assert.match(finishInvocations, /--no-cleanup/); + assert.equal(fs.existsSync(agentWorktree), true, 'finish --no-cleanup should keep the agent worktree'); + let branchResult = runCmd('git', ['show-ref', '--verify', '--quiet', `refs/heads/${agentBranch}`], repoDir); + assert.equal(branchResult.status, 0, 'finish --no-cleanup should keep the local agent branch'); const worktreeStatus = runCmd('git', ['status', '--short'], agentWorktree); assert.equal(worktreeStatus.status, 0, worktreeStatus.stderr || worktreeStatus.stdout); @@ -3103,10 +3176,14 @@ test('post-merge auto-runs cleanup on base branch and skips non-base branches', "if (marker) fs.appendFileSync(marker, process.argv.slice(2).join(' ') + '\\n', 'utf8');\n", 'utf8', ); - - let result = runCmd('bash', ['.githooks/post-merge', '0'], repoDir, { + const postMergeAsset = path.join(__dirname, '..', 'templates', 'githooks', 'post-merge'); + const hookDispatchEnv = { GUARDEX_POST_MERGE_MARKER: markerPath, - }); + GUARDEX_CLI_ENTRY: path.join(repoDir, 'bin', 'multiagent-safety.js'), + GUARDEX_NODE_BIN: process.execPath, + }; + + let result = runCmd('bash', [postMergeAsset, '0'], repoDir, hookDispatchEnv); assert.equal(result.status, 0, result.stderr || result.stdout); let invocations = fs @@ -3124,9 +3201,7 @@ test('post-merge auto-runs cleanup on base branch and skips non-base branches', result = runCmd('git', ['checkout', '-b', 'feature/post-merge-skip'], repoDir); assert.equal(result.status, 0, result.stderr || result.stdout); - result = runCmd('bash', ['.githooks/post-merge', '0'], repoDir, { - GUARDEX_POST_MERGE_MARKER: markerPath, - }); + result = runCmd('bash', [postMergeAsset, '0'], repoDir, hookDispatchEnv); assert.equal(result.status, 0, result.stderr || result.stdout); invocations = fs @@ -4202,7 +4277,7 @@ test('OpenSpec plan workspace scaffold creates expected role/task structure', () assert.match(plannerTasks, /## 5\. Collaboration/); assert.match(plannerTasks, /## 6\. Cleanup/); assert.match(plannerTasks, /\[P1\] READY - Initial planning draft checkpoint/); - assert.match(plannerTasks, /bash scripts\/agent-branch-finish\.sh/); + assert.match(plannerTasks, /gx branch finish --branch --base dev --via-pr --wait-for-merge --cleanup/); const plannerPlan = fs.readFileSync(path.join(planDir, 'planner', 'plan.md'), 'utf8'); assert.match(plannerPlan, /This ExecPlan is a living document/); @@ -4406,7 +4481,7 @@ test('doctor repairs setup drift and confirms repo is safe', () => { assert.match(result.stdout, /Repo is fully safe/); const repairedHook = fs.readFileSync(path.join(repoDir, '.githooks', 'pre-commit'), 'utf8'); - assert.match(repairedHook, /AGENTS\.md\|\.gitignore/); + assert.match(repairedHook, /'hook' 'run' 'pre-commit'/); assert.equal(fs.existsSync(path.join(repoDir, '.omx', 'notepad.md')), true); assert.equal(fs.existsSync(path.join(repoDir, '.omx', 'project-memory.json')), true); assert.equal(fs.existsSync(path.join(repoDir, '.omx', 'logs')), true); @@ -4463,7 +4538,7 @@ test('doctor recurses into nested frontend repos and repairs protected-main drif assert.match(repairedFrontendGitignore, /^scripts\/\*$/m); assert.match(repairedFrontendGitignore, /^\.githooks$/m); const repairedFrontendHook = fs.readFileSync(path.join(frontendDir, '.githooks', 'pre-commit'), 'utf8'); - assert.match(repairedFrontendHook, /AGENTS\.md\|\.gitignore/); + assert.match(repairedFrontendHook, /'hook' 'run' 'pre-commit'/); const frontendScanAfter = runNode(['scan', '--target', frontendDir], repoDir); assert.equal(frontendScanAfter.status, 0, frontendScanAfter.stderr || frontendScanAfter.stdout); @@ -4605,8 +4680,8 @@ test('prompt outputs AI setup instructions', () => { assert.match(result.stdout, /GitGuardex \(gx\) setup checklist/); assert.match(result.stdout, /gx setup/); assert.match(result.stdout, /gx doctor/); - assert.match(result.stdout, /codex-agent\.sh/); - assert.match(result.stdout, /scripts\/agent-file-locks\.py claim/); + assert.match(result.stdout, /gx branch start/); + assert.match(result.stdout, /gx locks claim/); assert.match(result.stdout, /gx finish --all/); assert.match(result.stdout, /\/opsx:propose/); assert.match(result.stdout, /https:\/\/github\.com\/apps\/pull/); @@ -4622,7 +4697,7 @@ test('prompt --exec outputs command-only checklist', () => { assert.match(result.stdout, /^gh --version/m); assert.match(result.stdout, /^gx setup$/m); assert.match(result.stdout, /^gx doctor$/m); - assert.match(result.stdout, /codex-agent\.sh/); + assert.match(result.stdout, /^gx branch start "" ""$/m); assert.match(result.stdout, /^gx finish --all$/m); assert.match(result.stdout, /^gx cleanup$/m); assert.doesNotMatch(result.stdout, /GitGuardex \(gx\) setup checklist/); @@ -4746,10 +4821,6 @@ exit 1 assert.equal(result.status, 0, result.stderr || result.stdout); assert.equal(fs.existsSync(marker), false, 'global install should not run'); assert.match(result.stdout, /Companion installs skipped by user choice/); - assert.match( - result.stdout, - /Guardex needs oh-my-claudecode as a dependency: https:\/\/github\.com\/Yeachan-Heo\/oh-my-claudecode/, - ); }); test('setup installs missing local companion tools with explicit approval', () => { @@ -5084,25 +5155,13 @@ test('cleanup command can remove squash-merged agent branches via merged PR dete test('cleanup command watch mode defaults to 60-minute idle threshold and supports one-cycle execution', () => { const repoDir = initRepo(); - const scriptsDir = path.join(repoDir, 'scripts'); - fs.mkdirSync(scriptsDir, { recursive: true }); - - const pruneScriptPath = path.join(scriptsDir, 'agent-worktree-prune.sh'); - const markerArgs = path.join(repoDir, '.cleanup-watch-args'); - fs.writeFileSync( - pruneScriptPath, - '#!/usr/bin/env bash\n' + - 'set -euo pipefail\n' + - `printf '%s\\n' \"$*\" > \"${markerArgs}\"\n`, - 'utf8', - ); - fs.chmodSync(pruneScriptPath, 0o755); + const resultSetup = runNode(['setup', '--target', repoDir, '--no-global-install'], repoDir); + assert.equal(resultSetup.status, 0, resultSetup.stderr || resultSetup.stdout); + seedCommit(repoDir); const result = runNode(['cleanup', '--target', repoDir, '--watch', '--once', '--interval', '15'], repoDir); assert.equal(result.status, 0, result.stderr || result.stdout); - const passedArgs = fs.readFileSync(markerArgs, 'utf8').trim(); - assert.match(passedArgs, /--idle-minutes 60/); - assert.match(passedArgs, /--only-dirty-worktrees/); + assert.match(result.stdout, /Cleanup watch cycle=1 \(interval=15s, idleMinutes=60, maxBranches=unbounded\)\./); }); test('release fails outside the maintainer repo path', () => {