Skip to content

[FEATURE-001]: Native Lifecycle Hooks and Slash Command Support for Cursor and Gemini CLI #27

@ascensum

Description

@ascensum

Feature Type

Integration with Another Tool

Problem Description

I'm frustrated when using ai-memory across different AI interfaces (Cursor IDE, Gemini CLI) because the automated memory capture (hooks) and retrieval (skills/slash commands)
only work out-of-the-box for Claude Code.

Currently, to get a "shared brain" across all my development tools, I have to:

  1. Write custom bash/python shims to translate Cursor's afterFileEdit format to Claude's PostToolUse format.
  2. Manually configure .gemini/settings.json and .cursor/hooks.json to point to the ai-memory scripts.
  3. Manually create Markdown (.cursor/commands/) and TOML (.gemini/commands/) files so that /search-memory and /memory-status appear in the chat slash-command menus.

Without these manual steps, ai-memory is effectively a "Claude Code only" tool, which limits its value for developers who switch between IDEs and CLI agents.

Proposed Solution

I would like the ai-memory installer (install.sh) to natively detect and support Cursor and Gemini CLI by generating the necessary configuration and adapter files automatically.

Key components:

  1. Native Adapters: A built-in cursor-adapter.sh that handles the JSON transformation from Cursor's afterFileEdit to PostToolUse, including CWD and SessionID context.
  2. Auto-Config: The installer should detect the presence of .gemini/ and .cursor/ directories and append the appropriate hook configurations to settings.json and hooks.json.
  3. Command Templates: Generate native command definitions:
    • .cursor/commands/*.md (Markdown instructions for Cursor agents).
    • .gemini/commands/*.toml (Command definitions for Gemini CLI).
  4. Security-First Shims: Ensure all shims use sys.argv for file paths instead of string interpolation to prevent shell injection vulnerabilities (learned this during our custom
    setup in Story 3.6).

Alternatives Considered

I considered maintaining these adapters manually in my project repo, but that creates a "maintenance tax" whenever ai-memory updates its internal script names or collection
structures. A native integration in ai-memory would allow the engine to evolve while keeping all IDE shims in sync.

Example Usage

1 # After running ai-memory install /path/to/project
   2 # I can immediately go into Cursor and type:
   3 /search-memory "how did we implement the shadow bridge?"
   4
   5 # Or in Gemini CLI:
   6 /search-memory "why is jobRunner.js frozen?"
   7
   8 # Both should trigger the same underlying ai-memory engine without manual setup.

How Important Is This Feature?

Critical - Blocking my work

Additional Context

Critical for users who want a truly "interface-agnostic" AI memory system.

Contribution

  • I would be willing to submit a PR for this feature
  • I can help with design/specification
  • I can help with testing

Pre-submission Checklist

  • I have searched existing issues and feature requests to avoid duplicates
  • This feature aligns with the project's goals (memory management for Claude Code)

Metadata

Metadata

Assignees

No one assigned

    Labels

    feature:multi-ideMulti-IDE support (Cursor, Gemini CLI)priority: mediumShould be fixed soon but not blockingstatus: confirmedTriaged, validated, ready for work

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions