-
-
Notifications
You must be signed in to change notification settings - Fork 4
Description
Feature Type
Integration with Another Tool
Problem Description
I'm frustrated when using ai-memory across different AI interfaces (Cursor IDE, Gemini CLI) because the automated memory capture (hooks) and retrieval (skills/slash commands)
only work out-of-the-box for Claude Code.
Currently, to get a "shared brain" across all my development tools, I have to:
- Write custom bash/python shims to translate Cursor's afterFileEdit format to Claude's PostToolUse format.
- Manually configure .gemini/settings.json and .cursor/hooks.json to point to the ai-memory scripts.
- Manually create Markdown (.cursor/commands/) and TOML (.gemini/commands/) files so that /search-memory and /memory-status appear in the chat slash-command menus.
Without these manual steps, ai-memory is effectively a "Claude Code only" tool, which limits its value for developers who switch between IDEs and CLI agents.
Proposed Solution
I would like the ai-memory installer (install.sh) to natively detect and support Cursor and Gemini CLI by generating the necessary configuration and adapter files automatically.
Key components:
- Native Adapters: A built-in cursor-adapter.sh that handles the JSON transformation from Cursor's afterFileEdit to PostToolUse, including CWD and SessionID context.
- Auto-Config: The installer should detect the presence of .gemini/ and .cursor/ directories and append the appropriate hook configurations to settings.json and hooks.json.
- Command Templates: Generate native command definitions:
- .cursor/commands/*.md (Markdown instructions for Cursor agents).
- .gemini/commands/*.toml (Command definitions for Gemini CLI).
- Security-First Shims: Ensure all shims use sys.argv for file paths instead of string interpolation to prevent shell injection vulnerabilities (learned this during our custom
setup in Story 3.6).
Alternatives Considered
I considered maintaining these adapters manually in my project repo, but that creates a "maintenance tax" whenever ai-memory updates its internal script names or collection
structures. A native integration in ai-memory would allow the engine to evolve while keeping all IDE shims in sync.
Example Usage
1 # After running ai-memory install /path/to/project
2 # I can immediately go into Cursor and type:
3 /search-memory "how did we implement the shadow bridge?"
4
5 # Or in Gemini CLI:
6 /search-memory "why is jobRunner.js frozen?"
7
8 # Both should trigger the same underlying ai-memory engine without manual setup.How Important Is This Feature?
Critical - Blocking my work
Additional Context
Critical for users who want a truly "interface-agnostic" AI memory system.
Contribution
- I would be willing to submit a PR for this feature
- I can help with design/specification
- I can help with testing
Pre-submission Checklist
- I have searched existing issues and feature requests to avoid duplicates
- This feature aligns with the project's goals (memory management for Claude Code)