Skip to content

Tiny MCP server for AI assisted decomp#39

Draft
GRAnimated wants to merge 2 commits intoopen-ead:masterfrom
GRAnimated:mcp
Draft

Tiny MCP server for AI assisted decomp#39
GRAnimated wants to merge 2 commits intoopen-ead:masterfrom
GRAnimated:mcp

Conversation

@GRAnimated
Copy link
Contributor

@GRAnimated GRAnimated commented Feb 22, 2026

We can save output tokens for LLMs by creating tools for them to load. It's almost the same as a bash call, but it's more standardized and is agnostic between coding models.

Requires the mcp Python package to use.

image

This change is Reviewable

@GRAnimated GRAnimated marked this pull request as draft February 23, 2026 05:25
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant