Skip to content

Lucineer/LOG-mcp

Repository files navigation

LOG-mcp

Stop guessing which AI model to use. This MCP server builds a dataset of your preferences each time you choose between draft responses. It learns from your actual choices, not from general benchmarks.

Live URL: https://log-mcp.casey-digennaro.workers.dev License: MIT • Runtime: Cloudflare Workers • Dependencies: 0

Why This Exists

Public model rankings often don't reflect your specific needs. This server learns your preferences directly from the choices you make while working, helping it route future prompts to the model you'd likely choose.

Quick Start

Fork this repository first to create your own instance.

  1. Fork this repository.
  2. Deploy to Cloudflare Workers using the one-click button in your fork.
  3. Add your model API keys in the Worker's environment variables.

For local development:

git clone https://github.com/your-username/log-mcp
cd log-mcp
cp .env.example .env
# Add your API keys to the .env file
npm run dev

How It Works

When you submit a prompt, LOG-mcp generates draft responses from each configured model. You select the best one. Each choice trains your private preference profile. Over time, it begins routing prompts directly to the model you would have selected. All choice data remains within your Worker.

Features

  • Preference Learning: Routes prompts based on your past choices.
  • PII Stripping: Removes identifiable information (names, emails, phone numbers) by default before sending to model APIs.
  • Semantic Caching: Skips regeneration for semantically similar, previously answered prompts.
  • Training Data Export: Export your preference pairs in standard LoRA/DPO format.
  • Multi-Model Support: Configure endpoints for OpenAI, Anthropic, Google, and open-weight models.
  • MCP Compliance: Works with any client supporting the Model Context Protocol.
  • Zero Dependencies: A single, self-contained script that deploys quickly.

Limitations

The system requires explicit choice data to learn. If you rarely select between drafts, it cannot build an effective routing profile and will continue to show all model outputs.

Architecture

LOG-mcp runs statelessly on Cloudflare Workers. All preference data is stored in a Cloudflare KV namespace. There are no external databases or background processes to manage.

About

LOG Model Context Protocol — MCP server for the LOG.ai platform

Topics

Resources

License

Contributing

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors