This project improves Reddit with an AI-powered "Truth Checker" that runs directly in the browser. It is inspired by X/Twitter's "Community Notes" feature and serves as a solution to the rise of misinformation on public forums like Reddit.
It consists of three main pieces:
- Frontend Chrome Extension (
reddit-extension/): injects a panel into eligible Reddit posts (r/news, r/politics, r/TheOnion) and calls the backend to display the verification results. - Backend FastAPI Server (
src/app.py,src/main.py): wraps LastMile'smcp-agentapp, exposes a/verifyendpoint, and orchestrates Tavily searches plus LLM reasoning. - AI + MCP Tooling: Tavily MCP server for search, LastMile's
mcp-agentfor workflow management, and OpenAI LLM backends via augmented LLMs.
- MCP Agent: https://1eb4wbtqipdcbkqwm8ve7wtnmo9mk0wk.deployments.mcp-agent.com/
- FastAPI Backend: https://cef5c5f9c1f3.ngrok-free.app/
|
|
Reddit Community Notes is a free browser extension that automatically fact-checks posts on Reddit. It works like Twitter's Community Notesβwhen you visit a news post, it shows you whether the information is verified, disputed, or needs more context.
-
Download the Extension
- Click the green "Code" button at the top of this page
- Select "Download ZIP"
- Extract the ZIP file to a folder on your computer
-
Open Chrome Extensions
- Open Google Chrome
- Type
chrome://extensionsin the address bar and press Enter - Turn on Developer Mode (toggle switch in the top-right corner)
-
Load the Extension
- Click "Load unpacked" button
- Select the
reddit-extensionfolder from the downloaded files - You're done! The extension is now active
-
Visit Reddit
- Go to reddit.com
- Navigate to one of these subreddits:
r/news- News articles and current eventsr/politics- Political news and discussionsr/TheOnion- Satirical news (to see how it detects satire)
-
View Any Post
- Click on any post in these subreddits
- Scroll down below the post content
- You'll see an "AI Truth Checker" panel appear automatically
-
Read the Results
- The panel will show one of three statuses:
- β Correct (green) - The information has been verified by reputable sources
- β Not Correct (red) - The information is disputed or inaccurate
β οΈ Unable to Verify (gray) - Not enough information available to verify
- Each result includes:
- An explanation of why it was marked that way
- Links to news sources that support or dispute the claim
- Publication dates to show when the information was verified
- The panel will show one of three statuses:
- First Time: Verification takes 10-20 seconds while the AI searches for sources
- Cached Results: If you or someone else verified the same post before, results appear instantly
- Source Links: Click any source link to read the original article
- Automatic: The extension works automaticallyβno buttons to click
Want to see it in action? Try these posts:
The panel doesn't appear:
- Make sure you're on
r/news,r/politics, orr/TheOnion - Refresh the page (F5 or Ctrl+R)
- Check that the extension is enabled in
chrome://extensions
It says "Error" or "Unable to verify":
- This is normal for very new posts or obscure topics
- The backend might be temporarily unavailableβtry again in a few minutes
Want to disable it?
- Go to
chrome://extensions - Toggle off the "Reddit Community Notes" extension
- Tavily MCP (search/extract tools)
- LastMile's MCP-Agent
- MongoDB (
pymongo) driver - OpenAI LLMs
- Chrome Extension
ManifestV3(JavaScript) - FastAPI (Python 3)
- Fetch/Web tooling for REST calls and packaging
uvfor dependency/env management
- Lives in
reddit-extension/. - Injects Twitter's "Community Notes"-style card into each Reddit post that matches the supported subreddits.
- Sends a POST request to
/verifyendpoint with the post URL, title, subtext, and detected timestamp. - Renders the structured response (verdict + sources) returned by the backend.
- Build/prepare the backend first (see next section).
- In Chrome, open
chrome://extensions, enable Developer Mode, then Load unpacked. - Select the
reddit-extension/folder. - Visit Reddit and verify posts to see the injected panel.
-
Reddit r/news
-
Reddit r/TheOnion
-
Reddit r/news
- The core MCP agent logic lives in
src/main.py(adapted LastMile'smcp-agentexample). src/app.pywraps the MCP app with FastAPI and exposes/verifyendpoint.- Uses Tavily MCP server to fetch reputable sources and OpenAI LLM (via
OpenAIAugmentedLLM). - MongoDB caching layer (
src/db/) for storing verification results to reduce API costs. Uses hashing to store unique id of the Reddit post URL.
.
βββ src/
β βββ app.py # FastAPI server with /verify endpoint
β βββ main.py # MCP agent with verify_content_agent tool
β βββ db/
β βββ cache.py # MongoDB cache for verification results
β βββ mongodb.py # MongoDB connection management
β βββ init_indexes.py # Database index initialization
βββ reddit-extension/ # Chrome extension frontend
β βββ content.js # Extension content script
β βββ manifest.json # Extension manifest
βββ mcp_agent.config.yaml # MCP agent configuration
βββ mcp_agent.secrets.yaml # API keys (references .env)
βββ pyproject.toml # Python dependencies
- Python 3.11+
- uv for managing the virtual environment.
- Tavily API key and LLM provider API keys (OpenAI) β stored in secrets files /
.env. - MongoDB (Atlas or local) β optional but recommended for caching verification results.
- Highly recommend using OpenAI
gpt-5-mini-2025-08-07model for efficiency and/orgpt-5.1-2025-11-13model for best accuracy.
# Install dependencies
uv sync
# Start the FastAPI server
uv run uvicorn src.app:fastapi_app --reload --host 0.0.0.0 --port 8000
# Or use the console script entry point
uv run appThe server listens on http://0.0.0.0:8000. The browser extension should point there for verification requests.
Note: The extension is currently configured to use the deployed backend at https://cef5c5f9c1f3.ngrok-free.app/verify. To use a local server, update the URL in reddit-extension/content.js.
| Component | Purpose |
|---|---|
| Tavily MCP server | Performs date-bounded searches on reputable domains using topic="news" to filter for news sources, returning structured search results with publication dates. |
LastMile's mcp-agent |
Provides the MCP workflow framework, agent lifecycle, logging, and server connections. |
| OpenAI LLM | Reason over Tavily results, enforce JSON schema, and summarize verification. OpenAIAugmentedLLM is currently configured with the gpt-5-mini-2025-08-07 model. |
The workflow enforces:
- Date filters aligned with the Reddit post timestamp.
- Use the
topic="news"parameter in Tavily searches to prioritize recent news sources with publication dates. - Reputable domain whitelists.
- Satire/fake-source detection (don't treat original satire articles as "proof").
- Returning multiple independent sources with descriptions.
Create a .env in the repo root. Structure as follows:
OPENAI_API_KEY=sk-...
GOOGLE_API_KEY=AIza...
TAVILY_API_KEY=tvly-...
MONGODB_URI=mongodb+srv://username:password@cluster.mongodb.net/reddit_verifier?retryWrites=true&w=majorityuv automatically reads .env files when running commands. The FastAPI server and MCP config reference these environment variables.
MongoDB is used to cache verification results, reducing API costs and improving response times for repeated post verifications.
- Create a free account at MongoDB Atlas
- Create a new cluster (free tier M0 is sufficient)
- Create a database user and set a password
- Add your IP address to the network access list (or use
0.0.0.0/0for development) - Click "Connect" β "Connect your application" β Copy the connection string
- Replace
<password>with your database user password and<dbname>withreddit_verifier - Add the connection string to your
.envfile asMONGODB_URI
Example connection string format:
mongodb+srv://username:password@cluster.mongodb.net/reddit_verifier?retryWrites=true&w=majority
- Cache Key: SHA256 hash of normalized URL (handles URL variations)
- TTL: 30 days (results auto-expire)
- Graceful Degradation: If MongoDB is unavailable, verification still works (just without caching)
- Indexes: Automatically created on first startup (TTL index on
expires_at, unique index oncache_key)
The cache significantly reduces API costs when the same Reddit posts are verified multiple times.
- Core MCP agent configuration (logger, MCP servers, agent definitions, default models).
- Update the
openai.default_modelor add other provider defaults if needed. - Tavily server uses
python -m mcp_server_tavilyand expectsTAVILY_API_KEYin the environment.
- Stores provider API keys and MCP server env overrides.
- Structure as follows:
openai:
api_key: "${OPENAI_API_KEY}"
google:
api_key: "${GOOGLE_API_KEY}"
mcp:
servers:
tavily:
env:
TAVILY_API_KEY: "${TAVILY_API_KEY}"Never commit real keys. Reference environment variables via
${VAR_NAME}and keep the.envlocal.
- Update dependencies or code.
- Run
uv run python src/main.py(or the specific test script) to ensure the agent still returns valid JSON with at least two sources. - Start the FastAPI server (
uv run uvicorn src.app:fastapi_app --reload). - Load the Chrome extension and verify posts.
- Check the terminal logs for "LLM Raw Response" and "Parsed JSON" to debug any schema issues.
With this setup, you can quickly iterate on the Chrome UI, backend logic, or the MCP/LLM instructions to improve verification quality.
Health check endpoint. Returns {"status": "ok", "message": "Reddit Content Verifier API is running"}.
Main verification endpoint. Accepts a JSON body with:
url: The URL of the Reddit post or linked articletitle: Post titlesubtext: Post body text (first 300 characters)postDate: Post timestamp (ISO format or relative time)
Returns a JSON response with:
is_correct:true,false, ornull(unable to verify)explanation: Human-readable explanationsources: Array of source objects withsource_urlandsource_descriptionstatus: Always"success"
The endpoint checks MongoDB cache first, and only calls the MCP agent if no cached result exists.

