Track AI traffic and analytics for any site.
That means both:
- AI crawlers/bots hitting your site (GPTBot, ClaudeBot, PerplexityBot, Bytespider, etc.)
- Human visits + conversions referred by AI tools (ChatGPT, Perplexity, Gemini, Copilot, Claude, DeepSeek, etc.)
The goal: answer "What did AI send me?" and "What did AI crawl?" with hard numbers.
packages/
registry/ AI bot + referrer classification data + matching functions
referral-snippet/ Drop-in <script> tag for tracking AI-referred human traffic
log-parser/ CLI to parse nginx/CloudFront logs and classify bot traffic
server-collector/ Express server: /ingest + /metrics + shortlink redirects
collector-config/ Docker Compose stack: Grafana + Prometheus + Tempo + OTel Collector
dashboards/
grafana/ Pre-built Grafana dashboard with 15+ panels
examples/
basic/ Full end-to-end demo (Docker Compose + demo page + seed script)
cd examples/basic
docker compose up --buildThen:
- Open http://localhost:8080 — click buttons to simulate AI traffic
- Open http://localhost:3000 — Grafana dashboard (admin/admin)
- Run
./seed.sh 100to bulk-seed 100 events
npm install
npm run build
npm testnpm run dev:server
# Listening on http://localhost:3456Maintained registry of 30+ AI bots and 19 AI referrer sources. Exposes classification functions:
import { classifyBot, classifyReferrer } from "@llm-telemetry/registry";
classifyBot("GPTBot/1.0");
// { isBot: true, name: "gptbot", operator: "OpenAI", purpose: "Training data collection..." }
classifyReferrer("https://chatgpt.com/");
// { isAIReferrer: true, name: "chatgpt", operator: "OpenAI" }Drop-in <script> tag that detects AI referral traffic and beacons events to your endpoint:
<script src="https://cdn.example.com/snippet.js"
data-endpoint="https://yoursite.com/api/ingest"
data-site-id="my-site">
</script>Emits ai_pageview on load. Exposes __llmTelemetry.trackConversion(name, value) for conversions.
CLI tool to parse server logs and produce AI bot traffic aggregates:
npx llm-log-parser parse access.log --format nginx --output csv
npx llm-log-parser parse cloudfront.log --format cloudfront --output json --bots-onlyExpress server with Prometheus metrics:
POST /ingest-- receive beacon events from the snippetGET /events-- query stored eventsGET /r/:code-- redirect shortlinks with first-party cookie + UTMsPOST /shortlinks-- create shortlinksGET /metrics-- Prometheus scrape endpointGET /health-- health check
Supports memory (default) and SQLite storage backends.
| Metric | Type | Labels | Description |
|---|---|---|---|
llmt_ingest_events_total |
Counter | event_type, source, operator, site_id | Total ingest events received |
llmt_ingest_bytes_total |
Counter | Payload bytes received | |
llmt_shortlink_clicks_total |
Counter | code, utm_source | Shortlink redirect clicks |
llmt_shortlinks_created_total |
Counter | Shortlinks created | |
llmt_http_requests_total |
Counter | method, route, status_code | HTTP requests |
llmt_http_request_duration_seconds |
Histogram | method, route | Request latency |
llmt_events_stored_total |
Gauge | Events currently in storage |
Docker Compose stack for observability:
cd packages/collector-config
docker compose up- Grafana: http://localhost:3000 (admin/admin)
- Prometheus: http://localhost:9090
- Tempo: http://localhost:3200
The pre-built dashboard (dashboards/grafana/llm-telemetry-overview.json) includes:
- Overview row — Total events, unique sources, shortlink clicks, storage gauge
- AI Traffic by Source — Time series of events by source + operator donut chart
- Event Breakdown — Top sources bar gauge, event type pie chart, conversion counter
- Shortlinks — Click time series + per-code performance bars
- Collector Health — HTTP request rate, p50/p95/p99 latency, error rate, payload volume
- Some AI tools strip referrers. For high-confidence attribution, we support:
- UTM conventions
- Optional redirect/shortlink endpoint (
/r/:code) - Server-side log correlation
To add a new bot or referrer:
- Add the entry to
packages/registry/ai-bots.jsonorpackages/registry/ai-referrers.json - Add a test case in
packages/registry/__tests__/registry.test.ts - Run
npm testto verify - Submit a PR
MIT