Skip to content

trssantos/agent-readiness-kit

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 

Repository files navigation

Agent Readiness Kit — agent-card.json Template, llms.txt Template & More

Ready-to-use templates for making your SaaS product discoverable by AI agents. Includes an agent-card.json template (A2A Protocol), llms.txt template, AGENTS.md, robots.txt AI directives, and JSON-LD structured data — deploy in 30 minutes.


Why This Matters

AI agents discover tools through protocols like A2A and MCP, not Google searches. If your product doesn't have an agent-card.json or llms.txt, you're invisible to the fastest-growing segment of software users.

Almost nobody has deployed an agent-card.json yet. This is a first-mover advantage — deploy now and be discoverable when agent crawlers start indexing.

The numbers: 97M+ monthly MCP SDK downloads. 5,800+ MCP servers. 40% of enterprise apps will embed agents by end of 2026 (Gartner). The $7.8B agent market is projected to hit $52B by 2030.

This kit gives you every file and template you need to become agent-ready.


Templates Included

Template What It Is Deploy To
agent-card.json A2A Protocol agent card template — fill in your product's skills, auth, and capabilities yourdomain.com/.well-known/agent-card.json
llms.txt llms.txt template — AI-readable product description with pricing yourdomain.com/llms.txt
AGENTS.md Agent capability description for GitHub repos Root of your GitHub repo
robots-ai-directives.txt AI crawler permissions (GPTBot, ClaudeBot, etc.) Merge into yourdomain.com/robots.txt
structured-data/*.json JSON-LD schema markup (Organization, SoftwareApplication, Product, FAQ) Embed in your HTML <head>
checklist.md 100-point agent readiness scoring checklist Your reference

Step-by-Step Guide

Step 1: llms.txt (5 minutes)

The llms.txt file is the most important agent discoverability file. It's a plain text file that tells AI systems what your product does, how it's priced, and how to use it.

  1. Open templates/llms.txt
  2. Replace all [bracketed] fields with your product information
  3. Delete the instructions section at the bottom
  4. Upload to your web server at the root: https://yourdomain.com/llms.txt
  5. Test: curl https://yourdomain.com/llms.txt

Tips:

  • Write as if explaining to an AI agent making a recommendation, not a human browsing
  • Include pricing — agents making purchase decisions need this
  • Be specific about capabilities and limitations
  • Update when features or pricing change

Reference: llmstxt.org

Step 2: agent-card.json — A2A Protocol (10 minutes)

The A2A (Agent-to-Agent) Protocol is a Google + Linux Foundation standard for agent discovery. The agent-card.json file tells other agents what your product can do, how to authenticate, and how to interact.

This is the biggest opportunity in the kit. Almost nobody has deployed these yet. Being early = maximum visibility when agent crawlers start indexing.

  1. Open templates/agent-card.json
  2. Fill in your product details:
    • name, description, url — basic identity
    • provider — your company information
    • capabilities — does your API support streaming? Push notifications?
    • authentication — how do agents authenticate with your product?
    • skills — map each of your product's key capabilities as a skill
  3. Delete ALL fields starting with _instructions and _deployment_instructions
  4. Validate your JSON at jsonlint.com
  5. Create the .well-known directory on your web server
  6. Upload: https://yourdomain.com/.well-known/agent-card.json
  7. Test: curl https://yourdomain.com/.well-known/agent-card.json

Important: The .well-known directory must be served by your web server. On most platforms:

  • Vercel: Add a public/.well-known/ directory
  • Netlify: Add a _redirects rule or public/.well-known/
  • Nginx: Ensure location /.well-known/ is configured
  • Apache: Works by default if the directory exists in your web root
  • GitHub Pages: Create .well-known/ in your repo root (note: GitHub Pages may require a .nojekyll file)

Reference: A2A Protocol

Step 3: AGENTS.md (5 minutes)

AGENTS.md is a human+agent readable file that describes your product's capabilities. It goes in the root of your GitHub repository.

  1. Open templates/AGENTS.md
  2. Fill in your product details
  3. Be specific about what your product CAN and CANNOT do
  4. Include authentication details and pricing
  5. Delete the instructions section
  6. Commit to the root of your GitHub repository

Tips:

  • Write the "What I Cannot Do" section honestly — agents waste tokens on capabilities you don't have
  • Include example interactions — this helps agents understand how to use your product

Step 4: robots.txt AI Directives (2 minutes)

Most websites already have a robots.txt. You need to add explicit AI crawler directives.

  1. Open templates/robots-ai-directives.txt
  2. Choose Option A (allow all), B (allow with restrictions), or C (selective)
  3. Replace yourdomain.com with your actual domain
  4. Add these directives to your existing robots.txt — don't replace the whole file
  5. Make sure your Sitemap: URL is correct

Key AI crawlers to allow:

  • GPTBot — OpenAI
  • ClaudeBot — Anthropic
  • Google-Extended — Google AI
  • Bingbot — Microsoft/Copilot
  • PerplexityBot — Perplexity AI

Step 5: Structured Data (10 minutes)

JSON-LD structured data helps search engines AND AI systems understand your product. Embed these in your HTML <head> section.

  1. Choose the templates that apply to your product:
    • organization.jsonEvery company should have this
    • software-application.json — For SaaS products and web apps
    • product.json — For physical products or digital goods
    • faq.json — For pages with FAQ content
  2. Fill in your details in each template
  3. Remove any _instructions fields
  4. Embed in your HTML:
<head>
  <!-- Other meta tags -->
  <script type="application/ld+json">
    { paste your organization.json content here }
  </script>
  <script type="application/ld+json">
    { paste your software-application.json content here }
  </script>
</head>
  1. Validate at Google's Rich Results Test

Step 6: Verify Everything (3 minutes)

Run through the checklist in checklist.md to make sure nothing was missed.

Quick verification commands:

# Check llms.txt
curl -s https://yourdomain.com/llms.txt | head -5

# Check agent-card.json
curl -s https://yourdomain.com/.well-known/agent-card.json | python3 -m json.tool

# Check robots.txt
curl -s https://yourdomain.com/robots.txt | grep -i "gptbot\|claudebot\|sitemap"

What Happens Next

After deploying these files:

  1. Immediate: Your site becomes crawlable by AI systems. Structured data starts appearing in Google's knowledge graph.
  2. Days 1-7: AI crawlers index your llms.txt and structured data. Your product starts appearing in AI-generated recommendations.
  3. Weeks 2-4: A2A Protocol crawlers discover your agent-card.json. Other agents can now find and interact with your product programmatically.
  4. Ongoing: Keep files updated as your product evolves. Add new skills to agent-card.json as you ship features.

Need More?

This kit gets you to "agent-discoverable." If you want to go further:

  • Agent Readiness Scanner — Automated scoring and agent-card.json generation: Conduit
  • MCP Integration — Make your API callable by AI agents via Model Context Protocol
  • Managed Agent Infrastructure — Hosted agent-card.json, managed MCP endpoint, usage billing

Keywords

agent-card.json agent-card.json template a2a protocol llms.txt llms.txt template agent readiness ai agent discovery mcp agents.md structured data json-ld ai crawler gptbot claudebot


Created by Conduit — infrastructure for the agent economy.

About

Make your SaaS product discoverable by AI agents. Templates for llms.txt, A2A agent-card.json, AGENTS.md, robots.txt, and JSON-LD structured data.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors