Ready-to-use templates for making your SaaS product discoverable by AI agents. Includes an agent-card.json template (A2A Protocol), llms.txt template, AGENTS.md, robots.txt AI directives, and JSON-LD structured data — deploy in 30 minutes.
AI agents discover tools through protocols like A2A and MCP, not Google searches. If your product doesn't have an agent-card.json or llms.txt, you're invisible to the fastest-growing segment of software users.
Almost nobody has deployed an agent-card.json yet. This is a first-mover advantage — deploy now and be discoverable when agent crawlers start indexing.
The numbers: 97M+ monthly MCP SDK downloads. 5,800+ MCP servers. 40% of enterprise apps will embed agents by end of 2026 (Gartner). The $7.8B agent market is projected to hit $52B by 2030.
This kit gives you every file and template you need to become agent-ready.
| Template | What It Is | Deploy To |
|---|---|---|
agent-card.json |
A2A Protocol agent card template — fill in your product's skills, auth, and capabilities | yourdomain.com/.well-known/agent-card.json |
llms.txt |
llms.txt template — AI-readable product description with pricing | yourdomain.com/llms.txt |
AGENTS.md |
Agent capability description for GitHub repos | Root of your GitHub repo |
robots-ai-directives.txt |
AI crawler permissions (GPTBot, ClaudeBot, etc.) | Merge into yourdomain.com/robots.txt |
structured-data/*.json |
JSON-LD schema markup (Organization, SoftwareApplication, Product, FAQ) | Embed in your HTML <head> |
checklist.md |
100-point agent readiness scoring checklist | Your reference |
The llms.txt file is the most important agent discoverability file. It's a plain text file that tells AI systems what your product does, how it's priced, and how to use it.
- Open
templates/llms.txt - Replace all
[bracketed]fields with your product information - Delete the instructions section at the bottom
- Upload to your web server at the root:
https://yourdomain.com/llms.txt - Test:
curl https://yourdomain.com/llms.txt
Tips:
- Write as if explaining to an AI agent making a recommendation, not a human browsing
- Include pricing — agents making purchase decisions need this
- Be specific about capabilities and limitations
- Update when features or pricing change
Reference: llmstxt.org
The A2A (Agent-to-Agent) Protocol is a Google + Linux Foundation standard for agent discovery. The agent-card.json file tells other agents what your product can do, how to authenticate, and how to interact.
This is the biggest opportunity in the kit. Almost nobody has deployed these yet. Being early = maximum visibility when agent crawlers start indexing.
- Open
templates/agent-card.json - Fill in your product details:
name,description,url— basic identityprovider— your company informationcapabilities— does your API support streaming? Push notifications?authentication— how do agents authenticate with your product?skills— map each of your product's key capabilities as a skill
- Delete ALL fields starting with
_instructionsand_deployment_instructions - Validate your JSON at jsonlint.com
- Create the
.well-knowndirectory on your web server - Upload:
https://yourdomain.com/.well-known/agent-card.json - Test:
curl https://yourdomain.com/.well-known/agent-card.json
Important: The .well-known directory must be served by your web server. On most platforms:
- Vercel: Add a
public/.well-known/directory - Netlify: Add a
_redirectsrule orpublic/.well-known/ - Nginx: Ensure
location /.well-known/is configured - Apache: Works by default if the directory exists in your web root
- GitHub Pages: Create
.well-known/in your repo root (note: GitHub Pages may require a.nojekyllfile)
Reference: A2A Protocol
AGENTS.md is a human+agent readable file that describes your product's capabilities. It goes in the root of your GitHub repository.
- Open
templates/AGENTS.md - Fill in your product details
- Be specific about what your product CAN and CANNOT do
- Include authentication details and pricing
- Delete the instructions section
- Commit to the root of your GitHub repository
Tips:
- Write the "What I Cannot Do" section honestly — agents waste tokens on capabilities you don't have
- Include example interactions — this helps agents understand how to use your product
Most websites already have a robots.txt. You need to add explicit AI crawler directives.
- Open
templates/robots-ai-directives.txt - Choose Option A (allow all), B (allow with restrictions), or C (selective)
- Replace
yourdomain.comwith your actual domain - Add these directives to your existing
robots.txt— don't replace the whole file - Make sure your
Sitemap:URL is correct
Key AI crawlers to allow:
GPTBot— OpenAIClaudeBot— AnthropicGoogle-Extended— Google AIBingbot— Microsoft/CopilotPerplexityBot— Perplexity AI
JSON-LD structured data helps search engines AND AI systems understand your product. Embed these in your HTML <head> section.
- Choose the templates that apply to your product:
organization.json— Every company should have thissoftware-application.json— For SaaS products and web appsproduct.json— For physical products or digital goodsfaq.json— For pages with FAQ content
- Fill in your details in each template
- Remove any
_instructionsfields - Embed in your HTML:
<head>
<!-- Other meta tags -->
<script type="application/ld+json">
{ paste your organization.json content here }
</script>
<script type="application/ld+json">
{ paste your software-application.json content here }
</script>
</head>- Validate at Google's Rich Results Test
Run through the checklist in checklist.md to make sure nothing was missed.
Quick verification commands:
# Check llms.txt
curl -s https://yourdomain.com/llms.txt | head -5
# Check agent-card.json
curl -s https://yourdomain.com/.well-known/agent-card.json | python3 -m json.tool
# Check robots.txt
curl -s https://yourdomain.com/robots.txt | grep -i "gptbot\|claudebot\|sitemap"After deploying these files:
- Immediate: Your site becomes crawlable by AI systems. Structured data starts appearing in Google's knowledge graph.
- Days 1-7: AI crawlers index your llms.txt and structured data. Your product starts appearing in AI-generated recommendations.
- Weeks 2-4: A2A Protocol crawlers discover your agent-card.json. Other agents can now find and interact with your product programmatically.
- Ongoing: Keep files updated as your product evolves. Add new skills to agent-card.json as you ship features.
This kit gets you to "agent-discoverable." If you want to go further:
- Agent Readiness Scanner — Automated scoring and agent-card.json generation: Conduit
- MCP Integration — Make your API callable by AI agents via Model Context Protocol
- Managed Agent Infrastructure — Hosted agent-card.json, managed MCP endpoint, usage billing
agent-card.json agent-card.json template a2a protocol llms.txt llms.txt template agent readiness ai agent discovery mcp agents.md structured data json-ld ai crawler gptbot claudebot
Created by Conduit — infrastructure for the agent economy.