A Claude Code skill that optimizes title tags and meta descriptions for SEO at scale.
Give it a website URL or CSV export and it will:
- Crawl your site — automatically extract all current titles and meta descriptions via sitemap or link crawling
- Fix title tags — shorten to ≤60 characters using grammar-aware truncation
- Fix meta descriptions — expand short ones and trim long ones to the 120-160 character sweet spot
- Detect grammar issues — trailing commas, dangling prepositions, unclosed parentheses, incomplete clauses
- Eliminate duplicates — find and differentiate pages sharing identical titles or metas
- Proper-case tech terms — PostgreSQL, AWS, Kubernetes, etc.
- Generate from context — create titles/metas for pages that have none, using URL path analysis
Clone the repo and copy the skill files into your Claude Code commands directory:
Global install (available in all projects):
git clone https://github.com/djforge/seo-meta-optimizer.git
mkdir -p ~/.claude/commands
cp seo-meta-optimizer/skills/optimize-meta-tags/SKILL.md ~/.claude/commands/optimize-meta-tags.md
cat seo-meta-optimizer/skills/optimize-meta-tags/reference.md >> ~/.claude/commands/optimize-meta-tags.mdPer-project install (available only in a specific project):
git clone https://github.com/djforge/seo-meta-optimizer.git
mkdir -p .claude/commands
cp seo-meta-optimizer/skills/optimize-meta-tags/SKILL.md .claude/commands/optimize-meta-tags.md
cat seo-meta-optimizer/skills/optimize-meta-tags/reference.md >> .claude/commands/optimize-meta-tags.mdAfter installing, restart Claude Code. The /optimize-meta-tags command will be available.
From a website URL (crawls the site automatically):
/optimize-meta-tags https://example.com
Or from a CSV export:
/optimize-meta-tags path/to/your-meta-tags.csv
The skill will ask for:
- Your brand name (e.g., "Acme Corp") for the title suffix
- A one-line brand description for generating meta descriptions
- Your website URL (if not already provided — fetched for brand voice and positioning context)
- Any audit CSVs (nice-to-have — Ahrefs, Screaming Frog, etc. for organic traffic data and pre-flagged issues)
A CSV file with columns:
| Column | Description |
|---|---|
| URL | Page URL |
| Page_Type | website, blog, docs, learn, etc. |
| Organic_Traffic | From audit data (0 if unavailable) |
| Current_Title | Original title tag |
| Optimized_Title | New title tag |
| Title_Changed | YES or NO |
| Current_Meta | Original meta description |
| Optimized_Meta | New meta description |
| Meta_Changed | YES or NO |
Plus optional split by section (website, blog, docs, learn) in a by_section/ subfolder.
Every run ends with a validation report targeting 0 issues across:
- Title length violations (>60 chars)
- Meta length violations (<120 or >160 chars)
- Grammar issues (30+ checks)
- Duplicate titles
- Duplicate meta descriptions
The optimizer uses a strategy chain for titles — trying progressively more aggressive approaches until one fits:
- Full title + brand suffix
- Apply shortenings (remove "Understanding", "Introducing", etc.) + brand
- Grammar-aware truncation + brand
- Drop brand suffix entirely
- Unpack/remove parentheticals
- Aggressive truncation at first colon or dash
Every truncation point is validated against grammar rules — it will never create a title ending with "and", "for", "the", a trailing comma, or an unclosed parenthesis.
MIT