robots-patch.txt
The robots-patch.txt file contains robots.txt directives specifically for AI engine crawlers.
What it is
Section titled “What it is”A set of User-agent and Allow/Disallow rules for AI-specific crawlers. These directives tell AI engines which parts of your site they can access.
Why it matters
Section titled “Why it matters”Many sites accidentally block AI crawlers through overly broad robots.txt rules. robots-patch.txt provides explicit allow rules for the major AI crawlers, ensuring your content is accessible to AI engines.
AI crawlers covered
Section titled “AI crawlers covered”| Crawler | Engine |
|---|---|
GPTBot | OpenAI (ChatGPT) |
ClaudeBot | Anthropic (Claude) |
PerplexityBot | Perplexity |
Google-Extended | Google AI (Gemini, AI Overviews) |
anthropic-ai | Anthropic (alternative) |
Example output
Section titled “Example output”# AI Crawler Access Rules# Generated by AEOrank
User-agent: GPTBotAllow: /
User-agent: ClaudeBotAllow: /
User-agent: PerplexityBotAllow: /
User-agent: Google-ExtendedAllow: /
User-agent: anthropic-aiAllow: /How to deploy
Section titled “How to deploy”Append the contents to your existing robots.txt file:
cat robots-patch.txt >> public/robots.txt