Skip to content

robots-patch.txt

The robots-patch.txt file contains robots.txt directives specifically for AI engine crawlers.

A set of User-agent and Allow/Disallow rules for AI-specific crawlers. These directives tell AI engines which parts of your site they can access.

Many sites accidentally block AI crawlers through overly broad robots.txt rules. robots-patch.txt provides explicit allow rules for the major AI crawlers, ensuring your content is accessible to AI engines.

CrawlerEngine
GPTBotOpenAI (ChatGPT)
ClaudeBotAnthropic (Claude)
PerplexityBotPerplexity
Google-ExtendedGoogle AI (Gemini, AI Overviews)
anthropic-aiAnthropic (alternative)
# AI Crawler Access Rules
# Generated by AEOrank
User-agent: GPTBot
Allow: /
User-agent: ClaudeBot
Allow: /
User-agent: PerplexityBot
Allow: /
User-agent: Google-Extended
Allow: /
User-agent: anthropic-ai
Allow: /

Append the contents to your existing robots.txt file:

Terminal window
cat robots-patch.txt >> public/robots.txt