Technical SEO
Robots.txt for the AI Era
Your robots.txt is probably outdated. Here is the 2026 version that handles AI crawlers correctly.
robots.txt has been the same file for 30 years. The web has not. AI crawlers (GPTBot, Claude-User, PerplexityBot, OAI-SearchBot, Meta-ExternalAgent, ClaudeBot, ChatGPT-User, and dozens more) now visit B2B sites regularly. Most robots.txt files do not address them, so they default to whatever the user agent's default behavior is.
The strategic question is whether to allow AI crawlers (yes for most B2B; you want to be discoverable) or block them (rare; only for IP-sensitive content). For most B2B, the answer is allow with intent. Explicitly allow the major AI crawlers in robots.txt so they know they have permission, and so your llms.txt and machine surfaces get fetched.
The 2026 robots.txt template includes: explicit allow directives for OAI-SearchBot, GPTBot, ClaudeBot, Claude-User, PerplexityBot, Meta-ExternalAgent, and any other AI agents you want to engage with. A pointer to your sitemap.xml. A pointer to your llms.txt. Disallow directives only for genuinely sensitive paths (admin, account areas).
Audit your robots.txt this quarter. If it has not been updated since 2023, it almost certainly does not handle the current AI crawler ecosystem correctly. The 30-minute update is one of the higher-leverage GEO investments you can make.