The Cost You Cannot See
When you block GPTBot, ClaudeBot, or PerplexityBot in your robots.txt, nothing visibly breaks. Your site works fine. Your Google rankings do not change. Your analytics dashboard looks the same. Everything appears normal.
That is exactly the problem. The cost of blocking AI crawlers is invisible – it shows up as opportunities that never happen rather than things that stop working. You cannot measure revenue from recommendations that never occurred. You cannot track traffic from citations that were never made.
The Three Hidden Costs
1. Lost Product Recommendations
Every day, millions of people ask AI assistants for product recommendations. "What is the best air purifier for a small apartment?" "Which running shoes are best for beginners?" "Recommend a CRM for a 10-person team."
If your products are blocked from AI crawlers, they cannot be recommended in these conversations. Your competitors who allow AI crawler access get recommended instead. This is not a theoretical future – it is happening right now, millions of times per day.
The math is straightforward. If 30% of product research involves an AI assistant, and your products are invisible to all of them, you are missing roughly 30% of your potential discovery surface. That percentage is growing every quarter.
2. Lost Citation Traffic
When AI assistants cite sources, they drive traffic. ChatGPT, Claude, and Perplexity all include links to source material. Sites that appear in these citations see measurable referral traffic – and this traffic converts well because users arriving from an AI recommendation have high purchase intent.
Botjar data shows that AI referral traffic has an average conversion rate 1.4x higher than organic search traffic. Users who arrive via an AI recommendation have already been told your product is good. They are further along the buying journey.
Block AI crawlers and this entire traffic channel goes to zero.
3. Stale AI Knowledge
Even if your products were in the training data before you blocked AI crawlers, that data ages out. AI models are retrained periodically, and blocked sites get dropped from new training runs. Over time, the AI's knowledge of your products becomes outdated – wrong prices, discontinued products, missing new launches.
Worse, the AI might still mention your brand but with inaccurate information, leading to a poor customer experience when someone visits your site expecting something different from what the AI described.
Quantifying the Cost
Let us put rough numbers on this for a mid-size ecommerce site doing $5M in annual revenue:
- AI referral traffic opportunity: 3-8% of total traffic, growing 40% year-over-year
- Average conversion rate from AI referrals: 1.4x your organic search rate
- Estimated annual revenue from AI referrals (if allowed): $150,000-$400,000
- Cost of blocking: $0 in direct costs, $150,000-$400,000 in missed opportunity
These numbers are conservative estimates based on current AI adoption rates. As AI-powered shopping grows, the opportunity cost of blocking increases proportionally.
The Server Cost Argument
The most common justification for blocking AI crawlers is server load. And it is true – AI crawlers consume resources. But let us put this in perspective:
- GPTBot typically generates 100-500 requests per day on a mid-size site
- At an average page size of 200KB, that is 20-100MB of bandwidth per day
- At typical cloud hosting rates, this costs $1-5 per month
You are potentially sacrificing $150,000+ in annual revenue opportunity to save $60 per year in hosting costs. The math does not work.
When Blocking Actually Makes Sense
There are legitimate reasons to block AI crawlers, but they are narrower than most teams think:
- Paywalled content publishers – if your revenue comes from content subscriptions, AI training on your content directly undermines your business model
- Legal or compliance requirements – some industries have data handling regulations that conflict with AI training
- Genuinely limited infrastructure – if you are running on a single server and every request matters, reducing any non-essential traffic makes sense
For the vast majority of ecommerce businesses, none of these apply. Your products are public, your pages are already indexed by search engines, and your servers can handle a few hundred extra requests per day.
What to Do Instead
Rather than blocking AI crawlers, optimize for them:
- Allow access to product and content pages while blocking sensitive areas
- Improve your schema markup so crawlers understand your products
- Ensure fast server response times for bot requests
- Monitor crawler behavior to catch issues early
- Track your AI Visibility Score to measure improvement
Calculate your hidden cost. Botjar shows you exactly which AI crawlers are blocked, which pages they cannot access, and the estimated revenue opportunity you are missing. Get your free bot audit →