Skip to content

Cloudflare report shows AI bots driving traffic surge

Cloudflare report shows AI bots driving traffic surge

Cloudflare says global Internet traffic surged again, and the reason isn’t subtle. AI bots now sit at the centre of how the web behaves, how fast it feels, and how much noise site owners must absorb. The company’s latest snapshot reads less like a trend report and more like a rules update for 2025.

According to Cloudflare, Internet requests across its network rose about 19% year over year. That growth isn’t evenly spread.

Automated systems, model training crawlers, and large-scale scraping account for a growing share, turning networks louder and harder to manage.

AI crawling changed shape fast. Bots built for training large models generated seven to eight times more traffic than other AI bot types.

In some cases, they produced up to 25 times more traffic than bots triggered by direct user actions. For site operators, it can feel relentless. Logs fill up. Bandwidth disappears. Pages slow down.

Cloudflare describes 2025 as an era of bot-on-bot competition. A small number of players pull content at industrial scale.

  • Meta’s llama-3-8b-instruct ranked as the most widely observed model on Cloudflare’s network, appearing across more than three times as many customer accounts as other vendors.
  • Googlebot still dominates overall and continues to out-crawl almost everyone else combined.
  • OpenAI’s GPTBot and newer entrants showed sharp spikes, then drop-offs. Predictability isn’t part of the deal.

One bright signal cuts through the noise. About 52% of human web traffic now runs under post-quantum encryption.

That level of adoption marks one of the first large-scale deployments of cryptography designed to withstand future quantum attacks. It’s mostly invisible to users, but it changes the math for defenders and attackers alike.

According to Beinsure analysts, this shift raises baseline security expectations across hosting, payments, and identity systems.

Attack patterns shifted too. Civil society groups and non-profits became the most targeted sector on Cloudflare’s network. The reason looks straightforward.

These organisations store sensitive donor and beneficiary data, often without enterprise-grade security budgets. The impact goes beyond downtime. Breaches hit trust, funding, and people already exposed to risk.

Cloudflare also logged a rough year for availability. More than 25 distributed denial-of-service attacks exceeded previous traffic records. Massive scraping waves stacked on top.

Nearly half of serious outages traced back to government actions rather than physical failures. Power-related disruptions doubled. Politics, not just cables, now shape uptime.

Speed tells a different story. Europe still leads on quality and performance. Several markets averaged download speeds above 200 Mbps, and Spain topped Cloudflare’s quality rankings. Infrastructure choices still matter, even as bot pressure rises.

For anyone running a website, the takeaways stay practical. Expect more automated traffic. Expect sharper DDoS peaks. Expect AI crawlers that don’t politely knock.

Make sure CDN and DDoS protections are actually switched on, not just licensed. Check cryptographic settings and confirm support for post-quantum or other modern ciphers.

Content owners face another decision. Manage crawlers aggressively, or monetise access. Cloudflare already offers tools aimed at controlling or charging for large-scale crawling. Robots.txt alone won’t cut it anymore. Clear policies, API limits, and traffic controls start to look like table stakes.

For shoppers and everyday users, the changes surface as slower pages on some sites, faster ones on others, and more frequent friction around access. The web still works. It just feels busier. And maybe a little more hostile.