WP Engine’s 2025 Website Traffic Trends Report
Key takeaways:
- AI crawlers are draining resources. AI crawlers can consume up to 70% of the most costly resources, such as hosting, environment, and performance. 76% of this bot activity comes from “unverified” bots, creating significant security and cost risks.
- Passive monitoring is no longer sufficient; active traffic governance is mandatory. Web teams must adopt “Intelligent Traffic Management,” including edge security tools (like GES) to filter bot traffic before it reaches the server.
- Security is a performance accelerator. Data confirms that security and speed are now linked. Sites served exclusively over Hypertext Transfer Protocol Secure (HTTPS) are 1–5 seconds faster in Largest Contentful Paint (LCP) than sites using older protocols.
- The global “speed gap” persists. Despite the availability of optimization tools, nearly 50% of the top 10 million sites still do not use a Content Delivery Network (CDN).
- You can download the full report and get a BONUS checklist of essential steps for future-proofing your digital experiences here.
The internet is undergoing its most significant shift since the rise of mobile. Artificial intelligence (AI), particularly large language models (LLMs) and automated agents, now shapes website traffic trends, infrastructure demands, and digital strategy. The WP Engine 2025 Website Traffic Trends Report shows that non-human traffic is rapidly reshaping how the web operates. Security and performance remain top concerns, but the fast rise of AI-driven bot activity now influences decisions about hosting, costs, and even plugin adoption.
Drawing on proprietary first-party data, plus insights from the Google Chrome User Experience Report (CrUX) and Cloudflare Bot Management, this report assesses how the web is performing across North America, Europe, Asia-Pacific, and the Middle East, identifying clear actions web teams should take to remain competitive.
Three pillars for 2026: Intelligent traffic management, security maturity, and performance parity
1. Intelligent traffic management
Automated, non-human activity now accounts for a large and growing share of all web requests. We have seen AI crawlers consuming as much as 70% of costly hosting, environment, and performance resources. This makes proactive traffic management not only a performance booster, but also a financial and security necessity.
Globally, bot traffic averages ~30% (Cloudflare), but distribution is uneven. The U.S. sees far more AI and bot activity than any other region, while Europe, APAC, and the Middle East continue to trend upward.
Not all bots are the same. Verified bots (such as GoogleBot, GPTBot, and Meta-ExternalAds) self-identify and are generally benign. But unverified bots either don’t identify themselves or can’t pass platform security checks, and they make up the majority of bot traffic. Across WP Engine-hosted sites, unverified bots account for 76% of all bot activity, creating additional risk and resource strain.
2. Security maturity
Security and speed now reinforce each other. Our data shows that organizations using powerful add-ons like Global Edge Security (GES) benefit from stronger protection and faster performance due to attack filtering and traffic management at the edge.
However, a clear security divide has emerged. Organizations with 10+ employees or 100+ domains have near-universal adoption of HTTPS and MFA/2FA. Small and solo site owners lag 25% behind, primarily due to limited resources, manual workflows, and the belief they’re “too small to be a target.” The opposite is true: smaller sites are often the victims of cyberattacks.
HTTPS adoption is also becoming a performance factor. Sites served exclusively over HTTPS are 1–5 seconds faster in Largest Contentful Paint (LCP), showing encryption is now a speed enabler thanks to modern protocols and better bot handling.
3. Global performance parity
The global “speed gap” persists. North America remains the fastest region, while Europe, the Middle East, and APAC lag behind. This is largely due to lower adoption of CDNs and edge caching.
Despite longstanding evidence that CDNs improve LCP by ~20%, nearly 50% of the top 10 million sites tracked by Google CrUX still do not use one. Multi-region traffic further slows down sites, especially when bot traffic artificially inflates cross-border requests.
Mobile speed continues to trail desktop globally, even as mobile becomes the dominant traffic source. This widening gap creates major engagement and SEO consequences.
Bot management and edge optimization are now mandatory
Only 38% of the customers we analyzed used a dedicated solution for bot mitigation, security, and performance such as GES. This leaves significant performance and security gains unrealized. Web teams must shift from passive monitoring to active bot mitigation by:
- Factoring human-to-bot ratios into hosting and bandwidth planning.
- Using edge security tools like GES to filter AI bots, DDoS patterns, and unverified crawlers.
- Adopting LLMS.txt to govern how AI crawlers interact with site content and allow sites to opt out of training use.
- Reducing unexpected regional traffic with better firewalling and fingerprint-based bot mitigation.
Security strategies for different team sizes
Depending on the size of your team, there are a few key strategies you can implement to prepare your site for the agentic web.
For small organizations, automation is key.
- Enforce MFA/2FA.
- Choose managed hosting for WordPressⓇ[1] that includes automated updates and vulnerability scanning.
- Offload routine maintenance to your hosting provider to reduce risk.
Agencies and larger teams should adopt DevSecOps workflows.
- Enforce MFA across all environments.
- Integrate security scans into CI/CD pipelines.
- Validate backups regularly.
- Standardize security expectations for every project and deployment.
Closing the global performance gap
To ensure competitive performance across regions and devices, web teams must adopt CDNs and edge caching as foundational infrastructure, prioritize ongoing mobile performance improvements, and consistently optimize page weight and static requests (targeting a total payload under 400KB and fewer than five static requests per page). Localizing assets is equally important, as bringing content physically closer to users reduces latency and improves load times.
Plugin usage patterns indicate different regional maturity levels. More established markets tend to rely on backend and SEO-focused tools such as ACF PRO, Gravity Forms, and Yoast, while emerging markets gravitate toward visual builders like Elementor and Genesis Blocks. Together, these trends reveal differing priorities around scalability, development practices, and performance optimization across the global web ecosystem.
The mandate for modernization
The message from the data is clear: Traditional approaches are becoming less effective under the weight of AI-driven traffic and global performance expectations. Organizations that modernize their infrastructure, secure every layer, and actively manage both human and non-human traffic will define the next era of high-performing digital experiences.
The time for passive monitoring is over. To stay competitive, teams must take action now.You can download the full report and get a BONUS checklist of essential steps for future-proofing your digital experiences here.