Here at WP Engine we have several default server settings in place to help optimize your site and the server it resides on. One of these settings is called “Redirect Bots.” Essentially, this setting redirects well known user agents (bots) crawling the site to the parent page when they request a page ending in a number.
How does it work?
Any page that ends in a number, e.g. site.com/page/1 or site.com/category/2 for example, will be redirected to the page before the number sequence begins (site.com/page, site.com/category). This is due to known bot behavior. When a bot sees a URL ending in a number, it thinks “I should test for a page following that, like page 4, 5, 6, etc.” This kind of behavior can sometimes be harmless, but for large sites with large numbers of bot crawlers this can add up to a lot of unnecessary server hits. So by default, these are redirected.
Why is this good?
When a bot has to crawl every single page of your site, e.g. site.com/page/123, site.com/page/456, these are all unique uncached hits to your server which can contribute to a high server load, especially if your site is frequently crawled by a lot of bots. With the redirect bots setting turned on, this redirects pages like site.com/page/456 and site.com/page/123 to avoid excess hits to the server for extra pages of content, and sends them instead to site.com/page/.
When should it be turned off?
If a service you’re using to scan your site is having issues or receiving a 301 redirect, there’s a chance this is due to the redirect bots setting. For example, using Facebook’s URL debugger tool attempts to scrape a specific page of the site that ends in a number, using one of the well known user agents that is redirected by default. This causes the tool to show an error. With this setting turned off, it allows Facebook to properly scrape and analyze the data given.
How do I turn it off?
Simply contact via a 24/7 Live Chat from your User Portal and we would be happy to turn it off for you. Effects of turning this setting off are typically very few, and can help in the few circumstances when tools are unable to read or scrape URLs.