The Redirect Bots setting 301 redirects well-known crawler user agents (bots) on the site to the parent page when they request a page ending in a number or in a query argument.
How “redirect bots” works
Bots will see a URL that ends in a number or query like 1, or even a year like 2009, and will increment it (2010, 2011, etc.), continually and high as they want. Each new page they hit isn’t going to be cached, and so the server can start to experience lag and high load.
With the redirect bots setting turned on (default), our server would automatically redirect all of these pages to simply:
NOTE: We highly advise keeping a clean sitemap and robots.txt available for bots to help curb this behavior as well.
How to disable “redirect bots”
If a service you’re using to scan your site is having issues or receiving a 301 redirect, there’s a chance this is due to the redirect bots setting.
For example, using Facebook’s URL debugger tool attempts to scrape a specific page of the site that ends in a number, using one of the well known user agents that is redirected by default. This causes the tool to show an error. With this setting turned off, it allows Facebook to properly scrape and analyze the data given.
- Login to the User Portal
- Select Sites
- Click on the environment name you wish to disable this on
- Click Utilities
- Locate Redirect Bots
- Check Disable