In our Advanced and Agency plans we check a maximum of 10,000 pages.
In some cases we might check less than the maximum number of pages for your site. There are several reasons for this:
- You are blocking our crawler from checking pages in your site's robots.txt file. The Alexa's Web and Site Audit Crawlers article has some tips for updating your robots.txt file to unblock our crawler.
- Your site's server is returning errors to our crawler. You can find out about these errors in the Crawler Errors section at the bottom of the audit report. To fix the crawler errors you should reach out to your webmaster or server host.
- Your site is redirecting our crawler to another domain. The audit will only run on your current domain and any pages and subdomains on it. If you are using redirects or canonical links to point crawlers to another domain we will skip those pages on your site. To fix this problem you should make sure your redirects and canonical links point to URLs on your domain.
- Your site is returning a redirect loop to our crawler. This is similar to the errors mentioned in 2 but won't be shown in the Crawler Errors section. To fix this problem you should make sure all of your redirects are pointing to a new page, and not the previous page, and all canonical links are pointing to the current page.