Why isn't Alexa checking all of my pages for the audit?

In our Essentials and Advanced plans we check a maximum of 10,000 pages.

In some cases we might check less than the maximum number of pages for your site. There are several reasons for this:

  1. Your are blocking our crawler from checking pages in your site's robots.txt file. The Alexa's Web and Site Audit Crawlers article has some tips for updating your robots.txt file to unblock our crawler.
  2. Your site's server is returning errors to our crawler. You can find out about these errors in the Crawler Errors section at the bottom of the audit report. To fix the crawler errors you should reach out to your webmaster or server host.
  3. Your site is redirecting our crawler to another domain. The audit will only run on your current domain and any pages and subdomains on it. If you are using redirects or canonical links to point crawlers to another domain we will skip those pages on your site. To fix this problem you should make sure your redirects and canonical links point to URLs on your domain.
  4. Your site is returning a redirect loop to our crawler. This is similar to the errors mentioned in 2 but won't be shown in the Crawler Errors section. To fix this problem you should make sure all of your redirects are pointing to a new page, and not the previous page, and all canonical links are pointing to the current page.