Alexa Reachability

Reachability

A well structured, well linked site makes it easy for your visitors and for search engine crawlers to find what they are looking for. Deeply nested pages or, worse, those with no links at all may be evaluated as unimportant by search engines.

  • Search engine crawlers tend to initially visit the most popular pages on the web. They then follow the links on those pages to discover less popular pages. If all of your pages can be found within a few clicks, crawlers will index your site optimally. However, if your site has pages that are nested very deeply, crawlers may have trouble finding all the pages on your site.
  • Deeply nested pages or, worse, those with no links at all may be evaluated as unimportant by search engines. This can include pages that require crawlers to follow multiple redirects.
  • Our calculation of the optimal path length is based on the total number of pages on your site and a consideration of the number of clicks required to reach each page. Because optimally available sites tend to have a fan-out factor of at least ten unique links per page, our calculation is based on that model. When your site falls short of that minimum fan-out factor, crawlers will be less likely to index all of the pages on your site.

Recommendations

Review the table below to see if you have any important pages that are difficult to find from the report starting page. If you see hard to reach pages that are important, consider adding links to them from other, easier to find page.

  • Listing a sitemap in your robots.txt file can help crawlers find all the URLs on your website, but that won’t necessarily help your visitors find what they are looking for. Be careful and deliberate with your linking structure so that your visitors can find what they are looking for.

We found some pages that are unreachable from the report starting page. Review the table below, and if you see any important pages that are unreachable consider adding links to them from pages that are reachable.

  • There are two common causes for unreachable pages.First, our crawler may have discovered the URLs in either a sitemap you provided or one listed in your robots.txt. If that is the case, simply adding links to these “dark” regions of your site can help visitors find them.

    Second, if your site is configured such that both _mydomain.com_ and _www.mydomain.com_ serve content, then search engines like Google will treat them as independent. You should make sure that either _mydomain.com_ redirects to _www.mydomain.com_, or vice versa.

About The Author: Rosendo Cuyasen Jr. is the head of Eyewebmaster a web developing and SEO firm in the country. You can follow him on twitter@Eyewebmaster account, you can also like him to his Facebook account and Google + account.

Comments are closed.