top of page
Writer's pictureWendy Danko

Is Your Site Crawlable & Indexable?

Updated: Sep 13, 2022


Search Bots are check your site for indexing.
"I'm here to help you. What are you looking for?"

Crawling issues could be holding your web pages back from indexing.


When there are issues with your website's web pages, they may not be indexed in search engines — not indexed means the search engine does not "see" the individual page for some reason. Finding and fixing unindexed pages is crucial for your SEO.


An SEO audit will uncover issues for your website. Through an SEO audit, you will be see what the status codes are for each web page. The status code will tell you which web pages need investigation.


The following are a few of the issues you will find in your SEO audit.


  • 400 error status code

  • 500 error status code

  • restricted from indexing

  • 404 page set-up

  • robots.txt file

  • xml sitemap

  • 200 code page is indexed


Below are short descriptions for each crawlability issue.


400 error status code


400 error code is a "bad request" error alerting that there are problems in the request made on a web page. Have you ever been on a website, you click on a link, it goes to an "'I'm sorry" page, or a "this page does not exist" or even no page at all?


400 errors, "bad request" or broken links, need to be fixed immediately, or they could leave a negative effect on your site's authority. Don't make your users flustered; make sure there are no broken 400 error codes.


500 error status code


500 error code is a "server error" and is usually a sign that there is a server problem. When you see a 500 error check to make sure your website is loading okay. If the site is not loading the correct way or not at all, contact your hosting provider.


When I see this problem, I wait a few minutes, restart my browser, and check to see if the site is loading correctly. If it is not loading correctly, I contact the hosting provider.


Restricted from indexing


Restricted pages may be preventing a page from indexing; it usually means a robots.txt file, Noindex X-Robots tag, or Noindex Meta tag has been set-up to stop the indexing.


Go over the web page to see if it something you would like to index. If the page is something you want to index, remove the tag that is preventing the indexing.


If you are not familiar with HTML and don't know how to access and use your c-panel or root folder, it is best to hire someone to do this for you. Making mistakes could be more expensive in the long run.


404 page set-up


A 404 page is a page not found. When someone clicks a link, an error page shows up. The person is presented with a page telling them the link does not exist and is asked to do something else, for example, visit a different page.


Check for broken links and fix them right away. A broken link will affect the user's experience and result in a poor ranking in search engines.


Some companies customize their 404 pages to send a unique message to encourage visitors to stay on their website. A custom page may have a search box, navigation menu, or another message to entertain with humor about why is the page is missing.


Make sure if your setting-up a custom 404 error page that it returns a 404 response code. If it does not return a 404 response code, you may never know you have a broken link.


FYI a 410 error code means a page is gone or non-existent. A 410 error should be treated the same as 404 error code.


robots.txt file


Robot.txt files suggest to crawling bots to do or not to do something. An example: disallow directives.


Beware, not all search engines pay attention to the robots.txt files. Make sure any critical data you might have that you don't want to be indexed, are secured differently.


The formatting for the robots.txt file is in HTML. If you are not familiar with HTML and don't know how to access and use your c-panel or root folder, it is best to hire someone to do this for you. Making mistakes could be more expensive in the long run.


.xml sitemap


A sitemap is a list of the web pages on your website and where to find them. Search engines will read the sitemap for indexing your web pages.


Make sure you provide a sitemap when you publish your website and a new sitemap every time you add a new page. Check your sitemap to make sure it is correct.


All XML sitemaps should have all the website pages listed that you want indexed. The location of the sitemap is in the website directory under the homepage. If you are not familiar with HTML and don't know how to access and use your c-panel or root folder, it is best to hire someone to do this for you. Making mistakes could be more expensive in the long run.


If you are seeing a 200 code for each web pages that means they are indexed!


The main take away is; an SEO audit report will help you uncover any of the issues listed above. An SEO audit has a multitude of other information you should go over as well. If you have never had an SEO audit completed, I'm guessing your website has a few issues that need to be dealt with to get it working for your company.


If you have any questions about crawling and indexing your web pages, please contact WenKo LLC, we would be happy to help.


Make sure to check your web pages for crawling issues with an SEO audit.


Happy Optimizing!


Wendy Danko

Setting Your Website Up For Success!




Wendy Danko

Setting Your Website Up For Success!


About the Author

Wendy Danko is an SEO Consultant and Web Designer at WenKo LLC. A design background has its perks, especially regarding SEO and website design. Wendy is determined to help all her clients understand and improve their SEO and website design. As a result, her clients have uncovered new leads to grow their client base with website improvements.

Comments


bottom of page