WenKo LLC

3331 41st Ave S

Minneapolis, MN 55406

Website Success Starts Here

© 2019 WenKo LLC Wendy Danko

  • Facebook Social Icon
  • Twitter Social Icon

If search bots were visible, what would they look like?

On-page SEO Will Improve Your Website Traffic 

 

Web-designers, Marketers, Business Owners, and Entrepreneurs are finding their online presence improves when on-page SEO is applied.

 

On-page SEO techniques improve crawlability for indexing on search engines. Once your webpages are indexed your customers have a better chance of finding your products and services. When there are crawlability problems your findability goes down. 

On-page SEO is an ongoing process that will keep your webpages crawlable for indexing. Through audits and analytics, improvements will be made. When there is a problem an audit will find the problem. When you need to understand what webpages are work well and which webpages need work, analytics will help you understand what needs more work.

When your webpages go uncheck you may be unaware of problems search engines are having with the crawlability of your webpages. When you're unaware of problems search engines could be giving you warnings and even un-indexing some of your webpages. A regularly scheduled audit will solve this problem.

An SEO audit report is your friend when it comes to keeping your webpages running smoothly.

Is my website crawlable
or crawler friendly?

Let's find out.

Website crawlability issues are extremely important to fix.

Here is what you need to know.  First, it is good you’re asking the crawlability issue question. Some people don’t even consider their site might have crawlability issues. So you are ahead of the game!

What are crawlability problems?

 

If your website is not crawlable, then there will be indexing issues. You want to get out ahead of this problem sooner rather than later if you wish to have your web pages indexed for all to visitors to see. You must make your webpage crawlable for search engines.

 

No crawlability issues are ideal. You want to ensure all your webpages are error-free, making your website available and friendly for search engine bots to scan. Search engine bots go through every page of your website to try and determine if the page is set up correctly or crawlable. Addressing all issues to error messages given in the form of a number is critical. What you want to find is a 200 code, meaning the page is good to go. Other codes could be 301 page has moved, 302 page has temporarily moved, 404 page is missing, or 403 denied access, and 505 server error or some of the codes. These codes should be checked regularly for each webpage on your site to make sure they stay correct. 

 

For example, if a webpage is returning a 404 code for a page not found, and you know the page exists, maybe the link is broken. A broken link provides poor user experience (UX). Fixing the link then becomes a priority.

How do I know if there are crawlability problems?

 

Listed below are two programs that could help you find crawlability problems.

 

• Google Search Console

Google Search Console will send you warnings and also list the issues your site may be having. Consider opening a Google Search Console account. It’s FREE! What’s holding you back?

 

• Screaming Frog

Screaming Frog is an audit program that will alert you to the code for each webpage after the software has scanned pages. Screaming Frog is FREE! Why are you waiting? However, you will not have access to all the features in the free version.

 

A third option is investing in an SEO Audit program or software.

 

• SEO Audit Program

 

There are hundreds of SEO audit programs out there to choose from. Be careful. They vary widely in their capabilities.

 

You may want to dig deeper, with an SEO audit of your whole website. 

 

An SEO audit report helps you find crawlability problems and may even tell you where the problem is and how to fix it.

 

The three tools listed above are useful to use if you understand how to use them to your advantage. Using more than one tool can help you make sure you are catching all the crawlability problems and not just some.

 

If you are unsure what you are doing, maybe it is time to hire a professional.

 

Here is a list of items that may cause a poor scanning experience for search engine bots.

 

1. Pages blocked from indexing by robot tags, nofollow links or robots.txt files

 

Webpages that are blocked, for any reason, you should investigate why it was blocked. Maybe it was unintentional. When blocking these pages, you will need to check three different areas.

 

robot tags

 

nofollow links 

 

robots.txt files

 

If you find blocking in any of these three areas, and if you want to unblock, you or your professional SEO will need to remove the offending blocking message. Correcting this type of issue may require knowledge of HTML. With the blocking message removed, search engines know the page is ready to scan. An SEO audit will uncover blocked pages to investigate.

 

2. URL structure errors

 

Pay attention to your URLs; they may contain crawlability issues too. Check for typos. Correct these typos ASAP to get better results for all your SEO efforts.

 

3. Old URL issues

 

When you update your website, make sure you follow best practice rules — updated all new pages. 

 

If you are adding security to your site, make sure all your links are to other pages and websites or secured sites. 

 

If you are linking to other insecure websites or webpages, you are compromising your site security. Make sure all your links are to sites that are secure (https).

 

4. Denied access

 

Check your content access. Do you have a page that all uses should or could read? Or, is there content only registered users have access to? Make sure to check content for access privileges because you may be losing customers with blocked content they want to see.

 

5. Server errors, capacity, or misconfiguration

 

When you see a 500 code, there could be a problem with the server. To fix this error, it is a good idea to bring it to the attention of the website developer or hosting company. The server may be offline, your site may have reached its capacity on the server, or something may be misconfigured. Your hosting provider or web developer will help you pinpoint these issues.

 

6. Formatting issues

 

When there are formatting issues or coding issues, search engines are going to have crawlability problems. It could be a server issue, firewall issue, or several other issues. You will need to contact a specialist to find this problem. First, start with your web developer, hosting company, or SEO specialist to see these issues.

 

For example, I was working on a website update for a law firm. My client wanted to see the updated site on his office computer and could not get to it. I contacted the hosting company, and they insisted there was no problem with the server. I thought about it for a few days and contacted the hosting company again. The hosting company also provided and oversaw the networking within the office. I was asking them if the onsite server blocked the new site. He grumbled and was not happy with me as if I was bothering him. He said if you don’t understand what you’re doing, you shouldn’t be working on their website. When the hosting company spent a little bit of time working on the formatting on the server, he found the security setup on the server was blocking the law firm’s ability to see their website. Problem solved. The message here is to work together to resolve the issue. Be kind to each other. There is no need to place blame or be rude.

7. Sitemap missing or not set up correct

 

Your sitemap is an essential document for the search engine to read. The sitemap will inform the search engine about site structure, and what pages to scan and index or not scan. Every time you update your website, make sure to update your sitemap. If you are adding new pages or delete old pages, the sitemap should reflex those changes. 

 

• There are different types of sitemaps:

 

• Sitemap.xml for web pages

 

• Sitemap.xml for photo

 

• Sitemaps.xml for PDFs

 

HTML.html

 

The first three on the list, Sitemap.xml, Sitemap.xml for a photo, Sitemap.xml for PDFs, can be in the same document.

 

The sitemap.xml is the one you should use. The sitemap.xml will be in the website root folder. There is nothing wrong with an HTML sitemap, and it may add to the human’s understanding of your website. Make sure you use the sitemap.xml format because most search bots are looking for this file.

8. Internal links not set up correctly or broken

 

The search engine also looks at the links you have within your website. There should be links between pages with similar information about the subject. 

 

For example, if you have a blog post talking about bunnies and what they eat and a webpage that is about how to raise rabbits, the two should have links within the copy content to each other. 

 

• Make sure internal links are related subjects. Don’t just do a link to check that box off. 

 

• When you provide this type of link, you are showing people and search engines that you are the expert on the subject by provided information.

 

• Page links also help search engines rank your pages for authority.

 

Beware, having too many internal links on one page, this will trigger a warning from Google. Three to five would be ideal unless your content is lengthy, then add a few more.

 

The linking to other pages should be correct; any broken links should be found and repaired for better user experiences (UX).

 

9. Redirects set-up wrong

 

Wrong redirects create a poor user experience (UX). As explain in the introduction to this article, you need to check the codes that show in your audit report. 

 

• 200 page is loaded correctly

 

• 404 page is not found

 

• 301 page has moved permanently

 

• 302 page has been temporarily moved

 

• 500 server error

 

When you check the code of the page, you may find broken links and other problems that will prevent your customers and search engines from reaching the information on your webpages. Broken links will cause a poor user experience and also cause crawlability issues.

 

10. Speed issues

 

Speed issue is a big problem that you need help to solve. Whet I have found, many speed issues involved image sizes. Make sure you’re compressing your images to get them as small a possible without compromising quality.

 

Here is why you need to be concerned about page speed. You have three seconds to load your webpage before your customer will give up and go to the next user-friendlier site.

 

If your site speed does not improve after you compress all your images, it would be a good idea to hire a professional to sort out the issues causing the slow speed.

 

11. Duplicate content

 

Duplicate content is one of the most frequent issues for a website. Duplicate content will need to be addressed for better search bot usage.

 

What you don’t want to do is run out of crawling budget on search engines, wasting the search engine’s time. Most search engines dedicate a certain amount of time crawling each website. 

 

There are millions of websites, and time spent on one site is limited. 

 

If you have duplicate content, your entire page may not get scanned and indexed by the search engine when you have duplicate content. You need to eliminate and duplicate content.

 

Address duplicate issues with:

 

• Removing or redirecting the duplicate webpage.

 

• Use the robots.txt file correctly.

 

• Use Meta tags correctly.

 

• Use a 301 redirect.

 

• Use rel=canonical.

 

You don’t need to do all five listed above. Pick one and implement it properly.

 

12. JavaScript (JS) and Cascading Style Sheet (CSS) issues

 

Implemented with JavaScript and Cascading Style Sheets be careful to not overdo it. These types of scripts can be read wrong when scanning a page for indexing. It is best to use JavaScript and Cascading Style Sheets as little as possible to limit any confusion.

 

13. Flash usage

 

Flash is old technology and avoided. It will cause poor indexing.

 

14. Frame usage

 

Frames are old technology and avoided. Frames will cause poor indexing.

 

Many of the above fourteen issues are very technical. You should contact a professional to assist in finding and correcting most of these errors.

 

When you have a website, you should consider the regular maintenance involved to keep the site running smoothly. Anything can happen at any moment that will cause a problem with crawlability and indexing. The most important thing to think about is giving your customers the best possible user experience (UX) by keeping up with and correcting crawlability issues. 

 

A monthly check-up with an SEO audit report will find crawlability problems. Eliminated crawlability issues can be time-consuming, but it is critical to the website, the customer experience, and indexing.

 

I hope you found this article helpful. If you found it a bit to technical hire a professional SEO to assist with finding and correcting crawlability problems with your website. 

 

It is an investment, but can you afford to upset your clients when they can't find what they are looking for on your website. What is a sale worth to you? Measure what you could be losing? When you weight these two items, increased clientele, and lost clients, you will then understand how productive an SEO professional will be to your website success. 

 

FYI: Google support has a wealth of information. I have linked to it often in this article. It is well worth your time to keep this link and refer to it often for updated information.

URLs and More in an SEO Audit Report

Easy to read URLs, with a keyword included, is a good SEO practice.

Redirects in an SEO Audit Report

Which why? The right SEO redirect will get your customers to the right page..

Crawlability and Indexing in an Audit Report

Your web pages will need to be crawlable for search engine bots to "see" it for indexing. If there are crawlability mistakes listed in your audit report, work on them right away.