WenKo LLC

3331 41st Ave S

Minneapolis, MN 55406

Website Success Starts Here

© 2020 WenKo LLC Wendy Danko

  • Facebook Social Icon
  • Twitter Social Icon

What is On-page SEO?

If search bots were visible, what would they look like?

A Guide to On-page SEO

 

The purpose of this guide is to help you understand what is involved in on-page SEO. When you have a firm understanding, you have a better idea about how your web pages rank on search engines. This guide is not a how-to guide, that will come later once you understand the parts involved.

The Web Page Foundation

Web-designers, Marketers, Business Owners, and Entrepreneurs are finding their online presence improves when on-page SEO is applied.

 

On-page SEO techniques improve crawlability for indexing on search engines. Once your webpages are indexed, your customers have a better chance of finding your products and services. When there are indexing problems your findability goes down. 

On-page SEO is an ongoing process that will keep your web pages crawlable for indexing. Through audits and analytics, improvements will be made. When there is a problem, an audit will find the problem. When you need to understand what web pages are working well and which web pages need work, analytics will help you know what needs more work.

When your web pages go unchecked, you may be unaware of problems the search engines are having with the indexing of your web pages. When you're unaware of problems, search engines could be giving you warnings and even un-indexing some of your web pages. A regularly scheduled audit will solve this problem.

An SEO audit report is your friend when it comes to keeping your web pages running smoothly.

Keyword Research​

Keyword research will give you a base to work from when creating content. This research will help you find the words that are already ranking for the topics on your webpage. The research will also give you new ideas to write about for your web page and your blog by revealing ideas you never thought about.

 

Your keywords should be the same words your customers use and not industry jargon. Niche jargon may be what you use when talking to people within your industry, but the people looking for your products and services will be using simpler words in their online searching. People don't know or understand your niche industry words. Think simple.

When researching keywords, you will be looking at what your competitors are doing, checking frequency of use, and relevancy. With the right keywords, you may have a chance at ranking higher.

Content

Provide content your customer wants to find. Make sure it is in their language or verbiage. Educate your customers and answer any questions they may have. Don't assume your customer understands every aspect of your product or service and how it will benefit them. 

Your end goal is to produce what the customer wants to find, be the expert in your field provide the information they want. Content includes written information, guides, infographics, charts and graphs, videos, images, gifs, and more.

 

Be the expert in your field or niche. Experts get respect, and then they get link shares from other pages and in social media, which in turn gives your site authority. You become the expert. When you are seen as the expert through your authority search engine will rank your web pages high on their SERP.

HTTPS, what is it?

HTTPS stands for Hypertext Transfer Protocol Secure. The S is telling your visitors that your site is secure. In the not too distant past, Google was warning people to secure their website, that it would be a ranking factor. Present-day Google, your site NEEDS to be secure or Google or will label your site with a warning label saying your site is not secure. A not secure website will have no chance of a page one listing.

Meta Title, what is it?

Next to keyword research, meta titles are the most critical items on your web page. Visitors and search engines will use your meta title to decide if they want to visit your webpage.

  • Use your keywords near the beginning of your meta title.

  • The title should be no more than 50 to 60 characters, including spaces or 600 pixels.

  • Make sure each page has a unique title that is relevant to the subject of the page.

Meta Description, what is it?

The meta description will show right below the meta description and web address on the search engine results page (SERP). Meta descriptions describe what your customers will find on the web page when they arrive. 

  • Write an enticing copy make your web page a must-see page.

  • Make sure the text is no longer than 120 to 160 characters, including spaces.

  • Don't be alarmed if the search engine changes this copy. Sometimes the search engine will change things to match a searcher query better. This change will be another relevant snippet from your webpage.

Headers and h1 thought h6, what are they?

The header is the main headline at the top of your web page. Don't get this confused with the meta description. The header and the meta description are two different things. Each one should be different and unique. Search engines will be looking at your headline and will use it as a ranking factor for the webpage.

  • Don't be tempted to use the meta titles for your page headline, you will be losing out on your SEO if you do.

  • The headline is at the top of your page and is labeled h1 in your code.

  • There should be only one h1 per page.

  • Use the most important information in the headline.

  • The average headline is 50 characters long but can go longer if needed.

  • Subheads are labeled h2 through h6 in the code.

Anchor Text Linking, what is it?

Anchor text linking is the hypertext links within the copy that go to other web pages to refer to supporting information on the subject. The anchor text can also be a citation to give credit for the source you used to find the information or ideas you are applying to your content. Anchor text linking will provide you with more authority in the search engines, but make sure the sites and pages you're are linking to are high quality. Low-quality pages and sites might hurt your rankings.

  • Make sure you use sources for your articles and content. Give credit where credit is due.

  • Use descriptive words instead of using 'click here.' People have been reading information on the internet long enough to understand how a link works.

  • Have anchor text linking to other pages within your website. Make sure these additional pages are relevant to the subject on the webpage.

Image Optimization, what is it?

Image optimization is a big deal when your web page is loading onto your customers' devices. If the images are not optimized, the site will take to long to load, and you may lose their attention, sending them to your competitor. 

 

Optimization means to use fewer pixels for photos, artwork, charts, and graphs. It is easy to compress images, but most people either don't know how to do it or forget to take this step. Don't skip it. One site I use to compress my images is www.tinypng.com. It's free and easy to use. They do have a paid version if you tend to use them a lot.

Image Alt Text, what is it?

Image alt text is another area people don't know about or don't know how or why they should include on their on-page SEO. Many people use a screen reader to read their internet information to them. When you add image alt text, you are telling the listener what the image represents. FYI Google does give a higher ranking to pages that include image alt text.  

  • Make sure you add image alt text to all images, including your logo.

  • Keep it short and straightforward.

  • Use your keywords in this description.

Breadcrumbs, what are they?

Breadcrumbs help keep customers on your website. Breadcrumbs are that text you see at the top of most webpages that tell you where you are within the site. Using breadcrumbs will lower the bounce rate of your website. The bounce rate is how much time someone spends on your web page, usually in a percent form. The lower the percentage, the longer they are on your web page. Your goal is to keep your customers interested in your information within your web pages. The breadcrumbs help your customers follow their way back to a previous page to help find what they are looking to find. For more information on breadcrumbs, this article covers the topic in detail.

Robots.txt, what is it?

A Robots.txt file is used to help search engines understand how to operate within your website. There may be web pages within your website that need to be online, but you don't want the public to access them. For example, a log-in page does not have information that will be searched for by a search engine. You would use codes on a robots.txt file to give instructions about what to do with these kinds of pages.

  • Use Robots.txt files for pages you don't want to include in search results.

  • Don't use robots.text file for the pages you have removed from your website. There're other ways to deal with pages that are deleted.

  • Robots.txt file is for bots or web crawlers that scan your pages.

Site Structure, what is it?

Your site structure is essential to the search engine as well as how people use your website. If the site structure is hard for search engines to decipher, your page may not be indexed. Keep it simple to follow.

Are my webpages crawlable
or crawler friendly?

Let's find out.

Fixing any website crawlability issues is extremely important for indexing.

Here is what you need to know.  First, it is good you’re asking the crawlability issue question. Some people don’t even consider their site might have issues with indexing. So you are ahead of the game!

What are crawlability problems?

 

If your website is not crawlable, then there will be indexing issues. You want to get out ahead of this problem sooner rather than later if you wish to have your web pages indexed for all to visitors to see. You must make your web pages crawlable for search engines.

 

Cleaning up indexing issues is essential. You want to ensure all your web pages are error-free, making your website available and friendly for search engine bots to scan. Search engine bots go through every page of your website to try and determine if the page is set up correctly or crawlable. Addressing all issues to error messages given in the form of a number is critical. What you want to find is a 200 code, meaning the page is good and is indexed, ready for your customers. Other codes could be:

  • 301, the page has moved 

  • 302, the page has temporarily moved 

  • 404, the page is missing

  • 403, denied access

  • 505, a server error

There are more error codes, these are the main ones to look at. These codes should be checked regularly for each webpage on your site to make sure they stay accurate. 

 

For example, if a web page is returning a 404 code for a page not found and you know the page exists, maybe the link is broken. A broken link provides poor user experience (UX). Fixing the link then becomes a priority.

How do I know if there are indexing problems?

 

Listed below are two programs that could help you find indexing problems.

 

• Google Search Console

Google Search Console will send you warnings and also list the issues your site may be having. Consider opening a Google Search Console account. It’s FREE! What’s holding you back?

 

• Screaming Frog

Screaming Frog is an audit program that will alert you to the code for each web page after the software has scanned pages. Screaming Frog is FREE! Why are you waiting? However, you will not have access to all the features in the free version.

 

A third option is investing in an SEO Audit program or software.

 

• SEO Audit Program

 

There are hundreds of SEO audit programs out there to choose from. Be careful. They vary widely in their capabilities and pricing.

 

You may want to dig deeper with an SEO audit of your whole website. 

 

An SEO audit report helps you find indexing problems and may even tell you where the problem is and how to fix it.

 

The three tools listed above are useful to use if you understand how to use them to your advantage. Using more than one tool can help you catch all the indexing problems.

 

If you are unsure what you are doing, maybe it is time to hire a professional.

 

Here is a list of items that may cause a poor scanning experience for search engine bots.

 

1. Pages blocked from indexing by robot tags, nofollow links or robots.txt files

 

Web pages that are blocked, for any reason, you should investigate why it was blocked. Maybe it was unintentional. When blocking these pages, you will need to check three different areas.

 

• robot tags

 

nofollow links 

 

robots.txt files

 

If you find blocking in any of these three areas, and if you want to unblock, you or your professional SEO will need to remove the offending blocking message. Correcting this type of issue may require knowledge of HTML. With the blocking message removed, search engines know the page is ready to scan. An SEO audit will uncover blocked pages to investigate.

 

2. URL structure errors

 

Pay attention to your URLs; they may contain indexing issues too. Check for typos. Correct these typos ASAP to get better results for all your SEO efforts.

 

3. Old URL issues

 

When you update your website, make sure you follow best practice rules — update all new pages. 

 

If you are adding security to your site, make sure all your links are to other pages and websites or secured sites. If you are linking to other unsecured websites or web pages, you are compromising your site's security. Make sure all your links are to sites that are secure (https).

 

4. Denied access

 

Check your content access. Do you have a page that all users should or could reading? Or, is there content only registered users have access to? Make sure to check content for access privileges because you may be losing customers with blocked content they want to see.

 

5. Server errors, capacity, or misconfiguration

 

When you see a 500 code, there could be a problem with the server. To fix this error, it is a good idea to bring it to the attention of the website developer or hosting company. The server may be offline, your site may have reached its capacity on the server, or something may be misconfigured. Your hosting provider or web developer will help you pinpoint these issues.

 

6. Formatting issues

 

When there are formatting issues or coding issues, search engines are going to have crawlability problems. It could be a server issue, firewall issue, or several other issues. You will need to contact a specialist to find this problem. First, start with your web developer, hosting company, or SEO specialist to see these issues.

 

For example, I was working on a website update for a law firm. My client wanted to see the updated site on his office computer and could not get to it. I contacted the hosting company, and they insisted there was no problem with the server. I thought about it for a few days and contacted the hosting company again. The hosting company also provided and oversaw the networking within the office. I was asking them if the onsite server blocked the new site. He grumbled and was not happy with me as if I was bothering him. He said if you don’t understand what you’re doing, you shouldn’t be working on their website. When the hosting company spent a little bit of time working on the formatting on the server, he found the security setup on the server was blocking the law firm’s ability to see their website. Problem solved. The message here is to work together to resolve the issue. Be kind to each other. There is no need to place blame or be rude.

7. Sitemap missing or not set up correct

 

Your sitemap is an essential document for the search engine to read. The sitemap will inform the search engine about site structure, and what pages to scan and index or not scan. Every time you update your website, make sure to update your sitemap. If you are adding new pages or delete old pages, the sitemap should reflex those changes. 

 

There are different types of sitemaps:

 

• Sitemap.xml for web pages

 

• Sitemap.xml for photo

 

• Sitemaps.xml for PDFs

 

HTML.html for people

 

The first three on the list, Sitemap.xml, Sitemap.xml for a photo, Sitemap.xml for PDFs, can be in the same document.

 

The sitemap.xml is the one you should use. The sitemap.xml will be in the website root folder. There is nothing wrong with an HTML sitemap, and it may add to the human’s understanding of your website. Make sure you use the sitemap.xml format because most search bots are looking for this file.

8. Internal links not set up correctly or broken

 

The search engine also looks at the links you have within your website. There should be links between pages with similar information about the subject. 

 

For example, if you have a blog post talking about bunnies and what they eat and a web page that is about how to raise rabbits, the two should have links within the copy content to each other. 

 

• Make sure internal links are related subjects. Don’t just do a link to check that box off. 

 

• When you provide this type of link, you are showing people and search engines that you are the expert on the subject by provided information.

 

• Page links also help search engines rank your pages for authority.

 

Beware, having too many internal links on one page, this will trigger a warning from Google. Three to five would be ideal unless your content is lengthy, then add a few more.

 

The linking to other pages should be correct; any broken links should be found and repaired for better user experiences (UX).

 

9. Redirects set-up wrong

 

Wrong redirects create a poor user experience (UX). As explain above, you need to check the codes that show in your audit report. 

 

• 200 page is loaded correctly

 

• 404 page is not found

 

• 301 page has moved permanently

 

• 302 page has been temporarily moved

 

• 500 server error

 

When you check the code of the page, you may find broken links and other problems that will prevent your customers and search engines from reaching the information on your web pages. Broken links will cause a poor user experience and also cause indexing issues.

 

10. Speed issues

 

Speed issue is a big problem that you need help to solve. What I have found, many speed issues involved image sizes. Make sure you’re compressing your images to get them as small a possible without compromising quality.

 

Here is why you need to be concerned about page speed. You have three seconds to load your webpage before your customer will give up and go to the next user-friendlier site.

 

If your site speed does not improve after you compress all your images, it would be a good idea to hire a professional to sort out the issues causing the slow speed.

 

11. Duplicate content

 

Duplicate content is one of the most frequent issues for a website. Duplicate content will need to be addressed for better search bot usage.

 

What you don’t want to do is run out of crawling budget on search engines, wasting the search engine’s time. Most search engines dedicate a certain amount of time crawling each website. 

 

There are millions of websites, and time spent on one site is limited. 

 

If you have duplicate content, your entire page may not get scanned and indexed by the search engine when you have duplicate content. You need to eliminate and duplicate content or redirect to the original document.

Address duplicate issues with:

 

• Removing or redirecting the duplicate webpage.

 

• Use the robots.txt file correctly.

 

• Use Meta tags correctly.

 

• Use a 301 redirect.

 

• Use rel=canonical.

 

12. JavaScript (JS) and Cascading Style Sheet (CSS) issues

 

Implementation of JavaScript and Cascading Style Sheets, be careful not to overdo it. These types of scripts can be read wrong when scanning a page for indexing. It is best to use JavaScript and Cascading Style Sheets as little as possible to limit any confusion.

 

13. Flash usage

 

Flash is old technology and avoiding it is best. If you use flash, it will cause poor indexing.

14. Frame usage

 

Frames are old technology and should be avoided. Frames will cause poor indexing.

 

Many of the above fourteen issues are very technical. You should contact a professional to assist in finding and correcting most of these errors.

 

When you have a website, you should consider the regular maintenance involved to keep the site running smoothly. Anything can happen at any time,  causing problems with crawlability and indexing. The most important thing to think about is giving your customers the best possible user experience (UX) by keeping up with and correcting indexing issues. 

 

A monthly check-up with an SEO audit report will find crawlability and indexing problems. Eliminated indexing issues can be time-consuming, but it is critical to the website, the customer experience, and indexing.

 

I hope you found this article helpful if you found it a bit to technical hire a professional SEO to assist with finding and correcting crawlability and indexing issues on your website. 

 

It is an investment, but can you afford to upset your clients when they can't find what they are looking for on your website. What is a sale worth to you? Measure what you could be losing? When you weight these two items, increased clientele, and lost clients, you will then understand how productive an SEO professional will be to your website success. 

 

FYI: Google support has a wealth of information. I have linked to it often in this article. It is well worth your time to keep this link and refer to it often for updated information.

On-page SEO is an essential process that should not be put on hold any longer. To be competitive on the World Wide Web, you need to optimize every page of your website. The information listed above is just the tip of the iceberg. Keeping up with the changing aspects of SEO is a daily journey in education and information searching.

 

Are your website, web pages, and blog posts up to date with SEO?

WenKo LLC enjoys learning everything about SEO and does keep up-to-date daily with algorithm changes search engines like Google are introducing.

How do you think your SEO is doing compared to your competitors?

 

Is your company lagging behind and losing out on possible customers?

 

It's time to either learn more about SEO or hire an SEO expert and stop wasting time. If you're serious about getting more clients with search marketing and on-page SEO,  WenKo LLC would like to be your go-to company.

Sources:

Conway, Richard. How to Get to the Top of Google Search.  2019.

“Guidelines - Search Console Help.” Google, Google,

https://support.google.com/webmasters/topic/9456575?hl=en&ref_topic=9428048.

Grybniak, Sergey. “Everything You Need to Know About Breadcrumbs & SEO.” 

Search Engine Journal, Search Engine Journal, 15 June 2018, https://www.searchenginejournal.com/breadcrumbs-seo/255007/#close.

“Home.” Screaming Frog, https://www.screamingfrog.co.uk/.

“What Are the SEO Benefits of XML & HTML Sitemaps?” The Daily Egg, 30 Apr. 2019,

https://www.crazyegg.com/blog/seo-benefits-of-xml-html-sitemaps/.