What is On-page SEO?​

On Page SEO imaginary Bots

If search bots were visible, what would they look like?

A Guide to On-page SEO


The purpose of this guide is to help you understand what is involved in on-page SEO. When you have a firm understanding, you have a better idea about how your web pages rank on search engines. 

The Web Page Foundation

Web designers, Marketers, Business Owners, and Entrepreneurs are finding their online presence improves with on-page SEO.


On-page SEO techniques improve crawlability for indexing on search engines. Once your webpages are indexed, your customers have a better chance of finding your products and services. When there are indexing problems your findability goes down. 

On-page SEO is an ongoing process that will keep your web pages crawlable for indexing. Through audits and analytics, improvements will be made. When there is trouble, an audit will find the problem. When you need to understand what web pages are working well and which web pages need work, analytics will help you know what needs more work.

When your web pages go unchecked, you may be unaware of problems the search engines are having with the indexing of your web pages. When you're unaware of problems, search engines could be giving you warnings and even un-indexing some of your web pages. A regularly scheduled audit will solve this indexing trouble.

An SEO audit and analytics report is your friend when it comes to keeping your web pages running smoothly.

Keyword Research​

Keyword research will give you a base to work from when creating content. This research will help you find the words that are already ranking for the topics on your webpage. The research will also give you new ideas for content for your web pages and your blog by revealing ideas you never thought about.


Your keywords should be the same words your customers use and not industry jargon. Niche jargon may be what you use when talking to people within your industry. But, the customers looking for your products and services will be using simpler words in their online searching. People don't know or understand your niche industry words. Think simpler.

When researching keywords, you will be looking at what your competitors are doing, checking frequency of use, and relevancy. With the right keywords, you may have a chance at ranking higher.


Provide content your customer wants to find. Make sure it is in their language or verbiage. Educate your customers and answer any questions they may have. Don't assume your customer understands every aspect of your product or service and how it will benefit them. 

Your end goal is to produce what the customer wants to find, be the expert in your field to provide the information they want. Content includes written information, guides, infographics, charts and graphs, videos, images, gifs, and more.


Be the expert in your field or niche. When you provide the information your customers are looking for you become the expert. 

HTTPS, what is it?

HTTPS stands for Hypertext Transfer Protocol Secure. The S is telling your visitors that your site is secure. In the not too distant past, Google was warning people to secure their website, that it would be a ranking factor. Present-day Google, your site NEEDS to be secure or Google or will label your site with a warning label saying your site is not secure. Not secure websites will have no chance of a page one listing.

Meta Title, what is it?

Next to keyword research, meta titles are the most critical items on your web page. Visitors and search engines will use your meta title to decide if they want to visit your webpage.

  • Use your keywords near the beginning of your meta title.

  • The title should be no more than 50 to 60 characters, including spaces or 600 pixels.

  • Make sure each page has a unique title that is relevant to the subject of the page.

Meta Title and Description Examples

Meta Description, what is it?

The meta description will show right below the meta tag and web address on the search engine results page (SERP). Meta descriptions describe what your customers will find on the web page when they arrive. 

  • Write enticing copy, make your web page a must-see page.

  • Make sure the text is no longer than 120 to 160 characters, including spaces.

  • Don't be alarmed if the search engine changes this copy. Sometimes the search engine will change the copy to match a searcher query better. 

Headline or h1 thought h6, what are they?

The h1 is the main headline at the top of your web page. Don't get this confused with the meta tag or title tag. The h1 headline and the meta tag or page title are two different things. Each one should be different and unique. Search engines will look at your h1 headline and use it as a ranking factor for the web page.

  • Don't be tempted to use the meta tag or page title for your page headline, and you will be losing out on your SEO if you do.

  • The h1 headline is at the top of your page and is labeled h1 in your code.

  • There should be only one h1 headline per page.

  • Use the most critical information in the headline.

  • The average headline is 50 characters long but can go longer if needed.

  • Subheads are labeled h2 through h6 in the code.

Anchor Text Linking, what is it?

Anchor text linking is the hypertext links within the copy that go to other web pages to refer to supporting information on the subject. The anchor text linking can also be a citation to give credit for the source you used to find the information or ideas you apply to your content. Anchor text linking will provide you with more authority in the search engines, but make sure the sites and pages you're are linking to are high quality. Low-quality pages and sites might hurt your rankings.

  • Make sure you use sources for your articles and content. Give credit where credit is due.

  • Use descriptive words instead of using 'click here.' People have been reading information on the internet long enough to understand how a link works.

  • Have anchor text linking to other pages within your website too. Make sure these additional links are relevant to the subject on the webpage.

Image Optimization, what is it?

Image optimization is a big deal when your web page is loading onto your customers' devices. If the images are not optimized, the site will take too long to load, and you may lose their attention, sending them to your competitor. 


Optimization means compressing photos, artwork, charts, and graphs. It is easy to compress images, but most people either don't know how to do it or forget to take this step. Please don't skip it. One site I use to compress my images is www.tinypng.com. It's free and easy to use. They do have a paid version if you tend to use them a lot.

Image Alt Text, what is it?

Image alt text is another area people don't know about or don't know how or why they should include in their on-page SEO. Many people use a screen reader to read their internet information to them. When you add image alt text, you are telling the listener what the image represents. FYI Google does give a higher ranking to pages that include image alt text.  

  • Make sure you add image alt text to all images, including your logo.

  • Keep it short and straightforward.

  • Use your keywords in this description.

Breadcrumbs, what are they?

Breadcrumbs help keep customers on your website. Breadcrumbs are the text you see at the top of most web pages that tell you where you are within the site. Using breadcrumbs will lower the bounce rate of your website. The bounce rate is how much time someone spends on your web page, usually in a percent form. The lower the percentage, the longer they are on your web page. Your goal is to keep your customers interested in your information within your web pages. The breadcrumbs help your customers follow their way back to a previous page to help find what they are looking for. For more information on breadcrumbs, this article covers the topic in detail.

Breadcrumb Example

Robots.txt, what is it?​

A Robots.txt file is used to help search engines understand how to operate within your website. There may be pages within your website that need to be online, but you don't want the public to access them. For example, a log-in page does not have information searched for by a search engine. You would use code in a robots.txt file to give instructions about what to do with these pages.

  • Use Robots.txt files for pages you don't want to include in search results.

  • Don't use robots.text file for the pages you have removed from your website. There're other ways to deal with pages that are deleted.

  • Robots.txt file is for bots or web crawlers that scan your pages.

  • All search engines do not read Robots.txt files.

  • If you need to secure a page on your website, a robots.txt is NOT the way to do it.

Site Structure, what is it?

Your site structure is essential to the search engine as well as how people use your website. If the site structure is complicated for search engines to decipher, your page may not be indexed. So please keep it simple to follow.

On-page-crawlible-Crab_felipe-portella (

Are my webpages crawlable
or crawler friendly?

Let's find out.

Fixing indexing trouble is extremely important for listing on the Search Engine Results Page (SERP).

Here is what you need to know.  First, it is good you’re asking the crawlability question. Some people don’t even consider their site might have issues with indexing. So you are ahead of the game!

What is crawlability trouble?


If your website is not crawlable, there will be indexing trouble with search engine spiders. You want to get out ahead of this problem sooner rather than later if you wish to have your web pages indexed for all visitors to see. It would be best if you made your web pages crawlable for search engines.


Cleaning up indexing issues is essential. You want to ensure all your web pages are error-free; making your website available and friendly for search engine bots to scan is critical.


Search engine bots go through every page of your website to determine if the page is set up correctly or crawlable. You want to find a 200 code, meaning the page is good and indexed, ready for your customers. Other codes could be:

  • 200, the page is indexed

  • 301, the page has moved 

  • 302, the page has temporarily moved 

  • 404, the page is missing

  • 403, denied access

  • 505, a server error

There are more error codes but, these are the main ones to look for. These codes should be checked regularly for each webpage on your site to make sure they stay accurate. 


For example, if a web page is returning a 404 code for a page not found and you know the page exists, maybe the link is broken. A broken link provides a poor user experience (UX). Fixing the link then becomes a priority.

How do I know if there are indexing problems?


Listed below are two programs that could help you find indexing problems.


• Google Search Console

Google Search Console will send you warnings and also list the issues your site may be having. Consider opening a Google Search Console account. It’s FREE! What’s holding you back?


• Screaming Frog

Screaming Frog is an audit program that will alert you to each web page's code after the software has scanned pages. Screaming Frog is FREE! Why are you waiting? However, you will not have access to all the features in the free version.


A third option is investing in an SEO Audit program or software.


• SEO Audit Program


There are hundreds of SEO audit programs out there to choose from. Be careful. They vary widely in their capabilities and pricing.


You may want to dig deeper with an SEO audit of your whole website. 


An SEO audit report helps you find indexing problems and may even tell you where the problem is and how to fix it.


The three tools listed above are useful to use if you understand how to use them to your advantage. Using more than one tool can help you catch all the indexing problems.


If you are unsure what you are doing, maybe it is time to hire a professional.


Here is a list of items that may cause a poor scanning experience for search engine bots.


1. Pages blocked from indexing by robot tags, no-follow links, or robots.txt files


Web pages that are blocked, for any reason, you should investigate why it was blocked. Maybe it was unintentional. When unblocking these pages, you will need to check three different areas.


• robot tags


Nofollow links 


robots.txt files


If you find blocking in any of these three areas, and if you want to unblock, you or your professional SEO will need to remove the offending blocking message. Correcting this type of issue may require knowledge of HTML. With the blocking message removed, search engines know the page is ready to scan. An SEO audit will uncover blocked pages to investigate.


2. URL structure errors


Please pay attention to your URLs; they may contain indexing issues too. Check for typos. Correct these typos ASAP to get better results for all your SEO efforts. Also, make sure you do 301 redirect if you change any URLs.


3. Old URL issues


When you update your website, make sure you follow best practice rules — update to all new pages. 


If you add security to your site, make sure all your links are to other pages, websites, or secured sites. If you link to other unsecured websites or web pages, you are compromising your site's security. Make sure all your links are to sites that are secure (HTTPS).


4. Denied access


Check your content access. Do you have a page that all users should or could be reading? Or, is there content only registered users have access to? Make sure to check content for access privileges because you may be losing customers with blocked content they want to see.


5. Server errors, capacity, or misconfiguration


When you see a 500 code, there could be a problem with the server. It is a good idea to bring it to the website developer or hosting company's attention to fix this error. The server may be offline, your site may have reached its capacity on the server, or something may be misconfigured. Your hosting provider or web developer will help you pinpoint these issues.


6. Formatting issues


When there are formatting issues or coding issues, search engines are going to have crawlability problems. It could be a server issue, firewall issue, or several other issues. You will need to contact a specialist to find this problem. First, start with your web developer, hosting company, or SEO specialist to see these issues.


For example, I was working on a website update for a law firm. My client wanted to see the updated site on his office computer and could not get to it. I contacted the hosting company, and they insisted there was no problem with the server. I thought about it for a few days and contacted the hosting company again. The hosting company also provided and oversaw the networking within the office. I was asking them if the onsite server blocked the new site. He grumbled and was not happy with me as if I was bothering him. If you don’t understand what you’re doing, you shouldn’t be working on their website. When the hosting company spent a little bit of time working on the server's formatting, he found the server's security setup was blocking the law firm’s ability to see their website. Problem solved. The message here is to work together to resolve the issue. Be kind to each other. There is no need to place blame or be rude.

7. Sitemap missing or not set up correct


Your sitemap is an essential document for the search engine to read. The sitemap will inform the search engine about site structure and what pages to scan and index or not scan. Every time you update your website, make sure to update your sitemap. If you are adding new pages or delete old pages, the sitemap should reflex those changes. 


There are different types of sitemaps:


• Sitemap.xml for web pages


• Sitemap.xml for photo


• Sitemap.xml for PDFs


HTML.html for people


The first three on the list, Sitemap.xml for web pages, Sitemap.xml for a photo, Sitemap.xml for PDFs, can be in the same document or separate.


The sitemap.xml is the one you should use. The sitemap.xml will be in the website root folder. There is nothing wrong with an HTML sitemap, and it may add to the human’s understanding of your website. Make sure you use the sitemap.xml format because most search bots are looking for this file.

8. Internal links not set up correctly or broken


The search engine also looks at the links you have within your website. There should be links between pages with similar information about the subject. 


For example, if you have a blog post talking about bunnies and what they eat and a web page about raising rabbits, the two should have links within the copy content to each other. 


• Make sure internal links are related subjects. 


• When you provide this type of link, you show people and search engines that you are the expert on the subject by provided information.


• Page links also help search engines rank your pages for authority.


The linking to other pages should be correct; any broken links should be found and repaired for better user experiences (UX).


9. Redirects set-up wrong


Wrong redirects create a poor user experience (UX). As explain above, you need to check the codes that show in your audit report. 


• 200 page is loaded correctly


• 404 page is not found


• 301 page has moved permanently


• 302 page has been temporarily moved


• 500 servers error


When you check the page's code, you may find broken links and other problems to prevent your customers and search engines from reaching the information on your web pages. Broken links will cause a poor user experience and also cause indexing issues.


10. Speed issues


What I have found, many speed issues involved image sizes. Make sure you’re compressing your images to get them as small a possible without compromising quality.


Here is why you need to be concerned about page speed. You have three seconds to load your webpage before your customer will give up and go to the next user-friendlier site.


If your site speed does not improve after compressing all your images, it would be a good idea to hire a professional to sort out the issues causing the slow speed.


11. Duplicate content


Duplicate content is one of the most frequent issues for a website. Duplicate content will need to be addressed for better search bot usage.


What you don’t want to do is run out of a crawling budget on search engines, wasting the search engine’s time. Most search engines dedicate a certain amount of time to crawling each website. 


There are millions of websites, and time spent on one site is limited. 


If you have duplicate content, your entire page may not get scanned and indexed by the search engine. You need to eliminate and duplicate content or redirect using rel=canonical to tell the search engine which page is the original.

Address duplicate issues with:


• Removing or use rel=canonical.


• Use the robots.txt file correctly.


• Use Meta tags correctly.


• Use a 301 redirect.


12. JavaScript (JS) and Cascading Style Sheet (CSS) issues


Implementation of JavaScript and Cascading Style Sheets, be careful not to overdo it. These types of scripts can be read wrong when scanning a page for indexing. Therefore, it is best to use JavaScript and Cascading Style Sheets as little as possible to limit any confusion.


13. Flash usage

Frames are old technology and should be avoided. In addition, frames will cause poor indexing.

Many of the above thirteen issues are very technical. You should contact a professional SEO to assist in finding and correcting most of these errors.


When you have a website, you should consider the regular maintenance involved to keep the site running smoothly. Anything can happen at any time,  causing problems with crawlability and indexing.


The most important thing to think about is giving your customers the best possible user experience (UX) by keeping up with and correcting indexing issues. 


A monthly check-up with an SEO audit report will find crawlability and indexing problems. Eliminated indexing issues can be time-consuming, but it is critical to the website, the customer experience, and indexing.


I hope you find this article helpful. If you found it a bit too technical, hire a professional SEO to assist with finding and correcting crawlability and indexing issues on your website. 


It is an investment, but can not you afford to upset your clients when they can't find what they are looking for on your website. What is a sale worth to you? Measure what you could be losing? When you weigh these two items, increased clientele and lost clients, you will understand how productive an SEO professional will be to your website's success. 


FYI: Google support has a wealth of information. I have linked to it often in this article. It is well worth your time to keep this link and refer to it often for updated information.

On-page SEO is an essential process that, when not done, will hold your site back. To be competitive on the World Wide Web, you need to optimize every page of your website. The information listed above is just the tip of the iceberg. Keeping up with the changing aspects of SEO is a daily journey in education and information searching.


Are your website, web pages, and blog posts up to date with SEO?

WenKo LLC enjoys learning everything about SEO and does keep up-to-date daily with algorithm changes search engines like Google are introducing.

How do you think your SEO is doing compared to your competitors?


Is your company lagging and losing out on possible customers?


It's time to either learn more about SEO or hire an SEO expert and stop wasting time. If you're serious about getting more clients with search marketing and on-page SEO,  WenKo would like to be your go-to company.


Conway, Richard. How to Get to the Top of Google Search.  2019.

“Guidelines - Search Console Help.” Google, Google,


Grybniak, Sergey. “Everything You Need to Know About Breadcrumbs & SEO.” 

Search Engine Journal, Search Engine Journal, 15 June 2018, https://www.searchenginejournal.com/breadcrumbs-seo/255007/#close.

“Home.” Screaming Frog, https://www.screamingfrog.co.uk/.

“What Are the SEO Benefits of XML & HTML Sitemaps?” The Daily Egg, 30 Apr. 2019,