On-page SEO | WenKo LLC | Minneapolis
top of page
Technical Black and Blue Background is a representation for technical SEO

A Guide to On-page SEO

 

This guide aims to help you understand what is involved in on-page SEO. When you have a firm understanding, you can better understand how your web pages rank on search engines.  

The Web Page Foundation

Web designers, Marketers, Business Owners, and Entrepreneurs are finding their online presence improves with on-page SEO.

On-page SEO techniques improve crawlability for indexing on search engines. Once your web pages are indexed, your customers have a better chance of finding your products and services. However, when there are indexing problems, your findability goes down. 

On-page SEO is an ongoing process that will keep your web pages crawlable for indexing. Your site will do better through audits and analytics when errors are corrected. When there is trouble, an audit will find the problem. When you understand what web pages are working well and which ones need work, analytics will help you know what needs more work.

When your web pages go unchecked, you may be unaware of the problems the search engines are having with indexing your web pages. Unfortunately, when you're unaware of problems, search engines could give you warnings and even un-indexing some of your web pages. A regularly scheduled audit will solve this indexing trouble.

An SEO audit and analytics report is your friend for running your web pages smoothly.

Keyword Phrase Research​

Keyword phrase research will give you a base to work from when creating content. This research will help you find the words already working for the topics on your webpage. The study will also give you new ideas for content for your web pages and blog by revealing ideas you never thought about.

Your keyword phrases should be the same words your customers use, not industry jargon. Niche jargon may be used when talking to people within your industry. But, the customers looking for your products and services will use simpler words in their online search. People don't know or understand your niche industry words. Think simpler.

You will look at what your competitors are doing, checking the frequency of use and relevancy. With the right keywords, you may have a chance at ranking higher.

Content

Provide content your customer wants to find. Make sure it is in their language or verbiage. Educate your customers and answer any questions they may have. Don't assume your customer understands every aspect of your product or service and how it will benefit them. 

Your end goal is to produce what the customer wants to find and be the expert in your field to provide the information they want. Content includes written information, guides, infographics, charts and graphs, videos, images, gifs, and more.

Be the expert in your field or niche. Then, you become the expert when you provide the information your customers are looking for. 

HTTPS, you must have it.

HTTPS stands for Hypertext Transfer Protocol Secure. The S is telling your visitors that your site is secure. In the not-too-distant past, Google warned people to secure their website, that it would be a ranking factor. Present-day Google, your site NEEDS to be secure, or Google will label your site with a warning label saying your site is not secure. A website that is not secure will have no chance of being indexed.

Meta Titles are essential.

Next to keyword phrases research, meta titles are the most critical items on your web page. Meta titles are essential because visitors and search engines will use your meta title to decide if they want to visit your webpage.

  • Use your keyword phrase near the beginning of your meta title.

  • The title should be no more than 50 to 60 characters, including spaces or 600 pixels.

  • Make sure each page has a unique title that is relevant to the page's topic.

Meta Title and Description Examples

Meta Description, tell what your page topic is about.

The meta description will show right below the meta tag and web address on the search engine results page (SERP). Meta descriptions describe what your customers find on the web page when they arrive. 

  • Write an enticing description for your web page, make it a must-see page but be truthful.

  • Ensure the text is no longer than 120 to 160 characters, including spaces.

  • Don't be alarmed if the search engine changes this copy. Sometimes the search engine will change the composition to match a searcher query better. 

Headline or h1 thought h6 are essential also.

 

The h1 is the main headline at the top of your web page. Don't get this confused with the meta tag or title tag. The h1 headline, meta tag, and page title are different things. Each one should be different and unique. Search engines will look at your h1 headline to understand your topic on each page.

  • Don't be tempted to use the meta tag or page title for your page headline; you will be losing out on your SEO if you do.

  • The h1 headline is at the top of your page and is labeled h1 in your code.

  • There should be only one h1 headline per page.

  • Use the most critical information in the headline.

  • The average headline is 50 characters long but can go longer if needed.

  • Subheads are labeled h2 through h6 in the code.

Anchor Text Linking.

Anchor text linking is the hypertext links within the copy that go to other web pages to refer to supporting information on the subject. The anchor text linking can also be a citation to credit the source you used to find the information or ideas you apply to your content. Anchor text linking will give you more authority in the search engines, but make sure the sites and pages you are linking to are high quality. Low-quality pages and sites might hurt your rankings.

  • Make sure you use sources for your articles and content. Give credit where credit is due.

  • Use descriptive words instead of using 'click here.' People have been reading information on the internet long enough to understand how a link works.

  • Have anchor text linking to other pages within your website too. Make sure these additional links are relevant to the subject on the webpage.

Image Optimization.

Image optimization is a big deal when your web page loads onto your customers' devices. If the images are not optimized, the site will take too long to load, and you may lose their attention, sending them to your competitor. 

Optimization means compressing photos, artwork, charts, and graphs. It is easy to compress images, but most people either don't know how or forget to take this step. Please don't skip it. One site I use to compress my images is www.tinypng.com. It's free and easy to use. They have a paid version if you use them a lot.

Image Alt Text, what is it.

Image alt text is another area people don't know about or don't know how or why they should include in their on-page SEO. Many people use a screen reader to read their internet information to them. When you add image alt text, you tell the listener what the image represents. FYI, Google gives a higher ranking to pages that include image alt text. 

  • Make sure you add image alt text to all images, including your logo.

  • Keep it short and straightforward.

  • Use your keywords in this description.

Breadcrumbs.

Breadcrumbs help keep customers on your website. Breadcrumbs are the text you see at the top of most web pages that tell you where you are within site. Using breadcrumbs will lower the bounce rate of your website. The bounce rate is how much time someone spends on your web page, usually in a percent form. The lower the percentage, the longer they are on your web page. Your goal is to keep your customers interested in the information within your web pages. The breadcrumbs help your customers follow their way back to a previous page to help find what they are looking for. For more information on breadcrumbs, this article covers the topic in detail.

Breadcrumb Example

Robots.txt.

A Robots.txt file is used to help search engines understand how to operate within your website. There may be pages within your website that need to be online, but you don't want the public to access them. For example, a login page does not have information searched for by a search engine. You would use code in a robots.txt file to give instructions about what to do with these pages.

  • Use Robots.txt files for pages you don't want to include in search results.

  • Don't use robots.text file for the pages you have removed from your website; there're other ways to deal with deleted pages.

  • Robots.txt file is for bots or web crawlers that scan your pages.

  • All search engines do not read Robots.txt files.

  • If you need to secure a page on your website, robots.txt is NOT the way to do it.

 

Site Structure.

Your site structure is essential to the search engine and how people use your website. However, if the site structure is complicated for search engines to decipher, your page may not be indexed. So please keep it simple to follow.

On-page-crawlible-Crab_felipe-portella (

Are my webpages crawlable
or crawler friendly?

Let's find out.

Fixing indexing trouble is crucial for listing on the Search Engine Results Page (SERP).

Here is what you need to know. First, it is good that you're asking the crawlability question. Some people don't even consider that their site might have issues with indexing. So you are ahead of the game!

What is crawlability trouble?

 

If your website is not crawlable, indexing might not happen. You want to get ahead of this problem sooner rather than later if you wish to have your web pages indexed for all visitors. Making your web pages crawlable for search engines to index would be best.

 

Cleaning up indexing issues is essential. In addition, you want to ensure all your web pages are error-free; making your website available and friendly for search engine bots to scan is critical.

 

Search engine bots go through every page of your website to determine if the page is set up correctly or crawlable. You want to find a 200 code, meaning the page is excellent for indexing. Other codes could be:

  • 200, the page is indexed

  • 301, the page has moved 

  • 302, the page has temporarily moved 

  • 404, the page is missing

  • 403, denied access

  • 505, a server error

There are more error codes, but these are the main ones to look for. These codes should be checked regularly for each webpage on your site to make sure they stay accurate. 

 

For example, if a web page returns a 404 code for a page not found and you know the page exists, maybe the link is broken. A broken link provides a poor user experience (UX). Fixing the link then becomes a priority.

How do I know if there are indexing problems?

 

Listed below are two programs that could help you find indexing problems.

• Google Search Console

 

Google Search Console will send you warnings and list the issues your site may have. Consider opening a Google Search Console account. It's FREE! What's holding you back?

• Screaming Frog

 

Screaming Frog is an audit program that will alert you to each web page's code after the software has scanned pages. Screaming Frog is FREE! Why are you waiting? However, you will not have access to all the features in the free version.

A third option is investing in an SEO Audit program or software.

• SEO Audit Program

There are hundreds of SEO audit programs out there to choose from. But be careful; they vary widely in their capabilities and pricing.

You may want to dig deeper with an SEO audit of your website.

 

An SEO audit report helps you find indexing problems and may even tell you where the problem is and how to fix it.

The three tools listed above are beneficial if you understand how to use them to your advantage. In addition, using more than one tool can help you catch all the indexing problems.

If you are unsure what you are doing, maybe it is time to hire a professional.

Here is a list of items that may cause a poor scanning experience for search engine bots.

scanning experience for search engine bots.

 

1. Pages are blocked from indexing by robot tags, no-follow links, or robots.txt files.

 

Web pages that are blocked for any reason should investigate why it was blocked. Maybe it was unintentional. When unblocking these pages, you will need to check three different areas.

• robot tags

• Nofollow links 

• robots.txt files

If you find blocking in any of these three areas and want to unblock, you or your professional SEO will need to remove the offending blocking message. Correcting this type of issue may require knowledge of HTML. With the blocking message removed, search engines know the page is ready to scan. An SEO audit will uncover blocked pages to investigate.

 

2. URL structure errors

 

Please pay attention to your URLs; they may contain indexing issues too. Check for typos. Correct these typos ASAP to get better results for all your SEO efforts. Also, make sure you do 301 redirects if you change any URLs.

 

3. Old URL issues

 

When you update your website, follow best practice rules — update to all new pages. 

If you add security to your site, ensure all links are to other pages, websites, or secured sites. You are compromising your site's safety if you link to other unsecured websites or web pages. Ensure all your links are to secure sites (HTTPS).

4. Denied access

 

Check your content access. Do you have a page that all users should or could be reading? Or is there content only registered users have access to? Check content for access privileges because you may lose customers with blocked content they want to see.

 

5. Server errors, capacity, or misconfiguration

 

When you see a 500 code, there could be a problem with the server. It is a good idea to bring it to the website developer or hosting company's attention to fix this error. For example, the server may be offline, your site may have reached its capacity on the server, or something may be misconfigured. Your hosting provider or web developer will help you pinpoint these issues.

 

6. Formatting issues

 

When there are formatting or coding issues, search engines will have crawlability problems. It could be a server, firewall, or several other issues. You will need to contact a specialist to find this problem. First, start with your web developer, hosting company, or SEO specialist to see these issues.

 

For example, I was working on a website update for a law firm. My client wanted to see the updated site on his office computer and could not get to it. I contacted the hosting company, and they insisted there was no problem with the server. I thought about it for a few days and contacted the hosting company again. The hosting company also provided and oversaw the networking within the office. I asked them if the onsite server blocked the new site. He grumbled and was unhappy with me as if I was bothering him. You shouldn't be working on their website if you don't understand what you're doing. When the hosting company spent a little time working on the server's formatting, he found that the server's security setup was blocking the law firm's ability to see its website. Problem solved. The message here is to work together to resolve the issue. Be kind to each other. There is no need to place blame or be rude.

7. Sitemap missing or not set up correct


Your sitemap is an essential document for the search engine to read. The sitemap will inform the search engine about the site structure and what pages to scan and index or not scan. Every time you update your website, make sure to update your sitemap. If you add new pages or delete old ones, the sitemap should reflect those changes. 

There are different types of sitemaps:

• Sitemap.xml for web pages

• Sitemap.xml for photo

• Sitemap.xml for PDFs

• HTML.html for people

The first three on the list, Sitemap.xml for web pages, Sitemap.xml for a photo, and Sitemap.xml for PDFs, can be in the same document or separately.

The sitemap.xml is the one you should use. The sitemap.xml will be in the website's root folder. There is nothing wrong with an HTML sitemap, which may add to the human's understanding of your website. Make sure you use the sitemap.xml format because most search bots are looking for this file.

8. Internal links not set up correctly or broken

 

The search engine also looks at the links you have within your website. There should be links between pages with similar information about the subject. 

For example, if you have a blog post about bunnies and what they eat and a web page about raising rabbits, the two should have links within the copy content. 

• Make sure internal links are related to topics. 

• When you provide this link, you show people and search engines that you are the expert on the subject by giving information.

• Page links also help search engines rank your pages for authority.

The linking to other pages should be correct; any broken links should be found and repaired for better user experiences (UX).

 

9. Redirects set-up wrong

 

Wrong redirects create a poor user experience (UX). As explained above, you need to check the codes in your audit report. 

• 200 page is loaded correctly

• 404 page is not found

• 301 page has moved permanently

• 302 page has been temporarily moved

• 500 servers error

When you check the page's code, you may find broken links and other problems to prevent your customers and search engines from reaching the information on your web pages. Broken links will cause a poor user experience and also cause indexing issues.
 

 

10. Speed issues

 

I have found that many speed issues involve image sizes. So ensure you're compressing your images to get them as small as possible without compromising quality.

Here is why you need to be concerned about page speed. You have three seconds to load your webpage before your customer gives up and goes to the next user-friendlier site.

 

If your site speed does not improve after compressing all your images, it would be a good idea to hire a professional to sort out the issues causing the slow speed.

 

11. Duplicate content

 

Duplicate content is one of the most frequent issues for a website. Duplicate content will need to be addressed for better search bot usage.

What you don't want to do is run out of a crawling budget on search engines, wasting the search engine's time. Most search engines dedicate a certain amount of time to crawling each website. 

There are millions of websites, and time spent on one site is limited. 

If you have duplicate content, your entire page may not get scanned and indexed by the search engine. You need to eliminate and duplicate content or redirect using rel=canonical to tell the search engine which page is the original.

Address duplicate issues with:

• Removing or use rel=canonical.

• Use the robots.txt file correctly.

• Use Meta tags correctly.

• Use a 301 redirect.

 

12. JavaScript (JS) and Cascading Style Sheet (CSS) issues

 

Implementing JavaScript and Cascading Style Sheets, be careful not to overdo it. These types of scripts can be read wrong when scanning a page for indexing. Therefore, it is best to use JavaScript and Cascading Style Sheets as little as possible to limit confusion.

 

13. Flash usage

Frames are old technology and should be avoided. In addition, frames will cause poor indexing.

Many of the above thirteen issues are very technical. Therefore, you should contact a professional SEO to assist in finding and correcting most of these errors.

 

When you have a website, you should consider the regular maintenance involved to keep the site running smoothly. Anything can happen at any time, causing problems with crawlability and indexing.

The most important thing to consider is giving your customers the best possible user experience (UX) by keeping up with and correcting indexing issues. 

A monthly check-up with an SEO audit report will find crawlability and indexing problems. Eliminated indexing issues can be time-consuming, but it is critical to the website, the customer experience, and indexing.

I hope you find this article helpful. If you found it a bit too technical, hire a professional SEO to assist with finding and correcting crawlability and indexing issues on your website. 

FYI: Google support has a wealth of information. I have linked to it often in this article. It is well worth your time to keep this link and refer to it often for updated information.

On-page SEO is an essential process that, when not done, will hold your site back. To be competitive on the internet (www), you need to optimize every page of your website. The information listed above is just the tip of the iceberg. Keeping up with the changing aspects of SEO is a daily journey in education and information searching.

On-page SEO is an investment you need for the long haul on your website. So don't skip the SEO. 

Are your website, web pages, and blog posts up to date with SEO?

WenKo LLC enjoys learning everything about SEO and does keep up-to-date daily with algorithm changes search engines like Google are introducing.

How do you think your SEO is doing compared to your competitors?

 

Is your company lagging and losing out on possible customers?

 

It's time to either learn more about SEO or hire an SEO expert and stop wasting time. If you're serious about getting more clients with search marketing and on-page SEO,  WenKo would like to be your go-to company.

Sources:

Conway, Richard. How to Get to the Top of Google Search.  2019.

“Guidelines - Search Console Help.” Google, Google,

https://support.google.com/webmasters/topic/9456575?hl=en&ref_topic=9428048.

Grybniak, Sergey. “Everything You Need to Know About Breadcrumbs & SEO.” 

Search Engine Journal, Search Engine Journal, 15 June 2018, https://www.searchenginejournal.com/breadcrumbs-seo/255007/#close.

“Home.” Screaming Frog, https://www.screamingfrog.co.uk/.

“What Are the SEO Benefits of XML & HTML Sitemaps?” The Daily Egg, 30 Apr. 2019,

https://www.crazyegg.com/blog/seo-benefits-of-xml-html-sitemaps/.

Keyword Research
Content
HTTPS
Meta Title
Meta Description
Headers
Anchor Text Link
Image Optimization
Image Alt Text
Breadcrumbs
Robots.txt
Site Stucture
Crawlability and Indexing
bottom of page