On page SEO - image  on https://trunk.lyImprovement of website to SEO

On-page SEO is the process of ensuring that your site is readable to search engines. Learning correct on-page SEO is not only important in ensuring Google picks up the keywords you want, but it is an opportunity to achieve easy wins and improve your website’s overall performance.

On-page SEO includes the following considerations:

1. Making sure site content is visible to search engines.
2. Making sure your site is not blocking search engines.
3. Making sure search engines pick up the keywords you want.

Most on-page SEO you can do yourself, if you have a basic level of experience dealing with sites.

If you are not technically inclined, please note there are technical sections of this chapter. You should still read these so you understand what has to be done to achieve rankings in Google, you can easily hire a web designer or web developer to implement the SEO techniques in this chapter, after you know what it takes to achieve top rankings.

How to structure your website for better ON SEO PAGE

These best practices will ensure your site is structured for better recognition by Google and other search engines.

Search engine friendly URLs.

Have you ever visited a web page and the URL looked like something like this:

http://www.examplesite.com/~articlepage21/post-entry321.asp?q=3

What a mess!

These kinds of URLs are a quick way to confuse search engines and site visitors. Clean URLs are more logical, user friendly and search engine friendly.

Here is an example of a clean URL:

http://www.examplesite.com/football-jerseys

Much better.

Take a quick look at Google’s search engine results. You will see a very large portion of sites in the top 10 have clean and readable URLs like the above example. And by a very large portion… I mean the vast majority.

Most site content management systems have search engine friendly URLs built into the site. It is often a matter of simply enabling the option in your site settings. If your site doesn’t have search engine friendly URLs, it’s time for a friendly chat with your web developer to fix this up.

Internal navigation

There is no limit on how to structure the navigation of your site. This can be a blessing or a curse.

Some people force visitors to watch an animation or intro before they can even access the site. In the process, some sites make it harder for visitors and more confusing for search engines to pick up the content on the site.

Other sites keep it simple by having a menu running along the top of the site or running down the left-hand side of the browser window. This has pretty much become an industry standard for most sites.

By following this standard, you make it significantly easier for visitors and search engines to understand your site. If you intend to break this convention, you must understand it is likely you will make it harder for search engines to pick up all of the pages on your site.

As a general rule, making it easier for users makes it easier for Google.

Above all else, your web site navigation must be made of real text links—not images.

If your main site navigation is currently made up of images, slap your web designer and change them to text now! If you do not have the main navigation featured in text, your internal pages will almost be invisible to Google and other search engines.

For an additional SEO boost, include links to pages you want visible to search engines and visitors on the home page.

By placing links specifically on the home page, Google’s search engine spider can come along to your site and quickly understand which pages on your site are important and worth including in the search results.

How to make Google pick up the keywords you want.

There are many misconceptions being circulated about what to do, and what not to do, when it comes to optimizing keywords into your page.

Some bloggers are going so far as telling their readers to not put keywords in the content of targeted pages at all. These bloggers—I’m not naming names—do have the best intentions and have really taken worry about Google’s spam detection to the next level.

But it is complete madness.

Not having keywords on your page it makes it almost impossible for Google to match your page with the keyword you want to rank for. If Google completely devalued having keywords on the page, Google would be a crappy search engine.

Think about it. If you search for “Ford Mustang 65 Auto Parts” and arrive on pages without those words on the page at all, it’s extremely unlikely you have found what you’re looking for.

Google needs to see the keywords on your page, and these keywords must be visible to your users. The easy approach is to either create content around your keyword, or naturally weave your keyword into the page. I’m not saying your page should look like the following example.

“Welcome to the NFL jersey store. Here we have NFL jerseys galore, with a wide range of NFL jerseys including women’s NFL jerseys, men’s NFL jerseys and children’s NFL jerseys and much, much more.”

This approach may have worked 10 years ago, but not now. The keyword should appear naturally in your page. Any attempts to go bonkers with your keywords will look horrible and may set off spam filters in search engines. Use your keyword naturally throughout the content. Repeating your keyword a couple of times is more than enough.

It’s really that simple.

Next up, you need to ensure you have a handful of LSI keywords on your page. LSI stands for Latent Semantic Indexing. Don’t be discouraged by the technical term, LSI keywords is an SEO term for related phrases. Google believes a page is more naturally written, and has a higher tendency to be good quality and relevant, if it also includes relevant and related keywords to your main phrase.

To successfully optimize a page, you need to have your main keywords and related keywords in the page. Find two or three related keywords to your main keyword, and repeat these in the page two or three times each. Ubersuggest is a great tool for finding keywords Google considers related to your main keywords—it does this by pulling suggestions from Google’s auto-suggest box. Use Ubersuggest and your keyword research to determine a list of the most related keywords.

Ubersuggest – Free
http://ubersuggest.org

Areas you can weave keywords into the page include:

  • Meta description and meta title tags
  • Navigation anchor text
  • Navigation anchor title tags
  • Headings (h1, h2, h3, and h4 tags)
  • Content text
  • Bolded and italicized text
  • Internal links in content
  • Image filename, image alt tag and image title tag
  • Video filename, video title

How to get more people clicking on your rankings in Google.

Meta tags have been widely misunderstood as mysterious pieces of code SEO professionals mess around with, and the secret to attaining top rankings. This couldn’t be further from the truth.

The function of meta tags is really quite simple. Meta tags are bits of code on your site controlling how your site appears in Google.

If you don’t fill out your meta tags, Google will automatically use text from your site to create your search listing. This is exactly what you don’t want Google to do, otherwise it can end up looking like gibberish! Fill out these tags correctly, and you can increase the number of people clicking to your site from the search engine results.

Below is an example of the meta tag code.

<title>Paul’s NFL Jerseys</title>
<meta description=”Buy NFL jerseys online. Wide range of colors and sizes.”/>
<meta name=”robots” content=”noodp, noydir”/>

Below is an example of how a page with the above meta tag should appear as a search engine result in Google:

Paul’s NFL Jerseys
Buy Paul’s NFL jerseys online. Wide range of colors and sizes.
http://www.yoursite.com/

Pretty simple, huh?

The title tag has a character limit of roughly 70 characters in Google. Use anymore than 70 characters and it is likely Google will truncate your title tag in the search engine results.

The meta description tag has a character limit of roughly 155 characters. Just like the title tag, Google will shorten your listing if it has any more than 155 characters in the tag.

The last meta robots tag indicates to Google you want to control how your listing appears in the search results. It’s good to include this, otherwise Google may ignore your tags and instead use those listed on other directories such as the Open Directory Project and the Yahoo Directory.

To change these tags on your site you have three options:

1. Use the software your site is built on. Most content management systems have the option to change these tags. If it doesn’t, you may need to install a plugin to change these tags.

2. Speak with your web designer or web developer to manually change your Meta tags for you.

3. If you are a tech-savvy person and are familiar with HTML, you can change these tags in the code yourself

Site load speed—Google magic dust.

How fast (or slow) your site loads is another factor Google takes into account when deciding how it should rank your pages in the search results.

A very well-known Google employee, Matt Cutts, publicly admitted fast load speed is a positive ranking factor.

If your site is as slow as a dead snail, then it is likely your site is not living up to its potential in the search engines. If your site load time is average, improving the load speed is an opportunity for an easy SEO boost.

Not only is load speed a contributing factor to achieving top rankings in Google, extensive industry reports have shown for each second shaved off a site, there is an average increase of 7% to the site conversion rate. In other words, the faster your site loads, the more chance you have of people completing a sale or filling out an inquiry form. Clearly this is not an aspect of your site to be overlooked.

Fortunately there are a handful of tools that make it easy to improve your load speed.

1. Google Page Speed Insights
https://developers.google.com/speed/pagespeed/insights

Google’s great free tool, Page Speed Insights, will give you a page load score out of 100. You can see how well your load speed compares to other sites. You can also see how well your site loads on mobile and desktop. Scores closer to 100 are near perfect.

After running a test on your site, the tool will give you a list of high priority, medium priority and low priority areas for improvement. You can forward these on to your developer to speed up your site, or if you are a bit of a tech-head, you can have a crack at fixing these up yourself.

2. Pingdom Tools
http://tools.pingdom.com/

Pingdom Tools is great for an overview of how long your site takes to load in different areas of the world, and for a quick breakdown of files and resources that are slowing your site down.

After the test is completed, if you scroll down you will see a list of the files each visitor has to download each time they visit your site. If you discover files that can be decreased in size, you can improve your site load speed.

Easy targets for improvements are large images. If you have any images over 200kb, these can usually be optimized and shrunk down to a fraction of the size without any loss in quality. Take a note of these files, send them to your web developer or web designer, and ask them to compress the files to a smaller file size.

The usual suspects—sitemaps.xml and robots.txt

Sitemaps.xml

Search engines automatically look for a special file on each site called the sitemaps.xml file.  Having this file on your site is a must for making it easy for search engines to discover pages on your site. Sitemaps are essentially a giant map to all of the pages on your site. Fortunately, creating this file and getting it on to your site is a straightforward process.

Most CMS systems have a sitemaps file automatically generated. This includes systems like WordPress, Magento, Shopify. If this is not the case on your site, you may need to install a plugin or use the free XML Sitemaps Generator tool. The XML Sitemaps Generator will automatically create a sitemaps.xml file for you.

XML Sitemaps Generator
http://www.xml-sitemaps.com

Next ask your web developer or web designer to upload it into the main directory of your site, or do it yourself if you have FTP access. Once uploaded, the file should be publicly accessible with an address like the below example:

http://www.yoursite.com/sitemaps.xml

Once you have done this, you should submit your sitemap to the Google Search Console account for your site.

If you do not have a Google Search Console account, the below article by Google gives simple instructions for web developers or web designers to set this up.

Add and verify a site to Google Search Console
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=34592

Login to your account and click on your site. Under “site configuration” click “sitemaps”, and in the textbox, enter the full address to your site.

Robots.txt

Another must-have for every site is a robots.txt file. This should sit in the same place as your sitemaps.xml file. The address to this file should look the same as the example below:

http://www.yoursite.com/robots.txt

The robots.txt file is a simple file that exists so you can tell the areas of your site you don’t want Google to list in the search engine results.

There is no real boost from having a robots.txt file on your site. It is essential you check to ensure you don’t have a robots.txt file blocking areas of your site you want search engines to find.

The robots.txt file is just a plain text document, its contents should look something like below:

# robots.txt good example

User-agent: *
Disallow: /admin
User-agent: *
Disallow: /logs

If you want your site to tell search engines to not crawl your site, it should look like the next example. If you do not want your entire site blocked, you must make sure it does not look like the example below. It is always a good idea to double check it is not set up this way, just to be safe.

# robots.txt – blocking the entire site

User-agent: *
Disallow: /

The forward slash in this example tells search engines their software should not visit the home directory.

To create your robots.txt file, simply create a plain text document with Notepad if you are on Windows, or Textedit if you are on Mac OS. Make sure the file is saved as a plain text document, and use the “robots.txt good example” as an indication on how it should look. Take care to list any directories you do not want search engines to visit, such as internal folders for staff, admin areas, CMS back-end areas, and so on.

If there aren’t any areas you would like to block, you can skip your robots.txt file altogether, but just double check you don’t have one blocking important areas of the site like the above example.

Duplicate content—canonical tags and other fun.

In later chapters I will describe how Google Panda penalizes sites with duplicate content. Unfortunately, many site content management systems will sometimes automatically create multiple versions of one page.

For example, let’s say your site has a product page on socket wrenches, but because of the system your site is built on, the exact same page can be accessed from multiple URLs from different areas of your site:

http://www.yoursite.com/products.aspx?=23213
http://www.yoursite.com/socket-wrenches
http://www.yoursite.com/tool-kits/socket-wrenches

In the search engine’s eyes this is confusing as hell and multiple versions of the page are considered duplicate content.

To account for this, you should always ensure a special tag is placed on every page in your site, called the rel canonical tag.

The rel canonical tag indicates the original version of a web page to search engines. By telling Google the page you consider to be the “true” version of the page into the tag, you can indicate which page you want listed in the search results.

Choose the URL providing the most sense to users and the best SEO benefit, this should usually be the URL that reads like plain English.

Using the earlier socket wrenches example, with the tag below, Google would be more likely to display the best version of the page in the search engine results.

<link rel=”canonical
” href=”http://www.yoursite.com/socket-wrenches
“/>

As a general rule, include this tag on every page on your site, shortly before the </head> tag in the code.

Usability—the new SEO explained

As mentioned in the first chapter, the trust and relevancy of sites has become increasingly important for Google. Market share for mobile and tablet Internet users skyrocketed to over 29% in 2015—to keep search a good experience for all users, Google has started to give preference to sites providing a good user experience for users on all devices. Usability has taken an increased importance in the SEO industry as a result, as SEO pundits found you can get an advantage by making your site easy to use.

For example, let’s say a mobile user is searching for late night pizza delivery in Los Angeles. One local business has a site with a large amount of backlinks but no special support for mobile users, it’s difficult for the user to navigate around the site because it doesn’t automatically fit to the screen, and the navigation text is small and hard to use on a touch screen.

Another competing local business has low amounts of backlinks, but good support for mobile users. Its design fits perfectly to the screen and has special navigation for mobile users, making it easy to get around.

In many cases, the second site will rank higher than the first, for mobile users. This is just one example of how usability can have a significant impact on your rankings.

While a term like usability can understandably seem a little vague, let’s look at practical steps to improve your usability and the SEO strength of your site.

1. Make your site accessible for all devices.

Make your site accessible and easy for all users: desktop, mobile and tablet. The simple way to do this is to make sure your site is responsive, which means it automatically resizes across all devices and has mobile-friendly navigation for mobile users. Mobile support is covered in more detail in Bonus Chapter 2 in the Mobile SEO Update section, but you can enter your site into the below tool to quickly see if Google registers your site as mobile friendly.

Mobile friendly Test
https://www.google.com/webmasters/tools/mobile-friendly/

2. Increase your content quality.

Gone are the days of hiring a bunch of writers in India to bulk out the content on your site. It needs to be proofread and edited, and the more “sticky” you make your content, the better results you will get. If you provide compelling content, users will spend more time on your site and are less likely to bounce back to the search results. Users will also be much more likely to share your content. Google will see this and give your rankings a boost.

3. Use clean code in your site.

There’s a surprisingly high amount of sites with dodgy code, difficult for both search engines and Internet browsers to read. If there are HTML code errors in your site, which means, if it hasn’t been coded according to industry best practices, it’s possible your design will break when your site is viewed on different browsers, or even worse, confuse search engines when they come along and look at your site. Run your site through the below tool and ask your web developer to fix any errors.

Web standards validator
https://validator.w3.org/

4. Take it easy on the popups and advertisements

Sites with spammy and aggressive ads are often ranked poorly in the search results. The SEO gurus have reached no consensus on the amount of ads leading to a penalty from Google, so use your common sense. Ensure advertisements don’t overshadow your content and occupy the majority of screen real estate.

5. Improve the overall ‘operability’ of your site.

Does your site have slow web hosting, or a bunch of broken links and images? Simple technical oversights like these contribute to a poor user experience.

Make sure your site is with a reliable web hosting company and doesn’t go down in peak traffic. Even better, make sure your site is hosted on a server in your local city, and this will make it faster for local users.

Next up, chase up any 404-errors with your web developer. 404 errors are errors indicating users are clicking on links in your site and being sent to an empty page. It contributes to a poor user experience in Google’s eyes. Fortunately, these errors are easy fixed.

You can find 404 errors on your site by logging into your Google Search Console account, clicking on your site, then clicking on “Crawl” and “Crawl Errors”. Here you will find a list of 404 errors. If you click on the error and then click “Linked From” you can find the pages with the broken links. Fix these yourself, or discuss with your web developer.

Google Search Console
https://www.google.com/webmasters/tools/

Readability—SEO for the future.

One of the strongest ranking factors has been flying under the radar, overlooked by many SEO professionals in their optimization checklists, leaving a golden opportunity for those that know about it. I’m talking about readability.

Google have been outspoken about readability as an important consideration for webmasters. Google’s SEO spam king himself, Matt Cutts, has gone on to say that poorly researched and misspelled content will rank poorly, and clarity should be your focus. And by readability, this means not just avoiding spelling mistakes, but making your content readable for the widest possible audience, with simple language and sentence structures.

Flesch readability has since surfaced in the Searchmetrics’ Google ranking factors report, showing a high correlation between high ranking sites and easy to read content. The Searchmetrics rankings report discovered that sites appearing in the top-10 showing an average Flesch reading score of 76.00—content that is fairly easy to read for 13-15 year old students and up.

It makes sense readability is a concern for Google. By encouraging search results to have content readable to a wide audience, Google maximise their advertising revenues. If Google were to encourage complicated results that mostly appeal to a smaller demographic, such as post-graduates, it would lower Google’s general appeal and their market share.

You can achieve an on-page SEO boost, while also increasing your user engagement, by making your content readable to a wide audience. Run your content through a Flesch readability test. It will look at your word and sentence usage, and give you a score on how readable it is. Scores between 90-100 are easily understood by an 11 year old student, 60-70 easily understood by 13 to 15 year old students, and 0-30 best understood by University graduates. You can use the free tool below, and should aim for a readability score between 60-100. To improve your score, edit your content to use fewer words per sentence, and use words with a smaller number of syllables.

Readability Score
https://readability-score.com

Next Chapter

  • SEO
    Three powerful SEO strategies explained.
  • Link Building
    How to Get High Quality, Authoritative Backlinks
  • Content
    What Google Tells Us About Quality Content
  • Keyword Research
    Keyword research is the most important step of every SEO project
  • Local SEO
    Getting started with local SEO
  • Mobile SEO
    An Introduction to Mobile SEO
  • SEO Tools
    Powerful SEO tools that can help save hours, days or even weeks of your time.
  • Google’s Updates
    How to stay ahead of Google’s updates