The Ultimate Technical SEO Checklist to Improve Rankings Instantly

When your business is not ranking as you have expected, you ought to do a thorough technical SEO audit before checking on content optimisations or building links.

Not sure where to start?

Kazakhstan

Check out this in-depth checklist for the common technical SEO elements. You will learn why they are important and how to resolve any teething issues from the audit.

Technical SEO – The Most Neglected and Underrated Aspect of SEO

The term technical SEO refers to search engine optimisation processes that relate to the technical area of the website.

The goals of technical SEO is to improve crawlability, indexability, and speed of the site. It also shapes the site’s architecture.

Why is Technical SEO Important For Your Website?

Technical SEO may not be as exciting as keyword research, developing a content strategy, or starting a link building campaign. However, your SEO effort will be in wane without proper technical SEO.

Think of technical SEO as the foundation of a house. If a site is not optimize for technical SEO, no amount of creative content, promotions or link building will increase its organic traffic.

We are moving forward in an increasingly fast-pace digital world where users expect a site to load almost instantly.

Google, being the dominant search engine, is committe to serving its users with pages that are optimise for speed.

This is where technical SEO kicks in, to ensure that your website is optimise to create the best user experience. A good technical SEO implementation paves way for on-page and off-page SEO optimisation.

Simple and Effective Technical SEO Best Practices and Checklist

Not all of the technical SEO requires intervention from a web developer. If you have been setting up your website via WordPress or other platforms, you can run through most of the audit and resolve the issues.

Simple checklist for companies without web developers:

1. Version of Domain to Use

Only one version of your website should be accessible at any one time. By default, a site is accessible with or without the ‘www.’ prefix.

For example, both of these URLs will return the same content.

 

While the pages may appear the same to you, they are treated as different versions by Google. Having different versions of the website may lead to Google treating the pages as duplicate content, or weakening the SEO value of the pages.

The same applies to a site with ‘http’ and ‘https’. Both of the following URLs could return the same content but are different versions on Google.

 

If nothing is done to address this issue, Google will index the version that it considers the best option.

You will need to ensure that Google only uses one version of the site. It doesn’t matter if you are going for ‘www’ or non-’www’, but you need to have ‘https’ for better SEO ranking.

Once you have decided on the version, you will need to set canonical tags on the pages that you want to redirect.

Here’s an example of a canonical tag.

The canonical tag, when placed on the homepage of a ‘www’ or ‘http’ version, redirects users to https:

With the canonical tags, Google is clear on which version is to be indexed and displayed on the search results.

Canonical tags are placed on the head of your page’s HTML code. If you are using Yoast Plugin, you can insert the target URL directly into the Canonical URL field.

 

2. Implement SSL Cert (HTTPS)

When you are browsing the internet, you may find that some websites have ‘http’ prefix while others have ‘https’.

HTTP stands for hypertext transfer protocol and HTTPS is the secure version of it. With HTTPS, data that are transmitted between your browser and the web server are encrypted. It prevents the data from being intercepted by malicious 3rd parties.

Sites that deal with financial transactions are required to be protected by HTTPS. In 2014, Google made HTTPS one of the SEO ranking signals. If you are still stuck with HTTP or does not have HTTPS properly optimised, your site may struggle to rank on Google.

In order to implement HTTPS, you will need to install an SSL cert for your site. Usually, this is performed by the web hosting provider.

If your site has an SSL cert installed, you will find a lock icon next to the URL.

 

Else, you will find a warning ‘not secured’ if your site does not have an SSL cert or it’s not properly configured.

 

You will need to contact your hosting provider to purchase and install an SSL cert. When the HTTPS version is up, you will need to redirect pages from the old HTTP to the HTTPS. This is to prevent duplicate content from being detected by Google.

3. Optimise Site Architecture

Your site architecture is how you organise and structure the pages within it. When properly-optimised, it makes navigation easy for readers and improves crawlability for Google bot.

Good site architecture is grouped according to categories and with the pages interlinked to each other. It’s arranged hierarchically from the home page, with the deepest page being 4 clicks or less from the top.

Conversely, a bad site architecture, which is characterised by non-organised structure, bad interlinking and deeply-nested pages, is bad for SEO. If a reader needs 10 clicks to navigate to a specific page, Google may have trouble crawling it.

To optimise your site’s architecture, you need to place important pages near to the home page. For a business website, these pages usually form the top tier of the architecture.

  • Products
  • Services
  • News
  • Support
  • Blog
  • About Us

The 1st level pages are then expanded into product items, landing pages, services description, and articles. They are also linked with their corresponding upper-level pages.

To further assist readers in navigating your site, you can use breadcrumbs. Breadcrumb is a navigational feature that tracks the pages based on the site architecture.

Here’s how breadcrumbs look like in one of our pages.

 

Users can easily navigate to other pages with the breadcrumbs, instead of clicking through the menus.

4. Setup Proper URL Structure

Proper URL structure means it’s easy to read for both users and search engine. If it’s not properly optimised, the URL can be unintelligible.

An URL must give a brief idea on the page’s content. It should also contain the page’s primary keyword as the URL structure slightly influences Google SEO.

For example, this URL contains random character, which does not provide information on the photos in the page.

When you are setting up URL structures, there are a few things to keep in mind.

  • Use lower case letters.
  • Include the keyword.
  • Make it intelligible.
  • Avoid stopwords like ‘and’, ‘or’, ‘for’, ‘but’.
  • Separate words with hyphens.

A good URL structure often reflects the site architecture of the site. The subfolder is reflected in the URL, which helps users and Google understand the context of the content.

5. Optimise robots.txt

Robots.txt is a file hosted on your site that contains instructions for search engine crawlers. It can be used to instruct specific crawlers on which pages are to be crawled or skipped.

An incorrectly set up robots.txt is one of the common technical SEO issues, which lead to low visibility or pages unable to be indexed.

You can find your site’s robots.txt by keying the filename after the domain as follows:

 

A basic robots.txt that allows all search engine crawler to access all pages will look like this:

 

Here’s one with specific allows and disallows for the crawlers.

 

It tells the crawler to avoid pages preceded by /tag/ and /wp-admin/ but allows access to /wp-admin/admin-ajax.php.

You can also specify instructions for specific crawlers like

If you have specific pages or subfolders that are used for testing, it will be a good idea to disallow access in the robots.txt.

More importantly, you don’t want to accidentally block access by Googlebot by setting the wrong instruction in the file.

You can use Google’s robots.txt tester tool in the Search Console to validate the file.

6. XML Sitemap Usage and Optimisation

An XML sitemap contains links to important pages and resources on your website. The goal of having an XML sitemap is to enable Google to crawl the pages easily.

If your site has thousands of pages and missing an XML sitemap, there’s a chance that Google may miss out some of the pages. Without an XML site map, some of the pages may not reflected correctly in the search result.

An XML sitemap should not be confuse with an HTML sitemap. The former is created for search engine crawlers while the latter is meant for humans. The HTML sitemap has no effect on crawlability and SEO.

 

Clicking into one of the sitemaps will provide an extensive list of links.

 

Here’s how to optimise your XML sitemap to make the full use of Google’s crawl budget.

  • Include only important pages.
  • Remove non-indexed, canonicalized, or broken pages.
  • Organize sitemaps by categories. Usually, this is done automatically when you are using plugins like Yoast.

You will need to ensure that the sitemap is accessible to Google. Go to Search Console → Sitemaps and enter the URL of the sitemap.

 

7. Noindex Pages That Are Not Important

We have mentioned the importance of getting your pages indexe. At the same time, there are pages that should not be index.

If you are running an e-commerce business, a particular product may be accesse via different URLs. This can create an issue of duplicate content and you will want to ensure only one version is indexe.

Other pages like privacy policy and terms of services are quite generic. Most businesses copied the content from similar templates. These pages are better kept as no-index to prevent duplicate content issues.

Some pages, such as ‘Thank You’ or ‘Download’ pages, are not meant to be found on the search engine. These pages are part of your marketing funnel, and you don’t want users accessing the download links on the search engine.

The login page of your website’s CMS should not be on the search engine.

 

8. Use Canonical URLs

Canonical URLs are use to direct users to the main version of a page. The main purpose of URL canonicalization is to prevent duplicate content from affecting SEO. It also prevents SEO value from being dilut over the multiple versions of the site.

Usually, there are a few versions of your site that can be return by Google. This is particularly true for the home page.

Google may return ‘www.yourhomepage.com’ or ‘yourhomepage.com’ when someone searches for the company’s name.

Including a canonical URL on the home page tells Google the exact version that should be displaye.

A canonical URL is define in the HTML code of the page and has the following format.

<link rel=”canonical” href=”target URL” />

It tells the search engine that the version it should refer to is at the target URL. Even if don’t maintain multiple versions of your site, it is still a best practice to have canonical URL referencing itself.

Here’s how backlinko.com is including a canonical URL to the same page.

9. Eliminate Redirect Chains

When you have changed the URL of a page or moved to a new domain, you will do a 301 redirect. Redirect chains are occurrences of more than one 301 redirects for the URL.

For example, your site shifted from HTTP to HTTPS and you have your first redirects of:

Site A (HTTP) → Site B(HTTPS)

Then, you decided to move the site to a new domain and you will have:

Site A (HTTP) → Site B(HTTPS) → Site C (New Domain)

As you keep making changes to the site, such as amending the URL for certain pages, you will have instances of lengthy redirect chains.

Redirect chains affect SEO in 2 ways:

  • It increases the loading speed of the page, which affects user experience.
  • It reduces the link equity that was built up in the initial domain.

If you have built high-quality backlinks to the previous URL, they will lose about 15% of value for each redirect.

Therefore, you will want to eliminate redirect chains. In order to do that, you need to identify redirect chains in your website, and you can do so with Screaming Frog.

Once you have identified the chains, you will need to remove that by removing the intermediary URLs from the loop.

The final fix will be:

Site A (HTTP) → Site C (New Domain)

10. Eliminate Thin and Duplicate Content

Since Google Panda upgrade in 2011, thin and duplicate content can lead to a loss of ranking or manual penalties.Unfortunately, such Black Hat tactic doesn’t work anymore.

If you are not sure if you are producing thin content, check out the existing pages that top the ranking and look at the topics covered.

Remember that it’s not all about word count, but whether the content is write in detail and answers the query satisfactorily.

Another common issue that greatly affects SEO is publishing duplicate content. Having multiple copies of the same content is bad for SEO, as Google struggles to decide which version is more important.

Duplicate content that results from plagiarism carries stiffer penalties.

Whether it’s an exact or partial copy, Google can detect plagiarism Kazakhstan Phone Number List and issue manual penalties. Duplicate content may also lead to removal from search engine if Google receives a complaint that you are using publishing other’s content without permission.

You can use tools like Ahref, Seranking or Copyscape to check for duplicate content.

 

11. Optimise 404 Pages

Here’s a basic, unoptimised 404 page.

However, some 404 errors may affect SEO ranking. If you have deleted a page that you have previously created backlinks to, then you are missing out the link power.

Despite the ambiguous effect on SEO, you will still want to optimise the 404 pages. Landing on a plain 404 page is bad for user experience and it doesn’t look well for your brand.

Instead, you will want to create a customised 404 page, to capture visitors who’re visiting a deleted page or keying in the wrong URL.

Some optimisation tips for a 404 page include:

  • Including a search bar.
  • Links to your popular posts.
  • Display brand-centred message.
  • Direct users to the homepage.

Here’s how we have created an optimised 404 page for our website.

Leave a comment

Your email address will not be published. Required fields are marked *