Do you find yourself using different techniques for yourself vs your clients? To ensure you stay organised and on the right track to success, our in-house technical team have created the ultimate SEO checklist. These are the basic processes and checks to make sure you and your client’s sites are in shape.

It won’t be possible to fix absolutely everything at once but by working through the list and identifying any issues you can see where your biggest opportunities for improvement are.

In this article, we will cover the following:

  • SEO tools

  • Basic SEO set-up checks

  • Technical SEO checks

  • Content checks

  • Off-site content & link building checks

Essential SEO tools

Before we delve into the different checks you’ll need to perform; first, we’ll share our favourite SEO tools that we think you should implement into your strategy.

There are plenty of SEO tools out there to choose from, to help you with reporting, analysis, competitor research and more. We haven’t got space in this blog to list them all, and everyone will have their own preferences – but here are our essential SEO tools from the technical team themselves.

Google Analytics

Google Analytics can give you so much insight into what your users are doing on your website and where they are coming from.

For SEO purposes, we want to focus on the Organic Traffic in Google Analytics.

Google Search Console

Google Search Console is important for SEO and can help to get the basics set up right – more on this in the next section.

Google Search Console allows you to see your website through Google’s crawlers’ eyes; find and fix issues and alert Google as you add new content to your site.

Yoast SEO

Yoast SEO is the most commonly used plugin for SEO.

Unfortunately this won’t be accessible for everyone but if you’ve got a WordPress site or another CMS that supports Yoast, this plugin is highly recommended!

Screaming Frog

Screaming Frog is an industry favourite amongst SEO professionals.

It’s very handy for full site crawls and can be used to identify technical issues on your site.

SEMRush

SEMRush has all-round marketing capabilities but it is most useful in the SEO world for keyword research and competitor analysis.

Ahrefs

Ahrefs is a great tool for link building.

Using this, you can benchmark yourself against your competitors and identify opportunities to build more links.

Basic SEO set-up checks

Set up and install Google Analytics

First, you’ll need to set up a Google Analytics account for your business; this can then be attached to your site by a tag.

Set up Google Search Console

Next, set up a Google Search Console account for your site and link this to your Google Analytics account. This will allow you to see Search Console data in Analytics.

Install Yoast SEO

If you’re using a CMS that has a Yoast plugin, we recommend installing this.

It can help you with optimising your meta data, excluding groups of pages from being indexed and making sure all your other basic SEO settings are in place.

Create & submit your sitemap  

A sitemap is required for every site to direct search engine crawlers around your site, through your links, to make sure every page is crawlable and indexable.

A sitemap normally sits on yourdomain.com/sitemap.xml

You should submit this to Google in Google Search Console to make sure the crawlers can find it.

Create a Robots.txt file

The robots.txt file is used to tell search engine crawlers what they can and can’t access on your site.

A robots.txt file normally sits on yourdomain.com/robots.txt

Technical SEO checks

A solid technical SEO set-up lays the foundations for strong SEO performance.

Although the on-page content and links to your site are very important; ultimately, if you don’t have your technical set-up correct, they won’t make a difference, as Google won’t be able to access your site.

So, we start our site checks with the technical basics.

Crawling & indexing

Check for crawl errors

Crawl errors may include full site errors, such as server errors, meaning your whole website is not crawlable; or URL errors, which are much more manageable. A crawl error is basically when the status code of a page is anything other than 200 – this is what we want all or most of our pages to be.

You can find crawl errors and warnings in Google Search Console’s Coverage report.

Check how Google sees your pages

Using the Inspect URL tool in Google Search Console you can see the code which Google crawls to understand your pages.

Alternatively, you can use the Web Developer chrome extension to see how your page looks when JavaScript and CSS are disabled, which also show you how Google sees your pages as a crawler.

Check that no important pages are broken

A broken page resulting in a 404 error can break the search engine bot’s crawl, resulting in important pages not being indexed, such as product or category pages, that you want to rank.

You can identify these by running a Screaming Frog crawl and filtering by 404 status code pages, or in Google Analytics, by setting 404 pages up as events to track.

Check for redirect chains

Redirect chains are likely to significantly increase load time but can also mean crawlers find it harder to crawl the site and may, in fact, give up if they encounter too many redirects.

Identify redirect chains through Screaming Frog and break them down to a single redirect, where a redirect is necessary.

Site speed

Site speed is more important than ever with the introduction of Core Web Vitals due to factor into Google’s algorithm from mid-2021.

Make sure you look to improve your Core Web Vital scores before then.

Google PageSpeed insights

Use Google PageSpeed Insights to see your scores and identify issues slowing your site down.

GTMetrix

You can run another speed test and find further recommendations using GTMetrix. We often find it’s best to use a combination of the tools, rather than just one.

Speed test recommendations can then be passed on to developers for assistance with implementation.

Mobile friendly

The introduction of mobile first indexing over the last couple of years has reflected the shift to increase mobile use amongst internet users. With 60% of searches being carried out on mobile devices now, it is essential that your site is mobile friendly.

It was recently confirmed by John Mueller that Google will only index mobile versions of sites by March 2021 and desktop-only sites will be completely dropped from Google’s index.

If you’re reading this and you have a desktop only site – we recommend you amend this.

Google’s Mobile Friendliness Test

Enter your URL into Google’s Mobile Friendliness Check to see whether your site is mobile-friendly and, if it isn’t, see what usability problems your site has.

Site architecture & navigation

It is important to get your Site Architecture correct to establish a hierarchy between pages, to indicate the most important ones, and to allow search engines, as well as users, to navigate around your site effectively.

Ensure that your navigation includes all of your most important pages; there are breadcrumbs throughout your site to allow backward navigation; and that the structure is logical for a user.

Canonicals

Canonical tags indicate to search engines the page that you want to be indexed when there are several similar versions of a page. They can be used to solve duplicate content issues that can arise from paginated pages and infinite scroll pages being indexed.

Ensure you have canonical tags set for your pages to ensure the right version is indexed.

Site security

Site security has been a ranking factor for several years now, but having sites served over HTTP connection still appears as a common issue.

Ensure that all URLs on your site are being served via HTTPS.

On-page SEO checks

On-page SEO involves optimising the content on your pages to rank higher, so you are able to attract more traffic organically.

URLs

A URL is the unique address of a webpage that shows in the address bar at the top of your browser. URLs act as a signal to both search engines and users to tell them where they are on the site.

Short & Relevant URLs

URLs should be short and relevant to the specific page in order to be user friendly and give an indication of what your page is going to be about.

Consistent URL structure

URLs across a website should follow a consistent structure to meet SEO best practice. The simplest way to do this is to arrange your site into subfolders, according to your categories, for example: yourdomain.com/category/sub-category/page

Metadata

Meta data is data that is tagged up on a webpage to give the search engine information about the main topic of your page. Search engines then use these to work out the relevance of your page to users’ search queries and decide where to rank you.

Meta data includes your title tag and meta description; each of which should be unique to a single page in order to avoid confusing the search engine over which page to rank for which queries.

One Page title & meta description per page

Having multiple titles or meta descriptions will confuse the search engine.

No missing page titles & meta descriptions

Pages without a title or meta description are missing an opportunity to optimise to appear on SERPs for relevant keywords.

No duplicate page titles & descriptions

Duplication of page titles or meta descriptions across the site risk facing a duplicate content penalisation.

Page titles & meta descriptions are keyword optimised for their specific page

In order to convey the relevance of a webpage for a certain search query to search engines, the title and meta description of a page should be optimised with keywords around that search query.

Page titles & meta descriptions include CTAs to optimise for CTR

Adding call-to-actions to titles and meta descriptions will help to increase the click-through rate of pages; this will indirectly improve SEO rankings by increasing engagement levels with the page.

Headings

The headings of a page help to give the page content a readable structure by breaking down the content into relevant sections. They also indicate to search engines the topic of a page and so should be tightly optimised at page level.

Only one H1 per page

H1s have always been one of the most significant ranking factors as they are used to indicate to both search engines and users the main topic for the rest of the page. The H1 should be the first and most visible heading on a page.

Headings are structured in the right order

As mentioned above, the H1 should be the first and most prominent heading; this should then be followed by a H2, which can be followed by a H3, and so on.

Internal linking

Internal links are used by crawlers and users to navigate around your website, establishing a hierarchy so that important pages can be distinguished. Not only that, but internal links help to spread link equity throughout your site.

Ensure there are no Orphan pages

Orphan pages are those pages that have no incoming links pointing to them.

Each page needs at least one link from somewhere else on your site so that it is discoverable; if a page doesn’t have any, it will not be able to be crawled by the search engine or found by a user.

Content checks

The content on a website refers to the content across all pages, including your product and category pages, as well as your blog and informational resources.

Content should be high quality, relevant and optimised for both head term and long tail keywords.

Keyword research has been carried out

Thorough keyword research should be carried out as the first point of call for any SEO content strategy. As mentioned earlier, SEMRush is our tool of choice for keyword research.

Through this you can map out which keywords should be the focus of which pages, to ensure that no pages are competing to rank for the same terms.

Content should be optimised according to your keyword research

Once you’ve done the keyword research, the next step is to use the identified keyword opportunities within your content to improve your rankings for them.

You should aim to use variations of head term keywords within your main product or service page content, headings and meta data; and longer tail keywords in informational content, such as FAQs and blogs, to create content clusters.

No hidden content appearing for search engines only and not users

An old, black hat SEO technique was to stuff pages with keywords that the search engine could read, but users could not see, by adding styling that pushed them off the visible screen for users, or making all the hidden text the same colour as the background to camouflage it.

Rightly so, this tactic goes against Google’s current guidelines and could mean you end up being penalised. Check that you haven’t got any hidden content by disabling JavaScript and CSS or using Google Search Console to identify any within the code.

No duplicate content

Duplicate content refers to content duplication both internally within your own site, and externally on other sites. This risks confusing Google over which page to rank and can result in the duplicate page ranking higher than the original.

Check for duplicate content across your site by using a tool such as Siteliner.

No thin content

Content should not just be put on a page for the sake of it; all content on your site should be high quality and optimised.

You can check for thin content using Screaming Frog and looking at the word count across each of your pages.

Link building checks

Although we’ve saved link building until last, that does not mean it is the least important. In fact, links and domain authority gained from building links is one of the most influential factors in Google’s algorithm.

That said, it is important to have your basics in place first, before you attempt to gain links, as no one will want to link to a low-quality piece of content on a low-quality site that looks untrustworthy.

If you’ve started link building, or are just about to, follow these checks to make sure you build the right links in the right way.

Competitor research

Without looking at what your competitors are doing, how will you be able to compete with them for the top rankings?

We recommend starting this process by looking into your competitor’s links and seeing where they have gained links from and how.

Link gap analysis

Now you’ve identified your competitors and where they are gaining links from, you can create a link intersect to see where you are missing links, where more than one of your competitors has one.

This enables you to identify warm link leads; if a site has already linked to two or three of your competitors, you have a better change of gaining a link from them to your site.

Convert existing brand mentions

A quick win in terms of link building – if your company has been mentioned on a website but not linked to, you should reach out to the publisher and request a link within the content that mentions you.

The worst thing they can do is say no, so you’ve got nothing to lose but may end up gaining an extra link from it.

Generate new content ideas

Looking at how your competitors have gained links can give you some content ideas for your own site.

If you see that it is common for informational how-to guides on your competitors’ sites, chances are, they will work for you too.

Outreach

Before you start outreach, you need to identify relevant sites to your business offering that you want to aim to gain links from.

Again, you can use your competitor research and link gap analysis for this and find contact details for individuals to reach out to from these warm link leads.

This checklist is by no means exhaustive but it’s a good place to start if you’re looking to improve your SEO.

For further SEO advice, BOSCO™ can help you to find out where you are missing keyword opportunities that can help increase revenue. Or, contact us for a chat at team@askbosco.io; we are happy to help.