Technical SEO, as scary as it sounds, is simply all the bits of Search Engine Optimisation that make it possible for Google to access your content. It’s the nuts and bolts of your website that make sure your site is found and indexed by search engines. We’ve put together a quick guide of the top technical SEO best practices to help you get your website right, and we’ll point you in the direction of the many online tools out there to help you further.

1. Conduct an SEO Audit

Before you start addressing your technical SEO, you need to know what needs fixing. To do this manually would involve drawing up a list of possible things that could be wrong with your site, and systematically going through to check them.

But you’re in luck; there are plenty of tools out there, including Woorank‘s software, to provide you with an audit of your site in no time.

Start Your Free Trial Now!

Create your account to start your free trial!

2. Address Your Site Speed

WHY?

Clicking on a website that takes aaaages to load is a frustration many of us could do without. But slow sites are not just unappealing to us humans. Search engines also find them less than ideal and penalize them accordingly.

Search engines such as Google measure load time by “time to first byte” (TTFB), or the time it takes for a browser to load the first byte of info on your page.

“Pogo-sticking”, a nifty term given to the act of a user bouncing off a slow website back to the search results and then clicking on another result, is also an ingredient for a low-ranking site.

HOW?

First, it’s time to diagnose your page speed issues. The best way to do this is by using Google‘s free Pagespeed Insights Tool, where you’re given a score of your page speed and a breakdown of the issues which might be stopping you getting above the desired 80% score.

Google Page Speed Test

Next, you need to repair the damage. You may have a list as long as your arm of issues to fix, but worry not – only a few minor changes will make a lot of difference. Here are a few high-impact changes you could prioritize:

  • Optimize images – and compress them to more manageable file sizes using online tools such as ImageOptimizer.
  • Use Content delivery networks – which essentially make the data your visitors are trying to access geographically closer to them.
  • Enable HTTP compression

3. Optimize Your Site for Mobile

WHY?

Given that mobile users have officially overtaken desktop browsers, it’s crucial that your website works well on a mobile phone. If not, you’re likely to not only frustrate the majority of your visitors, but get penalized by Google too. Remember, Google has started its move toward a mobile-first index.

HOW?

Increasingly, web designers and developers are building mobile-first websites. However, if your site has already been built, you can check your website’s mobile friendliness with Google‘s “Mobile Friendly Test“.

Google Mobile Friendliness test

Responsive site design: A website that’s mobile-friendly should automatically respond to the size screen on which you’re browsing. If your site doesn’t do this, then you can add some code to instruct browsers to display your content to a smaller screen. You can also avoid using Flash or Pop-ups which might hinder your mobile users from seeing your content.

*Page speed: *Increasing your page speed is even more important for mobile, so go back to point 2 and double check you’ve done as much as possible to reduce obstacles.

4. Improve Your Site Architecture, Starting with a Site Map

WHY?

An HTML sitemap will help your users get to understand your website’s navigation. An XML sitemap, the more important of the two, will help search engines to crawl your site.

HOW?

Manually constructing a sitemap is a waste of time given the number of free tools out there to help you. Google XML sitemap is the best place to start. You then submit your sitemap to Google Search Console.

XML sitemap

5. Avoid Duplicate Content

WHY?

Search engines get very hot and bothered about duplicate content, because they simply can’t tell which is the original version, which is the one they want to rank higher than the other. Excessive duplicate content can result in being penalized, so should be avoided. Content that looks too similar to other pages could be left out of search results altogether.

HOW?

Make your content unique! Nearly every business has a website section “About Us”, so it’s easy to plagiarize other people’s’ content without even realizing it. As such, keep your “about us” pages unique and interesting. Not only your visitors will thank you, but search engines too!

Beware of ambiguity. You might have two very similar sections on your site, say “Beauty” and “Hair” sections, each with a page of “Products”. To avoid confusing search engines, you need to differentiate these pages (and their URLs). Start by calling them “Beauty Products” and “Hair Products”, and ensuring you employ different copy on each page.

Tools. Start crawling your site to make sure you don’t have duplicate content with tools such as Site Crawl or Screaming Frog.

WooRank Site Crawl Duplicate Content

6. Beware of Robots

WHY?

Robot.txt files (or Robot Exclusion Protocol) are the files that tells search engines which parts of sites they can or can’t crawl or index. Blocking them when you do want your site to be crawled, or unblocking them when you want them to be hidden can damage your website’s authority.

HOW?

  1. You can find your robots.txt file in the root directory of your site, by opening your FTP tool and searching your public_html folder. Your robots.txt file is a small text file which you can then open using Notepad.
  2. Now you decide how you want to instruct bots from crawling your site.

For example:

  • Block all robots from crawling your site (no robots can access your site):
    User-agent: *
    Disallow: /
    
  • Block all bots accessing a particular folder (in this case your images):
    User-agent: *
    Disallow: /
    
  • Block specific bots (Google is not allowed to access your site):
    User-agent: Googlebot
    Disallow: /
    

For a complete step-by-step guide, along with tools to help you look into your robots.txt file, check out our guide.

7. Resolve Crawl Errors

WHY?

Crawl errors are the many and varied faults on your site that prevent Google from crawling all your pages. They might include URLs that are no longer available and give a 404 message, or a server error as a result of your server timing out. Ensuring these errors are kept to a minimum is crucial to maintaining the health and authority of your site.

HOW?

Head to Google Search Console and the crawl errors tab. You will then be given a list of all the errors on your site divided into “Site Errors” and “URL Errors”.

  • Site Errors: These are high-level issues such as DNS errors, Server Errors or Robot Failure.
  • URL Errors: These affect specific pages on your site, including: 404s, Access Denied, and Not Followed.

For a more comprehensive guide to identifying and fixing crawl errors, check out the Crawl Errors report.

Google Search Console crawl errors

Image source: Rebelytics.com

Final Thoughts:

These are just some of the elements of technical SEO best practices that will get you started. Remember, there are some excellent SEO tools and resources out there to help you, and just a few minor changes to your site can make all the difference when it comes to getting your website found online. Good luck!