icon/dark/fileicon/dark/foldericon/dark/folde-openicon/dark/hashtagicon/dark/line-graphicon/dark/listicon/dark/localicon/dark/lock

Technical SEO forms the foundations of your site. Getting this right can impact on the success of your future SEO efforts. After all, what’s the point in having an awesome site when nobody can find it?

In this guide, we’re going to be taking a look at some of the most common technical seo problems faced by SEO beginners and experts alike, and how to solve them. Follow our guide and you’ll soon be on your way to improved visibility and ranking potential.

What is Technical SEO?

This analogy has probably been used to death - but it’s a good one. Think of technical SEO as the foundation of a house. Use the right tools and follow correct procedure and that foundation will stand the test of time. In other words, strong foundations will ensure longevity and stability.

In reality, technical seo aims to make your site friendlier to both the user and search engines and examines key components such as website speed, crawling and indexing.

Technical SEO also includes on-page technical issues, like meta descriptions and title tags, that should be optimized to improve usability and user experience.

Before we get down to the nitty gritty it is worth registering and verifying your site on Google Search Console as this is going to be one of your most valuable tool for diagnosing technical issues that we’ll discuss.

From the main GSC dashboard press ‘Add a Property’ button.

Add propery in Google Search Console

Type in your site’s URL and click ‘Add’.

Add URl in GSC to verify property

You will be asked to verify your site. The recommended way is to download the HTML file supplied by Google and then uploading to your home page directory using an FTP client like FileZilla.

Load HTML to verify site ownership

Click ‘verify’ once you have done this to test that it has been successful. If you’re a WordPress user and have the Yoast plugin (If you don’t - get it now. It’s the best plugin for managing your on-page SEO), select the HTML tag option from the ‘Alternate methods tab’ and copy the last part of the code, as shown below.

Alternate method of adding HTML verification for GSC

Login to your WordPress dashboard and navigate to the Yoast Plugin. Go to the ‘Webmaster Tools’ and copy the code into the corresponding box and save changes.

Verify site ownership with Yoast SEO plugin

Your site is now verified.

Common Technical SEO Problems and How to Solve Them

1. Site not getting indexed

If your site isn’t showing up in search engine results pages, this could indicate that it isn’t being indexed.

Let’s rewind a little.

Search engines use bots to crawl content on the web. They look through content to determine quality and assess whether to include it in the searchable index. There are ways to help search bots determine quality and relevance which include, placing keywords in vital places such as, title tag, H1 and URLs (we’ll come back to this later).

The bottom line is, if you’re in doubt as to whether all of your pages are showing up in search results, you’ll need to check and then rectify the problem.

How to check

A quick way to check for pages being indexed is to type *site:mydomain.com *into Google. Google will return all of the pages on that domain that it has indexed. So, if the number of results don’t reflect the number of pages on your site, then you have a problem.

Alternatively, you can use GSC to check exactly how many URLs Google has indexed. From the dashboard, select the entity or site you wish to examine. Navigate to Google index > Index status and here you’ll see the number of pages that Google has indexed.

Indexed pages in GSC

It’s worth mentioning that Google states,

The number of indexed URLs is almost always significantly smaller than the number of crawled URLs, because Total indexed excludes URLs identified as duplicates, non-canonical, or those that contain a meta noindex tag.

So what’s the purpose of the tool?

  • If you’re publishing new pages on a regular basis the graph in GSC (shown above) should reflect this and means that Google is able to access and index new information

  • Sudden dips in the number of pages indexed can indicate a problem with the server or that Google is finding it hard to access your site’s content.

  • Google suggests regular monitoring of the number of indexed pages. Any sudden or drastic changes will need addressing.

How to fix

1. Check your Robots.txt file

Robots.txt files grant access to search engines so they can crawl your site. They can also protect private content from appearing online, save bandwidth and reduce loading time on your server. If you have a missing file this can cause additional errors in your apache log whenever robots request one.

Use a free tool like WooRank to detect if a robots.txt file is present. GSC can also tell you if there is one missing. GSC states that in the event of missing robots.txt files Google will assume that there are no restrictions and crawl all everything on your site.

Using Robots.txt for SEO

Want to learn how? Click here to access our free SEO guide

Creating a robots.txt file is relatively easy. Use Notepad to create a new file and save it as ‘robots.txt’ (all lower case).

User-agent: *
Disallow: /images/
Disallow: /pages/thankyou.html

In the example above the ‘*’ means that it is allowing all search engine bots to crawl your website and is disallowing access to images and thank you pages. This should be saved as a txt file and then uploaded to the highest-level directory of your site (or the root of your domain). If you want more information, refer to Google’s own guide.

It’s important to use a plain text file to create your robots.txt since using a word processor can add code to the file that will break it.

Once you’ve added it use GSC to test your file to see whether your robots.txt file *blocks Google web crawlers from specific URLs on your site. *

2. Check for specific crawl issues

If you’re a WooRank user then this is going to be really simple. Use the Site Crawl feature to detect any pages that can’t be indexed.

Non-indexable pages in Site Crawl

In this example you can see that there are 5 Non-Indexable Pages. I can also see that these pages are showing as non-canonical pages.

Check the page URLs listed here. If you want these pages to be indexed then take a look at the reasons given which could be:

  • Noindex meta robots or header tags
  • Robots.txt file blocking access
  • Use of a canonical tag pointing to another URL.

2. Broken links or pages

There are several reasons for broken links or pages. Unfortunately, no matter the reason, if Google can’t access them Google can’t index them. Use GSC to investigate crawl errors.

  • 5xx errors: These errors related to a problem at the server end. It could indicate that the server is down or overloaded.

  • 404 errors: Although these don’t directly impact on your SEO, it will drive your site visitors crazy. These errors indicate that the page trying to be visited is missing.

  • 3xx redirects: These are redirects and although the users will often be oblivious to a redirect, Google knows and sees all. Although these aren’t errors they can cause problems if you don’t use them correctly.

How to Fix

HTTP Errors in Site Crawl

Using Site Crawl, click on HTTP Status. It will show you how many errors - by code type - exist on the site.

If you find a lot of 5xx errors it might be worth raising the issue with your hosting agent. Check your Advanced Review to see any periods of server downtime.

If the problem persists it may be time to change to a different host.

WooRank server uptime monitoring

For 4xx errors, redirect users from the broken page to a different page (one that isn’t broken). Make sure the page is relevant to the content that would’ve been found on the broken page to avoid disappointing your visitors.

301 redirects are the most common and tell search engines that the requested web page has moved permanently to a different URL. These should be used in the following instances:

  • When the URL of a page gets changed

  • Site has been moved to a new domain

  • Site can be accessed through several different URL variations e.g www.name, http://name. Here you can use 301 redirects to send all traffic to the preferred domain name

  • If you’re merging two websites.

You might be tempted, but don’t use 301 redirects for duplicate content. There are legitimate reasons duplicate content exists (same content in two different categories on your website), so you want your visitors to be able to access them both. However, besides the typical issues with Panda, duplicate content can water down your link building results and page authority.

For duplicate content, the rel=canonical tag should be used on the duplicate page to tell search engine bots to crawl and pass juice to the preferred page.

3. Poor web & mobile speed

A slow loading website will have a negative impact on the user experience. Google picks up on this by monitoring bounce rate and abandonment - on top of measuring its own wait time - which will impact your ranking ability.

How to check

Use the free Woorank SEO audit to check the speed of the different versions of your site.

If it indicates that your site is slow then you need to rectify this quickly (excuse the pun).

WooRank speed tips criteria

How to fix website speed

  1. Utilize browser caching: Caching stores website resources (stylesheets, images and JavaScript files) on a local computer, so revisiting a site doesn’t mean reloading all of the content every time. Take a look at the expiration date set for your cache using a tool like Yslow. If you’re not making drastic, regular changes you could set the expiry date to 1 year.

  2. Design: Nested tables (tables within tables) may be something you want to include for a specific design reason, but they massively slow down functionality. If you really, really want to use these then keep them to a minimum.

  3. Think carefully about what content appears above-the-fold (ATF): This is the first thing that people see when visiting your site and although you want to grab your audience’s attention, don’t be tempted to include hefty videos that will slow down loading time.

  4. Compress images: Use an image editor like Photoshop to save images for web devices. This compresses images without losing quality and definition. Always use the correct format for graphics. Photographs should be saved as Jpegs and graphics or illustrations should be saved as PNGs. Consider using CSS scripts for the same images that appear throughout your site. E.g logos, banners and social share buttons. CSS scripts allow you to save images into a larger template which then load at the same time; reducing the number of HTTP requests.

  5. Minify code: Use Gzip to clean up and compress your HTML, Javascript and CSS files.

How to fix mobile speed

  1. Consider connection issues: Mobile users are on the go and may not always be connected to superfast wifi. Your mobile site should take into account data usage requirements. Disable any unnecessary images or data heavy videos.

  2. Simplify templates: Remove any unnecessary plugins, pop-ups and widgets that you may have running on your main site. These can all eat into data usage and slow loading times right down.

  3. Consider separate mobile site: If you have a large site it might be worth considering an independent mobile site. These should be designed specifically around a mobile user and ensure that information is accessible. Mobile sites are often stripped down versions of desktop sites to enhance the user experience.

  4. Limit number of redirects: If you have to use redirects make sure that they are limited to three in a chain (a page redirecting to a page that redirects to another page). Each redirect is a separate HTTP request and therefore impacts on page load speed which can have a bigger impact on the mobile user.

4. On-page issues

Unfortunately, search engines cannot read the content of a webpage. So they rely on certain things on a page to determine relevance to a search query. Luckily, there are several ways you can help out which will ensure that your pages get indexed and rank for specific keywords.

How to check

To monitor positions for specific keywords, you can use a free tool like RankScanner to check five keywords or WooRank’s Keyword Tool allows you to track up to 250 keywords along with performance over time.

How to fix

Well researched keywords need to be included in the following on-page locations:

  • Page URL
  • H1 title (One per page unless you’re using HTML5)
  • H2-H6 titles
  • Opening paragraph
  • A minimum of 3 times throughout the body of content
  • Title tag
  • Meta description
  • Image file names

This may not seem very technical but problems can arise as your site grows or is developed. Which brings us nicely onto our next point.

5. Duplicate content

There are several legitimate reasons why duplicate content may exist on your site (as in, not from scraping or plagiarism). The bad news is that, Google treats duplicate content negatively even when it’s caused legitimately, which can affect SEO performance.

Duplicate content can exist because:

  • Your site doesn’t redirect different domain variations such as http://, https://, www. http://www etc. to one place.

  • Site migration from HTTP to HTTPS

  • Duplicate on-page element including title tags, meta descriptions.

How to check

First, check whether your sites directs users to a preferred domain by simply typing your URL into a search bar with and without the http or www prefixes. If you site loads and always redirects to your preferred domain - great! If it doesn’t follow the steps below.

To check for duplicate content within your on page elements you can use GSC’s HTML improvements under Search Appearance. Here it details duplicate meta descriptions and title tags, an indication that you there are duplicated pages.

However, when I have done this in the past I have found that Site Crawl displays a lot more pages with errors - presumably because GSC doesn’t monitor pages that it hasn’t indexed. For this reason I suggest using WooRank because fixing any on-page issued will only improve your chances of being indexed.

How to Fix

Set your preferred domain in GSC. Go to your Site Settings in GSC and click the button for the version of your domain that you prefer.

For duplicate pages - two versions of the same page on different URLs - use the rel=canonical tag on the duplicated (non-canonical) pages pointing back to the original version.

First, choose your preferred page by selecting either the page with the shortest URL or the one with the most number of backlinks.

Next, add a rel=canonical link from the non-canonical page to the canonical link in the <head> section of the page:

<link rel="canonical" href="http://example.com/preferredpage/">

From a search engine perspective the pages now exist as one page but users can still visit both pages.

For on-page duplication errors, head over to the Site Crawl function in WooRank.

Site Crawl duplicate content issues

Here you will see duplications in:

  • Title tags
  • Meta descriptions
  • H1 tags
  • Body content

Work you way through these and refresh the site crawl.

What Are You Waiting For?

There are many technical SEO problems that can cause a site to perform badly. While some are more obvious to detect than others, using the right tools can help you identify any issues with speed, performance, crawling and indexing.

Ultimately, dismissing common technical SEO problems can be disastrous and will impact future success. So, optimize your site, rank higher and achieve more by working your way through this guide to common technical SEO problems and how to fix them.