icon/dark/fileicon/dark/foldericon/dark/folde-openicon/dark/hashtagicon/dark/line-graphicon/dark/listicon/dark/localicon/dark/lock

In a recent post, we explored the importance of technical SEO and identified that we were in a Technical SEO Renaissance. In an era of marketing when the emphasis is heavily placed on the creation of content, we argued that this approach to SEO is flawed if your site isn’t being found and indexed.

As SEOs, we should also take into account user behavior and the user experience. Since mobile usage now outweighs desktop, sites need to be optimized for these users. With this in mind, we’re going to be looking at the most important technical SEO factors, how they impact your site and what you can do to put them right.

What is Technical SEO?

Technical SEO refers to everything that happens behind the scenes. If we compare SEO to a car (I’ll give it a go) technical SEO would be the engine, driving the car and its passengers to their destination. The car’s accessories, like wheels, trim and color signifies the on-page SEO and user experience; while the car’s destination, and the passengers it takes relates to off-page SEO techniques, like link building.

The important thing to take away from this analogy is that the technical SEO factors, much like a car’s engine are vital to driving and delivering your website’s success. But that doesn’t stop them from getting overlooked (much like your car’s check engine light).

In fact, I frequently work with customers who complain about their site’s underperformance, despite the fact they’ve spent money on SEO. Often, site owners end up paying for the creation of landing pages and backlinks and a quick SEO audit will reveal a number of fundamental technical SEO factors missing from the site. In a nutshell, technical SEO is important.

1. Crawling and Indexing

For your site to rank highly, or at all, search engines must be able to crawl and index your site.

Robots.txt

Robot.txt files - placed in a site’s top-level directory (as in, example.com/robots.txt) - help search engines to crawl your site. Blocking searching engines from crawling a site can be useful for keeping information private, and preventing certain files from being crawled, like images and PDFs. However, one wrong move and you could block search engines from crawling your site altogether.

To check if your site has a robots.txt file type in your whole domain name followed by /robots.txt. For example, www.example.com/robots.txt. Here is the robots.txt file for my site:

Example of robots.txt file

If your site doesn’t have one, you need to create one. It’s also a good idea to include the locations of the sitemaps.

Sitemaps

XML sitemaps tell search engines about your site’s structure and help them your pages content faster. These should be maintained by keeping it free of non-canonical pages, 404 pages, blocked pages and redirected URLs. They should also be regularly updated whenever you publish new content.

Make sure you verify your site with the relevant search engine webmaster tools like Bing and Google’s Search Console so that you can test and submit your optimized sitemaps. Keeping them free from clutter and unnecessary URLs will ensure that your most important pages get found faster.

If you’re confident that your site has both a sitemap and a robots.txt file you should check that your site’s pages are being indexed and that they do actually show up in search results. There are a few simple ways of doing this

  1. Check on Google Search Console to see how many pages it shows as being indexed.

  2. Review your site using an audit tool to check the number of discovered pages.

    WooRank audit discovered pages

  3. Perform a site:example.com search. By typing ‘site:’ followed by your domain name into Google, it will display all of the pages in Google’s index. If the number of pages displayed here is much, much lower than the number of pages that exist on your site, you’ll need a more precise way of checking which pages are and are not being indexed.

  4. Use a tool like WooRank’s Site Crawl to obtain a better understanding of why these pages are not being indexed.

    Site Crawl non-indexable pages

As you can see on this particular site there are 6 pages not being indexed because the canonical tags on those pages are instructing Google to only index the canonical URLs.

There may be other reasons why your site isn’t indexed, so make sure you find out why by referring to our guide.

2. Get rid of duplicate or thin content

Search engines hate duplicate content. Not only does it serve little purpose to the user and add confusion, search engines may well see this as an attempt to dupe or manipulate them, which could result in your site being penalized.

Duplicate content across your site will also dilute link juice, which can massively harm your SEO.

There are several ways that duplicate content can come to exist on a site:

  1. Directly copying content from another site: This is not recommended. Search engines will know which content is the original, simply by looking at which was published first and will, therefore, ignore copied content completely.

  2. www resolve: Google sees URLs starting with www and those without as two different sites. If you don’t redirect one version to the other, Google will be seeing two copies of every page. It is therefore important to fix this issue.

    If you’re still unsure whether this is something that needs fixing on your site, it’s worthwhile running a review.

    WWW resolve in WooRank review

    To fix non www resolves you will need to use 301 redirects to send traffic to the preferred domain. This should be done in the root directory of your site. Once you’ve done this head over to Google Search Console and set your preferred domain. You will need to add and verify every version of your site as a ‘new property’ including with or without https:// if you have it.

  3. Tags/categories/authors: Websites with blogs or portfolios that use tags and categories can also fall victim to duplicate content. Because content can be found at different domains, for example, www.examplesite.com/blog/post1 and www.examplesite.com/categories/blogposts these may be treated by search engines as duplicate content.

    Use a tool like that offers a site crawl to check for any instances of this. What you’re looking for here is the use or absence of canonical tags. Use the rel="canonical” tags on duplicate pages to point search engines to the original content. You do not want to use 301 redirects here because this will redirect users to different pages which may not be what they’d expect to see.

  4. Duplicate meta descriptions/title tags/H1 tags: Since each of these on-page elements should be unique it is an important thing to assess. These give search engines the best indication as to the content on your pages.

    Again, the site crawl feature by WooRank can detect these for you and tell you which URLs share the same elements

    Title tags in SiteCrawl

Fix all of the issues above and you should notice an increase in your backlinks and ranking ability.

3. Mobile friendliness

In a response to the rising popularity of mobile usage compared with desktop, Google made several mobile-related updates to their algorithm, including mobile first indexing and mobile optimization.

Both updates aimed to enhance the experience for mobile users by encouraging site owners to:

  • Keep designs responsive: This ensures content responds to screen size allowing the user to access content easily on touchscreen devices. The user shouldn’t have to pinch or zoom to focus on content.

  • Touchscreen readiness: Ensures that important buttons and links can be accessed and are large enough to be tapped.

  • Mobile speed: Speed is an important issue for mobile users. Google sets the threshold for above the fold content at one second or less.

Let’s not forget that mobile usage surpasses desktop and TV. Mobile searches are also full of intent, meaning over 50% visit a store within a day of conducting a local search. Therefore it is paramount to have an awesome mobile friendly site.

A simple way of auditing your mobile site is by using a tool that tests mobile compatibility. Woorank examines mobile friendliness by assessing:

  • Mobile rendering
  • Touchscreen readiness
  • Mobile compatibility
  • Font size legibility
  • Mobile viewpoint and mobile speed

WooRank mobile friendliness audit

4. Speed

Remember that Google aims to serve pages to its users that are going to offer relevance to the search query and a good user experience. One of the factors that contribute to the experience is page load speed and site speed.

If you’re not sure how your site is performing you can use Google’s page speed insights. Google will give your site a rating out of 100 and also provides some in-depth guides on how to optimize site speed.

Other great tools for testing speed are:

Top tips for improving mobile speed include:

  1. Optimize your images: Compress image file sizes by saving images for ‘web devices’, using a photo editor like Photoshop. Photographic images should be saved in .jpeg format and illustrations should be saved as .png. Consider using CSS scripts if your site uses the same images in several places. This is particularly handy for logos, social icons and banner images. CSS scripts allow you to combine images into one large image template, reducing the number of HTTP requests.

  2. Site compression: Gzip is a software application for file compression. Use it to compress the size of HTML, Javascript and CSS files that are larger than 150 bytes.

  3. Optimize Browser’s cache: This stores website resources (stylesheets, images and JavaScript files) on a local computer. Sites that have already been visited will be remembered so it doesn’t have to reload all of the content on subsequent visits.

YSlow will analyze the expiration date set for your cache. Unless you plan on making frequent major changes to your site you should look at changing the cache to expire after a year. Google has a great developer’s guide on Leveraging Browser Caching.

5. Structured data markup

Structured data markup uses a universal language to annotate key information on your site like opening hours, reviews, product information, logos and biographical information. It is used by the semantic web to create highly accurate and enhanced search results like rich snippets, featured snippets and knowledge panels.

Although using structured data markup isn’t a ranking signal, it can increase click-throughs by enhancing search results. And because less than one-third of websites currently use it, we recommend getting in there quick before everyone starts doing it, and before Google insist on it. ( It could happen).

If your site has structured data markup you can use Google’s data highlighter tool, found under search appearance to annotate the important information. Google also provides a link to great overview video, making it simple to implement.

Google Data Highlighter to structure data

There are lots of other tools out there to help you structure your website’s data for the semantic web.

What we’ve learned

Although we’re often encouraged to create great content or establish a robust social media strategy, neglecting important technical SEO factors can result in a lot of wasted effort. Especially if your site isn’t being indexed. It’s therefore vital to address, test and regularly revisit these technical SEO factors to ensure that your site has the best chance of ranking highly.

If you’re unsure about how your site fairs for these technical factors use a site auditing tool to carefully examine your site. Prioritize these factors and work through them and you’ll see improvements in traffic and visibility in no time.