icon/dark/fileicon/dark/foldericon/dark/folde-openicon/dark/hashtagicon/dark/line-graphicon/dark/listicon/dark/localicon/dark/lock

When we think about SEO, many of us place an emphasis on producing ‘amazing’ content in the hope that it will be shared, generate backlinks and be found based on the keywords used.

While content is an important SEO strategy, search engines are becoming highly intelligent indexing machines that aim to serve only the very best performing sites to its users. Yes, search engines judge ‘quality’ of the content that exists on a site, but performance is also defined by a site’s technical construction, leading many to prioritize technical aspects.

This new emphasis on technical SEO is what is referred to as a ‘technical SEO renaissance’. So, to better understand this new era of SEO we’re going to be explaining what it is, why it has come about and what we can do to help our sites perform better on a technical level.

The Dawning of a Technical SEO Renaissance.

Search engines, especially industry giants like Google, are constantly aiming to improve the user experience. For Google, this means only serving websites that perform well and are full of original, helpful content that will benefit the user. Get this right and your site will stand a better chance of ranking highly.

How does Google ensure they select the best website? Through a complex programming system that can search, decipher, and measure how effective your site is: This is known as the ‘algorithm’.

Historically, SEO was a practice of keyword density, hacking, and black hat techniques. All were used to trick or mislead search engines into believing a site was of relevance to the search terms being input by the user. In a bid to eradicate this manipulation, Google began to develop ways of tackling these problems.

Today there are over 200 different ranking signals that Google will examine to determine where your site should rank. In addition to this, Google continues to make significant changes to their algorithm which has resulted in a dramatic shift in the way in which we approach SEO. This is the dawning of the technical SEO renaissance.

Google Updates Made Us Obsessed with Content

Panda

Rolled out globally in April 2011, Panda’s aim is to detect sites with poor quality content, ensuring they don’t rank as well as sites with in-depth content. Panda primarily focuses on removing duplicate content and thin pages from search results.

Penguin

Because backlinks – both quantity and quality – are a significant ranking signal, many sites went about obtaining these unnaturally by:

  • Buying links
  • Participating in link exchange programs
  • Using low quality, unmoderated directory sites
  • Spamming blog comments with links
  • Over use of exact match anchor text

Although Panda and Penguin were useful updates bringing many benefits to the user, they’re probably one of the major reasons such a strong emphasis has been put on creating content, which can overshadow technical considerations.

However, the next updates to examine play a major role in emphasizing the importance of technical SEO.

The Technical SEO Renaissance

Hummingbird

The Hummingbird update was a beast of an update. Instead of tackling specific problems like the previous updates, Hummingbird was a change in the way in which Google read, interpreted and understands content. Hummingbird uses semantic technology to better decipher what exists on your site and how well it answers users’ queries.

An example commonly used is: Prior to Hummingbird a user would have searched for ‘Pizza Delivery, Manchester’. Because the semantic web is about accessing data from across the web and across applications Google is now able to find sites based on your location, so all you need search for now is ‘Pizza delivery,’ or ‘pizza delivery near me,’ without specifying your location. This also means that sites no longer have to target location-based keywords because Google will do it for you.

However, this does mean that your site needs to be optimized for local search. But we’ll get on to this shortly.

Mobile-Friendly update

Released in April 2015, Google’s mobile-friendly update disrupted the equilibrium of site owners everywhere. Promising to give mobile friendly sites an added boost to rankings many started to wonder about the impact on sites with a less than friendly mobile version. The update was therefore presumptuously named, ‘Mobilegeddon.’

In reality, sites that weren’t responsive didn’t get penalized that much but it did result in some fluctuation in rank, instead favoring sites that were easier to navigate on mobile devices.

Everyone obviously took the update quite seriously because 85% of all pages are now deemed to be mobile friendly which has since resulted in Google removing it’s mobile-friendly label from its search results.

You should test your site for friendliness or use Woorank to see how your site renders on mobile devices.

Mobile First Indexing

When mobile use overtook desktops Google responded with its Mobile First Index which it began testing 2016. Mobile First Indexing promised to rank sites primarily based on their mobile version even if the search was being conducted on a desktop.

Basically, because Google would crawl and index the same pages and links pointing to the same URL regardless, then pages with the same content and structured data markup wouldn’t feel any impact.

However, sites with mobile versions with different content, URLs and markup would see some changes.

Accelerated Mobile Pages

In 2015 Google launched the Accelerated Mobile Pages project which, according to Google’s official blog

aims to dramatically improve the performance of the mobile web. We want web pages with rich content like video, animations and graphics to work alongside smart ads, and to load instantaneously. We also want the same code to work across multiple platforms and devices so that content can appear everywhere in an instant—no matter what type of phone, tablet or mobile device you’re using.

This was aimed primarily at publishers who produce and rely on users accessing and viewing content at a constant rate. Static (non-interactive) AMP pages are designed to be pre-renderable, meaning everything above-the-fold will have rendered before a user even clicks on it.

The AMP project not only highlighted the benefit of extremely fast-loading pages but it made a compelling case for many developers for further investment in speed and developing a technical SEO infrastructure.

If you want more information about how AMP work, then check out our blog.

Speed

Since as early as 2010 Google has been highlighting the importance of page-load speed. After conducting compelling research into how speed can affect the user experience Google went onto announce that fast loading pages would be rewarded.

Woorank can assess vital elements that can impact on speed so it is well worth checking out. Alternatively, you can use Google’s own toolto assess performance or use Yslow to identify ways of speeding up your site. They also provide 34 rules for speeding up your site which include:

  • Reducing DNS lookups
  • Make Javascript and CSS external
  • Minify Javascript and CSS
  • Avoid redirects
  • Make AJAX cacheable
  • Not scaling images in HTML
  • Minimising HTTP requests

SSL Secure

Although Matt Cutts announced, back in 2014, that HTTPS is a ‘very light-weight’ ranking signal, many experts predict that Google, because they have been obsessing about security for a long while, will eventually favor secure sites more intently.

The reason Google favors HTTPS sites is that it they indicate trust and credibility.

You will need to purchase an SSL certificate and plan your site migration to HTTPS carefully.

Once you’ve implemented HTTPS, use Woorank’s Site Crawl to check for any errors that may arise, paying particular attention to HTTP with HTTPS errors.

This will identify any pages hosted on HTTPS URLs, that contain assets that do not use HTTPS, like images and other CSS files.

Semantic Web

We touched on this when we looked at the Hummingbird update, but the semantic web is probably the most significant development to bring about a technical seo renaissance. Although most of what the semantic web is trying to achieve is dependent on creating content that is fresh, and able to answer user questions, there are some attributes of this futuristic web that will require some technical knowledge.

The semantic web again aims to improve the user experience which has resulted in the creation of enhanced search results like rich snippets, featured boxes and Google’s own knowledge graph. Most of these work by annotating content on your site’s pages to help Google decipher HTML. For example with the help of a universal language called Schema.org, site owners can tag key information like opening hours, reviews, recipes, products and contact information with structured data markup.

Currently, the semantic web is still in its infancy with only 17% of SEOs utilizing Schema.org markup, but since Google is always emphasizing creating an exceptional user experience it might become a strong ranking signal in the not-too-distant-future.

Are you ready to start your own technical SEO renaissance?

For a long while marketers and SEO practitioners have placed a large emphasis on creating content; and not for good reason. Content is, and will always be an important strategy as it is surely the very reason for the web’s existence – people are searching for content!

However, SEO used to be – and should remain – the practice of optimizing sites in order to be found. Not about how much keyword-specific content one can produce. Obviously, I’m being a little extreme here, but only to drive home this point.

Google updates seek to make the web better and enhance the user experience. Many of these updates and advancements have highlighted the importance of technical seo and how, without it, our sites will suffer.

So, review your site and take a good look at the technical aspects of your site and begin your very own technical SEO renaissance.