LIBERTAS

Mention Technical SEO and most of us run a mile. But with 81% of consumers conducting online searches before buying, and an estimated 4.49 billion indexed web pages, the time to master some new skills is now.

We’ve put together a handy checklist, specifically for all you non-developers. We’ll be showing you how to use the checklist as your very own technical SEO audit, explain how to implement changes and point you towards some great tools for success.

What is Technical SEO?

When it comes to SEO many place an emphasis on content creation/optimization (otherwise referred to as on-page SEO) and external link building (off-page SEO). Technical SEO is the foundation on which these two should be built and ensures that your site is Search Engine friendly. Without it you could be hindering your site’s chances of being indexed and ranking well.

What is a Technical SEO Audit?

A technical SEO audit is an assessment into how Search Engine friendly your site is. Instead of focusing on great content to boost visibility, a technical audit allows you to identify and address any technical issues that may be hindering your contents’ changes of being seen. It examines a number of key technical criteria that will increase your site’s performance if optimized correctly.

Use the follow technical SEO audit to manually examine your own site and rectify any issues that arise.

1. Website Speed

Website or page load speed is one of the biggest ranking factors and therefore requires consideration. When it comes to performance – you can always be faster, and there are several ways to ensure that pages load quickly without developer knowledge.

Speed Up Your Site!

Need tips on how to optimize your page load time? Click here to access our FREE SEO Guide

How to Optimize

  • Simplify page templates: Remove any unnecessary plugins, tracking codes, advertisements and widgets.
  • Minimize image sizes: Use an editor like Adobe Photoshop to save your images for web-based devices. This compresses image file sizes while retaining image quality.
  • Set browser’s cache: The browser cache stores website resources locally, on a computer. It remembers a site that you have visited and will be able to load it much quicker on subsequent visits. This can be changed accordingly to improve website speed.

WooRank Advanced Review Speed Tips section

Helpful Tools

There are a number of tools for testing website speed available which we have already put together for you.

2. Secure Browsing: HTTPS

Hypertext Transfer Protocol (HTTP) defines how messages are formatted and transmitted. HTTPS transmits web information securely, builds trust and credibility, and improves performance. If that wasn’t enough, back in 2014 Google announced that it was a ranking factor.

Matt Cutts says HTTPS is ranking signal on Twitter

How to Optimize

  • If you haven’t already, purchase an SSL certificate and migrate your site to HTTPS URLs (This isn’t necessarily a quick fix but it will be worth it in the long run).
  • Check that all assets are hosted on new secure URLs
    • Use a tool like Screaming Frog to compile a list of all URLs and find all non-HTTPS assets such as images, CSS files, videos and scripts

  • Update your Robots.txt and XML Sitemap to include HTTPS URLs.

Helpful Tools

WooRank’s very own Site Crawler tool is great for identifying HTTP assets hosted on HTTPS URLs.

3. Duplicate Content

Having duplicate content on your site is bad news and, unfortunately, search engines view sites starting with HTTP, HTTPS and WWW. as separate sites meaning a huge potential for duplicate content.

How to Optimize

The easiest way around this is to set your prefered domain in Google Search Console. Your preferred domain will receive all link juice and will be the one to appear in Search Engine Results Pages (SERPs),

Use Google Search Console to search for duplicate meta descriptions and title tags too. Under Search Appearance > HTML Improvements, look to see if duplicates in these sections have been identified. Re-write any duplicate descriptions and remember that title tags need to include your page’s primary keyword for best results.

Google Search Console HTML Improvements

4. URLs

URLs have an impact on both your site’s user experience and SEO. Both humans and bots expect them to provide at least a basic description of what the page is about, and where that page sits in the site’s hierarchy. Optimize your URLs by including the page’s target keyword as well as the page’s folder and subfolders. Take a look at these two URLs:

Search engines crawling the page will see that URL and will be able to tell that not only is the page about fancy white dress shirts, but it’s also topically related to men’s clothing. The second URL, unfortunately, doesn’t really tell you anything about what you’ll find on that page, except maybe that it’s a product produced by example.com. Which do you think will appear more relevant to a search for "men’s fancy white dress shirts"?

When creating URLs, follow these best practices:

  • Concise: Your URLs should be descriptive and contain keywords, but they should also be concise. Generally speaking, your URLs should be 100 characters or less.

  • Clean: When possible, avoid using URL parameters like session IDs and sorting/filtering. They lower usability and run the risk of creating duplicate content problems.

  • Hyphens: When using multiple words in your URL, separate them using hyphens, not underscores. Search engines use hyphens as word separators but don’t recognize underscores, so url_keyword looks the same to them as urlkeyword. Since humans use spaces in searches, hyphens in your URL will look more relevant.

*TIP: We recommend only changing URLs that are really hard to read. If you make changes to your URLs you’ll have to create a 301 redirect to the new URL and too many of these can affect site performance. *

Canonical URLs

URL optimization isn’t just how you use keywords, though. It’s also part of preventing duplicate content and consolidating ranking signals like link juice. It can be easy to inadvertently host duplicate content on a few pages thanks to URL parameters and syndicated content. This is bad not just for the duplicate pages - the Panda "penalty" affects the whole site. If you wind up with duplicate content thanks to your content management system or e-commerce platform, just use rel=”canonical” to point search engines toward the original version.

When using canonical URLs, first implement the WWW resolve. To do this, set a preferred domain in Google Search Console under Site Settings. Google takes preferred domains into account when crawling the web and displaying search results. So if you set your preferred domain to www.example.com, all links to example.com will send link juice to www.example.com, which is the URL Google will display in SERPs. Next, add the canonical tag to the <head> of HTML pages or the HTTP header of non-HTML pages:

  • HTML: <link rel="canonical” href=”https://www.example.com”/>
  • HTTP: Link <https://www.example.com>; rel="canonical”

When adding the canonical tag, make absolutely sure the URLs you’re using match 100% to your canonical URLs. Google sees http://www.example.com, https://www.example.com and example.com as three different pages. Google will simply ignore the canonical link if you use more than one on a page or link to a page that returns a 404 error.

5. Indexing/Crawling

Search Engines need to be able to crawl and index your pages in order to show them in SERPs.

Sitemaps

A basic description of XML sitemaps is a list of every URL on your site, stored as a text file in your site’s root directory. In reality, there’s a little bit more to them than that. Yes, they list every URL on your site (or at least the URL for every page you want crawled and indexed), but they also list extra information about each page and serve an important SEO function. Search engines use the information in sitemaps to crawl sites more intelligently and efficiently so they won’t waste their crawl budget on unimportant or unchanged content. When done correctly, your basic sitemap looks like this:

<?xml version="1.0" encoding="UTF-8”?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9” xmlns:xhtml=”http://www.w3.org/1999/html”>
    <url>
        <loc>https://www.example.com/</loc>
        <lastmod>2016-8-01</lastmod>
        <changefreq>monthly</changefreq>
        <priority>0.9</priority>
        <xhtml:link rel="alternate” hreflang=”fr” href=”https://www.example.com/fr/”/>
   </url>

What does all that mean? Here’s a synopsis:

  • <urlset>: This tells crawlers that the sitemaps is starting and ending.
  • <url>: Denotes the beginning and end of each URL entry in the sitemap.
  • <loc>: This defines the URL of the page. While the rest of the attributes found in the <url> tag are optional, <loc> required.
  • <lastmod>: The date, in YYYY-MM-DD format, the page was updated or modified.
  • <changefreq>: This indicates how frequently you update the page, which will help search engines decide how often to crawl it to make sure they’re indexing the freshest content. You might be tempted to lie to increase your crawl frequency, but don’t. If search engines see <changefreq> doesn’t jive with the actual change frequency, they’ll just ignore this parameter.
  • <priority>: Sets the priority of the page in relation to the rest of the site. Valid values range from 0.0 to 1.0, from least to most important. Use this tag to help search engines crawl your site more intelligently. Note that this only tells crawlers how important your pages are compared to your other pages. It does not affect how your pages are compared to other sites.
  • <xhtml:link>: This tag points to alternate versions of the page. In this example it indicates the French version of https://www.example.com.

Sitemaps aren’t a ranking signal, but they help search engines find all your pages and content, which makes it easier for you rank well.

If you don’t want to write your own sitemap, there are plenty of tools out there that can help you create one. Once you have your XML sitemap, validate and submit it using Google Search Console. You can also submit your sitemap to Bing via Webmaster Tools. Make sure you fix any errors so you don’t wind up impeding your site’s indexing.

Submit Pages Directly

The easiest way to get indexed is to submit your site directly to Google and Bing. Use Google Search Console to submit your URL to Google. This doesn’t require a Google Search Console account, but if you do have one, you can use the Fetch as Google tool in the Crawl section. After Googlebot successfully fetches your site, click the "Submit to index" button.

Fetch as Google in Google Search Console account

Submitting your site to Bing requires a Bing Webmaster Tools account.

Robots.txt files

Like XML sitemaps, robots.txt files are plain text files stored in the root directory of your site, and help crawlers navigate your site. The file contains lines of code that specify which user agents have access to which files, file types or folders. The code is broken up into blocks, with one user agent line per section. Basic robots.txt code looks like this:

User-agent: *
Disallow:

User-agent: googlebot
Disallow: *.ppt$

The asterisk (*) is used as a wild card. In the user agent line, the wild card represents all bots. In a disallow line, it represents the URL up to a specified point. In our example above, our robots.txt disallows Googlebot from crawling pages that end with a PowerPoint file extension - the $ denotes the end of the URL.

You can block bots from crawling your entire site by using a slash in the disallow line like this:

User-agent: *
Disallow: /

It’s good practice to disallow all robots from accessing the entire server when you’re building, redesigning or migrating your site. However, you have to be sure to restore access once you’re done, or your shiny new site won’t get indexed.

Use Google Search Console to test your robots.txt file for syntax errors or other problems.

Google Search Console robots.txt Tester

Meta Robots Tag

One problem with the robots.txt file is that it won’t stop search engines from following external links to your site, so disallowed pages could still wind up indexed. Add an extra layer of protection to individual pages using the robots meta tag:

<meta name="robots” content=”noindex”>

Or, in the context of getting your site indexed, make sure you don’t have the the robots meta tag on the pages you want to get indexed.

How to optimize

  1. Submit your XML sitemap to Google through Google Search Consol, under the Crawl section. Bing also has a webmaster tool where your site map can be submitted.
  2. Make sure that your robots.txt file isn’t blocking Searching Engines from crawling your site. To allow all search engines to crawl your entire site, your robots.txt file should look this:
    User-agent:
    Disallow:
    

Helpful tools.

There are loads of free tools for sitemap creation and you can test your Robots.txt file in Google Search Console.

Page Speed

Page load time is a crucial aspect of site usability and SEO. Google is out to give their users the best websites so they don’t want to send people to slow websites. When you audit your site using WooRank, check the Usability section to see how fast your page loads and how that compares to your competitors.

WooRank Load Time criterion

If your site is slow, optimize these elements to improve your page speed:

  • Images: Images are one of the biggest culprits of slow page speed. Don’t rely on HTML to reduce the file size of an image - it can only change its dimensions. Use image editing software like Photoshop to reduce the file size. Consider using other image optimization tools to further compress your images.

  • Dependencies: Certain plugins and scripts, like social share buttons and tracking software, are required for you to get the most out of your website. Whenever possible, use plugins made by your CMS and stick to just one tracking system at a time. Keep your CMS software up to date, but test each update in a test environment in case something with your site breaks with the CMS upgrade.

  • Caching: Use expires headers to control the length of time that your site can be cached, and tell browsers that they can cache images, stylesheets, scripts and Flash. This will reduce the number of HTTP requests, therefore improving your page speed.

  • G-Zip Encoding: Use G-Zip compression to zip and compress large files on your page to save bandwidth and download time.

  • Redirects: Some redirects are unavoidable. However, remember that every redirect is a new HTTP request and adds milliseconds on to your load time.

If all else fails, use browser developer consoles to find files that are bottlenecking your page load.

6. Mobile & Structured Markup

At the tail-end of 2016 Google announced Mobile-First Indexing, which means that results ranked in SERPs are based on the mobile version of a site’s content – even results shown to those using a non-mobile device. This is because searches on mobile devices like smartphones and tablets have now surpassed searches being made on laptops and PCs. It is therefore important to

  1. Establish a mobile version of your site (if you don’t already have one)
  2. Optimize your mobile version.

How to optimize

There are several aspects to ensuring that your mobile site is compatible for handheld devices:

  1. Use a responsive website design to ensure that it adapts to screen size
  2. Make sure that information and CTAs are accessible without having to zoom in or out
  3. Carefully consider the information that appears above the fold (ATF). This refers to the information that fits on the screen before the need to scroll down further
  4. Disable Flash as many devices are not compatible with Flash elements
  5. Ensure that the user isn’t required to scroll horizontally. This is terrible for the user experience
  6. Optimize your mobile speed by minifying code, reducing image size and keep the number of redirects to a minimum.

Helpful tools

Use device emulators or operating system simulators to test the mobile rendering of your site and check out some of our previous blogs that focus on mobile SEO.

Mobile Page Speed

Mobile friendliness has a direct tie to site speed, as load time is a major factor in mobile search rankings. It’s arguably even more important for mobile pages to be fast than it is for desktop sites. There are numbers that back this up: 40% of mobile users will leave a page after waiting three seconds for it to load. Google’s criteria for a page to be mobile friendly is to load above the fold content in one second or less.

You can optimize your mobile speed the same way you do your desktop speed: reducing image size, relying on caching, reduce dependencies and minimize redirects. Or, you can create an Accelerated Mobile Page (AMP). AMP is an open source spec initiative to create mobile pages that are fast and have enhanced user experience. There are three main parts to AMP

  • HTML: HTML for AMP pages is basically normal HTML. It just has a few custom variations and limitations for resources like images, videos and iframes.

  • JavaScript: AMP pages use a custom JavaScript library that loads asynchronously. You’re also required to set sizes in HTML, so browsers know how the page will look before elements are loaded. So the page won’t jump around as other resources load.

  • Cache: Google has a dedicated cache for AMP pages that it uses to serve in search results. When Google loads a page saved in the AMP cache, everything is coming from the same location, which means better efficiency.

Mobile Site Structure

There are three main options you have when creating the mobile version of your site:

  • Mobile Subdomain: This is the most labor and time intensive option of the three since it requires building an entirely separate mobile website, hosted on a subdomain (normally something like mobile.example.com or m.example.com). Google won’t be able to tell that the subdomain indicates the site is just for mobile users so you’ll have to use rel="canonical” tags on any duplicate pages. This method requires a lot of resources, more than the other two, and generally isn’t recommended.

  • Dynamic Design: This method detects user-agent and serves different HTML to mobile and desktop browsers. Use the vary: user-agent HTTP header to tell search engines that you will be serving different code based on user-agent.

    Add this code if you’re working in PHP:

    <?php
    header<"Vary: User-Agent, Accept”);
    ?>
    

    To do this in Apache, add the following code to your .htaccess:

    Header append Vary User-Agent
    

    Add this code in functions.php if you’re working with WordPress:

    function add_vary_header($headers) {
    $headers['Vary'] = 'User-Agent';
    return $headers;
    }
    add_filter('wp_headers', 'add_vary_header');
    
  • Responsive Design: The simplest and easiest way to create a mobile version of your site, responsive design is Google’s recommended method. It just requires you to set the viewport meta tag. The viewport tells browsers what dimensions to use when displaying a page. Set the viewport to scale to the device to make your pages mobile friendly:

    <meta name="viewport" content=”width-device-width, initial-scale=1.0”/>
    

Structured Data Markup

Structured data markup gives meaning to the content on your page so search engines can understand it. You can use Schema.org markup on your About page, for example, to tell search engines where to find your address, opening hours and phone number. Add it to product pages so search engines can easily find reviews and ratings of your products. If you’ve got a personal brand, add Schema markup to denote education, family and professional information.

Schema.org markup won’t necessarily make you outrank a page that doesn’t use it. But it’s a big help for SEO because it’s used in Google’s rich snippets. The easiest way to see search snippets live is to search for a recipe. In the search results you’ll see the normal search snippet: title, URL and description, along with a picture and star rating. Those last two are thanks to semantic markup.

So while semantic markup isn’t a ranking signal, it can help improve your search ranking. The better Google understands what’s on your page, the more likely you are to have better rankings. Plus, semantic markup also helps assistive applications like screen readers, improving your site’s user experience.

Final thoughts

The checklist above is a great way of conducting a manual technical SEO audit. Alternatively, you could use our very own Woorank auditing tool to instantly highlight areas that require attention. Use it in conjunction with this checklist to rectify any potential issues.

SEO can sometimes feel a little overwhelming but some of the most effective ways to enhance performance is by going through the technical SEO checklist above. Keep in mind that a Search Engine’s main aim is to provide a great customer experience and they do this by finding the creme de la creme of web pages to serve to its users. By optimising these technical SEO elements you are effectively telling Search Engines that your website is geared up to be a website of shear quality. Master these technical aspects and you’ll notice improved visibility and increased traffic.