If you’ve had your website for much time at all, you’ve probably realized organic search engine traffic is one of the most valuable traffic channels you can have. Even if you don’t have an online shop, your website will still generate a large amount of in-store visits for you.

How do you get more search traffic?


Many of those new to SEO see it as some combination of keyword usage and backlinks. And while those are two important concepts in SEO, there’s a lot more to it.

In fact, the way you set up your website will have a big, big, big impact on how well it performs from an SEO perspective.

This is known as technical SEO.

A bit worried about the word "technical"?

Don’t be. We’ll set you up with some of the technical SEO basics that will make it easier for Google to find and display your site, and that will give your website a solid foundation moving forward as you add new content, pages and features.

  1. Crawlability

Crawling and indexing is the more basic fundamental of SEO. Google can’t recommend a website if it can’t access, read or understand it. So the first focus of your technical SEO should be on making sure your website is crawlable.

If you already have a website, you can check its crawlability and indexability a couple of ways, depending on whether or not you’ve set up Google Search Console:

  • Site: search operator: One of the easiest ways to check your site’s indexing is to do a quick search using the site: operator. Just type site:yourdomain.com into Google and it will come back with all the pages it has on your domain. If the results look like this, your site isn’t indexed:

Unindexed site search operator results

  • Google Search Console: If you want a more reliable number, and you’ve added your site in Google Search Console, just check the Index Status report under Google Index. Google will tell you the number of pages on your domain it has in its index.

Indexed pages in Google Search Console

If the number of indexed pages is much lower than you’d expect, or you see a sudden drop in the number of indexed pages, it’s time to get a shovel and do some digging. It’s likely that something in your site is blocking Google from crawling your site.

Likely culprits include:

  • Problems with your robots.txt file. It’s possible that you have accidentally blocked your entire site, or a large chunk of it, with the syntax you’ve used in your disallow lines. You can check this pretty easy by clicking on the Advanced button in the Index Status area of Search Console. This will show the number of pages on your site blocked by your robots.txt.

  • Issues with meta robots tags. Meta robots tags including the "NoIndex" attribute will tell search engines not to index a page. Great if you want an extra layer of protection for pages you don’t want to be indexed, disastrous if you add them to pages you do want indexed. Use Site Crawl to find instances of NoIndex robots tags.

  • Crawl errors. Check for crawl errors in the Crawl section in GSC. As the name implies, this will list all of the errors Google encountered while trying to crawl your pages. These errors include pages blocked by robots.txt (which you’ve already seen), but also instances of server downtime and pages that return 404 errors. Even if some of your pages are still crawlable, sites with a lot of errors can be de-indexed, so you should check this section frequently.

  • Missing sitemap. Search engines can find pages even when they’re not in your sitemap, but it makes it much less likely. If you see that pages aren’t getting indexed, check your sitemap to make sure you aren’t leaving out any pages.

  • Page Speed

Page speed is vital a website’s user experience and, therefore, SEO. Now, speed can mean something different to different people. Is it the amount of time it takes your site to completely load?

The time it takes for your server to start sending information?

The time it takes to load content above the fold?

For SEO purposes, page speed is measured in the time it takes a user’s browser to receive the first byte from your server: time to first byte.

What do you do if you’ve got a slow website? Start with the basics:

  • Minification: Minification is the process of removing unnecessary data without changing the resource or how it’s loaded by the browser. So things such as removing code comments, using shorter variable names or removing unused code.

  • *Compression: *Compression, as you’re probably aware, reduces file size and download time by up to 90% — your exact goal when optimizing page speed. Since modern browsers can automatically handle .gzip compression, all you have to do is enable it on your server.

  • Caching: Caching headers tell browsers, basically, that they should try to download a resource only after a certain amount of time has passed and instead load the version that’s been saved on the browser. This can significantly reduce load time for repeat visitors to your website. To learn how to add caching to your site, check out this guide from Google.

Even if you already have a fast site, you should still check for opportunities to use minification, compression and caching. There’s no such thing as a website that’s too fast.

  1. Mobile Friendliness

Mobile friendliness is the measure of how well a page is optimized to load on a mobile device (phone or tablet). It seeks to answer the question "will someone have a good user experience on this page using their phone?"

One major component of mobile friendliness is speed. While desktop users will wait a (relatively) long time for a page to load, you’ve only got a few seconds before mobile users will bounce. Almost half will give up on a page that doesn’t load in 3 seconds or less.

Fortunately for you, mobile speed optimizations are basically the same as normal speed optimizations; anything that reduces load time is good. If you’re primarily a content publisher, you should also consider Google’s AMP project to further reduce load times through the AMP cache.

Aside from speed, Google also wants to see pages that will display on mobile screens. It wants to see responsive design.

Responsive design options include:

  • Mobile subdomain — Creating a whole new version of a website, hosting it a subdomain (mobile.example.com or m.example.com) and redirecting mobile users based when they click through to a page. This generally isn’t recommended because it’s really labor and resource intensive.

  • Dynamic design — Detecting user-agent and serving different HTML for mobile browsers. There’s a danger to this method, however, since trying to do different things based on user-agent is a sign that you’re cloaking your pages. Be sure to use the vary: user-agent HTTP header to tell Google you’re serving content based on user-agent to help avoid this issue.

  • Viewport meta tag — The simplest and easiest method of responsive design, this option requires you to add the viewport meta tag to your pages and set it to render based on screen width. A properly configured viewport looks like this <meta name="viewport" content="width-device-width, initial-scale=1.0”/>. This tells browsers to scale pages up or down to fit on the device screen.

  • Structured Data

Google is smart, but it can’t quite understand human language quite yet (although it’s getting there). It needs some help in the form of structured data.

Structured data adds context to the content on a page and helps search engines and other means better interpret what that text means. You can use to tell search engines about your store’s location, business hours, accepted payment options and prices. Or, if you don’t have a physical location, annotate your site navigation, product data and customer ratings.

This structured data will often be reflected in the search results in the form of rich snippets:

Product data in SERP rich snippet

Structured data is also a boon to content publishers.

Adding structured data to content through taxonomies like schema.org will help Google understand what topics your articles are about. For example, this article is marked up to add context surrounding its topic of "technical SEO" including:

  • Meta robots tags

  • Crawl errors

  • XML sitemaps

  • Page speed and TTFB

  • Mobile friendliness

  • Structured data

This structure helps Google understand how all of these terms are related to each other.

Move on to the Advanced SEO

This is by no means an exhaustive list of technical SEO. But your SEO success is predicated on you getting these basics right and building from there.

Once you’ve mastered these four technical SEO basics it’s time to move on to more advanced SEO concepts like dealing with duplicate content, semantic SEO and link building.