This Keyword Cloud provides an insight into the frequency of keyword usage within the page.
It's important to carry out keyword research to get an understanding of the keywords that your audience is using. There are a number of keyword research tools available online to help you choose which keywords to target.
This is the number of pages on your website that are indexed by search engines. It's important to aim to have all of your web pages crawled and indexed by the search engines, as this gives you more opportunity for your website to be found.
A low number (relative to the total number of pages/URLs on your website) indicates that there is an issue, whether it's due to a bad internal linking structure, or you're unknowingly preventing search engines from crawling your pages.
Make sure your website's XML sitemap is present and you have submitted it to the major search engines. Building backlinks to your website's internal pages will also help bots to discover, crawl and index them, while building authority to help them rank in the search engines.
Check Google™ Webmaster Tools under 'Google Index' and 'Crawl' to keep track of the status of your site's indexed/crawled pages.
Great, your website has a robots.txt file.
A robots.txt file allows you to restrict the access of search engine robots that crawl the web and it can prevent these robots from accessing specific directories and pages. It also specifies where the XML sitemap file is located.
You can check for errors in your robots.txt file using Google Search Console (Formerly Webmaster Tools) by selecting 'Robots.txt Tester' under 'Crawl'. This also allows you to test individual pages to make sure that Googlebot has the appropriate access.
Great, your website has an XML sitemap.
A sitemap lists URLs that are available for crawling and can include additional information like your site's latest updates, frequency of changes and importance of the URLs. This allows search engines to crawl the site more intelligently.
It is also good practice to specify your sitemap's location in your robots.txt file.
It's important to only include pages that you want the search engines to crawl, so avoid any that have been blocked via your robots.txt file. Check the URLs to ensure that none of them cause redirects or return error codes. This includes being consistent with your URLs, for example, including your preferred URLs (with or without www.), including the correct protocol (http or https) and making sure URLs all end with or without a trailing slash.
Good, the URLs look clean.
Created 3 years ago
Expires in 10 years
Do you know that you can register your domain for up to 10 years? By doing so, you will show the world that you are serious about your business.
We have not found a Blog on this website.
Your website does not have a blog.
In this tough and competitive internet marketing world, content marketing rules. While publishing your content on other sites might be a good strategy, publishing it on your own site garners more benefits.
Starting a blog is a great way to engage with your audience and increase your online visibility by attracting qualified traffic from new sources.
Use our tips to optimize your blog and improve performance.
If you don't feel that a blog is right for your business, consider other ways to build useful evergreen content, such as online guides and whitepapers.
This website is not optimized for Mobile Visitors
Perfect, the most important buttons/links are large enough to be tapped easily.
Perfect, no embedded objects detected.
Font Size Legibility
This web page’s text is too small for legibility on mobile devices.
- Use a base font size of 16 CSS pixels.
- Use sizes relative to the base size to define the typographic scale.
- The general recommendation for spacing between characters is 1.2em.
- Restrict the number of fonts used and the typographic scale.
This page does not specify a viewport, or the viewport is not well configured.
The content fits within the specified viewport size.
Avoid landing page redirects
Leverage browser caching
Reduce server response time
See Google's PageSpeed Insights Rules for more information on how to improve each of the elements in this section.
Custom 404 Page
Your website does not have a custom 404 Error Page.
Apparently your site does not have a 404 Error Page - this is bad in terms of usability.
Take the opportunity to provide visitors with a beautiful and helpful 404 Error Page to increase user retention.
Structured Data Markup
No Structured Data Markup has been detected
Google™ supports a number of rich snippets for content types, including: Reviews, People, Products, Businesses and Organizations, Recipes, Events , Videos, and Music. If your website covers one of these topics, then we suggest that you annotate it with Schema.org using microdata.
Register the various extensions of your domain to protect your brand from cybersquatters.
Register the various typos of your domain to protect your brand from cybersquatters.
Great, your website is SSL secured (HTTPS).
Your website's URLs do not redirect to HTTPS pages.
Your headers are not properly set up to use STS.
The SSL certificate expires in 4 months.
The certificate issuer is GoDaddy.com, Inc..
While switching to HTTPS, make sure your site remains optimized and see to it that your website will still run quickly. Follow these best practices for a smooth transition:
- Use a serious issuer to purchase your SSL certificate
- Redirect all of your HTTP pages to the HTTPS version of your website
- Use a Security Token Service (STS) in your headers
- Renew your SSL certificate every year, before it expires
- Make sure that all of your content (CSS, etc.) is linked to HTTPS
- Update your XML sitemap to ensure the URLs include HTTPS and update the robots.txt file to reference this version
- Register the HTTPS website in Google & Bing Webmaster Tools
128,888th most visited website in the World