This Keyword Cloud provides an insight into the frequency of keyword usage within the page.
It's important to carry out keyword research to get an understanding of the keywords that your audience is using. There are a number of keyword research tools available online to help you choose which keywords to target.
This table highlights the importance of being consistent with your use of keywords. To improve the chance of ranking well in search results for a specific keyword, make sure you include it in some or all of the following: page URL, page content, title tag, meta description, header tags, image alt attributes, internal link anchor text and backlink anchor text.
Warning, no 301 redirects are in place to redirect traffic to your preferred domain. Pages that load successfully both with and without www. are treated as duplicate content!
Redirecting requests from a non-preferred hostname is important because search engines consider URLs with and without "www" as two different websites.
Great, your website has a robots.txt file.
A robots.txt file allows you to restrict the access of search engine robots that crawl the web and it can prevent these robots from accessing specific directories and pages. It also specifies where the XML sitemap file is located.
You can check for errors in your robots.txt file using Google Search Console (Formerly Webmaster Tools) by selecting 'Robots.txt Tester' under 'Crawl'. This also allows you to test individual pages to make sure that Googlebot has the appropriate access.
Your website does not have an XML sitemap - this can be problematic.
A sitemap lists URLs that are available for crawling and can include additional information like your site's latest updates, frequency of changes and importance of the URLs. This allows search engines to crawl the site more intelligently.
We recommend that you generate an XML sitemap for your website and submit it to both Google™ Webmaster Tools and Bing Webmaster Tools. It is also good practice to specify your sitemap's location in your robots.txt file.
It's important to only include pages that you want the search engines to crawl, so avoid any that have been blocked via your robots.txt file. Check the URLs to ensure that none of them cause redirects or return error codes. This includes being consistent with your URLs, for example, including your preferred URLs (with or without www.), including the correct protocol (http or https) and making sure URLs all end with or without a trailing slash.
Underscores in the URLs
Font Size Legibility
Length: 9 character(s)
Keep your URLs short and avoid long domain names when possible.
A descriptive URL is better recognized by search engines. A user should be able to look at the address bar and make an accurate guess about the content of the page before reaching it (e.g., http://www.mysite.com/en/products).
Custom 404 Page
Your website does not have a custom 404 Error Page.
Apparently your site does not have a 404 Error Page - this is bad in terms of usability.
Take the opportunity to provide visitors with a beautiful and helpful 404 Error Page to increase user retention.
Above the Fold Content
We could not find a Print-Friendly CSS
This is a special CSS style sheet which ensures that unnecessary interface designs and images are left out when printing pages from your site, saving the user a lot of ink.
It is just another way to provide a rich user-experience.
Structured Data Markup
Register the various extensions of your domain to protect your brand from cybersquatters.
Register the various typos of your domain to protect your brand from cybersquatters.