Marketing Checklist Top priorities for lapolar.cl :
79 personalized tasks for
online marketing success!
This Keyword Cloud provides an insight into the frequency of keyword usage within the page.
It's important to carry out keyword research to get an understanding of the keywords that your audience is using. There are a number of keyword research tools available online to help you choose which keywords to target.
This is the number of pages that we have discovered on your website.
A low number can indicate that bots are unable to discover your webpages, which is a common cause of a bad site architecture & internal linking, or you're unknowingly preventing bots and search engines from crawling & indexing your pages.
Make sure your website's XML sitemap is present and you have submitted it to the major search engines. Building backlinks to your website's internal pages will also help bots to discover, crawl and index them, while building authority to help them rank in the search engines.
Check Google™ Search Console under 'Google Index' and 'Crawl' to keep track of the status of your site's indexed/crawled pages.
Warning, no 301 redirects are in place to redirect traffic to your preferred domain. Pages that load successfully both with and without www. are treated as duplicate content!
Great, your website has a robots.txt file.
A robots.txt file allows you to restrict the access of search engine robots that crawl the web and it can prevent these robots from accessing specific directories and pages. It also specifies where the XML sitemap file is located.
You can check for errors in your robots.txt file using Google Search Console (Formerly Webmaster Tools) by selecting 'Robots.txt Tester' under 'Crawl'. This also allows you to test individual pages to make sure that Googlebot has the appropriate access.
Your website does not have an XML sitemap - this can be problematic.
A sitemap lists URLs that are available for crawling and can include additional information like your site's latest updates, frequency of changes and importance of the URLs. This allows search engines to crawl the site more intelligently.
We recommend that you generate an XML sitemap for your website and submit it to both Google™ Search Console and Bing Webmaster Tools. It is also good practice to specify your sitemap's location in your robots.txt file.
It's important to only include pages that you want the search engines to crawl, so avoid any that have been blocked via your robots.txt file. Check the URLs to ensure that none of them cause redirects or return error codes. This includes being consistent with your URLs, for example, including your preferred URLs (with or without www.), including the correct protocol (http or https) and making sure URLs all end with or without a trailing slash.
Good, the URLs look clean.
Frames can cause problems on your web page because search engines will not crawl or index the content within them. Avoid frames whenever possible and use a NoFrames tag if you must use them.
Length: 7 character(s)
Keep your URLs short and avoid long domain names when possible.
A descriptive URL is better recognized by search engines. A user should be able to look at the address bar and make an accurate guess about the content of the page before reaching it (e.g., http://www.mysite.com/en/products).
Structured Data Markup
No Structured Data Markup has been detected
Google™ supports a number of rich snippets for content types, including: Reviews, People, Products, Businesses and Organizations, Recipes, Events , Videos, and Music. If your website covers one of these topics, then we suggest that you annotate it with Schema.org using microdata.
Register the various extensions of your domain to protect your brand from cybersquatters.
Warning: your website's speed should be improved.
Perfect, your server is using a caching method to speed up page display.
Too bad, your website is using nested tables, which can slow down page rendering.
Too bad, your website is using inline styles.
Too bad, your website has too many CSS files (more than 4).
Too bad, your website does not take advantage of gzip.
Website speed has a huge impact on performance, affecting user experience, conversion rates and even rankings. By reducing page load-times, users are less likely to get distracted and the search engines are more likely to reward you by ranking your pages higher in the SERPs.
Conversion rates are far higher for websites that load faster than their slower competitors.
See Google's PageSpeed Insights Rules for more information on how to improve each of the elements in this section.
Great, language/character encoding is specified: ISO-8859-1
Specifying language/character encoding can prevent problems with the rendering of special characters.
61,806th most visited website in the World