Marketing Checklist Top priorities for nabi.cat :
79 personalized tasks for
online marketing success!
|<H2> Cercavila “Per un país de tots, decidim escola catalana”|
|<H2> La ginesta altra vegada…|
|<H2> Colònies de 5è a Esterri d’Àneu|
|<H2> Jocs Florals 2015|
|<H2> Casal d’estiu Nabí 2015|
|<H2> Colònies 1r i 2n a Mas Suro|
|<H2> Les fotos de la Nabígresca|
|<H2> Guanyador del sorteig Nabígresca|
|<H2> Marató de Lectura|
|<H2> Nabígresca: Diumenge 19 d’abril|
|<H2> Marató de lectura|
|<H3> Subscriu-te al butlletí|
|<H3> Comentaris recents|
Great, your website is structured using HTML headings (<H1> to <H6>).
Use your keywords in the headings and make sure the first level (<H1>) includes your most important keywords. Never duplicate your title tag content in your header tag.
While it is important to ensure every page has an <H1> tag, never include more than one per page. Instead, use multiple <H2> - <H6> tags.
This Keyword Cloud provides an insight into the frequency of keyword usage within the page.
It's important to carry out keyword research to get an understanding of the keywords that your audience is using. There are a number of keyword research tools available online to help you choose which keywords to target.
|Keywords (2 words)||Freq||Title||Desc||<H>|
This table highlights the importance of being consistent with your use of keywords. To improve the chance of ranking well in search results for a specific keyword, make sure you include it in some or all of the following: page URL, page content, title tag, meta description, header tags, image alt attributes, internal link anchor text and backlink anchor text.
Great, a redirect is in place to redirect traffic from your non-preferred domain.
Great, your website has a robots.txt file.
A robots.txt file allows you to restrict the access of search engine robots that crawl the web and it can prevent these robots from accessing specific directories and pages. It also specifies where the XML sitemap file is located.
You can check for errors in your robots.txt file using Google Search Console (Formerly Webmaster Tools) by selecting 'Robots.txt Tester' under 'Crawl'. This also allows you to test individual pages to make sure that Googlebot has the appropriate access.
Great, your website has an XML sitemap.
A sitemap lists URLs that are available for crawling and can include additional information like your site's latest updates, frequency of changes and importance of the URLs. This allows search engines to crawl the site more intelligently.
It is also good practice to specify your sitemap's location in your robots.txt file.
It's important to only include pages that you want the search engines to crawl, so avoid any that have been blocked via your robots.txt file. Check the URLs to ensure that none of them cause redirects or return error codes. This includes being consistent with your URLs, for example, including your preferred URLs (with or without www.), including the correct protocol (http or https) and making sure URLs all end with or without a trailing slash.
Good, the URLs look clean.
Frames can cause problems on your web page because search engines will not crawl or index the content within them. Avoid frames whenever possible and use a NoFrames tag if you must use them.
Created 5 years ago
Do you know that you can register your domain for up to 10 years? By doing so, you will show the world that you are serious about your business.
Length: 4 character(s)
Keep your URLs short and avoid long domain names when possible.
A descriptive URL is better recognized by search engines. A user should be able to look at the address bar and make an accurate guess about the content of the page before reaching it (e.g., http://www.mysite.com/en/products).
Structured Data Markup
No Structured Data Markup has been detected
Google™ supports a number of rich snippets for content types, including: Reviews, People, Products, Businesses and Organizations, Recipes, Events , Videos, and Music. If your website covers one of these topics, then we suggest that you annotate it with Schema.org using microdata.
|nabi.com||Expires in 4 months|
|nabi.net||Expires in 3 months|
|nabi.org||This domain is booked|
|nabi.info||This domain is booked|
|nabi.biz||Expires in a year|
|nabi.eu||This domain is booked|
Register the various extensions of your domain to protect your brand from cybersquatters.
Register the various typos of your domain to protect your brand from cybersquatters.
The Doctype is used to instruct web browsers about the document type being used. For example, what version of HTML the page is written in.
Declaring a doctype helps web browsers to render content correctly.
Great, language/character encoding is specified: UTF-8
Specifying language/character encoding can prevent problems with the rendering of special characters.
Warning! Your website is not SSL secured (HTTPS).
- Use a serious issuer to purchase your SSL certificate
- Redirect all of your HTTP pages to the HTTPS version of your website
- Use a Security Token Service (STS) in your headers
- Renew your SSL certificate every year, before it expires
- Make sure that all of your content (CSS, etc.) is linked to HTTPS
- Update your XML sitemap to ensure the URLs include HTTPS and update the robots.txt file to reference this version
- Register the HTTPS website in Google & Bing Search Console/Webmaster Tools.