This is a bit more complicated, and although it usually occurs normally in eCommerce, we also see it on content websites.
For example, with all parameters created from a customization .
The first thing we are going to do in this case is to solve these problems with 301 redirects that tell Google the good version of our content, the one we want it to index correctly.
The next step is to choose between several techniques
- Set a rel=noindex tag on links, leaving the page out of reach of the robot.
- “rel=canonical” tag to tell the robot which page is the correct one (being careful, of course, since it is not valid for different domains, such as a canonical from www.domainone.com to www.domaintwo.com).
- Add a special character to our URLs when they are dynamic. This is the hash #, a technique to parameterize AJAX pages that makes Google not index what is after them, leaving the URL at www.example.com/#!personalization-product1-whatever2.
Caution, do not use robots.txt on your server to exclude crawl folders.
If robots cannot crawl those pages, they cannot detect that those URLs lead to the same content, and therefore treat them as separate pages.
The first problem falls within the domain options
The person who enters our website sees the same site but on slightly different domains.
Here we find everything from example.com and example.net to http and https security.
One way to fix this is by using 301 redirects that tell Google that the “good” domain is one of them, or in other words, we will tell Google which domain of all is the good one.
We can also use the canonical tag, telling brazil email list search engines the “primary” version of the website, without affecting users.
On the other hand, one of the cases that I particularly remember is that of having and not having “www.” before our domain. In that case, Google Webmaster Tools will be very helpful in telling the search engine what we want it to take as the main domain.
Titles and descriptions
Now a simpler mistake.
We have the same title with the same building brand community: the new era of engagement description on every page, either because, for example, we sell spare parts for electronic products and they are all very similar, or for whatever reasons.
The problem here is that these practices not only make Google not know what to show in search results, but they also confuse the user, affecting the CTR .
To fix it we have several options… we can maldivian lads use everything from Google Webmaster Tools to ScreamingFrog to identify which titles and descriptions we have duplicated, and from there, it’s a piece of craftsmanship (as I like to say).
It’s about changing all possible titles Content Delivery and descriptions thinking about the people who are going to visit us.
The ideal measurements? Between 120 and 140 characters for the meta description and 60 for the title.