The url parameters for tracking a link are important at the SEO level, and are part of the content positioning strategy. To improve your results in search engines, you must know its definition and operation, as well as addressing the problems that may arise from its use.
Google explains that the URL parameter is part of the technique to transmit information from a click through its URL. The first parameter always comes after a question mark in the URL. The most typical example: http://example.com?product=1234&utm_source=google.
How to use link tracking parameters
When a URL is parameterized, it is easier to find certain results because you can differentiate between them. Imagine that you want to access an e-commerce store and find clothes filtered by size. Then in the URL parameters you should enter the data that can be found in the publications. Example:tiendaderopa.com/camisetas?talle=xl.
The url tracking parameters They allow you to filter and organize the content of a website, and also track different publications, articles or topics. They can vary according to specific keys and values, and can also be combined in different ways.
Tracking parameters and content parameters
Content parameters change the information displayed on the web page. They are the ones that allow you to filter between different categories within an online store. Instead, tracking parameters record data. They can store different types of information. From the users' home network to the campaign or ad group they clicked on. They do not modify the content of the page and can even be configured in a personalized way. Using advanced tracking parameters, it is possible to have much more detailed tracking of the user experience on our page.
When can the use of parameters become an SEO problem?
There is a widespread opinion that we should move away from URL parameters. As useful as they may seem, they can cause problems in content crawlability and indexing, resulting in poor search engine rankings.
If the structure of a url parameter is not correct or they are only tracking (passive), the result can be an endless URL that repeats the same content. That is why we must think of alternatives to clean these types of parameters. The most common problems that arise from incorrectly configuring parameters include different situations.
Duplicate content
Since each URL is treated by the search engines As an independent, if there are several versions of the same page created by a URL parameter there may be a penalty. This is considered duplicate content and ends up affecting the overall performance of the website. Pages reordered using a URL parameter are usually very similar to the original, so some parameters can take you to exactly the same content as the original page.
Cannibalization of keywords
A filtered version of the original URL targets the same set of keywords. Therefore, several pages compete for the positioning of the same term. This causes the search engine to have problems choosing which page should be first for that particular keyword.
Waste of crawl resources
The Complex, multi-parameter URLs create many different URLs and they point to similar (not to say identical) content. According to Google, these types of tracking parameters can affect bandwidth allocation or cause problems when indexing all web content. It is preferable to stay away from these configurations as much as possible.
Negative readability for URLs
If you use custom parameters in your URL, the user will have problems reading it correctly. When displayed in SERPs, they look like untrustworthy websites and that reduces the chances of organic clicks. If the user distrusts the link because they see it as too complex, unless it is recommended by friends or acquaintances, it will be difficult for them to reach your website naturally.
Diluted positioning signage
When multiple URLs have the same content, users can link to any of the parameterized versions of the page. The result will be that the main page will hardly position well in the search engine.
How to clean link tracking parameters?
To improve the SEO of your website and improve the tracking and content parameters, you must take some specific actions. They are strategies that require work and patience, but that can bear fruit in a short time.
First you have to analyze the internal link system to generate coherence. If you have many Parameterized URLs, it is important to mark which page crawlers should not be indexed. Internal linking should only be to the static page, never to the parameterized versions. In this way, coherent signals are sent to search engines and a strengthening of the original website is created.
Another procedure is canonization of a URL. By setting canonical tags on parameterized URLs, you can reference the preferred URL you want to index. All variations of your website must include the canonical tag that identifies the original URL. This will help the SEO positioning of your page.
Blocking trackers with Disallow
If you are suffering tracking budget issues, you can choose to block crawlers from accessing your URLs with parameters. This is configured using a robots.txt file where you will include the following text:
User-agent: *
Disallow: /*?*
What this robot does is check before tracking. By configuring with the instructions presented, you will avoid tracking of the parameterized websites. Cleaning tracking parameters in your links is a somewhat tedious task, but necessary. If you have not done the correct configuration previously, it may take a while. But once you start using these strategies in your linking, your positioning results will greatly benefit. The better your web pages position in search engines, the greater the chances of making the web project profitable.