Should I block my affiliate links in my robots.txt file?

affiliate link cloaking robots

If you are managing a website that includes many affiliated links you might have noticed a drop in traffic, especially from Google search. One of the reasons could be that your content is saturated with affiliate links.

Your website’s link profile plays a key role in search algorithms. If your website is saturated with only affiliate links, Google will catch on to that and possibly devalue your site.

By linking to other sources, you will essentially tell Google that your site has some legitimacy, at least in terms of your link profile.

This will improve the affiliate-to-normal link ratio and could lift any link penalties automatically imposed by Google for low-quality sites.

Affiliates often use link cloaking

Cloaking essentially means redirecting the often long and strange-looking affiliate links to more user-friendly links that can be easily remembered and recalled.

Links redirection or masking, as it is sometimes referred to, is usually done with a 301 directive in .htaccess. Cloaked affiliate links can look completely different.

For example an affiliated link to a product page such as this one:

can be modified to:

This looks much nicer to the end-user, doesn’t it? As for the search engines, the cloaked link can also include targeted keywords.

All nice, right? Wrong.

Cloaking links is actually a bad practice for SEO and it is generally not recommended simply because it is too often abused, but link cloaking can also offer certain advantages.

For one, it lets affiliates organize all those crazy-looking links into something much more memorable and easier to manage.

If you use the same cloaked link on many different web pages throughout your website and your actual affiliate link has changed, all you will need to do is to change the original affiliate link in the redirection settings.

That’s it.

Your outgoing cloaked link remains the same throughout your entire website.

Managing your affiliate partners and links on big content-driven websites with contextual links, can become very easy.

Another big advantage of using cloaked links is that all the outgoing affiliate links can use a common directory structure.

In the example of a cloaked link,

you can see that it goes through /recommends/ directory structure.

And this brings us to the original question…

Blocking affiliate links in the robots.txt file

One of the most effective ways to hide (or at least lesser the value of) your affiliate links from search engines is by blocking them from indexing links going through a specific folder structure.

Still using the above example of an affiliated link,

the robots.txt file would look like this:

User-agent: *
Disallow: /recommends/

Every single link going through the /recommends/ directory structure will not be indexed by search engines.

Using this type of elimination technique might also be a good way of minimizing the importance of your external affiliate links on your website.

There may be a legitimate reason for doing so, however as outlined above, Google does not like hidden links and may treat them as spam.

Having said that, there are currently many sites that use cloaking on affiliate links and they are absolutely fine…

…for now.

Leave a Comment

4 × 1 =

error: Content is protected !!