Creating links that search engines can't click?

Hey there everyone!

Part of a site I’m building includes vote links. In an attempt to try to keep the count as legitimate as possible, I’ve gone to great lengths to try to stop people from circumventing the system. I’m pretty sure I’ve got that covered. The one thing I haven’t done, however is figured out how to protect the links from a search engine/crawler. I thought I was gonna be all clever and hide them in JS but then I read this. Unless I’m mistaken, it’s not a valid way of doing this.

Any suggestions on how to create a link that Google and the others can’t or won’t click? Nofollow doesn’t do this, as the link is still followed. That just stops the link from counting towards a pagerank.

Thanks for your time!

[quote=“schwim, post:1, topic:98710”]
Any suggestions on how to create a link that Google and the others can’t or won’t click? [Nofollow][2] doesn’t do this, as the link is still followed. That just stops the link from counting towards a pagerank.
[2]: http://en.wikipedia.org/wiki/Nofollow
[/quote]Actually, Google say they generally don’t follow nofollow links.

[quote=Google]How does Google handle nofollowed links?

In general, we don’t follow them. This means that Google does not transfer PageRank or anchor text across these links. Essentially, using nofollow causes us to drop the target links from our overall graph of the web. However, the target pages may still appear in our index if other sites link to them without using nofollow, or if the URLs are submitted to Google in a Sitemap. Also, it’s important to note that other search engines may handle nofollow in slightly different ways.[/quote]

https://support.google.com/webmasters/answer/96569?hl=en

Sorry, I was going by the wiki info where it states that it follows, but doesn’t rank. This, however, doesn’t resolve my issue as nofollow doesn’t have to be taken into account by any search engine/crawler/bot that hits the site.

I’m thinking JS is still going to end up being a better way to protect the links, but would appreciate it if anyone has a more bulletproof method.

As far as I know, web crawlers shouldn’t submit forms, so you could have the vote links set up as form buttons or something like that.

Wow, I totally overlooked that. That seems like a fantastic way to handle it. POST data would be excluded in the link reporting and I’ve already set it up to just quietly drop them at the parent page if all the vote submitting requirements aren’t met.

Thanks very much for your help!

Anything that performs an update on the server end should use POST.

The difference between a POST call and the GET call an ordinary link uses is that GET is telling everything that you are retrieving information and not updating anything.

2 Likes

I think we cant do it as i tried to do it for my website but doesn’t workedout