Blocking links via .htaccess

I saw on another forum a suggestion for blocking unwanted incoming links via .htaccess so that they would return a 403 error. The thinking behind this was that they would perhaps eventually get removed by the site concerned because they wouldn’t want broken links, or that Google would eventually de-index them.
I was thinking about using this in cases where I haven’t been able to get links removed. I was wondering what the pros and cons would be of doing this, i.e.
Would I actually gain anything, and could it cause me any problems?

Seems risky… If there’s a really bad link out there that worries you reporting it seems safer than blocking it after all when you block a link you’re telling Google your site is not accessible and who knows what doors that opens. Google starts seeing 403’s here, there and over there too, it’s not a good sign for the stability of your site.

If you have links you just don’t like I fail to see the value in cutting the user, and potentially the engine, off.

Of course having never done this it’s merely a comment, perhaps someone who was listed by some unscrupulous sites will come back with better insight…

Reporting would be the best option, but unfortunately Google doesn’t (currently) have this facility. In this type of instance it only advises that I should contact the webmaster of the site concerned, and there is no suggestion as to what to do if that fails (if anyone else knows otherwise that information would be greatly appreciated).

I would be telling Google that my site is not accessible, but only from specific urls? (I should explain that I am not envisaging blocking 100’s of urls, maybe 10 or so.) A couple of months ago I noticed a 403 error in GWT, and as it is a server error I contacted my hosts to ask what the problem was. They replied that they did this as a matter of routine where they suspected malware or hacking attempts, and that I need have no concerns. If it is ok in these instances, would it not be if I wanted to prevent access by certain domains for other reasons?

This is my other major concern, i.e. is there any value in doing it anyway, especially if there are risks involved? But it isn’t because I just don’t like the links, but because I feel they could be potentially harmful.
Users never click on these links anyway, the sites where they placed have no relevance to mine and there is no reason why anyone should have any interest in them. The sole purpose would be to render the link useless, so that for whatever reason it is there, it cannot fulfill the objective.
I was (perhaps optimistically) thinking that Google would maybe investigate those sites which are being blocked.

Thanks very much. I appreciate your reply and that you are only commenting, and this reply is in no way a criticism of your thoughts (which largely echo my own) but merely an exploration of them.

I had not heard about blocking the links in .htaccess file, it seems little bit risky to me ! as far as I know Any kind of server response the Search Engines get decides the reputation of the website in search engine’s eye! Hence If my site is showing plenty of 404 , soft 404 definitely that would lead to show that My site is not well strictured! Same way I feel 403 would not be safer in percentage wide of the total links prospects of a website! here I am not saying 100’s or 200 links but I am saying percentage wide !!

I think that there are no bad links. All links help in rankings and even links from non relevant website are not bad. I would never think about blocking links. If you have some really bad links from casino or porn websites then you can build some quality links to compensate.

If you will move/redirect the URL even then the links will be there. So you just need to build quality links. Google understand this fact that anyone can build links for your website to damage your website’s reputation and Google won’t de rank you just because of some unwanted links.