zygoma — 2011-02-08T03:30:29-05:00 — #1
Bonjour from medieval york York...
following a refurb of my site ive got 30 plus pages i do not want google to index. Now i know there are two ways to block the bots but am not sure what will be best.
Method 1 would be to put this in the header of the unwanted pages -<meta name=”robots” content=”noindex”>
Method 2 would be to do a robot text file which terrifies me.
But what about deleting the pages would that be a problem. These pages would have no SEO benefit to me and they only link to now defunt parts of the site.
So could i hit the self destruct button and delete pages i dont want...
Answers on a postcard please
ralphm — 2011-02-08T03:42:38-05:00 — #2
If you don't want the pages any more, I would say delete them and then set up some redirects in case people click on a link they find in a search engine (or elsewhere). Redirects are easy to set up in a .htaccess file. E.g.
Redirect 301 /old/page/ /new/page/
erik_j — 2011-02-08T07:25:52-05:00 — #3
Hi, Why not do all three; header, robot.txt, site map.
Of course safest would be to delete the pages, but only if the pages are to no use for the visitors or they can,t find them, I suggest you delete them.
If you do, then use "mod_alias" or maybe "mod_rewrite" to tell they are "gone" (410). - Ask in the Apache Configuration forum how to redirect visitors to a 410 page.
PS / This is a SP post card.
EDIT) This was apparently not a recently opened browser tab by me. 8.33 GMT I see it was.
baybossplaya — 2011-02-17T02:47:18-05:00 — #4
the best way would be to 301 redirect those pages. you dont want the link juice to disappear.
acmeous — 2011-03-21T10:14:25-04:00 — #5
I will like to mention another against deleting those pages. Do you want your visitors from other sites which may link to those pages by any chance, to get a 404 Page not Found?
So redirect them or use the 1st method of blocking search engines to it.