I need to create a robots.txt file that does the following:
Disallow all robots from accessing certain pages on the site.
But, as an exception, allow robot X to access the entire site.
My robots.txt contains something like this:
Can someone tell me if this will achieve my aim? I'm not sure whether the second record will completely override the first (in which case, the * would also apply to robot-x) or whether the * means, in effect, "all robots except the one mentioned above".
Hope this makes sense.
Hi in my opinion you use Disallow: /logon page and in your post somepage.htm so google not consider about this page only for relevant pages consider for robot.txt
According to http://www.robotstxt.org/faq/robotstxt.html, the User-agent: * line means "Any other robot not already listed", so yes, that would work fine for allowing robot-x global access and restricting it for all other (well-behaved) robots.
Thanks, Stevie. That's spot on - exactly what I needed.
This topic is now archived. It is frozen and cannot be changed in any way.