Robots.txt to allow only one robot

I need to create a robots.txt file that does the following:

  • Disallow all robots from accessing certain pages on the site.

  • But, as an exception, allow robot X to access the entire site.

My robots.txt contains something like this:

User-agent: robot-x

User-agent: *
Disallow: /somepage.htm
Disallow: /anotherpage.htm

Can someone tell me if this will achieve my aim? I’m not sure whether the second record will completely override the first (in which case, the * would also apply to robot-x) or whether the * means, in effect, “all robots except the one mentioned above”.

Hope this makes sense.


Hi in my opinion you use Disallow: /logon page and in your post somepage.htm so google not consider about this page only for relevant pages consider for robot.txt

According to, the User-agent: * line means “Any other robot not already listed”, so yes, that would work fine for allowing robot-x global access and restricting it for all other (well-behaved) robots.

Thanks, Stevie. That’s spot on - exactly what I needed.