Robots.txt question

Hello,
In my robots.txt file I have a list of robots that are not allowed to be indexed on my site and for the rest I have to allow all other robots, but I would like to know the real difference between these two rules:

    User-agent: *
    Disallow:

and this:

    User-agent: *
    Allow: /

Thank you

Other than the very obvious difference that one blocks all [good] bots and the other allows them, I’m not sure what else you might be thinking of?

But if you have, as you describe, a list of blocked bots, you do not need to have another list of allowed bots: being allowed is the default position.

There’s no such thing as “allow” in robots.txt, you can only disallow robots from particular folders.

There’s a very good guide to how to set up a robots file and what the syntax is on www.robotstxt.org.

The meaning of first one is allow the all crawlers to read and index the web site. The second one is not allow the all crawlers to read and index the web site. denote the all crawlers.
Thanks,
Off Page SEO

The first one

User-agent: *
Disallow:

is correctly formed and allows all bots (denoted by *) access to all areas of the site. i.e. there are no directories listed as disallowed. The second one, as Stevie D has already explained, is incorrect as there is no provision to “allow” in robots.txt.

Thank you very much for your help!
I’ll opt for the solution of TechnoBear.

User-agent: *
Disallow:

and

User-agent: *
Allow: /

Both are same.:slight_smile:

No, they are not the same. “Allow” is not a valid option, so if you want your robots.txt file to be correctly implemented, you need to use the correct syntax.