Need Advice on Robots.txt

Hello All,

I have a Sports blog which is built on Self Hosted Wordpress.

I am currently using the following robots.txt http://sportsbun.com/robots.txt

Is this SEO friendly. Do I need to change the robots.txt to block spidering certain directories.

Please advice me as to what I need to change to make it SEO Friendly and at the same time reduce unnecessary bandwidth wastage.

Thanks & Regards,
Nuk

You robots.txt is mis-configured - the “allow” directive is only required to counter a specific disallow directive - since you have no disallow, that line is redundant.

Read: http://www.robotstxt.org/robotstxt.html and http://en.wikipedia.org/wiki/Robots.txt

There’s little you need to worry about with robots.txt and WordPress, but there are a few directories that you can ensure don’t get indexed:

Here’s my current robots.txt

User-agent: *
Disallow: /wp-admin/
Disallow: /wp-content/plugins/
Sitemap: http://www.example.com/sitemap.xml.gz

*note you need to have a sitemap generated to use the sitemap: directive.

I have placed my new robots.txt located at http://www.sportsbun.com/robots.txt

Please let me know if it looks fine according to you…

Dude - you do realize that bots.txt file are only recommendation
of which “unnecessary bandwidth wastage.” are you talking in the name of JC ?
you website should have millions upon million visitors every month to actually waste all your bandwidth more ever - 99.99% of the hosts now days offer unlimited bandwidth …

Now if you uses robot.txt file and you dont know how to… then you can only screw your progress up.

robot.txt file tells google which of your pages not to crawl-not to “index”
unless you know exactly which directory to block. dont touch it.

just thinks on all the directories you dont need and add them if still wonna play with bot.txt files