Hi from South Africa,
I have noticed that on Google Webmaster Tools it says that my blog has robots.txt which means that 15 of my pages do not get visited by the crawlers. What are these? How do you get rid of them? I use Blogger. I am not allowed to give a link yet.
robots.txt files are used to tell Google and other "honest" bots which folders/files the site wants the bots to access and which it doesn't.
Which are the robots.txt file not allowing? They could be files that you really don't want them to go to. And/or they could be "duplicate" content, i.e. the same post under date/archive/category URLs.
Building a site map for search engines could help.
cyjetsu, did you even read what he said? He's asking for help about the robots.txt file, not optimizing his site for the search engines. :(