Problem with robots.txt

Hi everyone,
i am using robots.txt in my site which goes this way

User-agent: *
Disallow: /

but y r some(only few) of the pages still crawled by google?

It wuld be great if i got some immediate help

Thnx in advance

Perhaps Google crawled those pages before you added the robots.txt

will those entries be deleted atall?
How many days will google keep the links in cache any idea?

You can ask for those entries to be removed via Google Webmaster Tools.

I also read some time ago (and I can’t remember where, so I can’t vouch for this advice) that Google likes to be mentioned by name in a robot.txt file. e.g.

# For Googlebot
User-agent: Googlebot
Disallow: /cgi-bin/
Disallow: /scripts/

# For all bots
User-agent: *
Disallow: /cgi-bin/
Disallow: /scripts/

As I say, I don’t know how true it is, but it’s worth a try if you still have problems.