[FONT=Verdana]To exclude the folder ppc from all bots, all you need is
User-agent: *
Disallow: /ppc/
I read somewhere (and I can’t now remember where) that Googlebot likes to be called by name, and therefore one should add
User-Agent: Googlebot
Disallow: /ppc/
I always keep to that practice and have run into no problems, but it’s probably redundant.
The robots.txt protocol does not include “Allow”, only “Disallow”. By default, the whole site will be crawled unless folders/files are specifically excluded.