I am writing some scripts that are going to cache output into files which will need to be stored together into directories. The directories will be created dynamically by the script. Then daily I am going to run a cron job to delete the cached files (and directories). This is on a i686 running RedHat 7.1

So. I am just wondering, is there a rule of thumb for working out what the optimal hashing scheme is for directories in the files system???

For example, if I expect to create say 1000 directories all under the cache-data directory, what level of hashing should I use for the directory structure?

None? One level, two, ...? Is there an algorithm? I'm not very good with data structures such as trees.