VERY perceptive ... and a brilliant solution (using a hash of the file)! Of course, I'd prefer a database and a CRON job to get the current hash, update the database and report changes when discovered.
VERY nice to have written the WP plugin, too! I hope you made that available on the WP website as it should help with WP security issues.
As for your FTP comment, there's a major difference between downloading files vs download directories (which can be done just as easily via PHP while feeding the hash script and database comparison).
THANK YOU! I'd never seen WatchThatPage before but it seems like it should be a major benefit to all. As a free service, junior webmasters would find it affordable and the "donation" requested of professional webmasters is certainly dirt cheap - well worth the cost! I scanned for this: "If you want, we can keep the emails short and only tell you which pages did change, and leave it up to you to visit the pages yourself and find the changes." That's all I'd want from this service and would find it invaluable.
As above, if you're a coder, Mitt's hash (to a database for comparison via CRON) would be a good exercise and something you'd have direct control over. If you're not into coding, though, John's WatchThatPage would identify page changes on a frequent basis (I believe that the allowed "hourly" would be abusive of an excellent service). Take your pick!
Passwords: I hope you used STRONG passwords of the sort generated for you by http://strongpasswordgenerator.com. I think that's an excellent service and it certainly creates incredibly strong passwords - just be sure that it's long enough to deter brute force attacks.
OMG! That's exactly what Martin Psinas was doing in your linked article! If you're controlling the hosting on a reseller, VPS or dedi account, it should be a single setup to cover all the "addon domains" (in their subdirectories). If they're spread all over the place, it's still just a single setup (file, database and CRON) for each "master account." To keep that hash/compare/report file separate (relatively safe), keep it out of your "webspace," i.e., in a higher-level directory which cannot be accessed via the http: protocol. Then, with webstats, you will know the "lull" periods of your websites so leave the processing on the server rather than wasting bandwidth to download and processing locally.
You can't find anything? Martin's given it to you - except for the CRON and database setup - so I wouldn't complain about not finding anything. Personally, I'm thrilled that you started this thread and received two excellent responses so far (not to mention your own linked article which added my suggestions to Mitt's hashing suggestion [Don't use MD5, it's been cracked - use the latest SHA that you have available])!