As a Certified Ethical Hacker, I'm fully aware that prevention is the best tactic to prevent hackers but, should one break through, the sooner you know it, the quicker you can act to limit the damage.
A while back, I presented a script called hashscan, designed to track site changes. Executed via a daily CRON, the script reads the files for a specified directory (e.g., an account’s
public_html directory on a server), generates hashes (for files with specific file extensions), and compares them with the previous scan’s hashes stored in a database. It's a great way for site owners to be alerted to files that have been added, altered or deleted by a hacker.
In this article, I'll present an updated version of the script, called SuperScan.
Benefits of SuperScan
The primary benefit is that SuperScan will report any changes to files in an account whether the file change is an addition, alteration or deletion. SuperScan was designed not to overwhelm the webmaster. It only provides a report of changes since the last scan (the default is one hour, but can be configured via CRON) and a summary report (daily by default, although, again, it can be configured via CRON).
Because the scan of a 1500 file account takes ~0.75 seconds, SuperScan can be run frequently without affecting server performance.
To support forensic investigation, the file last modified date and time are held in the database, along with the hash value of the most recent scan (and prior scan for altered files).
The scanner file need not be changed, as all variables are set within a required configure script. It's in the configure script where you can select specific (or ALL) file extensions to be scanned or, if ALL, the file extensions to omit. Additionally, you may specify directories which the scanner will not scan.
While the SuperScan files can be tested within a webspace, I recommend that it be moved outside the webspace for production use via CRON to protect against casual hackers.
Finally, a curious additional benefit is that changes in (extensionless) error_log files are captured and can direct the webmaster’s attention to coding problems that have slipped through the testing procedures.
The logic flow of SuperScan is:
- Read the baseline information about the files in the database
- Scan the system’s files and compute their hashes
- Compare the baseline files against the current files to determine the changed files to generate:
- A list of added files
- A list of altered files and
- A list of deleted files
- Handle each of the changed files lists (update the database)
- Prepare and send a report (if required).
Database, Variables and the Working Arrays
Rather than bore you with the details here, I've inserted comments in all the scripts.
Thus, in short, there are three database tables:
- baseline: this contains the
$file_path, the file’s hash and the file last modified date and time. I also added the account so multiple accounts could use a single database)
- history: this records every detected change—or lack thereof—in each scan.
- scanned: this records scan summary date and time, as well as the number of changes and associated account.
I can’t stress enough that the
$testing variable set by
configure.php will trigger an immense amount of output, so it must only be used for testing and never during a CRON job!
Because the path/to/file is used as a key, it must be unique. That means that multiple accounts can never scan the same files.
In addition, Windows servers will use backslashes, which are immediately changed to slashes, because they cause characters to go missing in the database. Also, use of an apostrophe in a file name will cause problems with database queries.
The working arrays are designed to make use of PHP’s functions, which access the key (
$file_path; this is also the file structure iterator, so never alter
$baseline is read before starting the scan,
$current is the result of the scan, and the
$deleted arrays accumulate the changes from the
$baseline and are used to update the
$baseline for the next scan.
superscan.zip file contains 7 files:
CreateTables.sql, which can be used to setup your tables
ReadMe.txt, which provides an overview of the SuperScan script
scanner.php, the scanning script that requires
scandb.php(which connects to your MySQL server and returns the
reporter.php, which will provide a summary of recent scans via CRON
CRON.txt, which provides sample CRON instructions for both
$report is created as the file changes are detected, and is stored and emailed if not a “negative report.” The summary report is used for the “warm, fuzzy feeling” when you’re not receiving change reports.
During the cleanup, history and scanned tables have records older than 30 days are auto‐purged to prevent unlimited growth of the database, the large arrays are destroyed (reset to empty) and the database is closed.
I believe that SuperScan is a massive improvement over my prior effort, and is a worthy upgrade. It provides frequent notice of changed files, while “negative reports” won't overwhelm the webmaster with unnecessary “Unchanged” notices.
SuperScan was suggested by Han Wechgelaer (NL), who emailed the suggestion that my earlier hashscan script be extended to capture a history of the changes to an account’s files, as well as making more frequent assessments and adding a daily summary.
Han was kind enough to provide a copy of his start on this project and, between us, this evolved into SuperScan. Without Han’s gentle prodding and assistance, SuperScan would never have gotten off the ground and would certainly not be the exceptional tool it is today.
I'd love to know how you find this script, or if you have any questions about it or feedback.
Jump Start Git, 2nd Edition
Visual Studio Code: End-to-End Editing and Debugging Tools for Web Developers
Form Design Patterns