Search recursively for a file


There are various php error log files. I need to know on a daily basis if any new ones have been created. So some sort of cron job running a php script that does a recursive search I guess.


Found this code from 2010, not sure if it will work now ??


$document_root = "/home/********/";

$output_file = "/home/********/public_html/********/mysite_" . date('d-M-Y') . ".txt";

echo `ls -alR /home/********/public_html`; 

exec('ls -alR /home/********/public_html', $output_file, $return );


echo '<p>Returned: '. $return.'</p>';

       foreach ( $output_file as $file )


          echo $file.'<br>';


So, surely just need for look for a specific file, add some code to email me, then have it run as a cron once a day ?


maybe you just try it?


Yes it works. However I need to search for only one filename, that code gets everything. I have been testing ls under Kubuntu but can’t seem to get the correct syntax.


so what now, do you search for multiple files, or for just one? If just one i would say you already have the file name, so why search for it?

Also setting $output_filename as a string and then letting exec fill it up as an array makes no sense to me according to the documentation


At present I need to search for one file that will/can be in multiple paths.

It was simply a small php script that I found, used years ago, and the exec was within it. I may not need that command at all. In fact possibly ls is the wrong approach ? The script simply needs to recursively look for a filename. It could be in any path. After reading up on the file_exists() function at , it would be easier to do that. As soon as one occurance of the file was found, send an email.


This might do the job -

1 Like

I’m wondering if instead of having a reoccurring job that looks for error log files, it might be better to set up a custom error handler and have error_log send an email when an error happens?

1 Like

Thanks, but there seems to be no easy method to distinguish between an error produced by a 404 and an error produced by a php error. I only want the ones that are produced by a php error. They already get logged to an ‘error log’ file, hence a simple cron job to see if any of those files exist, once a day.


Beware: These notes have not been checked because posted from an ipad.

Assuming Linux is the operating system, here’s a few notes which may be helpful:

  1. Linux error log file settings and 404 redirect file are set in the /etc/Apache2/sites_available/Mysite.conf
    usually the log file location is /var/log/errors.log

  2. Apache2 settings can have a personalized .htaccess file in the Mysite document root with modifications to the default settings that override the ./Mysite.conf global default settings

  3. All Apache2 default error_log file location can be viewed and overridden using Php:
    a. echo ini_get(“error_log”);
    b. echo ini_set(“error_log”, “new-error-log.log”);
    c. the above setting is best called twice because the first result returns the default settings.

  4. Test the new-error-log.log file by:
    a. creating a new PHP file
    b. setting ini_set(…);
    c. create error script echo $fred/$jack;
    d. run newly created PHP file:
    e. examine the default error log and revised error log.

  5. Beware: of new-error-log.log file permissions. Try manually creating the file and set file permissions to octal value 0777


The new-error-log.log could be named:

$logfile = ‘error-’ . date(‘Y-m-d’) . ‘.log’;
ini_set(‘error_log’, $logfile );

Beware: There could be file creation problems.


Thanks for your reply. That looks quite involved and complicated. Have been brushing up on my bash/shell skills and this works perfectly …

find . -name 'error_log'

It found them all, very quick. So in a php script now I can run that with the exec command and send an email if any of these files are found.

1 Like