Avoiding Out of Memory Errors on large PHP executions

I’m working on a personal project which involves several large PHP executions. I wanted to discuss this at high level to determine the optimal way to handle the problem.

The script I’m working on has a long execution time (potentially minutes due to a large number of http connections). When building this script I’ve into a couple of design problems:

  1. How do I prevent the script from running out of memory or timing out besides changing the memory/timeout limits in the php.ini. In other words, how can I chunk the requests. Cron jobs? Or is there another way to manage executing the php script in chunks.
  2. Is there a dynamic way to display a progress indicator (percentage) with PHP? Or does this need to be done with javascript. How would the javascript communicate with the php request.

A common technique is to break the job into chunks and then send a page with a meta-refresh header.

<META HTTP-EQUIV="Refresh" CONTENT="1; URL=batch.php?batch=<?php echo $batchNumber ?>">

This page will load, then refresh to the next page. Each time you send the page, advance the batch number.

Now depending on what you are doing you may still need to write a cache file to disk using file_put_contents()