I run a website which sells audio downloads for doctors. At present when a customer buys a podcast he/she is shown a direct HTML link to start the download. The link points to a ZIP file which can be 1Gb in size. One question I ask is, why can the download operate successfully for perhaps some hours when the Apache server has a timeout of perhaps a few mins? I presume this timeout is for the server to process the GET request and to sent the resultant output to the browser via an output buffer. Perhaps there is no timeout for the buffer to be transmitted to the user. Am I correct?
My second question is about PHP timeouts. I’m not keen on an HTML direct link to a download as if the URL became known to other users they could download for free. A solution could be to use a PHP script to transfer the download file by sending the appropriate headers then, say, an ‘echo’ of a variable preloaded with the download data. My question is whether the timeouts in the php.ini file would effect the download. These are usually 30 secs for the script to complete. So, is the script running during the download? Alternatively does the script complete quickly and put the download data into a PHP output buffer. If the latter, is there any timeout for the buffer to empty?
I presume I’d have to set a large value for the ‘memory_limit’ in php.ini (128Mb by default) to allow for the size of the 1Gb download.
I host a client’s “radio journals” (I won’t identify here - PM if you want to see them in action) in which 28Mb audio files are served regularly. That’s a far cry from 1Gb, however, once the “streaming” link is established, neither Apache nor PHP is involved - only the script which is playing the audio file. Downloads are the same; once started, the browser is acting like an FTP client (some with the ability to “resume” which is very important with files of this size).
Think “out-of-the-box” recommendation: A 1Gb file seems ridiculous even with today’s technology. Wouldn’t it be better to put the file on a DVD (where you should be able to add copy protection - a simple “corrupted file” should suffice for the disk) and mail it? Even with an ADSL 2+ connection, it takes something approaching forever for me to download 150Mb - 300Mb video files of TV shows! More often than not, the link simply dies and the download is lost; it would be far worse with your 1,000Mb file!
Thanks for your help and your comments were very useful. Our downloads are worldwide so postage is not an option. In any case most downloads are done in 2 hours or so and we don’t get complains about download times.
No problem. If your downloads are going through in 2 hours, it’s obviously not a PHP/Apache problem. Your clients are linking to the download and only the end-to-end bandwidth should be a problem.
Having said that, though, one client of mine does get an occasional complaint about slow downloads (to OK from GA) and I suspect that there are too many simultaneous requests for the same file. Since your file is so large, may I suggest that you have it at a couple of hosts and rotate your links accordingly?
If it’s applicable, I’d avoid shared hosting for this sort of thing primarily because on a heavily contended server, long running apache threads/processes can end up getting killed due to other resource use on the server. There is also the issue as David mentioned of contention of bandwidth. You could use a CDN that has an authentication API such as amazon s3 to create single use or time limited download links, and have good resilience and reliability with reasonable cost.
You could always split your 1gb file into presumably what are individual podcast episodes? Not all of your customers will have the fastest broadband and this may be more manageable for them.