Hi,
we need to be able to provide our client with the ability to upload large files through their PHP application. They actually requested ‘unlimited’ file sizes, but in reality we’re looking at 5GB max.
We started off using a standard form with a file field, this was fine for large files up to about 1GB although we did have to adjust the server’s PHP config max_post and max_upload values to cater for this. As they have requested the ability to upload larger and larger files this method has become problematic both in terms of the memory usage and the reliability of the a HTTP connection remaining stable long enough for the uploads to complete.
I know it must be possible to give them the ability to upload 5GB files because I see other web services offering much larger uploads (YouTube can handle over 20GB).
We’ve looked at the HTML5 Files API which should allow uploads to be completed in ‘chunks’ which are then patched together on the server. I’m not entirely clear on how this works - for example does each ‘chunk’ just count as a single 200kb upload, or will we still need a large memory setting on the server? Even this is not ideal as the majority of the company are using IE8 in which the HTML5 APIs don’t work.
I just wondered which the best way to approach this is, we’re open to trying pretty much anything.
Thank you
Hi punkstjimmy,
I’ve used Plupload to handle uploads in the past, and that has built-in support for chunking… you should be able to handle files of any size that way, and as far as I know it shouldn’t require large amounts of memory, because each chunk is sent in a separate request. Also, the range of browsers supported is broad, so you don’t have to confine yourself to HTML5 only.
I’ve used chunking before too. (I created my own code to do it - it was a long time ago and there were no readily-available code snippets at the time). As long as it is handled right it can work really well for large files. 
Maybe the best option is ‘chunking’ very large files but I don’t think PHP is going to make uploading large files easy if consistently possible at all. Especially anyone utilizing a shared hosting environment. In those cases you need a script that is really handling FTP services like Java. http://www.simple2ftp.com has a php script that handles the application around the Java FTP code and makes it pretty easy for users and web owners to setup and use. Depending on what your needs are and what resources you have access to you might want to try this script.
Sounds like the client is trying to do the equivalent of flying around the world on a bicycle.
The form upload option uses HTTP which is fast in the download direction but very slow in the upload direction because it is intended for displaying of web pages. A far better solution when dealing with large files would be to use a service that is actually intended for handling files such as FTP, SFTP, XCom etc. where 50Tb files are no bigger problem than 5k files are.