I’m using the following script to download large file (>100Mb). It works well except that it seems to not ever save the final chunk. If the file is 149,499 on the server, it finishes its download at 145,996. Why? How do I get the last 2% or so to flush and complete the download? Thank much for your help.
FYI, this also happens the same on smaller files so its not stopping for time or file size issues.
$path = "the/file/path.mp4";
$headsize = get_headers($path,1);
$ext = str_from_last_occurrence($_vars['filepath'],".");
if ($ext=="mp3") { $type = "audio"; }
elseif ($ext=="mp4") { $type = "video"; }
function readfile_chunked($filename,$retbytes=true) {
// Stream file
$handle = fopen($filename, 'rb');
$chunksize = 1*(1024*1024); // how many bytes per chunk
$buffer = '';
$cnt =0;
if ($handle === false) {
return false;
}
while (!feof($handle)) {
$buffer = fread($handle, $chunksize);
echo $buffer;
ob_flush();
flush();
if ($retbytes) {
$cnt += strlen($buffer);
}
}
$status = fclose($handle);
if ($retbytes && $status) {
return $cnt; // return num. bytes delivered like readfile() does.
}
return $status;
}
header('Cache-Control: no-cache, no-store, max-age=0, must-revalidate');
header("Content-type: ".$type."/".$ext);
header('Content-Length: ' . (string)($headsize['Content-Length']));
header('Content-Disposition: attachment; filename="'.str_from_last_occurrence($_vars['filepath'],"/").'"');
header("Content-Transfer-Encoding: binary");
readfile_chunked($path);
exit;