Generally, downloaded files are loaded into the memory as a byte array. This is not a problem when dealing with small files, but in the case of large files (xxx Mb or Gb), a simple task such as file downloading may lead to an out-of-memory error. The simplest method to solve this problem in PHP is Chunking. Let’s see how to accomplish that in this tutorial.
Prerequisite:
The script:
$url = 'https://my_file_url';
// use basename() function to get the file name
$file_name = basename($url);
# 8 MB per one chunk of file.
$chunk_size = 8 * (1024 * 1024);
# send the proper headers to the browser
header('Content-Type: application/octet-stream');
header('Content-Transfer-Encoding: binary');
header('Content-Length: ' . $size);
header('Content-Disposition: attachment;filename="' . basename($filename) . '"');
// load the URL as a stream
$input_stream = fopen($url, 'r');
// this will download the file from the web in small chunks
if ($input_stream) {
// save the file to the current directory
$output_stream = fopen('./' . $file_name, 'wb');
if ($output_stream) {
// set the buffer size to 8Mb which is suitable for downloading large files
while (!feof($input_stream)) {
$chunk = fread($input_stream, $chunk_size);
fwrite($output_stream, $chunk, $chunk_size);
}
}
}
if ($input_stream) {
fclose($input_stream);
}
if ($output_stream) {
fclose($output_stream);
}
Conclusion:
There are other more professional methods for downloading large files with PHP such as cURL or mod-xsendfile
. But if you just want to use a pure PHP solution, the above script will work for most use cases.