11

I have to download big file (1xx MB) using PHP.

How can i download this without wasting memory (RAM) for temporary file ?

When i use

$something=file_get_contents('http://somehost.example/file.zip'); file_put_contents($something,'myfile.zip'); 

I need to have so much memory that size of that file.

Maybe it's possible to download it using any other way ?

For example in parts (for example 1024b), write to disk, and download another part repeating until file will be fully downloaded ?

2

2 Answers 2

37

Copy the file one small chunk at a time

/** * Copy remote file over HTTP one small chunk at a time. * * @param $infile The full URL to the remote file * @param $outfile The path where to save the file */ function copyfile_chunked($infile, $outfile) { $chunksize = 10 * (1024 * 1024); // 10 Megs /** * parse_url breaks a part a URL into it's parts, i.e. host, path, * query string, etc. */ $parts = parse_url($infile); $i_handle = fsockopen($parts['host'], 80, $errstr, $errcode, 5); $o_handle = fopen($outfile, 'wb'); if ($i_handle == false || $o_handle == false) { return false; } if (!empty($parts['query'])) { $parts['path'] .= '?' . $parts['query']; } /** * Send the request to the server for the file */ $request = "GET {$parts['path']} HTTP/1.1\r\n"; $request .= "Host: {$parts['host']}\r\n"; $request .= "User-Agent: Mozilla/5.0\r\n"; $request .= "Keep-Alive: 115\r\n"; $request .= "Connection: keep-alive\r\n\r\n"; fwrite($i_handle, $request); /** * Now read the headers from the remote server. We'll need * to get the content length. */ $headers = array(); while(!feof($i_handle)) { $line = fgets($i_handle); if ($line == "\r\n") break; $headers[] = $line; } /** * Look for the Content-Length header, and get the size * of the remote file. */ $length = 0; foreach($headers as $header) { if (stripos($header, 'Content-Length:') === 0) { $length = (int)str_replace('Content-Length: ', '', $header); break; } } /** * Start reading in the remote file, and writing it to the * local file one chunk at a time. */ $cnt = 0; while(!feof($i_handle)) { $buf = ''; $buf = fread($i_handle, $chunksize); $bytes = fwrite($o_handle, $buf); if ($bytes == false) { return false; } $cnt += $bytes; /** * We're done reading when we've reached the conent length */ if ($cnt >= $length) break; } fclose($i_handle); fclose($o_handle); return $cnt; } 

Adjust the $chunksize variable to your needs. This has only been mildly tested. It could easily break for a number of reasons.

Usage:

copyfile_chunked('http://somesite.com/somefile.jpg', '/local/path/somefile.jpg'); 
Sign up to request clarification or add additional context in comments.

4 Comments

Code looks good, but it's impossible to allow user open remote files in that way. Maybe you have similar code using fsockopen ?
It should work if the allow_url_fopen directive is turned on in PHP. But I'll update my example to show socket use.
Fairly sure your $errorcode and $errorstr are backwards in fsockopen.
I have allow_url_fopen enabled and still 500 error
10

you can shell out to a wget using exec() this will result in the lowest memory usage.

<?php exec("wget -o outputfilename.tar.gz http://pathtofile/file.tar.gz") ?> 

You can also try using fopen() and fread() and fwrite(). That way you onlly download x bytes into memory at a time.

2 Comments

this is not working for larger files around 1GB+
wrong flag: lowercased -o is log output of a downloding process, uppercase -O is actual output file

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.