33

I am using Laravel Storage and I want to serve users some (larger than memory limit) files. My code was inspired from a post in SO and it goes like this:

$fs = Storage::getDriver(); $stream = $fs->readStream($file->path); return response()->stream( function() use($stream) { fpassthru($stream); }, 200, [ 'Content-Type' => $file->mime, 'Content-disposition' => 'attachment; filename="'.$file->original_name.'"', ]); 

Unfourtunately, I run into an error for large files:

[2016-04-21 13:37:13] production.ERROR: exception 'Symfony\Component\Debug\Exception\FatalErrorException' with message 'Allowed memory size of 134217728 bytes exhausted (tried to allocate 201740288 bytes)' in /path/app/Http/Controllers/FileController.php:131 Stack trace: #0 /path/vendor/laravel/framework/src/Illuminate/Foundation/Bootstrap/HandleExceptions.php(133): Symfony\Component\Debug\Exception\FatalErrorException->__construct() #1 /path/vendor/laravel/framework/src/Illuminate/Foundation/Bootstrap/HandleExceptions.php(118): Illuminate\Foundation\Bootstrap\HandleExceptions->fatalExceptionFromError() #2 /path/vendor/laravel/framework/src/Illuminate/Foundation/Bootstrap/HandleExceptions.php(0): Illuminate\Foundation\Bootstrap\HandleExceptions->handleShutdown() #3 /path/app/Http/Controllers/FileController.php(131): fpassthru() #4 /path/vendor/symfony/http-foundation/StreamedResponse.php(95): App\Http\Controllers\FileController->App\Http\Controllers\{closure}() #5 /path/vendor/symfony/http-foundation/StreamedResponse.php(95): call_user_func:{/path/vendor/symfony/http-foundation/StreamedResponse.php:95}() #6 /path/vendor/symfony/http-foundation/Response.php(370): Symfony\Component\HttpFoundation\StreamedResponse->sendContent() #7 /path/public/index.php(56): Symfony\Component\HttpFoundation\Response->send() #8 /path/public/index.php(0): {main}() #9 {main} 

It seems that it tries to load all of the file into memory. I was expecting that usage of stream and passthru would not do this... Is there something missing in my code? Do I have to somehow specify chunk size or what?

The versions I am using are Laravel 5.1 and PHP 5.6.

1
  • The only scenario I can think of where fpassthru allocates into memory is when using output buffering. You might therefore try a loop on fread with an echo. Commented May 2, 2016 at 15:23

5 Answers 5

22
+100

It seems that output buffering is still building up a lot in memory.

Try disabling ob before doing the fpassthru:

function() use($stream) { while(ob_get_level() > 0) ob_end_flush(); fpassthru($stream); }, 

It could be that there are multiple output buffers active that is why the while is needed.

Sign up to request clarification or add additional context in comments.

1 Comment

This answer addresses the actual issue that was causing problems in my attempted implementation, therefore I accept and award you the bounty. Thanks to everyone for the other responses which are also valuable pieces of information!
19

Instead of loading the whole file into memory at once, try to use fread to read and send it chunk by chunk.

Here is a very good article: http://zinoui.com/blog/download-large-files-with-php

<?php //disable execution time limit when downloading a big file. set_time_limit(0); /** @var \League\Flysystem\Filesystem $fs */ $fs = Storage::disk('local')->getDriver(); $fileName = 'bigfile'; $metaData = $fs->getMetadata($fileName); $handle = $fs->readStream($fileName); header('Pragma: public'); header('Expires: 0'); header('Cache-Control: must-revalidate, post-check=0, pre-check=0'); header('Cache-Control: private', false); header('Content-Transfer-Encoding: binary'); header('Content-Disposition: attachment; filename="' . $metaData['path'] . '";'); header('Content-Type: ' . $metaData['type']); /* I've commented the following line out. Because \League\Flysystem\Filesystem uses int for file size For file size larger than PHP_INT_MAX (2147483647) bytes It may return 0, which results in: Content-Length: 0 and it stops the browser from downloading the file. Try to figure out a way to get the file size represented by a string. (e.g. using shell command/3rd party plugin?) */ //header('Content-Length: ' . $metaData['size']); $chunkSize = 1024 * 1024; while (!feof($handle)) { $buffer = fread($handle, $chunkSize); echo $buffer; ob_flush(); flush(); } fclose($handle); exit; ?> 

Update

A simpler way to do this: just call

if (ob_get_level()) ob_end_clean(); 

before returning a response.

Credit to @Christiaan

//disable execution time limit when downloading a big file. set_time_limit(0); /** @var \League\Flysystem\Filesystem $fs */ $fs = Storage::disk('local')->getDriver(); $fileName = 'bigfile'; $metaData = $fs->getMetadata($fileName); $stream = $fs->readStream($fileName); if (ob_get_level()) ob_end_clean(); return response()->stream( function () use ($stream) { fpassthru($stream); }, 200, [ 'Content-Type' => $metaData['type'], 'Content-disposition' => 'attachment; filename="' . $metaData['path'] . '"', ]); 

7 Comments

This is exactly what fpasstru is for, no need to complicate things.
I don't think so.. I did an experiment, fpassthru resulted in exactly the same error. With this method I'm able to download the file.
@Christiaan I've updated the code in my answer and you may do this experiment on your computer. (just generate a 20GB big file)
With the fpasstru did you make sure you disabled output buffering? Because that is what your example dies by calling flush every time.
@Christiaan You are right.. thanks for pointing it out. Yeah it's actually a very simple problem how could I have missed the point. Just call if (ob_get_level()) ob_end_clean(); before returning a response. I will update the answer and give credit to you
|
8

X-Send-File.

X-Send-File is an internal directive that has variants for Apache, nginx, and lighthttpd. It allows you to completely skip distributing a file through PHP and is an instruction that tells the webserver what to send as a response instead of the actual response from the FastCGI.

I've dealt with this before on a personal project and if you want to see the sum of my work, you can access it here:
https://github.com/infinity-next/infinity-next/blob/master/app/Http/Controllers/Content/ImageController.php#L250-L450

This deals not only with distributing files, but handling streaming media seeking. You are free to use that code.

Here is the official nginx documentation on X-Send-File.
https://www.nginx.com/resources/wiki/start/topics/examples/xsendfile/

You do have to edit your webserver and mark specific directories as internal for nginx to comply with X-Send-File directives.

I have example configuration for both Apache and nginx for my above code here.
https://github.com/infinity-next/infinity-next/wiki/Installation

This has been tested on high-traffic websites. Do not buffer media through a PHP Daemon unless your site has next to no traffic or you're bleeding resources.

2 Comments

I would really like to implement this, but I am not sure about the security. Can you explain if using X-Send-File adds any risks of exposing the file to unauthorized clients?
You can use controller policies with this which is why I love the solution so much. However, you should be aware that nginx and potentially CDNs like Cloudflare may cache the file and distribute it to anyone who has the URL.
4

You could try using the StreamedResponse component directly, instead of the Laravel wrapper for it. StreamedResponse

Comments

0

https://www.php.net/readfile

<?php $file = 'monkey.gif'; if (file_exists($file)) { header('Content-Description: File Transfer'); header('Content-Type: application/octet-stream'); header('Content-Disposition: attachment; filename="'.basename($file).'"'); header('Expires: 0'); header('Cache-Control: must-revalidate'); header('Pragma: public'); header('Content-Length: ' . filesize($file)); readfile($file); exit; } ?> 

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.