Why does readfile() exhaust PHP memory?

tmsimont picture tmsimont · Jul 8, 2011 · Viewed 14.5k times · Source

I've seen many questions about how to efficiently use PHP to download files rather than allowing direct HTTP requests (to keep files secure, to track downloads, etc.).

The answer is almost always PHP readfile().

BUT, although it works great during testing with huge files, when it's on a live site with hundreds of users, downloads start to hang and PHP memory limits are exhausted.

So what is it about how readfile() works that causes memory to blow up so bad when traffic is high? I thought it's supposed to bypass heavy use of PHP memory by writing directly to the output buffer?

EDIT: (To clarify, I'm looking for a "why", not "what can I do". I think that Apache's mod_xsendfile is the best way to circumvent)

Answer

user482594 picture user482594 · Jul 8, 2011
Description
int readfile ( string $filename [, bool $use_include_path = false [, resource $context ]] )
Reads a file and writes it to the output buffer*.

PHP has to read the file and it writes to the output buffer. So, for 300Mb file, no matter what the implementation you wrote (by many small segments, or by 1 big chunk) PHP has to read through 300Mb of file eventually.

If multiple user has to download the file, there will be a problem. (In one server, hosting providers will limit memory given to each hosting user. With such limited memory, using buffer is not going to be a good idea. )

I think using the direct link to download a file is a much better approach for big files.