Client download of large files using readfile()

Need help installing PHP, configuring a script, or configuring a server? Then come on in and post your questions! We'll try to help the best we can!

Moderator: General Moderators

Post Reply
JSylvia007
Forum Newbie
Posts: 1
Joined: Mon Apr 17, 2006 4:10 pm

Client download of large files using readfile()

Post by JSylvia007 »

feyd | Please use

Code: Select all

,

Code: Select all

and [syntax="..."] tags where appropriate when posting code. Your post has been edited to reflect how we'd like it posted. Please read:  [url=http://forums.devnetwork.net/viewtopic.php?t=21171]Posting Code in the Forums[/url] to learn how to do it too.[/color]


First off...  I dont know if this is a PHP problem or an Apache Problem...  But here is the situation, and I'm hoping that someone can at least narrow it down.  

We have a machine here that hosts some processed videos.  These files are LARGE, most are 1gb, with a max of about 2gb.  What i have is a small site which basically lists the files from the directories, and when you click a link, a PHP script serves the file, below is the php script:

Code: Select all

<?php
header("Pragma: public"); // required
header("Expires: 0");
header("Cache-Control: must-revalidate, post-check=0, pre-check=0");
header("Cache-Control: private",false); // required for certain browsers 
header("Content-Type: video/mov");
header("Content-Disposition: attachment; filename=".basename($file).";");
header("Content-Transfer-Encoding: binary");
header("Content-Length: " . filesize($file_path));
readfile("$file_path");
?>
I know the above script works, because we use it all over the place for serving smaller files (about 100mb and smaller). The problem is with these larger files. They start to download fine, the client machine gets the full filesize, and the download begins. then, 9 times out of 10, the file only downloads 8-10mb, and the download cancels automatically. Its not the client machines as we have tested on every browser known and the same exact behavior exists.

In short - am i missing a config entry somewhere that i need to enable? Is this definately a PHP issue? Can anyone help?

~Jake


feyd | Please use

Code: Select all

,

Code: Select all

and [syntax="..."] tags where appropriate when posting code. Your post has been edited to reflect how we'd like it posted. Please read:  [url=http://forums.devnetwork.net/viewtopic.php?t=21171]Posting Code in the Forums[/url] to learn how to do it too.[/color]
timvw
DevNet Master
Posts: 4897
Joined: Mon Jan 19, 2004 11:11 pm
Location: Leuven, Belgium

Post by timvw »

And what do your errors logs say? Ran out of memory? Maximum execution time exceeded?

Anyway, use http://www.php.net/fpassthru instead of reafile...
Another nice thing to have is an option that allows the user to 'resume' his download...
User avatar
Chris Corbyn
Breakbeat Nuttzer
Posts: 13098
Joined: Wed Mar 24, 2004 7:57 am
Location: Melbourne, Australia

Post by Chris Corbyn »

PHP by default maxes out at 8MB. timvw's suggestion should work, or you can maybe use output buffering with ob_flush().
jmut
Forum Regular
Posts: 945
Joined: Tue Jul 05, 2005 3:54 am
Location: Sofia, Bulgaria
Contact:

Post by jmut »

I don't think http is about downloading large files. Why don't you supply ftp or something which is meant for this purpose?
User avatar
feyd
Neighborhood Spidermoddy
Posts: 31559
Joined: Mon Mar 29, 2004 3:24 pm
Location: Bothell, Washington, USA

Post by feyd »

I've never had problems with readfile() sending quite large files (multiple gigabyte)
timvw
DevNet Master
Posts: 4897
Joined: Mon Jan 19, 2004 11:11 pm
Location: Leuven, Belgium

Post by timvw »

jmut wrote:I don't think http is about downloading large files. Why don't you supply ftp or something which is meant for this purpose?
Could you explain how ftp would be any better than http? Imho they're more or less the same..

(Apart from the fact that many ppl consider ftp broken by design)
jmut
Forum Regular
Posts: 945
Joined: Tue Jul 05, 2005 3:54 am
Location: Sofia, Bulgaria
Contact:

Post by jmut »

timvw wrote:
jmut wrote:I don't think http is about downloading large files. Why don't you supply ftp or something which is meant for this purpose?
Could you explain how ftp would be any better than http? Imho they're more or less the same..

(Apart from the fact that many ppl consider ftp broken by design)
I can't. I just thought it is better as it is file transfer protocol.
User avatar
Chris Corbyn
Breakbeat Nuttzer
Posts: 13098
Joined: Wed Mar 24, 2004 7:57 am
Location: Melbourne, Australia

Post by Chris Corbyn »

jmut wrote:
timvw wrote:
jmut wrote:I don't think http is about downloading large files. Why don't you supply ftp or something which is meant for this purpose?
Could you explain how ftp would be any better than http? Imho they're more or less the same..

(Apart from the fact that many ppl consider ftp broken by design)
I can't. I just thought it is better as it is file transfer protocol.
FTP would reveal the full path to the original file also, whereas over HTTP you can use PHP to mask it.

HTTP is fine for downloading files. It sends the headers needed and can transfer the data so it's all good :) Most linux distros allow you to download full ISO files over HTTP.
Post Reply