Page 1 of 1

[SOLVED] Trying to speed up reading from multiple ftp files

Posted: Thu Sep 09, 2004 6:35 pm
by dayhuffb
I have a webserver and ftp server in two different locations. The webserver needs to give the user the option to simultaneously download which ever files they choose. I can accomplish this if I concat all the files into one large file, and download that. The only problem is that when too many files are selected, it takes longer than 30 seconds to open - read - and close all the files, at which time the server times out. Our ISP, understandably, doesn't want to extend the timeout period - which leaves me looking for a faster solution. Does anybody have any ideas?

Code: Select all

<?php
$base_dir = "ftp://username:password@ipaddress".$_GET["directory"];
$files = $_GET["files"];
$output_files = explode(",",$files);
$size=count($output_files)-1;

for ($i=0; $i < $size; $i++) {
	$connection = $base_dir . $output_files[$i];
			
	if($fp = @fopen($connection, "rb")){
		while(!feof($fp)) {
			 $buffer .= fread($fp, 8000); 	 
		}
		fclose($fp);
	}
}
?>

Posted: Thu Sep 09, 2004 6:51 pm
by feyd
do they allow calling [php_man]set_time_limit[/php_man] ?

Posted: Thu Sep 09, 2004 6:58 pm
by dayhuffb
That worked, Thanks for your help!!!