[SOLVED] Trying to speed up reading from multiple ftp files

PHP programming forum. Ask questions or help people concerning PHP code. Don't understand a function? Need help implementing a class? Don't understand a class? Here is where to ask. Remember to do your homework!

Moderator: General Moderators

Post Reply
dayhuffb
Forum Newbie
Posts: 8
Joined: Thu Aug 12, 2004 5:45 pm

[SOLVED] Trying to speed up reading from multiple ftp files

Post by dayhuffb »

I have a webserver and ftp server in two different locations. The webserver needs to give the user the option to simultaneously download which ever files they choose. I can accomplish this if I concat all the files into one large file, and download that. The only problem is that when too many files are selected, it takes longer than 30 seconds to open - read - and close all the files, at which time the server times out. Our ISP, understandably, doesn't want to extend the timeout period - which leaves me looking for a faster solution. Does anybody have any ideas?

Code: Select all

<?php
$base_dir = "ftp://username:password@ipaddress".$_GET["directory"];
$files = $_GET["files"];
$output_files = explode(",",$files);
$size=count($output_files)-1;

for ($i=0; $i < $size; $i++) {
	$connection = $base_dir . $output_files[$i];
			
	if($fp = @fopen($connection, "rb")){
		while(!feof($fp)) {
			 $buffer .= fread($fp, 8000); 	 
		}
		fclose($fp);
	}
}
?>
User avatar
feyd
Neighborhood Spidermoddy
Posts: 31559
Joined: Mon Mar 29, 2004 3:24 pm
Location: Bothell, Washington, USA

Post by feyd »

do they allow calling [php_man]set_time_limit[/php_man] ?
dayhuffb
Forum Newbie
Posts: 8
Joined: Thu Aug 12, 2004 5:45 pm

Post by dayhuffb »

That worked, Thanks for your help!!!
Post Reply