I have a webserver and ftp server in two different locations. The webserver needs to give the user the option to simultaneously download which ever files they choose. I can accomplish this if I concat all the files into one large file, and download that. The only problem is that when too many files are selected, it takes longer than 30 seconds to open - read - and close all the files, at which time the server times out. Our ISP, understandably, doesn't want to extend the timeout period - which leaves me looking for a faster solution. Does anybody have any ideas?
Code: Select all
<?php
$base_dir = "ftp://username:password@ipaddress".$_GET["directory"];
$files = $_GET["files"];
$output_files = explode(",",$files);
$size=count($output_files)-1;
for ($i=0; $i < $size; $i++) {
$connection = $base_dir . $output_files[$i];
if($fp = @fopen($connection, "rb")){
while(!feof($fp)) {
$buffer .= fread($fp, 8000);
}
fclose($fp);
}
}
?>