Page 1 of 1
PHP FTP help
Posted: Wed Aug 08, 2007 9:14 am
by legend986
1. I'm trying to fetch a large file from another ftp server onto my local server. For files larger than say 30 MB, the PHP Timeout will occur. So is there a way I could actually pause the transfer, refresh the php file and resume it to complete the whole transfer?
2. I could transfer a file from one server to another server. How would I do the same for a directory?
Posted: Wed Aug 08, 2007 10:05 am
by pickle
1)
set_time_limit()
2) There is no FTP function to retrieve an entire directory. If your FTP client has a command window & you watch closely, you'll see that the client opens a directory, does a listing of all the files in that directory, then downloads the files individually. You'll have to do that too.
Posted: Wed Aug 08, 2007 10:52 am
by legend986
1. Thank you... One doubt however, I read it on the website, but what if max_execution_time is defined by the host? Will that be ignored?
2. In that case, I need to fetch each and every file name right?
I am getting the listing using
Code: Select all
$list = ftp_rawlist($resource, '/public_html');
but how would I fetch the file names from that to be able to transfer?
Posted: Wed Aug 08, 2007 12:05 pm
by timvw
Have a look at the ftp_nlist function...
Here's an example:
Code: Select all
// make sure we have time enough to execute this script
set_time_limit(1200);
// connect to the ftp server
$ftp = ftp_connect(‘ftp.scarlet.be’);
ftp_login($ftp, ‘anonymous’, ‘password’);
// get the files that are available here
$local = glob(‘*.*’);
// get the files that are available there
$remote = ftp_nlist($ftp, ‘.’);
// get the files there that are not availble here
foreach($remote as $file)
{
if (!in_array($file, $local))
{
// we don’t have the file, thus download it
ftp_get($ftp, $file, $file, FTP_BINARY);
}
}
// close the connection
ftp_close($ftp);