Page 1 of 1

Downloading Files Problem

Posted: Thu Jul 19, 2007 7:19 pm
by NoSpin
I am starting to think that php was not the correct language to choose for my problem and I would like to know if my issue can be done easily, or if i need to switch to perl.

I need to set a cron job to download a list of files off of a php server. So far I have written the code to : login, get a list of files available on the server, and download one file off of the server.

The files are quite large so I need to begin the download in the background.

For some reason after I start the first download, the php code stops running. This maybe because it ran into an error, the server security stopped the execution, or it is simply the way php is designed.

At the moment I am considering having a download script run every five minutes. All it does is check the database to see if a file needs to be downloaded, and start the download of that said file. Also a second script to fill the database once a night. This is of course a bad way to do this, but it is the only way I can think of doing it in PHP.

What do you think?

in case you are wondering, this is the command i am using to download the file
$command = '/usr/local/bin/wget -t 45 -o ' . "$logname $filename &";
echo exec($command);

Thanks,
Brett

Posted: Fri Jul 20, 2007 9:50 am
by Begby
It sounds like the PHP script is timing out. You may want to check on your php-cli.ini or php.ini file to see what the maximum running time allowed for a script is.

Posted: Fri Jul 20, 2007 11:44 am
by Christopher
You can modify that setting in your script at run-time (if your settings allow you to). That way you can use the default on all scripts but change it to a longer timeout for your download script.

Posted: Fri Jul 20, 2007 11:45 am
by Ambush Commander
set_time_limit. You may want to write a shell script instead.