Page 1 of 1

Single Cron Job Limit Circumventor

Posted: Sat Dec 02, 2006 3:36 pm
by DaveTheAve
Name: Single Cron Job Limit Circumventor
Why: My shared website provider allows for every account to only have one concurrent cron job; to many this is hell. My solution was to make a single file that, once ran, allowed multiple files to be ran as if a cron job ran them.
Notes: To add a new cron job add a new second-dimension array to the first-dimension array called $runFiles. The second-dimension array should have two keys: the first key is the time to give the cron job to run, the second key is the url or relative path of the cron job.

Code: Select all

<?php
$runFiles = array(
	array('60','http://www.example.com/cron.job.php'),
);

for($i=0,$k=count($runFiles);$i<$k;$i++) {
	set_time_limit($runFiles[$i][0]);
	$run = file_get_contents($runFiles[$i][1]);
}
So any feedback? Anyone even need it?

Posted: Sat Dec 02, 2006 3:45 pm
by feyd
using include() can be dangerous, file_get_contents() is likely better or safer. Even better would be fsockopen() if you're doing remote requests. For localized requests I'd likely use system() or its siblings.

Posted: Sat Dec 02, 2006 4:39 pm
by DaveTheAve
Liked the file_get_contents() idea; thus, I implemented it. Thanks feyd.

Posted: Sat Dec 02, 2006 11:36 pm
by AKA Panama Jack
If they don't have the SYSTEM function blocked and you know the path to the wget command you can do something like this...

Code: Select all

$runFiles = array( 
        array('60','http://www.example.com/cron.job.php'), 
); 

for($i=0,$k=count($runFiles);$i<$k;$i++) {
	system("/usr/bin/wget --background -q -o /dev/null -O /dev/null " . $runFiles[$i][1]);
}
You do not need to use set_time_limit as this will run each task as a background process and not wait for the last one to finish before starting the next one.