Single Cron Job Limit Circumventor

Coding Critique is the place to post source code for peer review by other members of DevNetwork. Any kind of code can be posted. Code posted does not have to be limited to PHP. All members are invited to contribute constructive criticism with the goal of improving the code. Posted code should include some background information about it and what areas you specifically would like help with.

Popular code excerpts may be moved to "Code Snippets" by the moderators.

Moderator: General Moderators

Post Reply
User avatar
DaveTheAve
Forum Contributor
Posts: 385
Joined: Tue Oct 03, 2006 2:25 pm
Location: 127.0.0.1
Contact:

Single Cron Job Limit Circumventor

Post by DaveTheAve »

Name: Single Cron Job Limit Circumventor
Why: My shared website provider allows for every account to only have one concurrent cron job; to many this is hell. My solution was to make a single file that, once ran, allowed multiple files to be ran as if a cron job ran them.
Notes: To add a new cron job add a new second-dimension array to the first-dimension array called $runFiles. The second-dimension array should have two keys: the first key is the time to give the cron job to run, the second key is the url or relative path of the cron job.

Code: Select all

<?php
$runFiles = array(
	array('60','http://www.example.com/cron.job.php'),
);

for($i=0,$k=count($runFiles);$i<$k;$i++) {
	set_time_limit($runFiles[$i][0]);
	$run = file_get_contents($runFiles[$i][1]);
}
So any feedback? Anyone even need it?
Last edited by DaveTheAve on Sat Dec 02, 2006 4:38 pm, edited 1 time in total.
User avatar
feyd
Neighborhood Spidermoddy
Posts: 31559
Joined: Mon Mar 29, 2004 3:24 pm
Location: Bothell, Washington, USA

Post by feyd »

using include() can be dangerous, file_get_contents() is likely better or safer. Even better would be fsockopen() if you're doing remote requests. For localized requests I'd likely use system() or its siblings.
User avatar
DaveTheAve
Forum Contributor
Posts: 385
Joined: Tue Oct 03, 2006 2:25 pm
Location: 127.0.0.1
Contact:

Post by DaveTheAve »

Liked the file_get_contents() idea; thus, I implemented it. Thanks feyd.
User avatar
AKA Panama Jack
Forum Regular
Posts: 878
Joined: Mon Nov 14, 2005 4:21 pm

Post by AKA Panama Jack »

If they don't have the SYSTEM function blocked and you know the path to the wget command you can do something like this...

Code: Select all

$runFiles = array( 
        array('60','http://www.example.com/cron.job.php'), 
); 

for($i=0,$k=count($runFiles);$i<$k;$i++) {
	system("/usr/bin/wget --background -q -o /dev/null -O /dev/null " . $runFiles[$i][1]);
}
You do not need to use set_time_limit as this will run each task as a background process and not wait for the last one to finish before starting the next one.
Post Reply