Intensive loops / max_execution_time
Moderator: General Moderators
-
determinedmoth
- Forum Commoner
- Posts: 33
- Joined: Wed Jul 07, 2004 9:13 am
Intensive loops / max_execution_time
Hi
I'm currently working on functions that, among others things, will be looping through credit card transactions via XMLRPC.
I'll potentially be processing 500+ at one time.
During testing, it currently takes about 10 seconds to loop through 10 credit card transactions. This poses the problem of the script timing out.
I basically need some advice on the route to take.
My "biller" page will be executed via cron on a set date, preferably only ONCE.
Can I then loop through say, 50 transactions, and then use a header redirect to hit the script again and do another batch of 50? Or will that still time-out? I'm unsure if php will think this is 1 execution, or several.
I will have full access to the server and it's configuration... Linux-based.
Thanks.
I'm currently working on functions that, among others things, will be looping through credit card transactions via XMLRPC.
I'll potentially be processing 500+ at one time.
During testing, it currently takes about 10 seconds to loop through 10 credit card transactions. This poses the problem of the script timing out.
I basically need some advice on the route to take.
My "biller" page will be executed via cron on a set date, preferably only ONCE.
Can I then loop through say, 50 transactions, and then use a header redirect to hit the script again and do another batch of 50? Or will that still time-out? I'm unsure if php will think this is 1 execution, or several.
I will have full access to the server and it's configuration... Linux-based.
Thanks.
Code: Select all
set_time_limit(0);-
determinedmoth
- Forum Commoner
- Posts: 33
- Joined: Wed Jul 07, 2004 9:13 am
I know I can set the time limit to whatever I like - but like you said, I should be doing something else here, and my question is really "what" should I be doing here?
What's the best (safest) method of breaking it down?
Can use header() until there are no transaction left, batching 50 at a time, or is this not best practice?
What's the best (safest) method of breaking it down?
Can use header() until there are no transaction left, batching 50 at a time, or is this not best practice?
- jayshields
- DevNet Resident
- Posts: 1912
- Joined: Mon Aug 22, 2005 12:11 pm
- Location: Leeds/Manchester, England
I'm not sure if that's a good idea but it is a solution.
You could have the cron run an ample amount of times every month (say you've got 500 things for it to process, you can handle 20 at a time before timing out. Every 30 days run the script 25 times in succession with about a minute in between each run).
You could use a database to track which transactions you've done upto so that the next run of the script knows where to start from.
This solution is flawed if you have an unpredictable amount of things for it to process, so I'd wait for a better opinion before doing anything.
You could have the cron run an ample amount of times every month (say you've got 500 things for it to process, you can handle 20 at a time before timing out. Every 30 days run the script 25 times in succession with about a minute in between each run).
You could use a database to track which transactions you've done upto so that the next run of the script knows where to start from.
This solution is flawed if you have an unpredictable amount of things for it to process, so I'd wait for a better opinion before doing anything.
-
determinedmoth
- Forum Commoner
- Posts: 33
- Joined: Wed Jul 07, 2004 9:13 am
header? Hmm........determinedmoth wrote: Can use header() until there are no transaction left, batching 50 at a time, or is this not best practice?
A quick question: For each card, you do have to communicate back with some processing gateway?
Assuming the answer to that is yes, take a look at CURL.
And just to make sure we are all on the same page here, you don't need to use a browser at all OK? Write this as a cron job that's called at whatever
interval you feel is helpful.
-
determinedmoth
- Forum Commoner
- Posts: 33
- Joined: Wed Jul 07, 2004 9:13 am
- Maugrim_The_Reaper
- DevNet Master
- Posts: 2704
- Joined: Tue Nov 02, 2004 5:43 am
- Location: Ireland
Basically you're estimating 10-15 minutes for processing 500+ cards. I'd break it down into batches, run each under a 0 time limit, and keep a running database track of all successfully completed transactions. If there's ever a break in the flow, the next script run should be intelligent enough to pick up where the last one failed/ended.
You could still do it all in one process without batching - just manage the time limit as suggested earlier. PHP isn't unstable and about to break up after 15 minutes of activity
. You still need to have regular scheduled calls to pick up on any problems afterwards.
You could still do it all in one process without batching - just manage the time limit as suggested earlier. PHP isn't unstable and about to break up after 15 minutes of activity