Page 1 of 1

Intensive loops / max_execution_time

Posted: Tue Jun 12, 2007 7:37 am
by determinedmoth
Hi
I'm currently working on functions that, among others things, will be looping through credit card transactions via XMLRPC.

I'll potentially be processing 500+ at one time.

During testing, it currently takes about 10 seconds to loop through 10 credit card transactions. This poses the problem of the script timing out.

I basically need some advice on the route to take.

My "biller" page will be executed via cron on a set date, preferably only ONCE.

Can I then loop through say, 50 transactions, and then use a header redirect to hit the script again and do another batch of 50? Or will that still time-out? I'm unsure if php will think this is 1 execution, or several.

I will have full access to the server and it's configuration... Linux-based.

Thanks.

Posted: Tue Jun 12, 2007 7:46 am
by Oren

Code: Select all

set_time_limit(0);
But you really should split it into smaller chunks.

Posted: Tue Jun 12, 2007 7:51 am
by determinedmoth
I know I can set the time limit to whatever I like - but like you said, I should be doing something else here, and my question is really "what" should I be doing here?

What's the best (safest) method of breaking it down?

Can use header() until there are no transaction left, batching 50 at a time, or is this not best practice?

Posted: Tue Jun 12, 2007 7:59 am
by jayshields
I'm not sure if that's a good idea but it is a solution.

You could have the cron run an ample amount of times every month (say you've got 500 things for it to process, you can handle 20 at a time before timing out. Every 30 days run the script 25 times in succession with about a minute in between each run).

You could use a database to track which transactions you've done upto so that the next run of the script knows where to start from.

This solution is flawed if you have an unpredictable amount of things for it to process, so I'd wait for a better opinion before doing anything.

Posted: Tue Jun 12, 2007 8:25 am
by determinedmoth
jayshields wrote:You could use a database to track which transactions you've done upto so that the next run of the script knows where to start from.
For reference, I am doing this.

Posted: Tue Jun 12, 2007 8:32 am
by BDKR
determinedmoth wrote: Can use header() until there are no transaction left, batching 50 at a time, or is this not best practice?
header? Hmm........

A quick question: For each card, you do have to communicate back with some processing gateway?

Assuming the answer to that is yes, take a look at CURL.

And just to make sure we are all on the same page here, you don't need to use a browser at all OK? Write this as a cron job that's called at whatever
interval you feel is helpful.

Posted: Tue Jun 12, 2007 8:41 am
by determinedmoth
BDKR wrote:
A quick question: For each card, you do have to communicate back with some processing gateway?
Yes - SecPay via XMLRPC library - that's working fine.
BDKR wrote: And just to make sure we are all on the same page here, you don't need to use a browser at all OK?
Nope. No browser.

Posted: Tue Jun 12, 2007 11:41 am
by Maugrim_The_Reaper
Basically you're estimating 10-15 minutes for processing 500+ cards. I'd break it down into batches, run each under a 0 time limit, and keep a running database track of all successfully completed transactions. If there's ever a break in the flow, the next script run should be intelligent enough to pick up where the last one failed/ended.

You could still do it all in one process without batching - just manage the time limit as suggested earlier. PHP isn't unstable and about to break up after 15 minutes of activity ;). You still need to have regular scheduled calls to pick up on any problems afterwards.