Intensive loops / max_execution_time

Not for 'how-to' coding questions but PHP theory instead, this forum is here for those of us who wish to learn about design aspects of programming with PHP.

Moderator: General Moderators

Post Reply
determinedmoth
Forum Commoner
Posts: 33
Joined: Wed Jul 07, 2004 9:13 am

Intensive loops / max_execution_time

Post by determinedmoth »

Hi
I'm currently working on functions that, among others things, will be looping through credit card transactions via XMLRPC.

I'll potentially be processing 500+ at one time.

During testing, it currently takes about 10 seconds to loop through 10 credit card transactions. This poses the problem of the script timing out.

I basically need some advice on the route to take.

My "biller" page will be executed via cron on a set date, preferably only ONCE.

Can I then loop through say, 50 transactions, and then use a header redirect to hit the script again and do another batch of 50? Or will that still time-out? I'm unsure if php will think this is 1 execution, or several.

I will have full access to the server and it's configuration... Linux-based.

Thanks.
User avatar
Oren
DevNet Resident
Posts: 1640
Joined: Fri Apr 07, 2006 5:13 am
Location: Israel

Post by Oren »

Code: Select all

set_time_limit(0);
But you really should split it into smaller chunks.
determinedmoth
Forum Commoner
Posts: 33
Joined: Wed Jul 07, 2004 9:13 am

Post by determinedmoth »

I know I can set the time limit to whatever I like - but like you said, I should be doing something else here, and my question is really "what" should I be doing here?

What's the best (safest) method of breaking it down?

Can use header() until there are no transaction left, batching 50 at a time, or is this not best practice?
User avatar
jayshields
DevNet Resident
Posts: 1912
Joined: Mon Aug 22, 2005 12:11 pm
Location: Leeds/Manchester, England

Post by jayshields »

I'm not sure if that's a good idea but it is a solution.

You could have the cron run an ample amount of times every month (say you've got 500 things for it to process, you can handle 20 at a time before timing out. Every 30 days run the script 25 times in succession with about a minute in between each run).

You could use a database to track which transactions you've done upto so that the next run of the script knows where to start from.

This solution is flawed if you have an unpredictable amount of things for it to process, so I'd wait for a better opinion before doing anything.
determinedmoth
Forum Commoner
Posts: 33
Joined: Wed Jul 07, 2004 9:13 am

Post by determinedmoth »

jayshields wrote:You could use a database to track which transactions you've done upto so that the next run of the script knows where to start from.
For reference, I am doing this.
User avatar
BDKR
DevNet Resident
Posts: 1207
Joined: Sat Jun 08, 2002 1:24 pm
Location: Florida
Contact:

Post by BDKR »

determinedmoth wrote: Can use header() until there are no transaction left, batching 50 at a time, or is this not best practice?
header? Hmm........

A quick question: For each card, you do have to communicate back with some processing gateway?

Assuming the answer to that is yes, take a look at CURL.

And just to make sure we are all on the same page here, you don't need to use a browser at all OK? Write this as a cron job that's called at whatever
interval you feel is helpful.
determinedmoth
Forum Commoner
Posts: 33
Joined: Wed Jul 07, 2004 9:13 am

Post by determinedmoth »

BDKR wrote:
A quick question: For each card, you do have to communicate back with some processing gateway?
Yes - SecPay via XMLRPC library - that's working fine.
BDKR wrote: And just to make sure we are all on the same page here, you don't need to use a browser at all OK?
Nope. No browser.
User avatar
Maugrim_The_Reaper
DevNet Master
Posts: 2704
Joined: Tue Nov 02, 2004 5:43 am
Location: Ireland

Post by Maugrim_The_Reaper »

Basically you're estimating 10-15 minutes for processing 500+ cards. I'd break it down into batches, run each under a 0 time limit, and keep a running database track of all successfully completed transactions. If there's ever a break in the flow, the next script run should be intelligent enough to pick up where the last one failed/ended.

You could still do it all in one process without batching - just manage the time limit as suggested earlier. PHP isn't unstable and about to break up after 15 minutes of activity ;). You still need to have regular scheduled calls to pick up on any problems afterwards.
Post Reply