Page 1 of 1

Very High Server Load & Abrupt halt of PHP script

Posted: Wed Mar 16, 2005 2:31 pm
by neyo
Hi, i run Apache/PHP/Mysql on a Linux box & i have a php script eg hello.php which is triggered after a table is populated with lets say 2000 records.
The basic logic behind d triggered script is that it checks the table, retrieve the 1st record, process it, delete the record & then check the table if there are still more records in the table, if there are no records left in the table it exits but if there are more records, it does an HTTP POST to itself hello.php.

Problem 1 :
Thru'put is very low. does about 1 record per second> How can 1 read records from the same table simultaneously without reading same record more than once judging by the speed of php scripts

Probem 2 :
Found out that after like 2-3mins , it stops running & i had 2 setup a cron to run every 5mins so that they records can b processed. Could anybody tell me why.

Problem 3 :
Server Load increased tremendously. how can one manage server load effectively. Should an exit command be issued at the end of the script since i declare require_once to the connection script. eg
<?php
//Connection statement
require_once('Connections/your_connection.php');

?>

Posted: Wed Mar 16, 2005 2:40 pm
by feyd
use a loop, not posting to itself.. posting to itself requires recursion, recursion quickly adds up ram usage, plus that instance will not end until its call returns, thus forcing the system to open many more connections among other things..

You can: loop, deleting/marking records as you go until the script is halted.. OR .. loop for 5 or 10 or how ever many you can pack into the time alotted.. OR .. you can set the script to never time out and process all records (you must still mark or delete them however.. in case the script stops for uncontrolled reasons.. like a memory fault.)

Posted: Wed Mar 16, 2005 3:03 pm
by neyo
thanks for your prompt response.

i'ld need 2 know what u mean by recursion.

the problem about a loop is that it times out & i cant set the script to never time out, the best i can do is to set the timeout interval to 90secs. but in this case it stops after processing 500records whereas, a client had sent in 5000 - 10000 records 2 b processed.

The idea of posting 2 itself is such that, a record is handled at a particular instance of the script. can an exit() at d end of the script or apache_child_terminate() function help?

i have 1GB of RAM, what can i can this line in my php_info 2 to help matters

Per Child: 0 - Keep Alive: on - Max Per Connection: 100

howelse can it b done?

Posted: Wed Mar 16, 2005 4:06 pm
by feyd
first off, I'd like to ask you to stop using abbreviations for already short words.

Next, my suggestion would use a cron to process in small batches, frequently. Maybe you could optimize the script to take less time as well. As long as the cron is working, it doesn't really matter if the client has 10 or 10 million records to process. They will eventually get processed, provided the insertion load is less than the processing load.. Considering you are only processing 1 record per second, it sounds like you could do with some major optimization..