Problem with long running script
Posted: Thu Apr 01, 2010 10:41 am
I've written a script that is supposed to execute a number of long running processes. Trouble is, it stops.
This is for a job board, written in PHP with MySQL as a database, running on a Linux box. The processes are set up as functions. I'm using Lynx to run the script as a cron job.
When the main script stops, I need to complete the jobs by running separate scripts from my desktop using a regular browser like Firefox or IE. As the script stops early on in the process, I spend a good deal of time running subsequent step "manually", one at a time.
Here are the steps:
1. Delete the old jobs2 table, create a new one, and insert data from jobs to jobs 2. No problems here.
2. Download xml data feeds (jobs) to the hard drive. There are about 165 feeds now, of varying sizes, some pretty big. This usually stops executing after a few records, and I have to finish it manually.
From this point on, I am usually running the individual scripts one at a time to complete the job.
3. Read the xml data feeds and insert into the jobs2 table. There are currently about 500,000 records. This stops, and needs to be restarted at least once.
Other steps seem to run OK, so I guess it's really the first two.
I should mention that I'm using full text indexing, so it's pretty intensive, but when I run top or mtop, everything seems OK.
I've got set_time_limit(0); so timeout is (I imagine) not the problem.
Any clues on how I can track this down?
Thanks in advance.
John
This is for a job board, written in PHP with MySQL as a database, running on a Linux box. The processes are set up as functions. I'm using Lynx to run the script as a cron job.
When the main script stops, I need to complete the jobs by running separate scripts from my desktop using a regular browser like Firefox or IE. As the script stops early on in the process, I spend a good deal of time running subsequent step "manually", one at a time.
Here are the steps:
1. Delete the old jobs2 table, create a new one, and insert data from jobs to jobs 2. No problems here.
2. Download xml data feeds (jobs) to the hard drive. There are about 165 feeds now, of varying sizes, some pretty big. This usually stops executing after a few records, and I have to finish it manually.
From this point on, I am usually running the individual scripts one at a time to complete the job.
3. Read the xml data feeds and insert into the jobs2 table. There are currently about 500,000 records. This stops, and needs to be restarted at least once.
Other steps seem to run OK, so I guess it's really the first two.
I should mention that I'm using full text indexing, so it's pretty intensive, but when I run top or mtop, everything seems OK.
I've got set_time_limit(0); so timeout is (I imagine) not the problem.
Any clues on how I can track this down?
Thanks in advance.
John