Page 1 of 1

PHP timeout on large XML feeds

Posted: Thu Nov 15, 2007 9:39 am
by nutkenz
I'm using MagicParser to parse an XML file containing about 65.000 entries. The browser screen is white for about a minute and then stops loading... Files up to 10.000 entries are no problem at all; these get added to the database properly. I raised the resource limits in php.ini 10-fold to counter this problem:

max_execution_time 6000 6000
max_input_nesting_level 64 64
max_input_time 3000 3000
memory_limit 500M 500M

The execution did not change though, there is still some kind of timeout... Should I be sending back some data to the browser with the ob_ functions or is there something else I should be looking out for?

Posted: Thu Nov 15, 2007 10:19 am
by feyd
The browser may be giving up, so sending some data may help. The output buffering functions are not helpful here. Flush() is more useful.

Posted: Thu Nov 15, 2007 5:00 pm
by nutkenz
I notice the timeout is occurring during a call to curl_exec to read the XML data because no data is being saved to a file. I set the CURLOPT_TIMEOUT to 6000, but this did not help; it still times out before the data is saved to file. I don't think the browser timing out has anyting to do with this because the process should keep running independant of the browser?

Posted: Thu Nov 15, 2007 5:02 pm
by feyd
nutkenz wrote:I notice the timeout is occurring during a call to curl_exec to read the XML data because no data is being saved to a file. I set the CURLOPT_TIMEOUT to 6000, but this did not help; it still times out before the data is saved to file. I don't think the browser timing out has anyting to do with this because the process should keep running independant of the browser?
Unless your script times out or explodes, yes it should keep running regardless.

Have you looked at your errors logs in detail?