I'm using MagicParser to parse an XML file containing about 65.000 entries. The browser screen is white for about a minute and then stops loading... Files up to 10.000 entries are no problem at all; these get added to the database properly. I raised the resource limits in php.ini 10-fold to counter this problem:
max_execution_time 6000 6000
max_input_nesting_level 64 64
max_input_time 3000 3000
memory_limit 500M 500M
The execution did not change though, there is still some kind of timeout... Should I be sending back some data to the browser with the ob_ functions or is there something else I should be looking out for?
PHP timeout on large XML feeds
Moderator: General Moderators
I notice the timeout is occurring during a call to curl_exec to read the XML data because no data is being saved to a file. I set the CURLOPT_TIMEOUT to 6000, but this did not help; it still times out before the data is saved to file. I don't think the browser timing out has anyting to do with this because the process should keep running independant of the browser?
- feyd
- Neighborhood Spidermoddy
- Posts: 31559
- Joined: Mon Mar 29, 2004 3:24 pm
- Location: Bothell, Washington, USA
Unless your script times out or explodes, yes it should keep running regardless.nutkenz wrote:I notice the timeout is occurring during a call to curl_exec to read the XML data because no data is being saved to a file. I set the CURLOPT_TIMEOUT to 6000, but this did not help; it still times out before the data is saved to file. I don't think the browser timing out has anyting to do with this because the process should keep running independant of the browser?
Have you looked at your errors logs in detail?