Parsing n number of urls without timeout
Posted: Sat Oct 11, 2008 2:16 am
Hi All,
I am working on a rss feed aggregator, the concept/algorithm is
1>I connect to a source url
2>Parse the text content
3>Wite on destination server in steps of 40KB
4>After every 40KB the content is trasferred to next file (myfile.2) and then on reaching threshhold 40KB to myfile3.txt
These steps will run for each url
Now the issue i am facing in this is after I connect an grab the data from source1 and then if i grab the data from source 2 get get script timeout.
Is there any possible workaround/logic.trick so that i can connect to any number of urls and then grab the data from all at once without this timeout.
Increasing the time to process a script is the solution i am not looking for... anyother idea?
I am working on a rss feed aggregator, the concept/algorithm is
1>I connect to a source url
2>Parse the text content
3>Wite on destination server in steps of 40KB
4>After every 40KB the content is trasferred to next file (myfile.2) and then on reaching threshhold 40KB to myfile3.txt
These steps will run for each url
Now the issue i am facing in this is after I connect an grab the data from source1 and then if i grab the data from source 2 get get script timeout.
Is there any possible workaround/logic.trick so that i can connect to any number of urls and then grab the data from all at once without this timeout.
Increasing the time to process a script is the solution i am not looking for... anyother idea?