Page 1 of 1

Parsing 1000+ Files

Posted: Thu Mar 04, 2004 6:12 pm
by jreece
I am trying to loop through a bunch of HTML pages and parse some data out of each page. I execute my code by accessing it from a browser. The code runs great for a couple minutes, but it then it just hangs on me. It does not show any errors, it just stops. Here is the relavent code - I am wondering if I should be able to use this to parse through 1000+ files or if someone has a suggestion on a different route?

Code: Select all

--loops through this code with a different URL each iteration
     $handle2 = fopen($myURL,"r");
      while (!feof($handle2)) {
         $buffer = fgets($handle2, 4096);
         $recordDateIndex = strpos($buffer, 'Recording Date');
         if(strlen($recordDateIndex) > 0) {
            //do something with this line
            echo $buffer . "<br>\n";
         &#125;
       &#125;
       fclose($handle2);

Posted: Thu Mar 04, 2004 6:26 pm
by Weirdan
try [php_man]set_time_limit[/php_man](0)