Page 1 of 1

fgetcsv freezes after 26,xxx records

Posted: Thu Jan 09, 2003 2:29 pm
by caseymanus
I am parsing a large CSV, but it stops parsing after 26,xxx lines out of about 168,xxx and CPU Utilization spikes. Its like there is a memory leak or something. Anyone have any thoughts, and by the way I have increased my script time out so that is not the issue.

Posted: Thu Jan 09, 2003 6:57 pm
by skehoe
Are you parsing the whole thing into memory?

Are there any messages being sent to the error log?

Posted: Fri Jan 10, 2003 4:01 pm
by caseymanus
This actually turned out to be an error caused by a " in the csv file. There is a patch on the latest cvs snapshot that fixes this, supposedly. For my work though it was simple enough to eliminate the " double quote.

Thanks

Posted: Wed Jan 22, 2003 3:33 pm
by caseymanus
There turned out to be a bug in fgetcsv that caused a lock up when trying to parse multible double-quotes. This was supposedly fixed but I never got it to work. Instead I read the line into an array then used explode() to process it....something like this

Code: Select all

while ($line = fgets($fp, 3000)){ //read each line
$data = explode("|", $line); //create the array
//do some more code}
Just wanted to document this if anyone else ran into the problem