fgetcsv freezes after 26,xxx records

PHP programming forum. Ask questions or help people concerning PHP code. Don't understand a function? Need help implementing a class? Don't understand a class? Here is where to ask. Remember to do your homework!

Moderator: General Moderators

Post Reply
User avatar
caseymanus
Forum Commoner
Posts: 34
Joined: Wed Nov 20, 2002 10:32 pm
Contact:

fgetcsv freezes after 26,xxx records

Post by caseymanus »

I am parsing a large CSV, but it stops parsing after 26,xxx lines out of about 168,xxx and CPU Utilization spikes. Its like there is a memory leak or something. Anyone have any thoughts, and by the way I have increased my script time out so that is not the issue.
User avatar
skehoe
Forum Commoner
Posts: 59
Joined: Sun Dec 22, 2002 5:57 am
Location: Denver

Post by skehoe »

Are you parsing the whole thing into memory?

Are there any messages being sent to the error log?
User avatar
caseymanus
Forum Commoner
Posts: 34
Joined: Wed Nov 20, 2002 10:32 pm
Contact:

Post by caseymanus »

This actually turned out to be an error caused by a " in the csv file. There is a patch on the latest cvs snapshot that fixes this, supposedly. For my work though it was simple enough to eliminate the " double quote.

Thanks
User avatar
caseymanus
Forum Commoner
Posts: 34
Joined: Wed Nov 20, 2002 10:32 pm
Contact:

Post by caseymanus »

There turned out to be a bug in fgetcsv that caused a lock up when trying to parse multible double-quotes. This was supposedly fixed but I never got it to work. Instead I read the line into an array then used explode() to process it....something like this

Code: Select all

while ($line = fgets($fp, 3000)){ //read each line
$data = explode("|", $line); //create the array
//do some more code}
Just wanted to document this if anyone else ran into the problem
Post Reply