Reading a File - Not Enough Memory

PHP programming forum. Ask questions or help people concerning PHP code. Don't understand a function? Need help implementing a class? Don't understand a class? Here is where to ask. Remember to do your homework!

Moderator: General Moderators

Post Reply
webcan
Forum Commoner
Posts: 66
Joined: Tue Oct 28, 2003 2:25 pm
Location: Toronto, Canada

Reading a File - Not Enough Memory

Post by webcan »

Hello:

I'm trying to read in a rather large tab-delimited file, it's almost 8mb. I get an error while reading the file, I'm assuming that it's because it runs out of memory.

The error is: Fatal error: Allowed memory size of 8388608 bytes exhausted at (null):0 (tried to allocate 74 bytes)

The code that I wrote to read this file is:

Code: Select all

<?php
$newdata = fopen("/home/www/auser/data/datafile.txt", "r");
while(!feof($newdata)) { 
	$filedata .= fgets($newdata, 1024); 
}
fclose($newdata);
$filelines = explode("\n", $filedata); 
?>
Does anyone have another way of reading this file in that would work? :)

Ideally I'd like to open the file and get "one line at a time", but I don't know how to do that. I find it weird to work with this method of reading in "x" bytes at a time from the file, because, how do you know the length of each line?

Thanks,
Peter.
User avatar
markl999
DevNet Resident
Posts: 1972
Joined: Thu Oct 16, 2003 5:49 pm
Location: Manchester (UK)

Post by markl999 »

You could always try a ini_set ( memory_limit, '16M' ) at the top and see if that's enough memory to read it in.
User avatar
Wayne
Forum Contributor
Posts: 339
Joined: Wed Jun 05, 2002 10:59 am

Post by Wayne »

if you are going to explode on \n then why dont you just use file() which will read the file into an array with each line as an array element.
kettle_drum
DevNet Resident
Posts: 1150
Joined: Sun Jul 20, 2003 9:25 pm
Location: West Yorkshire, England

Post by kettle_drum »

Just read each line to the variable one at a time like you said and you shouldnt run out of memory. Do it by what you have at the moment, but parse the line and do what you want with it inside the while loop:

Code: Select all

<?php 
$newdata = fopen("/home/www/auser/data/datafile.txt", "r"); 
while(!feof($newdata)) { 
   $filedata = fgets($newdata, 1024); 
   $filelines = explode("\n", $filedata); 
   //do whatever else you want to do before you call the next line.
} 
fclose($newdata); 
?>
If you want to read more than one line, then read more and unset the variable with the data stored in it to remove it from memory.
webcan
Forum Commoner
Posts: 66
Joined: Tue Oct 28, 2003 2:25 pm
Location: Toronto, Canada

Post by webcan »

OK, I tried the ini_set thing, that didn't work, I still get the same problem.

If I do the way kettle_drum suggested, does fgets take UP TO 1024 characters OR the first new line, or, does it just read it in with 1024 character chunks? Do I unset $filelines?

Thanks,
Peter.
kettle_drum
DevNet Resident
Posts: 1150
Joined: Sun Jul 20, 2003 9:25 pm
Location: West Yorkshire, England

Post by kettle_drum »

fget will get either 1024 bytes or until a new line. You would unset the variable that you store the data in...which is...$filedata, but you can also unset $filelines if your holding the data there as well.
webcan
Forum Commoner
Posts: 66
Joined: Tue Oct 28, 2003 2:25 pm
Location: Toronto, Canada

Post by webcan »

OK, thanks! That totally worked.

So, for future reference, if I'm not sure how long a file line might be, I can just do:

$filedata = fgets($newsources, 500000);

... right?
kettle_drum
DevNet Resident
Posts: 1150
Joined: Sun Jul 20, 2003 9:25 pm
Location: West Yorkshire, England

Post by kettle_drum »

Or you could do something like:

Code: Select all

while($char<>'\n' OR $char<>'\n\r'){
   $char = fgets($fp, 1);
   $line .= $char;
}
or something similar.
webcan
Forum Commoner
Posts: 66
Joined: Tue Oct 28, 2003 2:25 pm
Location: Toronto, Canada

Post by webcan »

OK, wow, I'm surprised that in PHP there's nothing like in Perl, where you just do while(<FILEHANDLE>) and it reads in one line at a time.
Post Reply