Page 1 of 2
Memory management.
Posted: Sun Apr 16, 2006 1:56 pm
by stevietee
Hi All,
Forgive me if this is a noob question but I am at something of a loss. I've only just started working with php (installed in on my local pc along with an apache webserver on good friday in fact) and ive been trying to write a small script to search through some specific text files and return the result of the search as html. If worked fine on my local setup but when I upload it to my website to run I get the following error:
Fatal error: Allowed memory size of 20971520 bytes exhausted (tried to allocate 20387808 bytes)
Now this seems to occur within the folloiwing code segment:
Code: Select all
if($counter=="0") {
$lines = file('./file1.txt');
}
elseif ($counter=="1") {
$lines = file('./file2.txt');
}
else
{
$lines = file('./file3.txt');
}
I can get around the problem but only if i include "ini_set("memory_limit","60M");"
I reckon 60Mb is too high however (and in any case I plan on adding more files as I get my head around this - so i guess that limit will have to go through the roof). I'm guessing that for every instance of the file () function memory is being allocated, and more memory is then allocated to store the contents in the array $lines?
Currently file1.txt is 9mb, file2.txt is 20mb, and file3.txt is 8mb. The error occurs when its processing the 20mb file2.txt.
Any suggestions would be much appreciated thanks, bearing in mind i am trying to load the contents of the file into an array .
Posted: Sun Apr 16, 2006 1:59 pm
by John Cartwright
I would likely throw it into a database and avoid having to load the entire file into memory, when most likely you do not need the entire file. Regardless, databases are much better for storing large amounts of data, consider indexing, etc.
Posted: Sun Apr 16, 2006 2:16 pm
by stevietee
Woh, that was a fast response

. So your saying I should write a script to import the contents of these files into a database (SQL?) then search that?
I was hoping I'd be able to do it without recourse to a database to be honest, since my SQL skills are negligable at this moment

. However if there is no other alternative then I'll look into it.
Question tho, given the size of the files that Im processing would you expect me to have to set my upper memory limit to more than 60Mb?
TIA
Posted: Sun Apr 16, 2006 2:25 pm
by John Cartwright
Firstly I would take down any sites due to maintenance purposes.. then change PHP to allow as much memory needed (temporarly)..
if your files are delimited, you can simply use
http://dev.mysql.com/doc/refman/4.1/en/load-data.html to important the files into the database so there is no need to write a script to do it for you..
Posted: Sun Apr 16, 2006 3:54 pm
by stevietee
Firstly I would take down any sites due to maintenance purposes.. then change PHP to allow as much memory needed (temporarly)
What benefit would this give me? Is "load data infile" hampered by memory restrictions too?
Posted: Sun Apr 16, 2006 4:13 pm
by John Cartwright
I would imagine so, your server may experience high loads when importing large files because reguardless the files have to be loaded into memory.. althought now that I think about it you won't even need to adjust the php memory limit, considering this all can be done via command line
taking your site off for a few minutes was implied for the shift of how your data handling is done.. I personally don't allow my users to see the updates as they progress and any errors that may occur during the upgrade.
Posted: Sun Apr 16, 2006 4:16 pm
by stevietee
Oh I see. Many thanks jcart. I'll post back how on I get on.
Posted: Mon Apr 17, 2006 2:36 pm
by stevietee
Ok, I'm picking this up slowly so please bear with me. Ive configured an SQL database on my localhost now and successfully loaded the contents of my text files thanks to jcart. Now I'm almost back the point I was at yesterday in that I've found some code to echo the specific contents of SQL tables based upon keyword entry.
It goes a little something like this...
Code: Select all
$q=" ";
$kt=split(" ",'keyword1 keyword2');//Breaking the string to array of words
while(list($key,$val)=each($kt)){
if($val<>" " and strlen($val) > 0){$q .= " name like '%$val%' and ";}
}// end of while
$q=substr($q,0,(strLen($q)-4));
// this will remove the last and from the string.
$query="select * from TABLE where $q ";
$line=mysql_query($query);
echo mysql_error();
while($row=mysql_fetch_array($line)){
echo "$row[url],$row[name],<br>";
//echo "<a href=",$row[url],">,$row[name],</a> <br>";
All seems to work ok. The appropriate records are extracted from the database and dumped to screen using the standard echo statement as above ('url' and 'name' are 2 field in my table btw). The problem I have now is when I try to reformat that output as a hyperlink (see the commented out final line). I get the following error message:
NOTICE: Use of undefined constant url - assumed 'url'... and the output is less than perfect
Im sure its just some stupid syntax thing but Im beat. I feel overdosed on php/Sql and its getting hard to hold sensible conversations with my loved ones without resorting to 3 letter acronyms. Someone help please...
Posted: Mon Apr 17, 2006 2:41 pm
by John Cartwright
Code: Select all
while($row=mysql_fetch_array($line)){
echo '<a href="'.$row['url'].'">'.$row['name'].'</a> <br>';
always quote your array indices, unless they are of course numerical

Posted: Mon Apr 17, 2006 2:47 pm
by stevietee
Excellent, that worked. This place is fantastic. Many Thanks once again, Jcart.
Posted: Mon Apr 17, 2006 2:51 pm
by John Cartwright
p.s., I hate looping when I don't have to..
so instead of generating sql by looping I typically do something like
Code: Select all
$keywords = explode(" ", 'keyword1 keyword2');
if (count($keywords)) {
$keywords = '`name` LIKE %'.implode('% AND `name` LIKE %', $keywords).'%';
} else {
$keywords = '`name` LIKE %'.$keyword.'%';
}
enjoy
Posted: Thu Apr 20, 2006 4:32 am
by stevietee

Aargh

It looks like I need an alternative to LOAD DATA INFILE. It turns out that my hosts have disabled access privileges for this very useful little function so I cant use it

. So, short of switching hosts, are there any other viable alternatives? (I'll be sure to check with my host first before spending 3 days getting scripts working on my local machin tho..grr).
Posted: Thu Apr 20, 2006 7:25 am
by stevietee
I guess thinking about this clearly I coud use a php script to emulate LOAD DATA INFILE could anyone point at an open source example?
Posted: Thu Apr 20, 2006 11:08 am
by feyd
search the regex board under raghavan20's posts.. I remember several threads dealing with the creation of just such a thing.
Posted: Thu Apr 20, 2006 12:13 pm
by stevietee
feyd wrote:search the regex board under raghavan20's posts.. I remember several threads dealing with the creation of just such a thing.
Sorry feyd. Cant find anything that helps in those forums. I have come up with the following however:-
Code: Select all
// Load New Records
$fcontents = file ('file.txt');
for($inCnt=0; $inCnt<sizeof($fcontents); $inCnt++) {
$line = trim($fcontents[$inCnt]);
$lineArray = explode("\t", $line);
$sql = "insert into table_name values ('". implode("','", $lineArray) ."')";
mysql_query($sql);
}
However this gives me my original error - ie Fatal error: Allowed memory size of 20971520 bytes exhausted when I try to open my file. In this instance file.txt is roughly 10MB. Short of making the file smaller or increasing the memory limit is there anything else i can try?