Page 1 of 1

Error: Allowed memory size exhausted

Posted: Tue Sep 04, 2007 10:57 am
by JeffD
feyd | Please use

Code: Select all

,

Code: Select all

and [syntax="..."] tags where appropriate when posting code. Your post has been edited to reflect how we'd like it posted. Please read:  [url=http://forums.devnetwork.net/viewtopic.php?t=21171]Posting Code in the Forums[/url] to learn how to do it too.[/color]


I am encountering the following error message...

Code: Select all

Fatal error: Allowed memory size of 16777216 bytes exhausted (tried to allocate 600 bytes) in [path/]functions.php on line 219
...while trying to run a utility script whose purpose is to use GDLib version 2.0.28 compatible with PHP version 4.3.10 to create (a) a thumbnail image and (b) a "normal-sized" image for each uploaded JPEG in a staging directory on the (LINUX) server.

One function populates an array with the file names of all images in the staging directory and then calls another function to create and store the images (in other directories).

In my testing, I'm using only 3 JPEGs, each between 385KB and 575KB.

When I invoke only the call for the thumbnails (fixed width of 120px; height calculated), things work. It's when I call the function for the normal images with a target width greater than 365px (even if I've commented out and bypassed the call for the thumbs), I get the above message (where the numbers of bytes in the "tried to allocate" part are multiples of the target width (?)). (Ideally, I want the target width of normal images to be 600px.)

Here are pertinent code snippets:

The loop:

Code: Select all

for ( $f=0; $f<count($file_listing); $f++ )
{
	$filename = GALLERY_IMAGES_SOURCE_DIR . $file_listing[$f];
	make_images_step2($filename, DEST_DIR_THUMB, WIDTH_THUMB, QUALITY_THUMB);
	make_images_step2($filename, DEST_DIR_NORMAL, WIDTH_NORMAL, QUALITY_NORMAL);
}
The called function:

Code: Select all

function make_images_step2($filename, $where, $width, $quality)
{
	
	// - Create and move the images
	
	$fx = basename($filename); // just the file and extension 9/3/2007
	$source = imagecreatefromjpeg($filename);
	$thumbX = $width;
	$imageX = imagesx($source);
	$imageY = imagesy($source);
	$thumbY = (int)(($thumbX*$imageY) / $imageX );
	$dest  = imagecreatetruecolor($thumbX, $thumbY);

	imagecopyresampled ($dest, $source, 0, 0, 0, 0, $thumbX, $thumbY, $imageX, $imageY);        
	
	// 9/3/2007: Write the file here!
	imagejpeg($dest, $where . $fx, $quality);
	imagedestroy($dest);
	imagedestroy($source);

} // End function
Any insights?

Thank you in advance


feyd | Please use

Code: Select all

,

Code: Select all

and [syntax="..."] tags where appropriate when posting code. Your post has been edited to reflect how we'd like it posted. Please read:  [url=http://forums.devnetwork.net/viewtopic.php?t=21171]Posting Code in the Forums[/url] to learn how to do it too.[/color]

Posted: Tue Sep 04, 2007 11:04 am
by anjanesh
You seem to have crossed the 16MB limit ! Your GD function taking so much memory ?
Whats your value of memory_limit in your php.ini file ? Increase it and check if you get the error msg again.

Posted: Tue Sep 04, 2007 11:07 am
by RobertGonzalez
There is something in your code that is using memory and not letting go of it. You may want to look at putting a few calls to memory_get_usage() in your script to see where the memory is piling up. Another function to look at is memory_get_peak_usage().

Posted: Tue Sep 04, 2007 11:25 am
by JeffD
Thanks, both anjanesh and Everah -

The memory_limit value was set to 16MB. I increased it with an ini_set() to 128MB, and things worked!

Now the next question: What are the downsides of setting the memory_limit higher? And if there are downsides, am I right to assume the easiest solution is to RE-set it to the default 16MB after running the utility scripts?

Thanks,
Jeff

Posted: Tue Sep 04, 2007 11:56 am
by RobertGonzalez
I typical leave my setting at the default of 8M unless I need to move it up for a good reason. Right not, on one of my company machines, it is set to 24M, but only because we are working with rather large datasets that come out of the database at 9 to 15M a pop.

I personally like leaving the limit as low as possible since the memory being used by one call is multiplied when called by many requests at once. Imagine if your app is taking up 32M on one request. How much is being put on your server if say, 100 people were accessing the site at the same time?