imageColorAt() and imageColorsForIndex() seem redundant
Posted: Fri May 16, 2003 8:10 pm
When looking at the documentation for these two functions, they seem to work together, but there is a redundancy that I don't understand.
The "imageColorAt()" function takes two parameters as explained in the documentation:
+ int imagecolorat ( resource image, int x, int y)
+ Returns the index of the color of the pixel at the specified location in the image specified by image.
The "imageColorsForIndex()" function also takes two parameters as noted:
+ array imagecolorsforindex ( resource image, int index)
+ This returns an associative array with red, green, and blue keys that contain the appropriate values for the specified color index.
Since you need to use the imageColorsAt() function to obtain the index, and the imageColorsAt() function has already accessed the resource image for this value, why does the imageColorsForIndex() function need to reaccess the resource image?
Shouldn't it just be able to return the RGB values by evaluating the color index?
If anyone can give some insight, it would be greatly appreciated.
Thanks.
Nik
The "imageColorAt()" function takes two parameters as explained in the documentation:
+ int imagecolorat ( resource image, int x, int y)
+ Returns the index of the color of the pixel at the specified location in the image specified by image.
The "imageColorsForIndex()" function also takes two parameters as noted:
+ array imagecolorsforindex ( resource image, int index)
+ This returns an associative array with red, green, and blue keys that contain the appropriate values for the specified color index.
Since you need to use the imageColorsAt() function to obtain the index, and the imageColorsAt() function has already accessed the resource image for this value, why does the imageColorsForIndex() function need to reaccess the resource image?
Shouldn't it just be able to return the RGB values by evaluating the color index?
If anyone can give some insight, it would be greatly appreciated.
Thanks.
Nik