I am finally setting up a decent caching system, or at least using zend frameworks caching system. Right now I have it setup on what is just the memory intensive parts, basically these queries that I run on our largest table (~1.4 million rows). This seams like the simplest way to go about this but I know I could go deaper. I have functions that create an array that I can use on my template, each array is different depending on the user and the selected month. This seams like overkill though, caching every single database return and putting it into a serialized array in a tmp file. So, how far should I go?
Just cache the result set for the memory intensive queries?
Cache all queries?
Cache all the arrays that I create for the templates (there are a lot of them)?
The last would be basically caching everything possible, but I imagine that this would make my total cache size huge because every user for every month would have another file for the cache. Also, it seams like I could mess up renewing my cache if I don't have the correct tags set for that cache or if I forget that something may be dependent on something else. Is it wise to just cache it all and hope for the best or what are the best practices for this? I don't want to litter my code with huge cache if statements and then realize that my gain from using the cache is nominal at best. What do you use/suggest?
Caching, how far should one go?
Moderator: General Moderators
Re: Caching, how far should one go?
First I would run a profiler tool to see exactly where performance issues were and exactly how big of an issue they are relative to each other, I would also consider only caching users that are regularly active or login regularly, I'd imagine you wouldn't keep the cache files around permanently either, the cache isn't a replacement for your storage layer. You should also weigh the pros and cons of caching model data vs caching generated HTML, etc..
Lastly the security issues of using a cache (obviously ) are important, you don't want to put anything user specific, into global cache files as to show private data to a non authorized party, it seems simple but there've been some slips by high profile websites
Lastly the security issues of using a cache (obviously ) are important, you don't want to put anything user specific, into global cache files as to show private data to a non authorized party, it seems simple but there've been some slips by high profile websites
Re: Caching, how far should one go?
@ jshpro2:
Do you know any good introductory resources for profiling PHP? I've never done it, but have a few sites that could benefit. I'd appreciate any links or book recommendations you could provide.
Thanks.
Do you know any good introductory resources for profiling PHP? I've never done it, but have a few sites that could benefit. I'd appreciate any links or book recommendations you could provide.
Thanks.
Re: Caching, how far should one go?
Try installing xdebug. Right now I'm going to take a look at http://code.google.com/p/webgrind/ and see whether it will be useful to me.
Re: Caching, how far should one go?
yeah I use phpED which uses xdebug, shows you right where your bottlenecks are. ( runs the profile session on the live server under real load if you wanted )

Pretty charts

Pretty charts
- shiznatix
- DevNet Master
- Posts: 2745
- Joined: Tue Dec 28, 2004 5:57 pm
- Location: Tallinn, Estonia
- Contact:
Re: Caching, how far should one go?
xdebug is looking good. I installed it and ran the scripts through kcachegrind and looks like some useful information. Good stuff, thanks
Re: Caching, how far should one go?
You should also configure and enable logging of slow queries, then use mysqls slow query log analyzer tool to find problem queries, xdebug will show you this too, but its helpful to work with just database profiling sometimes, and not view file based bottlenecks in the same result set
- shiznatix
- DevNet Master
- Posts: 2745
- Joined: Tue Dec 28, 2004 5:57 pm
- Location: Tallinn, Estonia
- Contact:
Re: Caching, how far should one go?
The queries are a problem but the biggest problem about it all was having to run the same queries like 300 times to bring out a report, there was no other way. But I am fixing the table up (its big and the guy before me was doing some pretty stupid things with this table) and with the caching I don't think I will have a problem now. That xdebug is awesome, super extra helpful.