Page 2 of 2
Re: Executing huge queries(60,000rows +)
Posted: Wed Aug 05, 2009 8:39 am
by VladSun
Make sure that ALL fields used in WHERE and ORDER BY clauses have theoir indexes build.
You could even try building composite indexes for cases where more than one filed is used in these clauses.
Re: Executing huge queries(60,000rows +)
Posted: Wed Aug 05, 2009 8:41 am
by drewh
Hey thanks again. I'm using mdb2 right now instead of mysqls built in stuff. I'm using the command result->free(); I thought this was similar to what you were talking about.
Also I think its memory because every time I increase the memory the query gets further and further down the page and more results are displayed than the time before.
Re: Executing huge queries(60,000rows +)
Posted: Wed Aug 05, 2009 8:46 am
by Eran
The index on pkg_line_id (I think it's called "FK_pkg_load_pkg_line_id" for some reason) needs to be a composite and include time_stamp and year as secondary fields (in that order). Make that modification and run explain again
Re: Executing huge queries(60,000rows +)
Posted: Wed Aug 05, 2009 8:47 am
by drewh
Ok. I'm sorry. I set the time limit to 0 and it worked fine. Thank you so much. What should I default it too or where should I actually set it back say if they just exit the browser in the middle of the query. Thanks a lot.
Hey again. Thanks to all who are helping. These are really good tips. Is there a rep system here?
Re: Executing huge queries(60,000rows +)
Posted: Wed Aug 05, 2009 8:52 am
by onion2k
drewh wrote:Is there a rep system here?
There was, but I broke it by being too awesome.

Re: Executing huge queries(60,000rows +)
Posted: Wed Aug 05, 2009 9:08 am
by Eran
there is a rep system, if you check the colors of the usernames. The more "green" you are, the more awesome you are
Re: Executing huge queries(60,000rows +)
Posted: Wed Aug 05, 2009 9:21 am
by drewh
Haha ok.
Hey guys one more question how would I go about converting all the time stamps to year, month, day , hour columns. It seems a little tricky.
Re: Executing huge queries(60,000rows +)
Posted: Wed Aug 05, 2009 9:52 am
by onion2k
Add the columns with ALTER TABLE `table` ADD COLUMN `hour` blah blah blah, then do something like "UPDATE `table` SET `hour` = HOUR(`timestamp`), `day` = DAYOFYEAR(`timestamp`), `year` = YEAR(`timestamp`);".
Re: Executing huge queries(60,000rows +)
Posted: Wed Aug 05, 2009 10:10 am
by Eran
why would you want to do that though? better to leverage MySQL internal date handling
Re: Executing huge queries(60,000rows +)
Posted: Wed Aug 05, 2009 10:11 am
by drewh
Thanks onion I'll try that.
Also onion thought it would cut down on the temp tables and function calls.
Re: Executing huge queries(60,000rows +)
Posted: Wed Aug 05, 2009 10:15 am
by onion2k
pytrin wrote:why would you want to do that though? better to leverage MySQL internal date handling
When you're fetching 60,000 rows and need to run the function on every row it's a lot faster to have done the transformations beforehand, especially if you're doing the same operation over and over again on the same data.
Re: Executing huge queries(60,000rows +)
Posted: Wed Aug 05, 2009 10:19 am
by Eran
Right, I missed that. It could help, but you should try to optimize your indexes first. You would gain much more performance from that in my opinion, and it's better to exhaust all other options before denormalizing the schema.
Re: Executing huge queries(60,000rows +)
Posted: Wed Aug 05, 2009 10:54 am
by drewh
Thanks a lot guys. Adding the index to the time_stamp made the page load almost instantly. And that's like 60,000 rows. Awesome!
Re: Executing huge queries(60,000rows +)
Posted: Thu Aug 06, 2009 10:16 am
by drewh
Also guys. As future reference for people who may look at this topic down the road. Making the functions pre calculated did not drop times by much at all(.1 to .2 seconds off). I did page load calculations and put them into an excel spreadsheet. One with function calls on in one database and in another database with the functions pre calculated. Thanks a lot though.