Executing huge queries(60,000rows +)
Moderator: General Moderators
Re: Executing huge queries(60,000rows +)
Make sure that ALL fields used in WHERE and ORDER BY clauses have theoir indexes build.
You could even try building composite indexes for cases where more than one filed is used in these clauses.
You could even try building composite indexes for cases where more than one filed is used in these clauses.
Last edited by VladSun on Wed Aug 05, 2009 8:47 am, edited 1 time in total.
There are 10 types of people in this world, those who understand binary and those who don't
Re: Executing huge queries(60,000rows +)
Hey thanks again. I'm using mdb2 right now instead of mysqls built in stuff. I'm using the command result->free(); I thought this was similar to what you were talking about.
Also I think its memory because every time I increase the memory the query gets further and further down the page and more results are displayed than the time before.
Also I think its memory because every time I increase the memory the query gets further and further down the page and more results are displayed than the time before.
Re: Executing huge queries(60,000rows +)
The index on pkg_line_id (I think it's called "FK_pkg_load_pkg_line_id" for some reason) needs to be a composite and include time_stamp and year as secondary fields (in that order). Make that modification and run explain again
Re: Executing huge queries(60,000rows +)
Ok. I'm sorry. I set the time limit to 0 and it worked fine. Thank you so much. What should I default it too or where should I actually set it back say if they just exit the browser in the middle of the query. Thanks a lot.
Hey again. Thanks to all who are helping. These are really good tips. Is there a rep system here?
Hey again. Thanks to all who are helping. These are really good tips. Is there a rep system here?
Re: Executing huge queries(60,000rows +)
There was, but I broke it by being too awesome.drewh wrote:Is there a rep system here?
Re: Executing huge queries(60,000rows +)
there is a rep system, if you check the colors of the usernames. The more "green" you are, the more awesome you are
Re: Executing huge queries(60,000rows +)
Haha ok.
Hey guys one more question how would I go about converting all the time stamps to year, month, day , hour columns. It seems a little tricky.
Hey guys one more question how would I go about converting all the time stamps to year, month, day , hour columns. It seems a little tricky.
Re: Executing huge queries(60,000rows +)
Add the columns with ALTER TABLE `table` ADD COLUMN `hour` blah blah blah, then do something like "UPDATE `table` SET `hour` = HOUR(`timestamp`), `day` = DAYOFYEAR(`timestamp`), `year` = YEAR(`timestamp`);".
Re: Executing huge queries(60,000rows +)
why would you want to do that though? better to leverage MySQL internal date handling
Re: Executing huge queries(60,000rows +)
Thanks onion I'll try that.
Also onion thought it would cut down on the temp tables and function calls.
Also onion thought it would cut down on the temp tables and function calls.
Re: Executing huge queries(60,000rows +)
When you're fetching 60,000 rows and need to run the function on every row it's a lot faster to have done the transformations beforehand, especially if you're doing the same operation over and over again on the same data.pytrin wrote:why would you want to do that though? better to leverage MySQL internal date handling
Re: Executing huge queries(60,000rows +)
Right, I missed that. It could help, but you should try to optimize your indexes first. You would gain much more performance from that in my opinion, and it's better to exhaust all other options before denormalizing the schema.
Re: Executing huge queries(60,000rows +)
Thanks a lot guys. Adding the index to the time_stamp made the page load almost instantly. And that's like 60,000 rows. Awesome!
Re: Executing huge queries(60,000rows +)
Also guys. As future reference for people who may look at this topic down the road. Making the functions pre calculated did not drop times by much at all(.1 to .2 seconds off). I did page load calculations and put them into an excel spreadsheet. One with function calls on in one database and in another database with the functions pre calculated. Thanks a lot though.