Page 1 of 1
Increase Memory Limit...
Posted: Fri Nov 07, 2008 8:52 am
by shiznatix
So I have this query that is run only once a month but is becoming a pain. Every month it gets bigger, this month it is at 137,937 rows returned. I can't slim down this query anymore, its as good as I can get it. Indexs are there and everything and I need all the first that are returned to be returned.
The problem is that it is using over 200megs of memory to run this query! I had the memory limit for php set to 200 and it was maxing that out. I set it again to 300 and all is well but I don't want to have to keep doing that every few months because this is growing by about 10k+ rows every month.
So, how do you manage such large returned arrays? I have to do lots of calculations on them and I use caching were I can but this is a bit too much. What options are there to speed this up and have it use less memory?
Re: Increase Memory Limit...
Posted: Fri Nov 07, 2008 9:07 am
by Eran
Try to use
mysql_unbuffered_query(), it doesnt automatically fetch and buffer the results into memory as mysql_query() does. This will allow you to process a row at a time without loading everything into memory.
Re: Increase Memory Limit...
Posted: Fri Nov 07, 2008 9:10 am
by shiznatix
looks good but I am not so sure that this is a mysql problem as much as it is a php trying to process such a huge array problem. Are there any other ways than "do it in chunks"
Re: Increase Memory Limit...
Posted: Fri Nov 07, 2008 9:24 am
by VladSun
shiznatix wrote:I have to do lots of calculations on them and I use caching were I can but this is a bit too much.
What kind of calculations? Maybe you can make them into DB layer, because I can't think of a calcualtion wich can be done in PHP and not in SQL

Re: Increase Memory Limit...
Posted: Fri Nov 07, 2008 9:39 am
by Eran
shiznatix wrote:looks good but I am not so sure that this is a mysql problem as much as it is a php trying to process such a huge array problem. Are there any other ways than "do it in chunks"
Is there a reason you must have all the rows at once in the same array? the only way to lower the memory cost is by storing less in memory, there's no escaping that..
Re: Increase Memory Limit...
Posted: Fri Nov 07, 2008 10:29 am
by Hannes2k
Hi,
what calculations are you doing? Because on normal calculations mysql_unbuffered_query would be the answer:
On the one hand, this saves a considerable amount of memory with SQL queries that produce large result sets. On the other hand, you can start working on the result set immediately after the first row has been retrieved: you don't have to wait until the complete SQL query has been performed.
Without code, I think no one can help you to reduce the amount of needed memory.
Re: Increase Memory Limit...
Posted: Fri Nov 07, 2008 10:36 am
by pickle
mysql_unbuffered_query() might be what you want actually. You're right in that your problem isn't MySQL. Your problem is when MySQL returns that result set to PHP, it exhausts PHP's memory limit.
mysql_unbuffered_query() will allow you to retrieve one row at a time into PHP - much lower memory overhead.
Continuing from what ~pytrin said: You could do 1 query to just return the number of rows, then in a loop, run a query that retrieves rows 1 - 1000, 1001 - 2000, 2001 - 3000, etc until it's done.
Re: Increase Memory Limit...
Posted: Mon Nov 24, 2008 3:16 pm
by John Cartwright
pickle wrote:mysql_unbuffered_query() might be what you want actually. You're right in that your problem isn't MySQL. Your problem is when MySQL returns that result set to PHP, it exhausts PHP's memory limit.
mysql_unbuffered_query() will allow you to retrieve one row at a time into PHP - much lower memory overhead.
Continuing from what ~pytrin said: You could do 1 query to just return the number of rows, then in a loop, run a query that retrieves rows 1 - 1000, 1001 - 2000, 2001 - 3000, etc until it's done.
Indeed. Working with databases with many millions of rows with PHP can be a pain. I've usually opted for a batch approach to process x amount of rows, and repeat until all rows have been exhausted. It seems other projects such as BigDump (to import large databases), follow the same approach.
Without providing some more information about the query, and your calculations then its a bit difficult to assess. Sometimes I wish PHP handled it's memory far better..
Re: Increase Memory Limit...
Posted: Tue Nov 25, 2008 6:18 am
by shiznatix
Well I added a forgotten index and did some code rearranging and have been able to make it work now no problemos but this new function is awesome and I am sure in the future I will be using that one a lot since my database is growing at a ridiculous rate. Thanks kids.