Increase Memory Limit...
Moderator: General Moderators
- shiznatix
- DevNet Master
- Posts: 2745
- Joined: Tue Dec 28, 2004 5:57 pm
- Location: Tallinn, Estonia
- Contact:
Increase Memory Limit...
So I have this query that is run only once a month but is becoming a pain. Every month it gets bigger, this month it is at 137,937 rows returned. I can't slim down this query anymore, its as good as I can get it. Indexs are there and everything and I need all the first that are returned to be returned.
The problem is that it is using over 200megs of memory to run this query! I had the memory limit for php set to 200 and it was maxing that out. I set it again to 300 and all is well but I don't want to have to keep doing that every few months because this is growing by about 10k+ rows every month.
So, how do you manage such large returned arrays? I have to do lots of calculations on them and I use caching were I can but this is a bit too much. What options are there to speed this up and have it use less memory?
The problem is that it is using over 200megs of memory to run this query! I had the memory limit for php set to 200 and it was maxing that out. I set it again to 300 and all is well but I don't want to have to keep doing that every few months because this is growing by about 10k+ rows every month.
So, how do you manage such large returned arrays? I have to do lots of calculations on them and I use caching were I can but this is a bit too much. What options are there to speed this up and have it use less memory?
Re: Increase Memory Limit...
Try to use mysql_unbuffered_query(), it doesnt automatically fetch and buffer the results into memory as mysql_query() does. This will allow you to process a row at a time without loading everything into memory.
- shiznatix
- DevNet Master
- Posts: 2745
- Joined: Tue Dec 28, 2004 5:57 pm
- Location: Tallinn, Estonia
- Contact:
Re: Increase Memory Limit...
looks good but I am not so sure that this is a mysql problem as much as it is a php trying to process such a huge array problem. Are there any other ways than "do it in chunks"
Re: Increase Memory Limit...
What kind of calculations? Maybe you can make them into DB layer, because I can't think of a calcualtion wich can be done in PHP and not in SQLshiznatix wrote:I have to do lots of calculations on them and I use caching were I can but this is a bit too much.
There are 10 types of people in this world, those who understand binary and those who don't
Re: Increase Memory Limit...
Is there a reason you must have all the rows at once in the same array? the only way to lower the memory cost is by storing less in memory, there's no escaping that..shiznatix wrote:looks good but I am not so sure that this is a mysql problem as much as it is a php trying to process such a huge array problem. Are there any other ways than "do it in chunks"
Re: Increase Memory Limit...
Hi,
what calculations are you doing? Because on normal calculations mysql_unbuffered_query would be the answer:
what calculations are you doing? Because on normal calculations mysql_unbuffered_query would be the answer:
Without code, I think no one can help you to reduce the amount of needed memory.On the one hand, this saves a considerable amount of memory with SQL queries that produce large result sets. On the other hand, you can start working on the result set immediately after the first row has been retrieved: you don't have to wait until the complete SQL query has been performed.
Re: Increase Memory Limit...
mysql_unbuffered_query() might be what you want actually. You're right in that your problem isn't MySQL. Your problem is when MySQL returns that result set to PHP, it exhausts PHP's memory limit. mysql_unbuffered_query() will allow you to retrieve one row at a time into PHP - much lower memory overhead.
Continuing from what ~pytrin said: You could do 1 query to just return the number of rows, then in a loop, run a query that retrieves rows 1 - 1000, 1001 - 2000, 2001 - 3000, etc until it's done.
Continuing from what ~pytrin said: You could do 1 query to just return the number of rows, then in a loop, run a query that retrieves rows 1 - 1000, 1001 - 2000, 2001 - 3000, etc until it's done.
Real programmers don't comment their code. If it was hard to write, it should be hard to understand.
- John Cartwright
- Site Admin
- Posts: 11470
- Joined: Tue Dec 23, 2003 2:10 am
- Location: Toronto
- Contact:
Re: Increase Memory Limit...
Indeed. Working with databases with many millions of rows with PHP can be a pain. I've usually opted for a batch approach to process x amount of rows, and repeat until all rows have been exhausted. It seems other projects such as BigDump (to import large databases), follow the same approach.pickle wrote:mysql_unbuffered_query() might be what you want actually. You're right in that your problem isn't MySQL. Your problem is when MySQL returns that result set to PHP, it exhausts PHP's memory limit. mysql_unbuffered_query() will allow you to retrieve one row at a time into PHP - much lower memory overhead.
Continuing from what ~pytrin said: You could do 1 query to just return the number of rows, then in a loop, run a query that retrieves rows 1 - 1000, 1001 - 2000, 2001 - 3000, etc until it's done.
Without providing some more information about the query, and your calculations then its a bit difficult to assess. Sometimes I wish PHP handled it's memory far better..
- shiznatix
- DevNet Master
- Posts: 2745
- Joined: Tue Dec 28, 2004 5:57 pm
- Location: Tallinn, Estonia
- Contact:
Re: Increase Memory Limit...
Well I added a forgotten index and did some code rearranging and have been able to make it work now no problemos but this new function is awesome and I am sure in the future I will be using that one a lot since my database is growing at a ridiculous rate. Thanks kids.