timvw wrote:It's probably something like:
Code: Select all
SELECT
SUM(invoices.total) - SUM(returns_table.total) AS result, invoices.date, invoices.client_id
FROM
invoices, returns_table
WHERE
invoices.client_id = returns_table.client_id
AND invoices.date = returns_table.date
AND invoices.date BETWEN '....' AND '...'
GROUP BY
invoices.date, invoices.client_id
That's not going to work, because the JOIN includes all rows everytime a new set of client_id(s) is matched. So you end up with (total matching rows X total sum for all matching rows)
Say for example, table invoices has 3 rows that fall in the date range being questioned...
// table invoices (total)
Code: Select all
row 1 = 10.00
row 2 = 12.00
row 3 = 11.00
Your SUM(invoices.total) would return (99.00) for that client_id. There is no way to JOIN the (2) queries unless column (total) was set to a true or false (IE: 0 or 1) type column, then you could do the JOIN. Seeing the column total is not like that, you need to do a sub-query and LEFT OUTER JOIN on the sub query!
So the optimized query would look like this... (change the dates [2007-01-01, 2007-03-31] to the range your wanting to query for)
Code: Select all
SELECT a.client_id, ( SUM(a.total) - b.total ) AS total_difference FROM invoices AS a LEFT OUTER JOIN ( SELECT client_id, SUM(total) AS total FROM returns_table WHERE `date` BETWEEN '2007-01-01' AND '2007-03-31' GROUP BY client_id ORDER BY client_id ) AS b ON ( a.client_id = b.client_id AND a.date BETWEEN '2007-01-01' AND '2007-03-31' ) GROUP BY a.client_id ORDER BY a.client_id;