Fastest query to determine if it matches atleast one record

Questions about the MySQL, PostgreSQL, and most other databases, as well as using it with PHP can be asked here.

Moderator: General Moderators

Post Reply
User avatar
anjanesh
DevNet Resident
Posts: 1679
Joined: Sat Dec 06, 2003 9:52 pm
Location: Mumbai, India

Fastest query to determine if it matches atleast one record

Post by anjanesh »

Hi

I would like to know if theres atleast one record in a table for a specified id. I dont need to know how many are there.

Code: Select all

1. SELECT COUNT(`ID`) FROM `table` WHERE `KeywordID` = 15;
2. SELECT COUNT(*) FROM `table` WHERE `KeywordID` = 15;
3. SELECT `ID` FROM `table` WHERE `KeywordID` = 15;
4. SELECT `ID` FROM `table` WHERE `KeywordID` = 15 LIMIT 0, 1;
5. SELECT * FROM `table` WHERE `KeywordID` = 15;
6. SELECT * FROM `table` WHERE `KeywordID` = 15 LIMIT 0, 1;
For queries using COUNT :

Code: Select all

$row = mysql_fetch_row($res);
if ($row[0] > 1)
 { ... }
And for queries not using COUNT :

Code: Select all

if (mysql_num_rows($res) > 1)
 { ... }
Which one of these should be faster ?

Thanks
User avatar
Christopher
Site Administrator
Posts: 13596
Joined: Wed Aug 25, 2004 7:54 pm
Location: New York, NY, US

Post by Christopher »

Good question. Since you have a number of possible options, have you benchmarked them yourself? If you don't know how to do timings then search for "timer" or "micrortime".
(#10850)
User avatar
anjanesh
DevNet Resident
Posts: 1679
Joined: Sat Dec 06, 2003 9:52 pm
Location: Mumbai, India

Post by anjanesh »

I did try benchmarking these. Problem is, the results vary because the total number of records are very few. And the time taken is the same for the few records that I have.

I need to know these timings for a large number of records - say for a table having a million records or so.
User avatar
John Cartwright
Site Admin
Posts: 11470
Joined: Tue Dec 23, 2003 2:10 am
Location: Toronto
Contact:

Post by John Cartwright »

Generate a million rows then perform your tests again :)
nickvd
DevNet Resident
Posts: 1027
Joined: Thu Mar 10, 2005 5:27 pm
Location: Southern Ontario
Contact:

Post by nickvd »

Jcart wrote:Generate a million rows then perform your tests again :)
Just wanted to agree with Jcart... Simply Generate Test Data and see what happens...
User avatar
mikeq
Forum Regular
Posts: 512
Joined: Fri May 03, 2002 3:33 am
Location: Edinburgh, Scotland

Post by mikeq »

You should also use SQL_NO_CACHE before running the tests i.e.

Code: Select all

SELECT SQL_NO_CACHE COUNT(id) FROM table WHERE keywordid=15
That way the MySQL cacheing will not come into play when running subsequent tests with the same queries.

Maybe also look at using max() and min(), if you are indexed on keywordid these should be the fastest.

something like

Code: Select all

SELECT SQL_NO_CACHE MAX(keywordid) FROM table WHERE keywordid=15;
I ran this type of query on a table I had with 20,000 records, the explain plan showed in the Extra column 'select tables optimized away' which means the select table has been removed from the query all together, doesn't get any better than that :)

With your queries place a multi-column index specified with (keywordid, id) and both are integers and you use both these fields in your query, running an explain plan should show in the Exta column 'Using Index', this means that the query didnt need to go to the data table but got all the information straight from the index file, fast. Not as fast as my query above, but fast.
Post Reply