Common Query performance

Questions about the MySQL, PostgreSQL, and most other databases, as well as using it with PHP can be asked here.

Moderator: General Moderators

Post Reply
User avatar
MathewByrne
Forum Commoner
Posts: 38
Joined: Sat Mar 27, 2004 9:49 pm
Location: Australia

Common Query performance

Post by MathewByrne »

The site I'm currently working on has an SQL query that will get run on almost every page request, along with any other queries that need to be run for that particular page. Because the results of this query will only change roughly once every two weeks, I decided to store the data in a php file which gets included on every request. The idea being that every time the data gets updated that this file is overwritten with the new data.

The file was going to look something like:

Code: Select all

<?php

// AUTO GENERATED ##

$_DATA = array
   (
      "item1" => 1,
      "item2" => 2,
      "item3" => 3
   );

?>

My question is, is it worth going to this trouble and is it likely to be faster to simply include a php file with the data defined as variables rather than executing an SQL query eash request?
josh
DevNet Master
Posts: 4872
Joined: Wed Feb 11, 2004 3:23 pm
Location: Palm beach, Florida

Post by josh »

it depends on the speed of your query, just because data hardly ever changes it doesn't mean you have to have a cache, as a general rule static cached data is always going to be faster then hitting the database though, put your query into a loop and run it 1,000 times and divide it out to get an accurate amount of time your query takes to run... it's up to you to decide wether or not to implement a cache, you could also check your mysql table's index's with

Code: Select all

EXPLAIN select * from `a`
, putting your actual query in there and checking if mysql is using your indeces properly

also if it's something like selecting distinct category from table, you might be better off making a category table and then using the id in your actual content table to reference the category names, that way you're not touching the large content table just to pull a few categories. this is called database normilization and has it's advantages and disadvantages
User avatar
MathewByrne
Forum Commoner
Posts: 38
Joined: Sat Mar 27, 2004 9:49 pm
Location: Australia

Post by MathewByrne »

Thanks for the info, would you mind posting a quick benchmark test that I can use to compare the two methods? afaik there isn't any way to get the query time from the last my_sql query but I could be wrong.
josh
DevNet Master
Posts: 4872
Joined: Wed Feb 11, 2004 3:23 pm
Location: Palm beach, Florida

Post by josh »

You aren't capable of running the benchmarks yourself? Heck -- I don't even have any idea what your queries are..

here's something to start with [untested]

Code: Select all

<?

// total times to run query
$total =10;

// query itself
$query = 'select * from table';

// begin magic
$start=microtime_float();
for($i=0;$i<$total;$i++) {
mysql_query($query);
}
echo ('The query took an average of '.round(microtime_float()-$start)/$total,4).' seconds each time');

/**
 * Simple function to replicate PHP 5 behaviour
 */
function microtime_float()
{
   list($usec, $sec) = explode(" ", microtime());
   return ((float)$usec + (float)$sec);
}
?>
User avatar
raghavan20
DevNet Resident
Posts: 1451
Joined: Sat Jun 11, 2005 6:57 am
Location: London, UK
Contact:

Post by raghavan20 »

The benchmarking option is available with mysql itself.

Code: Select all

mysql> select benchmark(10000, "select * from mysql.user");
+----------------------------------------------+
| benchmark(10000, "select * from mysql.user") |
+----------------------------------------------+
|                                            0 |
+----------------------------------------------+
1 row in set (0.36 sec)
josh
DevNet Master
Posts: 4872
Joined: Wed Feb 11, 2004 3:23 pm
Location: Palm beach, Florida

Post by josh »

try running a more complex query with that, like many nested subqueries and joins... it didn't work for me at least
Post Reply