Page 1 of 1

Is this server intensive?

Posted: Sun Feb 05, 2006 1:27 pm
by Covenant
Is it bad on the server (slows things down) to have a mysql_query or while() statement within another while() statement?

Here's an example:

Code: Select all

$boards=mysql_query("SELECT * from cats");

 while($row=mysql_fetch_array($boards))
 {
 echo ("$row[title]");
 $boards2=mysql_query("SELECT * from boards where catid='$row[id]'");
 while($row2=mysql_fetch_array($boards2)
      {
 echo ("$row2[title]");
      }
 }

Posted: Sun Feb 05, 2006 1:48 pm
by raghavan20
It is a bit server intensive, a join is quicker but do you think you can easily get information in the query result from the join displayed in the way you want...I think then you would be delegating work to the Web server instead of database server...
I think for just listing, it is better to use single select statements inside loops instead of joins. If anybody has got a better way to handle this result easily then I would be eager to hear from it you...

Code: Select all

select a.`title`, a.`id`, b.`title` from
cats a
left join `boards` b on b.`catid` = a.`id`
order by a.`title`

Posted: Mon Feb 06, 2006 5:09 am
by duk
i think what he is doing is faster then the join query... the join query will affect more the database server... and what he have with the select query and whiles, will use both server and database capacity... and will work faster... i didn't test....

Posted: Mon Feb 06, 2006 10:03 am
by feyd
the OP's is very likely not faster. This is due to running n 1 queries where n is the number of records that would match.

There are two possible routes to make it faster, one is raghavan20's. The other is running two queries and making the connections yourself after the fact, or using a mild amount of preprocessing.

Code: Select all

$cats = mysql_query('SELECT * FROM `cats` ORDER BY `order`') or die(mysql_error());
$titles = array();
while($row = mysql_fetch_assoc($cats)) {
  $titles[$row['catid']] = $row['title'];
}
$boards = mysql_query('SELECT * FROM `boards` WHERE `catid` IN(\'' . implode('\',\'',$list) . '\') ORDER BY `catid`, `order`');
$subtitles = array();
$last = null;
while($row = mysql_fetch_assoc($boards)) {
  if($last !== $row['catid']) {
    $subtitles[$row['catid']] = array();
  }
  $subtitles[$row['catid']][$row['boardid']] = $row['title'];
  $last = $row['catid'];
}

foreach($titles as $catid => $title) {
  echo '<hr><b>'.$title.'</b><br />';
  if(isset($subtitles[$catid])) {
    foreach($subtitles[$catid] as $boardid => $title) {
      echo $title.'<br />';
    }
  }
}

Posted: Mon Feb 06, 2006 8:53 pm
by s.dot
I had done a lot of the loading query results into arrays, and referencing them later on in the script to avoid using queries inside of loops.

However, on a few scripts I am loading 4 or 5 separate arrays which I think is using up more memory than it should. I think I should look into using joins.

Posted: Mon Feb 06, 2006 10:49 pm
by josh
@scrotaye ~

or only do the queries when you need them, and instead of loading them into arrays, loop over the result as you need it and use mysql_free_result() as soon as possible in the script