I have a function that loops through lots of table, gathering various information, and that exports the whole lot in csv format. It works ok, but the problem is that it's really slow.
Here is what it does in a nutshell (I am using a mysql library called "ezsql")
Code: Select all
$contacts = $db->get_results('SELECT * FROM contact');
foreach($contacts as $contact) {
$row = array();
// get address for this contact from the address table
// get phone numbers from phone table
// get lots of stuff from other tables
$row[] = $info1;
$row[] = $info2;
// ... and I keep on adding things
$csv_output .= join(',', $row)."\n";
}
$filename = "cbt_user_db.csv";
header('Content-Type: text/x-csv');
header('Expires: ' . gmdate('D, d M Y H:i:s') . ' GMT');
if (PMA_USR_BROWSER_AGENT == 'IE') {
header('Content-Disposition: inline; filename="' . $filename . '"');
header('Cache-Control: must-revalidate, post-check=0, pre-check=0');
header('Pragma: public');
}
else {
header('Content-Disposition: attachment; filename="' . $filename . '"');
header('Pragma: no-cache');
}
print $csv_output;
Does anyone have any suggestion on how to optimize this?
Many thansk,
Hubert.