Page 1 of 1

Database Backup

Posted: Thu Jul 07, 2005 4:11 am
by phpnoobie
What should i do , if i wanna take backup of a mysql database from my system. and upload it to my web server and apply the backup...
Thx in advance

Posted: Thu Jul 07, 2005 6:27 am
by Chris Corbyn
N00b method: Use phpMyAdmin to export the SQL (The export tab).

Command line method:

Code: Select all

mysqldump databasename -u username -p > databasename.sql
That creates a .sql file which has a bunch of MySQL code in it (insert queries and such like).

To put that on the other database. Upload the file, then use phpMyAdmin's Import feature OR

using the command line:

Code: Select all

//Login to MySQL using
mysql -u username -p

//Then select the database you want to add it to (create one if you dont have one)
use databasename;

//Then insert the backup SQL (note the \. )
\. /path/to/sql/file.sql;

Posted: Thu Jul 07, 2005 10:15 am
by onion2k
Personally I find exporting the database structure and the data seperately is a good idea. Generally the data isn't a problem. It just works. However, the structure sometimes needs a bit of a tweak, especially if the mysql versions are different. If you export the structure seperate you won't have to trawl through masses of data to fix things.

Also, if you resort to using PhpMyAdmin, you might find this helpful.. it takes an sql dump and breaks it into 1.8meg chunks (and seperates the structure stuff too).

Code: Select all

<?php

	$maxsize = 1887436;

	$file = fopen("hugegreatdumpfile.sql", "r");
	$tables = fopen("tables.sql", "w");
	$data = fopen("data_".++$d.".sql", "w");

	while (!feof($file)) {

		if ($count >= $maxsize) {
			fclose($data);
			$data = fopen("data_".++$d.".sql", "w");
			$count = 0;
		}

		$line = fgets($file, 100000);
		if (substr($line,0,6) == "INSERT") {
			fwrite($data,$line);
			$count += strlen($line);
		} elseif (substr($line,0,1) == "#") {

		} else {
			fwrite($tables,$line);
		}

	}

	echo "Completed: ".$d." data files created.";

?>
pecially if the mysql versions are different. If you export the structure seperate you won't have to trawl through masses of data to fix things.

Also, if you resort to using PhpMyAdmin, you might find this helpful.. it takes an sql dump and breaks it into 1.8meg chunks (and seperates the structure stuff too).

Code: Select all

<?php

	$maxsize = 1887436;

	$file = fopen("hugegreatdumpfile.sql", "r");
	$tables = fopen("tables.sql", "w");
	$data = fopen("data_".++$d.".sql", "w");

	while (!feof($file)) {

		if ($count >= $maxsize) {
			fclose($data);
			$data = fopen("data_".++$d.".sql", "w");
			$count = 0;
		}

		$line = fgets($file, 100000);
		if (substr($line,0,6) == "INSERT") {
			fwrite($data,$line);
			$count += strlen($line);
		} elseif (substr($line,0,1) == "#") {

		} else {
			fwrite($tables,$line);
		}

	}

	echo "Completed: ".$d." data files created.";

?>
[/php:1_".++$d.".sql", "w");

	while (!feof($file)) {

		if ($count &gt;= $maxsize) {
			fclose($data);
			$data = fopen(&quote;data_&quote;.++$d.&quote;.sql&quote;, &quote;w&quote;);
			$count = 0;
		}

		$line = fgets($file, 100000);
		if (substr($line,0,6) == "INSERT") {
			fwrite($data,$line);
			$count += strlen($line);
		} elseif (substr($line,0,1) == &qudata seperately is a good idea. Generally the data isn't a problem. It just works. However, the structure sometimes needs a bit of a tweak, especially if the mysql versions are different. If you export the structure seperate you won't have to trawl through masses of data to fix things.

Also, if you resort to using PhpMyAdmin, you might find this helpful.. it takes an sql dump and breaks it into 1.8meg chunks (and seperates the structure stuff too).

Code: Select all

<?php

	$maxsize = 1887436;

	$file = fopen("hugegreatdumpfile.sql&quote;, &quote;r&quote;);
	$tables = fopen(&quote;tables.sql&quote;, &quote;w&quote;);
	$data = fopen(&quote;data_&quote;.++$d.&quote;.sql&quote;, &quote;w&quote;);

	while (!feof($file)) {

		if ($count &gt;= $maxsize) {
			fclose($data);
			$data = fopen(&quote;data_&quote;.++$d.he structure sometimes needs a bit of a tweak, especially if the mysql versions are different. If you export the structure seperate you won't have to trawl through masses of data to fix things.

Also, if you resort to using PhpMyAdmin, you might find this helpful.. it takes an sql dump and breaks it into 1.8meg chunks (and seperates the structure stuff too).

Code: Select all

&lt;?php

	$maxsize = 1887436;

	$file = fopen(&quote;hugegreatdumpfile.sql&quote;, &quote;r&quote;);
	$tables = fopen(&quote;tables.sql&quote;, &quote;w&quote;);
	$data = fopen(&quote;data_&quote;.++$d.&quote;.sql&quote;, &quote;w&quote;);

	while (!feof($file)) {

		if ($count &gt;= $maxsize) {
			fclose($data);
			$data = fopen(&quote;data_&quote;.++$d.&quote;.sql&quote;, &quote;w&quote;);
			$count = 0;
		}

		$line = fgets($file, 100000);
		if (substr($line,0,6) == &quote;INSERT&quote;) {
			fwrite($data,$line);
			$count += strlen($line);
		} elseif (substr($line,0,1) == &quote;#&quote;) {

		} else {
			fwrite($e data seperately is a good idea. Generally the data isn't a problem. It just works. However, the structure sometimes needs a bit of a tweak, especially if the mysql versions are different. If you export the structure seperate you won't have to trawl through masses of data to fix things.

Also, if you resort to using PhpMyAdmin, you might find this helpful.. it takes an sql dump and breaks it into 1.8meg chunks (and seperates the structure stuff too).

Code: Select all

<?php

	$maxsize = 1887436;

	$file = fopen("hugegreatdumpfile.sql", "r");
	$tables = fopen("tables.sql", "w");
	$data = fopen("data_".++$d.".sql", "w");

	while (!feof($file)) {

		if ($count >= $maxsize) {
			fclose($data);
			$data = fopen("data_".++$d.".sql", "w");
			$count = 0;
		}

		$line = fgets($file, 100000);
		if (substr($line,0,6) == "INSERT") {
			fwrite($data,$line);
			$count += strlen($line);
		} elseif (substr($line,0,1) == "#") {

		} else {
			fwrite($tables,$line);
		}

	}

	echo "Completed: ".$d." data files created.";

?>
<?php

$maxsize = 1887436;

$file = fopen("hugegreatdumpfile.sql", "r");
$tables = fopen("tables.sql", "w");
$data = fopen("data_".++$d.".sql", "w");

while (!feof($file)) {

if ($count >= $maxsize) {
fclose($data);
$data = fopen("data_".++$d.".sql", "w");
$count = 0;
}

$line = fgets($file, 100000);
if (substr($line,0,6) == "INSERT") {
fwrite($data,$line);
$count += strlen($line);
} elseif (substr($line,0,1) == "#") {

} else {
fwrite($tables,$line);
}

}

echo "Completed: ".$d." data files created.";

?>

Posted: Thu Jul 07, 2005 12:45 pm
by Burrito
I use the command line also, but I do it w/o logging in to mysql

Code: Select all

mysqldump -u root -pyourpassword yourdatabase > d:\yourpath\yourfile.sql
then to import on the new machine:

Code: Select all

mysql -B -u root --password=yourpassword -h localhost yourdatabase < d:\yourpath\yourfile.sql