Database Backup

Questions about the MySQL, PostgreSQL, and most other databases, as well as using it with PHP can be asked here.

Moderator: General Moderators

Post Reply
phpnoobie
Forum Newbie
Posts: 7
Joined: Wed Jun 22, 2005 1:30 am
Contact:

Database Backup

Post by phpnoobie »

What should i do , if i wanna take backup of a mysql database from my system. and upload it to my web server and apply the backup...
Thx in advance
User avatar
Chris Corbyn
Breakbeat Nuttzer
Posts: 13098
Joined: Wed Mar 24, 2004 7:57 am
Location: Melbourne, Australia

Post by Chris Corbyn »

N00b method: Use phpMyAdmin to export the SQL (The export tab).

Command line method:

Code: Select all

mysqldump databasename -u username -p > databasename.sql
That creates a .sql file which has a bunch of MySQL code in it (insert queries and such like).

To put that on the other database. Upload the file, then use phpMyAdmin's Import feature OR

using the command line:

Code: Select all

//Login to MySQL using
mysql -u username -p

//Then select the database you want to add it to (create one if you dont have one)
use databasename;

//Then insert the backup SQL (note the \. )
\. /path/to/sql/file.sql;
User avatar
onion2k
Jedi Mod
Posts: 5263
Joined: Tue Dec 21, 2004 5:03 pm
Location: usrlab.com

Post by onion2k »

Personally I find exporting the database structure and the data seperately is a good idea. Generally the data isn't a problem. It just works. However, the structure sometimes needs a bit of a tweak, especially if the mysql versions are different. If you export the structure seperate you won't have to trawl through masses of data to fix things.

Also, if you resort to using PhpMyAdmin, you might find this helpful.. it takes an sql dump and breaks it into 1.8meg chunks (and seperates the structure stuff too).

Code: Select all

<?php

	$maxsize = 1887436;

	$file = fopen("hugegreatdumpfile.sql", "r");
	$tables = fopen("tables.sql", "w");
	$data = fopen("data_".++$d.".sql", "w");

	while (!feof($file)) {

		if ($count >= $maxsize) {
			fclose($data);
			$data = fopen("data_".++$d.".sql", "w");
			$count = 0;
		}

		$line = fgets($file, 100000);
		if (substr($line,0,6) == "INSERT") {
			fwrite($data,$line);
			$count += strlen($line);
		} elseif (substr($line,0,1) == "#") {

		} else {
			fwrite($tables,$line);
		}

	}

	echo "Completed: ".$d." data files created.";

?>
pecially if the mysql versions are different. If you export the structure seperate you won't have to trawl through masses of data to fix things.

Also, if you resort to using PhpMyAdmin, you might find this helpful.. it takes an sql dump and breaks it into 1.8meg chunks (and seperates the structure stuff too).

Code: Select all

<?php

	$maxsize = 1887436;

	$file = fopen("hugegreatdumpfile.sql", "r");
	$tables = fopen("tables.sql", "w");
	$data = fopen("data_".++$d.".sql", "w");

	while (!feof($file)) {

		if ($count >= $maxsize) {
			fclose($data);
			$data = fopen("data_".++$d.".sql", "w");
			$count = 0;
		}

		$line = fgets($file, 100000);
		if (substr($line,0,6) == "INSERT") {
			fwrite($data,$line);
			$count += strlen($line);
		} elseif (substr($line,0,1) == "#") {

		} else {
			fwrite($tables,$line);
		}

	}

	echo "Completed: ".$d." data files created.";

?>
[/php:1_".++$d.".sql", "w");

	while (!feof($file)) {

		if ($count &gt;= $maxsize) {
			fclose($data);
			$data = fopen(&quote;data_&quote;.++$d.&quote;.sql&quote;, &quote;w&quote;);
			$count = 0;
		}

		$line = fgets($file, 100000);
		if (substr($line,0,6) == "INSERT") {
			fwrite($data,$line);
			$count += strlen($line);
		} elseif (substr($line,0,1) == &qudata seperately is a good idea. Generally the data isn't a problem. It just works. However, the structure sometimes needs a bit of a tweak, especially if the mysql versions are different. If you export the structure seperate you won't have to trawl through masses of data to fix things.

Also, if you resort to using PhpMyAdmin, you might find this helpful.. it takes an sql dump and breaks it into 1.8meg chunks (and seperates the structure stuff too).

Code: Select all

<?php

	$maxsize = 1887436;

	$file = fopen("hugegreatdumpfile.sql&quote;, &quote;r&quote;);
	$tables = fopen(&quote;tables.sql&quote;, &quote;w&quote;);
	$data = fopen(&quote;data_&quote;.++$d.&quote;.sql&quote;, &quote;w&quote;);

	while (!feof($file)) {

		if ($count &gt;= $maxsize) {
			fclose($data);
			$data = fopen(&quote;data_&quote;.++$d.he structure sometimes needs a bit of a tweak, especially if the mysql versions are different. If you export the structure seperate you won't have to trawl through masses of data to fix things.

Also, if you resort to using PhpMyAdmin, you might find this helpful.. it takes an sql dump and breaks it into 1.8meg chunks (and seperates the structure stuff too).

Code: Select all

&lt;?php

	$maxsize = 1887436;

	$file = fopen(&quote;hugegreatdumpfile.sql&quote;, &quote;r&quote;);
	$tables = fopen(&quote;tables.sql&quote;, &quote;w&quote;);
	$data = fopen(&quote;data_&quote;.++$d.&quote;.sql&quote;, &quote;w&quote;);

	while (!feof($file)) {

		if ($count &gt;= $maxsize) {
			fclose($data);
			$data = fopen(&quote;data_&quote;.++$d.&quote;.sql&quote;, &quote;w&quote;);
			$count = 0;
		}

		$line = fgets($file, 100000);
		if (substr($line,0,6) == &quote;INSERT&quote;) {
			fwrite($data,$line);
			$count += strlen($line);
		} elseif (substr($line,0,1) == &quote;#&quote;) {

		} else {
			fwrite($e data seperately is a good idea. Generally the data isn't a problem. It just works. However, the structure sometimes needs a bit of a tweak, especially if the mysql versions are different. If you export the structure seperate you won't have to trawl through masses of data to fix things.

Also, if you resort to using PhpMyAdmin, you might find this helpful.. it takes an sql dump and breaks it into 1.8meg chunks (and seperates the structure stuff too).

Code: Select all

<?php

	$maxsize = 1887436;

	$file = fopen("hugegreatdumpfile.sql", "r");
	$tables = fopen("tables.sql", "w");
	$data = fopen("data_".++$d.".sql", "w");

	while (!feof($file)) {

		if ($count >= $maxsize) {
			fclose($data);
			$data = fopen("data_".++$d.".sql", "w");
			$count = 0;
		}

		$line = fgets($file, 100000);
		if (substr($line,0,6) == "INSERT") {
			fwrite($data,$line);
			$count += strlen($line);
		} elseif (substr($line,0,1) == "#") {

		} else {
			fwrite($tables,$line);
		}

	}

	echo "Completed: ".$d." data files created.";

?>
<?php

$maxsize = 1887436;

$file = fopen("hugegreatdumpfile.sql", "r");
$tables = fopen("tables.sql", "w");
$data = fopen("data_".++$d.".sql", "w");

while (!feof($file)) {

if ($count >= $maxsize) {
fclose($data);
$data = fopen("data_".++$d.".sql", "w");
$count = 0;
}

$line = fgets($file, 100000);
if (substr($line,0,6) == "INSERT") {
fwrite($data,$line);
$count += strlen($line);
} elseif (substr($line,0,1) == "#") {

} else {
fwrite($tables,$line);
}

}

echo "Completed: ".$d." data files created.";

?>
User avatar
Burrito
Spockulator
Posts: 4715
Joined: Wed Feb 04, 2004 8:15 pm
Location: Eden, Utah

Post by Burrito »

I use the command line also, but I do it w/o logging in to mysql

Code: Select all

mysqldump -u root -pyourpassword yourdatabase > d:\yourpath\yourfile.sql
then to import on the new machine:

Code: Select all

mysql -B -u root --password=yourpassword -h localhost yourdatabase < d:\yourpath\yourfile.sql
Post Reply