What should i do , if i wanna take backup of a mysql database from my system. and upload it to my web server and apply the backup...
Thx in advance
Database Backup
Moderator: General Moderators
- Chris Corbyn
- Breakbeat Nuttzer
- Posts: 13098
- Joined: Wed Mar 24, 2004 7:57 am
- Location: Melbourne, Australia
N00b method: Use phpMyAdmin to export the SQL (The export tab).
Command line method:
That creates a .sql file which has a bunch of MySQL code in it (insert queries and such like).
To put that on the other database. Upload the file, then use phpMyAdmin's Import feature OR
using the command line:
Command line method:
Code: Select all
mysqldump databasename -u username -p > databasename.sqlTo put that on the other database. Upload the file, then use phpMyAdmin's Import feature OR
using the command line:
Code: Select all
//Login to MySQL using
mysql -u username -p
//Then select the database you want to add it to (create one if you dont have one)
use databasename;
//Then insert the backup SQL (note the \. )
\. /path/to/sql/file.sql;Personally I find exporting the database structure and the data seperately is a good idea. Generally the data isn't a problem. It just works. However, the structure sometimes needs a bit of a tweak, especially if the mysql versions are different. If you export the structure seperate you won't have to trawl through masses of data to fix things.
Also, if you resort to using PhpMyAdmin, you might find this helpful.. it takes an sql dump and breaks it into 1.8meg chunks (and seperates the structure stuff too).
pecially if the mysql versions are different. If you export the structure seperate you won't have to trawl through masses of data to fix things.
Also, if you resort to using PhpMyAdmin, you might find this helpful.. it takes an sql dump and breaks it into 1.8meg chunks (and seperates the structure stuff too).
<?php
$maxsize = 1887436;
$file = fopen("hugegreatdumpfile.sql", "r");
$tables = fopen("tables.sql", "w");
$data = fopen("data_".++$d.".sql", "w");
while (!feof($file)) {
if ($count >= $maxsize) {
fclose($data);
$data = fopen("data_".++$d.".sql", "w");
$count = 0;
}
$line = fgets($file, 100000);
if (substr($line,0,6) == "INSERT") {
fwrite($data,$line);
$count += strlen($line);
} elseif (substr($line,0,1) == "#") {
} else {
fwrite($tables,$line);
}
}
echo "Completed: ".$d." data files created.";
?>
Also, if you resort to using PhpMyAdmin, you might find this helpful.. it takes an sql dump and breaks it into 1.8meg chunks (and seperates the structure stuff too).
Code: Select all
<?php
$maxsize = 1887436;
$file = fopen("hugegreatdumpfile.sql", "r");
$tables = fopen("tables.sql", "w");
$data = fopen("data_".++$d.".sql", "w");
while (!feof($file)) {
if ($count >= $maxsize) {
fclose($data);
$data = fopen("data_".++$d.".sql", "w");
$count = 0;
}
$line = fgets($file, 100000);
if (substr($line,0,6) == "INSERT") {
fwrite($data,$line);
$count += strlen($line);
} elseif (substr($line,0,1) == "#") {
} else {
fwrite($tables,$line);
}
}
echo "Completed: ".$d." data files created.";
?>Also, if you resort to using PhpMyAdmin, you might find this helpful.. it takes an sql dump and breaks it into 1.8meg chunks (and seperates the structure stuff too).
Code: Select all
<?php
$maxsize = 1887436;
$file = fopen("hugegreatdumpfile.sql", "r");
$tables = fopen("tables.sql", "w");
$data = fopen("data_".++$d.".sql", "w");
while (!feof($file)) {
if ($count >= $maxsize) {
fclose($data);
$data = fopen("data_".++$d.".sql", "w");
$count = 0;
}
$line = fgets($file, 100000);
if (substr($line,0,6) == "INSERT") {
fwrite($data,$line);
$count += strlen($line);
} elseif (substr($line,0,1) == "#") {
} else {
fwrite($tables,$line);
}
}
echo "Completed: ".$d." data files created.";
?>
[/php:1_".++$d.".sql", "w");
while (!feof($file)) {
if ($count >= $maxsize) {
fclose($data);
$data = fopen("e;data_"e;.++$d."e;.sql"e;, "e;w"e;);
$count = 0;
}
$line = fgets($file, 100000);
if (substr($line,0,6) == "INSERT") {
fwrite($data,$line);
$count += strlen($line);
} elseif (substr($line,0,1) == &qudata seperately is a good idea. Generally the data isn't a problem. It just works. However, the structure sometimes needs a bit of a tweak, especially if the mysql versions are different. If you export the structure seperate you won't have to trawl through masses of data to fix things.
Also, if you resort to using PhpMyAdmin, you might find this helpful.. it takes an sql dump and breaks it into 1.8meg chunks (and seperates the structure stuff too).Code: Select all
<?php
$maxsize = 1887436;
$file = fopen("hugegreatdumpfile.sql"e;, "e;r"e;);
$tables = fopen("e;tables.sql"e;, "e;w"e;);
$data = fopen("e;data_"e;.++$d."e;.sql"e;, "e;w"e;);
while (!feof($file)) {
if ($count >= $maxsize) {
fclose($data);
$data = fopen("e;data_"e;.++$d.he structure sometimes needs a bit of a tweak, especially if the mysql versions are different. If you export the structure seperate you won't have to trawl through masses of data to fix things.
Also, if you resort to using PhpMyAdmin, you might find this helpful.. it takes an sql dump and breaks it into 1.8meg chunks (and seperates the structure stuff too).Code: Select all
<?php
$maxsize = 1887436;
$file = fopen("e;hugegreatdumpfile.sql"e;, "e;r"e;);
$tables = fopen("e;tables.sql"e;, "e;w"e;);
$data = fopen("e;data_"e;.++$d."e;.sql"e;, "e;w"e;);
while (!feof($file)) {
if ($count >= $maxsize) {
fclose($data);
$data = fopen("e;data_"e;.++$d."e;.sql"e;, "e;w"e;);
$count = 0;
}
$line = fgets($file, 100000);
if (substr($line,0,6) == "e;INSERT"e;) {
fwrite($data,$line);
$count += strlen($line);
} elseif (substr($line,0,1) == "e;#"e;) {
} else {
fwrite($e data seperately is a good idea. Generally the data isn't a problem. It just works. However, the structure sometimes needs a bit of a tweak, especially if the mysql versions are different. If you export the structure seperate you won't have to trawl through masses of data to fix things.
Also, if you resort to using PhpMyAdmin, you might find this helpful.. it takes an sql dump and breaks it into 1.8meg chunks (and seperates the structure stuff too).Code: Select all
<?php
$maxsize = 1887436;
$file = fopen("hugegreatdumpfile.sql", "r");
$tables = fopen("tables.sql", "w");
$data = fopen("data_".++$d.".sql", "w");
while (!feof($file)) {
if ($count >= $maxsize) {
fclose($data);
$data = fopen("data_".++$d.".sql", "w");
$count = 0;
}
$line = fgets($file, 100000);
if (substr($line,0,6) == "INSERT") {
fwrite($data,$line);
$count += strlen($line);
} elseif (substr($line,0,1) == "#") {
} else {
fwrite($tables,$line);
}
}
echo "Completed: ".$d." data files created.";
?>$maxsize = 1887436;
$file = fopen("hugegreatdumpfile.sql", "r");
$tables = fopen("tables.sql", "w");
$data = fopen("data_".++$d.".sql", "w");
while (!feof($file)) {
if ($count >= $maxsize) {
fclose($data);
$data = fopen("data_".++$d.".sql", "w");
$count = 0;
}
$line = fgets($file, 100000);
if (substr($line,0,6) == "INSERT") {
fwrite($data,$line);
$count += strlen($line);
} elseif (substr($line,0,1) == "#") {
} else {
fwrite($tables,$line);
}
}
echo "Completed: ".$d." data files created.";
?>
I use the command line also, but I do it w/o logging in to mysql
then to import on the new machine:
Code: Select all
mysqldump -u root -pyourpassword yourdatabase > d:\yourpath\yourfile.sqlCode: Select all
mysql -B -u root --password=yourpassword -h localhost yourdatabase < d:\yourpath\yourfile.sql