Page 1 of 1

large db backup

Posted: Thu Sep 14, 2006 3:53 am
by s.dot
Currently i'm using phpmyadmin to backup my database. But with a 75 mb gzipped database... the file always seems to get corrupted! About 25% of it is seen as "garbage" so i've been downloading the raw sql file uncompressed. When I do that, the server gets too stressed to even load my website (and downloading a 400 mb sql file takes around 20 minutes or so).

I don't really need to download the file. Just store it somewhere on my server. Is there a way to do a backup without putting so much stress on the server?

Posted: Thu Sep 14, 2006 1:28 pm
by ody
Why not shell in and simply:

Code: Select all

mysqldump > backup-20060914.sql
or:

Code: Select all

cp -a /var/lib/mysql /some/backup/path/20060914/
When backing up, if you want to make sure you have a consistant backup you may want to:

Code: Select all

mysql> flush tables with read lock;
backup data in either two ways above then:

Code: Select all

mysql> unlock tables;

Posted: Thu Sep 14, 2006 2:29 pm
by RobertGonzalez
Can you hit your database with a MySQL Administrator utility or SQLYog or something like that. Those client side apps tend to dump large databases quickly and easily.

Posted: Thu Sep 14, 2006 4:05 pm
by ok
If you don't have what Everah or ody mentioned, you can make a PHP script that backup only one table at a time.
If the table is too large, then you split the table to 3M-5M and like that you continue to backup your DB.

When you want to import it, you also import one table each time and then insert the data.

Posted: Thu Sep 14, 2006 4:15 pm
by s.dot
I am trying out SQLyog, it looks nifty!

Posted: Thu Sep 14, 2006 6:55 pm
by RobertGonzalez
That is my tool of choice for MySQL DB admin work. The enterprise version kicks tail.