Portal Home > Knowledgebase > Articles Database > full manual backup


full manual backup




Posted by ti_nhatrang, 09-27-2007, 09:04 PM
hello guys, not sure if it's available, but is there anyway we can manually backup mysql db with a single command line? any kind of help would be appreciated... note: this is a pure mysql server, not with cpanel or directadmin in anyway..

Posted by Patrick, 09-27-2007, 09:26 PM
Yes, the command is called mysqldump: mysqldump -udatabase_username -p database_name > database_backup.sql The -p flag will prompt you for a password, and that will export your database into a file called database_backup.sql

Posted by ti_nhatrang, 09-27-2007, 09:53 PM
hmm, if I have 10 DB, can i do it all at once? how would I put this in a script to also prevent it from prompting password?

Posted by cywkevin, 09-27-2007, 10:31 PM
if you're logged in as that user create a ~/my.cnf file that says

Posted by cywkevin, 09-27-2007, 11:26 PM
or just add a couple of variables for user and password in the script and use it with --password=

Posted by david510, 09-28-2007, 07:11 AM
To prevent password prompting use this command. mysqldump -udatabase_username -pPASSWORD database_name > database_backup.sql

Posted by Patrick, 09-28-2007, 09:33 AM
mysqldump --all-databases > backup.sql That's not as ideal as creating separate backups for each database, as the above command will put all of your databases into one SQL file.

Posted by ti_nhatrang, 09-28-2007, 11:10 AM
yeah, i wouldn't want that either, would want it to dump all db in seperate files...

Posted by Patrick, 09-28-2007, 11:25 AM
You could create a shell script that will automate the whole backup process: Just keep adding as many lines as required and change the bold details to the correct database credentials. You will also want to make sure you set the correct file permissions on this script, so no one else can view your passwords...

Posted by ti_nhatrang, 09-28-2007, 11:35 AM
I will be doing this via root... should I just place this file under root somewhere? Also, how would I put this into /etc/crontab? Thanks for all your help thus far, I really appreciate it!

Posted by ti_nhatrang, 09-28-2007, 11:38 AM
also, how would I set it so that it's incremental creating the dates for the directory rather than overwriting it?

Posted by Patrick, 09-28-2007, 11:41 AM
I would put it under the /root directory than chmod 700 backup.sh (change backup.sh) to prevent anyone from reading it. You could add it to /etc/crontab, or just run crontab -e -u root which will basically do the same thing.

Posted by Patrick, 09-28-2007, 11:50 AM
You could try something like: mysqldump -udatabase_username -pPASSWORD database_name > "`date`.sql" If you want to make it organized, have it dump the databases to individual directories. For example: mysqldump -udatabase_username -pPASSWORD database_name > /root/database_name/"`date`.sql" That will create: /root/database_name/Fri Sep 28 11:47:49 EDT 2007.sql

Posted by ti_nhatrang, 09-28-2007, 11:54 AM
how abou this: mysqldump -udatabase_username -pPASSWORD database_name > /root/`date`/database_name.sql would it work?

Posted by ti_nhatrang, 09-28-2007, 11:55 AM
can we also get rid of the time?

Posted by Patrick, 09-28-2007, 12:15 PM
No, that won't work because the date directory under /root does not exist. In order to dump a database there, the directory will need to exist. As for removing the time, you could try: mysqldump -udatabase_username -pPASSWORD database_name > /root/database_name/"`date +%m-%d-%y`.sql" That will create: /root/database_name/09-28-07.sql

Posted by ti_nhatrang, 09-28-2007, 12:50 PM
right, but if you have 10 sql dbs... how would you do it?

Posted by Patrick, 09-28-2007, 04:15 PM
I would make a directory under your root directory, then additional directories for each database. For example: /root/backup/db1/ /root/backup/db2/ /root/backup/db3/ ... and so fourth. You would use use the same shell script, but with 10 different entries (one for each database):

Posted by david510, 09-28-2007, 11:40 PM
date -I" will output the date only. In this script we can use it as a variable for ease date=`date -I` and supply $date as variable. mysqldump -udatabase_username1 -pPASSWORD database_name1 > /root/backup/db1/$date.sql

Posted by derek.bodner, 09-29-2007, 09:34 AM
try this script: This will dump all your databases (don't have to go in and manually add it when you add a db) in their own file, which will be backup_date-database.sql, then archive all them into one .tar.gz (backup-date.tar.gz) to save space. modify DBHOST, DBUSER and DBPASSWD, chmod it u+x, and it should be good to go. Last edited by derek.bodner; 09-29-2007 at 09:38 AM.

Posted by ti_nhatrang, 09-29-2007, 11:29 AM
is there any way we can use the script you suggested and split out the sql into their own files? these sql files are at least 200gb each...

Posted by derek.bodner, 09-29-2007, 11:11 PM
Each database does have its own .sql file in the above example. If you want each database gzip'd individually rather than in one .tar.gz, modify it like this: Last edited by derek.bodner; 09-29-2007 at 11:16 PM.

Posted by activelobby4u, 09-30-2007, 02:17 AM
Yes, time can be removed. Date should be formated. like: date=`/bin/date "+%Y%m%d" Running the following line of commands: date=`/bin/date "+%Y%m%d"`; mkdir /root/$date; mysqldump -udatabase_username -pPASSWORD database_name > /root/$date/database_name.sql Will create a dump file like: /root/20070930/database_name.sql



Was this answer helpful?

Add to Favourites Add to Favourites    Print this Article Print this Article

Also Read
TurnKey Hosting (Views: 771)
Rotate Log Question (Views: 691)