Backing Up MySQL

Share this article

There are several ways to backup MySQL data, however, automation certainly makes the process easier. Insuring your backed up data is stored remotely further supports your business continuity in the event of a need to restore data onto new systems based on hardware failure, etc.

Having spent some time murking through my network (not needing to reinvent the wheel to research an efficient backup procedure), I stumbled onto some backup scripts from PJ Doland.

PJ (Patrick) is a web designer in the Washington DC area with former ties to the CATO Institute web site and was willing to share two scripts he is using. I liked these scripts as they are simple, and include the functionality to automate dropping the backups on a remote server for business continuity.

The first is for use as a standard MySQL backup and is found below:


#!/bin/bash

#####################################
### MySQL Configuration Variables ###
#####################################

# MySQL Hostname
DBHOST='localhost'

# MySQL Username
DBUSER='root'

# MySQL Password
DBPASSWD='password'

#####################################
### FTP Configuration Variables #####
#####################################

# FTP Hostname
FTPHOST='www.example.com'

# FTP Username
FTPUSER='username'

# FTP Password
FTPPASSWD='password'

# Local Directory for Dump Files
LOCALDIR=/path/to/local/directory/

# Remote Directory for Offsite Backup
REMOTEDIR=/path/to/remote/directory/

# Prefix for offsite .tar file backup
TARPREFIX=db1

#####################################
### Edit Below If Necessary #########
#####################################

cd $LOCALDIR
SUFFIX=`eval date +%y%m%d`

DBS=`mysql -u$DBUSER -p$DBPASSWD -h$DBHOST -e"show databases"`

for DATABASE in $DBS
do
if [ $DATABASE != "Database" ]; then
FILENAME=$SUFFIX-$DATABASE.gz
mysqldump -u$DBUSER -p$DBPASSWD -h$DBHOST $DATABASE | gzip --best > $LOCALDIR$FILENAME
fi
done

chmod 400 $LOCALDIR*.gz

tar -cf $TARPREFIX-$SUFFIX.tar $SUFFIX-*.gz

ftp -n $FTPHOST

It is a straightforward script, uses mysqldump and tar/gzips the files prior to using ftp to move the data to a remote server. This script and the one below can easily be modified to use scp if you prefer running the transfer over ssh.

The second script is updated to use mysqlhotcopy and is specifically for use with ISAM/MYISAM tables only. This script below will NOT work with InnoDB tables.


#!/bin/bash

### Configuration Variables

DBHOST='localhost'
DBUSER='root'
DBPASSWD='password'
FTPHOST='ftp.example.com'
FTPUSER='username'
FTPPASSWD='password'
LOCALDIR=/path/to/local/
REMOTEDIR=/path/to/local/
TARPREFIX=db1

### Do not edit anything below this line

cd $LOCALDIR
SUFFIX=`eval date +%y%m%d`

DBS=`mysql -u$DBUSER -p$DBPASSWD -h$DBHOST -e"show databases"`

for DATABASE in $DBS
do
if [ $DATABASE != "Database" ]; then
FILENAME=$SUFFIX-$DATABASE.tar.gz
mysqlhotcopy -u $DBUSER -p $DBPASSWD $DATABASE $LOCALDIR
tar -czf $LOCALDIR$FILENAME $LOCALDIR$DATABASE
rm -rf $LOCALDIR$DATABASE
rm -rf $LOCALDIR$DATABASE-replicate
fi
done

chmod 400 $LOCALDIR*.tar.gz

tar -cf $TARPREFIX-$SUFFIX.tar $SUFFIX-*.tar.gz

ftp -n $FTPHOST

It is recommended to use 700 permissions on these shell scripts to limit them to root user as they do contain substantial information on user name, password, and path to multiple servers once they are updated with your information.

They can then be added to cron to run on a schedule of your preference.

Blane WarreneBlane Warrene
View Author
Share this article
Read Next
Get the freshest news and resources for developers, designers and digital creators in your inbox each week