Backing up is one of the most important things to do when running a website and this is not related to WordPress websites only. There are so many things that can go wrong, whether your hosting provider is down, or you forgot to pay the hosting bill, or maybe you accidentally locked yourself out of your VPS and your hosting provider’s support team wouldn’t respond.
There are many different ways you can backup your WordPress blog, and your hosting provider might actually be doing that for you, but wouldn’t it be safer if you didn’t have to rely all your data on a third-party? Especially if you’ve been blogging for years, you really cannot afford to loose all that data that you’ve worked so hard for.
Of course you can use different WordPress plugins for automated backups or manually back up using database and file server programs. Many of the ways are described in the WordPress Backups entry in the codex. But today it’s all about geeky backups, so we’re going to write a shell script. Make sure that you’re comfortable with SSH and some Unix/Linux in general. And also make sure that your hosting provider actually gives you SSH access and access to the task manager (known as crontab). This post is more of a tutorial rather than just a snippet, so if you’re in here for the short story, proceed to the download link.
Our shell script will be quite simple — it will create a full backup archive of the website’s directory, including the WordPress core, themes and plugins, and most importantly, the uploaded content. Next we’ll create a full database dump and add it to the archive. The archive will be gzipped to make sure we save some space, and dated, so that we know when the archives were created and which are safe to remove.
We will then setup a cron entry to run the shell script periodically and get rid of old backups so that we don’t run out of hard disk space. The script itself can also be ran manually, like immediately after you publish a new and important blog post, just to be on the safe side.
Figuring Out the Structure & Creating the Backup Script
If you’re still reading, I’m quite sure that you’re familiar with shell scripts, but if you’re not just keep in mind that they’re simple executable text files that can run one or a set of commands when executed. Just like batch scripts in Windows environments.
Before diving into writing the actual script, I’ll make a few assumptions so that you can follow along and adapt the code I’m writing to your own file structure and database credentials.
- I’ve got SSH access on example.org — my username is
usernameand the password is
- My home directory is at
/home/username/and my website’s root directory is
/home/username/www/example.org/, so that’s where my WordPress files are
- The MySQL database is running on
localhost. The username is
mysqluser, the password is
mysqlpassand the database name for example.org is
- The shell script will be stored in
/home/username/scripts/and the backups will go to
/home/username/backups/so make sure those directories exist before continuing
Make sure that you’ve logged on to your server via SSH and let’s start off by browsing to our scripts directory and creating a new executable file there with these three simple commands:
$ cd /home/username/scripts/ $ touch backup.script $ chmod +x backup.script
Now use your favorite text editor to edit the contents of the backup.script file. My preference is
vi but you’re free to use
nano or any other plain/text file editor. We’ll go through the script step by step, but again, if you’re here for the short story, proceed to the download link.
Defining the Variables and Transforms
We’ll be using quite a lot of variables throughout the code so that it’ll be easier for you to manage the script afterwards or replicate it for other websites. Make sure you’re familiar with variables in bash and don’t forget the interpreter header at the very beginning of your script file.
#!/bin/bash NOW=$(date +"%Y-%m-%d-%H%M") FILE="example.org.$NOW.tar" BACKUP_DIR="/home/username/backups" WWW_DIR="/home/username/www/example.org/"
So we’ve defined the current date and time and decided on how to name our output backup file. Let’s proceed on to the database variables that we’ll use when making the actual MySQL tables dump and some transforms which I’ll explain in a second. Append the following code to your script file:
DB_USER="mysqluser" DB_PASS="mysqlpass" DB_NAME="example_org" DB_FILE="example.org.$NOW.sql" WWW_TRANSFORM='s,^home/username/www/example.org,www,' DB_TRANSFORM='s,^home/username/backups,database,'
So that first part is quite simple, now the transforms I have defined are not mandatory, but will really help in keeping your archives structure clean. We’ll be using these as the transform expressions when creating the archive, and they will make sure that the WordPress directory files are stored in a directory called
www and the database dump goes alongside in a folder called
database. This is for convenience, especially when it comes to browsing your backup archives — you won’t get those nasty 3-4 level
home/username/www/example.org/ prefixes. When writing those look carefully for the slashes — one single mistake can result in an invalid transform expression. More about modifying file and member names with tar.
Creating the Files Archive and the MySQL Backup
This part is quite simple, now that we have defined all the variables above. We’ll use the variables in expressions with the
mysqldump commands, you should get familiar with these as they’re very powerful in your day-to-day server maintenance life. Append the following code to the backup script:
tar -cvf $BACKUP_DIR/$FILE --transform $WWW_TRANSFORM $WWW_DIR mysqldump -u$DB_USER -p$DB_PASS $DB_NAME > $BACKUP_DIR/$DB_FILE
v flag for the
tar command makes it run in verbose mode, useful for debugging when things go wrong. You might also like the
--show-transformed-names flag for the tar command if your transform expression is not working for a reason. The second line creates an SQL dump of your database into the backups directory.
So in theory we’ll end up with a new archive in the backups directory and a database dump right next to it, but as I mentioned earlier, we’re going to take this further.
Merging, Cleaning Up & Compressing
The following commands will append the database file into the archive so that we won’t have to deal with two separate files, and that’s where the database transform variable is used, it’ll put it into the database directory within the archive, making the structure very clean. We then get rid of the database file since it’s in the archive already. Finally, we compress the whole archive so a .gz file is our final result. Append the following piece to your script:
tar --append --file=$BACKUP_DIR/$FILE --transform $DB_TRANSFORM $BACKUP_DIR/$DB_FILE rm $BACKUP_DIR/$DB_FILE gzip -9 $BACKUP_DIR/$FILE
That pretty much completes our backup script, and you can test it out by running it directly from the command line and then look at the backups directory. You can also try unarchiving the result file with
tar -xvf to get a feeling of how the data is stored inside the archive and if you’re unhappy with that you can play around with the transforms to achieve a structure that you will be comfortable with.
Now that you’re comfortable with the script itself and the backups that it produces, it’s time to actually set it up for automatic execution, which brings us to the next part of this post.
Task Scheduling: WordPress Backups with Crontab
If you’re unfamiliar with Cron, you should check out this Wikipedia article — it pretty much explains what Cron is and how scheduled jobs are executed. For detailed information look for the Cron manual for your Linux distribution. I’m using Ubuntu Server, which usually sticks all the Cron related stuff in the
/etc/crontab and the different parts to /etc/cron.* for daily, weekly, hourly, etc.
Before creating the cron entry make sure you know what you’re doing and if you don’t, refer to the Intro to Cron guide. Unless you need granular control over when your backups will be made, you might as well go with an entry to the weekly, daily or hourly cron which are set up by default in Ubuntu (and probably your OS too). So let’s create a new daily executable file:
$ cd /etc/cron.daily/ $ sudo touch example_org_backup $ sudo chmod +x example_org_backup
And then add the following contents to the script:
This is pretty straightforward, the backup script that we created earlier is launched every day (the time depends on the time and timezone settings on your operating system). Note that with the default setup will run the script as root and therefore create your backup files which will be owned by root and readable by everyone. This could be solved by either configuring cron to launch the script as a different user (perhaps
username) or adding a
chmod/chown command to the backup script itself to change the privileges or the owner/group of the resulted backup file.
So now we’re left with a backup script that creates archives of our WordPress application files, content files and a full database backup. The script is launched once daily (hourly, weekly or however you have configured it). Where should we take it from here? Well it’s pretty obvious that if your server fails, your backup files will not be accessible too, so you have to distribute the backup files to other destinations, and good ideas for storage servers Amazon S3, a different server via FTP, Dropbox and locally.
Amazon S3 and Dropbox require a special protocol but there are tools to work with that. FTP or SFTP is a good choice and might be bundled in the backup script itself, so that the backup is immediately sent to a remote server rather than keeping it on the application server. But in this tutorial we’ll use
rsync to download your WordPress backups to your local machine.
Downloading Your Backup Files
If you’re running a Mac or a Linux environment locally then your system has all the UNIX tools to help you grab your files. If you’re running Windows, you should Google for “rsync windows” or some other tool that will let you grab your files over the SSH protocol.
Why rsync is the right tool? Well there’s no right or wrong when it comes to utility programs, there are many others that help you achieve the same (or almost the same) results. I picked rsync because it’s simple enough to use from the command line and in a task scheduler, and can take off the load and bandwidth by only grabbing the new files without having to re-download the old backups every time it runs.
rsync out create a new directory where you’d like to store your backup files and pump in the following:
$ mkdir my_backups && cd my_backups $ rsync example.org:/home/username/backups/* .
Of course replace the example.org domain with your own, and the full path according to your setup. Rsync will receive an incremental file list from the server and will download the files that don’t exist in the directory where you’re pushing these files, in our case it’s the “dot”, meaning current directory, i.e.
Try running this from time to time as more backups are created. For testing purposes you may set the backup cron to launch hourly or fine-tune it to once every few minutes just to see how the backup files are then downloaded to your local machine via
You may also configure
rsync so that it doesn’t ask you for your SSH credentials when launching but to use a password file or an RSA key for SSH in general. There’s a great post on Oracle on how to do it, and this way you can use rsync in your task scheduler on your local machine, so that backups are downloaded automatically. I personally like to use an ssh-rsa key for password-less access to all servers I manage via an RSA key pair, but it’s up to you on which method to use.
Recap & Conclusion
In this tutorial we went through the process of coding a bash script which generates a tar archive of our application directory where the WordPress core sits, together with the themes, plugins and the contents directory. A database dump of the WordPress tables are then created and appended to the created archive. The archive structure is laid out nicely using tar transforms. Our script finally compresses the archive for storage and bandwidth optimization.
The script is then configured for an hourly, daily or weekly launch using Crontab on the remote server, and another short script is configured to grab those backup files from the remote server to your local machine using
rsync for incremental file downloads.
The final backup script could be downloaded right here but don’t forget to modify the variables and path names according to your WordPress installation and server setup, and make sure that the file is executable.
You can go further by creating a clean-up script or bundling it into the same backup script to get rid of old backup files (ones that are older than two days, months or weeks) so that your remote server doesn’t eventually run out of storage space. Don’t worry — by doing so,
rsync won’t delete your backup files on your local machine when synchronizing.
We at Theme.fm are now working on quite a different approach at backups specifically for WordPress along with some other great tools to work with and manager you WordPress installation, themes and plugin files — but that deserves a different blog post ;)
We’d love to hear how you manage backups of your WordPress installations, and web server backups in general, so leave a note in the comments section. If you’re new to this and have adapted the above script, we’ve got another exercise for you — take the script further by syncing it to Dropbox, Amazon S3 or a remote FTP server and share your bash scripts (using a pastebin service) via the comments!