This article will explain, in a few short steps, how to backup your WordPress installation to Amazon S3 service. It will deal with Ubuntu as operating system and Amazon EC2 for hosting but is not limited to such an environment. Prerequisits are an Amazon account for the S3 service, a WordPress installation and knowledge of these products.
How to backup WordPress
There are numerous plugins for WordPress when it comes to backup. I have learnt to appreciate the plugin BackUpWordPress as it not only backs up your database but also all the files within the installation.
There are a few number (though quite complete) of configuration options like
– manual or daily scheduled backups
– number of backups to keep
– whether to include files or database or both
The backups are stored on disk on the path of your choice and could also be attempted to be emailed. Having this kind of backup data makes it very comfortable to restore your WordPress installation.
But what if your complete machine/instance is failing? On, for example Amazon EC2 one should take snapshots of instances when the setup of these change but this does not apply to content changes. Restoring from a snapshot does not bring back newly created or updated content from, for example WordPress. That is, moving those WordPress backup files to a safer place becomes vital. So, s3cmd to the rescue.
What is s3cmd?
Extracted from its’ homepage:
“S3cmd is a command line tool for uploading, retrieving and managing data in Amazon S3. It is best suited for power users who don’t fear command line. It is also ideal for scripts, automated backups triggered from cron, etc.”
At the time of writing s3cmd requires Python 2.4 or newer and some pretty common Python modules. Installing it is merely a matter of running:
sudo apt-get install s3cmd
To configure it run
You will be asked for the two keys of your Amazon account – copy and paste them from your confirmation email or from your Amazon account page.
To list all your buckets run
and to list the content of a bucket run
s3cmd ls s3://[bucket name]
Now, you could use ‘s3cmd put’ to upload and ‘s3cmd remove’ to remotely remove specific files but then you would have to do some scripting to select these specific files and to keep the bucket contents to an acceptable size. There is another option that simply syncs the content of your local directory to the remote bucket, doing the removal automatically, namely ‘s3cmd sync’.
s3cmd sync --delete-removed [source directory] s3://[bucket name]
The source directory is naturally replaced with the directory of where you put your WordPress backups.
To make it a little easier to schedule via crontab you preferrably put the sync command in a shell script, for example ‘sync-wp-backups.sh’
#!/bin/bash /usr/bin/s3cmd sync --delete-removed [source directory] s3://[bucket name]
To schedule this script run
which brings you into editing the cron job with the editor of your choice. For example you could edit the crontab file to schedule it to run daily at 23:30 (11:30 PM)
30 23 * * * /home/ubuntu/sync-wp-backups.sh >> /home/ubuntu/s3-backup.log
which logs the output to file (s3-backup.log).
That’s it! Now I have a number of days of backup automatically synced from the Amazon instance running WordPress to a more fail safe S3 bucket. At least I sleep a lot better with this in place.