How to backup your cloud server to Amazon S3

15-08-2014

By Rob Stevenson

Following on from how to setup a web server on ElasticHosts, this covers backing up your server. It's not quite as ElasticHosts-centric and should work on most linux environments, virtual or real, with very little modification.

Amazon Web Services

You'll need to sign up fist. Go to aws.amazon.com and click Sign Up. I'll leave it there because the official documentation covers it pretty well. You'll need to Sign up for AWS and creat an IAM user. Make a note of the Access Key ID and Secret Access Key as you'll need these later to connect and copy files.

Once you're setup on AWS, head to the S3 section and create a bucket. Give it a meaningful name (eg myserverbackup) and pick a geographical region based on pricing and proximity to your existing servers.

Now head back to the User Identity and Access Management (IAM) area and find your user. Click on it and scroll down to permissions. Click Attach User Policy and paste the below into it, modifying it to the name of your bucket.

{
  "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Action": ["s3:ListBucket"],
            "Resource": [ "arn:aws:s3:::myserverbackup"]
        },
        {
            "Effect": "Allow",
            "Action": [ "s3:PutObject", "s3:GetObject", "s3:DeleteObject"],
            "Resource": [ "arn:aws:s3:::myserverbackup/*"]
        }
    ]
}

That's the AWS part done for now. To access your backups online you can go back to the S3 admin interface.

Setup s3cmd to upload files

s3cmd is a Python tool to put files on and retrieve files from AWS S3 buckets. It requires Python 2.4 or above, plus python-dateutil and optionally python-magic. Python usually comes with Linux but you may need to install the dependancies. On CentOS it goes like this:

yum -y install python-dateutil python-magic

Next download s3cmd. I tested with 1.5.0 rc1 but feel free to try a later one if it exists - let me know how it goes!

curl -L "http://downloads.sourceforge.net/project/s3tools/s3cmd/1.5.0-rc1/s3cmd-1.5.0-rc1.tar.gz?r=&ts=1408106691&use_mirror=heanet" -o s3cmd-1.5.0-rc1.tar.gz

Open, and optionally, move this package somewhere sensible

tar zxf s3cmd-1.5.0-rc1.tar.gz
mkdir /opt/s3
mv s3cmd-1.5.0-rc1/S3 s3cmd-1.5.0-rc1/s3cmd /opt/s3
ln -s /opt/s3/s3cmd /usr/local/bin/s3cmd

Configure it with the access keys you got from the IAM user setup. This will ask you for your IAM Access Key ID and Secret Access Key along with optional password and gpg key will protect your data against reading by Amazon staff or anyone who may get access to your them while they're stored at Amazon S3.

s3cmd --configure s3://myserverbackup

Next grab this backup script and put it somewhere sensible.

curl https://gist.github.com/merob/4f8923b014ffb3248f84/download# -o daily.backup.tgz
tar zxf daily.backup.tgz
mv gist4f8923b014ffb3248f84-*/daily.backup ~/bin
rmdir gist4f8923b014ffb3248f84-*

Take a look at the file daily.backup, there are a few things that need to be customized.

First create a file with a password in it that corresponds to the user which has access to backup your databases

echo mypassword > ~/.dbrootpw

Add the name of your s3 bucket and modify any config or directories that you want to be backed up. Here's a snippet of daily.backup:

DBPW=`cat /root/.dbrootpw`              # location of file which _just_ contains database root password
S3PATH="s3://nameofs3bucket"            # s3 bucket name
 
                                        # all lists separated by white space
ACTIVEUSERS="root"                      # any usernames who have homedirs to backup
DIRS=`ls -d /var/www/*`                 # list all directories to be backed up. (Will be backed up in separate files)
                                        # below: any custom config files
CONFIG="/etc/httpd/conf/httpd.conf /etc/httpd/conf.d /etc/postfix /etc/sysconfig/network /etc/sysconfig/iptables /etc/ssh/sshd_config /etc/aliases"

Add this to your crontab, ideally at an off-peak time of day

(crontab -l ; echo "7 1 * * * /root/bin/daily.backup
> ") | sort - | uniq - | crontab -

And that's it. It's probably worth checking that it works by running it manually, then logging on to AWS and checking your bucket contains the files you expect.

/root/bin/daily.backup
comments powered by Disqus