Backing up files from a Windows Instance to S3.

Part of a backup strategy in AWS could involve file-level backups.  As S3 storage is cheaper than EBS volumes, it could make sense to store your files in S3.

A possible way to go about this would be to use a utility like Duplicati to backup files to an S3 bucket.

Another method is with the AWS CLI tools and Powershell.  Let’s go through how to do it.

First, we’ll setup a bucket. We’ll name our bucket reallyreallyimportantfiles and take all the defaults on the rest of the setup.

s32.png

Next, we’ll want to create an IAM user with programmatic access, noting the access key and secret.  Also, we’ll give the user full access to the bucket for the backup, but no other buckets with the policy below.


{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "Stmt1507632416000",
            "Effect": "Allow",
            "Action": [
                "s3:ListAllMyBuckets"
            ],
            "Resource": [
                "*"
            ]
        },
        {
            "Sid": "Stmt1507632438000",
            "Effect": "Allow",
            "Action": [
                "s3:*"
            ],
            "Resource": [
                "arn:aws:s3:::reallyreallyimportantfiles",
                "arn:aws:s3:::reallyreallyimportantfiles/*"
            ]
        }
    ]
}

 

Now, we’ll connect to our instance, an install the AWS CLI.  You can download the CLI here.

After it’s installed, we’ll open up a cmd prompt and enter aws configure, where we’ll enter the Access Key, Secret, and default region (for this example, US-East-1). It’s not necessary to specify a default output format for this example.

AWSconfigure.png

We’ll use the following powershell script to do the following:

  • Create a zipped version of the folder we want to backup
  • PUTs it to our S3 bucket
  • The next time it runs, it will clean up backups older than 1 day

#Backup working directory
$backupdir = "c:\BACKUPS"

#Enter S3 Bucket Name
$s3bucket = "reallyreallyimportantfiles"

# Delete files older than the 1 day
$limit = (Get-Date).AddDays(-1)
Get-ChildItem -Path $backupdir -Recurse -Force | Where-Object { !$_.PSIsContainer -and $_.CreationTime -lt $limit } | Remove-Item -Force

#Zip file information
$source = "C:\FOLDER\IMPORTANT_FILES"
$destination = "c:\BACKUPS\importantfiles$(Get-Date -Format "yyyy-MM-dd").zip"
Add-Type -AssemblyName "System.IO.Compression.Filesystem"
$zip = [IO.Compression.Zipfile]::CreateFromDirectory($source, $destination)

#Sync to AWS S3 Bucket
$sync = Start-Process -FilePath "C:\Program Files\Amazon\AWSCLI\aws.exe" -ArgumentList "s3 sync $backupdir s3://$s3bucket" -Wait 

After executing the script, we’ve got a backup in our C:\BACKUPS folder…

s3.png

….and we see it made it to our S3 bucket as well.

se2.png

There are many other useful features that still could be added to the script…logging, sending emails on error, etc.  This serves as a starting point, and can be setup as a scheduled task to take care of file-level backups.

 

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

w

Connecting to %s