Category Archives: backups

Test Your Backups

Hat tip to Andy Donaldson for pointing me at this, great reminder to not only take backups, but test them.

How Toy Story 2 Was Almost Lost Forever

Automatic File Deletion in Amazon S3 Revisited

Back in October I wrote about automating my Apache/MySQL backups to Amazon S3, I then spend considerable time working out Automatic File deletion in S3, a slightly complicated process, but one that works well once I had it configured.

This week Amazon rolled out Object Expiration for S3 allowing you to automatically expire files in a bucket based on name and time criteria.  Amazon has an excellent guide to configuring Object Expiration via the AWS Console.

To test this I set one of my buckets up for 5 day expiration of any files named ‘backup’ (my existing scripts all maintain 7 days of backups)

so much easier then Python and Boto







When I checked this morning there were 5 backups remaining

still easier then Boto and Python





You can verify which rule applies to an object by selecting the object and examining the properties.  One word of caution, Object Expiration rules are for a bucket, so even if you have Folders with in a bucket the rule is global, make sure you understand what objects you are expiring.

Why expire objects in S3? In S3 (and all of the AWS services) you pay for what you use, by managing the number of files (in this case backups) stored at any given time my costs are kept to a minimum.  For December so far my S3 charges are 0.17 (yes that is 17 cents to store my backups for 3 websites and a number of MySQL databases).


Amazon S3 announces Object Expiration

Amazon S3 Developer Guide: Object Expiration

Managing Lifecycle Configuration

Automatic File Deletion in Amazon S3

A couple of weeks ago I wrote about backing up Apache and MySQL to Amazon S3, the final piece of the puzzle I was looking for was to automatically purge all backups older than 7 days from S3.  It took me a little while to put all the pieces in place, and I wanted to write it all down as much for myself as anyone else since I am sure I won’t remember all this if I need it again.

I found a nice utility names s3cleaner a simple python script that uses age and regex patterns to delete files from S3 buckets.

The first challenge was to run a python script on my server (hosted at Dreamhost) I needed to create a sandboxed Python environment.  I found some instructions here.  The first step is to download and run it.  From a command line you can simply run

wget (to download the script)


Once it is downloaded run the script


Next you have to download from the command line run

Once it is downloaded execute it, the key here is you are now using the sandboxed install 
of python, point to python in the path shown in the image above, not simply python like as
in the first step.
You now have a sandboxed installation of python which will let you run s3cleaner but wait not so fast, 
s3cleaner relies on boto which provides the python interface to Amazon Web Services.  Download boto here,
and copy it up to your server.  Once on your server install it, again make sure you use the sandboxed python
not the system wide one
You are now ready to place on your server and run it you need to specify the following flags
when running
A word of caution, for bucket name it only accepts a bucket name not a subdirectory, so 
everything that matches your deletion criteria (–-maxage and –-regex) anywhere in the 
bucket will be deleted.
Download List and

Backing up Apache and MySQL to Amazon S3

imageBackups, you know they are important right? I have written before about backing up my PC, but over time I have accumulated a bunch of data in web sites and MySql databases.  Starting with my blog, as well as Thinkup and Tweetnest, and an installation of YOURLS, and a few other miscellaneous PHP files, and databases, I have a lot of backups to worry about.  Sure my host Dreamhost maintains backups (which I have used), but Dreamhost recommends and common sense dictates that I have some kind of backup plan of my own.

I have used a number of different methods over time, including WordPress plugins, xcloner, and others but nothing was quite as simple and complete as I was looking for.

Finally over the last couple of weeks I found something that meets all my criteria, automatic, includes files and databases, and backs up to another site (i.e not on a Dreamhost server).

The destination for my backups is Amazon S3, cheap (no really cheap) reliable storage accessible from anywhere. Next I found S3 Tools, command line tools for Linux to interact with S3.  I simply copied the files up to my server, ran s3cmd –configure, plugged in my access and secret key, S3 bucket name and I was connected to S3.

I was going to write my own scripts to dump MySQL data and tar up some directories, but happily I found someone had already done that, with a few simple modifications to that script, and a quick cron job I had my backups up and running in not time.

There are still a few tweaks I would like to make to this process, but for now I finally have my backups running the way I like them.

Backups, Backups, Backups

This is a public service announcement

This past Friday I was sitting at my desk working, when suddenly I found myself staring at a lovely blue screen of death.  When I powered the machine back on I heard the tell tale click of a hard drive that died.  The good news is my data is all backed up, so beyond the downtime until a new hard drive arrived, and the frustration of having to rebuild the machine (I did have an image of the machine from when I first got it, but after restoring it failed to boot, so I did have to manually install Windows), losing the hard drive was not a big deal.

My personal backup system is made of up a few components

1. Carbonite – drop dead simple and easy to use as once you have it set up you don’t actually have to do anything other then have your computer connected to the internet.  I have looked at their competition, Carbonite is the only one to offer a flat fee for unlimited storage.  With remote file access, and an iPhone app you get access to your files anywhere.  

This is actually the second time I have recovered my system from my Carbonite backup.

2. Dropbox – I use the free version which gives me about 3GB of storage.  I mainly use Dropbox to keep certain files on both my laptop and netbook, and to share folders with others.  Dropbox also offers remote file access and an iPhone app, but for the amount of data I am backing up would cost me approximately $185.00 more a year then Carbonite. (A word of caution if you decide to use both Carbonite and Dropbox on the same machine, do NOT put your Dropbox folder within a folder being backed up by Carbonite, Windows Explorer was not at all amused when I tried that!)

3. Lotus Notes –  while most of the Notes databases I use on a regular basis are local, they are all ultimately hosted on servers that are backed up.  With background replication scheduled and running throughout the day this is as effortless as backing up can be.  Bringing back my Notes Databases takes no time at all with zero loss of data.  

Here are three great rules for backups from an article that caught my eye on Monday Yes. Another Backup Lecture

  • If it’s not automated, it’s not a real backup.
  • If it’s not redundant, it’s not a real backup.
  • If it’s not regularly rotated off-site, it’s not a real backup.

So how do you back up your data? If your answer is “I don’t” drop whatever it is you are doing and go figure out a backup strategy, one day you will be glad you did!

The best $39.99 I spent in 2008

I welcomed in the new year with a complete meltdown of my laptop.  Things started to go wrong about 4:30 PM on New years eve, and with in an hour I realized that I would be reinstalling my laptop completely.  I might have been really aggravated except for two things.  First and most importantly I was prepared for a disaster!  Second, for a while now I really wanted to get a clean start I have had this machine about 18 months which is probably longer then I have ever gone before with out starting clean, I just didn’t think I would have the time to do it before Lotusphere until fate intervened When I first configured my laptop I had the foresight to take an image of the OS Install before anything else, it took about 45 minutes to restore that image.  After a few Windows updates, and Lenovo system update I was ready to go.  ( I could snapshot the machine with all my software installed but lets face it over 18 months the software I used has changed significantly so I don’t see the point) Over the years I have tried various different backup strategies, with varying degree of success (OK If I had a Mac I would be using Time Machine), but last year I was looking for a solution that just worked, and required no intervention on my part to maintain it.  After looking at a few options I went with Carbonite.  Once you subscribe you select which folders should be backed up, and just let it run.  Over the year I used it a few times to restore individual files (some times just to test it, a couple of times due to accidental deletions).  After restoring my OS I logged in to the Carbonite site, installed the application, and put it in recovery mode, like magic my data started flowing back to my machine.   For the most part I do keep local backups as well on external drives, but those drives are not always connected, and are at times out of date, while the Carbonite backup just keeps on going.  In a complete coincidence earlier in the day on December 31, I had blogged some 2008 statistics this left me with a screen shot of how many pictures I had by year since 2002, giving me something to match my restored data against which it did perfectly! So how do you back up your PC? Are you ready for a disaster (or at least a crash)? In the process of reinstalling I made some changes to the software I use Trilian is being replaced by Digsby Adobe Reader is gone replaced by Foxit Reader I have not yet reinstalled Microsoft Office for the time being I am going to hold off, and see how far I can go with just Symphony.  Speaking of Symphony I am running the version in the Notes 8.5 Client, would have been nice if Notes 8.5 had shipped in time for my little disaster, so I am back to a recent Beta Drop until 8;5 gold is released.