Tag Archive: s3

Some useful Amazon S3 Tools

I find myself using Amazon S3 for storage more and more these days.  Yes I have a Dropbox account, and a Box account (and probably a Copy account too).   There are times that I need a little more space then I have available in Dropbox and among other things I have been using Amazon S3 to give my self a little extra space as needed.

What is Amazon S3 you ask this is a quick explanation

Here are some tools I have found useful working with my files in S3

Android: S3Anywhere Free and Pro versions available

Windows: Cloudberry Explorer for Amazon S3  Free and Pro versions available so far I have been using the free one and it is meeting my needs

iOS: AirFile Free and Pro versions available.  This one is interesting as it allows me to move files from S3 to Dropbox or the other way right from my iPad (it supports other Cloud storage services as well)

Are you using Amazon S3? If you have any good tools leave a comment.

Managing your Files in the Cloud Made Simple with Otixo

I find that I have files stored in a number of places these days, Google Docs, Dropbox, Amazon S3, box, and Picasa to name a few.  A while back I came across Otixo which allows me to access all of them in one place, and move files between the services all from one dashboard.  Here is how Otixo explains their service:

(wouldn’t it be nice to see IBM Connections Files, and IBM Smart Cloud on their list of supported services?)

Otixo also supports WebDAV access to your account, allowing you to see all your configured services as a mapped drive in Windows, great for uploading or moving files between services.

Otixo has free and paid plans the only difference between the two being monthly bandwidth limits.

 

Automatic File Deletion in Amazon S3 Revisited

Back in October I wrote about automating my Apache/MySQL backups to Amazon S3, I then spend considerable time working out Automatic File deletion in S3, a slightly complicated process, but one that works well once I had it configured.

This week Amazon rolled out Object Expiration for S3 allowing you to automatically expire files in a bucket based on name and time criteria.  Amazon has an excellent guide to configuring Object Expiration via the AWS Console.

To test this I set one of my buckets up for 5 day expiration of any files named ‘backup’ (my existing scripts all maintain 7 days of backups)

so much easier then Python and Boto

 

 

 

 

 

 

When I checked this morning there were 5 backups remaining

still easier then Boto and Python

 

 

 

 

You can verify which rule applies to an object by selecting the object and examining the properties.  One word of caution, Object Expiration rules are for a bucket, so even if you have Folders with in a bucket the rule is global, make sure you understand what objects you are expiring.

Why expire objects in S3? In S3 (and all of the AWS services) you pay for what you use, by managing the number of files (in this case backups) stored at any given time my costs are kept to a minimum.  For December so far my S3 charges are 0.17 (yes that is 17 cents to store my backups for 3 websites and a number of MySQL databases).

Links

Amazon S3 announces Object Expiration

Amazon S3 Developer Guide: Object Expiration

Managing Lifecycle Configuration

Automatic File Deletion in Amazon S3

A couple of weeks ago I wrote about backing up Apache and MySQL to Amazon S3, the final piece of the puzzle I was looking for was to automatically purge all backups older than 7 days from S3.  It took me a little while to put all the pieces in place, and I wanted to write it all down as much for myself as anyone else since I am sure I won’t remember all this if I need it again.

I found a nice utility names s3cleaner a simple python script that uses age and regex patterns to delete files from S3 buckets.

The first challenge was to run a python script on my server (hosted at Dreamhost) I needed to create a sandboxed Python environment.  I found some instructions here.  The first step is to download virtual-python.py and run it.  From a command line you can simply run

wget http://peak.telecommunity.com/dist/virtual-python.py (to download the script)

image

Once it is downloaded run the script

image

Next you have to download ez_setup.py from the command line run

wget http://peak.telecommunity.com/dist/ez_setup.py
 
Once it is downloaded execute it, the key here is you are now using the sandboxed install 
of python, point to python in the path shown in the image above, not simply python like as
in the first step.
image
 
You now have a sandboxed installation of python which will let you run s3cleaner but wait not so fast, 
s3cleaner relies on boto which provides the python interface to Amazon Web Services.  Download boto here,
and copy it up to your server.  Once on your server install it, again make sure you use the sandboxed python
not the system wide one
image
 
You are now ready to place s3cleaner.py on your server and run it you need to specify the following flags
when running
 
image
 
A word of caution, for bucket name it only accepts a bucket name not a subdirectory, so 
everything that matches your deletion criteria (–-maxage and –-regex) anywhere in the 
bucket will be deleted.
 
Download List 
virtual-python.py and ez_setup.py
boto
s3cleaner