I find myself using Amazon S3 for storage more and more these days. Â Yes I have a Dropbox account, and a Box account (and probably a Copy account too). Â There are times that I need a little more space then I have available in Dropbox and among other things I have been using Amazon S3 to give my self a little extra space as needed.
What is Amazon S3 you ask this is a quick explanation
https://www.youtube.com/watch?v=idO_uy6FicI
Here are some tools I have found useful working with my files in S3
Android: S3Anywhere Free and Pro versions available
Windows: Cloudberry Explorer for Amazon S3Â Â Free and Pro versions available so far I have been using the free one and it is meeting my needs
iOS: AirFile Free and Pro versions available.  This one is interesting as it allows me to move files from S3 to Dropbox or the other way right from my iPad (it supports other Cloud storage services as well)
Are you using Amazon S3? If you have any good tools leave a comment.
I find that I have files stored in a number of places these days, Google Docs, Dropbox, Amazon S3, box, and Picasa to name a few. Â A while back I came across Otixo which allows me to access all of them in one place, and move files between the services all from one dashboard. Â Here is how Otixo explains their service:
Otixo also supports WebDAV access to your account, allowing you to see all your configured services as a mapped drive in Windows, great for uploading or moving files between services.
To test this I set one of my buckets up for 5 day expiration of any files named ‘backup’ (my existing scripts all maintain 7 days of backups)
When I checked this morning there were 5 backups remaining
You can verify which rule applies to an object by selecting the object and examining the properties. One word of caution, Object Expiration rules are for a bucket, so even if you have Folders with in a bucket the rule is global, make sure you understand what objects you are expiring.
Why expire objects in S3? In S3 (and all of the AWS services) you pay for what you use, by managing the number of files (in this case backups) stored at any given time my costs are kept to a minimum. For December so far my S3 charges are 0.17 (yes that is 17 cents to store my backups for 3 websites and a number of MySQL databases).
A couple of weeks ago I wrote about backing up Apache and MySQL to Amazon S3, the final piece of the puzzle I was looking for was to automatically purge all backups older than 7 days from S3. It took me a little while to put all the pieces in place, and I wanted to write it all down as much for myself as anyone else since I am sure I won’t remember all this if I need it again.
I found a nice utility names s3cleaner a simple python script that uses age and regex patterns to delete files from S3 buckets.
The first challenge was to run a python script on my server (hosted at Dreamhost) I needed to create a sandboxed Python environment. I found some instructions here. The first step is to download virtual-python.py and run it. From a command line you can simply run