Category Archives: cloud

Speaking at IBM Connect

connect2016

It’s almost time to board a plane and head to Orlando, looking forward to seeing a lot of you in Orlando next week.  I will be speaking again this year, come check out my session on Tuesday morning.

To Infinity and Beyond: Migrating your Users and Their Data into IBM Connections Cloud 

Tuesday February 2, 2016 - 9:15-10:15 AM

Room: Lake Highland AB

 

IBM Connections Cloud Meetings Get a New Look and Audio/Video

Another weekend another set of updates to IBM Connections Cloud, one of which is a new look and feel to IBM Connections Cloud Meetings.  The Meeting UI has been updated to the Verse theme, and gone is the blank gray box which used to be prominent and it is replaced with quick actions to share your screen or files.

Meetings

Coming next is Audio/Video integration in Meetings, I have been beta testing it for a while now and happy to see it coming to release shortly.

Check out Luis Benitez’s blog for more of the new capabilities releases this weekend

IBM SmartCloud rebranding to IBM Connections Cloud

We heard last January at IBM Connect that the IBM Collaboration Solutions would all be rebranded under the IBM Connections name.   IBM SmartCloud for Social Business is no exception and will be changing later in September.

Current Name New Name
IBM SmartCloud Engage Advanced IBM Connections Cloud S1
IBM SmartCloud Engage Standard IBM Connections Cloud S2
IBM SmartCloud iNotes IBM Connections Web Mail Cloud
IBM SmartCloud Archive Essentials IBM Connections Archive Essentials Cloud
IBM SmartCloud Connections IBM Connections Social Cloud
IBM SmartCloud Docs IBM Connections Docs Cloud
IBM SmartCloud Meetings IBM Connections Meetings Cloud
(New Offering) IBM Connections Chat Cloud

Looks like they are holding off on changing the Notes Mail product names until Mail Next launches.

The URLs will not be changing, just the product names.

CloudGOO – combine all of your cloud storage into one drive

I installed CloudGOO on my Android phone today (they say an iOS version is in the works), I love the idea of showing all my cloud based files across different services as if they were all together on one big drive.  CloudGOO also supports automatic upload where you can specify which service to use, or let CloudGOO decide based on the space you have available.

https://www.youtube.com/watch?v=0iGG8FQWchQ

CloudGOO

Some useful Amazon S3 Tools

I find myself using Amazon S3 for storage more and more these days.  Yes I have a Dropbox account, and a Box account (and probably a Copy account too).   There are times that I need a little more space then I have available in Dropbox and among other things I have been using Amazon S3 to give my self a little extra space as needed.

What is Amazon S3 you ask this is a quick explanation

https://www.youtube.com/watch?v=idO_uy6FicI

Here are some tools I have found useful working with my files in S3

Android: S3Anywhere Free and Pro versions available

Windows: Cloudberry Explorer for Amazon S3  Free and Pro versions available so far I have been using the free one and it is meeting my needs

iOS: AirFile Free and Pro versions available.  This one is interesting as it allows me to move files from S3 to Dropbox or the other way right from my iPad (it supports other Cloud storage services as well)

Are you using Amazon S3? If you have any good tools leave a comment.

Automatic File Deletion in Amazon S3 Revisited

Back in October I wrote about automating my Apache/MySQL backups to Amazon S3, I then spend considerable time working out Automatic File deletion in S3, a slightly complicated process, but one that works well once I had it configured.

This week Amazon rolled out Object Expiration for S3 allowing you to automatically expire files in a bucket based on name and time criteria.  Amazon has an excellent guide to configuring Object Expiration via the AWS Console.

To test this I set one of my buckets up for 5 day expiration of any files named ‘backup’ (my existing scripts all maintain 7 days of backups)

so much easier then Python and Boto

 

 

 

 

 

 

When I checked this morning there were 5 backups remaining

still easier then Boto and Python

 

 

 

 

You can verify which rule applies to an object by selecting the object and examining the properties.  One word of caution, Object Expiration rules are for a bucket, so even if you have Folders with in a bucket the rule is global, make sure you understand what objects you are expiring.

Why expire objects in S3? In S3 (and all of the AWS services) you pay for what you use, by managing the number of files (in this case backups) stored at any given time my costs are kept to a minimum.  For December so far my S3 charges are 0.17 (yes that is 17 cents to store my backups for 3 websites and a number of MySQL databases).

Links

Amazon S3 announces Object Expiration

Amazon S3 Developer Guide: Object Expiration

Managing Lifecycle Configuration

Automatic File Deletion in Amazon S3

A couple of weeks ago I wrote about backing up Apache and MySQL to Amazon S3, the final piece of the puzzle I was looking for was to automatically purge all backups older than 7 days from S3.  It took me a little while to put all the pieces in place, and I wanted to write it all down as much for myself as anyone else since I am sure I won’t remember all this if I need it again.

I found a nice utility names s3cleaner a simple python script that uses age and regex patterns to delete files from S3 buckets.

The first challenge was to run a python script on my server (hosted at Dreamhost) I needed to create a sandboxed Python environment.  I found some instructions here.  The first step is to download virtual-python.py and run it.  From a command line you can simply run

wget http://peak.telecommunity.com/dist/virtual-python.py (to download the script)

image

Once it is downloaded run the script

image

Next you have to download ez_setup.py from the command line run

wget http://peak.telecommunity.com/dist/ez_setup.py
 
Once it is downloaded execute it, the key here is you are now using the sandboxed install 
of python, point to python in the path shown in the image above, not simply python like as
in the first step.
image
 
You now have a sandboxed installation of python which will let you run s3cleaner but wait not so fast, 
s3cleaner relies on boto which provides the python interface to Amazon Web Services.  Download boto here,
and copy it up to your server.  Once on your server install it, again make sure you use the sandboxed python
not the system wide one
image
 
You are now ready to place s3cleaner.py on your server and run it you need to specify the following flags
when running
 
image
 
A word of caution, for bucket name it only accepts a bucket name not a subdirectory, so 
everything that matches your deletion criteria (–-maxage and –-regex) anywhere in the 
bucket will be deleted.
 
Download List 
virtual-python.py and ez_setup.py
boto
s3cleaner
 

Backing up Apache and MySQL to Amazon S3

imageBackups, you know they are important right? I have written before about backing up my PC, but over time I have accumulated a bunch of data in web sites and MySql databases.  Starting with my blog, as well as Thinkup and Tweetnest, and an installation of YOURLS, and a few other miscellaneous PHP files, and databases, I have a lot of backups to worry about.  Sure my host Dreamhost maintains backups (which I have used), but Dreamhost recommends and common sense dictates that I have some kind of backup plan of my own.

I have used a number of different methods over time, including WordPress plugins, xcloner, and others but nothing was quite as simple and complete as I was looking for.

Finally over the last couple of weeks I found something that meets all my criteria, automatic, includes files and databases, and backs up to another site (i.e not on a Dreamhost server).

The destination for my backups is Amazon S3, cheap (no really cheap) reliable storage accessible from anywhere. Next I found S3 Tools, command line tools for Linux to interact with S3.  I simply copied the files up to my server, ran s3cmd –configure, plugged in my access and secret key, S3 bucket name and I was connected to S3.

I was going to write my own scripts to dump MySQL data and tar up some directories, but happily I found someone had already done that, with a few simple modifications to that script, and a quick cron job I had my backups up and running in not time.

There are still a few tweaks I would like to make to this process, but for now I finally have my backups running the way I like them.

Connecting to the Cloud

I have been enjoying Rob Cottingham’s cartoons for a while now, they appear occasionally on Read Write Web, and on his blog.  Rob is coming up on his fourth anniversary of Noise to Signal, here is his latest cartoon.

2011.05.23.cloud

Cartoon via Read Write Web