It’s
To Infinity and Beyond: Migrating your Users and Their Data into IBM Connections CloudÂ
Tuesday February 2, 2016 -Â 9:15-10:15 AM
Room:Â Lake Highland AB
It’s
To Infinity and Beyond: Migrating your Users and Their Data into IBM Connections CloudÂ
Tuesday February 2, 2016 -Â 9:15-10:15 AM
Room:Â Lake Highland AB
Another weekend another set of updates to IBM Connections Cloud, one of which is a new look and feel to
Coming next is Audio/Video integration in Meetings, I have been beta testing it for a while now and happy to see it coming to release shortly.
Check out Luis Benitez’s blog for more of the new capabilities releases this weekend
We heard last January at IBM Connect that the IBM Collaboration Solutions would all be rebranded under the IBM Connections name. Â IBM
Current Name | New Name |
IBM SmartCloud Engage Advanced | IBM Connections Cloud S1 |
IBM SmartCloud Engage Standard | IBM Connections Cloud S2 |
IBM SmartCloud iNotes | IBM Connections Web Mail Cloud |
IBM SmartCloud Archive Essentials | IBM Connections Archive Essentials Cloud |
IBM SmartCloud Connections | IBM Connections Social Cloud |
IBM SmartCloud Docs | IBM Connections Docs Cloud |
IBM SmartCloud Meetings | IBM Connections Meetings Cloud |
(New Offering) | IBM Connections Chat Cloud |
Looks like they are holding off on changing the Notes Mail product names until Mail Next launches.
The URLs will not be changing, just the product names.
I installed CloudGOOÂ on
https://www.youtube.com/watch?v=0iGG8FQWchQ
I find myself using Amazon
What is Amazon S3 you ask this is a quick explanation
https://www.youtube.com/watch?v=idO_uy6FicI
Here are some tools I have found useful working with my files in S3
Android: S3Anywhere Free and Pro versions available
Windows: Cloudberry Explorer for Amazon S3Â Â Free and Pro versions available so far I have been using the free one and it is meeting my needs
iOS: AirFile Free and Pro versions available.  This one is interesting as it allows me to move files from S3 to Dropbox or the other way right from my iPad (it supports other Cloud storage services as well)
Are you using Amazon S3? If you have any good tools leave a comment.
Back in October I wrote about automating
This week Amazon rolled out Object Expiration for S3 allowing you to automatically expire files in a bucket based on name and time criteria. Amazon has an excellent guide to configuring Object Expiration via the AWS Console.
To test this I set one of my buckets up for 5 day expiration of any files named ‘backup’ (my existing scripts all maintain 7 days of backups)
When I checked this morning there were 5 backups remaining
You can verify which rule applies to an object by selecting the object and examining the properties. One word of caution, Object Expiration rules are for a bucket, so even if you have Folders with in a bucket the rule is global, make sure you understand what objects you are expiring.
Why expire objects in S3? In S3 (and all of the AWS services) you pay for what you use, by managing the number of files (in this case backups) stored at any given time my costs are kept to a minimum. For December so far my S3 charges are 0.17 (yes that is 17 cents to store my backups for 3 websites and a number of MySQL databases).
Links
Amazon S3 announces Object Expiration
A
I found a nice utility names s3cleaner a simple python script that uses age and regex patterns to delete files from S3 buckets.
The first challenge was to run a python script on my server (hosted at Dreamhost) I needed to create a sandboxed Python environment. I found some instructions here. The first step is to download virtual-python.py and run it. From a command line you can simply run
wget http://peak.telecommunity.com/dist/virtual-python.py (to download the script)
Once it is downloaded run the script
Next you have to download ez_setup.py from the command line run
wget http://peak.telecommunity.com/dist/ez_setup.py
Once it is downloaded execute it, the key here is you are now using the sandboxed install
of python, point to python in the path shown in the image above, not simply python like as
in the first step.
You now have a sandboxed installation of python which will let you run s3cleaner but wait not so fast,
s3cleaner relies on boto which provides the python interface to Amazon Web Services. Download boto here,
and copy it up to your server. Once on your server install it, again make sure you use the sandboxed python
not the system wide one
You are now ready to place s3cleaner.py on your server and run it you need to specify the following flags
when running
A word of caution, for bucket name it only accepts a bucket name not a subdirectory, so
everything that matches your deletion criteria (–-maxage and –-regex) anywhere in the
bucket will be deleted.
Download List
virtual-python.py and ez_setup.py
boto
s3cleaner
Backups,
I have used a number of different methods over time, including WordPress plugins, xcloner, and others but nothing was quite as simple and complete as I was looking for.
Finally over the last couple of weeks I found something that meets all my criteria, automatic, includes files and databases, and backs up to another site (i.e not on a Dreamhost server).
The destination for my backups is Amazon S3, cheap (no really cheap) reliable storage accessible from anywhere. Next I found S3 Tools, command line tools for Linux to interact with S3. I simply copied the files up to my server, ran s3cmd –configure, plugged in my access and secret key, S3 bucket name and I was connected to S3.
I was going to write my own scripts to dump MySQL data and tar up some directories, but happily I found someone had already done that, with a few simple modifications to that script, and a quick cron job I had my backups up and running in not time.
There are still a few tweaks I would like to make to this process, but for now I finally have my backups running the way I like them.
I
Cartoon via Read Write Web