Skip to content

Posts from the ‘AWS’ Category

Data Without Limits – Insights from Werner Vogels of

O’Reilly Media’s Strata, Making Data Work Conference held February 1rst – 3rd, 2011 in Santa Clara, California was one of the most interesting and multifaceted events of the year.  Included were presentations on data science, real-time data processing and analytics, data acquisition and crowdsourcing, visualization, in addition to many other topics.  You can find the complete list of speaker slides and videos for the event at this link, Strata 2011 Speaker Slides & Videos.

What enriches this conference is the quality of the case studies presented.  Be sure to check out the presentation from DJ Patil of LinkedIn on Innovating Data Teams.  His discussion illustrates just how critical big data is to LinkedIn and how their approach to managing it enriches the user experience, and is transforming LinkedIn functionality at the same time.

One of the best overall presentations features Dr. Werner Vogels, CTO of titled Data Without Limits.  The video is provided below and provides a glimpse into how pervasive AWS is becoming as a foundation for accessing, aggregating and transforming data in real time.

Building a High Performance Cluster with Amazon Web Services

[tweetmeme source=@LouisColumbus only_single=false]

Amazon Web Services has released the following video that provides a fascinating look at how straightforward it is to create, launch and monitor high performance cluster instances.

CPU utilization, disk I/O and network utilization are tracked as part of the metrics, and guidance on how to define hardware virtualization (HVM) is also defined.   Creating an 8-node, 64 core, ad hoc cluster is defined in the steps in this video with the intent of running a molecular dynamics simulation.

What is interesting about this video is how Amazon Web Services continues to show the practicality of its broad spectrum of server capacities on the Elastic Compute Cloud (EC2).   This is the first in a series of videos Amazon Web Services will be releasing on creating high performance clusters.  It’s worth checking out as the walk-through of steps shows how rapidly EC2 is maturing as an enterprise platform.

Implications for the Enterprise

EC2 has language-agnostic Web Services APIs that show potential for integrating legacy systems, databases, master data management (MDM), CRM and enterprise systems.  For enterprises that have data-centric operations and business models, EC2 could become the foundation of contextual search and role-based access of their legacy data.  Decades of data accessed via contextual search would provide insights that aren’t possible today using existing methods of data access, integration and analysis.

Bottom line: Creating high performance clusters in AWS EC2 shows potential to increase the accuracy and precision of business intelligence and analytics, and potentially solve the most complex data-driven challenges of social CRM.

Flickr attribution:

Web 2.0 Summit 2010 Point of Control: The Cloud

[tweetmeme source=@LouisColumbus only_single=false]

O’Reilly Media and UBM TechNet sponsored the Web 2.0 Summit 2010 held in San Francisco from November 15th to 19th.  This event has become one of the premier conferences globally due to the quality of the content and speakers it attracts, and the thought leadership of the concepts presented there.

The following panel discussion includes Marc Benioff, Founder and CEO of; Andy Jassy, SVP of Amazon Web Services and Amazon Infrastructure; Paul Maritz, President and CEO of VMware and Tim O’Reilly, founder and CEO of O’Reilly Media, Inc.

The discussion is on the aspects of application independence in Cloud environments, the future direction of cloud integration technologies, and the emergence of Cloud-based operating system.

You can find the site by clicking on the Web 2.0 Summit Points of Control map below.

Google Cloud Technologies Overview

[tweetmeme source=@LouisColumbus only_single=false]

Google’s efforts at App Engine evangelism continue to accelerate with the announcement of new APIs and products from Google Labs.

The complete listing of Products in Labs and Graduates of Labs are listed on the Google Code Labs site.
Where Amazon Web Services (AWS) changes many different elements of their platform, pricing, and services often, Google is taking an incremental approach to rolling out new features based on innovation and extensive work in Google Labs.

The following slide deck authored by Chris Schalk is case in point.  Included in this presentation is an update on the Google Storage, Google Prediction API, and Google Big Query.  It’s an excellent overview of these APIs and services, explaining the evolving role of Google’s cloud technologies in the process.

Amazon Announces S3 Storage Price Reduction

[tweetmeme source=@LouisColumbus only_single=false]

Amazon Web Services announced a new pricing schedule for S3 storage that takes place immediately today, November 1rst.  Existing customers could see as much as a 19% reduction in monthly fees.  Amazon also created a new pricing tier at the 1TB level and have also deleted the current 50 – 100 TB tier.  Amazon says these pricing changes apply to the US Standard, EU – Ireland, and APAC – Singapore regions.

The full price list can be found on the Amazon Simple Storage Service (Amazon S3) page here .

AWS link:

Flickr attribution:

Netflix in the Cloud – Lessons Learned Deploying AWS

[tweetmeme source=@LouisColumbus only_single=false]

Adrian Cockcroft, Cloud Architect at Netflix recently published a summary slide deck of a presentation he will be giving on November 3rd at QConSF.  It is a fascinating look into how Netflix chose AWS and the lessons learned.  Adrian discusses the presentation on his blog here.

It is going to be very interesting to see the entire slide deck after QConSF, which is when Adrian plans to upload it per a note on his blog.

Test Driving An Amazon EC2 Micro Instance

[tweetmeme source=@LouisColumbus only_single=false]

Last month Amazon Web Services launched Micro Instances for EC2, the lowest-cost instance type they have offered to date.  A Micro instance includes 613MB of memory and can support 32- and 64-bit platforms on both Linux and Windows operating systems.

The pricing begins at $0.02 per hour for Linux and $0.03 per hour for Windows.  In addition, a Micro instance supports Amazon Machine Image (AMI) for defining applications, data structures and databases in addition to configurations.  Amazon is also including a templated image to get up and running quickly with a Micro instance as well.

Greg Wilson, who is a Sr. Technical Evangelist for Adobe Systems, produced one of the best tutorials and test drives available, which is shown below. He has also written a blog entry regarding lessons learned which can be found at My dive into the world of Amazon EC2 and the new crazy cheap Micro instance.

Bottom line: Micro instances are going to shift cloud-based development away from compute- and data-intensive application development to smaller applications and web services.  Given the price point, the use of Micro instances could lead to a proliferation of new low-end, utilitarian-like applications as well.

Ingram Micro Seeing Traction with Cloud Conduit Initiative

Bottom line: Reselling cloud computing services shows much potential as a market for technology platform and application providers. The challenge is the ability to tailor the services mix efficiently and accurately enough to capitalize on scalability and selective demand of mid-tier and small business end users.

Read more

Architectural Design Patterns in Cloud Computing, Excellent Presentation by AWS

Jinesh Varia of Amazon Web Services (AWS) authored the following presentation, which is an excellent overview of the AWS Services and basic terminology used on this specific cloud platform. This presentation describes the lessons learned by AWS in terms of scalability, cloud architectural trade-offs and also provides guidance of which storage option to choose.

Read more

%d bloggers like this: