Amazon Web Services has released the following video that provides a fascinating look at how straightforward it is to create, launch and monitor high performance cluster instances.
CPU utilization, disk I/O and network utilization are tracked as part of the metrics, and guidance on how to define hardware virtualization (HVM) is also defined. Creating an 8-node, 64 core, ad hoc cluster is defined in the steps in this video with the intent of running a molecular dynamics simulation.
What is interesting about this video is how Amazon Web Services continues to show the practicality of its broad spectrum of server capacities on the Elastic Compute Cloud (EC2). This is the first in a series of videos Amazon Web Services will be releasing on creating high performance clusters. It’s worth checking out as the walk-through of steps shows how rapidly EC2 is maturing as an enterprise platform.
Implications for the Enterprise
EC2 has language-agnostic Web Services APIs that show potential for integrating legacy systems, databases, master data management (MDM), CRM and enterprise systems. For enterprises that have data-centric operations and business models, EC2 could become the foundation of contextual search and role-based access of their legacy data. Decades of data accessed via contextual search would provide insights that aren’t possible today using existing methods of data access, integration and analysis.
Bottom line: Creating high performance clusters in AWS EC2 shows potential to increase the accuracy and precision of business intelligence and analytics, and potentially solve the most complex data-driven challenges of social CRM.
During the last four months of 2010 the pace of published forecasts on cloud computing, IaaS, PaaS and SaaS forecasts quickened, yielding an eclectic and at times conflicting view of this emerging market. From the daily Google Alerts, RSS feeds, e-mail subscriptions and offers to buy research reports on cloud computing received, the pace is being matched by the variety of research being completed.
I did a quick review of the term “cloud computing” on Google Insights for Search, which produced the following graphic. Google Insights for Search is an excellent analytical tool, as it will render a forecast based on previous results and show geographic concentrations. Please click on the image to expand it for easier viewing.
Cloud Computing Was Gartner’s Most Popular Inquiry Topic Last Year
Gartner analyst Ben Pring sums it all up when he writes in the report, The Influence of Cloud in Outsourcing, 2010-2011 that cloud computing was the #1 area of inquiry for the advisory firm in 2010. The Google Insights analysis and the proliferation of reports underscore that point.
2011: When Cloud Computing Customer Results Became King
You can debate which area of the hype cycle the industry is on, yet after reviewing all these forecasts and projections the urgent need for real-world results is clear. As 2011 begins, any software company who has measurable results from customers, not just projections, of their cloud and SaaS-based strategies will be much further ahead of the mainstream.
Hopefully this year the research firms will cite more users than ever before an anchor these forecasts, as varied as they are, back to customer results. That said, the energy and intensity going into forecasting the cloud computing and SaaS markets is impressive.
Experton Group is forecasting that the German cloud computing market is forecast to grow from EUR 1.14 billion in 2010 to EUR 8.2 billion in 2015. This is equal to average annual growth of 48 percent. In 2015, cloud computing will account for around 10 percent of total IT expenditure in Germany. Around half of revenue in 2015 will be generated from cloud services, with a third coming from investment in cloud infrastructure, mainly data centres. The use of so-called ‘private clouds’ by businesses will account for EUR 2.6 billion in revenues by 2015, up from EUR 400 million in 2010. Source: http://professional.wsj.com/article/TPDMEUR00020101007e6a700061.html
Gartner predicts worldwide software as a service (SaaS) revenue within the enterprise application software market is forecast to reach $9.2 billion in 2010, up 15.7 percent from 2009 revenue of $7.9 billion. The market is projected for stronger growth in 2011 with worldwide SaaS revenue totaling $10.7 billion, a 16.2 percent increase from 2010 revenue. These market forecasts are included in the report Forecast Analysis: Software as a Service, Worldwide, 2009-2014, Update.
Source: http://www.gartner.com/it/page.jsp?id=1492814
Gartner analysts write in the report Predicts 2011: New Relationships Will Change BI and Analytics, that by 2013, 33% of business intelligence functionality will be consumed via handheld devices, and 15% of BI deployments will combine BI, collaboration and social software into decision-making environments. By 2014, 30% of analytic applications will use in-memory functions to add scale and computational speed. In addition, 30% of analytic applications will use proactive, predictive and forecasting capabilities and 40% of spending on business analytics will go to system integrators, not software vendors. All of this is predicated on the security and scalability of cloud-based analytics.
Source: Predicts 2011: New Relationships Will Change BI and Analytics
International Data Corporation (IDC) expects the automated software quality (ASQ) and emerging testing as a service (TaaS) segments of the market to generate a 35.9% CAGR from 2009 – 2014 and $954 million in projected revenue in 2014.
Source: Worldwide Automated Software Quality Software as a Service and Testing as a Service 2010–2014 Forecast and 2009 Vendor Shares: Driving Cloud Quality http://www.idc.com/research/viewdocsynopsis.jsp?containerId=225003§ionId=null&elementId=null&pageType=SYNOPSIS
TechMarketView predicts the value of the UK cloud computing market will more than double between now and 2014 from £2.4bn to £6.1bn according to the study UK Software and IT Services Market Forecast published in December by the firm.
MarketsandMarkets.com in their report, Cloud Computing Market – Global Forecast (2010 -2015) predicts that the global cloud computing market is expected to grow from $37.8 billion in 2010 to $121.1 billion in 2015 at a CAGR of 26.2% from 2010 to 2015. SaaS is the largest contributor in the cloud computing services market, accounting for 73% of the market’s revenues 2010. Source: http://www.marketsandmarkets.com/Market-Reports/cloud-computing-234.html
Pike Research has released the report, Cloud Computing Energy Efficiency which is one of the most ambitious to date in quantifying the sustainability advantages of cloud computing. The analysis states that the adoption of the cloud computing services will lead to a 38% reduction of the worldwide data center energy expenditures by 2010. Data center energy cost reductions will lower total data center energy costs from $23.3bn in 2010 to $16.0bn in 2020, and cause a 28% reduction in GHG emissions from 2010 levels is another key finding of the report. Source: http://energyefficiency.cleantechnology-business-review.com/news/cloud-computing-to-cut-38-of-data-center-energy-costs-by-2020-pike-research-071210
Renub Research has made the following predictions in their latest report titled Cloud Computing – SaaS, PaaS, IaaS Market, Mobile Cloud Computing, M&A, Investments, and Future Forecast, Worldwide.Here are the key take-aways from the summary sent to me of the study:
Worldwide Cloud Computing market is growing at a rapid rate and it is expected to cross $25 Billion by the end of 2013
Renub predicts the Platform as a Service (PaaS) market size will reach US$ 400 Million by the year 2013
Renub also predicts that Infrastructure as a Service (IaaS) market will increase at a CAGR value of 52.53% for the period spanning 2010 – 2013
US Federal IT budget devoted to Cloud Computing Spending will reach nearly US$ 1 Billion by 2014
Happy New Year and I hope you find these links useful. I’ve been tracking this activity a while and thought this would be a good time to publish the list.
The emerging field of data science is a fascinating one that has major implications on the potential of cloud-based analytics, CRM, search, supply chain management and logistics.
Instead of relying purely on latent semantic indexing or the Google PageRank algorithm to define relevance of a search, data science techniques analyze content and its context to determine relevance. Google today looks at the content of a page; data science considers its surrounding data and relevance.
Earlier this month TechCrunch published the blog post Marissa Mayer’s Next Big Thing: “Contextual Discovery” — Google Results Without Search. The techniques of contextual discovery Google is experimenting with rely on a very rapid aggregation and transforming of data, which are part of the methodologies of data science. When Google moves fully into contextual discovery the potential exists for cloud-based analytics, CRM, search, supply chain management and logistics to be completely revolutionized by solving the big data problems associated with each of these areas contextual discovery.
In CRM, this would mean finally being able to access external and internal content (including the massive amount of data on social networks), aggregate the data, and transform it into meaningful analysis. The vision of social CRM would be realized once data science serves as the catalyst of contextual search or as Google calls it, contextual discovery.
Exploring Data Science
Two of the best blog posts are both from O’Reilly Radar on the emerging topic of data science. What is data science? By Mike Loukides and Six months after “What is data science?” by Mac Slocum O’Reilly Radar are worth reading and giving some serious thought to. O’Reilly also has also created a free report titled What is Data Science, which can be downloaded here.
Authors Mike Loukides and Mac Slocum set the foundation for how transformational data science has the potential of being by concentrating on the nascent area of data products. A data product is the result of accessing, aggregating and transforming content regardless of its location – and capturing data on its attributes – not just the data itself. Both authors point to reference systems and guided reference engines on e-commerce sites as just the beginning. Yet after reading their assessments and listening to Roger Magoulas, O’Reilly’s Director of Research, interviewed about data science below there are many more potential uses of this evolving area.
Potential Impact of Data Science on Analytics
The blog posts by Mike Loukides and Mac Slocum go into detail explaining how each area of data science is in varying levels of maturity. After reading these over and considering the big data problems in cloud-based analytics, CRM, search, supply chain management and logistics, the following methodology starts to make sense:
Access – For data science to realize its full potential there needs to be a technology layer that provides for real-time access to structured and unstructured content both within and outside an enterprise. More than a traditional Enterprise Application Integration (EAI) layer the technologies driving data access need to selectively pull all available content from every unstructured and structured data source available. Mike Loukides mentions Google Goggles and how MapReduce has made this application possible. Hadoop as a means to create greater access across federated content has much potential in this phase as well.
Aggregate – Called data conditioning by Mike Loukides, the aggregation phase is where contextual discovery happens. This could be accomplished through contextual search filters, taxonomies defined by specific alerts, or the use of the MapReduce and Hadoop query and relevance tools in use today.
Transform – Where Hadoop could be used for driving data analysis and as Mike Loukides calls this level of analysis, data jiujitsu. Examples are mentioned by both Mike Loukides and Mac Slocum including the Hadoop Online Prototype (HOP), which does real-time stream processing and several others. The impact of the access, aggregate and transform methodology on visualization is available at Flowing Data, one of the best sites on the Web for seeing how MapReduce, Hadoop and other data science-related techniques are taking on massive amounts of data and delivering insights.
Conclusion
Solving the big data problems of social media monitoring, sentiment analysis, forming a scalable platform for social CRM, integrating CRM, supply chain management and logistics data to demand management – and tying all of these areas to financial performance – is potentially achievable with data science. Deployed as a cloud-based platform opens up even greater potential for getting the most use of social networks, free data sources, and third-party databases than is possible today.
Be sure to check out the video below of Roger Magoulas, O’Reilly’s Director of Research, where he was interviewed about data science.
Bottom line: Cloud-based applications combined with the ability to measure program and strategy results are accelerating the efficiency and focus of marketing, bringing an entirely new level of intensity to this area.
The bottom line is that the need for staying current on analytics, business intelligence, data mining and linguistic modeling using cloud-based data is going to accelerate in importance faster than anyone expects.
Bottom line: Altimeter and Web Analytics Demystified have delivered a landmark report on social marketing analytics that is both pragmatic in advice and strategic in the framework delivered.
Bottom line: SAS has set a new standard on how to successfully launch a new service using social media. There’s great irony in the fact they are immediately analyzing this launch using their own software – what a great series of lessons for so many companies who will launch new apps and services this year.
The optimal or maximum group of friends any person can keep up with is 150, or to be precise, 147.8, according to Dr. Robin Dunbar, Professor of Evolutionary Anthropology, University of Oxford. He has further stated that language is the means of “social grooming” which conjures up an image you would expect from an anthropologist, namely one of reciprocity and mutual support through contact.
Popularized by best selling books including one of my favorite, The Tipping Point: How Little Things Make A Big Difference by Malcolm Gladwell and vocal discussions by Chris Brogan, Seth Godin and others, the Dunbar Number continues to be either assailed as irrelevant or praised as the truth.
Sixteen Ventures has updated one of their more popular reports titled SaaS Revenue Modeling: Details of the 7 Revenue Streams. It's an excellent report to review on the dynamics of SaaS pricing and strategy, areas Sixteen Ventures specializes in. You ca view the book on Slideshare below and also download the slides here.
Bottom line: Creating passionate users starts when applications become integral to their attaining their goals, and the quick delivery times of SaaS apps can do that when development with a mindset of experience over development expediency.