Skip to content

Posts tagged ‘Gartner’

What’s New In Gartner’s Hype Cycle For AI, 2020

What's New In Gartner's Hype Cycle For AI, 2020
AI is starting to deliver on its potential and its benefits for businesses are becoming a reality.

  • 47% of artificial intelligence (AI) investments were unchanged since the start of the pandemic and 30% of organizations plan to increase their AI investments, according to a recent Gartner poll.
  • 30% of CEOs own AI initiatives in their organizations and regularly redefine resources, reporting structures and systems to ensure success.
  • AI projects continue to accelerate this year in healthcare, bioscience, manufacturing, financial services and supply chain sectors despite greater economic & social uncertainty.
  • Five new technology categories are included in this year’s Hype Cycle for AI, including small data, generative AI, composite AI, responsible AI and things as customers.

These and many other new insights are from the Gartner Hype Cycle for Artificial Intelligence, 2020, published on July 27th of this year and provided in the recent article, 2 Megatrends Dominate the Gartner Hype Cycle for Artificial Intelligence, 2020.  Two dominant themes emerge from the combination of 30 diverse AI technologies in this year’s Hype Cycle. The first theme is the democratization or broader adoption of AI across organizations. The greater the democratization of AI, the greater the importance of developers and DevOps to create enterprise-grade applications. The second theme is the industrialization of AI platforms. Reusability, scalability, safety and responsible use of AI and AI governance are the catalysts contributing to the second theme.  The Gartner Hype Cycle for Artificial Intelligence, 2020, is shown below:

What's New In Gartner's Hype Cycle For AI, 2020
Smarter with Gartner, 2 Megatrends Dominate the Gartner Hype Cycle for Artificial Intelligence, 2020.

Details Of What’s New In Gartner’s Hype Cycle for Artificial Intelligence, 2020

  • Chatbots are projected to see over a 100% increase in their adoption rates in the next two to five years and are the leading AI use cases in enterprises today.  Gartner revised the bots’ penetration rate from a range of 5% to 20% last year to 20% to 50% this year. Gartner points to chatbot’s successful adoption as the face of AI today and the technology’s contributions to streamlining automated, touchless customer interactions aimed at keeping customers and employees safe. Bot vendors to watch include Amazon Web Services (AWS), Cognigy, Google, IBM, Microsoft, NTT DOCOMO, Oracle, Rasa and Rulai.
  • GPU Accelerators are the nearest-term technology to mainstream adoption and are predicted to deliver a high level of benefit according to Gartner’s’ Priority Matrix for AI, 2020. Gartner predicts GPU Accelerators will see a 100% improvement in adoption in two to five years, increasing from 5% to 20% adoption last year to 20% to 50% this year. Gartner advises its clients that GPU-accelerated Computing can deliver extreme performance for highly parallel compute-intensive workloads in HPC, DNN training and inferencing. GPU computing is also available as a cloud service. According to the Hype Cycle, it may be economical for applications where utilization is low, but the urgency of completion is high.
  • AI-based minimum viable products and accelerated AI development cycles are replacing pilot projects due to the pandemic across Gartner’s client base. Before the pandemic, pilot projects’ success or failure was, for the most part, dependent on if a project had an executive sponsor and how much influence they had. Gartner clients are wisely moving to minimum viable product and accelerating AI development to get results quickly in the pandemic. Gartner recommends projects involving Natural Language Processing (NLP), machine learning, chatbots and computer vision to be prioritized above other AI initiatives. They’re also recommending organizations look at insight engines’ potential to deliver value across a business.
  • Artificial General Intelligence (AGI) lacks commercial viability today and organizations need to focus instead on more narrowly focused AI use cases to get results for their business. Gartner warns there’s a lot of hype surrounding AGI and organizations would be best to ignore vendors’ claims of having commercial-grade products or platforms ready today with this technology. A better AI deployment strategy is to consider the full scope of technologies on the Hype Cycle and choose those delivering proven financial value to the organizations adopting them.
  • Small Data is now a category in the Hype Cycle for AI for the first time. Gartner defines this technology as a series of techniques that enable organizations to manage production models that are more resilient and adapt to major world events like the pandemic or future disruptions. These techniques are ideal for AI problems where there are no big datasets available.
  • Generative AI is the second new technology category added to this year’s Hype Cycle for the first time. It’s defined as various machine learning (ML) methods that learn a representation of artifacts from the data and generate brand-new, completely original, realistic artifacts that preserve a likeness to the training data, not repeat it.
  • Gartner sees potential for Composite AI helping its enterprise clients and has included it as the third new category in this year’s Hype Cycle. Composite AI refers to the combined application of different AI techniques to improve learning efficiency, increase the level of “common sense,” and ultimately to much more efficiently solve a wider range of business problems.
  • Concentrating on the ethical and social aspects of AI, Gartner recently defined the category Responsible AI as an umbrella term that’s included as the fourth category in the Hype Cycle for AI. Responsible AI is defined as a strategic term that encompasses the many aspects of making the right business and ethical choices when adopting AI that organizations often address independently. These include business and societal value, risk, trust, transparency, fairness, bias mitigation, explainability, accountability, safety, privacy and regulatory compliance.
  • The exponential gains in accuracy, price/performance, low power consumption and Internet of Things sensors that collect AI model data have to lead to a new category called Things as Customers, as the fifth new category this year.  Gartner defines things as Customers as a smart device or machine or that obtains goods or services in exchange for payment. Examples include virtual personal assistants, smart appliances, connected cars and IoT-enabled factory equipment.
  • Thirteen technologies have either been removed, re-classified, or moved to other Hype Cycles compared to last year.  Gartner has chosen to remove VPA-enabled wireless speakers from all Hype Cycles this year. AI developer toolkits are now part of the AI developer and teaching kits category. AI PaaS is now part of AI cloud services. Gartner chose to move AI-related C&SI services, AutoML, Explainable AI (also now part of the Responsible AI category in 2020), graph analytics and Reinforcement Learning to the Hype Cycle for Data Science and Machine Learning, 2020. Conversational User Interfaces, Speech Recognition and Virtual Assistants are now part of the Hype Cycle for Natural Language Technologies, 2020. Gartner has also chosen to move Quantum computing to the Hype Cycle for Compute Infrastructure, 2020. Robotic process automation software is now removed from the Hype Cycle for AI, as Gartner mentions the technology in several other Hype Cycles.

Why Cybersecurity Needs To Focus More On Customer Endpoints

Why Cybersecurity Needs To Focus More On Customer Endpoints

  • Cloud-based endpoint protection platforms (EPP) are proliferating across enterprises today as CIOs and CISOs prioritize greater resiliency in their endpoint security strategies going into 2020.
  • Gartner predicts that Global Information Security and Risk Management end-user spending is forecast to grow at a five-year CAGR of 9.2% to reach $174.5 billion in 2022, with approximately $50B spent on endpoint security.
  • Endpoint security tools are 24% of all IT security spending, and by 2020 global IT security spending will reach $128B according to Morgan Stanley Research.
  • 70% of all breaches still originate at endpoints, despite the increased IT spending on this threat surface, according to IDC.

There’s a surge of activity happening right now in enterprises that are prioritizing more resiliency in their endpoint security strategies going into 2020. The factors motivating CIOs, CISOs, IT, and Practice Directors to prioritize endpoint resiliency include more effective asset management based on real-time data while securing and ensuring every endpoint can heal itself using designed-in regenerative software at the BIOS level of every device. CIOs say the real-time monitoring helps reduce asset management operating expense, a big plus many of them appreciate give their tight budgets. Sean Maxwell, Chief Commercial Officer at Absolute, says, “Trust is at the center of every endpoint discussion today as CIOs, CISOs and their teams want the assurance every endpoint will be able to heal itself and keep functioning.”

The Endpoint Market Is Heating Up Going Into 2020

Over thirty vendors are competing in the endpoint security market right now. A few of the most interesting are Absolute Software, Microsoft, Palo Alto Networks, and others who are seeing a surge of activity from enterprises based on discussions with CIOs and CISOs. Absolute Software’s Persistence self-healing endpoint security technology is embedded in the firmware of more than 500 million devices and gives CIOs, CISOs and their team’s complete visibility and control over devices and data. Absolute is the leading visibility and control platform that provides enterprises with tamper-proof resilience and protection of all devices, data, and applications.

Like Absolute, Microsoft is unique in how they are the only vendor to provide built-in endpoint protection at the device level, with the core focus being on the OS. Windows 10 has Windows Defender Antivirus now integrated at the OS level, the same System Center Endpoint Protection delivers in Windows 7 and 8 OS. Microsoft Defender Advanced Threat Protection (ATP) incident response console aggregates alerts and incident response activities across Microsoft Defender ATP, Office 365 ATP, Azure ATP, and Active Directory, in addition to Azure.

Further evidence of how enterprise customers are placing a high priority on endpoint security is the increase in valuations of key providers in this market, including Absolute Software (TSE: ABT) and others. Absolute’s stock price has jumped 13% in just a month, following their latest earnings announcement on November 12th with a transcript of their earnings call here. Absolute’s CEO Christy Wyatt commented during the company’s most recent earnings call that, “The ability to utilize near real-time data from the endpoint to… to deliver actionable insights to IT about where controls are failing and the ability to apply resilience to self-heal and reinforce those security controls will become a critical skill for every one of our customers. This is the essence of Absolute’s platform, which adds resiliency to our customer’s operations.” It’s evident from what CIOs and CISOs are saying that resiliency is transforming endpoint security today and will accelerate in 2020.

Key Takeaways From Conversations With Enterprise Cybersecurity Leaders

The conversations with CIOs, CISOs, and IT Directors provided valuable insights into why resiliency is becoming a high priority for endpoint security strategies today. The following are key takeaways from the conversations:

  • Known humorously as the “fun button” cybersecurity teams enjoy being able to brick any device any time while monitoring the activity happening on it in real-time. One CIO told the story of how their laptops had been given to a service provider who was supposed to destroy them to stay in compliance with the Health Insurance Portability and Accountability Act (HIPAA), and one had been resold on the back market, ending up in a 3rd world nation. As the hacker attempted to rebuild the machine, the security team watched as each new image was loaded, at which time they would promptly brick the machine. After 19 tries, the hacker gave up and called the image re-build “brick me.”
  • IT budgets for 2020 are flat or slightly up, with many CIOs being given the goal of reducing asset management operating expenses, making resiliency ideal for better managing device costs. The more effectively assets are managed, the more secure an organization becomes. That’s another motivating factor motivating enterprises to adopt resiliency as a core part of the endpoint security strategies.
  • One CIO was adamant they had nine software agents on every endpoint, but Absolute’s Resilience platform found 16, saving the enterprise from potential security gaps. The gold image an enterprise IT team was using had inadvertently captured only a subset of the total number of software endpoints active on their networks. Absolute’s Resilience offering and Persistence technology enabled the CIO to discover gaps in endpoint security the team didn’t know existed before.
  • Endpoints enabled with Resiliency have proven their ability to autonomously self-heal themselves, earning the trust of CIOs and CISOs, who are adopting Absolute to alleviate costly network interruptions and potential breaches in the process. 19% of endpoints across a typical IT network require at least one client or patch management repair monthly, according to Absolute’s 2019 Endpoint Security Trends Report. The report also found that increasing security spending on protecting endpoints doesn’t increase an organizations’ safety – and in some instances, reduces it. Having a systematic, design-in solution to these challenges gives CIOs, CISO, and their teams greater peace of mind and reduces expensive interruptions and potential breaches that impede their organizations’ growth.

 

5 Proven Ways Manufacturers Can Get Started With Analytics

5 Proven Ways Manufacturers Can Get Started With Analytics

Going into 2020, manufacturers are at an inflection point in their adoption of analytics and business intelligence (BI). Analytics applications and tools make it possible for them to gain greater insights from the massive amount of data they produce every day. And with manufacturing leading all industries on the planet when it comes to the amount of data generated from operations daily, the potential to improve shop floor productivity has never been more within reach for those adopting analytics and BI applications.

Analytics and BI Are High Priorities In Manufacturing Today

Increasing the yield rates and quality levels for each shop floor, machine and work center is a high priority for manufacturers today. Add to that the pressure to stay flexible and take on configure-to-order and engineer-to-order special products fulfilled through short-notice production runs and the need for more insight into how each phase of production can be improved. Gartner’s latest survey of heavy manufacturing CIOs in the 2019 CIO Agenda: Heavy Manufacturing, Industry Insights, by Dr. Marc Halpern. October 15, 2018 (Gartner subscription required) reflects the reality all manufacturers are dealing with today. I believe they’re in a tough situation with customers wanting short-notice production time while supply chains often needing to be redesigned to reduce or eliminate tariffs. They’re turning to analytics to gain the insights they need to take on these challenges and more. The graphic below is from Gartner’s latest survey of heavy manufacturing CIOs, it indicates the technology areas where heavy manufacturing CIOs’ organizations will be spending the largest amount of new or additional funding in 2019 as well as the technology areas where their organizations will be reducing funding by the highest amount in 2019 compared with 2018:

Knowing Which Problems To Solve With Analytics

Manufacturers getting the most value from analytics start with a solid business case first, based on a known problem they’ve been trying to solve either in their supply chains, production or fulfillment operations. The manufacturers I’ve worked with focus on how to get more orders produced in less time while gaining greater visibility across production operations. They’re all under pressure to stay in compliance with customers and regulatory reporting; in many cases needing to ship product quality data with each order and host over 60 to 70 audits a year from customers in their plants. Analytics is becoming popular because it automates the drudgery of reporting that would otherwise take IT team’s days or weeks to do manually.

As one CIO put it as we walked his shop floor, “we’re using analytics to do the heavy data crunching when we’re hosting customer audits so we can put our quality engineers to work raising the bar of product excellence instead of having them run reports for a week.” As we walked the shop floor he explained how dashboards are tailored to each role in manufacturing, and the flat-screen monitors provide real-time data on how five key areas of performance are doing. Like many other CIOs facing the challenge of improving production efficiency and quality, he’s relying on the five core metrics below in the initial roll-out of analytics across manufacturing operations, finance, accounting, supply chain management, procurement, and service:

  • Manufacturing Cycle Time – One of the most popular metrics in manufacturing, Cycle Time quantifies the amount of elapsed time from when an order is placed until the product is manufactured and entered into finished goods inventory. Cycle times vary by segment of the manufacturing industry, size of manufacturing operation, global location and relative stability of supply chains supporting operations. Real-time integration, applying Six Sigma to know process bottlenecks, and re-engineering systems to be more customer-focused improve this metrics’ performance. Cycle Time is a predictor of the future of manufacturing as this metric captures improvement made across systems and processes immediately.
  • Supplier Inbound Quality Levels – Measuring the dimensions of how effective a given supplier is at consistently meeting a high level of product quality and on-time delivery is valuable in orchestrating a stable supply chain. Inbound quality levels often vary from one shipment to the next, so it’s helpful to have Statistical Process Control (SPC) charts that quantify and show the trends of quality levels over time. Nearly all manufacturers are relying on Six Sigma programs to troubleshoot specific trouble spots and problem areas of suppliers who may have wide variations in product quality in a given period. This metric is often used for ranking which suppliers are the most valuable to a factory and production network as well.
  • Production Yield Rates By Product, Process, and Plant Location – Yield rates reflect how efficient a machine or entire process is in transforming raw materials into finished products. Manufacturers rely on automated and manually-based approaches to capture this metric, with the latest generation of industrial machinery capable of producing its yield rate levels over time. Process-related manufacturers rely on this metric to manage every production run they do. Microprocessors, semiconductors, and integrated circuit manufacturers are continually monitoring yield rates to determine how they are progressing against plans and goals. Greater real-time integration, improved quality management systems, and greater supply chain quality and compliance all have a positive impact on yield rates. It’s one of the key measures of production yield as it reflects how well-orchestrated entire production processes are.
  • Perfect Order Performance – Perfect order performance measures how effective a manufacturer is at delivering complete, accurate, damage-free orders to customers on time. The equation that defines the perfect order Index (POI) or perfect order performance is the (Percent of orders delivered on time) * (Percent of orders complete) * (Percent of orders damage free) * (Percent of orders with accurate documentation) * 100. The majority of manufacturers are attaining a perfect order performance level of 90% or higher, according to The American Productivity and Quality Center (APQC). The more complex the product lines, configuration options, including build-to-order, configure-to-order, and engineer-to-order, the more challenging it is to attain a high, perfect order level. Greater analytics and insights gained from real-time integration and monitoring help complex manufacturers attained higher perfect order levels over time.
  • Return Material Authorization (RMA) Rate as % Of Manufacturing – The purpose of this metric is to define the percentage of products shipped to customers that are returned due to defective parts or not otherwise meeting their requirements. RMAs are a good leading indicator of potential quality problems. RMAs are also a good measure of how well integrated PLM, ERP and CRM systems, resulting in fewer product errors.

Conclusion

The manufacturers succeeding with analytics start with a compelling business case, one that has an immediate impact on the operations of their organizations. CIOs are prioritizing analytics and BI to gain greater insights and visibility across every phase of manufacturing. They’re also adopting analytics and BI to reduce the reporting drudgery their engineering, IT, and manufacturing teams are faced with as part of regular customer audits. There are also a core set of metrics manufacturers rely on to manage their business, and the five mentioned here are where many begin.

CIO’s Guide To Stopping Privileged Access Abuse – Part I

CIOs face the paradox of having to protect their businesses while at the same time streamlining access to the information and systems their companies need to grow. The threatscape they’re facing requires an approach to security that is adaptive to the risk context of each access attempt across any threat surface, anytime. Using risk scores to differentiate between privileged users attempting to access secured systems in a riskier context than normal versus privileged credential abuse by attackers has proven to be an effective approach for thwarting credential-based breaches.

Privileged credential abuse is one of the most popular breach strategies organized crime and state-sponsored cybercrime organizations use. They’d rather walk in the front door of enterprise systems than hack in. 74% of IT decision makers surveyed whose organizations have been breached in the past say it involved privileged access credential abuse, yet just 48% have a password vault. Just 21% have multi-factor authentication (MFA) implemented for privileged administrative access. These and many other insights are from Centrify’s recent survey, Privileged Access Management in the Modern Threatscape.

How CIOs Are Solving the Paradox of Privileged Credential Abuse

The challenge to every CIO’s security strategy is to adapt to risk contexts in real-time, accurately assessing every access attempt across every threat surface, risk-scoring each in milliseconds. By taking a “never trust, always verify, enforce least privilege” approach to security, CIOs can provide an adaptive, contextually accurate Zero Trust-based approach to verifying privileged credentials. Zero Trust Privilege is emerging as a proven framework for thwarting privileged credential abuse by verifying who is requesting access, the context of the request, and the risk of the access environment.

By taking a least privilege access approach, organizations can minimize attack surfaces, improve audit and compliance visibility, and reduce risk, complexity, and the costs of operating a modern, hybrid enterprise. CIOs are solving the paradox of privileged credential abuse by knowing that even if a privileged user has entered the right credentials but the request comes in with risky context, then stronger verification is needed to permit access.

Strategies For Stopping Privileged Credential Abuse

The following are five strategies CIOs need to concentrate on to stop privileged credential abuse. Starting with an inventory of privileged accounts and progressing through finding the gaps in IT infrastructure that create opportunities for privileged credential abuse, CIOs and their teams need to take preemptive action now to avert potential breaches in the future.

In Part 1 of a CIO’s Guide to Stopping Privileged Access Abuse, below are the steps they can take to get started:

  1. Discover and inventory all privileged accounts and their credentials to define who is accountable for managing their security and use. According to a survey by Gartner, more than 65% of enterprises are allowing shared use of privileged accounts with no accountability for their use. CIOs realize that a lack of consistent governance policies creates many opportunities for privileged credential abuse. They’re also finding orphaned accounts, multiple owners for privileged credentials and the majority of system administrators having super user or root user access rights for the majority of enterprise systems.
  2. Vault your cloud platforms’ Root Accounts and federate access to AWS, Google Cloud Platform, Microsoft Azure and other public cloud consoles. Root passwords on each of the cloud platforms your business relies on are the “keys to the kingdom” and provide bad actors from inside and outside the company to exfiltrate data with ease. The recent news of how a fired employee deleted his former employer’s 23 AWS servers is a cautionary tale of what happens when a Zero Trust approach to privileged credentials isn’t adopted. Centrify’s survey found that 63% or organizations take more than a day to shut off privilege access for an employee after leaving the company. Given how AWS root user accounts have the privilege to delete all instances immediately, it’s imperative for organizations to have a password vault where AWS root account credentials are stored. Instead of local AWS IAM accounts and access keys, use centralized identities (e.g., Active Directory) and enable federated login. By doing so, you obviate the need for long-lived access keys.
  3. Audit privileged sessions and analyze patterns to find potentially privileged credential sharing or abuse not immediately obvious from audits. Audit and log authorized and unauthorized user sessions across all enterprise systems, especially focusing on root password use across all platforms. Taking this step is essential for assigning accountability for each privileged credential in use. It will also tell you if privileged credentials are being shared widely across the organization. Taking a Zero Trust approach to securing privileged credentials will quickly find areas where there could be potential lapses or gaps that invite breaches. For AWS accounts, be sure to use AWS CloudTrail and Amazon CloudWatch to monitor all API activity across all AWS instances and your AWS account.
  4. Enforce least privilege access now within your existing infrastructure as much as possible, defining a security roadmap based on the foundations of Zero Trust as your future direction. Using the inventory of all privileged accounts as the baseline, update least privilege access on each credential now and implement a process for privilege elevation that will lower the overall risk and ability for attackers to move laterally and extract data. The days of “trust but verify” are over. CIOs from insurance and financial services companies recently spoken with point out that their new business models, all of them heavily reliant on secured Internet connectivity, are making Zero Trust the cornerstone of their future services strategies. They’re all moving beyond “trust but verify” to adopt a more adaptive approach to knowing the risk context by threat surface in real-time.
  5. Adopt multi-factor authentication (MFA) across all threat surfaces that can adapt and flex to the risk context of every request for resources. The CIOs running a series of insurance and financial services firms, a few of them former MBA students of mine, say multi-factor authentication is a must-have today for preventing privileged credential abuse. Their take on it is that adding in an authentication layer that queries users with something they know (user name, password, PIN or security question) with something they have (smartphone, one-time password token or smart card), something they are (biometric identification like fingerprint) and something they’ve done (contextual pattern matching of what they normally do where) has helped thwart privileged credential abuse exponentially since they adopted it. This is low-hanging fruit: adaptive MFA has made the productivity impact of this additional validation practically moot.

Conclusion

Every CIO I know is now expected to be a business strategist first, and a technologist second. At the top of many of their list of priorities is securing the business so it can achieve uninterrupted growth. The CIOs I regularly speak with running insurance and financial services companies often speak of how security is as much a part of their new business strategies as the financial products their product design teams are developing. The bottom line is that the more adaptive and able to assess the context of risks for each privilege access attempt a company’s access management posture can become, the more responsive they can be to employees and customers alike, fueling future growth.

Gartner’s Top 10 Strategic Technology Trends For 2015

speed-of-quality-management-systems-makes-manual-systems-seem-asleep-300x199Gartner presented their top 10 strategic technology trends for 2015 at their annual Gartner Symposium/ITxpo 2014 held in Orlando earlier this month.  Computing Everywhere, the Internet of Things (IoT) and 3D Printing are projected to be the three most important strategic technology trends in 2015.

3D Printing Will Continue To Revolutionize Prototyping And Manufacturing  

3D printing is forecast to reach a tipping point in the next three years due to streamlined prototyping and short-run manufacturing. Improving time-to-market, ensuring greater accuracy of highly customized products, and reducing production costs over the long-term are three of the many benefits companies are adopting 3D printing for today.  Be sure to read Larry Dignan’s excellent post covering the conference and top ten strategic technology trends, 3D printing turns strategic in 2015, says Gartner.

Taking Analytics To The Next Level in 2015

Advanced, persuasive and invisible analytics, context-rich systems, and smart machines also are included in the top 10 strategic technology trends for 2015. Given how quickly analytics is maturing as a technology category, it’s understandable why Gartner ranked this area as the 4th most strategic.  In 2015, analytics will move beyond providing dashboards with metrics and Key Performance Indicators (KPIs) to a more intuitive series of applications that give business analysts the flexibility to define models and test them in real-time. Alteryx and Tableau are interesting companies to watch in this area and Tableau Public is worth checking out and learning due to its advanced visualization features (free, opt-in).

Cloud Computing Becomes Part Of The New IT Reality

The last four technology trends Gartner mentions include cloud/client computing, software-defined applications and infrastructure, Web-scale IT and risk-based security and self-protection.

The following graphic provides an overview of the top 10 strategic technology trends for 2015.

gartner-top-2015-tech-620x334

Sizing the Public Cloud Services Market

Gartner’s latest forecast of the public cloud services market predicts that by 2015, this worldwide market will be worth $176.8 billion, achieving a five-year compound annual growth rate (CAGR) of 18.9%.

Their latest forecast is based on defining the public cloud services market from revenue generation, not an IT spending perspective.  This is in contrast to the public cloud services forecast IDC also released this week, stating that public IT cloud services spending would reach $72.9B by 2015.  Of the two approaches, the one that is revenue-based delivers a more granular, detailed look at Platform-as-a-Service (PaaS) and Infrastructure-as-a-Service (IaaS) challenges and opportunities for growth (see tables below for details).  The Gartner report, Public Cloud Services, Worldwide and Regions, Industry Sectors, 2010-2015, 2011 Update, was published on June 29, 2011.

Gartner’s decision to base their methodology on revenue generated versus pure IT spending opens up the potential to evaluate entirely new business models based on services growth.  The forecast is based on revenue either directly or indirectly generated from the sales of services and from sales to enterprise or consumers.  Business process services are defined in this forecast as any process that can be delivered as a service over a scalable, elastic and secure connection over the web.  This includes advertising, payroll, printing, e-c0mmerce, in addition to applying applications and systems infrastructure. Presented below are key take-aways and analysis from the reports.

Key Take-Aways

  • By 2015, the total market will be worth $176.8 billion, which represents a five-year compound annual growth rate (CAGR) from 2010 of 18.9%. The largest part of this is revenue derived from advertising that is used to provide IT services ($77.1 billion in 2015), which represents an addition to the total size of the IT market.
  • The transition of software from licensed to service models continues, but it has yet to reach breakthrough proportions (9.6% in 2010, rising to 13.8% in 2015). Traditional outsourcing services also continue to transition to cloud delivery models, involving a high degree of service standardization. Gartner continues to take a conservative view of revenue recognition in terms of SaaS adoption compared to other research firms as is shown in the following table.

  • Application and systems infrastructure are projected to grow the fastest in terms of revenue generation through 2015, with advertising-related revenue being a significant proportion of the total public cloud services market through the forecast period.  The following table breaks out public cloud revenue globally by business process services, applications, application infrastructure and systems infrastructure.
  • The high-tech, manufacturing and financial services sectors and the public sector will continue to be the most-aggressive adopters of cloud services through 2015.  Presented below is a table comparing cloud services revenue by industry sector.
  • The North American market continues to be, by far, the largest regional market representing 60% of the global market currently, but growth in China remains of interesting potential.
  • Financial services organizations in aggregate represent the largest users of public cloud services.
  • Some smaller countries will demonstrate very high growth (more than 25%) in e-commerce cloud services, because of high growth in underlying retail e-commerce. The Census Bureau of the U.S. Department of Commerce estimates that e-commerce sales in the fourth quarter of 2010 accounted for 4.3% of total U.S. retail sales.

Bottom line: Taking a revenue-based approach to defining cloud services shows how critical the application and system infrastructure is to overall market growth.  Gartner predicts the fastest growing revenue generating segment of public clouds will be storage services (89.5%) followed by Compute Services (47.8%) and supply management (39.5%).

%d bloggers like this: