Worldwide end-user spending on public cloud services is forecast to grow 23.1% in 2021 to total $332.3 billion, up from $270 billion in 2020.
Garter predicts worldwide end-user spending on public cloud services will jump from $242.6B in 2019 to $692.1B in 2025, attaining a 16.1% Compound Annual Growth Rate (CAGR).
Spending on SaaS cloud services is predicted to reach $122.6B this year, growing to $145.3B next year, attaining 19.3% growth between 2021 and 2022.
These and many other insights are from Gartner Forecasts Worldwide Public Cloud End-User Spending to Grow 23% in 2021. The pandemic created the immediate need for virtual workforces and cloud resources to support them at scale, accelerating public cloud adoption in 2020 with momentum continuing this year. Containerization, virtualization, and edge computing have quickly become more mainstream and are driving additional cloud spending. Gartner notes that CIOs face continued pressures to scale infrastructure that supports moving complex workloads to the cloud and the demands of a hybrid workforce.
Key insights from Gartner’s latest forecast of public cloud end-user spending include the following:
36% of all public cloud services revenue is from SaaS applications and services this year, projected to reach $122.6B with CRM being the dominant application category. Customer Experience and Relationship Management (CRM) is the largest SaaS segment, growing from $44.7B in 2019 to $99.7B in 2025, attaining a 12.14% CAGR. SaaS-based Enterprise Resource Planning (ERP) systems are the second most popular type of SaaS application, generating $15.7B in revenue in 2019. Gartner predicts SaaS-based ERP sales will reach $35.8B in 2025, attaining a CAGR of 12.42%.
Desktop as a Service (DaaS) is predicted to grow 67% in 2021, followed by Infrastructure-as-a-Service (IaaS) with a 38.5% jump in revenue. Platform-as-a-Service (PaaS) is the third-fastest growing area of public cloud services, projected to see a 28.3% jump in revenue this year. SaaS, the largest segment of public cloud spending at 36.9% this year, is forecast to grow 19.3% this year. The following graphic compares the growth rates of public cloud services between 2020 and 2021.
In 2021, SaaS end-user spending will grow by $19.8B, creating a $122.6B market this year. IaaS end-user spending will increase by $22.7B, the largest revenue gain by a cloud service in 2021. PaaS will follow, with end-user spending increasing $13.1B this year. CIOs and the IT teams they lead are investing in public cloud infrastructure to better scale operations and support virtual teams. CIOs from financial services and manufacturing firms I’ve recently spoken with are accelerating cloud spending for three reasons. First, create a more virtual organization that can scale; second, extend the legacy systems’ data value by integrating their databases with new SaaS apps; and third, an urgent need to improve cloud cybersecurity.
CIOs and the organizations they serve are prioritizing cloud infrastructure investment to better support virtual workforces, supply chains, partners, and service partners. The CIOs I’ve spoken with also focus on getting the most value out of legacy systems by integrating them with cloud infrastructure and apps. As a result, cloud infrastructure investment starting with IaaS is projected to see end-user spending increase from $82B this year to $223B in 2025, growing 38.5% this year alone. End-user spending on Database Management Systems is projected to lead all categories of PaaS through 2025, increasing from $31.2B this year to $84.8B in 2025. The following graphic compares cloud services forecasts and growth rates:
Sage Intacct, Oracle ERP Cloud, and Microsoft Dynamics 365 ERP are the three highest-rated ERP systems by their users.
86% of Unit4 ERP users say their CRM system is the best of all vendors in the study. The survey-wide satisfaction rating for CRM is 73%, accentuating Unit4 ERP’s leadership in this area.
85% of Ramco ERP Suite users say their ERP systems’ analytics and reporting is the best of all 22 vendors evaluated.
These and many other insights are from SoftwareReview’s latest customer rankings published recently in their Enterprise Data Quadrant Report, Enterprise Resource Planning, April 2021. The report is based entirely on attitudinal data captured from verified owners of each ERP system reviewed. 1,179 customer reviews were completed, evaluating 22 vendors. SoftwareReviews is a division of the world-class IT research and consulting firm Info-Tech Research Group. Their business model is based on providing research to enterprise buyers on subscription, alleviating the need to be dependent on vendor revenue, which helps them stay impartial in their many customer satisfaction studies. Key insights from the study include the following:
Sage Intacct, Oracle ERP Cloud, Microsoft Dynamics 365 ERP, Acumatica Cloud ERP, Unit4 ERP and FinancialForce ERP are most popular with their users. SoftwareReview found that these six ERP systems have the highest Net Emotional Footprint scores across all ERP vendors included in the study. The Net Emotional Footprint measures high-level user sentiment. It aggregates emotional response ratings across 25 questions, creating an indicator of overall user feeling toward the vendor and product. The following quadrant charts the results of the survey:
80% of Acumatica Cloud ERP users say their system helps create more business value, leading all vendors on this attribute. How effective an ERP system is at adapting to support new business and revenue models while providing greater cost visibility is the essence of how they deliver business value. The category average for this attribute is 75%. Of the 22 vendors profiled, 12 have scores at the average level or above, indicating many ERP vendors are focusing on these areas to improve the business case of adopting their systems.
86% of Sage Intacct ERP users say their system excels at ease of implementation, leading all vendors in the comparison by a wide margin. Implementing a new ERP system can be a costly and time-consuming process as it involves extensive training, change management, and integration. Ease of Implementation received a category score of 75% across the 22 vendors, indicating ERP vendors are doubling down investments to improve this area. Just 11 of the 22 ERP vendors scored above the category average.
Bottom Line: Today’s largely-distributed enterprises need to make sure they are putting endpoint security first in 2021– which includes closely managing every stage of the device lifecycle, from deployment to decommission, and ensuring all sensitive data remains protected.
There’s a looming paradox facing nearly every organization today of how they’ll secure thousands of remote endpoints without having physical access to devices, and without disrupting worker productivity. Whether there’s the need to retire hardware as part of down-sizing or cost-cutting measures, or the need to equip virtual teams with newer equipment more suitable for long term work-from-home scenarios, this is one of the most pressing issues facing CISOs and CIOs today.
Wanting to learn more about how their customers are tackling their endpoint security challenges and how their companies are helping to solve it, I sat down (virtually) with Absolute Software’s President and CEO Christy Wyatt and Matthew Zielinski, President of North America Intelligent Devices Group at Lenovo. The following is my interview with both of them:
Louis Columbus:Christy and Matt, thanks so much for your time today. To get started, I would like each of you to share what you’re hearing from your customers regarding their plans to refresh laptops and other endpoint devices in 2021.
Christy Wyatt: We’re seeing a strong desire from organizations to ensure that every individual is digitally enabled, and has access to a screen. In some cases, that means refreshing the hardware they already have in the field, and in other cases, that means buying or adding devices. From the endpoint security standpoint, there’s been a shift in focus around which tools matter the most. When laptops were primarily being used on campus, there was a certain set of solutions to monitor those devices and ensure they remained secure. Now that 90% of devices are out of the building, an entirely different set of capabilities is required – and delivering those has been our focus.
Matt Zielinski: We are seeing historic levels of demand from consumers, as many are transitioning from having maybe one or two devices per household to at least one device per person. We’re also seeing the same levels of demand on both the education and enterprise side. The new dynamic of work-from-anywhere, learn-from-anywhere, collaborate-from-anywhere underscores that the device hardware and software need to be current in order to support both the productivity and security needs of hugely distributed workforces. That’s our highest priority.
Louis: Where are CISOs in their understanding, evaluation, and adoption of endpoint security technologies?
Christy: The journey has been different for the education market than for the enterprise market. Most enterprise organizations were already on the digital path, with some percentage of their population already working remotely. And because of this, they typically have a more complex security stack to manage; our data shows that the total number of unique applications and versions installed on enterprise devices is nearly 1.5 million. What they’ve seen is a trifecta of vulnerabilities: employees taking data home with them, accessing it on unsecured connections, and not being aware of how their devices are protected beyond the WiFi connection and the network traffic.
In the education space, the challenges – and the amount of complexity – are completely different; they’re managing just a small fraction of that total number of apps and versions. That said, as the pandemic unfolded, education was hit harder because they were not yet at a point where every individual was digitally connected. There was a lot of reliance on being on campus, or being in a classroom. So, schools had to tackle digital and mobile transformation at the same time – and to their credit, they made multiple years of progress in a matter of weeks or months. This rapid rate of change will have a profound effect on how schools approach technology deployments going forward.
Matt: Whether in enterprise or education, our customers are looking to protect three things: their assets, their data, and their users’ productivity. It’s a daunting mission. But, the simplest way to accomplish it is to recognize the main control point has changed. It’s no longer the server sitting behind the firewall of your company’s or school’s IT environment. The vulnerability of the endpoint is that the network is now in the user’s hands; the edge is now the primary attack surface. I think CISOs realize this, and they are asking the right questions… I just don’t know if everyone understands the magnitude or the scale of the challenge. Because the problem is so critical, though, people are taking the time to make the right decisions and identify all the various components needed to be successful.
Louis: It seems like completing a laptop refresh during the conditions of a pandemic could be especially challenging, given how entire IT teams are remote. What do you anticipate will be the most challenging aspects of completing a hardware refresh this year (2021)?
Matt: The PC has always been a critical device for productivity. But now, without access to that technology, you are completely paralyzed; you can’t collaborate, you can’t engage, you can’t connect. Lenovo has always been focused on pushing intelligent transformation as far as possible to get the best devices into the hands of our customers. Beyond designing and building the device, we have the ability to distribute asset tags and to provide a 24/7 help desk for our customers whether you’re a consumer, a school, or a large institution. We can also decommission those devices at the end, so we’re able to support the entire journey or lifecycle.
The question has really become, how do you deliver secure devices to the masses? And, we’re fully equipped to do that. For example, every Lenovo X1 Carbon laptop comes out of the box with Lenovo Security Assurance, which is actually powered by Absolute; it is in our hardware. Our customers can open a Lenovo PC, and know that it is completely secure, right out of the box. Every one of our laptops is fortified with Absolute’s Persistence technology and self-healing capabilities that live in the BIOS. It’s that unbreakable, secure connection that makes it possible for us to serve our customers throughout the entire lifecycle of device ownership.
Louis:Why are the legacy approaches to decommissioning assets falling short / failing today? How would you redesign IT asset-decommissioning approaches to make them more automated, less dependent on centralized IT teams?
Christy: There have been a few very visible cases over the past year of highly regulated organizations, experiencing vulnerabilities because of how they decommissioned – or did not properly decommission – their assets. But, I don’t want anyone to believe that that this is a problem that is unique to regulated industries, like financial services. The move to the cloud has given many organizations a false sense of security, and it seems that the more data running in the cloud, the more pronounced this false sense of security becomes. It’s a mistaken assumption to think that when hardware goes missing, the security problem is solved by shutting down password access and that all the data is protected because it is stored in the cloud. That’s just not true. When devices aren’t calling in anymore, it’s a major vulnerability – and the longer the device sits without being properly wiped or decommissioned, the greater the opportunity for bad actors to take advantage of those assets.
The other piece that should be top of mind is that once a device is decommissioned, it’s often sold. We want to ensure that nothing on that device gets passed on to the next owner, especially if it’s going to a service or leasing program. So, we’ve concentrated on making asset decommissioning as precise as possible and something that can be done at scale, anytime and anywhere.
Matt: Historically, reclaiming and decommissioning devices has required physical interaction. The pandemic has limited face-to-face encounters, so , we’re leveraging many different software solutions to give our customers the ability to wipe the device clean if they aren’t able to get the asset back in their possession, so that at least they know it is secure. Since we’re all now distributed, we’re looking at several different solutions that will help with decommissioning, several of which are promising and scale well given today’s constraints. Our goal is to provide our enterprise customers with decommissioning flexibility, from ten units to several thousand.
Louis:Paradoxically, having everyone remote has made the business case for improving endpoint security more compelling too. What do you hear from enterprises about accelerating digital transformation initiatives that include the latest-generation endpoint devices?
Christy: The same acceleration that I spoke about on the education side, we absolutely see on the enterprise side as well, and with rapid transformation comes increased complexity. There has been a lot of conversation about moving to Zero Trust, moving more services to the cloud and putting more controls on the endpoint – and not having these sort of layers in between. Our data tells us that the average enterprise device today has 96 unique applications, and at least 10 of them are security applications. That is a massive amount of complexity to manage. So, we don’t believe that adding more controls to the endpoint is the answer; we believe that what’s most important is knowing the security controls you have are actually working. And we need to help devices and applications become more intelligent, self-aware, and capable of fixing themselves. This concept of resiliency is the cornerstone of effective endpoint security, and a critical part of the shift to a more modern security architecture.
Matt: I think there are two major forcing functions: connection and security. Because we are all now remote, there’s a huge desire to feel connected to one another even though we aren’t sitting in the same room together. We’re modifying our products in real-time with the goal of removing shared pain points and optimizing for the new reality in which we’re all living and working. Things like microphone noise suppression and multiple far field microphones, so that if the dog barks or kids run into a room, the system will mute before you’ve even pressed the mute button. We’re improving camera technology from a processing standpoint to make things look better. Ultimately, our goal is to provide an immersive and connected experience.
Security, however, transcends specific features that deliver customer experiences – security is the experience. The features that make hardware more secure are those that lie beneath the operating system, in the firmware. That is why we have such a deep network of partners, including Absolute. Because you need to have a full ecosystem, and a program that takes advantage of all the best capabilities, in order to deliver the best security solution possible.
Louis:How is Absolute helping enterprise customers ensure greater endpoint security and resiliency in 2021 and beyond?
Christy: We spend a lot of time sitting with customers to understand their needs and how and where we can extend our endpoint security solutions to fit. We believe in taking a layered approach – which is the framework for defense in-depth, and an effective endpoint security strategy. The foundational piece, which we are able to deliver, is a permanent digital tether to every device; this is the lifeline. Not having an undeletable connection to every endpoint means you have a very large security gap, which must be closed fast. A layered, persistence-driven approach ensures our customers know their security controls are actually working and delivering business value. It enables our customers to pinpoint where a vulnerability is and take quick action to mitigate it.
Lenovo’s unique, high value-add approach to integrated security has both helped drive innovation at Absolute, while also providing Lenovo customers the strongest endpoint security possible. Their multilayer approach to their endpoint strategy capitalizes on Absolute’s many BIOS-level strengths to help their customers secure every endpoint they have. As our companies work together, we are both benefitting from a collaboration that seeks to strengthen and enrich all layers of endpoint security. Best of all, our shared customers are the benefactors of this collaboration and the results we are driving at the forefront of endpoint security.
Louis:How has the heightened focus on enterprise cybersecurity in general, and endpoint security specifically, influenced Lenovo’s product strategy in 2021 and beyond?
Matt: We have always been focused on our unique cybersecurity strengths from the device side and making sure we have all of the control points in manufacturing to ensure we build a secure platform. So, we’ve had to be open-minded about endpoint security, and diligent in envisioning how potential vulnerabilities and attack strategies can be thwarted before they impact our customers. Because of this mindset, we’re fortunate to have a very active partner community. We’re always scouring the earth for the next hot cybersecurity technology and potential partner with unique capabilities and the ability to scale with our model. This is a key reason we’ve standardized on Absolute for endpoint security, as it can accommodate a wide breadth of deployment scenarios. It’s a constant and very iterative process with a team of very smart people constantly looking at how we can excel at cybersecurity. It is this strategy that is driving us to fortify our Lenovo Security Assurance architecture over the long-term, while also seeking new ways of providing insights from existing and potentially new security applications.
Louis:What advice are you giving CISOs to strengthen endpoint security in 2021 and beyond?
Christy: One of our advisors is the former Global Head of Information Security at Citi Group, and former CISO of JP Morgan and Deutsche Bank. He talks a lot about his shared experiences of enabling business operations, while defending organizations from ever-evolving threats, and the question that more IT and security leaders need to be asking – which is, “Is it working?” Included in his expert opinion is that cybersecurity needs to be integral to business strategy – and endpoint security is essential for creating a broader secure ecosystem that can adapt as a company’s needs change.
I believe there needs to be more boardroom-level conversations around how compliance frameworks can be best used to achieve a balance between cybersecurity and business operations. A big part of that is identifying resiliency as a critical KPI for measuring the strength of endpoint controls.
The human tragedy the COVID-19 pandemic has inflicted on the world is incalculable and continues to grow. Every human life is priceless and deserves the care needed to sustain it. COVID-19 is also impacting entire industries, causing them to randomly gyrate in unpredictable ways, directly impacting IT and tech spending.
Computer Economics and Avasant predict major disruption to High Tech & Telecommunications based on the industry’s heavy reliance on Chinese supply chains, which were severely impacted by COVID-19. Based on conversations with U.S.-based high tech manufacturers, I’ve learned that a few are struggling to make deliveries to leading department stores and discount chains due to parts shortages and allocations from their Chinese suppliers. North American electronics suppliers aren’t an option due to their prices being higher than their Chinese competitors. Leading department stores and discount chains openly encourage high tech device manufacturers to compete with each other on supplier availability and delivery date performance.
In contrast to the parts shortage and unpredictability of supply chains dragging down the industry, software is a growth catalyst. The study notes that Zoom, Slack, GoToMyPC, Zoho Remotely, Microsoft Office365, Atlassian, and others are already seeing increased demand as companies increase their remote-working capabilities.
Key insights from Forrester’s latest IT spending forecast and predictions are shown below:
Forrester is revising its tech forecast downward, predicting the US and global tech market growth slowing to around 2% in 2020. Mr. Bartels mentions that this assumes the US and other major economies have declined in the first half of 2020 but manage to recover in the second half.
If a full-fledged recession hits, there is a 50% probability that US and global tech markets will decline by 2% or more in 2020.
In either a second-half 2020 recovery or recession, Forrester predicts computer and communications equipment spending will be weakest, with potential declines of 5% to 10%.
Tech consulting and systems integration services spending will be flat in a temporary slowdown and could be down by up to 5% if firms cut back on new tech projects.
Software spending growth will slow to the 2% to 4% range in the best case and will post no growth in the worst case of a recession.
The only positive signs from the latest Forrester IT spending forecast is the continued growth in demand for cloud infrastructure services and potential increases in spending on specialized software. Forrester also predicts communications equipment, and telecom services for remote work and education as organizations encourage workers to work from home and schools move to online courses.
Every industry is economically hurting already from the COVID-19 pandemic. Now is the time for enterprise software providers to go the extra mile for their customers across all industries and help them recover and grow again. Strengthening customers in their time of need by freely providing remote collaboration tools, secure endpoint solutions, cloud-based storage, and CRM systems is an investment in the community that every software company needs to make it through this pandemic too.
Why CIOs Are Prioritizing Privileged Credential Abuse Now
Enterprise security approaches based on Zero Trust continue to gain more mindshare as organizations examine their strategic priorities. CIOs and senior management teams are most focused on securing infrastructure, DevOps, cloud, containers, and Big Data projects to stop the leading cause of breaches, which is privileged access abuse.
The following are the key reasons why CIOs are prioritizing privileged access management now:
Identities are the new security perimeter for any business, making privileged access abuse the greatest challenge CIOs face in keeping their businesses secure and growing.Gartner also sees privileged credential abuse as the greatest threat to organizations today, and has made Privileged Account Management one of the Gartner Top 10 Security Projects for 2018, and again in 2019. Forrester and Gartner’s findings and predictions reflect the growing complexity of threatscapes every CIO must protect their business against while still enabling new business growth. Banking, financial services, and insurance (BFSI) CIOs often remark in my conversations with them that the attack surfaces in their organizations are proliferating at a pace that quickly scales beyond any trust but verify legacy approach to managing access. They need to provide applications, IoT-enabled devices, machines, cloud services, and human access to a broader base of business units than ever before.
CIOs are grappling with the paradox of protecting the rapidly expanding variety of attack surfaces from breaches while still providing immediate access to applications, systems, and services that support their business’ growth. CIOs I’ve met with also told me access to secured resources needs to happen in milliseconds, especially to support the development of new banking, financial services, and insurance applications in beta testing today, scheduled to be launched this summer. Their organizations’ development teams expect more intuitive, secure, and easily accessible applications than ever before, which is driving CIOs to prioritize privileged access management now
Adapting and risk-scoring every access attempt in real-time is key to customer experiences on new services and applications, starting with response times. CIOs need a security strategy that can flex or adapt to risk contexts in real-time, assessing every access attempt across every threat surface and generating a risk score in milliseconds. The CIOs I’ve met with regularly see a “never trust, always verify, enforce least privilege” approach to security as the future of how they’ll protect every threat surface from privileged access abuse. Each of their development teams is on tight deadlines to get new services launch to drive revenue in Q3. Designing in Zero Trust with a strong focus on Zero Trust Privilege is saving valuable development time now and is enabling faster authentication times of the apps and services in testing today.
Strategies For Stopping Privileged Credential Abuse – Part 2
Recently I wrote a CIO’s Guide To Stopping Privileged Access Abuse – Part 1 detailing five recommended strategies for CIOs on how to stop privileged credential abuse. The first five strategies focus on the following: discovering and inventorying all privileged accounts; vaulting all cloud platforms’ Root Accounts; auditing privileged sessions and analyzing patterns to find privileged credential sharing not found during audits; enforcing least privilege access now within your existing infrastructure as much as possible; and adopting multi-factor authentication (MFA) across all threat surfaces that can adapt and flex to the risk context of every request for resources.
The following are the second set of strategies CIOs need to prioritize to further protect their organizations from privileged access abuse:
After completing an inventory of privileged accounts, create a taxonomy of them by assigning users to each class or category, personalizing privileged credential access to the role and entitlement level for each. CIOs tell me this is a major time saver in scaling their Privileged Access Management (PAM) strategies. Assigning every human, machine and sensor-based identity is the goal with the overarching objective being the creation of a Zero Trust-based enterprise security strategy. Recommended initial classes or categories include IT administrators who are also responsible for endpoint security; developers who require occasional access to production instances; service desk teams and service operations; the Project Management Office (PMO) and project IT; and external contractors and consultants.
By each category in the taxonomy, automate the time, duration, scope, resources, and entitlements of privileged access for each focusing on the estimated time to complete each typical task. Defining a governance structure that provides real-time access to resources based on successful authentication is a must-have for protecting privileged access credentials. By starting with the attributes of time, duration, scope and properties, organizations have a head start on creating a separation of duties (SOD) model. Separation of duties is essential for ensuring that privileged user accounts don’t have the opportunity to carry out and conceal any illegal or unauthorized activities.
Using the taxonomy of user accounts created and hardened using the separation of duties model, automate privileged access and approval workflows for enterprise systems. Instead of having administrators approve or semi-automate the evaluation of every human- and machine-based request for access, consider automating the process with a request and approval workflow. With time, duration, scope, and properties of privileged access already defined human- and machine-based requests for access to IT systems and services are streamlined, saving hundreds of hours a year and providing a real-time log for audit and data analysis later.
Break-glass, emergency or firecall account passwords need to be vaulted, with no exceptions. When there’s a crisis of any kind, the seconds it takes to get a password could mean the difference between cloud instances and entire systems being inaccessible or not. That’s why administrators often only manually secure root passwords to all systems, cloud platforms and containers included. This is the equivalent of leaving the front door open to the data center with all systems unlocked. The recent Centrify survey found that just 48% of organizations interviewed have a password vault. 52% are leaving the keys to the kingdom available for hackers to walk through the front door of data centers and exfiltraticate data whenever they want.
Continuous delivery and deployment platforms including Ansible, Chef, Puppet, and others need to be configured when first installed to eliminate the potential for privileged access abuse. The CIOs whose teams are creating new apps and services are using Chef and Puppet to design and create workloads, with real-time integration needed with customer, pricing, and services databases and the systems they run on. Given how highly regulated insurance is, CIOs are saying they need to have logs that show activity down to the API level in case of an audit. The more regulated and audited a company, the more trusted and untrusted domains are seen as the past, Zero Trust as the future based on CIO’s feedback.
The CIOs I regularly meet with from the banking, financial services, and insurance industries are under pressure to get new applications and services launched while protecting their business’ daily operations. With more application and services development happening in their IT teams, they’re focusing on how they can optimize the balance between security and speed. New apps, services, and the new customers they attract are creating a proliferation of new threat surfaces, making every new identity the new security perimeter.
Machine learning patents grew at a 34% Compound Annual Growth Rate (CAGR) between 2013 and 2017, the third-fastest growing category of all patents granted.
International Data Corporation (IDC) forecasts that spending on AI and ML will grow from $12B in 2017 to $57.6B by 2021.
Deloitte Global predicts the number of machine learning pilots and implementations will double in 2018 compared to 2017, and double again by 2020.
These and many other fascinating insights are from the latest series of machine learning market forecasts, market estimates, and projections. Machine learning’s potential impact across many of the world’s most data-prolific industries continues to fuel venture capital investment, private equity (PE) funding, mergers, and acquisitions all focused on winning the race of Intellectual Property (IP) and patents in this field. One of the fastest growing areas of machine learning IP is the development of custom chipsets. Deloitte Global is predicting up to 800K machine learning chips will be in use across global data centers this year. Enterprises are increasing their research, investment, and piloting of machine learning programs in 2018. And while the methodologies all vary across the many sources of forecasts, market estimates, and projections, all reflect how machine learning is improving the acuity and insights of companies on how to grow faster and more profitably. Key takeaways from the collection of machine learning market forecasts, market estimates and projections include the following:
Within the Business Intelligence (BI) & analytics market, Data Science platforms that support machine learning are predicted to grow at a 13% CAGR through 2021. Data Science platforms will outperform the broader BI & analytics market, which is predicted to grow at an 8% CAGR in the same period. Data Science platforms will grow in value from $3B in 2017 to $4.8B in 2021. Source: An Investors’ Guide to Artificial Intelligence, J.P. Morgan. November 27, 2017 (110 pp., PDF, no opt-in).
Machine learning patents grew at a 34% Compound Annual Growth Rate (CAGR) between 2013 and 2017, the third-fastest growing category of all patents granted. IBM, Microsoft, Google, LinkedIn, Facebook, Intel, and Fujitsu were the seven biggest ML patent producers in 2017. Source: IFI Claims Patent Services (Patent Analytics) 8 Fastest Growing Technologies SlideShare Presentation.
61% of organizations most frequently picked Machine Learning / Artificial Intelligence as their company’s most significant data initiative for next year. Of those respondent organizations indicating they actively use Machine Learning (ML) and Artificial Intelligence (AI), 58% percent indicated they ran models in production. Source: 2018 Outlook: Machine Learning and Artificial Intelligence, A Survey of 1,600+ Data Professionals (14 pp., PDF, no opt-in).
Tech market leaders including Amazon, Apple, Google, Tesla, and Microsoft are leading their industry sectors by a wide margin in machine learning (ML) and AI investment. Each is designing ML into future-generation products and using ML and AI to improve customer experiences and improve the efficiency of selling channels. Source: Will You Embrace AI Fast Enough? AT Kearney, January 2018.
Deloitte Global predicts the number of machine learning pilots and implementations will double in 2018 compared to 2017, and double again by 2020. Factors driving the increasing pace of ML pilots include more pervasive support of Application Program Interfaces (APIs), automating data science tasks, reducing the need for training data, accelerating training and greater insight into explaining results. Source: Deloitte Global Predictions 2018 Infographics.
60% of organizations at varying stages of machine learning adoption, with nearly half (45%) saying the technology has led to more extensive data analysis & insights. 35% can complete faster data analysis and increased the speed of insight, delivering greater acuity to their organizations. 35% are also finding that machine learning is enhancing their R&D capabilities for next-generation products. Source: Google & MIT Technology Review study: Machine Learning: The New Proving Ground for Competitive Advantage (10 pp., PDF, no opt-in).
McKinsey estimates that total annual external investment in AI was between $8B to $12B in 2016, with machine learning attracting nearly 60% of that investment. McKinsey estimates that total annual external investment in AI was between $8B to $12B in 2016, with machine learning attracting nearly 60% of that investment. Robotics and speech recognition are two of the most popular investment areas. Investors are most favoring machine learning startups due to quickness code-based start-ups have at scaling up to include new features fast. Software-based machine learning startups are preferred over their more cost-intensive machine-based robotics counterparts that often don’t have their software counterparts do. As a result of these factors and more, Corporate M&A is soaring in this area. The following graphic illustrates the distribution of external investments by category from the study. Source: McKinsey Global Institute Study, Artificial Intelligence, The Next Digital Frontier (80 pp., PDF, free, no opt-in).
Deloitte Global is predicting machine learning chips used in data centers will grow from a 100K to 200K run rate in 2016 to 800K this year. At least 25% of these will be Field Programmable Gate Arrays (FPGA) and Application Specific Integrated Circuits (ASICs). Deloitte found the Total Available Market (TAM) for Machine Learning (ML) Accelerator technologies could potentially reach $26B by 2020. Source: Deloitte Global Predictions 2018.
Amazon is relying on machine learning to improve customer experiences in key areas of their business including product recommendations, substitute product prediction, fraud detection, meta-data validation and knowledge acquisition. For additional details, please see the presentation, Machine Learning At Amazon, Amazon Web Services (47 pp., PDF no opt-in).
83% Of Enterprise Workloads Will Be In The Cloud By 2020. LogicMonitor’s survey is predicting that 41% of enterprise workloads will be run on public cloud platforms (Amazon AWS, Google Cloud Platform, IBM Cloud, Microsoft Azure and others) by 2020. An additional 20% are predicted to be private-cloud-based followed by another 22% running on hybrid cloud platforms by 2020. On-premise workloads are predicted to shrink from 37% today to 27% of all workloads by 2020.
Digitally transforming enterprises (63%) is the leading factor driving greater public cloud engagement or adoption followed by the pursuit of IT agility (62%). LogicMonitor’s survey found that the many challenges enterprises face in digitally transforming their business models are the leading contributing factor to cloud computing adoption. Attaining IT agility (62%), excelling at DevOps (58%), mobility (55%), Artificial Intelligence (AI) and Machine Learning (50%) and the Internet of Things (IoT) adoption (45%) are the top six factors driving cloud adoption today. Artifical Intelligence (AI) and Machine Learning are predicted to be the leading factors driving greater cloud computing adoption by 2020.
66% of IT professionals say security is their greatest concern in adopting an enterprise cloud computing strategy. Cloud platform and service providers will go on a buying spree in 2018 to strengthen and harden their platforms in this area. Verizon (NYSE:VZ) acquiring Niddel this week is just the beginning. Niddel’s Magnet software is a machine learning-based threat-hunting system that will be integrated into Verizon’s enterprise-class cloud services and systems. Additional concerns include attaining governance and compliance goals on cloud-based platforms (60%), overcoming the challenges of having staff that lacks cloud experience (58%), Privacy (57%) and vendor lock-in (47%).
Just 27% of respondents predict that by 2022, 95% of all workloads will run in the cloud. One in five respondents believes it will take ten years to reach that level of workload migration. 13% of respondents don’t see this level of workload shift ever occurring. Based on conversations with CIOs and CEOs in manufacturing and financial services industries there will be a mix of workloads between on-premise and cloud for the foreseeable future. C-level executives evaluate shifting workloads based on each systems’ contribution to new business models, cost, and revenue goals in addition to accelerating time-to-market.
Microsoft Azure and Google Cloud Platform are predicted to gain market share versus Amazon AWS in the next three years, with AWS staying the clear market leader. The study found 42% of respondents are predicting Microsoft Azure will gain more market share by 2020. Google Cloud Platform is predicted to also gain ground according to 35% of the respondent base. AWS is predicted to extend its market dominance with 52% market share by 2020.
81% of IT leaders are currently investing in or planning to invest in Artificial Intelligence (AI).
Cowen predicts AI will drive user productivity to materially higher levels, with Microsoft at the forefront.
Digital Marketing/Marketing Automation, Salesforce Automation (CRM) and Data Analytics are the top three areas ripe for AI/ML adoption.
According to angel.co, there are 2,200+ Artificial Intelligence start-ups, and well over 50% have emerged in just the last two years.
Cowen sees Salesforce ($CRM), Adobe ($ADBE) and ServiceNow ($NOW) as well-positioned to deliver and monetize new AI-based application services.
These and many other fascinating insights are from the Cowen and Company Multi-Sector Equity Research study, Artificial Intelligence: Entering A Golden Age For Data Science (142 pp., PDF, client access reqd). The study is based on interviews with 146 leading AI researchers, entrepreneurs and VC executives globally who are involved in the field of artificial intelligence and related technologies. Please see the Appendix of the study for a thorough overview of the methodology. This study isn’t representative of global AI, data engineering and machine learning (ML) adoption trends. It does, however, provide a glimpse into the current and future direction of AI, data engineering, and machine learning. Cowen finds the market is still nascent, with CIOs eager to invest in new AI-related initiatives. Time-to-market, customer messaging, product positioning and the value proposition of AI solutions will be critical factors for winning over new project investments.
Key takeaways from the study include the following:
Digital Marketing/Marketing Automation, Salesforce Automation (CRM) and Data Analytics are the top three areas ripe for AI/ML adoption. Customer self-service, Enterprise Resource Planning (ERP), Human Resource Management (HRM) and E-Commerce are additional areas that have upside potential for AI/ML adoption. The following graphic provides an overview of the areas in software that Cowen found the greater potential for AI/ML investment.
81% of IT leaders are currently investing in or planning to invest in Artificial Intelligence (AI). Based on the study, CIOs have a new mandate to integrate AI into IT technology stacks. The study found that 43% are evaluating and doing a Proof of Concept (POC) and 38% are already live and planning to invest more. The following graphic provides an overview of company readiness for machine learning and AI projects.
Market forecasts vary, but all consistently predict explosive growth. IDC predicts that the Cognitive Systems and AI market (including hardware & services) will grow from $8B in 2016 to $47B in 2020, attaining a Compound Annual Growth Rate (CAGR) of 55%. This forecast includes $18B in software applications, $5B in software platforms, and $24B in services and hardware. IBM claims that Cognitive Computing is a $2T market, including $200B in healthcare/life sciences alone. Tractica forecasts direct and indirect applications of AI software to grow from $1.4B in 2016 to $59.8B by 2025, a 52% CAGR.
According to CBInsights, the number of financing transactions to AI start-ups increased 10x over the last six years, from 67 in 2011 to 698 in 2016. Accenture states that the total number of AI start-ups has increased 20-fold since 2011. The top verticals include FinTech, Healthcare, Transportation and Retail/e-Commerce. The following graphic provides an overview of the AI annual funding history from 2011 to 2016.
Algorithmic trading, image recognition/tagging, and patient data processing are predicted to the b top AI uses cases by 2025. Tractica forecasts predictive maintenance and content distribution on social media will be the fourth and fifth highest revenue producing AI uses cases over the next eight years. The following graphic compares the top 10 uses cases by projected global revenue.
Machine Learning is predicted to generate the most revenue and is attracting the most venture capital investment in all areas of AI. Venture Scanner found that ML raised $3.5B to date (from 400+ companies), far ahead of the next category, Natural Language Processing, which has seen just over $1Bn raised to date (from 200+ companies). Venture Scanner believes that Machine Learning Applications and Machine Learning Platforms are two relatively early stage markets that stand to have some of the greatest market disruptions.
Cowen predicts that an Intelligent App Stack will gain rapid adoption in enterprises as IT departments shift from system-of-record to system-of-intelligence apps, platforms, and priorities. The future of enterprise software is being defined by increasingly intelligent applications today, and this will accelerate in the future. Cowen predicts it will be commonplace for enterprise apps to have machine learning algorithms that can provide predictive insights across a broad base of scenarios encompassing a company’s entire value chain. The potential exists for enterprise apps to change selling and buying behavior, tailoring specific responses based on real-time data to optimize discounting, pricing, proposal and quoting decisions.
According to angel.co, there are 2,200+ Artificial Intelligence start-ups, and well over 50% have emerged in just the last two years. Machine Learning-based Applications and Deep Learning Neural Networks are experiencing the largest and widest amount of investment attention in the enterprise.
Accenture leverages machine learning in 40% of active Analytics engagements, and nearly 80% of proposed Analytics opportunities today. Cowen found that Accenture’s view is that they are in the early stages of AI technology adoption with their enterprise clients. Accenture sees the AI market growing exponentially, reaching $400B in spending by 2020. Their customers have moved on from piloting and testing AI to reinventing their business strategies and models.
The more integrated the systems are supporting any selling strategy, the greater the chances sales will increase. That’s because accuracy, speed, and quality of every quote matter more than ever. Being able to strengthen every customer interaction with insight and intelligence often means the difference between successful upsells, cross-sells and the chance to bid and win new projects. Defining a roadmap to enrich selling strategies using SAP integration is delivering results across a variety of manufacturing and service industries today.
Getting more value out of the customer data locked in legacy SAP systems can improve selling results starting with existing sales cycles. Knowing what each customer purchased, when, at what price, and for which project or location is invaluable in accelerating sales cycles today. There are many ways to improve selling results using SAP integration, and the following are the top three based on conversations with SAP Architects, CIOs and IT Directors working with Sales Operations to improve selling results. These five approaches are generating more leads, closing more deals, leading to better selling decisions and improving sales productivity.
3 Ways SAP Integration Is Improving Selling Results
Reducing and eliminating significant gaps in the Configure-Price-Quote (CPQ) process by integrating Salesforce and SAP systems improves selling and revenue results quickly. The following two illustrations compare how much time and revenue escape from the selling process. It’s common to see companies lose at least 20% of their orders when they rely on manual approaches to handling quotes, pricing, and configurations. The greater the complexity of the deal is the more potential for lost revenue. The second graphic shows how greater system integration leads to lower costs to complete an order, cycle time reductions, order rework reductions, and lead times for entire orders dropping from 69 to 22 days.
Having customer order history, pricing, discounts and previously purchased bundles stored in SAP ERP systems integrated into Salesforce will drive better decisions on which customers are most likely to buy upsells, cross-sells and new products when. Instead of having just to rely on current activity with a given customer, sales teams can analyze sales history to find potential purchasing trends and indications of who can sign off on deals in progress. Having real-time access to SAP data within Salesforce gives sales teams the most valuable competitive advantage there is, which is more time to focus on customers and closing deals. enosiX is taking a leadership role in the area of real-time SAP to Salesforce integration, enabling enterprises to sell and operate more effectively.
Improving Sales Operations and Customer Service productivity by providing customer data in real-time via Salesforce to support teams on a 24/7 basis worldwide. The two departments who rely on customer data more than sales need to have real-time access to customer data on a 24/7 basis from any device at any time, on a global scale. By integrating customer data held today in SAP ERP and related systems to Salesforce, Sales Operations, and Customer Service will have the visibility they’ve never had before. And that will translate into faster response times, higher customer satisfaction and potentially more sales too.
Bottom line: Machine learning is providing the needed algorithms, applications, and frameworks to bring greater predictive accuracy and value to enterprises’ data, leading to diverse company-wide strategies succeeding faster and more profitably than before.
Industries Where Machine Learning Is Making An Impact
The good news for businesses is that all the data they have been saving for years can now be turned into a competitive advantage and lead to strategic goals being accomplished. Revenue teams are using machine learning to optimize promotions, compensation and rebates drive the desired behavior across selling channels. Predicting propensity to buy across all channels, making personalized recommendations to customers, forecasting long-term customer loyalty and anticipating potential credit risks of suppliers and buyers are Figure 1 provides an overview of machine learning applications by industry.
Machine Learning Is Revolutionizing Sales and Marketing
Unlike advanced analytics techniques that seek out causality first, machine learning techniques are designed to seek out opportunities to optimize decisions based on the predictive value of large-scale data sets. And increasingly data sets are comprised of structured and unstructured data, with the global proliferation of social networks fueling the growth of the latter type of data. Machine learning is proving to be efficient at handling predictive tasks including defining which behaviors have the highest propensity to drive desired sales and marketing outcomes. Businesses eager to compete and win more customers are applying machine learning to sales and marketing challenges first. In the MIT Sloan Management Review article, Sales Gets a Machine-Learning Makeover the Accenture Institute for High Performance shared the results of a recent survey of enterprises with at least $500M in sales that are targeting higher sales growth with machine learning. Key takeaways from their study results include the following:
76% say they are targeting higher sales growth with machine learning. Gaining greater predictive accuracy by creating and optimizing propensity models to guide up-sell and cross-sell is where machine learning is making contributions to omnichannel selling strategies today.
At least 40% of companies surveyed are already using machine learning to improve sales and marketing performance. Two out of five companies have already implemented machine learning in sales and marketing.
38% credited machine learning for improvements in sales performance metrics. Metrics the study tracked include new leads, upsells, and sales cycle times by a factor of 2 or more while another 41% created improvements by a factor of 5 or more.
Several European banks are increasing new product sales by 10% while reducing churn 20%. A recent McKinsey study found that a dozen European banks are replacing statistical modeling techniques with machine learning. The banks are also increasing customer satisfaction scores and customer lifetime value as well.
Why Machine Learning Adoption Is Accelerating
Machine learning’s ability to scale across the broad spectrum of contract management, customer service, finance, legal, sales, quote-to-cash, quality, pricing and production challenges enterprises face is attributable to its ability to continually learn and improve. Machine learning algorithms are iterative in nature, continually learning and seeking to optimize outcomes. Every time a miscalculation is made, machine learning algorithms correct the error and begin another iteration of the data analysis. These calculations happen in milliseconds which makes machine learning exceptionally efficient at optimizing decisions and predicting outcomes.
The economics of cloud computing, cloud storage, the proliferation of sensors driving Internet of Things (IoT) connected devices growth, pervasive use of mobile devices that consume gigabytes of data in minutes are a few of the several factors accelerating machine learning adoption. Add to these the many challenges of creating context in search engines and the complicated problems companies face in optimizing operations while predicting most likely outcomes, and the perfect conditions exist for machine learning to proliferate.
The following are the key factors enabling machine learning growth today:
Exponential data growth with unstructured data being over 80% of the data an enterprise relies on to make decisions daily. Demand forecasts, CRM and ERP transaction data, transportation costs, barcode and inventory management data, historical pricing, service and support costs and accounting standard costing are just a few of the many sources of structured data enterprises make decisions with today. The exponential growth of unstructured data that includes social media, e-mail records, call logs, customer service and support records, Internet of Things sensing data, competitor and partner pricing and supply chain tracking data frequently has predictive patterns enterprises are completely missing out on today. Enterprises looking to become competitive leaders are going after the insights in these unstructured data sources and turning them into a competitive advantage with machine learning.
The Internet of Things (IoT) networks, embedded systems and devices are generating real-time data that is ideal for further optimizing supply chain networks and increasing demand forecast predictive As IoT platforms, systems, applications and sensors permeate value chains of businesses globally, there is an exponential growth of data generated. The availability and intrinsic value of these large-scale datasets are an impetus further driving machine learning adoption.
Generating massive data sets through synthetic means including extrapolation and projection of existing historical data to create realistic simulated data. From weather forecasting to optimizing a supply chain network using advanced simulation techniques that generate terabytes of data, the ability to fine-tune forecasts and attain greater optimizing is also driving machine learning adoption. Simulated data sets of product launch and selling strategies is a nascent application today and one that shows promise in developing propensity models that predict purchase levels.
The economics of digital storage and cloud computing are combining to put infrastructure costs into freefall, making machine learning more affordable for all businesses. Online storage and public cloud instances can be purchased literally in minutes online with a credit card. Migrating legacy data off of databases where their accessibility is limited compared to cloud platforms is becoming more commonplace as greatest trust in secure cloud storage increases. For many small businesses who lack IT departments, the Cloud provides a scalable, secure platform for managing their data across diverse geographic locations.