Skip to content

Posts tagged ‘enterprise software’

2021 State Of The Machine Learning Market: Enterprise Adoption Is Strong

data science, machine learning, enterprise software, AI, artificial Intelligence
  • 59% of all large enterprises are deploying data science (DS) and machine learning (ML) today.
  • Nearly 50% of all organizations have up to 25 or more ML models in use today.
  • 29% of enterprises are refreshing their data science and machine learning models every day.
  • The higher the data literacy an enterprise can achieve before launching Data Science & Machine Learning initiatives, the higher the probability of success.

These and many other insights defining the state of the data science and machine learning market in 2021 are from Dresner Advisory Services’ 2021 Data Science and Machine Learning Market Study. The 7th annual report is noteworthy for its depth of analysis and insight into how data science and machine learning adoption is growing stronger in enterprises. In addition, the study explains which factors drive adoption and determine the key success factors that matter the most when deploying data science and machine learning techniques. The methodology uses crowdsourcing techniques to recruit respondents from over 6,000 organizations and vendors’ customer communities. As a result, 52% of respondents are from North America and 34% from EMEA, with the balance from Asia-Pacific and Latin America. 

“The perceived importance of data science and machine learning correlates with organizational success with BI, with users that self-report as completely successful with BI almost twice as likely to rate data science as critical,” said Jim Ericson, vice president, and research director at Dresner Advisory. “The perceived level of data literacy also correlates directly and positively with the current or likely future use of data science and machine learning in 2021.” 

Key insights from the study include the following:

  • 59% of large enterprises are deploying data science and machine learning in production today.  Enterprises with 10K employees or more lead all others in adopting and using DS and ML techniques, most often in R&D and Business Intelligence Competency Center (BICC)-related work. Large-scale enterprises often rely on DS and ML to identify how internal processes and workflows can be streamlined and made more cost-efficient. For example, the CEO of a manufacturing company explained on a recent conference call that DS and ML pilots bring much-needed visibility and control across multiple plants and help troubleshoot inventory management and supply chain allocation problems.
machine learning
  • The importance of data science and ML to enterprises has doubled in eight years, jumping from 25% in 2014 to 70% in 2021. The Dresner study notes that a record level of enterprises sees data science and ML as critically important to their business in 2021. Furthermore, 90% of enterprises consider these technologies essential to their operations, rating them critically important or very important. Successful projects in Business Intelligence Competency Centers (BICC) and R&D helped data science and ML gain broad adoption across all organizations. Larger-scale enterprises with over 10K employees are successfully scaling data science and ML to improve visibility, control, and profitability in organizations today.
machine learning
  • Enterprises dominate the recruiting and retention of data science and machine learning talent. Large-scale enterprises with over 10K employees are the most likely to have BI experts and data scientists/statisticians on staff. In addition, large-scale enterprises lead hiring and retention in seven of the nine roles included in the survey. It’s understandable how the Business Intelligence (BI) expertise of professionals in these roles is helping remove the roadblocks to getting more business value from data science and machine learning. Enterprises are learning how to scale data science and ML models to take on problems that were too complex to solve with analytics or BI alone.    
machine learning
  • 80% of DS and ML respondents most want model lifecycle management, model performance monitoring, model version control, and model lineage and history at a minimum. Keeping track of the state of each model, including version control, is a challenge for nearly all organizations adopting ML today. Enterprises reach ML scale when they can manage ML models across their lifecycles using an automated system. The next four most popular features of model rollback, searchable model repository, collaborative, model co-creation tools, and model registration and certification are consistent with the feedback from Data Science teams on what they need most in an ML platform. 
machine learning
  • Financial Services prioritize model lifecycle management and model performance monitoring to achieve greater scale from the tens of thousands of models they’re using today. Consistent with other research that tracks ML adoption by industry, the Dresner study found that Financial Services leads all other industries in their need for the two most valuable features of ML platforms, model lifecycle management and model performance monitoring. Retail and Wholesale are reinventing their business models in real-time to become more virtual while also providing greater real-time visibility across supply chains. ML models in these two industries need automated model version control, model lineage and history, model rollback, collaborative, model co-creation tools, and model registration and certification. In addition, retailers and Wholesalers are doubling down on data science and machine learning to support new digital businesses, improve supply chain performance and increase productivity.
machine learning
  • Enterprises need support for their expanding range of regression models, text analytics functions, and ensemble learning. Over the last seven years, text analytics functions and sentiment analysis’ popularity has continually grown. Martech vendors and the marketing technologists driving the market are increasing sentiment analysis’ practicality and importance. Recommendation engines and geospatial analysis are also experiencing greater adoption due to martech changing the nature of customer- and market-driven analysis and predictive modeling. 
machine learning
  • R, TensorFlow, and PyTorch are considered the three most critical open-source statistical and machine learning frameworks in 2021. Nearly 70% of respondents consider R important to getting work done in data science and ML. The R language has established itself as an industry standard and is well-respected across DevOps, and IT teams in financial services, professional services, consulting, process, and discrete manufacturing. Tensorflow and Pytorch are considered important by the majority of organizations Dresner’s research team interviewed. They’re also among the most in-demand ML frameworks today, with new applicants having experience in all three being recruited actively today.   
machine learning
  • Data literacy predicts DS and ML program success rates. 64% of organizations say they have extremely high literacy rates, implying that DS and ML have reached mainstream adoption thanks partly to BI literacy rates in the past. Enterprises that prioritize data literacy by providing training, certification, and ongoing education increase success odds with ML. A bonus is that employees will have a chance to learn marketable skills they can use in their current and future positions. Investing in training to improve data literacy is a win/win.
machine learning
  • On-database analytics and in-memory analytics (both 91%), and multi-tenant cloud services (88%) are the three most popular technologies enterprises rely on for greater scalability. Dresner’s research team observes that the scalability of data science and machine learning often involves multiple, different requirements to address high data volumes, large numbers of users, data variety while supporting analytic throughput. Apache Spark support continues to grow in enterprises and is the fourth-most relied-on industry support for ML scalability.   
machine learning

Gartner Predicts Public Cloud Services Market Will Reach $397.4B by 2022

Gartner Predicts Public Cloud Services Market Will Reach $397.4B by 2022
  • Worldwide end-user spending on public cloud services is forecast to grow 23.1% in 2021 to total $332.3 billion, up from $270 billion in 2020.
  • Garter predicts worldwide end-user spending on public cloud services will jump from $242.6B in 2019 to $692.1B in 2025, attaining a 16.1% Compound Annual Growth Rate (CAGR).
  • Spending on SaaS cloud services is predicted to reach $122.6B this year, growing to $145.3B next year, attaining 19.3% growth between 2021 and 2022.  

These and many other insights are from Gartner Forecasts Worldwide Public Cloud End-User Spending to Grow 23% in 2021.  The pandemic created the immediate need for virtual workforces and cloud resources to support them at scale, accelerating public cloud adoption in 2020 with momentum continuing this year. Containerization, virtualization, and edge computing have quickly become more mainstream and are driving additional cloud spending. Gartner notes that CIOs face continued pressures to scale infrastructure that supports moving complex workloads to the cloud and the demands of a hybrid workforce.

Key insights from Gartner’s latest forecast of public cloud end-user spending include the following:

  • 36% of all public cloud services revenue is from SaaS applications and services this year, projected to reach $122.6B with CRM being the dominant application category. Customer Experience and Relationship Management (CRM) is the largest SaaS segment, growing from $44.7B in 2019 to $99.7B in 2025, attaining a 12.14% CAGR. SaaS-based Enterprise Resource Planning (ERP) systems are the second most popular type of SaaS application, generating $15.7B in revenue in 2019. Gartner predicts SaaS-based ERP sales will reach $35.8B in 2025, attaining a CAGR of 12.42%.
  • Desktop as a Service (DaaS) is predicted to grow 67% in 2021, followed by Infrastructure-as-a-Service (IaaS) with a 38.5% jump in revenue. Platform-as-a-Service (PaaS) is the third-fastest growing area of public cloud services, projected to see a 28.3% jump in revenue this year. SaaS, the largest segment of public cloud spending at 36.9% this year, is forecast to grow 19.3% this year. The following graphic compares the growth rates of public cloud services between 2020 and 2021.  
  • In 2021, SaaS end-user spending will grow by $19.8B, creating a $122.6B market this year. IaaS end-user spending will increase by $22.7B, the largest revenue gain by a cloud service in 2021. PaaS will follow, with end-user spending increasing $13.1B this year. CIOs and the IT teams they lead are investing in public cloud infrastructure to better scale operations and support virtual teams. CIOs from financial services and manufacturing firms I’ve recently spoken with are accelerating cloud spending for three reasons. First, create a more virtual organization that can scale; second, extend the legacy systems’ data value by integrating their databases with new SaaS apps; and third, an urgent need to improve cloud cybersecurity.

Conclusion

CIOs and the organizations they serve are prioritizing cloud infrastructure investment to better support virtual workforces, supply chains, partners, and service partners. The CIOs I’ve spoken with also focus on getting the most value out of legacy systems by integrating them with cloud infrastructure and apps. As a result, cloud infrastructure investment starting with IaaS is projected to see end-user spending increase from $82B this year to $223B in 2025, growing 38.5% this year alone. End-user spending on Database Management Systems is projected to lead all categories of PaaS through 2025, increasing from $31.2B this year to $84.8B in 2025. The following graphic compares cloud services forecasts and growth rates:

Which ERP Systems Are Most Popular With Their Users In 2021?

Which ERP Systems Are Most Popular With Their Users In 2021?
  • Sage Intacct, Oracle ERP Cloud, and Microsoft Dynamics 365 ERP are the three highest-rated ERP systems by their users.
  • 86% of Unit4 ERP users say their CRM system is the best of all vendors in the study. The survey-wide satisfaction rating for CRM is 73%, accentuating Unit4 ERP’s leadership in this area.
  • 85% of Ramco ERP Suite users say their ERP systems’ analytics and reporting is the best of all 22 vendors evaluated.

These and many other insights are from SoftwareReview’s latest customer rankings published recently in their Enterprise Data Quadrant Report, Enterprise Resource Planning, April 2021. The report is based entirely on attitudinal data captured from verified owners of each ERP system reviewed. 1,179 customer reviews were completed, evaluating 22 vendors. SoftwareReviews is a division of the world-class IT research and consulting firm Info-Tech Research Group. Their business model is based on providing research to enterprise buyers on subscription, alleviating the need to be dependent on vendor revenue, which helps them stay impartial in their many customer satisfaction studies. Key insights from the study include the following:

  • Sage Intacct, Oracle ERP Cloud, Microsoft Dynamics 365 ERP, Acumatica Cloud ERP, Unit4 ERP and FinancialForce ERP are most popular with their users.  SoftwareReview found that these six ERP systems have the highest Net Emotional Footprint scores across all ERP vendors included in the study. The Net Emotional Footprint measures high-level user sentiment. It aggregates emotional response ratings across 25 questions, creating an indicator of overall user feeling toward the vendor and product. The following quadrant charts the results of the survey:
  • 80% of Acumatica Cloud ERP users say their system helps create more business value, leading all vendors on this attribute. How effective an ERP system is at adapting to support new business and revenue models while providing greater cost visibility is the essence of how they deliver business value. The category average for this attribute is 75%. Of the 22 vendors profiled, 12 have scores at the average level or above, indicating many ERP vendors are focusing on these areas to improve the business case of adopting their systems.
Which ERP Systems Are Most Popular With Their Users In 2021?
  • 86% of Sage Intacct ERP users say their system excels at ease of implementation, leading all vendors in the comparison by a wide margin. Implementing a new ERP system can be a costly and time-consuming process as it involves extensive training, change management, and integration. Ease of Implementation received a category score of 75% across the 22 vendors, indicating ERP vendors are doubling down investments to improve this area. Just 11 of the 22 ERP vendors scored above the category average.
Which ERP Systems Are Most Popular With Their Users In 2021?

What Enterprises Need To Plan For In 2021 When It Comes To Endpoint Security

What Enterprises Need to Plan for In 2021 When It Comes to Endpoint Security

Bottom Line: Today’s largely-distributed enterprises need to make sure they are putting endpoint security first in 2021– which includes closely managing every stage of the device lifecycle, from deployment to decommission, and ensuring all sensitive data remains protected.

There’s a looming paradox facing nearly every organization today of how they’ll secure thousands of remote endpoints without having physical access to devices, and without disrupting worker productivity. Whether there’s the need to retire hardware as part of down-sizing or cost-cutting measures, or the need to equip virtual teams with newer equipment more suitable for long term work-from-home scenarios, this is one of the most pressing issues facing CISOs and CIOs today.

Wanting to learn more about how their customers are tackling their endpoint security challenges and how their companies are helping to solve it, I sat down (virtually) with Absolute Software’s President and CEO Christy Wyatt and Matthew Zielinski, President of North America Intelligent Devices Group at Lenovo. The following is my interview with both of them:

Louis Columbus: Christy and Matt, thanks so much for your time today. To get started, I would like each of you to share what you’re hearing from your customers regarding their plans to refresh laptops and other endpoint devices in 2021.

Christy Wyatt: We’re seeing a strong desire from organizations to ensure that every individual is digitally enabled, and has access to a screen. In some cases, that means refreshing the hardware they already have in the field, and in other cases, that means buying or adding devices. From the endpoint security standpoint, there’s been a shift in focus around which tools matter the most. When laptops were primarily being used on campus, there was a certain set of solutions to monitor those devices and ensure they remained secure. Now that 90% of devices are out of the building, an entirely different set of capabilities is required – and delivering those has been our focus.

Matt Zielinski: We are seeing historic levels of demand from consumers, as many are transitioning from having maybe one or two devices per household to at least one device per person. We’re also seeing the same levels of demand on both the education and enterprise side. The new dynamic of work-from-anywhere, learn-from-anywhere, collaborate-from-anywhere underscores that the device hardware and software need to be current in order to support both the productivity and security needs of hugely distributed workforces. That’s our highest priority.

Louis:  Where are CISOs in their understanding, evaluation, and adoption of endpoint security technologies?

Christy: The journey has been different for the education market than for the enterprise market. Most enterprise organizations were already on the digital path, with some percentage of their population already working remotely. And because of this, they typically have a more complex security stack to manage; our data shows that the total number of unique applications and versions installed on enterprise devices is nearly 1.5 million. What they’ve seen is a trifecta of vulnerabilities: employees taking data home with them, accessing it on unsecured connections, and not being aware of how their devices are protected beyond the WiFi connection and the network traffic.

In the education space, the challenges – and the amount of complexity – are completely different; they’re managing just a small fraction of that total number of apps and versions. That said, as the pandemic unfolded, education was hit harder because they were not yet at a point where every individual was digitally connected. There was a lot of reliance on being on campus, or being in a classroom. So, schools had to tackle digital and mobile transformation at the same time – and to their credit, they made multiple years of progress in a matter of weeks or months. This rapid rate of change will have a profound effect on how schools approach technology deployments going forward.

Matt: Whether in enterprise or education, our customers are looking to protect three things: their assets, their data, and their users’ productivity. It’s a daunting mission. But, the simplest way to accomplish it is to recognize the main control point has changed. It’s no longer the server sitting behind the firewall of your company’s or school’s IT environment. The vulnerability of the endpoint is that the network is now in the user’s hands; the edge is now the primary attack surface. I think CISOs realize this, and they are asking the right questions… I just don’t know if everyone understands the magnitude or the scale of the challenge. Because the problem is so critical, though, people are taking the time to make the right decisions and identify all the various components needed to be successful.

Louis:   It seems like completing a laptop refresh during the conditions of a pandemic could be especially challenging, given how entire IT teams are remote. What do you anticipate will be the most challenging aspects of completing a hardware refresh this year (2021)?

Matt:  The PC has always been a critical device for productivity. But now, without access to that technology, you are completely paralyzed; you can’t collaborate, you can’t engage, you can’t connect. Lenovo has always been focused on pushing intelligent transformation as far as possible to get the best devices into the hands of our customers. Beyond designing and building the device, we have the ability to distribute asset tags and to provide a 24/7 help desk for our customers whether you’re a consumer, a school, or a large institution. We can also decommission those devices at the end, so we’re able to support the entire journey or lifecycle.

The question has really become, how do you deliver secure devices to the masses? And, we’re fully equipped to do that. For example, every Lenovo X1 Carbon laptop comes out of the box with Lenovo Security Assurance, which is actually powered by Absolute; it is in our hardware. Our customers can open a Lenovo PC, and know that it is completely secure, right out of the box. Every one of our laptops is fortified with Absolute’s Persistence technology and self-healing capabilities that live in the BIOS. It’s that unbreakable, secure connection that makes it possible for us to serve our customers throughout the entire lifecycle of device ownership.

Louis: Why are the legacy approaches to decommissioning assets falling short / failing today? How would you redesign IT asset-decommissioning approaches to make them more automated, less dependent on centralized IT teams?

Christy: There have been a few very visible cases over the past year of highly regulated organizations, experiencing vulnerabilities because of how they decommissioned – or did not properly decommission – their assets. But, I don’t want anyone to believe that that this is a problem that is unique to regulated industries, like financial services. The move to the cloud has given many organizations a false sense of security, and it seems that the more data running in the cloud, the more pronounced this false sense of security becomes. It’s a mistaken assumption to think that when hardware goes missing, the security problem is solved by shutting down password access and that all the data is protected because it is stored in the cloud. That’s just not true. When devices aren’t calling in anymore, it’s a major vulnerability – and the longer the device sits without being properly wiped or decommissioned, the greater the opportunity for bad actors to take advantage of those assets.

The other piece that should be top of mind is that once a device is decommissioned, it’s often sold. We want to ensure that nothing on that device gets passed on to the next owner, especially if it’s going to a service or leasing program. So, we’ve concentrated on making asset decommissioning as precise as possible and something that can be done at scale, anytime and anywhere.

Matt:  Historically, reclaiming and decommissioning devices has required physical interaction. The pandemic has limited face-to-face encounters, so , we’re leveraging many different software solutions to give our customers the ability to wipe the device clean if they aren’t able to get the asset back in their possession, so that at least they know it is secure. Since we’re all now distributed, we’re looking at several different solutions that will help with decommissioning, several of which are promising and scale well given today’s constraints. Our goal is to provide our enterprise customers with decommissioning flexibility, from ten units to several thousand.

Louis:  Paradoxically, having everyone remote has made the business case for improving endpoint security more compelling too. What do you hear from enterprises about accelerating digital transformation initiatives that include the latest-generation endpoint devices?

Christy:  The same acceleration that I spoke about on the education side, we absolutely see on the enterprise side as well, and with rapid transformation comes increased complexity. There has been a lot of conversation about moving to Zero Trust, moving more services to the cloud and putting more controls on the endpoint – and not having these sort of layers in between. Our data tells us that the average enterprise device today has 96 unique applications, and at least 10 of them are security applications. That is a massive amount of complexity to manage. So, we don’t believe that adding more controls to the endpoint is the answer; we believe that what’s most important is knowing the security controls you have are actually working. And we need to help devices and applications become more intelligent, self-aware, and capable of fixing themselves. This concept of resiliency is the cornerstone of effective endpoint security, and a critical part of the shift to a more modern security architecture.

Matt: I think there are two major forcing functions: connection and security. Because we are all now remote, there’s a huge desire to feel connected to one another even though we aren’t sitting in the same room together. We’re modifying our products in real-time with the goal of removing shared pain points and optimizing for the new reality in which we’re all living and working. Things like microphone noise suppression and multiple far field microphones, so that if the dog barks or kids run into a room, the system will mute before you’ve even pressed the mute button. We’re improving camera technology from a processing standpoint to make things look better. Ultimately, our goal is to provide an immersive and connected experience.

Security, however, transcends specific features that deliver customer experiences – security is the experience. The features that make hardware more secure are those that lie beneath the operating system, in the firmware. That is why we have such a deep network of partners, including Absolute. Because you need to have a full ecosystem, and a program that takes advantage of all the best capabilities, in order to deliver the best security solution possible.

Louis: How is Absolute helping enterprise customers ensure greater endpoint security and resiliency in 2021 and beyond?

Christy: We spend a lot of time sitting with customers to understand their needs and how and where we can extend our endpoint security solutions to fit. We believe in taking a layered approach – which is the framework for defense in-depth, and an effective endpoint security strategy. The foundational piece, which we are able to deliver, is a permanent digital tether to every device; this is the lifeline. Not having an undeletable connection to every endpoint means you have a very large security gap, which must be closed fast. A layered, persistence-driven approach ensures our customers know their security controls are actually working and delivering business value. It enables our customers to pinpoint where a vulnerability is and take quick action to mitigate it.

Lenovo’s unique, high value-add approach to integrated security has both helped drive innovation at Absolute, while also providing Lenovo customers the strongest endpoint security possible. Their multilayer approach to their endpoint strategy capitalizes on Absolute’s many BIOS-level strengths to help their customers secure every endpoint they have. As our companies work together, we are both benefitting from a collaboration that seeks to strengthen and enrich all layers of endpoint security. Best of all, our shared customers are the benefactors of this collaboration and the results we are driving at the forefront of endpoint security.

Louis:  How has the heightened focus on enterprise cybersecurity in general, and endpoint security specifically, influenced Lenovo’s product strategy in 2021 and beyond?

Matt:  We have always been focused on our unique cybersecurity strengths from the device side and making sure we have all of the control points in manufacturing to ensure we build a secure platform. So, we’ve had to be open-minded about endpoint security, and diligent in envisioning how potential vulnerabilities and attack strategies can be thwarted before they impact our customers. Because of this mindset, we’re fortunate to have a very active partner community. We’re always scouring the earth for the next hot cybersecurity technology and potential partner with unique capabilities and the ability to scale with our model. This is a key reason we’ve standardized on Absolute for endpoint security, as it can accommodate a wide breadth of deployment scenarios. It’s a constant and very iterative process with a team of very smart people constantly looking at how we can excel at cybersecurity. It is this strategy that is driving us to fortify our Lenovo Security Assurance architecture over the long-term, while also seeking new ways of providing insights from existing and potentially new security applications.

Louis: What advice are you giving CISOs to strengthen endpoint security in 2021 and beyond?

Christy: One of our advisors is the former Global Head of Information Security at Citi Group, and former CISO of JP Morgan and Deutsche Bank. He talks a lot about his shared experiences of enabling business operations, while defending organizations from ever-evolving threats, and the question that more IT and security leaders need to be asking – which is, “Is it working?” Included in his expert opinion is that cybersecurity needs to be integral to business strategy – and endpoint security is essential for creating a broader secure ecosystem that can adapt as a company’s needs change.

I believe there needs to be more boardroom-level conversations around how compliance frameworks can be best used to achieve a balance between cybersecurity and business operations. A big part of that is identifying resiliency as a critical KPI for measuring the strength of endpoint controls.

 

COVID-19’s Impact On Tech Spending This Year

COVID-19's Impact On Tech Spending This Year

The human tragedy the COVID-19 pandemic has inflicted on the world is incalculable and continues to grow. Every human life is priceless and deserves the care needed to sustain it. COVID-19 is also impacting entire industries, causing them to randomly gyrate in unpredictable ways, directly impacting IT and tech spending.

COVID-19’s Impact On Industries

Computer Economics in collaboration with their parent company Avasant published their Coronavirus Impact Index by Industry that looks at how COVID-19 is affecting 11 major industry sectors in four dimensions: personnel, operations, supply chain, and revenue. Please see the Coronavirus Impact Index by Industry by Tom Dunlap, Dave Wagner, and Frank Scavo of Computer Economics for additional information and analysis.  The resulting index is an overall rating of the impact of the pandemic on each industry and is shown below:

Computer Economics and Avasant predict major disruption to High Tech & Telecommunications based on the industry’s heavy reliance on Chinese supply chains, which were severely impacted by COVID-19. Based on conversations with U.S.-based high tech manufacturers, I’ve learned that a few are struggling to make deliveries to leading department stores and discount chains due to parts shortages and allocations from their Chinese suppliers. North American electronics suppliers aren’t an option due to their prices being higher than their Chinese competitors. Leading department stores and discount chains openly encourage high tech device manufacturers to compete with each other on supplier availability and delivery date performance.

In contrast to the parts shortage and unpredictability of supply chains dragging down the industry, software is a growth catalyst. The study notes that Zoom, Slack, GoToMyPC, Zoho Remotely, Microsoft Office365, Atlassian, and others are already seeing increased demand as companies increase their remote-working capabilities.

COVID-19’s Impact On IT Spending  

Further supporting the Coronavirus Impact Index by Industry analysis, Andrew Bartels, VP & Principal Analyst at Forrester, published his latest forecast of tech growth today in the post, The Odds of a Tech Market Decline In 2020 Have Just Gone Up To 50%.

Mr. Bartels is referencing the market forecasts shown in the following forecast published last month, New Forrester Forecast Shows That Global Tech Market Growth Will Slip To 3% In 2020 And 2021 and shown below:

Key insights from Forrester’s latest IT spending forecast and predictions are shown below:

  • Forrester is revising its tech forecast downward, predicting the US and global tech market growth slowing to around 2% in 2020. Mr. Bartels mentions that this assumes the US and other major economies have declined in the first half of 2020 but manage to recover in the second half.
  • If a full-fledged recession hits, there is a 50% probability that US and global tech markets will decline by 2% or more in 2020.
  • In either a second-half 2020 recovery or recession, Forrester predicts computer and communications equipment spending will be weakest, with potential declines of 5% to 10%.
  • Tech consulting and systems integration services spending will be flat in a temporary slowdown and could be down by up to 5% if firms cut back on new tech projects.
  • Software spending growth will slow to the 2% to 4% range in the best case and will post no growth in the worst case of a recession.
  • The only positive signs from the latest Forrester IT spending forecast is the continued growth in demand for cloud infrastructure services and potential increases in spending on specialized software. Forrester also predicts communications equipment, and telecom services for remote work and education as organizations encourage workers to work from home and schools move to online courses.

Conclusion

Every industry is economically hurting already from the COVID-19 pandemic. Now is the time for enterprise software providers to go the extra mile for their customers across all industries and help them recover and grow again. Strengthening customers in their time of need by freely providing remote collaboration tools, secure endpoint solutions, cloud-based storage, and CRM systems is an investment in the community that every software company needs to make it through this pandemic too.

CIO’s Guide To Stopping Privileged Access Abuse – Part 2

Why CIOs Are Prioritizing Privileged Credential Abuse Now

Enterprise security approaches based on Zero Trust continue to gain more mindshare as organizations examine their strategic priorities. CIOs and senior management teams are most focused on securing infrastructure, DevOps, cloud, containers, and Big Data projects to stop the leading cause of breaches, which is privileged access abuse.

Based on insights gained from advisory sessions with CIOs and senior management teams, Forrester estimates that 80% of data breaches have a connection to compromised privileged credentials, such as passwords, tokens, keys, and certificates. In another survey completed by Centrify, 74% of IT decision makers surveyed whose organizations have been breached in the past, say it involved privileged access abuse. Furthermore, 65% of organizations are still sharing root or privileged access to systems and data at least somewhat often. Centrify’s survey, Privileged Access Management in the Modern Threatscape, is downloadable here.

The following are the key reasons why CIOs are prioritizing privileged access management now:

  • Identities are the new security perimeter for any business, making privileged access abuse the greatest challenge CIOs face in keeping their businesses secure and growing. Gartner also sees privileged credential abuse as the greatest threat to organizations today, and has made Privileged Account Management one of the Gartner Top 10 Security Projects for 2018, and again in 2019Forrester and Gartner’s findings and predictions reflect the growing complexity of threatscapes every CIO must protect their business against while still enabling new business growth. Banking, financial services, and insurance (BFSI) CIOs often remark in my conversations with them that the attack surfaces in their organizations are proliferating at a pace that quickly scales beyond any trust but verify legacy approach to managing access. They need to provide applications, IoT-enabled devices, machines, cloud services, and human access to a broader base of business units than ever before.
  • CIOs are grappling with the paradox of protecting the rapidly expanding variety of attack surfaces from breaches while still providing immediate access to applications, systems, and services that support their business’ growth. CIOs I’ve met with also told me access to secured resources needs to happen in milliseconds, especially to support the development of new banking, financial services, and insurance applications in beta testing today, scheduled to be launched this summer. Their organizations’ development teams expect more intuitive, secure, and easily accessible applications than ever before, which is driving CIOs to prioritize privileged access management now
  • Adapting and risk-scoring every access attempt in real-time is key to customer experiences on new services and applications, starting with response times. CIOs need a security strategy that can flex or adapt to risk contexts in real-time, assessing every access attempt across every threat surface and generating a risk score in milliseconds. The CIOs I’ve met with regularly see a “never trust, always verify, enforce least privilege” approach to security as the future of how they’ll protect every threat surface from privileged access abuse. Each of their development teams is on tight deadlines to get new services launch to drive revenue in Q3. Designing in Zero Trust with a strong focus on Zero Trust Privilege is saving valuable development time now and is enabling faster authentication times of the apps and services in testing today.

Strategies For Stopping Privileged Credential Abuse – Part 2  

Recently I wrote a CIO’s Guide To Stopping Privileged Access Abuse – Part 1 detailing five recommended strategies for CIOs on how to stop privileged credential abuse. The first five strategies focus on the following: discovering and inventorying all privileged accounts; vaulting all cloud platforms’ Root Accounts; auditing privileged sessions and analyzing patterns to find privileged credential sharing not found during audits; enforcing least privilege access now within your existing infrastructure as much as possible; and adopting multi-factor authentication (MFA) across all threat surfaces that can adapt and flex to the risk context of every request for resources.

The following are the second set of strategies CIOs need to prioritize to further protect their organizations from privileged access abuse:

  1. After completing an inventory of privileged accounts, create a taxonomy of them by assigning users to each class or category, personalizing privileged credential access to the role and entitlement level for each. CIOs tell me this is a major time saver in scaling their Privileged Access Management (PAM) strategies. Assigning every human, machine and sensor-based identity is the goal with the overarching objective being the creation of a Zero Trust-based enterprise security strategy. Recommended initial classes or categories include IT administrators who are also responsible for endpoint security; developers who require occasional access to production instances; service desk teams and service operations; the Project Management Office (PMO) and project IT; and external contractors and consultants.
  2. By each category in the taxonomy, automate the time, duration, scope, resources, and entitlements of privileged access for each focusing on the estimated time to complete each typical task. Defining a governance structure that provides real-time access to resources based on successful authentication is a must-have for protecting privileged access credentials. By starting with the attributes of time, duration, scope and properties, organizations have a head start on creating a separation of duties (SOD) model. Separation of duties is essential for ensuring that privileged user accounts don’t have the opportunity to carry out and conceal any illegal or unauthorized activities.
  3. Using the taxonomy of user accounts created and hardened using the separation of duties model, automate privileged access and approval workflows for enterprise systems. Instead of having administrators approve or semi-automate the evaluation of every human- and machine-based request for access, consider automating the process with a request and approval workflow. With time, duration, scope, and properties of privileged access already defined human- and machine-based requests for access to IT systems and services are streamlined, saving hundreds of hours a year and providing a real-time log for audit and data analysis later.
  4. Break-glass, emergency or firecall account passwords need to be vaulted, with no exceptions. When there’s a crisis of any kind, the seconds it takes to get a password could mean the difference between cloud instances and entire systems being inaccessible or not. That’s why administrators often only manually secure root passwords to all systems, cloud platforms and containers included. This is the equivalent of leaving the front door open to the data center with all systems unlocked. The recent Centrify survey found that just 48% of organizations interviewed have a password vault. 52% are leaving the keys to the kingdom available for hackers to walk through the front door of data centers and exfiltraticate data whenever they want.
  5. Continuous delivery and deployment platforms including Ansible, Chef, Puppet, and others need to be configured when first installed to eliminate the potential for privileged access abuse. The CIOs whose teams are creating new apps and services are using Chef and Puppet to design and create workloads, with real-time integration needed with customer, pricing, and services databases and the systems they run on. Given how highly regulated insurance is, CIOs are saying they need to have logs that show activity down to the API level in case of an audit. The more regulated and audited a company, the more trusted and untrusted domains are seen as the past, Zero Trust as the future based on CIO’s feedback.

Conclusion

The CIOs I regularly meet with from the banking, financial services, and insurance industries are under pressure to get new applications and services launched while protecting their business’ daily operations. With more application and services development happening in their IT teams, they’re focusing on how they can optimize the balance between security and speed. New apps, services, and the new customers they attract are creating a proliferation of new threat surfaces, making every new identity the new security perimeter.

Roundup Of Machine Learning Forecasts And Market Estimates, 2018

  • Machine learning patents grew at a 34% Compound Annual Growth Rate (CAGR) between 2013 and 2017, the third-fastest growing category of all patents granted.
  • International Data Corporation (IDC) forecasts that spending on AI and ML will grow from $12B in 2017 to $57.6B by 2021.
  • Deloitte Global predicts the number of machine learning pilots and implementations will double in 2018 compared to 2017, and double again by 2020.

These and many other fascinating insights are from the latest series of machine learning market forecasts, market estimates, and projections. Machine learning’s potential impact across many of the world’s most data-prolific industries continues to fuel venture capital investment, private equity (PE) funding, mergers, and acquisitions all focused on winning the race of Intellectual Property (IP) and patents in this field. One of the fastest growing areas of machine learning IP is the development of custom chipsets. Deloitte Global is predicting up to 800K machine learning chips will be in use across global data centers this year. Enterprises are increasing their research, investment, and piloting of machine learning programs in 2018. And while the methodologies all vary across the many sources of forecasts, market estimates, and projections, all reflect how machine learning is improving the acuity and insights of companies on how to grow faster and more profitably. Key takeaways from the collection of machine learning market forecasts, market estimates and projections include the following:

  • Within the Business Intelligence (BI) & analytics market, Data Science platforms that support machine learning are predicted to grow at a 13% CAGR through 2021. Data Science platforms will outperform the broader BI & analytics market, which is predicted to grow at an 8% CAGR in the same period. Data Science platforms will grow in value from $3B in 2017 to $4.8B in 2021. Source: An Investors’ Guide to Artificial Intelligence, J.P. Morgan. November 27, 2017 (110 pp., PDF, no opt-in).

  • Machine learning patents grew at a 34% Compound Annual Growth Rate (CAGR) between 2013 and 2017, the third-fastest growing category of all patents granted. IBM, Microsoft, Google, LinkedIn, Facebook, Intel, and Fujitsu were the seven biggest ML patent producers in 2017. Source: IFI Claims Patent Services (Patent Analytics) 8 Fastest Growing Technologies SlideShare Presentation.

  • 61% of organizations most frequently picked Machine Learning / Artificial Intelligence as their company’s most significant data initiative for next year. Of those respondent organizations indicating they actively use Machine Learning (ML) and Artificial Intelligence (AI), 58% percent indicated they ran models in production. Source: 2018 Outlook: Machine Learning and Artificial Intelligence, A Survey of 1,600+ Data Professionals (14 pp., PDF, no opt-in).

  • Tech market leaders including Amazon, Apple, Google, Tesla, and Microsoft are leading their industry sectors by a wide margin in machine learning (ML) and AI investment. Each is designing ML into future-generation products and using ML and AI to improve customer experiences and improve the efficiency of selling channels. Source: Will You Embrace AI Fast Enough? AT Kearney, January 2018.

  • Deloitte Global predicts the number of machine learning pilots and implementations will double in 2018 compared to 2017, and double again by 2020. Factors driving the increasing pace of ML pilots include more pervasive support of Application Program Interfaces (APIs), automating data science tasks, reducing the need for training data, accelerating training and greater insight into explaining results. Source: Deloitte Global Predictions 2018 Infographics.

  • 60% of organizations at varying stages of machine learning adoption, with nearly half (45%) saying the technology has led to more extensive data analysis & insights. 35% can complete faster data analysis and increased the speed of insight, delivering greater acuity to their organizations. 35% are also finding that machine learning is enhancing their R&D capabilities for next-generation products. Source: Google & MIT Technology Review study: Machine Learning: The New Proving Ground for Competitive Advantage (10 pp., PDF, no opt-in).

  • McKinsey estimates that total annual external investment in AI was between $8B to $12B in 2016, with machine learning attracting nearly 60% of that investment. McKinsey estimates that total annual external investment in AI was between $8B to $12B in 2016, with machine learning attracting nearly 60% of that investment. Robotics and speech recognition are two of the most popular investment areas. Investors are most favoring machine learning startups due to quickness code-based start-ups have at scaling up to include new features fast. Software-based machine learning startups are preferred over their more cost-intensive machine-based robotics counterparts that often don’t have their software counterparts do. As a result of these factors and more, Corporate M&A is soaring in this area. The following graphic illustrates the distribution of external investments by category from the study. Source: McKinsey Global Institute Study, Artificial Intelligence, The Next Digital Frontier (80 pp., PDF, free, no opt-in).

  • Deloitte Global is predicting machine learning chips used in data centers will grow from a 100K to 200K run rate in 2016 to 800K this year. At least 25% of these will be Field Programmable Gate Arrays (FPGA) and Application Specific Integrated Circuits (ASICs). Deloitte found the Total Available Market (TAM) for Machine Learning (ML) Accelerator technologies could potentially reach $26B by 2020. Source: Deloitte Global Predictions 2018.

  • Amazon is relying on machine learning to improve customer experiences in key areas of their business including product recommendations, substitute product prediction, fraud detection, meta-data validation and knowledge acquisition. For additional details, please see the presentation, Machine Learning At Amazon, Amazon Web Services (47 pp., PDF no opt-in).

Sources of Market Data on Machine Learning:

2018 Outlook: Machine Learning and Artificial Intelligence, A Survey of 1,600+ Data Professionals. MEMSQL. (14 pp., PDF, no opt-in)

Advice for applying Machine Learning, Andrew Ng, Stanford University. (30 pp., PDF, no opt-in)

An Executive’s Guide to Machine Learning, McKinsey Quarterly. June 2015

An Investors’ Guide to Artificial Intelligence, J.P. Morgan. November 27, 2017 (110 pp., PDF, no opt-in)

Artificial intelligence and machine learning in financial services Market developments and financial stability implications, Financial Stability Board. (45 pp., PDF, no opt-in)

Big Data and AI Strategies Machine Learning and Alternative Data Approach to Investing, J.P. Morgan. (280 pp., PDF. No opt-in).

Google & MIT Technology Review study: Machine Learning: The New Proving Ground for Competitive Advantage (10 pp., PDF, no opt-in).

Hitting the accelerator: the next generation of machine-learning chips, Deloitte. (6 pp., PDF, no opt-in).

How Do Machines Learn? Algorithms are the Key to Machine Learning. Booz Allen Hamilton. (Infographic)

IBM Predicts Demand For Data Scientists Will Soar 28% By 2020, Forbes. May 13, 2017

Machine Learning At Amazon, Amazon Web Services (47 pp., PDF no opt-in).

Machine Learning Evolution (infographic). PwC. April 17, 2017 Machine learning: things are getting intense. Deloitte (6 pp., PDF. No opt-in)

Machine Learning: The Power and Promise Of Computers That Learn By Example. The Royal Society’s Machine Learning Project (128 pp., PDF, no opt-in)

McKinsey Global Institute StudyArtificial Intelligence, The Next Digital Frontier (80 pp., PDF, free, no opt-in)

McKinsey’s State Of Machine Learning And AI, 2017, Forbes, July 9, 2017

Predictions 2017: Artificial Intelligence Will Drive The Insights Revolution. Forrester, November 2, 2016 (9 pp., PDF, no opt-in)

Risks And Rewards: Scenarios around the economic impact of machine learning, The Economist Intelligence Unit. (80 pp., PDF, no opt-in)

Smartening up with Artificial Intelligence (AI) – What’s in it for Germany and its Industrial Sector? Digital/McKinsey & Company. (52 pp., PDF, no opt-in)

So What Is Machine Learning Anyway?  Business Insider. Nov. 23, 2017

The 10 Most Innovative Companies In AI/Machine Learning 2017, Wired

The Business Impact and Use Cases for Artificial Intelligence. Gartner (28 pp., PDF, no opt-in)

The Build-Or-Buy Dilemma In AIBoston Consulting Group. January 4, 2018.

The Next Generation of Medicine: Artificial Intelligence and Machine Learning, TM Capital (25 pp., PDF, free, opt-in)

The Roadmap to Enterprise AI, Rage Networks Brief based on Gartner research. (17 pp., PDF, no opt-in)

Will You Embrace AI Fast Enough? AT Kearney. January 2018

 

By 2020 83% Of Enterprise Workloads Will Be In The Cloud

  • Digitally transforming enterprises (63%) is the leading factor driving greater public cloud engagement or adoption today.
  • 66% of IT professionals say security is their most significant concern in adopting an enterprise cloud computing strategy.
  • 50% of IT professionals believe artificial intelligence and machine learning are playing a role in cloud computing adoption today, growing to 67% by 2020.
  • Artificial Intelligence (AI) and Machine Learning will be the leading catalyst driving greater cloud computing adoption by 2020.

These insights and findings are from LogicMonitor’s Cloud Vision 2020: The Future of the Cloud Study (PDF, free, opt-in, 9 pp.). The survey is based on interviews with approximately 300 influencers LogicMonitor interviewed in November 2017. Respondents include Amazon Web Services AWS re:Invent 2017 attendees, industry analysts, media, consultants and vendor strategists. The study’s primary goal is to explore the landscape for cloud services in 2020. While the study’s findings are not statistically significant, they do provide a fascinating glimpse into current and future enterprise cloud computing strategies.

Key takeaways include the following:

  • 83% Of Enterprise Workloads Will Be In The Cloud By 2020. LogicMonitor’s survey is predicting that 41% of enterprise workloads will be run on public cloud platforms (Amazon AWSGoogle Cloud PlatformIBM CloudMicrosoft Azure and others) by 2020. An additional 20% are predicted to be private-cloud-based followed by another 22% running on hybrid cloud platforms by 2020. On-premise workloads are predicted to shrink from 37% today to 27% of all workloads by 2020.

  • Digitally transforming enterprises (63%) is the leading factor driving greater public cloud engagement or adoption followed by the pursuit of IT agility (62%). LogicMonitor’s survey found that the many challenges enterprises face in digitally transforming their business models are the leading contributing factor to cloud computing adoption. Attaining IT agility (62%), excelling at DevOps (58%), mobility (55%), Artificial Intelligence (AI) and Machine Learning (50%) and the Internet of Things (IoT) adoption (45%) are the top six factors driving cloud adoption today. Artifical Intelligence (AI) and Machine Learning are predicted to be the leading factors driving greater cloud computing adoption by 2020.

  • 66% of IT professionals say security is their greatest concern in adopting an enterprise cloud computing strategy. Cloud platform and service providers will go on a buying spree in 2018 to strengthen and harden their platforms in this area. Verizon (NYSE:VZ) acquiring Niddel this week is just the beginning. Niddel’s Magnet software is a machine learning-based threat-hunting system that will be integrated into Verizon’s enterprise-class cloud services and systems. Additional concerns include attaining governance and compliance goals on cloud-based platforms (60%), overcoming the challenges of having staff that lacks cloud experience (58%), Privacy (57%) and vendor lock-in (47%).

  • Just 27% of respondents predict that by 2022, 95% of all workloads will run in the cloud. One in five respondents believes it will take ten years to reach that level of workload migration. 13% of respondents don’t see this level of workload shift ever occurring. Based on conversations with CIOs and CEOs in manufacturing and financial services industries there will be a mix of workloads between on-premise and cloud for the foreseeable future. C-level executives evaluate shifting workloads based on each systems’ contribution to new business models, cost, and revenue goals in addition to accelerating time-to-market.

  • Microsoft Azure and Google Cloud Platform are predicted to gain market share versus Amazon AWS in the next three years, with AWS staying the clear market leader. The study found 42% of respondents are predicting Microsoft Azure will gain more market share by 2020. Google Cloud Platform is predicted to also gain ground according to 35% of the respondent base. AWS is predicted to extend its market dominance with 52% market share by 2020.

How Artificial Intelligence Is Revolutionizing Enterprise Software In 2017

future-of-artificial-intelligence-and-big-data

  • 81% of IT leaders are currently investing in or planning to invest in Artificial Intelligence (AI).
  • Cowen predicts AI will drive user productivity to materially higher levels, with Microsoft at the forefront.
  • Digital Marketing/Marketing Automation, Salesforce Automation (CRM) and Data Analytics are the top three areas ripe for AI/ML adoption.
  • According to angel.co, there are 2,200+ Artificial Intelligence start-ups, and well over 50% have emerged in just the last two years.
  • Cowen sees Salesforce ($CRM), Adobe ($ADBE) and ServiceNow ($NOW) as well-positioned to deliver and monetize new AI-based application services.

These and many other fascinating insights are from the Cowen and Company Multi-Sector Equity Research study, Artificial Intelligence: Entering A Golden Age For Data Science (142 pp., PDF, client access reqd). The study is based on interviews with 146 leading AI researchers, entrepreneurs and VC executives globally who are involved in the field of artificial intelligence and related technologies. Please see the Appendix of the study for a thorough overview of the methodology. This study isn’t representative of global AI, data engineering and machine learning (ML) adoption trends. It does, however, provide a glimpse into the current and future direction of AI, data engineering, and machine learning.  Cowen finds the market is still nascent, with CIOs eager to invest in new AI-related initiatives. Time-to-market, customer messaging, product positioning and the value proposition of AI solutions will be critical factors for winning over new project investments.

Key takeaways from the study include the following:

  • Digital Marketing/Marketing Automation, Salesforce Automation (CRM) and Data Analytics are the top three areas ripe for AI/ML adoption. Customer self-service, Enterprise Resource Planning (ERP), Human Resource Management (HRM) and E-Commerce are additional areas that have upside potential for AI/ML adoption. The following graphic provides an overview of the areas in software that Cowen found the greater potential for AI/ML investment.

Artificial Intelligence: Entering A Golden Age For Data Science

  • 81% of IT leaders are currently investing in or planning to invest in Artificial Intelligence (AI). Based on the study, CIOs have a new mandate to integrate AI into IT technology stacks. The study found that 43% are evaluating and doing a Proof of Concept (POC) and 38% are already live and planning to invest more.  The following graphic provides an overview of company readiness for machine learning and AI projects.

How Artificial Intelligence Is Revolutionizing Enterprise Software In 2017

  • Market forecasts vary, but all consistently predict explosive growth. IDC predicts that the Cognitive Systems and AI market (including hardware & services) will grow from $8B in 2016 to $47B in 2020, attaining a Compound Annual Growth Rate (CAGR) of 55%. This forecast includes $18B in software applications, $5B in software platforms, and $24B in services and hardware. IBM claims that Cognitive Computing is a $2T market, including $200B in healthcare/life sciences alone. Tractica forecasts direct and indirect applications of AI software to grow from $1.4B in 2016 to $59.8B by 2025, a 52% CAGR.

Artificial Intelligence: Entering A Golden Age For Data Science

  • According to CBInsights, the number of financing transactions to AI start-ups increased 10x over the last six years, from 67 in 2011 to 698 in 2016. Accenture states that the total number of AI start-ups has increased 20-fold since 2011. The top verticals include FinTech, Healthcare, Transportation and Retail/e-Commerce. The following graphic provides an overview of the AI annual funding history from 2011 to 2016.

Artificial Intelligence: Entering A Golden Age For Data Science

  • Algorithmic trading, image recognition/tagging, and patient data processing are predicted to the b top AI uses cases by 2025. Tractica forecasts predictive maintenance and content distribution on social media will be the fourth and fifth highest revenue producing AI uses cases over the next eight years. The following graphic compares the top 10 uses cases by projected global revenue.

ai-use-cases

  • Machine Learning is predicted to generate the most revenue and is attracting the most venture capital investment in all areas of AI. Venture Scanner found that ML raised $3.5B to date (from 400+ companies), far ahead of the next category, Natural Language Processing, which has seen just over $1Bn raised to date (from 200+ companies). Venture Scanner believes that Machine Learning Applications and Machine Learning Platforms are two relatively early stage markets that stand to have some of the greatest market disruptions.

Artificial Intelligence: Entering A Golden Age For Data Science

  • Cowen predicts that an Intelligent App Stack will gain rapid adoption in enterprises as IT departments shift from system-of-record to system-of-intelligence apps, platforms, and priorities. The future of enterprise software is being defined by increasingly intelligent applications today, and this will accelerate in the future. Cowen predicts it will be commonplace for enterprise apps to have machine learning algorithms that can provide predictive insights across a broad base of scenarios encompassing a company’s entire value chain. The potential exists for enterprise apps to change selling and buying behavior, tailoring specific responses based on real-time data to optimize discounting, pricing, proposal and quoting decisions.

Artificial Intelligence: Entering A Golden Age For Data Science

  • According to angel.co, there are 2,200+ Artificial Intelligence start-ups, and well over 50% have emerged in just the last two years. Machine Learning-based Applications and Deep Learning Neural Networks are experiencing the largest and widest amount of investment attention in the enterprise.
  • Accenture leverages machine learning in 40% of active Analytics engagements, and nearly 80% of proposed Analytics opportunities today. Cowen found that Accenture’s view is that they are in the early stages of AI technology adoption with their enterprise clients.  Accenture sees the AI market growing exponentially, reaching $400B in spending by 2020. Their customers have moved on from piloting and testing AI to reinventing their business strategies and models.

3 Ways To Improve Selling Results With SAP Integration


sap-integration
The more integrated the systems are supporting any selling strategy, the greater the chances sales will increase. That’s because accuracy, speed, and quality of every quote matter more than ever. Being able to strengthen every customer interaction with insight and intelligence often means the difference between successful upsells, cross-sells and the chance to bid and win new projects. Defining a roadmap to enrich selling strategies using SAP integration is delivering results across a variety of manufacturing and service industries today.

Getting more value out of the customer data locked in legacy SAP systems can improve selling results starting with existing sales cycles. Knowing what each customer purchased, when, at what price, and for which project or location is invaluable in accelerating sales cycles today. There are many ways to improve selling results using SAP integration, and the following are the top three based on conversations with SAP Architects, CIOs and IT Directors working with Sales Operations to improve selling results. These five approaches are generating more leads, closing more deals, leading to better selling decisions and improving sales productivity.

 3 Ways SAP Integration Is Improving Selling Results

  1. Reducing and eliminating significant gaps in the Configure-Price-Quote (CPQ) process by integrating Salesforce and SAP systems improves selling and revenue results quickly. The following two illustrations compare how much time and revenue escape from the selling process. It’s common to see companies lose at least 20% of their orders when they rely on manual approaches to handling quotes, pricing, and configurations. The greater the complexity of the deal is the more potential for lost revenue.  The second graphic shows how greater system integration leads to lower costs to complete an order, cycle time reductions, order rework reductions, and lead times for entire orders dropping from 69 to 22 days.

3 Ways To Improve Selling Results With SAP Integration

3 Ways To Improve Selling Results With SAP Integration

  1. Having customer order history, pricing, discounts and previously purchased bundles stored in SAP ERP systems integrated into Salesforce will drive better decisions on which customers are most likely to buy upsells, cross-sells and new products when. Instead of having just to rely on current activity with a given customer, sales teams can analyze sales history to find potential purchasing trends and indications of who can sign off on deals in progress. Having real-time access to SAP data within Salesforce gives sales teams the most valuable competitive advantage there is, which is more time to focus on customers and closing deals.  enosiX is taking a leadership role in the area of real-time SAP to Salesforce integration, enabling enterprises to sell and operate more effectively.
  1. Improving Sales Operations and Customer Service productivity by providing customer data in real-time via Salesforce to support teams on a 24/7 basis worldwide. The two departments who rely on customer data more than sales need to have real-time access to customer data on a 24/7 basis from any device at any time, on a global scale. By integrating customer data held today in SAP ERP and related systems to Salesforce, Sales Operations, and Customer Service will have the visibility they’ve never had before. And that will translate into faster response times, higher customer satisfaction and potentially more sales too.

Additional Reading:

Accenture, Empowering Your Sales Force

Aberdeen Group, Configure-Price-Quote: Best-In-Class Deployments that Speed The Sale

Aberdeen Group, Configure/Price/Quote: Better, Faster Sales Deals Enabled

Aberdeen Group, Sales Enablement Advances In Configure/Price/Quote Solutions

Forbes, What’s Hot In CRM Applications, 2015: Why CPQ Continues To Accelerate

Forbes,  Cloud-Based CPQ Continues To Be One Of The Hottest Enterprise Apps Of 2016

Forbes, Five Ways Cloud-Based CPQ Increases Sales Effectiveness And Drives Up CRM Adoption

The Sales Management Association,  The Impact of Quoting Automation: Enabling the Sales Force, Optimizing Profits, and Improving Customer Engagement

%d bloggers like this: