Bottom Line: Absolute’s 2020 Endpoint Resilience Report illustrates why the purpose of any cybersecurity program needs to be attaining a balance between protecting an organization and the need to keep the business running, starting with secured endpoints.
Enterprises who’ve taken a blank-check approach in the past to spending on cybersecurity are facing the stark reality that all that spending may have made them more vulnerable to attacks. While cybersecurity spending grew at a Compound Annual Growth Rate (CAGR) of 12% in 2018, Gartner’s latest projections are predicting a decline to only 7% CAGR through 2023. Nearly every CISO I’ve spoken with in the last three months say prioritizing cybersecurity programs by their ROI and contribution to the business is how funding gets done today.
Cybersecurity Has Always Been A Business Decision
Overcoming the paradox of keeping a business secure while fueling its growth is the essence of why cybersecurity is a business decision. Securing an entire enterprise is an unrealistic goal; balancing security and ongoing operations is. CISOs speak of this paradox often and the need to better measure the effectiveness of their decisions.
This is why the findings from Absolute’s 2020 State of Endpoint Resilience Report are so timely given the shift to more spending accountability on cybersecurity programs. The report’s methodology is based on anonymized data from enterprise-specific subsets of nearly 8.5 million Absolute-enabled devices active across 12,000+ customer organizations in North America and Europe. Please see the last page of the study for additional details regarding the methodology.
Key insights from the study include the following:
More than one of every three enterprise devices had an Endpoint Protection (EP), client management or VPN application out of compliance, further exposing entire organizations to potential threats. More than 5% of enterprise devices were missing one or more of these critical controls altogether. Endpoints, encryption, VPN and Client Management are more, not less fragile, despite millions of dollars being spent to protect them before the downturn. The following graphic illustrates how fragile endpoints are by noting average compliances rate alongside installation rates:
When cybersecurity spending isn’t being driven by a business case, endpoints become more complex, chaotic and nearly impossible to protect. Absolute’s survey reflects what happens when cybersecurity spending isn’t based on a solid business decision, often leading to multiple endpoint security agents. The survey found the typical organization has 10.2 endpoint agents on average, up from 9.8 last year. One of the most insightful series of findings in the study and well worth a read is the section on measuring Application Resilience. The study found that the resiliency of an application varies significantly based on what else it is paired with. It’s interesting to see that same-vendor pairings don’t necessarily do better or show higher average compliance rates than pairings from different vendors. The bottom line is that there’s no guarantee that any agent, whether sourced from a single vendor or even the most innovative vendors, will work seamlessly together and make an organization more secure. The following graphic explains this point:
60% of breaches can be linked to a vulnerability where a patch was available, but not applied. When there’s a compelling business case to keep all machines current, patches get distributed and installed. When there isn’t, operating system patches are, on average, 95 days late. Counting up the total number of vulnerabilities addressed on Patch Tuesday in February through May 2020 alone, it shows that the average Windows 10 enterprise device has hundreds of potential vulnerabilities without a fix applied – including four zero-day vulnerabilities. Absolute’s data shows that Post-Covid-19, the average patch age has gone down slightly, driven by the business case of supporting an entirely remote workforce.
Organizations that had defined business cases for their cybersecurity programs are able to adapt better and secure vulnerable endpoint devices, along with the sensitive data piling up on those devices, being used at home by employees. Absolute’s study showed that the amount of sensitive data – like Personal Identifiable Information (PII), Protected Health Information (PHI) and Personal Financial Information (PFI) data – identified on endpoints soared as the Covid-19 outbreak spread and devices went home to work remotely. Without autonomous endpoints that have an unbreakable digital tether to ensure the health and security of the device, the greater the chance of this kind of data being exposed, the greater the potential for damages, compliance violations and more.
Absolute’s latest study on the state of endpoints amplifies what many CISOs and their teams are doing today. They’re prioritizing cybersecurity endpoint projects on ROI, looking to quantify agent effectiveness and moving beyond the myth that greater compliance is going to get them better security. The bottom line is that increasing cybersecurity spending is not going to make any business more secure, knowing the effectiveness of cybersecurity spending will, however. Being able to capable of tracking how resilient and persistent every autonomous endpoint is in an organization makes defining the ROI of endpoint investments possible, which is what every CISO I’ve spoken with is focusing on this year.
One in 10 enterprises now use 10 or more AI applications; chatbots, process optimization, and fraud analysis lead a recent survey’s top use cases according to MMC Ventures.
83% of IT leaders say AI & ML is transforming customer engagement, and 69% say it is transforming their business according to Salesforce Research.
IDC predicts spending on AI systems will reach $97.9B in 2023.
AI pilots are progressing into production based on their combined contributions to improving customer experience, stabilizing and increasing revenues, and reducing costs. The most successful AI use cases contribute to all three areas and deliver measurable results. Of the many use cases where AI is delivering proven value in enterprises today, the ten areas discussed below are notable for the measurable results they are providing.
What each of these ten use cases has in common is the accuracy and efficiency they can analyze and recommend actions based on real-time monitoring of customer interactions, production, and service processes. Enterprises who get AI right the first time build the underlying data structures and frameworks to support the advanced analytics, machine learning, and AI techniques that show the best potential to deliver value. There are various frameworks available, with BMC’s Autonomous Digital Enterprise (ADE) encapsulating what enterprises need to scale out their AI pilots into production. What’s unique about BMC’s approach is its focus on delivering transcendent customer experiences by creating an ecosystem that uses technology to cater to every touchpoint on a customer’s journey, across any channel a customer chooses to interact with an enterprise on.
10 Areas Where AI Is Delivering Proven Value Today
Having progressed from pilot to production across many of the world’s leading enterprises, they’re great examples of where AI is delivering value today. The following are 10 areas where AI is delivering proven value in enterprises today
Customer feedback systems lead all implementations of AI-based self-service platforms. That’s consistent with the discussions I’ve had with manufacturing CEOs who are committed to Voice of the Customer (VoC) programs that also fuel their new product development plans. The best-run manufacturers are using AI to gain customer feedback better also to improve their configure-to-order product customization strategies as well. Mining contact center data while improving customer response times are working on AI platforms today. Source: Forrester study, AI-Infused Contact Centers Optimize Customer Experience Develop A Road Map Now For A Cognitive Contact Center.
McKinsey finds that AI is improving demand forecasting by reducing forecasting errors by 50% and reduce lost sales by 65% with better product availability. Supply chains are the lifeblood of any manufacturing business. McKinsey’s initial use case analysis is finding that AI can reduce costs related to transport and warehousing and supply chain administration by 5% to 10% and 25% to 40%, respectively. With AI, overall inventory reductions of 20% to 50% are possible. Source: Smartening up with Artificial Intelligence (AI) – What’s in it for Germany and its Industrial Sector? McKinsey & Company.
The majority of CEOs and Chief Human Resource Officers (CHROs) globally plan to use more AI within three years, with the U.S. leading all other nations at 73%. Over 63% of all CEOs and CHROs interviewed say that new technologies have a positive impact overall on their operations. CEOs and CHROs introducing AI into their enterprises are doing an effective job at change management, as the majority of employees, 54%, are less concerned about AI now that they see its benefits. C-level executives who are upskilling their employees by enabling them to have stronger digital dexterity skills stand a better chance of winning the war for talent. Source: Harris Interactive, in collaboration with Eightfold Talent Intelligence And Management Report 2019-2020 Report.
AI is the foundation of the next generation of logistics technologies, with the most significant gains being made with advanced resource scheduling systems. AI-based techniques are the foundation of a broad spectrum of next-generation logistics and supply chain technologies now under development. The most significant gains are being made where AI can contribute to solving complex constraints, cost, and delivery problems manufacturers are facing today. For example, AI is providing insights into where automation can deliver the most significant scale advantages. Source: McKinsey & Company, Automation in logistics: Big opportunity, bigger uncertainty, April 2019. By Ashutosh Dekhne, Greg Hastings, John Murnane, and Florian Neuhaus.
AI sees the most significant adoption by marketers working in $500M to $1B companies, with conversational AI for customer service as the most dominant. Businesses with between $500M to $1B lead all other revenue categories in the number and depth of AI adoption use cases. Just over 52% of small businesses with sales of $25M or less are using AI for predictive analytics for customer insights. It’s interesting to note that small companies are the leaders in AI spending, at 38.1%, to improve marketing ROI by optimizing marketing content and timing. Source: The CMO Survey: Highlights and Insights Report, February 2019. Duke University, Deloitte, and American Marketing Association. (71 pp., PDF, free, no opt-in).
A semiconductor manufacturer is combining smart, connected machines with AI to improve yield rates by 30% or more, while also optimizing fab operations and streamlining the entire production process. They’ve also been able to reduce supply chain forecasting errors by 50% and lost sales by 65% by having more accurate product availability, both attributable to insights gained from AI. They’re also automating quality testing using machine learning, increasing defect detection rates up to 90%. These are the kind of measurable results manufacturers look for when deciding if a new technology is going to deliver results or not. These and many other findings from the semiconductor’s interviews with McKinsey are in the study, Smartening up with Artificial Intelligence (AI) – What’s in it for Germany and its Industrial Sector? . The following graphic from the study illustrates the many ways AI and machine learning are improving semiconductor manufacturing.
AI is making it possible to create propensity models by persona, and they are invaluable for predicting which customers will act on a bundling or pricing offer. By definition, propensity models rely on predictive analytics including machine learning to predict the probability a given customer will act on a bundling or pricing offer, e-mail campaign or other call-to-action leading to a purchase, upsell or cross-sell. Propensity models have proven to be very effective at increasing customer retention and reducing churn. Every business excelling at omnichannel today rely on propensity models to better predict how customers’ preferences and past behavior will lead to future purchases. The following is a dashboard that shows how propensity models work. Source: customer propensities dashboard is from TIBCO.
AI is reducing logistics costs by finding patterns in track-and-trace data captured using IoT-enabled sensors, contributing to $6M in annual savings. BCG recently looked at how a decentralized supply chain using track-and-trace applications could improve performance and reduce costs. They found that in a 30-node configuration, when blockchain is used to share data in real-time across a supplier network, combined with better analytics insight, cost savings of $6M a year is achievable. Source: Boston Consulting Group, Pairing Blockchain with IoT to Cut Supply Chain Costs, December 18, 2018, by Zia Yusuf, Akash Bhatia, Usama Gill, Maciej Kranz, Michelle Fleury, and Anoop Nannra.
Detecting and acting on inconsistent supplier quality levels and deliveries using AI-based applications is reducing the cost of bad quality across electronic, high-tech, and discrete manufacturing. Based on conversations with North American-based mid-tier manufacturers, the second most significant growth barrier they’re facing today is suppliers’ lack of consistent quality and delivery performance. Using AI, manufacturers can discover quickly who their best and worst suppliers are, and which production centers are most accurate in catching errors. Manufacturers are using dashboards much like the one below for applying machine learning to supplier quality, delivery, and consistency challenges. Source: Microsoft, Supplier Quality Analysis sample for Power BI: Take a tour.
Optimizing Shop Floor Operations with Real-Time Monitoring and AI is in production at Hitachi today. Combining real-time monitoring and AI to optimize shop floor operations, providing insights into machine-level loads and production schedule performance, is now in production at Hitachi. Knowing in real-time how each machine’s load level impacts overall production schedule performance leads to better decisions managing each production run. Optimizing the best possible set of machines for a given production run is now possible using AI. Source: Factories of the Future: How Symbiotic Production Systems, Real-Time Production Monitoring, Edge Analytics, and AI Are Making Factories Intelligent and Agile, Youichi Nonaka, Senior Chief Researcher, Hitachi R&D Group and Sudhanshu Gaur Director, Global Center for Social Innovation Hitachi America R&D.
Papadopoulos, T., Gunasekaran, A., Dubey, R., & Fosso Wamba, S. (2017). Big data and analytics in operations and supply chain management: managerial aspects and practical challenges. Production Planning & Control, 28(11/12), 873-876.
Online conversion rates increased 8.8% in February, reflecting a level of shopping urgency typically seen during Cyber Mondays, according to QuantumMetric.
Just over 306 million Americans are affected by stay-at-home orders, nearly 95% of the U.S. population. COVID-19 will forever change retailing, and its initial impact on e-Commerce is creating challenges to online selling & service no one imagined in January. The following graphic from COVID-19 Commerce Insight, an Emarsys initiative in cooperation with GoodData show how year-over-year revenue growth comparing the last seven days to the same period last year:
Mobile devices are the most popular device for online shopping by a wide margin. 72% of consumers are using mobile devices to shop in stores according to the latest PYMNTS’ 2020 Remote Payments Study. E-Commerce and online retailers’ supply chains, order management, and fulfillment systems are all being tested by the triple-digit order and revenue growth going on today. And best of all, more energy and intensity is being put into improving customer experiences online.
E-Commerce’s Time Savings and Efficiency Are Here To Stay
Stay-at-home orders will eventually be lifted state by state, but in the interim, there are millions of consumers creating and reinforcing new online buying behaviors and habits. In many families, online grocery, apparel, and entertainment shopping will replace store and mall visits permanently until a vaccine is available.
Paradoxically, quarantines have helped alleviate the severe time shortages, so many families and friends have had in their lives. Many are reluctant to go back to old shopping habits for fear of getting sick. A recent Morning Consult study found that 24% of consumers said they wouldn’t feel comfortable shopping in a mall for more than six months, 16% said they would feel comfortable in the next three months. The results are based on surveys with 2,200 U.S. adults between April 7 and April 9. Consumers are more motivated than ever to stay home and shop online, creating the ideal market conditions to fast-track, test, and launch new experience-driven mobile apps, sites, and touchpoints across their platform.
Eliminating Friction Is Key; Customers No Longer Have to Trade Experience for Security
Just as the latest approaches to personalizing web content, offers, pricing, and promotions adapt to each customers’ unique preferences and buying history, account control and security must do the same. Customizing security for every online customer eliminates the friction of having the most loyal, VIP-level customers go through the same authentication steps as a new one. Instead of one-size-fits-all account verification, there needs to be a more adaptive approach to managing friction customers experience. And the best place to start is by expanding the dataset used for defining and personalizing adaptive friction approaches by customer, all in real-time. Knowing the user type, device specifics, IP risk, geolocation, custom data, and more can be taken together to define a micro-segmentation based strategy.
One of the more innovative approaches to solving this challenge uses AI and Machine Learning algorithms to customize every e-commerce consumers’ experiences, reducing friction by identifying and segmenting users based on common characteristics. It’s called Kount Control – Account Takeover Protection and what’s noteworthy about this approach is its ability to identify returning customers even if they are logging in from a new location or a new device. Rather than instantly blocking their access, e-commerce businesses can give their customers an appropriate login response, such as step-up authentication or frictionless experience.
Kount Control enables an e-commerce companies’ fraud agents to see the health of the logins and take a quick pulse of the number of blocks and challenges as well as the failed login attempts. They’re also able to see the normal ebbs and flows of a business, which can help analysts identify when something is amiss. The following dashboard is crucial for showing location patterns to analyze logins during specific periods, or even filter on a particular user to identify attempted fraud.
Knowing the patterns of failed login attempts helps identify and stop credential attacks, while also helping to improve user’s login and purchase experiences. What’s noteworthy about Kount’s approach to doing this is how they identify potentially risky IP addresses and feed those to Security Operations while also identifying potential users that are under attack so they can be contacted via customer service outreach. The following is an example of a Kount dashboard tracking and analyzing failed login attempts in real-time.
Millions of online customers changing their behavior at the same time to opt for more time saving and convenience puts a considerable strain on e-commerce and online retailers today. How they choose to react will define the future of e-commerce. Many are choosing to remove the friction that stands in the way of turning occasional customers into the most loyal. And they’re starting with securing online identities and protecting accounts from takeover control. Once state by state stay at home orders are lifted, e-commerce sales may stabilize at a lower growth rate than they are today, yet the behavioral changes are already in motion to completely change the retailing landscape and commerce overall for years to come.
Improving revenues using BI is now the most popular objective enterprises are pursuing in 2019.
Reporting, dashboards, data integration, advanced visualization, and end-user self-service are the most strategic BI initiatives underway in enterprises today.
Operations, Executive Management, Finance, and Sales are primarily driving Business Intelligence (BI) adoption throughout enterprises today.
Tech companies’ Operations & Sales teams are the most effective at driving BI adoption across industries surveyed, with Advertising driving BI adoption across Marketing.
These and many other fascinating insights are from Dresner Advisory Associates’ 10th edition of its popular Wisdom of Crowds® Business Intelligence Market Study. The study is noteworthy in that it provides insights into how enterprises are expanding their adoption of Business Intelligence (BI) from centralized strategies to tactical ones that seek to improve daily operations. The Dresner research teams’ broad assessment of the BI market makes this report unique, including their use visualizations that provide a strategic view of market trends. The study is based on interviews with respondents from the firms’ research community of over 5,000 organizations as well as vendors’ customers and qualified crowdsourced respondents recruited over social media. Please see pages 13 – 16 for the methodology.
Key insights from the study include the following:
Operations, Executive Management, Finance, and Sales are primarily driving Business Intelligence (BI) adoption throughout their enterprises today. More than half of the enterprises surveyed see these four departments as the primary initiators or drivers of BI initiatives. Over the last seven years, Operations departments have most increased their influence over BI adoption, more than any other department included in the current and previous survey. Marketing and Strategic Planning are also the most likely to be sponsoring BI pilots and looking for new ways to introduce BI applications and platforms into use daily.
Tech companies’ Operations & Sales teams are the most effective at driving BI adoption across industries surveyed, with Advertising driving BI adoption across Marketing. Retail/Wholesale and Tech companies’ sales leadership is primarily driving BI adoption in their respective industries. It’s not surprising to see the leading influencer among Healthcare respondents is resource-intensive HR. The study found that Executive Management is most likely to drive business intelligence in consulting practices most often.
Reporting, dashboards, data integration, advanced visualization, and end-user self-service are the most strategic BI initiatives underway in enterprises today. Second-tier initiatives include data discovery, data warehousing, data discovery, data mining/advanced algorithms, and data storytelling. Comparing the last four years of survey data, Dresner’s research team found reporting retains all-time high scores as the top priority, and data storytelling, governance, and data catalog hold momentum. Please click on the graphic to expand for easier reading.
BI software providers most commonly rely on executive-level personas to design their applications and add new features. Dresner’s research team found all vertical industries except Business Services target business executives first in their product design and messaging. Given the customer-centric nature of advertising and consulting services business models, it is understandable why the primary focus BI vendors rely on in selling to them are customer personas. The following graphic compares targeted users for BI by industry.
Improving revenues using BI is now the most popular objective in 2019, despite BI initially being positioned as a solution for compliance and risk management. Executive Management, Marketing/Sales, and Operations are driving the focus on improving revenues this year. Nearly 50% of enterprises now expect BI to deliver better decision making, making the areas of reporting, and dashboards must-have features. Interestingly, enterprises aren’t looking to BI as much for improving operational efficiencies and cost reductions or competitive advantages. Over the last 12 to 18 months, more tech manufacturing companies have initiated new business models that require their operations teams to support a shift from products to services revenues. An example of this shift is the introduction of smart, connected products that provide real-time data that serves as the foundation for future services strategies. Please click on the graphic to expand for easier reading.
In aggregate, BI is achieving its highest levels of adoption in R&D, Executive Management, and Operations departments today. The growing complexity of products and business models in tech companies, increasing reliance on analytics and BI in retail/wholesale to streamline supply chains and improve buying experiences are contributing factors to the increasing levels of BI adoption in these three departments. The following graphic compares BI’s level of adoption by function today.
Enterprises with the largest BI budgets this year are investing more heavily into dashboards, reporting, and data integration. Conversely, those with smaller budgets are placing a higher priority on open source-based big data projects, end-user data preparation, collaborative support for group-based decision-making, and enterprise planning. The following graphic provides insights into technologies and initiatives strategic to BI at an enterprise level by budget plans.
Marketing/Sales and Operations are using the greatest variety of BI tools today. The survey shows how conversant Operations professionals are with the BI tools in use throughout their departments. Every one of them knows how many and most likely which types of BI tools are deployed in their departments. Across all industries, Research & Development (R&D), Business Intelligence Competency Center (BICC), and IT respondents are most likely to report they have multiple tools in use.
82% of enterprises are prioritizing analytics and BI as part of their budgets for new technologies and cloud-based services.
54% say AI, Machine Learning and Natural Language Processing (NLP) are also a high investment priority.
50% of enterprises say their stronger focus on metrics and Key Performance Indicators (KPIs) company-wide are a major driver of new investment in analytics and BI.
43% plan to both build and buy AI and machine learning applications and platforms.
42% are seeking to improve user experiences by automating discovery of data insights and 26% are using AI to provide user recommendations.
These and many other fascinating insights are from the recent TDWI Best Practices Report, BI and Analytics in the Age of AI and Big Data. An executive summary of the study is available online here. The entire study is available for download here (39 PP., PDF, free, opt-in). The study found that enterprises are placing a high priority on augmenting existing systems and replacing older technologies and data platforms with new cloud-based BI and predictive analytics ones. Transforming Data with Intelligence (TDWI) is a global community of AI, analytics, data science and machine learning professionals interested in staying current in these and more technology areas as part of their professional development. Please see page 3 of the study for specifics regarding the methodology.
Key takeaways from the study include the following:
82% of enterprises are prioritizing analytics and BI applications and platforms as part of their budgets for new technologies and cloud-based services. 78% of enterprises are prioritizing advanced analytics, and 76% data preparation. 54% say AI, machine learning and Natural Language Processing (NLP) are also a high investment priority. The following graphic ranks enterprises’ investment priorities for acquiring or subscribing to new technologies and cloud-based services by analytics and BI initiatives or strategies. Please click on the graphic to expand for easier reading.
Data warehouse or mart in the cloud (41%), data lake in the cloud (39%) and BI platform in the cloud (38%) are the top three types of technologies enterprises are planning to use. Based on this finding and others in the study, cloud platforms are the new normal in enterprises’ analytics and Bi strategies going into 2019. Cloud data storage (object, file, or block) and data virtualization or federation (both 32%) are the next-most planned for technologies by enterprises when it comes to investing in the analytics and BI initiatives. Please click on the graphic to expand for easier reading.
The three most important factors in delivering a positive user experience include good query performance (61%), creating and editing visualizations (60%), and personalizing dashboards and reports (also 60%). The three activities that lead to the least amount of satisfaction are using predictive analytics and forecasting tools (27% dissatisfied), “What if” analysis and deriving new data (25%) and searching across data and reports (24%). Please click on the graphic to expand for easier reading.
82% of enterprises are looking to broaden the base of analytics and BI platforms they rely on for insights and intelligence, not just stay with the solutions they have in place today. Just 18% of enterprises plan to add more instances of existing platforms and systems. Cloud-native platforms (38%), a new analytics platform (35%) and cloud-based data lakes (31%) are the top three system areas enterprises are planning to augment or replace existing BI, analytics, and data warehousing systems in. Please click on the graphic to expand for easier reading.
The majority of enterprises plan to both build and buy Artificial Intelligence (AI) and machine learning (ML) solutions so that they can customize them to their specific needs. 43% of enterprises surveyed plan to both build and buy AI and ML applications and platforms, a figure higher than any other recent survey on this aspect of enterprise AI adoption. 13% of responding enterprises say they will exclusively build their own AI and ML applications.
Capitalizing on machine learning’s innate strengths of applying algorithms to large volumes of data to find actionable new insights (54%) is what’s most important to the majority of enterprises. 47% of enterprises look to AI and machine learning to improve the accuracy and quality of information. And 42% are configuring AI and machine learning applications and platforms to augment user decision making by giving recommendations. Please click on the graphic to expand for easier reading.
62% of highest performing salespeople predict guided selling adoption will accelerate based on its ability rank potential opportunities by value and suggest next steps according to Salesforces’ latest State of Sales research study.
By 2020, 30% of all B2B companies will employ AI to augment at least one of their primary sales processes according to Gartner.
High-performing sales teams are 4.1X more likely to use AI and machine learning applications than their peers according to the State of Sales published by Salesforce.
Intelligent forecasting, opportunity insights, and lead prioritization are the top three AI and machine learning use cases in sales.
Artificial Intelligence (AI) and machine learning show the potential to reduce the most time-consuming, manual tasks that keep sales teams away from spending more time with customers. Automating account-based marketing support with predictive analytics and supporting account-centered research, forecasting, reporting, and recommending which customers to upsell first are all techniques freeing sales teams from manually intensive tasks.
The Race for Sales-Focused AI & Machine Learning Patents Is On
Fueled by the proliferation of patents and the integration of AI and machine learning code into CRM, CPQ, Customer Service, Predictive Analytics and a wide variety of Sales Enablement applications, use cases are flourishing today. Presented below are the ten ways machine learning is most revolutionizing selling today:
AI and machine learning technologies excel at pattern recognition, enabling sales teams to find the highest potential new prospects by matching data profiles with their most valuable customers. Nearly all AI-enabled CRM applications are providing the ability to define a series of attributes, characteristics and their specific values that pinpoint the highest potential prospects. Selecting and prioritizing new prospects using this approach saves sales teams thousands of hours a year.
Lead scoring and nurturing based on AI and machine learning algorithms help guide sales and marketing teams to turn Marketing Qualified Leads (MQL) into Sales Qualified Leads (SQL), strengthening sales pipelines in the process. One of the most important areas of collaboration between sales and marketing is lead nurturing strategies that move prospects through the pipeline. AI and machine learning are enriching the collaboration with insights from third-party data, prospect’s activity at events and on the website, and from previous conversations with salespeople. Lead scoring and nurturing relies heavily on natural language generation (NLG) and natural-language processing (NLP) to help improve each lead’s score.
Combining historical selling, pricing and buying data in a single machine learning model improves the accuracy and scale of sales forecasts. Factoring in differences inherent in every account given their previous history and product and service purchasing cycles is invaluable in accurately predicting their future buying levels. AI and machine learning algorithms integrated into CRM, sales management and sales planning applications can explain variations in forecasts, provided they have the data available. Forecasting demand for new products and services is an area where AI and machine learning are reducing the risk of investing in entirely new selling strategies for new products.
Knowing the propensity of a given customer to churn versus renew is invaluable in improving Customer Lifetime Value. Analyzing a diverse series of factors to see which customers are going to churn or leave versus those that will renew is among the most valuable insights AI and machine learning is delivering today. Being able to complete a Customer Lifetime Value Analysis for every customer a company has provides a prioritized roadmap of where the health of client relationships are excellent versus those that need attention. Many companies are using Customer Lifetime Value Analysis as a proxy for a customer health score that gets reviewed monthly.
Knowing the strategies, techniques and time management approaches the top 10% of salespeople to rely on to excel far beyond quota and scaling those practices across the sales team based on AI-driven insights. All sales managers and leaders think about this often, especially in sales teams where performance levels vary widely. Knowing the capabilities of the highest-achieving salespeople, then selectively recruiting those sales team candidates who have comparable capabilities delivers solid results. Leaders in the field of applying AI to talent management include Eightfold whose approach to talent management is refining recruiting and every phase of managing an employee’s potential. Please see the recent New York Times feature of them here.
Guided Selling is progressing rapidly from a personalization-driven selling strategy to one that capitalized on data-driven insights, further revolutionizing sales. AI- and machine learning-based guided selling is based on prescriptive analytics that provides recommendations to salespeople of which products, services, and bundles to offer at which price. 62% of highest performing salespeople predict guided selling adoption will accelerate based on its ability rank potential opportunities by value and suggest next steps according to Salesforces’ latest State of Sales research study.
Improving the sales team’s productivity by using AI and machine learning to analyze the most effective actions and behaviors that lead to more closed sales. AI and machine learning-based sales contact and customer predictive analytics take into account all sources of contacts with customers and determine which are the most effective. Knowing which actions and behaviors are correlated with the highest close rates, sales managers can use these insights to scale their sales teams to higher performance.
Sales and marketing are better able to define a price optimization strategy using all available data analyzing using AI and machine learning algorithms. Pricing continues to be an area the majority of sales and marketing teams learn to do through trial and error. Being able to analyze pricing data, purchasing history, discounts are taken, promotional programs participated in and many other factors, AI and machine learning can calculate the price elasticity for a given customer, making an optimized price more achievable.
Personalizing sales and marketing content that moves prospects from MQLs to SQLs is continually improving thanks to AI and machine learning. Marketing Automation applications including HubSpot and many others have for years been able to define which content asset needs to be presented to a given prospect at a given time. What’s changed is the interactive, personalized nature of the content itself. Combining analytics, personalization and machine learning, marketing automation applications are now able to tailor content and assets that move opportunities forward.
Solving the many challenges of sales engineering scheduling, sales enablement support and dedicating the greatest amount of time to the most high-value accounts is getting solved with machine learning. CRM applications including Salesforce can define a salesperson’s schedule based on the value of the potential sale combined with the strength of the sales lead, based on its lead score. AI and machine learning optimize a salesperson’s time so they can go from one customer meeting to the next, dedicating their time to the most valuable prospects.
Treating healthcare’s breach epidemic needs to start by viewing every threat surface, access point, identity, and login attempt as the new security perimeter. Healthcare providers urgently need to take a “never trust, always verify” approach, adopting Zero Trust Security to protect every threat surface using Next-Gen Access for end-user credentials and Privileged Access Management (PAM) for privileged credentials. One of the leaders in Next-Gen Access is Idaptive, a newly created spin-off of Centrify. Centrify itself is offering Zero Trust Privilege Services helping over half of the Fortune 100 to eliminate privileged access abuse, the leading cause of breaches today. Centrify Zero Trust Privilege grants least privilege access based on verifying who is requesting access, the context of the request, and the risk of the access environment.
There’s been a staggering 298.4% growth in the number of patient records breached as a result of insider-wrongdoing this year alone. In Q1 of this year, there were 4,597 patient records exfiltrated by insider wrong-doing, jumping to 70,562 in Q2 and soaring to 290,689 in Q3. Healthcare insiders can easily thwart healthcare systems’ legacy security approaches today by using compromised access credentials. Zero Trust Security, either in the form of Next-Gen Access for end-user credentials or Zero Trust Privilege for privileged access credentials has the potential to stop this
The total number of breached patient records has soared from 1.1M in Q1 of this year to 4.4M in Q3, a 58.7% jump in less than a year. Protenus found a total of 117 incidents were disclosed to U.S. Department of Health and Human Services (HHS) or the media in Q3 2018 alone. Details were disclosed for 100 of these incidents, affecting 4,390,512 patient records, the highest level ever recorded. Jumping from 1.1M medical records in Q1 to 4.4M in Q3, healthcare providers could easily see over 6.5M records breached in Q4 2018 alone.
Hackers targeted healthcare systems aggressively in Q3 of this year, exfiltrating 3.6M patient records in just 90 days. Compromised access credentials are hackers’ favorite technique for exfiltrating massive quantities of medical records they resell on the Dark Web or use to commit tax and credit card fraud. Healthcare providers need to minimize their attack surfaces, improve audit and compliance visibility, reduce risk, complexity, and costs across their modern, hybrid enterprises with Zero Trust. Healthcare providers need to shut down hackers now, taking away the opportunities they’re capitalizing on to exfiltrate medical records almost at will.
It takes 71 days on average for healthcare providers to realize their data is breached with one breach lasting over 15 years. Protenus found a wide variation in the length of time it takes healthcare providers to realize they’ve been breached and one didn’t know until 15 years after the initial successful breach. All breaches tracked by Protenus found that the insiders and/or hackers were successful in gaining access to a wealth of patient information including addresses, dates of birth, medical record numbers, healthcare providers, visit date, health insurance information, financial histories, and payment information.
Zero Trust is the antidote healthcare needs to treat its raging breach epidemic. It’s exponentially growing as insiders’ intent on wrongdoing turn to exfiltrating patients’ data for personal gain. Hackers also find healthcare providers’ legacy systems among the easiest to access using stolen access credentials, exfiltrating millions of records in months. With every new employee and device being a new security perimeter on their networks, the time is now for healthcare providers to discard the old model of “trust but verify” which relied on well-defined boundaries. Zero Trust mandates a “never trust, always verify” approach to access, from inside or outside healthcare providers’ networks.
For smart, connected product strategies to succeed they require a product lifecycle view of configurations, best attained by integrating PLM, CAD, CRM, and ERP systems.
Capgemini estimates that the size of the connected products market will be $519B to $685B by 2020.
In 2018, $985B will be spent on IoT-enabled smart consumer devices, soaring to $1.49B in 2020, attaining a 23.1% compound annual growth rate (CAGR) according to Statista.
Industrial manufacturers will spend on average $121M a year on smart, connected products according to Statista.
Succeeding with a smart, connected product strategy is requiring manufacturers to accelerate their IoT & software development expertise faster than they expected. By 2020, 50% of manufacturers will generate the majority of their revenues from smart, connected products according to Capgemini’s recent study. Manufacturers see 2019 as the breakout year for smart, connected products and the new revenue opportunities they provide.
Industrial Internet of Things (IIoT) platforms has the potential of providing a single, unified data model across an entire manufacturing operation, giving manufacturers a single unified view of product configurations across their lifecycles. Producing smart, connected products at scale also requires a system capable of presenting a unified view of configurations in the linguistics each department can understand. Engineering, production, marketing, sales, and service all need a unique view of product configurations to keep producing new products. Leaders in this field include Configit and their Configuration Lifecycle Management approach to CPQ and product configuration.
CPQ Needs To Scale Further To Sell Smart, Connected Products
Smart, connected products are redefining the principles of product design, manufacturing, sales, marketing, and service. CPQ systems need to grow beyond their current limitations by capitalizing on these new principles while scaling to support new business models that are services and subscription-based.
The following are the key areas where CPQ systems are innovating today, making progress towards enabling the custom configuration of smart, connected products:
For smart, connected product strategies to succeed they require a product lifecycle view of configurations, best attained by integrating PLM, CAD, CRM, and ERP systems. Smart, connected product strategies require real-time integration between front-end and back-end systems to optimize production performance. And they also require advanced visualization that provides prospects with an accurate, 3D-rendered view that can be accurately translated to a Bill of Materials (BOM) and into production. The following graphic is based on conversations with Configit customers, illustrating how they are combining PLM, CAD, CRM and ERP systems to support smart, connected products related to automotive manufacturing. Please click on the graphic to expand it for easier reading.
CPQ and product configuration systems need to reflect the products they’re specifying are part of a broader ecosystem, not stand-alone. The essence of smart, connected products is their contributions to broader, more complex networks and ecosystems. CPQ systems need to flex and support much greater system interoperability of products than they do today. Additional design principles include designing in connected service options, evergreen or long-term focus on the product-as-a-platform and designed in support for entirely new pricing models.
Smart, connected products need CPQ systems to reduce physical complexity while scaling device intelligence through cross-sells, up-sells and upgrades. Minimizing the physical options to allow for greater scale and support for device intelligence-based ones are needed in CPQ systems today. For many CPQ providers, that’s going to require different data models and taxonomies of product definitions. Smart, connected products will be modified after purchase as well, evolving to customers’ unique requirements.
After-sales service for smart, connected products will redefine pricing and profit models for the better in 2019, and CPQ needs to keep up to make it happen. Giving products the ability to send back their usage rates and patterns, reliability and performance data along with their current condition opens up lucrative pricing and services models. CPQ applications need to be able to provide quotes for remote diagnostics, price breaks on subscriptions for sharing data, product-as-a-service and subscription-based options for additional services. Many CPQ systems will need to be updated to support entirely new services-driven business models manufacturers are quickly adopting today.
Eightfold’s analysis of hiring data has found the half-life of technical, marketable skills is 5 to 7 years, making the ability to unlearn and learn new concepts essential for career survival.
Applicant Tracking Systems (ATS) don’t capture applicants’ drive and intensity to unlearn and learn or their innate capabilities for growth.
Artificial Intelligence (AI) and machine learning are proving adept at discovering candidates’ innate capabilities to unlearn, learn and reinvent themselves throughout their careers.
Hiring managers in search of qualified job candidates who can scale with and contribute to their growing businesses are facing a crisis today. They’re not finding the right or in many cases, any candidates at all using resumes alone, Applicant Tracking Systems (ATS) or online job recruitment sites designed for employers’ convenience first and candidates last. These outmoded approaches to recruiting aren’t designed to find those candidates with the strongest capabilities. Add to this dynamic the fact that machine learning is making resumes obsolete by enabling employers to find candidates with precisely the right balance of capabilities needed and its unbiased data-driven approach selecting candidates works. Resumes, job recruitment sites and ATS platforms force hiring managers to bet on the probability they make a great hire instead of being completely certain they are by basing their decisions on solid data.
Playing The Probability Hiring Game Versus Making Data-Driven Decisions
Many hiring managers and HR recruiters are playing the probability hiring game. It’s betting that the new hire chosen using imprecise methods will work out. And like any bet, it gets expensive quickly when a wrong choice is made. There’s a 30% chance the new hire will make it through one year, and if they don’t, it will cost at least 1.5 times their salary to replace them. When the median salary for a cloud computing professional is $146,350, and it takes the best case 46 days to find them, the cost and time loss of losing just one recruited cloud computing professional can derail a project for months. It will cost at least $219,000 or more to replace just that one engineer. The average size of an engineering team is ten people so only three will remain in 12 months. These are the high costs of playing the probability hiring game, fueled by unconscious and conscious biases and systems that game recruiters into believing they are making progress when they’re automating mediocre or worse decisions. Hiring managers will have better luck betting in Las Vegas or playing Powerball than hiring the best possible candidate if they rely on systems that only deliver a marginal probability of success at best.
Betting on solid data and personalization at scale, on the other hand, delivers real results. Real data slices through the probabilities and is the best equalizer there is at eradicating conscious and unconscious biases from hiring decisions. Hiring managers, HR recruiters, directors and Chief Human Resource Officers (CHROs) vow they are strong believers in diversity. Many are abandoning the probability hiring game for AI- and machine learning-based approaches to talent management that strip away any extraneous data that could lead to bias-driven hiring decisions. Now candidates get evaluated on their capabilities and innate strengths and how strong a match they are to ideal candidates for specific roles.
A Data-Driven Approach To Finding Employees Who Can Scale
Personalization at scale is more than just a recruiting strategy; it’s a talent management strategy intended to flex across the longevity of every employees’ tenure. Attaining personalization at scale is essential if any growing business is going to succeed in attracting, acquiring and growing talent that can support their growth goals and strategies. Eightfold’s approach makes it possible to scale personalized responses to specific candidates in a company’s candidate community while defining the ideal candidate for each open position. Personalization at scale has succeeded in helping companies find the right person to the right role at the right time and, for the first time, personalize every phase of recruitment, retention and talent management at scale.
Eightfold is pioneering the use of a self-updating corporate candidate database. Profiles in the system are now continually updated using external data gathering, without applicants reapplying or submitting updated profiles. The taxonomies supported in the corporate candidate database make it possible for hiring managers to define the optimal set of capabilities, innate skills, and strengths they need to fill open positions.
Lessons Learned at PARC Russell Williams, former Vice President of Human Resources at PARC, says the best strategy he has found is to define the ideal attributes of high performers and look to match those profiles with potential candidates. “We’re finding that there are many more attributes that define a successful employee in our most in-demand positions including data scientist that are evident from just reviewing a resume and with AI, I want to do it at scale,” Russell said. Ashutosh Garg, Eightfold founder, added: “that’s one of the greatest paradoxes that HR departments face, which is the need to know the contextual intelligence of a given candidate far beyond what a resume and existing recruiting systems can provide.” One of the most valuable lessons learned from PARC is that it’s possible to find the find candidates who excel at unlearning, learning, defining and diligently pursuing their learning roadmaps that lead to reinventing their skills, strengths, and marketability.
Machine learning algorithms capable of completing millions of pattern matching comparisons per second provides valuable new insights, enabling companies to find those who excel at reinventing themselves. The most valuable employees who can scale any business see themselves as learning entrepreneurs and have an inner drive to master new knowledge and skills. And that select group of candidates is the catalyst most often responsible for making the greatest contributions to a company’s growth.
Bottom Line: High-tech manufacturers need to urgently solve the paradox of improving supply chain security while attaining greater visibility across supplier networks if they’re going make the most of smart, connected products’ many growth opportunities in 2019.
Smart, connected products free manufacturers and their supply chains from having to rely on transactions and the price wars they create. The smarter the product, the greater the services revenue opportunities. And the more connected a smart product is using IoT and Wi-Fi sensors the more security has to be designed into every potential supplier evaluation, onboarding, quality plan, and ongoing suppliers’ audits. High-tech manufacturers are undertaking all of these strategies today, fueling them with real-time monitoring using barcoding, RFID and IoT sensors to improve visibility across their supply chains.
Gaining even greater visibility into their supply chains using cloud-based track-and-trace systems capable of reporting back the condition of components in transit to the lot and serialized pack level, high-tech suppliers are setting the gold standard for supply chain transparency and visibility. High-tech supply chains dominate many other industries’ supplier networks on accuracy, speed, and scale metrics on a consistent basis, yet the industry is behind on securing its vast supplier network. Every supplier identity and endpoint is a new security perimeter and taking a Zero Trust approach to securing them is the future of complex supply chains. With Zero Trust Privilege, high-tech manufacturers can secure privileged access to infrastructure, DevOps, cloud, containers, Big Data, production, logistics and shipping facilities, systems and teams.
High-Tech Needs to Confront Its Supply Chain Security Problem, Not Dismiss It
It’s ironic that high-tech supply chains are making rapid advances in accuracy and visibility yet still aren’t vetting suppliers thoroughly enough to stop counterfeiting, or worse. Bloomberg’s controversial recent article,The Big Hack: How China Used a Tiny Chip to Infiltrate U.S. Companies, explains how Amazon Web Services (AWS) was considering buying Portland, Oregon-based Elemental Technologies for its video streaming technology, known today as Amazon Prime Video. As part of the due diligence, AWS hired a third-party company to scrutinize Elemental’s security all the way up to the board level. The Elemental servers that handle the video compression were assembled by Super Micro Computer Inc., a San Jose-based company in China. Nested on the servers’ motherboards, the testers found a tiny microchip, not much bigger than a grain of rice, that wasn’t part of the boards’ original design that could create a stealth doorway into any network the machines were attached to. Apple (who is also an important Super Micro customer) and AWS deny this ever happened, yet 17 people have confirmed Supermicro had altered hardware, corroborating Bloomberg’s findings.
The hard reality is that the scenario Bloomberg writes about could happen to any high-tech manufacturer today. When it comes to security and 3rd party vendor risk management, many high-tech supply chains are stuck in the 90s while foreign governments, their militaries and the terrorist organizations they support are attempting to design in the ability to breach any network at will. How bad is it? 81% of senior executives involved in overseeing their companies’ global supply chains say 3rd party vendor management including recruiting suppliers is riskiest in China, India, Africa, Russia, and South America according to a recent survey by Baker & McKenzie.
Getting in Control of Security Risks in High-Tech Supply Chains
It’s time for high-tech supply chains to go with a least privilege-based approach to verifying who or what is requesting access to any confidential data across the supply chains. Further, high-tech manufacturers need to extend access request verification to include the context of the request and the risk of the access environment. Today it’s rare to find any high-tech manufacturer going to this level of least-privilege access approach, yet it’s the most viable approach to securing the most critical parts of their supply chains.
By taking a least-privilege access approach, high-tech manufacturers and their suppliers can minimize attack surfaces, improve audit and compliance visibility, and reduce risk, complexity, and operating costs across their hybrid manufacturing ecosystem.
Key actions that high-tech manufacturers can take to secure their supply chain and ensure they don’t end up in an investigative story of hacked supply chains include the following:
Taking a Zero Trust approach to securing every endpoint provides high-tech manufacturers with the scale they need to grow. High-tech supply chains are mercurial and fast-moving by nature, guaranteeing they will quickly scale faster than any legacy approaches enterprise security management. Vetting and then onboarding new suppliers needs to start by protecting every endpoint to the production and sourcing level, especially for next-generation smart, connected products.
Smart, connected products and the product-as-a-service business models they create are all based on real-time, rich, secured data streams that aren’t being eavesdropped on with components no one knows about. Taking a Zero Trust Privilege-based approach to securing access to diverse supply chains is needed if high-tech manufacturers are going to extend beyond legacy Privileged Access Management (PAM) to secure data being generated from real-time monitoring and data feeds from their smart, connected products today and in the future.
Quality management, compliance, and quality audits are all areas high-tech manufacturers excel in today and provide a great foundation to scale to Zero Trust Privilege. High-tech manufacturers have the most advanced quality management, inbound inspection and supplier quality audit techniques in the world. It’s time for the industry to step up on the security side too. By only granting least-privilege access based on verifying who is requesting access, the context of the request, and the risk of the access environment, high-tech manufacturers can make rapid strides to improve supply chain security.
Rethink the new product development cycles for smart, connected products and the sensors they rely on, so they’re protected as threat surfaces when built. Designing in security to the new product development process level and further advancing security scrutiny to the schematic and board design level is a must-do. In an era of where we have to assume bad actors are everywhere, every producer of high-tech products needs to realize their designs, product plans, and roadmaps are at risk. Ensuring the IOT and Wi-Fi sensors in smart, connected products aren’t designed to be hackable starts with a Zero Trust approach to defining security for supplier, design, and development networks.
The era of smart, connected products is here, and supply chains are already reverberating with the increased emphasis on components that are easily integrated and have high-speed connectivity. Manufacturing CEOs say it’s exactly what their companies need to grow beyond transaction revenue and the price wars they create. While high-tech manufacturers excel at accuracy, speed, and scale, they are falling short on security. It’s time for the industry to re-evaluate how Zero Trust can stabilize and secure every identity and threat surface across their supply chains with the same precision and intensity quality is today.