Skip to content

Archive for

Identities Are The New Security Perimeter

  • Privileged credentials for accessing an airport’s security system were recently for sale on the Dark Web for just $10, according to McAfee.
  • 18% of healthcare employees are willing to sell confidential data to unauthorized parties for as little as $500 to $1,000, and 24% of employees know of someone who has sold privileged credentials to outsiders, according to a recent Accenture survey.
  • Apple employees in Ireland have been offered as much as €20,000 ($22,878) in exchange for their privilege access credentials in 2016, according to Business Insider.
  • Privileged access credentials belonging to more than 1 million staff at a top UK law firm have been found for sale on the Dark Web.

There’s been a 135% year-over-year increase in financial data for sale on the Dark Web between the first half of 2017 and the first half of 2018. The Dark Web is now solidly established as a globally-based trading marketplace for a myriad of privileged credentials including access procedures with keywords, and corporate logins and passwords where transactions happen between anonymous buyers and sellers. It’s also the online marketplace of choice where disgruntled, angry employees turn to for revenge against employers. An employee at Honeywell, angry over not getting a raise, used the Dark Web as an intermediary to sell DEA satellite tracking system data he accessed from unauthorized accounts he created to Mexican drug cartels for $2M. He was caught in a sting operation, the breach was thwarted, and he was arrested.

Your Most Vulnerable Threat Surface Is A Best Seller

Sites on the Dark Web offer lucrative payment in bitcoin and other anonymous currencies for administrators’ accounts at leading European, UK and North American banking institutions and corporations. Employees are offering their privileged credentials for sale to the highest bidder out of anger, revenge or for financial gain anonymously from online auction sites.

Privileged access credentials are a best-seller because they provide the intruder with “the keys to the kingdom.” By leveraging a “trusted” identity, a hacker can operate undetected and exfiltrate sensitive data sets without raising any red flags. This holds especially true when the organizations are not applying multi-factor authentication (MFA) or risk-based access controls to limit any type of lateral movement after unauthorized access. Without these security measures in place, hackers can quickly access any digital businesses’ most valuable systems to exfiltrate valuable data or sabotage systems and applications.

81% of all hacking-related breaches leverage either stolen and weak passwords, according to Verizon’s 2017 Data Breach Investigations Report. A recent study by Centrify and Dow Jones Customer Intelligence titled, CEO Disconnect is Weakening Cybersecurity (31 pp, PDF, opt-in), found that CEOs can reduce the risk of a security breach by rethinking their Identity and Access Management (IAM) strategies. 68% of executives whose companies experienced significant breaches in hindsight believe that the breach could have been prevented by implementing more mature identity and access management strategies.

In A Zero Trust World, Identities Are The New Security Perimeter

The buying and selling of privileged credentials are proliferating on the Dark Web today and will exponentially increase in the years to come. Digital businesses need to realize that dated concepts of trusted and untrusted domains have been rendered ineffective. Teams of hackers aren’t breaking into secured systems; they’re logging in.

Digital businesses who are effective in thwarting privileged credential access have standardized on Zero Trust Security (ZTS) to ensure every potentially compromised endpoint, and threat surface within and outside a company is protected. Not a single device, login attempt, resource requested or other user-based actions are trusted, they are verified through Next-Gen Access (NGA).

Zero Trust Security relies upon four pillars: real-time user verification, device validation, access and privilege limitation, while also learning and adapting to verified user behaviors. Leaders in this area such as Centrify are relying on machine learning technology to calculate risk scores based on a wide spectrum of variables that quantitatively define every access attempt, including device, operating system, location, time of day, and several other key factors.

Depending on their risk scores, users are asked to validate their true identity through MFA further. If there are too many login attempts, risk scores increase quickly, and the NGA platform will automatically block and disable an account. All this happens in seconds and is running on a 24/7 basis ― monitoring every attempted login from anywhere in the world.

A recent Forrester Research thought leadership paper titled, Adopt Next-Gen Access to Power Your Zero Trust Strategy (14 pp., PDF, opt-in), provides insights into how NGA enables ZTS to scale across enterprises, protecting every endpoint and threat surface. The study found 32% of enterprises are excelling at the four ZTS pillars of verifying the identity of every user, validating every device using Mobile Data Management (MDM) and Mobile App Management (MAM), limiting access and privileges and learning and adapting using machine learning to analyze user behavior and gain greater insights from analytics.

NGA is a proven strategy for thwarting stolen and sold privileged access credentials from gaining access to a digital business’ network and systems, combining Identity-as-a-Service, Enterprise Mobility Management (EMM) and Privileged Access Management (PAM). Forrester found that scalable Zero Trust Security strategies empowered by NGA lead to increased organization-wide productivity (71%), reduced overall risk (70%) and reduced cost on compliance initiatives (70%).

Additionally, insights gained from user behavior through machine learning allow for greater efficiency — both on reduced compliance (31% more confident) and overall security costs (40% more likely to be confident), as well through increased productivity for the organization (8% more likely to be confident). The following graphic from the study ranks respondents’ answers.

Conclusion

Making sure your company’s privileged access credentials don’t make the best seller list on the Dark Web starts with a strong, scalable ZTS strategy driven by NGA. Next-Gen Access continually learns the behaviors of verified users, solving a long-standing paradox of user experience in security and access management. However, every digital business needs to focus on how the four pillars of Zero Trust Security apply to them and how they can take a pragmatic, thorough approach to secure every threat surface they have.

Glassdoor’s 10 Highest Paying Tech Jobs Of 2018

  • Software Engineering Manager is the highest paying position with an average salary of $163,500 with 31,621 open positions on Glassdoor today.
  • Over 368,000 open positions are available across the 10 highest paying jobs on Glassdoor today.
  • $147,000 is the average salary of the top 10 tech jobs on Glassdoor today.
  • 12.7% of all open positions are for software engineers, making this job the most in-demand in tech today.

Glassdoor is best known for its candid, honest reviews of employers written anonymously by employees. It is now common practice and a good idea for anyone considering a position with a new employer to check them out on Glassdoor first. With nearly 40 million reviews on more than 770,000 companies. Glassdoor is now the 2nd most popular job site professionals rely on in the U.S., attracting approximately 59 million job seekers a month. The Chief Human Resources Officer of one of the largest and best-known cloud-based enterprise software companies told me recently she gets 2X more applications from Glassdoor for any given position than any other recruiting site or channel.

Earlier this month Glassdoor Economic Research published the results of research completed on how base pay compares between tech and non-tech jobs.  The research team gathered a sample of tech companies with at least 100 job postings on Glassdoor as of June 26, 2018. Glassdoor defined tech roles as those positions requiring knowledge of code, software or data. The study found the following to be the 10 highest paying tech jobs today:

Walmart eCommerce, Microsoft, Intel, Amazon, and Google have the highest concentration of tech jobs as a percentage of all positions open. Workday, Salesforce, Verizon, and IBM have the highest concentration of non-tech positions available today.

Source: Glassdoor Economic Research Blog, Landing a Non-Tech Job in Tech: Who’s Hiring Today? July 19, 2018

IoT Market Predicted To Double By 2021, Reaching $520B

  • Bain predicts the combined markets of the Internet of Things (IoT) will grow to about $520B in 2021, more than double the $235B spent in 2017.
  • Data center and analytics will be the fastest growing IoT segment, reaching a 50% Compound Annual Growth Rate (CAGR) from 2017 to 2021.
  • IoT customers are planning and executing more proof of concept pilots, with many balancing their expectations regarding broader adoption.
  • Cloud Service Providers (CSP) are emerging as influential providers of IoT services, consulting and analytics for enterprises, leaving smaller opportunities for other providers in niche industries.
  • Security, integration with existing technology and uncertain returns on investment are the three biggest barriers to great IoT adoption in the enterprise.
  • Bain sees the need for vendors to concentrate on a few core industries with greater intensity to deliver more targeted industry solutions.

Enterprises adopting IoT are finding that vendors aren’t making enough progress on lowering the most significant barriers to adoption in the areas of security, ease of integration with existing information technology (IT), operational technology (OT) systems and uncertain returns on investment. As a result, enterprises are extending their expectations of when their use cases will reach scale and delivered results. These and many other fascinating findings are from Bain’s latest IoT research brief, Unlocking Opportunities in the Internet of Things. The PDF is downloadable here (PDF, 12 pp, no opt-in).

Additional key takeaways the research brief include the following:

  • The combined markets of the Internet of Things (IoT) will grow to about $520B in 2021, more than double the $235B spent in 2017. Data center and analytics will be the fastest growing IoT segment, reaching a 50% Compound Annual Growth Rate (CAGR) from 2017 to 2021. System integration, data center and analytics, network, consumer devices, connectors (or things) and legacy embedded systems are the six core technology and solution areas of the IoT market. The following graphic compares the CAGR of each area in addition to defining the worldwide revenue for each category.

  • Enterprises are still optimistic about IoT’s business value and potential to deliver a positive ROI; however many are planning less extensive IoT implementations by 2020. Bain finds that enterprises are still running more proofs of concept than they were two years ago. They’ve also discovered that more customers are considering trying out new use cases: 60% in 2018 compared with fewer than 40% in 2016.

  • Security, integration with existing technology and uncertain returns on investment are the three biggest barriers to great IoT adoption. Bain found that enterprises would buy more IoT devices and pay up to 22% more on average for them if security concerns were addressed. Integration continues to be a barrier to greater IoT adoption as well. Bain found that vendors haven’t simplified the integration of IoT solutions into business processes or IT and OT as much as enterprises have expected. The report calls for vendors to invest in learning more about typical implementation challenges in their customers’ industries so they can suggest more strategic, end-to-end solutions.

  • IoT vendors including CSPs generating the most sales are concentrating on two to three industries to scale the depth of their expertise quickly.  More than 80% of vendors still target four to six industries which makes it difficult to reach an expertise and knowledge scale that wins new clients. Bain finds that when vendors and CSPs concentrate on two or three domains, they gain mastery of specific markets faster and can provide insights to enterprises more effectively. Gaining expertise in two to three core industries is also an excellent differentiation strategy for vendors and CSPs who compete against price-driven IoT service providers.

  • Interest in remote monitoring and real-time monitoring is flourishing in IoT making this one of the fastest-growing use case categories. Being able to monitor production systems to the machine or asset level remotely and having the option to turn the data stream into a real-time source of knowledge is a fast-growing area of IoT adoption today. Based on interviews with manufacturers the popularity of Overall Equipment Effectiveness (OEE) is growing, fueled by the options available for remote and real-time monitoring of production assets. Bain discovered that industrial equipment leader ABB bundles remote monitoring into its connected robotics systems and connected low-voltage networks, which allows customers to troubleshoot and quickly identify issues requiring greater attention.
  • Cloud Service Providers (CSP) are emerging as influential providers of IoT services, consulting and analytics for enterprises, leaving smaller opportunities for other providers in niche industries. Amazon Web Services (AWS) and Microsoft Azure have emerged as the dominant CSP leaders of the fast-moving global market for IoT software and solutions. Bain finds that CSPs are successful in lowering barriers to IoT adoption, allowing for simpler implementations and making it easier to try out new use cases and scale up quickly. The study finds that the broad horizontal services provide little optimization for industry-specific applications, leaving a significant opportunity for industry solutions from systems integrators, enterprise app developers, industry IoT specialists, device makers and telecommunications providers.

Global State Of Enterprise Analytics, 2018

  • 71% of enterprises globally predict their investments in data and analytics will accelerate in the next three years and beyond.
  • 57% of enterprises globally have a Chief Data Officer, a leadership role that is pivotal in helping to democratize data and analytics across any organization.
  • 52% of enterprises are leveraging advanced and predictive analytics today to provide greater insights and contextual intelligence into operations.
  • 41% of all enterprises are considering a move to cloud-based analytics in the next year.
  • Cloud Computing (24%), Big Data (20%), and AI/Machine Learning (18%) are the three technologies predicted to have the greatest impact on analytics over the next five years.
  • Just 16% of enterprises have enabled at least 75% of their employees to have access to company data and analytics.

These and many other fascinating insights are from MicroStrategy’s latest research study, 2018 Global State of Enterprise Analytics Report.  You can download a copy here (PDF, 44 pp., opt-in). The study is based on surveys completed in April 2018 with 500 globally-based enterprise analytics and business intelligence professionals on the state of their organizations’ analytics initiatives across 20 industries. Participants represented organizations with 250 to 20,000 employees worldwide from five nations including Brazil, Germany, Japan, the United Kingdom and the United States. For additional details on the methodology, please see the study here. The study’s results underscore how enterprises need to have a unified data strategy that reflects their growth strategies and new business models’ information needs.

Key takeaways from the study include the following:

  • Driving greater process and cost efficiencies (60%), strategy and change (57%) and monitoring and improving financial performance (52%) are the top three ways enterprises globally are using data and analytics today. The study found that enterprises are also relying on data and analytics to gain greater insights into how current products and services are used (51%), managing risk (50%) and attain customer growth and retention (49%). Across the five nations surveyed, Japan leads the world in the use of data and analytics to drive process and cost efficiencies (65%). UK-based enterprises lead all nations in their use of data and analytics to analyze how current products and services are being used.  The report provides graphical comparisons of the five nations’ results.

  • Cloud Computing, Big Data, and AI/Machine Learning are the three technologies predicted to have the greatest global impact on analytics over the next five years. Japanese enterprises predict cloud computing will have the greatest impact on the future of analytics (28%) across the five nations’ enterprises interviewed. AI/Machine Learning is predicted to have the greatest impact on analytics in the U.K. (26%) globally as is Big Data in Germany (29%). Please see the study for country-specific prioritization of technologies.

  • 52% of enterprises are leveraging advanced and predictive analytics today to provide greater insights and contextual intelligence into operations. Additional leverage areas include distribution of analytics via e-mail and collaboration tools (49%), analytics embedded in other apps including Salesforce (44%) and mobile productivity apps (39%). Japanese enterprises lead the world in their adoption of advanced and predictive analytics (60%). German enterprises lead the world in the adoption of analytics for collaboration via e-mail and more real-time data and knowledge-sharing methods (50%).

  • 59% of enterprises are using Big Data Analytics, leading all categories of intelligence applications. Enterprise reporting (47%), data discovery (47%), mobile productivity apps (44%) and embedded apps (42%) are the top five intelligence applications in use globally by enterprises today. Big Data’s dominance in the survey results can be attributed to the top five industries in the sampling frame is among the most prolific in data generation and use. Manufacturing (15%) is the most data-prolific industry on the planet. Additional industries that generate massive amounts of data dominate the survey’s demographics including software technology-based businesses (14%), banking (13%), retail (11%), and financial services/business services (6%).

  • 27% of global enterprises prioritize security over any other factor when evaluating a new analytics vendor. The three core attributes of a scalable, comprehensive platform, ease of use, and a vendor’s products having an excellent reputation are all essential. Enterprises based in four of the five nations also prioritize security as the most critical success factor they evaluate potential analytics vendors to do business with. Enterprise scalability is most important in the U.S., with 26% of enterprises interviewed saying this is the most important priority in evaluating a new analytics vendor.

  • Data privacy and security concerns (49%) is the most formidable barrier enterprises face in gaining more effective use of their data and analytics. Enterprises from four of the five nations say data privacy and security are the most significant barrier they face in getting more value from analytics. In Japan, the greatest barrier is access limited to data across the organization (40%).

  • Globally 41% of all enterprises are considering a move to the cloud in the next year. 64% of U.S.-based enterprises are considering moving to a cloud-based analytics platform or solution in the next year. The U.S. leads enterprises from all five nations in planned cloud-based analytics cloud adoption as the graphic below illustrates.

10 Ways To Improve Cloud ERP With AI And Machine Learning

Capitalizing on new digital business models and the growth opportunities they provide are forcing companies to re-evaluate ERP’s role. Made inflexible by years of customization, legacy ERP systems aren’t delivering what digital business models need today to scale and grow.

Legacy ERP systems were purpose-built to excel at production consistency first at the expense of flexibility and responsiveness to customers’ changing requirements. By taking a business case-based approach to integrating Artificial Intelligence (AI) and machine learning into their platforms, Cloud ERP providers can fill the gap legacy ERP systems can’t.

Closing Legacy ERP Gaps With Greater Intelligence And Insight

Companies need to be able to respond quickly to unexpected, unfamiliar and unforeseen dilemmas with smart decisions fast for new digital business models to succeed. That’s not possible today with legacy ERP systems. Legacy IT technology stacks and the ERP systems they are built on aren’t designed to deliver the data needed most.

That’s all changing fast. A clear, compelling business model and successful execution of its related strategies are what all successful Cloud ERP implementations share. Cloud ERP platforms and apps provide organizations the flexibility they need to prioritize growth plans over IT constraints. And many have taken an Application Programming Interface (API) approach to integrate with legacy ERP systems to gain the incremental data these systems provide. In today’s era of Cloud ERP, rip-and-replace isn’t as commonplace as reorganizing entire IT architectures for greater speed, scale, and customer transparency using cloud-first platforms.

New business models thrive when an ERP system is constantly learning. That’s one of the greatest gaps between what Cloud ERP platforms’ potential and where their legacy counterparts are today. Cloud platforms provide greater integration options and more flexibility to customize applications and improve usability which is one of the biggest drawbacks of legacy ERP systems. Designed to deliver results by providing AI- and machine learning insights, Cloud ERP platforms, and apps can rejuvenate ERP systems and their contributions to business growth.

The following are the 10 ways to improve Cloud ERP with AI and machine learning, bridging the information gap with legacy ERP systems:

  1. Cloud ERP platforms need to create and strengthen a self-learning knowledge system that orchestrates AI and machine learning from the shop floor to the top floor and across supplier networks. Having a cloud-based infrastructure that integrates core ERP Web Services, apps, and real-time monitoring to deliver a steady stream of data to AI and machine learning algorithms accelerates how quickly the entire system learns. The Cloud ERP platform integration roadmap needs to include APIs and Web Services to connect with the many suppliers and buyer systems outside the walls of a manufacturer while integrating with legacy ERP systems to aggregate and analyze the decades of data they have generated.

  1. Virtual agents have the potential to redefine many areas of manufacturing operations, from pick-by-voice systems to advanced diagnostics. Apple’s Siri, Amazon’s Alexa, Google Voice, and Microsoft Cortana have the potential to be modified to streamline operations tasks and processes, bringing contextual guidance and direction to complex tasks. An example of one task virtual agents are being used for today is guiding production workers to select from the correct product bin as required by the Bill of Materials. Machinery manufacturers are piloting voice agents that can provide detailed work instructions that streamline configure-to-order and engineer-to-order production. Amazon has successfully partnered with automotive manufacturers and has the most design wins as of today. They could easily replicate this success with machinery manufacturers.

  1. Design in the Internet of Things (IoT) support at the data structure level to realize quick wins as data collection pilots go live and scale. Cloud ERP platforms have the potential to capitalize on the massive data stream IoT devices are generating today by designing in support at the data structure level first. Providing IoT-based data to AI and machine learning apps continually will bridge the intelligence gap many companies face today as they pursue new business models. Capgemini has provided an analysis of IoT use cases shown below, highlighting how production asset maintenance and asset tracking are quick wins waiting to happen. Cloud ERP platforms can accelerate them by designing in IoT support.

  1. AI and machine learning can provide insights into how Overall Equipment Effectiveness (OEE) can be improved that aren’t apparent today. Manufacturers will welcome the opportunity to have greater insights into how they can stabilize then normalize OEE performance across their shop floors. When a Cloud ERP platform serves as an always-learning knowledge system, real-time monitoring data from machinery and production assets provide much-needed insights into areas for improvement and what’s going well on the shop floor.

  1. Designing machine learning algorithms into track-and-traceability to predict which lots from which suppliers are most likely to be of the highest or lowest quality. Machine learning algorithms excel at finding patterns in diverse data sets by continually applying constraint-based algorithms. Suppliers vary widely in their quality and delivery schedule performance levels. Using machine learning, it’s possible to create a track-and-trace application that could indicate which lot from which supplier is the riskiest and those that are of exceptional quality as well.
  2. Cloud ERP providers need to pay attention to how they can help close the configuration gap that exists between PLM, CAD, ERP and CRM systems by using AI and machine learning. The most successful product configuration strategies rely on a single, lifecycle-based view of product configurations. They’re able to alleviate the conflicts between how engineering designs a product with CAD and PLM, how sales & marketing sell it with CRM, and how manufacturing builds it with an ERP system. AI and machine learning can enable configuration lifecycle management and avert lost time and sales, streamlining CPQ and product configuration strategies in the process.
  3. Improving demand forecasting accuracy and enabling better collaboration with suppliers based on insights from machine learning-based predictive models is attainable with higher quality data. By creating a self-learning knowledge system, Cloud ERP providers can vastly improve data latency rates that lead to higher forecast accuracy. Factoring in sales, marketing, and promotional programs further fine-tunes forecast accuracy.
  4. Reducing equipment breakdowns and increasing asset utilization by analyzing machine-level data to determine when a given part needs to be replaced. It’s possible to capture a steady stream of data on each machine’s health level using sensors equipped with an IP address. Cloud ERP providers have a great opportunity to capture machine-level data and use machine learning techniques to find patterns in production performance by using a production floor’s entire data set. This is especially important in process industries where machinery breakdowns lead to lost sales. Oil refineries are using machine learning models comprise more than 1,000 variables related to material input, output and process perimeters including weather conditions to estimate equipment failures.
  5. Implementing self-learning algorithms that use production incident reports to predict production problems on assembly lines needs to happen in Cloud ERP platforms. A local aircraft manufacturer is doing this today by using predictive modeling and machine learning to compare past incident reports. With legacy ERP systems these problems would have gone undetected and turned into production slowdowns or worse, the line having to stop.
  6. Improving product quality by having machine learning algorithms aggregate, analyze and continually learn from supplier inspection, quality control, Return Material Authorization (RMA) and product failure data. Cloud ERP platforms are in a unique position of being able to scale across the entire lifecycle of a product and capture quality data from the supplier to the customer. With legacy ERP systems manufacturers most often rely on an analysis of scrap materials by type or caused followed by RMAs. It’s time to get to the truth about why products fail, and machine learning can deliver the insights to get there.

IBM’s 2018 Data Breach Study Shows Why We’re In A Zero Trust World Now

  • Digital businesses that lost less than 1% of their customers due to a data breach incurred a cost of $2.8M, and if 4% or more were lost the cost soared to $6M.
  • U.S. based breaches are the most expensive globally, costing on average $7.91M with the highest global notification cost as well, $740,000.
  • A typical data breach costs a company $3.86M, up 6.4% from $3.62M last year.
  • Digital businesses that have security automation can minimize the costs of breaches by $1.55M versus those businesses who are not ($2.88M versus $4.43M).
  • 48% of all breaches are initiated by malicious or criminal attacks.
  • Mean-time-to-identify (MTTI) a breach is 197 days, and the mean-time-to-contain (MTTC) is 69 days.

These and many other insights into the escalating costs of security breaches are from the 2018 Cost of a Data Breach Study sponsored by IBM Security with research independently conducted by Ponemon Institute LLC. The report is downloadable here (PDF, 47 pp. no opt-in).

The study is based on interviews with more than 2,200 compliance, data protection and IT professionals from 477 companies located in 15 countries and regions globally who have experienced a data breach in the last 12 months. This is the first year the use of Internet of Things (IoT) technologies and security automation are included in the study. The study also defines mega breaches as those involving over 1 million records and costing $40M or more. Please see pages 5, 6 and 7 of the study for specifics on the methodology.

The report is a quick read and the data provided is fascinating. One can’t help but reflect on how legacy security technologies designed to protect digital businesses decades ago isn’t keeping up with the scale, speed and sophistication of today’s breach attempts. The most common threat surface attacked is compromised privileged credential access. 81% of all breaches exploit identity according to an excellent study from Centrify and Dow Jones Customer Intelligence, CEO Disconnect is Weakening Cybersecurity (31 pp, PDF, opt-in).

The bottom line from the IBM, Centrify and many other studies is that we’re in a Zero Trust Security (ZTS) world now and the sooner a digital business can excel at it, the more protected they will be from security threats. ZTS begins with Next-Gen Access (NGA) by recognizing that every employee’s identity is the new security perimeter for any digital business.

Key takeaways from the study include the following:

  • U.S. based breaches are the most expensive globally, costing on average $7.91M, more than double the global average of $3.86M. Nations in the Middle East have the second-most expensive breaches globally, averaging $5.31M, followed by Canada, where the average breach costs a digital business $4.74M. Globally a breach costs a digital business $3.86M this year, up from $3.62M last year. With the costs of breaches escalating so quickly and the cost of a breach in the U.S. leading all nations and outdistancing the global average 2X, it’s time for more digital businesses to consider a Zero Trust Security strategy. See Forrester Principal Analyst Chase Cunningham’s recent blog post What ZTX Means For Vendors And Users, from the Forrester Research blog for where to get started.

  • The number of breached records is soaring in the U.S., the 3rd leading nation of breached records, 6,850 records above the global average. The Ponemon Institute found that the average size of a data breach increased 2.2% this year, with the U.S. leading all nations in breached records. It now takes an average of 266 days to identify and contain a breach (Mean-time-to-identify (MTTI) a breach is 197 days and the mean-time-to-contain (MTTC) is 69 days), so more digital businesses in the Middle East, India, and the U.S. should consider reorienting their security strategies to a Zero Trust Security Model.

  • French and U.S. digital businesses pay a heavy price in customer churn when a breach happens, among the highest in the world. The following graphic compares abnormally high customer churn rates, the size of the data breach, average total cost, and per capita costs by country.

  • U.S. companies lead the world in lost business caused by a security breach with $4.2M lost per incident, over $2M more than digital businesses from the Middle East. Ponemon found that U.S. digitally-based businesses pay an exceptionally high cost for customer churn caused by a data breaches. Factors contributing to the high cost of lost business include abnormally high turnover of customers, the high costs of acquiring new customers in the U.S., loss of brand reputation and goodwill. U.S. customers also have a myriad of competitive options and their loyalty is more difficult to preserve. The study finds that thanks to current notification laws, customers have a greater awareness of data breaches and have higher expectations regarding how the companies they are loyal to will protect customer records and data.

Conclusion

The IBM study foreshadows an increasing level of speed, scale, and sophistication when it comes to how breaches are orchestrated. With the average breach globally costing $4.36M and breach costs and lost customer revenue soaring in the U.S,. it’s clear we’re living in a world where Zero Trust should be the new mandate.

Zero Trust Security starts with Next-Gen Access to secure every endpoint and attack surface a digital business relies on for daily operations, and limit access and privilege to protect the “keys to the kingdom,” which gives hackers the most leverage. Security software providers including Centrify are applying advanced analytics and machine learning to thwart breaches and many other forms of attacks that seek to exploit weak credentials and too much privilege. Zero Trust is a proven way to stay at parity or ahead of escalating threats.

%d bloggers like this: