Skip to content
Advertisements

IBM’s 2018 Data Breach Study Shows Why We’re In A Zero Trust World Now

  • Digital businesses that lost less than 1% of their customers due to a data breach incurred a cost of $2.8M, and if 4% or more were lost the cost soared to $6M.
  • U.S. based breaches are the most expensive globally, costing on average $7.91M with the highest global notification cost as well, $740,000.
  • A typical data breach costs a company $3.86M, up 6.4% from $3.62M last year.
  • Digital businesses that have security automation can minimize the costs of breaches by $1.55M versus those businesses who are not ($2.88M versus $4.43M).
  • 48% of all breaches are initiated by malicious or criminal attacks.
  • Mean-time-to-identify (MTTI) a breach is 197 days, and the mean-time-to-contain (MTTC) is 69 days.

These and many other insights into the escalating costs of security breaches are from the 2018 Cost of a Data Breach Study sponsored by IBM Security with research independently conducted by Ponemon Institute LLC. The report is downloadable here (PDF, 47 pp. no opt-in).

The study is based on interviews with more than 2,200 compliance, data protection and IT professionals from 477 companies located in 15 countries and regions globally who have experienced a data breach in the last 12 months. This is the first year the use of Internet of Things (IoT) technologies and security automation are included in the study. The study also defines mega breaches as those involving over 1 million records and costing $40M or more. Please see pages 5, 6 and 7 of the study for specifics on the methodology.

The report is a quick read and the data provided is fascinating. One can’t help but reflect on how legacy security technologies designed to protect digital businesses decades ago isn’t keeping up with the scale, speed and sophistication of today’s breach attempts. The most common threat surface attacked is compromised privileged credential access. 81% of all breaches exploit identity according to an excellent study from Centrify and Dow Jones Customer Intelligence, CEO Disconnect is Weakening Cybersecurity (31 pp, PDF, opt-in).

The bottom line from the IBM, Centrify and many other studies is that we’re in a Zero Trust Security (ZTS) world now and the sooner a digital business can excel at it, the more protected they will be from security threats. ZTS begins with Next-Gen Access (NGA) by recognizing that every employee’s identity is the new security perimeter for any digital business.

Key takeaways from the study include the following:

  • U.S. based breaches are the most expensive globally, costing on average $7.91M, more than double the global average of $3.86M. Nations in the Middle East have the second-most expensive breaches globally, averaging $5.31M, followed by Canada, where the average breach costs a digital business $4.74M. Globally a breach costs a digital business $3.86M this year, up from $3.62M last year. With the costs of breaches escalating so quickly and the cost of a breach in the U.S. leading all nations and outdistancing the global average 2X, it’s time for more digital businesses to consider a Zero Trust Security strategy. See Forrester Principal Analyst Chase Cunningham’s recent blog post What ZTX Means For Vendors And Users, from the Forrester Research blog for where to get started.

  • The number of breached records is soaring in the U.S., the 3rd leading nation of breached records, 6,850 records above the global average. The Ponemon Institute found that the average size of a data breach increased 2.2% this year, with the U.S. leading all nations in breached records. It now takes an average of 266 days to identify and contain a breach (Mean-time-to-identify (MTTI) a breach is 197 days and the mean-time-to-contain (MTTC) is 69 days), so more digital businesses in the Middle East, India, and the U.S. should consider reorienting their security strategies to a Zero Trust Security Model.

  • French and U.S. digital businesses pay a heavy price in customer churn when a breach happens, among the highest in the world. The following graphic compares abnormally high customer churn rates, the size of the data breach, average total cost, and per capita costs by country.

  • U.S. companies lead the world in lost business caused by a security breach with $4.2M lost per incident, over $2M more than digital businesses from the Middle East. Ponemon found that U.S. digitally-based businesses pay an exceptionally high cost for customer churn caused by a data breaches. Factors contributing to the high cost of lost business include abnormally high turnover of customers, the high costs of acquiring new customers in the U.S., loss of brand reputation and goodwill. U.S. customers also have a myriad of competitive options and their loyalty is more difficult to preserve. The study finds that thanks to current notification laws, customers have a greater awareness of data breaches and have higher expectations regarding how the companies they are loyal to will protect customer records and data.

Conclusion

The IBM study foreshadows an increasing level of speed, scale, and sophistication when it comes to how breaches are orchestrated. With the average breach globally costing $4.36M and breach costs and lost customer revenue soaring in the U.S,. it’s clear we’re living in a world where Zero Trust should be the new mandate.

Zero Trust Security starts with Next-Gen Access to secure every endpoint and attack surface a digital business relies on for daily operations, and limit access and privilege to protect the “keys to the kingdom,” which gives hackers the most leverage. Security software providers including Centrify are applying advanced analytics and machine learning to thwart breaches and many other forms of attacks that seek to exploit weak credentials and too much privilege. Zero Trust is a proven way to stay at parity or ahead of escalating threats.

Advertisements

Zero Trust Security Is The Growth Catalyst IoT Needs

  • McKinsey predicts the Internet of Things (IoT) market will be worth $581B for ICT-based spend alone, growing at a Compound Annual Growth Rate (CAGR) between 7 and 15% according to their study Internet of Things The IoT opportunity – Are you ready to capture a once-in-a-lifetime value pool?
  • By 2020, Discrete Manufacturing, Transportation & Logistics and Utilities industries are projected to spend $40B each on IoT platforms, systems, and services according to Statista.
  • The Industrial Internet of Things (IIoT) market is predicted to reach $123B in 2021, attaining a CAGR of 7.3% through 2020 according to Accenture.

IoT is forecast to be one of the tech industry’s fastest-growing sectors in the next three to five years, as many market estimates like the ones above illustrate. The one factor that will fuel IoT to rapidly grow to new heights or deflate demand just as quickly is security across the myriad of endpoints.

Zero Trust Security (ZTS) is the force multiplier IoT needs to reach its true potential and must be designed into IoT networks if they are going to flex and scale for every endpoint and protect every threat surface.

IoT Needs A Security Wake-Up Call Now  

Industrial Control Systems (ICS) provides a cautionary tale for anyone who thinks enterprise networks don’t need endpoint security and the ability to control access from any point inside or outside an organization.

Chemical, electricity, food & beverage, gas, healthcare, oil, transportation, water services and other key infrastructure industries have relied on ICS applications and platforms for decades. They were designed to deliver reliability and uptime first with little if any effort put into securing them.

However, the glaring security gaps in ICS provide the following lessons for IoT adoption now and in the future:

  • Only digitally enable an endpoint that can verify if every person or device attempting access is authorized, down to the risk score and device level. ICS endpoints were added as fast as utility companies and manufacturers could enable them with speed of deployment, reliability measurement, and uptime being the highest priorities. Security wasn’t a priority with the results being predictable: now many nations’ power grids are vulnerable to attack due to this oversight. With IoT, utilities need to start designing in security to the sensor level using Next-Gen Access as the foundation, leveraging Identity-as-a-Service (IDaaS), Enterprise Mobility Management (EMM) and Privileged Access Management (PAM) to enable Zero Trust strategies organization-wide. Next-Gen Access calculates a risk score predicated on previous authorized login and resource access patterns for each verified account.  When there is an anomaly in account credentials’ use, users are requested to verify with Multi-Factor Authentication (MFA).
  • An ICS doesn’t learn from security mistakes, while NGA gets smarter with every breach attempt. A typical ICS is designed to make operations more efficient and reliable, not secure. Even with many endpoints of an ICS being digitally-enabled today with device retrofitting common, security still isn’t a priority. Instead of digitally enabling IoT sensors purely for efficiency, Next-Gen Access needs to be designed in at the sensor level to protect entire networks. Zero Trust Security’s four main pillars are to verify the user, validate their device, limit access and privilege, and learn and adapt. Machine learning is relied on for learning and adapting in real-time to access requests and threats.
  • ICS assumes no bad actors exist while NGA knows how to stop them. Bad actors, or those who want to breach a system for financial gain or to harm a business, aren’t just outside. Verizon’s 2017 Data Breach Investigations Report finds that 25% of all breaches are initiated from inside an organization and 75% outside which makes NGA essential for attaining Zero Trust Security on an enterprise level. Of the ICS being protected today, the majority are reliant on trusted and untrusted domains, a security technology over two decades old. When organized crime, state-sponsored hacking organizations or internal employees can quickly compromise privileged credentials, entire utility systems are at risk.
  • Replacing security-obsolete ICS with IoT-based systems that have NGA designed in to flex for every person and device shuts down physical and digital attack vectors organization-wide. The strategic security plan for any IoT-enabled enterprise has to prioritize faster automated discovery, configuration and response if it’s going to survive against highly orchestrated attacks. NGA has proven effective at thwarting unauthorized privileged credential attacks while continually learning from usage patterns of authorized and unauthorized users.

Conclusion

ICS have some of the most porous, incomplete security perimeters of any enterprise systems. 63% of all ICS-related vulnerabilities cause processing plants to lose control of operations, and 71% can obfuscate or block the view of operations immediately according to the Dragos Industrial Control Vulnerabilities 2017 in Review.  ICS needs an overhaul starting with Next-Gen Access, enabling Zero Trust Security across every employee and device that forms an organizations’ security perimeter.

Bain & Company released a study on the price elasticity of IoT-enabled products by security level. They found that 93% of the executives surveyed would pay an average of 22% more for devices with better security. Taken together, Bain estimates that improving security solutions for these devices could grow the IoT cybersecurity market by $9B to $11B.

The speed at which manufacturers are building smart, connected products accentuates the need for Zero Trust Security powered by Next-Gen Access from their inception. Security as an afterthought won’t be effective at the scale and pace of IoT.

Source: Bain Snap Chart, July 98, 2018 Better IoT Security Could Grow Device Market

 

Zero Trust Security Update From The SecurIT Zero Trust Summit

  • Identities, not systems, are the new security perimeter for any digital business, with 81% of breaches involving weak, default or stolen passwords.
  • 53% of enterprises feel they are more susceptible to threats since 2015.
  • 51% of enterprises suffered at least one breach in the past 12 months and malicious insider incidents increased 11% year-over-year.

These and many other fascinating insights are from SecurIT: the Zero Trust Summit for CIOs and CISOs held last month in San Francisco, CA. CIO and CSO produced the event that included informative discussions and panels on how enterprises are adopting Next-Gen Access (NGA) and enabling Zero Trust Security (ZTS). What made the event noteworthy were the insights gained from presentations and panels where senior IT executives from Akamai, Centrify, Cisco, Cylance, EdgeWise, Fortinet, Intel, Live Nation Entertainment and YapStone shared their key insights and lessons learned from implementing Zero Trust Security.

Zero Trust’s creator is John Kindervag, a former Forrester Analyst, and Field CTO at Palo Alto Networks.  Zero Trust Security is predicated on the concept that an organization doesn’t trust anything inside or outside its boundaries and instead verifies anything and everything before granting access. Please see Dr. Chase Cunningham’s excellent recent blog post, What ZTX means for vendors and users, for an overview of the current state of ZTS. Dr. Chase Cunningham is a Principal Analyst at Forrester.

Key takeaways from the Zero Trust Summit include the following:

  • Identities, not systems, are the new security perimeter for any digital business, with 81% of breaches involving weak, default or stolen passwords. Tom Kemp, Co-Founder, and CEO, Centrify, provided key insights into the current state of enterprise IT security and how existing methods aren’t scaling completely enough to protect every application, endpoint, and infrastructure of any digital business. He illustrated how $86B was spent on cybersecurity, yet a stunning 66% of companies were still breached. Companies targeted for breaches averaged five or more separate breaches already. The following graphic underscores how identities are the new enterprise perimeter, making NGA and ZTS a must-have for any digital business.

  • 53% of enterprises feel they are more susceptible to threats since 2015. Chase Cunningham’s presentation, Zero Trust and Why Does It Matter, provided insights into the threat landscape and a thorough definition of ZTX, which is the application of a Zero Trust framework to an enterprise. Dr. Cunningham is a Principal Analyst at Forrester Research serving security and risk professionals. Forrester found the percentage of enterprises who feel they are more susceptible to threats nearly doubled in two years, jumping from 28% in 2015 to 53% in 2017. Dr. Cunningham provided examples of how breaches have immediate financial implications on the market value of any business with specific focus on the Equifax breach.

Presented by Dr. Cunningham during SecurIT: the Zero Trust Summit for CIOs and CISOs

  • 51% of enterprises suffered at least one breach in the past 12 months and malicious insider incidents increased 11% year-over-year. 43% of confirmed breaches in the last 12 months are from an external attack, 24% from internal attacks, 17% are from third-party incidents and 16% from lost or stolen assets. Consistent with Verizon’s 2018 Data Breach Investigations Report use of privileged credential access is a leading cause of breaches today.

Presented by Dr. Cunningham during SecurIT: the Zero Trust Summit for CIOs and CISOs

                       

  • One of Zero Trust Security’s innate strengths is the ability to flex and protect the perimeter of any growing digital business at the individual level, encompassing workforce, customers, distributors, and Akamai, Cisco, EdgeWise, Fortinet, Intel, Live Nation Entertainment and YapStone each provided examples of how their organizations are relying on NGA to enable ZTS enterprise-wide. Every speaker provided examples of how ZTS delivers several key benefits including the following: First, ZTS reduces the time to breach detection and improves visibility throughout a network. Second, organizations provided examples of how ZTS is reducing capital and operational expenses for security, in addition to reducing the scope and cost of compliance initiatives. All companies presenting at the conference provided examples of how ZTS is enabling greater data awareness and insight, eliminating inter-silo finger-pointing over security responsibilities and for several, enabling digital business transformation. Every organization is also seeing ZTS thwart the exfiltration and destruction of their data.

Conclusion

The SecurIT: the Zero Trust Summit for CIOs and CISOs event encapsulated the latest advances in how NGA is enabling ZTS by having enterprises who are adopting the framework share their insights and lessons learned. It’s fascinating to see how Akamai, Cisco, Intel, Live Nation Entertainment, YapStone, and others are tailoring ZTS to their specific customer-driven goals. Each also shared their plans for growth and how security in general and NGA and ZTS specifically are protecting customer and company data to ensure growth continues, uninterrupted.

 

 

Where Business Intelligence Is Delivering Value In 2018

  • Executive Management, Operations, and Sales are the three primary roles driving Business Intelligence (BI) adoption in 2018.
  • Dashboards, reporting, end-user self-service, advanced visualization, and data warehousing are the top five most important technologies and initiatives strategic to BI in 2018.
  • Small organizations with up to 100 employees have the highest rate of BI penetration or adoption in 2018.
  • Organizations successful with analytics and BI apps define success in business results, while unsuccessful organizations concentrate on adoption rate first.
  • 50% of vendors offer perpetual on-premises licensing in 2018, a notable decline over 2017. The number of vendors offering subscription licensing continues to grow for both on-premises and public cloud models.
  • Fewer than 15% of respondent organizations have a Chief Data Officer, and only about 10% have a Chief Analytics Officer today.

These and many other fascinating insights are from Dresner Advisory Service’s  2018 Wisdom of Crowds® Business Intelligence Market Study. In its ninth annual edition, the study provides a broad assessment of the business intelligence (BI) market and a comprehensive look at key user trends, attitudes, and intentions.  The latest edition of the study adds Information Technology (IT) analytics, sales planning, and GDPR, bringing the total to 36 topics under study.

“The Wisdom of Crowds BI Market Study is the cornerstone of our annual research agenda, providing the most in-depth and data-rich portrait of the state of the BI market,” said Howard Dresner, founder and chief research officer at Dresner Advisory Services. “Drawn from the first-person perspective of users throughout all industries, geographies, and organization sizes, who are involved in varying aspects of BI projects, our report provides a unique look at the drivers of and success with BI.” Survey respondents include IT (28%), followed by Executive Management (22%), and Finance (19%). Sales/Marketing (8%) and the Business Intelligence Competency Center (BICC) (7%). Please see page 15 of the study for specifics on the methodology.

Key takeaways from the study include the following:

  • Executive Management, Operations, and Sales are the three primary roles driving Business Intelligence (BI) adoption in 2018. Executive management teams are taking more of an active ownership role in BI initiatives in 2018, as this group replaced Operations as the leading department driving BI adoption this year. The study found that the greatest percentage change in functional areas driving BI adoption includes Human Resources (7.3%), Marketing (5.9%), BICC (5.1%) and Sales (5%).

  • Making better decisions, improving operational efficiencies, growing revenues and increased competitive advantage are the top four BI objectives organizations have today. Additional goals include enhancing customer service and attaining greater degrees of compliance and risk management. The graph below rank orders the importance of BI objectives in 2018 compared to the percent change in BI objectives between 2017 and 2018. Enhanced customer service is the fastest growing objective enterprises adopt BI to accomplish, followed by growth in revenue (5.4%).

  • Dashboards, reporting, end-user self-service, advanced visualization, and data warehousing are the top five most important technologies and initiatives strategic to BI in 2018. The study found that second-tier initiatives including data discovery, data mining/advanced algorithms, data storytelling, integration with operational processes, and enterprise and sales planning are also critical or very important to enterprises participating in the survey. Technology areas being hyped heavily today including the Internet of Things, cognitive BI, and in-memory analysis are relatively low in the rankings as of today, yet are growing. Edge computing increased 32% as a priority between 2017 and 2018 for example. The results indicate the core aspect of excelling at using BI to drive better business decisions and more revenue still dominate the priorities of most businesses today.
  • Sales & Marketing, Business Intelligence Competency Center (BICC) and   Executive Management have the highest level of interest in dashboards and advanced visualization. Finance has the greatest interest in enterprise planning and budgeting. Operations including manufacturing, supply chain management, and services) leads interest in data mining, data storytelling, integration with operational processes, mobile device support, data catalog and several other technologies and initiatives. It’s understandable that BICC leaders most advocate end-user self-service and attach high importance to many other categories as they are internal service bureaus to all departments in an enterprise. It’s been my experience that BICCs are always looking for ways to scale BI adoption and enable every department to gain greater value from analytics and BI apps. BICCs in the best run companies are knowledge hubs that encourage and educate all departments on how to excel with analytics and BI.

  • Insurance companies most prioritize dashboards, reporting, end-user self-service, data warehousing, data discovery and data mining. Business Services lead the adoption of advanced visualization, data storytelling, and embedded BI. Manufacturing most prioritizes sales planning and enterprise planning but trails in other high-ranking priorities. Technology prioritizes Software-as-a-Service (SaaS) given its scale and speed advantages. The retail & wholesale industry is going through an analytics and customer experience revolution today. Retailers and wholesalers lead all others in data catalog adoption and mobile device support.

  • Insurance, Technology and Business Services vertical industries have the highest rate of BI adoption today. The Insurance industry leads all others in BI adoption, followed by the Technology industry with 40% of organizations having 41% or greater adoption or penetration. Industries whose BI adoption is above average include Business Services and Retail & Wholesale. The following graphic illustrates penetration or adoption of Business Intelligence solutions today by industry.

  • Dashboards, reporting, advanced visualization, and data warehousing are the highest priority investment areas for companies whose budgets increased from 2017 to 2018. Additional high priority areas of investment include advanced visualization and data warehousing. The study found that less well-funded organizations are most likely to lead all others by investing in open source software to reduce costs.

  • Small organizations with up to 100 employees have the highest rate of BI penetration or adoption in 2018. Factors contributing to the high adoption rate for BI in small businesses include business models that need advanced analytics to function and scale, employees with the latest analytics and BI skills being hired to also scale high growth businesses and fewer barriers to adoption compared to larger enterprises. BI adoption tends to be more pervasive in small businesses as a greater percentage of employees are using analytics and BI apps daily.

  • Executive Management is most familiar with the type and number of BI tools in use across the organization. The majority of executive management respondents say their teams are using between one or two BI tools today. Business Intelligence Competency Centers (BICC) consistently report a higher number of BI tools in use than other functional areas given their heavy involvement in all phases of analytics and BI project execution. IT, Sales & Marketing and Finance are likely to have more BI tools in use than Operations.

  • Enterprises rate BI application usability and product quality & reliability at an all-time high in 2018. Other areas of major improvements on the part of vendors include improving ease of implementation, online training, forums and documentation, and completeness of functionality. Dresner’s research team found between 2017 and 2018 integration of components within product dropped, in addition to scalability. The study concludes the drop in integration expertise is due to an increasing number of software company acquisitions aggregating dissimilar products together from different platforms.

Analytics Are Empowering Next-Gen Access And Zero Trust Security

Employee identities are the new security perimeter of any business.

80% of IT security breaches involve privileged credential access according to a Forrester study. According to the Verizon Mobile Security Index 2018 Report, 89% of organizations are relying on just a single security strategy to keep their mobile networks safe. And with Gartner predicting worldwide security spending reaching $96B this year, up 8% from 2017, it’s evident enterprises must adopt a more vigilant, focused strategy for protecting every threat surface and access point of their companies. IT security strategies based on trusted and untrusted domains are being rendered insufficient as hackers camouflage their attacks through compromised, privileged credentials. It’s happening so often that eight in ten breaches are now the result of compromised employee identities.

Thus, taking a Zero Trust Security (ZTS) approach to ensure every potential threat surface and endpoint, both within and outside a company, is protected, has become vital in today’s dynamic threat landscape. ZTS is an essential strategy for any digital business whose perimeters flex in response to customer demand, are using the Internet of Things (IoT) sensors to streamline supply chain and production logistics, and have suppliers, sales teams, support, and services all using mobile apps.  ZTS begins with Next-Gen Access (NGA) by providing companies with the agility they need to secure applications, devices, endpoints, and infrastructure as quickly as needed to support company growth. Both NGA and ZTS are empowered by analytics to anticipate and thwart a wide variety of cyber threats, the most common of which is compromised credential access.

How NGA Leverages Analytics to Secure Every Endpoint

NGA validates every access attempt by capturing and quickly analyzing a wide breadth of data including user identity, device, device operating system, location, time, resource request, and several other factors. As NGA is designed to verify every user and access attempt, it’s foundational to attaining Zero Trust Security across an IT infrastructure. One of the fascinating areas of innovation in enterprise security today is the rapid adoption of analytics and machine learning for verifying users across diverse enterprise networks. NGA platforms calculate and assign a risk score to every access attempt, determining immediately if verified users will get immediate access to resources requested, or be asked to verify their identity further through Multi-Factor Authentication (MFA).

Machine learning-based NGA platforms including Centrify calculate a risk score that quantifies the relative level of trust based on every access attempt across an IT infrastructure. NGA platforms rely on machine learning algorithms to continuously learn and generate contextual intelligence that is used to streamline verified user’s access while thwarting many potential threats ― the most common of which is compromised credentials. IT security teams can combine the insights gained from machine learning, user profiles, and contextual intelligence to fine-tune the variables and attributes that calculate risk scores using cloud-enabled analytics services.  An example of Centrify’s Analytics Services dashboard is shown below:

Visibility and Analytics are a Core Pillar of ZTS

Analytics, machine learning and their combined potential to produce contextual intelligence, real-time risk scores, and secure company perimeters to the individual access attempt level need a continual stream of data to increase their accuracy. Forrester’s Zero Trust Framework, shown below, illustrates how an enterprise-wide ZTS security strategy encompasses workloads, networks, devices, and people.  NGA is the catalyst that makes ZTS scale into each of these areas. It’s evident from the diagram how essential visibility and analytics are to a successful ZTS strategy. NGA provides incident data including reports of anomalous or atypical login and attempted resource behavior. Visibility and analytics applications from IBM, Splunk, Sumologic, and others are relied on to aggregate the data, anticipating and predicting breaches and advanced attacks. The result is a ZTS security strategy that begins with NGA that flexes and scales to the individual perimeter level as a digital business grows.

Source: What ZTX Means For Vendors And Users, Forrester Research Blog, January 23, 2018., Chase Cunningham, Principal Analyst.

Conclusion

Every company, whether they realize it or not, is in a race against time to secure every threat surface that could be compromised and used to steal or destroy data and systems.  Relying on yesterday’s security technologies to protect against tomorrow’s sophisticated, well-orchestrated threats isn’t scaling. Reading through the Verizon Mobile Security Index 2018 Report illustrates why Zero Trust Security is the future. Improving visibility throughout the network and reducing the time to breach detection, stopping malware propagation and reducing the scope and cost of internal and regulatory-mandated compliance requirements are just a few of the business benefits. Analytics and machine learning are the fuel enabling NGA to scale and support ZTS strategies’ success today.

10 Charts That Will Change Your Perspective Of Big Data’s Growth

  • 10 Charts That Will Change Your Perspective Of Big Data's GrowthWorldwide Big Data market revenues for software and services are projected to increase from $42B in 2018 to $103B in 2027, attaining a Compound Annual Growth Rate (CAGR) of 10.48% according to Wikibon.
  • Forrester predicts the global Big Data software market will be worth $31B this year, growing 14% from the previous year. The entire global software market is forecast to be worth $628B in revenue, with $302B from applications.
  • According to an Accenture study, 79% of enterprise executives agree that companies that do not embrace Big Data will lose their competitive position and could face extinction. Even more, 83%, have pursued Big Data projects to seize a competitive edge.
  • 59% of executives say Big Data at their company would be improved through the use of AI according to PwC.

Sales and Marketing, Research & Development (R&D), Supply Chain Management (SCM) including distribution, Workplace Management and Operations are where advanced analytics including Big Data are making the greatest contributions to revenue growth today. McKinsey Analytics’ study Analytics Comes of Age, published in January 2018 (PDF, 100 pp., no opt-in) is a comprehensive overview of how analytics technologies and Big Data are enabling entirely new ecosystems, serving as a foundational technology for Artificial Intelligence (AI). McKinsey finds that analytics and Big Data are making the most valuable contributions in the Basic Materials and High Tech industries. The first chart in the following series of ten is from the McKinsey Analytics study, highlighting how analytics and Big Data are revolutionizing many of the foundational business processes of Sales and Marketing.

The following ten charts provide insights into Big Data’s growth:

  • Nearly 50% of respondents to a recent McKinsey Analytics survey say analytics and Big Data have fundamentally changed business practices in their sales and marketing functions. Also, more than 30% say the same about R&D across industries, with respondents in High Tech and Basic Materials & Energy report the greatest number of functions being transformed by analytics and Big Data. Source: Analytics Comes of Age, published in January 2018 (PDF, 100 pp., no opt-in).

  • Worldwide Big Data market revenues for software and services are projected to increase from $42B in 2018 to $103B in 2027, attaining a Compound Annual Growth Rate (CAGR) of 10.48%. As part of this forecast, Wikibon estimates the worldwide Big Data market is growing at an 11.4% CAGR between 2017 and 2027, growing from $35B to $103B. Source: Wikibon and reported by Statista.

  • According to NewVantage Venture Partners, Big Data is delivering the most value to enterprises by decreasing expenses (49.2%) and creating new avenues for innovation and disruption (44.3%). Discovering new opportunities to reduce costs by combining advanced analytics and Big Data delivers the most measurable results, further leading to this category being the most prevalent in the study. 69.4% have started using Big Data to create a data-driven culture, with 27.9% reporting results. Source: NewVantage Venture Partners, Big Data Executive Survey 2017 (PDF, 16 pp.)

  • The Hadoop and Big Data Market are projected to grow from $17.1B in 2017 to $99.31B in 2022 attaining a 28.5% CAGR. The greatest period of projected growth is in 2021 and 2022 when the market is projected to jump $30B in value in one year. Source: StrategyMRC and reported by Statista.

  • Big Data applications and analytics is projected to grow from $5.3B in 2018 to $19.4B in 2026, attaining a CAGR of 15.49%. Big Data market worldwide includes Professional Services is projected to grow from $16.5B in 2018 to $21.3B in 2026. Source: Wikibon and reported by Statista.

  • Comparing the worldwide demand for advanced analytics and Big Data-related hardware, services and software, the latter category’s dominance becomes clear. The software segment is projected to increase the fastest of all categories, increasing from $14B in 2018 to $46B in 2027 attaining a CAGR of 12.6%. Sources: WikibonSiliconANGLE; Statista estimates and reported by Statista.

  • Advanced analytics and Big Data revenue in China are projected to be worth ¥57.8B ($9B) by 2020. The Chinese market is predicted to be one of the fastest growing globally, growing at a CAGR of 31.72% in the forecast period. Sources: Social Sciences Academic Press (China) and Statista.

  • Non-relational analytic data stores are projected to be the fastest growing technology category in Big Datagrowing at a CAGR of 38.6% between 2015 and 2020. Cognitive software platforms (23.3% CAGR) and Content Analytics (17.3%) round out the top three fastest growing technologies between 2015 and 2020. Source: Statista.

  • A decentralized general-merchandise retailer that used Big Data to create performance group clusters saw sales grow 3% to 4%. Big Data is the catalyst of a retailing industry makeover, bringing greater precision to localization than has been possible before. Big Data is being used today to increase the ROI of endcap promotions, optimize planograms, help to improve upsell and cross-sell sales performance and optimize prices on items that drive the greatest amount of foot traffic. Source: Use Big Data to Give Local Shoppers What They Want, Boston Consulting Group, February 8, 2018.

  • 84% of enterprises have launched advanced analytics and Big Data initiatives to bring greater accuracy and accelerate their decision-making Big Data initiatives focused on this area also have the greatest success rate (69%) according to the most recent NewVantage Venture Partners Survey. Over a third of enterprises, 36%, say this area is their top priority for advanced analytics and Big Data investment. Sources: NewVantage Venture Partners Survey and Statista.

Additional Big Data Information Sources:

4 Pain Points of Big Data and how to solve them, Digital McKinsey via Medium, November 10, 2017

53% Of Companies Are Adopting Big Data Analytics, Forbes, December 24, 2017

6 Predictions For The $203 Billion Big Data Analytics Market, Forbes, Gil Press, January 20, 2017

Analytics Comes of Age, McKinsey Analytics, January 2018 (PDF, 100 pp.)

Big Data & Analytics Is The Most Wanted Expertise By 75% Of IoT Providers, Forbes, August 21, 2017

Big Data 2017 – Market Statistics, Use Cases, and Trends, Calsoft (36 pp., PDF)

Big Data and Business Analytics Revenues Forecast to Reach $150.8 Billion This Year, Led by Banking and Manufacturing Investments, According to IDC, March 14, 2017

Big Data Executive Survey 2018, Data and Innovation – How Big Data and AI are Driving Business Innovation, NewVantage Venture Partners, January 2018 (PDF, 18 pp.)

Big Data Tech Hadoop and Spark Get Slow Start in Enterprise, Information Week, March 20, 2018

Big Success With Big Data, Accenture  (PDF, 12 pp.)

Gartner Survey Shows Organizations Are Slow to Advance in Data and Analytics, Gartner, February 5, 2018

How Big Data and AI Are Driving Business Innovation in 2018, MIT Sloan Management Review, February 5, 2018

IDC forecasts big growth for Big Data, Analytics Magazine. April 2018

IDC Worldwide Big Data Technology and Services 2012 – 2015 Forecast, Courtesy of EC Europa (PDF, 34 pp.)

Midyear Global Tech Market Outlook For 2017 To 2018, Forrester, September 25, 2017 (client access reqd.)

Oracle Industry Analyst Reports – Data-rich website of industry analyst reports

Ten Ways Big Data Is Revolutionizing Marketing And Sales, Forbes, May 9, 2016

The Big Data Payoff: Turning Big Data into Business Value, CAP Gemini & Informatica Study, (PDF, 12 pp.)

The Forrester Wave™: Enterprise BI Platforms With Majority Cloud Deployments, Q3 2017 courtesy of Oracle

Three Ways Machine Learning Is Revolutionizing Zero Trust Security

Bottom Line: Zero Trust Security (ZTS) starts with Next-Gen Access (NGA). Capitalizing on machine learning technology to enable NGA is essential in achieving user adoption, scalability, and agility in securing applications, devices, endpoints, and infrastructure.

How Next-Gen Access and Machine Learning Enable Zero Trust Security

Zero Trust Security provides digital businesses with the security strategy they need to keep growing by scaling across each new perimeter and endpoint created as a result of growth. ZTS in the context of Next-Gen Access is built on four main pillars: (1) verify the user, (2) validate their device, (3) limit access and privilege, and (4) learn and adapt. The fourth pillar heavily relies on machine learning to discover risky user behavior and apply for conditional access without impacting user experience by looking for contextual and behavior patterns in access data.

As ZTS assumes that untrusted users or actors already exist both inside and outside the network, machine learning provides NGA with the capability to assess data about users, their devices, and behavior to allow access, block access, or enforce additional authentication. With machine learning, policies and user profiles can be adjusted automatically and in real-time. While NGA enabled by machine learning is delivering dashboards and alerts, the real-time response to security threats predicated on risk scores is very effective in thwarting breaches before they start.

Building NGA apps based on machine learning technology yields the benefits of being non-intrusive, supporting the productivity of workforce and business partners, and ultimately allowing digital businesses to grow without interruption. For example, Centrify’s rapid advances in machine learning and Next-Gen Access to enable ZTS strategies makes this company one of the most interesting to watch in enterprise security.

The following are three ways machine learning is revolutionizing Zero Trust Security:

  1. Machine learning enables enterprises to adopt a risk-based security strategy that can flex with their business as it grows. Many digital businesses have realized that “risk is security’s new compliance,” and therefore are implementing a risk-driven rather than a compliance-driven approach. Relying on machine learning technology to assess user, device, and behavioral data for each access request derives a real-time risk score. This risk score can then be used to determine whether to allow access, block access, or step up authentication. In evaluating each access request, machine learning engines process multiple factors, including the location of the access attempt, browser type, operating system, endpoint device status, user attributes, time of day, and unusual recent privilege change. Machine learning algorithms are also scaling to take into account unusual command runs, unusual resource access histories, and any unusual accounts used, unusual privileges requested and used, and more. This approach helps thwart comprised credential attacks, which make up 81% of all hacking-related data breaches, according to Verizon.
  2. Machine learning makes it possible to accomplish security policy alignment at scale. To keep pace with a growing digital business’ need to flex and scale to support new business models, machine learning also assists in automatically adjusting user profiles and access policies based on behavioral patterns. By doing so, the need for IT staffers to review and adjust policies vanishes, freeing them up to focus on things that will grow the business faster and more profitably. On the other hand, end users are not burdened with step-up authentication once a prior abnormal behavior is identified as now typical behavior and therefore both user profile and policies updated.
  3. Machine learning brings greater contextual intelligence into authentication, streamlining the experience and increasing user adoption. Ultimately, the best security is transparent and non-intrusive. That’s where the use of risk-based authentication and machine learning technology comes into play. The main impediment to adoption for multi-factor authentication has been the perceived impact on the productivity and agility of end users. A recent study by Dow Jones Customer Intelligence and Centrify revealed that 62% of CEOs state that multi-factor authentication (MFA) is difficult to manage and is not user-friendly, while only 41% of technical officers (CIOs, CTOs, and CISOs) agree with this assessment. For example, having to manually type in a code that has been transmitted via SMS in addition to the already supplied username and password is often seen as cumbersome. Technology advancements are removing some of these objections by offering a more user-friendly experience, like eliminating the need to manually enter a one-time password on the endpoint, by enabling the user to simply click a button on their smartphone. Nonetheless, some users still express frustration with this additional step, even if it is relatively quick and simple. To overcome these remaining barriers to adoption, machine learning technology contributes to minimizing the exposure to step up authentication over time, as the engine learns and adapts to the behavioral patterns.

In Conclusion

Zero Trust Security through the power of Next-Gen Access is allowing digital businesses to continue on their path of growth while safeguarding their patented ideas and intellectual property. Relying on machine learning technology for Next-Gen Access results in real-time security, allowing to identify high-risk events and ultimately greatly minimizing the effort required to identify threats across today’s hybrid IT environment.

The Best Big Data Companies And CEOs To Work For In 2018

Forbes readers’ most common requests center on who the best companies are to work for in analytics, big data, data management, data science and machine learning. The latest Computer Reseller News‘ 2018 Big Data 100 list of companies is used to complete the analysis as it is an impartial, independent list aggregated based on CRN’s analysis and perspectives of the market. Using the CRN list as a foundation, the following analysis captures the best companies in their respective areas today.

Using the 2018 Big Data 100 CRN list as a baseline to compare the Glassdoor scores of the (%) of employees who would recommend this company to a friend and (%) of employees who approve of the CEO, the following analysis was completed today. 25 companies on the list have very few (less than 15) or no Glassdoor reviews, so they are excluded from the rankings. Based on analysis of Glassdoor score patterns over the last four years, the lower the number of rankings, the more 100% scores for referrals and CEOs. These companies, however, are included in the full data set available here. If the image below is not visible in your browser, you can view the rankings here.

 

The highest rated CEOs on Glassdoor as of May 11, 2018 include the following:

Dataiku Florian Douetteau 100%
StreamSets Girish Pancha 100%
MemSQL Nikita Shamgunov 100%
1010 Data Greg Munves 99%
Salesforce.com Marc Benioff 98%
Attivio Stephen Baker 98%
SAP Bill McDermott 97%
Qubole Ashish Thusoo 97%
Trifacta Adam Wilson 97%
Zaloni Ben Sharma 97%
Reltio Manish Sood 96%
Microsoft Satya Nadella 96%
Cloudera Thomas J. Reilly 96%
Sumo Logic Ramin Sayar 96%
Google Sundar Pichai 95%
Looker Frank Bien 93%
MongoDB Dev Ittycheria 92%
Snowflake Computing Bob Muglia 92%
Talend Mike Tuchen 92%
Databricks Ali Ghodsi 90%
Informatica Anil Chakravarthy 90%

 

Five Reasons Why Machine Learning Needs To Make Resumes Obsolete

  • Hiring companies nationwide miss out on 50% or more of qualified candidates and tech firms incorrectly classify up 80% of candidates due to inaccuracies and shortcomings of existing Applicant Tracking Systems (ATS), illustrating how faulty these systems are for enabling hiring.
  • It takes on average 42 days to fill a position, and up to 60 days or longer to fill positions requiring in-demand technical skills and costs an average $5,000 to fill each position.
  • Women applicants have a 19% chance of being eliminated from consideration for a job after a recruiter screen and 30% after an onsite interview, leading to a massive loss of brainpower and insight every company needs to grow.

It’s time the hiring process gets smarter, more infused with contextual intelligence, insight, evaluating candidates on their mastery of needed skills rather than judging candidates on resumes that reflect what they’ve achieved in the past. Enriching the hiring process with greater machine learning-based contextual intelligence finds the candidates who are exceptional and have the intellectual skills to contribute beyond hiring managers’ expectations. Machine learning algorithms can also remove any ethic- and gender-specific identification of a candidate and have them evaluated purely on expertise, experiences, merit, and skills.

The hiring process relied on globally today hasn’t changed in over 500 years. From Leonardo da Vinci’s handwritten resume from 1482, which reflects his ability to build bridges and support warfare versus the genius behind Mona Lisa, Last Supper, Vitruvian Man, and a myriad of scientific discoveries and inventions that modernized the world, the approach job seekers take for pursuing new positions has stubbornly defied innovation. ATS apps and platforms classify inbound resumes and provide rankings of candidates based on just a small glimpse of their skills seen on a resume. When what’s needed is an insight into which managerial, leadership and technical skills & strengths any given candidate is attaining mastery of and at what pace.  Machine learning broadens the scope of what hiring companies can see in candidates by moving beyond the barriers of their resumes. Better hiring decisions are being made, and the Return on Investment (ROI) drastically improves by strengthening hiring decisions with greater intelligence. Key metrics including time-to-hire, cost-to-hire, retention rates, and performance all will improve when greater contextual intelligence is relied on.

Look Beyond Resumes To Win The War For Talent

Last week I had the opportunity to speak with the Vice President of Human Resources for one of the leading technology think tanks globally. He’s focusing on hundreds of technical professionals his organization needs in six months, 12 months and over a year from now to staff exciting new research projects that will deliver valuable Intellectual Property (IP) including patents and new products.

Their approach begins by seeking to understand the profiles and core strengths of current high performers, then seek out matches with ideal candidates in their community of applicants and the broader technology community. Machine learning algorithms are perfectly suited for completing the needed comparative analysis of high performer’s capabilities and those of candidates, whose entire digital persona is taken into account when comparisons are being completed. The following graphic illustrates the eightfold.ai Talent Intelligence Platform (TIP), illustrating how integrated it is with publicly available data, internal data repositories, Human Capital Resource Management (HRM) systems, ATS tools. Please click on the graphic to expand it for easier reading.

The comparative analysis of high achievers’ characteristics with applicants takes seconds to complete, providing a list of prospects complete with profiles. Machine learning-derived profiles of potential hires meeting the high performers’ characteristics provided greater contextual intelligence than any resume ever could. Taking an integrated approach to creating the Talent Intelligence Platform (TIP) yields insights not available with typical hiring or ATS solutions today. The profile below reflects the contextual intelligence and depth of insight possible when machine learning is applied to an integrated dataset of candidates. Please click on the graphic to expand it for easier reading. Key elements in the profile below include the following:

  • Career Growth Bell Curve – Illustrates how a given candidate’s career progressions and performance compares relative to others.

  • Social Following On Public Sites –  Provides a real-time glimpse into the candidate’s activity on Github, Open Stack, and other sites where technical professionals can share their expertise. This also provides insight into how others perceive their contributions.

  • Highlights Of Background That Is Relevant To Job(s) Under Review Provides the most relevant data from the candidate’s history in the profile so recruiters and managers can more easily understand their strengths.

  • Recent Publications – Publications provide insights into current and previous interests, areas of focus, mindset and learning progression over the last 10 to 15 years or longer.

  • Professional overlap that makes it easier to validate achievements chronicled in the resume – Multiple sources of real-time career data validate and provide greater context and insight into resume-listed accomplishments.

The key is understanding the context in which a candidate’s capabilities are being evaluated. And a 2-page resume will never give enough latitude to the candidate to cover all bases. For medium to large companies – doing this accurately and quickly is a daunting task if done manually – across all roles, all the geographies, all the candidates sourced, all the candidates applying online, university recruiting, re-skilling inside the company, internal mobility for existing employees, and across all recruitment channels. This is where machine learning can be an ally to the recruiter, hiring manager, and the candidate.

Five Reasons Why Machine Learning Needs To Make Resumes Obsolete

Reducing the costs and time-to-hire, increasing the quality of hires and staffing new initiatives with the highest quality talent possible all fuels solid revenue growth. Relying on resumes alone is like being on a bad Skype call where you only hear every tenth word in the conversation. Using machine learning-based approaches brings greater acuity, clarity, and visibility into hiring decisions.

The following are the five reasons why machine learning needs to make resumes obsolete:

  1. Resumes are like rearview mirrors that primarily reflect the past. What needed is more of a focus on where someone is going, why (what motivates them) and what are they fascinated with and learning about on their own. Resumes are rearview mirrors and what’s needed is an intelligent heads-up display of what their future will look like based on present interests and talent.
  2. By relying on a 500+-year-old process, there’s no way of knowing what skills, technologies and training a candidate is gaining momentum in. The depth and extent of mastery in specific areas aren’t reflected in the structure of resumes. By integrating multiple sources of data into a unified view of a candidate, it’s possible to see what areas they are growing the quickest in from a professional development standpoint.
  3. It’s impossible to game a machine learning algorithm that takes into account all digital data available on a candidate, while resumes have a credibility issue. Anyone who has hired subordinates, staff, and been involved in hiring decisions has faced the disappointment of finding out a promising candidate lied on a resume. It’s a huge let-down. Resumes get often gamed with one recruiter saying at least 60% of resumes have exaggerations and in some cases lies on them. Taking all data into account using a platform like TIP shows the true candidate and their actual skills.
  4. It’s time to take a more data-driven approach to diversity that removes unconscious biases. Resumes today immediately carry inherent biases in them. Recruiter, hiring managers and final interview groups of senior managers draw their unconscious biases based on a person’s name, gender, age, appearance, schools they attended and more. It’s more effective to know their skills, strengths, core areas of intelligence, all of which are better predictors of job performance.
  5. Reduces the risk of making a bad hire that will churn out of the organization fast. Ultimately everyone hires based in part on their best judgment and in part on their often unconscious biases. It’s human nature. With more data the probability of making a bad hire is reduced, reducing the risk of churning through a new hire and costing thousands of dollars to hire then replace them. Having greater contextual intelligence reduces the downside risks of hiring, removes biases by showing with solid data just how much a person is qualified or not for a role, and verifies their background strengths, skills, and achievements. Factors contributing to unconscious biases including gender, race, age or any other factors can be removed from profiles, so candidates are evaluated only on their potential to excel in the roles they are being considered for.

Bottom line: It’s time to revolutionize resumes and hiring processes, moving them into the 21st century by redefining them with greater contextual intelligence and insight enabled by machine learning.

 

How Zero Trust Security Fuels New Business Growth

Bottom Line: Zero Trust Security (ZTS) strategies enabled by Next-Gen Access (NGA) are indispensable for assuring uninterrupted digital business growth, and are proving to be a scalable security framework for streamlining onboarding and systems access for sales channels, partners, patients, and customers of fast-growing businesses.

The era of Zero Trust Security is here, accelerated by NGA solutions and driven by the needs of digital businesses for security strategies that can keep up with the rapidly expanding perimeters of their businesses. Internet of Things (IoT) networks and the sensors that comprise them are proliferating network endpoints and extending the perimeters of growing businesses quickly.

Inherent in the DNA of Next-Gen Access is the ability to verify the user, validate the device (including any sensor connected to an IoT network), limit access and privilege, then learn and adapt using machine learning techniques to streamline the user experience while granting access to approved accounts and resources. Many digital businesses today rely on IoT-based networks to connect with suppliers, channels, service providers and customers and gain valuable data they use to grow their businesses. Next-Gen Access solutions including those from Centrify are enabling Zero Trust Security strategies that scale to secure the perimeters of growing businesses without interrupting growth.

How Zero Trust Security Fuels New Business Growth  

The greater the complexity, scale and growth potential of any new digital business, the more critical NGA becomes for enabling ZTS to scale and protect its expanding perimeters. One of the most valuable ways NGA enables ZTS is using machine learning to learn and adapt to users’ system access behaviors continuously. Insights gained from NGA strengthen ZTS frameworks, enabling them to make the following contributions to new business growth:

  1. Zero Trust Security prevents data breaches that cripple new digital business models and ventures just beginning to scale and grow. Verifying, validating, learning and adapting to every user’s access attempts and then quantifying their behavior in a risk score is at the core of Next-Gen Access’ DNA. The risk scores quantify the relative levels of trust for each system user and determine what, if any, additional authentication is needed before access is granted to requested resources. Risk scores are continuously updated with every access attempt, making authentication less intrusive over time while greatly reducing compromised credential attacks.
  2. Securing the expanding endpoints and perimeters of a digital business using NGA frees IT and senior management up to focus more on growing the business. In any growing digital business, there’s an exponential increase in the number of endpoints being created, rapidly expanding the global perimeter of the business. The greater the number of endpoints and the broader the perimeter, the more revenue potential there is. Relying on Next-Gen Access to scale ZTS across all endpoints saves valuable IT time that can be dedicated to direct revenue-producing projects and initiatives. And by relying on NGA as the trust engine that enables ZTS, senior management will have far fewer security-related emergencies, interruptions, and special projects and can dedicate more time to growing the business. A ZTS framework also centralizes security management across a digital business, alleviating the costly, time-consuming task of continually installing patches and updates.
  3. Zero Trust Security is enabling digital businesses globally to meet and exceed General Data Protection Regulation (GDPR) compliance requirements while protecting and growing their most valuable asset: customer trust. Every week brings new announcements of security breaches at many of the world’s most well-known companies. Quick stats on users affected, potential dollar loss to the company and the all-too-common 800 numbers for credit bureaus seem to be in every press release. What’s missing is the incalculable, unquantifiable cost of lost customer value and the millions of hours customers waste trying to avert financial chaos. In response to the need for greater oversight of how organizations respond to breaches and manage data security, the European Union (EU) launched General Data Protection Regulation (GDPR) which goes into effect May 25, 2018. GDPR applies not only European organizations, but also to foreign businesses that offer goods or services in the European Union (EU) or monitor the behavior of individuals in the EU. The compliance directive also states that organizations need to process data so in a way that “ensures appropriate security of the personal data, using appropriate technical and organizational measures,” taking into account “state of the art and the costs of implementation.”

Using an NGA approach that includes risk-based multi-factor authentication (MFA) to evaluate every login combined with the least privilege approach across an entire organization is a first step towards excelling at GDPR compliance. Zero Trust Security provides every organization needing to comply with GDPR a solid roadmap of how to meet and exceed the initiative’s requirements and grow customer trust as a result.

Conclusion

Next-Gen Access enables Zero Trust Security strategies to scale and flex as a growing business expands. In the fastest growing businesses, endpoints are proliferating as new customers are gained, and suppliers are brought onboard. NGA ensures growth continues uninterrupted, helping to thwart comprised credential attacks, which make up 81% of all hacking-related data breaches, according to Verizon.

%d bloggers like this: