Skip to content
Advertisements

Posts from the ‘Enterprise software’ Category

High-Tech’s Greatest Challenge Will Be Securing Supply Chains In 2019

Bottom Line: High-tech manufacturers need to urgently solve the paradox of improving supply chain security while attaining greater visibility across supplier networks if they’re going make the most of smart, connected products’ many growth opportunities in 2019.

The era of smart, connected products is revolutionizing every aspect of manufacturing today, from suppliers to distribution networks. Capgemini estimates that the size of the connected products market will be $519B to $685B by 2020. Manufacturers expect close to 50 percent of their products to be smart, connected products by 2020, according to Capgemini’s Digital Engineering: The new growth engine for discrete manufacturers. The study is downloadable here (PDF, 40 pp., no opt-in).

Smart, connected products free manufacturers and their supply chains from having to rely on transactions and the price wars they create. The smarter the product, the greater the services revenue opportunities. And the more connected a smart product is using IoT and Wi-Fi sensors the more security has to be designed into every potential supplier evaluation, onboarding, quality plan, and ongoing suppliers’ audits. High-tech manufacturers are undertaking all of these strategies today, fueling them with real-time monitoring using barcoding, RFID and IoT sensors to improve visibility across their supply chains.

Gaining even greater visibility into their supply chains using cloud-based track-and-trace systems capable of reporting back the condition of components in transit to the lot and serialized pack level, high-tech suppliers are setting the gold standard for supply chain transparency and visibility. High-tech supply chains dominate many other industries’ supplier networks on accuracy, speed, and scale metrics on a consistent basis, yet the industry is behind on securing its vast supplier network. Every supplier identity and endpoint is a new security perimeter and taking a Zero Trust approach to securing them is the future of complex supply chains. With Zero Trust Privilege, high-tech manufacturers can secure privileged access to infrastructure, DevOps, cloud, containers, Big Data, production, logistics and shipping facilities, systems and teams.

High-Tech Needs to Confront Its Supply Chain Security Problem, Not Dismiss It

It’s ironic that high-tech supply chains are making rapid advances in accuracy and visibility yet still aren’t vetting suppliers thoroughly enough to stop counterfeiting, or worse. Bloomberg’s controversial recent article,The Big Hack: How China Used a Tiny Chip to Infiltrate U.S. Companies, explains how Amazon Web Services (AWS) was considering buying Portland, Oregon-based Elemental Technologies for its video streaming technology, known today as Amazon Prime Video. As part of the due diligence, AWS hired a third-party company to scrutinize Elemental’s security all the way up to the board level. The Elemental servers that handle the video compression were assembled by Super Micro Computer Inc., a San Jose-based company in China. Nested on the servers’ motherboards, the testers found a tiny microchip, not much bigger than a grain of rice, that wasn’t part of the boards’ original design that could create a stealth doorway into any network the machines were attached to. Apple (who is also an important Super Micro customer) and AWS deny this ever happened, yet 17 people have confirmed Supermicro had altered hardware, corroborating Bloomberg’s findings.

The hard reality is that the scenario Bloomberg writes about could happen to any high-tech manufacturer today. When it comes to security and 3rd party vendor risk management, many high-tech supply chains are stuck in the 90s while foreign governments, their militaries and the terrorist organizations they support are attempting to design in the ability to breach any network at will. How bad is it?  81% of senior executives involved in overseeing their companies’ global supply chains say 3rd party vendor management including recruiting suppliers is riskiest in China, India, Africa, Russia, and South America according to a recent survey by Baker & McKenzie.

PriceWaterhouseCoopers (PwC) and the MIT Forum for Supply Chain Innovation collaborated on a study of 209 companies’ supply chain operations and approaches to 3rd party vendor risk management. The study, PwC and the MIT Forum for Supply Chain Innovation: Making the right risk decisions to strengthen operations performance, quantifies the quick-changing nature of supply chains. 94% say there are changes in the extended supply chain network configuration happening frequently. Relying on trusted and untrusted domain controllers from server operating systems that are decades old can’t keep up with the mercurial pace of supply chains today.

Getting in Control of Security Risks in High-Tech Supply Chains

It’s time for high-tech supply chains to go with a least privilege-based approach to verifying who or what is requesting access to any confidential data across the supply chains. Further, high-tech manufacturers need to extend access request verification to include the context of the request and the risk of the access environment. Today it’s rare to find any high-tech manufacturer going to this level of least-privilege access approach, yet it’s the most viable approach to securing the most critical parts of their supply chains.

By taking a least-privilege access approach, high-tech manufacturers and their suppliers can minimize attack surfaces, improve audit and compliance visibility, and reduce risk, complexity, and operating costs across their hybrid manufacturing ecosystem.

Key actions that high-tech manufacturers can take to secure their supply chain and ensure they don’t end up in an investigative story of hacked supply chains include the following:

  • Taking a Zero Trust approach to securing every endpoint provides high-tech manufacturers with the scale they need to grow. High-tech supply chains are mercurial and fast-moving by nature, guaranteeing they will quickly scale faster than any legacy approaches enterprise security management. Vetting and then onboarding new suppliers needs to start by protecting every endpoint to the production and sourcing level, especially for next-generation smart, connected products.
  • Smart, connected products and the product-as-a-service business models they create are all based on real-time, rich, secured data streams that aren’t being eavesdropped on with components no one knows about. Taking a Zero Trust Privilege-based approach to securing access to diverse supply chains is needed if high-tech manufacturers are going to extend beyond legacy Privileged Access Management (PAM) to secure data being generated from real-time monitoring and data feeds from their smart, connected products today and in the future.
  • Quality management, compliance, and quality audits are all areas high-tech manufacturers excel in today and provide a great foundation to scale to Zero Trust Privilege. High-tech manufacturers have the most advanced quality management, inbound inspection and supplier quality audit techniques in the world. It’s time for the industry to step up on the security side too. By only granting least-privilege access based on verifying who is requesting access, the context of the request, and the risk of the access environment, high-tech manufacturers can make rapid strides to improve supply chain security.
  • Rethink the new product development cycles for smart, connected products and the sensors they rely on, so they’re protected as threat surfaces when built. Designing in security to the new product development process level and further advancing security scrutiny to the schematic and board design level is a must-do. In an era of where we have to assume bad actors are everywhere, every producer of high-tech products needs to realize their designs, product plans, and roadmaps are at risk. Ensuring the IOT and Wi-Fi sensors in smart, connected products aren’t designed to be hackable starts with a Zero Trust approach to defining security for supplier, design, and development networks.

Conclusion

The era of smart, connected products is here, and supply chains are already reverberating with the increased emphasis on components that are easily integrated and have high-speed connectivity. Manufacturing CEOs say it’s exactly what their companies need to grow beyond transaction revenue and the price wars they create. While high-tech manufacturers excel at accuracy, speed, and scale, they are falling short on security. It’s time for the industry to re-evaluate how Zero Trust can stabilize and secure every identity and threat surface across their supply chains with the same precision and intensity quality is today.

Advertisements

Which CRM Applications Matter Most In 2018

 

According to recent research by Gartner,

  • Marketing analytics continues to be hot for marketing leaders, who now see it as a key business requirement and a source of competitive differentiation
  • Artificial intelligence (AI) and predictive technologies are of high interest across all four CRM functional areas, and mobile remains in the top 10 in marketing, sales and customer service.
  • It’s in customer service where AI is receiving the highest investments in real use cases rather than proofs of concept (POCs) and experimentation.
  • Sales and customer service are the functional areas where machine learning and deep neural network (DNN) technology is advancing rapidly.

These and many other fascinating insights are from Gartner’s What’s Hot in CRM Applications in 2018 by Ed Thompson, Adam Sarner, Tad Travis, Guneet Bharaj, Sandy Shen and Olive Huang, published on August 14, 2018. Gartner clients can access the study here  (10 pp., PDF, client access reqd.).

Gartner continually tracks and analyzes the areas their clients have the most interest in and relies on that data to complete their yearly analysis of CRM’s hottest areas. Inquiry topics initiated by clients are an excellent leading indicator of relative interest and potential demand for specific technology solutions. Gartner organizes CRM technologies into the four category areas of Marketing, Sales, Customer Service, and Digital Commerce.

The following graphic from the report illustrates the top CRM applications priorities in Marketing, Sales, Customer Service, and Digital Commerce.

Key insights from the study include the following:

  • Marketing analytics continues to be hot for marketing leaders, who now see it as a key business requirement and a source of competitive differentiation. In my opinion and based on discussions with CMOs, interest in marketing analytics is soaring as they are all looking to quantify their team’s contribution to lead generation, pipeline growth, and revenue. I see analytics- and data-driven clarity as the new normal. I believe that knowing how to quantify marketing contributions and performance requires CMOs and their teams to stay on top of the latest marketing, mobile marketing, and predictive customer analytics apps and technologies constantly. The metrics marketers choose today define who they will be tomorrow and in the future.
  • Artificial intelligence (AI) and predictive technologies are of high interest across all four CRM functional areas, and mobile remains in the top 10 in marketing, sales and customer service. It’s been my experience that AI and machine learning are revolutionizing selling by guiding sales cycles, optimizing pricing and enabling CPQ to define and deliver smart, connected products. I’m also seeing CMOs and their teams gain value from Salesforce Einstein and comparable intelligent agents that exemplify the future of AI-enabled selling. CMOs are saying that Einstein can scale across every phase of customer relationships. Based on my previous consulting in CPQ and pricing, it’s good to see decades-old core technologies underlying Price Optimization and Management are getting a much-needed refresh with state-of-the-art AI and machine learning algorithms, which is one of the factors driving their popularity today. Using Salesforce Einstein and comparable AI-powered apps I see sales teams get real-time guidance on the most profitable products to sell, the optimal price to charge, and which deal terms have the highest probability of closing deals. And across manufacturers on a global scale sales teams are now taking a strategic view of Configure, Price, Quote (CPQ) as encompassing integration to ERP, CRM, PLM, CAD and price optimization systems. I’ve seen global manufacturers take a strategic view of integration and grow far faster than competitors. In my opinion, CPQ is one of the core technologies forward-thinking manufacturers are relying on to launch their next generation of smart, connected products.
  • It’s in customer service where AI is receiving the highest investments in real use cases rather than proofs of concept (POCs) and experimentation. It’s fascinating to visit with CMOs and see the pilots and full production implementations of AI being used to streamline customer service. One CMO remarked how effective AI is at providing greater contextual intelligence and suggested recommendations to customers based on their previous buying and services histories. It’s interesting to watch how CMOs are attempting to integrate AI and its associated technologies including ChatBots to their contribution to Net Promoter Scores (NPS). Every senior management team running a marketing organization today has strong opinions on NPS. They all agree that greater insights gained from predictive analytics and AI will help to clarify the true value of NPS as it relates to Customer Lifetime Value (CLV) and other key metrics of customer profitability.
  • Sales and customer service are the functional areas where machine learning and deep neural network (DNN) technology is advancing rapidly.  It’s my observation that machine learning’s potential to revolutionize sales is still nascent with many high-growth use cases completely unexplored. In speaking with the Vice President of Sales for a medical products manufacturer recently, she said her biggest challenge is hiring sales representatives who will have longer than a 19-month tenure with the company, which is their average today.  Imagine, she said, knowing the ideal attributes and strengths of their top performers and using machine learning and AI to find the best possible new sales hires. She and I discussed the spectrum of companies taking on this challenge, with Eightfold being one of the leaders in applying AI and machine learning to talent management challenges.

Source: Gartner by Ed Thompson, Adam Sarner, Tad Travis, Guneet Bharaj,  Sandy Shen and Olive Huang, published on August 14, 2018.

How To Protect Healthcare IoT Devices In A Zero Trust World

  • Over 100M healthcare IoT devices are installed worldwide today, growing to 161M by 2020, attaining a Compound Annual Growth Rate (CAGR) of 17.2% in just three years according to Statista.
  • Healthcare executives say privacy concerns (59%), legacy system integration (55%) and security concerns (54%) are the top three barriers holding back Internet of Things (IoT) adoption in healthcare organizations today according to the Accenture 2017 Internet of Health Things Survey.
  • The global IoT market is projected to soar from $249B in 2018 to $457B in 2020, attaining a Compound Annual Growth Rate (CAGR) of 22.4% in just three years according to Statista.

Healthcare and medical device manufacturers are in a race to see who can create the smartest and most-connected IoT devices first. Capitalizing on the rich real-time data monitoring streams these devices can provide, many see the opportunity to break free of product sales and move into more lucrative digital service business models. According to Capgemini’s “Digital Engineering, The new growth engine for discrete manufacturers,” the global market for smart, connected products is projected to be worth $519B to $685B by 2020. The study can be downloaded here (PDF, 40 pp., no opt-in). 47% of a typical manufacturer’s product portfolio by 2020 will be comprised of smart, connected products. In the gold rush to new digital services, data security needs to be a primary design goal that protects the patients these machines are designed to serve. The following graphic from the study shows how organizations producing smart, connected products are making use of the data generated today.

Healthcare IoT Device Data Doesn’t Belong For Sale On The Dark Web

Every healthcare IoT device from insulin pumps and diagnostic equipment to Remote Patient Monitoring is a potential attack surface for cyber adversaries to exploit. And the healthcare industry is renowned for having the majority of system breaches initiated by insiders. 58% of healthcare systems breach attempts involve inside actors, which makes this the leading industry for insider threats today according to Verizon’s 2018 Protected Health Information Data Breach Report (PHIDBR).

Many employees working for medical providers are paid modest salaries and often have to regularly work hours of overtime to make ends meet. Stealing and selling medical records is one of the ways those facing financial challenges look to make side money quickly and discreetly. And with a market on the Dark Web willing to pay up to $1,000 or more for the most detailed healthcare data, according to Experian, medical employees have an always-on, 24/7 marketplace to sell stolen data. 18% of healthcare employees are willing to sell confidential data to unauthorized parties for as little as $500 to $1,000, and 24% of employees know of someone who has sold privileged credentials to outsiders, according to a recent Accenture survey. Healthcare IoT devices are a potential treasure trove to inside and outside actors who are after financial gains by hacking the IoT connections to smart, connected devices and the networks they are installed on to exfiltrate valuable medical data.

Healthcare and medical device manufacturers need to start taking action now to secure these devices during the research and development, design and engineering phases of their next generation of IoT products. Specifying and validating that every IoT access point is compatible and can scale to support Zero Trust Security (ZTS) is essential if the network of devices being designed and sold will be secure. ZTS is proving to be very effective at thwarting potential breach attempts across every threat surface an organization has. Its four core pillars include verifying the identity of every user, validating every device, limiting access and privilege, and utilizing machine learning to analyze user behavior and gain greater insights from analytics.

The First Step Is Protect Development Environments With Zero Trust Privilege

Product research & development, design, and engineering systems are all attack surfaces that cyber adversaries are looking to exploit as part of the modern threatscape. Their goals include gaining access to valuable Intellectual Property (IP), patents and designs that can be sold to competitors and on the Dark Web, or damaging and destroying development data to slow down the development of new products. Another tactic lies in planting malware in the firmware of IoT devices to exfiltrate data at scale.

Attack surfaces and the identities that comprise the new security perimeter of their companies aren’t just people; they are workloads, services, machines, and development systems and platforms. Protecting every attack surface with cloud-ready Zero Trust Privilege (ZTP) which secures access to infrastructure, DevOps, cloud, containers, Big Data, and the entire development and production environment is needed.

Zero Trust Privilege can harden healthcare and medical device manufacturers’ internal security, only granting least privilege access based on verifying who is requesting access, the context of the request, and the risk of the access environment. By implementing least privilege access, healthcare and medical device manufacturers would be able to minimize attack surfaces, improve audit and compliance visibility, and reduces risk, complexity, and costs across their development and production operations.

The Best Security Test Of All: An FDA Audit

Regulatory agencies across Asia, Europe, and North America are placing a higher priority than ever before on cybersecurity to the device level. The U.S. Food & Drug Administration’s Cybersecurity Initiative is one of the most comprehensive, providing prescriptive guidance to manufacturers on how to attain higher levels of cybersecurity in their products.

During a recent healthcare device and medical device manufacturer’s conference, a former FDA auditor (and now Vice President of Compliance) gave a fascinating keynote on the FDA’s intent to audit medical device security at the production level. Security had been an afterthought or at best a “trust but verify” approach that relied on trusted versus untrusted machine domains. That will no longer be the case, as the FDA will now complete audits that are comparable to Zero Trust across manufacturing operations and devices.

As Zero Trust Privilege enables greater auditability than has been possible in the past, combined with a “never trust, always verify” approach to system access, healthcare device, and medical products manufacturers should start engineering in Zero Trust into their development cycles now.

2018 Roundup Of Cloud Computing Forecasts And Market Estimates

Cloud computing platforms and applications are proliferating across enterprises today, serving as the IT infrastructure driving new digital businesses. The following roundup of cloud computing forecasts and market estimates reflect a maturing global market for cloud services, with proven scale, speed and security to support new business models.

CIOs who are creating compelling business cases that rely on cloud platforms as a growth catalyst is the architects enabling these new business initiatives to succeed. The era of CIO strategist has arrived. Key takeaways include the following:

  • Amazon Web Services (AWS) accounted for 55% of the company’s operating profit in Q2, 2018, despite contributing only 12% to the company’s net sales. In Q1, 2018 services accounted for 40% of Amazon’s revenue, up from 26% three years earlier. Source: Cloud Business Drives Amazon’s Profits, Statista, July 27, 2018.

  • 80% of enterprises are both running apps on or experimenting with Amazon Web Services (AWS) as their preferred cloud platform. 67% of enterprises are running apps on (45%) and experimenting on (22%) the Microsoft Azure platform. 18% of enterprises are using Google’s Cloud Platform for applications today, with 23% evaluating the platform for future use. RightScale’s 2018 survey was included in the original data set Statista used to create the comparison. Source: Statista, Current and planned usage of public cloud platform services running applications worldwide in 2018. Please click on the graphic to expand for easier viewing.

  • Enterprise adoption of Microsoft Azure increased significantly from 43% to 58% attaining a 35% CAGR while AWS adoption increased from 59% to 68%. Enterprise respondents with future projects (the combination of experimenting and planning to use) show the most interest in Google (41%). Source: RightScale 2018 State of the Cloud Report. Please click on the graphic to expand for easier viewing.

  • Wikibon projects the True Private Cloud (TPC) worldwide market will experience a compound annual growth rate of 29.2%, reaching $262.4B by 2027. The firm predicts TPC growth will far outpace the infrastructure-as-a-service (IaaS) growth of 15.2% over the same period. A true private cloud is distinguished from a private cloud by the completeness of the integration of all aspects of the offering, including performance characteristics such as price, agility, and service breadth. Please see the source link for additional details on TPC. Source: Wikibon’s 2018 True Private Cloud Forecast and Market Shares. Please click on the graphic to expand for easier viewing.

  • Quality Control, Computer-Aided Engineering, and Manufacturing Execution Systems (MES) are the three most widely adopted systems in the cloud by discrete and process The survey also found that 60% of discrete and process manufacturers say their end users prefer the cloud over on-premise. Source: Amazon Web Services & IDC: Industrial Customers Are Ready For The Cloud – Now (PDF, 13 pp., no opt-in, sponsored by AWS). Please click on the graphic to expand for easier viewing.

  • The Worldwide Public Cloud Services Market is projected to grow by 17.3 3% in 2019 to total $206.2B, up from $175.8B in 2018 according to Gartner. In 2018 the market will grow a healthy 21% up from $145.3B in 2017 according to the research and advisory firm. Infrastructure-as-a-Service (IaaS) will be the fastest-growing segment of the market, forecasted to grow by 27.6% in 2019 to reach $39.5B, up from $31B in 2018. By 2022, Gartner expects that 90% of enterprises purchasing public cloud IaaS will do so from an integrated IaaS and Platform-as-a-Service (PaaS), and will use both the IaaS and PaaS capabilities from that provider. Source: Gartner Forecasts Worldwide Public Cloud Revenue to Grow 17.3 Percent in 2019.

  • More than $1.3T in IT spending will be directly or indirectly affected by the shift to cloud by 2022. 28% of spending within key enterprise IT markets will shift to the cloud by 2022, up from 19% in 2018. The largest cloud shift before 2018 occurred in application software, particularly driven by customer relationship management (CRM) software, with Salesforce dominating as the market leader. CRM has already reached a tipping point where a higher proportion of spending occurs in the cloud than in traditional software. Source: Gartner Says 28 Percent of Spending in Key IT Segments Will Shift to the Cloud by 2022.

  • IDC predicts worldwide Public Cloud Services Spending will reach $180B in 2018, an increase of 23.7% over 2017. According to IDC, the market is expected to achieve a five-year compound annual growth rate (CAGR) of 21.9% with public cloud services spending totaling $277B in 2021. The industries that are forecast to spend the most on public cloud services in 2018 are discrete manufacturing ($19.7B), professional services ($18.1B), and banking ($16.7B). The process manufacturing and retail industries are also expected to spend more than $10B each on public cloud services in 2018. These five industries will remain at the top in 2021 due to their continued investment in public cloud solutions. The industries that will see the fastest spending growth over the five-year forecast period are professional services (24.4% CAGR), telecom (23.3% CAGR), and banking (23.0% CAGR). Source: Worldwide Public Cloud Services Spending Forecast to Reach $160 Billion This Year, According to IDC.
  • Discrete Manufacturing is predicted to lead all industries on public cloud spending of $19.7B in 2018 according to IDC. Additional industries forecast to spend the most on public cloud services this year include Professional Services at $18.1B and Banking at $16.7B. The process manufacturing and retail industries are also expected to spend more than $10B each on public cloud services in 2018. According to IDC, these five industries will remain at the top in 2021 due to their continued investment in public cloud solutions. The industries that will see the fastest spending growth over the five-year forecast period are Professional Services with a 24.4% CAGR, Telecommunications with a 23.3% CAGR, and banking with a 23% CAGR. Source: Worldwide Public Cloud Services Spending Forecast to Reach $160 Billion This Year, According to IDC.

Additional Resources:

58% Of All Healthcare Breaches Are Initiated By Insiders

  • 58% of healthcare systems breach attempts involve inside actors, which makes this the leading industry for insider threats today.
  • Ransomware leads all malicious code categories, responsible for 70% of breach attempt incidents.
  • Stealing laptops from medical professionals’ cars to obtain privileged access credentials to gain access and install malware on healthcare networks, exfiltrate valuable data or sabotage systems and applications are all common breach strategies.

These and many other fascinating insights are from Verizon’s 2018 Protected Health Information Data Breach Report (PHIDBR). A copy of the study is available for download here (PDF, 20 pp., no opt-in).  The study is based on 1,368 incidents across 27 countries. Healthcare medical records were the focus of breaches, and the data victims were patients and their medical histories, treatment plans, and identities. The data comprising the report is a subset of Verizon’s Annual Data Breach Investigations Report (DBIR) and spans 2016 and 2017.

Why Healthcare Needs Zero Trust Security To Grow

One of the most compelling insights from the Verizon PHIDBR study is how quickly healthcare is becoming a digitally driven business with strong growth potential. What’s holding its growth back, however, is how porous healthcare digital security is. 66% of internal and external actors are abusing privileged access credentials to access databases and exfiltrate proprietary information, and 58% of breach attempts involve internal actors.

Solving the security challenges healthcare providers face is going to fuel faster growth. Digitally-enabled healthcare providers and fast-growing digital businesses in other industries are standardizing on Zero Trust Security (ZTS), which aims to protect every internal and external endpoint and attack surface. ZTS is based on four pillars, which include verifying the identity of every user, validating every device, limiting access and privilege, and learning and adapting using machine learning to analyze user behavior and gain greater insights from analytics.

Identities Need to Be Every Healthcare Providers’ New Security Perimeter

ZTS starts by defining a digital business’ security perimeter as every employees’ and patients’ identity, regardless of their location. Every login attempt, resource request, device operating system, and many other variables are analyzed using machine learning algorithms in real time to produce a risk score, which is used to empower Next-Gen Access (NGA).

The higher the risk score, the more authentication is required before providing access. Multi-Factor Authentication (MFA) is required first, and if a login attempt doesn’t pass, additional screening is requested up to shutting off an account’s access.

NGA is proving to be an effective strategy for thwarting stolen and sold healthcare provider’s privileged access credentials from gaining access to networks and systems, combining Identity-as-a-Service (IDaaS), Enterprise Mobility Management (EMM) and Privileged Access Management (PAM). Centrify is one of the leaders in this field, with expertise in the healthcare industry.

NGA can also assure healthcare providers’ privileged access credentials don’t make the best seller list on the Dark Web. Another recent study from Accenture titled, “Losing the Cyber Culture War in Healthcare: Accenture 2018 Healthcare Workforce Survey on Cybersecurity” found that 18% of healthcare employees are willing to sell confidential data to unauthorized parties for as little as $500 to $1,000. 24% of employees know of someone who has sold privileged credentials to outsiders, according to the survey. By verifying every login attempt from any location, NGA can thwart the many privilege access credentials for sale on the Dark Web.

The following are the key takeaways from Verizon’s 2018 Protected Health Information Data Breach Report (PHIDBR):

  • 58% of healthcare security breach attempts involve inside actors, which makes it the leading industry for insider threats today. External actors are attempting 42% of healthcare breaches. Inside actors rely on their privileged access credentials or steal them from fellow employees to launch breaches the majority of the time. By utilizing NGA, healthcare providers can get this epidemic of internal security breaches under control by forcing verification for every access request, anywhere, on a 24/7 basis.

  • Most healthcare breaches are motivated by financial gain, with healthcare workers most often using patient data to commit tax return and credit fraud. Verizon found 876 total breach incidents initiated by healthcare insiders in 2017, leading all categories. External actors initiated 523 breach incidents, while partners initiated 109 breach incidents. 496 of all breach attempts are motivated by financial gain across internal, external and partner actors. Internal actors are known for attempting breaches for fun and curiosity-driven by interest in celebrities’ health histories that are accessible from the systems they use daily. When internal actors are collaborating with external actors and partners for financial gain and accessing confidential health records of patients, it’s time for healthcare providers to take a more aggressive stance on securing patient records with a Zero Trust approach.

  • Abusing privileged access credentials (66%) and abusing credentials and physical access points (17%) to gain unauthorized access comprise 82.9% of all misuse-based breach attempts and incidents. Verizon’s study accentuates that misuse of credentials and the breaching of physical access points with little or no security is intentional, deliberate and driven by financial gain the majority of the time. Internal, external and partner actors acting alone or in collaboration with each other know the easiest attack surface to exploit are accessed credentials, with database access being the goal half of the time. When there’s little to no protection on web application and payment card access points to a network, breaches happen. Shutting down privilege abuse starts with a solid ZTS strategy based on NGA where every login attempt is verified before access is granted and anomalies trigger MFA and further user validation. Please click on the graphic to expand it for easier reading.

  • 70.2% of all hacking attempts are based on stolen privileged access credentials (49.3%) combined with brute force to obtain credentials from POS terminals and controllers (20.9%). Hackers devise ingenious ways of stealing privileged access credentials, even resorting to hacking a POS terminal or controllers to get them. Healthcare insiders also steal credentials to gain access to mainframes, servers, databases and internal systems. Verizon’s findings below are supported by Accenture’s research showing that 18% of healthcare employees are willing to sell privileged access credentials and confidential data to unauthorized parties for as little as $500 to $1,000. Please click on the graphic to expand it for easier reading.

  • Hospitals are most often targeted for breaches using privileged access credentials followed by ambulatory health care services, the latter of which is seen as the most penetrable business via hacking and brute force credential acquisition. Verizon compared breach incidents by North American Industry Classification System (NAICS) and found privileged credential misuse is flourishing in hospitals where inside and outside actors seek to access databases and web applications. Internal, external and partner actors are concentrating on hospitals due to the massive scale of sensitive data they can attain with stolen privileged access credentials and quickly sell them or profit from them through fraudulent means. Verizon also says a favorite hacking strategy is to use USB drives to exfiltrate proprietary information and sell it to health professionals intent on launching competing clinics and practices. Please click on the graphic to expand it for easier reading.

Conclusion

With the same intensity they invest in returning patients to health, healthcare providers need to strengthen their digital security, and Zero Trust Security is the best place to start. ZTS begins with Next-Gen Access by not trusting a single device, login attempt, or privileged access credential for every attack surface protected. Every device’s login attempt, resource request, and access credentials are verified through NGA, thwarting the rampant misuse and hacking based on comprised privileged access credentials. The bottom line is, it’s time for healthcare providers to get in better security shape by adopting a Zero Trust approach.

Global State Of Enterprise Analytics, 2018

  • 71% of enterprises globally predict their investments in data and analytics will accelerate in the next three years and beyond.
  • 57% of enterprises globally have a Chief Data Officer, a leadership role that is pivotal in helping to democratize data and analytics across any organization.
  • 52% of enterprises are leveraging advanced and predictive analytics today to provide greater insights and contextual intelligence into operations.
  • 41% of all enterprises are considering a move to cloud-based analytics in the next year.
  • Cloud Computing (24%), Big Data (20%), and AI/Machine Learning (18%) are the three technologies predicted to have the greatest impact on analytics over the next five years.
  • Just 16% of enterprises have enabled at least 75% of their employees to have access to company data and analytics.

These and many other fascinating insights are from MicroStrategy’s latest research study, 2018 Global State of Enterprise Analytics Report.  You can download a copy here (PDF, 44 pp., opt-in). The study is based on surveys completed in April 2018 with 500 globally-based enterprise analytics and business intelligence professionals on the state of their organizations’ analytics initiatives across 20 industries. Participants represented organizations with 250 to 20,000 employees worldwide from five nations including Brazil, Germany, Japan, the United Kingdom and the United States. For additional details on the methodology, please see the study here. The study’s results underscore how enterprises need to have a unified data strategy that reflects their growth strategies and new business models’ information needs.

Key takeaways from the study include the following:

  • Driving greater process and cost efficiencies (60%), strategy and change (57%) and monitoring and improving financial performance (52%) are the top three ways enterprises globally are using data and analytics today. The study found that enterprises are also relying on data and analytics to gain greater insights into how current products and services are used (51%), managing risk (50%) and attain customer growth and retention (49%). Across the five nations surveyed, Japan leads the world in the use of data and analytics to drive process and cost efficiencies (65%). UK-based enterprises lead all nations in their use of data and analytics to analyze how current products and services are being used.  The report provides graphical comparisons of the five nations’ results.

  • Cloud Computing, Big Data, and AI/Machine Learning are the three technologies predicted to have the greatest global impact on analytics over the next five years. Japanese enterprises predict cloud computing will have the greatest impact on the future of analytics (28%) across the five nations’ enterprises interviewed. AI/Machine Learning is predicted to have the greatest impact on analytics in the U.K. (26%) globally as is Big Data in Germany (29%). Please see the study for country-specific prioritization of technologies.

  • 52% of enterprises are leveraging advanced and predictive analytics today to provide greater insights and contextual intelligence into operations. Additional leverage areas include distribution of analytics via e-mail and collaboration tools (49%), analytics embedded in other apps including Salesforce (44%) and mobile productivity apps (39%). Japanese enterprises lead the world in their adoption of advanced and predictive analytics (60%). German enterprises lead the world in the adoption of analytics for collaboration via e-mail and more real-time data and knowledge-sharing methods (50%).

  • 59% of enterprises are using Big Data Analytics, leading all categories of intelligence applications. Enterprise reporting (47%), data discovery (47%), mobile productivity apps (44%) and embedded apps (42%) are the top five intelligence applications in use globally by enterprises today. Big Data’s dominance in the survey results can be attributed to the top five industries in the sampling frame is among the most prolific in data generation and use. Manufacturing (15%) is the most data-prolific industry on the planet. Additional industries that generate massive amounts of data dominate the survey’s demographics including software technology-based businesses (14%), banking (13%), retail (11%), and financial services/business services (6%).

  • 27% of global enterprises prioritize security over any other factor when evaluating a new analytics vendor. The three core attributes of a scalable, comprehensive platform, ease of use, and a vendor’s products having an excellent reputation are all essential. Enterprises based in four of the five nations also prioritize security as the most critical success factor they evaluate potential analytics vendors to do business with. Enterprise scalability is most important in the U.S., with 26% of enterprises interviewed saying this is the most important priority in evaluating a new analytics vendor.

  • Data privacy and security concerns (49%) is the most formidable barrier enterprises face in gaining more effective use of their data and analytics. Enterprises from four of the five nations say data privacy and security are the most significant barrier they face in getting more value from analytics. In Japan, the greatest barrier is access limited to data across the organization (40%).

  • Globally 41% of all enterprises are considering a move to the cloud in the next year. 64% of U.S.-based enterprises are considering moving to a cloud-based analytics platform or solution in the next year. The U.S. leads enterprises from all five nations in planned cloud-based analytics cloud adoption as the graphic below illustrates.

Zero Trust Security Update From The SecurIT Zero Trust Summit

  • Identities, not systems, are the new security perimeter for any digital business, with 81% of breaches involving weak, default or stolen passwords.
  • 53% of enterprises feel they are more susceptible to threats since 2015.
  • 51% of enterprises suffered at least one breach in the past 12 months and malicious insider incidents increased 11% year-over-year.

These and many other fascinating insights are from SecurIT: the Zero Trust Summit for CIOs and CISOs held last month in San Francisco, CA. CIO and CSO produced the event that included informative discussions and panels on how enterprises are adopting Next-Gen Access (NGA) and enabling Zero Trust Security (ZTS). What made the event noteworthy were the insights gained from presentations and panels where senior IT executives from Akamai, Centrify, Cisco, Cylance, EdgeWise, Fortinet, Intel, Live Nation Entertainment and YapStone shared their key insights and lessons learned from implementing Zero Trust Security.

Zero Trust’s creator is John Kindervag, a former Forrester Analyst, and Field CTO at Palo Alto Networks.  Zero Trust Security is predicated on the concept that an organization doesn’t trust anything inside or outside its boundaries and instead verifies anything and everything before granting access. Please see Dr. Chase Cunningham’s excellent recent blog post, What ZTX means for vendors and users, for an overview of the current state of ZTS. Dr. Chase Cunningham is a Principal Analyst at Forrester.

Key takeaways from the Zero Trust Summit include the following:

  • Identities, not systems, are the new security perimeter for any digital business, with 81% of breaches involving weak, default or stolen passwords. Tom Kemp, Co-Founder, and CEO, Centrify, provided key insights into the current state of enterprise IT security and how existing methods aren’t scaling completely enough to protect every application, endpoint, and infrastructure of any digital business. He illustrated how $86B was spent on cybersecurity, yet a stunning 66% of companies were still breached. Companies targeted for breaches averaged five or more separate breaches already. The following graphic underscores how identities are the new enterprise perimeter, making NGA and ZTS a must-have for any digital business.

  • 53% of enterprises feel they are more susceptible to threats since 2015. Chase Cunningham’s presentation, Zero Trust and Why Does It Matter, provided insights into the threat landscape and a thorough definition of ZTX, which is the application of a Zero Trust framework to an enterprise. Dr. Cunningham is a Principal Analyst at Forrester Research serving security and risk professionals. Forrester found the percentage of enterprises who feel they are more susceptible to threats nearly doubled in two years, jumping from 28% in 2015 to 53% in 2017. Dr. Cunningham provided examples of how breaches have immediate financial implications on the market value of any business with specific focus on the Equifax breach.

Presented by Dr. Cunningham during SecurIT: the Zero Trust Summit for CIOs and CISOs

  • 51% of enterprises suffered at least one breach in the past 12 months and malicious insider incidents increased 11% year-over-year. 43% of confirmed breaches in the last 12 months are from an external attack, 24% from internal attacks, 17% are from third-party incidents and 16% from lost or stolen assets. Consistent with Verizon’s 2018 Data Breach Investigations Report use of privileged credential access is a leading cause of breaches today.

Presented by Dr. Cunningham during SecurIT: the Zero Trust Summit for CIOs and CISOs

                       

  • One of Zero Trust Security’s innate strengths is the ability to flex and protect the perimeter of any growing digital business at the individual level, encompassing workforce, customers, distributors, and Akamai, Cisco, EdgeWise, Fortinet, Intel, Live Nation Entertainment and YapStone each provided examples of how their organizations are relying on NGA to enable ZTS enterprise-wide. Every speaker provided examples of how ZTS delivers several key benefits including the following: First, ZTS reduces the time to breach detection and improves visibility throughout a network. Second, organizations provided examples of how ZTS is reducing capital and operational expenses for security, in addition to reducing the scope and cost of compliance initiatives. All companies presenting at the conference provided examples of how ZTS is enabling greater data awareness and insight, eliminating inter-silo finger-pointing over security responsibilities and for several, enabling digital business transformation. Every organization is also seeing ZTS thwart the exfiltration and destruction of their data.

Conclusion

The SecurIT: the Zero Trust Summit for CIOs and CISOs event encapsulated the latest advances in how NGA is enabling ZTS by having enterprises who are adopting the framework share their insights and lessons learned. It’s fascinating to see how Akamai, Cisco, Intel, Live Nation Entertainment, YapStone, and others are tailoring ZTS to their specific customer-driven goals. Each also shared their plans for growth and how security in general and NGA and ZTS specifically are protecting customer and company data to ensure growth continues, uninterrupted.

 

 

Where Business Intelligence Is Delivering Value In 2018

  • Executive Management, Operations, and Sales are the three primary roles driving Business Intelligence (BI) adoption in 2018.
  • Dashboards, reporting, end-user self-service, advanced visualization, and data warehousing are the top five most important technologies and initiatives strategic to BI in 2018.
  • Small organizations with up to 100 employees have the highest rate of BI penetration or adoption in 2018.
  • Organizations successful with analytics and BI apps define success in business results, while unsuccessful organizations concentrate on adoption rate first.
  • 50% of vendors offer perpetual on-premises licensing in 2018, a notable decline over 2017. The number of vendors offering subscription licensing continues to grow for both on-premises and public cloud models.
  • Fewer than 15% of respondent organizations have a Chief Data Officer, and only about 10% have a Chief Analytics Officer today.

These and many other fascinating insights are from Dresner Advisory Service’s  2018 Wisdom of Crowds® Business Intelligence Market Study. In its ninth annual edition, the study provides a broad assessment of the business intelligence (BI) market and a comprehensive look at key user trends, attitudes, and intentions.  The latest edition of the study adds Information Technology (IT) analytics, sales planning, and GDPR, bringing the total to 36 topics under study.

“The Wisdom of Crowds BI Market Study is the cornerstone of our annual research agenda, providing the most in-depth and data-rich portrait of the state of the BI market,” said Howard Dresner, founder and chief research officer at Dresner Advisory Services. “Drawn from the first-person perspective of users throughout all industries, geographies, and organization sizes, who are involved in varying aspects of BI projects, our report provides a unique look at the drivers of and success with BI.” Survey respondents include IT (28%), followed by Executive Management (22%), and Finance (19%). Sales/Marketing (8%) and the Business Intelligence Competency Center (BICC) (7%). Please see page 15 of the study for specifics on the methodology.

Key takeaways from the study include the following:

  • Executive Management, Operations, and Sales are the three primary roles driving Business Intelligence (BI) adoption in 2018. Executive management teams are taking more of an active ownership role in BI initiatives in 2018, as this group replaced Operations as the leading department driving BI adoption this year. The study found that the greatest percentage change in functional areas driving BI adoption includes Human Resources (7.3%), Marketing (5.9%), BICC (5.1%) and Sales (5%).

  • Making better decisions, improving operational efficiencies, growing revenues and increased competitive advantage are the top four BI objectives organizations have today. Additional goals include enhancing customer service and attaining greater degrees of compliance and risk management. The graph below rank orders the importance of BI objectives in 2018 compared to the percent change in BI objectives between 2017 and 2018. Enhanced customer service is the fastest growing objective enterprises adopt BI to accomplish, followed by growth in revenue (5.4%).

  • Dashboards, reporting, end-user self-service, advanced visualization, and data warehousing are the top five most important technologies and initiatives strategic to BI in 2018. The study found that second-tier initiatives including data discovery, data mining/advanced algorithms, data storytelling, integration with operational processes, and enterprise and sales planning are also critical or very important to enterprises participating in the survey. Technology areas being hyped heavily today including the Internet of Things, cognitive BI, and in-memory analysis are relatively low in the rankings as of today, yet are growing. Edge computing increased 32% as a priority between 2017 and 2018 for example. The results indicate the core aspect of excelling at using BI to drive better business decisions and more revenue still dominate the priorities of most businesses today.
  • Sales & Marketing, Business Intelligence Competency Center (BICC) and   Executive Management have the highest level of interest in dashboards and advanced visualization. Finance has the greatest interest in enterprise planning and budgeting. Operations including manufacturing, supply chain management, and services) leads interest in data mining, data storytelling, integration with operational processes, mobile device support, data catalog and several other technologies and initiatives. It’s understandable that BICC leaders most advocate end-user self-service and attach high importance to many other categories as they are internal service bureaus to all departments in an enterprise. It’s been my experience that BICCs are always looking for ways to scale BI adoption and enable every department to gain greater value from analytics and BI apps. BICCs in the best run companies are knowledge hubs that encourage and educate all departments on how to excel with analytics and BI.

  • Insurance companies most prioritize dashboards, reporting, end-user self-service, data warehousing, data discovery and data mining. Business Services lead the adoption of advanced visualization, data storytelling, and embedded BI. Manufacturing most prioritizes sales planning and enterprise planning but trails in other high-ranking priorities. Technology prioritizes Software-as-a-Service (SaaS) given its scale and speed advantages. The retail & wholesale industry is going through an analytics and customer experience revolution today. Retailers and wholesalers lead all others in data catalog adoption and mobile device support.

  • Insurance, Technology and Business Services vertical industries have the highest rate of BI adoption today. The Insurance industry leads all others in BI adoption, followed by the Technology industry with 40% of organizations having 41% or greater adoption or penetration. Industries whose BI adoption is above average include Business Services and Retail & Wholesale. The following graphic illustrates penetration or adoption of Business Intelligence solutions today by industry.

  • Dashboards, reporting, advanced visualization, and data warehousing are the highest priority investment areas for companies whose budgets increased from 2017 to 2018. Additional high priority areas of investment include advanced visualization and data warehousing. The study found that less well-funded organizations are most likely to lead all others by investing in open source software to reduce costs.

  • Small organizations with up to 100 employees have the highest rate of BI penetration or adoption in 2018. Factors contributing to the high adoption rate for BI in small businesses include business models that need advanced analytics to function and scale, employees with the latest analytics and BI skills being hired to also scale high growth businesses and fewer barriers to adoption compared to larger enterprises. BI adoption tends to be more pervasive in small businesses as a greater percentage of employees are using analytics and BI apps daily.

  • Executive Management is most familiar with the type and number of BI tools in use across the organization. The majority of executive management respondents say their teams are using between one or two BI tools today. Business Intelligence Competency Centers (BICC) consistently report a higher number of BI tools in use than other functional areas given their heavy involvement in all phases of analytics and BI project execution. IT, Sales & Marketing and Finance are likely to have more BI tools in use than Operations.

  • Enterprises rate BI application usability and product quality & reliability at an all-time high in 2018. Other areas of major improvements on the part of vendors include improving ease of implementation, online training, forums and documentation, and completeness of functionality. Dresner’s research team found between 2017 and 2018 integration of components within product dropped, in addition to scalability. The study concludes the drop in integration expertise is due to an increasing number of software company acquisitions aggregating dissimilar products together from different platforms.

The Best Big Data Companies And CEOs To Work For In 2018

Forbes readers’ most common requests center on who the best companies are to work for in analytics, big data, data management, data science and machine learning. The latest Computer Reseller News‘ 2018 Big Data 100 list of companies is used to complete the analysis as it is an impartial, independent list aggregated based on CRN’s analysis and perspectives of the market. Using the CRN list as a foundation, the following analysis captures the best companies in their respective areas today.

Using the 2018 Big Data 100 CRN list as a baseline to compare the Glassdoor scores of the (%) of employees who would recommend this company to a friend and (%) of employees who approve of the CEO, the following analysis was completed today. 25 companies on the list have very few (less than 15) or no Glassdoor reviews, so they are excluded from the rankings. Based on analysis of Glassdoor score patterns over the last four years, the lower the number of rankings, the more 100% scores for referrals and CEOs. These companies, however, are included in the full data set available here. If the image below is not visible in your browser, you can view the rankings here.

 

The highest rated CEOs on Glassdoor as of May 11, 2018 include the following:

Dataiku Florian Douetteau 100%
StreamSets Girish Pancha 100%
MemSQL Nikita Shamgunov 100%
1010 Data Greg Munves 99%
Salesforce.com Marc Benioff 98%
Attivio Stephen Baker 98%
SAP Bill McDermott 97%
Qubole Ashish Thusoo 97%
Trifacta Adam Wilson 97%
Zaloni Ben Sharma 97%
Reltio Manish Sood 96%
Microsoft Satya Nadella 96%
Cloudera Thomas J. Reilly 96%
Sumo Logic Ramin Sayar 96%
Google Sundar Pichai 95%
Looker Frank Bien 93%
MongoDB Dev Ittycheria 92%
Snowflake Computing Bob Muglia 92%
Talend Mike Tuchen 92%
Databricks Ali Ghodsi 90%
Informatica Anil Chakravarthy 90%

 

Five Reasons Why Machine Learning Needs To Make Resumes Obsolete

  • Hiring companies nationwide miss out on 50% or more of qualified candidates and tech firms incorrectly classify up 80% of candidates due to inaccuracies and shortcomings of existing Applicant Tracking Systems (ATS), illustrating how faulty these systems are for enabling hiring.
  • It takes on average 42 days to fill a position, and up to 60 days or longer to fill positions requiring in-demand technical skills and costs an average $5,000 to fill each position.
  • Women applicants have a 19% chance of being eliminated from consideration for a job after a recruiter screen and 30% after an onsite interview, leading to a massive loss of brainpower and insight every company needs to grow.

It’s time the hiring process gets smarter, more infused with contextual intelligence, insight, evaluating candidates on their mastery of needed skills rather than judging candidates on resumes that reflect what they’ve achieved in the past. Enriching the hiring process with greater machine learning-based contextual intelligence finds the candidates who are exceptional and have the intellectual skills to contribute beyond hiring managers’ expectations. Machine learning algorithms can also remove any ethic- and gender-specific identification of a candidate and have them evaluated purely on expertise, experiences, merit, and skills.

The hiring process relied on globally today hasn’t changed in over 500 years. From Leonardo da Vinci’s handwritten resume from 1482, which reflects his ability to build bridges and support warfare versus the genius behind Mona Lisa, Last Supper, Vitruvian Man, and a myriad of scientific discoveries and inventions that modernized the world, the approach job seekers take for pursuing new positions has stubbornly defied innovation. ATS apps and platforms classify inbound resumes and provide rankings of candidates based on just a small glimpse of their skills seen on a resume. When what’s needed is an insight into which managerial, leadership and technical skills & strengths any given candidate is attaining mastery of and at what pace.  Machine learning broadens the scope of what hiring companies can see in candidates by moving beyond the barriers of their resumes. Better hiring decisions are being made, and the Return on Investment (ROI) drastically improves by strengthening hiring decisions with greater intelligence. Key metrics including time-to-hire, cost-to-hire, retention rates, and performance all will improve when greater contextual intelligence is relied on.

Look Beyond Resumes To Win The War For Talent

Last week I had the opportunity to speak with the Vice President of Human Resources for one of the leading technology think tanks globally. He’s focusing on hundreds of technical professionals his organization needs in six months, 12 months and over a year from now to staff exciting new research projects that will deliver valuable Intellectual Property (IP) including patents and new products.

Their approach begins by seeking to understand the profiles and core strengths of current high performers, then seek out matches with ideal candidates in their community of applicants and the broader technology community. Machine learning algorithms are perfectly suited for completing the needed comparative analysis of high performer’s capabilities and those of candidates, whose entire digital persona is taken into account when comparisons are being completed. The following graphic illustrates the eightfold.ai Talent Intelligence Platform (TIP), illustrating how integrated it is with publicly available data, internal data repositories, Human Capital Resource Management (HRM) systems, ATS tools. Please click on the graphic to expand it for easier reading.

The comparative analysis of high achievers’ characteristics with applicants takes seconds to complete, providing a list of prospects complete with profiles. Machine learning-derived profiles of potential hires meeting the high performers’ characteristics provided greater contextual intelligence than any resume ever could. Taking an integrated approach to creating the Talent Intelligence Platform (TIP) yields insights not available with typical hiring or ATS solutions today. The profile below reflects the contextual intelligence and depth of insight possible when machine learning is applied to an integrated dataset of candidates. Please click on the graphic to expand it for easier reading. Key elements in the profile below include the following:

  • Career Growth Bell Curve – Illustrates how a given candidate’s career progressions and performance compares relative to others.

  • Social Following On Public Sites –  Provides a real-time glimpse into the candidate’s activity on Github, Open Stack, and other sites where technical professionals can share their expertise. This also provides insight into how others perceive their contributions.

  • Highlights Of Background That Is Relevant To Job(s) Under Review Provides the most relevant data from the candidate’s history in the profile so recruiters and managers can more easily understand their strengths.

  • Recent Publications – Publications provide insights into current and previous interests, areas of focus, mindset and learning progression over the last 10 to 15 years or longer.

  • Professional overlap that makes it easier to validate achievements chronicled in the resume – Multiple sources of real-time career data validate and provide greater context and insight into resume-listed accomplishments.

The key is understanding the context in which a candidate’s capabilities are being evaluated. And a 2-page resume will never give enough latitude to the candidate to cover all bases. For medium to large companies – doing this accurately and quickly is a daunting task if done manually – across all roles, all the geographies, all the candidates sourced, all the candidates applying online, university recruiting, re-skilling inside the company, internal mobility for existing employees, and across all recruitment channels. This is where machine learning can be an ally to the recruiter, hiring manager, and the candidate.

Five Reasons Why Machine Learning Needs To Make Resumes Obsolete

Reducing the costs and time-to-hire, increasing the quality of hires and staffing new initiatives with the highest quality talent possible all fuels solid revenue growth. Relying on resumes alone is like being on a bad Skype call where you only hear every tenth word in the conversation. Using machine learning-based approaches brings greater acuity, clarity, and visibility into hiring decisions.

The following are the five reasons why machine learning needs to make resumes obsolete:

  1. Resumes are like rearview mirrors that primarily reflect the past. What needed is more of a focus on where someone is going, why (what motivates them) and what are they fascinated with and learning about on their own. Resumes are rearview mirrors and what’s needed is an intelligent heads-up display of what their future will look like based on present interests and talent.
  2. By relying on a 500+-year-old process, there’s no way of knowing what skills, technologies and training a candidate is gaining momentum in. The depth and extent of mastery in specific areas aren’t reflected in the structure of resumes. By integrating multiple sources of data into a unified view of a candidate, it’s possible to see what areas they are growing the quickest in from a professional development standpoint.
  3. It’s impossible to game a machine learning algorithm that takes into account all digital data available on a candidate, while resumes have a credibility issue. Anyone who has hired subordinates, staff, and been involved in hiring decisions has faced the disappointment of finding out a promising candidate lied on a resume. It’s a huge let-down. Resumes get often gamed with one recruiter saying at least 60% of resumes have exaggerations and in some cases lies on them. Taking all data into account using a platform like TIP shows the true candidate and their actual skills.
  4. It’s time to take a more data-driven approach to diversity that removes unconscious biases. Resumes today immediately carry inherent biases in them. Recruiter, hiring managers and final interview groups of senior managers draw their unconscious biases based on a person’s name, gender, age, appearance, schools they attended and more. It’s more effective to know their skills, strengths, core areas of intelligence, all of which are better predictors of job performance.
  5. Reduces the risk of making a bad hire that will churn out of the organization fast. Ultimately everyone hires based in part on their best judgment and in part on their often unconscious biases. It’s human nature. With more data the probability of making a bad hire is reduced, reducing the risk of churning through a new hire and costing thousands of dollars to hire then replace them. Having greater contextual intelligence reduces the downside risks of hiring, removes biases by showing with solid data just how much a person is qualified or not for a role, and verifies their background strengths, skills, and achievements. Factors contributing to unconscious biases including gender, race, age or any other factors can be removed from profiles, so candidates are evaluated only on their potential to excel in the roles they are being considered for.

Bottom line: It’s time to revolutionize resumes and hiring processes, moving them into the 21st century by redefining them with greater contextual intelligence and insight enabled by machine learning.

 

%d bloggers like this: