Skip to content
Advertisements

Posts from the ‘Machine learning’ Category

What Matters Most In Business Intelligence, 2019

  • Improving revenues using BI is now the most popular objective enterprises are pursuing in 2019.
  • Reporting, dashboards, data integration, advanced visualization, and end-user self-service are the most strategic BI initiatives underway in enterprises today.
  • Operations, Executive Management, Finance, and Sales are primarily driving Business Intelligence (BI) adoption throughout enterprises today.
  • Tech companies’ Operations & Sales teams are the most effective at driving BI adoption across industries surveyed, with Advertising driving BI adoption across Marketing.

These and many other fascinating insights are from Dresner Advisory Associates’ 10th edition of its popular Wisdom of Crowds® Business Intelligence Market Study. The study is noteworthy in that it provides insights into how enterprises are expanding their adoption of Business Intelligence (BI) from centralized strategies to tactical ones that seek to improve daily operations. The Dresner research teams’ broad assessment of the BI market makes this report unique, including their use visualizations that provide a strategic view of market trends. The study is based on interviews with respondents from the firms’ research community of over 5,000 organizations as well as vendors’ customers and qualified crowdsourced respondents recruited over social media. Please see pages 13 – 16 for the methodology.

Key insights from the study include the following:

  • Operations, Executive Management, Finance, and Sales are primarily driving Business Intelligence (BI) adoption throughout their enterprises today. More than half of the enterprises surveyed see these four departments as the primary initiators or drivers of BI initiatives. Over the last seven years, Operations departments have most increased their influence over BI adoption, more than any other department included in the current and previous survey. Marketing and Strategic Planning are also the most likely to be sponsoring BI pilots and looking for new ways to introduce BI applications and platforms into use daily.

  • Tech companies’ Operations & Sales teams are the most effective at driving BI adoption across industries surveyed, with Advertising driving BI adoption across Marketing. Retail/Wholesale and Tech companies’ sales leadership is primarily driving BI adoption in their respective industries. It’s not surprising to see the leading influencer among Healthcare respondents is resource-intensive HR. The study found that Executive Management is most likely to drive business intelligence in consulting practices most often.

  • Reporting, dashboards, data integration, advanced visualization, and end-user self-service are the most strategic BI initiatives underway in enterprises today. Second-tier initiatives include data discovery, data warehousing, data discovery, data mining/advanced algorithms, and data storytelling. Comparing the last four years of survey data, Dresner’s research team found reporting retains all-time high scores as the top priority, and data storytelling, governance, and data catalog hold momentum. Please click on the graphic to expand for easier reading.

  • BI software providers most commonly rely on executive-level personas to design their applications and add new features. Dresner’s research team found all vertical industries except Business Services target business executives first in their product design and messaging. Given the customer-centric nature of advertising and consulting services business models, it is understandable why the primary focus BI vendors rely on in selling to them are customer personas. The following graphic compares targeted users for BI by industry.

  • Improving revenues using BI is now the most popular objective in 2019, despite BI initially being positioned as a solution for compliance and risk management. Executive Management, Marketing/Sales, and Operations are driving the focus on improving revenues this year. Nearly 50% of enterprises now expect BI to deliver better decision making, making the areas of reporting, and dashboards must-have features. Interestingly, enterprises aren’t looking to BI as much for improving operational efficiencies and cost reductions or competitive advantages. Over the last 12 to 18 months, more tech manufacturing companies have initiated new business models that require their operations teams to support a shift from products to services revenues. An example of this shift is the introduction of smart, connected products that provide real-time data that serves as the foundation for future services strategies. Please click on the graphic to expand for easier reading.

  • In aggregate, BI is achieving its highest levels of adoption in R&D, Executive Management, and Operations departments today. The growing complexity of products and business models in tech companies, increasing reliance on analytics and BI in retail/wholesale to streamline supply chains and improve buying experiences are contributing factors to the increasing levels of BI adoption in these three departments. The following graphic compares BI’s level of adoption by function today.

  • Enterprises with the largest BI budgets this year are investing more heavily into dashboards, reporting, and data integration. Conversely, those with smaller budgets are placing a higher priority on open source-based big data projects, end-user data preparation, collaborative support for group-based decision-making, and enterprise planning. The following graphic provides insights into technologies and initiatives strategic to BI at an enterprise level by budget plans.

  • Marketing/Sales and Operations are using the greatest variety of BI tools today. The survey shows how conversant Operations professionals are with the BI tools in use throughout their departments. Every one of them knows how many and most likely which types of BI tools are deployed in their departments. Across all industries, Research & Development (R&D), Business Intelligence Competency Center (BICC), and IT respondents are most likely to report they have multiple tools in use.

Advertisements

How To Get Your Data Scientist Career Started

The most common request from this blogs’ readers is how to further their careers in analytics, cloud computing, data science, and machine learning. I’ve invited Alyssa Columbus, a Data Scientist at Pacific Life, to share her insights and lessons learned on breaking into the field of data science and launching a career there. The following guest post is authored by her.

Earning a job in data science, especially your first job in data science, isn’t easy, especially given the surplus of analytics job-seekers to analytics jobs.

Many people are looking to break into data science, from undergraduates to career changers, have asked me how I’ve attained my current data science position at Pacific Life. I’ve referred them to many different resources, including discussions I’ve had on the Dataquest.io blog and the Scatter Podcast. In the interest of providing job seekers with a comprehensive view of what I’ve learned that works, I’ve put together the five most valuable lessons learned. I’ve written this article to make your data science job hunt easier and as efficient as possible.

  • Continuously build your statistical literacy and programming skills. Currently, there are 24,697 open Data Scientist positions on LinkedIn in the United States alone. Using data mining techniques to analyze all open positions in the U.S., the following list of the top 10 data science skills was created today. As of April 14, the top 3 most common skills requested in LinkedIn data scientist job postings are Python, R, and SQL, closely followed by Jupyter Notebooks, Unix Shell/Awk, AWS, and Tensorflow. The following graphic provides a prioritized list of the most in-demand data science skills mentioned in LinkedIn job postings today. Please click on the graphic to expand for easier viewing.

Hands-on training is the best way to develop and continually improve statistical and programming skills, especially with the languages and technologies LinkedIn’s job postings prioritize.  Getting your hands dirty with a dataset is often much better than reading through abstract concepts and not applying what you’ve learned to real problems. Your applied experience is just as important as your academic experience, and taking statistics, and computer science classes help to translate theoretical concepts into practical results. The toughest thing to learn (and also to teach) about statistical analysis is the intuition for what the big questions to ask of your dataset are. Statistical literacy, or “how” to find the answers to your questions, come with education and practice. Strengthening your intellectual curiosity or insight into asking the right questions comes through experience.

  • Continually be creating your own, unique portfolio of analytics and machine learning projects. Having a good portfolio is essential to be hired as a data scientist, especially if you don’t come from a quantitative background or have experience in data science before. Think of your portfolio as proof to potential employers that you are capable of excelling in the role of a data scientist with both the passion and skills to do the job. When building your data science portfolio, select and complete projects that qualify you for the data science jobs, you’re the most interested in. Use your portfolio to promote your strengths and innate abilities by sharing projects you’ve completed on your own. Some skills I’d recommend you highlight in your portfolio include:
    • Your programming language of choice (e.g., Python, R, Julia, etc.).
    • The ability to interact with databases (e.g., your ability to use SQL).
    • Visualization of data (static or interactive).
    • Storytelling with data. This is a critical skill. In essence, can someone with no background in whatever area your project is in look at your project and gain some new understandings from it?
    • Deployment of an application or API. This can be done with small sample projects (e.g., a REST API for an ML model you trained or a nice Tableau or R Shiny dashboard).

Julia Silge and Amber Thomas both have excellent examples of portfolios that you can be inspired by. Julia’s portfolio is shown below.

  • Get (or git!) yourself a website. If you want to stand out, along with a portfolio, create and continually build a strong online presence in the form of a website.  Be sure to create and continually add to your GitHub and Kaggle profiles to showcase your passion and proficiency in data science. Making your website with GitHub Pages creates a profile for you at the same time, and best of all it’s free to do. A strong online presence will not only help you in applying for jobs, but organizations may also reach out to you with freelance projects, interviews, and other opportunities.
  • Be confident in your skills and apply for any job you’re interested in, starting with opportunities available in your network.  If you don’t meet all of a job’s requirements, apply anyway. You don’t have to know every skill (e.g., programming languages) on a job description, especially if there are more than ten listed. If you’re a great fit for the main requirements of the job’s description, you need to apply. A good general rule is that if you have at least half of the skills requested on a job posting, go for it. When you’re hunting for jobs, it may be tempting to look for work on company websites or tech-specific job boards. I’ve found, as have many others, that these are among the least helpful ways to find work. Instead, contact recruiters specializing in data science and build up your network to break into the field. I recommend looking for a data science job via the following sources, with the most time devoted to recruiters and your network:
    • Recruiters
    • Friends, family, and colleagues
    • Career fairs and recruiting events
    • General job boards
    • Company websites
    • Tech job boards.

Alyssa Columbus is a Data Scientist at Pacific Life and member of the Spring 2018 class of NASA Datanauts. Previously, she was a computational statistics and machine learning researcher at the UC Irvine Department of Epidemiology and has built robust predictive models and applications for a diverse set of industries spanning retail to biologics. Alyssa holds a degree in Applied and Computational Mathematics from the University of California, Irvine and is a member of Phi Beta Kappa. She is a strong proponent of reproducible methods, open source technologies, and diversity in analytics and is the founder of R-Ladies Irvine. You can reach her at her website: alyssacolumbus.com.

The State Of 3D Printing, 2019

  • Proof of concept and prototyping dominate 3D printing applications in 2019.
  • 80% of enterprises say 3D printing is enabling them to innovate faster.
  • 51% of enterprises are actively using 3D printing in production.

These and many other fascinating insights are from Sculpteo’s 5th edition of their popular study, The State of 3D Printing (29 pp., PDF, opt-in). The study’s methodology is based on interviews with 1,300 respondents coming from Europe (64%), United States (16.6 %) and Asia (20.2%), which is the fastest growing region internationally today as measured by this survey over five years. Eight industries are included in the research design including Industrial Goods (13.6%), High Tech (10.6%), Services (9.9%), Consumer Goods (8.6%), Health & Medical (6.2%), Automotive (5.7%), Aerospace & Defense (5.5%), and Education (4.9%). For additional details on the methodology, please see pages 6 and 7 of the study. Key takeaways from the survey include the following:

  • Proof of concepts and prototyping dominate 3D printing applications in 2019. Manufacturers are increasing their reliance on 3D printing as part of their broader manufacturing strategies, with production use up to 51% of all respondents from 38.7% in 2018. The following compares 2019’s purpose of 3D prints versus the last five years of survey data. Please click on the graphic to expand for easier reading.

  • Accelerating product development continues to be enterprises’ top focus guiding their 3D printing strategies in 2019. Mass customization and support for configure-to-order and engineer-to-order product strategies also continue to be a high priority this year, continued the trend since 2015. Increasing production flexibility is the third area of focus that guides additive manufacturing strategies today. Please click on the graphic to expand for easier reading.

  • Nearly 50% of enterprises say that quality control is their top challenge of using their 3D printers. As enterprises increase their adoption of 3D printing to accelerate their additive manufacturing strategies, quality is becoming increasingly more important. Manufacturers most define their success by the perceived level of quality products they deliver to their customers, which is increasing quality control as a needed benefit of 3D printing. Please click on the graphic to expand for easier reading.

  • Adopting a design-to-manufacturing strategy accelerates new product development and innovation, which is why CAD design leads all other activities today. When responses were asked which areas related to 3D printing and additive manufacturing, they spend the majority of their time, nearly 50% said CAD design. Building prototypes, research, and testing prototypes are also areas those enterprises adopting additive manufacturing are investing in today. Please click on the graphic to expand for easier reading.

  • Additive manufacturing adoption is growing across shop floors globally, evidenced by more than 70% of enterprises finding new applications for 3D printing in 2019 and 60% using CAD, simulation, and reverse engineering internally. The leading indicators of additive manufacturing becoming more pervasively adopted across global shop floors are shown in the following graphic. New uses for 3D printing, experimenting with new materials, extensive CAD Design integration combined with simulation and reverse engineering provide further evidence of how engrained additive manufacturing is becoming in production processes daily. 3D printing is now most commonly used alongside CNC machining, another strong indicator of how essential additive manufacturing is becoming to the production process. Please click on the graphic to expand for easier reading.

  • 3D printings’ innate strengths at producing items with complex geometries at a quick pace or iteration are the leading two benefits of 3D printing in 2019. More than 40% of enterprises say that rapid iterations of prototypes and lead time reductions are the leading benefits followed by mass customization (support for configure-to-order & engineer-to-order product strategies) and cost savings. Please click on the graphic to expand for easier reading.

  • 80% of high tech manufacturing respondents are relying on 3D printing for prototyping, leading all industries in this category. 47% use 3D printing to accelerate product development. High tech manufacturers are above average in their experimenting with new 3D printing materials and technologies, looking for greater competitive strength in their industry. Please click on the graphic to expand for easier reading.

  • North American-based enterprises see the scale to support complex product concepts (complex geometries), speed (quick iterations), scale (mass customizations) and cost savings as the top benefits of 3D printing. Sculpteo’s survey found that North American enterprises are more optimistic about the potential for 3D printing becoming mainstream in a production environment. While budget and physical space are the two most significant barriers enterprises face in adopting 3D printing at scale, their optimistic outlook on the technology’s future is driving greater adoption to the shop floor. Please click on the graphic to expand for easier reading.

How To Improve Privileged User’s Security Experiences With Machine Learning

Bottom Line: One of the primary factors motivating employees to sacrifice security for speed are the many frustrations they face, attempting to re-authenticate who they are so they can get more work done and achieve greater productivity.

How Bad Security Experiences Lead to a Breach

Every business is facing the paradox of hardening security without sacrificing users’ login and system access experiences. Zero Trust Privilege is emerging as a proven framework for thwarting privileged credential abuse by verifying who is requesting access, the context of the request, and the risk of the access environment across every threat surface an organization has.

Centrify’s recent survey Privileged Access Management In The Modern Threatscape found that 74% of data breaches start with privileged credential abuse. Forrester estimates that 80% of data breaches have a connection to compromised privileged credentials, such as passwords, tokens, keys, and certificates. On the Dark Web, privileged access credentials are a best-seller because they provide the intruder with “the keys to the kingdom.” By leveraging a “trusted” identity, a hacker can operate undetected and exfiltrate sensitive data sets without raising any red flags.

Frustrated with wasting time responding to the many account lock-outs, re-authentication procedures, and login errors outmoded Privileged Access Management (PAM) systems require, IT Help Desk teams, IT administrators, and admin users freely share privileged credentials, often resulting in them eventually being offered for sale on the Dark Web.

The Keys to the Kingdom Are In High Demand

18% of healthcare employees are willing to sell confidential data to unauthorized parties for as little as $500 to $1,000, and 24% of employees know of someone who has sold privileged credentials to outsiders, according to a recent Accenture survey. State-sponsored and organized crime organizations offer to pay bounties in bitcoin for privileged credentials for many of the world’s largest financial institutions on the Dark Web. And with the typical U.S.-based enterprise losing on average $7.91M from a breach, more than double the global average of $3.86M according to IBM’s 2018 Data Breach Study, it’s clear that improving admin user experiences to reduce the incidence of privileged credential sharing needs to happen now.

How Machine Learning Improves Admin User Experiences and Thwarts Breaches

Machine learning is making every aspect of security experiences more adaptive, taking into account the risk context of every privileged access attempt across any threat surface, anytime. Machine learning algorithms can continuously learn and generate contextual intelligence that is used to streamline verified privileged user’s access while thwarting many potential threats ― the most common of which is compromised credentials.

The following are a few of the many ways machine learning is improving privileged users’ experiences when they need to log in to secure critical infrastructure resources:

  • Machine learning is making it possible to provide adaptive, personalized login experiences at scale using risk-scoring of every access attempt in real-time, all contributing to improved user experiences. Machine learning is making it possible to implement security strategies that flex or adapt to risk contexts in real-time, assessing every access attempt across every threat surface, and generating a risk score in milliseconds. Being able to respond in milliseconds, or real-time is essential for delivering excellent admin user experiences. The “never trust, always verify, enforce least privilege” approach to security is how many enterprises from a broad base of industries including leading financial services and insurance companies are protecting every threat surface from privileged access abuse. CIOs at these companies say taking a Zero Trust approach with a strong focus on Zero Trust Privilege corporate-wide is redefining the legacy approach to Privileged Access Management by delivering cloud-architected Zero Trust Privilege to secure access to infrastructure, DevOps, cloud, containers, Big Data, and other modern enterprise use cases. Taking a Zero Trust approach to security enables their departments to roll out new services across every threat surface their customers prefer to use without having to customize security strategies for each.
  • Quantify, track and analyze every potential security threat and attempted breach and apply threat analytics to the aggregated data sets in real-time, thwarting data exfiltration attempts before they begin. One of the tenets or cornerstones of Zero Trust Privilege is adaptive control. Machine learning algorithms continually “learn” by continuously analyzing and looking for anomalies in users’ behavior across every threat surface, device, and login attempt. When any users’ behavior appears to be outside the threshold of constraints defined for threat analytics and risk scoring, additional authentication is immediately requested, and access denied to requested resources until an identity can be verified. Machine learning makes adaptive preventative controls possible.
  • When every identity is a new security perimeter, machine learnings’ ability to provide personalization at scale for every access attempt on every threat surface is essential for enabling a company to keep growing. Businesses that are growing the fastest often face the greatest challenges when it comes to improving their privileged users’ experiences. Getting new employees productive quickly needs to be based on four foundational elements. These include verifying the identity of every admin user, knowing the context of their access request, ensuring it’s coming from a clean source, and limiting access as well as privilege. Taken together, these pillars form the foundation of a Zero Trust Privilege.

Conclusion

Organizations don’t have to sacrifice security for speed when they’re relying on machine learning-based approaches for improving the privileged user experience. Today, a majority of IT Help Desk teams, IT administrators, and admin users are freely sharing privileged credentials to be more productive, which often leads to breaches based on privileged access abuse. By taking a machine learning-based approach to validate every access request, the context of the request, and the risk of the access environment, roadblocks in the way of greater privileged user productivity disappear. Privileged credential abuse is greatly minimized.

Industry 4.0’s Potential Needs To Be Proven On The Shop Floor

  • 99% of mid-market manufacturing executives are familiar with Industry 4.0, yet only 5% are currently implementing or have implemented an Industry 4.0 strategy.
  • Investing in upgrading existing machinery, replacing fully depreciated machines with next-generation smart, connected production equipment, and adopting real-time monitoring including Manufacturing Execution Systems (MES) are manufacturers’ top three priorities based on interviews with them.
  • Mid-market manufacturers getting the most value out of Industry 4.0 excel at orchestrating a variety of technologies to find new ways to excel at product quality, improve shop floor productivity, meet delivery dates, and control costs.
  • Real-time monitoring is gaining momentum to improve order cycle times, troubleshoot quality problems, improve schedule accuracy, and support track-and-trace.

These and many other fascinating insights are from Industry 4.0: Defining How Mid-Market Manufacturers Derive and Deliver ValueBDO is a leading provider of assurance, tax, and financial advisory services and is providing the report available for download here (PDF, 36 pp., no opt-in). The survey was conducted by Market Measurement, Inc., an independent market research consulting firm. The survey included 230 executives at U.S. manufacturing companies with annual revenues between $200M and $3B and was conducted in November and December of 2018. Please see page 2 of the study for additional details regarding the methodology. One of the most valuable findings of the study is that mid-market manufacturers need more evidence of Industry 4.0, delivering improved supply chain performance, quality, and shop floor productivity.

Insights from the Shop Floor: Machine Upgrades, Smart Machines, Real-Time Monitoring & MES Lead Investment Plans

In the many conversations I’ve had with mid-tier manufacturers located in North America this year, I’ve learned the following:

  • Their top investment priorities are upgrading existing machinery, replacing fully depreciated machines with next-generation smart, connected production equipment, and adopting real-time monitoring including Manufacturing Execution Systems (MES).
  • Manufacturers growing 10% or more this year over 2018 excel at integrating technologies that improve scheduling to enable more short-notice production runs, reduce order cycle times, and improve supplier quality.

Key Takeaways from BDO’s Industry 4.0 Study

  • Manufacturers are most motivated to evaluate Industry 4.0 technologies based on the potential for growth and business model diversification they offer. Building a business case for any new system or technology that delivers revenue, even during a pilot, is getting the highest priority by manufacturers today. Based on my interviews with manufacturers, I found they were 1.7 times more likely to invest in machine upgrades and smart machines versus spending more on marketing. Manufacturers are very interested in any new technology that enables them to accept short-notice production runs from customers, excel at higher quality standards, improve time-to-market, all the while having better cost visibility and control. All those factors are inherent in the top three goals of business model diversification, improved operational efficiencies, and increased market penetration.

  • For Industry 4.0 technologies to gain more adoption, more use cases are needed to explain how traditional product sales, aftermarket sales, and product-as-a-service benefit from these new technologies. Manufacturers know the ROI of investing in a machinery upgrade, buying a smart, connected machine, or integrating real-time monitoring across their shop floors. What they’re struggling with is how Industry 4.0 makes traditional product sales improve. 84% of upper mid-market manufacturers are generating revenue using Information-as-a-Service today compared to 67% of middle market manufacturers overall.

  • Manufacturers who get the most value out of their Industry 4.0 investments begin with a customer-centric blueprint first, integrating diverse technologies to deliver excellent customer experiences. Manufacturers growing 10% a year or more are relying on roadmaps to guide their technology buying decisions. These roadmaps are focused on how to reduce scrap, improve order cycle times, streamline supplier integration while improving inbound quality levels, and provide real-time order updates to customers. BDOs’ survey results reflect what I’m hearing from manufacturers. They’re more focused than ever before on having an integrated engagement strategy combined with greater flexibility in responding to unique and often urgent production runs.

  • Industry 4.0’s potential to improve supply chains needs greater focus if mid-tier manufacturers are going to adopt the framework fully. Manufacturing executives most often equate Industry 4.0 with shop floor productivity improvements while the greatest gains are waiting in their supply chains. The BDO study found that manufacturers are divided on the metrics they rely on to evaluate their supply chains. Upper middle market manufacturers are aiming to speed up customer order cycle times and are less focused on getting their total delivered costs down. Lower mid-market manufacturers say reducing inventory turnover is their biggest priority. Overall, strengthening customer service increases in importance with the size of the organization.

  • By enabling integration between engineering, supply chain management, Manufacturing Execution Systems (MES) and CRM systems, more manufacturers are achieving product configuration strategies at scale. A key growth strategy for many manufacturers is to scale beyond the limitations of their longstanding Make-to-Stock production strategies. By integrating engineering, supply chains, MES, and CRM, manufacturers can offer more flexibility to their customers while expanding their product strategies to include Configure-to-Order, Make-to-Order, and for highly customized products, Engineer-to-Order. The more Industry 4.0 can be shown to enable design-to-manufacturing at scale, the more it will resonate with senior executives in mid-tier manufacturing.

  • Manufacturers are more likely than ever before to accept cloud-based platforms and systems that help them achieve their business strategies faster and more completely, with analytics being in the early stages of adoption. Manufacturing CEOs and their teams are most concerned about how quickly new applications and platforms can position their businesses for more growth. Whether a given application or platform is cloud-based often becomes secondary to the speed and time-to-market constraints every manufacturing business faces. The fastest-growing mid-tier manufacturers are putting greater effort and intensity into mastering analytics across every area of their business too. BDO found that Artificial Intelligence (AI) leads all other technologies in planned use.

How To Improve Supply Chains With Machine Learning: 10 Proven Ways

Bottom line: Enterprises are attaining double-digit improvements in forecast error rates, demand planning productivity, cost reductions and on-time shipments using machine learning today, revolutionizing supply chain management in the process.

Machine learning algorithms and the models they’re based on excel at finding anomalies, patterns and predictive insights in large data sets. Many supply chain challenges are time, cost and resource constraint-based, making machine learning an ideal technology to solve them. From Amazon’s Kiva robotics relying on machine learning to improve accuracy, speed and scale to DHL relying on AI and machine learning to power their Predictive Network Management system that analyzes 58 different parameters of internal data to identify the top factors influencing shipment delays, machine learning is defining the next generation of supply chain management. Gartner predicts that by 2020, 95% of Supply Chain Planning (SCP) vendors will be relying on supervised and unsupervised machine learning in their solutions. Gartner is also predicting by 2023 intelligent algorithms, and AI techniques will be an embedded or augmented component across 25% of all supply chain technology solutions.

The ten ways that machine learning is revolutionizing supply chain management include:

  • Machine learning-based algorithms are the foundation of the next generation of logistics technologies, with the most significant gains being made with advanced resource scheduling systems. Machine learning and AI-based techniques are the foundation of a broad spectrum of next-generation logistics and supply chain technologies now under development. The most significant gains are being made where machine learning can contribute to solving complex constraint, cost and delivery problems companies face today. McKinsey predicts machine learning’s most significant contributions will be in providing supply chain operators with more significant insights into how supply chain performance can be improved, anticipating anomalies in logistics costs and performance before they occur. Machine learning is also providing insights into where automation can deliver the most significant scale advantages. Source: McKinsey & Company, Automation in logistics: Big opportunity, bigger uncertainty, April 2019. By Ashutosh Dekhne, Greg Hastings, John Murnane, and Florian Neuhaus

  • The wide variation in data sets generated from the Internet of Things (IoT) sensors, telematics, intelligent transport systems, and traffic data have the potential to deliver the most value to improving supply chains by using machine learning. Applying machine learning algorithms and techniques to improve supply chains starts with data sets that have the greatest variety and variability in them. The most challenging issues supply chains face are often found in optimizing logistics, so materials needed to complete a production run arrive on time. Source: KPMG, Supply Chain Big Data Series Part 1

  • Machine learning shows the potential to reduce logistics costs by finding patterns in track-and-trace data captured using IoT-enabled sensors, contributing to $6M in annual savings. BCG recently looked at how a decentralized supply chain using track-and-trace applications could improve performance and reduce costs. They found that in a 30-node configuration when blockchain is used to share data in real-time across a supplier network, combined with better analytics insight, cost savings of $6M a year is achievable. Source: Boston Consulting Group, Pairing Blockchain with IoT to Cut Supply Chain Costs, December 18, 2018, by Zia Yusuf, Akash Bhatia, Usama Gill, Maciej Kranz, Michelle Fleury, and Anoop Nannra

  • Reducing forecast errors up to 50% is achievable using machine learning-based techniques. Lost sales due to products not being available are being reduced up to 65% through the use of machine learning-based planning and optimization techniques. Inventory reductions of 20 to 50% are also being achieved today when machine learning-based supply chain management systems are used. Source: Digital/McKinsey, Smartening up with Artificial Intelligence (AI) – What’s in it for Germany and its Industrial Sector? (PDF, 52 pp., no opt-in).

  • DHL Research is finding that machine learning enables logistics and supply chain operations to optimize capacity utilization, improve customer experience, reduce risk, and create new business models. DHL’s research team continually tracks and evaluates the impact of emerging technologies on logistics and supply chain performance. They’re also predicting that AI will enable back-office automation, predictive operations, intelligent logistics assets, and new customer experience models. Source: DHL Trend Research, Logistics Trend Radar, Version 2018/2019 (PDF, 55 pp., no opt-in)

  • Detecting and acting on inconsistent supplier quality levels and deliveries using machine learning-based applications is an area manufacturers are investing in today. Based on conversations with North American-based mid-tier manufacturers, the second most significant growth barrier they’re facing today is suppliers’ lack of consistent quality and delivery performance. The greatest growth barrier is the lack of skilled labor available. Using machine learning and advanced analytics manufacturers can discover quickly who their best and worst suppliers are, and which production centers are most accurate in catching errors. Manufacturers are using dashboards much like the one below for applying machine learning to supplier quality, delivery and consistency challenges. Source: Microsoft, Supplier Quality Analysis sample for Power BI: Take a tour, 2018

  • Reducing risk and the potential for fraud, while improving the product and process quality based on insights gained from machine learning is forcing inspection’s inflection point across supply chains today. When inspections are automated using mobile technologies and results are uploaded in real-time to a secure cloud-based platform, machine learning algorithms can deliver insights that immediately reduce risks and the potential for fraud. Inspectorio is a machine learning startup to watch in this area. They’re tackling the many problems that a lack of inspection and supply chain visibility creates, focusing on how they can solve them immediately for brands and retailers. The graphic below explains their platform. Source: Forbes, How Machine Learning Improves Manufacturing Inspections, Product Quality & Supply Chain Visibility, January 23, 2019

  • Machine learning is making rapid gains in end-to-end supply chain visibility possible, providing predictive and prescriptive insights that are helping companies react faster than before. Combining multi-enterprise commerce networks for global trade and supply chain management with AI and machine learning platforms are revolutionizing supply chain end-to-end visibility. One of the early leaders in this area is Infor’s Control Center. Control Center combines data from the Infor GT Nexus Commerce Network, acquired by the company in September 2015, with Infor’s Coleman Artificial Intelligence (AI) Infor chose to name their AI platform after the inspiring physicist and mathematician Katherine Coleman Johnson, whose trail-blazing work helped NASA land on the moon. Be sure to pick up a copy of the book and see the movie Hidden Figures if you haven’t already to appreciate her and many other brilliant women mathematicians’ many contributions to space exploration. ChainLink Research provides an overview of Control Center in their article, How Infor is Helping to Realize Human Potential, and two screens from Control Center are shown below.

  • Machine learning is proving to be foundational for thwarting privileged credential abuse which is the leading cause of security breaches across global supply chains. By taking a least privilege access approach, organizations can minimize attack surfaces, improve audit and compliance visibility, and reduce risk, complexity, and the costs of operating a modern, hybrid enterprise. CIOs are solving the paradox of privileged credential abuse in their supply chains by knowing that even if a privileged user has entered the right credentials but the request comes in with risky context, then stronger verification is needed to permit access.  Zero Trust Privilege is emerging as a proven framework for thwarting privileged credential abuse by verifying who is requesting access, the context of the request, and the risk of the access environment.  Centrify is a leader in this area, with globally-recognized suppliers including Cisco, Intel, Microsoft, and Salesforce being current customers.  Source: Forbes, High-Tech’s Greatest Challenge Will Be Securing Supply Chains In 2019, November 28, 2018.
  • Capitalizing on machine learning to predict preventative maintenance for freight and logistics machinery based on IoT data is improving asset utilization and reducing operating costs. McKinsey found that predictive maintenance enhanced by machine learning allows for better prediction and avoidance of machine failure by combining data from the advanced Internet of Things (IoT) sensors and maintenance logs as well as external sources. Asset productivity increases of up to 20% are possible and overall maintenance costs may be reduced by up to 10%. Source: Digital/McKinsey, Smartening up with Artificial Intelligence (AI) – What’s in it for Germany and its Industrial Sector? (PDF, 52 pp., no opt-in).

References

Accenture, Reinventing The Supply Chain With AI, 20 pp., PDF, no opt-in.

Bendoly, E. (2016). Fit, Bias, and Enacted Sensemaking in Data Visualization: Frameworks for Continuous Development in Operations and Supply Chain Management Analytics. Journal Of Business Logistics37(1), 6-17.

Boston Consulting Group, Pairing Blockchain with IoT to Cut Supply Chain Costs, December 18, 2018, by Zia Yusuf, Akash Bhatia, Usama Gill, Maciej Kranz, Michelle Fleury, and Anoop Nannra

How To Secure Mobile Devices In A Zero Trust World

  • 86% of enterprises are seeing mobile threats growing the fastest this year, outpacing other threat types.
  • 48% say they’ve sacrificed security to “get the job done” up from 32% last year.
  • 41% of those affected say the compromise is having major with lasting repercussions and 43% said that their efforts to remediate the attacks were “difficult and expensive.”

Bottom Line: The majority of enterprises, 67%, are the least confident in the security of their mobile assets than any other device or platform today according to Verizon’s Mobile Security Index 2019.

Why Mobile Devices Are the Fastest Growing Threat Surface Today     

Verizon found that 86% of enterprises see an upswing in the number, scale, and scope of mobile breach attempts in 2019. When broken out by industry, Financial Services, Professional Services, and Education are the most commonly targeted industries as the graphic below shows:

The threat surfaces every organization needs to protect is exponentially increasing today based on the combination of employee- and company-owned mobile devices. 41% of enterprises rate mobile devices as their most vulnerable threat surface this year:

Passwords and Mobile Devices Have Become A Hacker’s Paradise

“The only people who love usernames and passwords are hackers,” said Alex Simons, corporate vice president at Microsoft’s identity division in a recent Wall Street Journal article, Username and Password Hell: Why the Internet Can’t Keep You Logged In. Verizon found that mobile devices are the most vulnerable, fastest-growing threat surface there is, making it a favorite with state-sponsored and organized crime syndicates. How rapidly mobile devices are proliferating in enterprises today frequently outpace their ability to secure them, falling back on legacy Privileged Access Management (PAM) approaches that hacking syndicates know how to get around easily using compromised passwords and privileged access credentials. Here’s proof of how much of a lucrative paradise it is for hackers to target passwords and mobile devices first:

  • Hacker’s favorite way to gain access to any business is by using privileged access credentials, which are increasingly being harvested from cellphones using malware. Hacking organizations would rather walk in the front door of any organizations’ systems rather than expend the time and effort to hack in. It’s by far the most popular approach with hackers, with 74% of IT decision makers whose organizations have been breached in the past say it involved privileged access credential abuse according to a recent Centrify survey, Privileged Access Management in the Modern Threatscape. Only 48% of the organizations have a password vault, and just 21% have multi-factor authentication (MFA) implemented for privileged administrative access. The Verizon study found that malware is the most common strategy hackers use to gain access to corporate networks. MobileIron’s Global Threat Report, mid-year 2018 found that 3.5% of Android devices are harboring known malware. Of these malicious apps, over 80% had access to internal networks and were scanning nearby ports. This suggests that the malware was part of a larger attack.

Securing Mobile Devices In A Zero Trust World Needs To Happen Now

Mobile devices are an integral part of everyone’s identity today. They are also the fastest growing threat surface for every business – making identities the new security perimeter. Passwords are proving to be problematic in scaling fast enough to protect these threat surfaces, as credential abuse is skyrocketing today. They’re perennial best-sellers on the Dark Web, where buyers and sellers negotiate in bitcoin for companies’ logins and passwords – often with specific financial firms, called out by name in “credentials wanted” ads. Organizations are waking up to the value of taking a Zero Trust approach to securing their businesses, which is a great start. Passwords are still the most widely relied-on security mechanism – and continue to be the weakest link in today’s enterprise security.  That needs to change. According to the Wall Street Journal, the World Wide Web Consortium has recently ratified a standard called WebAuthN, which allows websites to authenticate users with biometric information, or physical objects like security keys, and skip passwords altogether.

MobileIron is also taking a unique approach to this challenge by introducing zero sign-on (ZSO), built on the company’s unified endpoint management (UEM) platform and powered by the MobileIron Access solution. “By making mobile devices your identity, we create a world free from the constant pains of password recovery and the threat of data breaches due to easily compromised credentials,” wrote Simon Biddiscombe, MobileIron’s President and Chief Executive Officer in his recent blog post, Single sign-on is still one sign-on too many. Simon’s latest post MobileIron: We’re making history by making passwords history, provides the company’s vision going forward with ZSO. Zero sign-on eliminates passwords as the primary method for user authentication, unlike single sign-on, which still requires at least one username and password. MobileIron paved the way for a zero sign-on enterprise with its Access product in 2017, which enabled zero sign-on to cloud services on managed devices.

Conclusion

Mobile devices are the most quickly proliferating threat surface there are today and an integral part of everyone’s identities as well. Thwarting the many breach attempts attempted daily over mobile devices and across all threat surfaces needs to start with a solid Zero Trust framework. MobileIron’s introduction of zero sign-on (ZSO) eliminates passwords as the method for user authentication, replacing single sign-on, which still requires at least one username and password. ZSO is exactly what enterprises need to secure the proliferating number of mobile devices they rely on to operate and grow in a Zero Trust world.

CIO’s Guide To Stopping Privileged Access Abuse – Part I

CIOs face the paradox of having to protect their businesses while at the same time streamlining access to the information and systems their companies need to grow. The threatscape they’re facing requires an approach to security that is adaptive to the risk context of each access attempt across any threat surface, anytime. Using risk scores to differentiate between privileged users attempting to access secured systems in a riskier context than normal versus privileged credential abuse by attackers has proven to be an effective approach for thwarting credential-based breaches.

Privileged credential abuse is one of the most popular breach strategies organized crime and state-sponsored cybercrime organizations use. They’d rather walk in the front door of enterprise systems than hack in. 74% of IT decision makers surveyed whose organizations have been breached in the past say it involved privileged access credential abuse, yet just 48% have a password vault. Just 21% have multi-factor authentication (MFA) implemented for privileged administrative access. These and many other insights are from Centrify’s recent survey, Privileged Access Management in the Modern Threatscape.

How CIOs Are Solving the Paradox of Privileged Credential Abuse

The challenge to every CIO’s security strategy is to adapt to risk contexts in real-time, accurately assessing every access attempt across every threat surface, risk-scoring each in milliseconds. By taking a “never trust, always verify, enforce least privilege” approach to security, CIOs can provide an adaptive, contextually accurate Zero Trust-based approach to verifying privileged credentials. Zero Trust Privilege is emerging as a proven framework for thwarting privileged credential abuse by verifying who is requesting access, the context of the request, and the risk of the access environment.

By taking a least privilege access approach, organizations can minimize attack surfaces, improve audit and compliance visibility, and reduce risk, complexity, and the costs of operating a modern, hybrid enterprise. CIOs are solving the paradox of privileged credential abuse by knowing that even if a privileged user has entered the right credentials but the request comes in with risky context, then stronger verification is needed to permit access.

Strategies For Stopping Privileged Credential Abuse

The following are five strategies CIOs need to concentrate on to stop privileged credential abuse. Starting with an inventory of privileged accounts and progressing through finding the gaps in IT infrastructure that create opportunities for privileged credential abuse, CIOs and their teams need to take preemptive action now to avert potential breaches in the future.

In Part 1 of a CIO’s Guide to Stopping Privileged Access Abuse, below are the steps they can take to get started:

  1. Discover and inventory all privileged accounts and their credentials to define who is accountable for managing their security and use. According to a survey by Gartner, more than 65% of enterprises are allowing shared use of privileged accounts with no accountability for their use. CIOs realize that a lack of consistent governance policies creates many opportunities for privileged credential abuse. They’re also finding orphaned accounts, multiple owners for privileged credentials and the majority of system administrators having super user or root user access rights for the majority of enterprise systems.
  2. Vault your cloud platforms’ Root Accounts and federate access to AWS, Google Cloud Platform, Microsoft Azure and other public cloud consoles. Root passwords on each of the cloud platforms your business relies on are the “keys to the kingdom” and provide bad actors from inside and outside the company to exfiltrate data with ease. The recent news of how a fired employee deleted his former employer’s 23 AWS servers is a cautionary tale of what happens when a Zero Trust approach to privileged credentials isn’t adopted. Centrify’s survey found that 63% or organizations take more than a day to shut off privilege access for an employee after leaving the company. Given how AWS root user accounts have the privilege to delete all instances immediately, it’s imperative for organizations to have a password vault where AWS root account credentials are stored. Instead of local AWS IAM accounts and access keys, use centralized identities (e.g., Active Directory) and enable federated login. By doing so, you obviate the need for long-lived access keys.
  3. Audit privileged sessions and analyze patterns to find potentially privileged credential sharing or abuse not immediately obvious from audits. Audit and log authorized and unauthorized user sessions across all enterprise systems, especially focusing on root password use across all platforms. Taking this step is essential for assigning accountability for each privileged credential in use. It will also tell you if privileged credentials are being shared widely across the organization. Taking a Zero Trust approach to securing privileged credentials will quickly find areas where there could be potential lapses or gaps that invite breaches. For AWS accounts, be sure to use AWS CloudTrail and Amazon CloudWatch to monitor all API activity across all AWS instances and your AWS account.
  4. Enforce least privilege access now within your existing infrastructure as much as possible, defining a security roadmap based on the foundations of Zero Trust as your future direction. Using the inventory of all privileged accounts as the baseline, update least privilege access on each credential now and implement a process for privilege elevation that will lower the overall risk and ability for attackers to move laterally and extract data. The days of “trust but verify” are over. CIOs from insurance and financial services companies recently spoken with point out that their new business models, all of them heavily reliant on secured Internet connectivity, are making Zero Trust the cornerstone of their future services strategies. They’re all moving beyond “trust but verify” to adopt a more adaptive approach to knowing the risk context by threat surface in real-time.
  5. Adopt multi-factor authentication (MFA) across all threat surfaces that can adapt and flex to the risk context of every request for resources. The CIOs running a series of insurance and financial services firms, a few of them former MBA students of mine, say multi-factor authentication is a must-have today for preventing privileged credential abuse. Their take on it is that adding in an authentication layer that queries users with something they know (user name, password, PIN or security question) with something they have (smartphone, one-time password token or smart card), something they are (biometric identification like fingerprint) and something they’ve done (contextual pattern matching of what they normally do where) has helped thwart privileged credential abuse exponentially since they adopted it. This is low-hanging fruit: adaptive MFA has made the productivity impact of this additional validation practically moot.

Conclusion

Every CIO I know is now expected to be a business strategist first, and a technologist second. At the top of many of their list of priorities is securing the business so it can achieve uninterrupted growth. The CIOs I regularly speak with running insurance and financial services companies often speak of how security is as much a part of their new business strategies as the financial products their product design teams are developing. The bottom line is that the more adaptive and able to assess the context of risks for each privilege access attempt a company’s access management posture can become, the more responsive they can be to employees and customers alike, fueling future growth.

Machine Learning Engineer Is The Best Job In The U.S. According To Indeed

  • Machine Learning Engineer job openings grew 344% between 2015 to 2018, and have an average base salary of $146,085.
  • At $158,303, Computer Vision Engineers earn among the highest salaries in tech
  • The average base salary of the 25 best jobs in the U.S. according to Indeed is $104,825, and the median base salary is $99,007.
  • Agile Coach is the highest paying job with an average base salary of $161,377.
  • 9 of the top 25 jobs in the U.S. this year are in tech fields according to Indeed.
  • Five jobs are heavily dependent on applicants’ Artificial Intelligence (AI) skills and expertise.

These and many other insights are from this Indeed’s The Best Jobs in the U.S.: 2019 study released this week. Indeed defined the best jobs as those experiencing the fastest growth measured by the increase in job postings between 2015 and 2018, in conjunction with those offering the highest pay using a baseline salary of $75,000. Indeed’s best jobs of 2019’s data set is available here in Microsoft Excel.

Key insights from Indeed’s ranking of the best jobs of 2019 include the following:

  • At $158,303, Computer Vision Engineers earn among the highest salaries in tech according to Indeed, followed Machine Learning Engineers with a base salary of $146,085. The average base pay of the nine tech-related jobs that made Indeed’s list is $122,761, above the median salary of $99,007 for the entire group of the top 25 jobs. Indeed’s top 25 jobs for 2019 are illustrated below in descending salary order with the median salary providing a benchmark across the group. Please click on the graphic to expand for easier reading.

  • Three of the top 10 fastest growing jobs as measured by percentage growth in the number of job postings are in tech. From 2015 to 2018, job postings for Machine Learning Engineers grew 344%, followed by Full-stack developers (206%) and Salesforce developers (129%). In aggregate, all nine technology-related job postings increased by 146% between 2015 and 2018. The graphic below illustrates the percentage of growth in the number of postings between 2015 and 2018. Please click on the graphic to expand for easier reading.

  • Comparing average base salary to percentage growth in job postings underscores the exceptionally high demand for Machine Learning Engineers in 2019. Technical professionals with machine learning expertise today are in an excellent position to bargain for the average base salary of at least $146,085 or more. Full-stack developers and Salesforce developers are in such high demand, technical professionals with skills on these areas combined with experience can command a higher salary than the average base salary. The following graphic compares the average base salary to percentage growth in job postings for the years 2015 – 2018. Please click on the graphic to expand for easier reading.

74% Of Data Breaches Start With Privileged Credential Abuse

Centrify’s survey shows organizations are granting too much trust and privilege, opening themselves up to potential internal and externally-driven breaches initiated with compromised privileged access credentials. Photo credit: iStock

Enterprises who are prioritizing privileged credential security are creating a formidable competitive advantage over their peers, ensuring operations won’t be interrupted by a breach. However, there’s a widening gap between those businesses protected from a breach and the many who aren’t. In quantifying this gap consider the typical U.S.-based enterprise will lose on average $7.91M from a breach, nearly double the global average of $3.68M according to IBM’s 2018 Data Breach Study.

Further insights into how wide this gap is are revealed in Centrify’s Privileged Access Management in the Modern Threatscape survey results published today. The study is noteworthy as it illustrates how wide the gap is between enterprises’ ability to avert and thwart breaches versus their current levels of Privileged Access Management (PAM) and privileged credential security. 74% of IT decision makers surveyed whose organizations have been breached in the past, say it involved privileged access credential abuse, yet just 48% have a password vault, just 21% have multi-factor authentication (MFA) implemented for privileged administrative access, and 65% are sharing root or privileged access to systems and data at least somewhat often.

Addressing these three areas with a Zero Trust approach to PAM would make an immediate difference in security.

“What’s alarming is that the survey reveals many organizations, armed with the knowledge that they have been breached before, are doing too little to secure privileged access. IT teams need to be taking their Privileged Access Management much more seriously, and prioritizing basic PAM strategies like vaults and MFA while reducing shared passwords,” remarked Tim Steinkopf, Centrify CEO. FINN Partners, on behalf of Centrify, surveyed 1,000 IT decision makers (500 in the U.S. and 500 in the U.K.) online in October 2018. Please see the study here for more on the methodology.

How You Choose To Secure Privileged Credentials Determines Your Future 

Identities are the new security perimeter. Threats can emerge within and outside any organization, at any time. Bad actors, or those who want to breach a system for financial gain or to harm a business, aren’t just outside. 18% of healthcare employees are willing to sell confidential data to unauthorized parties for as little as $500 to $1,000, and 24% of employees know of someone who has sold privileged credentials to outsiders, according to a recent Accenture survey.

Attackers are increasingly logging in using weak, stolen, or otherwise compromised credentials. Centrify’s survey underscores how the majority of organizations’ IT departments have room for improvement when it comes to protecting privileged access credentials, which are the ‘keys to the kingdom.’ Reading the survey makes one realize that forward-thinking enterprises who are prioritizing privileged credential security gain major cost and time advantages over their competitors. They’re able to keep their momentum going across every area of their business by not having to recover from breaches or incur millions of dollars on losses or fines as the result of a breach.

One of the most promising approaches to securing every privileged identity and threat space within and outside an organization is Zero Trust Privilege (ZTP). ZTP enables an organizations’ IT team to grant least privilege access based on verifying who is requesting access, the context of the request, and the risk of the access environment.

Key Lessons Learned from the Centrify Survey

How wide the gap is between organizations who see identities as the new security perimeter and are adopting a Zero Trust approach to securing them and those that aren’t is reflected in the results of Centrify’s Privileged Access Management in the Modern Threatscape surveyThe following are the key lessons learned of where and how organizations can begin to close the security gaps they have that leave them vulnerable to privileged credential abuse and many other potential threats:

  • Organizations’ most technologically advanced areas that are essential for future growth and attainment of strategic goals are often the most unprotected. Big Data, cloud, containers and network devices are the most important areas of any IT infrastructure. According to Centrify’s survey, they are the most unprotected as well. 72% of organizations aren’t securing containers with privileged access controls. 68% are not securing network devices like hubs, switches, and routers with privileged access controls. 58% are not securing Big Data projects with privileged access controls. 45% are not securing public and private cloud workloads with privileged access controls. The study finds that UK-based businesses lag U.S.-based ones in each of these areas as the graphic below shows:

  • Only 36% of U.K. organizations are very confident in their company’s current IT security software strategies, compared to 65% in the U.S. The gap between organizations with hardened security strategies that have a higher probability of withstanding breach attempts is wide between U.K. and U.S.-based businesses. 44% of U.K. respondents weren’t positive about what Privileged Access Management is, versus 26% of U.S. respondents. 60% of U.K. respondents don’t have a password vault.

  • Just 35% of U.S. organizations and 30% of those in the UK are relying on Privileged Access Management to manage partners’ access to privileged credentials and infrastructure. Partners are indispensable for scaling any new business strategy and expanding an existing one across new markets and countries. Forward-thinking organizations look at every partner associates’ identity as a new security perimeter. The 35% of U.S.-based organizations doing this have an immediate competitive advantage over the 65% who aren’t. By enforcing PAM across their alliances and partnerships, organizations can achieve uninterrupted growth by eliminating expensive and time-consuming breaches that many businesses never fully recover from.
  • Organizations’ top five security projects for 2019 include protecting cloud data, preventing data leakage, analyzing security incidents, improving security education/awareness and encrypting data. These top five security projects could be achieved at scale by having IT teams implement a Zero Trust-based approach to Privileged Access Management (PAM). The time, cost and scale advantages of getting the top five security projects done using Zero Trust would free up IT teams to focus on projects that deliver direct revenue gains for example.

Conclusion

Centrify’s survey shows organizations are granting too much trust and privilege, opening themselves up to potential internal and externally-driven breaches initiated with compromised privileged access credentials. It also reveals that there is a strong desire to adhere to best practices when it comes to PAM (51% of respondents) and that the reason it is not being adequately implemented rarely has to do with prioritization or difficulty but rather budget constraints and executive buy-in.

The survey also shows U.K. – and U.S.-based organizations need to realize identity is the new security perimeter. For example, only 37% of respondents’ organizations are able to turn off privileged access for an employee who leaves the company within one day, leaving a wide-open exposure point that can continue to be exploited.

There are forward-thinking organizations who are relying on Zero Trust Privilege as a core part of their digital transformation efforts as well. The survey found that given a choice, respondents are most likely to say digital transformation (40%) is one of the top 3 projects they’d prefer to work on, followed by Endpoint Security (37%) and Privileged Access Management (28%). Many enterprises see digital transformation’s missing link being Zero Trust and the foundation for redefining their businesses by defining every identity as a new security perimeter, so they can securely scale and grow faster than before.

%d bloggers like this: