Skip to content
Advertisements

Posts from the ‘Louis Columbus’ blog’ Category

Top 10 Cybersecurity Companies To Watch In 2019

Today’s Threatscape Has Made “Trust But Verify” Obsolete 

The threatscape every business operates in today is proving the old model of “trust but verify” obsolete and in need of a complete overhaul. To compete and grow in the increasingly complex and lethal threatscape of today, businesses need more adaptive, contextually intelligent security solutions based on the Zero Trust Security framework. Zero Trust takes a “never trust, always verify, enforce least privilege” approach to privileged access, from inside or outside the network. John Kindervag was the first to see how urgent the need was for enterprises to change their approach to cybersecurity, so he created the Zero Trust Security framework in 2010 while at Forrester. Chase Cunningham, Principal Analyst at Forrester, is a mentor to many worldwide wanting to expand their knowledge of Zero Trust and frequently speaks and writes on the topic. If you are interested in cybersecurity in general and Zero Trust specifically, be sure to follow his blog.

AI and machine learning applied to cybersecurity’s most significant challenges is creating a proliferation of commercially successful, innovative platforms. The size and scale of deals in cybersecurity continue to accelerate with BlackBerry’s acquisition of Cylance for $1.4B in cash closing in February of this year being the largest. TD Ameritrade’s annual survey of registered investment advisors (RIA) showed nearly a 6X jump in cybersecurity investments this year compared to 2018.

The top ten cybersecurity companies reflect the speed and scale of innovation happening today that are driving the highest levels of investment this industry has ever seen. The following are the top ten cybersecurity companies to watch in 2019:

Absolute (ABT.TO)  – One of the world’s leading commercial enterprise security solutions, serving as the industry benchmark for endpoint resilience, visibility, and control. The company enables more than 12,000 customers with self-healing endpoint security, always-connected visibility into their devices, data, users, and applications whether endpoints are on or off the network, and the ultimate level of control and confidence required for the modern enterprise. Embedded in over one billion endpoint devices, Absolute delivers intelligence and real-time remediation capabilities that equip enterprises to stop data breaches at the source.

To thwart attackers, organizations continue to layer on security controls — Gartner estimates that more than $124B will be spent on security in 2019 aloneAbsolute’s 2019 Endpoint Security Trends Report finds that much of that spend is in vain, however, revealing that 70% of all breaches still originate on the endpoint. The problem is complexity at the endpoint – it causes security agents to fail invariably, reliably, and predictably.

Absolute’s research found that 42% of all endpoints are unprotected at any given time, and 100% of endpoint security tools eventually fail. As a result, IT leaders see a negative ROI on their security spend. What makes Absolute one of the top 10 security companies to watch in 2019 is their purpose-driven design to mitigate this universal law of security decay.

Enterprises rely on Absolute to cut through the complexity to identify failures, model control options, and refocus security intent. Rather than perpetuating organizations’ false sense of security, Absolute enables uncompromised endpoint persistence, builds resilience and delivers the intelligence needed to ensure security agents, applications, and controls continue functioning and deliver value as intended. Absolute has proven very effective in validating safeguards, fortifying endpoints, and stopping data security compliance failures. The following is an example of the Absolute platform at work:

BlackBerry Artifical Intelligence and Predictive Security  –  BlackBerry is noteworthy for how quickly they are reinventing themselves into an enterprise-ready cybersecurity company independent of the Cylance acquisition. Paying $1.4B in cash for Cylance brings much-needed AI and machine learning expertise to their platform portfolio, an acquisition that BlackBerry is moving quickly to integrate into their product and service strategies. BlackBerry Cylance uses AI and machine learning to protect the entire attack surface of an enterprise with automated threat prevention, detection, and response capabilities. Cylance is also the first company to apply artificial intelligence, algorithmic science, and machine learning to cyber security and improve the way companies, governments, and end users proactively solve the world’s most challenging security problems. Using a breakthrough mathematical process, BlackBerry Cylance quickly and accurately identifies what is safe and what is a threat, not just what is in a blacklist or whitelist. By coupling sophisticated math and machine learning with a unique understanding of a hacker’s mentality, BlackBerry Cylance provides the technology and services to be truly predictive and preventive against advanced threats. The following screen from CylancePROTECT provides an executive summary of CylancePROTECT usage, from the number of zones and devices to the percentage of devices covered by Auto-Quarantine and Memory Protection, Threat Events, Memory Violations, Agent Versions, and Offline Days for devices.

Centrify –  Centrify is redefining the legacy approach to Privileged Access Management by delivering cloud-ready Zero Trust Privilege to secure modern enterprise attack surfaces. Centrify Zero Trust Privilege helps customers grant least privilege access based on verifying who is requesting access, the context of the request, and the risk of the access environment. Industry research firm Gartner predicted Privileged Access Management (PAM) to be the second-fastest growing segment for information security and risk management spending worldwide in 2019 in their recent Forecast Analysis: Information Security and Risk Management, Worldwide, 3Q18 Update (client access required). By implementing least privilege access, Centrify minimizes the attack surface, improves audit and compliance visibility, and reduces risk, complexity, and costs for the modern, hybrid enterprise. Over half of the Fortune 100, the world’s largest financial institutions, intelligence agencies, and critical infrastructure companies, all trust Centrify to stop the leading cause of breaches – privileged credential abuse. PAM was also named a Top 10 security project for 2019 in Gartner’s Top 10 Security Projects for 2019 (client access required).
CloudFlare –  Cloudflare is a web performance and security company that provides online services to protect and accelerate websites online. Its online platforms include Cloudflare CDN that distributes content around the world to speed up websites, Cloudflare Optimizer that enables web pages with ad servers and third-party widgets to download Snappy software on mobiles and computers, CloudFlare Security that protects websites from a range of online threats including spam, SQL injection, and DDOS, Cloudflare Analytics that gives insight into website’s traffic including threats and search engine crawlers, Keyless SSL that allows organizations to keep secure sockets layer (SSL) keys private, and Cloudflare applications that help its users install web applications on their websites.

CrowdStrike – Applying machine learning to endpoint detection of IT network threats is how CrowdStrike is differentiating itself in the rapidly growing cybersecurity market today. It’s also one of the top 25 machine learning startups to watch in 2019. Crowdstrike is credited with uncovering Russian hackers inside the servers of the US Democratic National Committee. The company’s IPO was last Tuesday night, with an initial $34/per share price. Their IPO generated $610M at a valuation at one point reaching nearly $7B. Their Falcon platform stops breaches by detecting all attacks types, even malware-free intrusions, providing five-second visibility across all current and past endpoint activity while reducing cost and complexity for customers. CrowdStrike’s Threat Graph provides real-time analysis of data from endpoint events across the global crowdsourcing community, allowing detection and prevention of attacks based on patented behavioral pattern recognition technology.

Hunters.AI – Hunters.AI excels at autonomous threat hunting by capitalizing on its autonomous system that connects to multiple channels within an organization and detects the signs of potential cyber-attacks. They are one of the top 25 machine learning startups to watch in 2019. What makes this startup one of the top ten cybersecurity companies to watch in 2019 is their innovative approach to creating AI- and machine learning-based algorithms that continually learn from an enterprise’s existing security data. Hunters.AI generates and delivers visualized attack stories allowing organizations to more quickly and effectively identify, understand, and respond to attacks. Early customers, including Snowflake Computing, whose VP of Security recently said, “Hunters.AI identified the attack in minutes. In my 20 years in security, I have not seen anything as effective, fast, and with high fidelity as what Hunters can do.”  The following is a graphic overview of how their system works:

Idaptive – Idaptive is noteworthy for the Zero Trust approach they are taking to protecting organizations across every threat surface they rely on operate their businesses dally. Idaptive secures access to applications and endpoints by verifying every user, validating their devices, and intelligently limiting their access. Their product and services strategy reflects a “never trust, always verify, enforce least privilege” approach to privileged access, from inside or outside the network. The Idaptive Next-Gen Access platform combines single single-on (SSO), adaptive multifactor authentication (MFA), enterprise mobility management (EMM) and user behavior analytics (UBA). They have over 2,000 organizations using their platform today. Idaptive was spun out from Centrify on January 1st of this year.

Kount – Kount has successfully differentiated itself in an increasingly crowded cybersecurity marketplace by providing fraud management, identity verification and online authentication technologies that enable digital businesses, online merchants and payment service providers to identify and thwart a wide spectrum of threats in real-time. Kount has been able to show through customer references that their customers can approve more orders, uncover new revenue streams, and dramatically improve their bottom line all while minimizing fraud management cost and losses. Through Kount’s global network and proprietary technologies in AI and machine learning, combined with policy and rules management, their customers thwart online criminals and bad actors driving them away from their site, their marketplace and off their network. Kount’s continuously adaptive platform learns of new threats and continuously updates risk scores to further thwart breach and fraud attempts. Kount’s advances in both proprietary techniques and patented technology include: Superior mobile fraud detection, Advanced artificial intelligence, Multi-layer device fingerprinting, IP proxy detection and geo-location, Transaction and custom scoring, Global order linking, Business intelligence reporting, Comprehensive order management, Professional and managed services. Kount protects over 6,500 brands today.

MobileIron –  The acknowledged leader in Mobile Device Management software, MobileIron’s latest series of developments make them noteworthy and one of the top ten cybersecurity companies to watch in 2019.   MobileIron was the first to deliver key innovations such as multi-OS mobile device management (MDM), mobile application management (MAM), and BYOD privacy controls. Last month MobileIron introduced zero sign-on (ZSO), built on the company’s unified endpoint management (UEM) platform and powered by the MobileIron Access solution. “By making mobile devices your identity, we create a world free from the constant pains of password recovery and the threat of data breaches due to easily compromised credentials,” wrote Simon Biddiscombe, MobileIron’s President and Chief Executive Officer in his recent blog post, Single sign-on is still one sign-on too many. Simon’s latest post, MobileIron: We’re making history by making passwords history, provides the company’s vision going forward with ZSO. Zero sign-on eliminates passwords as the primary method for user authentication, unlike single sign-on, which still requires at least one username and password. MobileIron paved the way for a zero sign-on enterprise with its Access product in 2017, which enabled zero sign-on to cloud services on managed devices. Enterprise security teams no longer have to trade off security for better user experience, thanks to the MobileIron Zero Sign-On.

Sumo Logic – Sumo Logic is a fascinating cybersecurity company to track because it shows the ability to take on large-scale enterprise security challenges and turn them into a competitive advantage. An example of this is how quickly the company achieved FedRAMP Ready Designation, getting listed in the FedRAMP Marketplace. Sumo Logic is a secure, cloud-native, machine data analytics service, delivering real-time, continuous intelligence from structured, semi-structured, and unstructured data across the entire application lifecycle and stack. More than 2,000 customers around the globe rely on Sumo Logic for the analytics and insights to build, run, and secure their modern applications and cloud infrastructures. With Sumo Logic, customers gain a multi-tenant, service-model advantage to accelerate their shift to continuous innovation, increasing competitive advantage, business value, and growth. Founded in 2010, Sumo Logic is a privately held company based in Redwood City, Calif. and is backed by Accel Partners, Battery Ventures, DFJ, Franklin Templeton, Greylock Partners, IVP, Sapphire Ventures, Sequoia Capital, Sutter Hill Ventures and Tiger Global Management.

Advertisements

Machine Learning Is Helping To Stop Security Breaches With Threat Analytics

Bottom Line: Machine learning is enabling threat analytics to deliver greater precision regarding the risk context of privileged users’ behavior, creating notifications of risky activity in real time, while also being able to actively respond to incidents by cutting off sessions, adding additional monitoring, or flagging for forensic follow-up.

Separating Security Hacks Fact from Fiction

It’s time to demystify the scale and severity of breaches happening globally today. A commonly-held misconception or fiction is that millions of hackers have gone to the dark side and are orchestrating massive attacks on any and every business that is vulnerable. The facts are far different and reflect a much more brutal truth, which is that businesses make themselves easy to hack into by not protecting their privileged access credentials. Cybercriminals aren’t expending the time and effort to hack into systems; they’re looking for ingenious ways to steal privileged access credentials and walk in the front door. According to Verizon’s 2019 Data Breach Investigations Report, ‘Phishing’ (as a pre-cursor to credential misuse), ‘Stolen Credentials’, and ‘Privilege Abuse’ account for the majority of threat actions in breaches (see page 9 of the report).

It only really takes one compromised credential to potentially impact millions — whether it’s millions of individuals or millions of dollars. Undeniably, identities and the trust we place in them are being used against us. They have become the Achilles heel of our cybersecurity practices. According to a recent study by Centrify among 1,000 IT decision makers, 74% of respondents whose organizations have been breached acknowledged that it involved access to a privileged account. This number closely aligns with Forrester Research’s estimate “that at least 80% of data breaches . . . [involved] compromised privileged credentials, such as passwords, tokens, keys, and certificates.”

While the threat actors might vary according to Verizon’s 2019 Data Breach Investigations Report, the cyber adversaries’ tactics, techniques, and procedures are the same across the board. Verizon found that the fastest growing source of threats are from internal actors, as the graphic from the study illustrates below:


Internal actors are the fastest growing source of breaches because they’re able to obtain privileged access credentials with minimal effort, often obtaining them through legitimate access requests to internal systems or harvesting their co-workers’ credentials by going through the sticky notes in their cubicles. Privileged credential abuse is a challenge to detect as legacy approaches to cybersecurity trust the identity of the person using the privileged credentials. In effect, the hacker is camouflaged by the trust assigned to the privileged credentials they have and can roam internal systems undetected, exfiltrating sensitive data in the process.

The reality is that many breaches can be prevented by some of the most basic Privileged Access Management (PAM) tactics and solutions, coupled with a Zero Trust approach. Most organizations are investing the largest chunk of their security budget on protecting their network perimeter rather than focusing on security controls, which can affect positive change to protect against the leading attack vector: privileged access abuse.

The bottom line is that investing in securing perimeters leaves the most popular attack vector of all unprotected, which are privileged credentials. Making PAM a top priority is crucial to protect any business’ most valuable asset; it’s systems, data, and the intelligence they provide. Gartner has listed PAM on its Top 10 Security Projects for the past two years for a good reason.

Part of a cohesive PAM strategy should include machine learning-based threat analytics to provide an extra layer of security that goes beyond a password vault, multi-factor authentication (MFA), or privilege elevation.

How Machine Learning and Threat Analytics Stop Privileged Credential Abuse 

Machine learning algorithms enable threat analytics to immediately detect anomalies and non-normal behavior by tracking login behavioral patterns, geolocation, and time of login, and many more variables to calculate a risk score. Risk scores are calculated in real-time and define if access is approved, if additional authentication is needed, or if the request is blocked entirely.

Machine learning-based threat analytics also provide the following benefits:

  • New insights into privileged user access activity based on real-time data related to unusual recent privilege change, the command runs, target accessed, and privilege elevation.
  • Gain greater understanding and insights into the specific risk nature of specific events, computing a risk score in real time for every event expressed as high, medium, or low level for any anomalous activity.
  •  Isolate, identify, and track which security factors triggered an anomaly alert.
  • Capture, play, and analyze video sessions of anomalous events within the same dashboard used for tracking overall security activity.
  • Create customizable alerts that provide context-relevant visibility and session recording and can also deliver notifications of anomalies, all leading to quicker, more informed investigative action.

What to Look for In Threat Analytics 
Threat analytics providers are capitalizing on machine learning to improve the predictive accuracy and usability of their applications continually. What’s most important is for any threat analytics application or solution you’re considering to provide context-aware access decisions in real time. The best threat analytics applications on the market today are using machine learning as the foundation of their threat analytics engine. These machine learning-based engines are very effective at profiling the normal behavior pattern for any user on any login attempt, or any privileged activity including commands, identifying anomalies in real time to enable risk-based access control. High-risk events are immediately flagged, alerted, notified, and elevated to IT’s attention, speeding analysis, and greatly minimizing the effort required to assess risk across today’s hybrid IT environments.

The following is the minimum set of features to look for in any privilege threat analytics solution:

  • Immediate visibility with a flexible, holistic view of access activity across an enterprise-wide IT network and extended partner ecosystem. Look for threat analytics applications that provide dashboards and interactive widgets to better understand the context of IT risk and access patterns across your IT infrastructure. Threat analytics applications that give you the flexibility of tailoring security policies to every user’s behavior and automatically flagging risky actions or access attempts, so that you’ll gain immediate visibility into account risk, eliminating the overhead of sifting through millions of log files and massive amounts of historical data.
  • They have intuitively designed and customizable threat monitoring and investigation screens, workflows, and modules. Machine learning is enabling threat analytics applications to deliver more contextually-relevant and data-rich insights than has ever been possible in the past. Look for threat analytics vendors who offer intuitively designed and customizable threat monitoring features that provide insights into anomalous activity with a detailed timeline view. The best threat analytics vendors can identify the specific factors contributing to an anomaly for a comprehensive understanding of a potential threat, all from a single console. Security teams can then view system access, anomaly detection in high resolutions with analytics tools such as dashboards, explorer views, and investigation tools.
  • Must provide support for easy integration to Security Information and Event Management (SIEM) tools. Privileged access data is captured and stored to enable querying by log management and SIEM reporting tools. Make sure any threat analytics application you’re considering has installed, and working integrations with SIEM tools and platforms such as Micro Focus® ArcSight™, IBM® QRadar™, and Splunk® to identify risks or suspicious activity quickly.
  • Must Support Alert Notification by Integration with Webhook-Enabled Endpoints. Businesses getting the most value out of their threat analytics applications are integrating with Slack or existing onboard incident response systems such as PagerDuty to enable real-time alert delivery, eliminating the need for multiple alert touch points and improving time to respond. When an alert event occurs, the threat analytics engine allows the user to send alerts into third-party applications via Webhook. This capability enables the user to respond to a threat alert and contain the impact of a breach attempt.

Conclusion 
CentrifyForresterGartner, and Verizon each have used different methodologies and reached the same conclusion from their research: privileged access abuse is the most commonly used tactic for hackers to exfiltrate sensitive data. Breaches based on privileged credential abuse are extremely difficult to stop, as these credentials often have the greatest levels of trust and access rights associated with them. Leveraging threat analytics applications using machine learning that is adept at finding anomalies in behavioral data and thwarting a breach by denying access is proving very effective against privileged credential abuse.

Companies, including Centrify, use risk scoring combined with adaptive MFA to empower a least-privilege access approach based on Zero Trust. This Zero Trust Privilege approach verifies who or what is requesting privileged access, the context behind the request, and the risk of the access environment to enforce least privilege. These are the foundations of Zero Trust Privilege and are reflected in how threat analytics apps are being created and improved today.

Smart Machines Are The Future Of Manufacturing

Smart Machines Are The Future Of Manufacturing

  • Industrial Internet of Things (IIoT) presents integration architecture challenges that once solved can enable use cases that deliver fast-growing revenue opportunities.
  • ISA-95 addressed the rise of global production and distributed supply chains yet are still deficient on the issue of data and security, specifically the proliferation of IIoT sensors, which are the real security perimeter of any manufacturing business.
  • Finding new ways to excel at predictive maintenance, and cross-vendor shop floor integration are the most promising applications.
  • IIoT manufacturing systems are quickly becoming digital manufacturing platforms that integrate ERP, MES, PLM and CRM systems to provide a single unified view of product configurations.

These and many other fascinating insights are from an article McKinsey published titled IIoT platforms: The technology stack as value driver in industrial equipment and machinery which explores how the Industrial Internet of things (IIoT) is redefining industrial equipment and machinery manufacturing. It’s based on a thorough study also published this month, Leveraging Industrial Software Stack Advancement For Digital TransformationA copy of the study is downloadable here (PDF, 50 pp., no opt-in). The study shows how smart machines are the future of manufacturing, exploring how IIoT platforms are enabling greater machine-level autonomy and intelligence.

The following are the key takeaways from the study:

  • Capturing IIoT’s full value potential will require more sophisticated integrated approaches than current automation protocols provide. IIoT manufacturing systems are quickly becoming digital manufacturing platforms that integrate ERP, MES, PLM and CRM systems to provide a single unified view of product configurations and support the design-to-manufacturing process. Digital manufacturing platforms are already enabling real-time monitoring to the machine and shop floor level. The data streams real-time monitoring is delivering today is the catalyst leading to greater real-time analytics accuracy, machine learning adoption and precision and a broader integration strategy to the PLC level on legacy machinery. Please click on the graphic to expand for easier reading.

  • Inconsistent data structures at the machine, line, factory and company levels are slowing down data flows and making full transparency difficult to attain today in many manufacturers. Smart machines with their own operating systems that orchestrate IIoT data and ensure data structure accuracy are being developed and sold now, making this growth constraint less of an issue. The millions of legacy industrial manufacturing systems will continue to impede IIoT realizing its full potential, however. The following graphic reflects the complexities of making an IIoT platform consistent across a manufacturing operation. Please click on the graphic to expand for easier reading.

  • Driven by price wars and commoditized products, manufacturers have no choice but to pursue smart, connected machinery that enables IIoT technology stacks across shop floors. The era of the smart, connected machines is here, bringing with it the need to grow services and software revenue faster than transaction-based machinery sales. Machinery manufacturers are having to rethink their business models and redefine product strategies to concentrate on operating system-like functionality at the machine level that can scale and provide a greater level of autonomy, real-time data streams that power more accurate predictive maintenance, and cross-vendor shop floor integration. Please click on the graphic for easier reading.

  • Machines are being re-engineered starting with software and services as the primary design goals to support new business models. Machinery manufacturers are redefining existing product lines to be more software- and services-centric. A few are attempting to launch subscription-based business models that enable them to sell advanced analytics of machinery performance to customers. The resulting IIoT revenue growth will be driven by platforms as well as software and application development and is expected to be in the range of 20 to 35%. Please click on the graphic to expand for easier reading.

What Matters Most In Business Intelligence, 2019

  • Improving revenues using BI is now the most popular objective enterprises are pursuing in 2019.
  • Reporting, dashboards, data integration, advanced visualization, and end-user self-service are the most strategic BI initiatives underway in enterprises today.
  • Operations, Executive Management, Finance, and Sales are primarily driving Business Intelligence (BI) adoption throughout enterprises today.
  • Tech companies’ Operations & Sales teams are the most effective at driving BI adoption across industries surveyed, with Advertising driving BI adoption across Marketing.

These and many other fascinating insights are from Dresner Advisory Associates’ 10th edition of its popular Wisdom of Crowds® Business Intelligence Market Study. The study is noteworthy in that it provides insights into how enterprises are expanding their adoption of Business Intelligence (BI) from centralized strategies to tactical ones that seek to improve daily operations. The Dresner research teams’ broad assessment of the BI market makes this report unique, including their use visualizations that provide a strategic view of market trends. The study is based on interviews with respondents from the firms’ research community of over 5,000 organizations as well as vendors’ customers and qualified crowdsourced respondents recruited over social media. Please see pages 13 – 16 for the methodology.

Key insights from the study include the following:

  • Operations, Executive Management, Finance, and Sales are primarily driving Business Intelligence (BI) adoption throughout their enterprises today. More than half of the enterprises surveyed see these four departments as the primary initiators or drivers of BI initiatives. Over the last seven years, Operations departments have most increased their influence over BI adoption, more than any other department included in the current and previous survey. Marketing and Strategic Planning are also the most likely to be sponsoring BI pilots and looking for new ways to introduce BI applications and platforms into use daily.

  • Tech companies’ Operations & Sales teams are the most effective at driving BI adoption across industries surveyed, with Advertising driving BI adoption across Marketing. Retail/Wholesale and Tech companies’ sales leadership is primarily driving BI adoption in their respective industries. It’s not surprising to see the leading influencer among Healthcare respondents is resource-intensive HR. The study found that Executive Management is most likely to drive business intelligence in consulting practices most often.

  • Reporting, dashboards, data integration, advanced visualization, and end-user self-service are the most strategic BI initiatives underway in enterprises today. Second-tier initiatives include data discovery, data warehousing, data discovery, data mining/advanced algorithms, and data storytelling. Comparing the last four years of survey data, Dresner’s research team found reporting retains all-time high scores as the top priority, and data storytelling, governance, and data catalog hold momentum. Please click on the graphic to expand for easier reading.

  • BI software providers most commonly rely on executive-level personas to design their applications and add new features. Dresner’s research team found all vertical industries except Business Services target business executives first in their product design and messaging. Given the customer-centric nature of advertising and consulting services business models, it is understandable why the primary focus BI vendors rely on in selling to them are customer personas. The following graphic compares targeted users for BI by industry.

  • Improving revenues using BI is now the most popular objective in 2019, despite BI initially being positioned as a solution for compliance and risk management. Executive Management, Marketing/Sales, and Operations are driving the focus on improving revenues this year. Nearly 50% of enterprises now expect BI to deliver better decision making, making the areas of reporting, and dashboards must-have features. Interestingly, enterprises aren’t looking to BI as much for improving operational efficiencies and cost reductions or competitive advantages. Over the last 12 to 18 months, more tech manufacturing companies have initiated new business models that require their operations teams to support a shift from products to services revenues. An example of this shift is the introduction of smart, connected products that provide real-time data that serves as the foundation for future services strategies. Please click on the graphic to expand for easier reading.

  • In aggregate, BI is achieving its highest levels of adoption in R&D, Executive Management, and Operations departments today. The growing complexity of products and business models in tech companies, increasing reliance on analytics and BI in retail/wholesale to streamline supply chains and improve buying experiences are contributing factors to the increasing levels of BI adoption in these three departments. The following graphic compares BI’s level of adoption by function today.

  • Enterprises with the largest BI budgets this year are investing more heavily into dashboards, reporting, and data integration. Conversely, those with smaller budgets are placing a higher priority on open source-based big data projects, end-user data preparation, collaborative support for group-based decision-making, and enterprise planning. The following graphic provides insights into technologies and initiatives strategic to BI at an enterprise level by budget plans.

  • Marketing/Sales and Operations are using the greatest variety of BI tools today. The survey shows how conversant Operations professionals are with the BI tools in use throughout their departments. Every one of them knows how many and most likely which types of BI tools are deployed in their departments. Across all industries, Research & Development (R&D), Business Intelligence Competency Center (BICC), and IT respondents are most likely to report they have multiple tools in use.

How The Top 21% Of PAM-Mature Enterprises Are Thwarting Privileged Credential Breaches

  • Energy, Technology & Finance are the most mature industries when it comes to Privileged Access Management (PAM) adoption and uses, outscoring peer industries by a wide margin.
  • 58% of organizations do not use Multi-Factor Authentication (MFA) for privileged administrative access to servers, leaving their IT systems and infrastructure exposed to hacking attempts, including unchallenged privileged access abuse.
  • 52% of organizations are using shared accounts for controlling privileged access, increasing the probability of privileged credential abuse.

These and many other fascinating insights are from the recently published Centrify 2019 Zero Trust Privilege Maturity Model Report created in partnership with Techvangelism. You can download a copy of the study here (PDF, 22 pp., no opt-in). Over 1,300 organizations participated in the survey from 11 industries with Technology, Finance, and Healthcare, comprising 50% of all organizations participating. Please see page 4 of the study for additional details regarding the methodology.

What makes this study noteworthy is that it’s the first of its kind to create a Zero Trust Privilege Maturity Model designed to help organizations better understand and define their ability to discover, protect, secure, manage, and provide privileged access. Also, this model can be used to help mature existing security implementations towards one that provides the greatest level of protection of identity, privileged access, and its use.

Key takeaways from the study include the following:

  • The top 21% of enterprises who excel at thwarting privileged credential breaches share a common set of attributes that differentiate them from their peers. Enterprises who most succeed at stopping security breaches have progressed beyond vault- and identity-centric techniques by hardening their environments through the use of centralized management of service and application accounts and enforcing host-based session, file, and process auditing. In short, the most secure organizations globally have reached a level of Privileged Access Management (PAM) maturity that reduces the probability of a breach successfully occurring due to privileged credential abuse.

  • Energy, Technology & Finance are the most mature industries adopting Privileged Access Management (PAM), outscoring peer industries by a wide margin. Government, Education, and Manufacturing are the industries most lagging in their adoption of Zero Trust Privilege (ZTP), making them the most vulnerable to breaches caused by privileged credential abuse. Education and Manufacturing are the most vulnerable industries of all, where it’s common for multiple manufacturing sites to use shared accounts for controlling privileged access. The study found shared accounts for controlling privileged access is commonplace, with 52% of all organizations reporting this occurring often. Presented below are the relative levels of Zero Trust Privilege Maturity by demographics, with the largest organizations having the most mature approaches to ZTP, which is expected given the size and scale of their IT and cybersecurity departments.

  • 51% of organizations do not control access to transformational technologies with privileged access, including modern attack surfaces such as cloud workloads (38%), Big Data projects (65%), and containers (50%). Artificial Intelligence (AI)/Bots and Internet of Things (IoT) are two of the most vulnerable threat surfaces according to the 1,300 organizations surveyed. Just 16% of organizations have implemented a ZTP strategy to protect their AI/Bots technologies, and just 25% have implemented them for IoT. The graphic below compares usage or plans by transformational technologies.

  • 58% of organizations aren’t using MFA for server login, and 25% have no plans for a password vault, two areas that are the first steps to defining a Privileged Access Management (PAM) strategy. Surprisingly, 26% do not use and do not plan to use MFA for server login, while approximately 32% do plan to use MFA for server logins. Organizations are missing out on opportunities to significantly harden their security posture by adopting password vaults and implementing MFA across all server logins. These two areas are essential for implementing a ZTP framework.

Conclusion

To minimize threats – both external and internal – Privileged Access Management needs to go beyond the fundamental gateway-based model and look to encompass host-enforced privileged access that addresses every means by which the organization leverages privileged credentials. With just 21% of organizations succeeding with mature Zero Trust Privilege deployments, 79% are vulnerable to privileged credential abuse-based breaches that are challenging to stop. Privileged credentials are the most trusted in an organization, allowing internal and external hackers the freedom to move throughout networks undetected. That’s why understanding where an organization is on the spectrum of ZTP maturity is so important, and why the findings from the Centrify and Techvangelism 2019 Zero Trust Privilege Maturity Model Report are worth noting and taking action on.

How To Get Your Data Scientist Career Started

The most common request from this blogs’ readers is how to further their careers in analytics, cloud computing, data science, and machine learning. I’ve invited Alyssa Columbus, a Data Scientist at Pacific Life, to share her insights and lessons learned on breaking into the field of data science and launching a career there. The following guest post is authored by her.

Earning a job in data science, especially your first job in data science, isn’t easy, especially given the surplus of analytics job-seekers to analytics jobs.

Many people are looking to break into data science, from undergraduates to career changers, have asked me how I’ve attained my current data science position at Pacific Life. I’ve referred them to many different resources, including discussions I’ve had on the Dataquest.io blog and the Scatter Podcast. In the interest of providing job seekers with a comprehensive view of what I’ve learned that works, I’ve put together the five most valuable lessons learned. I’ve written this article to make your data science job hunt easier and as efficient as possible.

  • Continuously build your statistical literacy and programming skills. Currently, there are 24,697 open Data Scientist positions on LinkedIn in the United States alone. Using data mining techniques to analyze all open positions in the U.S., the following list of the top 10 data science skills was created today. As of April 14, the top 3 most common skills requested in LinkedIn data scientist job postings are Python, R, and SQL, closely followed by Jupyter Notebooks, Unix Shell/Awk, AWS, and Tensorflow. The following graphic provides a prioritized list of the most in-demand data science skills mentioned in LinkedIn job postings today. Please click on the graphic to expand for easier viewing.

Hands-on training is the best way to develop and continually improve statistical and programming skills, especially with the languages and technologies LinkedIn’s job postings prioritize.  Getting your hands dirty with a dataset is often much better than reading through abstract concepts and not applying what you’ve learned to real problems. Your applied experience is just as important as your academic experience, and taking statistics, and computer science classes help to translate theoretical concepts into practical results. The toughest thing to learn (and also to teach) about statistical analysis is the intuition for what the big questions to ask of your dataset are. Statistical literacy, or “how” to find the answers to your questions, come with education and practice. Strengthening your intellectual curiosity or insight into asking the right questions comes through experience.

  • Continually be creating your own, unique portfolio of analytics and machine learning projects. Having a good portfolio is essential to be hired as a data scientist, especially if you don’t come from a quantitative background or have experience in data science before. Think of your portfolio as proof to potential employers that you are capable of excelling in the role of a data scientist with both the passion and skills to do the job. When building your data science portfolio, select and complete projects that qualify you for the data science jobs, you’re the most interested in. Use your portfolio to promote your strengths and innate abilities by sharing projects you’ve completed on your own. Some skills I’d recommend you highlight in your portfolio include:
    • Your programming language of choice (e.g., Python, R, Julia, etc.).
    • The ability to interact with databases (e.g., your ability to use SQL).
    • Visualization of data (static or interactive).
    • Storytelling with data. This is a critical skill. In essence, can someone with no background in whatever area your project is in look at your project and gain some new understandings from it?
    • Deployment of an application or API. This can be done with small sample projects (e.g., a REST API for an ML model you trained or a nice Tableau or R Shiny dashboard).

Julia Silge and Amber Thomas both have excellent examples of portfolios that you can be inspired by. Julia’s portfolio is shown below.

  • Get (or git!) yourself a website. If you want to stand out, along with a portfolio, create and continually build a strong online presence in the form of a website.  Be sure to create and continually add to your GitHub and Kaggle profiles to showcase your passion and proficiency in data science. Making your website with GitHub Pages creates a profile for you at the same time, and best of all it’s free to do. A strong online presence will not only help you in applying for jobs, but organizations may also reach out to you with freelance projects, interviews, and other opportunities.
  • Be confident in your skills and apply for any job you’re interested in, starting with opportunities available in your network.  If you don’t meet all of a job’s requirements, apply anyway. You don’t have to know every skill (e.g., programming languages) on a job description, especially if there are more than ten listed. If you’re a great fit for the main requirements of the job’s description, you need to apply. A good general rule is that if you have at least half of the skills requested on a job posting, go for it. When you’re hunting for jobs, it may be tempting to look for work on company websites or tech-specific job boards. I’ve found, as have many others, that these are among the least helpful ways to find work. Instead, contact recruiters specializing in data science and build up your network to break into the field. I recommend looking for a data science job via the following sources, with the most time devoted to recruiters and your network:
    • Recruiters
    • Friends, family, and colleagues
    • Career fairs and recruiting events
    • General job boards
    • Company websites
    • Tech job boards.

Alyssa Columbus is a Data Scientist at Pacific Life and member of the Spring 2018 class of NASA Datanauts. Previously, she was a computational statistics and machine learning researcher at the UC Irvine Department of Epidemiology and has built robust predictive models and applications for a diverse set of industries spanning retail to biologics. Alyssa holds a degree in Applied and Computational Mathematics from the University of California, Irvine and is a member of Phi Beta Kappa. She is a strong proponent of reproducible methods, open source technologies, and diversity in analytics and is the founder of R-Ladies Irvine. You can reach her at her website: alyssacolumbus.com.

Seven Things You Need To Know About IIoT In Manufacturing

  • Global spending on IIoT Platforms for Manufacturing is predicted to grow from $1.67B in 2018 to $12.44B in 2024, attaining a 40% compound annual growth rate (CAGR) in seven years.
  • IIoT platforms are beginning to replace MES and related applications, including production maintenance, quality, and inventory management, which are a mix of Information Technology (IT) and Operations Technology (OT) technologies.
  • Connected IoT technologies are enabling a new era of smart, connected products that often expand on the long-proven platforms of everyday products. Capgemini estimates that the size of the connected products market will be $519B to $685B by 2020.

These and many other fascinating insights are from IoT Analytics’ study, IIoT Platforms For Manufacturing 2019 – 2024 (155 pp., PDF, client access reqd). IoT Analytics is a leading provider of market insights for the Internet of Things (IoT), M2M, and Industry 4.0. They specialize in providing insights on IoT markets and companies, focused market reports on specific IoT segments and Go-to-Market services for emerging IoT companies. The study’s methodology includes interviews with twenty of the leading IoT platform providers, executive-level IoT experts, and IIoT end users. For additional details on the methodology, please see pages 136 and 137 of the report. IoT Analytics defines the Industrial loT (lloT) as heavy industries including manufacturing, energy, oil and gas, and agriculture in which industrial assets are connected to the internet.

The seven things you need to know about IIoT in manufacturing include the following:

  • IoT Analytics’ technology architecture of the Internet of Things reflects the proliferation of new products, software and services, and the practical needs manufacturers have for proven integration to make the Industrial Internet of Things (IIoT) work. IoT technology architectures are in their nascent phase, showing signs of potential in solving many of manufacturing’s most challenging problems. IoT Analytics’ technology architecture shown below is designed to scale in response to the diverse development across the industry landscape with a modular, standardized approach.

  • IIoT platforms are beginning to replace MES and related applications, including production maintenance, quality, and inventory management, which are a mix of Information Technology (IT) and Operations Technology (OT) technologies. IoT Analytics is seeing IIoT platforms begin to replace existing industrial software systems that had been created to bridge the IT and OT gaps in manufacturing environments. Their research teams are finding that IIoT Platforms are an adjacent technology to these typical industrial software solutions but are now starting to replace some of them in smart connected factory settings. The following graphic explains how IoT Analytics sees the IIoT influence across the broader industrial landscape:

  • Global spending on IIoT Platforms for Manufacturing is predicted to grow from $1.67B in 2018 to $12.44B in 2024, attaining a 40% compound annual growth rate (CAGR) in seven years. IoT Analytics is finding that manufacturing is the largest IoT platform industry segment and will continue to be one of the primary growth catalysts of the market through 2024. For purposes of their analysis, IoT Analytics defines manufacturing as standardized production environments including factories, workshops, in addition to custom production worksites such as mines, offshore oil gas, and construction sites. The lloT platforms for manufacturing segment have experienced growth in the traditionally large manufacturing-base countries such as Japan and China. IoT Analytics relies on econometric modeling to create their forecasts.

  • In 2018, the industrial loT platforms market for manufacturing had an approximate 60%/40% split for within factories/outside factories respectively. IoT Analytics predicts this split is expected to remain mostly unchanged for 2019 and by 2024 within factories will achieve slight gains by a few percentage points. The within factories type (of lloT Platforms for Manufacturing) is estimated to grow from a $1B market in 2018 to a $1.5B market by 2019 driven by an ever-increasing amount of automation (e.g., robots on the factory floor) being introduced to factory settings for increased efficiencies, while the outside factories type is forecast to grow from $665M in 2018 to become a $960M market by 2019.

  • Discrete manufacturing is predicted to be the largest percentage of Industrial IoT platform spending for 2019, growing at a CAGR of 46% from 2018. Discrete manufacturing will outpace batch and process manufacturing, becoming 53% of all IIoT platform spending this year. IoT Analytics sees discrete manufacturers pursuing make-to-stock, make-to-order, and assemble-to-order production strategies that require sophisticated planning, scheduling, and tracking capabilities to improve operations and profitability. The greater the production complexity in discrete manufacturing, the more valuable data becomes. Discrete manufacturing is one of the most data-prolific industries there are, making it an ideal catalyst for IIoT platform’s continual growth.

  • Manufacturers are most relying on IIoT platforms for general process optimization (43.1%), general dashboards & visualization (41.1%) and condition monitoring (32.7%). Batch, discrete, and process manufacturers are prioritizing other use cases such as predictive maintenance, asset tracking, and energy management as all three areas make direct contributions to improving shop floor productivity. Discrete manufacturers are always looking to free up extra time in production schedules so that they can offer short-notice production runs to their customers. Combining IIoT platform use cases to uncover process and workflow inefficiencies so more short-notice production runs can be sold is driving Proof of Concepts (PoC) today in North American manufacturing.

  • IIoT platform early adopters prioritize security as the most important feature, ahead of scalability and usability. Identity and Access Management, multifactor-factor authentication, consistency of security patch updates, and the ability to scale and protect every threat surface across an IIoT network are high priorities for IIoT platform adopters today. Scale and usability are the second and third priorities. The following graphic compares IIoT platform manufacturers’ most important needs:

For more information on the insights presented here, check out IoT Analytics’ report: IIoT Platforms For Manufacturing 2019 – 2024.

How To Improve Privileged User’s Security Experiences With Machine Learning

Bottom Line: One of the primary factors motivating employees to sacrifice security for speed are the many frustrations they face, attempting to re-authenticate who they are so they can get more work done and achieve greater productivity.

How Bad Security Experiences Lead to a Breach

Every business is facing the paradox of hardening security without sacrificing users’ login and system access experiences. Zero Trust Privilege is emerging as a proven framework for thwarting privileged credential abuse by verifying who is requesting access, the context of the request, and the risk of the access environment across every threat surface an organization has.

Centrify’s recent survey Privileged Access Management In The Modern Threatscape found that 74% of data breaches start with privileged credential abuse. Forrester estimates that 80% of data breaches have a connection to compromised privileged credentials, such as passwords, tokens, keys, and certificates. On the Dark Web, privileged access credentials are a best-seller because they provide the intruder with “the keys to the kingdom.” By leveraging a “trusted” identity, a hacker can operate undetected and exfiltrate sensitive data sets without raising any red flags.

Frustrated with wasting time responding to the many account lock-outs, re-authentication procedures, and login errors outmoded Privileged Access Management (PAM) systems require, IT Help Desk teams, IT administrators, and admin users freely share privileged credentials, often resulting in them eventually being offered for sale on the Dark Web.

The Keys to the Kingdom Are In High Demand

18% of healthcare employees are willing to sell confidential data to unauthorized parties for as little as $500 to $1,000, and 24% of employees know of someone who has sold privileged credentials to outsiders, according to a recent Accenture survey. State-sponsored and organized crime organizations offer to pay bounties in bitcoin for privileged credentials for many of the world’s largest financial institutions on the Dark Web. And with the typical U.S.-based enterprise losing on average $7.91M from a breach, more than double the global average of $3.86M according to IBM’s 2018 Data Breach Study, it’s clear that improving admin user experiences to reduce the incidence of privileged credential sharing needs to happen now.

How Machine Learning Improves Admin User Experiences and Thwarts Breaches

Machine learning is making every aspect of security experiences more adaptive, taking into account the risk context of every privileged access attempt across any threat surface, anytime. Machine learning algorithms can continuously learn and generate contextual intelligence that is used to streamline verified privileged user’s access while thwarting many potential threats ― the most common of which is compromised credentials.

The following are a few of the many ways machine learning is improving privileged users’ experiences when they need to log in to secure critical infrastructure resources:

  • Machine learning is making it possible to provide adaptive, personalized login experiences at scale using risk-scoring of every access attempt in real-time, all contributing to improved user experiences. Machine learning is making it possible to implement security strategies that flex or adapt to risk contexts in real-time, assessing every access attempt across every threat surface, and generating a risk score in milliseconds. Being able to respond in milliseconds, or real-time is essential for delivering excellent admin user experiences. The “never trust, always verify, enforce least privilege” approach to security is how many enterprises from a broad base of industries including leading financial services and insurance companies are protecting every threat surface from privileged access abuse. CIOs at these companies say taking a Zero Trust approach with a strong focus on Zero Trust Privilege corporate-wide is redefining the legacy approach to Privileged Access Management by delivering cloud-architected Zero Trust Privilege to secure access to infrastructure, DevOps, cloud, containers, Big Data, and other modern enterprise use cases. Taking a Zero Trust approach to security enables their departments to roll out new services across every threat surface their customers prefer to use without having to customize security strategies for each.
  • Quantify, track and analyze every potential security threat and attempted breach and apply threat analytics to the aggregated data sets in real-time, thwarting data exfiltration attempts before they begin. One of the tenets or cornerstones of Zero Trust Privilege is adaptive control. Machine learning algorithms continually “learn” by continuously analyzing and looking for anomalies in users’ behavior across every threat surface, device, and login attempt. When any users’ behavior appears to be outside the threshold of constraints defined for threat analytics and risk scoring, additional authentication is immediately requested, and access denied to requested resources until an identity can be verified. Machine learning makes adaptive preventative controls possible.
  • When every identity is a new security perimeter, machine learnings’ ability to provide personalization at scale for every access attempt on every threat surface is essential for enabling a company to keep growing. Businesses that are growing the fastest often face the greatest challenges when it comes to improving their privileged users’ experiences. Getting new employees productive quickly needs to be based on four foundational elements. These include verifying the identity of every admin user, knowing the context of their access request, ensuring it’s coming from a clean source, and limiting access as well as privilege. Taken together, these pillars form the foundation of a Zero Trust Privilege.

Conclusion

Organizations don’t have to sacrifice security for speed when they’re relying on machine learning-based approaches for improving the privileged user experience. Today, a majority of IT Help Desk teams, IT administrators, and admin users are freely sharing privileged credentials to be more productive, which often leads to breaches based on privileged access abuse. By taking a machine learning-based approach to validate every access request, the context of the request, and the risk of the access environment, roadblocks in the way of greater privileged user productivity disappear. Privileged credential abuse is greatly minimized.

Industry 4.0’s Potential Needs To Be Proven On The Shop Floor

  • 99% of mid-market manufacturing executives are familiar with Industry 4.0, yet only 5% are currently implementing or have implemented an Industry 4.0 strategy.
  • Investing in upgrading existing machinery, replacing fully depreciated machines with next-generation smart, connected production equipment, and adopting real-time monitoring including Manufacturing Execution Systems (MES) are manufacturers’ top three priorities based on interviews with them.
  • Mid-market manufacturers getting the most value out of Industry 4.0 excel at orchestrating a variety of technologies to find new ways to excel at product quality, improve shop floor productivity, meet delivery dates, and control costs.
  • Real-time monitoring is gaining momentum to improve order cycle times, troubleshoot quality problems, improve schedule accuracy, and support track-and-trace.

These and many other fascinating insights are from Industry 4.0: Defining How Mid-Market Manufacturers Derive and Deliver ValueBDO is a leading provider of assurance, tax, and financial advisory services and is providing the report available for download here (PDF, 36 pp., no opt-in). The survey was conducted by Market Measurement, Inc., an independent market research consulting firm. The survey included 230 executives at U.S. manufacturing companies with annual revenues between $200M and $3B and was conducted in November and December of 2018. Please see page 2 of the study for additional details regarding the methodology. One of the most valuable findings of the study is that mid-market manufacturers need more evidence of Industry 4.0, delivering improved supply chain performance, quality, and shop floor productivity.

Insights from the Shop Floor: Machine Upgrades, Smart Machines, Real-Time Monitoring & MES Lead Investment Plans

In the many conversations I’ve had with mid-tier manufacturers located in North America this year, I’ve learned the following:

  • Their top investment priorities are upgrading existing machinery, replacing fully depreciated machines with next-generation smart, connected production equipment, and adopting real-time monitoring including Manufacturing Execution Systems (MES).
  • Manufacturers growing 10% or more this year over 2018 excel at integrating technologies that improve scheduling to enable more short-notice production runs, reduce order cycle times, and improve supplier quality.

Key Takeaways from BDO’s Industry 4.0 Study

  • Manufacturers are most motivated to evaluate Industry 4.0 technologies based on the potential for growth and business model diversification they offer. Building a business case for any new system or technology that delivers revenue, even during a pilot, is getting the highest priority by manufacturers today. Based on my interviews with manufacturers, I found they were 1.7 times more likely to invest in machine upgrades and smart machines versus spending more on marketing. Manufacturers are very interested in any new technology that enables them to accept short-notice production runs from customers, excel at higher quality standards, improve time-to-market, all the while having better cost visibility and control. All those factors are inherent in the top three goals of business model diversification, improved operational efficiencies, and increased market penetration.

  • For Industry 4.0 technologies to gain more adoption, more use cases are needed to explain how traditional product sales, aftermarket sales, and product-as-a-service benefit from these new technologies. Manufacturers know the ROI of investing in a machinery upgrade, buying a smart, connected machine, or integrating real-time monitoring across their shop floors. What they’re struggling with is how Industry 4.0 makes traditional product sales improve. 84% of upper mid-market manufacturers are generating revenue using Information-as-a-Service today compared to 67% of middle market manufacturers overall.

  • Manufacturers who get the most value out of their Industry 4.0 investments begin with a customer-centric blueprint first, integrating diverse technologies to deliver excellent customer experiences. Manufacturers growing 10% a year or more are relying on roadmaps to guide their technology buying decisions. These roadmaps are focused on how to reduce scrap, improve order cycle times, streamline supplier integration while improving inbound quality levels, and provide real-time order updates to customers. BDOs’ survey results reflect what I’m hearing from manufacturers. They’re more focused than ever before on having an integrated engagement strategy combined with greater flexibility in responding to unique and often urgent production runs.

  • Industry 4.0’s potential to improve supply chains needs greater focus if mid-tier manufacturers are going to adopt the framework fully. Manufacturing executives most often equate Industry 4.0 with shop floor productivity improvements while the greatest gains are waiting in their supply chains. The BDO study found that manufacturers are divided on the metrics they rely on to evaluate their supply chains. Upper middle market manufacturers are aiming to speed up customer order cycle times and are less focused on getting their total delivered costs down. Lower mid-market manufacturers say reducing inventory turnover is their biggest priority. Overall, strengthening customer service increases in importance with the size of the organization.

  • By enabling integration between engineering, supply chain management, Manufacturing Execution Systems (MES) and CRM systems, more manufacturers are achieving product configuration strategies at scale. A key growth strategy for many manufacturers is to scale beyond the limitations of their longstanding Make-to-Stock production strategies. By integrating engineering, supply chains, MES, and CRM, manufacturers can offer more flexibility to their customers while expanding their product strategies to include Configure-to-Order, Make-to-Order, and for highly customized products, Engineer-to-Order. The more Industry 4.0 can be shown to enable design-to-manufacturing at scale, the more it will resonate with senior executives in mid-tier manufacturing.

  • Manufacturers are more likely than ever before to accept cloud-based platforms and systems that help them achieve their business strategies faster and more completely, with analytics being in the early stages of adoption. Manufacturing CEOs and their teams are most concerned about how quickly new applications and platforms can position their businesses for more growth. Whether a given application or platform is cloud-based often becomes secondary to the speed and time-to-market constraints every manufacturing business faces. The fastest-growing mid-tier manufacturers are putting greater effort and intensity into mastering analytics across every area of their business too. BDO found that Artificial Intelligence (AI) leads all other technologies in planned use.

How To Improve Supply Chains With Machine Learning: 10 Proven Ways

Bottom line: Enterprises are attaining double-digit improvements in forecast error rates, demand planning productivity, cost reductions and on-time shipments using machine learning today, revolutionizing supply chain management in the process.

Machine learning algorithms and the models they’re based on excel at finding anomalies, patterns and predictive insights in large data sets. Many supply chain challenges are time, cost and resource constraint-based, making machine learning an ideal technology to solve them. From Amazon’s Kiva robotics relying on machine learning to improve accuracy, speed and scale to DHL relying on AI and machine learning to power their Predictive Network Management system that analyzes 58 different parameters of internal data to identify the top factors influencing shipment delays, machine learning is defining the next generation of supply chain management. Gartner predicts that by 2020, 95% of Supply Chain Planning (SCP) vendors will be relying on supervised and unsupervised machine learning in their solutions. Gartner is also predicting by 2023 intelligent algorithms, and AI techniques will be an embedded or augmented component across 25% of all supply chain technology solutions.

The ten ways that machine learning is revolutionizing supply chain management include:

  • Machine learning-based algorithms are the foundation of the next generation of logistics technologies, with the most significant gains being made with advanced resource scheduling systems. Machine learning and AI-based techniques are the foundation of a broad spectrum of next-generation logistics and supply chain technologies now under development. The most significant gains are being made where machine learning can contribute to solving complex constraint, cost and delivery problems companies face today. McKinsey predicts machine learning’s most significant contributions will be in providing supply chain operators with more significant insights into how supply chain performance can be improved, anticipating anomalies in logistics costs and performance before they occur. Machine learning is also providing insights into where automation can deliver the most significant scale advantages. Source: McKinsey & Company, Automation in logistics: Big opportunity, bigger uncertainty, April 2019. By Ashutosh Dekhne, Greg Hastings, John Murnane, and Florian Neuhaus

  • The wide variation in data sets generated from the Internet of Things (IoT) sensors, telematics, intelligent transport systems, and traffic data have the potential to deliver the most value to improving supply chains by using machine learning. Applying machine learning algorithms and techniques to improve supply chains starts with data sets that have the greatest variety and variability in them. The most challenging issues supply chains face are often found in optimizing logistics, so materials needed to complete a production run arrive on time. Source: KPMG, Supply Chain Big Data Series Part 1

  • Machine learning shows the potential to reduce logistics costs by finding patterns in track-and-trace data captured using IoT-enabled sensors, contributing to $6M in annual savings. BCG recently looked at how a decentralized supply chain using track-and-trace applications could improve performance and reduce costs. They found that in a 30-node configuration when blockchain is used to share data in real-time across a supplier network, combined with better analytics insight, cost savings of $6M a year is achievable. Source: Boston Consulting Group, Pairing Blockchain with IoT to Cut Supply Chain Costs, December 18, 2018, by Zia Yusuf, Akash Bhatia, Usama Gill, Maciej Kranz, Michelle Fleury, and Anoop Nannra

  • Reducing forecast errors up to 50% is achievable using machine learning-based techniques. Lost sales due to products not being available are being reduced up to 65% through the use of machine learning-based planning and optimization techniques. Inventory reductions of 20 to 50% are also being achieved today when machine learning-based supply chain management systems are used. Source: Digital/McKinsey, Smartening up with Artificial Intelligence (AI) – What’s in it for Germany and its Industrial Sector? (PDF, 52 pp., no opt-in).

  • DHL Research is finding that machine learning enables logistics and supply chain operations to optimize capacity utilization, improve customer experience, reduce risk, and create new business models. DHL’s research team continually tracks and evaluates the impact of emerging technologies on logistics and supply chain performance. They’re also predicting that AI will enable back-office automation, predictive operations, intelligent logistics assets, and new customer experience models. Source: DHL Trend Research, Logistics Trend Radar, Version 2018/2019 (PDF, 55 pp., no opt-in)

  • Detecting and acting on inconsistent supplier quality levels and deliveries using machine learning-based applications is an area manufacturers are investing in today. Based on conversations with North American-based mid-tier manufacturers, the second most significant growth barrier they’re facing today is suppliers’ lack of consistent quality and delivery performance. The greatest growth barrier is the lack of skilled labor available. Using machine learning and advanced analytics manufacturers can discover quickly who their best and worst suppliers are, and which production centers are most accurate in catching errors. Manufacturers are using dashboards much like the one below for applying machine learning to supplier quality, delivery and consistency challenges. Source: Microsoft, Supplier Quality Analysis sample for Power BI: Take a tour, 2018

  • Reducing risk and the potential for fraud, while improving the product and process quality based on insights gained from machine learning is forcing inspection’s inflection point across supply chains today. When inspections are automated using mobile technologies and results are uploaded in real-time to a secure cloud-based platform, machine learning algorithms can deliver insights that immediately reduce risks and the potential for fraud. Inspectorio is a machine learning startup to watch in this area. They’re tackling the many problems that a lack of inspection and supply chain visibility creates, focusing on how they can solve them immediately for brands and retailers. The graphic below explains their platform. Source: Forbes, How Machine Learning Improves Manufacturing Inspections, Product Quality & Supply Chain Visibility, January 23, 2019

  • Machine learning is making rapid gains in end-to-end supply chain visibility possible, providing predictive and prescriptive insights that are helping companies react faster than before. Combining multi-enterprise commerce networks for global trade and supply chain management with AI and machine learning platforms are revolutionizing supply chain end-to-end visibility. One of the early leaders in this area is Infor’s Control Center. Control Center combines data from the Infor GT Nexus Commerce Network, acquired by the company in September 2015, with Infor’s Coleman Artificial Intelligence (AI) Infor chose to name their AI platform after the inspiring physicist and mathematician Katherine Coleman Johnson, whose trail-blazing work helped NASA land on the moon. Be sure to pick up a copy of the book and see the movie Hidden Figures if you haven’t already to appreciate her and many other brilliant women mathematicians’ many contributions to space exploration. ChainLink Research provides an overview of Control Center in their article, How Infor is Helping to Realize Human Potential, and two screens from Control Center are shown below.

  • Machine learning is proving to be foundational for thwarting privileged credential abuse which is the leading cause of security breaches across global supply chains. By taking a least privilege access approach, organizations can minimize attack surfaces, improve audit and compliance visibility, and reduce risk, complexity, and the costs of operating a modern, hybrid enterprise. CIOs are solving the paradox of privileged credential abuse in their supply chains by knowing that even if a privileged user has entered the right credentials but the request comes in with risky context, then stronger verification is needed to permit access.  Zero Trust Privilege is emerging as a proven framework for thwarting privileged credential abuse by verifying who is requesting access, the context of the request, and the risk of the access environment.  Centrify is a leader in this area, with globally-recognized suppliers including Cisco, Intel, Microsoft, and Salesforce being current customers.  Source: Forbes, High-Tech’s Greatest Challenge Will Be Securing Supply Chains In 2019, November 28, 2018.
  • Capitalizing on machine learning to predict preventative maintenance for freight and logistics machinery based on IoT data is improving asset utilization and reducing operating costs. McKinsey found that predictive maintenance enhanced by machine learning allows for better prediction and avoidance of machine failure by combining data from the advanced Internet of Things (IoT) sensors and maintenance logs as well as external sources. Asset productivity increases of up to 20% are possible and overall maintenance costs may be reduced by up to 10%. Source: Digital/McKinsey, Smartening up with Artificial Intelligence (AI) – What’s in it for Germany and its Industrial Sector? (PDF, 52 pp., no opt-in).

References

Accenture, Reinventing The Supply Chain With AI, 20 pp., PDF, no opt-in.

Bendoly, E. (2016). Fit, Bias, and Enacted Sensemaking in Data Visualization: Frameworks for Continuous Development in Operations and Supply Chain Management Analytics. Journal Of Business Logistics37(1), 6-17.

Boston Consulting Group, Pairing Blockchain with IoT to Cut Supply Chain Costs, December 18, 2018, by Zia Yusuf, Akash Bhatia, Usama Gill, Maciej Kranz, Michelle Fleury, and Anoop Nannra

%d bloggers like this: