Skip to content
Advertisements

Archive for

Top 10 Cybersecurity Companies To Watch In 2019

Today’s Threatscape Has Made “Trust But Verify” Obsolete 

The threatscape every business operates in today is proving the old model of “trust but verify” obsolete and in need of a complete overhaul. To compete and grow in the increasingly complex and lethal threatscape of today, businesses need more adaptive, contextually intelligent security solutions based on the Zero Trust Security framework. Zero Trust takes a “never trust, always verify, enforce least privilege” approach to privileged access, from inside or outside the network. John Kindervag was the first to see how urgent the need was for enterprises to change their approach to cybersecurity, so he created the Zero Trust Security framework in 2010 while at Forrester. Chase Cunningham, Principal Analyst at Forrester, is a mentor to many worldwide wanting to expand their knowledge of Zero Trust and frequently speaks and writes on the topic. If you are interested in cybersecurity in general and Zero Trust specifically, be sure to follow his blog.

AI and machine learning applied to cybersecurity’s most significant challenges is creating a proliferation of commercially successful, innovative platforms. The size and scale of deals in cybersecurity continue to accelerate with BlackBerry’s acquisition of Cylance for $1.4B in cash closing in February of this year being the largest. TD Ameritrade’s annual survey of registered investment advisors (RIA) showed nearly a 6X jump in cybersecurity investments this year compared to 2018.

The top ten cybersecurity companies reflect the speed and scale of innovation happening today that are driving the highest levels of investment this industry has ever seen. The following are the top ten cybersecurity companies to watch in 2019:

Absolute (ABT.TO)  – One of the world’s leading commercial enterprise security solutions, serving as the industry benchmark for endpoint resilience, visibility, and control. The company enables more than 12,000 customers with self-healing endpoint security, always-connected visibility into their devices, data, users, and applications whether endpoints are on or off the network, and the ultimate level of control and confidence required for the modern enterprise. Embedded in over one billion endpoint devices, Absolute delivers intelligence and real-time remediation capabilities that equip enterprises to stop data breaches at the source.

To thwart attackers, organizations continue to layer on security controls — Gartner estimates that more than $124B will be spent on security in 2019 aloneAbsolute’s 2019 Endpoint Security Trends Report finds that much of that spend is in vain, however, revealing that 70% of all breaches still originate on the endpoint. The problem is complexity at the endpoint – it causes security agents to fail invariably, reliably, and predictably.

Absolute’s research found that 42% of all endpoints are unprotected at any given time, and 100% of endpoint security tools eventually fail. As a result, IT leaders see a negative ROI on their security spend. What makes Absolute one of the top 10 security companies to watch in 2019 is their purpose-driven design to mitigate this universal law of security decay.

Enterprises rely on Absolute to cut through the complexity to identify failures, model control options, and refocus security intent. Rather than perpetuating organizations’ false sense of security, Absolute enables uncompromised endpoint persistence, builds resilience and delivers the intelligence needed to ensure security agents, applications, and controls continue functioning and deliver value as intended. Absolute has proven very effective in validating safeguards, fortifying endpoints, and stopping data security compliance failures. The following is an example of the Absolute platform at work:

BlackBerry Artifical Intelligence and Predictive Security  –  BlackBerry is noteworthy for how quickly they are reinventing themselves into an enterprise-ready cybersecurity company independent of the Cylance acquisition. Paying $1.4B in cash for Cylance brings much-needed AI and machine learning expertise to their platform portfolio, an acquisition that BlackBerry is moving quickly to integrate into their product and service strategies. BlackBerry Cylance uses AI and machine learning to protect the entire attack surface of an enterprise with automated threat prevention, detection, and response capabilities. Cylance is also the first company to apply artificial intelligence, algorithmic science, and machine learning to cyber security and improve the way companies, governments, and end users proactively solve the world’s most challenging security problems. Using a breakthrough mathematical process, BlackBerry Cylance quickly and accurately identifies what is safe and what is a threat, not just what is in a blacklist or whitelist. By coupling sophisticated math and machine learning with a unique understanding of a hacker’s mentality, BlackBerry Cylance provides the technology and services to be truly predictive and preventive against advanced threats. The following screen from CylancePROTECT provides an executive summary of CylancePROTECT usage, from the number of zones and devices to the percentage of devices covered by Auto-Quarantine and Memory Protection, Threat Events, Memory Violations, Agent Versions, and Offline Days for devices.

Centrify –  Centrify is redefining the legacy approach to Privileged Access Management by delivering cloud-ready Zero Trust Privilege to secure modern enterprise attack surfaces. Centrify Zero Trust Privilege helps customers grant least privilege access based on verifying who is requesting access, the context of the request, and the risk of the access environment. Industry research firm Gartner predicted Privileged Access Management (PAM) to be the second-fastest growing segment for information security and risk management spending worldwide in 2019 in their recent Forecast Analysis: Information Security and Risk Management, Worldwide, 3Q18 Update (client access required). By implementing least privilege access, Centrify minimizes the attack surface, improves audit and compliance visibility, and reduces risk, complexity, and costs for the modern, hybrid enterprise. Over half of the Fortune 100, the world’s largest financial institutions, intelligence agencies, and critical infrastructure companies, all trust Centrify to stop the leading cause of breaches – privileged credential abuse. PAM was also named a Top 10 security project for 2019 in Gartner’s Top 10 Security Projects for 2019 (client access required).
CloudFlare –  Cloudflare is a web performance and security company that provides online services to protect and accelerate websites online. Its online platforms include Cloudflare CDN that distributes content around the world to speed up websites, Cloudflare Optimizer that enables web pages with ad servers and third-party widgets to download Snappy software on mobiles and computers, CloudFlare Security that protects websites from a range of online threats including spam, SQL injection, and DDOS, Cloudflare Analytics that gives insight into website’s traffic including threats and search engine crawlers, Keyless SSL that allows organizations to keep secure sockets layer (SSL) keys private, and Cloudflare applications that help its users install web applications on their websites.

CrowdStrike – Applying machine learning to endpoint detection of IT network threats is how CrowdStrike is differentiating itself in the rapidly growing cybersecurity market today. It’s also one of the top 25 machine learning startups to watch in 2019. Crowdstrike is credited with uncovering Russian hackers inside the servers of the US Democratic National Committee. The company’s IPO was last Tuesday night, with an initial $34/per share price. Their IPO generated $610M at a valuation at one point reaching nearly $7B. Their Falcon platform stops breaches by detecting all attacks types, even malware-free intrusions, providing five-second visibility across all current and past endpoint activity while reducing cost and complexity for customers. CrowdStrike’s Threat Graph provides real-time analysis of data from endpoint events across the global crowdsourcing community, allowing detection and prevention of attacks based on patented behavioral pattern recognition technology.

Hunters.AI – Hunters.AI excels at autonomous threat hunting by capitalizing on its autonomous system that connects to multiple channels within an organization and detects the signs of potential cyber-attacks. They are one of the top 25 machine learning startups to watch in 2019. What makes this startup one of the top ten cybersecurity companies to watch in 2019 is their innovative approach to creating AI- and machine learning-based algorithms that continually learn from an enterprise’s existing security data. Hunters.AI generates and delivers visualized attack stories allowing organizations to more quickly and effectively identify, understand, and respond to attacks. Early customers, including Snowflake Computing, whose VP of Security recently said, “Hunters.AI identified the attack in minutes. In my 20 years in security, I have not seen anything as effective, fast, and with high fidelity as what Hunters can do.”  The following is a graphic overview of how their system works:

Idaptive – Idaptive is noteworthy for the Zero Trust approach they are taking to protecting organizations across every threat surface they rely on operate their businesses dally. Idaptive secures access to applications and endpoints by verifying every user, validating their devices, and intelligently limiting their access. Their product and services strategy reflects a “never trust, always verify, enforce least privilege” approach to privileged access, from inside or outside the network. The Idaptive Next-Gen Access platform combines single single-on (SSO), adaptive multifactor authentication (MFA), enterprise mobility management (EMM) and user behavior analytics (UBA). They have over 2,000 organizations using their platform today. Idaptive was spun out from Centrify on January 1st of this year.

Kount – Kount has successfully differentiated itself in an increasingly crowded cybersecurity marketplace by providing fraud management, identity verification and online authentication technologies that enable digital businesses, online merchants and payment service providers to identify and thwart a wide spectrum of threats in real-time. Kount has been able to show through customer references that their customers can approve more orders, uncover new revenue streams, and dramatically improve their bottom line all while minimizing fraud management cost and losses. Through Kount’s global network and proprietary technologies in AI and machine learning, combined with policy and rules management, their customers thwart online criminals and bad actors driving them away from their site, their marketplace and off their network. Kount’s continuously adaptive platform learns of new threats and continuously updates risk scores to further thwart breach and fraud attempts. Kount’s advances in both proprietary techniques and patented technology include: Superior mobile fraud detection, Advanced artificial intelligence, Multi-layer device fingerprinting, IP proxy detection and geo-location, Transaction and custom scoring, Global order linking, Business intelligence reporting, Comprehensive order management, Professional and managed services. Kount protects over 6,500 brands today.

MobileIron –  The acknowledged leader in Mobile Device Management software, MobileIron’s latest series of developments make them noteworthy and one of the top ten cybersecurity companies to watch in 2019.   MobileIron was the first to deliver key innovations such as multi-OS mobile device management (MDM), mobile application management (MAM), and BYOD privacy controls. Last month MobileIron introduced zero sign-on (ZSO), built on the company’s unified endpoint management (UEM) platform and powered by the MobileIron Access solution. “By making mobile devices your identity, we create a world free from the constant pains of password recovery and the threat of data breaches due to easily compromised credentials,” wrote Simon Biddiscombe, MobileIron’s President and Chief Executive Officer in his recent blog post, Single sign-on is still one sign-on too many. Simon’s latest post, MobileIron: We’re making history by making passwords history, provides the company’s vision going forward with ZSO. Zero sign-on eliminates passwords as the primary method for user authentication, unlike single sign-on, which still requires at least one username and password. MobileIron paved the way for a zero sign-on enterprise with its Access product in 2017, which enabled zero sign-on to cloud services on managed devices. Enterprise security teams no longer have to trade off security for better user experience, thanks to the MobileIron Zero Sign-On.

Sumo Logic – Sumo Logic is a fascinating cybersecurity company to track because it shows the ability to take on large-scale enterprise security challenges and turn them into a competitive advantage. An example of this is how quickly the company achieved FedRAMP Ready Designation, getting listed in the FedRAMP Marketplace. Sumo Logic is a secure, cloud-native, machine data analytics service, delivering real-time, continuous intelligence from structured, semi-structured, and unstructured data across the entire application lifecycle and stack. More than 2,000 customers around the globe rely on Sumo Logic for the analytics and insights to build, run, and secure their modern applications and cloud infrastructures. With Sumo Logic, customers gain a multi-tenant, service-model advantage to accelerate their shift to continuous innovation, increasing competitive advantage, business value, and growth. Founded in 2010, Sumo Logic is a privately held company based in Redwood City, Calif. and is backed by Accel Partners, Battery Ventures, DFJ, Franklin Templeton, Greylock Partners, IVP, Sapphire Ventures, Sequoia Capital, Sutter Hill Ventures and Tiger Global Management.

Advertisements

Machine Learning Is Helping To Stop Security Breaches With Threat Analytics

Bottom Line: Machine learning is enabling threat analytics to deliver greater precision regarding the risk context of privileged users’ behavior, creating notifications of risky activity in real time, while also being able to actively respond to incidents by cutting off sessions, adding additional monitoring, or flagging for forensic follow-up.

Separating Security Hacks Fact from Fiction

It’s time to demystify the scale and severity of breaches happening globally today. A commonly-held misconception or fiction is that millions of hackers have gone to the dark side and are orchestrating massive attacks on any and every business that is vulnerable. The facts are far different and reflect a much more brutal truth, which is that businesses make themselves easy to hack into by not protecting their privileged access credentials. Cybercriminals aren’t expending the time and effort to hack into systems; they’re looking for ingenious ways to steal privileged access credentials and walk in the front door. According to Verizon’s 2019 Data Breach Investigations Report, ‘Phishing’ (as a pre-cursor to credential misuse), ‘Stolen Credentials’, and ‘Privilege Abuse’ account for the majority of threat actions in breaches (see page 9 of the report).

It only really takes one compromised credential to potentially impact millions — whether it’s millions of individuals or millions of dollars. Undeniably, identities and the trust we place in them are being used against us. They have become the Achilles heel of our cybersecurity practices. According to a recent study by Centrify among 1,000 IT decision makers, 74% of respondents whose organizations have been breached acknowledged that it involved access to a privileged account. This number closely aligns with Forrester Research’s estimate “that at least 80% of data breaches . . . [involved] compromised privileged credentials, such as passwords, tokens, keys, and certificates.”

While the threat actors might vary according to Verizon’s 2019 Data Breach Investigations Report, the cyber adversaries’ tactics, techniques, and procedures are the same across the board. Verizon found that the fastest growing source of threats are from internal actors, as the graphic from the study illustrates below:


Internal actors are the fastest growing source of breaches because they’re able to obtain privileged access credentials with minimal effort, often obtaining them through legitimate access requests to internal systems or harvesting their co-workers’ credentials by going through the sticky notes in their cubicles. Privileged credential abuse is a challenge to detect as legacy approaches to cybersecurity trust the identity of the person using the privileged credentials. In effect, the hacker is camouflaged by the trust assigned to the privileged credentials they have and can roam internal systems undetected, exfiltrating sensitive data in the process.

The reality is that many breaches can be prevented by some of the most basic Privileged Access Management (PAM) tactics and solutions, coupled with a Zero Trust approach. Most organizations are investing the largest chunk of their security budget on protecting their network perimeter rather than focusing on security controls, which can affect positive change to protect against the leading attack vector: privileged access abuse.

The bottom line is that investing in securing perimeters leaves the most popular attack vector of all unprotected, which are privileged credentials. Making PAM a top priority is crucial to protect any business’ most valuable asset; it’s systems, data, and the intelligence they provide. Gartner has listed PAM on its Top 10 Security Projects for the past two years for a good reason.

Part of a cohesive PAM strategy should include machine learning-based threat analytics to provide an extra layer of security that goes beyond a password vault, multi-factor authentication (MFA), or privilege elevation.

How Machine Learning and Threat Analytics Stop Privileged Credential Abuse 

Machine learning algorithms enable threat analytics to immediately detect anomalies and non-normal behavior by tracking login behavioral patterns, geolocation, and time of login, and many more variables to calculate a risk score. Risk scores are calculated in real-time and define if access is approved, if additional authentication is needed, or if the request is blocked entirely.

Machine learning-based threat analytics also provide the following benefits:

  • New insights into privileged user access activity based on real-time data related to unusual recent privilege change, the command runs, target accessed, and privilege elevation.
  • Gain greater understanding and insights into the specific risk nature of specific events, computing a risk score in real time for every event expressed as high, medium, or low level for any anomalous activity.
  •  Isolate, identify, and track which security factors triggered an anomaly alert.
  • Capture, play, and analyze video sessions of anomalous events within the same dashboard used for tracking overall security activity.
  • Create customizable alerts that provide context-relevant visibility and session recording and can also deliver notifications of anomalies, all leading to quicker, more informed investigative action.

What to Look for In Threat Analytics 
Threat analytics providers are capitalizing on machine learning to improve the predictive accuracy and usability of their applications continually. What’s most important is for any threat analytics application or solution you’re considering to provide context-aware access decisions in real time. The best threat analytics applications on the market today are using machine learning as the foundation of their threat analytics engine. These machine learning-based engines are very effective at profiling the normal behavior pattern for any user on any login attempt, or any privileged activity including commands, identifying anomalies in real time to enable risk-based access control. High-risk events are immediately flagged, alerted, notified, and elevated to IT’s attention, speeding analysis, and greatly minimizing the effort required to assess risk across today’s hybrid IT environments.

The following is the minimum set of features to look for in any privilege threat analytics solution:

  • Immediate visibility with a flexible, holistic view of access activity across an enterprise-wide IT network and extended partner ecosystem. Look for threat analytics applications that provide dashboards and interactive widgets to better understand the context of IT risk and access patterns across your IT infrastructure. Threat analytics applications that give you the flexibility of tailoring security policies to every user’s behavior and automatically flagging risky actions or access attempts, so that you’ll gain immediate visibility into account risk, eliminating the overhead of sifting through millions of log files and massive amounts of historical data.
  • They have intuitively designed and customizable threat monitoring and investigation screens, workflows, and modules. Machine learning is enabling threat analytics applications to deliver more contextually-relevant and data-rich insights than has ever been possible in the past. Look for threat analytics vendors who offer intuitively designed and customizable threat monitoring features that provide insights into anomalous activity with a detailed timeline view. The best threat analytics vendors can identify the specific factors contributing to an anomaly for a comprehensive understanding of a potential threat, all from a single console. Security teams can then view system access, anomaly detection in high resolutions with analytics tools such as dashboards, explorer views, and investigation tools.
  • Must provide support for easy integration to Security Information and Event Management (SIEM) tools. Privileged access data is captured and stored to enable querying by log management and SIEM reporting tools. Make sure any threat analytics application you’re considering has installed, and working integrations with SIEM tools and platforms such as Micro Focus® ArcSight™, IBM® QRadar™, and Splunk® to identify risks or suspicious activity quickly.
  • Must Support Alert Notification by Integration with Webhook-Enabled Endpoints. Businesses getting the most value out of their threat analytics applications are integrating with Slack or existing onboard incident response systems such as PagerDuty to enable real-time alert delivery, eliminating the need for multiple alert touch points and improving time to respond. When an alert event occurs, the threat analytics engine allows the user to send alerts into third-party applications via Webhook. This capability enables the user to respond to a threat alert and contain the impact of a breach attempt.

Conclusion 
CentrifyForresterGartner, and Verizon each have used different methodologies and reached the same conclusion from their research: privileged access abuse is the most commonly used tactic for hackers to exfiltrate sensitive data. Breaches based on privileged credential abuse are extremely difficult to stop, as these credentials often have the greatest levels of trust and access rights associated with them. Leveraging threat analytics applications using machine learning that is adept at finding anomalies in behavioral data and thwarting a breach by denying access is proving very effective against privileged credential abuse.

Companies, including Centrify, use risk scoring combined with adaptive MFA to empower a least-privilege access approach based on Zero Trust. This Zero Trust Privilege approach verifies who or what is requesting privileged access, the context behind the request, and the risk of the access environment to enforce least privilege. These are the foundations of Zero Trust Privilege and are reflected in how threat analytics apps are being created and improved today.

Smart Machines Are The Future Of Manufacturing

Smart Machines Are The Future Of Manufacturing

  • Industrial Internet of Things (IIoT) presents integration architecture challenges that once solved can enable use cases that deliver fast-growing revenue opportunities.
  • ISA-95 addressed the rise of global production and distributed supply chains yet are still deficient on the issue of data and security, specifically the proliferation of IIoT sensors, which are the real security perimeter of any manufacturing business.
  • Finding new ways to excel at predictive maintenance, and cross-vendor shop floor integration are the most promising applications.
  • IIoT manufacturing systems are quickly becoming digital manufacturing platforms that integrate ERP, MES, PLM and CRM systems to provide a single unified view of product configurations.

These and many other fascinating insights are from an article McKinsey published titled IIoT platforms: The technology stack as value driver in industrial equipment and machinery which explores how the Industrial Internet of things (IIoT) is redefining industrial equipment and machinery manufacturing. It’s based on a thorough study also published this month, Leveraging Industrial Software Stack Advancement For Digital TransformationA copy of the study is downloadable here (PDF, 50 pp., no opt-in). The study shows how smart machines are the future of manufacturing, exploring how IIoT platforms are enabling greater machine-level autonomy and intelligence.

The following are the key takeaways from the study:

  • Capturing IIoT’s full value potential will require more sophisticated integrated approaches than current automation protocols provide. IIoT manufacturing systems are quickly becoming digital manufacturing platforms that integrate ERP, MES, PLM and CRM systems to provide a single unified view of product configurations and support the design-to-manufacturing process. Digital manufacturing platforms are already enabling real-time monitoring to the machine and shop floor level. The data streams real-time monitoring is delivering today is the catalyst leading to greater real-time analytics accuracy, machine learning adoption and precision and a broader integration strategy to the PLC level on legacy machinery. Please click on the graphic to expand for easier reading.

  • Inconsistent data structures at the machine, line, factory and company levels are slowing down data flows and making full transparency difficult to attain today in many manufacturers. Smart machines with their own operating systems that orchestrate IIoT data and ensure data structure accuracy are being developed and sold now, making this growth constraint less of an issue. The millions of legacy industrial manufacturing systems will continue to impede IIoT realizing its full potential, however. The following graphic reflects the complexities of making an IIoT platform consistent across a manufacturing operation. Please click on the graphic to expand for easier reading.

  • Driven by price wars and commoditized products, manufacturers have no choice but to pursue smart, connected machinery that enables IIoT technology stacks across shop floors. The era of the smart, connected machines is here, bringing with it the need to grow services and software revenue faster than transaction-based machinery sales. Machinery manufacturers are having to rethink their business models and redefine product strategies to concentrate on operating system-like functionality at the machine level that can scale and provide a greater level of autonomy, real-time data streams that power more accurate predictive maintenance, and cross-vendor shop floor integration. Please click on the graphic for easier reading.

  • Machines are being re-engineered starting with software and services as the primary design goals to support new business models. Machinery manufacturers are redefining existing product lines to be more software- and services-centric. A few are attempting to launch subscription-based business models that enable them to sell advanced analytics of machinery performance to customers. The resulting IIoT revenue growth will be driven by platforms as well as software and application development and is expected to be in the range of 20 to 35%. Please click on the graphic to expand for easier reading.

What Matters Most In Business Intelligence, 2019

  • Improving revenues using BI is now the most popular objective enterprises are pursuing in 2019.
  • Reporting, dashboards, data integration, advanced visualization, and end-user self-service are the most strategic BI initiatives underway in enterprises today.
  • Operations, Executive Management, Finance, and Sales are primarily driving Business Intelligence (BI) adoption throughout enterprises today.
  • Tech companies’ Operations & Sales teams are the most effective at driving BI adoption across industries surveyed, with Advertising driving BI adoption across Marketing.

These and many other fascinating insights are from Dresner Advisory Associates’ 10th edition of its popular Wisdom of Crowds® Business Intelligence Market Study. The study is noteworthy in that it provides insights into how enterprises are expanding their adoption of Business Intelligence (BI) from centralized strategies to tactical ones that seek to improve daily operations. The Dresner research teams’ broad assessment of the BI market makes this report unique, including their use visualizations that provide a strategic view of market trends. The study is based on interviews with respondents from the firms’ research community of over 5,000 organizations as well as vendors’ customers and qualified crowdsourced respondents recruited over social media. Please see pages 13 – 16 for the methodology.

Key insights from the study include the following:

  • Operations, Executive Management, Finance, and Sales are primarily driving Business Intelligence (BI) adoption throughout their enterprises today. More than half of the enterprises surveyed see these four departments as the primary initiators or drivers of BI initiatives. Over the last seven years, Operations departments have most increased their influence over BI adoption, more than any other department included in the current and previous survey. Marketing and Strategic Planning are also the most likely to be sponsoring BI pilots and looking for new ways to introduce BI applications and platforms into use daily.

  • Tech companies’ Operations & Sales teams are the most effective at driving BI adoption across industries surveyed, with Advertising driving BI adoption across Marketing. Retail/Wholesale and Tech companies’ sales leadership is primarily driving BI adoption in their respective industries. It’s not surprising to see the leading influencer among Healthcare respondents is resource-intensive HR. The study found that Executive Management is most likely to drive business intelligence in consulting practices most often.

  • Reporting, dashboards, data integration, advanced visualization, and end-user self-service are the most strategic BI initiatives underway in enterprises today. Second-tier initiatives include data discovery, data warehousing, data discovery, data mining/advanced algorithms, and data storytelling. Comparing the last four years of survey data, Dresner’s research team found reporting retains all-time high scores as the top priority, and data storytelling, governance, and data catalog hold momentum. Please click on the graphic to expand for easier reading.

  • BI software providers most commonly rely on executive-level personas to design their applications and add new features. Dresner’s research team found all vertical industries except Business Services target business executives first in their product design and messaging. Given the customer-centric nature of advertising and consulting services business models, it is understandable why the primary focus BI vendors rely on in selling to them are customer personas. The following graphic compares targeted users for BI by industry.

  • Improving revenues using BI is now the most popular objective in 2019, despite BI initially being positioned as a solution for compliance and risk management. Executive Management, Marketing/Sales, and Operations are driving the focus on improving revenues this year. Nearly 50% of enterprises now expect BI to deliver better decision making, making the areas of reporting, and dashboards must-have features. Interestingly, enterprises aren’t looking to BI as much for improving operational efficiencies and cost reductions or competitive advantages. Over the last 12 to 18 months, more tech manufacturing companies have initiated new business models that require their operations teams to support a shift from products to services revenues. An example of this shift is the introduction of smart, connected products that provide real-time data that serves as the foundation for future services strategies. Please click on the graphic to expand for easier reading.

  • In aggregate, BI is achieving its highest levels of adoption in R&D, Executive Management, and Operations departments today. The growing complexity of products and business models in tech companies, increasing reliance on analytics and BI in retail/wholesale to streamline supply chains and improve buying experiences are contributing factors to the increasing levels of BI adoption in these three departments. The following graphic compares BI’s level of adoption by function today.

  • Enterprises with the largest BI budgets this year are investing more heavily into dashboards, reporting, and data integration. Conversely, those with smaller budgets are placing a higher priority on open source-based big data projects, end-user data preparation, collaborative support for group-based decision-making, and enterprise planning. The following graphic provides insights into technologies and initiatives strategic to BI at an enterprise level by budget plans.

  • Marketing/Sales and Operations are using the greatest variety of BI tools today. The survey shows how conversant Operations professionals are with the BI tools in use throughout their departments. Every one of them knows how many and most likely which types of BI tools are deployed in their departments. Across all industries, Research & Development (R&D), Business Intelligence Competency Center (BICC), and IT respondents are most likely to report they have multiple tools in use.

How The Top 21% Of PAM-Mature Enterprises Are Thwarting Privileged Credential Breaches

  • Energy, Technology & Finance are the most mature industries when it comes to Privileged Access Management (PAM) adoption and uses, outscoring peer industries by a wide margin.
  • 58% of organizations do not use Multi-Factor Authentication (MFA) for privileged administrative access to servers, leaving their IT systems and infrastructure exposed to hacking attempts, including unchallenged privileged access abuse.
  • 52% of organizations are using shared accounts for controlling privileged access, increasing the probability of privileged credential abuse.

These and many other fascinating insights are from the recently published Centrify 2019 Zero Trust Privilege Maturity Model Report created in partnership with Techvangelism. You can download a copy of the study here (PDF, 22 pp., no opt-in). Over 1,300 organizations participated in the survey from 11 industries with Technology, Finance, and Healthcare, comprising 50% of all organizations participating. Please see page 4 of the study for additional details regarding the methodology.

What makes this study noteworthy is that it’s the first of its kind to create a Zero Trust Privilege Maturity Model designed to help organizations better understand and define their ability to discover, protect, secure, manage, and provide privileged access. Also, this model can be used to help mature existing security implementations towards one that provides the greatest level of protection of identity, privileged access, and its use.

Key takeaways from the study include the following:

  • The top 21% of enterprises who excel at thwarting privileged credential breaches share a common set of attributes that differentiate them from their peers. Enterprises who most succeed at stopping security breaches have progressed beyond vault- and identity-centric techniques by hardening their environments through the use of centralized management of service and application accounts and enforcing host-based session, file, and process auditing. In short, the most secure organizations globally have reached a level of Privileged Access Management (PAM) maturity that reduces the probability of a breach successfully occurring due to privileged credential abuse.

  • Energy, Technology & Finance are the most mature industries adopting Privileged Access Management (PAM), outscoring peer industries by a wide margin. Government, Education, and Manufacturing are the industries most lagging in their adoption of Zero Trust Privilege (ZTP), making them the most vulnerable to breaches caused by privileged credential abuse. Education and Manufacturing are the most vulnerable industries of all, where it’s common for multiple manufacturing sites to use shared accounts for controlling privileged access. The study found shared accounts for controlling privileged access is commonplace, with 52% of all organizations reporting this occurring often. Presented below are the relative levels of Zero Trust Privilege Maturity by demographics, with the largest organizations having the most mature approaches to ZTP, which is expected given the size and scale of their IT and cybersecurity departments.

  • 51% of organizations do not control access to transformational technologies with privileged access, including modern attack surfaces such as cloud workloads (38%), Big Data projects (65%), and containers (50%). Artificial Intelligence (AI)/Bots and Internet of Things (IoT) are two of the most vulnerable threat surfaces according to the 1,300 organizations surveyed. Just 16% of organizations have implemented a ZTP strategy to protect their AI/Bots technologies, and just 25% have implemented them for IoT. The graphic below compares usage or plans by transformational technologies.

  • 58% of organizations aren’t using MFA for server login, and 25% have no plans for a password vault, two areas that are the first steps to defining a Privileged Access Management (PAM) strategy. Surprisingly, 26% do not use and do not plan to use MFA for server login, while approximately 32% do plan to use MFA for server logins. Organizations are missing out on opportunities to significantly harden their security posture by adopting password vaults and implementing MFA across all server logins. These two areas are essential for implementing a ZTP framework.

Conclusion

To minimize threats – both external and internal – Privileged Access Management needs to go beyond the fundamental gateway-based model and look to encompass host-enforced privileged access that addresses every means by which the organization leverages privileged credentials. With just 21% of organizations succeeding with mature Zero Trust Privilege deployments, 79% are vulnerable to privileged credential abuse-based breaches that are challenging to stop. Privileged credentials are the most trusted in an organization, allowing internal and external hackers the freedom to move throughout networks undetected. That’s why understanding where an organization is on the spectrum of ZTP maturity is so important, and why the findings from the Centrify and Techvangelism 2019 Zero Trust Privilege Maturity Model Report are worth noting and taking action on.

How To Get Your Data Scientist Career Started

The most common request from this blogs’ readers is how to further their careers in analytics, cloud computing, data science, and machine learning. I’ve invited Alyssa Columbus, a Data Scientist at Pacific Life, to share her insights and lessons learned on breaking into the field of data science and launching a career there. The following guest post is authored by her.

Earning a job in data science, especially your first job in data science, isn’t easy, especially given the surplus of analytics job-seekers to analytics jobs.

Many people are looking to break into data science, from undergraduates to career changers, have asked me how I’ve attained my current data science position at Pacific Life. I’ve referred them to many different resources, including discussions I’ve had on the Dataquest.io blog and the Scatter Podcast. In the interest of providing job seekers with a comprehensive view of what I’ve learned that works, I’ve put together the five most valuable lessons learned. I’ve written this article to make your data science job hunt easier and as efficient as possible.

  • Continuously build your statistical literacy and programming skills. Currently, there are 24,697 open Data Scientist positions on LinkedIn in the United States alone. Using data mining techniques to analyze all open positions in the U.S., the following list of the top 10 data science skills was created today. As of April 14, the top 3 most common skills requested in LinkedIn data scientist job postings are Python, R, and SQL, closely followed by Jupyter Notebooks, Unix Shell/Awk, AWS, and Tensorflow. The following graphic provides a prioritized list of the most in-demand data science skills mentioned in LinkedIn job postings today. Please click on the graphic to expand for easier viewing.

Hands-on training is the best way to develop and continually improve statistical and programming skills, especially with the languages and technologies LinkedIn’s job postings prioritize.  Getting your hands dirty with a dataset is often much better than reading through abstract concepts and not applying what you’ve learned to real problems. Your applied experience is just as important as your academic experience, and taking statistics, and computer science classes help to translate theoretical concepts into practical results. The toughest thing to learn (and also to teach) about statistical analysis is the intuition for what the big questions to ask of your dataset are. Statistical literacy, or “how” to find the answers to your questions, come with education and practice. Strengthening your intellectual curiosity or insight into asking the right questions comes through experience.

  • Continually be creating your own, unique portfolio of analytics and machine learning projects. Having a good portfolio is essential to be hired as a data scientist, especially if you don’t come from a quantitative background or have experience in data science before. Think of your portfolio as proof to potential employers that you are capable of excelling in the role of a data scientist with both the passion and skills to do the job. When building your data science portfolio, select and complete projects that qualify you for the data science jobs, you’re the most interested in. Use your portfolio to promote your strengths and innate abilities by sharing projects you’ve completed on your own. Some skills I’d recommend you highlight in your portfolio include:
    • Your programming language of choice (e.g., Python, R, Julia, etc.).
    • The ability to interact with databases (e.g., your ability to use SQL).
    • Visualization of data (static or interactive).
    • Storytelling with data. This is a critical skill. In essence, can someone with no background in whatever area your project is in look at your project and gain some new understandings from it?
    • Deployment of an application or API. This can be done with small sample projects (e.g., a REST API for an ML model you trained or a nice Tableau or R Shiny dashboard).

Julia Silge and Amber Thomas both have excellent examples of portfolios that you can be inspired by. Julia’s portfolio is shown below.

  • Get (or git!) yourself a website. If you want to stand out, along with a portfolio, create and continually build a strong online presence in the form of a website.  Be sure to create and continually add to your GitHub and Kaggle profiles to showcase your passion and proficiency in data science. Making your website with GitHub Pages creates a profile for you at the same time, and best of all it’s free to do. A strong online presence will not only help you in applying for jobs, but organizations may also reach out to you with freelance projects, interviews, and other opportunities.
  • Be confident in your skills and apply for any job you’re interested in, starting with opportunities available in your network.  If you don’t meet all of a job’s requirements, apply anyway. You don’t have to know every skill (e.g., programming languages) on a job description, especially if there are more than ten listed. If you’re a great fit for the main requirements of the job’s description, you need to apply. A good general rule is that if you have at least half of the skills requested on a job posting, go for it. When you’re hunting for jobs, it may be tempting to look for work on company websites or tech-specific job boards. I’ve found, as have many others, that these are among the least helpful ways to find work. Instead, contact recruiters specializing in data science and build up your network to break into the field. I recommend looking for a data science job via the following sources, with the most time devoted to recruiters and your network:
    • Recruiters
    • Friends, family, and colleagues
    • Career fairs and recruiting events
    • General job boards
    • Company websites
    • Tech job boards.

Alyssa Columbus is a Data Scientist at Pacific Life and member of the Spring 2018 class of NASA Datanauts. Previously, she was a computational statistics and machine learning researcher at the UC Irvine Department of Epidemiology and has built robust predictive models and applications for a diverse set of industries spanning retail to biologics. Alyssa holds a degree in Applied and Computational Mathematics from the University of California, Irvine and is a member of Phi Beta Kappa. She is a strong proponent of reproducible methods, open source technologies, and diversity in analytics and is the founder of R-Ladies Irvine. You can reach her at her website: alyssacolumbus.com.

The State Of 3D Printing, 2019

  • Proof of concept and prototyping dominate 3D printing applications in 2019.
  • 80% of enterprises say 3D printing is enabling them to innovate faster.
  • 51% of enterprises are actively using 3D printing in production.

These and many other fascinating insights are from Sculpteo’s 5th edition of their popular study, The State of 3D Printing (29 pp., PDF, opt-in). The study’s methodology is based on interviews with 1,300 respondents coming from Europe (64%), United States (16.6 %) and Asia (20.2%), which is the fastest growing region internationally today as measured by this survey over five years. Eight industries are included in the research design including Industrial Goods (13.6%), High Tech (10.6%), Services (9.9%), Consumer Goods (8.6%), Health & Medical (6.2%), Automotive (5.7%), Aerospace & Defense (5.5%), and Education (4.9%). For additional details on the methodology, please see pages 6 and 7 of the study. Key takeaways from the survey include the following:

  • Proof of concepts and prototyping dominate 3D printing applications in 2019. Manufacturers are increasing their reliance on 3D printing as part of their broader manufacturing strategies, with production use up to 51% of all respondents from 38.7% in 2018. The following compares 2019’s purpose of 3D prints versus the last five years of survey data. Please click on the graphic to expand for easier reading.

  • Accelerating product development continues to be enterprises’ top focus guiding their 3D printing strategies in 2019. Mass customization and support for configure-to-order and engineer-to-order product strategies also continue to be a high priority this year, continued the trend since 2015. Increasing production flexibility is the third area of focus that guides additive manufacturing strategies today. Please click on the graphic to expand for easier reading.

  • Nearly 50% of enterprises say that quality control is their top challenge of using their 3D printers. As enterprises increase their adoption of 3D printing to accelerate their additive manufacturing strategies, quality is becoming increasingly more important. Manufacturers most define their success by the perceived level of quality products they deliver to their customers, which is increasing quality control as a needed benefit of 3D printing. Please click on the graphic to expand for easier reading.

  • Adopting a design-to-manufacturing strategy accelerates new product development and innovation, which is why CAD design leads all other activities today. When responses were asked which areas related to 3D printing and additive manufacturing, they spend the majority of their time, nearly 50% said CAD design. Building prototypes, research, and testing prototypes are also areas those enterprises adopting additive manufacturing are investing in today. Please click on the graphic to expand for easier reading.

  • Additive manufacturing adoption is growing across shop floors globally, evidenced by more than 70% of enterprises finding new applications for 3D printing in 2019 and 60% using CAD, simulation, and reverse engineering internally. The leading indicators of additive manufacturing becoming more pervasively adopted across global shop floors are shown in the following graphic. New uses for 3D printing, experimenting with new materials, extensive CAD Design integration combined with simulation and reverse engineering provide further evidence of how engrained additive manufacturing is becoming in production processes daily. 3D printing is now most commonly used alongside CNC machining, another strong indicator of how essential additive manufacturing is becoming to the production process. Please click on the graphic to expand for easier reading.

  • 3D printings’ innate strengths at producing items with complex geometries at a quick pace or iteration are the leading two benefits of 3D printing in 2019. More than 40% of enterprises say that rapid iterations of prototypes and lead time reductions are the leading benefits followed by mass customization (support for configure-to-order & engineer-to-order product strategies) and cost savings. Please click on the graphic to expand for easier reading.

  • 80% of high tech manufacturing respondents are relying on 3D printing for prototyping, leading all industries in this category. 47% use 3D printing to accelerate product development. High tech manufacturers are above average in their experimenting with new 3D printing materials and technologies, looking for greater competitive strength in their industry. Please click on the graphic to expand for easier reading.

  • North American-based enterprises see the scale to support complex product concepts (complex geometries), speed (quick iterations), scale (mass customizations) and cost savings as the top benefits of 3D printing. Sculpteo’s survey found that North American enterprises are more optimistic about the potential for 3D printing becoming mainstream in a production environment. While budget and physical space are the two most significant barriers enterprises face in adopting 3D printing at scale, their optimistic outlook on the technology’s future is driving greater adoption to the shop floor. Please click on the graphic to expand for easier reading.

Seven Things You Need To Know About IIoT In Manufacturing

  • Global spending on IIoT Platforms for Manufacturing is predicted to grow from $1.67B in 2018 to $12.44B in 2024, attaining a 40% compound annual growth rate (CAGR) in seven years.
  • IIoT platforms are beginning to replace MES and related applications, including production maintenance, quality, and inventory management, which are a mix of Information Technology (IT) and Operations Technology (OT) technologies.
  • Connected IoT technologies are enabling a new era of smart, connected products that often expand on the long-proven platforms of everyday products. Capgemini estimates that the size of the connected products market will be $519B to $685B by 2020.

These and many other fascinating insights are from IoT Analytics’ study, IIoT Platforms For Manufacturing 2019 – 2024 (155 pp., PDF, client access reqd). IoT Analytics is a leading provider of market insights for the Internet of Things (IoT), M2M, and Industry 4.0. They specialize in providing insights on IoT markets and companies, focused market reports on specific IoT segments and Go-to-Market services for emerging IoT companies. The study’s methodology includes interviews with twenty of the leading IoT platform providers, executive-level IoT experts, and IIoT end users. For additional details on the methodology, please see pages 136 and 137 of the report. IoT Analytics defines the Industrial loT (lloT) as heavy industries including manufacturing, energy, oil and gas, and agriculture in which industrial assets are connected to the internet.

The seven things you need to know about IIoT in manufacturing include the following:

  • IoT Analytics’ technology architecture of the Internet of Things reflects the proliferation of new products, software and services, and the practical needs manufacturers have for proven integration to make the Industrial Internet of Things (IIoT) work. IoT technology architectures are in their nascent phase, showing signs of potential in solving many of manufacturing’s most challenging problems. IoT Analytics’ technology architecture shown below is designed to scale in response to the diverse development across the industry landscape with a modular, standardized approach.

  • IIoT platforms are beginning to replace MES and related applications, including production maintenance, quality, and inventory management, which are a mix of Information Technology (IT) and Operations Technology (OT) technologies. IoT Analytics is seeing IIoT platforms begin to replace existing industrial software systems that had been created to bridge the IT and OT gaps in manufacturing environments. Their research teams are finding that IIoT Platforms are an adjacent technology to these typical industrial software solutions but are now starting to replace some of them in smart connected factory settings. The following graphic explains how IoT Analytics sees the IIoT influence across the broader industrial landscape:

  • Global spending on IIoT Platforms for Manufacturing is predicted to grow from $1.67B in 2018 to $12.44B in 2024, attaining a 40% compound annual growth rate (CAGR) in seven years. IoT Analytics is finding that manufacturing is the largest IoT platform industry segment and will continue to be one of the primary growth catalysts of the market through 2024. For purposes of their analysis, IoT Analytics defines manufacturing as standardized production environments including factories, workshops, in addition to custom production worksites such as mines, offshore oil gas, and construction sites. The lloT platforms for manufacturing segment have experienced growth in the traditionally large manufacturing-base countries such as Japan and China. IoT Analytics relies on econometric modeling to create their forecasts.

  • In 2018, the industrial loT platforms market for manufacturing had an approximate 60%/40% split for within factories/outside factories respectively. IoT Analytics predicts this split is expected to remain mostly unchanged for 2019 and by 2024 within factories will achieve slight gains by a few percentage points. The within factories type (of lloT Platforms for Manufacturing) is estimated to grow from a $1B market in 2018 to a $1.5B market by 2019 driven by an ever-increasing amount of automation (e.g., robots on the factory floor) being introduced to factory settings for increased efficiencies, while the outside factories type is forecast to grow from $665M in 2018 to become a $960M market by 2019.

  • Discrete manufacturing is predicted to be the largest percentage of Industrial IoT platform spending for 2019, growing at a CAGR of 46% from 2018. Discrete manufacturing will outpace batch and process manufacturing, becoming 53% of all IIoT platform spending this year. IoT Analytics sees discrete manufacturers pursuing make-to-stock, make-to-order, and assemble-to-order production strategies that require sophisticated planning, scheduling, and tracking capabilities to improve operations and profitability. The greater the production complexity in discrete manufacturing, the more valuable data becomes. Discrete manufacturing is one of the most data-prolific industries there are, making it an ideal catalyst for IIoT platform’s continual growth.

  • Manufacturers are most relying on IIoT platforms for general process optimization (43.1%), general dashboards & visualization (41.1%) and condition monitoring (32.7%). Batch, discrete, and process manufacturers are prioritizing other use cases such as predictive maintenance, asset tracking, and energy management as all three areas make direct contributions to improving shop floor productivity. Discrete manufacturers are always looking to free up extra time in production schedules so that they can offer short-notice production runs to their customers. Combining IIoT platform use cases to uncover process and workflow inefficiencies so more short-notice production runs can be sold is driving Proof of Concepts (PoC) today in North American manufacturing.

  • IIoT platform early adopters prioritize security as the most important feature, ahead of scalability and usability. Identity and Access Management, multifactor-factor authentication, consistency of security patch updates, and the ability to scale and protect every threat surface across an IIoT network are high priorities for IIoT platform adopters today. Scale and usability are the second and third priorities. The following graphic compares IIoT platform manufacturers’ most important needs:

For more information on the insights presented here, check out IoT Analytics’ report: IIoT Platforms For Manufacturing 2019 – 2024.

%d bloggers like this: