Skip to content
Advertisements

Predicting The Future Of Next-Gen Access And Zero Trust Security In 2019

Bottom Line:  The most valuable catalyst all digital businesses need to continue growing in 2019 is a Zero Trust Security (ZTS) strategy based on Next-Gen Access (NGA) that scales to protect every access point to corporate data, recognizing that identities are the new security perimeter.

The faster any digital business is growing, the more identities, devices and network endpoints proliferate. The most successful businesses of 2019 and beyond are actively creating entirely new digital business models today. They’re actively recruiting, and onboarding needed experts independent of their geographic locations and exploring new sourcing and patent ideas with R&D partners globally. Businesses are digitally transforming themselves at a faster rate than ever before. Statista projects businesses will spend $190B on digital transformation in 2019, soaring to $490B by 2025, attaining a 14.4% Compound Annual Growth Rate (CAGR) in six years.

Security Perimeters Make Or Break A Growing Business

80% of IT security breaches involve privileged credential access according to a recent Forrester study. The Verizon Mobile Security Index 2018 Report found that 89% of organizations are relying on just a single security strategy to keep their mobile networks safe. A typical data breach cost the average company $3.86M in 2018, up 6.4% from $3.62M in 2017 according to IBM Security’s latest  2018 Cost of a Data Breach Study.

The hard reality for any digital business is realizing that their greatest growth asset is how well they protect the constantly expanding perimeter of their business. Legacy approaches to securing infrastructure that relies on trusted and untrusted domains can’t scale to protect every identity and device that comprises a company’s rapidly changing new security perimeter. All these factors and more are why Zero Trust Security (ZTS) enabled by Next-Gen Access (NGA) is as essential to digital businesses’ growth as their product roadmaps, pricing strategies, and services with Idaptive being an early leader in the market. To learn more about Identity-as-a-Service please see the Forrester report, The Forrester Wave™: Identity-As-A-Service, Q4 2017 (client access required)

Predicting The Future Of Next-Gen Access And Zero Trust Security

The following are predictions of how Next-Gen Access (NGA) powered by Zero Trust Security (ZTS) will evolve in 2019:

  • Behavior-based scoring algorithms will improve markedly in 2019, improving the user experience by calculating risk scores with greater precision than before. Thwarting attacks start with a series of behavior-based algorithms that calculate a risk score based on a wide variety of variables including past access attempts, device security posture, operating system, location, time of day, and many other measurable factors. Expect to see these algorithms and the risk scores they generate using machine learning techniques improve from accuracy and contextual intelligence standpoint in 2019. Leading companies in the field including Idaptive are actively investing in machine learning technologies to accomplish this today.
  • Multifactor Authentication (MFA) adoption soars as digital businesses seek to protect new R&D projects, patents in progress, roadmaps, and product plans. State-sponsored hacking organizations and organized crime see the intellectual property in fast-growing digital businesses as among the most valuable assets they can exfiltrate and sell on the Dark Web. MFA, one of the most effective single defenses against compromised passwords, will be adopted by the most successful businesses in AI, aerospace & defense, chip design for cellular and IoT devices, e-commerce, enterprise software and more.
  • Smart, connected products without adequate security designed in will proliferate in 2019, further challenging the security perimeters of the digital businesses. The era of smart, connected products is here, with Capgemini estimating the size of the connected products market will be $519B to $685B by 2020. Manufacturers expect close to 50% of their products to be smart, connected products by 2020, according to Capgemini’s Digital Engineering: The new growth engine for discrete manufacturers. The study is downloadable here (PDF, 40 pp., no opt-in). With every smart, connected device creating a new threat surface for a company, expect to see at least one device manufacturer design Zero Trust Security (ZTS) support to the board level to increase their sales into enterprises by reducing the threat of a breach starting from their device.
  • Looking for greater track and traceability, healthcare and medical products supply chains will adopt Zero Trust Security (ZTS). What’s going to make this an urgent issue in healthcare and medical products are the combined effects of greater regulatory reporting and compliance, combined with the pressure to improve time-to-market for new products and delivery accuracy for current customers. The pillars of ZTS are a perfect fit for healthcare and medical supply chains’ need for track and traceability. These pillars are real-time user verification, device validation, and intelligently limiting access, while also learning and adapting to verified user behaviors.
  • Real-time Security Analytics Services is going to thrive in 2019 as digital businesses seek insights into how they can fine-tune their ZTS strategies across every threat surface and machine learning algorithms improve. Many enterprises are in for an epiphany in 2019 when they see just how many potential breaches they’ve stopped using a combination of security strategies including Single Sign-On (SSO) and Multi-factor Authentication (MFA). Machine learning algorithms will continue to improve using behavior-based scoring, further improving the user experience. Leaders in the field include Idaptive who is setting a rapid pace of innovation in Real-Time Security Analytics Services.   

Conclusion

Security is at an inflection point today. Long-standing methods of protecting IT systems and a businesses’ assets can’t scale to protect every new identity, device or threat surface. When every identity is a new security perimeter, a new approach is needed to securing any digital business. The pillars of ZTS including real-time user verification, device validation, and intelligently limiting access, while also learning and adapting to verified user behaviors are proving to be effective at thwarting breaches and securing company’ digital assets of all kinds. It’s time for more digital businesses to see security as the growth catalyst it is and take action now to ensure their operations continue to flourish.

Advertisements

Microsoft Leads The AI Patent Race Going Into 2019

  • There have been over 154,000 AI patents filed worldwide since 2010 with the majority being in health fields (29.5%), Industry-specific solutions (25.3%) and AI-based digital security (15.7%).
  • AI-based marketing patents are the fasting growing global category, reaching a Compound Annual Growth Rate (CAGR) of 29.3% between 2010 and 2018.
  • The second- and third-fastest growing global AI patent categories between 2010 and 2018 are AI-based digital security (23.4% CAGR) and AI-based mobility (23% CAGR).
  • 79,936 patents were filed in the United States between 2010 and 2018, with the majority being in the health field (32.6%) followed by Industry-specific solutions (20.5%) and AI-based digital security (18%).
  • Machine learning dominates the AI patent landscape today, leading all categories of AI patents including deep learning and neural networks.

These and many other insights are from an excellent presentation recently given by Kai Gramke, Managing Director of EconSight titled Artificial Intelligence As A Key Technology and Driver of Technological Progress. EconSight clients include the Swiss Federal Council, German Federal Chancellery, leading European think tanks, research institutes and half of the German DAX-30 companies.  The presentation and information shared in this post were generated using the PatentSight analytics platform. PatentSight is a LexisNexis company and you can learn more about them here.  The following are the key takeaways from Kai’s recent research and presentation using PatentSight:

  • EconSight finds that Microsoft leads the AI patent race going into 2019 with 697 world class patents that the firm classifies as having a significant competitive impact as of November 2018. Out of the top 30 companies and research institutions as defined by EconSight in their recent analysis, Microsoft has created 20% of all patents in the global group of patent-producing companies and institutions. The following graphic provides a comparison of the top 3o in the group. Please click on the graphic to expand it for easier reading.

  • Machine learning dominates the AI patent landscape today, leading all categories of AI patents including deep learning and neural networks.  Machine learning is based on the foundational concepts of Bayesian analysis, data mining, and predictive analytics. Machine learning algorithms and the applications they rely on are designed to find patterns in large-scale data sets, while also being able to solve complex, constraint-based problems by learning from the data.  Enterprise software companies including Microsoft, SAP, and others are actively developing AI technologies that integrate into their existing platforms, streamlining adoption across their many customers. Please click on the graphic to expand for easier reading.

  • There have been 225,833 AI-based patents filed globally since 2000, with 30.7% being Industry specific (Industry 4.0 on the graphic below) followed by health-related patents (28.1%) 13.8% of all AI-based patents are for digital security and 11.9% for energy. It’s interesting to note that the fastest growing patents between 2000 and 2018 are for applying AI to marketing (22% CAGR) and AI-based digital security (18.8% CAGR). Please click on the graphic to expand for easier reading.

6 Best Practices For Increasing Security In AWS In A Zero Trust World

  • Amazon Web Services (AWS) reported $6.6B in revenue for Q3, 2018 and $18.2B for the first three fiscal quarters of 2018.
  • AWS revenue achieved an impressive 46% year-over-year net sales growth between Q3, 2017 and Q3, 2018 and 49% year-over-year growth for the first three quarters of the year.
  • AWS’ 34% market share is bigger than its next four competitors combined with the majority of customers taken from small-to-medium sized cloud operators according to Synergy Research.
  • The many announcements made at AWS Re:Invent this year reflect a growing focus on hybrid cloud computing, security, and compliance.

Enterprises are rapidly accelerating the pace at which they’re moving workloads to Amazon Web Services (AWS) for greater cost, scale and speed advantages. And while AWS leads all others as the enterprise public cloud platform of choice, they and all Infrastructure-as-a-Service (IaaS) providers rely on a Shared Responsibility Model where customers are responsible for securing operating systems, platforms and data.  In the case of AWS, they take responsibility for the security of the cloud itself including the infrastructure, hardware, software, and facilities. The AWS version of the Shared Responsibility Model shown below illustrates how Amazon has defined securing the data itself, management of the platform, applications and how they’re accessed, and various configurations  as the customers’ responsibility:

Included in the list of items where the customer is responsible for security “in” the cloud is identity and access management, including Privileged Access Management (PAM) to secure the most critical infrastructure and data.

Increasing Security for IaaS in a Zero Trust World

Stolen privileged access credentials are the leading cause of breaches today. Forrester found that 80% of data breaches are initiated using privileged credentials, and 66% of organizations still rely on manual methods to manage privileged accounts. And while they are the leading cause of breaches, they’re often overlooked — not only to protect the traditional enterprise infrastructure — but especially when transitioning to the cloud.

Both for on-premise and Infrastructure-as-a-Service (IaaS), it’s not enough to rely on password vaults alone anymore. Organizations need to augment their legacy Privileged Access Management strategies to include brokering of identities, multi-factor authentication enforcement and “just enough, just-in-time” privilege, all while securing remote access and monitoring of all privileged sessions. They also need to verify who is requesting access, the context of the request, and the risk of the access environment. These are all essential elements of a Zero Trust Privilege strategy, with Centrify being an early leader in this space.

6 Ways To Increase Security in AWS

The following are six best practices for increasing security in AWS and are based on the Zero Trust Privilege model:

  1. Vault AWS Root Accounts and Federate Access for AWS Console

Given how powerful the AWS root user account is, it’s highly recommended that the password for the AWS root account be vaulted and only used in emergencies. Instead of local AWS IAM accounts and access keys, use centralized identities (e.g., Active Directory) and enable federated login. By doing so, you obviate the need for long-lived access keys.

  1. Apply a Common Security Model and Consolidate Identities

When it comes to IaaS adoption, one of the inhibitors for organizations is the myth that the IaaS requires a unique security model, as it resides outside the traditional network perimeter. However, conventional security and compliance concepts still apply in the cloud. Why would you need to treat an IaaS environment any different than your own data center? Roles and responsibilities are still the same for your privileged users. Thus, leverage what you’ve already got for a common security infrastructure spanning on-premises and cloud resources. For example, extend your Active Directory into the cloud to control AWS role assignment and grant the right amount of privilege.

  1. Ensure Accountability

Shared privileged accounts (e.g., AWS EC2 administrator) are anonymous. Ensure 100% accountability by having users log in with their individual accounts and elevate privilege as required. Manage entitlements centrally from Active Directory, mapping roles, and groups to AWS roles.

  1. Enforce Least Privilege Access

Grant users just enough privilege to complete the task at hand in the AWS Management Console, AWS services, and on the AWS instances. Implement cross-platform privilege management for AWS Management Console, Windows and Linux instances.

  1. Audit Everything

Log and monitor both authorized and unauthorized user sessions to AWS instances. Associate all activity to an individual, and report on both privileged activity and access rights. It’s also a good idea to use AWS CloudTrail and Amazon CloudWatch to monitor all API activity across all AWS instances and your AWS account.

  1. Apply Multi-Factor Authentication Everywhere

Thwart in-progress attacks and get higher levels of user assurance. Consistently implement multi-factor authentication (MFA) for AWS service management, on login and privilege elevation for AWS instances, or when checking out vaulted passwords.

Conclusion

One of the most common reasons AWS deployments are being breached is a result of privileged access credentials being compromised. The six best practices mentioned in this post are just the beginning; there are many more strategies for increasing the security in AWS.  Leveraging a solid Zero Trust Privilege platform, organizations can eliminate shared Amazon EC2 key pairs, using auditing to define accountability to the individual user account level, execute on least privilege access across every login, AWS console, and AWS instance in use, enforce MFA and enable a common security model.

Which Analytics And BI Technologies Will Be The Highest Priority In 2019?

  • 82% of enterprises are prioritizing analytics and BI as part of their budgets for new technologies and cloud-based services.
  • 54% say AI, Machine Learning and Natural Language Processing (NLP) are also a high investment priority.
  • 50% of enterprises say their stronger focus on metrics and Key Performance Indicators (KPIs) company-wide are a major driver of new investment in analytics and BI.
  • 43%  plan to both build and buy AI and machine learning applications and platforms.
  • 42% are seeking to improve user experiences by automating discovery of data insights and 26% are using AI to provide user recommendations.

These and many other fascinating insights are from the recent TDWI Best Practices Report, BI and Analytics in the Age of AI and Big Data. An executive summary of the study is available online here. The entire study is available for download here (39 PP., PDF, free, opt-in). The study found that enterprises are placing a high priority on augmenting existing systems and replacing older technologies and data platforms with new cloud-based BI and predictive analytics ones. Transforming Data with Intelligence (TDWI) is a global community of AI, analytics, data science and machine learning professionals interested in staying current in these and more technology areas as part of their professional development. Please see page 3 of the study for specifics regarding the methodology.

Key takeaways from the study include the following:

  • 82% of enterprises are prioritizing analytics and BI applications and platforms as part of their budgets for new technologies and cloud-based services. 78% of enterprises are prioritizing advanced analytics, and 76% data preparation. 54% say AI, machine learning and Natural Language Processing (NLP) are also a high investment priority. The following graphic ranks enterprises’ investment priorities for acquiring or subscribing to new technologies and cloud-based services by analytics and BI initiatives or strategies. Please click on the graphic to expand for easier reading.

  • Data warehouse or mart in the cloud (41%), data lake in the cloud (39%) and BI platform in the cloud (38%) are the top three types of technologies enterprises are planning to use. Based on this finding and others in the study, cloud platforms are the new normal in enterprises’ analytics and Bi strategies going into 2019. Cloud data storage (object, file, or block) and data virtualization or federation (both 32%) are the next-most planned for technologies by enterprises when it comes to investing in the analytics and BI initiatives. Please click on the graphic to expand for easier reading.

  • The three most important factors in delivering a positive user experience include good query performance (61%), creating and editing visualizations (60%), and personalizing dashboards and reports (also 60%). The three activities that lead to the least amount of satisfaction are using predictive analytics and forecasting tools (27% dissatisfied), “What if” analysis and deriving new data (25%) and searching across data and reports (24%). Please click on the graphic to expand for easier reading.

  • 82% of enterprises are looking to broaden the base of analytics and BI platforms they rely on for insights and intelligence, not just stay with the solutions they have in place today. Just 18% of enterprises plan to add more instances of existing platforms and systems. Cloud-native platforms (38%), a new analytics platform (35%) and cloud-based data lakes (31%) are the top three system areas enterprises are planning to augment or replace existing BI, analytics, and data warehousing systems in. Please click on the graphic to expand for easier reading.

  • The majority of enterprises plan to both build and buy Artificial Intelligence (AI) and machine learning (ML) solutions so that they can customize them to their specific needs. 43% of enterprises surveyed plan to both build and buy AI and ML applications and platforms, a figure higher than any other recent survey on this aspect of enterprise AI adoption. 13% of responding enterprises say they will exclusively build their own AI and ML applications.

  • Capitalizing on machine learning’s innate strengths of applying algorithms to large volumes of data to find actionable new insights (54%) is what’s most important to the majority of enterprises. 47% of enterprises look to AI and machine learning to improve the accuracy and quality of information. And 42% are configuring AI and machine learning applications and platforms to augment user decision making by giving recommendations. Please click on the graphic to expand for easier reading.

10 Ways Machine Learning Is Revolutionizing Sales

  • Sales teams adopting AI are seeing an increase in leads and appointments of more than 50%, cost reductions of 40%–60%, and call time reductions of 60%–70% according to the Harvard Business Review article Why Salespeople Need to Develop Machine Intelligence.
  • 62% of highest performing salespeople predict guided selling adoption will accelerate based on its ability rank potential opportunities by value and suggest next steps according to Salesforces’ latest State of Sales research study.
  • By 2020, 30% of all B2B companies will employ AI to augment at least one of their primary sales processes according to Gartner.
  • High-performing sales teams are 4.1X more likely to use AI and machine learning applications than their peers according to the State of Sales published by Salesforce.
  • Intelligent forecasting, opportunity insights, and lead prioritization are the top three AI and machine learning use cases in sales.

Artificial Intelligence (AI) and machine learning show the potential to reduce the most time-consuming, manual tasks that keep sales teams away from spending more time with customers. Automating account-based marketing support with predictive analytics and supporting account-centered research, forecasting, reporting, and recommending which customers to upsell first are all techniques freeing sales teams from manually intensive tasks.

The Race for Sales-Focused AI & Machine Learning Patents Is On

CRM and Configure, Price & Quote (CPQ) providers continue to develop and fine-tune their digital assistants, which are specifically designed to help the sales team get the most value from AI and machine learning. Salesforces’ Einstein supports voice-activation commands from Amazon Alexa, Apple Siri, and Google. Salesforce and other enterprise software companies continue aggressively invest in Research & Development (R&D). For the nine months ended October 31, 2018, Salesforce spent $1.3B or 14% of total revenues compared to $1.1B or 15% of total revenues, during the same period a year ago, an increase of $211M according to the company’s 10Q filed with the Securities and Exchange Commission.

The race for AI and machine learning patents that streamline selling is getting more competitive every month. Expect to see the race of sales-focused AI and machine learning patents flourish in 2019. The National Bureau of Economic Research published a study last July from the Stanford Institute For Economic Policy Research titled Some Facts On High Tech Patenting. The study finds that patenting in machine learning has seen exponential growth since 2010 and Microsoft had the greatest number of patents in the 2000 to 2015 timeframe. Using patent analytics from PatentSight and ipsearchIAM published an analysis last month showing Microsoft as the global leader in machine learning patents with 2,075.  The study relied on PatentSight’s Patent Asset Index to rank machine learning patent creators and owners, revealing Microsoft and Alphabet are dominating today. Salesforce investing over $1B a year in R&D reflects how competitive the race for patents and intellectual property is.

10 Ways Machine Learning Is Revolutionizing Sales

Fueled by the proliferation of patents and the integration of AI and machine learning code into CRM, CPQ, Customer Service, Predictive Analytics and a wide variety of Sales Enablement applications, use cases are flourishing today. Presented below are the ten ways machine learning is most revolutionizing selling today:

 

  1. AI and machine learning technologies excel at pattern recognition, enabling sales teams to find the highest potential new prospects by matching data profiles with their most valuable customers. Nearly all AI-enabled CRM applications are providing the ability to define a series of attributes, characteristics and their specific values that pinpoint the highest potential prospects. Selecting and prioritizing new prospects using this approach saves sales teams thousands of hours a year.
  2. Lead scoring and nurturing based on AI and machine learning algorithms help guide sales and marketing teams to turn Marketing Qualified Leads (MQL) into Sales Qualified Leads (SQL), strengthening sales pipelines in the process. One of the most important areas of collaboration between sales and marketing is lead nurturing strategies that move prospects through the pipeline. AI and machine learning are enriching the collaboration with insights from third-party data, prospect’s activity at events and on the website, and from previous conversations with salespeople. Lead scoring and nurturing relies heavily on natural language generation (NLG) and natural-language processing (NLP) to help improve each lead’s score.
  3. Combining historical selling, pricing and buying data in a single machine learning model improves the accuracy and scale of sales forecasts. Factoring in differences inherent in every account given their previous history and product and service purchasing cycles is invaluable in accurately predicting their future buying levels. AI and machine learning algorithms integrated into CRM, sales management and sales planning applications can explain variations in forecasts, provided they have the data available. Forecasting demand for new products and services is an area where AI and machine learning are reducing the risk of investing in entirely new selling strategies for new products.
  4. Knowing the propensity of a given customer to churn versus renew is invaluable in improving Customer Lifetime Value. Analyzing a diverse series of factors to see which customers are going to churn or leave versus those that will renew is among the most valuable insights AI and machine learning is delivering today. Being able to complete a Customer Lifetime Value Analysis for every customer a company has provides a prioritized roadmap of where the health of client relationships are excellent versus those that need attention. Many companies are using Customer Lifetime Value Analysis as a proxy for a customer health score that gets reviewed monthly.
  5. Knowing the strategies, techniques and time management approaches the top 10% of salespeople to rely on to excel far beyond quota and scaling those practices across the sales team based on AI-driven insights. All sales managers and leaders think about this often, especially in sales teams where performance levels vary widely. Knowing the capabilities of the highest-achieving salespeople, then selectively recruiting those sales team candidates who have comparable capabilities delivers solid results. Leaders in the field of applying AI to talent management include Eightfold whose approach to talent management is refining recruiting and every phase of managing an employee’s potential. Please see the recent New York Times feature of them here.
  6. Guided Selling is progressing rapidly from a personalization-driven selling strategy to one that capitalized on data-driven insights, further revolutionizing sales. AI- and machine learning-based guided selling is based on prescriptive analytics that provides recommendations to salespeople of which products, services, and bundles to offer at which price. 62% of highest performing salespeople predict guided selling adoption will accelerate based on its ability rank potential opportunities by value and suggest next steps according to Salesforces’ latest State of Sales research study.
  7. Improving the sales team’s productivity by using AI and machine learning to analyze the most effective actions and behaviors that lead to more closed sales. AI and machine learning-based sales contact and customer predictive analytics take into account all sources of contacts with customers and determine which are the most effective. Knowing which actions and behaviors are correlated with the highest close rates, sales managers can use these insights to scale their sales teams to higher performance.
  8. Sales and marketing are better able to define a price optimization strategy using all available data analyzing using AI and machine learning algorithms. Pricing continues to be an area the majority of sales and marketing teams learn to do through trial and error. Being able to analyze pricing data, purchasing history, discounts are taken, promotional programs participated in and many other factors, AI and machine learning can calculate the price elasticity for a given customer, making an optimized price more achievable.
  9. Personalizing sales and marketing content that moves prospects from MQLs to SQLs is continually improving thanks to AI and machine learning. Marketing Automation applications including HubSpot and many others have for years been able to define which content asset needs to be presented to a given prospect at a given time. What’s changed is the interactive, personalized nature of the content itself. Combining analytics, personalization and machine learning, marketing automation applications are now able to tailor content and assets that move opportunities forward.
  10. Solving the many challenges of sales engineering scheduling, sales enablement support and dedicating the greatest amount of time to the most high-value accounts is getting solved with machine learning. CRM applications including Salesforce can define a salesperson’s schedule based on the value of the potential sale combined with the strength of the sales lead, based on its lead score. AI and machine learning optimize a salesperson’s time so they can go from one customer meeting to the next, dedicating their time to the most valuable prospects.

How To Protect Healthcare Records In A Zero Trust World

  • There’s been a staggering 298.4% growth in the reported number of patient records breached as a result of insider-wrongdoing this year alone according to Protenus.
  • The total disclosed number of breached patient records has soared from 1.1M in Q1 2018 to 4.4M in Q3 2018 alone, 680K of which were breached by insiders.
  • There were 117 disclosed health breaches in the last 90 days alone.
  • On average it’s taking 402 days to discover a healthcare provider has been breached.

Diagnosing Healthcare’s Breach Epidemic

Using access credentials stolen from co-workers or stolen laptops, unethical healthcare insiders are among the most prolific at stealing and selling patient data of any insider threat across any industry. Accenture’s study, “Losing the Cyber Culture War in Healthcare: Accenture 2018 Healthcare Workforce Survey on Cybersecurity,” found that the most common ways healthcare employees financially gain from stealing medical records is to commit tax return and credit card fraud.

Treating healthcare’s breach epidemic needs to start by viewing every threat surface, access point, identity, and login attempt as the new security perimeter. Healthcare providers urgently need to take a “never trust, always verify” approach, adopting  Zero Trust Security to protect every threat surface using Next-Gen Access for end-user credentials and Privileged Access Management (PAM) for privileged credentials. One of the leaders in Next-Gen Access is Idaptive, a newly created spin-off of Centrify. Centrify itself is offering Zero Trust Privilege Services helping over half of the Fortune 100 to eliminate privileged access abuse, the leading cause of breaches today. Centrify Zero Trust Privilege grants least privilege access based on verifying who is requesting access, the context of the request, and the risk of the access environment.

18% of healthcare employees are willing to sell confidential data to unauthorized parties for as little as $500 to $1,000, according to a recent Accenture study. 24% of employees know of someone who has sold access to patient data to outsiders. 58% of all healthcare breaches are initiated by insiders. Confidential patient diagnosis, treatment, payment histories, and medical records are the most valuable on the Dark Web, selling for as much as $1,000 per record according to Experian.

Key insights from Protenus’ Breach Barometer illustrate how healthcare’s breach epidemic is growing exponentially:

  • There’s been a staggering 298.4% growth in the number of patient records breached as a result of insider-wrongdoing this year alone. In Q1 of this year, there were 4,597 patient records exfiltrated by insider wrong-doing, jumping to 70,562 in Q2 and soaring to 290,689 in Q3. Healthcare insiders can easily thwart healthcare systems’ legacy security approaches today by using compromised access credentials. Zero Trust Security, either in the form of Next-Gen Access for end-user credentials or Zero Trust Privilege for privileged access credentials has the potential to stop this

  • The total number of breached patient records has soared from 1.1M in Q1 of this year to 4.4M in Q3, a 58.7% jump in less than a year. Protenus found a total of 117 incidents were disclosed to U.S. Department of Health and Human Services (HHS) or the media in Q3 2018 alone. Details were disclosed for 100 of these incidents, affecting 4,390,512 patient records, the highest level ever recorded. Jumping from 1.1M medical records in Q1 to 4.4M in Q3, healthcare providers could easily see over 6.5M records breached in Q4 2018 alone.

  • Hackers targeted healthcare systems aggressively in Q3 of this year, exfiltrating 3.6M patient records in just 90 days. Compromised access credentials are hackers’ favorite technique for exfiltrating massive quantities of medical records they resell on the Dark Web or use to commit tax and credit card fraud. Healthcare providers need to minimize their attack surfaces, improve audit and compliance visibility, reduce risk, complexity, and costs across their modern, hybrid enterprises with Zero Trust. Healthcare providers need to shut down hackers now, taking away the opportunities they’re capitalizing on to exfiltrate medical records almost at will.
  • It takes 71 days on average for healthcare providers to realize their data is breached with one breach lasting over 15 years. Protenus found a wide variation in the length of time it takes healthcare providers to realize they’ve been breached and one didn’t know until 15 years after the initial successful breach. All breaches tracked by Protenus found that the insiders and/or hackers were successful in gaining access to a wealth of patient information including addresses, dates of birth, medical record numbers, healthcare providers, visit date, health insurance information, financial histories, and payment information.

Conclusion

Zero Trust is the antidote healthcare needs to treat its raging breach epidemic.  It’s exponentially growing as insiders’ intent on wrongdoing turn to exfiltrating patients’ data for personal gain. Hackers also find healthcare providers’ legacy systems among the easiest to access using stolen access credentials, exfiltrating millions of records in months. With every new employee and device being a new security perimeter on their networks, the time is now for healthcare providers to discard the old model of “trust but verify” which relied on well-defined boundaries. Zero Trust mandates a “never trust, always verify” approach to access, from inside or outside healthcare providers’ networks.

CPQ Needs To Scale And Support Smarter, More Connected Products

  • For smart, connected product strategies to succeed they require a product lifecycle view of configurations, best attained by integrating PLM, CAD, CRM, and ERP systems.
  • Capgemini estimates that the size of the connected products market will be $519B to $685B by 2020.
  • In 2018, $985B will be spent on IoT-enabled smart consumer devices, soaring to $1.49B in 2020, attaining a 23.1% compound annual growth rate (CAGR) according to Statista.
  • Industrial manufacturers will spend on average $121M a year on smart, connected products according to Statista.

Succeeding with a smart, connected product strategy is requiring manufacturers to accelerate their IoT & software development expertise faster than they expected. By 2020, 50% of manufacturers will generate the majority of their revenues from smart, connected products according to Capgemini’s recent study. Manufacturers see 2019 as the breakout year for smart, connected products and the new revenue opportunities they provide.

Industrial Internet of Things (IIoT) platforms has the potential of providing a single, unified data model across an entire manufacturing operation, giving manufacturers a single unified view of product configurations across their lifecycles. Producing smart, connected products at scale also requires a system capable of presenting a unified view of configurations in the linguistics each department can understand. Engineering, production, marketing, sales, and service all need a unique view of product configurations to keep producing new products. Leaders in this field include Configit and their Configuration Lifecycle Management approach to CPQ and product configuration.

Please see McKinsey’s article IIoT platforms: The technology stack as a value driver in industrial equipment and machinery which explores how the Industrial Internet of things (IIoT) is redefining industrial equipment and machinery manufacturing. The following graphic from the McKinsey explains why smart, connected product strategies are accelerating across all industries. Please click on the graphic to expand it for easier reading.

CPQ Needs To Scale Further To Sell Smart, Connected Products

Smart, connected products are redefining the principles of product design, manufacturing, sales, marketing, and service. CPQ systems need to grow beyond their current limitations by capitalizing on these new principles while scaling to support new business models that are services and subscription-based.

The following are the key areas where CPQ systems are innovating today, making progress towards enabling the custom configuration of smart, connected products:

  • For smart, connected product strategies to succeed they require a product lifecycle view of configurations, best attained by integrating PLM, CAD, CRM, and ERP systems. Smart, connected product strategies require real-time integration between front-end and back-end systems to optimize production performance. And they also require advanced visualization that provides prospects with an accurate, 3D-rendered view that can be accurately translated to a Bill of Materials (BOM) and into production. The following graphic is based on conversations with Configit customers, illustrating how they are combining PLM, CAD, CRM and ERP systems to support smart, connected products related to automotive manufacturing. Please click on the graphic to expand it for easier reading.

  • CPQ and product configuration systems need to reflect the products they’re specifying are part of a broader ecosystem, not stand-alone. The essence of smart, connected products is their contributions to broader, more complex networks and ecosystems. CPQ systems need to flex and support much greater system interoperability of products than they do today. Additional design principles include designing in connected service options, evergreen or long-term focus on the product-as-a-platform and designed in support for entirely new pricing models.
  • Smart, connected products need CPQ systems to reduce physical complexity while scaling device intelligence through cross-sells, up-sells and upgrades. Minimizing the physical options to allow for greater scale and support for device intelligence-based ones are needed in CPQ systems today. For many CPQ providers, that’s going to require different data models and taxonomies of product definitions. Smart, connected products will be modified after purchase as well, evolving to customers’ unique requirements.
  • After-sales service for smart, connected products will redefine pricing and profit models for the better in 2019, and CPQ needs to keep up to make it happen. Giving products the ability to send back their usage rates and patterns, reliability and performance data along with their current condition opens up lucrative pricing and services models. CPQ applications need to be able to provide quotes for remote diagnostics, price breaks on subscriptions for sharing data, product-as-a-service and subscription-based options for additional services. Many CPQ systems will need to be updated to support entirely new services-driven business models manufacturers are quickly adopting today.

Using Machine Learning To Find Employees Who Can Scale With Your Business

  • Eightfold’s analysis of hiring data has found the half-life of technical, marketable skills is 5 to 7 years, making the ability to unlearn and learn new concepts essential for career survival.
  • Applicant Tracking Systems (ATS) don’t capture applicants’ drive and intensity to unlearn and learn or their innate capabilities for growth.
  • Artificial Intelligence (AI) and machine learning are proving adept at discovering candidates’ innate capabilities to unlearn, learn and reinvent themselves throughout their careers.

Hiring managers in search of qualified job candidates who can scale with and contribute to their growing businesses are facing a crisis today. They’re not finding the right or in many cases, any candidates at all using resumes alone, Applicant Tracking Systems (ATS) or online job recruitment sites designed for employers’ convenience first and candidates last. These outmoded approaches to recruiting aren’t designed to find those candidates with the strongest capabilities. Add to this dynamic the fact that machine learning is making resumes obsolete by enabling employers to find candidates with precisely the right balance of capabilities needed and its unbiased data-driven approach selecting candidates works. Resumes, job recruitment sites and ATS platforms force hiring managers to bet on the probability they make a great hire instead of being completely certain they are by basing their decisions on solid data.

Playing The Probability Hiring Game Versus Making Data-Driven Decisions

Many hiring managers and HR recruiters are playing the probability hiring game. It’s betting that the new hire chosen using imprecise methods will work out. And like any bet, it gets expensive quickly when a wrong choice is made. There’s a 30% chance the new hire will make it through one year, and if they don’t, it will cost at least 1.5 times their salary to replace them. When the median salary for a cloud computing professional is $146,350, and it takes the best case 46 days to find them, the cost and time loss of losing just one recruited cloud computing professional can derail a project for months. It will cost at least $219,000 or more to replace just that one engineer. The average size of an engineering team is ten people so only three will remain in 12 months. These are the high costs of playing the probability hiring game, fueled by unconscious and conscious biases and systems that game recruiters into believing they are making progress when they’re automating mediocre or worse decisions. Hiring managers will have better luck betting in Las Vegas or playing Powerball than hiring the best possible candidate if they rely on systems that only deliver a marginal probability of success at best.

Betting on solid data and personalization at scale, on the other hand, delivers real results. Real data slices through the probabilities and is the best equalizer there is at eradicating conscious and unconscious biases from hiring decisions. Hiring managers, HR recruiters, directors and Chief Human Resource Officers (CHROs) vow they are strong believers in diversity. Many are abandoning the probability hiring game for AI- and machine learning-based approaches to talent management that strip away any extraneous data that could lead to bias-driven hiring decisions. Now candidates get evaluated on their capabilities and innate strengths and how strong a match they are to ideal candidates for specific roles.

A Data-Driven Approach To Finding Employees Who Can Scale

Personalization at scale is more than just a recruiting strategy; it’s a talent management strategy intended to flex across the longevity of every employees’ tenure. Attaining personalization at scale is essential if any growing business is going to succeed in attracting, acquiring and growing talent that can support their growth goals and strategies. Eightfold’s approach makes it possible to scale personalized responses to specific candidates in a company’s candidate community while defining the ideal candidate for each open position. Personalization at scale has succeeded in helping companies find the right person to the right role at the right time and, for the first time, personalize every phase of recruitment, retention and talent management at scale.

Eightfold is pioneering the use of a self-updating corporate candidate database. Profiles in the system are now continually updated using external data gathering, without applicants reapplying or submitting updated profiles. The taxonomies supported in the corporate candidate database make it possible for hiring managers to define the optimal set of capabilities, innate skills, and strengths they need to fill open positions.

Lessons Learned at PARC
Russell Williams, former Vice President of Human Resources at PARC, says the best strategy he has found is to define the ideal attributes of high performers and look to match those profiles with potential candidates. “We’re finding that there are many more attributes that define a successful employee in our most in-demand positions including data scientist that are evident from just reviewing a resume and with AI, I want to do it at scale,” Russell said. Ashutosh Garg, Eightfold founder, added: “that’s one of the greatest paradoxes that HR departments face, which is the need to know the contextual intelligence of a given candidate far beyond what a resume and existing recruiting systems can provide.”  One of the most valuable lessons learned from PARC is that it’s possible to find the find candidates who excel at unlearning, learning, defining and diligently pursuing their learning roadmaps that lead to reinventing their skills, strengths, and marketability.

Conclusion

Machine learning algorithms capable of completing millions of pattern matching comparisons per second provides valuable new insights, enabling companies to find those who excel at reinventing themselves. The most valuable employees who can scale any business see themselves as learning entrepreneurs and have an inner drive to master new knowledge and skills. And that select group of candidates is the catalyst most often responsible for making the greatest contributions to a company’s growth.

High-Tech’s Greatest Challenge Will Be Securing Supply Chains In 2019

Bottom Line: High-tech manufacturers need to urgently solve the paradox of improving supply chain security while attaining greater visibility across supplier networks if they’re going make the most of smart, connected products’ many growth opportunities in 2019.

The era of smart, connected products is revolutionizing every aspect of manufacturing today, from suppliers to distribution networks. Capgemini estimates that the size of the connected products market will be $519B to $685B by 2020. Manufacturers expect close to 50 percent of their products to be smart, connected products by 2020, according to Capgemini’s Digital Engineering: The new growth engine for discrete manufacturers. The study is downloadable here (PDF, 40 pp., no opt-in).

Smart, connected products free manufacturers and their supply chains from having to rely on transactions and the price wars they create. The smarter the product, the greater the services revenue opportunities. And the more connected a smart product is using IoT and Wi-Fi sensors the more security has to be designed into every potential supplier evaluation, onboarding, quality plan, and ongoing suppliers’ audits. High-tech manufacturers are undertaking all of these strategies today, fueling them with real-time monitoring using barcoding, RFID and IoT sensors to improve visibility across their supply chains.

Gaining even greater visibility into their supply chains using cloud-based track-and-trace systems capable of reporting back the condition of components in transit to the lot and serialized pack level, high-tech suppliers are setting the gold standard for supply chain transparency and visibility. High-tech supply chains dominate many other industries’ supplier networks on accuracy, speed, and scale metrics on a consistent basis, yet the industry is behind on securing its vast supplier network. Every supplier identity and endpoint is a new security perimeter and taking a Zero Trust approach to securing them is the future of complex supply chains. With Zero Trust Privilege, high-tech manufacturers can secure privileged access to infrastructure, DevOps, cloud, containers, Big Data, production, logistics and shipping facilities, systems and teams.

High-Tech Needs to Confront Its Supply Chain Security Problem, Not Dismiss It

It’s ironic that high-tech supply chains are making rapid advances in accuracy and visibility yet still aren’t vetting suppliers thoroughly enough to stop counterfeiting, or worse. Bloomberg’s controversial recent article,The Big Hack: How China Used a Tiny Chip to Infiltrate U.S. Companies, explains how Amazon Web Services (AWS) was considering buying Portland, Oregon-based Elemental Technologies for its video streaming technology, known today as Amazon Prime Video. As part of the due diligence, AWS hired a third-party company to scrutinize Elemental’s security all the way up to the board level. The Elemental servers that handle the video compression were assembled by Super Micro Computer Inc., a San Jose-based company in China. Nested on the servers’ motherboards, the testers found a tiny microchip, not much bigger than a grain of rice, that wasn’t part of the boards’ original design that could create a stealth doorway into any network the machines were attached to. Apple (who is also an important Super Micro customer) and AWS deny this ever happened, yet 17 people have confirmed Supermicro had altered hardware, corroborating Bloomberg’s findings.

The hard reality is that the scenario Bloomberg writes about could happen to any high-tech manufacturer today. When it comes to security and 3rd party vendor risk management, many high-tech supply chains are stuck in the 90s while foreign governments, their militaries and the terrorist organizations they support are attempting to design in the ability to breach any network at will. How bad is it?  81% of senior executives involved in overseeing their companies’ global supply chains say 3rd party vendor management including recruiting suppliers is riskiest in China, India, Africa, Russia, and South America according to a recent survey by Baker & McKenzie.

PriceWaterhouseCoopers (PwC) and the MIT Forum for Supply Chain Innovation collaborated on a study of 209 companies’ supply chain operations and approaches to 3rd party vendor risk management. The study, PwC and the MIT Forum for Supply Chain Innovation: Making the right risk decisions to strengthen operations performance, quantifies the quick-changing nature of supply chains. 94% say there are changes in the extended supply chain network configuration happening frequently. Relying on trusted and untrusted domain controllers from server operating systems that are decades old can’t keep up with the mercurial pace of supply chains today.

Getting in Control of Security Risks in High-Tech Supply Chains

It’s time for high-tech supply chains to go with a least privilege-based approach to verifying who or what is requesting access to any confidential data across the supply chains. Further, high-tech manufacturers need to extend access request verification to include the context of the request and the risk of the access environment. Today it’s rare to find any high-tech manufacturer going to this level of least-privilege access approach, yet it’s the most viable approach to securing the most critical parts of their supply chains.

By taking a least-privilege access approach, high-tech manufacturers and their suppliers can minimize attack surfaces, improve audit and compliance visibility, and reduce risk, complexity, and operating costs across their hybrid manufacturing ecosystem.

Key actions that high-tech manufacturers can take to secure their supply chain and ensure they don’t end up in an investigative story of hacked supply chains include the following:

  • Taking a Zero Trust approach to securing every endpoint provides high-tech manufacturers with the scale they need to grow. High-tech supply chains are mercurial and fast-moving by nature, guaranteeing they will quickly scale faster than any legacy approaches enterprise security management. Vetting and then onboarding new suppliers needs to start by protecting every endpoint to the production and sourcing level, especially for next-generation smart, connected products.
  • Smart, connected products and the product-as-a-service business models they create are all based on real-time, rich, secured data streams that aren’t being eavesdropped on with components no one knows about. Taking a Zero Trust Privilege-based approach to securing access to diverse supply chains is needed if high-tech manufacturers are going to extend beyond legacy Privileged Access Management (PAM) to secure data being generated from real-time monitoring and data feeds from their smart, connected products today and in the future.
  • Quality management, compliance, and quality audits are all areas high-tech manufacturers excel in today and provide a great foundation to scale to Zero Trust Privilege. High-tech manufacturers have the most advanced quality management, inbound inspection and supplier quality audit techniques in the world. It’s time for the industry to step up on the security side too. By only granting least-privilege access based on verifying who is requesting access, the context of the request, and the risk of the access environment, high-tech manufacturers can make rapid strides to improve supply chain security.
  • Rethink the new product development cycles for smart, connected products and the sensors they rely on, so they’re protected as threat surfaces when built. Designing in security to the new product development process level and further advancing security scrutiny to the schematic and board design level is a must-do. In an era of where we have to assume bad actors are everywhere, every producer of high-tech products needs to realize their designs, product plans, and roadmaps are at risk. Ensuring the IOT and Wi-Fi sensors in smart, connected products aren’t designed to be hackable starts with a Zero Trust approach to defining security for supplier, design, and development networks.

Conclusion

The era of smart, connected products is here, and supply chains are already reverberating with the increased emphasis on components that are easily integrated and have high-speed connectivity. Manufacturing CEOs say it’s exactly what their companies need to grow beyond transaction revenue and the price wars they create. While high-tech manufacturers excel at accuracy, speed, and scale, they are falling short on security. It’s time for the industry to re-evaluate how Zero Trust can stabilize and secure every identity and threat surface across their supply chains with the same precision and intensity quality is today.

Where Cloud Computing Jobs Will Be In 2019

  • $146,350 is the median salary for cloud computing professionals in 2018.
  • There are 50,248 cloud computing positions available in the U.S. today available from 3,701 employers and 101,913 open positions worldwide today.
  • Oracle (NYSE: ORCL), Deloitte and Amazon (NASDAQ: AMZN) have the most open cloud computing jobs today.
  • Java, Linux, Amazon Web Services (AWS), Software Development, DevOps, Docker and Infrastructure as a Service (IaaS) are the most in-demand skills.
  • Washington DC, Arlington-Alexandria, VA, San Francisco-Oakland-Hayward, CA, New York-Newark-Jersey City, NY, San Jose-Sunnyvale-Santa Clara, CA, Chicago-Naperville-Elgin, IL, are the top five cities where cloud computing jobs are today and will be in 2019.

Demand for cloud computing expertise continues to increase exponentially and will accelerate in 2019. To better understand the current and future direction of cloud computing hiring trends, I utilized Gartner TalentNeuron. Gartner TalentNeuron is an online talent market intelligence portal with real-time labor market insights, including custom role analytics and executive-ready dashboards and presentations. Gartner TalentNeuron also supports a range of strategic initiatives covering talent, location, and competitive intelligence.

Gartner TalentNeuron maintains a database of more than one billion unique job listings and is collecting hiring trend data from more than 150 countries across six continents, resulting in 143GB of raw data being acquired daily. In response to many Forbes readers’ requests for recommendations on where to find a job in cloud computing, I contacted Gartner to gain access to TalentNeuron.

Key takeaways include the following:

  • $146,350 is the median salary for cloud computing professionals in 2018.  Cloud computing salaries have soared in the last two years, with 2016’s median salary being $124,300 a jump of $22,050. The following graphic shows the distribution of salaries for 50,248 cloud computing jobs currently available in the U.S. alone. Please click on the graphic to expand for easier reading.

  • The Hiring Scale is 78 for jobs that require cloud computing skill sets, with the average job post staying open 46 days. The higher the Hiring Scale score, the more difficult it is for employers to find the right applicants for open positions. Nationally an average job posting for an IT professional with cloud computing expertise is open 46 days. Please click on the graphic to expand for easier reading.

  • Washington, DC – Arlington-Alexandria, VA leads the top twenty metro areas that have the most open positions for cloud computing professionals today. Mapping the distribution of job volume, salary range, candidate supply, posting period and hiring scale by Metropolitan Statistical Area (MSA) or states and counties are supported by Gartner TalentNeuron.  The following graphic is showing the distribution of talent or candidate supply.  These are the markets with the highest supply of talent with cloud computing skills.

  • Oracle (NYSE: ORCL), Deloitte and Amazon (NASDAQ: AMZN) have the most open cloud computing jobs today. IBM, VMWare, Capital One, Microsoft, KPMG, Salesforce, PricewaterhouseCoopers, U.S. Bank, and Booz Allen Hamilton, Raytheon Corporation, SAP, Capgemini, Google, Leidos and Nutanix all have over 100 open cloud computing positions today.

%d bloggers like this: