Skip to content

Posts tagged ‘Machine learning’

10 Predictions How AI Will Improve Cybersecurity In 2020

10 Predictions How AI Will Improve Cybersecurity In 2020

Capgemini predicts 63% of organizations are planning to deploy AI in 2020 to improve cybersecurity, with the most popular application being network security.

Cybersecurity is at an inflection point entering 2020. Advances in AI and machine learning are accelerating its technological progress. Real-time data and analytics are making it possible to build stronger business cases, driving higher adoption. Cybersecurity spending has rarely been linked to increasing revenues or reducing costs, but that’s about to change in 2020.

What Leading Cybersecurity Experts Are Predicting For 2020

Interested in what the leading cybersecurity experts are thinking will happen in 2020, I contacted five of them. Experts I spoke with include Nicko van Someren, Ph.D. and Chief Technology Officer at Absolute Software; Dr. Torsten George, Cybersecurity Evangelist at Centrify; Craig Sanderson, Vice President of Security Products at Infoblox; Josh Johnston, Director of AI, Kount; and Brian Foster, Senior Vice President Product Management at MobileIron. Each of them brings a knowledgeable, insightful, and unique perspective to how AI and machine learning will improve cybersecurity in 2020. The following are their ten predictions:

  1. AI and machine learning will continue to enable asset management improvements that also deliver exponential gains in IT security by providing greater endpoint resiliency in 2020. Nicko van Someren, Ph.D. and Chief Technology Officer at Absolute Software, observes that “Keeping machines up to date is an IT management job, but it’s a security outcome. Knowing what devices should be on my network is an IT management problem, but it has a security outcome. And knowing what’s going on and what processes are running and what’s consuming network bandwidth is an IT management problem, but it’s a security outcome. I don’t see these as distinct activities so much as seeing them as multiple facets of the same problem space, accelerating in 2020 as more enterprises choose greater resiliency to secure endpoints.”
  2. AI tools will continue to improve at drawing on data sets of wildly different types, allowing the “bigger picture” to be put together from, say, static configuration data, historic local logs, global threat landscapes, and contemporaneous event streams.  Nicko van Someren, Ph.D., and CTO at Absolute Software also predict that“Enterprise executives will be concentrating their budgets and time on detecting cyber threats using AI above predicting and responding. As enterprises mature in their use and adoption of AI as part of their cybersecurity efforts, prediction and response will correspondingly increase.”
  3. Threat actors will increase the use of AI to analyze defense mechanisms and simulate behavioral patterns to bypass security controls, leveraging analytics to and machine learning to hack into organizations. Dr. Torsten George, Cybersecurity Evangelist at Centrify, predicts that “threat actors, many of them state-sponsored, will increase their use and sophistication of AI algorithms to analyze organizations’’ defense mechanisms and tailor attacks to specific weak areas. He also sees the threat of bad actors being able to plug into the data streams of organizations and use the data to further orchestrate sophisticated attacks.”
  4. Given the severe shortage of experienced security operations resources and the sheer volume of data that most organizations are trying to work through, we are likely to see organizations seeking out AI/ML capabilities to automate their security operations processes. Craig Sanderson, Vice President of Security Products at Infoblox also predicts that “while AI and machine learning will increasingly be used to detect new threats it still leaves organizations with the task of understanding the scope, severity, and veracity of that threat to inform an effective response. As security operations becomes a big data problem it necessitates big data solutions.”
  5. There’s going to be a greater need for adversarial machine learning to combat supply chain corruption in 2020. Sean Tierney, Director of Threat Intelligence at Infoblox, predicts that “the need for adversarial machine learning to combat supply chain corruption is going to increase in 2020. Sean predicts that the big problem with remote coworking spaces is determining who has access to what data. As a result, AI will become more prevalent in traditional business processes and be used to identify if a supply chain has been corrupted.”
  6. Artificial intelligence will become more prevalent in account takeover—both the proliferation and prevention of it. Josh Johnston, Director of AI at Kount, predicts that “the average consumer will realize that passwords are not providing enough account protection and that every account they have is vulnerable. Captcha won’t be reliable either, because while it can tell if someone is a bot, it can’t confirm that the person attempting to log in is the account holder. AI can recognize a returning user. AI will be key in protecting the entire customer journey, from account creation to account takeover, to a payment transaction. And, AI will allow businesses to establish a relationship with their account holders that are protected by more than just a password.”
  7. Consumers will take greater control of their data sharing and privacy in 2020. Brian Foster, Senior Vice President Product Management at MobileIron, observes that over the past few years, we’ve witnessed some of the biggest privacy and data breaches. As a result of the backlash, tech giants such as Apple, Google, Facebook and Amazon beefed up their privacy controls to gain back trust from customers. Now, the tables have turned in favor of consumers and companies will have to put privacy first to stay in business. Moving forward, consumers will own their data, which means they will be able to selectively share it with third parties, but most importantly, they will get their data back after sharing, unlike in years past.
  8. As cybersecurity threats evolve, we’ll fight AI with AI. Brian Foster, Senior Vice President Product Management at MobileIron, notes that the most successful cyberattacks are executed by highly professional criminal networks that leverage AI and ML to exploit vulnerabilities such as user behavior or security gaps to gain access to valuable business systems and data. All of this makes it extremely hard for IT security organizations to keep up — much less stay ahead of these threats. While an attacker only needs to find one open door in an enterprise’s security, the enterprise must race to lock all of the doors. AI conducts this at a pace and thoroughness human ability can no longer compete with, and businesses will finally take notice in 2020.
  9. AI and machine learning will thwart compromised hardware finding its way into organizations’ supply chains. Rising demand for electronic components will expand the market for counterfeit components and cloned products, increasing the threat of compromised hardware finding its way into organizations’ supply chains. The vectors for hardware supply-chain attacks are expanding as market demand for more and cheaper chips, and components drive a booming business for hardware counterfeiters and cloners. This expansion is likely to create greater opportunities for compromise by both nation-state and cybercriminal threat actors. Source: 2020 Cybersecurity Threats Trends Outlook; Booz, Allen, Hamilton, 2019.
  10. Capgemini predicts 63% of organizations are planning to deploy AI in 2020 to improve cybersecurity, with the most popular application being network security. Capgemini found that nearly one in five organizations were using AI to improve cybersecurity before 2019. In addition to network security, data security, endpoint security, and identity and access management are the highest priority use cases for improving cybersecurity with AI in enterprises today. Source: Capgemini, Reinventing Cybersecurity with Artificial Intelligence: The new frontier in digital security.
10 Predictions How AI Will Improve Cybersecurity In 2020

Source: Capgemini, Reinventing Cybersecurity with Artificial Intelligence: The new frontier in digital security.

What’s New In Gartner’s Hype Cycle For AI, 2019

What's New In Gartner's Hype Cycle For AI, 2019

  • Between 2018 and 2019, organizations that have deployed artificial intelligence (AI) grew from 4% to 14%, according to Gartner’s 2019 CIO Agenda survey.
  • Conversational AI remains at the top of corporate agendas spurred by the worldwide success of Amazon Alexa, Google Assistant, and others.
  • Enterprises are making progress with AI as it grows more widespread, and they’re also making more mistakes that contribute to their accelerating learning curve.

These and many other new insights are from Gartner Hype Cycle For AI, 2019 published earlier this year and summarized in the recent Gartner blog post, Top Trends on the Gartner Hype Cycle for Artificial Intelligence, 2019.  Gartner’s definition of Hype Cycles includes five phases of a technology’s lifecycle and is explained here. Gartner’s latest Hype Cycle for AI reflects the growing popularity of AutoML, intelligent applications, AI platform as a service or AI cloud services as enterprises ramp up their adoption of AI. The Gartner Hype Cycle for AI, 2019, is shown below:

Details Of What’s New In Gartner’s Hype Cycle For AI, 2019

  • Speech Recognition is less than two years to mainstream adoption and is predicted to deliver the most significant transformational benefits of all technologies on the Hype Cycle. Gartner advises its clients to consider including speech recognition on their short-term AI technology roadmaps. Gartner observes, unlike other technologies within the natural-language processing area, speech to text (and text to speech) is a stand-alone commodity where its modules can be plugged into a variety of natural-language workflows. Leading vendors in this technology area Amazon, Baidu, Cedat 85, Google, IBM, Intelligent Voice, Microsoft, NICE, Nuance, and Speechmatics.
  • Eight new AI-based technologies are included in this year’s Hype Cycle, reflecting Gartner enterprise clients’ plans to scale AI across DevOps and IT while supporting new business models. The latest technologies to be included in the Hype Cycle for AI reflect how enterprises are trying to demystify AI to improve adoption while at the same time, fuel new business models. The new technologies include the following:
  1. AI Cloud Services – AI cloud services are hosted services that allow development teams to incorporate the advantages inherent in AI and machine learning.
  2. AutoML – Automated machine learning (AutoML) is the capability of automating the process of building, deploying, and managing machine learning models.
  3. Augmented Intelligence – Augmented intelligence is a human-centered partnership model of people and artificial intelligence (AI) working together to enhance cognitive performance, including learning, decision making, and new experiences.
  4. Explainable AI – AI researchers define “explainable AI” as an ensemble of methods that make black-box AI algorithms’ outputs sufficiently understandable.
  5. Edge AI – Edge AI refers to the use of AI techniques embedded in IoT endpoints, gateways, and edge devices, in applications ranging from autonomous vehicles to streaming analytics.
  6. Reinforcement Learning – Reinforcement learning has the primary potential for gaming and automation industries and has the potential to lead to significant breakthroughs in robotics, vehicle routing, logistics, and other industrial control scenarios.
  7. Quantum Computing – Quantum computing has the potential to make significant contributions to the areas of systems optimization, machine learning, cryptography, drug discovery, and organic chemistry. Although outside the planning horizon of most enterprises, quantum computing could have strategic impacts in key businesses or operations.
  8. AI Marketplaces – Gartner defines an AI Marketplace as an easily accessible place supported by a technical infrastructure that facilitates the publication, consumption, and billing of reusable algorithms. Some marketplaces are used within an organization to support the internal sharing of prebuilt algorithms among data scientists.
  • Gartner considers the following AI technologies to be on the rise and part of the Innovation Trigger phase of the AI Hype Cycle. AI Marketplaces, Reinforcement Learning, Decision Intelligence, AI Cloud Services, Data Labeling, and Annotation Services, and Knowledge Graphs are now showing signs of potential technology breakthroughs as evidence by early proof-of-concept stories. Technologies in the Innovation Trigger phase of the Hype Cycle often lack usable, scalable products with commercial viability not yet proven.
  • Smart Robots and AutoML are at the peak of the Hype Cycle in 2019. In contrast to the rapid growth of industrial robotics systems that adopted by manufacturers due to the lack of workers, Smart Robots are defined by Gartner as having electromechanical form factors that work autonomously in the physical world, learning in short-term intervals from human-supervised training and demonstrations or by their supervised experiences including taking direction form human voices in a shop floor environment. Whiz robot from SoftBank Robotics is an example of a SmartRobot that will be sold under robot-as-a service (RaaS) model and originally be available only in Japan. AutoML is one of the most hyped technology in AI this year. Gartner defines automated machine learning (AutoML) as the capability of automating the process of building, deploying, or managing machine learning models. Leading vendors providing AutoML platforms and applications include Amazon SageMaker, Big Squid, dotData, DataRobot, Google Cloud Platform, H2O.ai, KNIME, RapidMiner, and Sky Tree.
  • Nine technologies were removed or reassigned from this years’ Hype Cycle of AI compared to 2018. Gartner has removed nine technologies, often reassigning them into broader categories. Augmented reality and Virtual Reality are now part of augmented intelligence, a more general category, and remains on many other Hype Cycles. Commercial UAVs (drones) is now part of edge AI, a more general category. Ensemble learning had already reached the Plateau in 2018 and has now graduated from the Hype Cycle. Human-in-the-loop crowdsourcing has been replaced by data labeling and annotation services, a broader category. Natural language generation is now included as part of NLP. Knowledge management tools have been replaced by insight engines, which are more relevant to AI. Predictive analytics and prescriptive analytics are now part of decision intelligence, a more general category.

Sources:

Hype Cycle for Artificial Intelligence, 2019, Published 25 July 2019, (Client access reqd.)

Top Trends on the Gartner Hype Cycle for Artificial Intelligence, 2019 published September 12, 2019

State Of AI And Machine Learning In 2019

  • Marketing and Sales prioritize AI and machine learning higher than any other department in enterprises today.
  • In-memory analytics and in-database analytics are the most important to Finance, Marketing, and Sales when it comes to scaling their AI and machine learning modeling and development efforts.
  • R&D’s adoption of AI and machine learning is the fastest of all enterprise departments in 2019.

These and many other fascinating insights are from Dresner Advisory Services’6th annual 2019 Data Science and Machine Learning Market Study (client access reqd) published last month. The study found that advanced initiatives related to data science and machine learning, including data mining, advanced algorithms, and predictive analytics are ranked the 8th priority among the 37 technologies and initiatives surveyed in the study. Please see page 12 of the survey for an overview of the methodology.

“The Data Science and Machine Learning Market Study is a progression of our analysis of this market which began in 2014 as an examination of advanced and predictive analytics,” said Howard Dresner, founder, and chief research officer at Dresner Advisory Services. “Since that time, we have expanded our coverage to reflect changes in sentiment and adoption, and have added new criteria, including a section covering neural networks.”

Key insights from the study include the following:

  • Data mining, advanced algorithms, and predictive analytics are among the highest-priority projects for enterprises adopting AI and machine learning in 2019. Reporting, dashboards, data integration, and advanced visualization are the leading technologies and initiatives strategic to Business Intelligence (BI) today. Cognitive BI (artificial-intelligence-based BI) ranks comparatively lower at 27th among priorities. The following graphic prioritizes the 27 technologies and initiatives strategic to business intelligence:

  • 40% of Marketing and Sales teams say data science encompassing AI and machine learning is critical to their success as a department. Marketing and Sales lead all departments in how significant they see AI and machine learning to pursue and accomplish their growth goals. Business Intelligence Competency Centers (BICC), R&D, and executive management audiences are the next most interested, and all top four roles cited carry comparable high combined “critical” and “very important” scores above 60%. The following graphic compares the importance levels by department for data science, including AI and machine learning:

  • R&D, Marketing, and Sales’ high level of shared interest across multiple feature areas reflect combined efforts to define new revenue growth models using AI and machine learning. Marketing, Sales, R&D, and the Business Intelligence Competency Centers (BICC) respondents report the most significant interest in having a range of regression models to work with in AI and machine learning applications. Marketing and Sales are also most interested in the next three top features, including hierarchical clustering, textbook statistical functions, and having a recommendation engine included in the applications and platforms they purchase. Dresner’s research team believes that the high shared interest in multiple features areas by R&D, Marketing and Sales is leading indicator enterprises are preparing to pilot AI and machine learning-based strategies to improve customer experiences and drive revenue. The following graphic compares interest and probable adoption by functional area of the enterprises interviewed:

  • 70% of R&D departments and teams are most likely to adopt data science, AI, and machine learning, leading all functions in an enterprise. Dresner’s research team sees the high level of interest by R&D teams as a leading indicator of broader enterprise adoption in the future. The study found 33% of all enterprises interviewed have adopted AI and machine learning, with the majority of enterprises having up to 25 models. Marketing & Sales lead all departments in their current evaluation of data science and machine learning software.

  • Financial Services & Insurance, Healthcare, and Retail/Wholesale say data science, AI, and machine learning are critical to their succeeding in their respective industries. 27% of Financial Services & Insurance, 25% of Healthcare and 24% of Retail/Wholesale enterprises say data science, AI, and machine learning are critical to their success. Less than 10% of Educational institutions consider AI and machine learning vital to their success. The following graphic compares the importance of data science, AI, and machine learning by industry:

  • The Telecommunications industry leads all others in interest and adoption of recommendation engines and model management governance. The Telecommunications, Financial Services, and Technology industries have the highest level of interest in adopting a range of regression models and hierarchical clustering across all industry respondent groups interviewed. Healthcare respondents have much lower interest in these latter features but high interest in Bayesian methods and text analytics functions. Retail/Wholesale respondents are often least interested in analytical features. The following graphic compares industries by their level of interest and potential adoption of analytical features in data science, AI, and machine learning applications and platforms:

  • Support for a broad range of regression models, hierarchical clustering, and commonly used textbook statistical functions are the top features enterprises need in data science and machine learning platforms. Dresner’s research team found these three features are considered the most important or “must-have” when enterprises are evaluating data science, AI and machine learning applications and platforms. All enterprises surveyed also expect any data science application or platform they are evaluating to have a recommendation engine included and model management and governance. The following graphic prioritizes the most and least essential features enterprises expect to see in data science, AI, and machine learning software and platforms:

  • The top three usability features enterprises are prioritizing today include support for easy iteration of models, access to advanced analytics, and an initiative, simple process for continuous modification of models. Support and guidance in preparing analytical data models and fast cycle time for analysis with data preparation are among the highest- priority usability features enterprises expect to see in AI and machine learning applications and platforms. It’s interesting to see the usability attribute of a specialist not required to create analytical models, test and run them at the lower end of the usability rankings. Many AI and machine learning software vendors rely on not needing a specialist to use their applications as a differentiator when the majority of enterprises value  support for easy iteration of models at a higher level as the graphic below shows:

  • 2019 is a record year for enterprises’ interest in data science, AI, and machine learning features they perceive as the most needed to achieve their business strategies and goals. Enterprises most expect AI and machine learning applications and platforms to support a range of regression models, followed by hierarchical clustering and textbook statistical functions for descriptive statistics. Recommendation engines are growing in popularity as interest grew to at least a tie as the second most important feature to respondents in 2019. Geospatial analysis and Bayesian methods were flat or slightly less important compared to 2018. The following graphic compares six years of interest in data science, AI, and machine learning techniques:

How AI Is Protecting Against Payments Fraud

  • 80% of fraud specialists using AI-based platforms believe the technology helps reduce payments fraud.
  • 63.6% of financial institutions that use AI believe it is capable of preventing fraud before it happens, making it the most commonly cited tool for this purpose.
  • Fraud specialists unanimously agree that AI-based fraud prevention is very effective at reducing chargebacks.
  • The majority of fraud specialists (80%) have seen AI-based platforms reduce false positives, payments fraud, and prevent fraud attempts.

AI is proving to be very effective in battling fraud based on results achieved by financial institutions as reported by senior executives in a recent survey, AI Innovation Playbook published by PYMNTS in collaboration with Brighterion. The study is based on interviews with 200 financial executives from commercial banks, community banks, and credit unions across the United States. For additional details on the methodology, please see page 25 of the study. One of the more noteworthy findings is that financial institutions with over $100B in assets are the most likely to have adopted AI, as the study has found 72.7% of firms in this asset category are currently using AI for payment fraud detection.

Taken together, the findings from the survey reflect how AI thwarts payments fraud and deserves to be a high priority in any digital business today. Companies, including Kount and others, are making strides in providing AI-based platforms, further reducing the risk of the most advanced, complex forms of payments fraud.

Why AI Is Perfect For Fighting Payments Fraud

Of the advanced technologies available for reducing false positives, reducing and preventing fraud attempts, and reducing manual reviews of potential payment fraud events, AI is ideally suited to provide the scale and speed needed to take on these challenges. More specifically, AI’s ability to interpret trend-based insights from supervised machine learning, coupled with entirely new knowledge gained from unsupervised machine learning algorithms are reducing the incidence of payments fraud. By combining both machine learning approaches, AI can discern if a given transaction or series of financial activities are fraudulent or not, alerting fraud analysts immediately if they are and taking action through predefined workflows. The following are the main reasons why AI is perfect for fighting payments fraud:

  • Payments fraud-based attacks are growing in complexity and often have a completely different digital footprint or pattern, sequence, and structure, which make them undetectable using rules-based logic and predictive models alone. For years e-commerce sites, financial institutions, retailers, and every other type of online business relied on rules-based payment fraud prevention systems. In the earlier years of e-commerce, rules and simple predictive models could identify most types of fraud. Not so today, as payment fraud schemes have become more nuanced and sophisticated, which is why AI is needed to confront these challenges.
  • AI brings scale and speed to the fight against payments fraud, providing digital businesses with an immediate advantage in battling the many risks and forms of fraud. What’s fascinating about the AI companies offering payments fraud solutions is how they’re trying to out-innovate each other when it comes to real-time analysis of transaction data. Real-time transactions require real-time security. Fraud solutions providers are doubling down on this area of R&D today, delivering impressive results. The fastest I’ve seen is a 250-millisecond response rate for calculating risk scores using AI on the Kount platform, basing queries on a decades-worth of data in their universal data network. By combining supervised and unsupervised machine learning algorithms, Kount is delivering fraud scores that are twice as predictive as previous methods and faster than competitors.
  • AI’s many predictive analytics and machine learning techniques are ideal for finding anomalies in large-scale data sets in seconds. The more data a machine learning model has to train on, the more accurate its predictive value. The greater the breadth and depth of data, a given machine learning algorithm learns from means more than how advanced or complex a given algorithm is. That’s especially true when it comes to payments fraud detection where machine learning algorithms learn what legitimate versus fraudulent transactions look like from a contextual intelligence perspective. By analyzing historical account data from a universal data network, supervised machine learning algorithms can gain a greater level of accuracy and predictability. Kount’s universal data network is among the largest, including billions of transactions over 12 years, 6,500 customers, 180+ countries and territories, and multiple payment networks. The data network includes different transaction complexities, verticals, and geographies, so machine learning models can be properly trained to predict risk accurately. That analytical richness includes data on physical real-world and digital identities creating an integrated picture of customer behavior.

Bottom Line:  Payments fraud is insidious, difficult to stop, and can inflict financial harm on any business in minutes. Battling payment fraud needs to start with a pre-emptive strategy to thwart fraud attempts by training machine learning models to quickly spot and act on threats then building out the strategy across every selling and service channel a digital business relies on.

10 Charts That Will Change Your Perspective Of AI In Marketing

 

  • Top-performing companies are more than twice as likely to be using AI for marketing (28% vs. 12%) according to Adobe’s latest Digital Intelligence Briefing.
  • Retailers are investing $5.9B this year in AI-based marketing and customer service solutions to improve shoppers’ buying experiences according to IDC.
  • Financial Services marketers lead all other industries in AI application adoption, with 37% currently using them today.
  • Sales and Marketing teams most often collaborate using Configure-Price-Quote (CPQ) and Marketing Automation AI-based applications, with sales leaders predicting AI adoption will increase 155% across sales teams in two years.

Artificial Intelligence enables marketers to understand sales cycles better, correlating their strategies and spending to sales results. AI-driven insights are also helping to break down data silos so marketing and sales can collaborate more on deals. Marketing is more analytics and quant-driven than ever before with the best CMOs knowing which metrics and KPIs to track and why they fluctuate.

The bottom line is that machine learning and AI are the technologies CMOs and their teams need to excel today. The best CMOs balance the quant-intensive nature of running marketing with qualitative factors that make a company’s brand and customer experience unique. With greater insight into how prospects make decisions when, where, and how to buy, CMOs are bringing a new level of intensity into driving outcomes. An example of this can be seen from the recent Forbes Insights and Quantcast research, Lessons of 21st-Century Brands Modern Brands & AI Report (17 pp., PDF, free, opt-in). The study found that AI enables marketers to increase sales (52%), increase in customer retention (51%), and succeed at new product launches (49%). AI is making solid contributions to improving lead quality, persona development, segmentation, pricing, and service.

The following ten charts provide insights into how AI is transforming marketing:

  • 21% of sales leaders rely on AI-based applications today, with the majority collaborating with marketing teams sharing these applications. Sales leaders predict that their use of AI will increase 155% in the next two years. Sales leaders predict AI will reach critical mass by 2020 when 54% expect to be using these technologies. Marketing and sales are relying on AI-based marketing automation, configure-price-quote (CPQ), and intelligent selling systems to increase revenue and profit growth significantly in the next two years. Source: Salesforce Research, State of Sales, 3rd edition. (58 pp., PDF, free, opt-in).

  • AI sees the most significant adoption by marketers working in $500M to $1B companies, with conversational AI for customer service is the most dominant. Businesses with between $500M to $1B lead all other revenue categories in the number and depth of AI adoption use cases. Just over 52% of small businesses with sales of $25M or less are using AI for predictive analytics for customer insights. It’s interesting to note that small companies are the leaders in AI spending, at 38.1%, to improve marketing ROI by optimizing marketing content and timing. Source: The CMO Survey: Highlights and Insights Report, February 2019. Duke University, Deloitte and American Marketing Association. (71 pp., PDF, free, no opt-in).

  • 22% of marketers currently are using AI-based applications with an additional 57% planning to use in the next two years. There are nine dominant use cases marketers are concentrating on today, ranging from personalized channel experiences to programmatic advertising and media buying to predictive customer journeys and real-time next best offers. Source: Salesforce’s State of Marketing Study, 5th edition

  • Content personalization and predictive analytics from customer insights are the two areas CMOs most prioritize AI spending today. The CMO study found that B2B service companies are the top user of AI for content personalization (62.2%) and B2B product companies use AI for augmented and virtual reality, facial recognition and visual search more than any other business types. Source: CMOs’ Top Uses For AI: Personalization and Predictive Analytics. Marketing Charts. March 14, 2019

  • Personalizing the overall customer journey and driving next-best offers in real-time are the two most common ways marketing leaders are using AI today, according to Salesforce. Improving customer segmentation, improving advertising and media buying, and personalizing channel experiences are the next fastest-growing areas of AI adoption in marketing today. Source: Salesforce’s State of Marketing Study, 5th edition

  • 81% of marketers are either planning to or are using AI in audience targeting this year. 80% are currently using or planning to use AI for audience segmentation. EConsultancy’s study found marketers are enthusiastic about AI’s potential to increase marketing effectiveness and track progress. 88% of marketers interviewed say AI will enable them t be more effective in getting to their goals. Source: Dream vs. Reality: The State of Consumer First and Omnichannel Marketing. EConsultancy (36 pp., PDF, free, no opt-in).

  • Over 41% of marketers say AI is enabling them to generate higher revenues from e-mail marketing. They also see an over 13% improvement in click-thru rates and 7.64% improvement in open rates. Source: 4 Positive Effects of AI Use in Email Marketing, Statista (infographic), March 1, 2019.

Additional data sources on AI’s use in Marketing:

15 examples of artificial intelligence in marketing, eConsultancy, February 28, 2019

4 Positive Effects of AI Use in Email Marketing, Statista, March 1, 2019

4 Ways Artificial Intelligence Can Improve Your Marketing (Plus 10 Provider Suggestions), Forbes, Kate Harrison, January 20, 2019

AI: The Next Generation Of Marketing Driving Competitive Advantage Throughout The Customer Life Cycle, Forrester Consulting. February 2017 (10 pp., PDF, free, no opt-in).

Artificial Intelligence for Marketing (complete book) (361 pp., PDF, free, no opt-in)

Artificial Intelligence Roundup, eMarketer, May 2018 (15 pp., PDF, free, no opt-in)

Digital Intelligence Briefing, Adobe, 2018 (43 pp., PDF, free, no opt-in).

How 28 Brands Are Using AI to Enhance Their Marketing [Infographic], Impact Blog

How AI Is Changing Sales, Harvard Business Review, July 30, 2018

How Top Marketers Use Artificial Intelligence On-Demand Webinar with Vala Afshar, Chief Digital Evangelist, Salesforce and Meghann York, Director, Product Marketing, Salesforce

How To Win Tomorrow’s Car Buyers – Artificial Intelligence in Marketing & Sales, McKinsey Center for Future Mobility, McKinsey & Company. February 2019. (44 pp., PDF, free, no opt-in)

IDC MarketScape: Worldwide Artificial Intelligence in Enterprise Marketing Clouds 2017 Vendor Assessment, (11 pp., PDF, free, no opt-in.)

In-depth: Artificial Intelligence 2019, Statista Digital Market Outlook, February 2019 (client access reqd).

Leading reasons to use artificial intelligence (AI) for marketing personalization according to industry professionals worldwide in 2018, Statista.

Lessons of 21st-Century Brands Modern Brands & AI Report, Forbes Insights and Quantcast Study (17 pp., PDF, free, opt-in),

Powerful pricing: The next frontier in apparel and fashion advanced analytics, McKinsey & Company, December 2018

Share of marketing and agency professionals who are comfortable with AI-enabled technology automated handling of their campaigns in the United States as of June 2018, Statista.  

The CMO Survey: Highlights and Insights Report, February 2019. Duke University, Deloitte and American Marketing Association. (71 pp., PDF, free, no opt-in).

Visualizing the uses and potential impact of AI and other analytics, McKinsey Global Institute, April 2018.  Interactive page based on Tableau data set can be found here.

What really matters in B2B dynamic pricing, McKinsey & Company, October 2018

Winning tomorrow’s car buyers using artificial intelligence in marketing and sales, McKinsey & Company, February 2019

Worldwide Spending on Artificial Intelligence Systems Will Grow to Nearly $35.8 Billion in 2019, According to New IDC Spending Guide, IDC; March 11, 2019

What’s Next For You? How AI Is Transforming Talent Management

Bottom Line: Taking on the talent crisis with greater intelligence and insight, delivering a consistently excellent candidate experience, and making diversity and inclusion a part of their DNA differentiates growing businesses who are attracting and retaining employees. The book What’s Next For You? by Ashutosh Garg, CEO and Co-Founder and Kamal Ahluwalia, President of eightfold.ai provide valuable insights and a data-driven roadmap of how AI is helping to solve the talent crisis for any business.

The Talent Crisis Is Real

  • 78% of CEOs and Chief Human Resource Officers (CHROs) say talent programs are important, with 56% say their current programs are ineffective.
  • 83% of employees want a new job yet only 53% want to leave for a new company.
  • 57% of employees say diversity and inclusion initiatives aren’t working, and 40% say their companies lack qualified diverse talent.
  • Nearly 50% of an organizations’ top talent will leave their jobs in the first two years of being hired.
  • 28% of open positions today won’t be filled in the next 12 months.

The above findings are just a sample of the depth of data-driven content and roadmap the book What’s Next For You? delivers. Co-authors Ashutosh Garg’s and Kamal Ahluwalia’s expertise in applying AI and machine learning to talent management problems with a strong data-first mindset is evident throughout the book. What makes the book noteworthy is how the authors write from the heart first with empathy for applicants and hiring managers, supporting key points with data. The empathetic, data-driven tone of the book makes the talent crisis relatable while also illustrating how AI can help any business make better talent management decisions.

“Businesses are having to adapt to technology changes and changes in customer expectations roughly every 10 years – a timeframe that is continuing to shrink. As a result, business leaders need to really focus on rethinking their business strategy and the associated talent strategy, so they have the organizational capability to transform and capitalize on the inevitable technology shifts,” writes John Thompson, Venture Partner, Lightspeed Venture Partners and Chairman of the Board at Microsoft in the forward.

The book cites talent management researchers and experts who say “our current knowledge base has a half-life of about two years, and the speed of technology is outperforming us as humans because of what it can do quickly and effectively“ (p.64). John Thompson’s observations in the forward that the time available for adapting to change is shrinking is a unifying thread that ties this book together. One of the most convincing is the fact that using today’s Applicant Tracking Systems (ATS) and hiring processes prone to biases, there’s a 30% chance a new hire will not make it through their first year. If the new hire is a cloud computing professional, this equates to a median salary of $146,350 and taking best-case 46 days to find their replacement. The cost and time loss of losing just one recruited cloud computing professional can derail a project for months. It will cost at least $219,000 or more to replace just that one engineer. Any manager who has lost a new hire within a year can relate to how real the talent crisis is and how urgent it is to solve it.

The Half-Life Of Skills Is Shrinking Fast

The most compelling chapter of the book illustrates how today’s talent crisis can be solved by taking an AI-enabled approach to every aspect of talent management. Chapter 4, The Half-Life Of Skills Is Shrinking Fast, delves into how AI can find candidates who can unlearn old concepts, and quickly master new ones. The book calls out this attribute of any potential new hire as being essential for them to adapt.  Using higher quality data than is available in traditional ATS systems, the authors illustrate how AI-based systems can be used for evaluating both the potential and experiences of applicants to match them with positions they will excel in. The authors make a convincing argument that AI can increase the probability of new candidate success. They cite a well-known Leadership IQ statistic of 46% of all new employee hires failing to adapt within 18 months, and the Harvard Business Review study finding between 40% to 60% of new upper management hires fail within 18 months. The authors contend that even Leonardo Da Vinci, one of the primary architects of the Renaissance, would have trouble finding work using a traditional resume entered into an ATS system today because his exceptional capabilities and potential would have never been discovered. When our existing process of recruiting is based on practices over 500 years old, as this copy of Leonardo Da Vinci’s resume illustrates, it’s time to put AI to work matching peoples’ potential with unique position requirements.

When Employees Achieve Their Potential, Companies Do Too   

Attracting the highest potential employees possible and retaining them is the cornerstone of any digital business’ growth strategy today and in the future. The book addresses the roadblocks companies face in attaining that goal, with bias being one of the strongest. “For example, McKinsey & Co., a top consulting agency, studied over 1,000 companies across 12 countries and found that firms in the top quartile of gender diversity were a fifth more likely to have above-average profits than those in the bottom quartile,” (p. 105). Further, “diverse executive boards generate better financial returns, and gender-diverse teams are more creative, more productive and more confident.” (p. 105).

In conclusion, consider this book a roadmap of how hiring and talent management can change for the better based on AI. The authors successfully illustrate how combining talent, personalization at scale, and machine learning can help employees achieve their potential, enabling companies to achieve theirs in the process.

Indeed’s 10 Most Popular AI & Machine Learning Jobs This Year

Indeed's 10 Most Popular AI & Machine Learning Jobs This Year

  • AI and Machine Learning job postings on Indeed rose 29.10% over the last year between May 2018 and May 2019.
  • Machine Learning and Deep Learning Engineers are the most popular jobs posted on Indeed between 2018 and 2019.
  • Machine Learning Engineers are earning an average salary of $142,858.57 in 2019 based on an analysis of all open positions on Indeed.
  • Indeed is seeing a leveling off of candidate-initiated searches for AI & Machine Learning (ML) jobs, dropping 14.5% between May 2018 and May 2019

These and many other insights are from Indeed’s recent report of the top 10 AI Jobs, and Salaries. Indeed’s analytics team completed an analysis of AI and machine learning hiring trends in 2019 to discover the top positions, highest salaries, and where the best opportunities are. The following are key insights from their latest study of AI and machine learning recruiting and hiring trends:

  • Machine Learning Engineers earn an average salary of $142,858.57 in 2019 based on an analysis of all open positions on Indeed. The Indeed analytics team found that the average annual salary for Machine Learning Engineers has grown by $8,409 in just a year, increasing 5.8%. Algorithm engineer’s average annual salary rose to $109,313 this year, an increase of $5,201, or 5%. Both salary bumps are likely a result of organizations’ spending more to attract talent to these crucial roles in a competitive AI job market

Indeed's 10 Most Popular AI & Machine Learning Jobs This Year

  • Machine Learning and Deep Learning Engineers are the most sought-after, popular jobs posted on Indeed between 2018 and 2019.  The Indeed analytics team identified the top 10 positions with the highest percentage of job descriptions that include the keywords “artificial intelligence” or “machine learning.” New jobs appearing on the list for the first time include Senior Data Scientist, Junior Data Scientist, Developer Consultant, Director of Data Science, and Lead Data Scientist. The inclusion of five new titles and the mix of skills shown in the table below reflects organizations’ growing expertise using AI, deep learning, and machine learning to drive business outcomes.

Indeed's 10 Most Popular AI & Machine Learning Jobs This Year

  • AI and Machine Learning job postings on Indeed rose 29.10% over the last year between May 2018 and May 2019.  Indeed found the increase is significantly less than it was for the previous two years. During the same period, May 2017 to May 2018 AI job postings on Indeed rose 57.91%, and a whopping 136.29% between May 2016 and May 2017.
  • Indeed is seeing a leveling off of candidate-initiated searches for AI & Machine Learning (ML) jobs, dropping 14.5% between May 2018 and May 2019. In comparison, searches increased 32% between May 2017 and May 2018 and 49.1% between May 2016 and May 2017. There are demand-and supply-side explanations for the 14.5% drop. From the demand side, the effects of AI and machine learning reaching broader adoption and maturing in organizations is leading to a greater variety of skills being recruited for. The 14.5% reduction reflects the broadening base of skills enterprises need to get the most out of AI and machine learning. From a supply side, potential job candidates are seeing the broadening base of skills they need to get hired, which are quickly making job descriptions from two years ago or longer obsolete. Finding candidates who have capabilities and potential to excel in AI and machine learning positions needs to get beyond just relying on job descriptions. Eightfold is doing just that by relying on machine learning algorithms to match candidates who have the optimal set of capabilities and potential for every open position an organization has.
  • New York, San Francisco, and Washington D.C. are the top three cities for AI and machine learning jobs in 2019. Indeed’s 2018 study also found New York and San Francisco leading all other metropolitan areas in open positions. New York’s diverse industries that range from banking, financial services, institutional investing, insurance to a growing AI startup community all contribute to its ranking first in the U.S. for AI positions.

Indeed's 10 Most Popular AI & Machine Learning Jobs This Year

Customer Experiences Define Success In A Digital-First World

Customer Experiences Define Success In A Digital-First World

  • 91% of enterprises have adopted or have plans to adopt a digital-first strategy. Of these enterprises, 48% already have a digital-first approach in place.
  • Creating better customer experiences (67%), improving process efficiency through automation (53%), and driving new revenue (48%) are the top three digital business strategies enterprises are investing in today.
  • 35% of enterprises have experienced revenue growth due to digital business initiatives over the past 12 months.
  • 5G, Artificial Intelligence, and Machine Learning are the top technologies being researched by enterprises who are defining digital business strategies.
  • Enterprises are planning to spend $15.3M on digital initiatives over the next 12 months. 59% will be allocated to technology, and 41% will be dedicated to people and skills.

These and many other fascinating insights are from the second annual IDG Digital Business study, The State of Digital Business Transformation 2019. You can download a summary of the slides here (7 pp., PDF, opt-in). The survey’s methodology is based on 702 interviews across nine industries with technology, financial services, and business services (consulting, legal and real estate) comprising 43% of all respondents. IDG relied on CIO, Computerworld, CSO, InfoWorld, and Network World visitors as their primary respondent base. For additional details regarding the methodology, please see page 2 of the study.

The study’s primary goal was to gain a better understanding of where organizations are in their approaches to becoming digital-first businesses. The study captures the strategies and technologies businesses are adopting to ensure digitally-driven growth with customer experience improvements being proven as a growth catalyst. Key insights from the survey include the following:

  • 52% of enterprises define digital business as meeting customer experience expectations, jumping to 65% for financial services enterprises. Customer expectations rule all other categories of how an enterprise defines a digital business. 49% define digital business as enabling worker productivity with mobile apps, data access, and AI-assisted automation. The following graphic compares how enterprises define their digital business. Please click on the graphic to expand for easier reading.

Customer Experiences Define Success In A Digital-First World

  • Mobile devices and apps are enterprises’ platform of choice for launching digital-first strategies in 2019. Mobile apps and the platforms supporting them provide the needed scale, speed-to-market, and performance gains through application-level improvements that all businesses need to gain initial adoption and growth with their digital-first strategies. IDG found that private cloud and business process management are the second- and third-most used technologies to drive digital-first initiatives. Enterprises also have a considerable lead when it comes to mobile app availability: 74% have mobile apps today compared to 51% of SMBs.

  • Internet of Things (IoT), Artificial Intelligence (AI) and machine learning are the leading three initiatives enterprises have in pilot today as part of their digital-first initiatives. 21% of all organizations surveyed are in one or more IoT pilots, and 20% of organizations are piloting AI and machine learning projects today. Nearly a third of all organizations (29%) have multi-cloud configurations in production today, and 25% have software-defined Wide Area Networks (WANs).

  • 57% of enterprises (companies with over 1K employees) say improving new product and service offerings by digitally enabling operations is the single greatest source of revenue growth. Digitally enabling or streamlining new product and development processes and the systems supporting them also improve the ability to innovate and size new opportunities (49%). It makes sense that once the new product development process is more digitally enabled, an organization will be able to more efficiently launch new capabilities (47% in enterprises) and improve sales capacity including upsell and cross-sell (41% overall).

  • Creating better customer experiences (67%), improving process efficiency through automation (53%), and driving new revenue (48%) are the top three digital business strategies enterprises are investing in today. Business Management, including General Managers with P&L responsibility, are placing a high priority on creating a better customer experience, far above all else. They’re the revenue drivers of businesses adopting a digital-first strategy today as well, over 10% higher than IT Management and 12% higher than IT executives.

  • In the most successful digital-first businesses, the CIO the most visible, vocal, and successful in leading change management initiatives. Six of the nine core dimensions of a successful digital enablement strategy are dominated by CIOs. Technology Needs Assessment (48%), IT Skills Assessment (48%) and Change Management (33%) are the three areas CIOs are making the greatest contribution to digital-first strategies on the part of their businesses. It’s important to note that CIOs are far and away, the champion and leader of data management strategies as well.

  • Enterprises are placing a high priority on data security and protection as part of the digital-first initiatives, with 27% having cybersecurity systems in place. It’s encouraging to see business and IT leaders making data and system security their highest priority, getting results quickly in this area. Technology needs assessment, and IT skills assessment (both 24%) are also areas where enterprises are making strong progress. As the CIO owns these areas and is also the person most likely to be owning change management, it’s understandable how advanced digital-first businesses are on these two dimensions. The following graphic compares the progress enterprises are making in becoming a digitally-driven business.

How To Get Your Data Scientist Career Started

The most common request from this blogs’ readers is how to further their careers in analytics, cloud computing, data science, and machine learning. I’ve invited Alyssa Columbus, a Data Scientist at Pacific Life, to share her insights and lessons learned on breaking into the field of data science and launching a career there. The following guest post is authored by her.

Earning a job in data science, especially your first job in data science, isn’t easy, especially given the surplus of analytics job-seekers to analytics jobs.

Many people are looking to break into data science, from undergraduates to career changers, have asked me how I’ve attained my current data science position at Pacific Life. I’ve referred them to many different resources, including discussions I’ve had on the Dataquest.io blog and the Scatter Podcast. In the interest of providing job seekers with a comprehensive view of what I’ve learned that works, I’ve put together the five most valuable lessons learned. I’ve written this article to make your data science job hunt easier and as efficient as possible.

  • Continuously build your statistical literacy and programming skills. Currently, there are 24,697 open Data Scientist positions on LinkedIn in the United States alone. Using data mining techniques to analyze all open positions in the U.S., the following list of the top 10 data science skills was created today. As of April 14, the top 3 most common skills requested in LinkedIn data scientist job postings are Python, R, and SQL, closely followed by Jupyter Notebooks, Unix Shell/Awk, AWS, and Tensorflow. The following graphic provides a prioritized list of the most in-demand data science skills mentioned in LinkedIn job postings today. Please click on the graphic to expand for easier viewing.

Hands-on training is the best way to develop and continually improve statistical and programming skills, especially with the languages and technologies LinkedIn’s job postings prioritize.  Getting your hands dirty with a dataset is often much better than reading through abstract concepts and not applying what you’ve learned to real problems. Your applied experience is just as important as your academic experience, and taking statistics, and computer science classes help to translate theoretical concepts into practical results. The toughest thing to learn (and also to teach) about statistical analysis is the intuition for what the big questions to ask of your dataset are. Statistical literacy, or “how” to find the answers to your questions, come with education and practice. Strengthening your intellectual curiosity or insight into asking the right questions comes through experience.

  • Continually be creating your own, unique portfolio of analytics and machine learning projects. Having a good portfolio is essential to be hired as a data scientist, especially if you don’t come from a quantitative background or have experience in data science before. Think of your portfolio as proof to potential employers that you are capable of excelling in the role of a data scientist with both the passion and skills to do the job. When building your data science portfolio, select and complete projects that qualify you for the data science jobs, you’re the most interested in. Use your portfolio to promote your strengths and innate abilities by sharing projects you’ve completed on your own. Some skills I’d recommend you highlight in your portfolio include:
    • Your programming language of choice (e.g., Python, R, Julia, etc.).
    • The ability to interact with databases (e.g., your ability to use SQL).
    • Visualization of data (static or interactive).
    • Storytelling with data. This is a critical skill. In essence, can someone with no background in whatever area your project is in look at your project and gain some new understandings from it?
    • Deployment of an application or API. This can be done with small sample projects (e.g., a REST API for an ML model you trained or a nice Tableau or R Shiny dashboard).

Julia Silge and Amber Thomas both have excellent examples of portfolios that you can be inspired by. Julia’s portfolio is shown below.

  • Get (or git!) yourself a website. If you want to stand out, along with a portfolio, create and continually build a strong online presence in the form of a website.  Be sure to create and continually add to your GitHub and Kaggle profiles to showcase your passion and proficiency in data science. Making your website with GitHub Pages creates a profile for you at the same time, and best of all it’s free to do. A strong online presence will not only help you in applying for jobs, but organizations may also reach out to you with freelance projects, interviews, and other opportunities.
  • Be confident in your skills and apply for any job you’re interested in, starting with opportunities available in your network.  If you don’t meet all of a job’s requirements, apply anyway. You don’t have to know every skill (e.g., programming languages) on a job description, especially if there are more than ten listed. If you’re a great fit for the main requirements of the job’s description, you need to apply. A good general rule is that if you have at least half of the skills requested on a job posting, go for it. When you’re hunting for jobs, it may be tempting to look for work on company websites or tech-specific job boards. I’ve found, as have many others, that these are among the least helpful ways to find work. Instead, contact recruiters specializing in data science and build up your network to break into the field. I recommend looking for a data science job via the following sources, with the most time devoted to recruiters and your network:
    • Recruiters
    • Friends, family, and colleagues
    • Career fairs and recruiting events
    • General job boards
    • Company websites
    • Tech job boards.

Alyssa Columbus is a Data Scientist at Pacific Life and member of the Spring 2018 class of NASA Datanauts. Previously, she was a computational statistics and machine learning researcher at the UC Irvine Department of Epidemiology and has built robust predictive models and applications for a diverse set of industries spanning retail to biologics. Alyssa holds a degree in Applied and Computational Mathematics from the University of California, Irvine and is a member of Phi Beta Kappa. She is a strong proponent of reproducible methods, open source technologies, and diversity in analytics and is the founder of R-Ladies Irvine. You can reach her at her website: alyssacolumbus.com.

Seven Things You Need To Know About IIoT In Manufacturing

  • Global spending on IIoT Platforms for Manufacturing is predicted to grow from $1.67B in 2018 to $12.44B in 2024, attaining a 40% compound annual growth rate (CAGR) in seven years.
  • IIoT platforms are beginning to replace MES and related applications, including production maintenance, quality, and inventory management, which are a mix of Information Technology (IT) and Operations Technology (OT) technologies.
  • Connected IoT technologies are enabling a new era of smart, connected products that often expand on the long-proven platforms of everyday products. Capgemini estimates that the size of the connected products market will be $519B to $685B by 2020.

These and many other fascinating insights are from IoT Analytics’ study, IIoT Platforms For Manufacturing 2019 – 2024 (155 pp., PDF, client access reqd). IoT Analytics is a leading provider of market insights for the Internet of Things (IoT), M2M, and Industry 4.0. They specialize in providing insights on IoT markets and companies, focused market reports on specific IoT segments and Go-to-Market services for emerging IoT companies. The study’s methodology includes interviews with twenty of the leading IoT platform providers, executive-level IoT experts, and IIoT end users. For additional details on the methodology, please see pages 136 and 137 of the report. IoT Analytics defines the Industrial loT (lloT) as heavy industries including manufacturing, energy, oil and gas, and agriculture in which industrial assets are connected to the internet.

The seven things you need to know about IIoT in manufacturing include the following:

  • IoT Analytics’ technology architecture of the Internet of Things reflects the proliferation of new products, software and services, and the practical needs manufacturers have for proven integration to make the Industrial Internet of Things (IIoT) work. IoT technology architectures are in their nascent phase, showing signs of potential in solving many of manufacturing’s most challenging problems. IoT Analytics’ technology architecture shown below is designed to scale in response to the diverse development across the industry landscape with a modular, standardized approach.

  • IIoT platforms are beginning to replace MES and related applications, including production maintenance, quality, and inventory management, which are a mix of Information Technology (IT) and Operations Technology (OT) technologies. IoT Analytics is seeing IIoT platforms begin to replace existing industrial software systems that had been created to bridge the IT and OT gaps in manufacturing environments. Their research teams are finding that IIoT Platforms are an adjacent technology to these typical industrial software solutions but are now starting to replace some of them in smart connected factory settings. The following graphic explains how IoT Analytics sees the IIoT influence across the broader industrial landscape:

  • Global spending on IIoT Platforms for Manufacturing is predicted to grow from $1.67B in 2018 to $12.44B in 2024, attaining a 40% compound annual growth rate (CAGR) in seven years. IoT Analytics is finding that manufacturing is the largest IoT platform industry segment and will continue to be one of the primary growth catalysts of the market through 2024. For purposes of their analysis, IoT Analytics defines manufacturing as standardized production environments including factories, workshops, in addition to custom production worksites such as mines, offshore oil gas, and construction sites. The lloT platforms for manufacturing segment have experienced growth in the traditionally large manufacturing-base countries such as Japan and China. IoT Analytics relies on econometric modeling to create their forecasts.

  • In 2018, the industrial loT platforms market for manufacturing had an approximate 60%/40% split for within factories/outside factories respectively. IoT Analytics predicts this split is expected to remain mostly unchanged for 2019 and by 2024 within factories will achieve slight gains by a few percentage points. The within factories type (of lloT Platforms for Manufacturing) is estimated to grow from a $1B market in 2018 to a $1.5B market by 2019 driven by an ever-increasing amount of automation (e.g., robots on the factory floor) being introduced to factory settings for increased efficiencies, while the outside factories type is forecast to grow from $665M in 2018 to become a $960M market by 2019.

  • Discrete manufacturing is predicted to be the largest percentage of Industrial IoT platform spending for 2019, growing at a CAGR of 46% from 2018. Discrete manufacturing will outpace batch and process manufacturing, becoming 53% of all IIoT platform spending this year. IoT Analytics sees discrete manufacturers pursuing make-to-stock, make-to-order, and assemble-to-order production strategies that require sophisticated planning, scheduling, and tracking capabilities to improve operations and profitability. The greater the production complexity in discrete manufacturing, the more valuable data becomes. Discrete manufacturing is one of the most data-prolific industries there are, making it an ideal catalyst for IIoT platform’s continual growth.

  • Manufacturers are most relying on IIoT platforms for general process optimization (43.1%), general dashboards & visualization (41.1%) and condition monitoring (32.7%). Batch, discrete, and process manufacturers are prioritizing other use cases such as predictive maintenance, asset tracking, and energy management as all three areas make direct contributions to improving shop floor productivity. Discrete manufacturers are always looking to free up extra time in production schedules so that they can offer short-notice production runs to their customers. Combining IIoT platform use cases to uncover process and workflow inefficiencies so more short-notice production runs can be sold is driving Proof of Concepts (PoC) today in North American manufacturing.

  • IIoT platform early adopters prioritize security as the most important feature, ahead of scalability and usability. Identity and Access Management, multifactor-factor authentication, consistency of security patch updates, and the ability to scale and protect every threat surface across an IIoT network are high priorities for IIoT platform adopters today. Scale and usability are the second and third priorities. The following graphic compares IIoT platform manufacturers’ most important needs:

For more information on the insights presented here, check out IoT Analytics’ report: IIoT Platforms For Manufacturing 2019 – 2024.

%d bloggers like this: