Skip to content
Advertisements

Posts tagged ‘Machine learning’

How To Get Your Data Scientist Career Started

The most common request from this blogs’ readers is how to further their careers in analytics, cloud computing, data science, and machine learning. I’ve invited Alyssa Columbus, a Data Scientist at Pacific Life, to share her insights and lessons learned on breaking into the field of data science and launching a career there. The following guest post is authored by her.

Earning a job in data science, especially your first job in data science, isn’t easy, especially given the surplus of analytics job-seekers to analytics jobs.

Many people are looking to break into data science, from undergraduates to career changers, have asked me how I’ve attained my current data science position at Pacific Life. I’ve referred them to many different resources, including discussions I’ve had on the Dataquest.io blog and the Scatter Podcast. In the interest of providing job seekers with a comprehensive view of what I’ve learned that works, I’ve put together the five most valuable lessons learned. I’ve written this article to make your data science job hunt easier and as efficient as possible.

  • Continuously build your statistical literacy and programming skills. Currently, there are 24,697 open Data Scientist positions on LinkedIn in the United States alone. Using data mining techniques to analyze all open positions in the U.S., the following list of the top 10 data science skills was created today. As of April 14, the top 3 most common skills requested in LinkedIn data scientist job postings are Python, R, and SQL, closely followed by Jupyter Notebooks, Unix Shell/Awk, AWS, and Tensorflow. The following graphic provides a prioritized list of the most in-demand data science skills mentioned in LinkedIn job postings today. Please click on the graphic to expand for easier viewing.

Hands-on training is the best way to develop and continually improve statistical and programming skills, especially with the languages and technologies LinkedIn’s job postings prioritize.  Getting your hands dirty with a dataset is often much better than reading through abstract concepts and not applying what you’ve learned to real problems. Your applied experience is just as important as your academic experience, and taking statistics, and computer science classes help to translate theoretical concepts into practical results. The toughest thing to learn (and also to teach) about statistical analysis is the intuition for what the big questions to ask of your dataset are. Statistical literacy, or “how” to find the answers to your questions, come with education and practice. Strengthening your intellectual curiosity or insight into asking the right questions comes through experience.

  • Continually be creating your own, unique portfolio of analytics and machine learning projects. Having a good portfolio is essential to be hired as a data scientist, especially if you don’t come from a quantitative background or have experience in data science before. Think of your portfolio as proof to potential employers that you are capable of excelling in the role of a data scientist with both the passion and skills to do the job. When building your data science portfolio, select and complete projects that qualify you for the data science jobs, you’re the most interested in. Use your portfolio to promote your strengths and innate abilities by sharing projects you’ve completed on your own. Some skills I’d recommend you highlight in your portfolio include:
    • Your programming language of choice (e.g., Python, R, Julia, etc.).
    • The ability to interact with databases (e.g., your ability to use SQL).
    • Visualization of data (static or interactive).
    • Storytelling with data. This is a critical skill. In essence, can someone with no background in whatever area your project is in look at your project and gain some new understandings from it?
    • Deployment of an application or API. This can be done with small sample projects (e.g., a REST API for an ML model you trained or a nice Tableau or R Shiny dashboard).

Julia Silge and Amber Thomas both have excellent examples of portfolios that you can be inspired by. Julia’s portfolio is shown below.

  • Get (or git!) yourself a website. If you want to stand out, along with a portfolio, create and continually build a strong online presence in the form of a website.  Be sure to create and continually add to your GitHub and Kaggle profiles to showcase your passion and proficiency in data science. Making your website with GitHub Pages creates a profile for you at the same time, and best of all it’s free to do. A strong online presence will not only help you in applying for jobs, but organizations may also reach out to you with freelance projects, interviews, and other opportunities.
  • Be confident in your skills and apply for any job you’re interested in, starting with opportunities available in your network.  If you don’t meet all of a job’s requirements, apply anyway. You don’t have to know every skill (e.g., programming languages) on a job description, especially if there are more than ten listed. If you’re a great fit for the main requirements of the job’s description, you need to apply. A good general rule is that if you have at least half of the skills requested on a job posting, go for it. When you’re hunting for jobs, it may be tempting to look for work on company websites or tech-specific job boards. I’ve found, as have many others, that these are among the least helpful ways to find work. Instead, contact recruiters specializing in data science and build up your network to break into the field. I recommend looking for a data science job via the following sources, with the most time devoted to recruiters and your network:
    • Recruiters
    • Friends, family, and colleagues
    • Career fairs and recruiting events
    • General job boards
    • Company websites
    • Tech job boards.

Alyssa Columbus is a Data Scientist at Pacific Life and member of the Spring 2018 class of NASA Datanauts. Previously, she was a computational statistics and machine learning researcher at the UC Irvine Department of Epidemiology and has built robust predictive models and applications for a diverse set of industries spanning retail to biologics. Alyssa holds a degree in Applied and Computational Mathematics from the University of California, Irvine and is a member of Phi Beta Kappa. She is a strong proponent of reproducible methods, open source technologies, and diversity in analytics and is the founder of R-Ladies Irvine. You can reach her at her website: alyssacolumbus.com.

Advertisements

Seven Things You Need To Know About IIoT In Manufacturing

  • Global spending on IIoT Platforms for Manufacturing is predicted to grow from $1.67B in 2018 to $12.44B in 2024, attaining a 40% compound annual growth rate (CAGR) in seven years.
  • IIoT platforms are beginning to replace MES and related applications, including production maintenance, quality, and inventory management, which are a mix of Information Technology (IT) and Operations Technology (OT) technologies.
  • Connected IoT technologies are enabling a new era of smart, connected products that often expand on the long-proven platforms of everyday products. Capgemini estimates that the size of the connected products market will be $519B to $685B by 2020.

These and many other fascinating insights are from IoT Analytics’ study, IIoT Platforms For Manufacturing 2019 – 2024 (155 pp., PDF, client access reqd). IoT Analytics is a leading provider of market insights for the Internet of Things (IoT), M2M, and Industry 4.0. They specialize in providing insights on IoT markets and companies, focused market reports on specific IoT segments and Go-to-Market services for emerging IoT companies. The study’s methodology includes interviews with twenty of the leading IoT platform providers, executive-level IoT experts, and IIoT end users. For additional details on the methodology, please see pages 136 and 137 of the report. IoT Analytics defines the Industrial loT (lloT) as heavy industries including manufacturing, energy, oil and gas, and agriculture in which industrial assets are connected to the internet.

The seven things you need to know about IIoT in manufacturing include the following:

  • IoT Analytics’ technology architecture of the Internet of Things reflects the proliferation of new products, software and services, and the practical needs manufacturers have for proven integration to make the Industrial Internet of Things (IIoT) work. IoT technology architectures are in their nascent phase, showing signs of potential in solving many of manufacturing’s most challenging problems. IoT Analytics’ technology architecture shown below is designed to scale in response to the diverse development across the industry landscape with a modular, standardized approach.

  • IIoT platforms are beginning to replace MES and related applications, including production maintenance, quality, and inventory management, which are a mix of Information Technology (IT) and Operations Technology (OT) technologies. IoT Analytics is seeing IIoT platforms begin to replace existing industrial software systems that had been created to bridge the IT and OT gaps in manufacturing environments. Their research teams are finding that IIoT Platforms are an adjacent technology to these typical industrial software solutions but are now starting to replace some of them in smart connected factory settings. The following graphic explains how IoT Analytics sees the IIoT influence across the broader industrial landscape:

  • Global spending on IIoT Platforms for Manufacturing is predicted to grow from $1.67B in 2018 to $12.44B in 2024, attaining a 40% compound annual growth rate (CAGR) in seven years. IoT Analytics is finding that manufacturing is the largest IoT platform industry segment and will continue to be one of the primary growth catalysts of the market through 2024. For purposes of their analysis, IoT Analytics defines manufacturing as standardized production environments including factories, workshops, in addition to custom production worksites such as mines, offshore oil gas, and construction sites. The lloT platforms for manufacturing segment have experienced growth in the traditionally large manufacturing-base countries such as Japan and China. IoT Analytics relies on econometric modeling to create their forecasts.

  • In 2018, the industrial loT platforms market for manufacturing had an approximate 60%/40% split for within factories/outside factories respectively. IoT Analytics predicts this split is expected to remain mostly unchanged for 2019 and by 2024 within factories will achieve slight gains by a few percentage points. The within factories type (of lloT Platforms for Manufacturing) is estimated to grow from a $1B market in 2018 to a $1.5B market by 2019 driven by an ever-increasing amount of automation (e.g., robots on the factory floor) being introduced to factory settings for increased efficiencies, while the outside factories type is forecast to grow from $665M in 2018 to become a $960M market by 2019.

  • Discrete manufacturing is predicted to be the largest percentage of Industrial IoT platform spending for 2019, growing at a CAGR of 46% from 2018. Discrete manufacturing will outpace batch and process manufacturing, becoming 53% of all IIoT platform spending this year. IoT Analytics sees discrete manufacturers pursuing make-to-stock, make-to-order, and assemble-to-order production strategies that require sophisticated planning, scheduling, and tracking capabilities to improve operations and profitability. The greater the production complexity in discrete manufacturing, the more valuable data becomes. Discrete manufacturing is one of the most data-prolific industries there are, making it an ideal catalyst for IIoT platform’s continual growth.

  • Manufacturers are most relying on IIoT platforms for general process optimization (43.1%), general dashboards & visualization (41.1%) and condition monitoring (32.7%). Batch, discrete, and process manufacturers are prioritizing other use cases such as predictive maintenance, asset tracking, and energy management as all three areas make direct contributions to improving shop floor productivity. Discrete manufacturers are always looking to free up extra time in production schedules so that they can offer short-notice production runs to their customers. Combining IIoT platform use cases to uncover process and workflow inefficiencies so more short-notice production runs can be sold is driving Proof of Concepts (PoC) today in North American manufacturing.

  • IIoT platform early adopters prioritize security as the most important feature, ahead of scalability and usability. Identity and Access Management, multifactor-factor authentication, consistency of security patch updates, and the ability to scale and protect every threat surface across an IIoT network are high priorities for IIoT platform adopters today. Scale and usability are the second and third priorities. The following graphic compares IIoT platform manufacturers’ most important needs:

For more information on the insights presented here, check out IoT Analytics’ report: IIoT Platforms For Manufacturing 2019 – 2024.

How To Improve Privileged User’s Security Experiences With Machine Learning

Bottom Line: One of the primary factors motivating employees to sacrifice security for speed are the many frustrations they face, attempting to re-authenticate who they are so they can get more work done and achieve greater productivity.

How Bad Security Experiences Lead to a Breach

Every business is facing the paradox of hardening security without sacrificing users’ login and system access experiences. Zero Trust Privilege is emerging as a proven framework for thwarting privileged credential abuse by verifying who is requesting access, the context of the request, and the risk of the access environment across every threat surface an organization has.

Centrify’s recent survey Privileged Access Management In The Modern Threatscape found that 74% of data breaches start with privileged credential abuse. Forrester estimates that 80% of data breaches have a connection to compromised privileged credentials, such as passwords, tokens, keys, and certificates. On the Dark Web, privileged access credentials are a best-seller because they provide the intruder with “the keys to the kingdom.” By leveraging a “trusted” identity, a hacker can operate undetected and exfiltrate sensitive data sets without raising any red flags.

Frustrated with wasting time responding to the many account lock-outs, re-authentication procedures, and login errors outmoded Privileged Access Management (PAM) systems require, IT Help Desk teams, IT administrators, and admin users freely share privileged credentials, often resulting in them eventually being offered for sale on the Dark Web.

The Keys to the Kingdom Are In High Demand

18% of healthcare employees are willing to sell confidential data to unauthorized parties for as little as $500 to $1,000, and 24% of employees know of someone who has sold privileged credentials to outsiders, according to a recent Accenture survey. State-sponsored and organized crime organizations offer to pay bounties in bitcoin for privileged credentials for many of the world’s largest financial institutions on the Dark Web. And with the typical U.S.-based enterprise losing on average $7.91M from a breach, more than double the global average of $3.86M according to IBM’s 2018 Data Breach Study, it’s clear that improving admin user experiences to reduce the incidence of privileged credential sharing needs to happen now.

How Machine Learning Improves Admin User Experiences and Thwarts Breaches

Machine learning is making every aspect of security experiences more adaptive, taking into account the risk context of every privileged access attempt across any threat surface, anytime. Machine learning algorithms can continuously learn and generate contextual intelligence that is used to streamline verified privileged user’s access while thwarting many potential threats ― the most common of which is compromised credentials.

The following are a few of the many ways machine learning is improving privileged users’ experiences when they need to log in to secure critical infrastructure resources:

  • Machine learning is making it possible to provide adaptive, personalized login experiences at scale using risk-scoring of every access attempt in real-time, all contributing to improved user experiences. Machine learning is making it possible to implement security strategies that flex or adapt to risk contexts in real-time, assessing every access attempt across every threat surface, and generating a risk score in milliseconds. Being able to respond in milliseconds, or real-time is essential for delivering excellent admin user experiences. The “never trust, always verify, enforce least privilege” approach to security is how many enterprises from a broad base of industries including leading financial services and insurance companies are protecting every threat surface from privileged access abuse. CIOs at these companies say taking a Zero Trust approach with a strong focus on Zero Trust Privilege corporate-wide is redefining the legacy approach to Privileged Access Management by delivering cloud-architected Zero Trust Privilege to secure access to infrastructure, DevOps, cloud, containers, Big Data, and other modern enterprise use cases. Taking a Zero Trust approach to security enables their departments to roll out new services across every threat surface their customers prefer to use without having to customize security strategies for each.
  • Quantify, track and analyze every potential security threat and attempted breach and apply threat analytics to the aggregated data sets in real-time, thwarting data exfiltration attempts before they begin. One of the tenets or cornerstones of Zero Trust Privilege is adaptive control. Machine learning algorithms continually “learn” by continuously analyzing and looking for anomalies in users’ behavior across every threat surface, device, and login attempt. When any users’ behavior appears to be outside the threshold of constraints defined for threat analytics and risk scoring, additional authentication is immediately requested, and access denied to requested resources until an identity can be verified. Machine learning makes adaptive preventative controls possible.
  • When every identity is a new security perimeter, machine learnings’ ability to provide personalization at scale for every access attempt on every threat surface is essential for enabling a company to keep growing. Businesses that are growing the fastest often face the greatest challenges when it comes to improving their privileged users’ experiences. Getting new employees productive quickly needs to be based on four foundational elements. These include verifying the identity of every admin user, knowing the context of their access request, ensuring it’s coming from a clean source, and limiting access as well as privilege. Taken together, these pillars form the foundation of a Zero Trust Privilege.

Conclusion

Organizations don’t have to sacrifice security for speed when they’re relying on machine learning-based approaches for improving the privileged user experience. Today, a majority of IT Help Desk teams, IT administrators, and admin users are freely sharing privileged credentials to be more productive, which often leads to breaches based on privileged access abuse. By taking a machine learning-based approach to validate every access request, the context of the request, and the risk of the access environment, roadblocks in the way of greater privileged user productivity disappear. Privileged credential abuse is greatly minimized.

Industry 4.0’s Potential Needs To Be Proven On The Shop Floor

  • 99% of mid-market manufacturing executives are familiar with Industry 4.0, yet only 5% are currently implementing or have implemented an Industry 4.0 strategy.
  • Investing in upgrading existing machinery, replacing fully depreciated machines with next-generation smart, connected production equipment, and adopting real-time monitoring including Manufacturing Execution Systems (MES) are manufacturers’ top three priorities based on interviews with them.
  • Mid-market manufacturers getting the most value out of Industry 4.0 excel at orchestrating a variety of technologies to find new ways to excel at product quality, improve shop floor productivity, meet delivery dates, and control costs.
  • Real-time monitoring is gaining momentum to improve order cycle times, troubleshoot quality problems, improve schedule accuracy, and support track-and-trace.

These and many other fascinating insights are from Industry 4.0: Defining How Mid-Market Manufacturers Derive and Deliver ValueBDO is a leading provider of assurance, tax, and financial advisory services and is providing the report available for download here (PDF, 36 pp., no opt-in). The survey was conducted by Market Measurement, Inc., an independent market research consulting firm. The survey included 230 executives at U.S. manufacturing companies with annual revenues between $200M and $3B and was conducted in November and December of 2018. Please see page 2 of the study for additional details regarding the methodology. One of the most valuable findings of the study is that mid-market manufacturers need more evidence of Industry 4.0, delivering improved supply chain performance, quality, and shop floor productivity.

Insights from the Shop Floor: Machine Upgrades, Smart Machines, Real-Time Monitoring & MES Lead Investment Plans

In the many conversations I’ve had with mid-tier manufacturers located in North America this year, I’ve learned the following:

  • Their top investment priorities are upgrading existing machinery, replacing fully depreciated machines with next-generation smart, connected production equipment, and adopting real-time monitoring including Manufacturing Execution Systems (MES).
  • Manufacturers growing 10% or more this year over 2018 excel at integrating technologies that improve scheduling to enable more short-notice production runs, reduce order cycle times, and improve supplier quality.

Key Takeaways from BDO’s Industry 4.0 Study

  • Manufacturers are most motivated to evaluate Industry 4.0 technologies based on the potential for growth and business model diversification they offer. Building a business case for any new system or technology that delivers revenue, even during a pilot, is getting the highest priority by manufacturers today. Based on my interviews with manufacturers, I found they were 1.7 times more likely to invest in machine upgrades and smart machines versus spending more on marketing. Manufacturers are very interested in any new technology that enables them to accept short-notice production runs from customers, excel at higher quality standards, improve time-to-market, all the while having better cost visibility and control. All those factors are inherent in the top three goals of business model diversification, improved operational efficiencies, and increased market penetration.

  • For Industry 4.0 technologies to gain more adoption, more use cases are needed to explain how traditional product sales, aftermarket sales, and product-as-a-service benefit from these new technologies. Manufacturers know the ROI of investing in a machinery upgrade, buying a smart, connected machine, or integrating real-time monitoring across their shop floors. What they’re struggling with is how Industry 4.0 makes traditional product sales improve. 84% of upper mid-market manufacturers are generating revenue using Information-as-a-Service today compared to 67% of middle market manufacturers overall.

  • Manufacturers who get the most value out of their Industry 4.0 investments begin with a customer-centric blueprint first, integrating diverse technologies to deliver excellent customer experiences. Manufacturers growing 10% a year or more are relying on roadmaps to guide their technology buying decisions. These roadmaps are focused on how to reduce scrap, improve order cycle times, streamline supplier integration while improving inbound quality levels, and provide real-time order updates to customers. BDOs’ survey results reflect what I’m hearing from manufacturers. They’re more focused than ever before on having an integrated engagement strategy combined with greater flexibility in responding to unique and often urgent production runs.

  • Industry 4.0’s potential to improve supply chains needs greater focus if mid-tier manufacturers are going to adopt the framework fully. Manufacturing executives most often equate Industry 4.0 with shop floor productivity improvements while the greatest gains are waiting in their supply chains. The BDO study found that manufacturers are divided on the metrics they rely on to evaluate their supply chains. Upper middle market manufacturers are aiming to speed up customer order cycle times and are less focused on getting their total delivered costs down. Lower mid-market manufacturers say reducing inventory turnover is their biggest priority. Overall, strengthening customer service increases in importance with the size of the organization.

  • By enabling integration between engineering, supply chain management, Manufacturing Execution Systems (MES) and CRM systems, more manufacturers are achieving product configuration strategies at scale. A key growth strategy for many manufacturers is to scale beyond the limitations of their longstanding Make-to-Stock production strategies. By integrating engineering, supply chains, MES, and CRM, manufacturers can offer more flexibility to their customers while expanding their product strategies to include Configure-to-Order, Make-to-Order, and for highly customized products, Engineer-to-Order. The more Industry 4.0 can be shown to enable design-to-manufacturing at scale, the more it will resonate with senior executives in mid-tier manufacturing.

  • Manufacturers are more likely than ever before to accept cloud-based platforms and systems that help them achieve their business strategies faster and more completely, with analytics being in the early stages of adoption. Manufacturing CEOs and their teams are most concerned about how quickly new applications and platforms can position their businesses for more growth. Whether a given application or platform is cloud-based often becomes secondary to the speed and time-to-market constraints every manufacturing business faces. The fastest-growing mid-tier manufacturers are putting greater effort and intensity into mastering analytics across every area of their business too. BDO found that Artificial Intelligence (AI) leads all other technologies in planned use.

How To Improve Supply Chains With Machine Learning: 10 Proven Ways

Bottom line: Enterprises are attaining double-digit improvements in forecast error rates, demand planning productivity, cost reductions and on-time shipments using machine learning today, revolutionizing supply chain management in the process.

Machine learning algorithms and the models they’re based on excel at finding anomalies, patterns and predictive insights in large data sets. Many supply chain challenges are time, cost and resource constraint-based, making machine learning an ideal technology to solve them. From Amazon’s Kiva robotics relying on machine learning to improve accuracy, speed and scale to DHL relying on AI and machine learning to power their Predictive Network Management system that analyzes 58 different parameters of internal data to identify the top factors influencing shipment delays, machine learning is defining the next generation of supply chain management. Gartner predicts that by 2020, 95% of Supply Chain Planning (SCP) vendors will be relying on supervised and unsupervised machine learning in their solutions. Gartner is also predicting by 2023 intelligent algorithms, and AI techniques will be an embedded or augmented component across 25% of all supply chain technology solutions.

The ten ways that machine learning is revolutionizing supply chain management include:

  • Machine learning-based algorithms are the foundation of the next generation of logistics technologies, with the most significant gains being made with advanced resource scheduling systems. Machine learning and AI-based techniques are the foundation of a broad spectrum of next-generation logistics and supply chain technologies now under development. The most significant gains are being made where machine learning can contribute to solving complex constraint, cost and delivery problems companies face today. McKinsey predicts machine learning’s most significant contributions will be in providing supply chain operators with more significant insights into how supply chain performance can be improved, anticipating anomalies in logistics costs and performance before they occur. Machine learning is also providing insights into where automation can deliver the most significant scale advantages. Source: McKinsey & Company, Automation in logistics: Big opportunity, bigger uncertainty, April 2019. By Ashutosh Dekhne, Greg Hastings, John Murnane, and Florian Neuhaus

  • The wide variation in data sets generated from the Internet of Things (IoT) sensors, telematics, intelligent transport systems, and traffic data have the potential to deliver the most value to improving supply chains by using machine learning. Applying machine learning algorithms and techniques to improve supply chains starts with data sets that have the greatest variety and variability in them. The most challenging issues supply chains face are often found in optimizing logistics, so materials needed to complete a production run arrive on time. Source: KPMG, Supply Chain Big Data Series Part 1

  • Machine learning shows the potential to reduce logistics costs by finding patterns in track-and-trace data captured using IoT-enabled sensors, contributing to $6M in annual savings. BCG recently looked at how a decentralized supply chain using track-and-trace applications could improve performance and reduce costs. They found that in a 30-node configuration when blockchain is used to share data in real-time across a supplier network, combined with better analytics insight, cost savings of $6M a year is achievable. Source: Boston Consulting Group, Pairing Blockchain with IoT to Cut Supply Chain Costs, December 18, 2018, by Zia Yusuf, Akash Bhatia, Usama Gill, Maciej Kranz, Michelle Fleury, and Anoop Nannra

  • Reducing forecast errors up to 50% is achievable using machine learning-based techniques. Lost sales due to products not being available are being reduced up to 65% through the use of machine learning-based planning and optimization techniques. Inventory reductions of 20 to 50% are also being achieved today when machine learning-based supply chain management systems are used. Source: Digital/McKinsey, Smartening up with Artificial Intelligence (AI) – What’s in it for Germany and its Industrial Sector? (PDF, 52 pp., no opt-in).

  • DHL Research is finding that machine learning enables logistics and supply chain operations to optimize capacity utilization, improve customer experience, reduce risk, and create new business models. DHL’s research team continually tracks and evaluates the impact of emerging technologies on logistics and supply chain performance. They’re also predicting that AI will enable back-office automation, predictive operations, intelligent logistics assets, and new customer experience models. Source: DHL Trend Research, Logistics Trend Radar, Version 2018/2019 (PDF, 55 pp., no opt-in)

  • Detecting and acting on inconsistent supplier quality levels and deliveries using machine learning-based applications is an area manufacturers are investing in today. Based on conversations with North American-based mid-tier manufacturers, the second most significant growth barrier they’re facing today is suppliers’ lack of consistent quality and delivery performance. The greatest growth barrier is the lack of skilled labor available. Using machine learning and advanced analytics manufacturers can discover quickly who their best and worst suppliers are, and which production centers are most accurate in catching errors. Manufacturers are using dashboards much like the one below for applying machine learning to supplier quality, delivery and consistency challenges. Source: Microsoft, Supplier Quality Analysis sample for Power BI: Take a tour, 2018

  • Reducing risk and the potential for fraud, while improving the product and process quality based on insights gained from machine learning is forcing inspection’s inflection point across supply chains today. When inspections are automated using mobile technologies and results are uploaded in real-time to a secure cloud-based platform, machine learning algorithms can deliver insights that immediately reduce risks and the potential for fraud. Inspectorio is a machine learning startup to watch in this area. They’re tackling the many problems that a lack of inspection and supply chain visibility creates, focusing on how they can solve them immediately for brands and retailers. The graphic below explains their platform. Source: Forbes, How Machine Learning Improves Manufacturing Inspections, Product Quality & Supply Chain Visibility, January 23, 2019

  • Machine learning is making rapid gains in end-to-end supply chain visibility possible, providing predictive and prescriptive insights that are helping companies react faster than before. Combining multi-enterprise commerce networks for global trade and supply chain management with AI and machine learning platforms are revolutionizing supply chain end-to-end visibility. One of the early leaders in this area is Infor’s Control Center. Control Center combines data from the Infor GT Nexus Commerce Network, acquired by the company in September 2015, with Infor’s Coleman Artificial Intelligence (AI) Infor chose to name their AI platform after the inspiring physicist and mathematician Katherine Coleman Johnson, whose trail-blazing work helped NASA land on the moon. Be sure to pick up a copy of the book and see the movie Hidden Figures if you haven’t already to appreciate her and many other brilliant women mathematicians’ many contributions to space exploration. ChainLink Research provides an overview of Control Center in their article, How Infor is Helping to Realize Human Potential, and two screens from Control Center are shown below.

  • Machine learning is proving to be foundational for thwarting privileged credential abuse which is the leading cause of security breaches across global supply chains. By taking a least privilege access approach, organizations can minimize attack surfaces, improve audit and compliance visibility, and reduce risk, complexity, and the costs of operating a modern, hybrid enterprise. CIOs are solving the paradox of privileged credential abuse in their supply chains by knowing that even if a privileged user has entered the right credentials but the request comes in with risky context, then stronger verification is needed to permit access.  Zero Trust Privilege is emerging as a proven framework for thwarting privileged credential abuse by verifying who is requesting access, the context of the request, and the risk of the access environment.  Centrify is a leader in this area, with globally-recognized suppliers including Cisco, Intel, Microsoft, and Salesforce being current customers.  Source: Forbes, High-Tech’s Greatest Challenge Will Be Securing Supply Chains In 2019, November 28, 2018.
  • Capitalizing on machine learning to predict preventative maintenance for freight and logistics machinery based on IoT data is improving asset utilization and reducing operating costs. McKinsey found that predictive maintenance enhanced by machine learning allows for better prediction and avoidance of machine failure by combining data from the advanced Internet of Things (IoT) sensors and maintenance logs as well as external sources. Asset productivity increases of up to 20% are possible and overall maintenance costs may be reduced by up to 10%. Source: Digital/McKinsey, Smartening up with Artificial Intelligence (AI) – What’s in it for Germany and its Industrial Sector? (PDF, 52 pp., no opt-in).

References

Accenture, Reinventing The Supply Chain With AI, 20 pp., PDF, no opt-in.

Bendoly, E. (2016). Fit, Bias, and Enacted Sensemaking in Data Visualization: Frameworks for Continuous Development in Operations and Supply Chain Management Analytics. Journal Of Business Logistics37(1), 6-17.

Boston Consulting Group, Pairing Blockchain with IoT to Cut Supply Chain Costs, December 18, 2018, by Zia Yusuf, Akash Bhatia, Usama Gill, Maciej Kranz, Michelle Fleury, and Anoop Nannra

Machine Learning Engineer Is The Best Job In The U.S. According To Indeed

  • Machine Learning Engineer job openings grew 344% between 2015 to 2018, and have an average base salary of $146,085.
  • At $158,303, Computer Vision Engineers earn among the highest salaries in tech
  • The average base salary of the 25 best jobs in the U.S. according to Indeed is $104,825, and the median base salary is $99,007.
  • Agile Coach is the highest paying job with an average base salary of $161,377.
  • 9 of the top 25 jobs in the U.S. this year are in tech fields according to Indeed.
  • Five jobs are heavily dependent on applicants’ Artificial Intelligence (AI) skills and expertise.

These and many other insights are from this Indeed’s The Best Jobs in the U.S.: 2019 study released this week. Indeed defined the best jobs as those experiencing the fastest growth measured by the increase in job postings between 2015 and 2018, in conjunction with those offering the highest pay using a baseline salary of $75,000. Indeed’s best jobs of 2019’s data set is available here in Microsoft Excel.

Key insights from Indeed’s ranking of the best jobs of 2019 include the following:

  • At $158,303, Computer Vision Engineers earn among the highest salaries in tech according to Indeed, followed Machine Learning Engineers with a base salary of $146,085. The average base pay of the nine tech-related jobs that made Indeed’s list is $122,761, above the median salary of $99,007 for the entire group of the top 25 jobs. Indeed’s top 25 jobs for 2019 are illustrated below in descending salary order with the median salary providing a benchmark across the group. Please click on the graphic to expand for easier reading.

  • Three of the top 10 fastest growing jobs as measured by percentage growth in the number of job postings are in tech. From 2015 to 2018, job postings for Machine Learning Engineers grew 344%, followed by Full-stack developers (206%) and Salesforce developers (129%). In aggregate, all nine technology-related job postings increased by 146% between 2015 and 2018. The graphic below illustrates the percentage of growth in the number of postings between 2015 and 2018. Please click on the graphic to expand for easier reading.

  • Comparing average base salary to percentage growth in job postings underscores the exceptionally high demand for Machine Learning Engineers in 2019. Technical professionals with machine learning expertise today are in an excellent position to bargain for the average base salary of at least $146,085 or more. Full-stack developers and Salesforce developers are in such high demand, technical professionals with skills on these areas combined with experience can command a higher salary than the average base salary. The following graphic compares the average base salary to percentage growth in job postings for the years 2015 – 2018. Please click on the graphic to expand for easier reading.

Vodafone’s 2019 IoT Barometer Reflects Robust Growth In The Enterprise

  • 85% of enterprises who develop deep expertise with IoT succeed at driving revenue faster than competitors.
  • 81% of enterprises say Artificial Intelligence streamlines interpreting and taking action on data insights gained from IoT systems and sensors.
  • 68% of enterprises are using IoT to track the security of physical assets, making this use case the most common across enterprises today.
  • Transport & Logistics and Manufacturing & Industrials saw the most significant increase in adoption between 2018 and 2019.

These and many other fascinating insights are from the 6th annual Vodafone IoT Barometer, 2019.  The entire report can be downloaded here (PDF, 32 pp., e-mail opt-in). The methodology is based on 1,758 interviews distributed across the Americas (22%), EMEA (49%) and Asia-Pacific (29%). Eight vertical markets were included with manufacturing (22%), healthcare and wellness (14%) and retail, leisure, and hospitality (14%) being the three most represented markets.  Vodaphone is making an interactive tool available here for exploring the results.

Key insights from Vodafone’s 2019 IoT Barometer include the following:

  • 34% of global businesses are now using IoT in daily operations, up from 29% in 2018, with 95% of IoT adopters are already seeing measurable benefits. 81% of IoT adopters say their reliance on IoT has grown, and 76% of adopters say IoT is mission-critical to them. 58% are using analytics platforms to get more insights from their IoT data to improve decision making. 71% of enterprises who have adopted IoT expect their company and others like them will start listing data resources on their balance sheets as assets within five years.

  • 95% of enterprises adopting IoT are achieving tangible benefits and positive ROI. 52% of enterprises report significant returns on their IoT investments. 79% say IoT is enabling positive outcomes that would have been impossible without it, further reflecting robust growth in the enterprise. Across all eight vertical markets reducing operating costs (53%) and gaining more accurate data and insights (48%) are the most common benefits. Transitioning an IoT pilot to production based on cost reduction and improved visibility creates a compelling ROI for many enterprises. The following graphic compares IoT’s benefits to enterprises. Please click on the graphic to expand for easier reading.

  • Transport & Logistics and Manufacturing & Industrials saw the greatest increase in adoption between 2018 and 2019. Transport and Logistics had the highest IoT adoption rate at 42% followed by Manufacturing and Industrials at 39%. Manufacturers are facing the challenges of improving production efficiency and product quality while accelerating time-to-market for next-generation smart, connected products. IoT contributes to productivity improvements and creates opportunities for services-based business models, two high priorities for manufacturers in 2019 and beyond.  The following graphic from the interactive tool compares IoT adoption by industry based on Vodaphone’s IoT barometer data over the last six years:

  • 89% of most sophisticated enterprises have multiple full-scale projects in production, orchestrating IoT with analytics, AI and cloud, creating a technology stack that delivers real-time insights. Enterprises who lead IoT adoption in their industries rely on integration to gain scale and speed advantages quickly over competitors. The greater the real-time integration, the greater the potential to digitally transform an enterprise and remove roadblocks that get in the way of growing. 95% of adopters where IoT is fully integrated say it’s enabling their digital transformation, compared with 55% that haven’t started integration. The following graphics reflect how integrated enterprises’ IoT projects are with existing business systems and processes and the extent to which enterprises agree that IoT is enabling digital transformation.

  • 68% of enterprises are using IoT to track the security of physical assets, making this use case the most common across enterprises today. 57% of all enterprises are using IoT to manage risk and compliance. 53% are using it to increase revenue and cut costs, with 82% of high performing enterprises rely on IoT to manage risk and compliance. The following graphic compares the types of variables enterprises are using IoT to track today and plan to in the future.

  • IoT adoption is soaring in Americas-based enterprises, jumping from 27% in 2018 to 40% in 2019. The Americas region leads the world in terms of IoT usage assessed by strategy, integration, and implementation of IoT deployments. 73% of Americas-based enterprises are the most likely to report significant returns from their IoT investments compared to 47% for Asia-Pacific (APAC) and 45% for Europe, Middle East and Africa (EMEA).
  • 52% of IoT-enabled enterprises plan to use 5G when it becomes available. Enterprises are looking forward to 5G’s many advantages including improved security via stronger encryption, more credentialing options, greater quality of service management, more specialized services and near-zero latency. Vodafone predicts 5G will be a strong catalyst of growth for emerging IoT applications including connected cars, smart cities, eHealth and industrial automation.

 

10 Ways AI & Machine Learning Are Revolutionizing Omnichannel

Disney, Oasis, REI, Starbucks, Virgin Atlantic, and others excel at delivering omnichannel experiences using AI and machine learning to fine-tune their selling and service strategies. Source: iStock

Bottom Line: AI and machine learning are enabling omnichannel strategies to scale by providing insights into the changing needs and preferences of customers, creating customer journeys that scale, delivering consistent experiences.

For any omnichannel strategy to succeed, each customer touchpoint needs to be orchestrated as part of an overarching customer journey. That’s the only way to reduce and eventually eliminate customers’ perceptions of using one channel versus another. What makes omnichannel so challenging to excel at is the need to scale a variety of customer journeys in real-time as customers are also changing.

89% of customers used at least one digital channel to interact with their favorite brands and just 13% found the digital-physical experiences well aligned according to Accenture’s omnichannel study. AI and machine learning are being used to close these gaps with greater intelligence and knowledge. Omnichannel strategists are fine-tuning customer personas, measuring how customer journeys change over time, and more precisely define service strategies using AI and machine learning. Disney, Oasis, REI, Starbucks, Virgin Atlantic, and others excel at delivering omnichannel experiences using AI and machine learning for example.

Omnichannel leaders including Amazon use AI and machine learning to anticipate which customer personas prefer to speak with a live agent versus using self-service for example. McKinsey also found omnichannel customer care expectations fall into the three categories of speed and flexibility, reliability and transparency, and interaction and care. Omnichannel customer journeys designed deliver on each of these three categories excel and scale between automated systems and live agents as the following example from the McKinsey article, How to capture what the customer wants illustrate:

The foundation all great omnichannel strategies are based on precise customer personas, insight into how they are changing, and how supply chains and IT need to flex and change too. AI and machine learning are revolutionizing omnichannel on these three core dimensions with greater insight and contextual intelligence than ever before.

10 Ways AI & Machine Learning Are Revolutionizing Omnichannel

The following are 10 ways AI & machine learning are revolutionizing omnichannel strategies starting with customer personas, their expectations, and how customer care, IT infrastructure and supply chains need to stay responsive to grow.

  1. AI and machine learning are enabling brands, retailers and manufacturers to more precisely define customer personas, their buying preferences, and journeys. Leading omnichannel retailers are successfully using AI and machine learning today to personalize customer experiences to the persona level. They’re combining brand, event and product preferences, location data, content viewed, transaction histories and most of all, channel and communication preferences to create precise personas of each of their key customer segments.
  2. Achieving price optimization by persona is now possible using AI and machine learning, factoring in brand and channel preferences, previous purchase history, and price sensitivity. Brands, retailers, and manufacturers are saying that cloud-based price optimization and management apps are easier to use and more powerful based on rapid advances in AI and machine learning algorithms than ever before. The combination of easier to use, more powerful apps and the need to better manage and optimize omnichannel pricing is fueling rapid innovation in this area. The following example is from Microsoft Azure’s Interactive Pricing Analytics Pre-Configured Solution (PCS). Source: Azure Cortana Interactive Pricing Analytics Pre-Configured Solution.

  1. Capitalizing on insights gained from AI and machine learning, omnichannel leaders are redesigning IT infrastructure and integration so they can scale customer experiences. Succeeding with omnichannel takes an IT infrastructure capable of flexing quickly in response to change in customers’ preferences while providing scale to grow. Every area of a brand, retailer or manufacturer’s supply chain from their supplier onboarding, quality management and strategic sourcing to yard management, dock scheduling, manufacturing, and fulfillment need to be orchestrated around customers. Leaders include C3 Solutions who offers a web-based Yard Management System (YMS) and Dock Scheduling System that can integrate with ERP, Supply Chain Management (SCM), Warehouse Management Systems (WMS) and many others via APIs. The following graphic illustrates how omnichannel leaders orchestrate IT infrastructure to achieve greater growth. Source: Cognizant, The 2020 Customer Experience.

  1. Omnichannel leaders are relying on AI and machine learning to digitize their supply chains, enabling on-time performance, fueling faster revenue growth. For any omnichannel strategy to succeed, supply chains need to be designed to excel at time-to-market and time-to-customer performance at scale. 54% of retailers pursuing omnichannel strategies say that their main goal in digitizing their supply chains was to deliver greater customer experiences. 45% say faster speed to market is their primary goal in digitizing their supply chain by adding in AI and machine learning-driven intelligence. Source: Digitize Today To Future-Proof Tomorrow (PDF, 16 pp., opt-in).

  1. AI and machine learning algorithms are making it possible to create propensity models by persona, and they are invaluable for predicting which customers will act on a bundling or pricing offer. By definition propensity models rely on predictive analytics including machine learning to predict the probability a given customer will act on a bundling or pricing offer, e-mail campaign or other call-to-action leading to a purchase, upsell or cross-sell. Propensity models have proven to be very effective at increasing customer retention and reducing churn. Every business excelling at omnichannel today rely on propensity models to better predict how customers’ preferences and past behavior will lead to future purchases. The following is a dashboard that shows how propensity models work. Source: customer propensities dashboard is from TIBCO.

  1. Combining machine learning-based pattern matching with a product-based recommendation engine is leading to the development of mobile-based apps where shoppers can virtually try on garments they’re interested in buying. Machine learning excels at pattern recognition, and AI is well-suited for creating recommendation engines, which are together leading to a new generation of shopping apps where customers can virtually try on any garment. The app learns what shoppers most prefer and also evaluates image quality in real-time, and then recommends either purchase online or in a store. Source: Capgemini, Building The Retail Superstar: How unleashing AI across functions offers a multi-billion dollar opportunity.

  1. 56% of brands and retailers say that order track-and-traceability strengthened with AI and machine learning is essential to delivering excellent customer experiences. Order tracking across each channel combined with predictions of allocation and out-of-stock conditions using AI and machine learning is reducing operating risks today. AI-driven track-and-trace is invaluable in finding where there are process inefficiencies that slow down time-to-market and time-to-customer. Source: Digitize Today To Future-Proof Tomorrow (PDF, 16 pp., opt-in).
  2. Gartner predicts that by 2025, customer service organizations who embed AI in their customer engagement center platforms will increase operational efficiencies by 25%, revolutionizing customer care in the process. Customer service is often where omnichannel strategies fail due to lack of real-time contextual data and insight. There’s an abundance of use cases in customer service where AI and machine learning can improve overall omnichannel performance. Amazon has taken the lead on using AI and machine learning to decide when a given customer persona needs to speak with a live agent. Comparable strategies can also be created for improving Intelligent Agents, Virtual Personal Assistants, Chatbot and Natural Language (NLP) performance.  There’s also the opportunity to improve knowledge management, content discovery and improve field service routing and support.
  3. AI and machine learning are improving marketing and selling effectiveness by being able to track purchase decisions back to campaigns by channel and understand why specific personas purchased while others didn’t. Marketing is already analytically driven, and with the rapid advances in AI and machine learning, markets will for the first time be able to isolate why and where their omnichannel strategies are succeeding or failing. By using machine learning to qualify the further customer and prospect lists using relevant data from the web, predictive models including machine learning can better predict ideal customer profiles. Each omnichannel sales lead’s predictive score becomes a better predictor of potential new sales, helping sales prioritize time, sales efforts and selling strategies.
  4. Predictive content analytics powered by AI and machine learning are improving sales close rates by predicting which content will lead a customer to buy. Analyzing previous prospect and buyer behavior by persona using machine learning provides insights into which content needs to be personalized and presented when to get a sale. Predictive content analytics is proving to be very effective in B2B selling scenarios, and are scaling into consumer products as well

How Machine Learning Improves Manufacturing Inspections, Product Quality & Supply Chain Visibility

Bottom Line: Manufacturers’ most valuable data is generated on shop floors daily, bringing with it the challenge of analyzing it to find prescriptive insights fast – and an ideal problem for machine learning to solve.

Manufacturing is the most data-prolific industry there is, generating on average 1.9 petabytes of data every year according to the McKinsey Global Insititute. Supply chains, sourcing, factory operations, and the phases of compliance and quality management generate the majority of data.

The most valuable data of all comes from product inspections that can immediately find exceptionally strong or weak suppliers, quality management and compliance practices in a factory. Manufacturing’s massive problem is in getting quality inspection results out fast enough across brands & retailers, other factories, suppliers and vendors to make a difference in future product quality.

How A Machine Learning Startup Is Revolutionizing Product Inspections

Imagine you’re a major brand or retailer and you’re relying on a network of factories across Bangladesh, China, India, and Southeast Asia to produce your new non-food consumer goods product lines including apparel. Factories, inspection agencies, suppliers and vendors that brands and retailers like you rely on vary widely on ethics, responsible sourcing, product quality, and transparency. With your entire consumer goods product lines (and future sales) at risk based on which suppliers, factories and product inspection agencies you choose, you and your companies’ future are riding on the decisions you make.

These career- and company-betting challenges and the frustration of gaining greater visibility into what’s going on in supply chains to factory floors led Carlos Moncayo Castillo and his brothers Fernando Moncayo Castillo and Luis Moncayo Castillo to launch Inspectorio. They were invited to the Target + Techstars Retail Accelerator in the summer of 2017, a competition they participated in with their cloud-based inspection platform that includes AI and machine learning and pervasive support for mobile technologies. Target relies on them today to bring greater transparency to their supply chains. “I’ve spent years working in non-food consumer goods product manufacturing seeing the many disconnects between inspections and suppliers, the lack of collaboration and how gaps in information create too many opportunities for corruption – I had to do something to solve these problems,” Carlos said. The many problems that a lack of inspection and supply chain visibility creates became the pain Inspectorio focused on solving immediately for brands and retailers. The following is a graphic of their platform:

Presented below are a few of the many ways the combining of a scalable inspection cloud platform combined with AI, machine learning and mobile technologies are improving inspections, product quality, and supply chain visibility:

  • Enabling the creation of customized inspector workflows that learn over time and are tailored to specific products including furniture, toys, homeware and garments, the factories they’re produced in, quality of the materials used. Inspectorio’s internal research has found 74% of all inspections today are done manually using a pen and paper, with results reported in Microsoft Word, Excel or PDFs, making collaboration slow and challenging. Improving the accuracy, speed and scale of inspection workflows including real-time updates across production networks drive major gains in quality and supply chain performance.
  • Applying constraint-based algorithms and logic to understand why there are large differences in inspection results between factories is enabling brands & retailers to manage quality faster and more completely. Uploading inspections in real-time from mobile devices to an inspection platform that contains AI and machine learning applications that quickly parse the data for prescriptive insights is the future of manufacturing quality. Variations in all dimensions of quality including factory competency, supplier and production assembly quality are taken into account. In a matter of hours, inspection-based data delivers the insights needed to avert major quality problems to every member of a production network.
  • Reducing risk, the potential for fraud, while improving the product and process quality based on insights gained from machine learning is forcing inspection’s inflection point. When inspections are automated using mobile technologies and results are uploaded in real-time to a secure cloud-based platform, machine learning algorithms can deliver insights that immediately reduce risks and the potential for fraud. One of the most powerful catalysts driving inspections’ inflection point is the combination of automated workflows that deliver high-quality data that machine learning produces prescriptive insights from. And those insights are shared on performance dashboards across every brand, retailer, supplier, vendor and factory involved in shared production strategies today.
  • Matching the most experienced inspector for a given factory and product inspection drastically increases accuracy and quality. When machine learning is applied to the inspector selection and assignment process, the quality, and thoroughness of inspections increase. For the first time, brands, retailers, and factories have a clear, quantified view of Inspector Productivity Analysis across the entire team of inspectors available in a given region or country. Inspections are uploaded in real-time to the Inspectorio platform where advanced analytics and additional machine learning algorithms are applied to the data, providing greater prescriptive insights that would have ever been possible using legacy manual methods. Machine learning is also making recommendations to inspectors on which defects to look for first based on the data patterns obtained from previous inspections.
  • Knowing why specific factories and products generated more Corrective Action/Preventative Action (CAPA) than others and how fast they have been closed in the past and why is now possible. Machine learning is making it possible for entire production networks to know why specific factory and product combinations generate the most CAPAs. Using constraint-based logic, machine learning can also provide prescriptive insights into what needs to be improved to reduce CAPAs, including their root cause.

Which Analytics And BI Technologies Will Be The Highest Priority In 2019?

  • 82% of enterprises are prioritizing analytics and BI as part of their budgets for new technologies and cloud-based services.
  • 54% say AI, Machine Learning and Natural Language Processing (NLP) are also a high investment priority.
  • 50% of enterprises say their stronger focus on metrics and Key Performance Indicators (KPIs) company-wide are a major driver of new investment in analytics and BI.
  • 43%  plan to both build and buy AI and machine learning applications and platforms.
  • 42% are seeking to improve user experiences by automating discovery of data insights and 26% are using AI to provide user recommendations.

These and many other fascinating insights are from the recent TDWI Best Practices Report, BI and Analytics in the Age of AI and Big Data. An executive summary of the study is available online here. The entire study is available for download here (39 PP., PDF, free, opt-in). The study found that enterprises are placing a high priority on augmenting existing systems and replacing older technologies and data platforms with new cloud-based BI and predictive analytics ones. Transforming Data with Intelligence (TDWI) is a global community of AI, analytics, data science and machine learning professionals interested in staying current in these and more technology areas as part of their professional development. Please see page 3 of the study for specifics regarding the methodology.

Key takeaways from the study include the following:

  • 82% of enterprises are prioritizing analytics and BI applications and platforms as part of their budgets for new technologies and cloud-based services. 78% of enterprises are prioritizing advanced analytics, and 76% data preparation. 54% say AI, machine learning and Natural Language Processing (NLP) are also a high investment priority. The following graphic ranks enterprises’ investment priorities for acquiring or subscribing to new technologies and cloud-based services by analytics and BI initiatives or strategies. Please click on the graphic to expand for easier reading.

  • Data warehouse or mart in the cloud (41%), data lake in the cloud (39%) and BI platform in the cloud (38%) are the top three types of technologies enterprises are planning to use. Based on this finding and others in the study, cloud platforms are the new normal in enterprises’ analytics and Bi strategies going into 2019. Cloud data storage (object, file, or block) and data virtualization or federation (both 32%) are the next-most planned for technologies by enterprises when it comes to investing in the analytics and BI initiatives. Please click on the graphic to expand for easier reading.

  • The three most important factors in delivering a positive user experience include good query performance (61%), creating and editing visualizations (60%), and personalizing dashboards and reports (also 60%). The three activities that lead to the least amount of satisfaction are using predictive analytics and forecasting tools (27% dissatisfied), “What if” analysis and deriving new data (25%) and searching across data and reports (24%). Please click on the graphic to expand for easier reading.

  • 82% of enterprises are looking to broaden the base of analytics and BI platforms they rely on for insights and intelligence, not just stay with the solutions they have in place today. Just 18% of enterprises plan to add more instances of existing platforms and systems. Cloud-native platforms (38%), a new analytics platform (35%) and cloud-based data lakes (31%) are the top three system areas enterprises are planning to augment or replace existing BI, analytics, and data warehousing systems in. Please click on the graphic to expand for easier reading.

  • The majority of enterprises plan to both build and buy Artificial Intelligence (AI) and machine learning (ML) solutions so that they can customize them to their specific needs. 43% of enterprises surveyed plan to both build and buy AI and ML applications and platforms, a figure higher than any other recent survey on this aspect of enterprise AI adoption. 13% of responding enterprises say they will exclusively build their own AI and ML applications.

  • Capitalizing on machine learning’s innate strengths of applying algorithms to large volumes of data to find actionable new insights (54%) is what’s most important to the majority of enterprises. 47% of enterprises look to AI and machine learning to improve the accuracy and quality of information. And 42% are configuring AI and machine learning applications and platforms to augment user decision making by giving recommendations. Please click on the graphic to expand for easier reading.

%d bloggers like this: