Skip to content
Advertisements

Posts tagged ‘Machine learning’

How AI & Machine Learning Are Redefining The War For Talent

These and many other fascinating insights are from Gartner’s recent research note, Cool Vendors in Human Capital Management for Talent Acquisition (PDF, 13 pp., client access reqd.) that illustrates how AI and machine learning are fundamentally redefining the war for talent. Gartner selected five companies that are setting a rapid pace of innovation in talent management, taking on Human Capital Management’s (HCM) most complex challenges. The five vendors Gartner mentions in the research note are AllyO, Eightfold, jobpal, Knack, and Vettd. Each has concentrated on creating and launching differentiated applications that address urgent needs enterprises have across the talent acquisition landscape. Gartner’s interpretation of the expanding Talent Acquisition Landscape is shown below (please click on the graphic to expand):

Source: Gartner, Cool Vendors in Human Capital Management for Talent Acquisition, Written by Jason Cerrato, Jeff Freyermuth, John Kostoulas, Helen Poitevin, Ron Hanscome. 7 September 2018

Company Growth Plans Are Accelerating The War For Talent

The average employee’s tenure at a cloud-based enterprise software company is 19 months; in the Silicon Valley, this trends to 14 months due to intense competition for talent according to C-level executives leading these companies. Fast-growing enterprise cloud computing companies and many other businesses like them need specific capabilities, skill sets, and associates who know how to unlearn old concepts and learn new ones. Today across tech and many other industries, every company’s growth strategy is predicated on how well they attract, engage, screen, interview, select and manage talent over associates’ lifecycles.

Of the five companies Gartner names as Cool Vendors in the field of Human Capital Management for Talent Acquisition, Eightfold is the only one achieving personalization at scale today. Attaining personalization at scale is essential if any growing business is going to succeed in attracting, acquiring and growing talent that can support their growth goals and strategies. Eightfold’s approach makes it possible to scale personalized responses to specific candidates in a company’s candidate community while defining the ideal candidate for each open position.

Gartner finds Eightfold noteworthy for its AI-based Talent Intelligence Platform that combines analysis of publicly available data, internal data repositories, HCM systems, ATS tools, and spreadsheets then creates ontologies based on organization-specific success criteria. Each ontology, or area of talent management interest, is customizable for further queries using the app’s easily understood and navigated user interface. Gartner also finds that Eightfold.ai is one of the first examples of a self-updating corporate candidate database. Profiles in the system are now continually updated using external data gathering, without applicants reapplying or submitting updated profiles. The Eightfold.ai Talent Intelligence Platform is shown below:

Taking A Data-Driven Approach to Improve Diversity

AI and machine learning have the potential to remove conscious and unconscious biases from hiring decisions, leading to hiring decisions based on capabilities and innate skills. Many CEOs and senior management teams are enthusiastically endorsing diversity programs yet struggling to make progress. AI and machine learning-based approaches like Eightfold’s can help to accelerate them to their diversity goals and attain a more egalitarian workplace. Data is the great equalizer, with a proven ability to eradicate conscious and unconscious biases from hiring decisions and enable true diversity by equally evaluating candidates based on their experience, growth potential and strengths.

Conclusion

At the center of every growing business’ growth plans is the need to attract, engage, recruit, and retain the highest quality employees possible. As future research in the field of HCM will show, the field is in crisis because it’s relying more on biases than solid data. Breaking through the barrier of conscious and unconscious biases will provide contextual intelligence of an applicant’s unique skills, capabilities and growth trajectories that are far beyond the scope of any resume or what an ATS can provide. The war for talent is being won today with data and insights that strip away biases to provide prospects who are ready for the challenges of helping their hiring companies grow.

Advertisements

Google Needs To Make Machine Learning Their Growth Fuel

  • In 2017 Google outspent Microsoft, Apple, and Facebook on R&D spending with the majority being on AI and machine learning.
  • Google needs new AI- and machine learning-driven businesses that have lower Total Acquisition Costs (TAC) to offset the rising acquisition costs of their ad and search businesses.
  • One of the company’s initial forays into AI and machine learning was its $600M acquisition of AI startup DeepMind in January 2014.
  • Google has launched two funds dedicated solely to AI: Gradient Ventures and the Google Assistant Investment Program, both of which are accepting pitches from AI and machine learning startups today.
  • On its Q4’17 earnings call, the company announced that its cloud business is now bringing in $1B per quarter. The number of cloud deals worth $1M+ that Google has sold more than tripled between 2016 and 2017.
  • Google’s M&A strategy is concentrating on strengthening their cloud business to better compete against Amazon AWS and Microsoft Azure.

These and many other fascinating insights are from CB Insight’s report, Google Strategy Teardown (PDF, 49 pp., opt-in). The report explores how Alphabet, Google’s parent company is relying on Artificial Intelligence (AI) and machine learning to capture new streams of revenue in enterprise cloud computing and services. Also, the report looks at how Alphabet can combine search, AI, and machine learning to revolutionize logistics, healthcare, and transportation. It’s a thorough teardown of Google’s potential acquisitions, strategic investments, and partnerships needed to maintain search dominance while driving revenue from new markets.

Key takeaways from the report include the following:

  • Google needs new AI- and machine learning-driven businesses that have lower Total Acquisition Costs (TAC) to offset the rising acquisition costs of their ad and search businesses. CB Insights found Google is experiencing rising TAC in their core ad and search businesses. With the strategic shift to mobile, Google will see TAC escalate even further. Their greatest potential for growth is infusing greater contextual intelligence and knowledge across the entire series of companies that comprise Alphabet, shown in the graphic below.

  • Google has launched two funds dedicated solely to AI: Gradient Ventures and the Google Assistant Investment Program, both of which are accepting pitches from AI and machine learning startups today. Gradient Ventures is an ROI fund focused on supporting the most talented founders building AI-powered companies. Former tech founders are leading Gradient Ventures, assisting in turning ideas into companies. Gradient Venture’s portfolio is shown below:

  • In 2017 Google outspent Microsoft, Apple, and Facebook on R&D spending with the majority being on AI and machine learning. Amazon dominates R&D spending across the top five tech companies investments in R&D in 2017 with $22.6B. Facebook leads in percent of total sales invested in R&D with 19.1%.

  • Google AI led the development of Google’s highly popular open source machine software library and framework Tensor Flow and is home to the Google Brain team. Google’s approach to primary research in the fields of AI, machine learning, and deep learning is leading to a prolific amount of research being produced and published. Here’s the search engine for their publication database, which includes many fascinating studies for review. Part of Google Brain’s role is to work with other Alphabet subsidiaries to support and lead their AI and machine learning product initiatives. An example of this CB Insights mentions in the report is how Google Brain collaborated with autonomous driving division Waymo, where it has helped apply deep neural nets to vehicles’ pedestrian detection The team has also been successful in increasing the number of AI and machine learning patents, as CB Insight’s analysis below shows:

  • Mentions of AI and machine learning are soaring on Google quarterly earnings calls, signaling senior management’s prioritizing these areas as growth fuel. CB Insights has an Insights Trends tool that is designed to analyze unstructured text and find linguistics-based associations, models and statistical insights from them. Analyzing Google earnings calls transcripts found AI and machine learning mentions are soaring during the last call.

  • Google’s M&A strategy is concentrating on strengthening their cloud business to better compete against Amazon AWS and Microsoft Azure. Google acquired Xively in Q1 of this year followed by Cask Data and Velostrata in Q2. Google needs to continue acquiring cloud-based companies who can accelerate more customer wins in the enterprise and mid-tier, two areas Amazon AWS and Microsoft Azure have strong momentum today.

10 Ways To Improve Cloud ERP With AI And Machine Learning

Capitalizing on new digital business models and the growth opportunities they provide are forcing companies to re-evaluate ERP’s role. Made inflexible by years of customization, legacy ERP systems aren’t delivering what digital business models need today to scale and grow.

Legacy ERP systems were purpose-built to excel at production consistency first at the expense of flexibility and responsiveness to customers’ changing requirements. By taking a business case-based approach to integrating Artificial Intelligence (AI) and machine learning into their platforms, Cloud ERP providers can fill the gap legacy ERP systems can’t.

Closing Legacy ERP Gaps With Greater Intelligence And Insight

Companies need to be able to respond quickly to unexpected, unfamiliar and unforeseen dilemmas with smart decisions fast for new digital business models to succeed. That’s not possible today with legacy ERP systems. Legacy IT technology stacks and the ERP systems they are built on aren’t designed to deliver the data needed most.

That’s all changing fast. A clear, compelling business model and successful execution of its related strategies are what all successful Cloud ERP implementations share. Cloud ERP platforms and apps provide organizations the flexibility they need to prioritize growth plans over IT constraints. And many have taken an Application Programming Interface (API) approach to integrate with legacy ERP systems to gain the incremental data these systems provide. In today’s era of Cloud ERP, rip-and-replace isn’t as commonplace as reorganizing entire IT architectures for greater speed, scale, and customer transparency using cloud-first platforms.

New business models thrive when an ERP system is constantly learning. That’s one of the greatest gaps between what Cloud ERP platforms’ potential and where their legacy counterparts are today. Cloud platforms provide greater integration options and more flexibility to customize applications and improve usability which is one of the biggest drawbacks of legacy ERP systems. Designed to deliver results by providing AI- and machine learning insights, Cloud ERP platforms, and apps can rejuvenate ERP systems and their contributions to business growth.

The following are the 10 ways to improve Cloud ERP with AI and machine learning, bridging the information gap with legacy ERP systems:

  1. Cloud ERP platforms need to create and strengthen a self-learning knowledge system that orchestrates AI and machine learning from the shop floor to the top floor and across supplier networks. Having a cloud-based infrastructure that integrates core ERP Web Services, apps, and real-time monitoring to deliver a steady stream of data to AI and machine learning algorithms accelerates how quickly the entire system learns. The Cloud ERP platform integration roadmap needs to include APIs and Web Services to connect with the many suppliers and buyer systems outside the walls of a manufacturer while integrating with legacy ERP systems to aggregate and analyze the decades of data they have generated.

  1. Virtual agents have the potential to redefine many areas of manufacturing operations, from pick-by-voice systems to advanced diagnostics. Apple’s Siri, Amazon’s Alexa, Google Voice, and Microsoft Cortana have the potential to be modified to streamline operations tasks and processes, bringing contextual guidance and direction to complex tasks. An example of one task virtual agents are being used for today is guiding production workers to select from the correct product bin as required by the Bill of Materials. Machinery manufacturers are piloting voice agents that can provide detailed work instructions that streamline configure-to-order and engineer-to-order production. Amazon has successfully partnered with automotive manufacturers and has the most design wins as of today. They could easily replicate this success with machinery manufacturers.

  1. Design in the Internet of Things (IoT) support at the data structure level to realize quick wins as data collection pilots go live and scale. Cloud ERP platforms have the potential to capitalize on the massive data stream IoT devices are generating today by designing in support at the data structure level first. Providing IoT-based data to AI and machine learning apps continually will bridge the intelligence gap many companies face today as they pursue new business models. Capgemini has provided an analysis of IoT use cases shown below, highlighting how production asset maintenance and asset tracking are quick wins waiting to happen. Cloud ERP platforms can accelerate them by designing in IoT support.

  1. AI and machine learning can provide insights into how Overall Equipment Effectiveness (OEE) can be improved that aren’t apparent today. Manufacturers will welcome the opportunity to have greater insights into how they can stabilize then normalize OEE performance across their shop floors. When a Cloud ERP platform serves as an always-learning knowledge system, real-time monitoring data from machinery and production assets provide much-needed insights into areas for improvement and what’s going well on the shop floor.

  1. Designing machine learning algorithms into track-and-traceability to predict which lots from which suppliers are most likely to be of the highest or lowest quality. Machine learning algorithms excel at finding patterns in diverse data sets by continually applying constraint-based algorithms. Suppliers vary widely in their quality and delivery schedule performance levels. Using machine learning, it’s possible to create a track-and-trace application that could indicate which lot from which supplier is the riskiest and those that are of exceptional quality as well.
  2. Cloud ERP providers need to pay attention to how they can help close the configuration gap that exists between PLM, CAD, ERP and CRM systems by using AI and machine learning. The most successful product configuration strategies rely on a single, lifecycle-based view of product configurations. They’re able to alleviate the conflicts between how engineering designs a product with CAD and PLM, how sales & marketing sell it with CRM, and how manufacturing builds it with an ERP system. AI and machine learning can enable configuration lifecycle management and avert lost time and sales, streamlining CPQ and product configuration strategies in the process.
  3. Improving demand forecasting accuracy and enabling better collaboration with suppliers based on insights from machine learning-based predictive models is attainable with higher quality data. By creating a self-learning knowledge system, Cloud ERP providers can vastly improve data latency rates that lead to higher forecast accuracy. Factoring in sales, marketing, and promotional programs further fine-tunes forecast accuracy.
  4. Reducing equipment breakdowns and increasing asset utilization by analyzing machine-level data to determine when a given part needs to be replaced. It’s possible to capture a steady stream of data on each machine’s health level using sensors equipped with an IP address. Cloud ERP providers have a great opportunity to capture machine-level data and use machine learning techniques to find patterns in production performance by using a production floor’s entire data set. This is especially important in process industries where machinery breakdowns lead to lost sales. Oil refineries are using machine learning models comprise more than 1,000 variables related to material input, output and process perimeters including weather conditions to estimate equipment failures.
  5. Implementing self-learning algorithms that use production incident reports to predict production problems on assembly lines needs to happen in Cloud ERP platforms. A local aircraft manufacturer is doing this today by using predictive modeling and machine learning to compare past incident reports. With legacy ERP systems these problems would have gone undetected and turned into production slowdowns or worse, the line having to stop.
  6. Improving product quality by having machine learning algorithms aggregate, analyze and continually learn from supplier inspection, quality control, Return Material Authorization (RMA) and product failure data. Cloud ERP platforms are in a unique position of being able to scale across the entire lifecycle of a product and capture quality data from the supplier to the customer. With legacy ERP systems manufacturers most often rely on an analysis of scrap materials by type or caused followed by RMAs. It’s time to get to the truth about why products fail, and machine learning can deliver the insights to get there.

Where Business Intelligence Is Delivering Value In 2018

  • Executive Management, Operations, and Sales are the three primary roles driving Business Intelligence (BI) adoption in 2018.
  • Dashboards, reporting, end-user self-service, advanced visualization, and data warehousing are the top five most important technologies and initiatives strategic to BI in 2018.
  • Small organizations with up to 100 employees have the highest rate of BI penetration or adoption in 2018.
  • Organizations successful with analytics and BI apps define success in business results, while unsuccessful organizations concentrate on adoption rate first.
  • 50% of vendors offer perpetual on-premises licensing in 2018, a notable decline over 2017. The number of vendors offering subscription licensing continues to grow for both on-premises and public cloud models.
  • Fewer than 15% of respondent organizations have a Chief Data Officer, and only about 10% have a Chief Analytics Officer today.

These and many other fascinating insights are from Dresner Advisory Service’s  2018 Wisdom of Crowds® Business Intelligence Market Study. In its ninth annual edition, the study provides a broad assessment of the business intelligence (BI) market and a comprehensive look at key user trends, attitudes, and intentions.  The latest edition of the study adds Information Technology (IT) analytics, sales planning, and GDPR, bringing the total to 36 topics under study.

“The Wisdom of Crowds BI Market Study is the cornerstone of our annual research agenda, providing the most in-depth and data-rich portrait of the state of the BI market,” said Howard Dresner, founder and chief research officer at Dresner Advisory Services. “Drawn from the first-person perspective of users throughout all industries, geographies, and organization sizes, who are involved in varying aspects of BI projects, our report provides a unique look at the drivers of and success with BI.” Survey respondents include IT (28%), followed by Executive Management (22%), and Finance (19%). Sales/Marketing (8%) and the Business Intelligence Competency Center (BICC) (7%). Please see page 15 of the study for specifics on the methodology.

Key takeaways from the study include the following:

  • Executive Management, Operations, and Sales are the three primary roles driving Business Intelligence (BI) adoption in 2018. Executive management teams are taking more of an active ownership role in BI initiatives in 2018, as this group replaced Operations as the leading department driving BI adoption this year. The study found that the greatest percentage change in functional areas driving BI adoption includes Human Resources (7.3%), Marketing (5.9%), BICC (5.1%) and Sales (5%).

  • Making better decisions, improving operational efficiencies, growing revenues and increased competitive advantage are the top four BI objectives organizations have today. Additional goals include enhancing customer service and attaining greater degrees of compliance and risk management. The graph below rank orders the importance of BI objectives in 2018 compared to the percent change in BI objectives between 2017 and 2018. Enhanced customer service is the fastest growing objective enterprises adopt BI to accomplish, followed by growth in revenue (5.4%).

  • Dashboards, reporting, end-user self-service, advanced visualization, and data warehousing are the top five most important technologies and initiatives strategic to BI in 2018. The study found that second-tier initiatives including data discovery, data mining/advanced algorithms, data storytelling, integration with operational processes, and enterprise and sales planning are also critical or very important to enterprises participating in the survey. Technology areas being hyped heavily today including the Internet of Things, cognitive BI, and in-memory analysis are relatively low in the rankings as of today, yet are growing. Edge computing increased 32% as a priority between 2017 and 2018 for example. The results indicate the core aspect of excelling at using BI to drive better business decisions and more revenue still dominate the priorities of most businesses today.
  • Sales & Marketing, Business Intelligence Competency Center (BICC) and   Executive Management have the highest level of interest in dashboards and advanced visualization. Finance has the greatest interest in enterprise planning and budgeting. Operations including manufacturing, supply chain management, and services) leads interest in data mining, data storytelling, integration with operational processes, mobile device support, data catalog and several other technologies and initiatives. It’s understandable that BICC leaders most advocate end-user self-service and attach high importance to many other categories as they are internal service bureaus to all departments in an enterprise. It’s been my experience that BICCs are always looking for ways to scale BI adoption and enable every department to gain greater value from analytics and BI apps. BICCs in the best run companies are knowledge hubs that encourage and educate all departments on how to excel with analytics and BI.

  • Insurance companies most prioritize dashboards, reporting, end-user self-service, data warehousing, data discovery and data mining. Business Services lead the adoption of advanced visualization, data storytelling, and embedded BI. Manufacturing most prioritizes sales planning and enterprise planning but trails in other high-ranking priorities. Technology prioritizes Software-as-a-Service (SaaS) given its scale and speed advantages. The retail & wholesale industry is going through an analytics and customer experience revolution today. Retailers and wholesalers lead all others in data catalog adoption and mobile device support.

  • Insurance, Technology and Business Services vertical industries have the highest rate of BI adoption today. The Insurance industry leads all others in BI adoption, followed by the Technology industry with 40% of organizations having 41% or greater adoption or penetration. Industries whose BI adoption is above average include Business Services and Retail & Wholesale. The following graphic illustrates penetration or adoption of Business Intelligence solutions today by industry.

  • Dashboards, reporting, advanced visualization, and data warehousing are the highest priority investment areas for companies whose budgets increased from 2017 to 2018. Additional high priority areas of investment include advanced visualization and data warehousing. The study found that less well-funded organizations are most likely to lead all others by investing in open source software to reduce costs.

  • Small organizations with up to 100 employees have the highest rate of BI penetration or adoption in 2018. Factors contributing to the high adoption rate for BI in small businesses include business models that need advanced analytics to function and scale, employees with the latest analytics and BI skills being hired to also scale high growth businesses and fewer barriers to adoption compared to larger enterprises. BI adoption tends to be more pervasive in small businesses as a greater percentage of employees are using analytics and BI apps daily.

  • Executive Management is most familiar with the type and number of BI tools in use across the organization. The majority of executive management respondents say their teams are using between one or two BI tools today. Business Intelligence Competency Centers (BICC) consistently report a higher number of BI tools in use than other functional areas given their heavy involvement in all phases of analytics and BI project execution. IT, Sales & Marketing and Finance are likely to have more BI tools in use than Operations.

  • Enterprises rate BI application usability and product quality & reliability at an all-time high in 2018. Other areas of major improvements on the part of vendors include improving ease of implementation, online training, forums and documentation, and completeness of functionality. Dresner’s research team found between 2017 and 2018 integration of components within product dropped, in addition to scalability. The study concludes the drop in integration expertise is due to an increasing number of software company acquisitions aggregating dissimilar products together from different platforms.

Three Ways Machine Learning Is Revolutionizing Zero Trust Security

Bottom Line: Zero Trust Security (ZTS) starts with Next-Gen Access (NGA). Capitalizing on machine learning technology to enable NGA is essential in achieving user adoption, scalability, and agility in securing applications, devices, endpoints, and infrastructure.

How Next-Gen Access and Machine Learning Enable Zero Trust Security

Zero Trust Security provides digital businesses with the security strategy they need to keep growing by scaling across each new perimeter and endpoint created as a result of growth. ZTS in the context of Next-Gen Access is built on four main pillars: (1) verify the user, (2) validate their device, (3) limit access and privilege, and (4) learn and adapt. The fourth pillar heavily relies on machine learning to discover risky user behavior and apply for conditional access without impacting user experience by looking for contextual and behavior patterns in access data.

As ZTS assumes that untrusted users or actors already exist both inside and outside the network, machine learning provides NGA with the capability to assess data about users, their devices, and behavior to allow access, block access, or enforce additional authentication. With machine learning, policies and user profiles can be adjusted automatically and in real-time. While NGA enabled by machine learning is delivering dashboards and alerts, the real-time response to security threats predicated on risk scores is very effective in thwarting breaches before they start.

Building NGA apps based on machine learning technology yields the benefits of being non-intrusive, supporting the productivity of workforce and business partners, and ultimately allowing digital businesses to grow without interruption. For example, Centrify’s rapid advances in machine learning and Next-Gen Access to enable ZTS strategies makes this company one of the most interesting to watch in enterprise security.

The following are three ways machine learning is revolutionizing Zero Trust Security:

  1. Machine learning enables enterprises to adopt a risk-based security strategy that can flex with their business as it grows. Many digital businesses have realized that “risk is security’s new compliance,” and therefore are implementing a risk-driven rather than a compliance-driven approach. Relying on machine learning technology to assess user, device, and behavioral data for each access request derives a real-time risk score. This risk score can then be used to determine whether to allow access, block access, or step up authentication. In evaluating each access request, machine learning engines process multiple factors, including the location of the access attempt, browser type, operating system, endpoint device status, user attributes, time of day, and unusual recent privilege change. Machine learning algorithms are also scaling to take into account unusual command runs, unusual resource access histories, and any unusual accounts used, unusual privileges requested and used, and more. This approach helps thwart comprised credential attacks, which make up 81% of all hacking-related data breaches, according to Verizon.
  2. Machine learning makes it possible to accomplish security policy alignment at scale. To keep pace with a growing digital business’ need to flex and scale to support new business models, machine learning also assists in automatically adjusting user profiles and access policies based on behavioral patterns. By doing so, the need for IT staffers to review and adjust policies vanishes, freeing them up to focus on things that will grow the business faster and more profitably. On the other hand, end users are not burdened with step-up authentication once a prior abnormal behavior is identified as now typical behavior and therefore both user profile and policies updated.
  3. Machine learning brings greater contextual intelligence into authentication, streamlining the experience and increasing user adoption. Ultimately, the best security is transparent and non-intrusive. That’s where the use of risk-based authentication and machine learning technology comes into play. The main impediment to adoption for multi-factor authentication has been the perceived impact on the productivity and agility of end users. A recent study by Dow Jones Customer Intelligence and Centrify revealed that 62% of CEOs state that multi-factor authentication (MFA) is difficult to manage and is not user-friendly, while only 41% of technical officers (CIOs, CTOs, and CISOs) agree with this assessment. For example, having to manually type in a code that has been transmitted via SMS in addition to the already supplied username and password is often seen as cumbersome. Technology advancements are removing some of these objections by offering a more user-friendly experience, like eliminating the need to manually enter a one-time password on the endpoint, by enabling the user to simply click a button on their smartphone. Nonetheless, some users still express frustration with this additional step, even if it is relatively quick and simple. To overcome these remaining barriers to adoption, machine learning technology contributes to minimizing the exposure to step up authentication over time, as the engine learns and adapts to the behavioral patterns.

In Conclusion

Zero Trust Security through the power of Next-Gen Access is allowing digital businesses to continue on their path of growth while safeguarding their patented ideas and intellectual property. Relying on machine learning technology for Next-Gen Access results in real-time security, allowing to identify high-risk events and ultimately greatly minimizing the effort required to identify threats across today’s hybrid IT environment.

Five Reasons Why Machine Learning Needs To Make Resumes Obsolete

  • Hiring companies nationwide miss out on 50% or more of qualified candidates and tech firms incorrectly classify up 80% of candidates due to inaccuracies and shortcomings of existing Applicant Tracking Systems (ATS), illustrating how faulty these systems are for enabling hiring.
  • It takes on average 42 days to fill a position, and up to 60 days or longer to fill positions requiring in-demand technical skills and costs an average $5,000 to fill each position.
  • Women applicants have a 19% chance of being eliminated from consideration for a job after a recruiter screen and 30% after an onsite interview, leading to a massive loss of brainpower and insight every company needs to grow.

It’s time the hiring process gets smarter, more infused with contextual intelligence, insight, evaluating candidates on their mastery of needed skills rather than judging candidates on resumes that reflect what they’ve achieved in the past. Enriching the hiring process with greater machine learning-based contextual intelligence finds the candidates who are exceptional and have the intellectual skills to contribute beyond hiring managers’ expectations. Machine learning algorithms can also remove any ethic- and gender-specific identification of a candidate and have them evaluated purely on expertise, experiences, merit, and skills.

The hiring process relied on globally today hasn’t changed in over 500 years. From Leonardo da Vinci’s handwritten resume from 1482, which reflects his ability to build bridges and support warfare versus the genius behind Mona Lisa, Last Supper, Vitruvian Man, and a myriad of scientific discoveries and inventions that modernized the world, the approach job seekers take for pursuing new positions has stubbornly defied innovation. ATS apps and platforms classify inbound resumes and provide rankings of candidates based on just a small glimpse of their skills seen on a resume. When what’s needed is an insight into which managerial, leadership and technical skills & strengths any given candidate is attaining mastery of and at what pace.  Machine learning broadens the scope of what hiring companies can see in candidates by moving beyond the barriers of their resumes. Better hiring decisions are being made, and the Return on Investment (ROI) drastically improves by strengthening hiring decisions with greater intelligence. Key metrics including time-to-hire, cost-to-hire, retention rates, and performance all will improve when greater contextual intelligence is relied on.

Look Beyond Resumes To Win The War For Talent

Last week I had the opportunity to speak with the Vice President of Human Resources for one of the leading technology think tanks globally. He’s focusing on hundreds of technical professionals his organization needs in six months, 12 months and over a year from now to staff exciting new research projects that will deliver valuable Intellectual Property (IP) including patents and new products.

Their approach begins by seeking to understand the profiles and core strengths of current high performers, then seek out matches with ideal candidates in their community of applicants and the broader technology community. Machine learning algorithms are perfectly suited for completing the needed comparative analysis of high performer’s capabilities and those of candidates, whose entire digital persona is taken into account when comparisons are being completed. The following graphic illustrates the eightfold.ai Talent Intelligence Platform (TIP), illustrating how integrated it is with publicly available data, internal data repositories, Human Capital Resource Management (HRM) systems, ATS tools. Please click on the graphic to expand it for easier reading.

The comparative analysis of high achievers’ characteristics with applicants takes seconds to complete, providing a list of prospects complete with profiles. Machine learning-derived profiles of potential hires meeting the high performers’ characteristics provided greater contextual intelligence than any resume ever could. Taking an integrated approach to creating the Talent Intelligence Platform (TIP) yields insights not available with typical hiring or ATS solutions today. The profile below reflects the contextual intelligence and depth of insight possible when machine learning is applied to an integrated dataset of candidates. Please click on the graphic to expand it for easier reading. Key elements in the profile below include the following:

  • Career Growth Bell Curve – Illustrates how a given candidate’s career progressions and performance compares relative to others.

  • Social Following On Public Sites –  Provides a real-time glimpse into the candidate’s activity on Github, Open Stack, and other sites where technical professionals can share their expertise. This also provides insight into how others perceive their contributions.

  • Highlights Of Background That Is Relevant To Job(s) Under Review Provides the most relevant data from the candidate’s history in the profile so recruiters and managers can more easily understand their strengths.

  • Recent Publications – Publications provide insights into current and previous interests, areas of focus, mindset and learning progression over the last 10 to 15 years or longer.

  • Professional overlap that makes it easier to validate achievements chronicled in the resume – Multiple sources of real-time career data validate and provide greater context and insight into resume-listed accomplishments.

The key is understanding the context in which a candidate’s capabilities are being evaluated. And a 2-page resume will never give enough latitude to the candidate to cover all bases. For medium to large companies – doing this accurately and quickly is a daunting task if done manually – across all roles, all the geographies, all the candidates sourced, all the candidates applying online, university recruiting, re-skilling inside the company, internal mobility for existing employees, and across all recruitment channels. This is where machine learning can be an ally to the recruiter, hiring manager, and the candidate.

Five Reasons Why Machine Learning Needs To Make Resumes Obsolete

Reducing the costs and time-to-hire, increasing the quality of hires and staffing new initiatives with the highest quality talent possible all fuels solid revenue growth. Relying on resumes alone is like being on a bad Skype call where you only hear every tenth word in the conversation. Using machine learning-based approaches brings greater acuity, clarity, and visibility into hiring decisions.

The following are the five reasons why machine learning needs to make resumes obsolete:

  1. Resumes are like rearview mirrors that primarily reflect the past. What needed is more of a focus on where someone is going, why (what motivates them) and what are they fascinated with and learning about on their own. Resumes are rearview mirrors and what’s needed is an intelligent heads-up display of what their future will look like based on present interests and talent.
  2. By relying on a 500+-year-old process, there’s no way of knowing what skills, technologies and training a candidate is gaining momentum in. The depth and extent of mastery in specific areas aren’t reflected in the structure of resumes. By integrating multiple sources of data into a unified view of a candidate, it’s possible to see what areas they are growing the quickest in from a professional development standpoint.
  3. It’s impossible to game a machine learning algorithm that takes into account all digital data available on a candidate, while resumes have a credibility issue. Anyone who has hired subordinates, staff, and been involved in hiring decisions has faced the disappointment of finding out a promising candidate lied on a resume. It’s a huge let-down. Resumes get often gamed with one recruiter saying at least 60% of resumes have exaggerations and in some cases lies on them. Taking all data into account using a platform like TIP shows the true candidate and their actual skills.
  4. It’s time to take a more data-driven approach to diversity that removes unconscious biases. Resumes today immediately carry inherent biases in them. Recruiter, hiring managers and final interview groups of senior managers draw their unconscious biases based on a person’s name, gender, age, appearance, schools they attended and more. It’s more effective to know their skills, strengths, core areas of intelligence, all of which are better predictors of job performance.
  5. Reduces the risk of making a bad hire that will churn out of the organization fast. Ultimately everyone hires based in part on their best judgment and in part on their often unconscious biases. It’s human nature. With more data the probability of making a bad hire is reduced, reducing the risk of churning through a new hire and costing thousands of dollars to hire then replace them. Having greater contextual intelligence reduces the downside risks of hiring, removes biases by showing with solid data just how much a person is qualified or not for a role, and verifies their background strengths, skills, and achievements. Factors contributing to unconscious biases including gender, race, age or any other factors can be removed from profiles, so candidates are evaluated only on their potential to excel in the roles they are being considered for.

Bottom line: It’s time to revolutionize resumes and hiring processes, moving them into the 21st century by redefining them with greater contextual intelligence and insight enabled by machine learning.

 

How Zero Trust Security Fuels New Business Growth

Bottom Line: Zero Trust Security (ZTS) strategies enabled by Next-Gen Access (NGA) are indispensable for assuring uninterrupted digital business growth, and are proving to be a scalable security framework for streamlining onboarding and systems access for sales channels, partners, patients, and customers of fast-growing businesses.

The era of Zero Trust Security is here, accelerated by NGA solutions and driven by the needs of digital businesses for security strategies that can keep up with the rapidly expanding perimeters of their businesses. Internet of Things (IoT) networks and the sensors that comprise them are proliferating network endpoints and extending the perimeters of growing businesses quickly.

Inherent in the DNA of Next-Gen Access is the ability to verify the user, validate the device (including any sensor connected to an IoT network), limit access and privilege, then learn and adapt using machine learning techniques to streamline the user experience while granting access to approved accounts and resources. Many digital businesses today rely on IoT-based networks to connect with suppliers, channels, service providers and customers and gain valuable data they use to grow their businesses. Next-Gen Access solutions including those from Centrify are enabling Zero Trust Security strategies that scale to secure the perimeters of growing businesses without interrupting growth.

How Zero Trust Security Fuels New Business Growth  

The greater the complexity, scale and growth potential of any new digital business, the more critical NGA becomes for enabling ZTS to scale and protect its expanding perimeters. One of the most valuable ways NGA enables ZTS is using machine learning to learn and adapt to users’ system access behaviors continuously. Insights gained from NGA strengthen ZTS frameworks, enabling them to make the following contributions to new business growth:

  1. Zero Trust Security prevents data breaches that cripple new digital business models and ventures just beginning to scale and grow. Verifying, validating, learning and adapting to every user’s access attempts and then quantifying their behavior in a risk score is at the core of Next-Gen Access’ DNA. The risk scores quantify the relative levels of trust for each system user and determine what, if any, additional authentication is needed before access is granted to requested resources. Risk scores are continuously updated with every access attempt, making authentication less intrusive over time while greatly reducing compromised credential attacks.
  2. Securing the expanding endpoints and perimeters of a digital business using NGA frees IT and senior management up to focus more on growing the business. In any growing digital business, there’s an exponential increase in the number of endpoints being created, rapidly expanding the global perimeter of the business. The greater the number of endpoints and the broader the perimeter, the more revenue potential there is. Relying on Next-Gen Access to scale ZTS across all endpoints saves valuable IT time that can be dedicated to direct revenue-producing projects and initiatives. And by relying on NGA as the trust engine that enables ZTS, senior management will have far fewer security-related emergencies, interruptions, and special projects and can dedicate more time to growing the business. A ZTS framework also centralizes security management across a digital business, alleviating the costly, time-consuming task of continually installing patches and updates.
  3. Zero Trust Security is enabling digital businesses globally to meet and exceed General Data Protection Regulation (GDPR) compliance requirements while protecting and growing their most valuable asset: customer trust. Every week brings new announcements of security breaches at many of the world’s most well-known companies. Quick stats on users affected, potential dollar loss to the company and the all-too-common 800 numbers for credit bureaus seem to be in every press release. What’s missing is the incalculable, unquantifiable cost of lost customer value and the millions of hours customers waste trying to avert financial chaos. In response to the need for greater oversight of how organizations respond to breaches and manage data security, the European Union (EU) launched General Data Protection Regulation (GDPR) which goes into effect May 25, 2018. GDPR applies not only European organizations, but also to foreign businesses that offer goods or services in the European Union (EU) or monitor the behavior of individuals in the EU. The compliance directive also states that organizations need to process data so in a way that “ensures appropriate security of the personal data, using appropriate technical and organizational measures,” taking into account “state of the art and the costs of implementation.”

Using an NGA approach that includes risk-based multi-factor authentication (MFA) to evaluate every login combined with the least privilege approach across an entire organization is a first step towards excelling at GDPR compliance. Zero Trust Security provides every organization needing to comply with GDPR a solid roadmap of how to meet and exceed the initiative’s requirements and grow customer trust as a result.

Conclusion

Next-Gen Access enables Zero Trust Security strategies to scale and flex as a growing business expands. In the fastest growing businesses, endpoints are proliferating as new customers are gained, and suppliers are brought onboard. NGA ensures growth continues uninterrupted, helping to thwart comprised credential attacks, which make up 81% of all hacking-related data breaches, according to Verizon.

How To Close The Talent Gap With Machine Learning

  • 80% of the positions open in the U.S. alone were due to attrition. On an average, it costs $5,000 to fill an open position and takes on average of 2 months to find a new employee. Reducing attrition removes a major impediment to any company’s productivity.
  • The average employee’s tenure at a cloud-based enterprise software company is 19 months; in the Silicon Valley this trends to 14 months due to intense competition for talent according to C-level executives.
  • Eightfold.ai can quantify hiring bias and has found it occurs 35% of the time within in-person interviews and 10% during online or virtual interview sessions.
  • Adroll Group launched nurture campaigns leveraging the insights gained using Eightfold.ai for a data scientist open position and attained a 48% open rate, nearly double what they observed from other channels.
  • A leading cloud services provider has seen response rates to recruiting campaigns soar from 20% to 50% using AI-based candidate targeting in the company’s community.

The essence of every company’s revenue growth plan is based on how well they attract, nurture, hire, grow and challenge the best employees they can find. Often relying on manual techniques and systems decades old, companies are struggling to find the right employees to help them grow. Anyone who has hired and managed people can appreciate the upside potential of talent management today.

How AI and Machine Learning Are Revolutionizing Talent Management

Strip away the hype swirling around AI in talent management and what’s left is the urgent, unmet needs companies have for greater contextual intelligence and knowledge about every phase of talent management. Many CEOs are also making greater diversity and inclusion their highest priority. Using advanced AI and machine learning techniques, a company founded by former Google and Facebook AI Scientists is showing potential in meeting these challenges. Founders Ashutosh Garg and Varun Kacholia have over 6000+ research citations and 80+ search and personalization patents. Together they founded Eightfold.ai as Varun says “to help companies find and match the right person to the right role at the right time and, for the first time, personalize the recommendations at scale.” Varun added that “historically, companies have not been able to recognize people’s core capabilities and have unnecessarily exacerbated the talent crisis,” said Varun Kacholia, CTO, and Co-Founder of Eightfold.ai.

What makes Eightfold.ai noteworthy is that it’s the first AI-based Talent Intelligence Platform that combines analysis of publicly available data, internal data repositories, Human Capital Resource Management (HRM) systems, ATS tools and spreadsheets then creates ontologies based on organization-specific success criteria. Each ontology, or area of talent management interest, is customizable for further queries using the app’s easily understood and navigated user interface.

Based on conversations with customers, its clear integration is one of the company’s core strengths. Eightfold.ai relies on an API-based integration strategy to connect with legacy back-end systems. The company averages between 2 to 3 system integrations per customer and supports 20 unique system integrations today with more planned. The following diagram explains how the Eightfold Talent Intelligence Platform is constructed and how it works.

For all the sophisticated analysis, algorithms, system integration connections, and mathematics powering the Eightfold.ai platform, the company’s founders have done an amazing job creating a simple, easily understood user interface. The elegant simplicity of the Eightfold.ai interface reflects the same precision of the AI and machine learning code powering this platform.

I had a chance to speak with Adroll Group and DigitalOcean regarding their experiences using Eightfold.ai. Both said being able to connect the dots between their candidate communities, diversity and inclusion goals, and end-to-end talent management objectives were important goals that the streamlined user experience was helping enable. The following is a drill-down of a candidate profile, showing the depth of external and internal data integration that provides contextual intelligence throughout the Eightfold.ai platform.

Talent Management’s Inflection Point Has Arrived 

Every interaction with a candidate, current associate, and high-potential employee is a learning event for the system.

AI and machine learning make it possible to shift focus away from being transactional and more on building relationships. AdRoll Group and DigitalOcean both mentioned how Eightfold.ai’s advanced analytics and machine learning helps them create and fine-tune nurturing campaigns to keep candidates in high-demand fields aware of opportunities in their companies. AdRoll Group used this technique of concentrating on insights to build relationships with potential Data Scientists and ultimately made a hire assisted by the Eightold.ai platform. DigitalOcean is also active using nurturing campaigns to recruit for their most in-demand positions. “As DigitalOcean continues to experience rapid growth, it’s critical we move fast to secure top talent, while taking time to nurture the phenomenal candidates already in our community,” said Olivia Melman, Manager, Recruiting Operations at DigitalOcean. “Eightfold.ai’s platform helps us improve operational efficiencies so we can quickly engage with high quality candidates and match past applicants to new openings.”

In companies of all sizes, talent management reaches its full potential when accountability and collaboration are aligned to a common set of goals. Business strategies and new business models are created and the specific amount of hires by month and quarter are set. Accountability for results is shared between business and talent management organizations, as is the case at AdRoll Group and DigitalOcean, both of which are making solid contributions to the growth of their businesses. When accountability and collaboration are not aligned, there are unpredictable, less than optimal results.

AI makes it possible to scale personalized responses to specific candidates in a company’s candidate community while defining the ideal candidate for each open position. The company’s founders call this aspect of their platform personalization at scale. “Our platform takes a holistic approach to talent management by meaningfully connecting the dots between the individual and the business. At Eightfold.ai, we are going far beyond keyword and Boolean searches to help companies and employees alike make more fulfilling decisions about ‘what’s next, “ commented Ashutosh Garg, CEO, and Co-Founder of Eightfold.ai.

Every hiring manager knows what excellence looks like in the positions they’re hiring for. Recruiters gather hundreds of resumes and use their best judgment to find close matches to hiring manager needs. Using AI and machine learning, talent management teams save hundreds of hours screening resumes manually and calibrate job requirements to the available candidates in a company’s candidate community. This graphic below shows how the Talent Intelligence Platform (TIP) helps companies calibrate job descriptions. During my test drive, I found that it’s as straightforward as pointing to the profile of ideal candidate and asking TIP to find similar candidates.

Achieving Greater Equality With A Data-Driven Approach To Diversity

Eightfold.ai can quantify hiring bias and has found it occurs 35% of the time within in-person interviews and 10% during online or virtual interview sessions. They’ve also analyzed hiring data and found that women are 11% less like to make it through application reviews, 19% less likely through recruiter screens, 12% through assessments and a shocking 30% from onsite interviews. Conscious and unconscious biases of recruiters and hiring managers often play a more dominant role than a woman’s qualifications in many hiring situations. For the organizations who are enthusiastically endorsing diversity programs yet struggling to make progress, AI and machine learning are helping to accelerate them to the goals they want to accomplish.

AI and machine learning can’t make an impact in this area quickly enough. Imagine the lost brainpower from not having a way to evaluate candidates based on their innate skills and potential to excel in the role and the need for far greater inclusion across the communities companies operate in. AdRoll Group’s CEO is addressing this directly and has made attaining greater diversity and inclusion a top company objective for the year. Daniel Doody, Global Head of Talent at AdRoll Group says “We’re very deliberate in our efforts to uncover and nurture more diverse talent while also identifying individuals who have engaged with our talent brand to include them” he said. Daniel Doody continued, “Eightfold.ai has helped us gain greater precision in our nurturing campaigns designed to bring more diverse talent to Adroll Group globally.”

Kelly O. Kay, Managing Partner, Global Managing Partner, Software & Internet Practice at Heidrick & Struggles agrees. “Eightfold.ai levels the playing field for diversity hiring by using pattern matching based on human behavior, which is fascinating,” Mr. Kay said. He added, “I’m 100% supportive of using AI and machine learning to provide everyone equal footing in pursuing and attaining their career goals.” He added that the Eightfold.ai’s greatest strength is how brilliantly it takes on the challenge of removing unconscious bias from hiring decisions, further ensuring greater diversity in hiring, retention and growth decisions.

Eightfold.ai has a unique approach to presenting potential candidates to recruiters and hiring managers. They can remove any gender-specific identification of a candidate and have them evaluated purely on expertise, experiences, merit, and skills. And the platform also can create gender-neutral job descriptions in seconds too. With these advances in AI and machine learning, long-held biases of tech companies who only want to hire from Cal-Berkeley, Stanford or MIT are being challenged when they see the quality of candidates from just as prestigious Indian, Asian, and European universities as well. Daniel Doody of Adroll Group says the insights gained from the Eightfold.ai platform “are helping to make managers and recruiters more aware of their own hiring biases while at the same time assisting in nurturing potential candidates via less obvious channels.”

How To Close The Talent Gap

Based on conversations with customers, it’s apparent that Eightfold.ai’s Talent Intelligence Platform (TIP) provides enterprises the ability to accelerate time to hire, reduce the cost to hire and increase the quality of hire. Eightfold.ai customers are also seeing how TIP enables their companies to reduce employee attrition, saving on hiring and training costs and minimizing the impact of lost productivity. Today more CEOs and CFOs than ever are making diversity and talent initiatives their highest priority. Based on conversations with Eightfold.ai customers it’s clear their TIP provides the needed insights for C-level executives to reach their goals.

Another aspect of the TIP that customers are just beginning to explore is how to identify employees who are the most likely to leave, and take proactive steps to align their jobs with their aspirations, extending the most valuable employees’ tenure at their companies. At the same time, customers already see good results from using TIP to identify top talent that fits open positions who are likely to join them and put campaigns in place to recruit and hire them before they begin an active job search. Every Eightfold.ai customer spoken with attested to the platform’s ability to help them in their strategic imperatives around talent.

Five Ways Machine Learning Can Save Your Company From A Security Breach Meltdown

  • $86B was spent on security in 2017, yet 66% of companies have still been breached an average of five or more times.
  • Just 55% of CEOs say their organizations have experienced a breach, while 79% of CTOs acknowledge breaches have occurred. One in approximately four CEOs (24%) aren’t aware if their companies have even had a security breach.
  • 62% of CEOs inaccurately cite malware as the primary threat to cybersecurity.
  • 68% of executives whose companies experienced significant breaches in hindsight believe that the breach could have been prevented by implementing more mature identity and access management strategies.

These and many other fascinating findings are from the recently released Centrify and Dow Jones Customer Intelligence study, CEO Disconnect is Weakening Cybersecurity (31 pp, PDF, opt-in).

One of the most valuable findings from the study is how CEOs can reduce the risk of a security breach meltdown by rethinking their core cyber defense strategy by maturing their identity and access management strategies.

However, 62% of CEOs have the impression that multi-factor authentication is difficult to manage. Thus, their primary security concern is primarily driven by how to avoid delivering poor user experiences. In this context, machine learning can assist in strengthening the foundation of a multi-factor authentication platform to increase effectiveness while streamlining user experiences.

Five Ways Machine Learning Saves Companies From Security Breach Meltdowns

Machine learning is solving the security paradox all enterprises face today. Spending millions of dollars on security solutions yet still having breaches occur that are crippling their ability to compete and grow, enterprises need to confront this paradox now. There are many ways machine learning can be used to improve enterprise security. With identity being the primary point of attacks, the following are five ways machine learning can be leveraged in the context of identity and access management to minimize the risk of falling victim to a data breach.

  1. Thwarting compromised credential attacks by using risk-based models that validate user identity based on behavioral pattern matching and analysis. Machine learning excels at using constraint-based and pattern matching algorithms, which makes them ideal for analyzing behavioral patterns of people signing in to systems that hold sensitive information. Compromised credentials are the most common and lethal type of breach. Applying machine learning to this challenge by using a risk-based model that “learns’ behavior over time is stopping security breaches today.
  2. Attaining Zero Trust Security (ZTS) enterprise-wide using risk scoring models that flex to a businesses’ changing requirements. Machine learning enables Zero Trust Security (ZTS) frameworks to scale enterprise-wide, providing threat assessments and graphs that scale across every location. These score models are invaluable in planning and executing growth strategies quickly across broad geographic regions. CEOs need to see multi-factor authentication as a key foundation of ZTS frameworks that can help them grow faster. Machine learning enables IT to accelerate the development of Zero Trust Security (ZTS) frameworks and scale them globally. Removing security-based roadblocks that get in the way of future growth needs to be the highest priority CEOs address. A strong ZTS framework is as much a contributor to revenue as is any distribution or selling channel.
  3. Streamlining security access for new employees by having persona-based risk model profiles that can be quickly customized by IT for specific needs. CEOs most worry about security’s poor user experience and its impacts on productivity. The good news is that the early multi-factor authentication workflows that caused poor user experiences are being redefined with contextual insights and intelligence based on more precise persona-based risk scoring models. As the models “learn” the behaviors of employees regarding access, the level of authentication changes and the experience improves. By learning new behavior patterns over time, machine learning is accelerating how quickly employees can gain access to secured services and systems.
  4. Provide predictive analytics and insights into which are the most probable sources of threats, what their profiles are and what priority to assign to them. CIOs and the security teams they manage need to have enterprise-wide visibility of all potential threats, ideally prioritized by potential severity. Machine learning algorithms are doing this today, providing threat assessments and defining which are the highest priority threats that CIOs and their teams need to address.
  5. Stop malware-based breaches by learning how hackers modify the code bases in an attempt to bypass multi-factor authentication. One of the favorite techniques for hackers to penetrate an enterprise network is to use impersonation-based logins and passwords to pass malware onto corporate servers. Malware breaches can be extremely challenging to track. One approach that is working is when enterprises implement a ZTS framework and create specific scenarios to trap, stop and destroy suspicious malware activity.

10 Ways Machine Learning Is Revolutionizing Manufacturing In 2018

  • Improving semiconductor manufacturing yields up to 30%, reducing scrap rates, and optimizing fab operations is achievable with machine learning.
  • Reducing supply chain forecasting errors by 50% and lost sales by 65% with better product availability is achievable with machine learning.
  • Automating quality testing using machine learning is increasing defect detection rates up to 90%.

Bottom line: Machine learning algorithms, applications, and platforms are helping manufacturers find new business models, fine-tune product quality, and optimize manufacturing operations to the shop floor level.

Manufacturers care most about finding new ways to grow, excel at product quality while still being able to take on short lead-time production runs from customers. New business models often bring the paradox of new product lines that strain existing ERP, CRM and PLM systems by the need always to improve time-to-customer performance. New products are proliferating in manufacturing today, and delivery windows are tightening. Manufacturers are turning to machine learning to improve the end-to-end performance of their operations and find a performance-based solution to this paradox.

The ten ways machine learning is revolutionizing manufacturing in 2018 include the following:

  • Improving semiconductor manufacturing yields up to 30%, reducing scrap rates, and optimizing fab operations are is achievable with machine learning. Attaining up to a 30% reduction in yield detraction in semiconductor manufacturing, reducing scrap rates based on machine learning-based root-cause analysis and reducing testing costs using AI optimization are the top three areas where machine learning will improve semiconductor manufacturing. McKinsey also found that AI-enhanced predictive maintenance of industrial equipment will generate a 10% reduction in annual maintenance costs, up to a 20% downtime reduction and 25% reduction in inspection costs. Source: Smartening up with Artificial Intelligence (AI) – What’s in it for Germany and its Industrial Sector? (52 pp., PDF, no opt-in) McKinsey & Company.

  • Asset Management, Supply Chain Management, and Inventory Management are the hottest areas of artificial intelligence, machine learning and IoT adoption in manufacturing today. The World Economic Forum (WEF) and A.T. Kearney’s recent study of the future of production find that manufacturers are evaluating how combining emerging technologies including IoT, AI, and machine learning can improve asset tracking accuracy, supply chain visibility, and inventory optimization. Source: Technology and Innovation for the Future of Production: Accelerating Value Creation (38 pp., PDF, no opt-in) World Economic Forum with A.T. Kearney.

  • Manufacturer’s adoption of machine learning and analytics to improve predictive maintenance is predicted to increase 38% in the next five years according to PwC. Analytics and MI-driven process and quality optimization are predicted to grow 35% and process visualization and automation, 34%. PwC sees the integration of analytics, APIs and big data contributing to a 31% growth rate for connected factories in the next five years. Source: Digital Factories 2020: Shaping the future of manufacturing (48 pp., PDF, no opt-in) PriceWaterhouseCoopers

  • McKinsey predicts machine learning will reduce supply chain forecasting errors by 50% and reduce lost sales by 65% with better product availability. Supply chains are the lifeblood of any manufacturing business. Machine learning is predicted to reduce costs related to transport and warehousing and supply chain administration by 5 to 10% and 25 to 40%, respectively. Due to machine learning, overall inventory reductions of 20 to 50% are possible. Source: Smartening up with Artificial Intelligence (AI) – What’s in it for Germany and its Industrial Sector? (52 pp., PDF, no opt-in) McKinsey & Company.

  • Improving demand forecast accuracy to reduce energy costs and negative price variances using machine learning uncovers price elasticity and price sensitivity as well. Honeywell is integrating AI and machine-learning algorithms into procurement, strategic sourcing and cost management. Source: Honeywell Connected Plant: Analytics and Beyond. (23 pp., PDF, no opt-in) 2017 Honeywell User’s Group.

  • Automating inventory optimization using machine learning has improved service levels by 16% while simultaneously increasing inventory turns by 25%. AI and machine learning constraint-based algorithms and modeling are making it possible scale inventory optimization across all distribution locations, taking into account external, independent variables that affect demand and time-to-customer delivery performance. Source: Transform the manufacturing supply chain with Multi-Echelon inventory optimization, Microsoft, March 1, 2018.

  • Combining real-time monitoring and machine learning is optimizing shop floor operations, providing insights into machine-level loads and production schedule performance. Knowing in real-time how each machine’s load level impacts overall production schedule performance leads to better decisions managing each production run. Optimizing the best possible set of machines for a given production run is now possible using machine learning algorithms. Source: Factories of the Future: How Symbiotic Production Systems, Real-Time Production Monitoring, Edge Analytics and AI Are Making Factories Intelligent and Agile, (43 pp., PDF, no opt-in) Youichi Nonaka, Senior Chief Researcher, Hitachi R&D Group and Sudhanshu Gaur Director, Global Center for Social Innovation Hitachi America R&D

  • Improving the accuracy of detecting costs of performance degradation across multiple manufacturing scenarios reduces costs by 50% or more. Using real-time monitoring technologies to create accurate data sets that capture pricing, inventory velocity, and related variables gives machine learning apps what they need to determine cost behaviors across multiple manufacturing scenarios. Source: Leveraging AI for Industrial IoT (27 pp., PDF, no opt-in) Chetan Gupta, Ph.D. Chief Data Scientist, Big Data Lab, Hitachi America Ltd. Date: Sept. 19th, 2017

  • A manufacturer was able to achieve a 35% reduction in test and calibration time via accurate prediction of calibration and test results using machine learning. The project’s goal was to reduce test and calibration time in the production of mobile hydraulic pumps. The methodology focused on using a series of machine learning models that would predict test outcomes and learn over time. The process workflow below was able to isolate the bottlenecks, streamlining test and calibration time in the process. Source: The Value Of Data Science Standards In Manufacturing Analytics (13 pp., PDF, no opt-in) Soundar Srinivasan, Bosch Data Mining Solutions And Services

  • Improving yield rates, preventative maintenance accuracy and workloads by the asset is now possible by combining machine learning and Overall Equipment Effectiveness (OEE). OEE is a pervasively used metric in manufacturing as it combines availability, performance, and quality, defining production effectiveness. Combined with other metrics, it’s possible to find the factors that impact manufacturing performance the most and least. Integrating OEE and other datasets in machine learning models that learn quickly through iteration are one of the fastest growing areas of manufacturing intelligence and analytics today. Source: TIBCO Manufacturing Solutions, TIBCO Community, January 30, 2018

Additional reading:

Artificial Intelligence (AI) Delivering Breakthroughs in Industrial IoT (26 pp., PDF, no opt-in) Hitachi

Artificial Intelligence and Robotics and Their Impact on the Workplace (120 pp., PDF, no opt-in) IBA Global Employment Institute

Artificial Intelligence: The Next Digital Frontier? (80 pp., PDF, no opt-in) McKinsey and Company

Big Data Analytics for Smart Manufacturing: Case Studies in Semiconductor Manufacturing (20 pp., PDF, no opt-in), Applied Materials, Applied Global Services

Connected Factory and Digital Manufacturing: A Competitive Advantage, Shantanu Rai, HCL Technologies (36 pp., PDF, no opt-in)

Demystifying AI, Machine Learning, and Deep Learning, DZone, AI Zone

Digital Factories 2020: Shaping the future of manufacturing (48 pp., PDF, no opt-in) PriceWaterhouseCoopers

Emerging trends in global advanced manufacturing: Challenges, Opportunities, And Policy Responses (76 pp., PDF, no opt-in) University of Cambridge

Factories of the Future: How Symbiotic Production Systems, Real-Time Production Monitoring, Edge Analytics and AI Are Making Factories Intelligent and Agile, (43 pp., PDF, no opt-in) Youichi Nonaka, Senior Chief Researcher, Hitachi R&D Group and Sudhanshu Gaur Director, Global Center for Social Innovation Hitachi America R&D

Get started with the Connected factory preconfigured solution, Microsoft Azure

Honeywell Connected Plant: Analytics and Beyond. (23 pp., PDF, no opt-in) 2017 Honeywell User’s Group.

Impact of the Fourth Industrial Revolution on Supply Chains (22 pp., PDF, no opt-in) World Economic Forum

Leveraging AI for Industrial IoT (27 pp., PDF, no opt-in) Chetan Gupta, Ph.D. Chief Data Scientist, Big Data Lab, Hitachi America Ltd. Date: Sept. 19th, 2017

Machine Learning & Artificial Intelligence Presentation (14 pp., PDF, no opt-in) Erik Hjerpe Volvo Car Group

Machine Learning Techniques in Manufacturing Applications & Caveats, (44 pp., PDF, no opt-in), Thomas Hill, Ph.D. | Exec. Director Analytics, Dell

Machine learning: the power and promise of computers that learn by example (128 pp., PDF, no opt-in) Royal Society UK

Predictive maintenance and the smart factory (8 pp., PDF, no opt-in) Deloitte

Priore, P., Gómez, A., Pino, R., & Rosillo, R. (2014). Dynamic scheduling of manufacturing systems using machine learning: An updated reviewAi Edam28(1), 83-97.

Smartening up with Artificial Intelligence (AI) – What’s in it for Germany and its Industrial Sector? (52 pp., PDF, no opt-in) McKinsey & Company

Technology and Innovation for the Future of Production: Accelerating Value Creation (38 pp., PDF, no opt-in) World Economic Forum with A.T. Kearney

The Future of Manufacturing; Making things in a changing world (52 pp., PDF, no opt-in) Deloitte University Press

The transformative potential of AI in the manufacturing industry, Microsoft, by Sanjay Ravi, Managing Director, Worldwide Discrete Manufacturing, Microsoft, September 25, 2017

The Value Of Data Science Standards In Manufacturing Analytics (13 pp., PDF, no opt-in) Soundar Srinivasan, Bosch Data Mining Solutions And Services

TIBCO Manufacturing Solutions, TIBCO Community, January 30, 2018

Transform the manufacturing supply chain with Multi-Echelon inventory optimization, Microsoft, March 1, 2018.

Turning AI into concrete value: the successful implementers’ toolkit (28 pp., PDF, no opt-in) Capgemini Consulting

Wuest, T., Weimer, D., Irgens, C., & Thoben, K. D. (2016). Machine learning in manufacturing: advantages, challenges, and applicationsProduction & Manufacturing Research4(1), 23-45.

%d bloggers like this: