Skip to content
Advertisements

Posts from the ‘Machine learning’ Category

Machine Learning Engineer Is The Best Job In The U.S. According To Indeed

  • Machine Learning Engineer job openings grew 344% between 2015 to 2018, and have an average base salary of $146,085.
  • At $158,303, Computer Vision Engineers earn among the highest salaries in tech
  • The average base salary of the 25 best jobs in the U.S. according to Indeed is $104,825, and the median base salary is $99,007.
  • Agile Coach is the highest paying job with an average base salary of $161,377.
  • 9 of the top 25 jobs in the U.S. this year are in tech fields according to Indeed.
  • Five jobs are heavily dependent on applicants’ Artificial Intelligence (AI) skills and expertise.

These and many other insights are from this Indeed’s The Best Jobs in the U.S.: 2019 study released this week. Indeed defined the best jobs as those experiencing the fastest growth measured by the increase in job postings between 2015 and 2018, in conjunction with those offering the highest pay using a baseline salary of $75,000. Indeed’s best jobs of 2019’s data set is available here in Microsoft Excel.

Key insights from Indeed’s ranking of the best jobs of 2019 include the following:

  • At $158,303, Computer Vision Engineers earn among the highest salaries in tech according to Indeed, followed Machine Learning Engineers with a base salary of $146,085. The average base pay of the nine tech-related jobs that made Indeed’s list is $122,761, above the median salary of $99,007 for the entire group of the top 25 jobs. Indeed’s top 25 jobs for 2019 are illustrated below in descending salary order with the median salary providing a benchmark across the group. Please click on the graphic to expand for easier reading.

  • Three of the top 10 fastest growing jobs as measured by percentage growth in the number of job postings are in tech. From 2015 to 2018, job postings for Machine Learning Engineers grew 344%, followed by Full-stack developers (206%) and Salesforce developers (129%). In aggregate, all nine technology-related job postings increased by 146% between 2015 and 2018. The graphic below illustrates the percentage of growth in the number of postings between 2015 and 2018. Please click on the graphic to expand for easier reading.

  • Comparing average base salary to percentage growth in job postings underscores the exceptionally high demand for Machine Learning Engineers in 2019. Technical professionals with machine learning expertise today are in an excellent position to bargain for the average base salary of at least $146,085 or more. Full-stack developers and Salesforce developers are in such high demand, technical professionals with skills on these areas combined with experience can command a higher salary than the average base salary. The following graphic compares the average base salary to percentage growth in job postings for the years 2015 – 2018. Please click on the graphic to expand for easier reading.

Advertisements

74% Of Data Breaches Start With Privileged Credential Abuse

Centrify’s survey shows organizations are granting too much trust and privilege, opening themselves up to potential internal and externally-driven breaches initiated with compromised privileged access credentials. Photo credit: iStock

Enterprises who are prioritizing privileged credential security are creating a formidable competitive advantage over their peers, ensuring operations won’t be interrupted by a breach. However, there’s a widening gap between those businesses protected from a breach and the many who aren’t. In quantifying this gap consider the typical U.S.-based enterprise will lose on average $7.91M from a breach, nearly double the global average of $3.68M according to IBM’s 2018 Data Breach Study.

Further insights into how wide this gap is are revealed in Centrify’s Privileged Access Management in the Modern Threatscape survey results published today. The study is noteworthy as it illustrates how wide the gap is between enterprises’ ability to avert and thwart breaches versus their current levels of Privileged Access Management (PAM) and privileged credential security. 74% of IT decision makers surveyed whose organizations have been breached in the past, say it involved privileged access credential abuse, yet just 48% have a password vault, just 21% have multi-factor authentication (MFA) implemented for privileged administrative access, and 65% are sharing root or privileged access to systems and data at least somewhat often.

Addressing these three areas with a Zero Trust approach to PAM would make an immediate difference in security.

“What’s alarming is that the survey reveals many organizations, armed with the knowledge that they have been breached before, are doing too little to secure privileged access. IT teams need to be taking their Privileged Access Management much more seriously, and prioritizing basic PAM strategies like vaults and MFA while reducing shared passwords,” remarked Tim Steinkopf, Centrify CEO. FINN Partners, on behalf of Centrify, surveyed 1,000 IT decision makers (500 in the U.S. and 500 in the U.K.) online in October 2018. Please see the study here for more on the methodology.

How You Choose To Secure Privileged Credentials Determines Your Future 

Identities are the new security perimeter. Threats can emerge within and outside any organization, at any time. Bad actors, or those who want to breach a system for financial gain or to harm a business, aren’t just outside. 18% of healthcare employees are willing to sell confidential data to unauthorized parties for as little as $500 to $1,000, and 24% of employees know of someone who has sold privileged credentials to outsiders, according to a recent Accenture survey.

Attackers are increasingly logging in using weak, stolen, or otherwise compromised credentials. Centrify’s survey underscores how the majority of organizations’ IT departments have room for improvement when it comes to protecting privileged access credentials, which are the ‘keys to the kingdom.’ Reading the survey makes one realize that forward-thinking enterprises who are prioritizing privileged credential security gain major cost and time advantages over their competitors. They’re able to keep their momentum going across every area of their business by not having to recover from breaches or incur millions of dollars on losses or fines as the result of a breach.

One of the most promising approaches to securing every privileged identity and threat space within and outside an organization is Zero Trust Privilege (ZTP). ZTP enables an organizations’ IT team to grant least privilege access based on verifying who is requesting access, the context of the request, and the risk of the access environment.

Key Lessons Learned from the Centrify Survey

How wide the gap is between organizations who see identities as the new security perimeter and are adopting a Zero Trust approach to securing them and those that aren’t is reflected in the results of Centrify’s Privileged Access Management in the Modern Threatscape surveyThe following are the key lessons learned of where and how organizations can begin to close the security gaps they have that leave them vulnerable to privileged credential abuse and many other potential threats:

  • Organizations’ most technologically advanced areas that are essential for future growth and attainment of strategic goals are often the most unprotected. Big Data, cloud, containers and network devices are the most important areas of any IT infrastructure. According to Centrify’s survey, they are the most unprotected as well. 72% of organizations aren’t securing containers with privileged access controls. 68% are not securing network devices like hubs, switches, and routers with privileged access controls. 58% are not securing Big Data projects with privileged access controls. 45% are not securing public and private cloud workloads with privileged access controls. The study finds that UK-based businesses lag U.S.-based ones in each of these areas as the graphic below shows:

  • Only 36% of U.K. organizations are very confident in their company’s current IT security software strategies, compared to 65% in the U.S. The gap between organizations with hardened security strategies that have a higher probability of withstanding breach attempts is wide between U.K. and U.S.-based businesses. 44% of U.K. respondents weren’t positive about what Privileged Access Management is, versus 26% of U.S. respondents. 60% of U.K. respondents don’t have a password vault.

  • Just 35% of U.S. organizations and 30% of those in the UK are relying on Privileged Access Management to manage partners’ access to privileged credentials and infrastructure. Partners are indispensable for scaling any new business strategy and expanding an existing one across new markets and countries. Forward-thinking organizations look at every partner associates’ identity as a new security perimeter. The 35% of U.S.-based organizations doing this have an immediate competitive advantage over the 65% who aren’t. By enforcing PAM across their alliances and partnerships, organizations can achieve uninterrupted growth by eliminating expensive and time-consuming breaches that many businesses never fully recover from.
  • Organizations’ top five security projects for 2019 include protecting cloud data, preventing data leakage, analyzing security incidents, improving security education/awareness and encrypting data. These top five security projects could be achieved at scale by having IT teams implement a Zero Trust-based approach to Privileged Access Management (PAM). The time, cost and scale advantages of getting the top five security projects done using Zero Trust would free up IT teams to focus on projects that deliver direct revenue gains for example.

Conclusion

Centrify’s survey shows organizations are granting too much trust and privilege, opening themselves up to potential internal and externally-driven breaches initiated with compromised privileged access credentials. It also reveals that there is a strong desire to adhere to best practices when it comes to PAM (51% of respondents) and that the reason it is not being adequately implemented rarely has to do with prioritization or difficulty but rather budget constraints and executive buy-in.

The survey also shows U.K. – and U.S.-based organizations need to realize identity is the new security perimeter. For example, only 37% of respondents’ organizations are able to turn off privileged access for an employee who leaves the company within one day, leaving a wide-open exposure point that can continue to be exploited.

There are forward-thinking organizations who are relying on Zero Trust Privilege as a core part of their digital transformation efforts as well. The survey found that given a choice, respondents are most likely to say digital transformation (40%) is one of the top 3 projects they’d prefer to work on, followed by Endpoint Security (37%) and Privileged Access Management (28%). Many enterprises see digital transformation’s missing link being Zero Trust and the foundation for redefining their businesses by defining every identity as a new security perimeter, so they can securely scale and grow faster than before.

10 Ways AI & Machine Learning Are Revolutionizing Omnichannel

Disney, Oasis, REI, Starbucks, Virgin Atlantic, and others excel at delivering omnichannel experiences using AI and machine learning to fine-tune their selling and service strategies. Source: iStock

Bottom Line: AI and machine learning are enabling omnichannel strategies to scale by providing insights into the changing needs and preferences of customers, creating customer journeys that scale, delivering consistent experiences.

For any omnichannel strategy to succeed, each customer touchpoint needs to be orchestrated as part of an overarching customer journey. That’s the only way to reduce and eventually eliminate customers’ perceptions of using one channel versus another. What makes omnichannel so challenging to excel at is the need to scale a variety of customer journeys in real-time as customers are also changing.

89% of customers used at least one digital channel to interact with their favorite brands and just 13% found the digital-physical experiences well aligned according to Accenture’s omnichannel study. AI and machine learning are being used to close these gaps with greater intelligence and knowledge. Omnichannel strategists are fine-tuning customer personas, measuring how customer journeys change over time, and more precisely define service strategies using AI and machine learning. Disney, Oasis, REI, Starbucks, Virgin Atlantic, and others excel at delivering omnichannel experiences using AI and machine learning for example.

Omnichannel leaders including Amazon use AI and machine learning to anticipate which customer personas prefer to speak with a live agent versus using self-service for example. McKinsey also found omnichannel customer care expectations fall into the three categories of speed and flexibility, reliability and transparency, and interaction and care. Omnichannel customer journeys designed deliver on each of these three categories excel and scale between automated systems and live agents as the following example from the McKinsey article, How to capture what the customer wants illustrate:

The foundation all great omnichannel strategies are based on precise customer personas, insight into how they are changing, and how supply chains and IT need to flex and change too. AI and machine learning are revolutionizing omnichannel on these three core dimensions with greater insight and contextual intelligence than ever before.

10 Ways AI & Machine Learning Are Revolutionizing Omnichannel

The following are 10 ways AI & machine learning are revolutionizing omnichannel strategies starting with customer personas, their expectations, and how customer care, IT infrastructure and supply chains need to stay responsive to grow.

  1. AI and machine learning are enabling brands, retailers and manufacturers to more precisely define customer personas, their buying preferences, and journeys. Leading omnichannel retailers are successfully using AI and machine learning today to personalize customer experiences to the persona level. They’re combining brand, event and product preferences, location data, content viewed, transaction histories and most of all, channel and communication preferences to create precise personas of each of their key customer segments.
  2. Achieving price optimization by persona is now possible using AI and machine learning, factoring in brand and channel preferences, previous purchase history, and price sensitivity. Brands, retailers, and manufacturers are saying that cloud-based price optimization and management apps are easier to use and more powerful based on rapid advances in AI and machine learning algorithms than ever before. The combination of easier to use, more powerful apps and the need to better manage and optimize omnichannel pricing is fueling rapid innovation in this area. The following example is from Microsoft Azure’s Interactive Pricing Analytics Pre-Configured Solution (PCS). Source: Azure Cortana Interactive Pricing Analytics Pre-Configured Solution.

  1. Capitalizing on insights gained from AI and machine learning, omnichannel leaders are redesigning IT infrastructure and integration so they can scale customer experiences. Succeeding with omnichannel takes an IT infrastructure capable of flexing quickly in response to change in customers’ preferences while providing scale to grow. Every area of a brand, retailer or manufacturer’s supply chain from their supplier onboarding, quality management and strategic sourcing to yard management, dock scheduling, manufacturing, and fulfillment need to be orchestrated around customers. Leaders include C3 Solutions who offers a web-based Yard Management System (YMS) and Dock Scheduling System that can integrate with ERP, Supply Chain Management (SCM), Warehouse Management Systems (WMS) and many others via APIs. The following graphic illustrates how omnichannel leaders orchestrate IT infrastructure to achieve greater growth. Source: Cognizant, The 2020 Customer Experience.

  1. Omnichannel leaders are relying on AI and machine learning to digitize their supply chains, enabling on-time performance, fueling faster revenue growth. For any omnichannel strategy to succeed, supply chains need to be designed to excel at time-to-market and time-to-customer performance at scale. 54% of retailers pursuing omnichannel strategies say that their main goal in digitizing their supply chains was to deliver greater customer experiences. 45% say faster speed to market is their primary goal in digitizing their supply chain by adding in AI and machine learning-driven intelligence. Source: Digitize Today To Future-Proof Tomorrow (PDF, 16 pp., opt-in).

  1. AI and machine learning algorithms are making it possible to create propensity models by persona, and they are invaluable for predicting which customers will act on a bundling or pricing offer. By definition propensity models rely on predictive analytics including machine learning to predict the probability a given customer will act on a bundling or pricing offer, e-mail campaign or other call-to-action leading to a purchase, upsell or cross-sell. Propensity models have proven to be very effective at increasing customer retention and reducing churn. Every business excelling at omnichannel today rely on propensity models to better predict how customers’ preferences and past behavior will lead to future purchases. The following is a dashboard that shows how propensity models work. Source: customer propensities dashboard is from TIBCO.

  1. Combining machine learning-based pattern matching with a product-based recommendation engine is leading to the development of mobile-based apps where shoppers can virtually try on garments they’re interested in buying. Machine learning excels at pattern recognition, and AI is well-suited for creating recommendation engines, which are together leading to a new generation of shopping apps where customers can virtually try on any garment. The app learns what shoppers most prefer and also evaluates image quality in real-time, and then recommends either purchase online or in a store. Source: Capgemini, Building The Retail Superstar: How unleashing AI across functions offers a multi-billion dollar opportunity.

  1. 56% of brands and retailers say that order track-and-traceability strengthened with AI and machine learning is essential to delivering excellent customer experiences. Order tracking across each channel combined with predictions of allocation and out-of-stock conditions using AI and machine learning is reducing operating risks today. AI-driven track-and-trace is invaluable in finding where there are process inefficiencies that slow down time-to-market and time-to-customer. Source: Digitize Today To Future-Proof Tomorrow (PDF, 16 pp., opt-in).
  2. Gartner predicts that by 2025, customer service organizations who embed AI in their customer engagement center platforms will increase operational efficiencies by 25%, revolutionizing customer care in the process. Customer service is often where omnichannel strategies fail due to lack of real-time contextual data and insight. There’s an abundance of use cases in customer service where AI and machine learning can improve overall omnichannel performance. Amazon has taken the lead on using AI and machine learning to decide when a given customer persona needs to speak with a live agent. Comparable strategies can also be created for improving Intelligent Agents, Virtual Personal Assistants, Chatbot and Natural Language (NLP) performance.  There’s also the opportunity to improve knowledge management, content discovery and improve field service routing and support.
  3. AI and machine learning are improving marketing and selling effectiveness by being able to track purchase decisions back to campaigns by channel and understand why specific personas purchased while others didn’t. Marketing is already analytically driven, and with the rapid advances in AI and machine learning, markets will for the first time be able to isolate why and where their omnichannel strategies are succeeding or failing. By using machine learning to qualify the further customer and prospect lists using relevant data from the web, predictive models including machine learning can better predict ideal customer profiles. Each omnichannel sales lead’s predictive score becomes a better predictor of potential new sales, helping sales prioritize time, sales efforts and selling strategies.
  4. Predictive content analytics powered by AI and machine learning are improving sales close rates by predicting which content will lead a customer to buy. Analyzing previous prospect and buyer behavior by persona using machine learning provides insights into which content needs to be personalized and presented when to get a sale. Predictive content analytics is proving to be very effective in B2B selling scenarios, and are scaling into consumer products as well

How Machine Learning Improves Manufacturing Inspections, Product Quality & Supply Chain Visibility

Bottom Line: Manufacturers’ most valuable data is generated on shop floors daily, bringing with it the challenge of analyzing it to find prescriptive insights fast – and an ideal problem for machine learning to solve.

Manufacturing is the most data-prolific industry there is, generating on average 1.9 petabytes of data every year according to the McKinsey Global Insititute. Supply chains, sourcing, factory operations, and the phases of compliance and quality management generate the majority of data.

The most valuable data of all comes from product inspections that can immediately find exceptionally strong or weak suppliers, quality management and compliance practices in a factory. Manufacturing’s massive problem is in getting quality inspection results out fast enough across brands & retailers, other factories, suppliers and vendors to make a difference in future product quality.

How A Machine Learning Startup Is Revolutionizing Product Inspections

Imagine you’re a major brand or retailer and you’re relying on a network of factories across Bangladesh, China, India, and Southeast Asia to produce your new non-food consumer goods product lines including apparel. Factories, inspection agencies, suppliers and vendors that brands and retailers like you rely on vary widely on ethics, responsible sourcing, product quality, and transparency. With your entire consumer goods product lines (and future sales) at risk based on which suppliers, factories and product inspection agencies you choose, you and your companies’ future are riding on the decisions you make.

These career- and company-betting challenges and the frustration of gaining greater visibility into what’s going on in supply chains to factory floors led Carlos Moncayo Castillo and his brothers Fernando Moncayo Castillo and Luis Moncayo Castillo to launch Inspectorio. They were invited to the Target + Techstars Retail Accelerator in the summer of 2017, a competition they participated in with their cloud-based inspection platform that includes AI and machine learning and pervasive support for mobile technologies. Target relies on them today to bring greater transparency to their supply chains. “I’ve spent years working in non-food consumer goods product manufacturing seeing the many disconnects between inspections and suppliers, the lack of collaboration and how gaps in information create too many opportunities for corruption – I had to do something to solve these problems,” Carlos said. The many problems that a lack of inspection and supply chain visibility creates became the pain Inspectorio focused on solving immediately for brands and retailers. The following is a graphic of their platform:

Presented below are a few of the many ways the combining of a scalable inspection cloud platform combined with AI, machine learning and mobile technologies are improving inspections, product quality, and supply chain visibility:

  • Enabling the creation of customized inspector workflows that learn over time and are tailored to specific products including furniture, toys, homeware and garments, the factories they’re produced in, quality of the materials used. Inspectorio’s internal research has found 74% of all inspections today are done manually using a pen and paper, with results reported in Microsoft Word, Excel or PDFs, making collaboration slow and challenging. Improving the accuracy, speed and scale of inspection workflows including real-time updates across production networks drive major gains in quality and supply chain performance.
  • Applying constraint-based algorithms and logic to understand why there are large differences in inspection results between factories is enabling brands & retailers to manage quality faster and more completely. Uploading inspections in real-time from mobile devices to an inspection platform that contains AI and machine learning applications that quickly parse the data for prescriptive insights is the future of manufacturing quality. Variations in all dimensions of quality including factory competency, supplier and production assembly quality are taken into account. In a matter of hours, inspection-based data delivers the insights needed to avert major quality problems to every member of a production network.
  • Reducing risk, the potential for fraud, while improving the product and process quality based on insights gained from machine learning is forcing inspection’s inflection point. When inspections are automated using mobile technologies and results are uploaded in real-time to a secure cloud-based platform, machine learning algorithms can deliver insights that immediately reduce risks and the potential for fraud. One of the most powerful catalysts driving inspections’ inflection point is the combination of automated workflows that deliver high-quality data that machine learning produces prescriptive insights from. And those insights are shared on performance dashboards across every brand, retailer, supplier, vendor and factory involved in shared production strategies today.
  • Matching the most experienced inspector for a given factory and product inspection drastically increases accuracy and quality. When machine learning is applied to the inspector selection and assignment process, the quality, and thoroughness of inspections increase. For the first time, brands, retailers, and factories have a clear, quantified view of Inspector Productivity Analysis across the entire team of inspectors available in a given region or country. Inspections are uploaded in real-time to the Inspectorio platform where advanced analytics and additional machine learning algorithms are applied to the data, providing greater prescriptive insights that would have ever been possible using legacy manual methods. Machine learning is also making recommendations to inspectors on which defects to look for first based on the data patterns obtained from previous inspections.
  • Knowing why specific factories and products generated more Corrective Action/Preventative Action (CAPA) than others and how fast they have been closed in the past and why is now possible. Machine learning is making it possible for entire production networks to know why specific factory and product combinations generate the most CAPAs. Using constraint-based logic, machine learning can also provide prescriptive insights into what needs to be improved to reduce CAPAs, including their root cause.

Which Analytics And BI Technologies Will Be The Highest Priority In 2019?

  • 82% of enterprises are prioritizing analytics and BI as part of their budgets for new technologies and cloud-based services.
  • 54% say AI, Machine Learning and Natural Language Processing (NLP) are also a high investment priority.
  • 50% of enterprises say their stronger focus on metrics and Key Performance Indicators (KPIs) company-wide are a major driver of new investment in analytics and BI.
  • 43%  plan to both build and buy AI and machine learning applications and platforms.
  • 42% are seeking to improve user experiences by automating discovery of data insights and 26% are using AI to provide user recommendations.

These and many other fascinating insights are from the recent TDWI Best Practices Report, BI and Analytics in the Age of AI and Big Data. An executive summary of the study is available online here. The entire study is available for download here (39 PP., PDF, free, opt-in). The study found that enterprises are placing a high priority on augmenting existing systems and replacing older technologies and data platforms with new cloud-based BI and predictive analytics ones. Transforming Data with Intelligence (TDWI) is a global community of AI, analytics, data science and machine learning professionals interested in staying current in these and more technology areas as part of their professional development. Please see page 3 of the study for specifics regarding the methodology.

Key takeaways from the study include the following:

  • 82% of enterprises are prioritizing analytics and BI applications and platforms as part of their budgets for new technologies and cloud-based services. 78% of enterprises are prioritizing advanced analytics, and 76% data preparation. 54% say AI, machine learning and Natural Language Processing (NLP) are also a high investment priority. The following graphic ranks enterprises’ investment priorities for acquiring or subscribing to new technologies and cloud-based services by analytics and BI initiatives or strategies. Please click on the graphic to expand for easier reading.

  • Data warehouse or mart in the cloud (41%), data lake in the cloud (39%) and BI platform in the cloud (38%) are the top three types of technologies enterprises are planning to use. Based on this finding and others in the study, cloud platforms are the new normal in enterprises’ analytics and Bi strategies going into 2019. Cloud data storage (object, file, or block) and data virtualization or federation (both 32%) are the next-most planned for technologies by enterprises when it comes to investing in the analytics and BI initiatives. Please click on the graphic to expand for easier reading.

  • The three most important factors in delivering a positive user experience include good query performance (61%), creating and editing visualizations (60%), and personalizing dashboards and reports (also 60%). The three activities that lead to the least amount of satisfaction are using predictive analytics and forecasting tools (27% dissatisfied), “What if” analysis and deriving new data (25%) and searching across data and reports (24%). Please click on the graphic to expand for easier reading.

  • 82% of enterprises are looking to broaden the base of analytics and BI platforms they rely on for insights and intelligence, not just stay with the solutions they have in place today. Just 18% of enterprises plan to add more instances of existing platforms and systems. Cloud-native platforms (38%), a new analytics platform (35%) and cloud-based data lakes (31%) are the top three system areas enterprises are planning to augment or replace existing BI, analytics, and data warehousing systems in. Please click on the graphic to expand for easier reading.

  • The majority of enterprises plan to both build and buy Artificial Intelligence (AI) and machine learning (ML) solutions so that they can customize them to their specific needs. 43% of enterprises surveyed plan to both build and buy AI and ML applications and platforms, a figure higher than any other recent survey on this aspect of enterprise AI adoption. 13% of responding enterprises say they will exclusively build their own AI and ML applications.

  • Capitalizing on machine learning’s innate strengths of applying algorithms to large volumes of data to find actionable new insights (54%) is what’s most important to the majority of enterprises. 47% of enterprises look to AI and machine learning to improve the accuracy and quality of information. And 42% are configuring AI and machine learning applications and platforms to augment user decision making by giving recommendations. Please click on the graphic to expand for easier reading.

10 Ways Machine Learning Is Revolutionizing Sales

  • Sales teams adopting AI are seeing an increase in leads and appointments of more than 50%, cost reductions of 40%–60%, and call time reductions of 60%–70% according to the Harvard Business Review article Why Salespeople Need to Develop Machine Intelligence.
  • 62% of highest performing salespeople predict guided selling adoption will accelerate based on its ability rank potential opportunities by value and suggest next steps according to Salesforces’ latest State of Sales research study.
  • By 2020, 30% of all B2B companies will employ AI to augment at least one of their primary sales processes according to Gartner.
  • High-performing sales teams are 4.1X more likely to use AI and machine learning applications than their peers according to the State of Sales published by Salesforce.
  • Intelligent forecasting, opportunity insights, and lead prioritization are the top three AI and machine learning use cases in sales.

Artificial Intelligence (AI) and machine learning show the potential to reduce the most time-consuming, manual tasks that keep sales teams away from spending more time with customers. Automating account-based marketing support with predictive analytics and supporting account-centered research, forecasting, reporting, and recommending which customers to upsell first are all techniques freeing sales teams from manually intensive tasks.

The Race for Sales-Focused AI & Machine Learning Patents Is On

CRM and Configure, Price & Quote (CPQ) providers continue to develop and fine-tune their digital assistants, which are specifically designed to help the sales team get the most value from AI and machine learning. Salesforces’ Einstein supports voice-activation commands from Amazon Alexa, Apple Siri, and Google. Salesforce and other enterprise software companies continue aggressively invest in Research & Development (R&D). For the nine months ended October 31, 2018, Salesforce spent $1.3B or 14% of total revenues compared to $1.1B or 15% of total revenues, during the same period a year ago, an increase of $211M according to the company’s 10Q filed with the Securities and Exchange Commission.

The race for AI and machine learning patents that streamline selling is getting more competitive every month. Expect to see the race of sales-focused AI and machine learning patents flourish in 2019. The National Bureau of Economic Research published a study last July from the Stanford Institute For Economic Policy Research titled Some Facts On High Tech Patenting. The study finds that patenting in machine learning has seen exponential growth since 2010 and Microsoft had the greatest number of patents in the 2000 to 2015 timeframe. Using patent analytics from PatentSight and ipsearchIAM published an analysis last month showing Microsoft as the global leader in machine learning patents with 2,075.  The study relied on PatentSight’s Patent Asset Index to rank machine learning patent creators and owners, revealing Microsoft and Alphabet are dominating today. Salesforce investing over $1B a year in R&D reflects how competitive the race for patents and intellectual property is.

10 Ways Machine Learning Is Revolutionizing Sales

Fueled by the proliferation of patents and the integration of AI and machine learning code into CRM, CPQ, Customer Service, Predictive Analytics and a wide variety of Sales Enablement applications, use cases are flourishing today. Presented below are the ten ways machine learning is most revolutionizing selling today:

 

  1. AI and machine learning technologies excel at pattern recognition, enabling sales teams to find the highest potential new prospects by matching data profiles with their most valuable customers. Nearly all AI-enabled CRM applications are providing the ability to define a series of attributes, characteristics and their specific values that pinpoint the highest potential prospects. Selecting and prioritizing new prospects using this approach saves sales teams thousands of hours a year.
  2. Lead scoring and nurturing based on AI and machine learning algorithms help guide sales and marketing teams to turn Marketing Qualified Leads (MQL) into Sales Qualified Leads (SQL), strengthening sales pipelines in the process. One of the most important areas of collaboration between sales and marketing is lead nurturing strategies that move prospects through the pipeline. AI and machine learning are enriching the collaboration with insights from third-party data, prospect’s activity at events and on the website, and from previous conversations with salespeople. Lead scoring and nurturing relies heavily on natural language generation (NLG) and natural-language processing (NLP) to help improve each lead’s score.
  3. Combining historical selling, pricing and buying data in a single machine learning model improves the accuracy and scale of sales forecasts. Factoring in differences inherent in every account given their previous history and product and service purchasing cycles is invaluable in accurately predicting their future buying levels. AI and machine learning algorithms integrated into CRM, sales management and sales planning applications can explain variations in forecasts, provided they have the data available. Forecasting demand for new products and services is an area where AI and machine learning are reducing the risk of investing in entirely new selling strategies for new products.
  4. Knowing the propensity of a given customer to churn versus renew is invaluable in improving Customer Lifetime Value. Analyzing a diverse series of factors to see which customers are going to churn or leave versus those that will renew is among the most valuable insights AI and machine learning is delivering today. Being able to complete a Customer Lifetime Value Analysis for every customer a company has provides a prioritized roadmap of where the health of client relationships are excellent versus those that need attention. Many companies are using Customer Lifetime Value Analysis as a proxy for a customer health score that gets reviewed monthly.
  5. Knowing the strategies, techniques and time management approaches the top 10% of salespeople to rely on to excel far beyond quota and scaling those practices across the sales team based on AI-driven insights. All sales managers and leaders think about this often, especially in sales teams where performance levels vary widely. Knowing the capabilities of the highest-achieving salespeople, then selectively recruiting those sales team candidates who have comparable capabilities delivers solid results. Leaders in the field of applying AI to talent management include Eightfold whose approach to talent management is refining recruiting and every phase of managing an employee’s potential. Please see the recent New York Times feature of them here.
  6. Guided Selling is progressing rapidly from a personalization-driven selling strategy to one that capitalized on data-driven insights, further revolutionizing sales. AI- and machine learning-based guided selling is based on prescriptive analytics that provides recommendations to salespeople of which products, services, and bundles to offer at which price. 62% of highest performing salespeople predict guided selling adoption will accelerate based on its ability rank potential opportunities by value and suggest next steps according to Salesforces’ latest State of Sales research study.
  7. Improving the sales team’s productivity by using AI and machine learning to analyze the most effective actions and behaviors that lead to more closed sales. AI and machine learning-based sales contact and customer predictive analytics take into account all sources of contacts with customers and determine which are the most effective. Knowing which actions and behaviors are correlated with the highest close rates, sales managers can use these insights to scale their sales teams to higher performance.
  8. Sales and marketing are better able to define a price optimization strategy using all available data analyzing using AI and machine learning algorithms. Pricing continues to be an area the majority of sales and marketing teams learn to do through trial and error. Being able to analyze pricing data, purchasing history, discounts are taken, promotional programs participated in and many other factors, AI and machine learning can calculate the price elasticity for a given customer, making an optimized price more achievable.
  9. Personalizing sales and marketing content that moves prospects from MQLs to SQLs is continually improving thanks to AI and machine learning. Marketing Automation applications including HubSpot and many others have for years been able to define which content asset needs to be presented to a given prospect at a given time. What’s changed is the interactive, personalized nature of the content itself. Combining analytics, personalization and machine learning, marketing automation applications are now able to tailor content and assets that move opportunities forward.
  10. Solving the many challenges of sales engineering scheduling, sales enablement support and dedicating the greatest amount of time to the most high-value accounts is getting solved with machine learning. CRM applications including Salesforce can define a salesperson’s schedule based on the value of the potential sale combined with the strength of the sales lead, based on its lead score. AI and machine learning optimize a salesperson’s time so they can go from one customer meeting to the next, dedicating their time to the most valuable prospects.

How AI & Machine Learning Are Redefining The War For Talent

These and many other fascinating insights are from Gartner’s recent research note, Cool Vendors in Human Capital Management for Talent Acquisition (PDF, 13 pp., client access reqd.) that illustrates how AI and machine learning are fundamentally redefining the war for talent. Gartner selected five companies that are setting a rapid pace of innovation in talent management, taking on Human Capital Management’s (HCM) most complex challenges. The five vendors Gartner mentions in the research note are AllyO, Eightfold, jobpal, Knack, and Vettd. Each has concentrated on creating and launching differentiated applications that address urgent needs enterprises have across the talent acquisition landscape. Gartner’s interpretation of the expanding Talent Acquisition Landscape is shown below (please click on the graphic to expand):

Source: Gartner, Cool Vendors in Human Capital Management for Talent Acquisition, Written by Jason Cerrato, Jeff Freyermuth, John Kostoulas, Helen Poitevin, Ron Hanscome. 7 September 2018

Company Growth Plans Are Accelerating The War For Talent

The average employee’s tenure at a cloud-based enterprise software company is 19 months; in the Silicon Valley, this trends to 14 months due to intense competition for talent according to C-level executives leading these companies. Fast-growing enterprise cloud computing companies and many other businesses like them need specific capabilities, skill sets, and associates who know how to unlearn old concepts and learn new ones. Today across tech and many other industries, every company’s growth strategy is predicated on how well they attract, engage, screen, interview, select and manage talent over associates’ lifecycles.

Of the five companies Gartner names as Cool Vendors in the field of Human Capital Management for Talent Acquisition, Eightfold is the only one achieving personalization at scale today. Attaining personalization at scale is essential if any growing business is going to succeed in attracting, acquiring and growing talent that can support their growth goals and strategies. Eightfold’s approach makes it possible to scale personalized responses to specific candidates in a company’s candidate community while defining the ideal candidate for each open position.

Gartner finds Eightfold noteworthy for its AI-based Talent Intelligence Platform that combines analysis of publicly available data, internal data repositories, HCM systems, ATS tools, and spreadsheets then creates ontologies based on organization-specific success criteria. Each ontology, or area of talent management interest, is customizable for further queries using the app’s easily understood and navigated user interface. Gartner also finds that Eightfold.ai is one of the first examples of a self-updating corporate candidate database. Profiles in the system are now continually updated using external data gathering, without applicants reapplying or submitting updated profiles. The Eightfold.ai Talent Intelligence Platform is shown below:

Taking A Data-Driven Approach to Improve Diversity

AI and machine learning have the potential to remove conscious and unconscious biases from hiring decisions, leading to hiring decisions based on capabilities and innate skills. Many CEOs and senior management teams are enthusiastically endorsing diversity programs yet struggling to make progress. AI and machine learning-based approaches like Eightfold’s can help to accelerate them to their diversity goals and attain a more egalitarian workplace. Data is the great equalizer, with a proven ability to eradicate conscious and unconscious biases from hiring decisions and enable true diversity by equally evaluating candidates based on their experience, growth potential and strengths.

Conclusion

At the center of every growing business’ growth plans is the need to attract, engage, recruit, and retain the highest quality employees possible. As future research in the field of HCM will show, the field is in crisis because it’s relying more on biases than solid data. Breaking through the barrier of conscious and unconscious biases will provide contextual intelligence of an applicant’s unique skills, capabilities and growth trajectories that are far beyond the scope of any resume or what an ATS can provide. The war for talent is being won today with data and insights that strip away biases to provide prospects who are ready for the challenges of helping their hiring companies grow.

Google Needs To Make Machine Learning Their Growth Fuel

  • In 2017 Google outspent Microsoft, Apple, and Facebook on R&D spending with the majority being on AI and machine learning.
  • Google needs new AI- and machine learning-driven businesses that have lower Total Acquisition Costs (TAC) to offset the rising acquisition costs of their ad and search businesses.
  • One of the company’s initial forays into AI and machine learning was its $600M acquisition of AI startup DeepMind in January 2014.
  • Google has launched two funds dedicated solely to AI: Gradient Ventures and the Google Assistant Investment Program, both of which are accepting pitches from AI and machine learning startups today.
  • On its Q4’17 earnings call, the company announced that its cloud business is now bringing in $1B per quarter. The number of cloud deals worth $1M+ that Google has sold more than tripled between 2016 and 2017.
  • Google’s M&A strategy is concentrating on strengthening their cloud business to better compete against Amazon AWS and Microsoft Azure.

These and many other fascinating insights are from CB Insight’s report, Google Strategy Teardown (PDF, 49 pp., opt-in). The report explores how Alphabet, Google’s parent company is relying on Artificial Intelligence (AI) and machine learning to capture new streams of revenue in enterprise cloud computing and services. Also, the report looks at how Alphabet can combine search, AI, and machine learning to revolutionize logistics, healthcare, and transportation. It’s a thorough teardown of Google’s potential acquisitions, strategic investments, and partnerships needed to maintain search dominance while driving revenue from new markets.

Key takeaways from the report include the following:

  • Google needs new AI- and machine learning-driven businesses that have lower Total Acquisition Costs (TAC) to offset the rising acquisition costs of their ad and search businesses. CB Insights found Google is experiencing rising TAC in their core ad and search businesses. With the strategic shift to mobile, Google will see TAC escalate even further. Their greatest potential for growth is infusing greater contextual intelligence and knowledge across the entire series of companies that comprise Alphabet, shown in the graphic below.

  • Google has launched two funds dedicated solely to AI: Gradient Ventures and the Google Assistant Investment Program, both of which are accepting pitches from AI and machine learning startups today. Gradient Ventures is an ROI fund focused on supporting the most talented founders building AI-powered companies. Former tech founders are leading Gradient Ventures, assisting in turning ideas into companies. Gradient Venture’s portfolio is shown below:

  • In 2017 Google outspent Microsoft, Apple, and Facebook on R&D spending with the majority being on AI and machine learning. Amazon dominates R&D spending across the top five tech companies investments in R&D in 2017 with $22.6B. Facebook leads in percent of total sales invested in R&D with 19.1%.

  • Google AI led the development of Google’s highly popular open source machine software library and framework Tensor Flow and is home to the Google Brain team. Google’s approach to primary research in the fields of AI, machine learning, and deep learning is leading to a prolific amount of research being produced and published. Here’s the search engine for their publication database, which includes many fascinating studies for review. Part of Google Brain’s role is to work with other Alphabet subsidiaries to support and lead their AI and machine learning product initiatives. An example of this CB Insights mentions in the report is how Google Brain collaborated with autonomous driving division Waymo, where it has helped apply deep neural nets to vehicles’ pedestrian detection The team has also been successful in increasing the number of AI and machine learning patents, as CB Insight’s analysis below shows:

  • Mentions of AI and machine learning are soaring on Google quarterly earnings calls, signaling senior management’s prioritizing these areas as growth fuel. CB Insights has an Insights Trends tool that is designed to analyze unstructured text and find linguistics-based associations, models and statistical insights from them. Analyzing Google earnings calls transcripts found AI and machine learning mentions are soaring during the last call.

  • Google’s M&A strategy is concentrating on strengthening their cloud business to better compete against Amazon AWS and Microsoft Azure. Google acquired Xively in Q1 of this year followed by Cask Data and Velostrata in Q2. Google needs to continue acquiring cloud-based companies who can accelerate more customer wins in the enterprise and mid-tier, two areas Amazon AWS and Microsoft Azure have strong momentum today.

Where Business Intelligence Is Delivering Value In 2018

  • Executive Management, Operations, and Sales are the three primary roles driving Business Intelligence (BI) adoption in 2018.
  • Dashboards, reporting, end-user self-service, advanced visualization, and data warehousing are the top five most important technologies and initiatives strategic to BI in 2018.
  • Small organizations with up to 100 employees have the highest rate of BI penetration or adoption in 2018.
  • Organizations successful with analytics and BI apps define success in business results, while unsuccessful organizations concentrate on adoption rate first.
  • 50% of vendors offer perpetual on-premises licensing in 2018, a notable decline over 2017. The number of vendors offering subscription licensing continues to grow for both on-premises and public cloud models.
  • Fewer than 15% of respondent organizations have a Chief Data Officer, and only about 10% have a Chief Analytics Officer today.

These and many other fascinating insights are from Dresner Advisory Service’s  2018 Wisdom of Crowds® Business Intelligence Market Study. In its ninth annual edition, the study provides a broad assessment of the business intelligence (BI) market and a comprehensive look at key user trends, attitudes, and intentions.  The latest edition of the study adds Information Technology (IT) analytics, sales planning, and GDPR, bringing the total to 36 topics under study.

“The Wisdom of Crowds BI Market Study is the cornerstone of our annual research agenda, providing the most in-depth and data-rich portrait of the state of the BI market,” said Howard Dresner, founder and chief research officer at Dresner Advisory Services. “Drawn from the first-person perspective of users throughout all industries, geographies, and organization sizes, who are involved in varying aspects of BI projects, our report provides a unique look at the drivers of and success with BI.” Survey respondents include IT (28%), followed by Executive Management (22%), and Finance (19%). Sales/Marketing (8%) and the Business Intelligence Competency Center (BICC) (7%). Please see page 15 of the study for specifics on the methodology.

Key takeaways from the study include the following:

  • Executive Management, Operations, and Sales are the three primary roles driving Business Intelligence (BI) adoption in 2018. Executive management teams are taking more of an active ownership role in BI initiatives in 2018, as this group replaced Operations as the leading department driving BI adoption this year. The study found that the greatest percentage change in functional areas driving BI adoption includes Human Resources (7.3%), Marketing (5.9%), BICC (5.1%) and Sales (5%).

  • Making better decisions, improving operational efficiencies, growing revenues and increased competitive advantage are the top four BI objectives organizations have today. Additional goals include enhancing customer service and attaining greater degrees of compliance and risk management. The graph below rank orders the importance of BI objectives in 2018 compared to the percent change in BI objectives between 2017 and 2018. Enhanced customer service is the fastest growing objective enterprises adopt BI to accomplish, followed by growth in revenue (5.4%).

  • Dashboards, reporting, end-user self-service, advanced visualization, and data warehousing are the top five most important technologies and initiatives strategic to BI in 2018. The study found that second-tier initiatives including data discovery, data mining/advanced algorithms, data storytelling, integration with operational processes, and enterprise and sales planning are also critical or very important to enterprises participating in the survey. Technology areas being hyped heavily today including the Internet of Things, cognitive BI, and in-memory analysis are relatively low in the rankings as of today, yet are growing. Edge computing increased 32% as a priority between 2017 and 2018 for example. The results indicate the core aspect of excelling at using BI to drive better business decisions and more revenue still dominate the priorities of most businesses today.
  • Sales & Marketing, Business Intelligence Competency Center (BICC) and   Executive Management have the highest level of interest in dashboards and advanced visualization. Finance has the greatest interest in enterprise planning and budgeting. Operations including manufacturing, supply chain management, and services) leads interest in data mining, data storytelling, integration with operational processes, mobile device support, data catalog and several other technologies and initiatives. It’s understandable that BICC leaders most advocate end-user self-service and attach high importance to many other categories as they are internal service bureaus to all departments in an enterprise. It’s been my experience that BICCs are always looking for ways to scale BI adoption and enable every department to gain greater value from analytics and BI apps. BICCs in the best run companies are knowledge hubs that encourage and educate all departments on how to excel with analytics and BI.

  • Insurance companies most prioritize dashboards, reporting, end-user self-service, data warehousing, data discovery and data mining. Business Services lead the adoption of advanced visualization, data storytelling, and embedded BI. Manufacturing most prioritizes sales planning and enterprise planning but trails in other high-ranking priorities. Technology prioritizes Software-as-a-Service (SaaS) given its scale and speed advantages. The retail & wholesale industry is going through an analytics and customer experience revolution today. Retailers and wholesalers lead all others in data catalog adoption and mobile device support.

  • Insurance, Technology and Business Services vertical industries have the highest rate of BI adoption today. The Insurance industry leads all others in BI adoption, followed by the Technology industry with 40% of organizations having 41% or greater adoption or penetration. Industries whose BI adoption is above average include Business Services and Retail & Wholesale. The following graphic illustrates penetration or adoption of Business Intelligence solutions today by industry.

  • Dashboards, reporting, advanced visualization, and data warehousing are the highest priority investment areas for companies whose budgets increased from 2017 to 2018. Additional high priority areas of investment include advanced visualization and data warehousing. The study found that less well-funded organizations are most likely to lead all others by investing in open source software to reduce costs.

  • Small organizations with up to 100 employees have the highest rate of BI penetration or adoption in 2018. Factors contributing to the high adoption rate for BI in small businesses include business models that need advanced analytics to function and scale, employees with the latest analytics and BI skills being hired to also scale high growth businesses and fewer barriers to adoption compared to larger enterprises. BI adoption tends to be more pervasive in small businesses as a greater percentage of employees are using analytics and BI apps daily.

  • Executive Management is most familiar with the type and number of BI tools in use across the organization. The majority of executive management respondents say their teams are using between one or two BI tools today. Business Intelligence Competency Centers (BICC) consistently report a higher number of BI tools in use than other functional areas given their heavy involvement in all phases of analytics and BI project execution. IT, Sales & Marketing and Finance are likely to have more BI tools in use than Operations.

  • Enterprises rate BI application usability and product quality & reliability at an all-time high in 2018. Other areas of major improvements on the part of vendors include improving ease of implementation, online training, forums and documentation, and completeness of functionality. Dresner’s research team found between 2017 and 2018 integration of components within product dropped, in addition to scalability. The study concludes the drop in integration expertise is due to an increasing number of software company acquisitions aggregating dissimilar products together from different platforms.

Five Reasons Why Machine Learning Needs To Make Resumes Obsolete

  • Hiring companies nationwide miss out on 50% or more of qualified candidates and tech firms incorrectly classify up 80% of candidates due to inaccuracies and shortcomings of existing Applicant Tracking Systems (ATS), illustrating how faulty these systems are for enabling hiring.
  • It takes on average 42 days to fill a position, and up to 60 days or longer to fill positions requiring in-demand technical skills and costs an average $5,000 to fill each position.
  • Women applicants have a 19% chance of being eliminated from consideration for a job after a recruiter screen and 30% after an onsite interview, leading to a massive loss of brainpower and insight every company needs to grow.

It’s time the hiring process gets smarter, more infused with contextual intelligence, insight, evaluating candidates on their mastery of needed skills rather than judging candidates on resumes that reflect what they’ve achieved in the past. Enriching the hiring process with greater machine learning-based contextual intelligence finds the candidates who are exceptional and have the intellectual skills to contribute beyond hiring managers’ expectations. Machine learning algorithms can also remove any ethic- and gender-specific identification of a candidate and have them evaluated purely on expertise, experiences, merit, and skills.

The hiring process relied on globally today hasn’t changed in over 500 years. From Leonardo da Vinci’s handwritten resume from 1482, which reflects his ability to build bridges and support warfare versus the genius behind Mona Lisa, Last Supper, Vitruvian Man, and a myriad of scientific discoveries and inventions that modernized the world, the approach job seekers take for pursuing new positions has stubbornly defied innovation. ATS apps and platforms classify inbound resumes and provide rankings of candidates based on just a small glimpse of their skills seen on a resume. When what’s needed is an insight into which managerial, leadership and technical skills & strengths any given candidate is attaining mastery of and at what pace.  Machine learning broadens the scope of what hiring companies can see in candidates by moving beyond the barriers of their resumes. Better hiring decisions are being made, and the Return on Investment (ROI) drastically improves by strengthening hiring decisions with greater intelligence. Key metrics including time-to-hire, cost-to-hire, retention rates, and performance all will improve when greater contextual intelligence is relied on.

Look Beyond Resumes To Win The War For Talent

Last week I had the opportunity to speak with the Vice President of Human Resources for one of the leading technology think tanks globally. He’s focusing on hundreds of technical professionals his organization needs in six months, 12 months and over a year from now to staff exciting new research projects that will deliver valuable Intellectual Property (IP) including patents and new products.

Their approach begins by seeking to understand the profiles and core strengths of current high performers, then seek out matches with ideal candidates in their community of applicants and the broader technology community. Machine learning algorithms are perfectly suited for completing the needed comparative analysis of high performer’s capabilities and those of candidates, whose entire digital persona is taken into account when comparisons are being completed. The following graphic illustrates the eightfold.ai Talent Intelligence Platform (TIP), illustrating how integrated it is with publicly available data, internal data repositories, Human Capital Resource Management (HRM) systems, ATS tools. Please click on the graphic to expand it for easier reading.

The comparative analysis of high achievers’ characteristics with applicants takes seconds to complete, providing a list of prospects complete with profiles. Machine learning-derived profiles of potential hires meeting the high performers’ characteristics provided greater contextual intelligence than any resume ever could. Taking an integrated approach to creating the Talent Intelligence Platform (TIP) yields insights not available with typical hiring or ATS solutions today. The profile below reflects the contextual intelligence and depth of insight possible when machine learning is applied to an integrated dataset of candidates. Please click on the graphic to expand it for easier reading. Key elements in the profile below include the following:

  • Career Growth Bell Curve – Illustrates how a given candidate’s career progressions and performance compares relative to others.

  • Social Following On Public Sites –  Provides a real-time glimpse into the candidate’s activity on Github, Open Stack, and other sites where technical professionals can share their expertise. This also provides insight into how others perceive their contributions.

  • Highlights Of Background That Is Relevant To Job(s) Under Review Provides the most relevant data from the candidate’s history in the profile so recruiters and managers can more easily understand their strengths.

  • Recent Publications – Publications provide insights into current and previous interests, areas of focus, mindset and learning progression over the last 10 to 15 years or longer.

  • Professional overlap that makes it easier to validate achievements chronicled in the resume – Multiple sources of real-time career data validate and provide greater context and insight into resume-listed accomplishments.

The key is understanding the context in which a candidate’s capabilities are being evaluated. And a 2-page resume will never give enough latitude to the candidate to cover all bases. For medium to large companies – doing this accurately and quickly is a daunting task if done manually – across all roles, all the geographies, all the candidates sourced, all the candidates applying online, university recruiting, re-skilling inside the company, internal mobility for existing employees, and across all recruitment channels. This is where machine learning can be an ally to the recruiter, hiring manager, and the candidate.

Five Reasons Why Machine Learning Needs To Make Resumes Obsolete

Reducing the costs and time-to-hire, increasing the quality of hires and staffing new initiatives with the highest quality talent possible all fuels solid revenue growth. Relying on resumes alone is like being on a bad Skype call where you only hear every tenth word in the conversation. Using machine learning-based approaches brings greater acuity, clarity, and visibility into hiring decisions.

The following are the five reasons why machine learning needs to make resumes obsolete:

  1. Resumes are like rearview mirrors that primarily reflect the past. What needed is more of a focus on where someone is going, why (what motivates them) and what are they fascinated with and learning about on their own. Resumes are rearview mirrors and what’s needed is an intelligent heads-up display of what their future will look like based on present interests and talent.
  2. By relying on a 500+-year-old process, there’s no way of knowing what skills, technologies and training a candidate is gaining momentum in. The depth and extent of mastery in specific areas aren’t reflected in the structure of resumes. By integrating multiple sources of data into a unified view of a candidate, it’s possible to see what areas they are growing the quickest in from a professional development standpoint.
  3. It’s impossible to game a machine learning algorithm that takes into account all digital data available on a candidate, while resumes have a credibility issue. Anyone who has hired subordinates, staff, and been involved in hiring decisions has faced the disappointment of finding out a promising candidate lied on a resume. It’s a huge let-down. Resumes get often gamed with one recruiter saying at least 60% of resumes have exaggerations and in some cases lies on them. Taking all data into account using a platform like TIP shows the true candidate and their actual skills.
  4. It’s time to take a more data-driven approach to diversity that removes unconscious biases. Resumes today immediately carry inherent biases in them. Recruiter, hiring managers and final interview groups of senior managers draw their unconscious biases based on a person’s name, gender, age, appearance, schools they attended and more. It’s more effective to know their skills, strengths, core areas of intelligence, all of which are better predictors of job performance.
  5. Reduces the risk of making a bad hire that will churn out of the organization fast. Ultimately everyone hires based in part on their best judgment and in part on their often unconscious biases. It’s human nature. With more data the probability of making a bad hire is reduced, reducing the risk of churning through a new hire and costing thousands of dollars to hire then replace them. Having greater contextual intelligence reduces the downside risks of hiring, removes biases by showing with solid data just how much a person is qualified or not for a role, and verifies their background strengths, skills, and achievements. Factors contributing to unconscious biases including gender, race, age or any other factors can be removed from profiles, so candidates are evaluated only on their potential to excel in the roles they are being considered for.

Bottom line: It’s time to revolutionize resumes and hiring processes, moving them into the 21st century by redefining them with greater contextual intelligence and insight enabled by machine learning.

 

%d bloggers like this: