Skip to content
Advertisements

Posts tagged ‘Analytics’

Five Reasons Why Machine Learning Needs To Make Resumes Obsolete

  • Hiring companies nationwide miss out on 50% or more of qualified candidates and tech firms incorrectly classify up 80% of candidates due to inaccuracies and shortcomings of existing Applicant Tracking Systems (ATS), illustrating how faulty these systems are for enabling hiring.
  • It takes on average 42 days to fill a position, and up to 60 days or longer to fill positions requiring in-demand technical skills and costs an average $5,000 to fill each position.
  • Women applicants have a 19% chance of being eliminated from consideration for a job after a recruiter screen and 30% after an onsite interview, leading to a massive loss of brainpower and insight every company needs to grow.

It’s time the hiring process gets smarter, more infused with contextual intelligence, insight, evaluating candidates on their mastery of needed skills rather than judging candidates on resumes that reflect what they’ve achieved in the past. Enriching the hiring process with greater machine learning-based contextual intelligence finds the candidates who are exceptional and have the intellectual skills to contribute beyond hiring managers’ expectations. Machine learning algorithms can also remove any ethic- and gender-specific identification of a candidate and have them evaluated purely on expertise, experiences, merit, and skills.

The hiring process relied on globally today hasn’t changed in over 500 years. From Leonardo da Vinci’s handwritten resume from 1482, which reflects his ability to build bridges and support warfare versus the genius behind Mona Lisa, Last Supper, Vitruvian Man, and a myriad of scientific discoveries and inventions that modernized the world, the approach job seekers take for pursuing new positions has stubbornly defied innovation. ATS apps and platforms classify inbound resumes and provide rankings of candidates based on just a small glimpse of their skills seen on a resume. When what’s needed is an insight into which managerial, leadership and technical skills & strengths any given candidate is attaining mastery of and at what pace.  Machine learning broadens the scope of what hiring companies can see in candidates by moving beyond the barriers of their resumes. Better hiring decisions are being made, and the Return on Investment (ROI) drastically improves by strengthening hiring decisions with greater intelligence. Key metrics including time-to-hire, cost-to-hire, retention rates, and performance all will improve when greater contextual intelligence is relied on.

Look Beyond Resumes To Win The War For Talent

Last week I had the opportunity to speak with the Vice President of Human Resources for one of the leading technology think tanks globally. He’s focusing on hundreds of technical professionals his organization needs in six months, 12 months and over a year from now to staff exciting new research projects that will deliver valuable Intellectual Property (IP) including patents and new products.

Their approach begins by seeking to understand the profiles and core strengths of current high performers, then seek out matches with ideal candidates in their community of applicants and the broader technology community. Machine learning algorithms are perfectly suited for completing the needed comparative analysis of high performer’s capabilities and those of candidates, whose entire digital persona is taken into account when comparisons are being completed. The following graphic illustrates the eightfold.ai Talent Intelligence Platform (TIP), illustrating how integrated it is with publicly available data, internal data repositories, Human Capital Resource Management (HRM) systems, ATS tools. Please click on the graphic to expand it for easier reading.

The comparative analysis of high achievers’ characteristics with applicants takes seconds to complete, providing a list of prospects complete with profiles. Machine learning-derived profiles of potential hires meeting the high performers’ characteristics provided greater contextual intelligence than any resume ever could. Taking an integrated approach to creating the Talent Intelligence Platform (TIP) yields insights not available with typical hiring or ATS solutions today. The profile below reflects the contextual intelligence and depth of insight possible when machine learning is applied to an integrated dataset of candidates. Please click on the graphic to expand it for easier reading. Key elements in the profile below include the following:

  • Career Growth Bell Curve – Illustrates how a given candidate’s career progressions and performance compares relative to others.

  • Social Following On Public Sites –  Provides a real-time glimpse into the candidate’s activity on Github, Open Stack, and other sites where technical professionals can share their expertise. This also provides insight into how others perceive their contributions.

  • Highlights Of Background That Is Relevant To Job(s) Under Review Provides the most relevant data from the candidate’s history in the profile so recruiters and managers can more easily understand their strengths.

  • Recent Publications – Publications provide insights into current and previous interests, areas of focus, mindset and learning progression over the last 10 to 15 years or longer.

  • Professional overlap that makes it easier to validate achievements chronicled in the resume – Multiple sources of real-time career data validate and provide greater context and insight into resume-listed accomplishments.

The key is understanding the context in which a candidate’s capabilities are being evaluated. And a 2-page resume will never give enough latitude to the candidate to cover all bases. For medium to large companies – doing this accurately and quickly is a daunting task if done manually – across all roles, all the geographies, all the candidates sourced, all the candidates applying online, university recruiting, re-skilling inside the company, internal mobility for existing employees, and across all recruitment channels. This is where machine learning can be an ally to the recruiter, hiring manager, and the candidate.

Five Reasons Why Machine Learning Needs To Make Resumes Obsolete

Reducing the costs and time-to-hire, increasing the quality of hires and staffing new initiatives with the highest quality talent possible all fuels solid revenue growth. Relying on resumes alone is like being on a bad Skype call where you only hear every tenth word in the conversation. Using machine learning-based approaches brings greater acuity, clarity, and visibility into hiring decisions.

The following are the five reasons why machine learning needs to make resumes obsolete:

  1. Resumes are like rearview mirrors that primarily reflect the past. What needed is more of a focus on where someone is going, why (what motivates them) and what are they fascinated with and learning about on their own. Resumes are rearview mirrors and what’s needed is an intelligent heads-up display of what their future will look like based on present interests and talent.
  2. By relying on a 500+-year-old process, there’s no way of knowing what skills, technologies and training a candidate is gaining momentum in. The depth and extent of mastery in specific areas aren’t reflected in the structure of resumes. By integrating multiple sources of data into a unified view of a candidate, it’s possible to see what areas they are growing the quickest in from a professional development standpoint.
  3. It’s impossible to game a machine learning algorithm that takes into account all digital data available on a candidate, while resumes have a credibility issue. Anyone who has hired subordinates, staff, and been involved in hiring decisions has faced the disappointment of finding out a promising candidate lied on a resume. It’s a huge let-down. Resumes get often gamed with one recruiter saying at least 60% of resumes have exaggerations and in some cases lies on them. Taking all data into account using a platform like TIP shows the true candidate and their actual skills.
  4. It’s time to take a more data-driven approach to diversity that removes unconscious biases. Resumes today immediately carry inherent biases in them. Recruiter, hiring managers and final interview groups of senior managers draw their unconscious biases based on a person’s name, gender, age, appearance, schools they attended and more. It’s more effective to know their skills, strengths, core areas of intelligence, all of which are better predictors of job performance.
  5. Reduces the risk of making a bad hire that will churn out of the organization fast. Ultimately everyone hires based in part on their best judgment and in part on their often unconscious biases. It’s human nature. With more data the probability of making a bad hire is reduced, reducing the risk of churning through a new hire and costing thousands of dollars to hire then replace them. Having greater contextual intelligence reduces the downside risks of hiring, removes biases by showing with solid data just how much a person is qualified or not for a role, and verifies their background strengths, skills, and achievements. Factors contributing to unconscious biases including gender, race, age or any other factors can be removed from profiles, so candidates are evaluated only on their potential to excel in the roles they are being considered for.

Bottom line: It’s time to revolutionize resumes and hiring processes, moving them into the 21st century by redefining them with greater contextual intelligence and insight enabled by machine learning.

 

Advertisements

The State Of Cloud Business Intelligence, 2018

  • Cloud BI adoption is soaring in 2018, nearly doubling 2016 adoption levels.
  • Over 90% of Sales & Marketing teams say that Cloud BI is essential for getting their work done in 2018, leading all categories in the survey.
  • 66% of organizations that consider themselves completely successful with Business Intelligence (BI) initiatives currently use the cloud.
  • Financial Services (62%), Technology (54%), and Education (54%) have the highest Cloud BI adoption rates in 2018.
  • 86% of Cloud BI adopters name Amazon AWS as their first choice, 82% name Microsoft Azure, 66% name Google Cloud, and 36% identify IBM Bluemix as their preferred provider of cloud BI services.

These and other many other fascinating insights are from Dresner Advisory Services 2018 Cloud Computing and Business Intelligence Market Study (client access reqd.) of the Wisdom of Crowds® series of research. The goal of the 7th annual edition of the study seeks to quantify end-user deployment trends and attitudes toward cloud computing and business intelligence (BI), defined as the technologies, tools, and solutions that employ one or more cloud deployment models. Dresner Advisory Services defines the scope of Business Intelligence (BI) tools and technologies to include query and reporting, OLAP (online analytical processing), data mining and advanced analytics, end-user tools for ad hoc query and analysis, and dashboards for performance monitoring. Please see page 10 of the study for the methodology. The study found the primary barriers to greater cloud BI adoption are enterprises’ concerns regarding data privacy and security.

Key takeaways from the study include the following:

  • Cloud BI’s importance continues to accelerate in 2018, with the majority of respondents considering it an important element of their broader analytics strategies. The study found that mean level of sentiment rose from 2.68 to 3.22 (above the level of “important”) between 2017 and 2018, indicating the increased importance of Cloud BI over the last year. By region, Asia-Pacific respondents continue to be the strongest proponents of cloud computing regarding both adjusted mean (4.2 or “very important”) and levels of criticality. The following graphic illustrates Cloud BI’s growing importance between 2012 and 2018.

  • Over 90% of Sales & Marketing teams say Cloud BI apps are important to getting their work done in 2018, leading all respondent categories in the survey. The study found that Cloud BI importance in 2018 is highest among Sales/Marketing and Executive Management respondents. One of the key factors driving this is the fact that both Sales & Marketing and Executive Management are increasingly relying on cloud-based front office applications and services that are integrated with and generate cloud-based data to track progress towards goals.

  • Cloud BI is most critical to Financial Services & Insurance, Technology, and Retail & Wholesale Trade industries. The study recorded its highest-ever levels of Cloud Bi importance in 2018. Financial Services has the highest weighted mean interest in cloud BI (3.8, which approaches “very important” status shown in the figure below). Technology organizations, where half of the respondents say cloud BI is “critical” or “very important,” are the next most interested. Close to 90% of Retail/Wholesale respondents say SaaS/cloud BI is at least “important” to them. As it has been over time, Healthcare remains the industry least open to managed services for data and business intelligence.

  • Cloud BI adoption is soaring in 2018, nearly doubling 2016 adoption levels. The study finds that the percentage of respondents using Cloud BI in 2018 nearly doubled from 25% of enterprise users in 2016. Year over year, current use rose from 31% to 49%. In the same time frame, the percentage of respondents with no plans to use cloud BI dropped by half, from 38% to 19%. This study has been completed for the last seven years, showing a steady progression of Cloud BI awareness and adoption, with 2018 being the first one showing the most significant rise in adoption levels ever.

  • Sales & Marketing leads all departments in current use and planning for Cloud BI applications. Business Intelligence Competency Centers (BICC) are a close second, each with over 60% adoption rates for Cloud BI today. Operations including manufacturing and supply chains and services are the next most likely to use Cloud BI currently. Marketing and BICC lead current adoption and are contributing catalysts of Cloud BI’s soaring growth between 2016 and 2018. Both of these departments often have time-constrained and revenue-driven goals where quantifying contributions to company growth and achievement ad critical.

  • Financial Services (62%), Technology (54%), and Education (54%) industries have the highest Cloud BI adoption rates in 2018. The retail/wholesale industry has the fourth-highest level of Cloud BI adoption and the greatest number of companies who are currently evaluating Cloud BI today. The least likely current or future users are found in manufacturing and security-sensitive healthcare organizations, where 45% respondents report no plans for cloud-based BI/analytics.

  • Dashboards, advanced visualization, ad-hoc query, data integration, and self-service are the most-required Cloud BI features in 2018. Sales & Marketing need real-time feedback on key initiatives, programs, strategies, and progress towards goals. Dashboards and advanced visualization features’ dominance of feature requirements reflect this department’s ongoing need for real-time feedback on the progress of their teams towards goals. Reporting, data discovery, and end-user data blending (data preparation) make up the next tier of importance.

  • Manufacturers have the greatest interest in dashboards, ad-hoc query, production reporting, search interface, location intelligence, and ability to write to transactional applications. Education respondents report the greatest interest in advanced visualization along with data integration, data mining, end-user data blending, data catalog, and collaborative support for group-based analysis. Financial Services respondents are highly interested in advanced visualization and lead all industries in self-serviceHealthcare industry respondents lead interest only in in-memory support. Retail/Wholesale and Healthcare industry respondents are the least feature interested overall.

  • Interest in cloud application connections to Salesforce, NetSuite, and other cloud-based platforms has increased 12% this year. Getting end-to-end visibility across supply chains, manufacturing centers, and distribution channels requires Cloud BI apps be integrated with cloud-based platforms and on-premises applications and data. Expect to see this accelerate in 2019 as Cloud BI apps become more pervasive across Marketing & Sales and Executive Management, in addition to Operations including supply chain management and manufacturing where real-time shop floor monitoring is growing rapidly.

  • Retail/Wholesale, Business Services, Education and Financial Services & Insurance industries are most interested in Google Analytics connectors to obtain data for their Cloud BI apps. Respondents from Technology industries prioritize Salesforce integration and connectors above all others. Education respondents are most interested in MySQL and Google Drive integration and connectors. Manufacturers are most interested in connectors to Google AdWords, SurveyMonkey, and The Healthcare industry respondents prioritize SAP Cloud BI services and also interested in ServiceNow connectors.

Data Scientist Is The Best Job In America According Glassdoor

  • Data Scientist has been named the best job in America for three years running, with a median base salary of $110,000 and 4,524 job openings.
  • DevOps Engineer is the second-best job in 2018, paying a median base salary of $105,000 and 3,369 job openings.
  • There are 29,187 Software Engineering jobs available today, making this job the most popular regarding Glassdoor postings according to the study.

These and many other fascinating insights are from Glassdoor’s 50 Best Jobs In America For 2018. The Glassdoor Report is viewable online here. Glassdoor’s annual report highlights the 50 best jobs based on each job’s overall Glassdoor Job Score.The Glassdoor Job Score is determined by weighing three key factors equally: earning potential based on median annual base salary, job satisfaction rating, and the number of job openings. Glassdoor’s 2018 report lists jobs that excel across all three dimensions of their Job Score metric. For an excellent overview of the study by Karsten Strauss of Forbes, please see his post, The Best Jobs To Apply For In 2018.

LinkedIn’s 2017 U.S. Emerging Jobs Report found that there are 9.8 times more Machine Learning Engineers working today than five years ago with 1,829 open positions listed on their site as of last month. Data science and machine learning are generating more jobs than candidates right now, making these two areas the fastest growing tech employment areas today.

Key takeaways from the study include the following:

  • Six analytics and data science jobs are included in Glassdoor’s 50 best jobs In America for 2018. These include Data Scientist, Analytics Manager, Database Administrator, Data Engineer, Data Analyst and Business Intelligence Developer. The complete list of the top 50 jobs is provided below with the analytics and data science jobs highlighted along with software engineering, which has a record 29,817 open jobs today:

  • Median base salary of the 50 best jobs in America is $91,000 with the average salary of the six analytics and data science jobs being $94,167.
  • Across all six analytics and data science jobs there are 16,702 openings as of today according to Glassdoor.
  • Tech jobs make up 20 of Glassdoor’s 50 Best Jobs in America for 2018, up from 14 jobs in 2017.

Source: Glassdoor Reveals the 50 Best Jobs in America for 2018

53% Of Companies Are Adopting Big Data Analytics

  • Big data adoption reached 53% in 2017 for all companies interviewed, up from 17% in 2015, with telecom and financial services leading early adopters.
  • Reporting, dashboards, advanced visualization end-user “self-service” and data warehousing are the top five technologies and initiatives strategic to business intelligence.
  • Data warehouse optimization remains the top use case for big data, followed by customer/social analysis and predictive maintenance.
  • Among big data distributions, Cloudera is the most popular, followed by Hortonworks, MAP/R, and Amazon EMR.

These and many other insights are from Dresner Advisory Services’ insightful 2017 Big Data Analytics Market Study (94 pp., PDF, client accessed reqd), which is part of their Wisdom of Crowds® series of research. This 3rd annual report examines end-user trends and intentions surrounding big data analytics, defined as systems that enable end-user access to and analysis of data contained and managed within the Hadoop ecosystem. The 2017 Big Data Analytics Market Study represents a cross-section of data that spans geographies, functions, organization size, and vertical industries. Please see page 10 of the study for additional details regarding the methodology.

“Across the three years of our comprehensive study of big data analytics, we see a significant increase in uptake in usage and a large drop of those with no plans to adopt,” said Howard Dresner, founder and chief research officer at Dresner Advisory Services. “In 2017, IT has emerged as the most typical adopter of big data, although all departments – including finance – are considering future use. This is an indication that big data is becoming less an experimental endeavor and more of a practical pursuit within organizations.”

Key takeaways include the following:

  • Reporting, dashboards, advanced visualization end-user “self-service” and data warehousing are the top five technologies and initiatives strategic to business intelligence.  Big Data ranks 20th across 33 key technologies Dresner Advisory Services currently tracks.  Big Data Analytics is of greater strategic importance than the Internet of Things (IoT), natural language analytics, cognitive Business Intelligence (BI) and Location intelligence.

  • 53% of companies are using big data analytics today, up from 17% in 2015 with Telecom and Financial Services industries fueling the fastest adoption. Telecom and financial services are the most active early adopters, with Technology and Healthcare being the third and fourth industries seeing big data analytics Education has the lowest adoption as 2017 comes to a close, with the majority of institutions in that vertical saying they are evaluating big data analytics for the future. North America (55%) narrowly leads EMEA (53%) in their current levels of big data analytics adoption. Asia-Pacific respondents report 44% current adoption and are most likely to say they “may use big data in the future.”

  • Data warehouse optimization is considered the most important big data analytics use case in 2017, followed by customer/social analysis and predictive maintenance. Data warehouse optimization is considered critical or very important by 70% of all respondents. It’s interesting to note and ironic that the Internet of Things (IoT) is among the lowest priority use cases for big data analytics today.

  • Big data analytics use cases vary significantly by industry with data warehouse optimization dominating Financial Services, Healthcare, and Customer/social analysis is the leading use case in Technology-based companies. Fraud detection use cases also dominate Financial Services and Telecommunications. Using big data for clickstream analytics is most popular in Financial Services.

  • Spark, MapReduce, and Yarn are the three most popular software frameworks today. Over 30% of respondents consider Spark critical to their big data analytics strategies. MapReduce and Yarn are “critical” to more than 20 percent of respondents.

  • The big data access methods most preferred by respondents include Spark SQL, Hive, HDFS and Amazon S3. 73% of the respondents consider Spark SQL critical to their analytics strategies. Over 30% of respondents consider Hive and HDFS critical as well. Amazon S3 is critical to one of five respondents for managing big data access. The following graphic shows the distribution of big data access methods.

  • Machine learning continues to gain more industry support and investment plans with Spark Machine Learning Library (MLib) adoption projected to grow by 60% in the next 12 months. In the next 24 months, MLib will dominate machine learning according to the survey results. MLib is accessible from the Sparklyr R Package and many others, which continues to fuel its growth. The following graphic compares projected two-year adoption rates by machine learning libraries and frameworks.

Gartner’s Hype Cycle for Emerging Technologies, 2017 Adds 5G, Edge Computing For First Time

  • Gartner added eight new technologies to the Hype Cycle this year including 5G, Artificial General Intelligence, Deep Learning, Edge Computing, Serverless PaaS.
  • Virtual Personal Assistants, Personal Analytics, Data Broker PaaS (dbrPaaS) are no longer included in the Hype Cycle for Emerging Technologies.

The Hype Cycle for Emerging Technologies, 2017 provides insights gained from evaluations of more than 2,000 technologies the research and advisory firms tracks. From this large base of technologies, the technologies that show the most potential for delivering a competitive advantage over the next five to 10 years are included in the Hype Cycle.

The eight technologies added to the Hype Cycle this year include 5G, Artificial General Intelligence, Deep Learning, Deep Reinforcement Learning, Digital Twin, Edge Computing, Serverless PaaS and Cognitive Computing. Ten technologies not included in the hype cycle for 2017 include 802.11ax, Affective Computing, Context Brokering, Gesture Control Devices, Data Broker PaaS (dbrPaaS), Micro Data Centers, Natural-Language Question Answering, Personal Analytics, Smart Data Discovery and Virtual Personal Assistants.

The three most dominant trends include Artifical Intelligence (AI) Everywhere, Transparently Immersive Experiences, and Digital Platforms. Gartner believes that key platform-enabling technologies are 5G, Digital Twin, Edge Computing, Blockchain, IoT Platforms, Neuromorphic Hardware, Quantum Computing, Serverless PaaS and Software-Defined Security.

Key takeaways from this year’s Hype Cycle include the following:

  • Heavy R&D spending from Amazon, Apple, Baidu, Google, IBM, Microsoft, and Facebook is fueling a race for Deep Learning and Machine Learning patents today and will accelerate in the future – The race is on for Intellectual Property (IP) in deep learning and machine learning today. The success of Amazon Alexa, Apple Siri, Google’s Google Now, Microsoft’s Cortana and others are making this area the top priority for R&D investment by these companies today. Gartner predicts deep-learning applications and tools will be a standard component in 80% of data scientists’ tool boxes by 2018. Amazon Machine Learning is available on Amazon Web Services today, accessible here.  Apple has also launched a Machine Learning JournalBaidu Research provides a site full of useful information on their ongoing research and development as well. Google Research is one of the most comprehensive of all, with a wealth of publications and research results.  IBM’s AI and Cognitive Computing site can be found here. The Facebook Research site provides a wealth of information on 11 core technologies their R&D team is working on right now. Many of these sites also list open positions on their R&D teams.
  • 5G adoption in the coming decade will bring significant gains for security, scalability, and speed of global cellular networks – Gartner predicts that by 2020, 3% of network-based mobile communications service providers (CSPs) will launch 5G networks commercially. The Hype Cycle report mentions that from 2018 through 2022 organizations will most often utilize 5G to support IoT communications, high definition video and fixed wireless access. AT&T, NTT Docomo, Sprint USA, Telstra, T-Mobile, and Verizon have all announced plans to launch 5G services this year and next.
  • Artificial General Intelligence is going to become pervasive during the next decade, becoming the foundation of AI as a Service – Gartner predicts that AI as a Service will be the enabling core technology that leads to the convergence of AI Everywhere, Transparently Immersive Experiences and Digital Platforms. The research firm is also predicting 4D Printing, Autonomous Vehicles, Brain-Computer Interfaces, Human Augmentation, Quantum Computing, Smart Dust and Volumetric Displays will reach mainstream adoption.

Sources:

Gartner Identifies Three Megatrends That Will Drive Digital Business Into the Next Decade

Gartner Hype Cycle for Emerging Technologies, 2017 (client access required)

Artificial Intelligence Will Enable 38% Profit Gains By 2035

sedff

  • By 2035 AI technologies have the potential to increase productivity 40% or more.
  • AI will increase economic growth an average of 1.7% across 16 industries by 2035.
  • Information and Communication, Manufacturing and Financial Services will be the top three industries that gain economic growth in 2035 from AI’s benefits.
  • AI will have the most positive effect on Education, Accommodation and Food Services and Construction industry profitability in 2035.

Today Accenture Research and Frontier Economics published How AI Boosts Industry Profits and Innovation. The report is downloadable here (28 pp., PDF, no opt-in).The research compares the economic growth rates of 16 industries, projecting the impact of Artifical Intelligence (AI) on global economic growth through 2035. Using Gross Value Added (GVA) as a close approximation of Gross Domestic Product (GDP), the study found that the more integrated AI is into economic processes, the greater potential for economic growth.  One of the reports’ noteworthy findings is that AI has the potential to increase economic growth rates by a weighted average of 1.7% across all industries through 2035. Information and Communication (4.8%), Manufacturing (4.4%) and Financial Services (4.3%) are the three sectors that will see the highest annual GVA growth rates driven by AI in 2035. The bottom line is that AI has the potential to boost profitability an average of 38% by 2035 and lead to an economic boost of $14T across 16 industries in 12 economies by 2035.

Key takeaways from the study include the following:

  • AI will increase economic growth by an average of 1.7% across 16 industries by 2035 with Information and Communication, manufacturing and financial services leading all industries. Accenture Research found that the Information and Communication industry has the greatest potential for economic growth from AI. Integrating AI into legacy information and communications systems will deliver significant cost, time and process-related savings quickly. Accenture predicts the time, cost and labor savings will generate up to $4.7T in GVA value in 2035. High growth areas within this industry are cloud, network, and systems security including defining enterprise-wide cloud security strategies.

awfdasdf

  • AI will most increase profitability in Education, Accommodation and Food Services and Construction industries in 2035. Personalized learning programs and automating mundane, routine tasks to free up colleges, universities, and trade school instructors to teach new learning frameworks will accelerate profitability in the education through 2035.  Accommodation & Food Services and Construction are industries with manually-intensive, often isolated processes that will benefit from the increased insights and contextual intelligence from AI throughout the forecast period.

qwjhjh

  • Manufacturing’s adoption of Industrial Internet of Things (IIoT), smart factories and comparable initiatives are powerful catalysts driving AI adoption. Based on the proliferation of Industrial Internet of Things (IIoT) devices and the networks and terabytes of data they generate, Accenture predicts AI will contribute an additional $3.76T GVA to manufacturing by 2035. Supply chain management, forecasting, inventory optimization and production scheduling are all areas AI can make immediate contributions to this industry’s profits and long-term economic

asdfsda

  • Financial Services’ greatest gains from AI will come automating and reducing the errors in mundane, manually-intensive tasks including credit scoring and first-level customer inquiries. Accenture forecasts financial services will benefit $1.2T in additional GVA in 2035 from AI. Follow-on areas of automation in Financial Services include automating market research queries through intelligent bots, and scoring and reviewing mortgages.

ujhhuuhkj

  • By 2035 AI technologies could increase labor productivity 40% or more, doubling economic growth in 12 developed nations. Accenture finds that AI’s immediate impact on profitability is improving individual efficiency and productivity. The economies of the U.S. and Finland are projected to see the greatest economic gains from AI through 2035, with each attaining 2% higher GVA growth.The following graphic compares the 12 nations included in the first phase of the research.

eterwtreert

Sources:

Machine Learning Is The New Proving Ground For Competitive Advantage

  • 50% of organizations are planning to use machine learning to better understand customers in 2017.
  • 48% are planning to use machine learning to gain greater competitive advantage.
  • Top future applications of machine learning include automated agents/bots (42%), predictive planning (41%), sales & marketing targeting (37%), and smart assistants (37%).

These and many other insights are from a recent survey completed by MIT Technology Review Custom and Google Cloud, Machine Learning: The New Proving Ground for Competitive Advantage (PDF, no opt-in, 10 pp.). Three hundred and seventy-five qualified respondents participated in the study, representing a variety of industries, with the majority being from technology-related organizations (43%). Business services (13%) and financial services (10%) respondents are also included in the study.  Please see page 2 of the study for additional details on the methodology.

Key insights include the following:

  • 50% of those adopting machine learning are seeking more extensive data analysis and insights into how they can improve their core businesses. 46% are seeking greater competitive advantage, and 45% are looking for faster data analysis and speed of insight. 44% are looking at how they can use machine learning to gain enhanced R&D capabilities leading to next-generation products.
If your organization is currently using ML, what are you seeking to gain?*

If your organization is currently using ML, what are you seeking to gain?

  • In organizations now using machine learning, 45% have gained more extensive data analysis and insights. Just over a third (35%) have attained faster data analysis and increased the speed of insight, in addition to enhancing R&D capabilities for next-generation products. The following graphic compares the benefits organizations who have adopted machine learning have gained. One of the primary factors enabling machine learning’s full potential is service oriented frameworks that are synchronous by design, consuming data in real-time without having to move data. enosiX is quickly emerging as a leader in this area, specializing in synchronous real-time Salesforce and SAP integration that enables companies to gain greater insights, intelligence, and deliver measurable results.
your organization is currently using machine learning, what have you actually gained?

If your organization is currently using machine learning, what have you actually gained?

  • 26% of organizations adopting machine learning are committing more than 15% of their budgets to initiatives in this area. 79% of all organizations interviewed are investing in machine learning initiatives today. The following graphic shows the distribution of IT budgets allocated to machine learning during the study’s timeframe of late 2016 and 2017 planning.
What part of your IT budget for 2017 is earmarked for machine learning?

What part of your IT budget for 2017 is earmarked for machine learning? 

  • Half of the organizations (50%) planning to use machine learning to better understand customers in 2017. 48% are adopting machine learning to gain a greater competitive advantage, and 45% are looking to gain more extensive data analysis and data insights. The following graphic compares the benefits organizations adopting machine learning are seeking now.
If your organization is planning to use machine learning, what benefits are you seeking?

If your organization is planning to use machine learning, what benefits are you seeking?

  • Natural language processing (NLP) (49%), text classification and mining(47%), emotion/behavior analysis (47%) and image recognition, classification, and tagging (43%) are the top four projects where machine learning is in use today.  Additional projects now underway include recommendations (42%), personalization (41%), data security (40%), risk analysis (41%), online search (41%) and localization and mapping (39%). Top future uses of machine learning include automated agents/bots (42%), predictive planning (41%), sales & marketing targeting (37%), and smart assistants (37%).
  • 60% of respondents have already implemented a machine learning strategy and committed to ongoing investment in initiatives. 18% have planned to implement a machine learning strategy in the next 12 to 24 months. Of the 60% of respondent companies who have implemented machine learning initiatives, 33% are in the early stages of their strategies, testing use cases. 28% consider their machine learning strategies as mature with between one and five use cases or initiatives ongoing today.

Business Intelligence And Analytics In The Cloud, 2017

  • 78% are planning to increase the use of cloud for BI and data management in the next twelve months.
  • 46% of organizations prefer public cloud platforms for cloud BI, analytics and data management deployments.
  • Cloud BI adoption increased in respondent companies from 29% to 43% from 2013 to 2016.
  • Almost half of organizations using cloud BI (46%) use a public cloud for BI and data management compared to less than a third (30%) for hybrid cloud and 24% for private cloud.

These and many other insights are from the BARC Research and Eckerson Group Study, BI and Data Management in the Cloud: Issues and Trends published January 2017 (39 pp., PDF, no opt-in). Business Application Research Center (BARC) is a research and consulting firm that concentrates on enterprise software including business intelligence (BI), analytics and data management. Eckerson Group is a research and consulting firm focused on serving the needs of business intelligence (BI) and analytic leaders in Fortune 2000 organizations worldwide. The study is based on interviews completed in September and October 2016. 370 respondents participated in the survey globally. Given the size of the sample, the results aren’t representative of the global BI and analytics user base. The study’s results provide an interesting glimpse into analytics and BI adoption today, however. For a description of the methodology, please see page 31 of the study.

Key insights from the study include the following:

  • Public cloud is the most preferred deployment platform for cloud BI and analytics, and the larger the organization toe more likely they are using private clouds. 46% of organizations selected public cloud platforms as their preferred infrastructure for supporting their BI, analytics, and data management initiatives in 2016. 30% are relying on a hybrid cloud platform and 24%, private clouds. With public cloud platforms becoming more commonplace in BI and analytics deployments, the need for greater PaaS- and IaaS-level orchestration becomes a priority. The larger the organization, the more likely they are using private clouds (33%). Companies with between 250 to 2,500 employees are the least likely to be using private clouds (16%).

grouped-bi-cloud-platform-graphic

  • Dashboard-based reporting (76%), ad-hoc analysis and exploration (57%) and dashboard authoring (55%) are the top three Cloud BI use cases. Respondents are most interested in adding advanced and predictive analytics (53%), operational planning and forecasting (44%), strategic planning and simulation (44%) in the next year. The following graphic compares primary use cases and planned investments in the next twelve months. SelectHub has created a useful Business Intelligence Tools Comparison here that provides insights into this area.

cloud-bi-use-cases

  • Power users dominate the use of cloud BI and analytics solutions, driving more complex use cases that include ad-hoc analysis (57%) and advanced report and dashboard creation (55%). Casual users are 20% of all cloud BI and analytics, with their most common use being for reporting and dashboards (76%). Customers and suppliers are an emerging group of cloud BI and analytics users as more respondent companies create self-service web-based apps to streamline external reporting.

cloud-bi-power-users

  • Data integration between cloud applications/databases (51%) and providing data warehouses and data marts (50%) are the two most common data management strategies in use to support BI and analytics solutions today. Respondent organizations are using the cloud to integration cloud applications with each other and with on-premises applications (46%).  The study also found that as more organizations move to the cloud, there’s a corresponding need to support hybrid cloud architectures. Cloud-based data warehouses are primarily being built to support net new applications versus existing apps on-premise. Data integration is essential for the ongoing operations of cloud-based and on-premise ERP systems. A useful comparison of ERP systems can be found here.

cloud-data-integration

  • Data integration between on-premises and cloud applications dominates use cases across all company sizes, with 48% of enterprises leading in adoption. Enterprises are also prioritizing providing data warehouses and data marts (48%), the pre-processing of data (38%) and data integration between cloud applications and databases (38%). The smaller a company is the more critical data integration becomes. 63% of small companies with less than 250 employees are prioritizing data integration between cloud applications and databases (63%).

use-cases-of-cloud-management-by-company-size

  • Tools for data exploration (visual discovery) adopted grew the fastest in the last three years, increasing from 20% adoption in 2013 to 49% in 2016. BI tools increased slightly from 55% to 62% and BI servers dropped from 56% to 51%. Approximately one in five respondent organizations (22%) added analytical applications in 2016.

bi-tools-growth

  • The main reasons for adopting cloud BI and analytics differ by size of the company, with cost (57%) being the most important for mid-sized businesses between 250 to 2.5K employees. Consistent with previous studies, small companies’ main reason for adopting cloud BI and analytics include flexibility (46%), reduced maintenance of hardware and software (43%), and cost (38%). Enterprises with more than 2.5K employees are adopting cloud BI and analytics for greater scalability (48%), cost (40%) and reduced maintenance of hardware and software (38%). The following graphic compares the most important reason for adopting cloud BI, analytics and data management by the size of the company.

most-important-reason-for-adopting-cloud-bi-and-data-management

McKinsey’s 2016 Analytics Study Defines The Future Of Machine Learning

  • U.S. retailer supply chain operations who have adopted data and analytics have seen up to a 19% increase in operating margin over the last five years.
  • Design-to-value, supply chain management and after-sales support are three areas where analytics are making a financial contribution in manufacturing.
  • 40% of all the potential value associated with the Internet of Things requires interoperability between IoT systems.

These and many other insights are from the McKinsey Global Institute’s study The Age of Analytics: Competing In A Data-Driven World published in collaboration with McKinsey Analytics this month. You can get a copy of the Executive Summary here (28 pp., free, no opt-in, PDF) and the full report (136 pp., free, no opt-in, PDF) here. Five years ago the McKinsey Global Institute (MGI) released Big Data: The Next Frontier For Innovation, Competition, and Productivity (156 pp., free no opt-in, PDF), and in the years since McKinsey sees data science adoption and value accelerate, specifically in the areas of machine learning and deep learning. The study underscores how critical integration is for gaining greater value from data and analytics.

Key takeaways from the study include the following:McKinsey Analytics

  • Location-based services and U.S. retail are showing the greatest progress capturing value from data and analytics. Location-based services are capturing up to 60% of data and analytics value today predicted by McKinsey in their 2011 report. McKinsey predicts there are growing opportunities for businesses to use geospatial data to track assets, teams, and customers across dispersed locations to generate new insights and improve efficiency. U.S. Retail is capturing up to 40%, and Manufacturing, 30%.  The following graphic compares the potential impact as predicted in McKinsey’s 2011 study with the value captured by segment today, including a definition of major barriers to adoption.

uneven-progress

  • Machine learning’s greatest potential across industries includes improving forecasting and predictive analytics. McKinsey analyzed the 120 use cases their research found as most significant in machine learning and then weighted them based on respondents’ mention of each. The result is a heat map of machine learning’s greatest potential impact across industries and use case types.  Please see the report for detailed scorecards of each industry’s use case ranked by impact and data richness.

machine-learning-impact

  • Machine learning’s potential to deliver real-time optimization across industries is just starting to evolve and will quickly accelerate in the next three years. McKinsey analyzed the data richness associated with each of the 300 machine learning use cases, defining this attribute as a combination of data volume and variety. Please see page 105 of the study for a thorough explanation of McKinsey’s definition of data volume and variety used in the context of this study The result of evaluating machine learning’s data richness by industry is shown in the following heat map:

rich-data-is-an-enabler

  • Enabling autonomous vehicles and personalizing advertising are two of the highest opportunity use cases for machine learning today. Additional use cases with high potential include optimizing pricing, routing, and scheduling based on real-time data in travel and logistics; predicting personalized health outcomes, and optimizing merchandising strategy in retail. McKinsey identified 120 potential use cases of machine learning in 12 industries and surveyed more than 600 industry experts on their potential impact. They found an extraordinary breadth of potential applications for machine learning.  Each of the use cases was identified as being one of the top three in an industry by at least one expert in that industry. McKinsey plotted the top 120 use cases below, with the y-axis shows the volume of available data (encompassing its breadth and frequency), while the x-axis shows the potential impact, based on surveys of more than 600 industry experts. The size of the bubble reflects the diversity of the available data sources.

machine-learning

  • Designing an appropriate organizational structure to support data and analytics activities (45%), Ensuring senior management involvement (42%), and designing effective data architecture and technology infrastructure (36%) are the three most significant challenges to attaining data and analytics objectives. McKinsey found that the barriers break into the three categories: strategy, leadership, and talent; organizational structure and processes; and technology infrastructure. Approximately half of executives across geographies and industries reported greater difficulty recruiting analytical talent than any other kind of talent. 40% say retention is also an issue.

barriers-to-analytics-and-machine-learning-adoption

  • U.S. retailer supply chain operations who have adopted data and analytics have seen up to a 19% increase in operating margin over the last five years. Using data and analytics to improve merchandising including pricing, assortment, and placement optimization is leading to an additional 16% in operating margin improvement. The following table illustrates data and analytics’ contribution to U.S. retail operations by area.

us-retail-data-sheet

  • Design-to-value, supply chain management and after-sales support are three areas where analytics are making a financial contribution in manufacturing. McKinsey estimates that analytics have increased manufacturer’s gross margins by as much as 40% when used in design-to-value workflows and projects. Up to 15% of after-sales costs have been reduced through the use of analytics that includes product sensor data analysis for after-sales service. There are several interesting companies to watch in this area, with two of the most innovative being Sight Machine and enosiX, with the latter enabling real-time integration between SAP and Salesforce systems. The following graphic illustrates the estimated impact of analytics on manufacturing financial performance by area.

manufacturing

Analytics, Data Storage Will Lead Cloud Adoption In 2017

  • cioU.S.-based organizations are budgeting $1.77M for cloud spending in 2017 compared to $1.30M for non-U.S. based organizations.
  • 10% of enterprises with over 1,000 employees are projecting they will spend $10M or more on cloud computing apps and platforms throughout this year.
  • Organizations are using multiple cloud models to meet their business’s needs, including private (62%), public (60%), and hybrid (26%).
  • By 2018 the typical IT department will have the minority of their apps and platforms (40%) residing in on-premise systems.

These and many other insights are from IDG’s Enterprise Cloud Computing Survey, 2016. You can find the 2016 Cloud Computing Executive Summary here and a presentation of the results here.  The study’s methodology is based on interviews with respondents who are reporting they are involved with cloud planning and management across their organizations. The sampling frame includes audiences across six IDG Enterprise brands (CIO, Computerworld, CSO, InfoWorld, ITworld and Network World) representing IT and security decision-makers across eight industries. The survey was fielded online with the objective of understanding organizational adoption, use-cases, and solution needs for cloud computing. A total of 925 respondents were interviewed to complete the study.

Key takeaways include the following:

  • The cloud is the new normal for enterprise apps, with 70% of all organizations having at least one app in the cloud today. 75% of enterprises with greater than 1,000 employees have at least one app or platform running in the cloud today, leading all categories of adoption measured in the survey. 90% of all organizations today either have apps running in the cloud are planning to use cloud apps in the next 12 months, or within 1 to 3 years. The cloud has won the enterprise and will continue to see the variety and breadth of apps adopted accelerating in 2017 and beyond.

use-of-cloud-technology-continuously-expanding

 

  • Business/data analytics and data storage/data management (both 43%) are projected to lead cloud adoption in 2017 and beyond. 22% of organizations surveyed are predicting that business/data analytics will be the leading cloud application area they will migrate to in the next twelve months. 21% are predicting data storage/data management apps are a high priority area for their organizations’ cloud migration plans in 2017. Three of the market leaders in analytics are Tableau, QlikView and Microsoft Power BI. They are analyzed in this recent post from SelectHub, accessible here.

data-storage-and-analytics-moving-to-the-cloud

 

  • 28% of organizations’ total IT budgets is dedicated to cloud computing next year. Of that, 45% is allocated to SaaS, 30% to IaaS and 19% to PaaS. The average investment organizations will make in cloud computing next year is $1.62M, with enterprises over 1,000 employees projected to spend $3.03M. The average investment in cloud computing remains constant in organizations with $1.62M invested in 2014, $1.56M in 2015 and $1.62M in 2016. 10% of enterprises with over 1,000 employees are projecting they will spend $10M or more on cloud computing apps and platforms throughout this year.

cloud-budget

 

  • CIOs, IT architects and IT networking/management control cloud spending in the enterprise. In contrast, CEOs, CIOs, and CFOs are driving small and medium business (SMB) cloud spending this year. The following graphic compares how influential the following groups and individuals are in the cloud computing purchase process.

cloud-investment

 

  • Just 46% of organizations are using Application Programmer Interfaces (APIs) to integrate with databases, messaging systems, portals or storage components. 40% are using them for creating connections to the application layer of their cloud and the underlying IT infrastructures. The following graphic provides insights into how APIs are being used and which teams see the most value in them.

apis

 

  • In 18 months the majority of organizations’ IT infrastructures will be entirely cloud-based. IDG found that in 18 months nearly one-third (28%) of all organizations interviewed will be relying on private clouds as part of their IT infrastructure. Just over a fifth (22%) will have public cloud as part of their IT infrastructure, and 10% will be using hybrid By 2018 the typical IT department will have the minority of their apps and platforms (40%) residing in on-premise systems.

it-shifts-to-the-cloud

 

  • Concerns about where data is stored (43%), cloud security (41%) and vendor lock-in (21%) are the top three challenges organizations face when adopting public cloud technologies. Private and hybrid cloud adoption in organizations is also facing the challenges of cloud security and vendor lock-in. Private and hybrid cloud adoption are being slowed by a lack of the right skill sets to manage and gain the maximum value from cloud investments.

challenges

%d bloggers like this: