These 15 companies are the highest rated in their fields of expertise as defined by Computer Reseller News as of May 11, 2018.
Nine out of 10 employees of these top 15 companies would recommend the company they work for to a friend.
Forbes readers’ most common requests center on who the best companies are to work for in analytics, big data, data management, data science and machine learning. The latest Computer Reseller News‘ 2018 Big Data 100 list of companies is used to complete the analysis as it is an impartial, independent list aggregated based on CRN’s analysis and perspectives of the market. Using the CRN list as a foundation, the following analysis captures the best companies in their respective areas today.
Using the 2018 Big Data 100 CRN list as a baseline to compare the Glassdoor scores of the (%) of employees who would recommend this company to a friend and (%) of employees who approve of the CEO, the following analysis was completed today. 25 companies on the list have very few (less than 15) or no Glassdoor reviews, so they are excluded from the rankings. Based on analysis of Glassdoor score patterns over the last four years, the lower the number of rankings, the more 100% scores for referrals and CEOs. These companies, however, are included in the full data set available here. If the image below is not visible in your browser, you can view the rankings here.
The highest rated CEOs on Glassdoor as of May 11, 2018 include the following:
Data Scientist has been named the best job in America for three years running, with a median base salary of $110,000 and 4,524 job openings.
DevOps Engineer is the second-best job in 2018, paying a median base salary of $105,000 and 3,369 job openings.
There are 29,187 Software Engineering jobs available today, making this job the most popular regarding Glassdoor postings according to the study.
These and many other fascinating insights are from Glassdoor’s 50 Best Jobs In America For 2018. The Glassdoor Report is viewable online here. Glassdoor’s annual report highlights the 50 best jobs based on each job’s overall Glassdoor Job Score.The Glassdoor Job Score is determined by weighing three key factors equally: earning potential based on median annual base salary, job satisfaction rating, and the number of job openings. Glassdoor’s 2018 report lists jobs that excel across all three dimensions of their Job Score metric. For an excellent overview of the study by Karsten Strauss of Forbes, please see his post, The Best Jobs To Apply For In 2018.
LinkedIn’s 2017 U.S. Emerging Jobs Report found that there are 9.8 times more Machine Learning Engineers working today than five years ago with 1,829 open positions listed on their site as of last month. Data science and machine learning are generating more jobs than candidates right now, making these two areas the fastest growing tech employment areas today.
Key takeaways from the study include the following:
Six analytics and data science jobs are included in Glassdoor’s 50 best jobs In America for 2018. These include Data Scientist, Analytics Manager, Database Administrator, Data Engineer, Data Analyst and Business Intelligence Developer. The complete list of the top 50 jobs is provided below with the analytics and data science jobs highlighted along with software engineering, which has a record 29,817 open jobs today:
Median base salary of the 50 best jobs in America is $91,000 with the average salary of the six analytics and data science jobs being $94,167.
Across all six analytics and data science jobs there are 16,702 openings as of today according to Glassdoor.
Tech jobs make up 20 of Glassdoor’s 50 Best Jobs in America for 2018, up from 14 jobs in 2017.
2M developers are working on IoT applications, increasing 34% since the last year.
Over 50% of the developers working on IoT applications are writing software that utilizes sensors in some capacity.
4M enterprise developers play decision-making roles when it comes to selecting organizational IT development resources. Another 5.2 million hold decision-making authority for selecting IT deployment resources.
4M developers (26% of all developers globally) are using the cloud as a development environment today
The APAC region leads the world with approximately 7.4M developers today, followed by EMEA with 7.2M, North America with 4.4M and Latin American with 1.9M.
These and many other fascinating insights are from the Evans Data Corporation Global Developer Population and Demographic Study 2016 (PDF, client access) published earlier this week. The methodology Evans Data has created to produce this report is the most comprehensive developed for aggregating, analyzing and predicting developer populations globally. The study combines Evans Data’s proprietary global developer population modeling with the current results of their semi-annual global developer survey.
Key takeaways from the study include the following:
6M developers (29% of all developers globally) are involved in a Big Data and Advanced Analytics project today. An additional 25% of developers, or 5.3M, are going to begin Big Data and Advanced Analytics projects within the next six 13% or 2.6M of all developers globally are going to start Big Data and Advanced Analytics projects within the next 7 to 12 months. The following graphic provides an overview of the involvement of 21M developers in Big Data and Advanced Analytics projects today. Please click on the image to expand for easier viewing.
4M developers (26% of all developers globally) are using the cloud as a development environment today. Developers creating new apps in the cloud had increased 375% since Evans began measuring developer participation in mobile development in 2009 when just slightly more than 1.2M developers were using the cloud as their development platform. 4.5M developers (21% of all global developers) plan on beginning app development on cloud platforms in the next six months, and 3.9M (18% of all global developers) plan on starting development on the cloud in 7 – 12 months. Please click on the image to expand for easier viewing.
8M developers in APAC (24% of all developers in the region) are currently developing on cloud platforms. 29% of APAC developers are planning to start cloud-based development in six months, and 20% in 7 – 12 months. The following graphic compares the number of developers currently using the cloud as a development environment today and the number who plan to in the future. Please click on the image to expand for easier viewing.
34% of all Commercial Independent Software Vendors (ISVs) globally today (1.8M developers) are using the cloud as a development environment. An additional 1.4M are planning to begin cloud development in the next six months. 28% of developers globally creating apps in the cloud are from custom system integrators (SI) and value-added resellers (VARs). 23% or approximately 1.2M are from enterprises. The following graphic compares the percent of developers by developer segment who are currently creating new apps in cloud environments. Please click on the image to expand for easier viewing.
30% of developers (6.2M developers globally) are currently developing software for connected devices or the Internet of Things today, with an additional 26% planning to begin projects in 6 months. Evans Data found that this increased 34% over the last year. Also, 2.1M developers plan to begin development in this area within the next 7 to 12 months. The following graphic compares the number of developers globally by stage of development for creating software for connected devices or the Internet of Things. Please click on the image to expand for easier viewing.
41% of global developers creating connected device and IoT software today are from 27% are from North America, 24% are from EMEA and 7% from Latin America. There are 6,072,048 developers currently working on connected device and IoT software today globally. The following graphic provides an overview of the distribution of developers creating connected device and IoT software by region today. Please click on the image to expand for easier viewing.
34% of developers actively creating software for connected devices or the Internet of Things work for custom System Integrators (SI) and VARs today. ISVs are the next largest segment of developers working on IoT projects (30%) followed by enterprises (21%). The following graphic provides an overview of the global base of developers creating software for connected devices and IoT. Evans Data found there are 6.1M developers currently creating apps and solutions in this area alone. Please click on the image to expand for easier viewing.
Big Data & business analytics software worldwide revenues will grow from nearly $122B in 2015 to more than $187B in 2019, an increase of more than 50% over the five-year forecast period.
The market for prescriptive analytics software is estimated to grow from approximately $415M in 2014 to $1.1B in 2019, attaining a 22% CAGR.
By 2020, predictive and prescriptive analytics will attract 40% of enterprises’ net new investment in business intelligence and analytics.
Making enterprises more customer-centric, sharpening focus on key initiatives that lead to entering new markets and creating new business models, and improving operational performance are three dominant factors driving analytics, Big Data, and business intelligence (BI) investments today. Unleashing the insights hidden in unstructured data is providing enterprises with the potential to compete and improve in areas they had limited visibility into before. Examples of these areas include the complexity of B2B selling and service relationships, healthcare services, and maintenance, repair, and overhaul (MRO) of complex machinery. All organizations face the daunting task of integrating systems together to enable greater process visibility. enosiX is taking a leadership role in this area, offering real-time integration between SAP and Salesforce systems, giving enterprises the opportunity to be more responsive to suppliers, resellers, partners and most importantly, customers.
Presented below are a roundup of recent analytics and big data forecasts and market estimates:
The global big data market will grow from $18.3B in 2014 to $92.2B by 2026, representing a compound annual growth rate of 14.4 percent. Wikibon predicts significant growth in all four sub-segments of big data software through 2026. Data management (14% CAGR), core technologies such as Hadoop, Spark and streaming analytics (24% CAGR), databases (18% CAGR) and big data applications, analytics and tools (23% CAGR) are the four fastest growing sub-segments according to Wikibon. Source: Wikibon forecasts Big Data market to hit $92.2B by 2026.
The Total Data market is expected to nearly double in size, growing from $69.6B in revenue in 2015 to $132.3B in 2020. The specific market segments included in 451 Research’s analysis are operational databases, analytic databases, reporting and analytics, data management, performance management, event/stream processing, distributed data grid/cache, Hadoop, and search-based data platforms and analytics. Source: Total Data market expected to reach $132bn by 2020; 451 Research, June 14, 2016.
Improving customer relationships (55%) and making the business more data-focused (53%) are the top two business goals or objectives driving investments in data-driven initiatives today. 78% of enterprises agree that collection and analysis of Big Data have the potential to change fundamentally the way they do business over the next 1 to 3 years. Source: IDG Enterprise 2016 Data & Analytics Research, July 5, 2016.
Venture capital (VC) investment in Big Data accelerated quickly at the beginning of the year with DataDog ($94M), BloomReach ($56M), Qubole ($30M), PlaceIQ ($25M) and others receiving funding. Big Data startups received $6.64B in venture capital investment in 2015, 11% of total tech VC. M&A activity has remained moderate (FirstMark noted 35 acquisitions since their latest landscape was published last year). Source: Matt Turck’s blog post, Is Big Data Still a Thing? (The 2016 Big Data Landscape).
IDC forecasts global spending on cognitive systems will reach nearly $31.3 billion in 2019 with a five-year compound annual growth rate (CAGR) of 55%. More than 40% of all cognitive systems spending throughout the forecast will go to software, which includes both cognitive applications (i.e., text and rich media analytics, tagging, searching, machine learning, categorization, clustering, hypothesis generation, question answering, visualization, filtering, alerting, and navigation). Also included in the forecasts are cognitive software platforms, which enable the development of intelligent, advisory, and cognitively enabled solutions. Source: Worldwide Spending on Cognitive Systems Forecast to Soar to More Than $31 Billion in 2019, According to a New IDC Spending Guide.
Big Data Analytics & Hadoop Market accounted for $8.48B in 2015 and is expected to reach $99.31B by 2022 growing at a CAGR of 42.1% from 2015 to 2022. The rise of big data analytics and rapid growth in consumer data capture and taxonomy techniques are a few of the many factors fueling market growth. Source: Stratistics Market Research Consulting (PDF, opt-in, payment reqd).
The purpose of the index is to understand how business users perceive, plan for and utilize four key technologies: cloud, mobility, security and big data. Dell released the first wave of its results this week and will be publishing several additional chapters throughout 2016. You can download Chapter 1 of the study here (PDF, no opt-in, 18 pp.).
Key take-aways from the study include the following:
Orchestrating big data, cloud and mobility strategies leads to 53% greater growth than peers not adopting these technologies. Midmarket organizations adopting big data alone have the potential to grow 50% more than comparable organizations. Effective use of Bring Your Own Device (BYOD) mobility strategies has the potential to increase growth by 53% over laggards or late adopters..
73% of North American organizations believe the volume and complexity of their data requires big data analytics apps and tools. This is up from 54% in 2014, indicating midmarket organizations are concentrating on how to get more value from the massive data stores many have accumulated. This same group of organizations believe they are getting more value out of big data this year (69%) compared to last year (64%). Top outcomes of using big data include better targeting of marketing efforts (41%), optimization of ad spending (37%), and optimization of social media marketing (37%).
54% of an organization’s security budget is invested in security plans versus reacting to threats.Dell & TNS Research discovered that midmarket organizations both in North America and Western Europe are relying on security to enable new devices or drive competitive advantage. In North America, taking a more strategic approach to security has increased from 25% in 2014 to 35% today. In Western Europe, the percentage of companies taking a more strategic view of security has increased from 26% in 2014 to 30% this year.
IT infrastructure costs to support big data initiatives (29%) and costs related to securing the data (28%) are the two greatest barriers to big data adoption. For cloud adoption, costs and security are the two biggest barriers in midmarket organizations as is shown in the graphic below.
Cloud use by midmarket companies in France increased 12% in the last twelve months, leading all nations in the survey. Of the 11 countries surveyed, France had the greatest increase in cloud adoption within midmarket companies. French businesses increased their adoption of cloud applications and platforms from 70% in 2014 to 82% in 2015.
Bottom line: Big data is providing supplier networks with greater data accuracy, clarity, and insights, leading to more contextual intelligence shared across supply chains.
Forward-thinking manufacturers are orchestrating 80% or more of their supplier network activity outside their four walls, using big data and cloud-based technologies to get beyond the constraints of legacy Enterprise Resource Planning (ERP) and Supply Chain Management (SCM) systems. For manufacturers whose business models are based on rapid product lifecycles and speed, legacy ERP systems are a bottleneck. Designed for delivering order, shipment and transactional data, these systems aren’t capable of scaling to meet the challenges supply chains face today.
Choosing to compete on accuracy, speed and quality forces supplier networks to get to a level of contextual intelligence not possible with legacy ERP and SCM systems. While many companies today haven’t yet adopted big data into their supply chain operations, these ten factors taken together will be the catalyst that get many moving on their journey.
The ten ways big data is revolutionizing supply chain management include:
Enabling more complex supplier networks that focus on knowledge sharing and collaboration as the value-add over just completing transactions. Big data is revolutionizing how supplier networks form, grow, proliferate into new markets and mature over time. Transactions aren’t the only goal, creating knowledge-sharing networks is, based on the insights gained from big data analytics. The following graphic from Business Ecosystems Come Of Age (Deloitte University Press) (free, no opt-in) illustrates the progression of supply chains from networks or webs, where knowledge sharing becomes a priority.
Big data and advanced analytics are being integrated into optimization tools, demand forecasting, integrated business planning and supplier collaboration & risk analytics at a quickening pace. These are the top four supply chain capabilities that Delotte found are currently in use form their recent study, Supply Chain Talent of the Future Findings from the 3rd Annual Supply Chain Survey (free, no opt-in). Control tower analytics and visualization are also on the roadmaps of supply chain teams currently running big data pilots.
64% of supply chain executives consider big data analytics a disruptive and important technology, setting the foundation for long-term change management in their organizations.SCM World’s latest Chief Supply Chain Officer Report provides a prioritization of the most disruptive technologies for supply chains as defined by the organizations’ members. The following graphic from the report provides insights into how senior supply chain executives are prioritizing big data analytics over other technologies.
Using geoanalytics based on big data to merge and optimize delivery networks. The Boston Consulting Group provides insights into how big data is being put to use in supply chain management in the article Making Big Data Work: Supply Chain Management (free, opt-in). One of the examples provided is how the merger of two delivery networks was orchestrated and optimized using geoanalytics. The following graphic is from the article. Combining geoanalytics and big data sets could drastically reduce cable TV tech wait times and driving up service accuracy, fixing one of the most well-known service challenges of companies in that business.
Greater contextual intelligence of how supply chain tactics, strategies and operations are influencing financial objectives. Supply chain visibility often refers to being able to see multiple supplier layers deep into a supply network. It’s been my experience that being able to track financial outcomes of supply chain decisions back to financial objectives is attainable, and with big data app integration to financial systems, very effective in industries with rapid inventory turns. Source: Turn Big Data Into Big Visibility.
Traceability and recalls are by nature data-intensive, making big data’s contribution potentially significant. Big data has the potential to provide improved traceability performance and reduce the thousands of hours lost just trying to access, integrate and manage product databases that provide data on where products are in the field needing to be recalled or retrofitted.
Increasing supplier quality from supplier audit to inbound inspection and final assembly with big data.IBM has developed a quality early-warning system that detects and then defines a prioritization framework that isolates quality problem faster than more traditional methods, including Statistical Process Control (SPC). The early-warning system is deployed upstream of suppliers and extends out to products in the field.
Demand for Computer Systems Analysts with big data expertise increased 89.9% in the last twelve months and 85.40% for Computer and Information Research Scientists.
Demand for Python programming expertise increased 96.9% in big-data related positions in the last twelve months.
These and other key insights are from a recent analysis completed of big data hiring trends using WANTED Analytics, the leading provider of data analytics on the workplace. For purposes of this analysis, the term “big data” is comprised of the four skill sets of data analysis, data acquisition, data mining and data structures. The WANTED Analytics taxonomy references these skill sets when queries are made on the term “big data”.
The company currently maintains a database of more than one billion unique job listings and is collecting hiring trend data from more than 150 countries. WANTED Analytics has never been a client, they provided complimentary access based on my requesting a trial account. Many Forbes readers are interested in staying current on big data hiring trends, which led me to complete this analysis.
Key Take-aways include the following:
Demand for big data expertise across a range of occupations saw significant growth over the last twelve months. There was a 123.60% jump in demand for Information Technology Project Managers with big data expertise, and an 89.8% increase for Computer Systems Analysts. The following table provides an overview of the distribution of open positions by occupation and the percentage growth in job demand over time.
The five leading industries with the most job openings requiring big data expertise include Professional, Scientific and Technical Services (27.14%), Information Technologies (18.89%), Manufacturing (12.35%), Retail Trade (9.62%) and Sustainability, Waste Management & Remediation Services (8.20%). The following graphic shows the distribution of open positions between September 1, 2014 to today, December 29, 2014:
The Hiring Scale is 76 for jobs that require big data skills with 12 candidates per job opening as of December 29, 2014. The higher the Hiring Scale score, the more difficult it is for employers to find the right applicants for open positions. Nationally an average job posting for an IT professional with cloud computing expertise is open just 47 days.
The median salary for professionals with big data expertise is $103,000 a year. Sample jobs in this category include Big Data Solution Architect, Linux Systems and Big Data Engineer, Big Data Platform Engineer, Lead Software Engineer, Big Data (Java, Hadoop, SQL) and others. The distribution of median salaries across all industries shown below:
San Jose – Sunnyvale – Santa Clara, CA, San Francisco – Oakland – Fremont, CA, and Washington – Arlington – Alexandria, DC are the top three U.S. employment markets for big data related jobs as of today. Mapping the distribution of job volume, salary range, candidate supply, posting period and hiring scale by Metropolitan Statistical Area (MSA) or states and counties is supported by WANTED Analytics and shown in the following graphic. A summary of the top twenty employment markets is also shown following the map:
Cisco (NASDAQ:CSCO), IBM (NYSE: IBM) and Oracle (NYSE:ORCL) have the most open big data-related positions today. Cisco, its supplier, partner and support ecosystem companies have 3,613 related big data positions available. The following table shows the top ten big data employers today, the distribution of jobs, and the number of new jobs added over the last year.
Python programming (96.90%), Linux expertise (76.60%) and Structured Query Language (SQL) (76%) are the three most in-demand skills in positions that mention big data as a requirement. The following table provides an overview of the top 10 most in-demand skills:
McKinsey & Company recently published How Big Data Can Improve Manufacturing which provides insightful analysis of how big data and advanced analytics can streamline biopharmaceutical, chemical and discrete manufacturing.
The article highlights how manufacturers in process-based industries are using advanced analytics to increase yields and reduce costs. Manufacturers have an abundance of operational and shop floor data that is being used for tracking today. The McKinsey article shows through several examples how big data and advanced analytics applications and platforms can deliver operational insights as well.
The following graphic from the article illustrates how big data and advanced analytics are streamlining manufacturing value chains by finding the core determinants of process performance, and then taking action to continually improve them:
Big Data’s Impact on Manufacturing Is Growing
In addition to the examples provided in the McKinsey article, there are ten ways big data is revolutionizing manufacturing:
Increasing the accuracy, quality and yield of biopharmaceutical production. It is common in biopharmaceutical production flows to monitor more than 200 variables to ensure the purity of the ingredients as well as the substances being made stay in compliance. One of the many factors that makes biopharmaceutical production so challenging is that yields can vary from 50 to 100% for no immediately discernible reason. Using advanced analytics, a manufacturer was able to track the nine parameters that most explained yield variation. Based on this insight they were able to increase the vaccine’s yield by 50%, worth between $5M to $10M in yearly savings for the single vaccine alone.
Accelerating the integration of IT, manufacturing and operational systems making the vision of Industrie 4.0 a reality. Industrie 4.0 is a German government initiative that promotes automation of the manufacturing industry with the goal of developing Smart Factories. Big data is already being used for optimizing production schedules based on supplier, customer, machine availability and cost constraints. Manufacturing value chains in highly regulated industries that rely on German suppliers and manufacturers are making rapid strides with Industrie 4.0 today. As this initiative serves as a catalyst to galvanize diverse multifunctional departments together, big data and advanced analytics will become critical to its success.
Better forecasts of product demand and production (46%), understanding plant performance across multiple metrics (45%) and providing service and support to customers faster (39%) are the top three areas big data can improve manufacturing performance. These findings are from a recent survey LNS Research and MESA International completed to see where big data is delivering the greatest manufacturing performance improvements today. You can find the original blog post here.
Integrating advanced analytics across the Six Sigma DMAIC (Define, Measure, Analyze, Improve and Control) framework to fuel continuous improvement. Getting greater insights into how each phase of a DMAIC-driven improvement program is working, and how the efforts made impact all other areas of manufacturing performance is nascent today. This area shows great potential to make production workflows more customer-driven than ever before.
Greater visibility into supplier quality levels, and greater accuracy in predicting supplier performance over time. Using big data and advanced analytics, manufacturers are able to view product quality and delivery accuracy in real-time, making trade-offs on which suppliers receive the most time-sensitive orders. Managing to quality metrics becomes the priority over measuring delivery schedule performance alone.
Measuring compliance and traceability to the machine level becomes possible. Using sensors on all machinery in a production center provides operations managers with immediate visibility into how each is operating. Having advanced analytics can also show quality, performance and training variances by each machine and its operators. This is invaluable in streamlining workflows in a production center, and is becoming increasingly commonplace.
Selling only the most profitable customized or build-to-order configurations of products that impact production the least. For many complex manufacturers, customized or build-to-order products deliver higher-than-average gross margins yet also costs exponentially more if production processes aren’t well planned. Using advanced analytics, manufacturers are discovering which of the myriad of build-to-order configurations they can sell with the most minimal impact to existing production schedules to the machine scheduling, staffing and shop floor level.
Breaking quality management and compliance systems out of their silos and making them a corporate priority. It’s time for more manufacturers to take a more strategic view of quality and quit being satisfied with standalone, siloed quality management and compliance systems. The McKinsey article and articles listed at the end of this post provide many examples of how big data and analytics are providing insights into which parameters matter most to quality management and compliance. The majority of these parameters are corporate-wide, not just limited to quality management or compliance departments alone.
Quantify how daily production impacts financial performance with visibility to the machine level. Big data and advanced analytics are delivering the missing link that can unify daily production activity to the financial performance of a manufacturer. Being able to know to the machine level if the factory floor is running efficiently, production planners and senior management know how best to scale operations. By unifying daily production to financial metrics, manufacturers have a greater chance of profitably scaling their operations.
Service becomes strategic and a contributor to customers’ goals by monitoring products and proactively providing preventative maintenance recommendations. Manufacturers are starting to look at the more complex products they produce as needing an operating system to manage the sensors onboard. These sensors report back activity and can send alerts for preventative maintenance. Big data and analytics will make the level of recommendations contextual for the first time so customers can get greater value. General Electric is doing this today with its jet engines and drilling platforms for example.
Additional sources of information on Big Data in Manufacturing:
The Rise of Industrial Big Data: Leveraging large time-series data sets to drive innovation, competitiveness and growth — capitalizing on the big data opportunity, GE Intelligent Platforms White Paper, April 2012. http://www.ge-ip.com/library/detail/13170
The five trends that serve as the foundation of this report include the increasing pervasiveness of software, affordable small devices, ubiquitous broadband connectivity, big-data analytics and cloud computing. BCG’s analysis illustrates how the majority of TMT companies that deliver the most value to shareholders are concentrating on the explosive growth of new markets, the rise of software-enabled digital metasystems, and for many, both.
The study is based on an analysis of 191 companies, 76 in the technology industry, 62 from media and 53 from telecom. To review the methodology of this study please see page 28 of the report.
Here are the key takeaways from this years’ BCG TMT Value Creators Report:
BCG is predicting 1B smartphones will be sold in 2013, the first year their sales will have exceeded those of features phones. By 2018, there will be more than 5B “post-PC” products (tablets & smartphones) in circulation. There are nearly as many mobile connections in the world as people (6.8B) according to the United Nation’s International Telecommunication Union (ITU).
27 terabytes of data is generated every second through the creation of video, images social networks, transactional and enterprise-based systems and networks. 90% of the data that is stored today didn’t exist two years ago, and the annual data growth rate in future years is projected to be 40% to 60% over current levels according to BCG’s analysis.
The ascent of communications speeds is surpassing Moore’s Law as a structural driver of growth. BCG completed the following analysis graphing the progression of microprocessor transition count (Moore’s Law) relative to Internet speed (bps) citing Butter’s Law of Photonics which states that the amount of data coming out of an optical fiber is doubling every nine months. BCG states that these dynamics are democratizing information technology and will lead to the cloud computing industry (software and services) reaching nearly $250B in 2017.
BCG predicts that India will see a fivefold increase in digitally-influenced spending, ascending from $30B in 2012 to $150B in 2016, among the fastest of all nations globally according to their study. India will also see the value of online purchases increase from $8B in 2012 t5o $50B in 2016.
3D printing is forecast to become a $3.1B market by 2016, and will have an economic impact of $550B in 2025, fueling rapid price reductions in 3D printers through 2017. BCG sees 3D printing, connected travel, genomics and smart grid technologies are central to their digital metasystem. The following graphic illustrates the key trends in each of these areas along with research findings from BCG and other sources.
Only 7% of customers are comfortable with their information being used outside of the purpose for which it was originally gathered.
BCG reports that mobile infrastructure investments in Europe have fallen 67% from 2004 to 2014. Less than 1% of mobile connections in Europe were 4B as of the end of 2012, compared to 11% in the U.S. and 28% in South Korea. European operators have also been challenged to monetize mobile data as well, as the following figures illustrate.
Big Data is attracting $19B in funding across five key areas according to BCG’s analysis. These include consumer data and marketing, enterprise data, analytical tools, vertical markets and data platforms. A graphical analysis of these investments is shown below.
Customers are quickly reinventing how they choose to learn about new products, keep current on existing ones, and stay loyal to those brands they most value. The best-run companies are all over this, orchestrating their IT strategies to be as responsive as possible.
The luxury of long technology evaluation cycles, introspective analysis of systems, and long deployment timeframes are giving way to rapid deployments and systems designed for accuracy and speed.
CIOs need to be just as strong at strategic planning and execution as they are at technology. Many are quickly prioritizing analytics, cloud and mobile strategies to stay in step with their rapidly changing customer bases. This is especially true for those companies with less than $1B in sales, as analytics, cloud computing and mobility can be combined to compete very effectively against their much bigger rivals.
What’s Driving CIOs – A Look At Technology Priorities
Gartner’s annual survey of CIOs includes 2,300 respondents located in 44 countries, competing in all major industries. As of the last annual survey, the three-highest rated priorities for investment from 2012 to 2015 included Analytics and Business Intelligence (BI), Mobile Technologies and Cloud Computing.
Source: From the Gartner Report Market Insight: Technology Opens Up Opportunities in SMB Vertical Markets September 6, 2012 by Christine Arcaris, Jeffrey Roster
How Industries Prioritize Analytics, Cloud and Mobile
When these priorities are analyzed across eight key industries, patterns emerge showing how the communications, media and services (CMS) and manufacturing industries have the highest immediate growth potential for mobility (Next 2 years). In Big Data/BI, Financial Services is projected to be the fastest-developing industry and in Cloud computing, CMS and Government.
In analyzing this and related data, a profile of early adopter enterprises emerges. These are companies who are based on knowledge-intensive business models, have created and excel at running virtual organization structures, rely on mobility to connect with and build relationships with customers, and have deep analytics expertise. In short, their business models take the best of what mobility, Big Data/BI and cloud computing have to offer and align it to their strategic plans and programs. The following figure, Vertical Industry Growth by Technology Over the Next Five Years, shows the prioritization and relative growth by industry.
Source: From the Gartner Report Market Insight: Technology Opens Up Opportunities in SMB Vertical Markets September 6, 2012 by Christine Arcaris, Jeffrey Roster
How Mobility Could Emerge As the Trojan Horse of Enterprise Software
Bring Your Own Device (BYOD), the rapid ascent of enterprise application stores, and the high expectations customers have of continual mobile app usability and performance improvements are just three of many factors driving mobility growth.
Just as significant is the success many mid-tier companies are having in competing with their larger, more globally known rivals using mobile-based Customer Relationship Management (CRM), warranty management, service and spare parts procurement strategies. What smaller competitors lack in breadth they are more than making up for in speed and responsiveness. Gartner’s IT Market Clock for Enterprise Mobility, 2012 captures how mobility is changing the nature of competition.
Source: IT Market Clock for Enterprise Mobility, 2012 Published: 10 September 2012 Analyst(s): Monica Basso
Bottom Line – By excelling at the orchestration of analytics, cloud and mobile, enterprises can differentiate where it matters most – by delivering an excellent customer experience. Mobility can emerge as an enterprise Trojan Horse because it unleashes accuracy, precision and speed into customer-facing processes that larger, complacent competitors may have overlooked.