Skip to content

Posts tagged ‘AWS’

10 Ways AI And Machine Learning Are Improving Endpoint Security

  • Gartner predicts $137.4B will be spent on Information Security and Risk Management in 2019, increasing to $175.5B in 2023, reaching a CAGR of 9.1%. Cloud Security, Data Security, and Infrastructure Protection are the fastest-growing areas of security spending through 2023.
  •  69% of enterprise executives believe artificial intelligence (AI) will be necessary to respond to cyberattacks with the majority of telecom companies (80%) saying they are counting on AI to help identify threats and thwart attacks according to Capgemini.
  •  Spending on AI-based cybersecurity systems and services reached $7.1B in 2018 and is predicted to reach $30.9B in 2025, attaining a CAGR of 23.4% in the forecast period according to Zion Market Research.

Traditional approaches to securing endpoints based on the hardware characteristics of a given device aren’t stopping breach attempts today. Bad actors are using AI and machine learning to launch sophisticated attacks to shorten the time it takes to compromise an endpoint and successfully breach systems. They’re down to just 7 minutes after comprising an endpoint and gaining access to internal systems ready to exfiltrate data according to Ponemon. The era of trusted and untrusted domains at the operating system level, and “trust, but verify” approaches are over. Security software and services spending is soaring as a result, as the market forecasts above show.

AI & Machine Learning Are Redefining Endpoint Security

AI and machine learning are proving to be effective technologies for battling increasingly automated, well-orchestrated cyberattacks and breach attempts. Attackers are combining AI, machine learning, bots, and new social engineering techniques to thwart endpoint security controls and gain access to enterprise systems with an intensity never seen before. It’s becoming so prevalent that Gartner predicts that more than 85% of successful attacks against modern enterprise user endpoints will exploit configuration and user errors by 2025. Cloud platforms are enabling AI and machine learning-based endpoint security control applications to be more adaptive to the proliferating types of endpoints and corresponding threats. The following are the top ten ways AI and machine learning are improving endpoint security:

  • Using machine learning to derive risk scores based on previous behavioral patterns, geolocation, time of login, and many other variables is proving to be effective at securing and controlling access to endpoints. Combining supervised and unsupervised machine learning to fine-tune risk scores in milliseconds is reducing fraud, thwarting breach attempts that attempt to use privileged access credentials, and securing every identity on an organizations’ network. Supervised machine learning models rely on historical data to find patterns not discernable with rules or predictive analytics. Unsupervised machine learning excels at finding anomalies, interrelationships, and valid links between emerging factors and variables. Combining both unsupervised and supervised machine learning is proving to be very effective in spotting anomalous behavior and reducing or restricting access.
  • Mobile devices represent a unique challenge to achieving endpoint security control, one that machine learning combined with Zero Trust is proving to be integral at solving.  Cybercriminals prefer to steal a mobile device, its passwords, and privileged access credentials than hack into an organization. That’s because passwords are the quickest onramp they have to the valuable data they want to exfiltrate and sell. Abandoning passwords for new techniques including MobileIron’s zero sign-on approach shows potential for thwarting cybercriminals from getting access while hardening endpoint security control. Securing mobile devices using a zero-trust platform built on a foundation of unified endpoint management (UEM) capabilities enables enterprises to scale zero sign-on for managed and unmanaged services for the first time. Below is a graphic illustrating how they’re adopting machine learning to improve mobile endpoint security control:
  • Capitalizing on the core strengths of machine learning to improve IT asset management is making direct contributions to greater security.  IT Management and security initiatives continue to become more integrated across organizations, creating new challenges to managing endpoint security across each device. Absolute Software is taking an innovative approach to solve the challenge of improving IT asset management, so endpoint protection is strengthened at the same time. Recently I had a chance to speak with Nicko van Someren, Ph.D. and Chief Technology Officer at Absolute Software, where he shared with me how machine learning algorithms are improving security by providing greater insights into asset management. “Keeping machines up to date is an IT management job, but it’s a security outcome. Knowing what devices should be on my network is an IT management problem, but it has a security outcome. And knowing what’s going on and what processes are running and what’s consuming network bandwidth is an IT management problem, but it’s a security outcome. I don’t see these as distinct activities so much as seeing them as multiple facets of the same problem space. Nicko added that Absolute’s endpoint security controls begin at the BIOS level of over 500M devices that have their endpoint code embedded in them. The Absolute Platform is comprised of three products: Persistence, Intelligence, and Resilience—each building on the capabilities of the other. Absolute Intelligence standardizes the data around asset analytics and security advocacy analytics to allow Security managers to ask any question they want. (“What’s slowing down my device? What’s working and what isn’t? What has been compromised? What’s consuming too much memory? How does this deviate from normal performance?”). An example of Absolute’s Intelligence providing insights into asset management and security is shown below:
  • Machine learning has progressed to become the primary detection method for identifying and stopping malware attacks. Machine learning algorithms initially contributed to improving endpoint security by supporting the back-end of malware protection workflows. Today more vendors are designing endpoint security systems with machine learning as the primary detection method. Machine learning trained algorithms can detect file-based malware and learn which files are harmful or not based on the file’s metadata and content. Symantec’s Content & Malware Analysis illustrates how machine learning is being used to detect and block malware. Their approach combines advanced machine learning and static code file analysis to block, detect, and analyze threats and stop breach attempts before they can spread.
  • Supervised machine learning algorithms are being used for determining when given applications are unsafe to use, assigning them to containers, so they’re isolated from production systems. Taking into account an applications’ threat score or reputation, machine learning algorithms are defining if dynamic application containment needs to run for a given application. Machine learning-based dynamic application containment algorithms and rules block or log unsafe actions of an application based on containment and security rules. Machine learning algorithms are also being used for defining predictive analytics that define the extent of a given applications’ threat.
  •  Integrating AI, machine learning, and SIEM (Security Information and Event Management) in a single unified platform are enabling organizations to predict, detect, and respond to anomalous behaviors and events. AI and machine learning-based algorithms and predictive analytics are becoming a core part of SIEM platforms today as they provide automated, continuous analysis and correlation of all activity observed within a given IT environment. Capturing, aggregating, and analyzing endpoint data in real-time using AI techniques and machine learning algorithms is providing entirely new insights into asset management and endpoint security. One of the most interesting companies to watch in this area is LogRhythm. They’ve developed an innovative approach to integrating AI, machine learning, and SIEM in their LogRhythm NextGen SIEM Platform, which delivers automated, continuous analysis and correlation of all activity observed within an IT environment. The following is an example of how LogRhythm combines AI, machine learning, and SIEM to bring new insights into securing endpoints across a network.
  • Machine learning is automating the more manually-based, routine incident analysis, and escalation tasks that are overwhelming security analysts today. Capitalizing on supervised machine learnings’ innate ability to fine-tune algorythms in milliseconds based on the analysis of incidence data, endpoint security providers are prioritizing this area in product developnent. Demand from potential customers remains strong, as nearly everyone is facing a cybersecurity skills shortage while facing an onslaught of breach attempts.  “The cybersecurity skills shortage has been growing for some time, and so have the number and complexity of attacks; using machine learning to augment the few available skilled people can help ease this. What’s exciting about the state of the industry right now is that recent advances in Machine Learning methods are poised to make their way into deployable products,” Absolute’s CTO Nicko van Someren added.
  • Performing real-time scans of all processes with an unknown or suspicious reputation is another way how machine learning is improving endpoint security. Commonly referred to as Hunt and Respond, supervised and unsupervised machine learning algorithms are being used today to seek out and resolve potential threats in milliseconds instead of days. Supervised machine learning algorithms are being used to discover patterns in known or stable processes where anomalous behavior or activity will create an alert and pause the process in real-time. Unsupervised machine learning algorithms are used for analyzing large-scale, unstructured data sets to categorize suspicious events, visualize threat trends across the enterprise, and take immediate action at a single endpoint or across the entire organization.
  • Machine learning is accelerating the consolidation of endpoint security technologies, a market dynamic that is motivating organizations to trim back from the ten clients they have on average per endpoint today. Absolute Software’s 2019 Endpoint Security Trends Report found that a typical device has ten or more endpoint security agents installed, each often conflicting with the other. The study also found that enterprises are using a diverse array of endpoint agents, including encryption, AV/AM, and Endpoint Detection and Response (EDR). The wide array of endpoint solutions make it nearly impossible to standardize a specific test to ensure security and safety without sacrificing speed. By helping to accelerate the consolidation of security endpoints, machine learning is helping organizations to see the more complex and layered the endpoint protection, the greater the risk of a breach.
  • Keeping every endpoint in compliance with regulatory and internal standards is another area machine learning is contributing to improving endpoint security. In regulated industries, including financial services, insurance, and healthcare, machine learning is being deployed to discover, classify, and protect sensitive data. This is especially the case with HIPAA (Health Insurance Portability and Accountability Act) compliance in healthcare. Amazon Macie is representative of the latest generation of machine learning-based cloud security services. Amazon Macie recognizes sensitive data such as personally identifiable information (PII) or intellectual property and provides organizations with dashboards, alerts, and contextual insights that give visibility into how data is being accessed or moved. The fully managed service continuously monitors data access activity for anomalies and generates detailed alerts when it detects the risk of unauthorized access or inadvertent data leaks. An example of one of Amazon Macie’s dashboard is shown below:

The Truth About Privileged Access Security On AWS And Other Public Clouds

 

Bottom Line: Amazon’s Identity and Access Management (IAM) centralizes identity roles, policies and Config Rules yet doesn’t go far enough to provide a Zero Trust-based approach to Privileged Access Management (PAM) that enterprises need today.

AWS provides a baseline level of support for Identity and Access Management at no charge as part of their AWS instances, as do other public cloud providers. Designed to provide customers with the essentials to support IAM, the free version often doesn’t go far enough to support PAM at the enterprise level. To AWS’s credit, they continue to invest in IAM features while fine-tuning how Config Rules in their IAM can create alerts using AWS Lambda. AWS’s native IAM can also integrate at the API level to HR systems and corporate directories, and suspend users who violate access privileges.

In short, native IAM capabilities offered by AWS, Microsoft Azure, Google Cloud, and more provides enough functionality to help an organization get up and running to control access in their respective homogeneous cloud environments. Often they lack the scale to fully address the more challenging, complex areas of IAM and PAM in hybrid or multi-cloud environments.

The Truth about Privileged Access Security on Cloud Providers Like AWS

The essence of the Shared Responsibility Model is assigning responsibility for the security of the cloud itself including the infrastructure, hardware, software, and facilities to AWS and assign the securing of operating systems, platforms, and data to customers. The AWS version of the Shared Responsibility Model, shown below, illustrates how Amazon has defined securing the data itself, management of the platform, applications and how they’re accessed, and various configurations as the customers’ responsibility:

AWS provides basic IAM support that protects its customers against privileged credential abuse in a homogenous AWS-only environment. Forrester estimates that 80% of data breaches involve compromised privileged credentials, and a recent survey by Centrify found that 74% of all breaches involved privileged access abuse.

The following are the four truths about privileged access security on AWS (and, generally, other public cloud providers):

  1. Customers of AWS and other public cloud providers should not fall for the myth that cloud service providers can completely protect their customized and highly individualized cloud instances. As the Shared Responsibility Model above illustrates, AWS secures the core areas of their cloud platform, including infrastructure and hosting services. AWS customers are responsible for securing operating systems, platforms, and data and most importantly, privileged access credentials. Organizations need to consider the Shared Responsibility Model the starting point on creating an enterprise-wide security strategy with a Zero Trust Security framework being the long-term goal. AWS’s IAM is an interim solution to the long-term challenge of achieving Zero Trust Privilege across an enterprise ecosystem that is going to become more hybrid or multi-cloud as time goes on.
  2. Despite what many AWS integrators say, adopting a new cloud platform doesn’t require a new Privileged Access Security model. Many organizations who have adopted AWS and other cloud platforms are using the same Privileged Access Security Model they have in place for their existing on-premises systems. The truth is the same Privileged Access Security Model can be used for on-premises and IaaS implementations. Even AWS itself has stated that conventional security and compliance concepts still apply in the cloud. For an overview of the most valuable best practices for securing AWS instances, please see my previous post, 6 Best Practices For Increasing Security In AWS In A Zero Trust World.
  3. Hybrid cloud architectures that include AWS instances don’t need an entirely new identity infrastructure and can rely on advanced technologies, including Multi-Directory Brokering. Creating duplicate identities increases cost, risk, and overhead and the burden of requiring additional licenses. Existing directories (such as Active Directory) can be extended through various deployment options, each with their strengths and weaknesses. Centrify, for example, offers Multi-Directory Brokering to use whatever preferred directory already exists in an organization to authenticate users in hybrid and multi-cloud environments. And while AWS provides key pairs for access to Amazon Elastic Compute Cloud (Amazon EC2) instances, their security best practices recommend a holistic approach should be used across on-premises and multi-cloud environments, including Active Directory or LDAP in the security architecture.
  4. It’s possible to scale existing Privileged Access Management systems in use for on-premises systems today to hybrid cloud platforms that include AWS, Google Cloud, Microsoft Azure, and other platforms. There’s a tendency on the part of system integrators specializing in cloud security to oversell cloud service providers’ native IAM and PAM capabilities, saying that a hybrid cloud strategy requires separate systems. Look for system integrators and experienced security solutions providers who can use a common security model already in place to move workloads to new AWS instances.

Conclusion

The truth is that Identity and Access Management solutions built into public cloud offerings such as AWS, Microsoft Azure, and Google Cloud are stop-gap solutions to a long-term security challenge many organizations are facing today. Instead of relying only on a public cloud provider’s IAM and security solutions, every organization’s cloud security goals need to include a holistic approach to identity and access management and not create silos for each cloud environment they are using. While AWS continues to invest in their IAM solution, organizations need to prioritize protecting their privileged access credentials – the “keys to the kingdom” – that if ever compromised would allow hackers to walk in the front door of the most valuable systems an organization has. The four truths defined in this article are essential for building a Zero Trust roadmap for any organization that will scale with them as they grow. By taking a “never trust, always verify, enforce least privilege” strategy when it comes to their hybrid- and multi-cloud strategies, organizations can alleviate costly breaches that harm the long-term operations of any business.

The State Of Cloud Business Intelligence, 2019

  • An all-time high 48% of organizations say cloud BI is either “critical” or “very important” to their operations in 2019.
  • Marketing & Sales place the greatest importance on cloud BI in 2019.
  • Small organizations of 100 employees or less are the most enthusiastic, perennial adopters and supporters of cloud BI.
  • The most preferred cloud BI providers are Amazon Web Services and Microsoft Azure.

These and other insights are from Dresner Advisory Services’ 2019 Cloud Computing and Business Intelligence Market Study. The 8th annual report focuses on end-user deployment trends and attitudes toward cloud computing and business intelligence (BI), defined as the technologies, tools, and solutions that rely on one or more cloud deployment models. What makes the study noteworthy is the depth of focus around the perceived benefits and barriers for cloud BI, the importance of cloud BI, and current and planned usage.

“We began tracking and analyzing the cloud BI market dynamic in 2012 when adoption was nascent. Since that time, deployments of public cloud BI applications are increasing, with organizations citing substantial benefits versus traditional on-premises implementations,” said Howard Dresner, founder, and chief research officer at Dresner Advisory Services. Please see page 10 of the study for specifics on the methodology.

Key insights gained from the report include the following:

  • An all-time high 48% of organizations say cloud BI is either “critical” or “very important” to their operations in 2019. Organizations have more confidence in cloud BI than ever before, according to the study’s results. 2019 is seeing a sharp upturn in cloud BI’s importance, driven by the trust and credibility organizations have for accessing, analyzing and storing sensitive company data on cloud platforms running BI applications.

  • Marketing & Sales place the greatest importance on cloud BI in 2019. Business Intelligence Competency Centers (BICC) and IT departments have an above-average interest in cloud BI as well, with their combined critical and very important scores being over 50%. Dresner’s research team found that Operations had the greatest duality of scores, with critical and not important being reported at comparable levels for this functional area. Dresner’s analysis indicates Operations departments often rely on cloud BI to benchmark and improve existing processes while re-engineering legacy process areas.

  • Small organizations of 100 employees or less are the most enthusiastic, perennial adopters and supporters of cloud BI. As has been the case in previous years’ studies, small organizations are leading all others in adopting cloud BI systems and platforms.  Perceived importance declines only slightly in mid-sized organizations (101-1,000 employees) and some large organizations (1,001-5,000 employees), where minimum scores of important offset declines in critical.

  • The retail/wholesale industry considers cloud BI the most important, followed by technology and advertising industries. Organizations competing in the retail/wholesale industry see the greatest value in adopting cloud BI to gain insights into improving their customer experiences and streamlining supply chains. Technology and advertising industries are industries that also see cloud BI as very important to their operations. Just over 30% of respondents in the education industry see cloud BI as very important.

  • R&D departments are the most prolific users of cloud BI systems today, followed by Marketing & Sales. The study highlights that R&D leading all other departments in existing cloud BI use reflects broader potential use cases being evaluated in 2019. Marketing & Sales is the next most prolific department using cloud BI systems.

  • Finance leads all others in their adoption of private cloud BI platforms, rivaling IT in their lack of adoption for public clouds. R&D departments are the next most likely to be relying on private clouds currently. Marketing and Sales are the most likely to take a balanced approach to private and public cloud adoption, equally adopting private and public cloud BI.

  • Advanced visualization, support for ad-hoc queries, personalized dashboards, and data integration/data quality tools/ETL tools are the four most popular cloud BI requirements in 2019. Dresner’s research team found the lowest-ranked cloud BI feature priorities in 2019 are social media analysis, complex event processing, big data, text analytics, and natural language analytics. This years’ analysis of most and least popular cloud BI requirements closely mirror traditional BI feature requirements.

  • Marketing and Sales have the greatest interest in several of the most-required features including personalized dashboards, data discovery, data catalog, collaborative support, and natural language analytics. Marketing & Sales also have the highest level of interest in the ability to write to transactional applications. R&D leads interest in ad-hoc query, big data, text analytics, and social media analytics.

  • The Retail/Wholesale industry leads interest in several features including ad-hoc query, dashboards, data integration, data discovery, production reporting, search interface, data catalog, and ability to write to transactional systems. Technology organizations give the highest score to advanced visualization and end-user self-service. Healthcare respondents prioritize data mining, end-user data blending, and location analytics, the latter likely for asset tracking purposes. In-memory support scores highest with Financial Services respondent organizations.

  • Marketing & Sales rely on a broader base of third party data connectors to get greater value from their cloud BI systems than their peers. The greater the scale, scope and depth of third-party connectors and integrations, the more valuable marketing and sales data becomes. Relying on connectors for greater insights into sales productivity & performance, social media, online marketing, online data storage, and simple productivity improvements are common in Marketing & Sales. Finance requiring integration to Salesforce reflects the CRM applications’ success transcending customer relationships into advanced accounting and financial reporting.

  • Subscription models are now the most preferred licensing strategy for cloud BI and have progressed over the last several years due to lower risk, lower entry costs, and lower carrying costs. Dresner’s research team found that subscription license and free trial (including trial and buy, which may also lead to subscription) are the two most preferred licensing strategies by cloud BI customers in 2019. Dresner Advisory Services predicts new engagements will be earned using subscription models, which is now seen as, at a minimum, important to approximately 90% of the base of respondents.

  • 60% of organizations adopting cloud BI rank Amazon Web Services first, and 85% rank AWS first or second. 43% choose Microsoft Azure first and 69% pick Azure first or second. Google Cloud closely trails Azure as the first choice among users but trails more widely after that. IBM Bluemix is the first choice of 12% of organizations responding in 2019.

Where Cloud Computing Jobs Will Be In 2019

  • $146,350 is the median salary for cloud computing professionals in 2018.
  • There are 50,248 cloud computing positions available in the U.S. today available from 3,701 employers and 101,913 open positions worldwide today.
  • Oracle (NYSE: ORCL), Deloitte and Amazon (NASDAQ: AMZN) have the most open cloud computing jobs today.
  • Java, Linux, Amazon Web Services (AWS), Software Development, DevOps, Docker and Infrastructure as a Service (IaaS) are the most in-demand skills.
  • Washington DC, Arlington-Alexandria, VA, San Francisco-Oakland-Hayward, CA, New York-Newark-Jersey City, NY, San Jose-Sunnyvale-Santa Clara, CA, Chicago-Naperville-Elgin, IL, are the top five cities where cloud computing jobs are today and will be in 2019.

Demand for cloud computing expertise continues to increase exponentially and will accelerate in 2019. To better understand the current and future direction of cloud computing hiring trends, I utilized Gartner TalentNeuron. Gartner TalentNeuron is an online talent market intelligence portal with real-time labor market insights, including custom role analytics and executive-ready dashboards and presentations. Gartner TalentNeuron also supports a range of strategic initiatives covering talent, location, and competitive intelligence.

Gartner TalentNeuron maintains a database of more than one billion unique job listings and is collecting hiring trend data from more than 150 countries across six continents, resulting in 143GB of raw data being acquired daily. In response to many Forbes readers’ requests for recommendations on where to find a job in cloud computing, I contacted Gartner to gain access to TalentNeuron.

Key takeaways include the following:

  • $146,350 is the median salary for cloud computing professionals in 2018.  Cloud computing salaries have soared in the last two years, with 2016’s median salary being $124,300 a jump of $22,050. The following graphic shows the distribution of salaries for 50,248 cloud computing jobs currently available in the U.S. alone. Please click on the graphic to expand for easier reading.

  • The Hiring Scale is 78 for jobs that require cloud computing skill sets, with the average job post staying open 46 days. The higher the Hiring Scale score, the more difficult it is for employers to find the right applicants for open positions. Nationally an average job posting for an IT professional with cloud computing expertise is open 46 days. Please click on the graphic to expand for easier reading.

  • Washington, DC – Arlington-Alexandria, VA leads the top twenty metro areas that have the most open positions for cloud computing professionals today. Mapping the distribution of job volume, salary range, candidate supply, posting period and hiring scale by Metropolitan Statistical Area (MSA) or states and counties are supported by Gartner TalentNeuron.  The following graphic is showing the distribution of talent or candidate supply.  These are the markets with the highest supply of talent with cloud computing skills.

  • Oracle (NYSE: ORCL), Deloitte and Amazon (NASDAQ: AMZN) have the most open cloud computing jobs today. IBM, VMWare, Capital One, Microsoft, KPMG, Salesforce, PricewaterhouseCoopers, U.S. Bank, and Booz Allen Hamilton, Raytheon Corporation, SAP, Capgemini, Google, Leidos and Nutanix all have over 100 open cloud computing positions today.

By 2020 83% Of Enterprise Workloads Will Be In The Cloud

  • Digitally transforming enterprises (63%) is the leading factor driving greater public cloud engagement or adoption today.
  • 66% of IT professionals say security is their most significant concern in adopting an enterprise cloud computing strategy.
  • 50% of IT professionals believe artificial intelligence and machine learning are playing a role in cloud computing adoption today, growing to 67% by 2020.
  • Artificial Intelligence (AI) and Machine Learning will be the leading catalyst driving greater cloud computing adoption by 2020.

These insights and findings are from LogicMonitor’s Cloud Vision 2020: The Future of the Cloud Study (PDF, free, opt-in, 9 pp.). The survey is based on interviews with approximately 300 influencers LogicMonitor interviewed in November 2017. Respondents include Amazon Web Services AWS re:Invent 2017 attendees, industry analysts, media, consultants and vendor strategists. The study’s primary goal is to explore the landscape for cloud services in 2020. While the study’s findings are not statistically significant, they do provide a fascinating glimpse into current and future enterprise cloud computing strategies.

Key takeaways include the following:

  • 83% Of Enterprise Workloads Will Be In The Cloud By 2020. LogicMonitor’s survey is predicting that 41% of enterprise workloads will be run on public cloud platforms (Amazon AWSGoogle Cloud PlatformIBM CloudMicrosoft Azure and others) by 2020. An additional 20% are predicted to be private-cloud-based followed by another 22% running on hybrid cloud platforms by 2020. On-premise workloads are predicted to shrink from 37% today to 27% of all workloads by 2020.

  • Digitally transforming enterprises (63%) is the leading factor driving greater public cloud engagement or adoption followed by the pursuit of IT agility (62%). LogicMonitor’s survey found that the many challenges enterprises face in digitally transforming their business models are the leading contributing factor to cloud computing adoption. Attaining IT agility (62%), excelling at DevOps (58%), mobility (55%), Artificial Intelligence (AI) and Machine Learning (50%) and the Internet of Things (IoT) adoption (45%) are the top six factors driving cloud adoption today. Artifical Intelligence (AI) and Machine Learning are predicted to be the leading factors driving greater cloud computing adoption by 2020.

  • 66% of IT professionals say security is their greatest concern in adopting an enterprise cloud computing strategy. Cloud platform and service providers will go on a buying spree in 2018 to strengthen and harden their platforms in this area. Verizon (NYSE:VZ) acquiring Niddel this week is just the beginning. Niddel’s Magnet software is a machine learning-based threat-hunting system that will be integrated into Verizon’s enterprise-class cloud services and systems. Additional concerns include attaining governance and compliance goals on cloud-based platforms (60%), overcoming the challenges of having staff that lacks cloud experience (58%), Privacy (57%) and vendor lock-in (47%).

  • Just 27% of respondents predict that by 2022, 95% of all workloads will run in the cloud. One in five respondents believes it will take ten years to reach that level of workload migration. 13% of respondents don’t see this level of workload shift ever occurring. Based on conversations with CIOs and CEOs in manufacturing and financial services industries there will be a mix of workloads between on-premise and cloud for the foreseeable future. C-level executives evaluate shifting workloads based on each systems’ contribution to new business models, cost, and revenue goals in addition to accelerating time-to-market.

  • Microsoft Azure and Google Cloud Platform are predicted to gain market share versus Amazon AWS in the next three years, with AWS staying the clear market leader. The study found 42% of respondents are predicting Microsoft Azure will gain more market share by 2020. Google Cloud Platform is predicted to also gain ground according to 35% of the respondent base. AWS is predicted to extend its market dominance with 52% market share by 2020.

How AWS And Azure Competing Is Improving Public Cloud Adoption

Global Cloud

  • Public Cloud spending is predicted to grow at quickly, attaining 16% year-over-year growth in 2017.
  • Cowen’s AWS segment model is predicting Revenue and EBITDA to grow 25% and 26.8% annually from 2017 to 2022.
  • Microsoft Azure is viewed as the platform that customers would most likely purchase or renew going forward (28% of total vs. AWS at 22%, GCP at 15%, and IBM at 10%).

These and many other fascinating insights are from Cowen’s study published this week, Public Cloud V: AWS And Azure Still Leading The Pack (58 pp., PDF, client access reqd.). Cowen partnered with Altman Vilandrie & Company to complete the study. The study relies on a survey sample of 551 respondents distributed across small, medium and enterprises who are using Public Cloud platforms and services today.  For purposes of the survey, small businesses have less than 500 employees, medium-sized businesses as 500 to 4,999 employees, and enterprises as more than 5,000 employees. The study provides insight on a range of topics including cloud spending trends, workload migration dynamics, and vendor positioning. Please see pages 5,6 & 7 for additional details regarding the methodology.

The more AWS and Azure compete to win customers, the greater the innovation and growth in public cloud adoption as the following key takeaways illustrate:

  • Existing Public Cloud customers predict spending will grow 16% year-over-year in 2017. Existing mid-market Public Cloud customers predict spending will increase 18% this year. SMBs who have already adopted Public Cloud predict a 17% increase in spending in 2017, and enterprises, 13%. Public Cloud providers are the most successful upselling and cross-selling mid-market companies this year as many are relying on the cloud to scale their global operations to support growth.

Public Cloud Spending, 2017

  • AWS dominates awareness levels with SMBs who have existing Public Cloud deployments, with Microsoft Azure the most known and considered in enterprises. Consistent with many other surveys of Public Cloud adoption, IBM SoftLayer scored better in enterprises than any other segment including SMBs (71% vs. 58%). Google Cloud Platform has its strongest awareness levels in SMBs, attributable to the adoption of their many cloud-based applications in this market segment. They trail AWS, Azure, and SoftLayer in the enterprise, however. Across all existing companies who have adopted Public Cloud, the majority are most aware of AWS and Microsoft Azure. The second graphic provides an overview of awareness across the entire respondent base.

test

  • Microsoft is the most-used Public Cloud and the most likely to be purchased or renewed by 28% of all respondents. While AWS is the most reviewed Public Cloud across all respondents, Microsoft Azure is the most used. When asked which Public Cloud provider they are likely to purchase or renew, the majority of respondents said Microsoft Azure (28%), followed by AWS (22%), Google Cloud Platform (15%) and IBM SoftLayer (10%). The following graphic compares awareness, reviewed and use levels by Public Cloud platform.

Comparative Analysis Of Most Used Public Cloud Provider

  • Only 37% of current Azure users expect to add or replace their Public Cloud provider, compared to 53% of current AWS users and 50% of GCP users. The study found that approximately 40% of respondents expect to add or replace their cloud provider in the next two years, compared to 43% who predicted that last year. Companies who have adopted Microsoft Azure are least likely to replace/add other vendors, as only 37% of current Azure users expect to add or replace, compared to 53% of current AWS users and 50% of GCP users.

substitute

  • AWS and Azure dominate all seven facets of user experience included in the survey. AWS has the best User Interface, API Complexity, and Reporting & Billing. Microsoft Azure leads all Public Cloud providers globally in the areas of Management & Monitoring, Software & Data Integration, Technical Support and Training &   Google Cloud Platform is 3rd on all seven facts of user experience.

user

  • 18% of workloads are supported by Public Cloud today with SMBs and mid-market companies slightly leading enterprises (16%). Overall, 38% of all workloads are supported with on-premise infrastructure and platforms, increasing to 43% for enterprises. The following graphic illustrates the percentage of workloads supported by each infrastructure type.

Infrastructure

  • 77% of existing Public Cloud adopters are either likely or very likely to add a SaaS workload in the next two years, led by mid-market companies (81%). SMBs (76%) and enterprises (73%) are also likely/very likely to add SaaS workloads in the next two years. The majority of these new SaaS workloads will be in the areas of Testing & Development, Web Hosting, and e-mail and communications.

Comparing

  • Cowen’s AWS segment model is predicting Revenue and EBITDA to have a five-year Compound Annual Growth Rate (CAGR) of 25% and 26.8% from 2017 to 2022. AWS Net Income is predicted to increase from $2.7B in 2017 to $8.2B in 2022, attaining a projected 24.5% CAGR from 2017 to 2022. Revenue is predicted to soar from an estimated $16.8B in 2017 to $51.5B in 2022, driving a 25% CAGR in the forecast period.

Roundup Of Cloud Computing Forecasts, 2017

  • Cloud computing is projected to increase from $67B in 2015 to $162B in 2020 attaining a compound annual growth rate (CAGR) of 19%.
  • Gartner predicts the worldwide public cloud services market will grow 18% in 2017 to $246.8B, up from $209.2B in 2016.
  • 74% of Tech Chief Financial Officers (CFOs) say cloud computing will have the most measurable impact on their business in 2017.

Cloud platforms are enabling new, complex business models and orchestrating more globally-based integration networks in 2017 than many analyst and advisory firms predicted. Combined with Cloud Services adoption increasing in the mid-tier and small & medium businesses (SMB), leading researchers including Forrester are adjusting their forecasts upward. The best check of any forecast is revenue.  Amazon’s latest quarterly results released two days ago show Amazon Web Services (AWS) attained 43% year-over-year growth, contributing 10% of consolidated revenue and 89% of consolidated operating income.

Additional key takeaways from the roundup include the following:

  • Wikibon is predicting enterprise cloud spending is growing at a 16% compound annual growth (CAGR) run rate between 2016 and 2026. The research firm also predicts that by 2022, Amazon Web Services (AWS) will reach $43B in revenue, and be 8.2% of all cloud spending. Source: Wikibon report preview: How big can Amazon Web Services get?
Wikibon Worldwide Enterprise IT Projection By Vendor Revenue

Wikibon Worldwide Enterprise IT Projection By Vendor Revenue

Rapid Growth of Cloud Computing, 2015–2020

Rapid Growth of Cloud Computing, 2015–2020

Worldwide Public Cloud Services Forecast (Millions of Dollars)

Worldwide Public Cloud Services Forecast (Millions of Dollars)

  • By the end of 2018, spending on IT-as-a-Service for data centers, software and services will be $547B. Deloitte Global predicts that procurement of IT technologies will accelerate in the next 2.5 years from $361B to $547B. At this pace, IT-as-a-Service will represent more than half of IT spending by the 2021/2022 timeframe. Source: Deloitte Technology, Media and Telecommunications Predictions, 2017 (PDF, 80 pp., no opt-in).
Deloitte IT-as-a-Service Forecast

Deloitte IT-as-a-Service Forecast

  • Total spending on IT infrastructure products (server, enterprise storage, and Ethernet switches) for deployment in cloud environments will increase 15.3% year over year in 2017 to $41.7B. IDC predicts that public cloud data centers will account for the majority of this spending ( 60.5%) while off-premises private cloud environments will represent 14.9% of spending. On-premises private clouds will account for 62.3% of spending on private cloud IT infrastructure and will grow 13.1% year over year in 2017. Source: Spending on IT Infrastructure for Public Cloud Deployments Will Return to Double-Digit Growth in 2017, According to IDC.
Worldwide Cloud IT Infrastructure Market Forecast

Worldwide Cloud IT Infrastructure Market Forecast

  • Platform-as-a-Service (PaaS) adoption is predicted to be the fastest-growing sector of cloud platforms according to KPMG, growing from 32% in 2017 to 56% adoption in 2020. Results from the 2016 Harvey Nash / KPMG CIO Survey indicate that cloud adoption is now mainstream and accelerating as enterprises shift data-intensive operations to the cloud.  Source: Journey to the Cloud, The Creative CIO Agenda, KPMG (PDF, no opt-in, 14 pp.)
Cloud investment by type today and in three years

Cloud investment by type today and in three years

AWS Segment Financial Comparison

AWS Segment Financial Comparison

  • In Q1, 2017 AWS generated 10% of consolidated revenue and 89% of consolidated operating income. Net sales increased 23% to $35.7 billion in the first quarter, compared with $29.1 billion in first quarter 2016. Source: Cloud Business Drives Amazon’s Profits.
Comparing AWS' Revenue and Income Contributions

Comparing AWS’ Revenue and Income Contributions

  • RightScale’s 2017 survey found that Microsoft Azure adoption surged from 26% to 43% with AWS adoption increasing from 56% to 59%. Overall Azure adoption grew from 20% to 34% percent of respondents to reduce the AWS lead, with Azure now reaching 60% of the market penetration of AWS. Google also increased adoption from 10% to 15%. AWS continues to lead in public cloud adoption (57% of respondents currently run applications in AWS), this number has stayed flat since both 2016 and 2015. Source: RightScale 2017 State of the Cloud Report (PDF, 38 pp., no opt-in)
Public Cloud Adoption, 2017 versus 2016

Public Cloud Adoption, 2017 versus 2016

  • Global Cloud IT market revenue is predicted to increase from $180B in 2015 to $390B in 2020, attaining a Compound Annual Growth Rate (CAGR) of 17%. In the same period, SaaS-based apps are predicted to grow at an 18% CAGR, and IaaS/PaaS is predicted to increase at a 27% CAGR. Source: Bain & Company research brief The Changing Faces of the Cloud (PDF, no opt-in).
60% of IT Market Growth Is Being Driven By The Cloud

60% of IT Market Growth Is Being Driven By The Cloud

  • 74% of Tech Chief Financial Officers (CFOs) say cloud computing will have the most measurable impact on their business in 2017. Additional technologies that will have a significant financial impact in 2017 include the Internet of Things, Artificial Intelligence (AI) (16%) and 3D printing and virtual reality (14% each). Source: 2017 BDO Technology Outlook Survey (PDF), no opt-in).
CFOs say cloud investments deliver the greatest measurable impact

CFOs say cloud investments deliver the greatest measurable impact

Cloud investments are fueling new job throughout Canada

Cloud investments are fueling new job throughout Canada

  • APIs are enabling persona-based user experiences in a diverse base of cloud enterprise As of today there are 17,422 APIs listed on the Programmable Web, with many enterprise cloud apps concentrating on subscription, distributed order management, and pricing workflows.  Sources: Bessemer Venture Partners State of the Cloud 2017 and 2017 Is Quickly Becoming The Year Of The API Economy. The following graphic from the latest Bessemer Venture Partners report illustrates how APIs are now the background of enterprise software.
APIs are fueling a revolution in cloud enterprise apps

APIs are fueling a revolution in cloud enterprise apps

Additional Resources:

Five Strategies For Improving Customer Relationships Using Salesforce Integration

Bottom line: Defining salesforce integration strategies from the customers’ perspective that streamline every aspect of their relationship with your company drives greater revenue, earns trust and creates upsell and cross-sell opportunities in the future.

In the most competitive selling situations the company that has exceptional insights into what matters most to prospects and customers win the most deals. It’s not enough to just have a CRM system that is hard-wired into the core customer-facing processes of a business. To win more sales cycles companies are getting the most from every system they have available. From SAP Enterprise Resource Planning (ERP) systems to legacy pricing, operations, services, pricing, and CRM systems, companies winning more deals today can use Salesforce integration as a catalyst for driving more revenue.

Five Strategies For Improving Customer Relationships Using Salesforce Integration

  1. Making the Configure-Price-Quote (CPQ) process more efficient for customers and prospects by integrating ERP data into every quote. Today speed is a feature every system must have to stay competitive. Being able to create quotes that include the date the proposed configuration will ship and coordinate with services and programs delivery while providing order status from ERP systems is winning deals today. The tighter the ERP system integration, the better the quote accuracy in a CPQ system and the higher the chance of winning a sale. The following table shows the many benefits of having a well-integrated CPQ process.

business-impact-of-an-integrated-cpq-process

  1. Creating an omni-channel experience for customers needs to start with ERP, legacy, 3rd party and Salesforce integration that sets the foundation to exceed customer experiences daily. Providing a unified experience across every channel is challenging yet attainable, with market leaders using a series of integration strategies to provide this level of insight so customers’ expectations are exceeded in every single interaction. Only by integrating CRM systems including Salesforce with SAP ERP systems can any company hope to deliver a consistent, excellent series of experiences across all channels, all the time.
  1. Set up sales teams for exceptional performance with tightly integrated mobile apps that accelerate sales cycles. By using mobile apps that integrate SAP ERP systems, Salesforce CRM, and legacy systems into simplified, highly efficient workflows, sales teams can close more deals without having to come back to their offices.  Senior management teams can get more done using mobile apps that are an extension of their SAP ERP systems as well. Mobile apps are revolutionizing productivity thanks to SAP and Salesforce integration.
  1. Attaining high product quality levels that exceed customer expectations by providing every manufacturing department real-time visibility into quality inspections and inventory control. By integrating inbound inspection, inventory control, and quality management data across manufacturing, Bunn can deliver products that exceed customer expectations. Bunn’s product quality inspectors can perform and record results right at the machines being tested. The warehouse management system can scan and record inventory counts in real time to SAP. Maintaining high levels of product quality are what make Bunn’s beverage equipment machines a market standard globally today.
  1. Making new product launches more successful by having a tightly integrated approach to selling, producing and servicing new products that are in step with customers’ changing needs. From apparel to high-tech and financial services, customers are rapidly redefining which channels they choose to purchase through, how they choose to customize products, and which services they prefer to bundle in.  Integrating Salesforce, e-commerce and ERP systems into a single, unified workflow that is designed to provide customers exactly what they need is essential for enabling new product launches to succeed. With an integrated system across Salesforce, ERP, distribution and pricing systems, new product launches can scale globally quicker and still allow for personalization to customers’ unique preferences.  Salesforce integration is essential for successful new product introductions as the entire launch process gains speed, scale, and simplicity as a result.

Originally published on the enosiX blog, Five Strategies For Improving Customer Relationships Using Salesforce Integration. 

Amazon Web Services Leading Cloud Infrastructure as a Service App Development

IaaS Magic QuadrantEvangelizing development on any cloud computing or enterprise platform is challenging, costly and takes a unique skill set that can educate, persuade, sell and serve developers at the same time.

The companies who excel at this exude technical prowess and as a result earn and keep trust.  For Cloud Infrastructure as a Service (IaaS) platform providers, getting developers, both at partner companies and at enterprise customers to build applications, is a critical catalyst for future growth.

Assessing Cloud Infrastructure as a Service Providers with Inquiry Analytics  

Using the Magic Quadrant for Cloud Infrastructure as a Service, 2012 published October 18, 2012 as the baseline and shown above from Rueven Cohen’s excellent post last year, the five leaders were compared using the Inquiry Analytics Statistics: Topic and Vendor Mind Share for Software, 4Q12 published March 13th of this year.  Analyzing the five leaders in the Magic Quadrant using Inquiry Analytics shows that Amazon Web Services (AWS) was 57.1% of inquiry share worldwide for application development  during the 4th quarter of 2012.

From 4th quarter 2011 to 4th quarter 2012, Amazon Web Services showed just over 10% inquiry gain against the other vendors listed as leaders in the quadrant.  Only five vendors can be compared at once using the Gartner Inquiry Analytics tool so the leaders were included in the comparison first.

cloud IaaS

A second pass through the Inquiry Analytics was done comparing Amazon Web Services to the other vendors in the quadrant.  AWS had 63.6% of inquiries in the application development category during the 4th quarter of 2012 compared to non-leader vendors in the quadrant who were listed in the Inquiry Analytics database.  It was surprising to find that a few of the vendors listed in the Cloud IaaS Magic Quadrant don’t have data available in the Inquiry Analytics Statistics: Topic and Vendor Mind Share for Software, 4Q12 indicating inquiries.  During this pass, Rackspace share of inquiries between the 4th quarter of 2011 to the 4th quarter of 2012 declined just over 5% and Dell declines approximately 2%.

Bottom line: The land grab for developers is accelerating on IaaS and will be a major factor in who establishes a long-term cloud platform for years to come.

SaaS Adoption Accelerates, Goes Global in the Enterprise

In working with manufacturers and financial services firms over the last year, one point is becoming very clear: SaaS is gaining trust as a solid alternative for global deployments across the enterprise.  And this trend has been accelerating in the last six months.  One case in point is a 4,000 seat SaaS CRM deployment going live in Australia, Europe, and the U.S. by December of this year.

What’s noteworthy about this shift is that just eighteen months ago an Australian-based manufacturer was only considering SaaS for on-premises enhancement of their CRM system.  What changed?  The European and U.S. distribution and sales offices were on nearly 40 different CRM, quoting, proposal and pricing systems.  It was nearly impossible to track global opportunities.

Meanwhile business was booming in Australia and there were up-sell and cross-sell opportunities being missed in the U.S. and European-based headquarters of their prospects. The manufacturer  chose to move to a global SaaS CRM solution quickly.  Uniting all three divisions with a global sales strategy forced the consolidation of 40 different quoting, pricing and CRM systems in the U.S. alone.  What they lost in complexity they are looking to pick up in global customer sales.

Measuring Where SaaS Is Cannibalizing On-Premise Enterprise Applications

Gartner’s Market Trends: SaaS’s Varied Levels of Cannibalization to On-Premises Applications published: 29 October 2012 breaks out the percent of SaaS revenue for ten different enterprise application categories.  The greener the color the greater the adoption.  As was seen with the Australian manufacturer, CRM continues dominate this trend of SaaS cannibalizing on-premise enterprise applications.

Additional take-aways from this report include the following:

  • Perceived lower Total Cost of Ownership (TCO) continues to be the dominant reason enterprises are considering SaaS adoption, with 50% of respondents in 2012 mentioning this as the primary factor in their decision.
  • CRM is leading all other enterprise application areas in net new deployments according to the Gartner study, with the majority of on-premise replacements being in North America and Europe.
  • Gartner projects that by 2016 more than 50% of CRM software revenue will be delivered by SaaS.  As of 2011, 35% of CRM software was delivered on the SaaS platform.  Gartner expects to see SaaS-based CRM grow at three time the rate of on-premise applications.
  • 95% of Web analytics functions are delivered via the SaaS model  whereas only 40% of sales use cloud today according to the findings of this study.
  • The highest adoption rates of SaaS-based applications include sales, customer service, social CRM and marketing automation.
  • SaaS-based ERP will continued to be a small percentage of the total market, attaining 10% cannibalization by 2012.  Forrester has consistently said this is 13%, growing to 16% by 2015.
  • Office suites and digital content creation (DCC) will attain compound annual growth rates (CAGR) of 40.7% and a 32.2% respectively from 2011 through 2016. Gartner is making the assumption consumers and small businesses will continue be the major forces for Web-based office suites through 2013.
  • The four reasons why companies don’t choose SaaS include uncertainty if it is the right deployment option (36%), satisfaction with existing on-premise applications (30%), no further requirements (33%) and locked into their current solution with expensive contractual requirements (14%).

Bottom Line: Enterprises and their need to compete with greater accuracy and speed are driving the cannibalization of on-premise applications faster than many anticipated; enterprise software vendors need to step up and get in front of this if they are going to retain their greatest sources of revenue.

Source:  Market Trends: SaaS’s Varied Levels of Cannibalization to On-Premises Applications Published: 29 October 2012 written by Chad Eschinger, Joanne M. Correia, Yanna Dharmasthira, Tom Eid, Chris Pang, Dan Sommer, Hai Hong Swinehart and Laurie F. Wurster

%d bloggers like this: