Skip to content

Posts from the ‘Gartner’ Category

What’s New In Gartner’s Hype Cycle For AI, 2020

What's New In Gartner's Hype Cycle For AI, 2020
AI is starting to deliver on its potential and its benefits for businesses are becoming a reality.

  • 47% of artificial intelligence (AI) investments were unchanged since the start of the pandemic and 30% of organizations plan to increase their AI investments, according to a recent Gartner poll.
  • 30% of CEOs own AI initiatives in their organizations and regularly redefine resources, reporting structures and systems to ensure success.
  • AI projects continue to accelerate this year in healthcare, bioscience, manufacturing, financial services and supply chain sectors despite greater economic & social uncertainty.
  • Five new technology categories are included in this year’s Hype Cycle for AI, including small data, generative AI, composite AI, responsible AI and things as customers.

These and many other new insights are from the Gartner Hype Cycle for Artificial Intelligence, 2020, published on July 27th of this year and provided in the recent article, 2 Megatrends Dominate the Gartner Hype Cycle for Artificial Intelligence, 2020.  Two dominant themes emerge from the combination of 30 diverse AI technologies in this year’s Hype Cycle. The first theme is the democratization or broader adoption of AI across organizations. The greater the democratization of AI, the greater the importance of developers and DevOps to create enterprise-grade applications. The second theme is the industrialization of AI platforms. Reusability, scalability, safety and responsible use of AI and AI governance are the catalysts contributing to the second theme.  The Gartner Hype Cycle for Artificial Intelligence, 2020, is shown below:

What's New In Gartner's Hype Cycle For AI, 2020
Smarter with Gartner, 2 Megatrends Dominate the Gartner Hype Cycle for Artificial Intelligence, 2020.

Details Of What’s New In Gartner’s Hype Cycle for Artificial Intelligence, 2020

  • Chatbots are projected to see over a 100% increase in their adoption rates in the next two to five years and are the leading AI use cases in enterprises today.  Gartner revised the bots’ penetration rate from a range of 5% to 20% last year to 20% to 50% this year. Gartner points to chatbot’s successful adoption as the face of AI today and the technology’s contributions to streamlining automated, touchless customer interactions aimed at keeping customers and employees safe. Bot vendors to watch include Amazon Web Services (AWS), Cognigy, Google, IBM, Microsoft, NTT DOCOMO, Oracle, Rasa and Rulai.
  • GPU Accelerators are the nearest-term technology to mainstream adoption and are predicted to deliver a high level of benefit according to Gartner’s’ Priority Matrix for AI, 2020. Gartner predicts GPU Accelerators will see a 100% improvement in adoption in two to five years, increasing from 5% to 20% adoption last year to 20% to 50% this year. Gartner advises its clients that GPU-accelerated Computing can deliver extreme performance for highly parallel compute-intensive workloads in HPC, DNN training and inferencing. GPU computing is also available as a cloud service. According to the Hype Cycle, it may be economical for applications where utilization is low, but the urgency of completion is high.
  • AI-based minimum viable products and accelerated AI development cycles are replacing pilot projects due to the pandemic across Gartner’s client base. Before the pandemic, pilot projects’ success or failure was, for the most part, dependent on if a project had an executive sponsor and how much influence they had. Gartner clients are wisely moving to minimum viable product and accelerating AI development to get results quickly in the pandemic. Gartner recommends projects involving Natural Language Processing (NLP), machine learning, chatbots and computer vision to be prioritized above other AI initiatives. They’re also recommending organizations look at insight engines’ potential to deliver value across a business.
  • Artificial General Intelligence (AGI) lacks commercial viability today and organizations need to focus instead on more narrowly focused AI use cases to get results for their business. Gartner warns there’s a lot of hype surrounding AGI and organizations would be best to ignore vendors’ claims of having commercial-grade products or platforms ready today with this technology. A better AI deployment strategy is to consider the full scope of technologies on the Hype Cycle and choose those delivering proven financial value to the organizations adopting them.
  • Small Data is now a category in the Hype Cycle for AI for the first time. Gartner defines this technology as a series of techniques that enable organizations to manage production models that are more resilient and adapt to major world events like the pandemic or future disruptions. These techniques are ideal for AI problems where there are no big datasets available.
  • Generative AI is the second new technology category added to this year’s Hype Cycle for the first time. It’s defined as various machine learning (ML) methods that learn a representation of artifacts from the data and generate brand-new, completely original, realistic artifacts that preserve a likeness to the training data, not repeat it.
  • Gartner sees potential for Composite AI helping its enterprise clients and has included it as the third new category in this year’s Hype Cycle. Composite AI refers to the combined application of different AI techniques to improve learning efficiency, increase the level of “common sense,” and ultimately to much more efficiently solve a wider range of business problems.
  • Concentrating on the ethical and social aspects of AI, Gartner recently defined the category Responsible AI as an umbrella term that’s included as the fourth category in the Hype Cycle for AI. Responsible AI is defined as a strategic term that encompasses the many aspects of making the right business and ethical choices when adopting AI that organizations often address independently. These include business and societal value, risk, trust, transparency, fairness, bias mitigation, explainability, accountability, safety, privacy and regulatory compliance.
  • The exponential gains in accuracy, price/performance, low power consumption and Internet of Things sensors that collect AI model data have to lead to a new category called Things as Customers, as the fifth new category this year.  Gartner defines things as Customers as a smart device or machine or that obtains goods or services in exchange for payment. Examples include virtual personal assistants, smart appliances, connected cars and IoT-enabled factory equipment.
  • Thirteen technologies have either been removed, re-classified, or moved to other Hype Cycles compared to last year.  Gartner has chosen to remove VPA-enabled wireless speakers from all Hype Cycles this year. AI developer toolkits are now part of the AI developer and teaching kits category. AI PaaS is now part of AI cloud services. Gartner chose to move AI-related C&SI services, AutoML, Explainable AI (also now part of the Responsible AI category in 2020), graph analytics and Reinforcement Learning to the Hype Cycle for Data Science and Machine Learning, 2020. Conversational User Interfaces, Speech Recognition and Virtual Assistants are now part of the Hype Cycle for Natural Language Technologies, 2020. Gartner has also chosen to move Quantum computing to the Hype Cycle for Compute Infrastructure, 2020. Robotic process automation software is now removed from the Hype Cycle for AI, as Gartner mentions the technology in several other Hype Cycles.

Why Cybersecurity Needs To Focus More On Customer Endpoints

Why Cybersecurity Needs To Focus More On Customer Endpoints

  • Cloud-based endpoint protection platforms (EPP) are proliferating across enterprises today as CIOs and CISOs prioritize greater resiliency in their endpoint security strategies going into 2020.
  • Gartner predicts that Global Information Security and Risk Management end-user spending is forecast to grow at a five-year CAGR of 9.2% to reach $174.5 billion in 2022, with approximately $50B spent on endpoint security.
  • Endpoint security tools are 24% of all IT security spending, and by 2020 global IT security spending will reach $128B according to Morgan Stanley Research.
  • 70% of all breaches still originate at endpoints, despite the increased IT spending on this threat surface, according to IDC.

There’s a surge of activity happening right now in enterprises that are prioritizing more resiliency in their endpoint security strategies going into 2020. The factors motivating CIOs, CISOs, IT, and Practice Directors to prioritize endpoint resiliency include more effective asset management based on real-time data while securing and ensuring every endpoint can heal itself using designed-in regenerative software at the BIOS level of every device. CIOs say the real-time monitoring helps reduce asset management operating expense, a big plus many of them appreciate give their tight budgets. Sean Maxwell, Chief Commercial Officer at Absolute, says, “Trust is at the center of every endpoint discussion today as CIOs, CISOs and their teams want the assurance every endpoint will be able to heal itself and keep functioning.”

The Endpoint Market Is Heating Up Going Into 2020

Over thirty vendors are competing in the endpoint security market right now. A few of the most interesting are Absolute Software, Microsoft, Palo Alto Networks, and others who are seeing a surge of activity from enterprises based on discussions with CIOs and CISOs. Absolute Software’s Persistence self-healing endpoint security technology is embedded in the firmware of more than 500 million devices and gives CIOs, CISOs and their team’s complete visibility and control over devices and data. Absolute is the leading visibility and control platform that provides enterprises with tamper-proof resilience and protection of all devices, data, and applications.

Like Absolute, Microsoft is unique in how they are the only vendor to provide built-in endpoint protection at the device level, with the core focus being on the OS. Windows 10 has Windows Defender Antivirus now integrated at the OS level, the same System Center Endpoint Protection delivers in Windows 7 and 8 OS. Microsoft Defender Advanced Threat Protection (ATP) incident response console aggregates alerts and incident response activities across Microsoft Defender ATP, Office 365 ATP, Azure ATP, and Active Directory, in addition to Azure.

Further evidence of how enterprise customers are placing a high priority on endpoint security is the increase in valuations of key providers in this market, including Absolute Software (TSE: ABT) and others. Absolute’s stock price has jumped 13% in just a month, following their latest earnings announcement on November 12th with a transcript of their earnings call here. Absolute’s CEO Christy Wyatt commented during the company’s most recent earnings call that, “The ability to utilize near real-time data from the endpoint to… to deliver actionable insights to IT about where controls are failing and the ability to apply resilience to self-heal and reinforce those security controls will become a critical skill for every one of our customers. This is the essence of Absolute’s platform, which adds resiliency to our customer’s operations.” It’s evident from what CIOs and CISOs are saying that resiliency is transforming endpoint security today and will accelerate in 2020.

Key Takeaways From Conversations With Enterprise Cybersecurity Leaders

The conversations with CIOs, CISOs, and IT Directors provided valuable insights into why resiliency is becoming a high priority for endpoint security strategies today. The following are key takeaways from the conversations:

  • Known humorously as the “fun button” cybersecurity teams enjoy being able to brick any device any time while monitoring the activity happening on it in real-time. One CIO told the story of how their laptops had been given to a service provider who was supposed to destroy them to stay in compliance with the Health Insurance Portability and Accountability Act (HIPAA), and one had been resold on the back market, ending up in a 3rd world nation. As the hacker attempted to rebuild the machine, the security team watched as each new image was loaded, at which time they would promptly brick the machine. After 19 tries, the hacker gave up and called the image re-build “brick me.”
  • IT budgets for 2020 are flat or slightly up, with many CIOs being given the goal of reducing asset management operating expenses, making resiliency ideal for better managing device costs. The more effectively assets are managed, the more secure an organization becomes. That’s another motivating factor motivating enterprises to adopt resiliency as a core part of the endpoint security strategies.
  • One CIO was adamant they had nine software agents on every endpoint, but Absolute’s Resilience platform found 16, saving the enterprise from potential security gaps. The gold image an enterprise IT team was using had inadvertently captured only a subset of the total number of software endpoints active on their networks. Absolute’s Resilience offering and Persistence technology enabled the CIO to discover gaps in endpoint security the team didn’t know existed before.
  • Endpoints enabled with Resiliency have proven their ability to autonomously self-heal themselves, earning the trust of CIOs and CISOs, who are adopting Absolute to alleviate costly network interruptions and potential breaches in the process. 19% of endpoints across a typical IT network require at least one client or patch management repair monthly, according to Absolute’s 2019 Endpoint Security Trends Report. The report also found that increasing security spending on protecting endpoints doesn’t increase an organizations’ safety – and in some instances, reduces it. Having a systematic, design-in solution to these challenges gives CIOs, CISO, and their teams greater peace of mind and reduces expensive interruptions and potential breaches that impede their organizations’ growth.

 

Machine Learning Is Helping To Stop Security Breaches With Threat Analytics

Bottom Line: Machine learning is enabling threat analytics to deliver greater precision regarding the risk context of privileged users’ behavior, creating notifications of risky activity in real time, while also being able to actively respond to incidents by cutting off sessions, adding additional monitoring, or flagging for forensic follow-up.

Separating Security Hacks Fact from Fiction

It’s time to demystify the scale and severity of breaches happening globally today. A commonly-held misconception or fiction is that millions of hackers have gone to the dark side and are orchestrating massive attacks on any and every business that is vulnerable. The facts are far different and reflect a much more brutal truth, which is that businesses make themselves easy to hack into by not protecting their privileged access credentials. Cybercriminals aren’t expending the time and effort to hack into systems; they’re looking for ingenious ways to steal privileged access credentials and walk in the front door. According to Verizon’s 2019 Data Breach Investigations Report, ‘Phishing’ (as a pre-cursor to credential misuse), ‘Stolen Credentials’, and ‘Privilege Abuse’ account for the majority of threat actions in breaches (see page 9 of the report).

It only really takes one compromised credential to potentially impact millions — whether it’s millions of individuals or millions of dollars. Undeniably, identities and the trust we place in them are being used against us. They have become the Achilles heel of our cybersecurity practices. According to a recent study by Centrify among 1,000 IT decision makers, 74% of respondents whose organizations have been breached acknowledged that it involved access to a privileged account. This number closely aligns with Forrester Research’s estimate “that at least 80% of data breaches . . . [involved] compromised privileged credentials, such as passwords, tokens, keys, and certificates.”

While the threat actors might vary according to Verizon’s 2019 Data Breach Investigations Report, the cyber adversaries’ tactics, techniques, and procedures are the same across the board. Verizon found that the fastest growing source of threats are from internal actors, as the graphic from the study illustrates below:


Internal actors are the fastest growing source of breaches because they’re able to obtain privileged access credentials with minimal effort, often obtaining them through legitimate access requests to internal systems or harvesting their co-workers’ credentials by going through the sticky notes in their cubicles. Privileged credential abuse is a challenge to detect as legacy approaches to cybersecurity trust the identity of the person using the privileged credentials. In effect, the hacker is camouflaged by the trust assigned to the privileged credentials they have and can roam internal systems undetected, exfiltrating sensitive data in the process.

The reality is that many breaches can be prevented by some of the most basic Privileged Access Management (PAM) tactics and solutions, coupled with a Zero Trust approach. Most organizations are investing the largest chunk of their security budget on protecting their network perimeter rather than focusing on security controls, which can affect positive change to protect against the leading attack vector: privileged access abuse.

The bottom line is that investing in securing perimeters leaves the most popular attack vector of all unprotected, which are privileged credentials. Making PAM a top priority is crucial to protect any business’ most valuable asset; it’s systems, data, and the intelligence they provide. Gartner has listed PAM on its Top 10 Security Projects for the past two years for a good reason.

Part of a cohesive PAM strategy should include machine learning-based threat analytics to provide an extra layer of security that goes beyond a password vault, multi-factor authentication (MFA), or privilege elevation.

How Machine Learning and Threat Analytics Stop Privileged Credential Abuse 

Machine learning algorithms enable threat analytics to immediately detect anomalies and non-normal behavior by tracking login behavioral patterns, geolocation, and time of login, and many more variables to calculate a risk score. Risk scores are calculated in real-time and define if access is approved, if additional authentication is needed, or if the request is blocked entirely.

Machine learning-based threat analytics also provide the following benefits:

  • New insights into privileged user access activity based on real-time data related to unusual recent privilege change, the command runs, target accessed, and privilege elevation.
  • Gain greater understanding and insights into the specific risk nature of specific events, computing a risk score in real time for every event expressed as high, medium, or low level for any anomalous activity.
  •  Isolate, identify, and track which security factors triggered an anomaly alert.
  • Capture, play, and analyze video sessions of anomalous events within the same dashboard used for tracking overall security activity.
  • Create customizable alerts that provide context-relevant visibility and session recording and can also deliver notifications of anomalies, all leading to quicker, more informed investigative action.

What to Look for In Threat Analytics 
Threat analytics providers are capitalizing on machine learning to improve the predictive accuracy and usability of their applications continually. What’s most important is for any threat analytics application or solution you’re considering to provide context-aware access decisions in real time. The best threat analytics applications on the market today are using machine learning as the foundation of their threat analytics engine. These machine learning-based engines are very effective at profiling the normal behavior pattern for any user on any login attempt, or any privileged activity including commands, identifying anomalies in real time to enable risk-based access control. High-risk events are immediately flagged, alerted, notified, and elevated to IT’s attention, speeding analysis, and greatly minimizing the effort required to assess risk across today’s hybrid IT environments.

The following is the minimum set of features to look for in any privilege threat analytics solution:

  • Immediate visibility with a flexible, holistic view of access activity across an enterprise-wide IT network and extended partner ecosystem. Look for threat analytics applications that provide dashboards and interactive widgets to better understand the context of IT risk and access patterns across your IT infrastructure. Threat analytics applications that give you the flexibility of tailoring security policies to every user’s behavior and automatically flagging risky actions or access attempts, so that you’ll gain immediate visibility into account risk, eliminating the overhead of sifting through millions of log files and massive amounts of historical data.
  • They have intuitively designed and customizable threat monitoring and investigation screens, workflows, and modules. Machine learning is enabling threat analytics applications to deliver more contextually-relevant and data-rich insights than has ever been possible in the past. Look for threat analytics vendors who offer intuitively designed and customizable threat monitoring features that provide insights into anomalous activity with a detailed timeline view. The best threat analytics vendors can identify the specific factors contributing to an anomaly for a comprehensive understanding of a potential threat, all from a single console. Security teams can then view system access, anomaly detection in high resolutions with analytics tools such as dashboards, explorer views, and investigation tools.
  • Must provide support for easy integration to Security Information and Event Management (SIEM) tools. Privileged access data is captured and stored to enable querying by log management and SIEM reporting tools. Make sure any threat analytics application you’re considering has installed, and working integrations with SIEM tools and platforms such as Micro Focus® ArcSight™, IBM® QRadar™, and Splunk® to identify risks or suspicious activity quickly.
  • Must Support Alert Notification by Integration with Webhook-Enabled Endpoints. Businesses getting the most value out of their threat analytics applications are integrating with Slack or existing onboard incident response systems such as PagerDuty to enable real-time alert delivery, eliminating the need for multiple alert touch points and improving time to respond. When an alert event occurs, the threat analytics engine allows the user to send alerts into third-party applications via Webhook. This capability enables the user to respond to a threat alert and contain the impact of a breach attempt.

Conclusion 
CentrifyForresterGartner, and Verizon each have used different methodologies and reached the same conclusion from their research: privileged access abuse is the most commonly used tactic for hackers to exfiltrate sensitive data. Breaches based on privileged credential abuse are extremely difficult to stop, as these credentials often have the greatest levels of trust and access rights associated with them. Leveraging threat analytics applications using machine learning that is adept at finding anomalies in behavioral data and thwarting a breach by denying access is proving very effective against privileged credential abuse.

Companies, including Centrify, use risk scoring combined with adaptive MFA to empower a least-privilege access approach based on Zero Trust. This Zero Trust Privilege approach verifies who or what is requesting privileged access, the context behind the request, and the risk of the access environment to enforce least privilege. These are the foundations of Zero Trust Privilege and are reflected in how threat analytics apps are being created and improved today.

Gartner Top 10 Strategic Technology Trends For 2016

2016 Gartner technology trends graphicGartner announced their top 10 strategic technology trends for 2016 at the Gartner Symposium/ITxpo held October 4 – 8th in Orlando. David Cearley, Vice President and Gartner Fellow, presented the company’s Top Ten Strategic Technology Trends for 2016You can find the video here.

Key take-aways from his presentation and the trends announced are provided below:

  • Enterprise 3D-printing shipments will attain a 64.1% Compound Annual Growth Rate (CAGR) through 2019. David Cearley mentioned during his keynote that jet engines are being 3D printed today.  He gave the example to illustrate that 3D printing will continue to gain adoption in more demanding manufacturing environments including aerospace, automotive, energy, medical devices and military-based markets and industries.
  • Emergence of an entirely new class of business models based on smart machine technologies, advanced analytics and big data. Combining machine learning, continued adoption of Internet of Things (IoT) sensors and supporting data models, and advanced intelligence to interpret and act on the data, Gartner’s predictions set the stage of an entirely new class of business models. Manufacturing-as-a-Service and paying only for the production time used in a factory are within reach for more companies than before based on these predictions.
  • The device mesh will expand to include IoT-based devices that scale well beyond the enterprise. Gartner is predicting that in the next three years traditional computing and communication devices, including desktop and mobile devices will increasingly be augmented by wearable devices, home electronics including appliances with sensors, transportation-based sensors and data collection devices, and environmental devices all capable of capturing data in real-time.
  • A digital mesh will continue to proliferate, aligning apps and devices to individuals’ specific roles and tasks.  Gartner sees this digital mesh as an expanding series of devices, services, platforms, informational networks and individuals that integrate together and provide contextual intelligence and enabling greater collaboration. The proliferation of the digital mesh will lead to more ambient, contextually intelligent and intuitive app design over time Gartner predicts.
  • The next twelve months will also see the proliferation of algorithm-based businesses enabling automated background tasks including smart machines. Gartner’s technology trends for 2016 set a solid foundation for the growth of globally-based smart factories and production centers. Acumatica, Plex Systems and other Cloud ERP providers are ideally positioned for this trend, having proven their ability to provide manufacturing intelligence from the shop floor to the top floor. In addition to cloud platforms, these algorithm-based businesses will need to support unstructured data analysis including latent semantic indexing (LSI), data taxonomy and classification algorithms to ensure data fidelity and scalability, and more robust analytics and predictive modeling systems.
  • Combining algorithms, analytics, data architectures and smart machines have the potential to revolutionize manufacturing quickly. General Electric’s Predix platform, IBM’s IoT Foundation and several other cloud-based IoT platforms are already making progress on transforming the vision of algorithm-based smart machine production strategies into a reality for manufacturers globally.
  • Gartner sees a new IT reality taking shape. Adaptive security, advanced systems, Internet of Things (IoT), mesh app & service architectures are the catalysts of the new nature of IT that Gartner is predicting.

A graphic illustrating the top 10 strategic trends is show below:

top ten technology trends 2016

Sources:

Gartner Identifies the Top 10 Strategic Technology Trends for 2016.  Press Release Announcement, October 6, 2015.

Video replay of the keynote: The Top 10 Strategic Technology Trends for 2016

Roundup of Cloud Computing & Enterprise Software Market Estimates and Forecasts, 2013

157989221When the CEO of a rust-belt manufacturer speaks of cloud computing as critical to his company’s business strategies for competing globally, it’s clear a fundamental shift is underway.

Nearly every manufacturing company I’ve spoken with in the last ninety days has a mobility roadmap and is also challenged to integrate existing ERP, pricing and fulfillment systems into next-generation selling platforms.

One of the most driven CEOs I’ve met in manufacturing implemented a cloud-based channel management, pricing, quoting and CRM system to manage direct sales and a large distributor network across several countries.  Manufacturers are bringing an entirely new level of pragmatism to cloud computing, quickly deflating its hype by pushing for results on the shop floor.

There’s also been an entirely new series of enterprise software and cloud computing forecasts and market estimates published.  I’ve summarized the key take-aways below:

  • Enterprise sales of ERP systems will grow to $32.9B in 2016, attaining a 6.7% CAGR in the forecast period of 2011 to 2016.   CRM is projected to be an $18.6B global market by 2016, attaining a CAGR of 9.1% from 2011 to 2016.   The fastest growing category of enterprise software will be Web Conferencing and Team, growing at a 12.4% CAGR through the forecast period.  The following graphic compares 2011 actual sales and the latest forecast for 2016 by enterprise software product category.  Source:  Gartner’s Forecast Analysis: Enterprise Application Software, Worldwide, 2011-2016, 4Q12 Update Published: 31 January 2013

Figure 1 enteprise spending

Figure 2

figure 3 cloud computing

 public cloud forecast

Forrester Wave

  • IDC is predicting Cloud Services and enablement spending will hit $60 billion, growing at 26% through the year and that over 80% of new apps will be distributed and deployed on cloud platforms.  Their predictions also are saying that 2.5% of legacy packaged enterprise apps will start migrating to clouds.  Source: Top 10 Predictions, IDC Predictions 2012: Competing for 2020 by Frank Gens. You can download a copy of the IDC Predictions here: http://cdn.idc.com/research/Predictions12/Main/downloads/IDCTOP10Predictions2012.pdf

How Cloud Computing Is Accelerating Context-Aware Coupons, Offers and Promotions

Retailers and marketers often face the challenge of getting coupons, offers and promotions delivered at the perfect time and in the right context to their customers.

The rapid advances in cyber foragingcontextual computing and cloud computing platforms are succeeding at revolutionizing this aspect of the retail shopping experience.  Context-aware advertising platforms and strategies can also provide precise audience and segment-based messaging directly to customers while they are in the store or retail outlet.

What makes context-aware advertising so unique and well adapted to the cloud is the real-time data integration and contextual intelligence they use for tailoring and transmitting offers to customers.  When a customer opts in to retailer’s contextually-based advertising system, they are periodically sent alerts, coupons, and offers on products of interest once they are in or near the store.  Real-time offer engines chose which alerts, coupons or offers to send, when, and in which context.  Cloud-based analytics and predictive modeling applications will be used for further fine-tuning of alerts, coupons and offers as well.  The ROI of each campaign, even to a very specific audience, will be measurable.  Companies investing in cloud-based contextual advertising systems include Apple, Google, Greystripe, Jumptap, Microsoft, Millennial Media, Velti and Yahoo.

Exploring the Framework of Me Marketing and Context-Aware Offers

A few years ago, a student in one of my MBA courses in international marketing did their dissertation on cyber foraging and contextual mobile applications’ potential use for streamlining business travel throughout Europe.  As a network engineer for Cisco at the time, he viewed the world very systemically; instead of getting frustrated with long waits he would dissect the problem and look at the challenges from a system-centric view.  The result was a great dissertation on cyber foraging and the potential use of Near Field Communications (NFC) and Radio Frequency Identification (RFID) as sensors to define contextual location and make business travel easier.  One of the greatest benefits of teaching, even part-time, is the opportunity to learn so much from students.

I’ve been following this area since, and when Gartner published Me Marketing: Get Ready for the Promise of Real-Time, Context-Aware Offers in Consumer Goods this month I immediately read it.  Gartner is defining Me Marketing as real-time, context-aware offers in grocery stores. Given the abundance of data on transactions that occur in grocery stores, Gartner is predicting this will be the most popular and fastest-growing area of context-aware offers.  The formula for Me Marketing is shown below:

The four steps of the Me Marketing formula are briefly described as follows:

Me marketing framework for contextual coupons

 

  • Consumer Insight and Permission – The first step of the framework and the most difficult from a change management standpoint, this requires customers to opt in to receiving alerts, coupons, offers and promotions.  The best retailers also have invested heavily in security and authentication technologies here too.
  • Delivery Mechanism and In-the-Moment Context – The real-time offer engine is used to determining which coupons, offers and promotions are best suited for a specific customer based on their shopping patterns, preferences and locations.
  • Select Best Offer – Next, the real-time offer engine next defines a very specific product or service offer based on location, previous purchase history, social media analysis, predictive and behavioral analysis, and previous learned patterns of purchasing.
  • Redemption – The purchase of the item offered.  Initial pilots have shown that less frequent yet highly relevant, targeted offers have a higher redemption rate.  It is encouraging to see that early tests of these systems show that spamming customers leads to immediate opt-outs and in some cases shopping competitors.

A Short Overview of Contextual Advertising and the Cloud

Cloud-based systems and applications are necessary for retailers to gain the full value that contextual advertising can provide.  This includes the social context, with specific focus on aggregation and analysis of Social CRM, CRM, and social media content, in addition to behavioral analytics and sentiment analysis.  It also includes the previous browsing, purchasing, returns and prices paid by product for each customer.  Cloud-based integration architectures are necessary for making contextual advertising a reality in several hundred or even thousands of retail stores at the same time.

Geographical data and analysis is also essential.  RFID has often been included in cyber foraging and contextual advertising pilots, in addition to NFC.  As Global Positioning System (GPS) chip sets have dropped in price and become more accurate, companies including Google, Microsoft and Yahoo are basing their contextual advertising platforms on them.  Finally the activity or task also needs to have a contextual definition.

Combining all three of these elements gives the context of the customer in the retail store.  The figure below is from Three-Dimensional Context-Aware Tailoring of Information.  This study also took into account how personas are used by companies building cloud-based contextual advertising systems.  The taxonomies shown in the figure are used for building personas of customers.

context aware technology

There are many pilot projects and enterprise-wide system tests going on right now in the area of cloud-based contextual advertising.  One of the more interesting is an application suite created entirely on Google App Engine, Android, and Cloud Services.  The pilot is explained in the study Exploring Solutions for Mobile Companionship: A Design Research Approach to Context-Aware Management.  The following figure shows a diagram of the suite.  This pilot uses Cloud to Device Messaging (C2DM) which is part of the Android API to link the Google App Engine server and Android client.  Google will most likely add more depth of support for C2DM as it plays a critical role in contextual system development.

context aware Google Ad Platform

Benefits of a Cloud-based Contextual Advertising Platform

For the customer, cloud-based advertising systems over time will learn their preferences and eventually impact the demand planning and forecasting systems of retailers.  This translates into the customer-centric benefits of products being out of stock less.  In addition, customers will receive more relevant offers.  The entire shopping experience will be more pleasant with expectations being met more often.

For the retailer, better management of product categories and more effective gross margin growth will be possible. Having real-time analytics of each coupon, offer and promotion will also give them immediate insights into which of their selling strategies are working or not.

For the manufacturer, the opportunity to finally understand how customers respond at the store level to promotions, programs including the results of co-op funds investment and pricing strategies will be known.  The manufacturers who partner with retailers using these systems will also have the chance at attaining greater product differentiation as their coupons, offers and promotions will only go to the most relevant customers.

References:

Me Marketing: Get Ready for the Promise of Real-Time, Context-Aware Offers in Consumer Goods Published: 24 December 2012 Analyst(s): Don Scheibenreif, Dale Hagemeyer

Tor-Morten Grønli, Ghinea, G., & Bygstad, B. (2013). Exploring Solutions for Mobile Companionship: A Design Research Approach to Context-Aware Management. International Journal of Information Management, 33(1), 227. http://www.sciencedirect.com/science/article/pii/S0268401212001259

Tor-Morten Grønli, & Ghinea, G. (2010). Three-Dimensional Context-Aware Tailoring of Information. Online Information Review, 34(6), 892-906. http://www.emeraldinsight.com/journals.htm?articleid=1896452

First Steps to Creating a Cloud Computing Strategy for 2013

Cloud computing strategy 2013 will be one of the most pivotal years for cloud computing because trust in these technologies is on the line.

Expectations are high regarding these technologies’ ability to deliver business value while reducing operating costs.  Enterprises’ experiences have at times met these high expectations, yet too often are getting mixed results.  Managing cloud expectations at the C-level is quickly emerging as one of the most valuable skills in 2013. The best CIOs at this are business strategists who regularly review with their line-of-business counterparts what is and isn’t working.  These CIOs who are excelling as strategists also are creating and continually evaluating their cloud computing plans for 2013.  They are focusing on plans that capitalize the best of what cloud computing has to offer, while minimizing risks.

CIOs excelling as strategists are also using cloud computing planning to punch through the hype and make cloud technologies real from a customer, supplier and internal efficiency standpoint.  Lessons learned from these cloud computing planning efforts in enterprises are provided below:

  • Cloud computing needs to mature more to take on all enterprise applications, so plan for a hybrid IT architecture that provides both agility and security.  This is a common concern among CIOs in the manufacturing and financial services industries especially.  As much as the speed of deployment, customization and subscription-based models attract enterprises to the cloud, the difficult problems of security, legacy system integration, and licensing slow its adoption.  There is not enough trust in the cloud yet to move the entire IT infrastructure there in the majority of manufacturing companies I’ve spoken with.
  • Reorganizing IT to deliver greater business agility and support of key business initiatives will be a high priority in 2013.  The gauntlet has been thrown at the feet of many CIOs this year: become more strategic and help the business grow now.  Cloud is part of this, yet not its primary catalyst, the need to increase sales is.  IT organizations will increasingly reflect a more service-driven, not technology-based approach to delivering information and intelligence to the enterprise as a result.
  • Recruiting, training and retaining cloud architects, developers, engineers, support and service professionals will be a challenge even for the largest enterprises.  There isn’t enough talent to go around for all the projects going on and planned right now.  State Farm Insurance has 1,000 software engineers working on their mobility applications for claims processing and quoting for example.  And they are hiring more.  Certifications in cloud technologies are going to be worth at least a 30 to 50% increase in salary in specific positions. This is very good news for engineers who want to differentiate themselves and get ahead in their careers, both financially and from a management standpoint.
  • Measuring the contributions of operating expense (OPEX) reductions is going to become commonplace in 2013.  From the cloud computing plans I’ve seen, OPEX is being tracked with greater accuracy than in any other year and will be a strong focus in the future.  The capital expense (CAPEX) savings are clear, yet OPEX savings in many cases aren’t. Cloud computing’s greatest wins in the enterprise continue to be in non-mission critical areas of the business.  This is changing as cloud-based ERP systems gain adoption within businesses who are constrained by monolithic ERP systems from decades ago.  Plex Systems is a leader in this area and one to watch if you are interested in this area of enterprise software.  SaaS is dominating in the area of lower application costs and high user counts, which is the Public Computing Sweet Spot in the following graphic:

Figure 1 Cloud Computing Planning Guide

Source: 2013 Cloud Computing Planning Guide: Rising Expectations Published: 1 November 2012 Analysts: Drue Reeves, Kyle Hilgendorf

  • Start building a SaaS application review framework including Service Level Agreement (SLA) benchmarks to drive greater transparency by vendors.  Gartner forecasts that the SaaS-based cloud market will grow from $12.1B in 2013 to$21.3B in 2015, with the primary growth factors being ease of customization and speed of deployment. CIOs and their staffs have SaaS frameworks already in place, often with specific levels of performance defined including security and multitenancy audits.  SLAs are going to be a challenge however as many vendors are inflexible and will not negotiate them. At a minimum make sure cloud service providers and cloud management platforms (CMP) have certifications for ISO 27001 and Statements on Standards for Attestation Engagements (SSAE) No. 16, as this shows the provider is making investments in availability, security and performance levels.
  • Create a Cloud Decision Framework to keep technology evaluations and investments aligned with business strategies.  Business and application assessments and the vendor selection process need to take into account application requirements, role of external cloud resources, and how the RFI will be structured. These process areas will vary by type of company – yet concentrating in application requirements goes a long way to reducing confusion and forcing trade-offs in the middle of a review cycle.  The following is an example of a Cloud Decision Framework:

Figure 2 Sample Cloud Decision Framework

Source: 2013 Cloud Computing Planning Guide: Rising Expectations Published: 1 November 2012 Analysts: Drue Reeves, Kyle Hilgendorf

  • Mitigating risk and liability through intensive due diligence needs to become any cloud-based companies’ core strength.  Regardless of how the HP-Autonomy litigation is resolved it is a powerful cautionary tale of the need for due diligence.  And let’s face it: there are way too many SaaS companies chasing too few dollars in the niche areas of enterprise software today.  A shakeout is on the way, the market just can’t sustain so many vendors.  To reduce risk and liability, ask to see the financial statements (especially if the vendor is private), get references and visit them, meet with engineering to determine how real the product roadmap is, and require an SLA.  Anyone selling software on SaaS will also have revenue recognition issues too, be sure to thoroughly understand how they are accounting for sales.
  • Design in security management at the cloud platform level, including auditing and access control by role in the organization.  One manufacturing company I’ve been working with has defined security at this level and has been able to quickly evaluate SaaS-based manufacturing, pricing and services systems by their security integration compatibility.  This has saved thousands of dollars in security-based customizations to meet the manufactures’ corporate standards.

Bottom line: 2013 is the make-or-break year for cloud in the enterprise, and getting started on a plan will help your organization quickly cut through the hype and see which providers can deliver value.

Cloud Computing and Enterprise Software Forecast Update, 2012

The latest round of cloud computing and enterprise software forecasts reflect the growing influence of analytics, legacy systems integration, mobility and security on IT buyer’s decisions.

Bain & Company and Gartner have moved beyond aggregate forecasts, and are beginning to forecast by cloud and SaaS adoption stage.  SAP is using the Bain adoption model in their vertical market presentations today.

Despite the predictions of the demise of enterprise software, forecasts and sales cycles I’ve been involved with indicate market growth.  Mobility and cloud computing are the catalysts of rejuvenation in many enterprise application areas, and are accelerating sales cycles.  Presented in this roundup are market sizes, forecasts and compound annual growth rates (CAGRS) for ten enterprise software segments.

Key take-aways from the latest cloud computing and enterprise software forecasts are provided below:

  • Public and private cloud computing will be strong catalysts of server growth through 2015.  IDC reports that $5.2B in worldwide server revenue was generated in 2011 or 885,000 units sold.  IDC is forecasting a $9.4B global market by 2015, resulting in 1.8 million servers sold. Source:  IDC Worldwide Enterprise Server Cloud Computing 2011–2015 http://www.idc.com/getdoc.jsp?containerId=228916 
  • IDC reports that enterprise cloud application revenues reached $22.9B in 2011 and is projected reach $67.3B by 2016, attaining a CAGR of 24%.  IDC also predicts that by 2106, $1 of every $5 will be spent on cloud-based software and infrastructure. Report, Worldwide SaaS and Cloud Software 2012–2016 Forecast and 2011 Vendor Shares, Link: http://www.idc.com/getdoc.jsp?containerId=236184
  • 11% of companies are transformational, early adopters of cloud computing, attaining 44% adoption (as defined by % of MIPS) in 2010, growing to 49% in 2013.  This same segment will reduce their reliance on traditional, on-premise software from 34% to 30% in the same period according to Bain & Company’s cloud computing survey results shown below.  SAP is using this adopter-based model in their vertical market presentations, an example of which is shown here.

  • The global Platform-as-a-Service (PaaS) market is growing from $900M in 2011 to $2.9B in 2016, achieving a 26.6% CAGR.  At this projected rate, PaaS will generate an average of $360M a year in revenue between 2011 and 2016.  Gartner projects that the largest segments will be Application Platform Services (aPaaS) which generated 35% of total PaaS spending in 2011, followed by cloud application lifecycle services (12.5).    Source: Market Trends: Platform as a Service, Worldwide, 2012-2016, 2H12 Update Published: 5 October 2012 ID:G00239236.

  • The three most popular net-new SaaS solutions deployed are CRM (49%), Enterprise Content Management (ECM) (37%) and Digital Content Creation (35%).  The three most-replaced on-premise applications are Supply Chain Management (SCM) (35%), Web Conferencing, teaming platforms and social software suites (34%) and Project & Portfolio Management (PPM (33%). The following graphic shows the full distribution of responses. Source: User Survey Analysis: Using Cloud Services for Mission-Critical Applications Published: 28 September 2012

  •  In 2011, the worldwide enterprise application software market generated $115.1B in revenue, and is projected to grow to $157.6B by 2016, attaining a 6.5% CAGR in the forecast period. Gartner reports that 38% of worldwide enterprise software revenue is from maintenance and technical support; 17% from subscription payments; and 56% from ongoing revenue including new purchases.  An analysis of the ten enterprise software markets and their relative size and growth are shown in the figure below along with a table showing relative rates of growth from 2011 to 2016. Source: Forecast: Enterprise Software Markets, Worldwide, 2011-2016, 3Q12 Update Published: 12 September 2012 ID:G00234766

SaaS Adoption Accelerates, Goes Global in the Enterprise

In working with manufacturers and financial services firms over the last year, one point is becoming very clear: SaaS is gaining trust as a solid alternative for global deployments across the enterprise.  And this trend has been accelerating in the last six months.  One case in point is a 4,000 seat SaaS CRM deployment going live in Australia, Europe, and the U.S. by December of this year.

What’s noteworthy about this shift is that just eighteen months ago an Australian-based manufacturer was only considering SaaS for on-premises enhancement of their CRM system.  What changed?  The European and U.S. distribution and sales offices were on nearly 40 different CRM, quoting, proposal and pricing systems.  It was nearly impossible to track global opportunities.

Meanwhile business was booming in Australia and there were up-sell and cross-sell opportunities being missed in the U.S. and European-based headquarters of their prospects. The manufacturer  chose to move to a global SaaS CRM solution quickly.  Uniting all three divisions with a global sales strategy forced the consolidation of 40 different quoting, pricing and CRM systems in the U.S. alone.  What they lost in complexity they are looking to pick up in global customer sales.

Measuring Where SaaS Is Cannibalizing On-Premise Enterprise Applications

Gartner’s Market Trends: SaaS’s Varied Levels of Cannibalization to On-Premises Applications published: 29 October 2012 breaks out the percent of SaaS revenue for ten different enterprise application categories.  The greener the color the greater the adoption.  As was seen with the Australian manufacturer, CRM continues dominate this trend of SaaS cannibalizing on-premise enterprise applications.

Additional take-aways from this report include the following:

  • Perceived lower Total Cost of Ownership (TCO) continues to be the dominant reason enterprises are considering SaaS adoption, with 50% of respondents in 2012 mentioning this as the primary factor in their decision.
  • CRM is leading all other enterprise application areas in net new deployments according to the Gartner study, with the majority of on-premise replacements being in North America and Europe.
  • Gartner projects that by 2016 more than 50% of CRM software revenue will be delivered by SaaS.  As of 2011, 35% of CRM software was delivered on the SaaS platform.  Gartner expects to see SaaS-based CRM grow at three time the rate of on-premise applications.
  • 95% of Web analytics functions are delivered via the SaaS model  whereas only 40% of sales use cloud today according to the findings of this study.
  • The highest adoption rates of SaaS-based applications include sales, customer service, social CRM and marketing automation.
  • SaaS-based ERP will continued to be a small percentage of the total market, attaining 10% cannibalization by 2012.  Forrester has consistently said this is 13%, growing to 16% by 2015.
  • Office suites and digital content creation (DCC) will attain compound annual growth rates (CAGR) of 40.7% and a 32.2% respectively from 2011 through 2016. Gartner is making the assumption consumers and small businesses will continue be the major forces for Web-based office suites through 2013.
  • The four reasons why companies don’t choose SaaS include uncertainty if it is the right deployment option (36%), satisfaction with existing on-premise applications (30%), no further requirements (33%) and locked into their current solution with expensive contractual requirements (14%).

Bottom Line: Enterprises and their need to compete with greater accuracy and speed are driving the cannibalization of on-premise applications faster than many anticipated; enterprise software vendors need to step up and get in front of this if they are going to retain their greatest sources of revenue.

Source:  Market Trends: SaaS’s Varied Levels of Cannibalization to On-Premises Applications Published: 29 October 2012 written by Chad Eschinger, Joanne M. Correia, Yanna Dharmasthira, Tom Eid, Chris Pang, Dan Sommer, Hai Hong Swinehart and Laurie F. Wurster

Using Search Analytics To See Into Gartner’s $232B Big Data Forecast

By combining search analytics and the latest Gartner forecast on big data published last week, it’s possible to get a glimpse into this areas’ highest growth industry sectors.  Big data is consistently a leading search term on Gartner.com, which is the basis of the twelve months of data used for the analysis.

In addition, data from Gartner’s latest report, Big Data Drives Rapid Changes in Infrastructure and $232 Billion in IT Spending Through 2016 by Mark A. Beyer, John-David Lovelock, Dan Sommer, and Merv Adrian is also used.  These authors have done a great job of explaining how big data is rapidly emerging as a market force, not just a single market unto itself.  This distinction pervades their analysis and the following table showing Total IT Spending Driven by Big Data reflects the composite market approach.  Use cases from enterprise software spending, storage management, IT services, social media and search forecasts are the basis of the Enterprise Software Spending for Specified Sub-Markets Forecast.  Social Media Analytics are the basis of the Social Media Revenue Worldwide forecast.

Additional Take-Aways

  • Enterprise software spending for specified sub-markets will attain a 16.65% compound annual growth rate (CAGR) in revenue from 2011 to 2016.
  • Attaining a 96.77% CAGR from 2011 through 2016, Social Media Revenue Is one of the primary use case catalysts of this latest forecast.
  • Big Data IT Services Spending will attain a 10.20% CAGR from 2011 to 2016.
  • $29B will be spent on big data throughout 2012 by IT departments.  Of this figure, $5.5B will be for software sales and the balance for IT services.
  • Gartner is projecting a 45% per year average growth rate for social media, social network analysis and content analysis from 2011 to 2016.
  • Gartner projects a 20 times ratio of IT Services to Software in the short term, dropping as this market matures and more expertise is available.
  • By 2020, big data functionality will be part of the baseline of enterprise software, with enterprise vendors enhancing the value of their applications with it.
  • Organizations are already replacing early implementations of big data solutions – and Gartner is projecting this will continue through 2020.
  • By 2016 spending on Application Infrastructure and Middleware becomes one of the most dominant for big data in Enterprise Software-Specified Sub Markets.

  • $232B is projected to be sold in total across all categories in the forecast from 2011 to 2016. From $24.4B in 2011 to $43.7B in 2016, this presents a 12.42% CAGR in total market growth.

Search Analytics and Big Data

Big data is continually one of the top terms search on Gartner.com, and over the last twelve months, this trend has accelerated.  The following time series graph shows the weekly number of inquiries Gartner clients have made, with the red line being the logarithmic trend.

Banking (25%), Services (15%) and Manufacturing (15%) are the three most active industries in making inquiries about big data to Gartner over the last twelve months.  The majority of these are large organizations (63%) located in North America (59%) and Europe (19%).

What unifies all of these industries from a big data standpoint is how critical the stability of their customer relationships are to their business models.  Banks have become famous for bad service and according to the American Customer Satisfaction Index (ACSI) have shown anemic growth in customer satisfaction in the latest period measured, 2010 to 2011.  The potential for using big data to becoming more attuned to customer expectations and deliver more effective customer experiences in this and all services industries shows great upside.

Bottom line: Companies struggling with flat or dropping rankings on the ACSI need to consider big data strategies based on structured and unstructured customer data.  In adopting this strategy the potential exists to drastically improve customer satisfaction, loyalty, and ultimately profits.