Gartner: Top Strategic Technology Trends for 2022

 

Top Strategic Technology Trends for 2022

Published 18 October 2021 - ID G00757234 - 37 min read

By David Groombridge, Frances Karamouzis, and 14 more

Facing CEO demands to grow and digitalize efficiently, IT leaders must find force multipliers to drive growth, adopt technology to scale digitalization, and create scalable IT foundations for cost efficiency. Gartner’s strategic technology trends show the key technologies to scale, adapt and grow.

Overview

Opportunities

  • Following the business disruptions of the pandemic, CEOs are keen to return to growth, or to build on the business momentum they’ve established. IT leaders responsible for technology innovation have an opportunity to deliver the innovations needed to accelerate that growth.
  • CEOs saw the promise of digital business during the pandemic and now want to find more direct digital routes to connect with their customers. With boards of directors increasing budgets for digitalization in business units, IT leaders must seize the opportunity to deliver creative new technologies to sculpt digital change.
  • At the same time, CEOs are seeking cost-efficient and scalable approaches that reduce capital expenditure and free up cash. IT leaders must invest in a cost-efficient technical foundation to securely, scalably and resiliently engineer trust throughout environments.

Recommendations

IT leaders responsible for technology innovation:
  • Accelerate the growth of your enterprise by composing and enabling the force-multiplying technologies that will create the business opportunities that your CEO needs. Adapt to the distributed enterprise of hybrid work with location-independent employee and customer services, and utilize total experience approaches to optimize outcomes for all stakeholders. Use autonomic systems to scale management of rapidly-changing complex environments, and pilot generative artificial intelligence (AI) to accelerate R&D timescales and innovation.
  • Sculpt the rapid digitalization of your organization by releasing the creativity of multidisciplinary fusion teams of IT and business technologists. Introduce AI engineering approaches to optimize the production value of AI, and deploy hyperautomation tools and processes to accelerate the pace of business model change. Use composable applications to reduce the time to market of solutions, and adopt decision intelligence approaches to optimize organizational decision making.
  • Engineer the scalability and trust required for cost-optimized operations by building a secure, resilient foundation for future IT operations based on cloud technologies. Implement cloud-native platforms for rapid application change and deploy a data fabric to integrate, scale and optimize data usage. Utilize privacy-enhancing computation for secure data processing anywhere, and adopt a cybersecurity mesh architecture to consistently integrate security approaches across cloud and noncloud environments.

What You Need to Know

The past 18 months have triggered faster IT change than ever, and this pace will only increase. In this changing world, CEOs’ priorities are clear: They want growth, digitalization and efficiency.1 Most want to rebuild the revenue they lost during the pandemic and return to growth, but some need to build on the momentum they’ve established. At the same time, CEOs know they must accelerate the adoption of digital business in the new world, and are seeking more direct digital routes to connect with their customers. But, with an eye on future economic risks, they also want to be efficient, and protect margins and cash flow.
This means that it’s vital for you, as an IT leader responsible for technology innovation, to address the three themes for action in our top strategic technology trends for 2022 (see Figure 1) by:
  • Accelerating growth — finding the IT innovations that will win business and market share
  • Sculpting change — releasing creative new technology solutions to scale the digitalization of your organization
  • Engineering trust — creating resilient foundations that enable your organization to scale cost-efficiently
Figure 1: Top Strategic Technology Trends for 2022

List of Gartner’s 12 strategic technology trends for 2022, broken into three themes.
Accelerate growth. You’re no stranger to your organization’s constant pursuit of growth. Our first four technology trends produce combinatorial innovation and maximize value creation, enabling exponential growth in the postpandemic world. They compose digital capabilities to integrate the physical and virtual worlds — for hybrid working, holistic experiences, management at scale and imagining new imaginations.
Sculpt change. Having come through the peak of the pandemic, you know all too well the incredible pace of change in your organization. You must deliver creative new uses of technology to enable your organization to scale digitalization rapidly. You must collaborate with business and other IT leaders and create teams that fuse business and IT skills from various disciplines. These “fusion teams” (see Note 1) must begin sculpting change through rapid application creation, routes to effective decisions, business automation and AI optimization.
Engineer trust. To achieve these aims and also deliver the cost-efficient growth your CEO demands, you need a technology foundation to securely integrate your digital assets and scale as your business does. For this, you need trends that create a resilient and efficient IT foundation by enabling secure integration and processing of data across cloud and noncloud environments. These trends engineer the trust necessary in a connected world by enabling secure integration and processing of data across cloud and noncloud environments.
Trends and technologies, however, don’t exist in isolation — they build on and reinforce one another. We selected our trends for 2022 in part based on their combined effects. Taken together, our top strategic technology trends for 2022 will help you meet your CEO’s priorities to scale, adapt and grow.
Download the presentation
Trend Profiles

Accelerating Growth

Generative AI

Analysis by Anthony Mullen, Marc Halpern, Soyeb Barot, Nicole Greene, Brent Stewart
Strategic Planning Assumption: By 2025, generative AI will account for 10% of all data produced, up from less than 1% today.
Generative AI is a form of AI that learns a digital representation of artifacts from sample data and uses it to generate new, original, realistic artifacts that retain a likeness to the training data but don’t repeat it. The input artifacts can be content (e.g., text, pictures, video and designs) or tangible objects (e.g., chemicals, alloys and products), while the output can be in the same form as the input or in a new mode (e.g., text converted to images).
New AI methods and practitioners are moving generative AI from the confines of research into commercial applications. These solutions can generate images, audio, code, language and designs, as the following examples show:
  • The U.K. Financial Conduct Authority worked with the provider Synthesized to use generative AI to create synthetic payment data from five million records of real payment data. The synthetic dataset will be used to create new fraud models without revealing individuals’ data.2
  • Researchers from the University of South Carolina College of Engineering and Computing and from Guizhou University, China, have demonstrated the potential for generative AI to speed up the identification of new inorganic materials by over a hundredfold.3
  • GitHub, Microsoft and OpenAI have collaborated to create GitHub Copilot, a tool that can generate and recommend code to developers as they write software 4
Generative AI is disruptive. It has the capacity to reshape the R&D economics of many areas of the organization, from products, content and customer experience to analytics, software engineering and data science.
Recommendations:
  • Accelerate content production and R&D efforts by selecting proven uses of generative AI to speed the creation of new products and increase the personalization of artifacts.
  • Reduce time to value of analytics projects by using generative AI to create synthetic datasets to protect privacy. Refine the approach by constantly comparing synthetic data to data from the field.
  • Use both generative AI and simulation technology to create an organizationwide architectural practice for rapid ideation.

Autonomic Systems

Analysis by Nick Jones, Erick Brethenoux, David Cearley
Strategic Planning Assumption: By 2024, 20% of organizations that sell autonomic systems or devices will require customers to waive indemnity provisions related to their products’ learned behavior.
Autonomic systems are self-managing physical or software systems performing domain-bounded tasks that possess three fundamental characteristics:
  • Autonomy. They execute their own decisions and tasks autonomously (without supervision).
  • Learning. They modify their behavior and internal operations based on experience and changing conditions, as well as potentially changing goals. Their ability to learn and evolve their behavior might result in nondeterministic behavior.
  • Agency. They have a sense of their own internal state and purpose, which guides how and what they learn. They act independently on decisions made.
These three characteristics enable these systems to adapt their behavior to support new requirements and situations, optimize their performance, and defend themselves (e.g., from cyberattacks).
Autonomic systems enable organizations to become more agile in situations where traditional programming or simple automation isn’t flexible or responsive enough, or where such legacy solutions are incapable of supporting a requirement.
AI has matured to the point where autonomic behavior is feasible, as shown by recent deployments in areas such as cybersecurity and communications network optimization. For example, Ericsson has demonstrated the use of reinforcement learning and digital twins to create an autonomic system that dynamically optimizes 5G network performance.5 In the longer term, autonomic behavior will become common in physical systems such as robots, drones, manufacturing machines and smart spaces.
Recommendations:
  • Pilot autonomic technologies in cases where early adoption will deliver agility and performance benefits in managing complex software or physical systems.
  • Create a multidisciplinary task force to prepare for the business, legal, HR and ethical consequences of deploying partially nondeterministic systems.

Total Experience

Analysis by Jason Wong, Michelle Duerst, Don Scheibenreif, Saul Brand, Michael Chiu, Van Baker
Strategic Planning Assumption: By 2026, 60% of large enterprises will use total experience to transform their business models to achieve world-class customer and employee advocacy levels.
Total experience (TX) is a business strategy for creating superior shared customer and employee experiences by interlinking customer experience (CX), employee experience (EX), user experience (UX) and multiexperience (MX) disciplines (see Note 2).
The goal of TX is to drive greater customer and employee confidence, satisfaction, loyalty and advocacy. Organizations are using separate digital initiatives to improve CX and employee productivity.6 But you can achieve both these business outcomes by applying adaptive and resilient TX business strategies to simultaneously increase revenue from customers and reduce internal costs.
TX applies composable business technologies at the intersection of customer and employee journeys. TX is enabled by technologies for design, development, content, automation and analytics. These technologies can be used to uncover and then remove effort, in order to transform shared experiences.
Canadian Blood Services transformed its business operations through multiple TX initiatives. It eliminated the friction and inefficiencies caused by manually setting up appointments for blood donation and the repetitive workflow for repeat donations that had frustrated both donors and employees.7 It rolled out multiple MX technologies, such as mobile apps, chatbots connected to interactive voice response, live chat with agents, reminders and concierge services, to improve the UX for donors and employees. Donor retention and advocacy improved and the cost of booking an appointment was reduced significantly.
Recommendations:
  • Form executive-sponsored fusion teams to create and execute a TX strategy; start by creating a TX-centric business architecture. Task the fusion teams with continuously enhancing UX and MX capabilities to improve overall CX and EX outcomes.
  • Instruct teams pursuing experience improvement initiatives to partner with and learn from others. Make all leaders of experience-related initiatives equally responsible for solving the combined needs of customers and employees.

Distributed Enterprise

Analysis by David Groombridge, Tony Harvey, Stuart Downes, Manjunath Bhat
Strategic Planning Assumption: By 2023, 75% of organizations that exploit distributed enterprise benefits will realize revenue growth 25% faster than competitors.
With the rise in remote and hybrid working patterns, traditional office-centric organizations are evolving into “distributed enterprises,” with staff spread widely geographically. However, these new hybrid work realities mean that partners and consumers are also now remote. Organizations must plan for the obvious impact of supporting workers everywhere, and the less obvious, but more strategic, change of adapting their business models to the changed customer demands caused by the distributed enterprise.
Gartner’s Hybrid Work HR Leader Survey found that 75% of remote or hybrid knowledge workers said their expectations for working flexibly have increased, and only 4% would choose an on-site working arrangement. However, distributed teams can suffer more fatigue, and hybrid or remote knowledge workers who experienced high levels of virtual work were 29% more likely to feel they were working too hard.8
IT teams must support remote workers with a digital workplace with a revised portfolio of services focused on collaboration, automation and employee well-being. For some organizations, this may be complex. For example, the logistics company Geodis has deployed virtual reality solutions to enable its forklift drivers to work remotely. 9
Reducing the risks of fatigue will also require thinking about new ways of working with technology. For example, Dropbox realized that its many existing collaboration habits did not match its new work environment. It saw an opportunity to redesign the way its employees worked. It introduced a “virtual first” approach to working, based on asynchronous collaboration and employees’ ability to design their own work week, with a focus on outcomes.10
At the same time, the rise of the distributed enterprise will require a change to business models. The lack of consumers traveling into central locations in cities will require businesses to pivot rapidly to new location-independent service models.
Gartner is already seeing a rise in demand for virtual services, location-agnostic access and robot deliveries to support remote consumers as part of the distributed enterprise. For example, with demand from consumers for increased remote healthcare and telemedicine, Amazon plans to roll out its telehealth service, Amazon Care, across the U.S. in the next year.11 The U.S. restaurant chain Chipotle has already pivoted to digital business. Its digital sales account for over 50% of its revenue through innovations such as digital-only restaurants, Facebook Messenger apps and contactless delivery to support changes in consumer demand.12
Recommendations:
  • Reduce employee fatigue and burnout by rearchitecting collaboration tools, workspaces and processes to match the new hybrid work environment. Gain visibility into employee sentiment and experience by using digital employee experience management tools to improve endpoint performance and provide proactive support.
  • Plan to pivot business models to capture market share from customer and consumer changes due to remote working. Do so by adopting virtual-first, remote-first architectural principles. Provide the tools for fusion teams to rapidly develop and improve customer-facing technologies.
  • Enhance the customer experience by using total experience approaches to eliminate the pain points that create friction in digital interactions for both customers and employees.

Sculpting Change

AI Engineering

Analysis by Kevin Gabbard, Soyeb Barot, Jitendra Subramanyam
Strategic Planning Assumption: By 2025, the 10% of enterprises that establish AI engineering best practices will generate at least three times more value from their AI efforts than the 90% of enterprises that do not.
AI engineering is the discipline of operationalizing updates to AI models, using integrated update pipelines for data, model and development, to deliver consistent business value from AI.
AI and operationalizing AI are receiving much attention. In a Gartner survey, 69% of corporate board members said AI would be a top game-changing technology enabling their industry to emerge stronger from the pandemic.13,14 But AI adoption alone will not deliver value without AI engineering to continuously optimize model value in production.
AI can deliver such continuous business value only once you:
  • Make strategic decisions about the use of AI in your enterprise, rather than just adding AI as another technology in the stack.
  • Establish new ways of monitoring AI value drift, such as analyzing changes in data inputs, the underlying infrastructure or the business processes that affect AI value.
  • Implement automated and integrated update pipelines to make enterprise-scale changes to production AI technology and processes. Include DataOps, ModelOps and DevOps (see Note 3) to update data sources, AI models and applications, respectively.
For example, Georgia Pacific, a U.S. manufacturer of pulp and paper products, monitors three kinds of AI model drift that can erode business value: changes to business operations, model inputs and system performance. Georgia Pacific quantifies these drifts and then applies rapid changes to production AI models using integrated and automated update processes to maintain business value. In this way, Georgia Pacific maintains initial model performance, or better, across the life cycle of its models.15
Recommendations:
  • Implement AI engineering as a strategic differentiator for creating and maintaining production AI value. Establish and refine AI engineering practices that incorporate best practices from DataOps, ModelOps and DevOps.
  • Monitor AI models in production for value drift, using KPIs that report on the business ecosystem in which the AI model operates.
  • Develop AI model management and governance practices that align model performance, human behavior and delivery of business value.

Hyperautomation

Analysis by Stephanie Stoudt-Hansen, Frances Karamouzis, Keith Guttridge
Strategic Planning Assumption: By 2024, diffuse hyperautomation spending will drive up the total cost of ownership fortyfold, making adaptive governance a differentiating factor in corporate performance.
Business-driven hyperautomation is a disciplined approach that organizations use to rapidly identify, vet and automate as many business and IT processes as possible. Hyperautomation involves the orchestrated use of multiple technologies, tools or platforms. Examples include AI, machine learning, event-driven software architecture, robotic process automation (RPA), intelligent BPM suites, integration platform as a service, low-code tools, and other types of decision, process and task automation tools.
Hyperautomation has increased in importance in enterprises, with a rise in investment and funded implementations16 due to a combination of factors, including the:
The incredible velocity, diversity and volume of technology “creation” and its sources have resulted in a need for hyperautomation. Functionality must be more aligned to business processes and customers, which, in turn, has driven a substantial rise in the number of technologists in business units. The most successful initiatives were those where business teams were involved in end-to-end capability life cycles,17 which need to map to collective initiatives for maximum success. However, the proliferation of hyperautomation activities must be governed holistically; otherwise, the spreading of accountability and spending across business units will hugely increase the management and operational costs of enterprise automation. The effective enterprise use of hyperautomation requires strong, centralized coordination and governance to optimize costs.
As an example, a global oil and gas company is managing 14 concurrent hyperautomation initiatives, each with well over $6 million in funding (with select efforts exceeding $50 million). The initiatives range from:
  • Targeted task automation to industrialize over 90 different areas (using RPA tools)
  • Intelligent document processing for targeted compliance, procurement and legal processes, with contract values exceeding $15 billion
  • Automation of geoscience (integration of geology and geophysics) and offshore oil drilling operations
The company conducts the decision making, deployments and governance of these initiatives strategically, based on targeted business outcomes for quality, time to market, business agility or innovation for new business models.
Recommendations:
  • Plan and architect for continuous concurrent hyperautomation initiatives to ensure orchestration across business processes and functions, as well as optimized use of technology tools.
  • Establish holistic mapping and prioritization of collective initiatives, rather than islands of task automation, to ensure synergistic and coordinated business outcomes.
  • Focus on governance to drive operational agility and resilience, while optimizing ongoing management and reducing organizational debt.

Decision Intelligence

Analysis by Pieter den Hamer
Strategic Planning Assumption: By 2023, more than 33% of large organizations will have analysts practicing decision intelligence, including decision modeling.
Organizations face unprecedented business complexity and uncertainty, so they must make accurate and highly contextualized decisions more quickly.18 Decision intelligence is a practical discipline designed to improve organizational decision making. It involves explicitly understanding and engineering how organizations make decisions and how they evaluate, manage and improve outcomes through feedback. Decision intelligence can support and augment human decision making and, potentially, automate it through the use of augmented analytics, simulations and AI.
Disruption-ready and resilient organizations must be able to rapidly compose and recompose transparent decision flows and models. Decision intelligence helps an organization unpack how decisions are made, and how they break down, by modeling decisions through a consistent framework. The framework provides a decision model that enables each phase of the decision flow to be understood and redesigned, showing which technical components can support, augment or automate that phase. That will help you determine where your investments can yield a higher return across people, process, data and technology.
Decision intelligence can support and enhance decision processes at a variety of levels:
  • Operational decisions, such as in personalizing repeatable client or customer interactions
  • Managerial decisions with a high degree of repetition, as in production planning or recruitment
  • Strategic decisions, such as mergers and acquisitions
The technology focus on breaking down data, analytics and AI platforms into smaller reusable components creates new design opportunities for more-effective decision-making systems and applications. These systems and applications enable more dynamic, optimized value chains and more scalable, personalized customer interactions.
Recommendations:
  • Start using decision intelligence in areas where business-critical decision making must be improved with more data-driven support or AI-powered augmentation, or where decisions can be scaled and accelerated with automation.
  • Model decisions and develop practices that incorporate both human and AI decision-making capabilities. Incorporate a feedback loop to measure results.

Composable Applications

Analysis by Yefim Natis, Dennis Gaughan, Tad Travis, Kirk Knoernschild, Mark O’Neill, Gene Alvarez
Strategic Planning Assumption: By 2024, the design mantra for new SaaS and custom applications will be “composable API-first or API-only,” rendering traditional SaaS and custom applications “legacy.”
In turbulent times, faster change is essential for business resilience and growth. Organizations can master change by adopting composable business principles that enable them to innovate and adapt more quickly to changing business needs by assembling modular applications. Such composable applications use the principles of business-centric modularity, autonomy, orchestration and discovery in their architecture (see How to Design Enterprise Applications That Are Composable by Default). They are built from atomic packaged business capabilities (PBCs). These are reusable, software-defined business objects; for example, they can represent a patient, a credit rating or results of a graph query. Assembling PBCs into applications on a suitable composition platform supports safer, more efficient and faster change led by democratized application design teams (see Use Gartner’s Reference Model to Deliver Intelligent Composable Business Applications).
Examples exist in many industries of the agility that composable applications create. Ally Bank has created PBCs representing repeatable capabilities such as fraud alerting, which its fusion teams (see Note 1) can assemble in low-code environments, saving over 200,000 hours of manual effort.19 During the pandemic, healthcare organizations recognized the strategic value of enterprise agility in the face of uncertainty. They quickly became leaders in adopting principles of application composability to develop telemedicine and other apps.20 The sportswear manufacturer adidas saw its own fusion teams slowed down by low-value repetitive work. Using composable applications, adidas increased the number of solutions it delivered nearly tenfold, while cutting weeks off delivery times.21
Recommendations:
  • Prioritize the adoption and effectiveness of business-IT fusion teams and begin to equip them with dedicated design tools.
  • Champion composable architectural principles in all new technology initiatives, including application modernization, engineering and the selection of vendor services.
  • Develop a roadmap and use Gartner’s Composable Business Index to assess your progress (see Toolkit: Composable Business Index From the 2020 Gartner IT Symposium/Xpo Keynote).

Engineering Trust

Cloud-Native Platforms

Analysis by Dennis Smith, Arun Chandrasekaran, Sid Nag, David Smith, Anne Thomas, Michael Warrilow
Strategic Planning Assumption: By 2025, cloud-native platforms will serve as the foundation for more than 95% of new digital initiatives — up from less than 40% in 2021.
Many traditional organizations struggle to build the necessary skills to succeed with digital initiatives because they lack the necessary talent and expertise. They’re disappointed with the simple rehosting of applications because of unpredictable or overly high costs. Cloud-native platforms (CNPs) are enabling such organizations to use the capabilities of the cloud to offset this talent shortage and deliver much faster on their digital initiatives. Two-thirds of organizations surveyed by Gartner report that they are using a cloud-native application architecture to build their platforms for digital business.22
CNPs use the core capabilities of cloud computing to deliver faster time to value. Drawing on elasticity and scalability, CNPs improve productivity by adopting DevOps principles and practices, while improving efficiency through the use of modern application architecture principles. CNPs maximize the potential of cloud computing by matching compute requirements with demand and by improving organizational agility to deliver business value faster.
As an example, a major Indian bank has built a CNP, with container, database and compute services. The CNP enabled the bank to build a portfolio of new digital financial services bringing advanced banking experiences to customers at scale, and enabling account openings in six minutes and instant digital payments. The bank used its provider’s Kubernetes service to deploy a new microservices architecture to support the integration of savings, virtual debit card and credit card services. This enabled the bank’s system to easily scale to over 3.5 million transactions in two months. The overall initiative increased customer satisfaction by 35% and reduced costs by 24%.
Recommendations:
  • Minimize basic lift-and-shift migrations that don’t take full advantage of cloud attributes.
  • Invest in cloud-native platforms and adopt modern principles of application architecture.
  • Develop practices and a culture to increase automation through standardized operational patterns.

Privacy-Enhancing Computation

Analysis by Bart Willemsen, Ramon Krikken and Mark Horvath
Strategic Planning Assumption: By 2025, 60% of large organizations will use one or more privacy-enhancing computation techniques in analytics, business intelligence or cloud computing.
Organizations are facing pressure from maturing international privacy and data protection legislation, as well as from declining customer trust following privacy incidents. Privacy-enhancing computation (PEC) techniques can help prevent these incidents and aid value creation from information, without exposing personal data.
PEC techniques include various robust and maturing approaches to protect privacy at a data, software or hardware level. Organizations are increasingly using PEC for data protection in analytics, business intelligence, cross-border transfers and processing in untrusted environments, such as the public cloud. 23
Organizations can use PEC to securely share, pool and analyze personal data without compromising confidentiality. For example, DeliverFund is a U.S.-based nonprofit organization with a mission to tackle human trafficking. Its platforms use homomorphic encryption so that its partners can conduct data searches against its extremely sensitive data on human trafficking, with both the search and the results being encrypted. In this way, partners can submit sensitive queries without having to expose personal or regulated data. 24
PEC approaches vary and include differential privacy, synthetic data, homomorphic encryption, secure multiparty computation and zero-knowledge proofs. Use cases exist in banking (e.g., anti-money laundering), finance (e.g., know your customer), insurance (e.g., cross-insurance organizations fraud detection), healthcare (e.g., for clinical trial comparison and trend analysis), and analytics. In public cloud infrastructure, we see, for example, the use of trusted execution environments (TEE) as offered by hyperscalers. Amazon, Google and Microsoft all offer a way to segregate computing environments on a hardware system level, although their exact implementations differ.
Recommendations:
  • Investigate key use cases within the organization and the wider ecosystem where a desire exists to use personal data in untrusted environments or for analytics and business intelligence purposes, both internally and externally.
  • Prioritize investments in applicable PEC techniques to gain an early competitive advantage.

Cybersecurity Mesh

Analysis by Felix Gaehtgens, Patrick Hevesi, James Hoover, Michael Kelley, Mary Ruddy, Henrique Teixeira
Strategic Planning Assumption: By 2024, organizations adopting a cybersecurity mesh architecture to integrate security tools to work as a cooperative ecosystem will reduce the financial impact of individual security incidents by an average of 90%.
Building a supportive trust fabric across assets that are widely distributed across clouds and traditional data centers requires a flexible, integrated security structure. However, organizations struggle to build a coherent security posture as most security products work only within their very specific domain and need to be individually configured. This causes a fragmented set of services, so a common, broad and unified security approach is needed. Cybersecurity mesh architecture (CSMA) is an approach to extend security controls beyond traditional enterprise perimeters. CSMA focuses on composability, scalability and interoperability to create a collaborative ecosystem of security tools.
CSMA helps provide a common, integrated security structure and posture to secure all assets, regardless of location. CSMA enables best-of-breed, stand-alone solutions to work together to improve overall security posture while moving control points closer to the assets they’re designed to protect. It composes across foundational layers to enable distinct security controls to work together and facilitates their configuration and management. By doing so, CSMA fosters composability, scalability and interoperability for security controls.
The security industry is starting to meet the demands for an integrated and interoperable approach to security architecture.25 The CSMA approach provides a model for creating a coherent and integrated security posture from individual components, increasing agility and capacity to focus on higher-value endeavors.
Using a CSMA approach enabled a healthcare organization in the U.S. to respond more effectively to threats and accurately identify the best response. It integrated threat intelligence and vulnerability information, as well as context from multiple security controls, to enforce policies and drive automation. Previously, it used different siloed analytics solutions for endpoint and security monitoring from different vendors. This led to inefficiencies because the organization never really saw the big picture, and prevented effective security orchestration and response.
A technology organization was struggling to create value from its threat intelligence program. Using a CSMA approach, it started integrating multiple data feeds from distinct security products. This enabled it to better identify indicators of compromises and to respond more quickly to incidents.
Recommendations:
  • Prioritize composability and interoperability when selecting security solutions, and build a common base framework to integrate solutions for synergetic effects.
  • Deploy supportive layers for a long-term CSMA strategy — security analytics, identity fabric, policy management and dashboards.
  • Familiarize yourself with current and emerging security standards, and with open-source code projects as a potential alternative to supplement gaps in vendor interoperability.

Data Fabric

Analysis by Mark Beyer, Ehtisham Zaidi, Robert Thanaraj
Strategic Planning Assumption: By 2024, data fabric deployments will quadruple efficiency in data utilization while cutting human-driven data management tasks in half.
Data environments distributed across cloud and noncloud environments cannot be supported by the linear expansion of current, saturated data management cost models. Instead, they require new forms of distributed access and delivery infrastructures to provide scalability. A data fabric supports the design, deployment and use of integrated and reusable data objects, regardless of deployment platform and architectural approach. It does this by using continuous analytics over existing, discoverable and inferenced metadata assets to identify where and how data is being used.
The focus of data management activities is shifting from human-driven effort to analytics and machine learning that use metadata to:
  • Augment human analysis during data specification and design. A data fabric uses a continuous process for monitoring and profiling data content, combined with graph analysis of actual data usage, utilization and use cases. This reveals exact patterns of current data flows throughout an organization, as opposed to attempting to reconfirm original designs that have changed in practice.
  • Monitor operations for recommended data changes. Machine learning within the data fabric analyzes variances in usage, content and even schema of data that take place over time. It compares these variances with behavior patterns in both systems and people to identify the extent and types of changes.
  • Manage orchestration across platforms and tools. Based on the operations monitoring and the analysis of actual data usage, a data fabric moves from filling a passive observer and design assistance role into becoming an active system. In this mode, it passes recommendations as runtime instructions to third-party tools for data management functions, such as data quality, data mastering, data sharing and data regulation.
These capabilities not only integrate data from multiple sources, but also reduce data science effort considerably by actively recommending where data should be used and changed. As a result, interest in data fabrics is accelerating rapidly.26 Full maturity is several years away, but by late 2024, data management systems will begin to deliver significant capabilities to augment human efforts. Data fabric will quadruple the output from human-driven efforts by automating predictable or inference-based tasks and offloading them to systems on an ever-increasing number of data assets. This will, in time, impact almost every aspect of data and analytics, applications, and, eventually, all of IT and business.
Recommendations:
  • Replace data management tools and platforms that are isolated in their use of metadata with tools and platforms that share internal metadata in a much broader sense. Do this gradually.
  • Eliminate solutions that don’t support or plan to support the shift to active use of metadata.
  • Identify priority areas to introduce data fabric solutions by using metadata analytics to determine current data utilization patterns for ongoing business operations. Prioritize areas where significant drift between actual and modeled data occurs.

Evidence

The 2021 Gartner CEO and Senior Business Executive Survey: Gartner conducted this research from July 2020 through December 2020, with questions about the period 2020 to 2023. One-quarter of the sample was collected in July and August, and three-quarters were collected from October through December. In total, 465 actively employed CEOs and other senior executive business leaders qualified and participated.
The research was collected via 390 online surveys and 75 telephone interviews. By job role, the sample mix was:
  • 287 CEOs
  • 115 CFOs
  • 29 COOs or other C-level
  • 34 chairpersons, presidents and board directors
By geographic region, the sample mix was:
  • 183 North America
  • 109 Europe
  • 97 China, Japan, Australia and others in the Asia/Pacific region
  • 56 Brazil, Mexico and other Latin America
  • 13 Middle East
  • 7 South Africa
By enterprise revenue, the sample mix was:
  • 46 $50 million to less than $250 million
  • 122 $250 million to less than $1 billion
  • 226 $1 billion to less than $10 billion
  • 71 $10 billion or more
The survey was developed collaboratively by a team of Gartner analysts that examines technology-related strategic business change, and was reviewed, tested and administered by Gartner’s Research Data and Analytics team. The results of this study are representative of the respondent base and not necessarily business as a whole.
6 The 2021 Gartner Digital Business Acceleration Survey found the top two reasons for pursuing digital initiatives were to enhance customer experiences (58%) and improve employee productivity (57%).
8 The 2021 Gartner Hybrid Work HR Leader Survey was conducted in February 2021 and was completed by 75 HR Leaders from client organizations across all industries. Respondent organizations were based in ANZ (Australia and New Zealand), the Asia/Pacific region (Malaysia, Japan, Singapore and Thailand), EMEA (Belgium, Denmark, Germany, France, Romania, Spain, South Africa, Sweden, Switzerland and the U.K.), Latin America (Brazil, Mexico and Peru), and North America (the U.S. and Canada). This was administered as a web-based survey.
13 Google searches for “operationalizing AI” quintupled from 2018 to 2020. Source Google Trends.
14 From Gartner’s 2021 View from the Board of Directors Survey. When asked, “Which will be the top 3 game changer technologies for your industry to emerge stronger from the COVID-19 crisis?,” 69% of respondents gave AI as one of their top three answers (n = 255). The survey was conducted to find out how boards of directors (BoDs) view digital-business-driven business model evolution and the impacts of that on their enterprises. It was conducted online from May 2020 through June 2020, among 265 respondents from the U.S., EMEA and the Asia/Pacific region, and respondents were required to be a board director or a member of a corporate BoD. Disclaimer: The results of this survey do not represent global findings or the market as a whole, but reflect sentiment of the respondents and companies surveyed.
15 For details on effectively operationalizing AI models, see Case Study: Monitoring the Business Value of AI Models in Production (Georgia Pacific).
16 Gartner delivered two hyperautomation webinars on 1 December 2020 and 8 June 2021. Participant polling indicated that more than 80% (for both polls on both dates) would increase or not change hyperautomation investment strategies over next 12 months. The number of respondents ranged from 184 to 399.
17 Gartner’s 2021 Reimagining Technology Work Survey was conducted via an online platform in March 2021 among over 6,000 employees across functions, levels, industries and geographies. The survey examined the extent to which employees outside of IT were involved in customizing and building analytics or technology solutions, the types of activities they performed, the teams and structures they worked in, and the types of support they received, among others. Regression analysis was used to identify the 10 factors most closely linked to achievement of business technologists’ key objectives, as reported in Infographic: Boost the Value and Success of Business-Driven Hyperautomation Initiatives.
18 In Gartner’s Reengineering the Decision Survey47% of respondents agreed that the decisions they make will become more complex in the next 18 months, with only 2% disagreeing. Thirty-seven percent of respondents agreed that they will need to make decisions more quickly in that time frame, with just 5% disagreeing. The survey was conducted online from 24 June 2021 through 9 July 2021 to understand the role of data and analytics in organizational decision making. In total, 132 IT and business leaders participated, based on their involvement and participation in strategic decision making at their organizations.
22 From Gartner’s 2020 Building Digital Platforms Survey. When asked, “What architectural approaches or development capabilities is your organization using to build its digital platform?,” 67% of respondents said a cloud-native application architecture (n = 206). The survey was conducted online during May 2020 and June 2020 among 206 respondents working for organizations in North America and Western Europe with at least $1 billion in annual revenue. Organizations also had to be working on digital business efforts or have plans to do so, defined as involving the Internet of Things, delivery of public APIs, private/B2B APIs or a combination.
Respondents were required to have a job title of director or more senior, and to be involved in either digital business, data analytics, IoT or API-based platforms for partners. In respect to digital business initiatives, they were also required to have a role in either defining technology requirements, investigating or evaluating service providers, or making final decisions. Results of this study do not represent global findings or the market as a whole, but reflect sentiment of the respondents and companies surveyed.
23 Gartner’s 2021 Security and Risk Survey found that 40% of security and risk management leaders prioritized investment in PEC as a top priority (sum of top three ranks) for 2022 and beyond. The survey was conducted between April 2021 and May 2021 to better understand how risk management planning, operations, budgeting and buying are performed, especially in the following areas: IT risk management, cybersecurity program management, business continuity management, privacy and cyber-physical system security. The research was conducted online among 615 respondents across North America, EMEA, the Asia/Pacific region and Latin America. Results of this study do not represent global findings or the market as a whole, but are a simple average of results for the targeted countries, industries and company size segments covered in this survey.
26 Gartner client inquiries on data fabric increased 4:1 year-over-year from 2019 to 2020. This same trend continued with nearly a 3.5:1 ratio from 2020 into 2021. Gartner client inquiries from June 2019 through July 2021 indicate that data fabric is currently most closely associated with data integration and metadata (both passive and active).

Note 1: Fusion Teams

A fusion team is a multidisciplinary team that blends technology or analytics and business domain expertise and shares accountability for business and technology outcomes. Instead of organizing work by functions or technologies, fusion teams are typically organized by the cross-cutting business capabilities, business outcomes or customer outcomes they support. Fusion teams often comprise product leads, data and information specialists, technology platform experts, total experience designers, as well as developers and testers, and other roles as required.

Note 2: Multiexperience

Multiexperience describes interactions that take place across a variety of digital touchpoints (such as web, mobile apps, conversational apps, augmented reality, virtual reality, mixed reality and wearables) using a combination of interaction modalities in support of a seamless and consistent digital user journey. Modalities include no-touch, voice, vision and gesture. Multiexperience is part of a long-term shift from the individual computers we use today to a multidevice, multisensory and multilocation ambient computing experience.

Note 3: DataOps, ModelOps and DevOps

Gartner defines DataOps, model operationalization (ModelOps) and DevOps as follows:
  • DataOps is a collaborative data management practice focused on improving the communication, integration, automation, observability and operations of data flows between data managers and data consumers across an organization. See  Hype Cycle for Data Management, 2021 for more details.
  • ModelOps is primarily focused on the end-to-end governance and life cycle management of all analytics, AI and decision models (including analytical models and models based on machine learning, knowledge graphs, rules, optimization, linguistics, agents and others). See  Hype Cycle for Analytics and Business Intelligence, 2021 for more details.
  • DevOps is a business-driven approach for delivering customer value using agile methods, collaboration and automation. See Keys to DevOps Success for more details.