Mencari Data Integration Tools ? Lihat Gartner Magic Quadrant

Magic Quadrant for Data Integration Tools

29 July 2015 ID:G00269320
Analyst(s): Eric ThooLakshmi Randall


Enterprise buyers increasingly see data integration as a strategic requirement, for which they want comprehensive data delivery capabilities, flexible deployment models, and synergies with information and application infrastructures. To help them make the right choice, Gartner assesses 13 vendors.

Market Definition/Description

The discipline of data integration comprises the practices, architectural techniques and tools for achieving consistent access to, and delivery of, data across the spectrum of data subject areas and data structure types in the enterprise — to meet the data consumption requirements of all applications and business processes.
The market for data integration tools includes vendors that offer software products to enable the construction and implementation of data access and data delivery infrastructure for a variety of data integration scenarios. These include:
  • Data acquisition for business intelligence (BI), analytics and data warehousing —Extracting data from operational systems, transforming and merging that data, and delivering it to integrated data structures for analytics purposes. The variety of data and context for analytics is expanding as emergent environments — such as NoSQL and Hadoop distributions for supporting big data, in-memory DBMSs, logical data warehouse architectures and end-user capability to integrate data (as part of data preparation) — increasingly become parts of the information infrastructure.
  • Sourcing and delivery of master data in support of master data management (MDM) —Enabling the connectivity and integration of the data representing critical business entities such as customers, products and employees. Data integration tools can be used to build the data access and synchronization processes to support MDM initiatives.
  • Data migrations/conversions — Although traditionally addressed most often via the custom coding of conversion programs, data integration tools are increasingly addressing the data movement and transformation challenges inherent in the replacement of legacy applications and consolidation efforts during mergers and acquisitions.
  • Data consistency between operational applications — Data integration tools provide the ability to ensure database-level consistency across applications, both on an internal and an interenterprise basis (for example, involving data structures for SaaS applications or cloud-resident data sources), and in a bidirectional or unidirectional manner.
  • Interenterprise data sharing — Organizations are increasingly required to provide data to, and receive data from, external trading partners (customers, suppliers, business partners and others). Data integration tools are relevant for addressing these challenges, which often consist of the same types of data access, transformation and movement component found in other common use cases.
The usage of data integration tools may display characteristics not unique to one of these individual scenarios. Technologies in this market are required to execute many of the core functions of data integration, which can apply to any of the above scenarios. Examples of resulting characteristics include:
  • Interoperating with application integration technology in a single solution architecture to, for instance, expose extraction, transformation and loading (ETL) processes that extract data from sources as a service to be provisioned via an enterprise service bus.
  • Enabling data services as an architectural technique in a service-oriented architecture (SOA) context. Rather than a use of data integration per se, this represents an emerging trend for data integration capabilities to play a role, and to be implemented within, software-defined architecture for application services.
  • Integrating a combination of data residing on-premises and in SaaS applications or other cloud-based data stores and services, to fulfill requirements such as cloud service integration.
  • Supporting the delivery of data to, and the access of data from, platforms typically associated with big data initiatives, such as Hadoop, NoSQL and cloud-based data stores. These platforms provide opportunities for distributing data integration workloads to external parallelized processes. The emerging concept of a "data lake," where data is continuously collected and stored in a lightly structured NoSQL repository, poses data integration challenges but also opportunities to assist in the application of schemas at data read-time, if needed, and to deliver data to business users, processes or applications, or to use data iteratively.
Gartner has defined several classes of functional capability that vendors of data integration tools provide to deliver optimal value to organizations in support of a full range of data integration scenarios:
  • Connectivity/adapter capabilities (data source and target support). The ability to interact with a range of different types of data structure, including:
    • Relational databases
    • Legacy and nonrelational databases
    • Various file formats
    • XML
    • Packaged applications, such as those for CRM and supply chain management
    • SaaS and cloud-based applications and sources
    • Industry-standard message formats, such as electronic data interchange (EDI), Health Level Seven International (HL7) and Society for Worldwide Interbank Financial Telecommunication (SWIFT)
    • Parallel distributed processing environments such as Hadoop Distributed File System (HDFS) and other NoSQL-type repositories, such as graph, table-style, document store and key-value DBMSs
    • Message queues, including those provided by application integration middleware products and standards-based products (such as Java Message Service)
    • Data types of a less structured nature, such as that associated with social media, Web clickstreams, email, websites, office productivity tools and content
    • Emergent sources, such as data on in-memory repositories, mobile platforms and spatial applications
    • Screen-scraping and/or user interaction simulations (for example, scripts to interact with Web, 3270, VT100 and others)
  • Data integration tools must support different modes of interaction with this range of data structure types, including:
    • Bulk/batch acquisition and delivery
    • Granular trickle-feed acquisition and delivery
    • Change data capture (CDC) — the ability to identify and extract modified data
    • Event-based acquisition (time-based, data-value-based, or links to application integration tools to interact with message request/reply, publish/subscribe, and routing)
  • Data delivery capabilities. The ability to provide data to consuming applications, processes and databases in a variety of modes, including:
    • Physical bulk/batch data movement between data repositories, such as processes for ETL or extraction, loading and transformation (ELT)
    • Data federation/virtualization
    • Message-oriented encapsulation and movement of data (via linkage with application integration tool capability)
    • Replication of data between homogeneous or heterogeneous DBMSs and schemas
  • In addition, support for the delivery of data across the range of latency requirements is important, including:
    • Scheduled batch delivery
    • Streaming/near-real-time delivery
    • Event-driven delivery of data based on identification of a relevant event
  • Data transformation capabilities. Built-in capabilities for achieving data transformation operations of varying complexity, including:
    • Basic transformations, such as data-type conversions, string manipulations and simple calculations
    • Transformations of intermediate complexity, such as look-up and replace operations, aggregations, summarizations, integrated time series, deterministic matching and the management of slowly changing dimensions
    • Complex transformations, such as sophisticated parsing operations on free-form text, rich media and patterns/events in big data
In addition, the tools must provide facilities for developing custom transformations and extending packaged transformations.
  • Metadata and data modeling support. As the increasingly important heart of data integration capabilities, metadata management and data modeling requirements include:
    • Automated discovery and acquisition of metadata from data sources, applications and other tools
    • Discernment of relationships between data models and business process models
    • Data model creation and maintenance
    • Physical-to-logical model mapping and rationalization
    • Ability to define model-to-model relationships via graphical attribute-level mapping
    • Lineage and impact analysis reporting, in graphical and tabular formats
    • An open metadata repository, with the ability to share metadata bidirectionally with other tools
    • Automated synchronization of metadata across multiple instances of the tools
    • Ability to extend the metadata repository with customer-defined metadata attributes and relationships
    • Documentation of project/program delivery definitions and design principles in support of requirements definition activities
    • A business analyst/end-user interface to view and work with metadata
  • Design and development environment capabilities. Facilities for enabling the specification and construction of data integration processes, including:
    • Graphical representation of repository objects, data models and data flows
    • Management of the development process workflow, addressing requirements such as approvals and promotions
    • Granular, role-based and developer-based security
    • Team-based development capabilities, such as version control and collaboration
    • Functionality to support reuse across developers and projects, and to facilitate the identification of redundancies
    • A common or shared user interface for design and development (of diverse data delivery styles, of data integration and data quality operations, of cloud and on-premises environments, and so on)
    • A business analyst/end-user interface to specify and manage mapping and transformation logic through the use of end-user functionality for data integration/preparation
    • Support for testing and debugging
  • Information governance support capabilities (via interoperation with data quality, profiling and mining capabilities with the vendor's or a third party's tools). Mechanisms to work with related capabilities to help with the understanding and assurance of data quality over time, including interoperability with:
    • Data profiling tools (profiling and monitoring the conditions of data quality)
    • Data mining tools (relationship discovery)
    • Data quality tools (supporting data quality improvements)
  • Deployment options and runtime platform capabilities. Breadth of support for the hardware and operating systems on which data integration processes may be deployed, and the choices of delivery model — specifically:
    • Mainframe environments, such as IBM z/OS and z/Linux
    • Midrange environments, such as IBM System i or HP Tandem
    • Unix-based environments
    • Windows environments
    • Linux environments
    • On-premises (at the customer site) installation and deployment of software
    • Hosted off-premises software deployment (dedicated, single-tenant implementation)
    • Integration platform as a service (iPaaS), consumed by the customer completely "as a service" — the vendor provides cloud infrastructure; the customer does not install and administer the software
    • Cloud deployment support (requires organizations to deploy software in cloud infrastructure)
    • In-memory computing environment
    • Server virtualization (support for shared, virtualized implementations)
    • Parallel distributed processing (such as Hadoop and MapReduce)
  • Operations and administration capabilities. Facilities for enabling adequate ongoing support, management, monitoring and control of the data integration processes implemented by the tools, such as:
    • Error-handling functionality, both predefined and customizable
    • Monitoring and control of runtime processes, both via functionality in the tools and through interoperability with other IT operations technologies
    • Collection of runtime statistics to determine use and efficiency, as well as an application-style interface for visualization and evaluation
    • Security controls, for both data in-flight and administrator processes
    • A runtime architecture that ensures performance and scalability
  • Architecture and integration capabilities. The degree of commonality, consistency and interoperability between the various components of the data integration toolset, including:
    • A minimal number of products (ideally one) supporting all data delivery modes
    • Common metadata (a single repository) and/or the ability to share metadata across all components and data delivery modes
    • A common design environment to support all data delivery modes
    • The ability to switch seamlessly and transparently between delivery modes (bulk/batch versus granular real-time versus federation) with minimal rework
    • Interoperability with other integration tools and applications, via certified interfaces, robust APIs and links to messaging support
    • Efficient support for all data delivery modes, regardless of runtime architecture type (centralized server engine versus distributed runtime)
    • The ability to execute data integration in cloud and on-premises environments, as appropriate, where developed artifacts can be interchanged, reused and deployed across both environments with minimal rework
  • Service enablement capabilities. As acceptance of data service concepts continues to grow, so data integration tools must exhibit service-oriented characteristics and provide support for SOA, such as:
    • The ability to deploy all aspects of runtime functionality as data services (for example, deployed functionality can be called via a Web services interface)
    • Management of publication and testing of data services
    • Interaction with service repositories and registries
    • Service enablement of development and administration environments, so that external tools and applications can dynamically modify and control the runtime behavior of the tools

Magic Quadrant

Figure 1. Magic Quadrant for Data Integration Tools
Figure 1.Magic Quadrant for Data Integration Tools
Source: Gartner (July 2015)

Vendor Strengths and Cautions


Based in Redwood City, California, U.S., Actian offers data integration capabilities via Actian DataConnect and Actian DataCloud. Actian's customer base for data integration tools is estimated to number more than 6,800 organizations.
  • Core capability and performance. Actian's continued focus on diverse connectivity and emphasis on industry-standard message formats enable companies to support diverse data integration requirements, from real-time to batch mode. Customers report Actian's bulk/batch data delivery to be a key strength, one favored in implementations for having strong scalability and performance.
  • User productivity. The extension of functionality so that business roles can control user-defined integration mapping and data loading enables productivity gains. Actian has extended the capabilities of DataCloud and DataConnect, which will evolve to be delivered as managed services to further simplify customer onboarding and ease use for interenterprise data sharing.
  • Customer relationship. Actian's partnering posture toward companies results in a strong customer experience, encompassing the presale process, the selling process, and the postsale relationship.
  • Challenges with major upgrades. Customers report challenges with upgrades between major releases. A migration utility offered by Actian, along with increased frequency of releases, is intended to reduce the complexity of each upgrade and ease migration efforts.
  • Linkage of product support with professional services. Customers would like to see better alignment between Actian's data integration tools, along with access to implementation skills and professional services.
  • Market positioning of use cases. The perceived closeness of alignment between Actian's portfolio and analytics support requirements represents a competitive disadvantage for prospective buyers. However, customers report diverse data integration use cases in production.


Based in Chicago, Illinois, U.S., Adeptia offers the Adeptia Integration Suite and Adeptia Connect. Adeptia's customer base for this product set is estimated to number approximately 530 organizations.
  • Integrated product and time-to-value: Adeptia's data integration technology is offered alongside other integration tools that provide an enterprise service bus, B2B integration and trading partner management — all in a single product. Customers appreciate the tight integration of the underlying components and the ability to support rapid implementation.
  • Expanded applicability using cloud services. Adeptia Connect offers iPaaS functionality that enables interenterprise data sharing, where businesses are able to publish and use data connectors for B2B integration. Planned enhancements to support EDI schemas, data dictionaries, mapping and prebuilt EDI connections to common applications are expected at the end of 2015, followed by a single interface for using both iPaaS and on-premises tools.
  • Pricing and value. Customers view Adeptia's tools as attractively priced and as delivering good value, relative to the cost of tools from major competitors. Adeptia's recent standardization of subscription-based pricing aims to simplify procurement, based on tiered editions and feature sets.
  • Skills and market coverage. While Adeptia claims its products' ease of use and reduced dependence on skilled resources are differentiators, finding those skilled resources might prove challenging for organizations trying to implement and maintain deployments as their requirements grow.
  • Learning experience. Some Adeptia customers indicate usage challenges when business-oriented users, rather than integration developers, are first learning to perform complex data integration activities. Generally, however, Adeptia's enabling of business-user-driven integration tasks and overall ease of use attract buyers to 'its Web-based product — and these are differentiating factors that the company continues to focus on.
  • Big data support. Overall, Adeptia's implementations and competitive activities indicate limited traction in support of big data initiatives, which are increasingly emphasized in this market.


Based in San Jose, California, U.S., Cisco offers the Cisco Information Server and Cisco Integration Platform. Cisco's customer base for this product set is estimated to number more than 320 organizations.
  • Agility and rapid time-to-value. Cisco has a strong focus on data federation/virtualization, and a well-established track record for capitalizing on the growing demand in this area. Reference customers praise Cisco's data virtualization technology for enabling agile development, and using optimization techniques such as pushing processing down to data sources, which minimizes data movement and reduces time-to-value.
  • Expanding product portfolio. Cisco continues to evolve its product portfolio by supporting the logical data warehouse (LDW), integrating streaming data and Internet of Things (IoT) environments, linking with Cisco Tidal Enterprise Scheduler for data integration workflows, and envisaging capability for data preparation. Cisco's expanding product portfolio supports the objective of enabling bimodal IT and digitalization, and of providing an enhanced analytics experience.
  • Customer relationship and market access. Reference customers state that Cisco is extremely responsive to customer needs and that the quality of its customer support is good. Cisco's global reach continues to widen the availability of the data integration tools it acquired from Composite Software.
  • Breadth of coverage and market resonance. Organizations seeking providers with a comprehensive range of data delivery capabilities (beyond federated/virtualized styles) often find Cisco's product set to be narrow. Cisco's relatively small customer base in this market limits the availability of relevant skills, which at times poses a barrier to adoption. Cisco needs to increase its market resonance on the basis of its acquired offerings from Composite Software, as Cisco is not an incumbent vendor in the information management technology sector.
  • Synergy with data management capabilities. Although Cisco is working to improve its metadata support, customers are still looking for a seamless way to integrate metadata across diverse data integration use cases. Customers are increasing their expectations for governance support and requiring integrated use of Cisco's data integration capability with comprehensive data quality tools.
  • Deployment and diagnostic guidance. Reference customers identify diagnosis of error messages as challenging, and require better diagnostic support and integrated documentation. Customers also expressed a desire for a more mature user community, for improved access to implementation guidance and practices.


Based in Palo Alto, California, U.S., Denodo offers the Denodo Platform. Denodo's customer base for its data integration product is estimated to number 250 companies.
  • Capitalization on demand. The Denodo Platform provides data virtualization, with an established basis in the enabling of data abstraction capabilities for joining multistructured data sources from DBMSs, websites, documents and a variety of repositories. Physical data movements are supported via the Denodo Scheduler component, which delivers data from repository to cache, or directly to another repository. Denodo capitalizes on the increasing traction and importance of data virtualization in the overall market for data integration tools.
  • Track record and partner channels. With a recognized track record in data virtualization, Denodo has built a partner network of implementers and joint-marketing vendors, including IBM, Cloudera, Hortonworks, MicroStrategy, Tableau, SAP, MongoDB and Pivotal. A diverse range of software vendors license or bundle Denodo's functionality as part of their products in support of logical abstraction and agility for analytics, big data and operational use cases.
  • Connectivity support and links to related integration capability. The Denodo Platform provides broad connectivity to relational databases, prerelational legacy data, flat files, XML, packaged applications and emergent data types including Hadoop and cloud-based data sources. The Denodo Platform can receive and publish data in a variety of formats and interfaces, including Java Database Connectivity (JDBC), Open Database Connectivity (ODBC), Java Message Service (JMS)-compliant message queues, REST and SOAP Web services, JavaScript Object Notation (JSON), XML, portlets and SharePoint Web parts. It can also support discovery and the use of data services.
  • Breadth of coverage. Organizations seeking providers with a breadth of data delivery styles may find Denodo's integrated platform to have limited versatility, relative to competitors with toolsets for diverse data delivery. Customers express a desire for easier administrative manageability and links to related tools for data governance support.
  • Degree of metadata support. Reference customers identify Denodo's metadata management as an area of relative weakness when enabling reusability across a growing range of software tools and use cases. Customers increasingly look for comprehensive functionality and a vision for these requirements, to address the escalating number and variety of datasets and distributed data architectures.
  • Availability of skills and best-practice documentation. Since Denodo has a relatively small customer base, customers express concerns about a lack of adequately skilled implementers in the market. Denodo has addressed this shortcoming with additional system integration partners and training. Reference customers express a desire for improvements in guidance on design architecture and documentation of best practices.


Based in Armonk, New York, U.S., IBM offers the following data integration products: IBM InfoSphere Information Server Enterprise Edition (including InfoSphere Information Server for Data Integration and InfoSphere Business Information Exchange), InfoSphere Federation Server, InfoSphere Data Replication, InfoSphere Information Server Enterprise Hypervisor Edition and WebSphere Cast Iron Live. IBM's customer base for this product set is estimated to number more than 10,700 organizations.
  • Depth and breadth of usage. IBM's data integration tools continue to be deployed for extensive use cases — often those of complex scale, spanning a wide range of projects and involving teams of various sizes.
  • Mind share and capitalization on market demand. IBM continues to gain traction as an enterprise standard for data integration infrastructure, with a strong presence in competitive bids. The linkage of data integration capability with diverse analytics support, and the embedding of IBM DataWorks (for self-service data preparation) in Watson Analytics and in cloud-based data stores such as IBM dashBD, are increasing the synergy of IBM's data integration tools with its broader portfolio.
  • Versatility in enabling information infrastructure and analytics. IBM continues to focus on enabling information infrastructure modernization in its efforts to align data integration with diverse demands for information capabilities (including data quality and governance, support for line-of-business users, big data, integration support in cloud adoptions and MDM). IBM's data integration focus is expanding applicable usage scenarios to business-facing roles by increasing self-service capabilities and deepening the synergies between information infrastructure and analytics use cases, to form integrated toolsets available on a common platform.
  • Product support and version upgrades. Customers reported difficulty with version upgrades and migrations. IBM has begun mitigating this through in-place upgrades, where downtime can be avoided, and will continue to mitigate it by converging the timing of various InfoSphere product releases, for better anticipation and alignment of upgrades.
  • Pricing model. Reference customers identify software costs and perceived total cost of ownership (TCO) as barriers to broader adoption. IBM's provision of varied licensing approaches, such as core-, workgroup-, bundle-, subscription- and perpetual-based models, while intended to provide more procurement and pricing choices, has reportedly also confused customers assessing and selecting pricing models.
  • Complexity of integrated use of portfolio. In general, customers expressed difficulty with integrated deployments of IBM's data integration tools alongside other IBM products, such as challenges associated with the integration of IBM InfoSphere DataStage with IBM BigInsights.


Based in Redwood City, California, U.S., Informatica offers the following data integration products: Informatica Platform (including PowerCenter, PowerExchange, Data Services, Data Replication, Ultra Messaging, Big Data, B2B Data Exchange and Data Integration Hub), Vibe Data Stream, and Informatica Cloud Integration. Informatica's customer base for this product set is estimated to number more than 5,500 organizations. At the time of writing, Informatica has announced an agreement for it to be acquired and taken into private ownership by a company controlled by the European private equity firm Permira Advisers and the Canada Pension Plan Investment Board. The acquisition awaits regulatory approval and is scheduled to be completed in 3Q15.
  • Strength of data integration functionality and synergy with portfolio. Informatica's tools continue to reflect a diverse range of data integration styles, usage scenarios and multiproject deployments. Strong synergies between Informatica's data integration tools and other Informatica technologies encourage usage as an enterprise standard for a data integration infrastructure that links with data quality, MDM, big data, data security and cloud integration technologies. Informatica's emphasis on supporting digital services, the IoT, and related analytics and data security opportunities capitalizes on trends in demand.
  • Broad presence and dedicated focus on, and innovation in, data management and integration. Informatica's mind share in this market is extensive, with the highest frequency of appearances in competitive situations. Informatica's concentration on enabling information capabilities aligns its application- and technology-agnostic offerings with a broad range of established and emerging information infrastructures. Informatica's v10 release, planned for late 2015, advances Informatica's metadata-rich capabilities with Live Data Map, to enable data-driven applications (such as Secure@Souce for tracking and protecting sensitive, private information and Project Sonoma for operationalizing self-service capability).
  • Alignment with evolving trends and business-facing demand. The linking of Informatica's data integration offerings to its self-service data preparation tool facilitates a Microsoft Excel-like interface through which end users can build ETL-type tasks that can be deployed on Informatica Platform. Strong adoption of Informatica's iPaaS aligns well with the growing movement of data integration architectures toward cloud-based and hybrid delivery models. Informatica's v10 release is expected to feature new capabilities that support any type of user, data and mode of deployment.
  • Business evolution. The announced acquisition of Informatica generated some uncertainty among prospective and existing customers about the possibility of an impending strategy that could affect Informatica's roadmap and status as a thought-leader. Informatica has told its customers and partners that its commitment to delivering on its roadmap remains unchanged.
  • Cost model. Prospective customers point to difficulty understanding Informatica's licensing and pricing methods. Existing customers often express concerns about high costs relative to alternatives in this market. Informatica has begun to address some of these concerns by introducing simpler product packaging and pricing, by consolidating multiple add-on products.
  • Clarity of product messaging and portfolio architecture. Informatica's portfolio has grown large, and customers often express confusion about overlapping products and functionality. They want more intuitive ways to understand and navigate the offerings that Informatica's data integration tools work with. They also want guidance on how to add new products to existing Informatica deployments, and easier integrated usage of components.

Information Builders

Based in New York, New York, U.S., Information Builders offers the following data integration products: iWay Integration Suite (composed of iWay Service Manager and iWay DataMigrator) and iWay Universal Adapter Suite. Information Builders' customer base for this product set is estimated to number more than 800 organizations.
  • Robust functionality and broad usage. Information Builders' data integration tools support a diverse and balanced set of use cases, for which the core functionality remains robust and reliable. The vendor's capabilities in adapters and connectivity, comprehensive data transformation and encapsulating data into real-time message flows are regarded as key strengths in deployments.
  • Synergy of data integration tools with enterprise information capabilities: Information Builders continues to focus on evolving an integrated environment for data integration capability that can operate with enterprise service bus and master data management functions. This aligns well with demand for support of data integration activities, so that competency teams can seamlessly implement integration, process management, data governance and analytics in a synergistic way.
  • Customer relationship. Reference customers report a positive overall experience with Information Builders, both before buying and after implementation. Selection of this vendor's data integration tools is often influenced by an existing relationship and use of other Information Builders products.
  • Mind share with business leaders and influencers. Information Builders appeals mainly to technical communities and IT buyers, but it has relatively low mind share with business management and process leaders, who increasingly influence the adoption of information capabilities. This creates a market visibility challenge at a time when major competitors are engaging more with those roles.
  • Adoption scale and learning experience: Implementations of Information Builders' products by enterprises show an increase in departmental-level and narrower deployments. Many customers identify challenges with product complexity, a long learning curve and limited availability of skilled practitioners.
  • Product documentation and time-to-value. Reference customers have stated they want to see more improvements to Information Builders' documentation of products and best practices. Customers seek more extensive documentation of technical components to enable consistency of developer practices and faster time-to-value in implementations.


Based in Redmond, Washington, U.S., Microsoft offers data integration capabilities via SQL Server Integration Services (SSIS), which is included in the SQL Server DBMS license. Vast worldwide deployments of Microsoft SQL Server involve usage of SSIS for data integration, although Microsoft does not report a specific customer count for SSIS.
  • Relevant capabilities and TCO. Reference customers cite overall low TCO, speed of implementation, ease of use, and the ability to integrate with other Microsoft SQL Server capabilities as the main reasons for choosing SSIS over alternatives.
  • Alignment with data management and process- and user-oriented integration: SSIS supports connectivity to diverse data types and broad deployment in Microsoft-centric environments. SSIS is often used to put data into SQL Server to enable analytics, data management and end-user data manipulation using Microsoft Office tools, particularly Excel. Using SSIS in conjunction with Microsoft's BizTalk and Azure Data Factory platforms enables delivery of data from business workflows in enterprise applications for user-configured integration flows.
  • Widespread tool presence and usage experience. Broad familiarity with the implementation of Microsoft technologies spurs usage of SSIS. Wide choices in terms of community collaboration, training, and third-party documentation and guidance for deployment practices are reported as key points of value.
  • Integration of portfolio. Reference customers cite difficulties with integrated implementation of Microsoft's offerings across its portfolio as they address the growing scale and complexity of deployment scenarios for data integration activities.
  • Platform support. The inability to deploy data integration workloads on non-Windows environments is a limitation for customers wishing to draw on the processing power of diverse hardware and operating system platforms.
  • Linkage to deployments of information infrastructure. Reference customers cited a desire for more extensive big data support when manipulating and delivering data of interest, and for more guidance to enable data quality and governance in relation to data integration activities. These challenges relate to the growing complexity of, and demand for, information capabilities. Microsoft is progressively enhancing its big data support with links to machine learning and streaming analytics, and envisaging capabilities for data hub enablement.


Based in Redwood Shores, California, U.S., Oracle offers the following data integration products: Oracle Data Integrator (ODI), Oracle GoldenGate and Oracle Data Service Integrator. Oracle's customer base for this product set is estimated to number more than 10,000 organizations.
  • Broad usage and applicability. Oracle's data integration tool deployments reflect a mix of use cases and balanced market traction for ELT capabilities and real-time-oriented CDC and replication support. Oracle has extended its big data support to enable ingestion of streaming data by ODI workflows that can be deployed in Apache Spark and Pig environments. To capitalize further on big data and streaming integration scenarios, Oracle is expanding its product development competencies to focus on machine learning in modeling and design processes.
  • Synergies with portfolio's broad range of technologies. Recognition of Oracle's diverse portfolio for addressing data integration and other data and application-oriented requirements (spanning data quality tools, MDM solutions, ESB, analytic appliances and enterprise applications) continues to fuel its appeal in deployment scenarios.
  • Brand awareness and market presence. Oracle's size and global coverage of applications and analytics solutions enables it to draw on a huge customer base and a wide product distribution model for positioning data integration tools. Broad usage of Oracle's technologies within its customer base has driven wide availability of community support, training and third-party documentation on implementation practices and approaches to problem resolution.
  • Functional fulfillment. Although perceptions of Oracle's evolution of its data integration tools are generally positive, product-related customer satisfaction has declined in relation to developers' productivity with newer versions, overall ease of use and time-to-value. Customers want easier deployment of source and target changes, simpler monitoring across platforms, and more control over concurrency and queues.
  • Customer experience. Customers point to challenges in terms of version upgrades, bugs in new releases, responsiveness and the quality of product support. Customers of Oracle's data integration tools identify challenges with the processes for obtaining product support, especially in urgent cases.
  • Pricing. Concerns about prices, target and source-based licensing requirements, and perceived hardware-oriented cost challenges as deployments broaden have generated dissatisfaction among customers.


Based in Walldorf, Germany, SAP offers the following data integration products: SAP Data Services, SAP Replication Server, SAP Landscape Transformation Replication Server, SAP Process Orchestration, SAP Hana Cloud Integration, SAP Hana Enterprise Information Management (EIM, including SAP Hana Smart Data Integration and SAP Hana Smart Data Quality), SAP Agile Data Preparation and SAP PowerDesigner. SAP's customer base for this product set is estimated to number more than 15,000 organizations.
  • Broad usage and functionality. The breadth of functionality available across SAP's portfolio supports a diverse mix of data integration styles and use cases of increasing complexity. Usage supports synergistic deployments with SAP's broad application and information infrastructure offerings. Newly released SAP Hana EIM capabilities in SAP Hana Smart Data Integration and SAP Hana Smart Data Quality include built-in data integration adapters, as well a software development kit, which enhances the construction of data access and integration flows of variable latency involving on-premises and cloud-based data, in-memory data stores, Apache Spark and NoSQL environments.
  • Alignment with business-facing demand. Drawing on the linkage of data integration with information stewardship, alongside delivery of self-service data preparation functionality, SAP is making collaboration easier between business users and data integration practitioners. The launch of SAP Agile Data Preparation, with its user-facing functionality, enables business roles to access data sources, perform data transformations, and model datasets of interest in support of business scenarios.
  • Market presence and links to related disciplines. Capitalizing on its brand recognition, global reach and vast customer base in related disciplines, SAP maintains a high share of the market's data integration tool adoption. Enterprises with an incumbent portfolio of SAP products naturally look to the same vendor to provide their data integration technology.
  • Roadmap alignment and market messaging. There are concerns among customers about roadmaps, where the capabilities of SAP's agnostic data integration tools and native Hana products appear to overlap. Concerns about SAP's offerings becoming tightly linked to Hana have given rise to a perception that SAP is placing less emphasis on addressing the needs of non-SAP environments (for timely releases of heterogeneous data connectivity, for example). However, recent product releases and published roadmaps from SAP show continued development of heterogeneous features in its agnostic data integration tools.
  • Support for metadata. Although SAP offers extensive metadata and modeling functionality for data integration activities through SAP PowerDesigner, some reference customers are unaware of these capabilities and require help to address the growing scale and diversity of data integration scenarios that are making metadata management and modeling more complex in information infrastructure environments.
  • Customer support, service experience and skills. Reference customers' feedback indicates concerns about the overall customer experience. They want better guidance and support for best practices to ease the learning curve, wider availability of high-quality professional services, and shorter time-to-value for deployments.


Based in Cary, North Carolina, U.S., SAS offers the following data integration products: Data Management Platform, Federation Server, SAS/Access, SAS Data Loader for Hadoop and SAS Event Stream Processing. SAS's customer base for this product set is estimated to number 14,000 organizations.
  • Broad and integrated portfolio. The breadth and completeness of core functions and the integration of components position SAS to compete with larger and more established vendors in the data integration market. SAS's newly introduced Data Loader for Hadoop provides a guided user interface to facilitate data loading and data preparation (profiling, cleansing and transforming data), one that capitalizes on demand in the market.
  • Customer relationship. Reference customers report that their relationship with SAS, both before purchasing and after implementation, is exceptional. This contributes to longer-term, recurring engagements.
  • Product reliability and stability. Reference customers praised SAS's products for stability, reliability, robustness and effectiveness. These qualities, along with synergistic capabilities across portfolio, establish SAS's data integration technology as dependable and mature.
  • Cost and usage. Reference customers expressed concerns about high prices and a licensing model that they perceive as limiting their ability to expand deployments. SAS's data integration tool exhibits a dominant emphasis on analytics scenarios, which reflects narrower versatility compared with leading competitors in this market.
  • Availability of deployment skills. Reference customers expressed a desire for greater availability of resources who possess a deep knowledge of SAS tools outside its own professional services business, to give them wider procurement and cost options.
  • Metadata support and ease of deployment. Some reference customers indicate requirements for more extensive metadata support when using SAS tools for data integration. However, recent enhancements provide capabilities that some customers may not have taken advantage of; these include discovering, integrating and facilitating the reuse of metadata both within and external to SAS technologies. Customers identify difficulties, and display reduced satisfaction, with tool setup and product version upgrades and migrations.


Based in Woodcliff Lake, New Jersey, U.S., Syncsort offers DMX (for Linux, Unix and Windows) and DMX-h (for Hadoop). Syncsort's customer base for this product set is estimated to number 1,500 organizations.
  • Performance of core functionality and time-to-value. Syncsort continues to provide high-performance bulk/batch data movement capabilities with faster time-to-value than many competitors. These strengths continue to fulfill requirements for targeted functionality and superior performance and throughput.
  • Capitalization on big data initiatives. Syncsort is widening its data integration focus to work with diverse parts of the Hadoop ecosystem, interact with streaming data, and support external parallelized processing in offloading heavy legacy-infrastructure-related ETL or ELT workloads from data warehouses and mainframes to Hadoop. Using its "Intelligent Execution" framework, Syncsort sets out to insulate users from the underlying complexities of Hadoop, enable flexible deployment to diverse platforms, and support on-premises or cloud-based deployments.
  • Customer relationship and track record. Syncsort offers a high quality of service and support, and many customers identify its technical support and their overall relationship with Syncsort as positives. With an established track record of optimizing ETL processing, a loyal customer base and strategic partners (including Amazon, Cloudera, Cognizant, Dell, Fujitsu, Hortonworks, MapR, Qlik, Splunk, Tableau and Waterline Data), Syncsort has a solid foundation on which to grow its market presence.
  • Functional coverage and usage. Implementations of Syncsort's capabilities predominantly center on bulk/batch data movement, which poses challenges in competitive situations that require a broad range of data integration styles. However, links between Syncsort's tools and Apache Kafka and Storm are starting to support message-oriented data delivery.
  • Support for metadata and data quality. While Syncsort is making efforts to extend its metadata capabilities, such as by using Apache HCatalog, reference customers cite metadata management as an area requiring improvements in terms of the discovery of metadata in broad context, ease of access, and reuse in data integration processes. Customers express a desire for linkages to data quality capabilities in synergy with data integration tool usage.
  • Availability of skills and evolving cost model. There is a growing desire for greater availability of resources with a deep knowledge of Syncsort's tools to facilitate wider access to skills and more cost options. Cost-sensitivity is beginning to surface in broadening deployments.


Based in Redwood City, California, U.S., Talend offers Open Studio for Data Integration, Enterprise Data Integration, Platform for Data Services, Platform for Big Data and Integration Cloud. Talend's paying customer base for this product portfolio is estimated to number more than 3,600 organizations.
  • Portfolio relevance and cost model. Reference customers appreciate the solid performance of Talend's functionality, its support of diverse use cases, and the lower TCO of Talend's technology, relative to competitors. Talend's free open-source offering and affordable developer-based pricing for fully featured software appeal to customers and frequently attract usage for augmenting data integration capabilities.
  • Commitment to big data and evolving trends. Talend continues to benefit from an early commitment to big data and Hadoop, with new customer adoption and technology advancements such as in-memory processing (with Apache Spark) and the enabling of real-time scenarios. Momentum to support big data, combined with a new iPaaS offering and plans for self-service data preparation capability (using data virtualization and iPaaS) in 1Q16, position Talend to expand its market reach.
  • Adaptable capabilities in a unified product set. Talend's portfolio, including data quality, MDM, business process management, an ESB, and a recently added metadata management tool, sets out to deepen synergies across information- and application infrastructure-related use cases. Customers value the configurability of Talend's tools, which makes them flexible enough to adapt to the business requirements of data integration processes and the availability of artifacts built by Talend's practitioner community.
  • Breadth of experience with all data delivery styles and resourcing. While Talend's capabilities resonate well in bulk/batch-oriented data delivery needs, the company needs to increase awareness of its support for other data integration styles. Perceived limitations in terms of access to skilled resources, integration with incumbent technical environments and standards, and enterprisewide deployments were expressed by prospective customers.
  • Implementation guidance and time-to-value. Increased adoption of Talend's offerings is generating dissatisfaction with regard to the availability of usage references and best-practice implementation guidance. Deployments exhibit a long time-to-value, compared with major alternatives in this market. Customers desire improvements in technical support and documentation to ease version upgrades and migrations, tool usage, monitoring and administration. Talend is making significant investments in this area.
  • Product and market messaging. Some existing and prospective customers, particularly organizations considering putting Talend's portfolio to enterprisewide use, indicate that Talend does not adequately articulate the more evolved capabilities of its products or their synergistic uses.

Vendors Added and Dropped

We review and adjust our inclusion criteria for Magic Quadrants and MarketScopes as markets change. As a result of these adjustments, the mix of vendors in any Magic Quadrant or MarketScope may change over time. A vendor's appearance in a Magic Quadrant or MarketScope one year and not the next does not necessarily indicate that we have changed our opinion of that vendor. It may be a reflection of a change in the market and, therefore, changed evaluation criteria, or of a change of focus by that vendor.


  • Denodo.


  • None, but Cisco (Composite Software) now appears as Cisco.

Inclusion and Exclusion Criteria

To be included in this Magic Quadrant, vendors must possess within their technology portfolio the subset of capabilities identified by Gartner as the most critical from within the overall range of capabilities expected of data integration tools. Specifically, vendors must deliver the following functional requirements:
  • Range of connectivity/adapter support (sources and targets) — Native access to relational DBMS products, plus access to nonrelational legacy data structures, flat files, XML and message queues
  • Mode of connectivity/adapter support (against a range of sources and targets) — Bulk/batch and CDC
  • Data delivery modes support — At least two modes among bulk/batch data movement, federated/virtualized views, message-oriented delivery, and data replication and synchronization
  • Data transformation support — At a minimum, packaged capabilities for basic transformations (such as data type conversions, string manipulations and calculations)
  • Metadata and data modeling support — Automated metadata discovery, lineage and impact analysis reporting, ability to synchronize metadata across multiple instances of the tool, and an open metadata repository, including mechanisms for bidirectional sharing of metadata with other tools
  • Design and development support — Graphical design/development environment and team development capabilities (such as version control and collaboration)
  • Data governance support — Ability to interoperate at a metadata level with data profiling and/or data quality tools
  • Runtime platform support — Windows, Unix or Linux operating systems
  • Service enablement — Ability to deploy functionality as services conforming to SOA principles
In addition, vendors had to satisfy the following quantitative requirements regarding their market penetration and customer base:
  • They must generate at least $20 million of their annual software revenue from data integration tools, or maintain at least 300 maintenance-paying customers for their data integration tools.
  • They must support data integration tool customers in at least two of the major geographic regions (North America, Latin America, Europe, the Middle East and Africa, and Asia/Pacific).
We excluded vendors that focus on only one specific data subject area (for example, the integration of customer data), a single industry, or only their own data models and architectures.
There are many vendors of data integration tools that do not meet the above criteria and are therefore not included in this Magic Quadrant. For example, many vendors provide products to address one very specific style of data delivery (such as data federation/virtualization) and cannot support other styles. Others provide a range of functionality, but operate only in a specific technical environment. Still others operate only in a single region or support only narrow, departmental implementations. Some vendors meet all the functional, deployment and geographic requirements, but are very new to the data integration tool market and therefore have limited revenue and few production customers.

Evaluation Criteria

Ability to Execute

Gartner analysts evaluate technology providers on the quality and efficacy of the processes, systems, methods or procedures that enable IT providers' performance to be competitive, efficient and effective, and to positively affect revenue, retention and reputation. Ultimately, technology providers are judged on their ability to capitalize on their vision and their success in doing so.
We evaluate vendors' Ability to Execute in the data integration tool market using the following criteria:
  • Product/Service. How well the vendor supports the range of distinguishing data integration functionalities required by the market, the manner (architecture) in which this functionality is delivered, support for established and emerging deployment models, and the overall usability and consumption of the tools. Product capabilities are critical to the success of data integration tool deployments and, therefore, receive a high weighting.
  • Overall Viability. The magnitude of the vendor's financial resources and the continuity of its people and technology, which affect the practical success of the business unit or organization in generating business results.
  • Sales Execution/Pricing. The effectiveness of the vendor's pricing model in light of current customer demand trends and spending patterns, and the effectiveness of its direct and indirect sales channels. This criterion is weighted high to reflect the major emphasis of buyers on cost models and ROI, and the criticality of consistent sales execution in order to drive a vendor's growth and customer retention.
  • Market Responsiveness/Record. The degree to which the vendor has demonstrated the ability to respond successfully to market demand for data integration capabilities over an extended period, and how well the vendor has acted on the vision of prior years.
  • Marketing Execution. The overall effectiveness of the vendor's marketing efforts, which impacts its mind share, market share and account penetration. The ability of the vendor to adapt to changing demands in the market by aligning its product message with new trends and end-user interests.
  • Customer Experience. The level of satisfaction expressed by customers with the vendor's product support and professional services; their overall relationship with the vendor; and their perceptions of the value of the vendor's data integration tools relative to costs and expectations. This criterion retains a weighting of "high" to reflect buyers' scrutiny of these considerations as they seek to derive optimal value from their investments. Analysis and rating of vendors against this criterion are driven directly by the results of a customer survey executed as part of the Magic Quadrant process.
Table 1. Ability to Execute Evaluation Criteria
Evaluation Criteria
Overall Viability
Sales Execution/Pricing
Market Responsiveness/Record
Marketing Execution
Customer Experience
Not Rated
Source: Gartner (July 2015)

Completeness of Vision

Gartner analysts evaluate technology providers on their ability to convincingly articulate logical statements about current and future market direction, innovation, customer needs and competitive forces, as well as how they map to Gartner's position. Ultimately, technology providers are assessed on their understanding of the ways that market forces can be exploited to create opportunities.
We assess vendors' Completeness of Vision for the data integration tool market using the following criteria:
  • Market Understanding. The degree to which the vendor leads the market in recognizing opportunities represented by trends and new directions (technology, product, services or otherwise), and its ability to adapt to significant market inertia and disruption, including the degree to which the vendor is aligned with the significant trend for synergy with data management and application integration technologies. Given the dynamic nature of this market, this item receives a weighting of "high."
  • Marketing Strategy. The degree to which the vendor's marketing approach aligns with and/or exploits emerging trends and the overall direction of the market.
  • Sales Strategy. The alignment of the vendor's sales model with the ways in which customers' preferred buying approaches will evolve over time.
  • Offering (Product) Strategy. The degree to which the vendor's product roadmap reflects demand trends in the market, fills current gaps or weaknesses, and includes developments that create competitive differentiation and increased value for customers. In addition, given the requirement for data integration tools to support diverse environments for data, delivery models, and platform mix, we assess vendors on the degree of openness of their technology and product strategy. Given the intense evolution of both technology and deployment models in this market, this criterion receives a weighting of "high."
  • Business Model. The overall approach the vendor takes to execute its strategy for the data integration tool market, including diversity of delivery models, packaging and pricing options, and partnership.
  • Vertical/Industry Strategy. The degree of emphasis the vendor places on vertical solutions, and the vendor's depth of vertical-market expertise.
  • Innovation. The degree to which the vendor demonstrates creative energy by enhancing its practices and product capabilities, as well as introducing thought-leading and differentiating ideas and product plans that have the potential to significantly extend or reshape the market in a way that adds real value for customers. Given the pace of expansion of data integration requirements and the highly competitive nature of the market, this criterion receives a weighting of "high."
  • Geographic Strategy. The vendor's strategy for expanding its reach into markets beyond its home region/country, and its approach to achieving a global presence (for example, its direct local presence and use of resellers and distributors).
Table 2. Completeness of Vision Evaluation Criteria
Evaluation Criteria
Market Understanding
Marketing Strategy
Sales Strategy
Offering (Product) Strategy
Business Model
Vertical/Industry Strategy
Geographic Strategy
Source: Gartner (July 2015)

Quadrant Descriptions


Leaders in the data integration tool market are front-runners in the convergence of single-purpose tools into an offering that supports a full range of data delivery styles. They exhibit a clear understanding and vision of where the market is headed, and are strong in establishing data integration infrastructure as an enterprise standard and as a critical component of modern information infrastructure. They support both traditional and new data integration patterns to capitalize on market demand. Leaders have significant mind share in the market, and resources skilled in their tools are readily available. These vendors establish market trends (to a large degree) by providing new functional capabilities in their products, and by identifying new types of business problem to which data integration tools can bring significant value. Examples of deployments that span multiple projects and types of use case are common among Leaders' customers. Leaders have an established market presence, significant size and a multinational presence (directly or through a parent company).


Challengers are well-positioned in light of the key trends in the data integration tool market, such as the need to support multiple styles of data delivery. However, they may not provide comprehensive breadth of functionality, or may be limited to specific technical environments or application domains. In addition, their vision may be hampered by a lack of coordinated strategy across the various products in their data integration tool portfolio. Challengers generally have substantial customer bases, an established presence, credibility and viability, although implementations may be of a single-project nature, or reflect multiple projects of a single type (for example, predominantly ETL-oriented use cases).


Visionaries demonstrate a strong understanding of emerging technology and business trends, or a position well-aligned with current demand, but they lack market awareness or credibility beyond their customer base or a single application domain. Visionaries may also fail to provide a comprehensive set of product capabilities. They may be new entrants lacking the installed base and global presence of larger vendors, although they could also be large, established players in related markets that have only recently placed an emphasis on data integration tools. The growing emphasis on aligning data integration tools with the market's demand for interoperability of delivery styles, integrated deployment of related offerings (such as data integration and data quality tools), metadata modeling, support for emerging information and application infrastructures, and deployment models (among other things), is creating challenges for which vendors must demonstrate vision.

Niche Players

Niche Players have gaps in both their Completeness of Vision and Ability to Execute. They often lack key aspects of product functionality and/or exhibit a narrow focus on their own architectures and installed bases. Niche Players may have good functional breadth but a limited presence and mind share in this market. With a small customer base and limited resources, they are not recognized as proven providers of comprehensive data integration tools for enterprise-class deployments. Many Niche Players have very strong offerings for a specific range of data integration problems (for example, a particular set of technical environments or application domains) and deliver substantial value for their customers in the associated segment.


Data integration is central to enterprises' information infrastructure. Enterprises pursuing frictionless sharing of data are increasingly favoring tools that are flexible in regard to time-to-value demands, integration patterns, optimization for cost and delivery models, and synergies with information and application infrastructures.
Digital business will intensify data integration challenges. Use cases for generating more business value from an enterprise's information will accelerate the need to connect information across distributed data sources in far more diverse ways than has been the case with the traditional movement of bulk data. New types of data are emerging with the rise of digital businesses, and integration leaders now have to factor these into their data integration strategies. Enabling an integrated, digital business will add further complexity to an organization's data integration strategy by requiring a mix of latencies and patterns, as well as hybrid deployments using on-premises and cloud-based models.
Pressures grow in this market as vendors are challenged to address demand trends for innovation with the ability to enhance traditional practices and introduce new models and practices.
Business imperatives to confront new information challenges are driving the need for a realignment of technology vision in this market. Demand trends in 2015 require vendors to increase their flexibility in approaching comprehensive data integration needs, and to demonstrate a balanced alignment to time-to-value, breadth of data integration functionality, diverse use cases, and quality customer experience. Buyers increasingly favor tool characteristics that exhibit end-user relevance, flexibility in cost and delivery models, and synergy with information and application infrastructure initiatives.

Market Overview

Enterprises' need to improve the flexibility of their information infrastructure is intensifying their focus on data integration activities. More information and application managers are realizing that data integration is a critical aspect of their overall enterprise information management (EIM) strategy and information infrastructure. They understand that they need to employ data integration capabilities to share data across all organizational and system boundaries.
Gartner estimates that the data integration tool market was worth approximately $2.4 billion in constant currency at the end of 2014, an increase of 6.9% from 2013. The growth rate is above the average for the enterprise software market as a whole, as data integration capability continues to be considered of critical importance for addressing the diversity of problems and emerging requirements. A projected five-year compound annual growth rate of approximately 7.7% will bring the total to more than $3.4 billion by 2019 (see "Forecast: Enterprise Software Markets, Worldwide, 2012-2019, 2Q15 Update").
Vendors' pursuit of a more comprehensive offering strategy — to support a broad range of use cases and capitalize on new demand — continues to shape this market's competitive landscape. Offerings that help equip enterprises with data integration capability, independent of applications, processes or technology platforms, do well in this market. Competitive pressures are intensifying the focus on technologies that support varying styles of data integration beyond bulk/batch, tightened links to data quality tools, and progression toward a model-driven approach that uses common metadata across the tool portfolio. Evolving their relevance and competitive positioning requires vendors to extend their vision, deepen their capabilities and broaden the applicability of their data integration offerings. This is in line with buyers' expectations for optimal functionality, performance and scalability in data integration tools, to ensure they work well with the same vendor's technology stack and, increasingly, interoperate across related information and application infrastructures.
The following trends reflect a shift in demand from buyers, as well as areas of opportunity for technology providers to provide thought leadership and innovation to extend this market's boundaries:
  • Growing interest in business moments and recognition of the required speed of digital business. In the context of digital business, "business moments" — opportunities of short duration or a point in time that sets in motion a series of events involving people, business and things — are increasingly attracting the attention of enterprises. They want to harness data to seize these moments, which will require data integration support. Data integration functionality provided in a "sandbox" to support analytics is of growing interest; this approach enables data to be delivered and manipulated in a physical or virtual manner, for ingestion regardless of where it resides; it also encourages experimentation with, and the building of, new models with which to use data of interest. As pressures for real-time data integration grow, organizations will need to manage a range of data latencies to make data available for use within acceptable service levels and to match the required speed of business.
  • Intensifying pressure for enterprises to modernize and enlarge their data integration strategy. Data integration architecture that predominantly uses a single style and form — for example, batch-oriented, bulk data extract and delivery (accomplished via ETL techniques) — falls short in enterprises that face overwhelming pressure to support real-time operations and those that require flexible latencies. Data integration tool deployments are emphasizing the need to support multiple modes of data delivery, including traditional batch/bulk-oriented data movement, creation of in-memory federated views of data, and low-latency capture and propagation of events and changed data. Organizations are increasingly driven to position data integration as a strategic discipline at the heart of their information infrastructure — to ensure it is equipped for comprehensive data capture and delivery, linked to metadata management and data governance support, and applicable to diverse use cases. In addition, implementations need to support multiple types of user experience via tool interfaces that appeal not only to technical practitioners but also to people in business-facing roles, such as business analysts and end users. Offerings that promote collaboration between business and IT participants are becoming important as organizations seek adaptive approaches to achieving data integration capabilities.
  • Requirements to balance cost-effectiveness, incremental functionality, time-to-value and growing interest in self-service. Organizations' continued scrutiny of their investments and optimization of costs is resulting in aggressive behavior on the part of buyers when negotiating prices with vendors. More organizations are looking for solid basic capabilities that are "good enough," that can be deployed rapidly and that are offered at attractive prices. Some buyers, furthermore, are taking a targeted approach by acquiring only what they need now, while planning for future data integration needs — they don't procure all the capabilities they want at once. Nor are buyers necessarily looking for a single product or a single vendor-specific platform to accomplish everything. Vendors have responded to this development in various ways, such as by varying their pricing structures and deployment options (open-source, cloud and hybrid models), and extending support for end-user functions so that they work with targeted data of interest, especially when requirements aren't well-defined. Demand from business-facing buyers is also prompting vendors to add functionality that supports self-service data integration.
  • Expectations for high-quality customer support and services. Faced with a need to optimize staffing and budgets, as well as mounting pressure for faster and higher-quality delivery of solutions, buyers are demanding superior customer service and support from technology providers. In addition to highly responsive and high-quality technical support for products, they want direct and frequent interactions with sales teams and executives. Buyers also want wide availability of relevant skills — both within a provider's installed base and among system integrator partners — and forums where they can share experiences, lessons and solutions with their peers.
  • Increasing traction of extensive use cases. Usage of data integration tools continues to expand beyond analytics-related scenarios, as many others are now also driving demand. The need to support operational data consistency, data migration and cloud-related integration is prompting more data integration initiatives than before. The architectural approach of the LDW optimizes the repository styles that employ data federation/virtualization capabilities to enable data services and assimilate data involving a variety of integrated datasets. Big-data-related initiatives require the use of opportunistic analytics and the exploration of answers to less-well-formed or unexpected business questions. The distribution of required computing workloads to parallelized processes in Hadoop and alternative NoSQL repositories will continue to advance the ability of data integration tools to interact with big data sources and to deliver data to, and execute integration tasks in, platforms associated with big data environments.
  • Extension of integration architectures through a combination of cloud and on-premises deployments. A hybrid approach to data integration is growing in popularity because it provides the ability to execute data integration in both cloud and on-premises environments, as appropriate. It enables enterprises to interchange, reuse and deploy artifacts as needed across both environments. Adoption of this approach is increasing in the wake of the "cloud first" focus of some digital business strategies, which emphasizes the use of "lightweight" technologies that are user-oriented and adaptable to change. Customers are looking to gain business agility by, for example, using iPaaS as an extension of their data integration infrastructure to manage cloud-related data delivery and address the growing need for data sharing in cloud scenarios.
  • Need for alignment with application and information infrastructure. More organizations are selecting data integration tools that provide tight links to data quality tools, to support critical information management and governance initiatives. As MDM programs increase in number and scope, so organizations seek to apply their investments in data integration technology to those initiatives, to enable the movement, transformation and federation of master data. In addition, many organizations are beginning to pursue data integration and application integration in a synergistic way, to exploit the intersection of the two disciplines. Aligned application integration and data integration infrastructure, deployed for the full spectrum of customer-facing interactions and a broad range of operational flows, gradually optimizes costs and shared competencies, as compared with the pursuit of disparate approaches to similar or common use cases (see "Five Reasons to Begin Converging Application and Data Integration"). The expansion of vendors' capabilities into application integration provides opportunities to use tools that exploit common areas of both technologies to deliver shared benefits, such as use of CDC tooling that publishes captured changes into message queues.