Why is efficient data management so important?
The ability to use data is vital. It is critical to a transformation process that will enable a company to successfully position itself and assert itself in a highly dynamic, data-driven world. A basic prerequisite for exercising such an ability is properly functioning data management.
By data management we mean all conceptual, methodical, technical and organizational measures that serve to efficiently manage, protect and profitably use data. These include disciplines such as data governance, data architecture, data quality, master data management, data warehousing and data security.
The image of data management – formerly a hidden process in the dark basements of the IT department – has changed dramatically in recent years. The professional and technical relevance of data management is constantly increasing, especially when it comes to making trustworthy and correct data usable for BI and analytics. As a result, the relevance of data management in corporate strategy is increasing. Data has become a strategic asset. And a strategic asset must be efficiently maintained.
In the third edition of The Data Management Survey, we have extended our focus beyond BI to cover data & analytics and all tools that help to monetize data. This embraces BI, data management and analytics capabilities in operational systems, exploratory environments (e.g., data lakes) for advanced analytics as well as a holistic view of company data with regard to strategic tasks (e.g., data strategy). Each of these areas has its own specifics in terms of data types, technologies, methods and architectures as well as required skills and personnel resources.
Data management tasks for BI and data warehousing focus on data integration, data preparation and data modeling as well as data storage and provision for reporting, analysis and planning. Today’s BI architectures and technologies have to be able to keep pace with ever more dynamic requirements and growing complexity in the area of data processing and systems. Often there is no way around fundamental modernization.
In this article you will discover:
- a head-to-head-comparison where you can see how the featured products stack up against each other
- the most important lessons learned from surveying over 1,100 respondents about data management software usage and selection.
The Data Management Survey: Head-to-Head Tool Comparison

Who should read the study?
In this highly dynamic world, internal and external influences require increasingly rapid and well-judged reactions. This can only be mastered if we have our systems under control. This study offers interesting insights, especially for members of data, BI or analytics competency centers, data experts, architects and people who prepare data for corporate management. It is designed to give you some orientation in this competitive software market and provide you with valuable guidance for upcoming tool selection projects or stress tests.
Not all data management tools for data & analytics are alike. They focus on specific topics such as analytical databases, ETL, data warehouse automation or cover a variety of functionality in a broad portfolio. This is why we use peer groups to compare products.
The peer groups are primarily based on the various activities involved in analytical projects and the different user groups. They take into account how customers say they use the software, which can vary substantially from product to product. But we also include the experience and judgment of BARC analysts in deciding on the groupings.
Peer groups are simply a guide to the reader to help make data management tools easier to compare and to show why individual products return such disparate results. They are not intended to be a judgment of the quality of the tools.
The point to the peer groups is to make sure that comparisons between the data management tools make sense. The products are grouped together as we would expect them to appear in a software selection shortlist.
To make a proper choice, buyers should first segment the market into the tool types that fit their requirements. The peer groups are intended to help with this task.
Peer group segmentation is based on two key factors:
- Usage scenario – these peer groups are based on how customers say they use the product.
- Functional capabilities – aside from the most common usage scenarios, we also examine the full set of functions that a product is able to perform/provide.
The KPIs
We combine the Price to Value, Time to Market, Recommendation, Product Satisfaction and Support Quality root KPIs to calculate this aggregated KPI. These five factors are clearly related: If one is lacking, then the importance of the others is accentuated. A customer is basically satisfied if the expected benefits are derived from the money that was paid. If the tool exceeds expectations and the user is treated well and individually, then satisfaction logically follows. This KPI indicates how much the tool and vendor meet the expectations that were generated by the vendor’s marketing but also other information sources.
Delivering a superior customer and user experience is more important than ever. Data management professionals do not want to have to spend a lot of time developing, implementing and monitoring data processes. As a result, they are looking for easy-to-use interfaces and good support in their workflows that is based on a performant and reliable software platform.
With the current vogue for agility and self-service capabilities and the increasing need for business users to be able to access a variety of data sources, the user experience of a data management product is an important consideration for many organizations.
New ideas and technologies are the lifeblood of the software industry. However, some vendors prefer to rest on their laurels, relying on existing technologies and lucrative maintenance contracts with loyal customers. If a data management tool cannot keep up with recent developments, it becomes outdated very quickly and cannot deliver the same level of benefits as rival tools.
The aggregated Technical Capability KPI combines the Connectivity and Functionality KPIs. While Customer Experience highlights the look and feel of the product, Technical Capability is supposed to give an idea of the available technical features and functions that users can work with. It is an important indicator to assess the technical performance of the tool for current but also future use cases. In addition, conclusions can be drawn about the degree to which a tool supports developers in design and implementation and thus increases the speed of implementation.
The KPI rules
The peer groups
Analytical database products prepare, store and provide data for analytical purposes.
Business software generalists have a broad product portfolio including most (or all) types of enterprise software for a variety of business requirements (e.g., ERP, BI, DM).
Data governance products help to control, develop, monitor and secure data to make it usable for business needs. They do not manipulate data. Instead, they focus on managing and leveraging metadata such as data catalogs.
Data pipelining products take a modern approach to data integration and support more than one data integration pattern. A pattern can be data interaction, data integration, data preparation or even data orchestration in order to get data connected and to make it usable for any kind of business purpose.
Data warehouse technologies prepare, store and provide data for data warehousing purposes.
Products in this peer group support data-driven or requirements-driven data warehouse design and implementation. They mainly focus on the simplification and automation of data integration and data modeling tasks.