Why is efficient data management so important?

The ability to use data is vital. It is critical to a transformation process that will enable a company to successfully position itself and assert itself in a highly dynamic, data-driven world. A basic prerequisite for exercising such an ability is properly functioning data management.

By data management we mean all conceptual, methodical, technical and organizational measures that serve to efficiently manage, protect and profitably use data. These include disciplines such as data governance, data architecture, data quality, master data management, data warehousing and data security.

The image of data management – formerly a hidden process in the dark basements of the IT department – has changed dramatically in recent years. The professional and technical relevance of data management is constantly increasing, especially when it comes to making trustworthy and correct data usable for BI and analytics. As a result, the relevance of data management in corporate strategy is increasing. Data has become a strategic asset. And a strategic asset must be efficiently maintained.

In the fifth edition of The Data Management Survey, we have extended our focus beyond BI to cover data & analytics and all tools that help to monetize data. This embraces BI, data management and analytics capabilities in operational systems, exploratory environments (e.g., data lakes) for advanced analytics as well as a holistic view of company data with regard to strategic tasks (e.g., data strategy). Each of these areas has its own specifics in terms of data types, technologies, methods and architectures as well as required skills and personnel resources.

Data management tasks for BI and data warehousing focus on data integration, data preparation and data modeling as well as data storage and provision for reporting, analysis and planning. Today’s BI architectures and technologies have to be able to keep pace with ever more dynamic requirements and growing complexity in the area of data processing and systems. Often there is no way around fundamental modernization.

In this article you will discover:

  • a head-to-head-comparison where you can see how the featured products stack up against each other
  • the most important lessons learned from surveying hundreds of respondents about data management software usage and selection.

The Data Management Survey: Head-to-Head Tool Comparison

This interactive dashboard lets you compare two data management tools. The comparison is based on the six aggregated KPI results from The Data Management Survey 24. See how they stack up against each other by selecting a peer group and then two data management software products of your choice.
*Charts generated using this tool summarize the collective opinion of a group of end-users and do not necessarily reflect the views of BARC. Judgments as to the superiority or otherwise of individual products should not be solely based on these charts. Many of the products featured are not directly comparable with each other so care should be taken to ensure meaningful comparisons are made.
User Compare Planning and Budgeting Software Products

Who should read the study?

In this highly dynamic world, internal and external influences require increasingly rapid and well-judged reactions. This can only be achieved if we have our systems under control. This study offers interesting insights, especially for members of data, BI or analytics competency centers, data experts, architects and people who prepare data for corporate management. It is designed to give you some orientation in this competitive software market and provide you with valuable guidance for upcoming tool selection projects or stress tests.

Not all data management tools for data & analytics are alike. They focus on specific areas such as analytical databases, ETL and data warehouse automation or cover a variety of functionality in a broad portfolio. This is why we use peer groups to compare products.

The peer groups are primarily based on the various activities involved in analytical projects and the different user groups. They take into account how customers say they use the software, which can vary substantially from product to product. But we also include the experience and judgment of BARC analysts in deciding on the groupings.

Peer groups are simply a guide to the reader to help make data management tools easier to compare and to show why individual products return such disparate results. They are not intended to be a judgment of the quality of the tools.

The point to the peer groups is to make sure that comparisons between the data management tools make sense. The products are grouped together as we would expect them to appear in a software selection shortlist.

To make a proper choice, buyers should first segment the market into the tool types that fit their requirements. The peer groups are intended to help with this task.

Peer group segmentation is based on two key factors:

  • Usage scenario – these peer groups are based on how customers say they use the product.
  • Functional capabilities – aside from the most common usage scenarios, we also examine the full set of functions that a product is able to perform/provide.

The KPIs

The Data Management Survey 24 examines data management software product selection and usage among users in categories (KPIs) including Product Satisfaction, Recommendation, Functionality and Performance. There are 32 KPIs in total.
Different readers will have their own views on which of these KPIs are important to them. For example, some people will regard visual interfaces as critical, whereas others may consider deployment and model management capabilities to be more important.
Consequently, we think reducing the KPIs to only one aggregated score is too simplistic to be helpful when seeking out the best data management software to match your needs. In our view, there are at least six crucial KPIs when it comes to comparing data management tools from a user perspective. They are made up from 26 individual root KPIs:

Business Value is possibly the most important KPI, focusing on bottom line benefits of data & analytics projects. Business benefits are the real reason for carrying out any data & analytics project. Analytics that does not deliver broad business value is superfluous.

The Business Value KPI shows how a successful data & analytics software product can provide these benefits in the real world. The KPI combines the Business Benefits, Project Success and Project Length KPIs.

We combine the Price to Value, Recommendation, Vendor Support, Implementer Support, Product Satisfaction and Sales Experience root KPIs to calculate this aggregated KPI. These factors are clearly related: If one is lacking, then the importance of the others is accentuated. A customer is basically satisfied if the expected benefits are derived from the money that was paid. If the tool exceeds expectations and the user is treated well and individually, then satisfaction logically follows. This KPI indicates how much the tool and vendor meet the expectations that were generated by the vendor’s marketing but also other information sources.

Year after year, feedback from BARC survey participants underlines just how critical it is to consider the functionality of a product when evaluating it. After all, what use would the software be if it were incapable of enabling us to perform the tasks required of it? We combine the Functional Coverage, Self-Learning, Active Metadata and Security & Privacy KPIs to calculate this aggregated KPI.

Delivering a superior customer and user experience is more important than ever. Data management professionals do not want to have to spend a lot of time developing, implementing and monitoring data processes. As a result, they are looking for easy-to-use interfaces and good support in their workflows that is based on a performant and reliable software platform.

With the current vogue for agility and self-service capabilities and the increasing need for business users to be able to access a variety of data sources, the user experience of a data management product is an important consideration for many organizations.

The aggregated Technical Foundation KPI combines the Performance, Platform Reliability, Connectivity, Scalability and Extensibility KPIs. While User Experience highlights the look and feel of the product as well as efficiency in development and operations, Technical Foundation is supposed to give an idea of the technical platform features that help to operate a performant and reliable platform, tailored to individual needs and integrated in the data ecosystem. It is an important indicator to assess the technical capabilities of the tool for current but also future use cases.

The Competitiveness KPI gives insights into how data & analytics tools perform in a competitive selection process as well as the strength of a product’s market presence. It combines the Considered for Purchase and Competitive Win Rate KPIs.

Recognizing which data & analytics software to compare entails understanding which tools have fared well in other organizations’ product selections. This enables users to eliminate ‘losers’ at an early stage in the selection process.

The KPI rules

Only measures that have a clear good/bad trend are used as the basis for KPIs.

KPIs may be based on one or more measures from The Data Management Survey.

Only products with samples of at least 15 – 30 (depending on the KPI) for each of the questions that feed into the KPI are included.

For quantitative data, KPIs are converted to a scale of 1 to 10 (worst to best). A linear min-max transformation is applied, which preserves the order of, and the relative distance between, products‘ scores.

The peer groups

Technologies that provide data warehouse capabilities as a service in the cloud.

'Yellow pages’ to support search for data and to support governance leveraging metadata in a highly user-friendly environment.

Platforms that help to build up and utilize data knowledge effectively and efficiently utilizing automated processes (e.g., for linking and analyzing a wide variety of metadata from distributed metadata sources).

Data pipelining products take a modern approach to data integration and support more than one data integration pattern. A pattern can be data interaction, data integration, data preparation or even data orchestration in order to get data connected and to make it usable for any kind of business purpose.

Mainly SaaS platforms that provide integrated end-to-end functionality from data integration to analysis with a special focus on business user support to cover self-service requirements.

Data warehousing automation products cover data-driven or requirements-driven data warehouse design and implementation. They mainly focus on the simplification and automation of data integration and data modeling tasks.