Data Integrity Trends for 2022
As we look ahead to 2022, it’s clearer than ever that to win in a globally competitive marketplace, business leaders must develop and nurture strong competencies in all aspects of data integrity. According to a recent Corinium report entitled Data Integrity Trends: Chief Data Officer Perspectives, most enterprises have established a basic foundation for data-driven decision-making and automation, but they are also reporting significant struggles in the quest to develop and maintain data integrity at scale. They have made progress, but there is still quite a way to go before business leaders can truly trust data-driven insights.
Corinium surveyed 304 global Chief Data Officers or equivalent roles from across a range of verticals including financial services, insurance, retail, telecom, healthcare and life sciences, transportation and logistics, government, education, software technology, and more. Respondents were asked about their organizations’ data integrity strategies, including their approaches to data quality, data governance, location intelligence, and enriching company data with data from third-party sources.
Here are some of the key trends identified in the report.
Most Enterprises Are Rushing to Embrace Analytics
It should not come as a surprise to anyone that most executives view data analytics as an important strategic enabler. Sixty-one percent of respondents in the Corinium survey reported that their companies had already established their core data management and governance frameworks at least “quite successfully”; nearly a third described their efforts as “very successful.”
Many are also prioritizing data democratization–empowering employees to engage in self-service analytics to produce their own data-driven insights on demand.
Read our Report
Data Integrity Trends: Chief Data Officer Perspectives
The vast majority of enterprises are busy laying the foundations for data success. For most, however, this is still very much a work in progress. Achieving data integrity at scale is a challenge–one that must be addressed to firmly establish trust and confidence in data-driven insights. The Corinium report covers a broad range of topics with respect to data integrity. To read the report, download your free copy today.
Nevertheless, executives also understand that diving headlong into simply ‘doing analytics’ is inherently risky, and it stands to undermine the trust among stakeholders. For this reason, the vast majority of respondents say their companies are prioritizing data integrity, shoring up the foundations of their data strategies to achieve lasting credibility.
Many companies are still struggling with the existence of data silos across their organizations, which makes it difficult to understand important aspects of their business completely and holistically. Nearly three-quarters of respondents cited the lack of the necessary technology and services to facilitate data integration as a key limitation. Even more–just over 80% indicated that concerns over data quality were creating a major barrier with respect to their integration capabilities.
Beyond being a hindrance to integration, data quality issues are the root of much broader challenges at most organizations. Incomplete, inconsistent, or inaccurate data appear to be recurring themes for most. These issues are exacerbated by the challenges of profiling and cataloging data, reconciling inconsistent formats, and connecting policies and rules to a broad collection of disparate data sources.
Executives acknowledge the tremendous value available through location intelligence and data enrichment as well, but find it difficult to enrich data consistently and to do so at scale. Respondents reported that their enterprises integrated an average number of 27 third-party data sources into their data architecture. Compliance with privacy standards, adherence to internal data quality standards, interoperability, format consistency, and freshness of the data were the challenges organizations most wrestled with when enriching data.
Data Preparation Is the Top Task for Data Teams
Unfortunately, these challenges often lead to company resources being unduly burdened with manual data cleansing and data preparation tasks. The use of process automation to improve data quality, for example, is still very limited; 51% of respondents indicated that their organizations make limited use of automation, and another 12% are not using automation tools for data quality at all.
Given the rapidly increasing volume and velocity of data available to these enterprises, automation is quickly becoming a business imperative. Those who do not proactively attend to data quality in a scalable way will inevitably experience a decay in the integrity of their data. As enterprises increasingly rely on advanced analytics (including AI/ML) to inform both strategic and tactical decisions, the challenge of achieving data quality at scale will take on even greater importance in the coming years.
Dan Power, Managing Director of Data Governance, Global Markets at State Street puts it this way: “The biggest killer of data governance programs is lack of automation. Data quality tool vendors, whether they’re integrated into a data management catalog or not, need to do better at incorporating AI and ML techniques.”
Scale is Driving Integration Challenges as Well
The biggest challenge many enterprises face with respect to data integration is a shortage of employees with the right knowledge and expertise. But the increased size and complexity of corporate IT landscapes is also a factor. 77% of respondents to the Corinium survey said that processing high volumes of data is at least ‘quite challenging,’ and 73% indicated that their teams find it at least ‘quite challenging’ to deal with multiple data sources and complex data formats.
Given the challenges of staffing, combined with the problem of complexity and change, it is no surprise that many corporate leaders are turning to low-code and no-code integration tools, which afford them greater flexibility and agility. With the right enterprise-grade integration platform, streaming data pipelines can be developed easily, deployed anywhere across the corporate IT landscape, and modified quickly without introducing undue risk. When data is critical to business operations, robust and resilient integration tools enable business continuity, even when a connection is disrupted.
Integration of complex and diverse data sources is also critically important. Mainframe data can be particularly challenging, given the intricacies of hierarchical databases, COBOL copybooks, and other complexities associated with mainframe computing systems. Unlocking the data and making it available to cloud analytics platforms, as well as to other applications, is critical if enterprises want a complete picture of what’s happening in their businesses.
The vast majority of enterprises are busy laying the foundations for data success and are using data integrity trends for 2022 planning. For most, however, this is still very much a work in progress. Achieving data integrity at scale is a challenge–one that must be addressed to firmly establish trust and confidence in data-driven insights. The Corinium report covers a broad range of topics with respect to data integrity. To read the report, download your free copy today: Data Integrity Trends: Chief Data Officer Perspectives.