So, what about data quality?
Though data professionals are aware of the importance of data, the aspect of its quality is consistently overlooked. In a 2017 Harvard Business Review study, they had managers evaluate their last 100 data records and count their total error-free data entries. This would give their Data Quality score from 0 to 100, with the higher the number the better. It was determined that a score of 97 or above error-free data entries out of 100 was acceptable. They found that “only 3% of the DQ [data quality] scores in the study can be rated “acceptable” using the loosest-possible standard.”
Not only is the quality of data not there, but it is also causing organizations to lose out on profits and valuable insights. Gartner conducted a study in 2021 to see how much money organizations are losing because of bad data quality. They calculated data volume, error rate in an organization’s data, direct costs associated with fixing bad data, and missed opportunities or inefficiencies the organization faces. Through this, they found that organizations lose an average of $12.9 million every year because of poor data quality.
As data becomes more vital in the workforce, it’s important to maintain and transform it to meet our needs.
Insights derived from data provide professionals with a deeper understanding of how to improve business processes. To gain a closer look at how various organizations manage data quality and its impact on business operations, we asked data professionals two key questions: How does your organization prioritize data quality? And how do you validate it?
When asking about the prioritization of data quality, we used a scale from 1 to 10, with 1 representing 'not a priority' and 10 representing 'a top priority.' For avenues of data validation, we provided four answer choices: 'In-house,' 'Externally,' 'I don't know,' and 'We don’t validate our data.'
While most respondents validated their data either in-house or externally, it was surprising to find that those who didn’t validate their data quality still rated it as a high priority for their organization, with an average score of 9.00 on the 10-point scale. Interestingly, those who did validate their data had a lower average prioritization score, with in-house validation scoring an average of 8.00.
What could be the reason for this?
Why would an organization highly prioritize data quality, but also not validate their data quality? And what are some tools to make validating data quality easier for organizations that have a hard time handling it?
For these answers and more insights, tune into our Data Quality report released at the end of September. This report will include actionable solutions to combat these exact issues. Here is a link to the summer recap report, to catch yourself up to speed!
If you would like to participate in our survey about data quality and get our in-depth Data Quality report, please visit this link. Thank you!
Comments