the cognistx blog

Why should you measure the cost of bad data?

July 30, 2020
Harini Panda

You already took the first-steps to becoming an AI driven organization - analytics and AI objectives are linked to business outcomes, metrics for measuring AI Return on Investment have been established, use cases for AI adoption have been prioritized and the volume and variety of data required to implement these use cases has been collected. Now comes one of the most crucial executional challenges, taming the data quality beast.

While it could be tempting to hurry and drive insights with minimal data cleaning, organizations must proceed with caution. Poor Data Quality can result in less accurate AI/ML models and consequently incorrect insights and increased costs. The costs of bad data are shockingly high: IBM estimates that poor quality data costs US companies alone $3.1 trillion and Gartner’s 2017 Data Quality Market Survey indicates an average annual financial cost of approximately $15 million per organization. Add to this the hit to customer loyalty and organization’s brand image and valuable employee time. According to this HBR article,  data scientists spend 60% of their time cleaning and organizing data.

Bad data can arise from several underlying factors such as null or missing values, manual entry errors, multiple versions of the same data, inconsistent data, misclassified data, missing data relationships and others. Furthermore, data collected may not comply with the industry specific data quality standards and organization’s business driven rules. Given the disruption caused by Covid-19, it is also likely that for certain use cases organizations need to work with smaller data sets representative of the current conditions, accelerating the need for better quality data to derive insights. Hence, it is critical for organizations to understand the impact of bad data and take a proactive approach to managing data.

If you don’t know how much bad data is costing your organization, try our Data Quality Calculator or talk to one of our Data Scientists. If you would like to try our Data Quality Engine solution, reach out to our Sales Representatives today!

Fortunately, AI and ML can automate this herculean task of cleaning bad data and detect anomalies in real-time. Choosing a comprehensive Data Quality Tool that can address the aforementioned data quality issues can lower costs, drive better insights and improve customer satisfaction.  For instance, for an Oil & Gas client, manual entry resulted in misclassification of customer account types and subsequently demand forecasting. We analyzed over 165,000+ customer accounts and corrected data errors resulting in millions of dollars in cost savings. For an automotive client, we corrected product catalog and inventory data, improving customer satisfaction and loyalty.  Cognistx can be your partner in defeating these data quality issues, allowing you to focus on other critical AI tasks.

Past Blog Posts