We use data to solve real business problems
brain-7803655_1280.jpg

Blog

Articles

Numerical Insights publishes articles on a variety of topics including business analytics, data analysis, data visualizations tools, improving business results, supply chain analytics, HR Analytics, strategic workforce planning, and improving profitability. We aim to make our articles informative and educational.

View Inventory Articles View Business Improvement Articles View Human Resources Articles

 

What is Data Quality Management? How do we Improve Data Quality?

What is Data Quality Management?

What is Data Quality Management and How do We Improve It?

 Data Quality Management (DQM) is the active management of the quality of data so that it is useful for its intended business purpose. Poor data quality can result in poor business decisions. 

Data quality can be measured in terms of data’s accuracy, completeness, validity, uniqueness, timeliness and consistency. But what do all of these metrics mean with respect to data? 

Let’s understand them with an example. Suppose a business has phone and email data in a spreadsheet for a number of potential customers whom the company wants to target using an automated text and email marketing campaign.  

Accuracy, in this case, implies that the phone numbers and email addresses are correct and there was no error in recording these phone numbers and email addresses.  

Completeness means that the data of each potential customer is complete and there are no customer records missing a phone number or an email address.  

Validity ensures that email addresses and phone numbers are valid. For example, the email addresses stored have an ‘@’ in them and the phone numbers start with the country code so that the email addresses and phone numbers can be processed by an automated text and email system.  

Uniqueness means that no email address or phone number is repeated so that there is no wastage in sending extra emails or text messages.  

Timeliness implies that the data of the potential customers is available at the right time when the customers are in the process of making their purchase decision. In this case, it would be futile to have year-old data because most of the customers would have already made their purchase.  

Consistency of data means that the data moves consistently across systems. For example, the email automation software would not add an additional quotation mark after every email address which would prevent the email from reaching its intended destination. 

The Cost of Poor-Quality Data 

By now, the value of high-quality data should start to become clear. Data quality becomes even more important when a business is trying to make highly critical (and costly) decisions based on that data. This is the intention of business analytics.  

Imagine running a social media marketing campaign targeting millions of customers based on poor quality customer data. It is estimated that poor data quality costs businesses between $9.7 and $14.2 million per year on average.  

Bad data quality impacts businesses which use data analytics in various ways, including:

  1. Waste of marketing spend and inability to target the right customers,

  2. Incorrect investment decisions,

  3. Inability to properly understand customer behaviour, and

  4. Inability to turn prospects into leads and leads into sales.

This is in addition to the intangible costs of using poor quality data to make business decisions, such as reduced trust in the value of data in driving those decisions, hampering the data-driven culture of the company. 

Given how important data is in gaining competitive advantage and how much complex data is being collected by businesses to help in making decisions, data quality has become crucial. Also, with stricter compliance requirements being placed on how and which data to collect, it is paramount that the limited data which is available be of high quality and useful in making critical business decisions such as marketing, investments, preventing employee turnover, and understanding customer behaviour, just to name a few. 

Causes of Poor Data Quality in Organizations

 What are the causes of poor data quality and how do we improve data quality? The four main causes of poor data quality in organizations are: 

  1. Multiple data entry points: As data enters the organization from multiple entry points, the chance of it being duplicated increases substantially. Duplicate data is costly for the organization as it can lead to wastage of marketing resources and incorrect understanding of customer behaviour.

  2. Siloed information in the organization: When various departments of a company operate in silos, there is often a lack of communication with regards to the data being collected and processed. These data silos may also be a result of mergers and acquisitions. Departmental silos often have different data definitions making it difficult to manage and reconcile data in the organization.

  3. Lack of appropriate data governance: Human errors in data entry are inevitable. The only way to prevent and correct human errors is to have an effective data governance process consisting of checks and guards such as approval processes to correct data early in the data collection cycle.

  4. Misjudgment: Organizations often misjudge the amount of incorrect data they have and collect every day and the adverse effect it can have on decision-making. Being unaware of poor data quality means there is also an unawareness of the quality of decisions being made from that data. 

How to improve Data Quality

Given the various ways in which data can be used by businesses for making critical decisions and the ways in which poor quality data can creep in, data quality management is no trivial task. There is a plethora of tools available in the market for Data Quality Management. The Gartner Magic Quadrant for Data Quality Management tools shows the top tools.) 

Data Quality Management tools work by profiling data, using automation tools and introducing exception-handling workflows for poor quality data. Tools also provide methods for cleaning and harmonizing conflicting data integrated from multiple sources using data quality rules so that cleansed records exist in a master data list. 

In addition to the tools available in the market, changes to organizational processes can help in improving data quality. Organizational processes can target prevention of new bad data, continuous improvement of data and organizational alignment.:  

  1. Prevention of New Bad Data: Measures can be introduced to prevent bad quality data from entering the organization. These include a single source of data entry, effective data quality assurance, data governance and approval processes.

  2. Continuous Improvement of Data Quality: Data governance policies and procedures must strive for continuous improvement of data quality in the organization. This may involve setting up processes and controls and defining data quality metrics.

  3. Organizational Alignment: Alignment of various processes and systems using data in the organization can improve data quality substantially. This will help in maintaining data consistency and ensure that data integrity is not lost when the data moves from one system to another.

In conclusion, data quality management is critical for organizations looking to use their data for crucial business decisions. These can include decisions such as directing marketing budgets, making business investments such as mergers and acquisitions, and reducing employee turnover. Data quality can be improved by preventing bad data from entering the organization, having effective data governance and aligning different systems that use and store data in the organization.