Enter “digital transformation” into your favorite search engine and you will uncover thousands of contemporary articles on the topic. Important trends for business are detailed, for example, in eWeek’s January 28, 2022 7 Digital Transformation Trends Shaping 2022, including hyperautomation, more effective cybersecurity, further applications of artificial intelligence, speed and agility in decision-making, low-code/no-code tools, and democratization of data and tools. Organizations are investing heavily in machine learning (ML), applying data and algorithms that reflect how humans learn to continuously improve focus and accuracy of business (organizational) decisions.
Critical to the success of digital transformation are data quality and data quality management (DQM). A cogent example of the old adage “garbage in, garbage out” is how ineffective ML (and data analysis, in general) can be if algorithms act on low quality data. Simple examples include inconsistencies in expressing names and addresses. Case studies abound of unsuccessful sales and marketing campaigns that result from poor name and address hygiene. These barely scratch the surface of contemporary data quality issues.
Data metrics and tools have evolved to address data cleansing and profiling around the attributes accuracy, completeness, consistency, integrity, currency (is the data up-to-date?), and relevance to the business problems at hand. In many respects, data cleansing and profiling are analogous to environmental ecology: If you do not clean out the old growth and attend to the new, hundreds of thousands of acres might burn uncontrollably. “Wildfires” downstream of poor data ecology manifest in loss of competitive advantage, decreased market share, misdirected business leads, loss of venture funding, and leaving business opportunities on the table, to name just a few.
Once an optimal set of data attributes are delineated, measures of data quality can be derived and applied to continuous improvement. Consider, for example, the number of errors per unit amount of data, or the rate at which parsing algorithms fail per unit amount of data. Further, such measures can be compared with measures of business outcome quality, making the end-to-end cycle from data acquisition to business decision adaptive—which is, after all, the goal of ML.
Low-code and no-code tools to ensure data quality have emerged, especially in the context of digital transformation. They work hand-in-hand with the tools of data exploration and analysis. While seasoned, highly-technical data scientists may be required to address the most complex data quality issues, these are mostly edge-cases. The majority of data quality issues that organizations face can be addressed by low-code / no-code tools. The best of these tools complement data visualization analysis tools, which are also low-code / no-code.
The benefits of data quality to digital transformation are many, including competitive advantage, razor-sharp sales and marketing focus, rapid system rollouts, and efficiently integrating disparate and emerging technologies. Most importantly, data quality enables organizations to draw a straight line from the investment in data acquisition to its impact on the bottom line.
If your business aims to improve the data that informs business success, contact Gemini. By connecting the dots between data from disparate sources, Gemini helps organizations effectively transform data into stories that provide fast and effective business insights and outcomes.