It couldn’t be simpler.
When there’s a problem with the car, a light on the dashboard illuminates. It could be the oil pressure. Or the tire pressure. Or the battery. Or the engine temperature. Or the engine. Collectively these are referred to as “idiot lights.”
Dashboard warning lights were introduced prior to World War II, but the first known use of the phrase “idiot light” was not until the late 1960s, around 1966 or 1968. The term has a dual pejorative meaning: they make the interpretation of problematic automobile conditions embarrassingly simple, and you’d have to be embarrassingly negligent to ignore them.
Yet it happens. Frequently.
A 2024 study by CarMD found that 49% of respondents had an illuminated warning light on their dashboards. For 22% of respondents, it was the check engine light.
So, you’re driving along and the check engine light comes on, but you ignore it. You’ve got someplace you need to get to, or you don’t have money for an expensive repair so you hope that it goes away. It doesn’t, of course, and eventually you find yourself on the side of the road watching traffic speed by. And you’re stuck paying for a much more expensive repair (or a new car).
When the car is towed to the repair shop, the mechanic will probably ask whether, and for how long the check engine light was on. Some embarrassment is appropriate. The repairperson is probably too businesslike to laugh at you, at least to your face, but your spouse or siblings probably aren’t. After all, you received a clear signal and you chose to ignore it. Peer pressure and shame aren’t always bad things. Shame is like pain. Both imprint avoidance. It just takes once, and you probably won’t ignore a warning light again.
What if there was a check engine light for data? There is:
Data Quality is the check engine light for data.
And most companies, certainly more than the 22% driving around with their check engine light on, have been ignoring it.
An effective idiot light requires two things:
1. Identifying the critical condition.
If Data Quality is the idiot light, then a process is required to evaluate it: collect expected data content, measure the actual data content, and compare the two. Simple. All you have to do is do it. I’ve talked a lot about this here. Others have talked a lot about it elsewhere. And the negligence of not doing it is worse than ignoring the idiot light.
2. Understanding the connection between inaction and consequences.
All I want from my car is for it to get me from Point A to Point B, and I should know that I have to change the oil once in a while. But, my car died and left me stranded because I didn’t want to take the time and go to the effort of changing the oil.
All I want from the data is my report, and I should know that I have to ensure that the source data is well understood and correct. But, I made poor business decisions and took incorrect actions because I didn’t want to take the time and go to the effort of understanding the data and ensuring that its content was correct.
We learn in driver training class, or just from experience, that the light means that something requires our attention. The owner’s manual provides information about what to do, even if that’s taking the car to the mechanic.
Management doesn’t necessarily have an owner’s manual for data. Executives don’t necessarily know what the data idiot light means, or that it indicates that there’s a problem, or even that they should have one. Besides, their focus is elsewhere.
So, we think that we need to teach the executives what the idiot light means. We try. And more often than not, we fail. For decades, many of us have been going about it all wrong. We create Data Literacy curriculums and hold lunch & learns. Executives don’t have time to go to a class or seminar or workshop, so the education is delegated. Often through multiple levels.
Executives don’t have to know the details of Data Quality or Information Management or Data Governance. They shouldn’t have to. They just need to understand that an indicator exists, and that ignoring the indicator will lead to future problems.
Just like I don’t have to know the details of how the engine works or the fuel combustion process or pistons or timing belts. I just need to know that an indicator exists, and that ignoring the indicator will lead to future problems.
The challenges many companies are experiencing when implementing AI initiatives are currently the most visible consequences of inaction.
Companies are beginning to recognize that the challenges they’ve been having launching AI initiatives are largely caused by bad data. For years, for decades, that same recognition has been missing when it comes to poor quality reports or incorrect business actions.
Corporate executives should have the decency to be embarrassed by the poor quality of their companies’ data.
Do something about it. Create an idiot light. Establish a Data Quality infrastructure for your data, and articulate the consequences of not properly maintaining your data resources. And keep driving ahead.
Photo Credit: kkursor, Public Domain, Wikimedia Commons.
0 Comments