Publication

Cleaning up dirty data: mundane chore or potent accelerator?

Ian QuestNov 20, 20216 min read

Want to know the key to unlocking your engineers' potential? Look no further than the quality of the data they work with. Ian Quest, QR_'s director of consulting, on why the apparently mundane work of cleaning up dirty data is one of the most potent accelerators of complex engineering programmes.

Share:

Where's the brass?

I was brought up in Yorkshire, where, among many other brilliant colloquialisms, I learned the phrase "where there's muck, there's brass." In other words: if you're willing to do the unpleasant things others aren't, you'll do well. That philosophy applies to many of the activities around engineering programmes. Apparently unglamorous work, done rigorously, holds the key to significantly accelerating and de-risking a programme — and unlocking the real potential of your engineers and creative people.

How do you measure your data quality?

Without the hard, detailed facts about data quality, it's easy to be optimistic about programme status. Add that to the natural tendency to avoid being the one to flag a delay, and you have a perfect recipe for a data quality disaster — perceptions muddied further by optimism and greenwashing.

It shouldn't be surprising, then, when a thorough audit of product and programme data reveals the truth. What is surprising is the level of issues normally found — and the apparently mundane nature of many of them. Across a sample of both large and small automotive OEMs, we consistently find that 20-30% of parts have data errors or omissions at the point of order. Most of those errors will impact on-time delivery. Taken at face value, the numbers suggest engineering work hasn't been completed or that technical issues remain unresolved. In reality, this is generative product and programme data that's either absent or doesn't reflect engineering and business intent. In a BOM of 2,000 parts, 20-30% means 400-600 parts with errors — many of which will result in late or unreliable parts to build.

The secret reason

Although the statistics vary by OEM and by programme — and there are, of course, good examples — the opportunity here is large. It's possible to reduce programme duration, or cost, by 10-20% by removing data errors and omissions. This isn't a one-off exercise and it isn't a mundane admin task. It requires commitment, with focused and determined leadership, professional execution in both the technical and interpersonal sense, and a structured, comprehensive methodology.

How do you know how thorough your data validation is?

One of the big challenges is that the real value in validation exercises is in the rigour applied — not just the fact that a check happened. Spotting missing data is relatively easy; finding incorrect data is not. It's also not only about data quality; it's about data quality and data provenance. Material examples, engineering intent, configuration: these require understanding, genuine conversations, and face-to-face verification. Beware the tick-box exercise. It's easy to tick a box and find few errors — but two validations that look similar on paper can have very different outcomes.

The wholesale clean-up

It would suit everyone if, like Hercules in the Augean stables, we could simply divert the river and cleanse the data wholesale. There's often a sense that the data quality issue is so specific that small actions won't really help — that a big system or process change is needed to fix it. It's collectively convenient to think this way, absolving ourselves of the problem and passing the buck.

Millions have been spent trying to fix this, with the majority of that spend going into two areas: system upgrades and administrative support. Both are "big" solutions that, from a bird's-eye view, look appropriate for a big, widespread issue. In reality, this problem lives in the detail — and it has to be tackled in the detail. We see many types of errors, each with its own root cause, and the majority of them aren't meaningfully affected by a system update or a change to someone's data.

Building solid data foundations

Tackling the issue requires some shifts in mindset and some redirection of effort.

  • 1. Understand and believe in the power of good dataUnless you genuinely believe it's important, you won't invest the energy to support it properly, fund it properly, or hold people to account when it slips.

Source note: the remaining points in this numbered list are not visible in the OCR text used to prepare this version. Awaiting source-text confirmation before insertion.

Ian Quest

Ian Quest

Former Director of Consulting, Quick Release_

Ian is Director of Consulting at Quick Release. Based in London, he operates internationally as he leads QR_'s growing consultancy arm — with a focus on unlocking competitive advantage by bringing products to market faster and more efficiently. An early career in aerospace engineering with Rolls-Royce led to senior leadership roles across several prominent manufacturing consultancies, culminating in the directorship of Newton Europe's Air, Land & Sea business. Ian joined QR_ full-time in 2017, having previously provided non-executive advisory services to its founders.

Reach out to start a conversation

Every business faces unique challenges, and we're here to listen, not presume. Share your contact details, and one of our experts will reach out to discuss your specific needs. No spam, just tailored solutions.

Your information will be used in accordance with our privacy policy. Manage your subscription preferences.