· Bob Rougeaux · Finance · 5 min read
What I Learned Rebuilding Master Product Data from Scratch
At Lacerta, the bill of materials had grown organically for years with no governance. Before we could build useful analytics, we had to fix the foundation. What that process actually looks like.
Every company I’ve worked with has a version of this problem. The product data started out clean — someone built the original item master with care, the BOM was set up correctly, the cost accounting made sense. Then the business grew. People made changes. Systems were updated. A few workarounds got baked in as permanent. Nobody tracked the cumulative drift.
At Lacerta Group, we hit this problem head-on when we tried to build a BI reporting layer on top of our ERP data. The dashboards we built looked authoritative. They were not. The underlying product data had inconsistencies that made margin analysis unreliable and the quoting workflow impossible to automate.
We had to go back to the beginning.
What “Bad” Product Data Actually Looks Like
It’s rarely dramatic. You don’t open the database and see obviously wrong records. What you see instead is:
Duplicate items. The same physical product entered twice under slightly different part numbers — usually because someone created a new record when they couldn’t find the existing one. Now you have split cost history, split inventory, and reports that double-count or miss depending on which number gets used.
Stale cost standards. Standard costs that were set at product launch and never updated, now bearing no resemblance to current material costs. The P&L shows variance as a line item, but nobody knows what it’s measuring anymore.
Missing BOM levels. Assemblies that were built and shipped but whose bill of materials was never fully entered. The system knows the finished good exists; it doesn’t know what went into it. Standard cost is missing or estimated.
Free-text description abuse. Fields that were supposed to hold structured data (product family, material type, unit of measure) instead holding whatever the person entering the record typed that day. “STL” and “Steel” and “steel alloy” and “S/S” all meaning the same thing — unfilterable by a machine.
None of these individually crashes the business. Together, they make the data untrustworthy enough that your analysts start cross-checking reports against their own spreadsheets. That’s the tell.
The Fix Is Not Glamorous
I want to be honest about what a master data cleanup actually involves, because there’s a version of this story that sounds like a clever technical project. It isn’t. It’s a lot of careful, methodical work.
Step one is a full export and audit. Every item in the item master, every BOM level, every open standard cost, exported to a structured format and reviewed. The goal is to build a map of what exists, what’s correct, what’s duplicated, and what’s missing. This is the part that takes longer than anyone budgets for.
Step two is prioritization. You can’t fix everything at once in a running business. You have to rank by impact: which products account for 80% of revenue? Which BOM inaccuracies are creating the largest cost variances? Which duplicates are actively causing order entry errors? Fix those first.
Step three is governance before remediation. This is the part most companies skip, and it’s why the same problems come back. Before you clean the data, you have to define who owns it going forward and what the rules are. Who approves new item creation? Who can change a standard cost? What happens when a product is discontinued? Without governance, the cleanup is a one-time project. With governance, it’s a lasting improvement.
Step four is the actual remediation. For Lacerta, this meant merging duplicate records, updating cost standards to reflect current material pricing, completing missing BOM structures for our most-used assemblies, and enforcing controlled vocabulary on the fields that fed our reporting.
What It Enabled
Once the product data was stable, several things became possible that hadn’t been before.
The quoting app — which I’ve written about separately — could only work with a reliable item master. If two records existed for the same part, the app would quote at the wrong cost. Clean data was a prerequisite.
Margin analysis by product family became trustworthy. Before the cleanup, I couldn’t tell a sales manager “your margins on this category are compressing” without asterisks. After, I could. That changed how those conversations went.
Standard cost variance became a useful operational signal rather than a noise. When the standards accurately reflected expected costs, actual-to-standard variance meant something — a supplier price increase, a yield problem, a procurement win. When the standards were stale, the variance was just a number everyone learned to ignore.
What I’d Tell Someone Starting This
Plan for the data audit to take three times as long as you think. The export is fast. The analysis is slow. Every time you think you understand the scope, you find another category of issue.
Involve operations and manufacturing, not just finance. In a manufacturing business, the BOM is a shared asset — it affects production scheduling, procurement, and cost accounting. If only finance is working on it, you’ll clean things in ways that break other processes. Get the right people in the room.
Build the cleanup into a broader project, not a standalone initiative. Standalone data cleanup projects die. Leadership doesn’t want to fund them, and the team that has to do the work doesn’t want to prioritize them. We tied our cleanup to the BI implementation — it was the prerequisite for a business outcome people cared about. That gave it energy it wouldn’t have had otherwise.
The unglamorous truth about building good financial analytics is that it starts before the analytics. It starts with whether the source data is any good. Most of the time, it isn’t — not because anyone did anything wrong, but because data quality requires active maintenance, and maintenance requires someone to own it.
That’s the job. The dashboards are the easy part.