Anand Argade Avatar
About the Author

Managing The Manufacturing Inventory That Isn’t There

Posted on

How Process Mining and OCPM gives the mid-market manufacturers a way to finally solve the complex ends of the Ghost Inventory problem.

Almost every mid-market manufacturer we speak with tells us some version of the same story.

The system says there are 400 units of a component in Warehouse B. The production team orders a replenishment run because the line needs150. And when someone actually walks into Warehouse B, they find 61 units. Some of them were damaged, some consumed in a job that was never properly closed, some simply never where the record said they were. The 400 that the ERP showed were, in any meaningful operational sense, ghosts.

Ghost inventory is the gap between what enterprise systems record and what physically exists. It is one of the most persistent and expensive problems in manufacturing. Although not new, the tools available to counter it have never been able to grasp the problem or resolve it entirely.

It is worth understanding why and what changes with process mining and its object-centric evolution, because the implications of ghost inventory for mid-market manufacturers in particular are substantial.

The Scale of the Problem Is Larger Than Most Acknowledge

The industry data on inventory mismanagement paints an interesting picture.

According to this study, worldwide, inventory distortion that includes factors like shrinkage, stockouts, and overstock costs businesses a staggering USD 1.6 trillion annually! Further, Worldmetrics reported that inefficient inventory management alone costs businesses approximately USD 1.1 trillion globally every year. Also, 60% of manufacturers cite inaccurate inventory data as one of their biggest operational challenges, while identifying poor visibility as the primary contributing factor.

But what threw a spanner into an otherwise well-oiled system? Actually, supply chains that were optimised for cost efficiency now face a demand for resilience amid chaos. The Red Sea crisis, Panama Canal constraints, Ukraine conflict spillovers, and escalating US-China technology decoupling have colluded to amplify fallouts for which most enterprises were never prepared.

For mid-market manufacturers, these statistics are not just numbers but a direct hit on their balance sheet and the production schedule. For instance, a USD 300 million food and beverage manufacturer cannot absorb inventory discrepancies like its Fortune 500 competitor that can cross-subsidise operational losses across divisions. It is important to note that every ghost item is a working capital tie-up, a potential production stoppage, or an emergency procurement order at unfavourable pricing.

Poor inventory management costs businesses up to 11% of their annual revenue through stockouts, overstocking, and associated rework. Considered at mid-market scale, this translates into a massive cut on the margins. The ghost inventory problem has several well-documented origins. These include but not limited to:

  • Manual data entry errors accumulated across warehouse transactions.
  • Goods receipts recorded without corresponding quality inspections being closed.
  • Production consumption posted days after actual usage, leaving phantom stock on the books.
  • Materials moved across storage locations without system updates following.

For multi-site manufacturers, these discrepancies compound fast across facilities, currencies, and teams. In response, to tackle the problem locally, each starts running their own informal workarounds that diverge progressively from the system of record.

Ghost inventory is not a data quality problem alone. It is a process execution problem and process execution problems require process-level visibility to solve.

How Process Mining Has Empowered the Manufacturers

Reading the Trail That ERP Systems Leave

Process mining works on the foundational principle that enterprise systems record every transaction, status change, and user action as an event. The postings for goods receipt, goods issue, inventory transfer, physical inventory count, and adjustment leaves a timestamped trace in the ERP. Process mining reads those traces and weaves a complete picture of how inventory actually moved through the operation. It is different from the earlier methods that made assumptions about how inventories are supposed to move or about reconstructing the flow from memory in the quarterly audit.

It is where traditional process mining has already delivered measurable value for manufacturers. By constructing an event log from SAP inventory management tables and running discovery and conformance analysis, it is possible to pinpoint where goods receipts are posted without inspection completions, where inventory transfers occur outside normal workflow sequences, and where goods issues are reversed at unusual rates.

These are the process deviations that generate ghost inventory, and process mining makes them visible at scale across every transaction and not just the handful that surface in periodic cycle counts.

The benefit of this approach is already established. For instance, this research on the application of process mining to the supply chain, published in Taylor & Francis, confirmed that process mining can trace individual objects through the supply chain from the point of disruption onward, identifying specific failure points, including stockouts and inventory anomalies that cascade from upstream process deviations.

While we engage with manufacturing businesses at FUTUROOT, applying process mining to inventory processes routinely surfaces dozens of distinct process variant types that deviate from intended flow. These are the variants that cycle count procedures and ERP exception reports never detected!

Tracing The Conformance Gap

One of the most direct applications of process mining to managing the ghost inventory problem is conformance checking. It compares the actual sequence of process steps recorded in the event log against the process design.

In SAP S/4HANA Cloud ERP, the intended inventory flow is well-defined: A goods receipt triggers a quality inspection notification, inspection completion triggers a stock posting, a stock posting enables a goods issue, and so on. Therefore, when conformance is done against the event data, the deviations that generate ghost inventory become analytically clear. This might involve the goods receipt that bypassed quality inspection, the goods issue that preceded the associated production order confirmation, or the inventory transfer that was never acknowledged at the receiving location.

These deviations are individually small enough to avoid detection. However, across thousands of transactions per month in a mid-market manufacturing environment, they accumulate into a structural gap between system and physical inventory, making cycle counts feel like an exercise in damage assessment rather than verification.

Process mining gives operations and inventory managers the tools to address this at the source, helping pinpoint the process variants that produce discrepancies and intervene in the workflow.

The deviations that generate ghost inventory are individually small. Across thousands of transactions per month, they accumulate into the structural gap between what the system says and what the warehouse holds.

The Limits of Conventional Analysis: OCPM As the Logical Next Step

The Multi-Object Reality of Manufacturing Inventory

Traditional process mining has undoubtedly contributed meaningfully to improving inventory accuracy. However, deeper ghost inventory issues often persist even after applying conformance checking and variant analysis. It is because they exist not through a single process flow but at the intersections of multiple interacting objects.

Let’s consider what is actually happening when a finished goods inventory discrepancy appears on the balance sheet of a mid-market manufacturer. More than a simple goods issue posted in error, this reflects a web of interconnected lapses as follows:

  • A production order that was completed with a component shortage that was never formally closed.
  • A quality hold that was physically resolved but never released in the system.
  • A batch split that was executed in the warehouse but not replicated in materials management
  • A warehouse transfer order that was confirmed in the warehouse management module but not reflected in the inventory management ledger

All four of these are separate object lifecycles — production order, quality notification, batch, and transfer order. Each has its own process flow and interacts with the others, contributing to the final discrepancy.

When conventional process mining analyses each of these as a separate flat event log, it can identify deviations within each flow. However, what is missing is the interaction pattern across all four that consistently precedes the inventory discrepancy. That interaction pattern is where the root cause lives!

That’s why, in this research, Wil van der Aalst argues that manufacturing processes involving Bills of Materials, production orders, and logistics are precisely the domain where object-centric analysis is most needed. These processes, by design, involve assembly steps that inherently reference multiple objects simultaneously.

What Object-Centric Process Mining Adds

Object-Centric Process Mining extends the analytical framework of traditional process mining to work directly with multi-object event data. Instead of requiring a choice of case notion that forces the relational complexity of manufacturing processes into a single flat sequence, OCPM extracts an Object-Centric Event Log in which each event can reference any number of objects simultaneously.

For instance, a goods receipt event references the purchase order, the material, the batch, the storage location, and the vendor simultaneously. That is what the event is in the SAP system, and that is what it needs to be in the analysis.

For the ghost inventory problem specifically, this matters in three ways.
  • First, OCPM can trace the complete interaction history of a specific material through every object it touched. From production orders, quality notifications, batches, transfer orders, to goods movements, everything is captured in a single, unified analytical model. When a discrepancy appears, the path to its origin no longer requires manually correlating four separate process analyses. It is visible as a connected sequence of cross-object events.
  • Second, OCPM can identify the interaction patterns that precede discrepancies. These are the combinations of object-lifecycle deviations that consistently produce ghost inventory. This allows the analytical posture to shift from reactive investigation to proactive pattern recognition.
  • Third, and perhaps most significantly for mid-market manufacturers planning or recently completed ERP transformations. OCPM’s single-extraction multi-perspective model means that analytical questions can be updated in runtime without requiring fresh data preparation. It is a practical cost advantage for operations teams working without dedicated data science resources.

It is important to note that OCPM, as an applied discipline, is still maturing. Of course, the theoretical foundations are solid, and the OCEL 2.0 standard has been published and is supported. However, commercial implementations at scale in manufacturing environments are still emerging. Research findings from the BPM conferences in 2024 and 2025 demonstrate the OCPM’s applicability to SAP ERP extraction, specifically, and industrial case studies are beginning to show measurable results. Therefore, even if the full toolkit is still being assembled, its value for businesses is clear!

OCPM does not replace what process mining has built. It extends its reach into the territory where ghost inventory’s deepest causes actually live: the interactions between process lifecycles, not within any single one.

What This Means for Mid-Market Manufacturers

Adopting a Phase-wise Approach

For a mid-market manufacturer trying to address ghost inventory today, the practical approach is to be sequential rather than all-or-nothing. The good news is that process mining’s established capabilities already provide a substantial starting point. Built on top of it, OCPM represents a natural and powerful extension as the tooling matures.

The first step is to apply conventional process mining to the core inventory management processes, such as goods receipts, goods issues, inventory transfers, physical inventory postings, and production order confirmations. It surfaces the conformance deviations and process variants that are generating the bulk of routine discrepancies. For most mid-market manufacturers, this step alone identifies a significant reduction opportunity. In our experience at FUTUROOT, addressing the top five to eight variant types that deviate from the intended inventory process flow accounts for the majority of the systematic ghost inventory gap.

The second step is extending the analysis to the process boundaries where inventory events intersect with procurement, quality management, production, and warehouse management. It is where traditional process mining, applied with care and domain knowledge about how SAP relates these modules, can already surface cross-functional patterns even before full OCPM capability is deployed. The key to success is having both a process mining platform and ERP knowledge to extract and interpret events from relevant tables across modules, rather than treating each module as a separate analytical exercise.

The third step, where OCPM can genuinely add value, is the multi-object root cause analysis for persistent, complex discrepancies that survive the first two steps. These are the discrepancies that resist correction because their origin is genuinely multi-causal, spread across interacting object lifecycles in ways that single-case analysis cannot fully resolve. It is where the extension of process mining into object-centric territory delivers a qualitative leap in analytical capability.

The Capability Gap That Mid-Market Manufacturers Need Closed

Large enterprises with dedicated process excellence teams are no strangers to the analytical steps described above. What is relatively new is the availability of this capability at the speed, price point, and interface simplicity that mid-market manufacturers can realistically consume.

The ghost inventory problem cannot be countered by awareness alone. Mid-market businesses require a process intelligence capability that can run continuously across the full transaction volume of an ERP system, surface the deviations at the point where they occur rather than weeks later in a reconciliation exercise, and extend that analysis into the multi-object interaction patterns that account for the most persistent discrepancies.

That capability, once the preserve of enterprise organisations with specialist teams and significant platform budgets, is now within reach for mid-market manufacturers. Its evolution toward object-centric process mining is what will make it genuinely complete.

The Road Ahead

Ghost inventory persists in manufacturing as a systemic and cultural issue. It originates in the gaps between process steps, modules, physical reality, and the digital record and is sustained by the absence of visibility across all those gaps simultaneously.

Process mining has already significantly narrowed those gaps for manufacturers who have adopted it, making process deviations visible at the transaction scale and conformance gaps explicit in real time. Object-centric process mining tasks that capability to the next level to address the multi-object complexities. The two are not competing approaches, but successive chapters of the same analytical project: closing the distance between what enterprise systems record and what is operationally real.

At FUTUROOT, our work with manufacturing clients has consistently emphasised this. The inventory problems that matter most exist at the crossroads of multiple processes and objects. For the first time, the process intelligence tools available to mid-market manufacturers are developing the capability to reach up and address them at a level they need to be addressed. That is a development worth following closely and deliberately building toward!