← Insights8 min read
Data InfrastructureMarch 19, 2026

Why Manufacturing Data Projects Stall Between the Historian and the Dashboard

Most manufacturing data initiatives die in the gap between extracting data from historians and SCADA systems and delivering something operations teams will actually use. The technical and organisational reasons are predictable and avoidable.

KB

Karan Bhosale

Founder, Kinesiis

Every manufacturing data project starts the same way. Someone in operations or engineering says: we have years of data in our historian, why can't we see what's happening on the floor in real time? The IT team or a vendor builds a pipeline from the historian to a dashboard. Three months later, nobody uses the dashboard. The data is technically correct but operationally useless. The project is quietly shelved. This pattern repeats across manufacturing companies of all sizes, and the reasons are consistent enough to be worth documenting.

Key takeaways

  • -Historian data extraction is not the hard part. The hard part is transforming time-series process data into business context that operations teams can act on.
  • -Most manufacturing dashboards fail because they show data engineers' view of the data, not operators' view of the production line.
  • -Tag naming conventions in historians are inconsistent across lines and sites. Expect to spend 30-40% of project time on tag mapping and normalisation.
  • -Manufacturing data projects succeed when a plant operations lead is involved in defining what 'useful' looks like from day one, not after the pipeline is built.

Historian extraction is the easy part

OSIsoft PI, Honeywell PHD, GE Proficy, Wonderware: the major historians all have APIs or export mechanisms. Getting data out is a solved problem. Most vendors will help you set up an OPC-UA connection or a REST API export in a few days.

The project doesn't stall here. It stalls in what happens next: making the extracted data mean something to someone who runs a production line. Raw historian data is a stream of tag-value-timestamp triples. Converting that into 'Line 3 is running 12% below target because the sealer temperature has been drifting for the last two hours' requires context that doesn't exist in the historian.

Tag mapping is where 30-40% of the project time goes

Historian tags are named by the people who commissioned the equipment, often years or decades ago. Naming conventions vary between lines, between sites, and between the people who set them up. A temperature sensor on Line 1 might be tagged TT-101-PV at one site and LINE1_SEALER_TEMP at another.

Before you can build any useful analytics or dashboards, you need a tag mapping layer that translates raw historian tags into a common, human-readable schema. This is not a one-time exercise. New equipment gets added, tags get renamed, and sites get reconfigured. The mapping layer needs to be maintained as a living artefact.

Teams that underestimate this step build pipelines that technically work but produce dashboards where half the labels say things like 'FIC-2301-PV' and the other half say 'Flow Rate'. Operations teams look at this once and never come back.

Dashboards fail when they show the engineer's view, not the operator's view

The most common failure mode is a dashboard designed by the data engineering team based on what's available in the data, rather than what the operator needs to see during a shift.

An operator running a packaging line does not want a time-series chart of 47 sensors. They want to know: is the line running, is it on target, and if not, what changed? That requires the pipeline to calculate OEE metrics, compare against targets, and surface anomalies, not just display raw values.

The fix is simple but rarely done: put a plant operations lead in the room when you design the dashboard. Not as a reviewer after the fact, but as a co-designer from day one. If the people who will use the system don't shape what it shows, they won't use it.

The ERP-to-historian gap creates a context problem

Historians record what the equipment is doing. ERP systems record what the equipment should be doing: production orders, batch recipes, quality specifications. Useful manufacturing analytics require both.

A temperature reading from a reactor is meaningless without knowing what product is being made, what the specification is, and what batch it belongs to. Joining historian data with ERP data is where the operational context comes from, and it is where most pipelines either don't go or go badly.

The integration is challenging because historians operate on continuous time-series data and ERPs operate on discrete batch or order records. Aligning the two requires a context model that maps time ranges in the historian to production events in the ERP. This is solvable, but it needs to be designed explicitly, not treated as a downstream reporting problem.

In summary

Manufacturing data projects stall not because the technology is hard, but because the gap between raw process data and operational insight is larger than it appears from the IT side. Historian extraction, tag mapping, operator-centric dashboard design, and ERP context integration are all solvable problems, but they require plant operations involvement from day one and realistic time estimates for the data normalisation work that makes everything else possible.

Related service

Data Infrastructure and Engineering

View service →

Industry

Manufacturing Data Infrastructure

View industry →

Dealing with this
in your organisation?

Talk to us. We will scope an engagement before any work begins.