Why the IT OT divide still haunts digital manufacturing

Bridging operational and enterprise data is vital for AI-driven insight. Yet the cultural, architectural and governance gaps between teams continue to stall progress in digital manufacturing.

Manufacturers have talked about IT/OT convergence for decades, but the reality on most factory floors remains far more fragmented than strategy decks would suggest. At the root of the issue is not just a technological mismatch, but a difference in worldviews, between teams optimising for uptime and safety, and those building for adaptability and scale.

Operational technology (OT) has always been engineered around stability. It is internal, deterministic, and usually built to withstand worst-case scenarios. The systems that govern it must be proven, predictable and rarely changed. IT, by contrast, thrives on adaptability and connectivity. It is designed to respond to flux, frequent software updates, device variability, user access and external threats are part of its daily rhythm.

These opposing mentalities, deeply embedded in their respective disciplines, make integration challenging. The IT team sees OT as outdated and insecure, still running legacy software in unpatched environments. OT sees IT as reckless, introducing unnecessary risk into systems that are otherwise stable and reliable. At core, both are correct, and both need each other.

“It comes down to a difference in priorities,” Eamonn O’Neill, Co-Founder and CTO at Lemongrass, says. “OT is about reliability, predictability, and uptime. IT is geared towards services that change frequently and tolerate occasional failure. You can see why one side struggles to work with the other.”

Architecture without ownership leads nowhere

While cultural rifts remain, data is emerging as the shared medium through which IT and OT can meaningfully collaborate. Most manufacturers already have basic integration in place, process orders sent from enterprise systems into manufacturing execution systems, and some production data returned upstream. But much of this is still transactional, and the real opportunity lies in unlocking value from the data trapped in OT.

“Quality metrics, breakage rates, even unplanned downtime on specific lines—this is all data that OT has, but IT rarely sees,” O’Neill explains. “Once that data is integrated into enterprise platforms, it can drive decisions far beyond the shop floor, from supplier selection to supply chain route optimisation.”

The integration challenge is not simply technical. A lack of formal data ownership has led many organisations to over-invest in infrastructure and under-invest in accountability. Without clear responsibility, data lakes become data swamps, volumes of duplicated, ungoverned information with little strategic value. The answer lies in embedding ownership from the start, not retrofitting governance once problems emerge.

“You cannot do data integration without data owners. It is not just a technical job. You need people in the organisation who know what the data is, what it means, and how it should be governed,” O’Neill says. “That includes data quality, duplication, freshness, without ownership, everything else becomes reactive.”

Cloud removes the bottleneck but not the complexity

Many of the traditional limitations around storage, bandwidth and data processing have been resolved through cloud-native infrastructure. Cloud platforms are now capable of ingesting and storing plant-level time series data at scale, often at a lower cost than on-premise alternatives. The technical barriers to ingesting and analysing OT data at high frequency are no longer significant.

“Compared to services like Netflix, streaming production data is a minor technical challenge,” O’Neill says. “Cloud platforms can easily handle the scale, and the cost of storing that data, indefinitely if needed, is now low enough to justify keeping it.”

Yet this ease of storage can introduce new forms of risk. Cloud misuse does not necessarily involve overcharging, but poor planning. Without clear limits on data retention or access, enterprises risk hoarding rather than harnessing their information. Simply having the data is not enough. It must be interpreted, structured and secured, with clear intent.

“You can end up with a data swamp, where there is so much data and nobody really understands it,” O’Neill explains. “There is nothing wrong with deleting data if it has no value. But if you want to keep it, then you need to know what you are keeping and why. Otherwise, it becomes an unmanageable liability.”

Security is still the point of failure

Integrating OT and IT also changes the risk profile of each environment. OT systems were never designed to be externally connected, and their operating systems, often outdated and unpatched, are a soft target for attackers. IT teams, operating closer to the network perimeter, often refuse to tolerate this level of vulnerability.

“Running Windows 7 in OT is not unusual. From an engineering perspective, if it works, it works. But from an IT security view, that is indefensible,” O’Neill warns. “Inbound connections from cloud to OT must be close to zero. You need strict controls and architectures built specifically to prevent injection or lateral movement.”

This is not a theoretical concern. OT breaches, particularly in critical infrastructure, have demonstrated how vulnerable industrial systems can be if not ring-fenced effectively. The solution is not to apply IT security methods directly, but to design bespoke architectures that account for OT’s unique limitations.

There is also the issue of patching. In IT, patching is routine. In OT, patching can introduce production risk. This forces organisations to confront the trade-off between uptime and security, and to prioritise based on business value. There is no one-size-fits-all answer, only appropriate mitigation.

Walking the floor builds trust faster than workshops

None of this works without people. Culture, not code, is the real barrier to convergence. The success of IT/OT integration depends on building relationships across disciplines and creating joint ownership of shared outcomes. That begins with understanding each other’s daily realities.

“Executives should be walking the plant, seeing what actually happens, and bringing their IT teams with them,” O’Neill advises. “You need to understand how engineers use technology day to day, where the gaps are, and where IT can help.”

Respect comes first. An engineer tasked with maintaining uptime on a billion-dollar production line cannot be expected to prioritise patching schedules over product output. Conversely, IT teams dealing with daily cybersecurity threats cannot ignore known vulnerabilities in connected systems. Acknowledging the validity of each viewpoint is the first step toward building a working model of collaboration.

Early wins matter. Rather than attempting sweeping transformation, organisations should focus on integrating small data sets that deliver measurable results. Feeding quality metrics from OT into ERP systems, for instance, shows immediate value and lays the groundwork for larger projects. Once these benefits are visible, trust follows.

From pilot to platform without stalling

Scaling IT/OT integration beyond isolated pilots requires deliberate design. A dual approach, combining high-level architectural visibility with ground-level insight, is key. Organisations must map all systems, both enterprise and operational, to identify overlaps, redundancies and integration gaps.

“You need a high-level architecture that shows what is connected, what is siloed, and where the opportunity is,” O’Neill explains. “At the same time, work from the ground up to see how systems are being used. Somewhere in the middle, you will find the points of leverage.”

This iterative approach prevents technology from outpacing operations. It allows for constant validation of the business case while preserving operational continuity. It also ensures that governance, security and sovereignty are addressed not as bolt-ons, but as core design principles from the outset.

Digital maturity is not a software problem, and it cannot be fixed with a cloud subscription. It is an organisational challenge that requires cross-functional accountability, sustained momentum and respect for the operational realities of manufacturing. The companies that succeed will not be those with the flashiest tech stack, but those that treat convergence as a long-term cultural shift, grounded in data, designed for resilience, and led by people who understand both sides of the factory wall.

Related Posts
Others have also viewed
magnetic

MES must become the factory’s nervous system, not its filing cabinet

Manufacturing execution systems are evolving from passive record-keeping to active, real-time decision support that shapes ...

Robotics will decide whether industry thrives or fractures by 2035

The fusion of robotics and artificial intelligence is accelerating, but its trajectory remains uncertain. Whether ...

Balancing precision and capacity with simulation-powered digital twins

Discrete event simulation is reshaping process design in manufacturing, replacing guesswork with data-driven certainty and ...

Most digital twins in manufacturing are just digital theatre

Digital twins and AI are converging to transform industrial operations, but success demands deep data ...