Chad Stoecker, Vice President, Global Managed Services at GE Digital explains why digital twins are becoming the new normal for industrial companies

It is 1969, and Neil Armstrong is sitting on the moon in the cabin of the Apollo 11 lunar lander. With the mission so far considered a success, Armstrong presses the electrical switch that ignites the rocket fuel needed to propel the lander back to Earth. But when he pushes the button – nothing happens.
“I started to think of ways to activate the switch,” wrote Armstrong later. “As it turned out, the very pen I used to record these notes was the perfect tool to engage this circuit breaker.”

You might think that the key lesson here is that simple is best. But if you run a major complex industrial facility, like an oil and gas platform, then you know that relying on one individual (and a ballpoint pen) is no way to deliver safe, reliable and cost-effective operations.
Perhaps that is why NASA engineers were the original pioneers of the concept of a digital twin in the early 1970s.

No longer science fiction: digital twins are becoming standard practice for industrial companies
According to Forrester Research’s ‘Untangle The Digital Twin As Part of Your Digital Product Strategy report’, “Sixteen per cent of senior global business purchase influencers at manufacturing companies currently implement some form of digital twin, with a further 21 per cent planning to do so in the next 12 months. In utilities and telecoms, both current (22 per cent) and planned (27 per cent) implementations are even higher.”

Although the term digital twin is now nearly 60 years old, it is only in the last decade that it have become a business imperative for industrial companies of all stripes, from oil and gas to power generation, electrical transmission and manufacturing. Companies like GE Digital have made digital twins standard operating practice with its Predix APM (Asset Performance Management) suite of Industrial IoT solutions forming the foundation for connected products and services.

At GE Digital, we define a digital twin as a software representation of a physical thing, system or process designed to detect, prevent, and optimise. Digital twins allow us to predict, prevent and optimise activities by applying software to the challenges of the physical world. Our analytics are built on real data from hundreds of thousands of real machines, not just based on designs of how designers and engineers think something will work.

GE Digital currently has more than 8,500 digital twins under management by its teams of industrial experts at our Industrial Managed Services facilities in Chicago, Paris, Dubai, Johannesburg, Sao Paulo, Shanghai and Singapore. Over recent years, those twins have saved customers more than $1.5 billion by avoiding unplanned downtime in industries as diverse as oil and gas, power generation, grid transmission and distribution, aviation, mining, and manufacturing. Take a look at these value stories to see the kinds of catches this team has made for customers and the savings they have facilitated.

Why digital twin?
The marriage of industrial operational technology (OT) and enterprise level information technology (IT) data is at the forefront of this industrial digital transformation wave. Our digital twins do not only mirror individual components, assets or processes, but scale to entire fleets of assets including power stations, electrical grids, upstream and downstream oil and gas equipment, providing data and decision-making value for every level of an organization from operator to plant manager to chief operating officer, and saving customers millions of dollars.

Digital twins are informed by the past, aware of the present, and able to predict future outcomes:
Hindsight – Through the study of historical data, digital twins understand the ways that an asset can fail, including contextual data regarding the rates and effects of failures overtime.

Insight – The learnings gained through hindsight leads to an effective strategy to mitigate the rates and effects of those failures through implementing actions such as inspections and analytics, which will detect or prevent failures.

Foresight – The data that is generated when the actions are conducted leads to foresight such as if changes are not made, then this is the likely consequence.

Oversight – This is a nearly autonomous condition where the foresight leads to the prescription of changes that could or should be made and the estimated impacts of those changes, such as when an asset X vibrates like this, automatically reduce output by Y to avoid consequence Z.