AI

How AI Will Drive Digital Twin 3.0

Testing is a critical task for digital twins, and AI will increase the efficacy of the testing model moving forward

Physics calculations may work perfectly well in theory. On a blackboard, academic science is pretty predictable (outside of the quantum realm, perhaps). Yet, nothing is manufactured in a complete vacuum, is it? When it comes to real-world settings, millions of factors could impact the state of a physical object — material, friction, temperature, pressure, altitude, wear… the list goes on.

With so many tangible conditions increasing the likelihood of deviation, it can be difficult to reproduce a digital twin that accurately represents real-world conditions. This is, in part, why some believe the next generation of digital twins will be more driven by artificial intelligence (AI). By leveraging neural networks created with sensor data, production could trust intelligence gathered from trends across data points instead of solely relying on theoretical models. This could enable better predictability, optimized operational performances and overall improved products.

To discover what the next generation of digital twins will look like, I recently met with Faustino Gomez, co-founder and CEO of NNAISENSE. Below, we’ll define what a digital twin is and why the concept needs an upgrade. We’ll also explore a few digital twin use cases to see how the next wave of AI-driven digital twins will be “learning directly from the data,” as Gomez describes.

What Is a Digital Twin?

No, we’re not discussing your cyber clone. Digital twins are a bit less transhuman than that. Put simply, a digital twin is a digital duplicate of a physical object or system. Digital twins are commonly used across many industries to monitor and test intricate apparatus. Industrial manufacturing, for example, often leverages a digital twin to monitor production and test the performance of equipment within a simulated environment.

Digital Twin 1.0

Some of the first digital twins are attributed to NASA. By reproducing mechanical objects locally, NASA could diagnose issues and test solutions for physical components 200,000 miles away in space, far beyond direct human intervention.

The digital twin concept also found roots in other mechanical devices, such as jet engines, enabling on-the-ground mechanics to diagnose issues in the air. Using digital twins, smart factories could mimic the behaviors of manufacturing robots to test behavior and identify potential bottlenecks.

Digital Twin 2.0 Adds More Data

Using a digital twin, engineers could represent complex interworkings of joints and parts within a computer screen — a 3D reflection of the device. As opposed to constructing physical models, virtual simulations significantly reduced the effort and resources required to test new concepts and diagnose problems. But, simulating reality in a black box wasn’t always easy.

Reproducing complex physical counterparts with a digital twin is wrought with complications that can be difficult to model in a standard CAD simulation, said Gomez. So, engineers began taking measurements from physical devices, such as vibrations or temperature, and integrating actual production data into the twin. This enabled operators to better forecast system performance by considering real-world elements. Yet, truly predictive forecasting was still lacking. That’s where AI comes in.

Digital Twin 3.0 Adds AI

Up until recently, digital twin and AI have been independent concepts. But with the advent of more sensors and greater data collection, integrating AI is becoming an inevitable evolution of the digital twin, said Gomez.

Adding an AI layer to a digital twin would involve training a deep learning model using hundreds of time-series data outputs from sensors. Generating this data would require capturing a chronological view of how systems work functionally for some time. “Then, take data and split it,” said Gomez. “Use some to train, some to test.” The process is not too dissimilar to Walk Forward Testing, common in financial trading, added Gomez.

Instead of using traditional statistics methods and standard physics, a neural net could unify data and find nonlinear relationships between data regardless of type. The end result could quickly identify cause-and-effect correlations between data sets (which simulated physics engines may miss) and predict future conditions based on new input.

Use Cases

With better intelligence, you get a more accurate depiction of possible outcomes. The use cases for crystal ball digital twins are numerous. Gomez described how 3D printing and factory maintenance could utilize digital twins:

  • Additive 3d printing: 3D printing metal powder involves a complex thermodynamic process with high-intensity lasers. The process is often less than perfect, leaving cold and hot spots from layer to layer, but ignoring them could result in a defect. By analyzing each layer’s thermal displacements, a deep learning model could be trained on the intricate details of the process. This could be used to build a more refined process model, essentially enabling manufacturers to “pre-compute the part.”
  • Predicting glass quality: Gomez described how a specialty glass company utilizes AI-driven digital twins. Their process model takes sensor readings at various chambers throughout its molten melt and cooling process, analyzing temperatures to maintain a uniform mix. Based on their controls, operators can predict what glass quality will be in the future. Essentially, it’s “a user assistance system based on a digital twin,” described Gomez.

As you can see in the examples above, a digital twin isn’t necessarily a single object — it could represent an entire process model for a large industrial operation. According to Gomez, having this digital twin could boost testing and R&D, improve development time, and decrease time to market.

Digital Twin 3.0 Forecasts the Future

For digital twins to become more effective, these simulations must be faithful to current and future real-world conditions. But, computing tests for situations involving wet chemicals, unstable materials, or thermodynamics is tricky. Therefore, to Gomez, predicting such dynamics really requires “learning directly from the process model,” he said.

The key thing here is a neural net doesn’t care what it’s monitoring. Be it torque, matter, friction, temperature, fumes or vibration, the model will learn regardless of what’s thrown at it. While the benefit here is you don’t have to know physics or chemistry, setting up this process requires much customization, which could differ significantly on the type of materials and process at hand.

The fusion of disparate data sets could bring interesting capabilities for testing and forecasting. Yet, quality and quantity of data is paramount, noted Gomez. Furthermore, these models won’t spring up in an instant — they must be tested and validated using real-world dimensions before deemed useful. While digital twin 3.0 is a promising step forward, more research will be required to ensure the predicted digital future reflects the physical outcomes.

AI-infused digital twins use “data from the process to predict what the process will be in the future,” said Gomez. This goes far beyond traditional 3D representations to produce more faithful, predictive digital twins.

Bill Doerrfeld

Bill Doerrfeld is a tech journalist and analyst. His beat is cloud technologies, specifically the web API economy. He began researching APIs as an Associate Editor at ProgrammableWeb, and since 2015 has been the Editor at Nordic APIs, a high impact blog on API strategy for providers. He loves discovering new trends, researching new technology, and writing on topics like DevOps, REST design, GraphQL, SaaS marketing, IoT, AI, and more. He also gets out into the world to speak occasionally.

Recent Posts

Valkey is Rapidly Overtaking Redis

Redis is taking it in the chops, as both maintainers and customers move to the Valkey Redis fork.

16 hours ago

GitLab Adds AI Chat Interface to Increase DevOps Productivity

GitLab Duo Chat is a natural language interface which helps generate code, create tests and access code summarizations.

21 hours ago

The Role of AI in Securing Software and Data Supply Chains

Expect attacks on the open source software supply chain to accelerate, with attackers automating attacks in common open source software…

1 day ago

Exploring Low/No-Code Platforms, GenAI, Copilots and Code Generators

The emergence of low/no-code platforms is challenging traditional notions of coding expertise. Gone are the days when coding was an…

2 days ago

Datadog DevSecOps Report Shines Spotlight on Java Security Issues

Datadog today published a State of DevSecOps report that finds 90% of Java services running in a production environment are…

3 days ago

OpenSSF warns of Open Source Social Engineering Threats

Linux dodged a bullet. If the XZ exploit had gone undiscovered for only a few more weeks, millions of Linux…

3 days ago