Uncertainty Quantification and Digital Twins Improve Autonomous Manufacturing
IMSI - September 2025
In the Spring of 2025, IMSI hosted a Long Program on “Uncertainty Quantification and AI for Complex Systems.” During the embedded Workshop on “Uncertainty Quantification for Material Science and Engineering,” Professor Wei Chen, Department of Mechanical Engineering, Northwestern University spoke about her work on digital twins applied to additive manufacturing.
Professor Chen motivated her talk by referencing the recent National Academies report: Foundational Research Gaps and Future Directions for Digital Twins (2023), which defined “digital twin” as a computer model that mimics the a physical or social system (or system-of-systems), is updated with data from its physical twin and has a predictive capability that can be used to improve performance or provide other value. Chen noted that digital twins can also be applied to autonomous systems, which could make them particularly important in manufacturing settings.
The purpose of a digital twin is to iteratively improve the predictive model of an actual physical system by using sensor data from the system to inform the model, then use a control input (action) to drive the model output toward a prespecified functional (time-dependent) performance that represents the actual physical behavior. Then, through an iterative process of sensing, control inputs, and comparing the digital output to the physical state of the system, the digital twin model is continuously updated and converges on the actual physical system for a given set of state variables. In real-world settings, application of digital twins requires making optimal decision making in near real time. For example, in manufacturing the time scale for effective decision making is milliseconds.
Chen described her group’s work on additive manufacturing in which an object is built up layer by layer using laser deposition. In materials manufacturing, processing conditions such as laser power during manufacturing affect the material’s properties. In additive manufacturing, the power of the laser used in material deposition is a controllable parameter and the temperature profile of the part being made is a measurable one. In a manufacturing process that is guided by a digital twin of the physical system, there is a real-time digital replica or twin of the part as it grows, and this twin mimics the actual physical system. The material being manufactured is monitored for temperature and perhaps other properties, and the sensor data (temperature) is collected for decision making in digital twin. The digital twin then makes a prediction about the future state of the physical system based on an error analysis of the previous state of the twin as compared to the physical part, and it outputs a control signal for the next step in the manufacturing process. Then sensor data is acquired again, input to the digital twin, etc., with the goal of improving the digital twin’s ability to output control signals that ultimately reduce the error between the desired state of the physical part being manufactured and the digital model of that part. This iterative process involves offline model validation and online decision making, both involving uncertainty quantification.
Chen discussed her group’s goal to extend these concepts to develop tools for hybrid autonomous manufacturing. She described their objective as a “co-design challenge,” in which online data collection is used to control the manufacturing process while also aiming to improve the manufacturing process. In other words, the goal is to figure out how to concurrently design the material, its geometry, and the process for its manufacture (See Figure 1).
Uncertainty quantification plays an important role in hybrid autonomous manufacturing. Specifically, engineers have to distinguish between two types of uncertainty: aleatoric uncertainty and epistemic uncertainty. Aleatoric uncertainty arises from factors that will continue to exist in the system regardless of how much data is collected, such as temperature fluctuations in the manufacturing environment, variations in sample or substrate quality, and intrinsic sensor noise. Epistemic uncertainty arises from limited knowledge and data about the system being modeled and controlled, and can be reduced with more data about the system under study. Examples of epistemic uncertainty include the discrepancy between model simulation and experimental data, and numerical uncertainty introduced in the machine learning model.
Chen discussed several machine learning models that are useful in manufacturing; progressing from situations in which data about the system are sparse, to systems in which system data flow as a time series, to the most complex systems in which the data are generated in a multidimensional space, such as manufacturing sheet metal. An added degree of complexity arises from when there is image data of the system. For sparse data, Gaussian process models are very effective. For time series data, recursive neural networks are popular. Graph neural networks are found to be useful in multidimensional applications. Finally, if image data are acquired, such as microscopy images of material structures, convolutional neural networks have benefits.
How do engineers quantify the uncertainty in these machine learning models? Chen discussed a range of Bayesian Neural Network tools such as Monte Carlo Dropout and Laplace Approximation as being amenable for use in digital twin applications to manufacturing.
An example application is to create a surrogate model of an additive manufacturing process in which the model conducts a time series analysis of the temperature of the melted material as it is being deposited at a given laser power, and then predicting in near real-time (at the scale of the manufacturing process time-steps) the temperature of the melt-pool at the next step, so that adjustments can be made to the laser power to achieve the desired properties of the material. Bayesian optimization is used in the feedback process to control the laser power in such a way that defects in the material are reduced and the desired material characteristics are achieved.
In any of these control processes, uncertainty is introduced, and if uncertainty is inadvertently built into the feedback mechanism for the control algorithm, this uncertainty can propagate through the model resulting in out-of-tolerance results for the material being manufactured. To deal with this, Chen’s group needed to understand how to quantify uncertainty propagation along the prediction horizon. They took a machine learning approach to uncertainty quantification. Specifically, they developed a method for dealing with quantifying the uncertainty associated with a lack of sufficient data for adequately training the machine learning model.
Chen demonstrated that surrogate modeling, real-time decision making, and uncertainty quantification are all interrelated and must be approached together in her strategies for improving the additive manufacturing process. Yet to be added to this approach, are techniques for model updating, such that the digital twin model can be considered trustworthy. Ultimately, her team’s goal is to improve the efficiency of real-time decision making in the manufacturing process, using tools that integrate the quantification of aleatoric and epistemic uncertainties.
Improving the predictability, repeatability, and reliability of manufactured novel materials is an example of how NSF-funded research advances national priorities. In the case of IMSI, Chen’s work exemplifies how IMSI’s major research themes of materials science, uncertainty quantification, artificial intelligence, and machine learning are advanced by IMSI collaboratively convening leading researchers in topics deeply embedded in applied mathematical sciences, such as uncertainty quantification, with scientists and engineers who need to apply advances in these topics to pressing needs.