Artificial Intelligence and Smart Vision
Limited-Data Machine Learning for Engineering Application
These days, artificial intelligence (AI) is so pervasive including in engineering such that we use it many times without knowing it. High-performance computing (HPC) enables engineers to perform data-driven engineering where a computer can act without being explicitly programmed. However, the current data-driven algorithms like machine learning (ML) takes a large initial data set to construct a model and further refines the model stochastically by comparison against actual occurrences. In most engineering applications, there is no initial large data set to draw on.
As a solution, my team is a pioneer in developing hybrid digital twins from limited data that is tailored for engineering applications. These hybrid digital twins use relatively small data (by far less than big data mining) from computational simulations as its initial training set to form a minimal fidelity model. Then, wisely extend an effective training set to properly train ML toward higher fidelity. This is usually conducted by different approaches, for example, a back propagation error calculation to identify the highest likelihood of low-fidelity, or by the mean of either theory-guided ML or penalized evolutionary searches (i.e. Genetic Algorithms) for high bias.
Big data mining tools cannot be a good solution for many engineering applications. For example, the size of data set needed by data scientist to train ML can be equivalent to constructing a library of frequent scenarios for an engineering application, therefore, unless the training is wise to use of limited data the role of ML can easily become less attractive for real engineering application.
Subsequently, an ML algorithm is not explicit programming and it is basically structured to be a continuous learner. Therefore, these hybrid models interact with real-world data collection for continual re-training based on real observation over the initial FEA training. This is the key direction toward a personalized AI-act of the computer that reflects the specific practice of engineering in a setting. That also defines the scope of proper data collection and assimilation for an engineering application rather than fitting analysis to existing data.
Digitization of Welding and Digital Twins of Welded Structure
Integration of simulation and machine learning for engineering application is my team’s emerging profession toward developing hybrid digital twin framework for engineering challenges that switch prediction (fully or partially) from simulation to a machine learning (ML) predictor with a fast response time while yet remain fidelity as the simulation.
One stream of work is the challenge of model order reduction in engineering computation when going from a general model to case-specific model. On one hand, the ML algorithm can represent an engineering application in a single batch. The main limitation is dealing with many features and reliance on big data. On the other hand, architecturing the system with multiple small batches can give more flexibility for prediction but at the cost of additional post-processing computation and time delay.
Therefore a good digital twin needs an optimal breakdown between these two scenarios. Traditionally, ML algorithms are good interpolator but poor extrapolator, however, the expensive partial differential equation (PDE) based simulation can train low-cost meta-models such as ML-based tools to replicate the behaviour of the system without solving the expensive PDEs. Either, these ML tools replace PDEs to simulate the core behaviour or replace the system at a higher level for batch control, these ML-PDEs can manage the extrapolation. The capacity of ML batches for extrapolation can be a good threshold for an optimal breakdown.
Active Exploration of Weld Distortion Scenarios on Digital Twins
Design standards for welded structures commits contractors for submitting an effective distortion control plan where welds shall be made in sequence such as to minimize distortion and welding heat shall be balanced. These are all requirements, but standard presents no solutions on how to achieve them. Typically, plans to control weld distortion are therefore mostly intuitive with welding engineers relying on their experience combined with the results of a limited number of practical tests. An alternative with modern computing is a digital twin of welding process on a structure allowing engineers to efficiently optimize welding scenarios without the need for multiple physical samples.
Digital twins that are entirely constructed on simulation tools are yet limited by computational time and therefore not mature for practical designs. To this end, we built and integrated machine learning (ML) algorithms with the simulation capability for active exploration of various welding scenarios in real time. We present an example of our algorithm implemented in a real welded structure project.
Artificial Intelligence (AI) for Optimal Deposition Pattern in Overlay Welding
Deep learning and Artificial Neural Networks (ANN) are popular machine learning tools for many applications. Binary characterization of a manufacturing process is the most effective way of achieving an optimal computational performance and memory footprint of these tools. From our experience, architecturing an ANN and binary definition of the physics-based phenomenon needs field knowledge and therefore is more effective with engineers rather than isolated data science.
Our team developed a deep learning prediction of overlay welding distortion where we binarized the definition of overlay pattern given by the user and the machine quickly returns the 3D distortion replicating FEA prediction.
Evaluation of Heat-Affected Zone (HAZ) Softening from Cross-Section Image
Modern grades of the pipeline such as X70, X80 & X100 improve many properties but coming with undesirable features such as HAZ-softening in seam and girth welds. On the other hand, welding features can control and minimize the problem in various stages ranging from engineering, machines, process, and workmanship.
Our product helps weld engineers to evaluate many welding scenarios in available time and reasonable cost to set up optimal welding for their practice. This product solves the existing problem of relying on a long and expensive process of HAZ-evaluation through physical testing. A smart computer application where user can take or upload a macro picture of pipe cross-section with weld(s) and see an embedded HAZ-Softening map (Yield, UTS, or Hardness) around the weld in the picture.
The user defines the information of welding’s heat-input and travelling speed, and other information is extracted through image processing in the application. A machine-learning algorithm that previously trained with integrated digital and physical simulation quickly predicts the HAZ-softening map around the weld for a given class of materials