Artificial intelligence and simulation based optimization of turbine blade manufacturing
Traditional methods to optimize the casting process to minimize microporosity through experimental trials are both time-consuming and expensive. Numerical simulations are increasingly used to accelerate the development process. However, the prolonged simulation times – often spanning several hours – severely restrict the number of parameter configurations that can be affordably explored for microporosity assessment, thereby limiting their utility in optimization loops.
To address these limitations, artificial intelligence (AI) offers a promising alternative through the introduction of well-trained machine learning (ML) models that can rapidly estimate the effects of changes in geometric or process parameters on microporosity. However, due to the high cost and time demand of casting trials, sufficient experimental data is rarely available to train ML models with the necessary predictive accuracy. In this contribution, we show that simulation-generated data can be used to effectively overcome this bottle neck by providing the datasets needed to train ML-models.
In the optimization of a complex setup involving 18 test bars, numerous rapid, small-scale simulations of critical component regions were used to generate training data. These simulations enabled the training of a ML model capable of quantifying microporosity in cast parts as a function of design and process parameters. The resulting ML model was employed to guide engineers in designing gating systems and selecting process parameters that minimize microporosity. The integration of this ML-driven approach into the casting development process significantly increased efficiency, enabling optimal configurations with minimal dependence on costly, time-intensive trials.
Keywords: microporosity prediction, process digitalization, process simulation, artificial intelligence, machine learning, process design, process optimization
See more of: Aeromat Technical Program