Corey M. ParrottDiab AbueiddaKai A. James
Machine learning surrogates for topology optimization must generalize well to a large variety of boundary conditions and volume fractions to serve as a stand-alone model. However, when analyzing design performance using physics-based analysis, many of the recently published methods suffer from low reliability, with a high percentage of the generated structures performing poorly. Disconnected regions of solid material between boundary conditions lead to unstable designs with significant outliers skewing the performance on test data. In this work, multi-head self-attention generative adversarial networks are introduced as a novel architecture for multiphysics topology optimization. This network contains multi-head attention mechanisms in high-dimensional feature spaces to learn the global dependencies of data (i.e., connectivity between boundary conditions). The model is demonstrated on design of coupled thermoelastic structures and its performance is evaluated with respect to the physics-based objective function used to generate training data. The proposed network achieves over a 36 times reduction in mean objective function error and an eight times reduction in volume fraction error compared to a baseline approach without attention mechanisms.
Corey M. ParrottDiab AbueiddaKai A. James
Hesaneh KazemiCarolyn Conner SeepersadH. Alicia Kim
Fenghua HuZhuowei WangQingshun BaiChong ChenLianglun Cheng
Lucas PereiraLarissa Driemeier