Comparing Activation Functions in Machine Learning for Finite Element Simulations in Thermomechanical Forming - Archive ouverte HAL Accéder directement au contenu
Article Dans Une Revue Algorithms Année : 2023

Comparing Activation Functions in Machine Learning for Finite Element Simulations in Thermomechanical Forming

Résumé

Finite element (FE) simulations have been effective in simulating thermomechanical forming processes, yet challenges arise when applying them to new materials due to nonlinear behaviors. To address this, machine learning techniques and artificial neural networks play an increasingly vital role in developing complex models. This paper presents an innovative approach to parameter identification in flow laws, utilizing an artificial neural network that learns directly from test data and automatically generates a Fortran subroutine for the Abaqus standard or explicit FE codes. We investigate the impact of activation functions on prediction and computational efficiency by comparing Sigmoid, Tanh, ReLU, Swish, Softplus, and the less common Exponential function. Despite its infrequent use, the Exponential function demonstrates noteworthy performance and reduced computation times. Model validation involves comparing predictive capabilities with experimental data from compression tests, and numerical simulations confirm the numerical implementation in the Abaqus explicit FE code.
Fichier principal
Vignette du fichier
Pantalé - 2023 - Comparing Activation Functions in Machine Learning.pdf (1.72 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Licence : CC BY - Paternité

Dates et versions

hal-04316232 , version 1 (30-11-2023)

Identifiants

Citer

Olivier Pantalé. Comparing Activation Functions in Machine Learning for Finite Element Simulations in Thermomechanical Forming. Algorithms, 2023, 16 (12), pp.537. ⟨10.3390/a16120537⟩. ⟨hal-04316232⟩
12 Consultations
7 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More