MorphoActivation: Generalizing ReLU activation function by mathematical morphology - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2022

MorphoActivation: Generalizing ReLU activation function by mathematical morphology

Jesús Angulo

Résumé

This paper analyses both nonlinear activation functions and spatial max-pooling for Deep Convolutional Neural Networks (DCNNs) by means of the algebraic basis of mathematical morphology. Additionally, a general family of activation functions is proposed by considering both max-pooling and nonlinear operators in the context of morphological representations. Experimental section validates the goodness of our approach on classical benchmarks for supervised learning by DCNN.
Fichier principal
Vignette du fichier
MorphoActivationsDGMM20.pdf (897.81 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03721056 , version 1 (12-07-2022)

Identifiants

Citer

Santiago Velasco-Forero, Jesús Angulo. MorphoActivation: Generalizing ReLU activation function by mathematical morphology. International Conference on Discrete Geometry and Mathematical Morphology, Oct 2022, Strasbourg, France. ⟨hal-03721056⟩
39 Consultations
167 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More