Scale Equivariant Neural Networks with Morphological Scale-Spaces - Archive ouverte HAL Access content directly
Conference Papers Year : 2021

Scale Equivariant Neural Networks with Morphological Scale-Spaces

Abstract

The translation equivariance of convolutions can make convolutional neural networks translation equivariant or invariant. Equivariance to other transformations (e.g. rotations, affine transformations, scalings) may also be desirable as soon as we know a priori that transformed versions of the same objects appear in the data. The semigroup cross-correlation, which is a linear operator equivariant to semigroup actions, was recently proposed and applied in conjunction with the Gaussian scale-space to create architectures which are equivariant to discrete scalings. In this paper, a generalization using a broad class of liftings, including morphological scale-spaces, is proposed. The architectures obtained from different scale-spaces are tested and compared in supervised classification and semantic segmentation tasks where objects in test images appear at different scales compared to training images. In both classification and segmentation tasks, the scale-equivariant architectures improve dramatically the generalization to unseen scales compared to a convolutional baseline. Besides, in our experiments morphological scale-spaces outperformed the Gaussian scale-space in geometrical tasks.
Fichier principal
Vignette du fichier
main.pdf (344.87 Ko) Télécharger le fichier
Origin : Files produced by the author(s)

Dates and versions

hal-03213645 , version 1 (30-04-2021)

Identifiers

Cite

Mateus Sangalli, Samy Blusseau, Santiago Velasco-Forero, Jesus Angulo. Scale Equivariant Neural Networks with Morphological Scale-Spaces. IAPR International Conference on Discrete Geometry and Mathematical Morphology (DGMM), 2021, May 2021, Uppsala, Sweden. ⟨hal-03213645⟩
4630 View
200 Download

Altmetric

Share

Gmail Facebook Twitter LinkedIn More