FULLY TRAINABLE GAUSSIAN DERIVATIVE CONVOLUTIONAL LAYER - Archive ouverte HAL Access content directly
Conference Papers Year :

FULLY TRAINABLE GAUSSIAN DERIVATIVE CONVOLUTIONAL LAYER

Valentin Penaud--Polge
  • Function : Author
  • PersonId : 1149801
Jesus Angulo

Abstract

The Gaussian kernel and its derivatives have already been employed for Convolutional Neural Networks in several previous works. Most of these papers proposed to compute filters by linearly combining one or several bases of fixed or slightly trainable Gaussian kernels with or without their derivatives. In this article, we propose a high-level configurable layer based on anisotropic, oriented and shifted Gaussian derivative kernels which generalize notions encountered in previous related works while keeping their main advantage. The results show that the proposed layer has competitive performance compared to previous works and that it can be successfully included in common deep architectures such as VGG16 for image classification and U-net for image segmentation.
Fichier principal
Vignette du fichier
main.pdf (514.13 Ko) Télécharger le fichier
Origin : Files produced by the author(s)

Dates and versions

hal-03722767 , version 1 (13-07-2022)

Identifiers

Cite

Valentin Penaud--Polge, Santiago Velasco-Forero, Jesus Angulo. FULLY TRAINABLE GAUSSIAN DERIVATIVE CONVOLUTIONAL LAYER. 29th IEEE International Conference on Image Processing (IEEE ICIP), Oct 2022, Bordeaux, France. ⟨hal-03722767⟩
59 View
69 Download

Altmetric

Share

Gmail Facebook Twitter LinkedIn More