Extracting Deformation-Aware Local Features by Learning to Deform - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2021

Extracting Deformation-Aware Local Features by Learning to Deform

Résumé

Despite the advances in extracting local features achieved by handcrafted and learning-based descriptors, they are still limited by the lack of invariance to non-rigid transformations. In this paper, we present a new approach to compute features from still images that are robust to non-rigid deformations to circumvent the problem of matching deformable surfaces and objects. Our deformation-aware local descriptor, named DEAL, leverages a polar sampling and a spatial transformer warping to provide invariance to rotation, scale, and image deformations. We train the model architecture end-to-end by applying isometric non-rigid deformations to objects in a simulated environment as guidance to provide highly discriminative local features. The experiments show that our method outperforms state-of-the-art handcrafted, learning-based image, and RGB-D descriptors in different datasets with both real and realistic synthetic deformable objects in still images. The source code and trained model of the descriptor are publicly available at https://www.verlab.dcc.ufmg.br/descriptors/neurips2021.

Dates et versions

hal-03442401 , version 1 (23-11-2021)

Identifiants

Citer

Guilherme Potje, Renato Martins, Felipe Cadar, Erickson R. Nascimento. Extracting Deformation-Aware Local Features by Learning to Deform. Neural Information Processing Systems (NeurIPS), Dec 2021, Virtual (Vancouver), Canada. ⟨hal-03442401⟩
59 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More