Learning more universal representations for transfer-learning - Archive ouverte HAL Accéder directement au contenu
Article Dans Une Revue IEEE Transactions on Pattern Analysis and Machine Intelligence Année : 2020

Learning more universal representations for transfer-learning

Résumé

A representation is supposed universal if it encodes any element of the visual world (e.g., objects, scenes) in any configuration (e.g., scale, context). While not expecting pure universal representations, the goal in the literature is to improve the universality level, starting from a representation with a certain level. To do so, the state-of-the-art consists in learning CNN-based representations on a diversified training problem (e.g., ImageNet modified by adding annotated data). While it effectively increases universality, such approach still requires a large amount of efforts to satisfy the needs in annotated data. In this work, we propose two methods to improve universality, but pay special attention to limit the need of annotated data. We also propose a unified framework of the methods based on the diversifying of the training problem. Finally, to better match Atkinson's cognitive study about universal human representations, we proposed to rely on the transfer-learning scheme as well as a new metric to evaluate universality. This latter, aims us to demonstrates the interest of our methods on 10 target-problems, relating to the classification task and a variety of visual domains.
Fichier principal
Vignette du fichier
tamaazousti_preprint_pami.pdf (1.23 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Licence : CC BY NC ND - Paternité - Pas d'utilisation commerciale - Pas de modification

Dates et versions

hal-04314530 , version 1 (29-11-2023)

Identifiants

Citer

Youssef Tamaazousti, Hervé Le Borgne, Céline Hudelot, Mohamed-El-Amine Seddik, Mohamed Tamaazousti. Learning more universal representations for transfer-learning. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2020, 42 (9), pp.2212-2224. ⟨10.1109/TPAMI.2019.2913857⟩. ⟨hal-04314530⟩
13 Consultations
7 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More