Integration of Multimodal Data - Archive ouverte HAL
Chapitre D'ouvrage Année : 2023

Integration of Multimodal Data

Résumé

This chapter focuses on the joint modeling of heterogeneous information, such as imaging, clinical, and biological data. This kind of problem requires to generalize classical uni-and multivariate association models to account for complex data structure and interactions, as well as high data dimensionality. Typical approaches are essentially based on the identification of latent modes of maximal statistical association between different sets of features and ultimately allow to identify joint patterns of variations between different data modalities, as well as to predict a target modality conditioned on the available ones. This rationale can be extended to account for several data modalities jointly, to define multi-view, or multichannel, representation of multiple modalities. This chapter covers both classical approaches such as partial least squares (PLS) and canonical correlation analysis (CCA), along with most recent advances based on multi-channel variational autoencoders. Specific attention is here devoted to the problem of interpretability and generalization of such high-dimensional models. These methods are illustrated in different medical imaging applications, and in the joint analysis of imaging and non-imaging information, such as-omics or clinical data.
Fichier principal
Vignette du fichier
Chapter19.pdf (1.22 Mo) Télécharger le fichier
Origine Publication financée par une institution
Licence

Dates et versions

hal-04239814 , version 1 (12-10-2023)

Licence

Identifiants

Citer

Marco Lorenzi, Marie Deprez, Irene Balelli, Ana L Aguila, Andre Altmann. Integration of Multimodal Data. Olivier Colliot. Machine Learning for Brain Disorders, NM197, Springer, pp.573 - 597, 2023, Neuromethods, 978-1-0716-3197-3. ⟨10.1007/978-1-0716-3195-9_19⟩. ⟨hal-04239814⟩
64 Consultations
92 Téléchargements

Altmetric

Partager

More