Longitudinal autoencoder for multi-modal disease progression modelling - Archive ouverte HAL
Pré-Publication, Document De Travail Année : 2019

Longitudinal autoencoder for multi-modal disease progression modelling

Résumé

Imaging modalities and clinical measurement, as well as their time progression can be seen as heterogeneous observations of the same underlying disease process. The analysis of sequences of multi-modal observations, where not all modalities are present at each visit, is a challenging task. In this paper, we propose a multi-modal autoencoder for longitudinal data. The sequences of observations for each modality are encoded using a recurrent network into a latent variable. The variables for the different modalities are then fused into a common variable which describes a linear trajectory in a low-dimensional latent space. This latent space is mapped into the multi-modal observation space using separate decoders for each modality. We first illustrate the stability of the proposed model through simple scalar experiments. Then, we illustrate how information can be conveyed from one modality to refine predictions about the future using the learned autoencoder. Finally, we apply this approach to the prediction of future MRI for Alzheimer's patients.
Fichier principal
Vignette du fichier
longitudinal_multimodal_autoencoder.pdf (819.64 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-02090886 , version 1 (05-04-2019)
hal-02090886 , version 2 (17-04-2019)

Identifiants

  • HAL Id : hal-02090886 , version 2

Citer

Raphäel Couronné, Maxime Louis, Stanley Durrleman. Longitudinal autoencoder for multi-modal disease progression modelling. 2019. ⟨hal-02090886v2⟩
482 Consultations
515 Téléchargements

Partager

More