Alzheimer's disease (AD) is a neurodegenerative process characterized by the accumulation of amyloid-beta plaques and neurofibrillary tangles and is the most common cause of dementia. Studies have been striving to analyze the disease using available physiological and behavioral data. Functional/structural neuroimaging and genomics are complementary modalities for exploring the mechanisms subserving the development of AD. In this paper, we present a deep multimodal generative data fusion framework for integrating these sources in a classification task involving AD patients and healthy controls from the ADNI database. Biological data fusion has the potential to improve the fmal prediction, but at the same time, it is particularly challenging due to the unavailability of all sources of input for the entire cohort of subjects. Our proposed model allows us to perform prediction even if individuals are missing certain modalities. Our method addresses the missing modalities problem via knowledge transfer from two generative adversarial networks. The model exhibits superior performance for predicting AD versus healthy control, even with missing modalities. This could have an important impact from the patient point of view since certain clinical tests may not be necessary or available to a given individual.

A deep generative multimodal imaging genomics framework for Alzheimer's disease prediction

Giorgio Dolci
;
Gloria Menegaz;
2022-01-01

Abstract

Alzheimer's disease (AD) is a neurodegenerative process characterized by the accumulation of amyloid-beta plaques and neurofibrillary tangles and is the most common cause of dementia. Studies have been striving to analyze the disease using available physiological and behavioral data. Functional/structural neuroimaging and genomics are complementary modalities for exploring the mechanisms subserving the development of AD. In this paper, we present a deep multimodal generative data fusion framework for integrating these sources in a classification task involving AD patients and healthy controls from the ADNI database. Biological data fusion has the potential to improve the fmal prediction, but at the same time, it is particularly challenging due to the unavailability of all sources of input for the entire cohort of subjects. Our proposed model allows us to perform prediction even if individuals are missing certain modalities. Our method addresses the missing modalities problem via knowledge transfer from two generative adversarial networks. The model exhibits superior performance for predicting AD versus healthy control, even with missing modalities. This could have an important impact from the patient point of view since certain clinical tests may not be necessary or available to a given individual.
2022
imaging genetics, Multimodal model, Generative network, Alzheimer's disease, Deep learning
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11562/1091247
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 7
  • ???jsp.display-item.citation.isi??? 6
social impact