Repository landing page

We are not able to resolve this OAI Identifier to the repository landing page. If you are the repository manager for this record, please head to the Dashboard and adjust the settings.

DEEP MULTI-MODAL SCHIZOPHRENIA DISORDER DIAGNOSIS VIA A GRU-CNN ARCHITECTURE

Abstract

The file on this institutional repository is embargoed indefinitely due to licensing and copyright restrictions. Individuals may download a copy of this article from the publisher's website for personal, non-commercial use at: https://doi.org/10.14311/NNW.2022.32.009 .Schizophrenia is a complex mental disorder associated with a change in the functional and structural of the brain. Accurate automatic diagnosis of schizophrenia is crucial and still a challenge. In this paper, we propose an automatic diagnosis of schizophrenia disorder method based on the fusion of different neuroimaging features and a deep learning architecture. We propose a deep-multimodal fusion (DMMF) architecture based on gated recurrent unit (GRU) network and 2D-3D convolutional neural networks (CNN). The DMMF model combines functional connectivity (FC) measures extracted from functional magnetic resonance imaging (fMRI) data and low-level features obtained from fMRI, magnetic resonance imaging (MRI), or diffusion tensor imaging (DTI) data and creates latent and discriminative feature maps for classification. The fusion of ROI-based FC with fractional anisotropy (FA) derived from DTI images achieved state-of-theart diagnosis-accuracy of 99.50% and an area under the curve (AUC) of 99.7% on COBRE dataset. The results are promising for the combination of features. The high accuracy and AUC in our experiments show that the proposed deep learning architecture can extract latent patterns from neuroimaging data and can help to achieve accurate classification of schizophrenia and healthy groups

Similar works

This paper was published in Brunel University Research Archive.

Having an issue?

Is data on this page outdated, violates copyrights or anything else? Report the problem now and we will take corresponding actions after reviewing your request.