Repository landing page

We are not able to resolve this OAI Identifier to the repository landing page. If you are the repository manager for this record, please head to the Dashboard and adjust the settings.

Fine-Grained Emotion Recognition Using Brain-Heart Interplay Measurements and eXplainable Convolutional Neural Networks

Abstract

Emotion recognition from electro-physiological signals is an important research topic in multiple scientific domains. While a multimodal input may lead to additional information that increases emotion recognition performance, an optimal processing pipeline for such a vectorial input is yet undefined. Moreover, the algorithm performance often compromises between the ability to generalize over an emotional dimension and the explainability associated with its recognition accuracy. This study proposes a novel explainable artificial intelligence architecture for a 9-level valence recognition from electroencephalographic (EEG) and electrocardiographic (ECG) signals. Synchronous EEG-ECG information are combined to derive vectorial brain-heart interplay features, which are rearranged in a sparse matrix (image) and then classified through an explainable convolutional neural network. The proposed architecture is tested on the publicly available MAHNOB dataset also against the use of vectorial EEG input. Results, also expressed in terms of confusion matrices, outperform the current state of the art, especially in terms of recognition accuracy. In conclusion, we demonstrate the effectiveness of the proposed approach embedding multimodal brain-heart dynamics in an explainable fashion

Similar works

Full text

thumbnail-image

Archivio della Ricerca - Università di Pisa

redirect
Last time updated on 17/10/2023

Having an issue?

Is data on this page outdated, violates copyrights or anything else? Report the problem now and we will take corresponding actions after reviewing your request.