Repository landing page

We are not able to resolve this OAI Identifier to the repository landing page. If you are the repository manager for this record, please head to the Dashboard and adjust the settings.

What is neurorepresentationalism?:From neural activity and predictive processing to multi-level representations and consciousness

Abstract

This review provides an update on Neurorepresentationalism, a theoretical framework that defines conscious experience as multimodal, situational survey and explains its neural basis from brain systems constructing best-guess representations of sensations originating in our environment and body (Pennartz, 2015). It posits that conscious experience is characterized by five essential hallmarks: (i) multimodal richness, (ii) situatedness and immersion, (iii) unity and integration, (iv) dynamics and stability, and (v) intentionality. Consciousness is furthermore proposed to have a biological function, framed by the contrast between reflexes and habits (not requiring consciousness) versus goal-directed, planned behavior (requiring multimodal, situational survey). Conscious experience is therefore understood as a sensorily rich, spatially encompassing representation of body and environment, while we nevertheless have the impression of experiencing external reality directly. Contributions to understanding neural mechanisms underlying consciousness are derived from models for predictive processing, which are trained in an unsupervised manner, do not necessarily require overt action, and have been extended to deep neural networks. Even with predictive processing in place, however, the question remains why this type of neural network activity would give rise to phenomenal experience. Here, I propose to tackle the Hard Problem with the concept of multi-level representations which emergently give rise to multimodal, spatially wide superinferences corresponding to phenomenal experiences. Finally, Neurorepresentationalism is compared to other neural theories of consciousness, and its implications for defining indicators of consciousness in animals, artificial intelligence devices and immobile or unresponsive patients with disorders of consciousness are discussed

Similar works

Full text

thumbnail-image

International Migration, Integration and Social Cohesion online publications

redirect
Last time updated on 24/12/2022

Having an issue?

Is data on this page outdated, violates copyrights or anything else? Report the problem now and we will take corresponding actions after reviewing your request.