We are not able to resolve this OAI Identifier to the repository landing page. If you are the repository manager for this record, please head to the Dashboard and adjust the settings.
Moving through space is a crucial activity in daily human life. The main objective of my Ph.D.
project consisted of investigating how people exploit the multisensory sources of information
available (vestibular, visual, auditory) to efficiently navigate. Specifically, my Ph.D. aimed
at i) examining the multisensory integration mechanisms underlying spatial navigation; ii)
establishing the crucial role of vestibular signals in spatial encoding and processing, and
its interaction with environmental landmarks; iii) providing the neuroscientific basis to
develop tailored assessment protocols and rehabilitation procedures to enhance orientation
and mobility based on the integration of different sensory modalities, especially addressed to
improve the compromised navigational performance of visually impaired (VI) people.
To achieve these aims, we conducted behavioral experiments on adult participants,
including psychophysics procedures, galvanic stimulation, and modeling. In particular, the
experiments involved active spatial navigation tasks with audio-visual landmarks and selfmotion
discrimination tasks with and without acoustic landmarks using a motion platform
(Rotational-Translational Chair) and an acoustic virtual reality tool. Besides, we applied
Galvanic Vestibular Stimulation to directly modulate signals coming from the vestibular
system during behavioral tasks that involved interaction with audio-visual landmarks. In
addition, when appropriate, we compared the obtained results with predictions coming
from the Maximum Likelihood Estimation model, to verify the potential optimal integration
between the available multisensory cues.
i) Results on multisensory navigation showed a sub-group of integrators and another
of non-integrators, revealing inter-individual differences in audio-visual processing while
moving through the environment. Finding these idiosyncrasies in a homogeneous sample of
adults emphasizes the role of individual perceptual characteristics in multisensory perception,
highlighting how important it is to plan tailored rehabilitation protocols considering each
individual’s perceptual preferences and experiences. ii) We also found a robust inherent
overestimation bias when estimating passive self-motion stimuli. This finding shed new light
on how our brain processes and elaborates the available cues building a more functional
representation of the world. We also demonstrated a novel impact of the vestibular signals on the encoding of visual environmental cues without actual self-motion information. The role
that vestibular inputs play in visual cues perception, and space encoding has multiple consequences
on humans’ ability to functionally navigate in space and interact with environmental
objects, especially when vestibular signals are impaired due to intrinsic (vestibular disorders)
or environmental conditions (altered gravity, e.g. spaceflight missions). Finally, iii) the combination
of the Rotational-Translational Chair and the acoustic virtual reality tool revealed a
slight improvement in self-motion perception for VI people when exploiting acoustic cues.
This approach shows to be a successful technique for evaluating audio-vestibular perception
and improving spatial representation abilities of VI people, providing the basis to develop
new rehabilitation procedures focused on multisensory perception.
Overall, the findings resulting from my Ph.D. project broaden the scientific knowledge
about spatial navigation in multisensory environments, yielding new insights into the exploration
of the brain mechanisms associated with mobility, orientation, and locomotion
abilities
Is data on this page outdated, violates copyrights or anything else? Report the problem now and we will take corresponding actions after reviewing your request.