Repository landing page

We are not able to resolve this OAI Identifier to the repository landing page. If you are the repository manager for this record, please head to the Dashboard and adjust the settings.

Electrophysiological signatures of second language multimodal comprehension

Abstract

Language is multimodal: non-linguistic cues, such as prosody, gestures and mouth movements, are always present in face-to- face communication and interact to support processing. In this paper, we ask whether and how multimodal cues affect L2 processing by recording EEG for highly proficient bilinguals when watching naturalistic materials. For each word, we quantified surprisal and the informativeness of prosody, gestures, and mouth movements. We found that each cue modulates the N400: prosodic accentuation, meaningful gestures, and informative mouth movements all reduce N400. Further, effects of meaningful gestures but not mouth informativeness are enhanced by prosodic accentuation, whereas effects of mouth are enhanced by meaningful gestures but reduced by beat gestures. Compared with L1, L2 participants benefit less from cues and their interactions, except for meaningful gestures and mouth movements. Thus, in real- world language comprehension, L2 comprehenders use multimodal cues just as L1 speakers albeit to a lesser extent

Similar works

This paper was published in MPG.PuRe.

Having an issue?

Is data on this page outdated, violates copyrights or anything else? Report the problem now and we will take corresponding actions after reviewing your request.