Publication:
Multimodal encoding of motion events in speech, gesture and cognition

Loading...
Thumbnail Image

Institution Authors

Research Projects

Organizational Unit

Journal Title

Journal ISSN

Volume Title

Type

article

Access

openAccess
Attribution 4.0 International

Publication Status

Published online

Creative Commons license

Except where otherwised noted, this item's license is described as openAccess

Journal Issue

Abstract

How people communicate about motion events and how this is shaped by language typology are mostly studied with a focus on linguistic encoding in speech. Yet, human communication typically involves an interactional exchange of multimodal signals, such as hand gestures that have different affordances for representing event components. Here, we review recent empirical evidence on multimodal encoding of motion in speech and gesture to gain a deeper understanding of whether and how language typology shapes linguistic expressions in different modalities, and how this changes across different sensory modalities of input and interacts with other aspects of cognition. Empirical evidence strongly suggests that Talmy's typology of event integration predicts multimodal event descriptions in speech and gesture and visual attention to event components prior to producing these descriptions. Furthermore, variability within the event itself, such as type and modality of stimuli, may override the influence of language typology, especially for expression of manner.

Date

2023-12

Publisher

Cambridge University Press

Description

Keywords

Citation

Collections


Page Views

0

File Download

0