Quantifying the speech-gesture relation with massive multimodal datasets: Informativity in time expressions.

Autor: Cristóbal Pagán Cánovas, Javier Valenzuela, Daniel Alcaraz Carrión, Inés Olza, Michael Ramscar
Jazyk: angličtina
Rok vydání: 2020
Předmět:
Zdroj: PLoS ONE, Vol 15, Iss 6, p e0233892 (2020)
Druh dokumentu: article
ISSN: 1932-6203
DOI: 10.1371/journal.pone.0233892
Popis: The development of large-scale corpora has led to a quantum leap in our understanding of speech in recent years. By contrast, the analysis of massive datasets has so far had a limited impact on the study of gesture and other visual communicative behaviors. We utilized the UCLA-Red Hen Lab multi-billion-word repository of video recordings, all of them showing communicative behavior that was not elicited in a lab, to quantify speech-gesture co-occurrence frequency for a subset of linguistic expressions in American English. First, we objectively establish a systematic relationship in the high degree of co-occurrence between gesture and speech in our subset of expressions, which consists of temporal phrases. Second, we show that there is a systematic alignment between the informativity of co-speech gestures and that of the verbal expressions with which they co-occur. By exposing deep, systematic relations between the modalities of gesture and speech, our results pave the way for the data-driven integration of multimodal behavior into our understanding of human communication.
Databáze: Directory of Open Access Journals