Autor: |
Alan Taitz, M Florencia Assaneo, Natalia Elisei, Mónica Trípodi, Laurent Cohen, Jacobo D Sitt, Marcos A Trevisan |
Jazyk: |
angličtina |
Rok vydání: |
2018 |
Předmět: |
|
Zdroj: |
PLoS ONE, Vol 13, Iss 3, p e0193466 (2018) |
Druh dokumentu: |
article |
ISSN: |
1932-6203 |
DOI: |
10.1371/journal.pone.0193466 |
Popis: |
Sound-symbolic word classes are found in different cultures and languages worldwide. These words are continuously produced to code complex information about events. Here we explore the capacity of creative language to transport complex multisensory information in a controlled experiment, where our participants improvised onomatopoeias from noisy moving objects in audio, visual and audiovisual formats. We found that consonants communicate movement types (slide, hit or ring) mainly through the manner of articulation in the vocal tract. Vowels communicate shapes in visual stimuli (spiky or rounded) and sound frequencies in auditory stimuli through the configuration of the lips and tongue. A machine learning model was trained to classify movement types and used to validate generalizations of our results across formats. We implemented the classifier with a list of cross-linguistic onomatopoeias simple actions were correctly classified, while different aspects were selected to build onomatopoeias of complex actions. These results show how the different aspects of complex sensory information are coded and how they interact in the creation of novel onomatopoeias. |
Databáze: |
Directory of Open Access Journals |
Externí odkaz: |
|
Nepřihlášeným uživatelům se plný text nezobrazuje |
K zobrazení výsledku je třeba se přihlásit.
|