Autor: |
Désirée Schoenherr, Alisa Shugaley, Franziska Roller, Lukas A. Knitter, Bernhard Strauss, Uwe Altmann |
Jazyk: |
angličtina |
Rok vydání: |
2023 |
Předmět: |
|
Zdroj: |
Methodology, Vol 19, Iss 3, Pp 283-302 (2023) |
Druh dokumentu: |
article |
ISSN: |
1614-2241 |
DOI: |
10.5964/meth.9375 |
Popis: |
In clinical research, the dependence of the results on the methods used is frequently discussed. In research on nonverbal synchrony, human ratings or automated methods do not lead to congruent results. Even when automated methods are used, the choice of the method and parameter settings are important to obtain congruent results. However, these are often insufficiently reported and do not meet the standard of transparency and reproducibility. This tutorial is aimed at researchers who are not familiar with the software Praat and R and shows in detail how to extract acoustic features like fundamental frequency or speech rate from video or audio files in conversations. Furthermore, it is presented how vocal synchrony indices can be calculated from these characteristics to represent how well two interaction partners vocally adapt to each other. All used scripts as well as a minimal example, can be found on the Open Science Framework and Github. |
Databáze: |
Directory of Open Access Journals |
Externí odkaz: |
|