Non-verbal speech cues as objective measures for negative symptoms in patients with schizophrenia.

Autor: Yasir Tahir, Zixu Yang, Debsubhra Chakraborty, Nadia Thalmann, Daniel Thalmann, Yogeswary Maniam, Nur Amirah Binte Abdul Rashid, Bhing-Leet Tan, Jimmy Lee Chee Keong, Justin Dauwels
Jazyk: angličtina
Rok vydání: 2019
Předmět:
Zdroj: PLoS ONE, Vol 14, Iss 4, p e0214314 (2019)
Druh dokumentu: article
ISSN: 1932-6203
DOI: 10.1371/journal.pone.0214314
Popis: Negative symptoms in schizophrenia are associated with significant burden and possess little to no robust treatments in clinical practice today. One key obstacle impeding the development of better treatment methods is the lack of an objective measure. Since negative symptoms almost always adversely affect speech production in patients, speech dysfunction have been considered as a viable objective measure. However, researchers have mostly focused on the verbal aspects of speech, with scant attention to the non-verbal cues in speech. In this paper, we have explored non-verbal speech cues as objective measures of negative symptoms of schizophrenia. We collected an interview corpus of 54 subjects with schizophrenia and 26 healthy controls. In order to validate the non-verbal speech cues, we computed the correlation between these cues and the NSA-16 ratings assigned by expert clinicians. Significant correlations were obtained between these non-verbal speech cues and certain NSA indicators. For instance, the correlation between Turn Duration and Restricted Speech is -0.5, Response time and NSA Communication is 0.4, therefore indicating that poor communication is reflected in the objective measures, thus validating our claims. Moreover, certain NSA indices can be classified into observable and non-observable classes from the non-verbal speech cues by means of supervised classification methods. In particular the accuracy for Restricted speech quantity and Prolonged response time are 80% and 70% respectively. We were also able to classify healthy and patients using non-verbal speech features with 81.3% accuracy.
Databáze: Directory of Open Access Journals
Nepřihlášeným uživatelům se plný text nezobrazuje