Improved semi-supervised autoencoder for deception detection.

Autor: Fu H; School of Information Science and Engineering, Henan University of Technology, Zhengzhou, China., Lei P; School of Information Science and Engineering, Henan University of Technology, Zhengzhou, China., Tao H; School of Information Science and Engineering, Henan University of Technology, Zhengzhou, China.; Key Laboratory of Underwater Acoustic signal Processing of Ministry of Education, Southeast University, Nanjing, China., Zhao L; Key Laboratory of Underwater Acoustic signal Processing of Ministry of Education, Southeast University, Nanjing, China., Yang J; School of Information Science and Engineering, Henan University of Technology, Zhengzhou, China.
Jazyk: angličtina
Zdroj: PloS one [PLoS One] 2019 Oct 08; Vol. 14 (10), pp. e0223361. Date of Electronic Publication: 2019 Oct 08 (Print Publication: 2019).
DOI: 10.1371/journal.pone.0223361
Abstrakt: Existing algorithms of speech-based deception detection are severely restricted by the lack of sufficient number of labelled data. However, a large amount of easily available unlabelled data has not been utilized in reality. To solve this problem, this paper proposes a semi-supervised additive noise autoencoder model for deception detection. This model updates and optimizes the semi-supervised autoencoder and it consists of two layers of encoder and decoder, and a classifier. Firstly, it changes the activation function of the hidden layer in network according to the characteristics of the deception speech. Secondly, in order to prevent over-fitting during training, the specific ratio dropout is added at each layer cautiously. Finally, we directly connected the supervised classification task in the output of encoder to make the network more concise and efficient. Using the feature set specified by the INTERSPEECH 2009 Emotion Challenge, the experimental results on Columbia-SRI-Colorado (CSC) corpus and our own deception corpus show that the proposed model can achieve more advanced performance than other alternative methods with only a small amount of labelled data.
Competing Interests: The authors have declared that no competing interests exist.
Databáze: MEDLINE
Nepřihlášeným uživatelům se plný text nezobrazuje