Zobrazeno 1 - 10
of 30
pro vyhledávání: '"Thurid Vogt"'
Publikováno v:
4th Conference on Conversational User Interfaces.
Autor:
Vered Aharonson, Laurence Devillers, Laurence Vidrascu, Thurid Vogt, Noam Amir, Anton Batliner, Dino Seppi, Stefan Steidl, Björn Schuller
Publikováno v:
Cognitive Technologies ISBN: 9783642151835
In this chapter, we focus on the automatic recognition of emotional states using acoustic and linguistic parameters as features and classifiers as tools to predict the ‘correct’ emotional states. We first sketch history and state of the art in th
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::014fa9334db09a8efd09f6eab9a6c606
https://opus.bibliothek.uni-augsburg.de/opus4/frontdoor/index/index/docId/67641
https://opus.bibliothek.uni-augsburg.de/opus4/frontdoor/index/index/docId/67641
Autor:
Laurence Devillers, Thurid Vogt, Dino Seppi, Björn Schuller, Johannes Wagner, Loic Kessous, Noam Amir, Anton Batliner, Stefan Steidl, Vered Aharonson, Laurence Vidrascu
Publikováno v:
Computer Speech and Language
Computer Speech and Language, Elsevier, 2010, 25 (1), pp.4. ⟨10.1016/j.csl.2009.12.003⟩
Computer Speech and Language, Elsevier, 2010, 25 (1), pp.4. ⟨10.1016/j.csl.2009.12.003⟩
International audience; In this article, we describe and interpret a set of acoustic and linguistic features that characterise emotional/emotion-related user states - confined to the one database processed: four classes in a German corpus of children
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::3fcd70fc8f576f3d72c8c5efd0ecc7e2
https://opus.bibliothek.uni-augsburg.de/opus4/files/67000/Batliner10-WSF.pdf
https://opus.bibliothek.uni-augsburg.de/opus4/files/67000/Batliner10-WSF.pdf
Publikováno v:
Kim, J, André, E, Rehm, M, Vogt, T & Wagner, J 2005, Integrating Information from Speech and Physiological Signals to Achieve Emotional Sensitivity . in Proceedings of the 9th European Conference on Speech Communication and Technology . pp. 809-812 .
INTERSPEECH
INTERSPEECH
Recently, there has been a significant amount of work on the recognition of emotions from speech and biosignals. Most approaches to emotion recognition so far concentrate on a single modality and do not take advantage of the fact that an integrated m
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::7943cec90f60f520406fa7d709e4efcb
https://opus.bibliothek.uni-augsburg.de/opus4/frontdoor/index/index/docId/47270
https://opus.bibliothek.uni-augsburg.de/opus4/frontdoor/index/index/docId/47270
Autor:
Elisabeth André, Thurid Vogt
Publikováno v:
KI - Künstliche Intelligenz. 25:213-223
Emotion recognition from speech in real-time is an upcoming research topic and the consideration of real-time constraints concerns all aspects of the recognition system. We present here a comparison of units and feature types for speech emotion recog
Autor:
Thurid Vogt, Marc Cavazza, Elisabeth André, Fred Charles, Nikolaus Bee, Johannes Wagner, David Pizzi
Publikováno v:
ICMI-MLMI
In this paper, we investigate the user's eye gaze behavior during the conversation with an interactive storytelling application. We present an interactive eye gaze model for embodied conversational agents in order to improve the experience of users p
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::025ab48ee47281487bb9042858143062
https://opus.bibliothek.uni-augsburg.de/opus4/files/46131/46131.pdf
https://opus.bibliothek.uni-augsburg.de/opus4/files/46131/46131.pdf
Publikováno v:
VRST
The Entertainment potential of Virtual Reality is yet to be fully realised. In recent years, this potential has been described through the Holodeck™ metaphor, without however addressing the issue of content creation and gameplay. Recent progress in
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::ed6cd0851d327f079e2b6f01eafb5902
https://opus.bibliothek.uni-augsburg.de/opus4/files/50662/50662.pdf
https://opus.bibliothek.uni-augsburg.de/opus4/files/50662/50662.pdf