Zobrazeno 1 - 10
of 123
pro vyhledávání: '"Andreas Wendemuth"'
Publikováno v:
Frontiers in Computer Science, Vol 4 (2022)
ObjectiveAcoustic addressee detection is a challenge that arises in human group interactions, as well as in interactions with technical systems. The research domain is relatively new, and no structured review is available. Especially due to the recen
Externí odkaz:
https://doaj.org/article/190366317c3844b094a09b0fdae80df9
Autor:
Andreas Wendemuth, Olga Egorow
Publikováno v:
IEEE Transactions on Affective Computing. 13:175-186
Although being a frequently occurring phenomenon in spoken communication, speech overlaps did not obtain the deserved attention in research so far - in both Human-Human Interaction (HHI) and Human-Computer Interaction (HCI). It is common knowledge th
Publikováno v:
Cognitive Systems Research. 70:65-79
The main promise of voice assistants is their ability to correctly interpret and learn from user input as well as the ability to utilize this knowledge to achieve specific goals and tasks. These systems need predetermined activation actions to start
Autor:
Yamini Sinha, Jan Hintz, Matthias Busch, Tim Polzehl, Matthias Haase, Andreas Wendemuth, Ingo Siegert
Publikováno v:
2nd Symposium on Security and Privacy in Speech Communication.
Publikováno v:
IET Intelligent Transport Systems. 14:1265-1277
For vehicle safety, the in-time monitoring of the driver and assessing his/her state is a demanding issue. Frustration can lead to aggressive driving behaviours, which play a decisive role in up to one-third of fatal road accidents. Consequently, the
Autor:
Camilla Apoy, Marc Wilbrink, Anna Anund, Daniel Teichmann, Andreas Wendemuth, Luca Zanovello, Yannis Lilis, Hamid Sanatnama, Evangelos Bekiaris, Annika Larsson, Alessia Knauss, Harald Widlroither, Svitlana Finér, Alexander Efa, Mengnuo Dai, Johan Karlsson, Frederik Diederichs, Evangelia Chrysochoou, Stas Krupenia, Emmanouil Zidianakis, Stella Nikolaou, Nikos Dimokas, Sven Bischoff, Andreas Absér, Pantelis Maroudis
Publikováno v:
IET Intelligent Transport Systems. 14:889-899
Automated vehicles are entering the roads and automation is applied to cars, trucks, buses, and even motorcycles today. High automation foresees transitions during driving in both directions. The driver and rider state become a critical parameter sin
Publikováno v:
ICHMS
While the original aim of assistant systems is the reduction of the workload of their user, this is often not the result within state-of-the-art systems. One reason is that the current generation of assistant systems tends to be used as user interfac
Publikováno v:
ICHMS
One of the core problems of machine learning applications, and in turn when recognizing Emotions from speech, is the difficulty to decide which measurable features are the ones containing the relevant information concerning the emotion classification
Publikováno v:
INTERSPEECH
Scopus-Elsevier
Scopus-Elsevier
Recognition and detection of non-lexical or paralinguistic cues from speech usually uses one general model per event (emotional state, level of interest). Commonly this model is trained independent of the phonetic structure. Given sufficient data, th
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::e8f5fdebf88907bdcdaf23fa729511b1
https://opus.bibliothek.uni-augsburg.de/opus4/frontdoor/index/index/docId/76685
https://opus.bibliothek.uni-augsburg.de/opus4/frontdoor/index/index/docId/76685
Publikováno v:
2016 IEEE 26th International Workshop on Machine Learning for Signal Processing (MLSP)
MLSP
MLSP
As emotion recognition from speech has matured to a degree where it becomes suitable for real-life applications, it is time for developing techniques for matching different types of emotional data with multi-dimensional and categories-based annotatio
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::96c57986e2ce2c228ee8819395815e56
https://opus.bibliothek.uni-augsburg.de/opus4/frontdoor/index/index/docId/72193
https://opus.bibliothek.uni-augsburg.de/opus4/frontdoor/index/index/docId/72193