Zobrazeno 1 - 10
of 142
pro vyhledávání: '"Alexandros Potamianos"'
Autor:
Athanasia Zlatintsi, Petros Koutras, Georgios Evangelopoulos, Nikolaos Malandrakis, Niki Efthymiou, Katerina Pastra, Alexandros Potamianos, Petros Maragos
Publikováno v:
EURASIP Journal on Image and Video Processing, Vol 2017, Iss 1, Pp 1-24 (2017)
Abstract Research related to computational modeling for machine-based understanding requires ground truth data for training, content analysis, and evaluation. In this paper, we present a multimodal video database, namely COGNIMUSE, annotated with sen
Externí odkaz:
https://doaj.org/article/8227720ba6b24450ae51c338d8c28908
Publikováno v:
Interspeech 2021.
Publikováno v:
NAACL-HLT
In this work we explore Unsupervised Domain Adaptation (UDA) of pretrained language models for downstream tasks. We introduce UDALM, a fine-tuning procedure, using a mixed classification and Masked Language Model loss, that can adapt to the target do
Publikováno v:
INTERSPEECH
In this work we propose a machine learning model for depression detection from transcribed clinical interviews. Depression is a mental disorder that impacts not only the subject's mood but also the use of language. To this end we use a Hierarchical A
Publikováno v:
INTERSPEECH
Autor:
Theodoros Giannakopoulos, Aggelina Chatziagapi, Shrikanth S. Narayanan, Alexandros Potamianos, Athanasios Katsamanis, Georgios Pantazopoulos, Spiros Dimopoulos, Dimitris Sgouropoulos
Publikováno v:
CBMI
This paper demonstrates the utilization of Oliver 11https://behavioralsignals.com/oliver/, the speech emotion recognition (SER) API created by Behavioral Signals, in the context of a movie content visualization application. Oliver API provides an emo
Publikováno v:
ACL (1)
In this paper, we present a novel approach for incorporating external knowledge in Recurrent Neural Networks (RNNs). We propose the integration of lexicon features into the self-attention mechanism of RNN-based architectures. This form of conditionin
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::65b0f0e256b47ff22f7b42b2424f68b7
http://arxiv.org/abs/1906.03674
http://arxiv.org/abs/1906.03674
Publikováno v:
NAACL-HLT (1)
In traditional Distributional Semantic Models (DSMs) the multiple senses of a polysemous word are conflated into a single vector space representation. In this work, we propose a DSM that learns multiple distributional representations of a word based
Publikováno v:
NAACL-HLT (1)
A growing number of state-of-the-art transfer learning methods employ language models pretrained on large generic corpora. In this paper we present a conceptually simple and effective transfer learning approach that addresses the problem of catastrop
Publikováno v:
NAACL-HLT (1)
Neural sequence-to-sequence models are currently the dominant approach in several natural language processing tasks, but require large parallel corpora. We present a sequence-to-sequence-to-sequence autoencoder (SEQˆ3), consisting of two chained enc