Attention-based Interactive Disentangling Network for Instance-level Emotional Voice Conversion
Autor: | Chen, Yun, Yang, Lingxiao, Chen, Qi, Lai, Jian-Huang, Xie, Xiaohua |
---|---|
Rok vydání: | 2023 |
Předmět: | |
Druh dokumentu: | Working Paper |
DOI: | 10.21437/Interspeech.2023-39 |
Popis: | Emotional Voice Conversion aims to manipulate a speech according to a given emotion while preserving non-emotion components. Existing approaches cannot well express fine-grained emotional attributes. In this paper, we propose an Attention-based Interactive diseNtangling Network (AINN) that leverages instance-wise emotional knowledge for voice conversion. We introduce a two-stage pipeline to effectively train our network: Stage I utilizes inter-speech contrastive learning to model fine-grained emotion and intra-speech disentanglement learning to better separate emotion and content. In Stage II, we propose to regularize the conversion with a multi-view consistency mechanism. This technique helps us transfer fine-grained emotion and maintain speech content. Extensive experiments show that our AINN outperforms state-of-the-arts in both objective and subjective metrics. Comment: Accepted by INTERSPEECH 2023 |
Databáze: | arXiv |
Externí odkaz: |