A TextCNN and WGAN-gp based deep learning frame for unpaired text style transfer in multimedia services.

Autor: Hu, Mingxuan, He, Min, Su, Wei, Chehri, Abdellah
Předmět:
Zdroj: Multimedia Systems; Aug2021, Vol. 27 Issue 4, p723-732, 10p
Abstrakt: With the rapid growth of big multimedia data, multimedia processing techniques are facing some challenges, such as knowledge understanding, semantic modeling, feature representation, etc. Hence, based on TextCNN and WGAN-gp (improved training of Wasserstein GANs), a deep learning framework is suggested to improve the efficiency of discriminating the specific style features and the style-independent content features in unpaired text style transfer for multimedia services. To redact a sentence with the requested style and preserve the style-independent content, the encoder-decoder framework is usually adopted. However, lacking of same-content sentence pairs with different style for training, some works fail to capture the original content and generate satisfied style properties accurately in the transferred sentences. In this paper, we adopt TextCNN to extract the style features in the transferred sentences, and align the style features with the target style label by the generator (encoder and decoder). Meanwhile, WGAN-gp is utilized subtly to preserve the content features of original sentences. Experiments demonstrate that the performances of our framework on automatic evaluation and human evaluation are much better than the former works. Thus, it provides an effective method for unpaired text style transfer in multimedia services. [ABSTRACT FROM AUTHOR]
Databáze: Complementary Index