Towards Unsupervised Language Understanding and Generation by Joint Dual Learning
Autor: | Chao-Wei Huang, Shang-Yu Su, Yun-Nung Chen |
---|---|
Rok vydání: | 2020 |
Předmět: |
FOS: Computer and information sciences
Boosting (machine learning) Source code Computer science media_common.quotation_subject Natural language understanding 010501 environmental sciences computer.software_genre Semantics 01 natural sciences 030507 speech-language pathology & audiology 03 medical and health sciences 0105 earth and related environmental sciences media_common Computer Science - Computation and Language business.industry Supervised learning Natural language generation Unsupervised learning Artificial intelligence 0305 other medical science business computer Computation and Language (cs.CL) Natural language processing Natural language |
Zdroj: | ACL |
DOI: | 10.48550/arxiv.2004.14710 |
Popis: | In modular dialogue systems, natural language understanding (NLU) and natural language generation (NLG) are two critical components, where NLU extracts the semantics from the given texts and NLG is to construct corresponding natural language sentences based on the input semantic representations. However, the dual property between understanding and generation has been rarely explored. The prior work is the first attempt that utilized the duality between NLU and NLG to improve the performance via a dual supervised learning framework. However, the prior work still learned both components in a supervised manner, instead, this paper introduces a general learning framework to effectively exploit such duality, providing flexibility of incorporating both supervised and unsupervised learning algorithms to train language understanding and generation models in a joint fashion. The benchmark experiments demonstrate that the proposed approach is capable of boosting the performance of both NLU and NLG. Comment: Accepted by ACL 2020 |
Databáze: | OpenAIRE |
Externí odkaz: |