Cross-modal Language Generation using Pivot Stabilization for Web-scale Language Coverage

Autor: Radu Soricut, Ashish V. Thapliyal
Rok vydání: 2020
Předmět:
Zdroj: ACL
DOI: 10.48550/arxiv.2005.00246
Popis: Cross-modal language generation tasks such as image captioning are directly hurt in their ability to support non-English languages by the trend of data-hungry models combined with the lack of non-English annotations. We investigate potential solutions for combining existing language-generation annotations in English with translation capabilities in order to create solutions at web-scale in both domain and language coverage. We describe an approach called Pivot-Language Generation Stabilization (PLuGS), which leverages directly at training time both existing English annotations (gold data) as well as their machine-translated versions (silver data); at run-time, it generates first an English caption and then a corresponding target-language caption. We show that PLuGS models outperform other candidate solutions in evaluations performed over 5 different target languages, under a large-domain testset using images from the Open Images dataset. Furthermore, we find an interesting effect where the English captions generated by the PLuGS models are better than the captions generated by the original, monolingual English model.
Comment: ACL 2020
Databáze: OpenAIRE