Improving Disfluency Detection by Self-Training a Self-Attentive Model
Autor: | Mark Johnson, Paria Jamshid Lou |
---|---|
Rok vydání: | 2020 |
Předmět: |
FOS: Computer and information sciences
Computer Science - Computation and Language Parsing Computer science business.industry 02 engineering and technology computer.software_genre Syntax 03 medical and health sciences 0302 clinical medicine 030221 ophthalmology & optometry 0202 electrical engineering electronic engineering information engineering 020201 artificial intelligence & image processing Artificial intelligence Joint (audio engineering) business Computation and Language (cs.CL) Self training computer Natural language processing Word (computer architecture) |
Zdroj: | ACL Macquarie University |
DOI: | 10.18653/v1/2020.acl-main.346 |
Popis: | Self-attentive neural syntactic parsers using contextualized word embeddings (e.g. ELMo or BERT) currently produce state-of-the-art results in joint parsing and disfluency detection in speech transcripts. Since the contextualized word embeddings are pre-trained on a large amount of unlabeled data, using additional unlabeled data to train a neural model might seem redundant. However, we show that self-training --- a semi-supervised technique for incorporating unlabeled data --- sets a new state-of-the-art for the self-attentive parser on disfluency detection, demonstrating that self-training provides benefits orthogonal to the pre-trained contextualized word representations. We also show that ensembling self-trained parsers provides further gains for disfluency detection. |
Databáze: | OpenAIRE |
Externí odkaz: |