Grammatical Sequence Prediction for Real-Time Neural Semantic Parsing
Autor: | Konstantine Arkoudas, Christoph Teichmann, Chunyang Xiao |
---|---|
Rok vydání: | 2019 |
Předmět: |
FOS: Computer and information sciences
Vocabulary Parsing Computer Science - Computation and Language business.industry Computer science media_common.quotation_subject computer.software_genre Semantics Security token Bottleneck Focus (linguistics) Rule-based machine translation Artificial intelligence business computer Computation and Language (cs.CL) Natural language processing Utterance media_common |
DOI: | 10.48550/arxiv.1907.11049 |
Popis: | While sequence-to-sequence (seq2seq) models achieve state-of-the-art performance in many natural language processing tasks, they can be too slow for real-time applications. One performance bottleneck is predicting the most likely next token over a large vocabulary; methods to circumvent this bottleneck are a current research topic. We focus specifically on using seq2seq models for semantic parsing, where we observe that grammars often exist which specify valid formal representations of utterance semantics. By developing a generic approach for restricting the predictions of a seq2seq model to grammatically permissible continuations, we arrive at a widely applicable technique for speeding up semantic parsing. The technique leads to a 74% speed-up on an in-house dataset with a large vocabulary, compared to the same neural model without grammatical restrictions. |
Databáze: | OpenAIRE |
Externí odkaz: |