Contextual BERT: Conditioning the Language Model Using a Global State
Autor: | Timo I. Denk, Ana Peleteiro Ramallo |
---|---|
Jazyk: | angličtina |
Rok vydání: | 2020 |
Předmět: |
FOS: Computer and information sciences
Computer Science - Computation and Language Computer science business.industry Context (language use) computer.software_genre Blank Personalization Task (project management) Domain (software engineering) Language model Artificial intelligence business computer Computation and Language (cs.CL) Natural language processing Word (computer architecture) Sentence |
Popis: | BERT is a popular language model whose main pre-training task is to fill in the blank, i.e., predicting a word that was masked out of a sentence, based on the remaining words. In some applications, however, having an additional context can help the model make the right prediction, e.g., by taking the domain or the time of writing into account. This motivates us to advance the BERT architecture by adding a global state for conditioning on a fixed-sized context. We present our two novel approaches and apply them to an industry use-case, where we complete fashion outfits with missing articles, conditioned on a specific customer. An experimental comparison to other methods from the literature shows that our methods improve personalization significantly. Accepted at the TextGraphs-14 workshop at COLING'2020 - The 28th International Conference on Computational Linguistics |
Databáze: | OpenAIRE |
Externí odkaz: |