Distributed representation and one-hot representation fusion with gated network for clinical semantic textual similarity

Autor: Ying Xiong, Shuai Chen, Haoming Qin, He Cao, Yedan Shen, Xiaolong Wang, Qingcai Chen, Jun Yan, Buzhou Tang
Jazyk: angličtina
Rok vydání: 2020
Předmět:
Zdroj: BMC Medical Informatics and Decision Making, Vol 20, Iss S1, Pp 1-7 (2020)
Druh dokumentu: article
ISSN: 1472-6947
95516468
DOI: 10.1186/s12911-020-1045-z
Popis: Abstract Background Semantic textual similarity (STS) is a fundamental natural language processing (NLP) task which can be widely used in many NLP applications such as Question Answer (QA), Information Retrieval (IR), etc. It is a typical regression problem, and almost all STS systems either use distributed representation or one-hot representation to model sentence pairs. Methods In this paper, we proposed a novel framework based on a gated network to fuse distributed representation and one-hot representation of sentence pairs. Some current state-of-the-art distributed representation methods, including Convolutional Neural Network (CNN), Bi-directional Long Short Term Memory networks (Bi-LSTM) and Bidirectional Encoder Representations from Transformers (BERT), were used in our framework, and a system based on this framework was developed for a shared task regarding clinical STS organized by BioCreative/OHNLP in 2018. Results Compared with the systems only using distributed representation or one-hot representation, our method achieved much higher Pearson correlation. Among all distributed representations, BERT performed best. The highest Person correlation of our system was 0.8541, higher than the best official one of the BioCreative/OHNLP clinical STS shared task in 2018 (0.8328) by 0.0213. Conclusions Distributed representation and one-hot representation are complementary to each other and can be fused by gated network.
Databáze: Directory of Open Access Journals
Nepřihlášeným uživatelům se plný text nezobrazuje