Key n-Gram Extractions and Analyses of Different Registers Based on Attention Network
Autor: | Haiyan Wu, Ying Liu, Shaoyun Shi, Qingfeng Wu, Yunlong Huang |
---|---|
Jazyk: | angličtina |
Rok vydání: | 2021 |
Předmět: | |
Zdroj: | Journal of Applied Mathematics, Vol 2021 (2021) |
Druh dokumentu: | article |
ISSN: | 1110-757X 1687-0042 |
DOI: | 10.1155/2021/5264090 |
Popis: | Keyn-gram extraction can be seen as extracting n-grams which can distinguish different registers. Keyword (as n=1, 1-gram is the keyword) extraction models are generally carried out from two aspects, the feature extraction and the model design. By summarizing the advantages and disadvantages of existing models, we propose a novel key n-gram extraction model “attentive n-gram network” (ANN) based on the attention mechanism and multilayer perceptron, in which the attention mechanism scores each n-gram in a sentence by mining the internal semantic relationship between words, and their importance is given by the scores. Experimental results on the real corpus show that the key n-gram extracted from our model can distinguish a novel, news, and text book very well; the accuracy of our model is significantly higher than the baseline model. Also, we conduct experiments on key n-grams extracted from these registers, which turned out to be well clustered. Furthermore, we make some statistical analyses of the results of key n-gram extraction. We find that the key n-grams extracted by our model are very explanatory in linguistics. |
Databáze: | Directory of Open Access Journals |
Externí odkaz: |