Zobrazeno 1 - 10
of 256
pro vyhledávání: '"Ren, Yafeng"'
Publikováno v:
[J]. ACM Transactions on Information Systems, 2022, 41(2): 1-32
Aspect-based sentiment analysis (ABSA) aims at automatically inferring the specific sentiment polarities toward certain aspects of products or services behind the social media texts or reviews, which has been a fundamental application to the real-wor
Externí odkaz:
http://arxiv.org/abs/2304.09563
Conversational semantic role labeling (CSRL) is a newly proposed task that uncovers the shallow semantic structures in a dialogue text. Unfortunately several important characteristics of the CSRL task have been overlooked by the existing works, such
Externí odkaz:
http://arxiv.org/abs/2210.03037
Publikováno v:
In Information Sciences February 2025 691
Publikováno v:
In European Journal of Pharmacology 5 January 2025 986
Publikováno v:
In Information Processing and Management September 2024 61(5)
Autor:
Chen, Xiangqi, Wang, Han, Wu, Chuan, Li, Xiaoyan, Huang, Xiaojuan, Ren, Yafeng, Pu, Qiang, Cao, Zhongwei, Tang, Xiaoqiang, Ding, Bi-Sen
Publikováno v:
In Redox Biology April 2024 70
In this paper, we propose to enhance the pair-wise aspect and opinion terms extraction (PAOTE) task by incorporating rich syntactic knowledge. We first build a syntax fusion encoder for encoding syntactic features, including a label-aware graph convo
Externí odkaz:
http://arxiv.org/abs/2105.02520
Lexical chain consists of cohesion words in a document, which implies the underlying structure of a text, and thus facilitates downstream NLP tasks. Nevertheless, existing work focuses on detecting the simple surface lexicons with shallow syntax asso
Externí odkaz:
http://arxiv.org/abs/2009.09173
Syntax has been shown useful for various NLP tasks, while existing work mostly encodes singleton syntactic tree using one hierarchical neural network. In this paper, we investigate a simple and effective method, Knowledge Distillation, to integrate h
Externí odkaz:
http://arxiv.org/abs/2009.07411
We consider retrofitting structure-aware Transformer-based language model for facilitating end tasks by proposing to exploit syntactic distance to encode both the phrasal constituency and dependency connection into the language model. A middle-layer
Externí odkaz:
http://arxiv.org/abs/2009.07408