Zobrazeno 1 - 10
of 68
pro vyhledávání: '"He, Yongquan"'
Autor:
He, Yongquan, Huang, Xuancheng, Tang, Minghao, Meng, Lingxun, Li, Xiang, Lin, Wei, Zhang, Wenyuan, Gao, Yifu
Instruction tuning for large language models (LLMs) can drive them to produce results consistent with human goals in specific downstream tasks. However, the process of continual instruction tuning (CIT) for LLMs may bring about the catastrophic forge
Externí odkaz:
http://arxiv.org/abs/2403.10056
Temporal knowledge graph question answering (TKGQA) poses a significant challenge task, due to the temporal constraints hidden in questions and the answers sought from dynamic structured knowledge. Although large language models (LLMs) have made cons
Externí odkaz:
http://arxiv.org/abs/2402.16568
Publikováno v:
CIKM (2020) 505-514
Embedding entities and relations into continuous vector spaces has attracted a surge of interest in recent years. Most embedding methods assume that all test entities are available during training, which makes it time-consuming to retrain embeddings
Externí odkaz:
http://arxiv.org/abs/2402.14033
Publikováno v:
IJCAI (2021) 1915-1921
In recent years, temporal knowledge graph (TKG) reasoning has received significant attention. Most existing methods assume that all timestamps and corresponding graphs are available during training, which makes it difficult to predict future events.
Externí odkaz:
http://arxiv.org/abs/2402.12074
Named entity recognition (NER) is a fundamental task in natural language processing that aims to identify and classify named entities in text. However, span-based methods for NER typically assign entity types to text spans, resulting in an imbalanced
Externí odkaz:
http://arxiv.org/abs/2310.18349
Fine-grained entity typing (FET) is an essential task in natural language processing that aims to assign semantic types to entities in text. However, FET poses a major challenge known as the noise labeling problem, whereby current methods rely on est
Externí odkaz:
http://arxiv.org/abs/2310.14596
Autor:
Wang, Zihan, Zhao, Kai, He, Yongquan, Chen, Zhumin, Ren, Pengjie, de Rijke, Maarten, Ren, Zhaochun
Recent work on knowledge graph completion (KGC) focused on learning embeddings of entities and relations in knowledge graphs. These embedding methods require that all test entities are observed at training time, resulting in a time-consuming retraini
Externí odkaz:
http://arxiv.org/abs/2305.10531
Inductive link prediction for knowledge graph aims at predicting missing links between unseen entities, those not shown in training stage. Most previous works learn entity-specific embeddings of entities, which cannot handle unseen entities. Recent s
Externí odkaz:
http://arxiv.org/abs/2208.00850
Akademický článek
Tento výsledek nelze pro nepřihlášené uživatele zobrazit.
K zobrazení výsledku je třeba se přihlásit.
K zobrazení výsledku je třeba se přihlásit.
Akademický článek
Tento výsledek nelze pro nepřihlášené uživatele zobrazit.
K zobrazení výsledku je třeba se přihlásit.
K zobrazení výsledku je třeba se přihlásit.