Alleviating the Knowledge-Language Inconsistency: A Study for Deep Commonsense Knowledge
Autor: | Yi Zhang, Lei Li, Yunfang Wu, Qi Su, Xu Sun |
---|---|
Rok vydání: | 2022 |
Předmět: |
FOS: Computer and information sciences
Computational Mathematics Computer Science - Computation and Language TheoryofComputation_MATHEMATICALLOGICANDFORMALLANGUAGES Acoustics and Ultrasonics Computer Science (miscellaneous) Electrical and Electronic Engineering ComputingMethodologies_ARTIFICIALINTELLIGENCE Computation and Language (cs.CL) |
Zdroj: | IEEE/ACM Transactions on Audio, Speech, and Language Processing. 30:594-604 |
ISSN: | 2329-9304 2329-9290 |
Popis: | Knowledge facts are typically represented by relational triples, while we observe that some commonsense facts are represented by the triples whose forms are inconsistent with the expression of language. This inconsistency puts forward a challenge for pre-trained language models to deal with these commonsense knowledge facts. In this paper, we term such knowledge as deep commonsense knowledge and conduct extensive exploratory experiments on it. We show that deep commonsense knowledge occupies a significant part of commonsense knowledge while conventional methods fail to capture it effectively. We further propose a novel method to mine the deep commonsense knowledge distributed in sentences, alleviating the reliance of conventional methods on the triple representation form of knowledge. Experiments demonstrate that the proposal significantly improves the performance in mining deep commonsense knowledge. |
Databáze: | OpenAIRE |
Externí odkaz: |