A complex network approach to analyse pre-trained language models for ancient Chinese

Autor: Jianyu Zheng, Xin'ge Xiao
Jazyk: angličtina
Rok vydání: 2024
Předmět:
Zdroj: Royal Society Open Science, Vol 11, Iss 5 (2024)
Druh dokumentu: article
ISSN: 2054-5703
DOI: 10.1098/rsos.240061
Popis: Ancient Chinese is a splendid treasure within Chinese culture. To facilitate its compilation, pre-trained language models for ancient Chinese are developed. After that, researchers are actively exploring the factors contributing to their success. However, previous work did not study how language models organized the elements of ancient Chinese from a holistic perspective. Hence, we adopt complex networks to explore how language models organize the elements in ancient Chinese system. Specifically, we first analyse the characters’ and words’ co-occurrence networks in ancient Chinese. Then, we study characters’ and words’ attention networks, generated by attention heads within SikuBERT from two aspects: static and dynamic network analysis. In the static network analysis, we find that (i) most of attention networks exhibit small-world properties and scale-free behaviour, (ii) over 80% of attention networks exhibit high similarity with the corresponding co-occurrence networks, (iii) there exists a noticeable gap between characters’ and words’ attention networks across layers, while their fluctuations remain relatively consistent, and (iv) the attention networks generated by SikuBERT tend to be sparser compared with those from Chinese BERT. In dynamic network analysis, we find that the sentence segmentation task does not significantly affect network metrics, while the part-of-speech tagging task makes attention networks sparser.
Databáze: Directory of Open Access Journals