Meta In-Context Learning: Harnessing Large Language Models for Electrical Data Classification

Autor: Mi Zhou, Fusheng Li, Fan Zhang, Junhao Zheng, Qianli Ma
Jazyk: angličtina
Rok vydání: 2023
Předmět:
Zdroj: Energies, Vol 16, Iss 18, p 6679 (2023)
Druh dokumentu: article
ISSN: 1996-1073
DOI: 10.3390/en16186679
Popis: The evolution of communication technology has driven the demand for intelligent power grids and data analysis in power systems. However, obtaining and annotating electrical data from intelligent terminals is time-consuming and challenging. We propose Meta In-Context Learning (M-ICL), a new approach that harnesses large language models to classify time series electrical data, which largely alleviates the need for annotated data when adapting to new tasks. The proposed M-ICL consists of two stages: meta-training and meta-testing. In meta-training, the model is trained on various tasks that have an adequate amount of training data. The meta-training stage aims to learn the mapping between electrical data and the embedding space of large language models. In the meta-testing stage, the trained model makes predictions on new tasks. By utilizing the in-context learning ability of large language models, M-ICL adapts models to new tasks effectively with only a few annotated instances (e.g., 1–5 training instances per class). Our contributions lie in the new application of large language models to electrical data classification and the introduction of M-ICL to improve the classification performance with the strong in-context learning ability of large language models. Furthermore, we conduct extensive experiments on 13 real-world datasets, and the experimental results show that the proposed M-ICL improves the average accuracy over all datasets by 19.06%, 12.06%, and 6.63% when only one, two, and five training instances for each class are available, respectively. In summary, M-ICL offers a promising solution to the challenges of electrical data classification.
Databáze: Directory of Open Access Journals
Nepřihlášeným uživatelům se plný text nezobrazuje