SCC-GPT: Source Code Classification Based on Generative Pre-Trained Transformers

Autor: Mohammad D. Alahmadi, Moayad Alshangiti, Jumana Alsubhi
Jazyk: angličtina
Rok vydání: 2024
Předmět:
Zdroj: Mathematics, Vol 12, Iss 13, p 2128 (2024)
Druh dokumentu: article
ISSN: 12132128
2227-7390
DOI: 10.3390/math12132128
Popis: Developers often rely on online resources, such as Stack Overflow (SO), to seek assistance for programming tasks. To facilitate effective search and resource discovery, manual tagging of questions and posts with the appropriate programming language is essential. However, accurate tagging is not consistently achieved, leading to the need for the automated classification of code snippets into the correct programming language as a tag. In this study, we introduce a novel approach to automated classification of code snippets from Stack Overflow (SO) posts into programming languages using generative pre-trained transformers (GPT). Our method, which does not require additional training on labeled data or dependency on pre-existing labels, classifies 224,107 code snippets into 19 programming languages. We employ the text-davinci-003 model of ChatGPT-3.5 and postprocess its responses to accurately identify the programming language. Our empirical evaluation demonstrates that our GPT-based model (SCC-GPT) significantly outperforms existing methods, achieving a median F1-score improvement that ranges from +6% to +31%. These findings underscore the effectiveness of SCC-GPT in enhancing code snippet classification, offering a cost-effective and efficient solution for developers who rely on SO for programming assistance.
Databáze: Directory of Open Access Journals