Sentiment Analysis for Software Engineering: How Far Can Pre-trained Transformer Models Go?
Autor: | Stefanus Agus Haryono, Ting Zhang, David Lo, Ferdian Thung, Lingxiao Jiang, Bowen Xu |
---|---|
Rok vydání: | 2020 |
Předmět: |
Code review
Computer science business.industry Sentiment analysis 020207 software engineering 02 engineering and technology 010501 environmental sciences computer.software_genre 01 natural sciences 0202 electrical engineering electronic engineering information engineering Stack overflow Software mining Macro Software engineering business computer 0105 earth and related environmental sciences |
Zdroj: | ICSME |
Popis: | Extensive research has been conducted on sentiment analysis for software engineering (SA4SE). Researchers have invested much effort in developing customized tools (e.g., SentiStrength-SE, SentiCR) to classify the sentiment polarity for Software Engineering (SE) specific contents (e.g., discussions in Stack Overflow and code review comments). Even so, there is still much room for improvement. Recently, pre-trained Transformer-based models (e.g., BERT, XLNet) have brought considerable breakthroughs in the field of natural language processing (NLP). In this work, we conducted a systematic evaluation of five existing SA4SE tools and variants of four state-of-the-art pre-trained Transformer-based models on six SE datasets. Our work is the first to fine-tune pre-trained Transformer-based models for the SA4SE task. Empirically, across all six datasets, our fine-tuned pre-trained Transformer-based models outperform the existing SA4SE tools by 6.5-35.6% in terms of macro/micro-averaged F1 scores. |
Databáze: | OpenAIRE |
Externí odkaz: |