Does the Order of Fine-tuning Matter and Why?

Autor: Chen, Qihong, Li, Jiawei, Suh, Hyunjae, Jiang, Lianghao, Zhou, Zheng, Chen, Jingze, Gesi, Jiri, Ahmed, Iftekhar
Rok vydání: 2024
Předmět:
Druh dokumentu: Working Paper
Popis: To improve the performance on a target task, researchers have fine-tuned language models with an intermediate task before the target task of interest. However, previous works have focused on the pre-trained language models and downstream tasks in Natural Language Processing (NLP) and considered only one intermediate task. The effect of fine-tuning multiple intermediate tasks and their ordering on target task performance has not been fully explored in Software Engineering. In this study, we perform the first empirical study on analyzing the impact of task ordering on target task performance. Experimental results show that there is an impact of task ordering on target task performance by up to 6% of performance gain and up to 4% of performance loss. To explain such an impact, we consider a variety of potential factors, including the characteristics of dataset (syntactic similarity and semantic similarity analysis, dataset size), model (probing task and attention analysis), and task (task affinity analysis). Our study provides Software Engineering researchers and practitioners with insights into the effect of task orderings and how to select the one that is cost-effective while achieving the best performance gain.
Databáze: arXiv