Analytical Uncertainty-Based Loss Weighting in Multi-Task Learning

Autor: Kirchdorfer, Lukas, Elich, Cathrin, Kutsche, Simon, Stuckenschmidt, Heiner, Schott, Lukas, Köhler, Jan M.
Rok vydání: 2024
Předmět:
Druh dokumentu: Working Paper
Popis: With the rise of neural networks in various domains, multi-task learning (MTL) gained significant relevance. A key challenge in MTL is balancing individual task losses during neural network training to improve performance and efficiency through knowledge sharing across tasks. To address these challenges, we propose a novel task-weighting method by building on the most prevalent approach of Uncertainty Weighting and computing analytically optimal uncertainty-based weights, normalized by a softmax function with tunable temperature. Our approach yields comparable results to the combinatorially prohibitive, brute-force approach of Scalarization while offering a more cost-effective yet high-performing alternative. We conduct an extensive benchmark on various datasets and architectures. Our method consistently outperforms six other common weighting methods. Furthermore, we report noteworthy experimental findings for the practical application of MTL. For example, larger networks diminish the influence of weighting methods, and tuning the weight decay has a low impact compared to the learning rate.
Databáze: arXiv