Autor: |
Maziarka, ��ukasz, Nowak, Aleksandra, Wo��czyk, Maciej, Bedychaj, Andrzej |
Jazyk: |
angličtina |
Rok vydání: |
2021 |
Předmět: |
|
Popis: |
One of the main arguments behind studying disentangled representations is the assumption that they can be easily reused in different tasks. At the same time finding a joint, adaptable representation of data is one of the key challenges in the multi-task learning setting. In this paper, we take a closer look at the relationship between disentanglement and multi-task learning based on hard parameter sharing. We perform a thorough empirical study of the representations obtained by neural networks trained on automatically generated supervised tasks. Using a set of standard metrics we show that disentanglement appears naturally during the process of multi-task neural network training. |
Databáze: |
OpenAIRE |
Externí odkaz: |
|