SNR: Sub-Network Routing for Flexible Parameter Sharing in Multi-Task Learning
Autor: | Ang Li, Jilin Chen, Zhe Zhao, Jiaqi Ma, Lichan Hong, Ed H. Chi |
---|---|
Rok vydání: | 2019 |
Předmět: |
Computer engineering
Artificial neural network Computer science 020204 information systems Computation 0202 electrical engineering electronic engineering information engineering Multi-task learning 020201 artificial intelligence & image processing 02 engineering and technology General Medicine Latent variable Routing (electronic design automation) Object detection |
Zdroj: | AAAI |
ISSN: | 2374-3468 2159-5399 |
DOI: | 10.1609/aaai.v33i01.3301216 |
Popis: | Machine learning applications, such as object detection and content recommendation, often require training a single model to predict multiple targets at the same time. Multi-task learning through neural networks became popular recently, because it not only helps improve the accuracy of many prediction tasks when they are related, but also saves computation cost by sharing model architectures and low-level representations. The latter is critical for real-time large-scale machine learning systems. However, classic multi-task neural networks may degenerate significantly in accuracy when tasks are less related. Previous works (Misra et al. 2016; Yang and Hospedales 2016; Ma et al. 2018) showed that having more flexible architectures in multi-task models, either manually-tuned or softparameter-sharing structures like gating networks, helps improve the prediction accuracy. However, manual tuning is not scalable, and the previous soft-parameter sharing models are either not flexible enough or computationally expensive. In this work, we propose a novel framework called SubNetwork Routing (SNR) to achieve more flexible parameter sharing while maintaining the computational advantage of the classic multi-task neural-network model. SNR modularizes the shared low-level hidden layers into multiple layers of subnetworks, and controls the connection of sub-networks with learnable latent variables to achieve flexible parameter sharing. We demonstrate the effectiveness of our approach on a large-scale dataset YouTube8M. We show that the proposed method improves the accuracy of multi-task models while maintaining their computation efficiency. |
Databáze: | OpenAIRE |
Externí odkaz: |