Faster Meta Update Strategy for Noise-Robust Deep Learning
Autor: | Lu Jiang, Linchao Zhu, Yi Yang, Youjiang Xu |
---|---|
Rok vydání: | 2021 |
Předmět: |
FOS: Computer and information sciences
Computer Science - Machine Learning Meta learning (computer science) Computer science business.industry Computer Vision and Pattern Recognition (cs.CV) Deep learning Computer Science - Computer Vision and Pattern Recognition Overfitting Machine learning computer.software_genre Bottleneck Machine Learning (cs.LG) Metamodeling Robustness (computer science) Pattern recognition (psychology) Code (cryptography) Artificial intelligence business computer |
Zdroj: | CVPR |
Popis: | It has been shown that deep neural networks are prone to overfitting on biased training data. Towards addressing this issue, meta-learning employs a meta model for correcting the training bias. Despite the promising performances, super slow training is currently the bottleneck in the meta learning approaches. In this paper, we introduce a novel Faster Meta Update Strategy (FaMUS) to replace the most expensive step in the meta gradient computation with a faster layer-wise approximation. We empirically find that FaMUS yields not only a reasonably accurate but also a low-variance approximation of the meta gradient. We conduct extensive experiments to verify the proposed method on two tasks. We show our method is able to save two-thirds of the training time while still maintaining the comparable or achieving even better generalization performance. In particular, our method achieves the state-of-the-art performance on both synthetic and realistic noisy labels, and obtains promising performance on long-tailed recognition on standard benchmarks. Comment: Accepted to CVPR 2021 |
Databáze: | OpenAIRE |
Externí odkaz: |