Autor: |
Timofeev, Aleksandr, Afonin, Andrei, Liu, Yehao |
Rok vydání: |
2021 |
Předmět: |
|
Druh dokumentu: |
Working Paper |
Popis: |
In this work, we propose a meta-learner based on ODE neural networks that learns gradients. This approach makes the optimizer is more flexible inducing an automatic inductive bias to the given task. Using the simplest Hamiltonian Neural Network we demonstrate that our method outperforms a meta-learner based on LSTM for an artificial task and the MNIST dataset with ReLU activations in the optimizee. Furthermore, it also surpasses the classic optimization methods for the artificial task and achieves comparable results for MNIST. |
Databáze: |
arXiv |
Externí odkaz: |
|