Slightly-slacked dropout for improving neural network learning on FPGA
Autor: | Sota Sawaguchi, Hiroaki Nishi |
---|---|
Rok vydání: | 2018 |
Předmět: |
Artificial neural network
Computer Networks and Communications Computer science business.industry 020208 electrical & electronic engineering Process (computing) Neural network learning 02 engineering and technology 010501 environmental sciences Overfitting 01 natural sciences Software Computer engineering Artificial Intelligence Hardware and Architecture ComputingMilieux_COMPUTERSANDEDUCATION 0202 electrical engineering electronic engineering information engineering Hardware acceleration business Field-programmable gate array Dropout (neural networks) 0105 earth and related environmental sciences Information Systems |
Zdroj: | ICT Express. 4:75-80 |
ISSN: | 2405-9595 |
Popis: | Neural Network Learning (NNL) is compute-intensive. It often involves a dropout technique which effectively regularizes the network to avoid overfitting. As such, a hardware accelerator for dropout NNL has been proposed; however, the existing method encounters a huge transfer cost between hardware and software. This paper proposes Slightly-Slacked Dropout (SS-Dropout), a novel deterministic dropout technique to address the transfer cost while accelerating the process. Experimental results show that our SS-Dropout technique improves both the usual and dropout NNL accelerator, i.e., 1.55 times speed-up and three order-of-magnitude less transfer cost, respectively. |
Databáze: | OpenAIRE |
Externí odkaz: |