Slightly-slacked dropout for improving neural network learning on FPGA

Autor: Sota Sawaguchi, Hiroaki Nishi
Rok vydání: 2018
Předmět:
Zdroj: ICT Express. 4:75-80
ISSN: 2405-9595
Popis: Neural Network Learning (NNL) is compute-intensive. It often involves a dropout technique which effectively regularizes the network to avoid overfitting. As such, a hardware accelerator for dropout NNL has been proposed; however, the existing method encounters a huge transfer cost between hardware and software. This paper proposes Slightly-Slacked Dropout (SS-Dropout), a novel deterministic dropout technique to address the transfer cost while accelerating the process. Experimental results show that our SS-Dropout technique improves both the usual and dropout NNL accelerator, i.e., 1.55 times speed-up and three order-of-magnitude less transfer cost, respectively.
Databáze: OpenAIRE