ASBNN: Acceleration of Bayesian Convolutional Neural Networks by Algorithm-hardware Co-design

Autor: Yoshiki Fujiwara, Shinya Takamaeda-Yamazaki
Rok vydání: 2021
Předmět:
Zdroj: ASAP
DOI: 10.1109/asap52443.2021.00041
Popis: Bayesian Convolutional Neural Networks (BCNNs) have been proposed to address the problem of model uncertainty in conventional neural networks. By treating weights as distributions rather than deterministic values, BCNNs mitigate the problem of overfitting, training with a small amount of data, and uncertainty evaluations. However, computing the distributions of BCNN outputs is time- and energy-consuming because it requires computing multiple forward passes.To address this computational problem, we propose a novel algorithm-hardware co-design approach with an approximation algorithm and hardware support for the rapid computation of BCNN. Our observations of the absolute number of each layer’s input and the input difference among multiple forward passes show that most of these values are significantly small compared with other large values. Our algorithm treats these small values as zero and makes them sparser. The extracted sparsity allows us to skip most multiplications. As a result, it achieves a computation reduction of 81.1 % in classification tasks and 77.7 % in regression tasks. Additionally, to support the algorithm-level approximation on hardware, we propose a novel dataflow that is specialized for our algorithm, and develop a new accelerator architecture, accelerator for sparse Bayesian Neural Networks (ASBNN), that can handle sparsity extracted by the algorithm. Our evaluation demonstrates that the ASBNN successfully exploits the algorithmic computation reduction to improve the computation time by 3.3× and energy efficiency by 3.7× compared with the naive implementation of dense BCNN accelerators.
Databáze: OpenAIRE