Diminished-1 Fermat Number Transform for Integer Convolutional Neural Networks

Autor: Zhu Baozhou, Nauman Ahmed, Zaid Al-Ars, Johan Peltenburg, Koen Bertels
Rok vydání: 2019
Předmět:
Zdroj: 2019 IEEE 4th International Conference on Big Data Analytics (ICBDA).
Popis: Convolutional Neural Networks (CNNs) are a class of widely used deep artificial neural networks. However, training large CNNs to produce state-of-the-art results can take a long time. In addition, we need to reduce compute time of the inference stage for trained networks to make it accessible for real time applications. In order to achieve this, integer number formats INT8 and INT16 with reduced precision are being used to create Integer Convolutional Neural Networks (ICNNs) to allow them to be deployed on mobile devices or embedded systems. In this paper, Diminished-l Fermat Number Transform (DFNT), which refers to Fermat Number Transform (FNT) with diminished-l number representation, is proposed to accelerate ICNNs through algebraic properties of integer convolution. This is achieved by performing the convolution step as diminished -1 point-wise products between DFNT transformed feature maps, which can be reused multiple times in the calculation. Since representing and computing all the integers in the ring of integers modulo Fermat number $2^{b}+1$ for FNT requires $b+1$ bits, diminished-1 number representation is used to enable exact and efficient calculation. Using DFNT, integer convolution is implemented on a general purpose processor, showing speedup of 2–3x with typical parameter configurations and better scalability without any round-off error compared to the baseline.
Databáze: OpenAIRE