Hardware Acceleration of Fully Quantized BERT for Efficient Natural Language Processing

Autor: Liu, Zejian, Li, Gang, Cheng, Jian
Rok vydání: 2021
Předmět:
Zdroj: Design, Automation & Test in Europe (DATE) 2021
Druh dokumentu: Working Paper
Popis: BERT is the most recent Transformer-based model that achieves state-of-the-art performance in various NLP tasks. In this paper, we investigate the hardware acceleration of BERT on FPGA for edge computing. To tackle the issue of huge computational complexity and memory footprint, we propose to fully quantize the BERT (FQ-BERT), including weights, activations, softmax, layer normalization, and all the intermediate results. Experiments demonstrate that the FQ-BERT can achieve 7.94x compression for weights with negligible performance loss. We then propose an accelerator tailored for the FQ-BERT and evaluate on Xilinx ZCU102 and ZCU111 FPGA. It can achieve a performance-per-watt of 3.18 fps/W, which is 28.91x and 12.72x over Intel(R) Core(TM) i7-8700 CPU and NVIDIA K80 GPU, respectively.
Databáze: arXiv