NV-DNN: Towards Fault-Tolerant DNN Systems with N-Version Programming

Autor: Zhuangbin Chen, Sy-Yen Kuo, Zhi Jin, Weibin Wu, Hui Xu, Michael R. Lyu
Rok vydání: 2019
Předmět:
Zdroj: DSN Workshops
DOI: 10.1109/dsn-w.2019.00016
Popis: Employing deep learning algorithms in real-world applications becomes a trend. However, a bottleneck that impedes their further adoption in safety-critical systems is the reliability issue. It is challenging to develop reliable neural network models as the theory of deep learning has not yet been well-established and neural network models are very sensitive to data perturbations. Inspired by the classic paradigm of N-version programming for fault tolerance, this paper investigates the feasibility of developing fault-tolerant deep learning systems through model redundancy. We hypothesize that if we train several simplex models independently, these models are unlikely to produce erroneous results for the same test cases. In this way, we can design a fault-tolerant system whose output is determined by all these models cooperatively. We propose several independence factors that can be introduced for generating multiple versions of neural network models, including training, network, and data. Experimental results on MNIST and CIFAR-10 both verify that our approach can improve the fault-tolerant ability of a deep learning system. Particularly, independent data for training plays the most significant role in generating multiple models sharing the least mutual faults.
Databáze: OpenAIRE