Training with Augmented Data: GAN-based Flame-Burning Image Synthesis for Fire Segmentation in Warehouse

Autor: Jineng Ouyang, Zhikai Yang, Teng Wang, Leping Bu
Rok vydání: 2021
Předmět:
Zdroj: Fire Technology. 58:183-215
ISSN: 1572-8099
0015-2684
Popis: The training of video fire detection models based on deep learning relies on a large number of positive and negative samples, namely, fire video and scenario video with other disturbances similar to fire. Due to the prohibition of ignition in lots of indoor occasions, the fire video samples in the scene are insufficient. In this paper, a method based on generative adversarial network is proposed to generate flame images which are then migrated into specified scenes, thus increasing fire video samples in those restricted situations. Flame kernel is pre-implanted into the specified scene to keep its characteristics intact. The flame and scene are blended together by adding styling information such as blurry edge and ground reflection. This method overcomes background distortion which is caused by existing multimodal image translation on as a result of information loss and is able to guarantee the diversity of flames in specified scenes and produce perceptually realistic results. Compared with other multimodal image-to-image translation schemes, the FID and LPIPS values of images generated by our method are the highest, reaches 118.4 and 0.1322 respectively. In addition, Unet and the SA-Unet, in which a self-attention mechanism is involved, are used as fire segmenting networks to evaluate the enhancement of the augmented data on improving the accuracy of segmented network. Their F1-scores reaches 0.8905 and 0.9082 respectively after Unet and SA-Unet are trained with GAN-based augmented dataset generated by our model. The F1-scores are second only to 0.9259 and 0.9291 which are obtained when Unet and SA-Unet are trained with real picture serving as augmented dataset.
Databáze: OpenAIRE