TSS-Net: Time-based Semantic Segmentation Neural Network for Road Scene Understanding

Autor: Huy-Hung Nguyen, Jae Wook Jeon, Tin Trung Duong
Rok vydání: 2021
Předmět:
Zdroj: IMCOM
DOI: 10.1109/imcom51814.2021.9377401
Popis: In this research, a multitask convolutional neural network that can do end-to-end road scene classification and semantic segmentation, which are the two crucial tasks for advanced driver assistance systems (ADAS), is proposed. We name the network TSS which means time-based semantic segmentation. The network contains three main modules: an image encoder, a scene classifier, and two time-based segmentation decoders. For each road scene image, the encoder extracts image features which will be used for classifier and decoders. Next, the image features are fed to the classifier to predict the scene type (in this case a day or a night scene). Then, based on the predicted scene type, the same extracted features are fed to a corresponding segmentation decoder to produce the final semantic segmentation result. By using this classification-driven decoder approach, we can improve the accuracy of the segmentation model, even when the model has been trained excessively earlier. Through the experiment, the validity of our proposed method has been proven. Our approach can be considered as stacking multiple segmentation modules on top of the classification module with all of them share the same image encoder. With this approach, we can utilize the result from classification to gain more accuracy in segmentation in one feed forward only.
Databáze: OpenAIRE