An Uncertainty-based Neural Network for Explainable Trajectory Segmentation
Autor: | Fangtong Wang, Xin Bi, Wang Guoren, Xiangguo Zhao, Ye Yuan, Zhixun Liu, Chao Zhang |
---|---|
Rok vydání: | 2021 |
Předmět: | |
Zdroj: | ACM Transactions on Intelligent Systems and Technology. 13:1-18 |
ISSN: | 2157-6912 2157-6904 |
DOI: | 10.1145/3467978 |
Popis: | As a variant task of time-series segmentation, trajectory segmentation is a key task in the applications of transportation pattern recognition and traffic analysis. However, segmenting trajectory is faced with challenges of implicit patterns and sparse results. Although deep neural networks have tremendous advantages in terms of high-level feature learning performance, deploying as a blackbox seriously limits the real-world applications. Providing explainable segmentations has significance for result evaluation and decision making. Thus, in this article, we address trajectory segmentation by proposing a Bayesian Encoder-Decoder Network (BED-Net) to provide accurate detection with explainability and references for the following active-learning procedures. BED-Net consists of a segmentation module based on Monte Carlo dropout and an explanation module based on uncertainty learning that provides results evaluation and visualization. Experimental results on both benchmark and real-world datasets indicate that BED-Net outperforms the rival methods and offers excellent explainability in the applications of trajectory segmentation. |
Databáze: | OpenAIRE |
Externí odkaz: |