Wipe Scene Change Detection in Object-Camera Motion Based on Linear Regression and an Inflated Spatial-Motion Neural Network

Autor: Dipanita Chakraborty, Werapon Chiracharit, Kosin Chamnongthai, Theekapun Charoenpong
Jazyk: angličtina
Rok vydání: 2023
Předmět:
Zdroj: IEEE Access, Vol 11, Pp 33080-33099 (2023)
Druh dokumentu: article
ISSN: 2169-3536
DOI: 10.1109/ACCESS.2023.3262796
Popis: To facilitate content-based video analysis, automatic scene change detection (SCD) with large-scale motion activity is an essential fundamental step for locating a transition from one video scene to another. With the exponential increase in digital media usage, SCD has become more challenging in processing large motion content with minimal information loss and maximum perseverance. Wipe SCD in object-camera motion is noticeable evidence of this issue. Wipe transitions, which are a type of gradual transition, have diverse motion pattern changes when influenced by object-camera motion (camera pan, large-object, and zoom-in/out), creating a velocity imbalance in the same frame. Furthermore, this motion imbalance leads to false detection. Due to the loss of motion information and longer processing time of existing frameworks, we propose a novel method of wipe scene change detection (WSCD) based on deep spatial-motion feature analysis. First, large input videos are segmented into shots using dimensionality reduction and adaptive threshold. Secondly, linear regression is used to compute slope angle changes in shots for candidate selection and wipe localization. Finally, only selected candidates are processed to extract features using a two-stream inflated 3D-convolutional neural network for RGB stream and optical flow velocity for motion stream network (I3DCNN) and then classified into wipe in-motion and no-motion clips. The experimental results are obtained by classifying wipe patterns using a detection reviewing and merging strategy on corresponding wipe frames. The average improvement in wipe scene change detection accuracy evaluated on the benchmark TRECVID dataset is 11.9%, demonstrating the efficacy of our proposed method.
Databáze: Directory of Open Access Journals