Occlusion-aware Video Temporal Consistency
Autor: | Chun-Han Yao, Shao-Yi Chien, Chia-Yang Chang |
---|---|
Rok vydání: | 2017 |
Předmět: |
Computer science
business.industry Frame (networking) ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION Color balance 020207 software engineering 02 engineering and technology Video processing Tone mapping Video quality Local color 0202 electrical engineering electronic engineering information engineering 020201 artificial intelligence & image processing Computer vision Affine transformation Artificial intelligence Image warping business |
Zdroj: | ACM Multimedia |
DOI: | 10.1145/3123266.3123363 |
Popis: | Image color editing techniques such as color transfer, HDR tone mapping, dehazing, and white balance have been widely used and investigated in recent decades. However, naively employing them to videos frame-by-frame often leads to flickering or color inconsistency. To solve it generally, earlier methods rely on temporal filtering or warping from the previous frame, but they still fail in the cases of occlusion and produce blurry results. We introduce a new framework for these challenges: (1) We develop an online keyframe strategy to keep track of the dynamic objects, where more temporal information can be acquired than a single previous frame. (2) To preserve image details, local color affine model is employed. The main concept of this post-processing step is to capture the color transformation from editing algorithms and maintain the detail structures of the raw image simultaneously. Practically, our approach takes a raw video and its per-frame processed version, and generates a temporally consistent output. In addition, we propose a video quality metric to evaluate temporal coherence. Extensive experiments and subjective test are done to show the superiority of the proposed framework with respect to color fidelity, detail preservation, and temporal consistency. |
Databáze: | OpenAIRE |
Externí odkaz: |