Kinect 3D Point Cloud Live Video Streaming
Autor: | Zainab Namh Sultani, Rana Fareed Ghani |
---|---|
Rok vydání: | 2015 |
Předmět: |
Kinect
Computer science business.industry Filter Point cloud ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION Compression 3D Point Cloud Video processing Filter (signal processing) Streaming Live streaming Upsampling Octree Histogram General Earth and Planetary Sciences RGB color model Computer vision Artificial intelligence Android (operating system) business General Environmental Science |
Zdroj: | Procedia Computer Science. 65:125-132 |
ISSN: | 1877-0509 |
DOI: | 10.1016/j.procs.2015.09.090 |
Popis: | We present a live video streaming system using a low cost 3D sensor camera like the Microsoft Kinect. The huge amount of raw point data that Kinect creates has to be stored and transmitted by efficient compact means. Noise and redundancy, however, make the process more difficult to achieve. To overcome these difficulties we propose a live streaming system that streams a 3D video to an Android Mobile phone and to Linux Desktop systems. For Android mobile phone client, the 3D video is filtered before streaming. Filtering stage contains 3 types of filters, Voxel Grid, Statistical Outlier Removal and Histogram-based conditional filters. The video is captured by the Kinect and a 3D point cloud is created. Voxel Grid filter is used since the generated 3D video will have millions of points; a downsampling procedure is applied to minimize the number of points. To reduce outliers and color information, statistical outlier removal and histogram-based conditional filters are used respectively. Conditional filter is customized by the scene histogram for each channel of RGB color to preserve the dominant color information for each scene. For Linux desktop client, the video is filtered by the histogram-based conditional filter and is compressed using an Octree structure that reduces spatial redundancies across the streamed video. |
Databáze: | OpenAIRE |
Externí odkaz: |