5.4 A Dynamic Pseudo 4-Tap CMOS Time-of-Flight Image Sensor with Motion Artifact Suppression and Background Light Cancelling Over 120klux

Autor: Dahwan Park, Canxing Piao, Jaehyuk Choi, Yeonsoo Ahn, Kihwan Cho, Seong-Jin Kim, Seung Min Song, Jung-Hoon Chun, Jungsoon Shin, Donguk Kim, Jihoon Park, Seunghyun Lee
Rok vydání: 2020
Předmět:
Zdroj: ISSCC
DOI: 10.1109/isscc19947.2020.9063101
Popis: An indirect time-of-flight (iToF) CMOS image sensor (CIS) is a device that provides depth as well as a two-dimensional shape of an object by measuring the phase difference of reflected pulse trains of light. Because the iToF CIS offers high spatial resolution from the scaled photodetectors such as pinned photodiodes (PPD) [1] or photogates [2], as well as high depth accuracy, it is advantageous for object or gesture recognition with higher accuracy compared with conventional 2D imagers. Even though iToF CISs have been limited to indoor applications, such as gaming, they have a strong demand for outdoor applications that include 3D face recognition and augmented reality for mobile devices, gesture recognition for vehicles, and service robots. The operation principle of the iTOF CIS is a 4-phase demodulation (4PH) that acquires four charges from four phases (0, π/2, π, 3π/2) of demodulation. A conventional iToF CIS employs 2-tap (2T) pixels that have a photodetector with two readout paths such that charges from two phases can be separated. For the 4PH, we need two successive frames: 0 and π in the first, and π/2 and 3π/2 in the second frame [1]–[5]. Therefore, this conventional 4PH induces significant motion artifact that is critical for applications of gesture recognition. Another critical problem is the weak immunity for background light (BGL), including strong indoor lighting or sunlight. Several iToF CISs with integrated BGL cancelling are reported. However, they sacrifice sensitivity by connecting an additional capacitor to avoid saturation or sacrifice spatial resolution by adding huge analog memories in the CIS core [3].
Databáze: OpenAIRE