Robust Imaging-Free Object Recognition Through Anderson Localizing Optical Fiber
Autor: | Axel Schülzgen, Xiaowen Hu, Rodrigo Amezcua Correa, Jian Zhao, Jose Enrique Antonio-Lopez, Shengli Fan |
---|---|
Rok vydání: | 2021 |
Předmět: |
Facet (geometry)
Optical fiber Multi-mode optical fiber Computer science business.industry Fiber (mathematics) Cognitive neuroscience of visual object recognition Physics::Optics Convolutional neural network Atomic and Molecular Physics and Optics law.invention Speckle pattern law Robustness (computer science) Computer vision Artificial intelligence business |
Zdroj: | Journal of Lightwave Technology. 39:920-926 |
ISSN: | 1558-2213 0733-8724 |
DOI: | 10.1109/jlt.2020.3029416 |
Popis: | Recognizing objects directly from optical fiber output images is useful in endoscopic applications when forming a clear image of the object is unnecessary or rather difficult. Conventional fiber-optic systems, such as multicore-fiber-based and multimode-fiber-based systems, suffer from the sensitivity of the fiber to external perturbations. For example, a slight movement of the fiber (a-few-millimeters translation of the tip for meter-long multicore fibers or multimode fibers) can greatly change the output images of the system. In this work, we utilize the light guidance stability of recently proposed glass-air Anderson localizing optical fiber (GALOF) to achieve robust imaging-free objection recognition. We transport five classes of cell images through an 80-cm straight GALOF. A deep convolutional neural network is trained to classify the output images and tested on images never seen, namely, images collected when the fiber is bent or when the fiber facet is placed several millimeters away from the object without any distal optics. Bending-invariant high classification accuracy (86.8% on average) is observed all the way to the maximum bending offset distance of 45 cm (∼74thinsp;° bending angle). High classification accuracy (91.2%) is also preserved when the fiber facet is 0.5 mm away from the object. |
Databáze: | OpenAIRE |
Externí odkaz: |