Zobrazeno 1 - 10
of 106
pro vyhledávání: '"Jae Seol Lee"'
Autor:
Jae-Seol Lee, Tae-Hyoung Park
Publikováno v:
IEEE Access, Vol 10, Pp 125102-125111 (2022)
LiDAR semantic segmentation is essential in autonomous vehicle safety. A rotating 3D LiDAR projects more laser points onto nearby objects and fewer points onto farther objects. Therefore, when projecting points onto a 2D image, such as spherical coor
Externí odkaz:
https://doaj.org/article/45e7da061c794d6e871ac65b53862460
Publikováno v:
Sensors, Vol 21, Iss 22, p 7623 (2021)
Although numerous road segmentation studies have utilized vision data, obtaining robust classification is still challenging due to vision sensor noise and target object deformation. Long-distance images are still problematic because of blur and low r
Externí odkaz:
https://doaj.org/article/6a374b9731d0449ebff8b0f42cd979a4
Autor:
Jae Seol Lee, Jungki Song, Seong Oh Kim, Seokbeom Kim, Wooju Lee, Joshua A. Jackman, Dongchoul Kim, Nam-Joon Cho, Jungchul Lee
Publikováno v:
Nature Communications, Vol 7, Iss 1, Pp 1-14 (2016)
Atomic force microscopy typically employs hard tips to map the surface topology of a sample, with sub-nanometre resolution. Here, the authors instead develop softer hydrogel probes, which show potential for multifunctional measurement capabilities be
Externí odkaz:
https://doaj.org/article/124febb929be468e8e8d1ef3c41c6099
Autor:
Jae-Seol Lee, Tae-Hyoung Park
Publikováno v:
IEEE Transactions on Intelligent Transportation Systems. 22:5802-5810
We propose a new camera-lidar fusion method for road detection where the spherical coordinate transformation is introduced to decrease the gap between the point cloud of 3D lidar data. The camera’s color data and the 3D lidar’s height data are tr
Publikováno v:
Sensors (Basel, Switzerland)
Sensors, Vol 21, Iss 7623, p 7623 (2021)
Sensors
Volume 21
Issue 22
Sensors, Vol 21, Iss 7623, p 7623 (2021)
Sensors
Volume 21
Issue 22
Although numerous road segmentation studies have utilized vision data, obtaining robust classification is still challenging due to vision sensor noise and target object deformation. Long-distance images are still problematic because of blur and low r
Publikováno v:
IEEE Transactions on Intelligent Transportation Systems. 20:4251-4256
An effective method to segment vehicles and roads is proposed for autonomous vehicles using low-channel 3D lidar. The distance-view transformation is newly proposed to overcome the low density of top-view data of lidar. In addition, a dilated convolu
Publikováno v:
Transaction of the Korean Society of Automotive Engineers. 26:368-377
Autor:
Seok-Cheol Kee, Jae-Seol Lee
Publikováno v:
Journal of Institute of Control, Robotics and Systems. 23:455-461
Publikováno v:
CIS/RAM
As radar and lidar sensors’ precision varies with distance, this paper proposes an extended Kalman filter that reflects the precision of the sensors as the distance changes. Majority of previous studies did not consider the measurement errors of th
Autor:
Jae-Seol Lee, Tae-Hyoung Park
Publikováno v:
2019 IEEE Intelligent Vehicles Symposium (IV).
A fast lidar-camera fusion method is proposed to detect road in autonomous vehicles. The height data of lidar is transformed to spherical coordinate system to increase the data density. The RGB data of camera is also transformed to spherical coordina