SwePub
Sök i SwePub databas

  Extended search

Träfflista för sökning "WFRF:(Yeo Ye Chuan) "

Search: WFRF:(Yeo Ye Chuan)

  • Result 1-2 of 2
Sort/group result
   
EnumerationReferenceCoverFind
1.
  • Cui, Zhaopeng, et al. (author)
  • Real-Time Dense Mapping for Self-Driving Vehicles using Fisheye Cameras
  • 2019
  • In: Proceedings - IEEE International Conference on Robotics and Automation. - 1050-4729. - 9781538660263 ; 2019-May, s. 6087-6093
  • Conference paper (peer-reviewed)abstract
    • We present a real-time dense geometric mapping algorithm for large-scale environments. Unlike existing methods which use pinhole cameras, our implementation is based on fisheye cameras whose large field of view benefits various computer vision applications for self-driving vehicles such as visual-inertial odometry, visual localization, and object detection. Our algorithm runs on in-vehicle PCs at approximately 15 Hz, enabling vision-only 3D scene perception for self-driving vehicles. For each synchronized set of images captured by multiple cameras, we first compute a depth map for a reference camera using plane-sweeping stereo. To maintain both accuracy and efficiency, while accounting for the fact that fisheye images have a lower angular resolution, we recover the depths using multiple image resolutions. We adopt the fast object detection framework, YOLOv3, to remove potentially dynamic objects. At the end of the pipeline, we fuse the fisheye depth images into the truncated signed distance function (TSDF) volume to obtain a 3D map. We evaluate our method on large-scale urban datasets, and results show that our method works well in complex dynamic environments.
  •  
2.
  • Heng, Lionel, et al. (author)
  • Project AutoVision: Localization and 3D Scene Perception for an Autonomous Vehicle with a Multi-Camera System
  • 2019
  • In: Proceedings - IEEE International Conference on Robotics and Automation. - 1050-4729. - 9781538660263 ; 2019-May, s. 4695-4702
  • Conference paper (peer-reviewed)abstract
    • Project AutoVision aims to develop localization and 3D scene perception capabilities for a self-driving vehicle. Such capabilities will enable autonomous navigation in urban and rural environments, in day and night, and with cameras as the only exteroceptive sensors. The sensor suite employs many cameras for both 360-degree coverage and accurate multi-view stereo; the use of low-cost cameras keeps the cost of this sensor suite to a minimum. In addition, the project seeks to extend the operating envelope to include GNSS-less conditions which are typical for environments with tall buildings, foliage, and tunnels. Emphasis is placed on leveraging multi-view geometry and deep learning to enable the vehicle to localize and perceive in 3D space. This paper presents an overview of the project, and describes the sensor suite and current progress in the areas of calibration, localization, and perception.
  •  
Skapa referenser, mejla, bekava och länka
  • Result 1-2 of 2

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Close

Copy and save the link in order to return to this view