Interpolation Method for Sparse Point Cloud at Long-range Using LiDAR and Camera Sensor Fusion
LiDARとカメラを用いたセンサフュージョンによる遠距離スパース点群の補間手法
- Delivery
- Available on the other site
- Click here to order.
- Publication code
- 20224226
- Paper/Info type
- JSAE Transaction
Vol.53 No.3
- Pages
- 598-604(Total 7 p)
- Date of publication
- May 2022
- Publisher
- JSAE
- Language
- Japanese
Detailed Information
Category(J) | 研究論文 Translation |
---|---|
Category(E) | ResearchPaper |
Author(J) | 1) 齊藤 真衣, 2) 沈 舜聡, 3) 伊東 敏夫 |
Author(E) | 1) Mai Saito, 2) Shuncong Shen, 3) Toshio Ito |
Affiliation(J) | 1) 芝浦工業大学, 2) 芝浦工業大学, 3) 芝浦工業大学 |
Abstract(J) | 低コストの低解像度3D-LiDARには,遠距離になると点群データが疎になるという問題がある.この問題を解決するために,カメラとLiDARのセンサフュージョンを利用した点群フレームの合成手法を提案する.カメラからのRGBデータを利用して隣接フレーム内での対応点を探索する.また,LiDARからの深度情報によって探索範囲を決定する. Translation |
Abstract(E) | LiDAR plays an important role as external sensor for automated driving system. However, low-resolution 3D-LiDAR is insufficient that the point cloud data become sparse at long-range. This problem makes it difficult to obtain accurate information about the target object. To solve this problem, we propose an interpolation method, using sensor fusion with camera and LiDAR. And, RGB data from camera is used to search for corresponding point in adjacent frames. In addition, the searching range is determined by LiDAR’s depth data. We evaluated computing cost and shape reconstruction accuracy while applying our method into preceding vehicle. Furthermore, the point cloud data are carried out clustering and classification based on SVM. |