Multi-Sensor Fusion in Slow Lanes for Lane Keep Assist System
- 提供方法
- 版元よりダウンロードリンクを連絡
- 形態
- 価格
- 一般価格(税込):¥6,600 会員価格(税込):¥5,280
- 文献・情報種別
- SAE Paper
No.2021-01-0084
- 掲載ページ
- 1-7(Total 7 p)
- 発行年月
- 2021年 4月
- 出版社
- SAE International
- 言語
- 英語
- イベント
- SAE WCX Digital Summit 2021
書誌事項
著者(英) | 1) Qusay Alrousan, 2) Sherif Matta, 3) Tom Tasky |
---|---|
勤務先(英) | 1) FEV North America Inc., 2) FEV North America Inc., 3) FEV North America Inc. |
抄録(英) | Implementing Advanced Driver Assistance Systems (ADAS) features that are available in all road scenarios and weather conditions is a big challenge for automotive companies and considered key enablers to achieve autonomous Level 4 (L4) vehicles. One important feature is the Lane Keep Assist System (LKAS). Most LKAS systems are based on lane line detection cameras and lane coefficient estimations by the camera is the key point for LKAS where the camera recognizes the lane lines using edge detection. But when the lane markers are not available due to high traffic and slow driving on the roads, another source of data for the lane lines needs to be available for the LKAS. In this paper a multi-sensor fusion approach based on camera, Lidar, and GPS is used to allow the vehicle to maintain its lateral location within the lane. The lateral distances of the lane lines are measured by LiDAR detection of the markers based on the intensity and fused with lane line information from the HD Map after transforming the sensors to the same reference. This approach was tested on FEV’s Smart Vehicle Demonstrator and the test results show the vehicle was able to maintain the lane. 翻訳 |