Please log in

Paper / Information search system

日本語

ENGLISH

Help

Please log in

  • Summary & Details

Higher Accuracy and Lower Computational Perception Environment Based Upon a Real-time Dynamic Region of Interest

Detailed Information

Author(E)1) Nicolas Eric Brown, 2) Johan Fanas Rojas, 3) Hamzeh Alzu'bi, 4) Qusay Alrousan, 5) Rick Meyer, 6) Zachary Asher
Affiliation(E)1) Western Michigan University, 2) Western Michigan University, 3) FEV North America, Inc., 4) FEV North America, Inc., 5) Western Michigan University, 6) Western Michigan University
Abstract(E)Robust sensor fusion is a key technology for enabling the safe operation of automated vehicles. Sensor fusion typically utilizes inputs of cameras, radars, lidar, inertial measurement unit, and global navigation satellite systems, process them, and then output object detection or positioning data. This paper will focus on sensor fusion between the camera, radar, and vehicle wheel speed sensors which is a critical need for near-term realization of sensor fusion benefits. The camera is an off-the-shelf computer vision product from MobilEye and the radar is a Delphi/Aptive electronically scanning radar (ESR) both of which are connected to a drive-by-wire capable vehicle platform. We utilize the MobilEye and wheel speed sensors to create a dynamic region of interest (DROI) of the drivable region that changes as the vehicle moves through the environment. The use of the DROI can reduce the need for up to approximately 100% of the detections from radar, for processing of the driveable region. This provides not only accurate and robust detections but also has benefits in lowering computational power and time complexity. We then continue to reduce the number of detections in the driveable region using machine learning techniques such as density-based spatial clustering and applications with noise (DBSCAN). This is followed by KMeans clustering to further reduce detections and lastly fused with the extended Kalman filter. Our experimental results obtained using an instrumented vehicle show a large reduction in the need for radar detections processing after both, fusion with our DROI and further clustering using machine learning techniques for the driveable region. Our proposed complete technique decreases the amount of fused misdetection, decreases computational power, and increases the reliability of the fused perception model which can greatly benefit current Advanced Driver Assistance System products available on the market today.

About search

close

How to use the search box

You can enter up to 5 search conditions. The number of search boxes can be increased or decreased with the "+" and "-" buttons on the right.
If you enter multiple words separated by spaces in one search box, the data that "contains all" of the entered words will be searched (AND search).
Example) X (space) Y → "X and Y (including)"

How to use "AND" and "OR" pull-down

If "AND" is specified, the "contains both" data of the phrase entered in the previous and next search boxes will be searched. If you specify "OR", the data that "contains" any of the words entered in the search boxes before and after is searched.
Example) X AND Y → "X and Y (including)"  X OR Z → "X or Z (including)"
If AND and OR searches are mixed, OR search has priority.
Example) X AND Y OR Z → X AND (Y OR Z)
If AND search and multiple OR search are mixed, OR search has priority.
Example) W AND X OR Y OR Z → W AND (X OR Y OR Z)

How to use the search filters

Use the "search filters" when you want to narrow down the search results, such as when there are too many search results. If you check each item, the search results will be narrowed down to only the data that includes that item.
The number in "()" after each item is the number of data that includes that item.

Search tips

When searching by author name, enter the first and last name separated by a space, such as "Taro Jidosha".