Comparison and Application of Multiple 3D LIDAR Fusion Methods for Object Detection and Tracking

Elif Aksu Tasdelen, Volkan Sezer

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

5 Citations (Scopus)

Abstract

Environment perception is a critical part of autonomous driving which is required to get a reliable and accurate object information from environment. LIDAR sensors are thought to be a key enabler for autonomous cars through their significant advantages on wide field-of-view and high-resolution capabilities. Automotive companies' interest in LIDAR sensors is also thought to increase with slashed sensor prices over the years. Our main aim in this research is to get more precise object detection and tracking (ODT) system in real time for autonomous vehicles. In this paper, we have developed, applied and tested two different (low and high) realtime sensor fusion methods on multiple 3D LIDAR sensors for environment perception. The first contribution of this work is proposing and implementing 'high level track-to-track fusion' method on multiple 3D LIDAR sensors. To the best of our knowledge, this is the first automotive application of track-to-track fusion method on multiple 3D LIDARs. Another contribution is the analysis and comparison of track-to-track fusion method performance with the well-studied low-level real-time fusion method. These two real-time fusion strategies are implemented in the experimental test truck which is instrumented with two 3D LIDAR sensors and the performance of the fusion strategies are tested under three different driving scenarios. Additionally, the ground truth data is collected with the help of global navigation satellite system (GNSS) in high accuracy for performance evaluation. The test results are analyzed in terms of defined performance criteria and the benefits weaknesses of the proposed approach are discussed in this work.

Original languageEnglish
Title of host publication2020 5th International Conference on Robotics and Automation Engineering, ICRAE 2020
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages64-69
Number of pages6
ISBN (Electronic)9781728189819
DOIs
Publication statusPublished - 20 Nov 2020
Event5th International Conference on Robotics and Automation Engineering, ICRAE 2020 - Virtual, Singapore, Singapore
Duration: 20 Nov 202022 Nov 2020

Publication series

Name2020 5th International Conference on Robotics and Automation Engineering, ICRAE 2020

Conference

Conference5th International Conference on Robotics and Automation Engineering, ICRAE 2020
Country/TerritorySingapore
CityVirtual, Singapore
Period20/11/2022/11/20

Bibliographical note

Publisher Copyright:
© 2020 IEEE.

Funding

ACKNOWLEDGMENT The authors would like to thank all national funding authorities and the ECSEL Joint Undertaking, which funded the PRYSTINE project under the grant agreement number 783190.

FundersFunder number
Horizon 2020 Framework Programme783190
Electronic Components and Systems for European Leadership

    Keywords

    • lidar point cloud processing
    • multi-lidar sensor fusion
    • object tracking
    • Sensor fusion

    Fingerprint

    Dive into the research topics of 'Comparison and Application of Multiple 3D LIDAR Fusion Methods for Object Detection and Tracking'. Together they form a unique fingerprint.

    Cite this