Joint Sensor Calibration for LiDAR-Camera-IMU system for Mobile Mapping in Indoor and Urban Environments

Robotics
ACQUAL
Staff Involved
Additional Remarks

Strong programming skills in Python and C++ are necessary. Knowledge about Linux, Docker, and sensor calibration techniques is useful. The multi-sensor backpack system provided by the laboratory has 2 x VLP16 LiDARs, Ladybug5+ HD camera, BD990 Trimble GNSS, and XSens MTI 630R IMU.

Topic description

Simultaneous Localization and Mapping (SLAM) techniques are critical for mobile mapping in indoor and urban environments. However, accurate mapping in such environments requires joint calibration of multiple sensors, such as LiDAR used for precise geometry measurements, camera used for texture and semantics, and IMU for fast rotation handling. This thesis aims to develop advanced SLAM techniques that incorporate joint sensor calibration for mobile mapping in indoor and urban environments using a multi-sensor system provided by the laboratory.

Topic objectives and methodology

The objective of this thesis is to develop and implement advanced data-driven SLAM calibration techniques that register the sensors of a mobile mapping system together, namely, a 360-degree wide-angle-lens camera system and two multi-line lidars. Such techniques enable semantic-metric sensor fusion benefits, e.g., coloring of the point cloud using not only image colors but also image semantics and are in constant demand in academia and industry [1]. Additionally, this approach may be able to address the intrinsic calibration issue of multi-line lidars [2] because the camera semantics can offer, e.g., planar models that can be utilized to minimize the noise around these planes. Reference intrinsic calibration values for LIDAR are available.

The student will implement the proposed methods in C++ and Python, using datasets provided by the laboratory or collected by the student. The performance of the new method will be evaluated and compared against existing techniques, with a particular focus on the accuracy and efficiency of the joint sensor calibration. Additionally, the student will review the state of the art in relevant SLAM, sensor calibration, and sensor fusion techniques. The techniques developed and implemented by the student will be an integral part of the research efforts focusing on mobile mapping.

References for further reading

[1]: Lehtola, V. V., Hyyti, H., & Malkamäki, T. (2022). Why it Makes Sense to Use High Cost Sensors to do Low Cost Sensor Research. ISPRS-International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, 48, 137-143.

[2]: Bergelt, R., Khan, O., & Hardt, W. (2017, October). Improving the intrinsic calibration of a velodyne lidar sensor. In 2017 IEEE SENSORS (pp. 1-3). IEEE.