Treffer: LiDAR Point Cloud Colourisation Using Multi-Camera Fusion and Low-Light Image Enhancement.
Weitere Informationen
Highlights: What are the main findings? We developed an end-to-end pipeline combining multi-camera fusion with automated and targetless LiDAR–camera calibration to achieve full 360° real-time colourisation. Added colour correction and low-light enhancement modules recovered scene details at illumination as low as 0.5 lx, comparable to well-lit conditions. What is the implication of the main finding? A hardware-agnostic solution is provided for reliable monitoring and mapping in low-light environments, including underground mines and night-time navigation. Deployment is simplified with improvements for LiDAR interpretability for applications such as autonomous navigation, geological surveys, and vegetation analysis. In recent years, the fusion of camera data with LiDAR measurements has emerged as a powerful approach to enhance spatial understanding. This study introduces a novel, hardware-agnostic methodology that generates colourised point clouds from mechanical LiDAR using multiple camera inputs, providing complete 360-degree coverage. The primary innovation lies in its robustness under low-light conditions, achieved through the integration of a low-light image enhancement module within the fusion pipeline. The system requires initial calibration to determine intrinsic camera parameters, followed by automatic computation of the geometric transformation between the LiDAR and cameras—removing the need for specialised calibration targets and streamlining the setup. The data processing framework uses colour correction to ensure uniformity across camera feeds before fusion. The algorithm was tested using a Velodyne Puck Hi-Res LiDAR and a four-camera configuration. The optimised software achieved real-time performance and reliable colourisation even under very low illumination, successfully recovering scene details that would otherwise remain undetectable. [ABSTRACT FROM AUTHOR]
Copyright of Sensors (14248220) is the property of MDPI and its content may not be copied or emailed to multiple sites without the copyright holder's express written permission. Additionally, content may not be used with any artificial intelligence tools or machine learning technologies. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)