Uncertainty Modelling of Laser Scanning Point Clouds Using Machine-Learning Methods
- authored by
- Jan Moritz Hartmann, Hamza Alkhatib
- Abstract
Terrestrial laser scanners (TLSs) are a standard method for 3D point cloud acquisition due to their high data rates and resolutions. In certain applications, such as deformation analysis, modelling uncertainties in the 3D point cloud is crucial. This study models the systematic deviations in laser scan distance measurements as a function of various influencing factors using machine-learning methods. A reference point cloud is recorded using a laser tracker (Leica AT 960) and a handheld scanner (Leica LAS-XL) to investigate the uncertainties of the Z+F Imager 5016 in laboratory conditions. From 49 TLS scans, a wide range of data are obtained, covering various influencing factors. The processes of data preparation, feature engineering, validation, regression, prediction, and result analysis are presented. The results of traditional machine-learning methods (multiple linear and nonlinear regression) are compared with eXtreme gradient boosted trees (XGBoost). Thereby, it is demonstrated that it is possible to model the systemic deviations of the distance measurement with a coefficient of determination of 0.73, making it possible to calibrate the distance measurement to improve the laser scan measurement. An independent TLS scan is used to demonstrate the calibration results.
- Organisation(s)
-
Geodetic Institute
- Type
- Article
- Journal
- Remote Sensing
- Volume
- 15
- ISSN
- 2072-4292
- Publication date
- 29.04.2023
- Publication status
- Published
- Peer reviewed
- Yes
- ASJC Scopus subject areas
- Earth and Planetary Sciences(all)
- Electronic version(s)
-
https://doi.org/10.3390/rs15092349 (Access:
Open)
-
Details in the research portal "Research@Leibniz University"