Geodetic Institute
Logo Geodetic Institute

A Mobile Multi-Sensor Platform For Building Reconstruction Integrating Terrestrial And Autonomous UAV-Based Close Range Data Acquisition

authored by
A. Cefalu, N. Haala, S. Schmohl, I. Neumann, T. Genz
Abstract

Photogrammetric data capture of complex 3D objects using UAV imagery has become commonplace. Software tools based on algorithms like Structure-from-Motion and multi-view stereo image matching enable the fully automatic generation of densely meshed 3D point clouds. In contrast, the planning of a suitable image network usually requires considerable effort of a human expert, since this step directly influences the precision and completeness of the resulting point cloud. Planning of suitable camera stations can be rather complex, in particular for objects like buildings, bridges and monuments, which frequently feature strong depth variations to be acquired by high resolution images at a short distance. Within the paper, we present an automatic flight mission planning tool, which generates flight lines while aiming at camera configurations, which maintain a roughly constant object distance, provide sufficient image overlap and avoid unnecessary stations. Planning is based on a coarse Digital Surface Model and an approximate building outline. As a proof of concept, we use the tool within our research project MoVEQuaD, which aims at the reconstruction of building geometry at sub-centimetre accuracy.

Organisation(s)
Geodetic Institute
External Organisation(s)
University of Stuttgart
Geo-Office Gesellschaft für graphische Datenverarbeitung und Vermessung mbH
Type
Conference article
Journal
International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences - ISPRS Archives
Volume
42
Pages
63-70
No. of pages
8
ISSN
1682-1750
Publication date
23.08.2017
Publication status
Published
Peer reviewed
Yes
ASJC Scopus subject areas
Geography, Planning and Development, Information Systems
Electronic version(s)
https://doi.org/10.5194/isprs-archives-XLII-2-W6-63-2017 (Access: Open)
 

Details in the research portal "Research@Leibniz University"