A Mobile Multi-Sensor Platform For Building Reconstruction Integrating Terrestrial And Autonomous UAV-Based Close Range Data Acquisition

verfasst von
A. Cefalu, N. Haala, S. Schmohl, I. Neumann, T. Genz
Abstract

Photogrammetric data capture of complex 3D objects using UAV imagery has become commonplace. Software tools based on algorithms like Structure-from-Motion and multi-view stereo image matching enable the fully automatic generation of densely meshed 3D point clouds. In contrast, the planning of a suitable image network usually requires considerable effort of a human expert, since this step directly influences the precision and completeness of the resulting point cloud. Planning of suitable camera stations can be rather complex, in particular for objects like buildings, bridges and monuments, which frequently feature strong depth variations to be acquired by high resolution images at a short distance. Within the paper, we present an automatic flight mission planning tool, which generates flight lines while aiming at camera configurations, which maintain a roughly constant object distance, provide sufficient image overlap and avoid unnecessary stations. Planning is based on a coarse Digital Surface Model and an approximate building outline. As a proof of concept, we use the tool within our research project MoVEQuaD, which aims at the reconstruction of building geometry at sub-centimetre accuracy.

Organisationseinheit(en)
Geodätisches Institut
Externe Organisation(en)
Universität Stuttgart
Geo-Office Gesellschaft für graphische Datenverarbeitung und Vermessung mbH
Typ
Konferenzaufsatz in Fachzeitschrift
Journal
International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences - ISPRS Archives
Band
42
Seiten
63-70
Anzahl der Seiten
8
ISSN
1682-1750
Publikationsdatum
23.08.2017
Publikationsstatus
Veröffentlicht
Peer-reviewed
Ja
ASJC Scopus Sachgebiete
Geografie, Planung und Entwicklung, Information systems
Elektronische Version(en)
https://doi.org/10.5194/isprs-archives-XLII-2-W6-63-2017 (Zugang: Offen)
 

Details im Forschungsportal „Research@Leibniz University“