Damage Detection for Port Infrastructure by Means of Machine-Learning-Algorithms

verfasst von
Frederic Hake, Matthias Hermann, Hamza Alkhatib, Christian Hesse, Karsten Holste, Georg Umlauf, Gael Kermarrec, Ingo Neumann
Abstract

The ageing infrastructure in ports requires regular inspection. This inspection is currently carried out manually by divers who sense by hand the entire underwater infrastructure. This process is cost-intensive as it involves a lot of time and human resources. To overcome these difficulties, we propose to scan the above and underwater port structure with a Multi-Sensor-System, and -by a fully automated process-to classify the obtained point cloud into damaged and undamaged zones. We make use of simulated training data to test our approach since not enough training data with corresponding class labels are available yet. To that aim, we build a rasterised heightfield of a point cloud of a sheet pile wall by cutting it into verticall slices. The distance from each slice to the corresponding line generates the heightfield. This latter is propagated through a convolutional neural network which detects anomalies. We use the VGG19 Deep Neural Network model pretrained on natural images. This neural network has 19 layers and it is often used for image recognition tasks. We showed that our approach can achieve a fully automated, reproducible, quality-controlled damage detection which is able to analyse the whole structure instead of the sample wise manual method with divers. The mean true positive rate is 0.98 which means that we detected 98 % of the damages in the simulated environment.

Organisationseinheit(en)
Geodätisches Institut
Externe Organisation(en)
Hochschule Konstanz Technik, Wirtschaft und Gestaltung (HTWG)
Dr. Hesse und Partner Ingenieure
WKC Hamburg GmbH
Typ
Paper
Publikationsdatum
2020
Publikationsstatus
Elektronisch veröffentlicht (E-Pub)
Elektronische Version(en)
https://www.fig.net/resources/proceedings/fig_proceedings/fig2020/papers/ts08b/TS08B_hake_alkhatib_et_al_10441.pdf (Zugang: Offen)
 

Details im Forschungsportal „Research@Leibniz University“