Sensor Synergy in Bathymetric Mapping: Integrating Optical, LiDAR, and Echosounder Data Using Machine Learning

Emre Gülher, Ugur Alganci*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

Bathymetry, the measurement of water depth and underwater terrain, is vital for scientific, commercial, and environmental applications. Traditional methods like shipborne echosounders are costly and inefficient in shallow waters due to limited spatial coverage and accessibility. Emerging technologies such as satellite imagery, drones, and spaceborne LiDAR offer cost-effective and efficient alternatives. This research explores integrating multi-sensor datasets to enhance bathymetric mapping in coastal and inland waters by leveraging each sensor’s strengths. The goal is to improve spatial coverage, resolution, and accuracy over traditional methods using data fusion and machine learning. Gülbahçe Bay in İzmir, Turkey, serves as the study area. Bathymetric modeling uses Sentinel-2, Göktürk-1, and aerial imagery with varying resolutions and sensor characteristics. Model calibration evaluates independent and integrated use of single-beam echosounder (SBE) and satellite-based LiDAR (ICESat-2) during training. After preprocessing, Random Forest and Extreme Gradient Boosting algorithms are applied for bathymetric inference. Results are assessed using accuracy metrics and IHO CATZOC standards, achieving A1 level for 0–10 m, A2/B for 0–15 m, and C level for 0–20 m depth intervals.

Original languageEnglish
Article number2912
JournalRemote Sensing
Volume17
Issue number16
DOIs
Publication statusPublished - Aug 2025

Bibliographical note

Publisher Copyright:
© 2025 by the authors.

Keywords

  • Göktürk-1
  • IceSat-2
  • Sentinel-2
  • bathymetry
  • machine learning
  • multi-sensor integration

Fingerprint

Dive into the research topics of 'Sensor Synergy in Bathymetric Mapping: Integrating Optical, LiDAR, and Echosounder Data Using Machine Learning'. Together they form a unique fingerprint.

Cite this