Evaluation of Depth Anything Models for Satellite-Derived Bathymetry

Esra Günaydln, Irem Yakar, Tolga Baklrman, Mahmut Oguz Selbesoglu

Research output: Contribution to journalConference articlepeer-review

Abstract

The emergence of foundation models has driven major advancements in computer vision and natural language processing, primarily due to their strong zero-shot and few-shot capabilities powered by large-scale, diverse datasets. While earlier approaches used supervised datasets, their limited scene diversity did not perform well in unseen environments. To overcome these limitations, recent works have leveraged unlabeled monocular images, which can be automatically labeled using pre-Trained models. One model can be shown as Depth Anything, which demonstrated robust zero-shot performance across diverse scenarios, with Depth Anything V2 further improving accuracy. In this study, the performance of Depth Anything V1 and V2 models was evaluated in satellite-derived bathymetry using Sentinel 2 satellite imagery. The accuracy of these predicted depth maps was evaluated by comparing them with bathymetric data obtained from the National Oceanic and Atmospheric Administration’s (NOAA) National Centers for Environmental Information (NCEI) as the ground truth. The results show that the correlation between Depth Anything V1 predictions and NOAA NCEI data was 56.69%, while the correlation for Depth Anything V2 reached 84.54%. The predicted depth maps were also scaled to obtain Root Mean Square Error (RMSE) and Mean Absolute Error (MAE). The RMSE and MAE values for Depth Anything V1 are 0.4135 m and 0.34 m, respectively, while the RMSE and MAE values for V2 are 0.2681 m and 0.2089 m, respectively. This improvement shows the capability of Depth Anything V2 in estimating underwater terrain from monocular satellite imagery, which also demonstrates its potential for cost-effective bathymetric mapping in remote sensing applications. In addition to deep learning-based approaches applied in the test area, a satellite-derived depth map was also generated using the classical band ratio method. Compared with reference bathymetric data, the correlation coefficient, RMSE, and MAE were found to be 38.20%, 0.4639m, and 0.3746m, respectively.

Original languageEnglish
Pages (from-to)101-106
Number of pages6
JournalInternational Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences - ISPRS Archives
Volume48
Issue number2/W10-2025
DOIs
Publication statusPublished - 7 Jul 2025
Externally publishedYes
Event3rd International Workshop on 3D Underwater Mapping from Above and Below - Vienna, Austria
Duration: 8 Jul 202511 Jul 2025

Bibliographical note

Publisher Copyright:
© 2025 Esra Günaydln et al.

Keywords

  • Bathymetry
  • Deep Learning
  • Depth Anything
  • Monocular Depth Estimation
  • Satellite Imagery

Fingerprint

Dive into the research topics of 'Evaluation of Depth Anything Models for Satellite-Derived Bathymetry'. Together they form a unique fingerprint.

Cite this