VHRTrees: a new benchmark dataset for tree detection in satellite imagery and performance evaluation with YOLO-based models

Şule Nur Topgül, Elif Sertel*, Samet Aksoy, Cem Ünsalan, Johan E.S. Fransson*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

Natural and planted forests, covering approximately 31% of the Earth’s land area, are crucial for global ecosystems, providing essential services such as regulating the water cycle, soil conservation, carbon storage, and biodiversity preservation. However, traditional forest mapping and monitoring methods are often costly and limited in scale, highlighting the need to develop innovative approaches for tree detection that can enhance forest management. In this study, we present a new dataset for tree detection, VHRTrees, derived from very high-resolution RGB satellite images. This dataset includes approximately 26,000 tree boundaries derived from 1,496 image patches of different geographical regions, representing various topographic and climatic conditions. We implemented various object detection algorithms to evaluate the performance of different methods, propose the best experimental configurations, and generate a benchmark analysis for further studies. We conducted our experiments with different variants and hyperparameter settings of the YOLOv5, YOLOv7, YOLOv8, and YOLOv9 models. Results from extensive experiments indicate that, increasing network resolution and batch size led to higher precision and recall in tree detection. YOLOv8m, optimized with Auto, achieved the highest F1-score (0.932) and mean Average Precision (mAP)@0.50 Intersection over Union threshold (0.934), although some other configurations showed higher [email protected]:0.95. These findings underscore the effectiveness of You Only Look Once (YOLO)-based object detection algorithms for real-time forest monitoring applications, offering a cost-effective and accurate solution for tree detection using RGB satellite imagery. The VHRTrees dataset, related source codes, and pretrained models are available at https://github.com/RSandAI/VHRTrees.

Original languageEnglish
Article number1495544
JournalFrontiers in Forests and Global Change
Volume7
DOIs
Publication statusPublished - 2024

Bibliographical note

Publisher Copyright:
Copyright © 2025 Topgül, Sertel, Aksoy, Ünsalan and Fransson.

Keywords

  • Google Earth imagery
  • VHRTrees
  • YOLO
  • artificial intelligence
  • deep learning
  • forest management
  • optical satellite data
  • tree detection

Fingerprint

Dive into the research topics of 'VHRTrees: a new benchmark dataset for tree detection in satellite imagery and performance evaluation with YOLO-based models'. Together they form a unique fingerprint.

Cite this