A Novel Convolutional Autoencoder-Based Clutter Removal Method for Buried Threat Detection in Ground-Penetrating Radar

Eyyup Temlioglu*, Isin Erer

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

45 Citations (Scopus)

Abstract

The clutter encountered in ground-penetrating radar (GPR) systems seriously affects the performance of the subsurface target detection methods. A new clutter removal method based on convolutional autoencoders (CAEs) is introduced. The raw GPR image is encoded via successive convolution and pooling layers and then decoded to provide the clutter-free GPR image. The loss function is defined in terms of the reference clutter-free target image and the decoder output is optimized to learn the weight coefficients from the raw data. The method is compared to the conventional subspace methods, recently proposed nonnegative matrix factorization, as well as low-rank and sparse decomposition (LRSD) methods and dictionary separation-based morphological component analysis. CAE and its deeper version deep CAE (DCAE) are trained by several scenarios generated by the electromagnetic simulation tool gprMax. Simulation results demonstrate the effectiveness of the proposed method for challenging scenarios. While for real GPR image, the simulated data trained networks remain slightly behind the LRSD methods for the dry case, nonetheless, they outperform the aforementioned processing techniques for the more challenging wet case.

Original languageEnglish
JournalIEEE Transactions on Geoscience and Remote Sensing
Volume60
DOIs
Publication statusPublished - 2022

Bibliographical note

Publisher Copyright:
© 1980-2012 IEEE.

Keywords

  • Clutter removal
  • convolutional autoencoders (CAEs)
  • gprMax
  • ground-penetrating radar (GPR)
  • image decomposition

Fingerprint

Dive into the research topics of 'A Novel Convolutional Autoencoder-Based Clutter Removal Method for Buried Threat Detection in Ground-Penetrating Radar'. Together they form a unique fingerprint.

Cite this