CO-POLAR SAR data classification as a tool for real time paddy-rice monitoring

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

8 Citations (Scopus)

Abstract

The crop phenology retrieval on precision agriculture has been an important research area with the increasing demand on crops. Remotely sensed Synthetic Aperture Radar (SAR) data provides a simple possibility for automatic monitoring of agricultural fields due to the its inherit all-weather monitoring capability. Most of the studies rely on morphology based modelling of the electromagnetic backscattering which requires Monte Carlo simulations. In this paper, instead of modelling the backscattering of the signals for monitoring the crop fields, a classification scheme was implemented on the data acquired by TerraSAR-X by using the features extracted from backscattering coefficients with the machine learning algorithms which are Support Vector Machines, k-Nearest Neighbor and Regression Tree.

Original languageEnglish
Title of host publication2015 IEEE International Geoscience and Remote Sensing Symposium, IGARSS 2015 - Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages4141-4144
Number of pages4
ISBN (Electronic)9781479979295
DOIs
Publication statusPublished - 10 Nov 2015
EventIEEE International Geoscience and Remote Sensing Symposium, IGARSS 2015 - Milan, Italy
Duration: 26 Jul 201531 Jul 2015

Publication series

NameInternational Geoscience and Remote Sensing Symposium (IGARSS)
Volume2015-November

Conference

ConferenceIEEE International Geoscience and Remote Sensing Symposium, IGARSS 2015
Country/TerritoryItaly
CityMilan
Period26/07/1531/07/15

Bibliographical note

Publisher Copyright:
© 2015 IEEE.

Keywords

  • classification
  • machine learning
  • Precision agriculture
  • synthetic aperture radar (SAR)

Fingerprint

Dive into the research topics of 'CO-POLAR SAR data classification as a tool for real time paddy-rice monitoring'. Together they form a unique fingerprint.

Cite this