The Comparison of Activation Functions for Multispectral Landsat TM Image Classification

Coşkun Özkan*, Filiz Sunar Erbek

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

60 Citations (Scopus)

Abstract

Neural networks, recently applied to a number of image classification problems, are computational systems consisting of neurons or nodes arranged in layers with interconnecting links. Although there are a wide range of network types and possible applications in remote sensing, most attention has focused on the use of MultiLayer Perceptron (MLP) or FeedForward (FF) networks trained with a backpropagation-learning algorithm for supervised classification. One of the main characteristic elements of an artificial neural network (ANN) is the activation function. Nonlinear logistic (sigmoid and tangent hyperbolic) and linear activation functions have been used effectively with MLP networks for various purposes. The main objective of this study is to compare sigmoid, tangent hyperbolic, and linear activation functions through the one- and two-hidden layered MLP neural network structures trained with the scaled conjugate gradient learning algorithm, and to evaluate their performance on the multispectral Landsat TM imagery classification problem.

Original languageEnglish
Pages (from-to)1225-1234
Number of pages10
JournalPhotogrammetric Engineering and Remote Sensing
Volume69
Issue number11
DOIs
Publication statusPublished - Nov 2003

Fingerprint

Dive into the research topics of 'The Comparison of Activation Functions for Multispectral Landsat TM Image Classification'. Together they form a unique fingerprint.

Cite this