TY - JOUR
T1 - Dimension reduction by a novel unified scheme using divergence analysis and genetic search
AU - Korürek, Mehmet
AU - Yüksel, Ayhan
AU - Dokur, Zümray
AU - Ölmez, Tamer
PY - 2010/12
Y1 - 2010/12
N2 - In this study, a unified scheme using divergence analysis and genetic search is proposed to determine significant components of feature vectors in high-dimensional spaces, without having to deal with singular matrix problems. In the literature it is observed that three main problems exist in the feature selection process performed in a high-dimensional space. These problems are high computational load, local minima, and singular matrices. In this study, feature selection is realized by increasing the dimension one by one, rather than reducing the dimension. In this sense, the recursive covariance matrices are formulated to decrease the computational load. The use of genetic algorithms is proposed to avoid local optima and singular matrix problems in high-dimensional feature spaces. Candidate strings in the genetic pool represent the new features formed by increasing the dimension. The genetic algorithms investigate the combination of features which give the highest divergence value. In this study, two methods are proposed for the selection of features. In the first method, features in a high-dimensional space are determined by using divergence analysis and genetic search (DAGS) together. If the dimension is not high, the second method is offered which uses only recursive divergence analysis (RDA) without any genetic search. In Section 3 two experiments are presented: Feature determination in a two-dimensional phantom feature space, and feature determination for ECG beat classification in a real data space.
AB - In this study, a unified scheme using divergence analysis and genetic search is proposed to determine significant components of feature vectors in high-dimensional spaces, without having to deal with singular matrix problems. In the literature it is observed that three main problems exist in the feature selection process performed in a high-dimensional space. These problems are high computational load, local minima, and singular matrices. In this study, feature selection is realized by increasing the dimension one by one, rather than reducing the dimension. In this sense, the recursive covariance matrices are formulated to decrease the computational load. The use of genetic algorithms is proposed to avoid local optima and singular matrix problems in high-dimensional feature spaces. Candidate strings in the genetic pool represent the new features formed by increasing the dimension. The genetic algorithms investigate the combination of features which give the highest divergence value. In this study, two methods are proposed for the selection of features. In the first method, features in a high-dimensional space are determined by using divergence analysis and genetic search (DAGS) together. If the dimension is not high, the second method is offered which uses only recursive divergence analysis (RDA) without any genetic search. In Section 3 two experiments are presented: Feature determination in a two-dimensional phantom feature space, and feature determination for ECG beat classification in a real data space.
KW - Dimension reduction
KW - Divergence analysis
KW - Genetic algorithms
KW - Recursive covariance matrix
KW - Recursive inverse matrix
UR - http://www.scopus.com/inward/record.url?scp=77955413169&partnerID=8YFLogxK
U2 - 10.1016/j.dsp.2009.09.001
DO - 10.1016/j.dsp.2009.09.001
M3 - Article
AN - SCOPUS:77955413169
SN - 1051-2004
VL - 20
SP - 1535
EP - 1546
JO - Digital Signal Processing: A Review Journal
JF - Digital Signal Processing: A Review Journal
IS - 6
ER -