Performance Evaluation of Exponential Discriminant Analysis with Feature Selection for Steganalysis
The performance of supervised learning-based seganalysis depends on the choice of both classifier and features which represent the image. Features extracted from images may contain irrelevant and redundant features which makes them inefficient for machine learning. Relevant features not only decrease the processing time to train a classifier but also provide better generalisation. Linear discriminant classifier which is commonly used for classification may not be able to classify in better way non-linearly separable data. Recently, exponential discriminant analysis, a variant of linear discriminant analysis (LDA), is proposed which transforms the scatter matrices to a new space by distance diffusion mapping. This provides exponential discriminant analysis (EDA) much more discriminant power to classify non-linearly separable data and helps in improving classification accuracy in comparison to LDA. In this paper, the performance of EDA in conjunction with feature selection methods has been investigated. For feature selection, Kullback divergence, Chernoff distance measures and linear regression measures are used to determine relevant features from higher-order statistics of images. The performance is evaluated in terms classification error and computation time. Experimental results show that exponential discriminate analysis in conjunction with linear regression significantly performs better in terms of both classification error and compilation time of training classifier.
Defence Science Journal, 2012, 62(1), pp.19-24, DOI:http://dx.doi.org/10.14429/dsj.62.1437
Where otherwise noted, the Articles on this site are licensed under Creative Commons License: CC Attribution-Noncommercial-No Derivative Works 2.5 India