We are not able to resolve this OAI Identifier to the repository landing page. If you are the repository manager for this record, please head to the Dashboard and adjust the settings.
Feature selection (FS) has become an indispensable task in dealing with today’s highly complex pattern recognition problems with massive number of features. In this study, we propose a new wrapper ap-proach for FS based on binary simultaneous perturbation stochastic approximation (BSPSA). This pseudogra dientdescent stochastic algorithm starts with an initial feature vector and moves toward the optimal feature vector via successive
iterations. In each iteration, the current feature vector’s individual com- ponents are perturbed simultaneously by offsets from a qualified probability distribution. We present computational experiments on datasets with numbers of features ranging from a few dozens to thousands using three widely-used classifiers as wrappers: nearest neighbor, decision tree, and linear support vectormachine. We
compare our methodology against the fullset of features as well as a binary genetic algorithm and sequential FS methods using cross-validated classification error rate and AUC as theperformance criteria. Our results indicate that features selected by BSPSA compare favorably to alter- native methods in general and BSPSA can yield superior feature sets for datasets with tens of thousands of features by examining an extremely small fraction of the solution space. We are not aware of any other wrapper FS methods that are computationally feasible with good convergence properties for such large datasets
Is data on this page outdated, violates copyrights or anything else? Report the problem now and we will take corresponding actions after reviewing your request.