We are not able to resolve this OAI Identifier to the repository landing page. If you are the repository manager for this record, please head to the Dashboard and adjust the settings.
'Institute of Electrical and Electronics Engineers (IEEE)'
Doi
Abstract
In this paper, we introduce a new classification algorithm called the optimization of distribution differences (ODD). The algorithm aims to find a transformation from the feature space to a new space where the instances in the same class are as close as possible to one another, whereas the gravity centers of these classes are as far as possible from one another. This aim is formulated as a multiobjective optimization problem that is solved by a hybrid of an evolutionary strategy and the quasi-Newton method. The choice of the transformation function is flexible and could be any continuous space function. We experiment with a linear and a nonlinear transformation in this paper. We show that the algorithm can outperform eight other classification methods, namely naive Bayes, support vector machines, linear discriminant analysis, multilayer perceptrons, decision trees, and k-nearest neighbors, and two recently proposed classification methods, in 12 standard classification data sets. Our results show that the method is less sensitive to the imbalanced number of instances compared with these methods. We also show that ODD maintains its performance better than other classification methods in these data sets and hence offers a better generalization ability
Is data on this page outdated, violates copyrights or anything else? Report the problem now and we will take corresponding actions after reviewing your request.