Enhancement of Imbalance Data Classification with Boosting Methods: An Experiment
Autor: | Smita Jaywantrao Ghorpade, Ratna Sadashiv Chaudhari, Seema Sajanrao Patil |
---|---|
Rok vydání: | 2022 |
Předmět: | |
Zdroj: | ECS Transactions. 107:15923-15934 |
ISSN: | 1938-6737 1938-5862 |
DOI: | 10.1149/10701.15923ecst |
Popis: | The idea of boosting emanates from the area of machine learning. It is a challenging task for imbalance data set to have appropriate distribution of data samples in each class by machine learning algorithm. To deal with this problem, ensemble learning method is one of the popular approaches. Ensemble methods integrate several learning algorithms, which gives better predictive performance as compared to any of the basic learning algorithms alone. Based on this research, a question is formulated. The null hypothesis is stated as “There is no significant difference between single classifier and classifier with ensemble techniques - AdaboostM1 and Bagging.” Alternative hypothesis is stated as “Ensemble techniques AdaBoostM1 and Bagging works more superior as compare to single classifier.” We have conducted an experiment on three imbalanced data sets. We examined the accuracy of four classifiers Naïve Bayes, Multilayer Perceptron, Locally Weighted Learning, and REPTree. The predicted accuracy score of these classifiers are compared with boosting techniques AdaboostM1, Bagging, Voting, and Stacking. |
Databáze: | OpenAIRE |
Externí odkaz: |