A Modified Bayesian Optimization based Hyper-Parameter Tuning Approach for Extreme Gradient Boosting

Autor: Sayan Putatunda, Kiran Rama
Rok vydání: 2019
Předmět:
Zdroj: 2019 Fifteenth International Conference on Information Processing (ICINPRO).
DOI: 10.1109/icinpro47689.2019.9092025
Popis: It is already reported in the literature that the performance of a machine learning algorithm is greatly impacted by performing proper Hyper-Parameter optimization. One of the ways to perform Hyper-Parameter optimization is by manual search but that is time consuming. Some of the common approaches for performing Hyper-Parameter optimization are Grid search Random search and Bayesian optimization using Hyperopt. In this paper, we propose a brand new approach for hyperparameter improvement i.e. Randomized-Hyperopt and then tune the hyperparameters of the XGBoost i.e. the Extreme Gradient Boosting algorithm on ten datasets by applying Random search, Randomized-Hyperopt, Hyperopt and Grid Search. The performances of each of these four techniques were compared by taking both the prediction accuracy and the execution time into consideration. We find that the Randomized-Hyperopt performs better than the other three conventional methods for hyper-paramter optimization of XGBoost.
Comment: Pre-review version of the paper submitted to IEEE 2019 Fifteenth International Conference on Information Processing (ICINPRO). The paper is accepted for publication
Databáze: OpenAIRE