Abstract

AdaBoost algorithm is a typical Boosting algorithm, which belongs to a successful representative in the Boosting family. This algorithm can upgrade a weak classifier with a better classification effect than random classification to a strong classifier with high classification accuracy, where n_estimators represents the number of iterations of the base classifier. If the value is too large, it will easily cause the model to overfit, if it is too small, it is easy. The model is under-fitting, and the parameter setting is not set randomly, but according to the current status of the data set. Aiming at the problem that the number of iterations in the AdaBoost algorithm is uncertain, this paper introduces a Bayesian optimization algorithm for hyperparameter tuning, which makes the value of hyper parameter in AdaBoost algorithm suitable for the current data set, and finally obtains a hyperparameter optimization AdaBoost algorithm. The experiment result shows the method that adopt Bayesian optimization algorithm for hyperparameter optimization and apply the optimized hyperparameter value to the AdaBoost algorithm does not only improves the classification accuracy of the AdaBoost algorithm, but also avoids overfitting and underfitting of the model.

Details

Title
An Improved AdaBoost Algorithm for Hyperparameter Optimization
Author
Gao, Rongfang 1 ; Liu, Zhanyu 1 

 College of Computer Science and Technology, Xi’an Shiyou University, Xi’an, China 
Publication year
2020
Publication date
Sep 2020
Publisher
IOP Publishing
ISSN
17426588
e-ISSN
17426596
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
2570950036
Copyright
© 2020. This work is published under http://creativecommons.org/licenses/by/3.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.