Adaboost.M1 requires each weak classifier,s accuracy rate more than 1/2. But it is difficult to find a weak classifier which accuracy rate more than 1/2 in a multiple classification issues. Some scholars put forward the Stagewise Additive Modeling using a Multi-class Exponential loss function (SAMME) algorithm, it reduces the weak classifier accuracy requirements, from more than 1/2 to more than 1/k (k is the category number). SAMME algorithm reduces the difficulty to find weak classifier. But, due to the SAMME algorithm is no guarantee that the effectiveness of the weak classifier, which does not ensure that the final classifier improves classification accuracy. This paper analyzes the multi-class Adaboost algorithm by graphic method and math method, and then a new kind of multi-class classification method is proposed which not only reduces the weak classifier accuracy requirements, but also ensures the effectiveness of the weak classifier. In the benchmark experiments on UCI data sets show that the proposed algorithm are better than the SAMME, and achieves the effect of Adaboost.M1.
杨新武,马壮,袁顺. 基于弱分类器调整的多分类Adaboost算法[J]. 电子与信息学报, 2016, 38(2): 373-380.
YANG Xinwu, MA Zhuang, YUAN Shun. Multi-class Adaboost Algorithm Based on the Adjusted Weak Classifier. JEIT, 2016, 38(2): 373-380.
CAO Ying, MIAO Qiguang, LIU Jiachen, et al. Advance and prospects of AdaBoost algorithm[J]. Acta Automatica Sinica, 2013, 39(6): 745-758. doi: 10.3724/SP.J.1004.2013.00745.
[6]
FREUND Y and SCHAPIRE R E. A desicion-theoretic generalization of on-line learning and an application to boosting[J]. Lecture Notes in Computer Science, 1970, 55(1): 23-37. doi: 10.1007/3-540-59119-2_166.
[7]
NEGRI P, GOUSSIES N, and LOTITO P. Detecting pedestrians on a movement feature space[J]. Pattern Recognition, 2014, 47(1): 56-71. doi: 10.1016/j.patcog. 2013.05.020.
[8]
LIU L, SHAO L, and ROCKETT P. Boosted key-frame selection and correlated pyramidal motion-feature representation for human action recognition[J]. Pattern Recognition, 2013, 46(7): 1810-1818. doi: 10.1016/j.patcog. 2012.10.004.
[9]
FREUND Y and SCHAPIRE R E. Experiments with a new boosting algorithm[C]. Proceedings of the Thirteenth International Conference on Machine Learning, Italy, 1996: 148-156.
[10]
ALLWEIN E L, SCHAPIRE R E, and SINGER Y. Reducing multiclass to binary: a unifying approach for margin classifiers[J]. The Journal of Machine Learning Research, 2001, 1(2): 113-141. doi: 10.1162/15324430152733133.
[11]
MUKHERJEE I and SCHAPIRE R E. A theory of multiclass boosting[J]. The Journal of Machine Learning Research, 2013, 14(1): 437-497.
[12]
SCHAPIRE R E and SINGER Y. Improved boosting algorithms using confidence-rated predictions[J]. Machine Learning, 1999, 37(3): 297-336. doi: 10.1145/279943.279960.
HU Jinhai, LUO Guangqi, LI Yinghong, et al. An AdaBoost algorithm for multi-class classification based on exponential loss function and its application[J]. Acta Aeronautica et Astronautica Sinica, 2008, 29(4): 811-816.
[15]
ZHU J, ZOU H, ROSSET S, et al. Multi-class adaboost[J]. Statistics and Its Interface, 2009, 2(3): 349-360.
[16]
BLAKE C L and MERZ C J. UCI Repository of machine learning databases[OL]. http://www.ics.uci.edu/.1998.
[17]
FRIEDMAN J, HASTIE T, and TIBSHIRANI R. Additive logistic regression: a statistical view of boosting (with discussion and a rejoinder by the authors)[J]. The Annals of Statistics, 2000, 28(2): 337-407.doi: 10.1214/aos/1016120463.
FU Zhongliang. Effectiveness analysis of AdaBoost[J]. Journal of Computer Research and Development, 2008, 45(10): 1747-1755.
[19]
KUZNETSOV V, MOHRI M, and SYED U. Multi-class deep boosting[C]. Advances in Neural Information Processing Systems, Canada, 2014: 2501-2509.
[20]
CORTES C, MOHRI M, and SYED U. Deep boosting[C]. Proceedings of the 31st International Conference on Machine Learning (ICML-14), Beijing, 2014: 1179-1187.
[21]
ZHAI S, XIA T, and WANG S. A multi-class boosting method with direct optimization[C]. Proceedings of the 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, New York, 2014: 273-282.
[22]
ZHAI S, XIA T, TAN M, et al. Direct 0-1 loss minimization and margin maximization with boosting[C]. Advances in Neural Information Processing Systems, Nevada, 2013: 872-880.
[23]
DOLLAR P. Quickly boosting decision trees-pruning underachieving features early[C]. JMLR Workshop & Conference Proceedings, 2013, 28: 594-602.
[24]
FERNANDEZ B A and BAUMELA L. Multi-class boosting with asymmetric binary weak-learners[J]. Pattern Recognition, 2014, 47(5): 2080-2090. doi: 10.1016/j.patcog. 2013.11.024.