|
|
Multi-class Adaboost Algorithm Based on the Adjusted Weak Classifier |
YANG Xinwu MA Zhuang YUAN Shun |
(School of Computer Science, Beijing University of Technology, Beijing 100124, China) |
|
|
Abstract Adaboost.M1 requires each weak classifier,s accuracy rate more than 1/2. But it is difficult to find a weak classifier which accuracy rate more than 1/2 in a multiple classification issues. Some scholars put forward the Stagewise Additive Modeling using a Multi-class Exponential loss function (SAMME) algorithm, it reduces the weak classifier accuracy requirements, from more than 1/2 to more than 1/k (k is the category number). SAMME algorithm reduces the difficulty to find weak classifier. But, due to the SAMME algorithm is no guarantee that the effectiveness of the weak classifier, which does not ensure that the final classifier improves classification accuracy. This paper analyzes the multi-class Adaboost algorithm by graphic method and math method, and then a new kind of multi-class classification method is proposed which not only reduces the weak classifier accuracy requirements, but also ensures the effectiveness of the weak classifier. In the benchmark experiments on UCI data sets show that the proposed algorithm are better than the SAMME, and achieves the effect of Adaboost.M1.
|
Received: 11 May 2015
Published: 18 November 2015
|
|
Corresponding Authors:
YANG Xinwu
E-mail: yang_xinwu@bjut.edu.cn
|
|
|
|
[1] |
VALIANT L G. A theory of the learnable[J]. Communications of the ACM, 1984, 27(11): 1134-1142. doi: 10.1145/800057.808710.
|
[2] |
SCHAPIRE R E. The strength of weak learnability[J]. Machine Learning, 1990, 5(2): 197-227. doi: 10.1007/ BF00116037.
|
[3] |
FREUND Y. Boosting a weak learning algorithm by majority [J]. Information and Computation, 1995, 121(2): 256-285. doi: 10.1006/inco.1995.1136.
|
[4] |
SCHAPIRE R E. A brief introduction to boosting[C]. International Joint Conference on Artificial Intelligence, Sweden, 1999: 1401-1406.
|
[5] |
曹莹, 苗启广, 刘家辰, 等. AdaBoost 算法研究进展与展望[J]. 自动化学报, 2013, 39(6): 745-758. doi: 10.3724/SP.J. 1004.2013.00745.
|
|
CAO Ying, MIAO Qiguang, LIU Jiachen, et al. Advance and prospects of AdaBoost algorithm[J]. Acta Automatica Sinica, 2013, 39(6): 745-758. doi: 10.3724/SP.J.1004.2013.00745.
|
[6] |
FREUND Y and SCHAPIRE R E. A desicion-theoretic generalization of on-line learning and an application to boosting[J]. Lecture Notes in Computer Science, 1970, 55(1): 23-37. doi: 10.1007/3-540-59119-2_166.
|
[7] |
NEGRI P, GOUSSIES N, and LOTITO P. Detecting pedestrians on a movement feature space[J]. Pattern Recognition, 2014, 47(1): 56-71. doi: 10.1016/j.patcog. 2013.05.020.
|
[8] |
LIU L, SHAO L, and ROCKETT P. Boosted key-frame selection and correlated pyramidal motion-feature representation for human action recognition[J]. Pattern Recognition, 2013, 46(7): 1810-1818. doi: 10.1016/j.patcog. 2012.10.004.
|
[9] |
FREUND Y and SCHAPIRE R E. Experiments with a new boosting algorithm[C]. Proceedings of the Thirteenth International Conference on Machine Learning, Italy, 1996: 148-156.
|
[10] |
ALLWEIN E L, SCHAPIRE R E, and SINGER Y. Reducing multiclass to binary: a unifying approach for margin classifiers[J]. The Journal of Machine Learning Research, 2001, 1(2): 113-141. doi: 10.1162/15324430152733133.
|
[11] |
MUKHERJEE I and SCHAPIRE R E. A theory of multiclass boosting[J]. The Journal of Machine Learning Research, 2013, 14(1): 437-497.
|
[12] |
SCHAPIRE R E and SINGER Y. Improved boosting algorithms using confidence-rated predictions[J]. Machine Learning, 1999, 37(3): 297-336. doi: 10.1145/279943.279960.
|
[13] |
涂承胜, 刁力力, 鲁明羽, 等. Boosting 家族 AdaBoost 系列代表算法[J]. 计算机科学, 2003, 30(3): 30-34.
|
|
TU Chengsheng, DIAO Lili, LU Mingyu, et al. The typical algorithm of AdaBoost series in Boosting family[J]. Computer Science, 2003, 30(3): 30-34.
|
[14] |
胡金海, 骆广琦, 李应红, 等. 一种基于指数损失函数的多类分类 AdaBoost 算法及其应用[J]. 航空学报, 2008, 29(4): 811-816.
|
|
HU Jinhai, LUO Guangqi, LI Yinghong, et al. An AdaBoost algorithm for multi-class classification based on exponential loss function and its application[J]. Acta Aeronautica et Astronautica Sinica, 2008, 29(4): 811-816.
|
[15] |
ZHU J, ZOU H, ROSSET S, et al. Multi-class adaboost[J]. Statistics and Its Interface, 2009, 2(3): 349-360.
|
[16] |
BLAKE C L and MERZ C J. UCI Repository of machine learning databases[OL]. http://www.ics.uci.edu/.1998.
|
[17] |
FRIEDMAN J, HASTIE T, and TIBSHIRANI R. Additive logistic regression: a statistical view of boosting (with discussion and a rejoinder by the authors)[J]. The Annals of Statistics, 2000, 28(2): 337-407.doi: 10.1214/aos/1016120463.
|
[18] |
付忠良. 关于 AdaBoost 有效性的分析[J]. 计算机研究与发展, 2008, 45(10): 1747-1755.
|
|
FU Zhongliang. Effectiveness analysis of AdaBoost[J]. Journal of Computer Research and Development, 2008, 45(10): 1747-1755.
|
[19] |
KUZNETSOV V, MOHRI M, and SYED U. Multi-class deep boosting[C]. Advances in Neural Information Processing Systems, Canada, 2014: 2501-2509.
|
[20] |
CORTES C, MOHRI M, and SYED U. Deep boosting[C]. Proceedings of the 31st International Conference on Machine Learning (ICML-14), Beijing, 2014: 1179-1187.
|
[21] |
ZHAI S, XIA T, and WANG S. A multi-class boosting method with direct optimization[C]. Proceedings of the 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, New York, 2014: 273-282.
|
[22] |
ZHAI S, XIA T, TAN M, et al. Direct 0-1 loss minimization and margin maximization with boosting[C]. Advances in Neural Information Processing Systems, Nevada, 2013: 872-880.
|
[23] |
DOLLAR P. Quickly boosting decision trees-pruning underachieving features early[C]. JMLR Workshop & Conference Proceedings, 2013, 28: 594-602.
|
[24] |
FERNANDEZ B A and BAUMELA L. Multi-class boosting with asymmetric binary weak-learners[J]. Pattern Recognition, 2014, 47(5): 2080-2090. doi: 10.1016/j.patcog. 2013.11.024.
|
|
|
|