|
|
Improved Binary Glowworm Swarm Optimization Combined with Complementarity Measure for Ensemble Pruning |
ZHU Xuhui①② NI Zhiwei①② NI Liping①② JIN Feifei①② CHENG Meiying③ LI Jingming④ |
①(School of Management, Hefei University of Technology, Hefei 230009, China)
②(Key Laboratory of Process Optimization and Intelligent Decision-making, Ministry of Education, Hefei 230009, China)
③(Business School, Huzhou University, Huzhou 313000, China)
④(School of Management Science and Engineering, Anhui University of Finance and Economics, Bengbu 233030, China) |
|
|
Abstract The key to the success of an ensemble system are the diversity and the average accuracy of base classifiers. The increase of diversity among base classifiers will lead to the decrease of the average accuracy, and vice versa. So there exists a tradeoff between the diversity and the average accuracy, which makes the ensemble perform the best with respect to ensemble pruning. To find the tradeoff, Improved Binary Glowworm Swarm Optimization combined with Complementarity measure for Ensemble Pruning (IBGSOCEP) is proposed. Firstly, an initial pool of classifiers is constructed through training independently some base classifiers using bootstrap sampling. Secondly, the classifiers in the initial pool are pre-pruned using complementarity measure. Thirdly, Improved Binary Glowworm Swarm Optimization (IBGSO) is proposed by improving moving way, searching processes of glowworm, introducing re-initialization, and leaping behaviors. Finally, the optimal sub-ensemble is achieved from the base classifiers after pre-pruning using IBGSO. Experimental results on 5 UCI datasets demonstrate that IBGSODSEN can achieve better results than other approaches with less number of base classifiers, and that its effectiveness and significance.
|
Received: 23 October 2017
Published: 10 May 2018
|
|
Fund:The National Natural Science Foundation of China (91546108, 71271071, 71490725, 71301041), The National Key Research and Development Plan (2016YFF0202604), Open Research Fund Program of Key Laboratory of Process Optimization and Intelligent Decision-making |
Corresponding Authors:
NI Zhiwei
E-mail: zhwnelson@163.com
|
|
|
|
[1] |
BASHBAGHI S, GRANGER E, SABOURIN R, et al. Dynamic ensembles of exemplar-SVMs for still-to-video face recognition[J]. Pattern Recognition, 2017, 69(C): 61-81. doi: 10.1016/j.patcog.2017.04.014.
|
[2] |
刘家辰, 苗启广, 曹莹, 等. 基于混合多样性生成与修剪的集成单类分类算法[J]. 电子与信息学报, 2015, 37(2): 386-393. doi: 10.11999/JEIT140161.
|
|
LIU Jiachen, MIAO Qiguang, CAO Ying, et al. Ensemble one-class classifiers based on hybrid diversity generation and pruning[J]. Journal of Electronics & Information Technology, 2015, 37(2): 386-393. doi: 10.11999/JEIT140161.
|
[3] |
LI Kai, XING Junliang, HU Weiming, et al. D2C: Deep cumulatively and comparatively learning for human age estimation[J]. Pattern Recognition, 2017, 66(6): 95-105. doi: 10.1016/j.patcog.2017.01.007.
|
[4] |
LU Huijuan, AN Chunlin, ZHENG Enhui, et al. Dissimilarity based ensemble of extreme learning machine for gene expression data classification[J]. Neurocomputing, 2014, 128(5): 22-30. doi: 10.1016/j.neucom.2013.02.052.
|
[5] |
杨春, 殷绪成, 郝红卫, 等. 基于差异性的分类器集成: 有效性分析及优化集成[J]. 自动化学报, 2014, 40(4): 660-674. doi: 10.3724/SP.J.1004.2014.00660.
|
|
YANG Chun, YIN Xucheng, HAO Hongwei, et al. Classifier ensemble with diversity: Effectiveness analysis and ensemble optimization[J]. Acta Automatica Sinica, 2014, 40(4): 660-674. doi: 10.3724/SP.J.1004.2014.00660.
|
[6] |
YKHLEF H and BOUCHAFFRA D. An efficient ensemble pruning approach based on simple coalitional games[J]. Information Fusion, 2017, 34(C): 28-42. doi: 10.1016/j.inffus. 2016.06.003.
|
[7] |
ZHOU Zhihua, WU Jianxin, and TANG Wei. Ensembling neural networks: many could be better than all[J]. Artificial Intelligence, 2002, 137(1): 239-263. doi: 10.1016/S0004- 3702(02)00190-X.
|
[8] |
MARTÍNEZ-MUÑOZ G, HERNANDEZ LOBATO D, and SUAREZ A. An analysis of ensemble pruning techniques based on ordered aggregation[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2009, 31(2): 245-259. doi: 10.1109/TPAMI.2008.78.
|
[9] |
GUO Li and BOUKIR S. Margin-based ordered aggregation for ensemble pruning[J]. Pattern Recognition Letters, 2013, 34(6): 603-609. doi: 10.1016/j.patrec.2013.01.003.
|
[10] |
DAI Qun, ZHANG Ting, and LIU Ningzhong. A new reverse reduce-error ensemble pruning algorithm[J]. Applied Soft Computing, 2015, 28(3): 237-249. doi: 10.1016/j.asoc.2014. 10.045.
|
[11] |
ROKACH L. Collective-agreement-based pruning of ensembles[J]. Computational Statistics and Data Analysis, 2009, 53(4): 1015-1026. doi: 10.1016/j.csda.2008.12.001.
|
[12] |
LAZAREVIC A and OBRADOVIC Z. Effective pruning of neural network classifier ensembles[C]. International Joint Conference on Neural Networks, Washington DC, USA, 2001: 796-801. doi: 10.1109/IJCNN.2001.939461.
|
[13] |
GIACINTO G, ROLI F, and FUMERA G. Design of effective multiple classifier systems by clustering of classifiers[C]. International Conference on Pattern Recognition, Barcelona, Spain, 2000: 160-163. doi: 10.1109/ICPR.2000.906039.
|
[14] |
BAKKER B and HESKES T. Clustering ensembles of neural network models[J]. Neural Network, 2003, 16(2): 261-269. doi: 10.1016/S0893-6080(02)00187-9.
|
[15] |
ZHOU Hongfang, ZHAO Xuehan, and WANG Xiao. An effective ensemble pruning algorithm based on frequent patterns[J]. Knowledge-Based Systems, 2014, 56(C): 79-85. doi: 10.1016/j.knosys.2013. 10.024.
|
[16] |
CAVALCANTI G D C, OLIVEIRA L S, NOURA T J M, et al. Combining diversity measures for ensemble pruning[J]. Pattern Recognition Letters, 2016, 74(C): 38-45. doi: 10.1016 /j.patrec.2016.01.029.
|
[17] |
倪志伟, 张琛, 倪丽萍. 基于萤火虫群优化算法的选择性集成雾霾天气预测方法[J]. 模式识别与人工智能, 2016, 29(2): 143-153. doi: 10.16451/j.cnki.issn1003-6059.201602006.
|
|
NI Zhiwei, ZHANG Chen, and NI Liping. Haze forecast method of selective ensemble based on glowworm swarm optimization algorithm[J]. Pattern Recognition and Artificial Intelligence, 2016, 29(2): 143-153. doi: 10.16451/j.cnki. issn1003-6059.201602006.
|
[18] |
MARINAKI M and MARINAKI Y. A glowworm swarm optimization algorithm for the vehicle routing problem with stochastic demands[J]. Expert Systems with Applications, 2016, 46(C): 145-163. doi: 10.1016/j.eswa.2015.10.012.
|
[19] |
BREIMAN L. Bagging predictors[J]. Machine Learning, 1996, 24(2): 123-140. doi: 10.1023/A:1018054314350.
|
[20] |
SINGHAL P K, NARESH R, and SHARMA V. Binary fish swarm algorithm for profit-based unit commitment problem in competitive electricity market with ramp rate constraints [J]. IET Generation, Transmission & Distribution, 2015, 9(13): 1697-1707. doi: 10.1049/iet-gtd.2015.0201.
|
|
|
|