高级搜索

留言板

尊敬的读者、作者、审稿人, 关于本刊的投稿、审稿、编辑和出版的任何问题, 您可以本页添加留言。我们将尽快给您答复。谢谢您的支持!

姓名
邮箱
手机号码
标题
留言内容
验证码

基于弱分类器调整的多分类Adaboost算法

杨新武 马壮 袁顺

杨新武, 马壮, 袁顺. 基于弱分类器调整的多分类Adaboost算法[J]. 电子与信息学报, 2016, 38(2): 373-380. doi: 10.11999/JEIT150544
引用本文: 杨新武, 马壮, 袁顺. 基于弱分类器调整的多分类Adaboost算法[J]. 电子与信息学报, 2016, 38(2): 373-380. doi: 10.11999/JEIT150544
YANG Xinwu, MA Zhuang, YUAN Shun. Multi-class Adaboost Algorithm Based on the Adjusted Weak Classifier[J]. Journal of Electronics & Information Technology, 2016, 38(2): 373-380. doi: 10.11999/JEIT150544
Citation: YANG Xinwu, MA Zhuang, YUAN Shun. Multi-class Adaboost Algorithm Based on the Adjusted Weak Classifier[J]. Journal of Electronics & Information Technology, 2016, 38(2): 373-380. doi: 10.11999/JEIT150544

基于弱分类器调整的多分类Adaboost算法

doi: 10.11999/JEIT150544

Multi-class Adaboost Algorithm Based on the Adjusted Weak Classifier

  • 摘要: Adaboost.M1算法要求每个弱分类器的正确率大于1/2,但在多分类问题中寻找这样的弱分类器较为困难。有学者提出了多类指数损失函数的逐步添加模型(SAMME),把弱分类器的正确率要求降低到大于1/k(k为类别数),降低了寻找弱分类器的难度。由于SAMME算法无法保证弱分类器的有效性,从而并不能保证最终强分类器正确率的提升。为此,该文通过图示法及数学方法分析了多分类Adaboost算法的原理,进而提出一种新的既可以降低弱分类器的要求,又可以确保弱分类器有效性的多分类方法。在UCI数据集上的对比实验表明,该文提出的算法的结果要好于SAMME算法,并达到了不弱于Adaboost.M1算法的效果。
  • VALIANT L G. A theory of the learnable[J]. Communications of the ACM, 1984, 27(11): 1134-1142. doi: 10.1145/800057.808710.
    SCHAPIRE R E. The strength of weak learnability[J]. Machine Learning, 1990, 5(2): 197-227. doi: 10.1007/ BF00116037.
    FREUND Y. Boosting a weak learning algorithm by majority [J]. Information and Computation, 1995, 121(2): 256-285. doi: 10.1006/inco.1995.1136.
    SCHAPIRE R E. A brief introduction to boosting[C]. International Joint Conference on Artificial Intelligence, Sweden, 1999: 1401-1406.
    曹莹, 苗启广, 刘家辰, 等. AdaBoost 算法研究进展与展望[J]. 自动化学报, 2013, 39(6): 745-758. doi: 10.3724/SP.J. 1004.2013.00745.
    CAO Ying, MIAO Qiguang, LIU Jiachen, et al. Advance and prospects of AdaBoost algorithm[J]. Acta Automatica Sinica, 2013, 39(6): 745-758. doi: 10.3724/SP.J.1004.2013.00745.
    FREUND Y and SCHAPIRE R E. A desicion-theoretic generalization of on-line learning and an application to boosting[J]. Lecture Notes in Computer Science, 1970, 55(1): 23-37. doi: 10.1007/3-540-59119-2_166.
    NEGRI P, GOUSSIES N, and LOTITO P. Detecting pedestrians on a movement feature space[J]. Pattern Recognition, 2014, 47(1): 56-71. doi: 10.1016/j.patcog. 2013.05.020.
    LIU L, SHAO L, and ROCKETT P. Boosted key-frame selection and correlated pyramidal motion-feature representation for human action recognition[J]. Pattern Recognition, 2013, 46(7): 1810-1818. doi: 10.1016/j.patcog. 2012.10.004.
    FREUND Y and SCHAPIRE R E. Experiments with a new boosting algorithm[C]. Proceedings of the Thirteenth International Conference on Machine Learning, Italy, 1996: 148-156.
    ALLWEIN E L, SCHAPIRE R E, and SINGER Y. Reducing multiclass to binary: a unifying approach for margin classifiers[J]. The Journal of Machine Learning Research, 2001, 1(2): 113-141. doi: 10.1162/15324430152733133.
    MUKHERJEE I and SCHAPIRE R E. A theory of multiclass boosting[J]. The Journal of Machine Learning Research, 2013, 14(1): 437-497.
    SCHAPIRE R E and SINGER Y. Improved boosting algorithms using confidence-rated predictions[J]. Machine Learning, 1999, 37(3): 297-336. doi: 10.1145/279943.279960.
    涂承胜, 刁力力, 鲁明羽, 等. Boosting 家族 AdaBoost 系列代表算法[J]. 计算机科学, 2003, 30(3): 30-34.
    TU Chengsheng, DIAO Lili, LU Mingyu, et al. The typical algorithm of AdaBoost series in Boosting family[J]. Computer Science, 2003, 30(3): 30-34.
    胡金海, 骆广琦, 李应红, 等. 一种基于指数损失函数的多类分类 AdaBoost 算法及其应用[J]. 航空学报, 2008, 29(4): 811-816.
    HU Jinhai, LUO Guangqi, LI Yinghong, et al. An AdaBoost algorithm for multi-class classification based on exponential loss function and its application[J]. Acta Aeronautica et Astronautica Sinica, 2008, 29(4): 811-816.
    ZHU J, ZOU H, ROSSET S, et al. Multi-class adaboost[J]. Statistics and Its Interface, 2009, 2(3): 349-360.
    FRIEDMAN J, HASTIE T, and TIBSHIRANI R. Additive logistic regression: a statistical view of boosting (with discussion and a rejoinder by the authors)[J]. The Annals of Statistics, 2000, 28(2): 337-407.doi: 10.1214/aos/1016120463.
    付忠良. 关于 AdaBoost 有效性的分析[J]. 计算机研究与发展, 2008, 45(10): 1747-1755.
    FU Zhongliang. Effectiveness analysis of AdaBoost[J]. Journal of Computer Research and Development, 2008, 45(10): 1747-1755.
    KUZNETSOV V, MOHRI M, and SYED U. Multi-class deep boosting[C]. Advances in Neural Information Processing Systems, Canada, 2014: 2501-2509.
    CORTES C, MOHRI M, and SYED U. Deep boosting[C]. Proceedings of the 31st International Conference on Machine Learning (ICML-14), Beijing, 2014: 1179-1187.
    ZHAI S, XIA T, and WANG S. A multi-class boosting method with direct optimization[C]. Proceedings of the 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, New York, 2014: 273-282.
    ZHAI S, XIA T, TAN M, et al. Direct 0-1 loss minimization and margin maximization with boosting[C]. Advances in Neural Information Processing Systems, Nevada, 2013: 872-880.
    DOLLAR P. Quickly boosting decision trees-pruning underachieving features early[C]. JMLR Workshop Conference Proceedings, 2013, 28: 594-602.
    FERNANDEZ B A and BAUMELA L. Multi-class boosting with asymmetric binary weak-learners[J]. Pattern Recognition, 2014, 47(5): 2080-2090. doi: 10.1016/j.patcog. 2013.11.024.
  • 加载中
计量
  • 文章访问数:  1996
  • HTML全文浏览量:  255
  • PDF下载量:  2223
  • 被引次数: 0
出版历程
  • 收稿日期:  2015-05-11
  • 修回日期:  2015-10-08
  • 刊出日期:  2016-02-19

目录

    /

    返回文章
    返回