高级搜索

留言板

尊敬的读者、作者、审稿人, 关于本刊的投稿、审稿、编辑和出版的任何问题, 您可以本页添加留言。我们将尽快给您答复。谢谢您的支持!

姓名
邮箱
手机号码
标题
留言内容
验证码

基于自适应权值裁剪的Adaboost快速训练算法

余陆斌 杜启亮 田联房

余陆斌, 杜启亮, 田联房. 基于自适应权值裁剪的Adaboost快速训练算法[J]. 电子与信息学报, 2020, 42(11): 2742-2748. doi: 10.11999/JEIT190473
引用本文: 余陆斌, 杜启亮, 田联房. 基于自适应权值裁剪的Adaboost快速训练算法[J]. 电子与信息学报, 2020, 42(11): 2742-2748. doi: 10.11999/JEIT190473
Lubin YU, Qiliang DU, Lianfang TIAN. Fast Training Adaboost Algorithm Based on Adaptive Weight Trimming[J]. Journal of Electronics & Information Technology, 2020, 42(11): 2742-2748. doi: 10.11999/JEIT190473
Citation: Lubin YU, Qiliang DU, Lianfang TIAN. Fast Training Adaboost Algorithm Based on Adaptive Weight Trimming[J]. Journal of Electronics & Information Technology, 2020, 42(11): 2742-2748. doi: 10.11999/JEIT190473

基于自适应权值裁剪的Adaboost快速训练算法

doi: 10.11999/JEIT190473
基金项目: 海防公益类项目(201505002),广东省重点研发计划-新一代人工智能(20180109),广州市产业技术重大攻关计划(2019-01-01-12-1006-0001),广东省科学技术厅重大科技计划项目(2016B090912001),中央高校基本科研业务费专项资金(2018KZ05)
详细信息
    作者简介:

    余陆斌:男,1994年生,博士生,主要研究方向为机器学习、机器视觉

    杜启亮:男,1980年生,副研究员,博士,主要研究方向为机器人、机器视觉

    田联房:男,1968年生,教授,博士,主要研究方向为模式识别、人工智能

    通讯作者:

    杜启亮 qldu@scut.edu.cn

  • 中图分类号: TP391

Fast Training Adaboost Algorithm Based on Adaptive Weight Trimming

Funds: The Coast defence Public Welfare Project (201505002), Guangdong Province Key R&D Program-A New Generation of Artificial Intelligence (20180109), Guangzhou City Industrial Technology Major Research Project (2019-01-01-12-1006-0001), The Major Science and Technology Plan Project of Guangdong Science and Technology Department (2016B090912001), The Special Fund for Basic Scientific Research in Central Colleges and Universities (2018KZ05)
  • 摘要: Adaboost是一种广泛使用的机器学习算法,然而Adaboost算法在训练时耗时十分严重。针对该问题,该文提出一种基于自适应权值的Adaboost快速训练算法AWTAdaboost。该算法首先统计每一轮迭代的样本权值分布,再结合当前样本权值的最大值和样本集规模计算出裁剪系数,权值小于裁剪系数的样本将不参与训练,进而加快了训练速度。在INRIA数据集和自定义数据集上的实验表明,该文算法能在保证检测效果的情况下大幅加快训练速度,相比于其他快速训练算法,在训练时间接近的情况下有更好的检测效果。
  • 图  1  自定义数据集样本示例

    图  2  各算法在INRIA数据集上的错误率

    图  3  各算法在自定义数据集上的错误率

    图  4  各算法的训练时间

    图  5  AWTAdaboost算法在训练时保留样本比例

    表  1  各算法在两个数据集上的错误率

    INRIA数据集自定义数据集
    训练集错误率测试集错误率训练集错误率测试集错误率
    Adaboost0.00000.02850.00000.0296
    SWTAdaboost0.03950.07680.05380.1089
    DWTAdaboost0.00000.04660.01940.0735
    WNS-Adaboost0.00000.03560.00060.0439
    GAdaboost0.05630.11080.07240.1345
    PCA+DRAdaboost0.00000.04130.00000.0539
    AWTAdaboost0.00000.03020.00000.0324
    下载: 导出CSV

    表  2  各算法训练时间对比

    算法INRIA数据集相对
    训练时间
    自定义数据集相对
    训练时间
    Adaboost1.00001.0000
    SWTAdaboost0.62370.6547
    DWTAdaboost0.63470.6551
    WNS-Adaboost0.58140.5919
    GAdaboost0.44820.4636
    PCA+DRAdaboost0.51240.5324
    AWTAdaboost0.55700.5732
    注:表中只记录了SWTAdaboost提前停止迭代前的训练时间和相同$\beta $下DWTAdaboost的训练时间。
    下载: 导出CSV
  • VALIANT L G. A theory of the learnable[C]. The 16th Annual ACM Symposium on Theory of Computing, New York, USA, 1984: 436–445.
    KEARNS M and VALIANT L. Cryptographic limitations on learning Boolean formulae and finite automata[J]. Journal of the ACM, 1994, 41(1): 67–95. doi: 10.1145/174644.174647
    SCHAPIRE R E. The strength of weak learnability[J]. Machine Learning, 1990, 5(2): 197–227.
    FREUND Y and SCHAPIRE R E. A decision-theoretic generalization of on-line learning and an application to boosting[J]. Journal of Computer and System Sciences, 1997, 55(1): 119–139. doi: 10.1006/jcss.1997.1504
    FREUND Y and SCHAPIRE R E. Experiments with a new boosting algorithm[C]. International Conference on Machine Learning, Bari, Italy, 1996: 148–156.
    ZHANG Xingqiang and DING Jiajun. An improved Adaboost face detection algorithm based on the different sample weights[C]. The 20th IEEE International Conference on Computer Supported Cooperative Work in Design, Nanchang, China, 2016: 436–439.
    CHO H, SUNG M, and JUN B. Canny text detector: Fast and robust scene text localization algorithm[C]. 2016 IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, USA, 2016: 3566–3573.
    GAO Chenqiang, LI Pei, ZHANG Yajun, et al. People counting based on head detection combining Adaboost and CNN in crowded surveillance environment[J]. Neurocomputing, 2016, 208: 108–116. doi: 10.1016/j.neucom.2016.01.097
    FRIEDMAN J, HASTIE T, and TIBSHIRANI R. Additive logistic regression: A statistical view of boosting (With discussion and a rejoinder by the authors)[J]. Annals of Statistics, 2000, 28(2): 337–407.
    贾慧星, 章毓晋. 基于动态权重裁剪的快速Adaboost训练算法[J]. 计算机学报, 2009, 32(2): 336–341. doi: 10.3724/SP.J.1016.2009.00336

    JIA Huixing and ZHANG Yujin. Fast Adaboost training algorithm by dynamic weight trimming[J]. Chinese Journal of Computers, 2009, 32(2): 336–341. doi: 10.3724/SP.J.1016.2009.00336
    SEYEDHOSSEINI M, PAIVA A R C, and TASDIZEN T. Fast AdaBoost training using weighted novelty selection[C]. 2011 International Joint Conference on Neural Networks, San Jose, USA, 2011: 1245–1250.
    TOLBA M F and MOUSTAFA M. GAdaBoost: Accelerating adaboost feature selection with genetic algorithms[C]. The 8th International Joint Conference on Computational Intelligence, Porto, Portugal, 2016: 156-163.
    YUAN Shuang and LÜ Cixing. Fast adaboost algorithm based on weight constraints[C]. 2015 IEEE International Conference on Cyber Technology in Automation, Control, and Intelligent Systems, Shenyang, China, 2015: 825–828.
    袁双, 吕赐兴. 基于PCA改进的快速Adaboost算法研究[J]. 科学技术与工程, 2015, 15(29): 62–66. doi: 10.3969/j.issn.1671-1815.2015.29.011

    YUAN Shuang and LÜ Cixing. Fast adaboost algorithm based on improved PCA[J]. Science Technology and Engineering, 2015, 15(29): 62–66. doi: 10.3969/j.issn.1671-1815.2015.29.011
    DALAL N and TRIGGS B. Histograms of oriented gradients for human detection[C]. 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, San Diego, USA, 2005: 886–893.
  • 加载中
图(5) / 表(2)
计量
  • 文章访问数:  1122
  • HTML全文浏览量:  385
  • PDF下载量:  61
  • 被引次数: 0
出版历程
  • 收稿日期:  2019-06-27
  • 修回日期:  2020-04-19
  • 网络出版日期:  2020-08-31
  • 刊出日期:  2020-11-16

目录

    /

    返回文章
    返回