Advanced Search
Volume 38 Issue 2
Feb.  2016
Turn off MathJax
Article Contents
LI Yubo, CHEN Miao. Construction of Nearly Perfect Gaussian Integer Sequences[J]. Journal of Electronics & Information Technology, 2018, 40(7): 1752-1758. doi: 10.11999/JEIT170844
Citation: YANG Xinwu, MA Zhuang, YUAN Shun. Multi-class Adaboost Algorithm Based on the Adjusted Weak Classifier[J]. Journal of Electronics & Information Technology, 2016, 38(2): 373-380. doi: 10.11999/JEIT150544

Multi-class Adaboost Algorithm Based on the Adjusted Weak Classifier

doi: 10.11999/JEIT150544
  • Received Date: 2015-05-11
  • Rev Recd Date: 2015-10-08
  • Publish Date: 2016-02-19
  • Adaboost.M1 requires each weak classifier,s accuracy rate more than 1/2. But it is difficult to find a weak classifier which accuracy rate more than 1/2 in a multiple classification issues. Some scholars put forward the Stagewise Additive Modeling using a Multi-class Exponential loss function (SAMME) algorithm, it reduces the weak classifier accuracy requirements, from more than 1/2 to more than 1/k (k is the category number). SAMME algorithm reduces the difficulty to find weak classifier. But, due to the SAMME algorithm is no guarantee that the effectiveness of the weak classifier, which does not ensure that the final classifier improves classification accuracy. This paper analyzes the multi-class Adaboost algorithm by graphic method and math method, and then a new kind of multi-class classification method is proposed which not only reduces the weak classifier accuracy requirements, but also ensures the effectiveness of the weak classifier. In the benchmark experiments on UCI data sets show that the proposed algorithm are better than the SAMME, and achieves the effect of Adaboost.M1.
  • VALIANT L G. A theory of the learnable[J]. Communications of the ACM, 1984, 27(11): 1134-1142. doi: 10.1145/800057.808710.
    SCHAPIRE R E. The strength of weak learnability[J]. Machine Learning, 1990, 5(2): 197-227. doi: 10.1007/ BF00116037.
    FREUND Y. Boosting a weak learning algorithm by majority [J]. Information and Computation, 1995, 121(2): 256-285. doi: 10.1006/inco.1995.1136.
    SCHAPIRE R E. A brief introduction to boosting[C]. International Joint Conference on Artificial Intelligence, Sweden, 1999: 1401-1406.
    曹莹, 苗启广, 刘家辰, 等. AdaBoost 算法研究进展与展望[J]. 自动化学报, 2013, 39(6): 745-758. doi: 10.3724/SP.J. 1004.2013.00745.
    CAO Ying, MIAO Qiguang, LIU Jiachen, et al. Advance and prospects of AdaBoost algorithm[J]. Acta Automatica Sinica, 2013, 39(6): 745-758. doi: 10.3724/SP.J.1004.2013.00745.
    FREUND Y and SCHAPIRE R E. A desicion-theoretic generalization of on-line learning and an application to boosting[J]. Lecture Notes in Computer Science, 1970, 55(1): 23-37. doi: 10.1007/3-540-59119-2_166.
    NEGRI P, GOUSSIES N, and LOTITO P. Detecting pedestrians on a movement feature space[J]. Pattern Recognition, 2014, 47(1): 56-71. doi: 10.1016/j.patcog. 2013.05.020.
    LIU L, SHAO L, and ROCKETT P. Boosted key-frame selection and correlated pyramidal motion-feature representation for human action recognition[J]. Pattern Recognition, 2013, 46(7): 1810-1818. doi: 10.1016/j.patcog. 2012.10.004.
    FREUND Y and SCHAPIRE R E. Experiments with a new boosting algorithm[C]. Proceedings of the Thirteenth International Conference on Machine Learning, Italy, 1996: 148-156.
    ALLWEIN E L, SCHAPIRE R E, and SINGER Y. Reducing multiclass to binary: a unifying approach for margin classifiers[J]. The Journal of Machine Learning Research, 2001, 1(2): 113-141. doi: 10.1162/15324430152733133.
    MUKHERJEE I and SCHAPIRE R E. A theory of multiclass boosting[J]. The Journal of Machine Learning Research, 2013, 14(1): 437-497.
    SCHAPIRE R E and SINGER Y. Improved boosting algorithms using confidence-rated predictions[J]. Machine Learning, 1999, 37(3): 297-336. doi: 10.1145/279943.279960.
    涂承胜, 刁力力, 鲁明羽, 等. Boosting 家族 AdaBoost 系列代表算法[J]. 计算机科学, 2003, 30(3): 30-34.
    TU Chengsheng, DIAO Lili, LU Mingyu, et al. The typical algorithm of AdaBoost series in Boosting family[J]. Computer Science, 2003, 30(3): 30-34.
    胡金海, 骆广琦, 李应红, 等. 一种基于指数损失函数的多类分类 AdaBoost 算法及其应用[J]. 航空学报, 2008, 29(4): 811-816.
    HU Jinhai, LUO Guangqi, LI Yinghong, et al. An AdaBoost algorithm for multi-class classification based on exponential loss function and its application[J]. Acta Aeronautica et Astronautica Sinica, 2008, 29(4): 811-816.
    ZHU J, ZOU H, ROSSET S, et al. Multi-class adaboost[J]. Statistics and Its Interface, 2009, 2(3): 349-360.
    FRIEDMAN J, HASTIE T, and TIBSHIRANI R. Additive logistic regression: a statistical view of boosting (with discussion and a rejoinder by the authors)[J]. The Annals of Statistics, 2000, 28(2): 337-407.doi: 10.1214/aos/1016120463.
    付忠良. 关于 AdaBoost 有效性的分析[J]. 计算机研究与发展, 2008, 45(10): 1747-1755.
    FU Zhongliang. Effectiveness analysis of AdaBoost[J]. Journal of Computer Research and Development, 2008, 45(10): 1747-1755.
    KUZNETSOV V, MOHRI M, and SYED U. Multi-class deep boosting[C]. Advances in Neural Information Processing Systems, Canada, 2014: 2501-2509.
    CORTES C, MOHRI M, and SYED U. Deep boosting[C]. Proceedings of the 31st International Conference on Machine Learning (ICML-14), Beijing, 2014: 1179-1187.
    ZHAI S, XIA T, and WANG S. A multi-class boosting method with direct optimization[C]. Proceedings of the 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, New York, 2014: 273-282.
    ZHAI S, XIA T, TAN M, et al. Direct 0-1 loss minimization and margin maximization with boosting[C]. Advances in Neural Information Processing Systems, Nevada, 2013: 872-880.
    DOLLAR P. Quickly boosting decision trees-pruning underachieving features early[C]. JMLR Workshop Conference Proceedings, 2013, 28: 594-602.
    FERNANDEZ B A and BAUMELA L. Multi-class boosting with asymmetric binary weak-learners[J]. Pattern Recognition, 2014, 47(5): 2080-2090. doi: 10.1016/j.patcog. 2013.11.024.
  • Cited by

    Periodical cited type(19)

    1. 孙顺远,魏志涛. 基于二次移动平均法估计背景光照的二值化方法. 计算机与数字工程. 2024(06): 1830-1836 .
    2. 赵孔卫,徐广标. 基于像素分析的针织面料卷边性评价研究. 针织工业. 2024(10): 11-14 .
    3. 卢晓波,徐海,朱俊召,张宇,谭健,高冠男,胡军华,林龙. 基于机器视觉的加热卷烟烟支端部质量检测系统设计. 轻工学报. 2024(06): 101-107+115 .
    4. 韩海豹,化荣,张虎,陈杰. 量产活禽(肉鸡)智能化运输装备控制系统的设计. 农业技术与装备. 2023(01): 20-22 .
    5. 支亚京,汤宁,吴兴洋,汪华,胡兴炜,张军. 基于支持向量机的气温自记纸图像数字化. 计算机技术与发展. 2023(10): 216-220 .
    6. 魏兴凯,蒋峥,傅呈勋,刘斌. 基于光照影响因子的动态Niblack算法研究及应用. 计算机工程与设计. 2022(04): 1066-1073 .
    7. 徐浩,章明希. 高精密齿轮小缺陷的智能视觉测量. 兵器材料科学与工程. 2021(01): 83-87 .
    8. 贺欢,吐尔洪江·阿布都克力木,何笑. 一种基于MALLAT算法的图像去雾方法. 新疆师范大学学报(自然科学版). 2020(01): 23-27 .
    9. 赵琛,张血琴,刘凯,郭裕钧. 基于正则化的多光谱图像二值化处理. 计算机仿真. 2020(04): 436-440 .
    10. 杜炤鑫,谢海宁,宋杰,周德生,邹晓峰,陈冉,曾平. 基于图像处理和深度学习的配网跳闸故障识别方法. 中国科学技术大学学报. 2020(01): 39-48 .
    11. 蒋鹏程,熊礼治,韩啸. 一种基于内容保护与优化识别的二维码方案. 软件导刊. 2019(02): 119-122 .
    12. 安建尧,李金新,孙双平. 基于Prewitt算子的红外图像边缘检测改进算法. 杭州电子科技大学学报(自然科学版). 2018(05): 18-23+39 .
    13. 陈志伟,徐世许,刘云鹏,曾祥晓. 基于视觉筛选的并联机器人平面抓取系统设计. 制造业自动化. 2018(05): 44-47 .
    14. 熊炜,徐晶晶,赵诗云,王改华,刘敏,赵楠,刘聪. 基于支持向量机的低质量文档图像二值化. 计算机应用与软件. 2018(02): 218-223+241 .
    15. 李昌利,周晓晓,张振,樊棠怀. Retinex模型下基于融合策略的雾霾图像增强. 工程科学与技术. 2018(05): 202-208 .
    16. 于晓,闫振雷,周子杰. 指纹识别网页登录器设计. 实验室研究与探索. 2018(10): 85-88+128 .
    17. 宋巧君,张东. 基于双边滤波和Black-hat变换的OSTU裂缝分割算法. 信息技术. 2017(12): 90-92 .
    18. 谢芳娟,曾萍萍,谭菊华. 低分辨率灰度图像传输真实度优化仿真研究. 计算机仿真. 2017(12): 183-186 .
    19. 田敬波. 基于模板算子边缘检测的图像二值化算法. 信息技术与信息化. 2017(09): 98-101 .

    Other cited types(33)

  • 加载中

Catalog

    通讯作者: 陈斌, bchen63@163.com
    • 1. 

      沈阳化工大学材料科学与工程学院 沈阳 110142

    1. 本站搜索
    2. 百度学术搜索
    3. 万方数据库搜索
    4. CNKI搜索

    Article Metrics

    Article views (2048) PDF downloads(2224) Cited by()
    Proportional views
    Related

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return