Citation: | Lubin YU, Qiliang DU, Lianfang TIAN. Fast Training Adaboost Algorithm Based on Adaptive Weight Trimming[J]. Journal of Electronics & Information Technology, 2020, 42(11): 2742-2748. doi: 10.11999/JEIT190473 |
VALIANT L G. A theory of the learnable[C]. The 16th Annual ACM Symposium on Theory of Computing, New York, USA, 1984: 436–445.
|
KEARNS M and VALIANT L. Cryptographic limitations on learning Boolean formulae and finite automata[J]. Journal of the ACM, 1994, 41(1): 67–95. doi: 10.1145/174644.174647
|
SCHAPIRE R E. The strength of weak learnability[J]. Machine Learning, 1990, 5(2): 197–227.
|
FREUND Y and SCHAPIRE R E. A decision-theoretic generalization of on-line learning and an application to boosting[J]. Journal of Computer and System Sciences, 1997, 55(1): 119–139. doi: 10.1006/jcss.1997.1504
|
FREUND Y and SCHAPIRE R E. Experiments with a new boosting algorithm[C]. International Conference on Machine Learning, Bari, Italy, 1996: 148–156.
|
ZHANG Xingqiang and DING Jiajun. An improved Adaboost face detection algorithm based on the different sample weights[C]. The 20th IEEE International Conference on Computer Supported Cooperative Work in Design, Nanchang, China, 2016: 436–439.
|
CHO H, SUNG M, and JUN B. Canny text detector: Fast and robust scene text localization algorithm[C]. 2016 IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, USA, 2016: 3566–3573.
|
GAO Chenqiang, LI Pei, ZHANG Yajun, et al. People counting based on head detection combining Adaboost and CNN in crowded surveillance environment[J]. Neurocomputing, 2016, 208: 108–116. doi: 10.1016/j.neucom.2016.01.097
|
FRIEDMAN J, HASTIE T, and TIBSHIRANI R. Additive logistic regression: A statistical view of boosting (With discussion and a rejoinder by the authors)[J]. Annals of Statistics, 2000, 28(2): 337–407.
|
贾慧星, 章毓晋. 基于动态权重裁剪的快速Adaboost训练算法[J]. 计算机学报, 2009, 32(2): 336–341. doi: 10.3724/SP.J.1016.2009.00336
JIA Huixing and ZHANG Yujin. Fast Adaboost training algorithm by dynamic weight trimming[J]. Chinese Journal of Computers, 2009, 32(2): 336–341. doi: 10.3724/SP.J.1016.2009.00336
|
SEYEDHOSSEINI M, PAIVA A R C, and TASDIZEN T. Fast AdaBoost training using weighted novelty selection[C]. 2011 International Joint Conference on Neural Networks, San Jose, USA, 2011: 1245–1250.
|
TOLBA M F and MOUSTAFA M. GAdaBoost: Accelerating adaboost feature selection with genetic algorithms[C]. The 8th International Joint Conference on Computational Intelligence, Porto, Portugal, 2016: 156-163.
|
YUAN Shuang and LÜ Cixing. Fast adaboost algorithm based on weight constraints[C]. 2015 IEEE International Conference on Cyber Technology in Automation, Control, and Intelligent Systems, Shenyang, China, 2015: 825–828.
|
袁双, 吕赐兴. 基于PCA改进的快速Adaboost算法研究[J]. 科学技术与工程, 2015, 15(29): 62–66. doi: 10.3969/j.issn.1671-1815.2015.29.011
YUAN Shuang and LÜ Cixing. Fast adaboost algorithm based on improved PCA[J]. Science Technology and Engineering, 2015, 15(29): 62–66. doi: 10.3969/j.issn.1671-1815.2015.29.011
|
DALAL N and TRIGGS B. Histograms of oriented gradients for human detection[C]. 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, San Diego, USA, 2005: 886–893.
|