[1] SCHAPIRE R E. The strength of weak learnability[J]. Machine Learning,1990,5(2):197-227. [2] REYZIN L,SCHAPIRE R E. How boosting the margin can also boost classifier complexity[C]//Proceedings of the 23rd International Conference on Machine Learning. New York:ACM, 2006:753-760. [3] FREUND Y,SCHAPIRE R E. A decision-theoretic generalization of online learning and an application to boosting[J]. Journal of Computer and System Sciences,1997,55(1):119-139. [4] FREUND Y,SCHAPIRE R E. Experiments with a new boosting algorithm[C]//Proceedings of the 13th International Conference on Machine Learning. San Francisco:Morgan Kaufmann Inc.,1996:148-156. [5] CARUANA R,NICULESCU-MIZIL A. An empirical comparison of supervised learning algorithms[C]//Proceedings of the 23rd International Conference on Machine Learning. New York:ACM, 2006:161-168. [6] SETTOUTI N,EL AMINE BECHAR M,CHIKH M A. Statistical comparisons of the top 10 algorithms in data mining for classification task[J]. International Journal of Interactive Multimedia and Artificial Intelligence,2016,4(1):46-51. [7] SCHAPIRE R E,SINGER Y. Improved boosting algorithms using confidence-rated predictions[J]. Machine Learning,1999,37(3):297-336. [8] FRIEDMAN J,HASTIE T,TIBSHIRANI R. Additive logistic regression:a statistical view of boosting[J]. The Annals of Statistics,2000,28(2):337-407. [9] SCHAPIRE R E,FREUND Y,BARTLETT P,et al. Boosting the margin:a new explanation for the effectiveness of voting methods[J]. The Annals of Statistics,1998,26(5):1651-1686. [10] XIAO F Y,WANG Y,HE L G,et al. Motion estimation from surface electromyogram using AdaBoost regression and average feature values[J]. IEEE Access,2019,7:13121-13134. [11] FU Q,JING B,HE P J,et al. Fault feature selection and diagnosis of rolling bearings based on EEMD and optimized Elman_AdaBoost algorithm[J]. IEEE Sensors Journal,2018,18(12):5024-5034. [12] CHEN T,LU S J. Accurate and efficient traffic sign detection using discriminative AdaBoost and support vector regression[J]. IEEE Transactions on Vehicular Technology,2016,65(6):4006-4015. [13] GUTIÉRREZ-TOBAL G C,ÁLVAREZ D,DEL CAMPO F,et al. Utility of AdaBoost to detect sleep apnea-hypopnea syndrome from single-channel airflow[J]. IEEE Transactions on Biomedical Engineering,2016,63(3):636-646. [14] WANG J,XIONG X F,ZHOU N,et al. Early warning method for transmission line galloping based on SVM and AdaBoost bi-level classifiers[J]. IET Generation,Transmission and Distribution, 2016,10(14):3499-3507. [15] XU H P,YUAN H Y. An SVM-based AdaBoost cascade classifier for sonar image[J]. IEEE Access,2020,8:115857-115864. [16] KROGH A,VEDELSBY J. Neural network ensembles,cross validation, and active learning[C]//Proceedings of the 7th International Conference on Neural Information Processing Systems. Cambridge:MIT Press,1994:231-238. [17] UEDA N, NAKANO R. Generalization error of ensemble estimators[C]//Proceedings of the 1996 International Conference on Neural Networks. Piscataway:IEEE,1996:90-95. [18] 孙博, 王建东, 陈海燕, 等. 集成学习中的多样性度量[J]. 控制与决策,2014,29(3):385-395.(SUN B,WANG J D,CHEN H Y,et al. Diversity measures in ensemble learning[J]. Control and Decision,2014,29(3):385-395.) [19] 孙涛, 周志华. 近似多元信息多样性[J]. 计算机科学与探索, 2019,13(4):639-646.(SUN T,ZHOU Z H. Approximate multiInformation diversity[J]. Journal of Frontiers of Computer Science and Technology,2019,13(4):639-646.) [20] 姜正申, 刘宏志, 付彬, 等. 集成学习的泛化误差和AUC分解理论及其在权重优化中的应用[J]. 计算机学报,2019,42(1):1-15.(JIANG Z S,LIU H Z,FU B,et al. Decomposition theories of generalization error and AUC in ensemble learning with application in weight optimization[J]. Chinese Journal of Computers,2019, 42(1):1-15.) [21] 王玲娣, 徐华. AdaBoost的多样性分析及改进[J]. 计算机应用,2018,38(3):650-654,660.(WANG L D,XU H. Diversity analysis and improvement of AdaBoost[J]. Journal of Computer Applications,2018,38(3):650-654,660.) [22] ZHANG P B,YANG Z X. A novel AdaBoost framework with robust threshold and structural optimization[J]. IEEE Transactions on Cybernetics,2018,48(1):64-76. [23] WU S,NAGAHASHI H. Parameterized AdaBoost:introducing a parameter to speed up the training of real AdaBoost[J]. IEEE Signal Processing Letters,2014,21(6):687-691. [24] 刘苹光, 文成玉, 杜鸿. 一种改进的AdaBoost检测算法[J]. 计算机应用,2015,35(8):2261-2265.(LIU P G,WEN C Y,DU H. Improved detection algorithm of AdaBoost[J]. Journal of Computer Applications,2015,35(8):2261-2265.) [25] 高敬阳, 赵彦. 基于样本抽样和权重调整的SWA-Adaboost算法[J]. 计算机工程,2014,40(9):248-251,256.(GAO J Y, ZHAO Y. SWA-Adaboost algorithm based on sampling and weight adjustment[J]. Computer Engineering, 2014, 40(9):248-251,256.) [26] 吴恋, 马敏耀, 黄一峰, 等. 基于AdaBoost算法的Linux病毒检测研究[J]. 计算机工程,2018,44(8):161-166,173.(WU L, MA M Y,HUANG Y F,et al. Linux virus detection study based on AdaBoost algorithm[J]. Computer Engineering,2018,44(8):161-166,173.) [27] 李闯, 丁晓青, 吴佑寿. 一种改进的AdaBoost算法——AD AdaBoost[J]. 计算机学报,2007,30(1):103-109.(LI C,DING X Q,WU Y S. A revised AdaBoost algorithm-AD AdaBoost[J]. Chinese Journal of Computers,2007,30(1):103-109.) [28] 翟夕阳, 王晓丹, 雷蕾, 等. 基于多类指数损失函数逐步添加模型的改进多分类AdaBoost算法[J]. 计算机应用,2017,37(6):1692-1696.(ZHAI X Y,WANG X D,LEI L,et al. Improved multi-class AdaBoost algorithm based on stagewise additive modeling using a multi-class exponential loss function[J]. Journal of Computer Applications,2017,37(6):1692-1696.) [29] 王永祥. 基于ECG的心脏骤停预测方法研究[D]. 长春:吉林大学,2017:34-43.(WANG Y X. Research of sudden cardiac arrest prediction method based on ECG signals[D]. Changchun:Jilin University,2017:34-43.) [30] 邱仁博, 娄震. 一种改进的带参数AdaBoost算法[J]. 计算机工程,2016,42(7):199-202,208. (QIU R B,LOU Z. An improved parameterized AdaBoost algorithm[J]. Computer Engineering,2016,42(7):199-202,208.) |