[1] VOGELSTEIN B, PAPADOPOULOS N, VELCULESCU V E, et al. Cancer genome landscapes[J]. Science, 2013, 339(6127):1546-1558. [2] NJAH H, JAMOUSSI S. Weighted ensemble learning of Bayesian network for gene regulatory networks[J]. Neurocomputing, 2015, 150(B):404-416. [3] BREIMAN L. Bagging predictors[J]. Machine Learning, 1996, 24(2):123-140. [4] SCHAPIRE R E, FREUND Y, BARTLETT P, et al. Boosting the margin:a new explanation for the effectiveness of voting methods[C]//Proceedings of the Fourteenth International Conference on Machine Learning. San Francisco, CA:Morgan Kaufmann Publishers Inc., 1997:322-330. [5] TUMER K, GHOSH J. Error correlation and error reduction in ensemble classifiers[J]. Connection Science, 2015, 8(3/4):385-404. [6] 周志华. 机器学习[M]. 北京:清华大学出版社, 2016:171-196. (ZHOU Z H. Machine Learning[M]. Beijing:Tsinghua University Press, 2016:171-196.) [7] 张春霞, 张讲社. 选择性集成学习算法综述[J]. 计算机学报, 2011, 34(8):1399-1410. (ZHANG C X, ZHANG J S. A survey of selective ensemble learning algorithms[J]. Chinese Journal of Computers, 2011, 34(8):1399-1410.) [8] 陆慧娟, 安春霖, 马小平, 等. 基于输出不一致测度的极限学习机集成的基因表达数据分类[J]. 计算机学报, 2013, 36(2):341-348. (LU H J, AN C L, MA X P, et al. Disagreement measure based ensemble of extreme learning machine for gene expression data classification[J]. Chinese Journal of Computers, 2013, 36(2):341-348.) [9] MARGINEANTU D D, DIETTERICH T G. Pruning adaptive boosting[C]//ICML 1997:Proceedings of the 14th International Conference on Machine Learning. San Francisco:Morgan Kaufmann Publishers, 1997:211-218. [10] MAO S, JIAO L, XIONG L, et al. Weighted classifier ensemble based on quadratic form[J]. Pattern Recognition, 2015, 48(5):1688-1706. [11] HUANG G B, ZHU Q Y, SIEW C K. Extreme learning machine:a new learning scheme of feedforward neural networks[C]//Proceedings of the 2004 IEEE International Joint Conference on Neural Networks. Piscataway, NJ:IEEE, 2004:985-990. [12] HUANG G B, ZHOU H, DING X, et al. Extreme learning machine for regression and multiclass classification[J]. IEEE Transactions on Systems, Man, and Cybernetics, Part B, 2012, 42(2):513. [13] ZHOU Z H, WU J, TANG W. Ensembling neural networks:many could be better than all[J]. Artificial Intelligence, 2002, 137(1/2):239-263. [14] 徐晓杨, 纪志成. 选择性集成极限学习机分类器建模研究[J]. 计算机应用与软件, 2016, 33(9):279-283. (XU X Y, JI Z C, Research on modelling selective ensemble extreme learing machine classifier[J]. Computer Applications and Software, 2016, 33(9):279-283.) [15] GUO Y, JIAO L, WANG S, et al. A novel dynamic rough subspace based selective ensemble[J]. Pattern Recognition, 2014, 48(5):1638-1652. [16] ZHU X, NI Z, CHENG M, et al. Selective ensemble based on extreme learning machine and improved discrete artificial fish swarm algorithm for haze forecast[J]. Applied Intelligence, 2017:1-19. [17] KUNCHEVA L I, WHITAKER C J. Measures of diversity in classifier ensembles and their relationship with the ensemble accuracy[J]. Machine Learning, 2003, 51(2):181-207. [18] ÖZÖĜÜR-AKYÜZ S, WINDEATT T, SMITH R. Pruning of error correcting output codes by optimization of accuracy-diversity trade off[J]. Machine Learning, 2015, 101(1/2/3):253-269. [19] 吴建鑫, 周志华, 沈学华, 等.一种选择性神经网络集成构造方法[J]. 计算机研究与发展, 2000, 37(9):1039-1044.(WU J, ZHOU Z, SHEN X, et al. A selective constructing approach to neural network ensemble[J]. Journal of Computer Research & Development, 2000, 37(9):1039-1044.) [20] BERETTA L, SANTANIELLO A. Implementing ReliefF filters to extract meaningful features from genetic lifetime datasets[J]. Journal of Biomedical Informatics, 2011, 44(2):361-369. [21] TIAN H, MENG B. A new modeling method based on bagging ELM for day-ahead electricity price prediction[C]//Proceedings of the 2010 IEEE 5th International Conference on Bio-Inspired Computing:Theories and Applications. Piscataway, NJ:IEEE, 2010:1076-1079. [22] TAN A C, GILBERT D. Ensemble machine learning on gene expression data for cancer classification[EB/OL].[2017-05-10]. http://v-scheiner.brunel.ac.uk/bitstream/2438/3013/1/TanGilbertNZ2003.pdf. |