[1] GHAHRAMANI Z. Probabilistic machine learning and artificial intelligence[J]. Nature, 2015, 521(7553):452-459. [2] HINTON G, DENG L, YU D, et al. Deep neural networks for acoustic modeling in speech recognition:the shared views of four research groups[J]. IEEE Signal Processing Magazine, 2012, 29(6):82-97. [3] KRIZHEVSKY A, SUTSKEVER I, HINTON G E, et al. ImageNet classification with deep convolutional neural networks[J]. Communications of the ACM, 2017, 60(6):84-90. [4] 黄凯奇, 任伟强, 谭铁牛. 图像物体分类与检测算法综述[J]. 计算机学报, 2014, 36(6):1-18.(HUANG K Q, REN W Q, TAN T N. A review on image object classification and detection[J]. Chinese Journal of Computers, 2014, 36(6):1-18.) [5] KANDOLA E J, HOFMANN T, POGGIO T, et al. A neural probabilistic language model[J]. Journal of Machine Learning Research, 2006, 194:137-186. [6] TAX D M J, DUIN R P W. Support vector data description[J]. Machine Learning, 2004, 54(1):45-66. [7] LEE K Y, KIM D W, LEE D, et al. Improving support vector data description using local density degree[J]. Pattern Recognition, 2005, 38(10):1768-1771. [8] KIM P J, CHANG H J, SONG D S, et al. Fast support vector data description using k-means clustering[C]//Proceedings of the 4th International Symposium on Neural Networks. Berlin:Springer, 2007:506-514. [9] LUO J, LI B, WU C, et al. A fast SVDD algorithm based on decomposition and combination for fault detection[C]//Proceedings of the 2010 International Conference on Control and Automation. Piscataway:IEEE, 2010:1924-1928. [10] HUANG G, CHEN H, ZHOU Z, et al. Two-class support vector data description[J]. Pattern Recognition, 2011, 44(2):320-329. [11] 李勇, 刘战东, 张海军. 不平衡数据的集成分类算法综述[J]. 计算机应用研究, 2014, 31(5):1287-1291.(LI Y, LIU Z D, ZHANG H J. Summary of integrated classification algorithm for unbalanced data[J]. Application Research of Computers, 2014, 31(5):1287-1291.) [12] FREUND Y, SCHAPIRE R E. A decision-theoretic generalization of on-line learning and an application to boosting[J]. Journal of Computer and System Sciences, 1997, 55(1):119-139. [13] BREIMAN L. Bagging predictors[J]. Machine Learning, 1996, 24(2):123-140. [14] BREIMAN L. Random forests[J]. Machine Learning, 2001, 45(1):5-32. [15] 周志华. 机器学习[M]. 北京:清华大学出版社, 2016:171-189.(ZHOU Z H. Mechine Learning[M]. Beijing:Tsinghua University Press, 2016:171-189.) [16] QIAN Y, LI F, LIANG J, et al. Space structure and clustering of categorical data[J]. IEEE Transactions on Neural Networks & Learning Systems, 2016, 27(10):2047-2059. [17] ZHOU Z. Ensemble Methods:Foundations and Algorithms[M]. Boca Raton, FL:Taylor & Francis Group, 2012:1-22. [18] QIAN Y, LI F, LIANG J, et al. Fusing monotonic decision trees[J]. IEEE Transactions on Knowledge and Data Engineering, 2015, 27(10):2717-2728. [19] EFRON B. Bootstrap methods:another look at the jackknife[J]. Breakthroughs in Statistics, 1979, 7(1):569-593. [20] CAWLEY G, TALBOT N. Gunnar Raetsch's benchmark datasets[DB/OL].[2018-11-20]. http://theoval.cmp.uea.ac.uk/~gcc/matlab/default.html#benchmarks. [21] NAGANJANEYULU S, KUPPA M R. A novel framework for class imbalance learning using intelligent under-sampling[J]. Progress in Artificial Intelligence, 2013, 2(1):73-84. [22] ZHANG X, SONG Q, WANG G, et al. A dissimilarity-based imbalance data classification algorithm[J]. Applied Intelligence, 2015, 42(3):544-565. [23] JIANG K, LU J, XIA K. A novel algorithm for imbalance data classification based on genetic algorithm improved SMOTE[J]. Arabian Journal for Science and Engineering, 2016, 41(8):3255-3266. |