[1] KUNCHEVA L I. Combining Pattern Classifiers:Methods and Algorithms[M]. New York:Wiley-Interscience, 2004:290-325. [2] DUDA R O, HART P E, STORK D G. Pattern Classification[M]. 2nd ed. New York:Wiley, 2001:55-88. [3] SU C T, LIN H C. Applying electromagnetism-like mechanism for feature selection[J]. Information Sciences, 2011, 181(5):972-986. [4] WANG L, ZHOU N, CHU F. A general wrapper approach to selection of class-dependent features[J]. IEEE Transactions on Neural Networks, 2008, 19(7):1267-1278. [5] DATTA A, GHOSH S, GHOSH A. Self-adaptive differential evolution for feature selection in hyperspectral image data[J]. Applied Soft Computing, 2013, 13(4):1969-1977. [6] XUE B, ZHANG M, BROWNE W N, et al. A survey on evolutionary computation approaches to feature selection[J]. IEEE Transactions on Evolutionary Computation, 2016, 20(4):606-626. [7] ZHANG Z, YANG P. An ensemble of classifiers with genetic algorithm based feature selection[J]. IEEE Intelligent Informatics Bulletin, 2008, 9(1):18-24. [8] ABD-ALSABOUR N, RANDALL M. Feature selection for classification using an ant colony system[C]//Proceedings of the 20106th IEEE International Conference on E-Science Workshops. Washington, DC:IEEE Computer Society, 2014:86-91. [9] SHUNMUGAPRIYA P, KANMANI S, DEVIPRIYA S, et al. Investigation on the effects of ACO parameters for feature selection and classification[C]//CNC 2012:International Conference on Advances in Communication, Network, and Computing. Berlin:Springer, 2012:136-145. [10] CHUANG L Y, TSAI S W, YANG C H. Catfish binary particle swarm optimization for feature selection[EB/OL].[2018-03-20]. https://pdfs.semanticscholar.org/222d/8b2803f9cedf0da0b454c 061c0bb46384722.pdf. [11] 巢秀琴,李炜.人工蜂群算法优化的特征选择方法[J/OL].(2018-01-19)[2018-04-30].计算机科学与探索,http://kns.cnki.net/kcms/detail/11.5602.TP.20180206.1345.012.html.(CHAO X Q, LI W.A feature selection method optimized by artificial bee colony algorithm[J/OL].(2018-01-19)[2018-04-30].Journal of Frontiers of Computer Science and Technology, http://kns.cnki.net/kcms/detail/11.5602.TP.20180206.1345.012.html.) [12] AL-ANI A, ALSUKKER A, KHUSHABA R N. Feature subset selection using differential evolution and a wheel based search strategy[J]. Swarm & Evolutionary Computation, 2013, 9:15-26. [13] EMARY E, ZAWBAA H M, HASSANIEN A E. Binary grey wolf optimization approaches for feature selection[J]. Neurocomputing, 2016, 172(C):371-381. [14] MAFARJA M M, MIRJALILI S. Hybrid whale optimization algorithm with simulated annealing for feature selection[J]. Neurocomputing, 2017, 260:302-312. [15] SHUNMUGAPRIYA P, KANMANI S. A hybrid algorithm using ant and bee colony optimization for feature selection and classification (AC-ABC Hybrid)[J].Swarm and Evolutionary Computation,2017,36:27-36. [16] 张梦林, 李占山. 基于SAC的特征选择算法[J]. 计算机科学, 2018, 45(2):63-68.(ZHANG M L,LI Z S. Feature selection algorithm using SAC algorithm[J].Computer Science, 2018, 45(2):63-68.) [17] XUE B, ZHANG M, BROWNE W N. Particle swarm optimization for feature selection in classification:a multi-objective approach[J]. IEEE Transactions on Cybernetics, 2013, 43(6):1656-1671. [18] KENNEDY J. Bare bones particle swarms[C]//Proceedings of the 2003 IEEE Swarm Intelligence Symposium. Piscataway, NJ:IEEE, 2003:80-87. [19] HETTICH S, BLAKE C, MERZ C. UCI repository of machine learning databases[DB/OL].[2017-07-08].http://www.ics.uci.edu/mlearn/MLRepository.html. [20] STATNIKOV A, ALIFERIS C F, TSAMARDINOS I. Gene expression datasets[DB/OL].[2017-08-10].http://www.gems-sys-tem.org. [21] COELLO C A C, PULIDO G T, LECHUGA M S. Handling multiple objectives with particle swarm optimization[J]. IEEE Transactions on Evolutionary Computation, 2004, 8(3):256-279. [22] DEB K, PRATAP A, AGARWAL S, et al. A fast and elitist multiobjective genetic algorithm: NSGA-Ⅱ[J]. IEEE Transactions on Evolutionary Computation, 2002, 6(2):182-197. |