[1] GANDHI S S, PRABHUNE S S. Overview of feature subset selection algorithm for high dimensional data[C]//ICISC 2017:Proceedings of the 2017 IEEE International Conference on Inventive Systems and Control. Piscataway, NJ:IEEE, 2017:1-6. [2] FLEURET F. Fast binary feature selection with conditional mutual information[J]. Journal of Machine Learning Research, 2004, 5(3):1531-1555. [3] LIU H, DITZLER G. Speeding up joint mutual information feature selection with an optimization heuristic[C]//Proceedings of the 2017 IEEE Symposium Series on Computational Intelligence. Piscataway, NJ:IEEE, 2018:1-8. [4] MIN F, XU J. Semi-greedy heuristics for feature selection with test cost constraints[J]. Granular Computing, 2016, 1(3):199-211. [5] TSAGRIS M, LAGANI V, TSAMARDINOS I. Feature selection for high-dimensional temporal data[J]. BMC Bioinformatics, 2018, 19:17. [6] 黄志艳.一种基于信息增益的特征选择方法[J].山东农业大学学报(自然科学版), 2013,44(2):252-256.(HUANG Z Y. Based on the information gain text feature selection method[J]. Journal of Shandong Agricultural University (Natural Science), 2013,44(2):252-256.) [7] 刘海峰,刘守生,宋阿羚.基于词频分布信息的优化IG特征选择方法[J].计算机工程与应用,2017,53(4):113-117.(LIU H F, LIU S S, SONG A L. Improved method of IG feature selection based on word frequency distribution[J]. Computer Engineering and Applications, 2017, 53(4):113-117.) [8] BATTITI R. Using mutual information for selecting features in supervised neural net learning[J]. IEEE Transactions on Neural Networks, 1994, 5(4):537-550. [9] HOQUE N, BHATTACHARYYA D K, KALITA J K. MIFS-ND:a mutual information-based feature selection method[J]. Expert Systems with Applications, 2014, 41(14):6371-6385. [10] CHO D, LEE B. Optimized automatic sleep stage classification using the Normalized Mutual Information Feature Selection (NMIFS) method[C]//Proceedings of the 201739th Annual International Conference of the IEEE Engineering in Medicine and Biology Society. Piscataway, NJ:IEEE, 2017:3094-3097. [11] PENG H, LONG F, DING C. Feature selection based on mutual information:criteria of max-dependency, max-relevance, and min-redundancy[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2005, 27(8):1226-1238. [12] 董泽民,石强.基于归一化模糊联合互信息最大的特征选择[J].计算机工程与应用,2017,53(22):105-110.(DONG Z M, SHI Q. Feature selection using normalized fuzzy joint mutual information maximum[J]. Computer Engineering and Applications, 2017, 53(22):105-110.) [13] BENNASAR M, HICKS Y, SETCHI R. Feature selection using joint mutual information maximisation[J]. Expert Systems with Applications, 2015, 42(22):8520-8532. [14] LI J, DONG W, MENG D. Grouped gene selection of cancer via adaptive sparse group lasso based on conditional mutual information[J]. IEEE/ACM Transactions on Computational Biology and Bioinformatics, 2018, 15(6):2028-2038. [15] LIU C, WANG W, ZHAO Q, et al. A new feature selection method based on a validity index of feature subset[J]. Pattern Recognition Letters, 2017, 92:1-8. [16] AMARATUNGA D, CABRERA J. High-dimensional data[J]. Journal of the National Science Foundation of Sri Lanka, 2016, 44(1):3. [17] DUA, D. AND KARRA TANISKIDOU, E. UCI Machine Learning Repository[DB/OL].[2018-07-13]. http://archive.ics.uci.edu/ml. [18] ROSS B C. Mutual information between discrete and continuous data sets[J]. PLoS One, 2014, 9(2):e87357. [19] CHELVAN P M, PERUMAL K. A study on selection stability measures for various feature selection algorithms[C]//Proceedings of the 2016 IEEE International Conference on Computational Intelligence and Computing Research. Piscataway, NJ:IEEE, 2017:1-4. |