[1] 翟俊海, 刘博, 张素芳. 基于粗糙集相对分类信息熵和粒子群优化的特征选择方法[J]. 智能系统学报,2017,12(3):397-404. (ZHAI J H,LIU B,ZHANG S F. A feature selection approach based on rough set relative classification information entropy and particle swarm optimization[J]. CAAI Transactions on Intelligent Systems,2017,12(3):397-404.) [2] MARIELLO A,BATTITI R. Feature selection based on the neighborhood entropy[J]. IEEE Transactions on Neural Networks and Learning Systems,2018,29(12):6313-6322. [3] 毛莺池, 曹海, 平萍, 等. 基于最大联合条件互信息的特征选择[J]. 计算机应用,2019,39(3):734-741.(MAO Y C,CAO H, PING P,et al. Feature selection based on maximum conditional and joint mutual information[J]. Journal of Computer Applications, 2019,39(3):734-741.) [4] PENG H,LONG F,DING C. Feature selection based on mutual information criteria of max-dependency,max-relevance,and min-redundancy[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence,2005,27(8):1226-1238. [5] 董红斌, 滕旭阳, 杨雪. 一种基于关联信息熵度量的特征选择方法[J]. 计算机研究与发展,2016,53(8):1684-1695. (DONG H B,TENG X Y,YANG X. Feature selection based on the measurement of correlation information entropy[J]. Journal of Computer Research and Development,2016,53(8):1684-1695.) [6] 马忱, 姜高霞, 王文剑. 面向函数型数据的动态互信息特征选择方法[J]. 计算机科学与探索,2019,13(1):158-168. (MA C, JIANG G X,WANG W J. Dynamic mutual information feature selection for functional data[J]. Journal of Frontiers of Computer Science and Technology,2019,13(1):158-168.) [7] YANG H H,MOODY J. Data visualization and feature selection:new algorithms for nonGaussian data[C]//Proceedings of the 12th International Conference on Neural Information Processing Systems. Cambridge:MIT Press,1999:687-693. [8] ZHENG K,WANG X. Feature selection method with joint maximal information entropy between features and class[J]. Pattern Recognition,2018,77:20-29. [9] KIRA K,RENDELL L A. A practical approach to feature selection[C]//Proceedings of the 9th International Workshop on Machine Learning. San Francisco:Morgan Kaufmann Publishers Inc., 1992:249-256. [10] 翟俊海, 刘博, 张素芳. 基于相对分类信息熵的进化特征选择算法[J]. 模式识别与人工智能, 2016,29(8):682-690. (ZHAI J H,LIU B,ZHANG S F. Feature selection via evolutionary computation based on relative classification information entropy[J]. Pattern Recognition and Artificial Intelligence,2016,29(8):682-690.) [11] 张振海, 李士宁, 李志刚, 等. 一类基于信息熵的多标签特征选择算法[J]. 计算机研究与发展,2013,50(6):1177-1184. (ZHANG Z H,LI S N,LI Z G,et al. Multi-label feature selection algorithm based on information entropy[J]. Journal of Computer Research and Development,2013,50(6):1177-1184.) [12] 周红标, 乔俊飞. 基于高维k-近邻互信息的特征选择方法[J]. 智能系统学报,2017,12(5):595-600. (ZHOU H B,QIAO J F. Feature selection method based on high dimensional k-nearest neighbors mutual information[J]. CAAI Transactions on Intelligent Systems,2017,12(5):595-600.) [13] 董泽民, 石强. 基于归一化模糊联合互信息最大的特征选择[J]. 计算机工程与应用,2017,53(22):105-110.(DONG Z M, SHI Q. Feature selection using normalized fuzzy joint mutual information maximum[J]. Computer Engineering and Applications, 2017,53(22):105-110.) [14] BLAKE C L,MERZ C J. UCI repository of machine learning databases[EB/OL].[2019-05-13]. http://mlearn.ics.uci.edu/MLRepository.html. [15] WANG K,ZHENG J,ZHANG J,et al. Estimating the number of clusters via system evolution for cluster analysis of gene expression data[J]. IEEE Transactions on Information Technology in Biomedicine,2009,13(5):848-853. |