《计算机应用》唯一官方网站 ›› 2023, Vol. 43 ›› Issue (4): 1206-1213.DOI: 10.11772/j.issn.1001-9081.2022030444
收稿日期:2022-04-08
									
				
											修回日期:2022-06-02
									
				
											接受日期:2022-06-02
									
				
											发布日期:2023-01-11
									
				
											出版日期:2023-04-10
									
				
			通讯作者:
					刘颖
							作者简介:于振华(1977—),男,山东乳山人,教授,博士,主要研究方向:软件缺陷预测、信息物理融合系统;基金资助:
        
                                                                                                                            Zhenhua YU1, Zhengqi LIU1, Ying LIU2( ), Cheng GUO3
), Cheng GUO3
			  
			
			
			
                
        
    
Received:2022-04-08
									
				
											Revised:2022-06-02
									
				
											Accepted:2022-06-02
									
				
											Online:2023-01-11
									
				
											Published:2023-04-10
									
			Contact:
					Ying LIU   
							About author:YU Zhenhua, born in 1977, Ph. D., professor. His research interests include software defect prediction, cyber-physical systems.Supported by:摘要:
特征选择是软件缺陷预测中数据预处理的关键步骤。针对现有特征选择方法存在的降维效果不显著、选取的最优特征子集分类精度低等问题,提出了一种基于自适应混合粒子群优化(SHPSO)的软件缺陷预测特征选择方法。首先,结合种群划分设计了基于Q学习的自适应权重更新策略,其中引入Q学习根据粒子的状态自适应地调整惯性权重;其次,为了平衡算法前期的全局搜索能力和后期的收敛速度,提出了基于曲线自适应的时变学习因子;最后,采用混合位置更新策略帮助粒子尽快跳出局部最优解,并增加粒子的多样性。在12个公开软件缺陷数据集上进行实验验证的结果表明,与使用全部特征的方法、常用的传统特征选择方法及主流的基于智能优化算法的特征选择方法相比,所提方法在提高软件缺陷预测模型分类性能和降低特征空间维度上均取得了有效的结果。与改进樽海鞘群算法(ISSA)相比,所提方法的分类精度平均提高了约1.60%,特征子集规模平均降低了约63.79%。实验结果表明,所提方法可以选出分类精度较高且数量较少的特征子集。
中图分类号:
于振华, 刘争气, 刘颖, 郭城. 基于自适应混合粒子群优化的软件缺陷预测特征选择方法[J]. 计算机应用, 2023, 43(4): 1206-1213.
Zhenhua YU, Zhengqi LIU, Ying LIU, Cheng GUO. Feature selection method based on self-adaptive hybrid particle swarm optimization for software defect prediction[J]. Journal of Computer Applications, 2023, 43(4): 1206-1213.
| U | 状态 | U | 状态 | 
|---|---|---|---|
| [0,0.1) | s1 | (0.5,1] | s3 | 
| [0.1,0.5] | s2 | 
表1 种群的状态
Tab. 1 State of population
| U | 状态 | U | 状态 | 
|---|---|---|---|
| [0,0.1) | s1 | (0.5,1] | s3 | 
| [0.1,0.5] | s2 | 
| 动作 | 定义 | 动作 | 定义 | 
|---|---|---|---|
| +0.1 | -0.1 | ||
| +0.0 | 
表2 动作定义
Tab. 2 Action definition
| 动作 | 定义 | 动作 | 定义 | 
|---|---|---|---|
| +0.1 | -0.1 | ||
| +0.0 | 
| 状态 | 动作 | ||
|---|---|---|---|
| s1 | Q11 | Q12 | Q13 | 
| s2 | Q21 | Q22 | Q23 | 
| s3 | Q31 | Q32 | Q33 | 
表3 Q表
Tab. 3 Q-table
| 状态 | 动作 | ||
|---|---|---|---|
| s1 | Q11 | Q12 | Q13 | 
| s2 | Q21 | Q22 | Q23 | 
| s3 | Q31 | Q32 | Q33 | 
| 实际标签 | 预测标签 | |
|---|---|---|
| 有缺陷 | 无缺陷 | |
| 有缺陷 | TP | FN | 
| 无缺陷 | FP | TN | 
表4 混淆矩阵
Tab. 4 Confusion matrix
| 实际标签 | 预测标签 | |
|---|---|---|
| 有缺陷 | 无缺陷 | |
| 有缺陷 | TP | FN | 
| 无缺陷 | FP | TN | 
| 数据集 | 语言 | 粒度 | 特征数 | 样本 总数 | 有缺陷 样本数 | 无缺陷 样本数 | 
|---|---|---|---|---|---|---|
| CM1 | C | 函数 | 37 | 327 | 42 | 285 | 
| KC1 | C++ | 函数 | 21 | 1 183 | 314 | 869 | 
| KC3 | Java | 函数 | 39 | 194 | 36 | 158 | 
| MC2 | C | 函数 | 39 | 125 | 44 | 81 | 
| MW1 | C | 函数 | 37 | 253 | 27 | 226 | 
| PC1 | C | 函数 | 37 | 705 | 61 | 644 | 
| PC3 | C | 函数 | 37 | 1 077 | 134 | 943 | 
| PC4 | C | 函数 | 37 | 1 287 | 177 | 1 110 | 
| PC5 | C++ | 函数 | 38 | 1 711 | 471 | 1 240 | 
| ant-1.7 | Java | 类 | 20 | 745 | 166 | 579 | 
| camel-1.6 | Java | 类 | 20 | 965 | 188 | 777 | 
| ivy-1.1 | Java | 类 | 20 | 107 | 45 | 62 | 
表5 数据集具体信息
Tab. 5 Specific information of datasets
| 数据集 | 语言 | 粒度 | 特征数 | 样本 总数 | 有缺陷 样本数 | 无缺陷 样本数 | 
|---|---|---|---|---|---|---|
| CM1 | C | 函数 | 37 | 327 | 42 | 285 | 
| KC1 | C++ | 函数 | 21 | 1 183 | 314 | 869 | 
| KC3 | Java | 函数 | 39 | 194 | 36 | 158 | 
| MC2 | C | 函数 | 39 | 125 | 44 | 81 | 
| MW1 | C | 函数 | 37 | 253 | 27 | 226 | 
| PC1 | C | 函数 | 37 | 705 | 61 | 644 | 
| PC3 | C | 函数 | 37 | 1 077 | 134 | 943 | 
| PC4 | C | 函数 | 37 | 1 287 | 177 | 1 110 | 
| PC5 | C++ | 函数 | 38 | 1 711 | 471 | 1 240 | 
| ant-1.7 | Java | 类 | 20 | 745 | 166 | 579 | 
| camel-1.6 | Java | 类 | 20 | 965 | 188 | 777 | 
| ivy-1.1 | Java | 类 | 20 | 107 | 45 | 62 | 
| 算法 | 参数设置 | 
|---|---|
| BPSO[ | 种群大小40;迭代次数100;惯性权重 | 
| ISCA[ | 种群大小40;迭代次数100;均衡因子 | 
| TMGWO[ | 种群大小40;迭代次数100;权重参数 | 
| TVBSSA[ | 种群大小40;迭代次数100 | 
| ISSA[ | 种群大小40;迭代次数100;局部搜索迭代次数为10 | 
| SHPSO | 种群大小40;迭代次数100;初始惯性权重 | 
表6 算法参数
Tab. 6 Algorithm parameters
| 算法 | 参数设置 | 
|---|---|
| BPSO[ | 种群大小40;迭代次数100;惯性权重 | 
| ISCA[ | 种群大小40;迭代次数100;均衡因子 | 
| TMGWO[ | 种群大小40;迭代次数100;权重参数 | 
| TVBSSA[ | 种群大小40;迭代次数100 | 
| ISSA[ | 种群大小40;迭代次数100;局部搜索迭代次数为10 | 
| SHPSO | 种群大小40;迭代次数100;初始惯性权重 | 
| 数据集 | 平均分类精度% | 平均特征数 | ||
|---|---|---|---|---|
| 全特征方法 | SHPSO | 全特征方法 | SHPSO | |
| CM1 | 84.71 | 89.29 | 37 | 2.43 | 
| KC1 | 70.91 | 77.02 | 21 | 2.43 | 
| KC3 | 77.74 | 86.80 | 39 | 2.13 | 
| MC2 | 71.32 | 84.91 | 39 | 2.10 | 
| MW1 | 87.85 | 94.60 | 37 | 2.43 | 
| PC1 | 91.04 | 93.47 | 37 | 1.90 | 
| PC3 | 86.11 | 89.24 | 37 | 2.50 | 
| PC4 | 84.85 | 90.90 | 37 | 3.43 | 
| PC5 | 68.64 | 78.09 | 38 | 3.93 | 
| ant | 78.97 | 84.39 | 20 | 1.80 | 
| camel | 77.40 | 81.98 | 20 | 1.37 | 
| ivy | 70.00 | 84.04 | 20 | 2.87 | 
表7 SHPSO算法与全特征方法的实验结果
Tab. 7 Experimental results of SHPSO algorithm and all features method
| 数据集 | 平均分类精度% | 平均特征数 | ||
|---|---|---|---|---|
| 全特征方法 | SHPSO | 全特征方法 | SHPSO | |
| CM1 | 84.71 | 89.29 | 37 | 2.43 | 
| KC1 | 70.91 | 77.02 | 21 | 2.43 | 
| KC3 | 77.74 | 86.80 | 39 | 2.13 | 
| MC2 | 71.32 | 84.91 | 39 | 2.10 | 
| MW1 | 87.85 | 94.60 | 37 | 2.43 | 
| PC1 | 91.04 | 93.47 | 37 | 1.90 | 
| PC3 | 86.11 | 89.24 | 37 | 2.50 | 
| PC4 | 84.85 | 90.90 | 37 | 3.43 | 
| PC5 | 68.64 | 78.09 | 38 | 3.93 | 
| ant | 78.97 | 84.39 | 20 | 1.80 | 
| camel | 77.40 | 81.98 | 20 | 1.37 | 
| ivy | 70.00 | 84.04 | 20 | 2.87 | 
| 数据集 | 平均分类精度/% | 平均特征数 | ||||
|---|---|---|---|---|---|---|
| CFS | PCA | SHPSO | CFS | PCA | SHPSO | |
| CM1 | 85.02 | 83.91 | 89.29 | 5 | 12 | 2.43 | 
| KC1 | 71.24 | 71.96 | 77.02 | 8 | 8 | 2.43 | 
| KC3 | 81.69 | 79.38 | 86.89 | 3 | 11 | 2.13 | 
| MC2 | 70.18 | 69.39 | 84.91 | 10 | 11 | 2.10 | 
| MW1 | 87.68 | 89.39 | 94.61 | 9 | 12 | 2.43 | 
| PC1 | 90.06 | 90.80 | 93.47 | 8 | 13 | 1.90 | 
| PC3 | 85.70 | 85.45 | 89.24 | 9 | 13 | 2.50 | 
| PC4 | 86.17 | 87.89 | 90.99 | 4 | 15 | 3.43 | 
| PC5 | 68.75 | 74.18 | 78.09 | 11 | 15 | 3.93 | 
| ant | 79.57 | 79.23 | 84.39 | 6 | 13 | 1.80 | 
| camel | 77.68 | 77.07 | 81.98 | 8 | 13 | 1.37 | 
| ivy | 69.60 | 71.11 | 84.04 | 7 | 11 | 2.87 | 
表8 SHPSO算法与传统特征选择方法的实验结果
Tab. 8 Experimental results of SHPSO algorithm and traditional feature selection methods
| 数据集 | 平均分类精度/% | 平均特征数 | ||||
|---|---|---|---|---|---|---|
| CFS | PCA | SHPSO | CFS | PCA | SHPSO | |
| CM1 | 85.02 | 83.91 | 89.29 | 5 | 12 | 2.43 | 
| KC1 | 71.24 | 71.96 | 77.02 | 8 | 8 | 2.43 | 
| KC3 | 81.69 | 79.38 | 86.89 | 3 | 11 | 2.13 | 
| MC2 | 70.18 | 69.39 | 84.91 | 10 | 11 | 2.10 | 
| MW1 | 87.68 | 89.39 | 94.61 | 9 | 12 | 2.43 | 
| PC1 | 90.06 | 90.80 | 93.47 | 8 | 13 | 1.90 | 
| PC3 | 85.70 | 85.45 | 89.24 | 9 | 13 | 2.50 | 
| PC4 | 86.17 | 87.89 | 90.99 | 4 | 15 | 3.43 | 
| PC5 | 68.75 | 74.18 | 78.09 | 11 | 15 | 3.93 | 
| ant | 79.57 | 79.23 | 84.39 | 6 | 13 | 1.80 | 
| camel | 77.68 | 77.07 | 81.98 | 8 | 13 | 1.37 | 
| ivy | 69.60 | 71.11 | 84.04 | 7 | 11 | 2.87 | 
| 数据集 | 平均分类精度/% | 平均特征个数 | ||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| BPSO | ISCA | TMGWO | TVBSSA | ISSA | SHPSO | BPSO | ISCA | TMGWO | TVBSSA | ISSA | SHPSO | |
| CM1 | 88.18 | 88.45 | 88.05 | 87.04 | 88.92 | 89.29 | 8.37 | 2.17 | 2.67 | 1.90 | 7.63 | 2.43 | 
| KC1 | 76.41 | 76.16 | 75.76 | 74.97 | 76.18 | 77.02 | 4.50 | 2.53 | 3.17 | 3.73 | 4.93 | 2.43 | 
| KC3 | 84.29 | 85.48 | 84.46 | 82.60 | 84.24 | 86.89 | 10.20 | 2.47 | 3.37 | 3.97 | 7.27 | 2.13 | 
| MC2 | 80.88 | 81.58 | 79.30 | 77.98 | 80.09 | 84.91 | 11.90 | 3.07 | 3.83 | 9.40 | 9.73 | 2.10 | 
| MW1 | 92.59 | 93.11 | 92.19 | 91.40 | 93.20 | 94.61 | 9.53 | 2.80 | 2.80 | 5.20 | 7.13 | 2.43 | 
| PC1 | 92.36 | 92.91 | 92.86 | 91.45 | 92.99 | 93.47 | 7.50 | 2.27 | 2.47 | 1.53 | 6.80 | 1.90 | 
| PC3 | 88.27 | 88.53 | 88.20 | 87.18 | 88.66 | 89.24 | 8.53 | 2.40 | 2.43 | 2.07 | 8.53 | 2.50 | 
| PC4 | 88.56 | 89.98 | 89.00 | 86.87 | 89.46 | 90.99 | 8.57 | 3.63 | 3.87 | 3.57 | 8.53 | 3.43 | 
| PC5 | 76.14 | 77.21 | 77.12 | 75.08 | 77.20 | 78.09 | 9.23 | 4.20 | 4.60 | 6.60 | 9.33 | 3.93 | 
| ant | 84.33 | 83.59 | 82.90 | 83.27 | 83.90 | 84.39 | 4.53 | 2.63 | 2.50 | 4.37 | 4.70 | 1.80 | 
| camel | 81.37 | 81.51 | 81.03 | 80.46 | 81.62 | 81.98 | 2.67 | 1.50 | 1.57 | 2.20 | 3.57 | 1.37 | 
| ivy | 83.54 | 80.71 | 81.21 | 81.11 | 82.42 | 84.04 | 5.47 | 3.27 | 3.90 | 6.37 | 5.50 | 2.87 | 
表9 SHPSO算法与基于智能优化算法的特征选择方法的实验结果
Tab. 9 Experimental results of SHPSO algorithm and feature selection methods based on intelligent optimization algorithms
| 数据集 | 平均分类精度/% | 平均特征个数 | ||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| BPSO | ISCA | TMGWO | TVBSSA | ISSA | SHPSO | BPSO | ISCA | TMGWO | TVBSSA | ISSA | SHPSO | |
| CM1 | 88.18 | 88.45 | 88.05 | 87.04 | 88.92 | 89.29 | 8.37 | 2.17 | 2.67 | 1.90 | 7.63 | 2.43 | 
| KC1 | 76.41 | 76.16 | 75.76 | 74.97 | 76.18 | 77.02 | 4.50 | 2.53 | 3.17 | 3.73 | 4.93 | 2.43 | 
| KC3 | 84.29 | 85.48 | 84.46 | 82.60 | 84.24 | 86.89 | 10.20 | 2.47 | 3.37 | 3.97 | 7.27 | 2.13 | 
| MC2 | 80.88 | 81.58 | 79.30 | 77.98 | 80.09 | 84.91 | 11.90 | 3.07 | 3.83 | 9.40 | 9.73 | 2.10 | 
| MW1 | 92.59 | 93.11 | 92.19 | 91.40 | 93.20 | 94.61 | 9.53 | 2.80 | 2.80 | 5.20 | 7.13 | 2.43 | 
| PC1 | 92.36 | 92.91 | 92.86 | 91.45 | 92.99 | 93.47 | 7.50 | 2.27 | 2.47 | 1.53 | 6.80 | 1.90 | 
| PC3 | 88.27 | 88.53 | 88.20 | 87.18 | 88.66 | 89.24 | 8.53 | 2.40 | 2.43 | 2.07 | 8.53 | 2.50 | 
| PC4 | 88.56 | 89.98 | 89.00 | 86.87 | 89.46 | 90.99 | 8.57 | 3.63 | 3.87 | 3.57 | 8.53 | 3.43 | 
| PC5 | 76.14 | 77.21 | 77.12 | 75.08 | 77.20 | 78.09 | 9.23 | 4.20 | 4.60 | 6.60 | 9.33 | 3.93 | 
| ant | 84.33 | 83.59 | 82.90 | 83.27 | 83.90 | 84.39 | 4.53 | 2.63 | 2.50 | 4.37 | 4.70 | 1.80 | 
| camel | 81.37 | 81.51 | 81.03 | 80.46 | 81.62 | 81.98 | 2.67 | 1.50 | 1.57 | 2.20 | 3.57 | 1.37 | 
| ivy | 83.54 | 80.71 | 81.21 | 81.11 | 82.42 | 84.04 | 5.47 | 3.27 | 3.90 | 6.37 | 5.50 | 2.87 | 
| 数据集 | CFS | PCA | BPSO | ISCA | TMGWO | TVBSSA | ISSA | 
|---|---|---|---|---|---|---|---|
| CM1 | 2.47E-06 | 1.70E-06 | 1.12E-04 | 2.53E-03 | 1.31E-03 | 1.02E-05 | 2.49E-03 | 
| KC1 | 2.52E-06 | 1.71E-06 | 2.24E-02 | 3.90E-02 | 2.15E-04 | 6.54E-05 | |
| KC3 | 2.36E-06 | 1.65E-06 | 6.06E-05 | 8.54E-06 | 4.64E-06 | 3.52E-06 | 1.56E-05 | 
| MC2 | 3.73E-06 | 1.66E-06 | 8.41E-05 | 1.05E-02 | 9.35E-05 | 5.42E-06 | 1.83E-04 | 
| MW1 | 1.66E-06 | 1.66E-06 | 4.37E-05 | 2.35E-03 | 1.11E-05 | 4.85E-06 | 1.61E-03 | 
| PC1 | 1.72E-06 | 1.65E-06 | 3.49E-06 | 1.84E-03 | 1.25E-02 | 3.88E-06 | 4.95E-02 | 
| PC3 | 1.71E-06 | 1.69E-06 | 3.86E-04 | 2.81E-02 | 2.22E-04 | 5.84E-06 | |
| PC4 | 1.72E-06 | 1.72E-06 | 1.72E-06 | 1.43E-03 | 1.67E-04 | 2.51E-06 | 5.78E-04 | 
| PC5 | 1.73E-06 | 1.73E-06 | 1.84E-05 | 9.42E-03 | 1.10E-02 | 4.54E-06 | |
| ant | 1.71E-06 | 1.91E-06 | 6.00E-03 | 9.93E-04 | 1.06E-02 | ||
| camel | 1.90E-06 | 1.68E-06 | 5.15E-03 | 4.99E-02 | 7.58E-04 | 6.55E-04 | |
| ivy | 3.64E-06 | 2.47E-06 | 2.24E-02 | 4.97E-02 | 1.67E-02 | 
表10 SHPSO算法与对比算法分类精度的Wilcoxon 符号秩检验p值
Tab. 10 p values of Wilcoxon signed-rank test for classification accuracies of SHPSO algorithm and comparison algorithms
| 数据集 | CFS | PCA | BPSO | ISCA | TMGWO | TVBSSA | ISSA | 
|---|---|---|---|---|---|---|---|
| CM1 | 2.47E-06 | 1.70E-06 | 1.12E-04 | 2.53E-03 | 1.31E-03 | 1.02E-05 | 2.49E-03 | 
| KC1 | 2.52E-06 | 1.71E-06 | 2.24E-02 | 3.90E-02 | 2.15E-04 | 6.54E-05 | |
| KC3 | 2.36E-06 | 1.65E-06 | 6.06E-05 | 8.54E-06 | 4.64E-06 | 3.52E-06 | 1.56E-05 | 
| MC2 | 3.73E-06 | 1.66E-06 | 8.41E-05 | 1.05E-02 | 9.35E-05 | 5.42E-06 | 1.83E-04 | 
| MW1 | 1.66E-06 | 1.66E-06 | 4.37E-05 | 2.35E-03 | 1.11E-05 | 4.85E-06 | 1.61E-03 | 
| PC1 | 1.72E-06 | 1.65E-06 | 3.49E-06 | 1.84E-03 | 1.25E-02 | 3.88E-06 | 4.95E-02 | 
| PC3 | 1.71E-06 | 1.69E-06 | 3.86E-04 | 2.81E-02 | 2.22E-04 | 5.84E-06 | |
| PC4 | 1.72E-06 | 1.72E-06 | 1.72E-06 | 1.43E-03 | 1.67E-04 | 2.51E-06 | 5.78E-04 | 
| PC5 | 1.73E-06 | 1.73E-06 | 1.84E-05 | 9.42E-03 | 1.10E-02 | 4.54E-06 | |
| ant | 1.71E-06 | 1.91E-06 | 6.00E-03 | 9.93E-04 | 1.06E-02 | ||
| camel | 1.90E-06 | 1.68E-06 | 5.15E-03 | 4.99E-02 | 7.58E-04 | 6.55E-04 | |
| ivy | 3.64E-06 | 2.47E-06 | 2.24E-02 | 4.97E-02 | 1.67E-02 | 
| 1 | 姜佳君,陈俊洁,熊英飞. 软件缺陷自动修复技术综述[J]. 软件学报, 2021, 32(9):2665-2690. 10.13328/j.cnki.jos.006274 | 
| JIANG J J, CHEN J J, XIONG Y F. Survey of automatic program repair techniques[J]. Journal of Software, 2021, 32(9):2665-2690. 10.13328/j.cnki.jos.006274 | |
| 2 | 陈翔,王莉萍,顾庆,等. 跨项目软件缺陷预测方法研究综述[J]. 计算机学报, 2018, 41(1):254-274. 10.11897/SP.J.1016.2018.00254 | 
| CHEN X, WANG L P, GU Q, et al. A survey on cross-project software defect prediction methods[J]. Chinese Journal of Computers, 2018, 41(1): 254-274. 10.11897/SP.J.1016.2018.00254 | |
| 3 | FARIS H, MAFARJA M M, HEIDARI A A, et al. An efficient binary Salp Swarm Algorithm with crossover scheme for feature selection problems[J]. Knowledge-Based Systems, 2018, 154: 43-67. 10.1016/j.knosys.2018.05.009 | 
| 4 | KENNEDY J, EBERHART R. Particle swarm optimization[C]// Proceedings of the 1995 International Conference on Neural Networks - Volume 4. Piscataway: IEEE, 1995: 1942-1948. | 
| 5 | MIRJALILI S, LEWIS A. The whale optimization algorithm[J]. Advances in Engineering Software, 2016, 95:51-67. 10.1016/j.advengsoft.2016.01.008 | 
| 6 | MIRJALILI S, MIRJALILI S M, LEWIS A. Grey wolf optimizer[J]. Advances in Engineering Software, 2014, 69:46-61. 10.1016/j.advengsoft.2013.12.007 | 
| 7 | HEIDARI A A, MIRJALILI S, FARIS H, et al. Harris hawks optimization: algorithm and applications[J]. Future Generation Computer Systems, 2019, 97:849-872. 10.1016/j.future.2019.02.028 | 
| 8 | MIRJALILI S, GANDOMI A H, MIRJALILI S Z, et al. Salp swarm algorithm: a bio-inspired optimizer for engineering design problems[J]. Advances in Engineering Software, 2017, 114:163-191. 10.1016/j.advengsoft.2017.07.002 | 
| 9 | ZHU K, YING S, ZHANG N N, et al. Software defect prediction based on enhanced metaheuristic feature selection optimization and a hybrid deep neural network[J]. Journal of Systems and Software, 2021, 180: No.111026. 10.1016/j.jss.2021.111026 | 
| 10 | XUE Y, TANG T, PANG W, et al. Self-adaptive parameter and strategy based particle swarm optimization for large-scale feature selection problems with multiple classifiers[J]. Applied Soft Computing, 2020, 88: No.106031. 10.1016/j.asoc.2019.106031 | 
| 11 | MAFARJA M, MIRJALILI S. Whale optimization approaches for wrapper feature selection[J]. Applied Soft Computing, 2018, 62: 441-453. 10.1016/j.asoc.2017.11.006 | 
| 12 | TU Q, CHEN X C, LIU X C. Multi-strategy ensemble grey wolf optimizer and its application to feature selection[J]. Applied Soft Computing, 2019, 76:16-30. 10.1016/j.asoc.2018.11.047 | 
| 13 | ZHANG Y N, LIU R J, WANG X, et al. Boosted binary Harris hawks optimizer and feature selection[J]. Engineering with Computers, 2021, 37(4):3741-3770. 10.1007/s00366-020-01028-5 | 
| 14 | 郑延斌,樊文鑫,韩梦云,等. 基于博弈论及Q学习的多Agent协作追捕算法[J]. 计算机应用, 2020, 40(6):1613-1620. 10.3233/jifs-191222 | 
| ZHENG Y B, FAN W X, HAN M Y, et al. Multi-agent collaborative pursuit algorithm based on game theory and Q-learning[J]. Journal of Computer Applications, 2020, 40(6):1613-1620. 10.3233/jifs-191222 | |
| 15 | 陈科. 面向高维数据的分类特征选择方法研究[D]. 济南:山东大学, 2021:32-34. | 
| CHEN K. Research on feature selection methods for high-dimensional classification[D]. Jinan: Shandong University, 2021: 32-34. | |
| 16 | RATNAWEERA A, HALGAMUGE S K, WATSON H C. Self-organizing hierarchical particle swarm optimizer with time-varying acceleration coefficients[J]. IEEE Transactions on Evolutionary Computation, 2004, 8(3): 240-255. 10.1109/tevc.2004.826071 | 
| 17 | MIRJALILI S. SCA: a Sine Cosine Algorithm for solving optimization problems[J]. Knowledge-Based Systems, 2016, 96:120-133. 10.1016/j.knosys.2015.12.022 | 
| 18 | MEHTA S, PATNAIK K S. Improved prediction of software defects using ensemble machine learning techniques[J]. Neural Computing and Applications, 2021, 33(16): 10551-10562. 10.1007/s00521-021-05811-3 | 
| 19 | SHEPPERD M, SONG Q B, SUN Z B, et al. Data quality: some comments on the NASA software defect datasets[J]. IEEE Transactions on Software Engineering, 2013, 39(9):1208-1215. 10.1109/tse.2013.11 | 
| 20 | HALL M A. Correlation-based feature selection of discrete and numeric class machine learning[C]// Proceedings of the 17th International Conference on Machine Learning. San Francisco: Morgan Kaufmann Publishers Inc., 2000:359-366. | 
| 21 | WOLD S, ESBENSEN K, GELADI P. Principal component analysis[J]. Chemometrics and Intelligent Laboratory Systems, 1987, 2(1/2/3): 37-52. 10.1016/0169-7439(87)80084-9 | 
| 22 | KENNEDY J, EBERHART R C. A discrete binary version of the particle swarm algorithm[C]// Proceedings of the 1997 IEEE International Conference on Systems, Man, and Cybernetics: Computational Cybernetics and Simulation - Volume 5. Piscataway: IEEE, 1997:4104-4108. 10.1109/icsmc.1997.625706 | 
| 23 | SINDHU R, NGADIRAN R, YACOB Y M, et al. Sine-cosine algorithm for feature selection with elitism strategy and new updating mechanism[J]. Neural Computing and Applications, 2017, 28(10): 2947-2958. 10.1007/s00521-017-2837-7 | 
| 24 | ABDEL-BASSET M, EL-SHAHAT D, EL-HENAWY I, et al. A new fusion of grey wolf optimizer algorithm with a two-phase mutation for feature selection[J]. Expert Systems with Applications, 2020, 139: No.112824. 10.1016/j.eswa.2019.112824 | 
| 25 | FARIS H, HEIDARI A A, AL-ZOUBI A M, et al. Time-varying hierarchical chains of salps with random weight networks for feature selection[J]. Expert Systems with Applications, 2020, 140: No.112898. 10.1016/j.eswa.2019.112898 | 
| 26 | TUBISHAT M, IDRIS N, SHUIB L, et al. Improved Salp Swarm Algorithm based on opposition based learning and novel local search algorithm for feature selection[J]. Expert Systems with Applications, 2020, 145: No.113122. 10.1016/j.eswa.2019.113122 | 
| 27 | HUDA R K, BANKA H. New efficient initialization and updating mechanisms in PSO for feature selection and classification[J]. Neural Computing and Applications, 2020, 32(8): 3283-3294. 10.1007/s00521-019-04395-3 | 
| 28 | NGUYEN B H, XUE B, ANDREAE P, et al. A new binary particle swarm optimization approach: momentum and dynamic balance between exploration and exploitation[J]. IEEE Transactions on Cybernetics, 2021, 51(2): 589-603. 10.1109/tcyb.2019.2944141 | 
| 29 | KILIÇ F, KAYA Y, YILDIRIM S. A novel multi population based particle swarm optimization for feature selection[J]. Knowledge-Based Systems, 2021, 219: No.106894. 10.1016/j.knosys.2021.106894 | 
| 30 | HU X M, ZHANG S R, LI M, et al. Multimodal particle swarm optimization for feature selection[J]. Applied Soft Computing, 2021, 113(Pt A): No.107887. 10.1016/j.asoc.2021.107887 | 
| [1] | 陈虹, 齐兵, 金海波, 武聪, 张立昂. 融合1D-CNN与BiGRU的类不平衡流量异常检测[J]. 《计算机应用》唯一官方网站, 2024, 44(8): 2493-2499. | 
| [2] | 高培根, 锁斌. 基于加权犹豫模糊集的实验设计与分阶段PSO-Kriging建模[J]. 《计算机应用》唯一官方网站, 2024, 44(7): 2144-2150. | 
| [3] | 高麟, 周宇, 邝得互. 进化双层自适应局部特征选择[J]. 《计算机应用》唯一官方网站, 2024, 44(5): 1408-1414. | 
| [4] | 雷明珠, 王浩, 贾蓉, 白琳, 潘晓英. 基于特征间关系合成少数类样本的过采样算法[J]. 《计算机应用》唯一官方网站, 2024, 44(5): 1428-1436. | 
| [5] | 孙林, 刘梦含. 基于自适应布谷鸟优化特征选择的K-means聚类[J]. 《计算机应用》唯一官方网站, 2024, 44(3): 831-841. | 
| [6] | 徐大鹏, 侯新民. 基于网络结构设计的图神经网络特征选择方法[J]. 《计算机应用》唯一官方网站, 2024, 44(3): 663-670. | 
| [7] | 孟圣洁, 于万钧, 陈颖. 最大相关和最大差异的高维数据特征选择算法[J]. 《计算机应用》唯一官方网站, 2024, 44(3): 767-771. | 
| [8] | 杜晓昕, 周薇, 王浩, 郝田茹, 王振飞, 金梅, 张剑飞. 智能算法的亚群优化策略综述[J]. 《计算机应用》唯一官方网站, 2024, 44(3): 819-830. | 
| [9] | 王震, 张珊珊, 邬斌扬, 苏万华. 基于自适应粒子群优化算法的串联复合涡轮储能优化策略[J]. 《计算机应用》唯一官方网站, 2024, 44(2): 611-618. | 
| [10] | 刘晶鑫, 黄雯静, 徐亮胜, 黄冲, 吴建生. 字典学习与样本关联保持结合的无监督特征选择模型[J]. 《计算机应用》唯一官方网站, 2024, 44(12): 3766-3775. | 
| [11] | 何添, 沈宗鑫, 黄倩倩, 黄雁勇. 基于自适应学习的多视图无监督特征选择方法[J]. 《计算机应用》唯一官方网站, 2023, 43(9): 2657-2664. | 
| [12] | 梁军, 洪泽泓, 余松森. 基于改进粒子群优化算法和遗传变异的图像分割模型[J]. 《计算机应用》唯一官方网站, 2023, 43(6): 1743-1749. | 
| [13] | 孙林, 黄金旭, 徐久成. 基于邻域容差互信息和鲸鱼优化算法的非平衡数据特征选择[J]. 《计算机应用》唯一官方网站, 2023, 43(6): 1842-1854. | 
| [14] | 高智慧, 韩萌, 刘淑娟, 李昂, 穆栋梁. 基于智能优化算法的高效用项集挖掘方法综述[J]. 《计算机应用》唯一官方网站, 2023, 43(6): 1676-1686. | 
| [15] | 马学森, 许雪梅, 蒋功辉, 乔焰, 周天保. 混合自适应粒子群工作流调度优化算法[J]. 《计算机应用》唯一官方网站, 2023, 43(2): 474-483. | 
| 阅读次数 | ||||||
| 全文 |  | |||||
| 摘要 |  | |||||