《计算机应用》唯一官方网站 ›› 2024, Vol. 44 ›› Issue (3): 671-676.DOI: 10.11772/j.issn.1001-9081.2023040441

• 人工智能 • 上一篇    下一篇

基于改进实数编码遗传算法的神经网络超参数优化

佘维1,2, 李阳3,5, 钟李红2,4,5, 孔德锋6, 田钊1,3()   

  1. 1.郑州大学 网络空间安全学院, 郑州 450001
    2.嵩山实验室, 郑州 450000
    3.互联网医疗与健康服务河南省协同创新中心(郑州大学), 郑州 450052
    4.郑州市区块链与数据智能重点实验室(郑州大学), 郑州 450001
    5.郑州大学 计算机与人工智能学院, 郑州 450001
    6.军事科学院国防工程研究院 工程防护研究所, 河南 洛阳 471023
  • 收稿日期:2023-04-18 修回日期:2023-05-25 接受日期:2023-05-31 发布日期:2023-12-04 出版日期:2024-03-10
  • 通讯作者: 田钊
  • 作者简介:佘维(1977—),男,湖南常德人,教授,博士生导师,博士,CCF会员,主要研究方向:复杂系统建模与仿真、机器学习、区块链、数据智能
    李阳(1999—),男,河南开封人,硕士研究生,主要研究方向:群体智能、机器学习
    钟李红(1997—),女,四川射洪人,博士研究生,主要研究方向:机器学习、推荐系统、群体智能
    孔德锋(1989—),男,河南郸城人,工程师,主要研究方向:军事运筹学、毁伤效能评估
    田钊(1985—),男,河南荥阳人,副教授,博士,主要研究方向:群体智能、机器学习、区块链、数据智能。
  • 基金资助:
    嵩山实验室预研项目(YYYY022022003);河南省重点研发与推广专项(212102310039)

Hyperparameter optimization for neural network based on improved real coding genetic algorithm

Wei SHE1,2, Yang LI3,5, Lihong ZHONG2,4,5, Defeng KONG6, Zhao TIAN1,3()   

  1. 1.School of Cyber Science and Engineering,Zhengzhou University,Zhengzhou Henan 450001,China
    2.Songshan Laboratory,Zhengzhou Henan 450000,China
    3.Henan Provincial Collaborative Innovation Center for Internet Medical and Health Services,Zhengzhou Henan 450052,China
    4.Zhengzhou Key Laboratory of Blockchain and Data Intelligence (Zhengzhou University),Zhengzhou Henan 450001,China
    5.School of Computer and Artificial Intelligence,Zhengzhou University,Zhengzhou Henan 450001,China
    6.Institute of Engineering Protection,National Defense Engineering Research Institute of the Academy of Military Sciences,Luoyang Henan 471023,China
  • Received:2023-04-18 Revised:2023-05-25 Accepted:2023-05-31 Online:2023-12-04 Published:2024-03-10
  • Contact: Zhao TIAN
  • About author:SHE Wei, born in 1977, Ph. D., professor. His research interests include complex system modeling and simulation, machine learning, blockchain, data intelligence.
    LI Yang, born in 1999, M. S. candidate. His research interests include swarm intelligence, machine learning.
    ZHONG Lihong, born in 1997, Ph. D. candidate. Her research interests include machine learning, recommendation system, swarm intelligence.
    KONG Defeng, born in 1989, engineer. His research interests include military operations research, damage effectiveness evaluation.
  • Supported by:
    Songshan Laboratory Pre-research Project(YYYY022022003);Key Research and Development and Promotion Project in Henan Province(212102310039)

摘要:

针对神经网络超参数优化效果差、容易陷入次优解和优化效率低的问题,提出一种基于改进实数编码遗传算法(IRCGA)的深度神经网络超参数优化算法——IRCGA-DNN(IRCGA for Deep Neural Network)。首先,采用实数编码方式表示超参数的取值,使超参数的搜索空间更灵活;然后,引入分层比例选择算子增加解集多样性;最后,分别设计了改进的单点交叉和变异算子,以更全面地探索超参数空间,提高优化算法的效率和质量。基于两个仿真数据集,验证IRCGA-DNN的毁伤效果预测性能和收敛效率。实验结果表明,在两个数据集上,与GA-DNN(Genetic Algorithm for Deep Neural Network)相比,所提算法的收敛迭代次数分别减少了8.7%和13.6%,均方误差(MSE)相差不大;与IGA-DNN(Improved GA-DNN)相比,IRCGA-DNN的收敛迭代次数分别减少了22.2%和13.6%。实验结果表明,所提算法收敛速度和预测性能均更优,能有效处理神经网络超参数优化问题。

关键词: 实数编码, 遗传算法, 超参数优化, 进化神经网络, 机器学习

Abstract:

To address the problems of poor effects, easily falling into suboptimal solutions, and inefficiency in neural network hyperparameter optimization, an Improved Real Coding Genetic Algorithm (IRCGA) based hyperparameter optimization algorithm for the neural network was proposed, which was named IRCGA-DNN (IRCGA for Deep Neural Network). Firstly, a real-coded form was used to represent the values of hyperparameters, which made the search space of hyperparameters more flexible. Then, a hierarchical proportional selection operator was introduced to enhance the diversity of the solution set. Finally, improved single-point crossover and variational operators were designed to explore the hyperparameter space more thoroughly and improve the efficiency and quality of the optimization algorithm, respectively. Two simulation datasets were used to show IRCGA’s performance in damage effectiveness prediction and convergence efficiency. The experimental results on two datasets indicate that, compared to GA-DNN(Genetic Algorithm for Deep Neural Network), the proposed algorithm reduces the convergence iterations by 8.7% and 13.6% individually, and the MSE (Mean Square Error) is not much different; compared to IGA-DNN(Improved Genetic Algorithm for Deep Neural Network), IRCGA-DNN achieves reductions of 22.2% and 13.6% in convergence iterations respectively. Experimental results show that the proposed algorithm is better in both convergence speed and prediction performance, and is suitable for hyperparametric optimization of neural networks.

Key words: real coding, genetic algorithm, hyperparameter optimization, evolutionary neural network, machine learning

中图分类号: