Journal of Computer Applications

    Next Articles

Diffusion-guided evolutionary neural architecture search algorithm

  

  • Received:2026-01-20 Revised:2026-03-19 Online:2026-04-23 Published:2026-04-23

扩散引导的进化神经架构搜索算法

汤泽宇,温晓易,谢在鹏   

  1. 河海大学
  • 通讯作者: 谢在鹏

Abstract: Evolutionary Algorithms (EA) are widely adopted in Neural Architecture Search (NAS) due to their strong global exploration capability. However, traditional mutation operators often lack performance-aware guidance and show limited local exploitation, constraining search efficiency and architecture quality. To address this, this paper introduced Diffusion-Guided Evolutionary NAS (DGEA) to enhance mutation quality and search capability via distribution-aware and performance-oriented designs. The core of DGEA was to use a discrete diffusion model to generate structural perturbations, aiding evolutionary search process. By learning priors from high-performing architectures, it applied controlled noise addition and conditional denoising to create high-performance new architectures. Additionally, an adaptive mutation scheduler dynamically balanced exploration and exploitation, while a lightweight Harmony Search (HS) operator served as a local enhancement module for fine-tuning neighborhoods and preventing premature convergence. Experiments on NAS-Bench-101, NAS-Bench-201, and ImageNet100 transfer tasks show DGEA significantly boosts search efficiency and effectiveness. Under similar computational budgets, DGEA cuts iterations needed to reach the same accuracy threshold by 10%-84% versus EG-NAS (Neural Architecture Search with Fast Evolutionary Exploration), GEA(Guided Evolutionary Neural Architecture Search with Efficient Performance Estimation) and REA(Regularized Evolution for Image Classifier Architecture Search), with accuracy gains of 0.28 to 0.42 percentage points. Ablation studies verify the contribution of diffusion-guided mutation and harmony module. Overall, integrating distribution-aware generative mutation with adaptive local refinement strengthens evolutionary search's performance, stability, and robustness.

摘要: 进化算法(EA)在神经架构搜索(NAS)中因具备较强全局探索能力而被广泛采用,但其传统变异方式常缺乏性能信息引导,且局部开发能力有限,导致搜索效率与架构质量提升受限。为此,论文提出了一种扩散引导的进化神经架构搜索算法(DGEA),以分布感知与偏置控制机制共同提升变异质量和搜索能力。算法核心是利用离散扩散模型生成结构化扰动,通过从高性能架构分布中学习到的结构先验,执行可控的噪声添加与条件去噪以生成新的高性能架构。同时,引入自适应变异策略动态平衡探索与利用;此外,集成一个轻量级的和声搜索(HS)算子作为局部增强模块,对架构邻域进行精细化微调以抑制早熟收敛。在NAS-Bench-101、NAS-Bench-201以及ImageNet100迁移任务上的实验结果表明,DGEA显著提升了搜索效率与效果。在相近计算预算下,相较EG-NAS(Neural Architecture Search with Fast Evolutionary Exploration)、GEA(Guided evolutionary neural architecture search with efficient performance estimation)和REA(Regularized Evolution for Image Classifier Architecture Search),DGEA达到相同准确率阈值所需架构评估数量减少10%~84%,准确率整体提升0.28~0.42个百分点。消融实验证实了扩散引导变异(DGM)机制和HS模块的贡献以及协同作用。结论表明,通过整合分布感知的生成式变异与自适应HS局部优化,DGEA有效强化了进化搜索的动态性能与收敛稳定性。

CLC Number: