计算机应用 ›› 2016, Vol. 36 ›› Issue (3): 697-702.DOI: 10.11772/j.issn.1001-9081.2016.03.697

• 人工智能 • 上一篇    下一篇

基于自动编码器组合的深度学习优化方法

邓俊锋1,2, 张晓龙1,2   

  1. 1. 武汉科技大学 计算机科学与技术学院, 武汉 430065;
    2. 智能信息处理与实时工业系统湖北省重点实验室, 武汉 430065
  • 收稿日期:2015-08-13 修回日期:2015-10-14 出版日期:2016-03-10 发布日期:2016-03-17
  • 通讯作者: 张晓龙
  • 作者简介:邓俊锋(1989-),男,湖北黄冈人,硕士研究生,主要研究方向:机器学习、数据挖掘;张晓龙(1963-),男,江西永新人,教授,博士生导师,主要研究方向:数据挖掘、机器学习、生物信息处理。
  • 基金资助:
    国家自然科学基金资助项目(61273225);国家科技支撑计划项目(2012BAC22B01)。

Deep learning algorithm optimization based on combination of auto-encoders

DENG Junfeng1,2, ZHANG Xiaolong1,2   

  1. 1. School of Computer Science and Technology, Wuhan University of Science and Technology, Wuhan Hubei 430065, China;
    2. Hubei Province Key Laboratory of Intelligent Information Processing and Real-time Industrial System, Wuhan Hubei 430065, China
  • Received:2015-08-13 Revised:2015-10-14 Online:2016-03-10 Published:2016-03-17
  • Supported by:
    This work is partially supported by the National Natural Science Foundation of China (61273225) and the National Key Technology R&D Program (2012BAC22B01).

摘要: 为了提高自动编码器算法的学习精度,更进一步降低分类任务的分类错误率,提出一种组合稀疏自动编码器(SAE)和边缘降噪自动编码器(mDAE)从而形成稀疏边缘降噪自动编码器(SmDAE)的方法,将稀疏自动编码器和边缘降噪自动编码器的限制条件加载到一个自动编码器(AE)之上,使得这个自动编码器同时具有稀疏自动编码器的稀疏性约束条件和边缘降噪自动编码器的边缘降噪约束条件,提高自动编码器算法的学习能力。实验表明,稀疏边缘降噪自动编码器在多个分类任务上的学习精度都高于稀疏自动编码器和边缘降噪自动编码器的分类效果;与卷积神经网络(CNN)的对比实验也表明融入了边缘降噪限制条件,而且更加鲁棒的SmDAE模型的分类精度比CNN还要好。

关键词: 深度学习, 自动编码器, 稀疏自动编码器, 降噪自动编码器, 卷积神经网络

Abstract: In order to improve the learning accuracy of Auto-Encoder (AE) algorithm and further reduce the classification error rate, Sparse marginalized Denoising Auto-Encoder (SmDAE) was proposed combined with Sparse Auto-Encoder (SAE) and marginalized Denoising Auto-Encoder (mDAE). SmDAE is an auto-encoder which was added the constraint conditions of SAE and mDAE and has the characteristics of SAE and mDAE, so as to enhance the ability of deep learning. Experimental results show that SmDAE outperforms both SAE and mDAE in the given classification tasks; comparative experiments with Convolutional Neural Network (CNN) show that SmDAE with marginalized denoising and a more robust model outperforms convolutional neural network.

Key words: deep learning, Auto-Encoder (AE), Sparse Auto-Encoder (SAE), Denoising Auto-Encoder (DAE), Convolutional Neural Network (CNN)

中图分类号: