计算机应用 ›› 2017, Vol. 37 ›› Issue (3): 746-749.DOI: 10.11772/j.issn.1001-9081.2017.03.746

• 先进计算 • 上一篇    下一篇

多输出数据依赖核支持向量回归快速学习算法

王定成, 赵友志, 陈北京, 陆一祎   

  1. 南京信息工程大学 计算机与软件学院, 南京 210044
  • 收稿日期:2016-08-18 修回日期:2016-08-23 出版日期:2017-03-10 发布日期:2017-03-22
  • 通讯作者: 王定成
  • 作者简介:王定成(1967-),男,安徽霍山人,研究员,博士,主要研究方向:智能计算;赵友志(1990-),男,江苏宿迁人,主要研究方向:智能计算;陈北京(1981-),男,江西石城人,副教授,博士,主要研究方向:图像处理;陆一祎(1993-),女,江苏南通人,主要研究方向:智能计算。
  • 基金资助:
    国家自然科学基金资助项目(61572258,61375030);江苏省自然科学基金资助项目(BK2012858,BK20151530)。

Fast learning algorithm of multi-output support vector regression with data-dependent kernel

WANG Dingcheng, ZHAO Youzhi, CHEN Beijing, LU Yiyi   

  1. School of Computer and Software, Nanjing University of Information Science and Technology, Nanjing Jiangsu 210044, China
  • Received:2016-08-18 Revised:2016-08-23 Online:2017-03-10 Published:2017-03-22
  • Supported by:
    This work is partially supported by the National Natural Science Foundation of China (61572258, 61375030), the Natural Science Foundation of Jiangsu Province (BK2012858, BK20151530).

摘要: 针对基于递推下降法的多输出支持向量回归算法在模型参数拟合过程中收敛速度慢、预测精度低的情况,使用一种基于秩2校正规则且具有二阶收敛速度的修正拟牛顿算法(BFGS)进行多输出支持向量回归算法的模型参数拟合,同时为了保证模型迭代过程中的下降量和全局收敛性,应用非精确线性搜索技术确定步长因子。通过分析支持向量机(SVM)中核函数的几何结构,构造数据依赖核函数替代传统核函数,生成多输出数据依赖核支持向量回归模型。将模型与基于梯度下降法、修正牛顿法拟合的多输出支持向量回归模型进行对比。实验结果表明,在200个样本下该算法的迭代时间为72.98 s,修正牛顿法的迭代时间为116.34 s,递推下降法的迭代时间为2065.22 s。所提算法能够减少模型迭代时间,具有更快的收敛速度。

关键词: 数据依赖核, 多输出支持向量回归, 最优化算法, 拟牛顿算法

Abstract: For the Multi-output Support Vector Regression (MSVR) algorithm based on gradient descent method in the process of model parameter fitting, the convergence rate is slow and the prediction accuracy is low. A modified version of the Quasi-Newton algorithm (BFGS) with second-order convergence rate based on the rank-2 correction rule was used to fit the model parameters of MSVR algorithm. At the same time, to ensure the decrease of the iterative process and the global convergence, the step size factor was determined by the non-exact linear search technique. Based on the analysis of the geometry structure of kernel function in Support Vector Machine (SVM), a data-dependent kernel function was substituted for the traditional kernel function, and the multi-output data-dependent kernel support vector regression model was generated. The model was compared with the multi-output support vector regression model based on gradient descent method and modified Newton method. The experimental results show that in the case of 200 samples, the iterative time of the proposed algorithm is 72.98 s, the iterative time of modified Newton's algorithm is 116.34 s and the iterative time of gradient descent method is 2065.22 s. The proposed algorithm can reduce the model iteration time and has faster convergence speed.

Key words: data-dependent kernel, multi-output support vector regression, optimization algorithm, Quasi-Newton algorithm

中图分类号: