计算机应用 ›› 2019, Vol. 39 ›› Issue (5): 1364-1367.DOI: 10.11772/j.issn.1001-9081.2018112346

• 数据科学与技术 • 上一篇    下一篇

基于间隔理论的过采样集成算法

张宗堂, 陈喆, 戴卫国   

  1. 海军潜艇学院 航海观通系, 山东 青岛 266000
  • 收稿日期:2018-11-26 修回日期:2018-12-12 发布日期:2019-05-14 出版日期:2019-05-10
  • 通讯作者: 张宗堂
  • 作者简介:张宗堂(1989-),男,山东青岛人,博士,主要研究方向:水声目标识别;陈喆(1987-),男,吉林四平人,讲师,博士,主要研究方向:水声目标识别;戴卫国(1968-),男,河北深州人,教授,博士,主要研究方向:水声目标识别。

Over sampling ensemble algorithm based on margin theory

ZHANG Zongtang, CHEN Zhe, DAI Weiguo   

  1. Navigation and Observation Department, Navy Submarine Academy, Qingdao Shandong 266000, China
  • Received:2018-11-26 Revised:2018-12-12 Online:2019-05-14 Published:2019-05-10

摘要: 针对传统集成算法不适用于不平衡数据分类的问题,提出基于间隔理论的AdaBoost算法(MOSBoost)。首先通过预训练得到原始样本的间隔;然后依据间隔排序对少类样本进行启发式复制,从而形成新的平衡样本集;最后将平衡样本集输入AdaBoost算法进行训练以得到最终集成分类器。在UCI数据集上进行测试实验,利用F-measure和G-mean两个准则对MOSBoost、AdaBoost、随机过采样AdaBoost(ROSBoost)和随机降采样AdaBoost(RDSBoost)四种算法进行评价。实验结果表明,MOSBoost算法分类性能优于其他三种算法,其中,相对于AdaBoost算法,MOSBoost算法在F-measureG-mean准则下分别提升了8.4%和6.2%。

关键词: 不平衡数据, 间隔理论, 过采样方法, 集成分类器, 机器学习

Abstract: In order to solve the problem that traditional ensemble algorithms are not suitable for imbalanced data classification, Over Sampling AdaBoost based on Margin theory (MOSBoost) was proposed. Firstly, the margins of original samples were obtained by pre-training. Then, the minority class samples were heuristic duplicated by margin sorting thus forming a new balanced sample set. Finally, the finall ensemble classifier was obtained by the trained AdaBoost with the balanced sample set as the input. In the experiment on UCI dataset, F-measure and G-mean were used to evaluate MOSBoost, AdaBoost, Random OverSampling AdaBoost (ROSBoost) and Random UnderSampling AdaBoost (RDSBoost). The experimental results show that MOSBoost is superior to other three algorithm. Compared with AdaBoost, MOSBoost improves 8.4% and 6.2% respctively under F-measure and G-mean criteria.

Key words: imbalanced data, margin theory, over sampling method, ensemble classifier, machine learning

中图分类号: