计算机应用 ›› 2016, Vol. 36 ›› Issue (11): 2985-2992.DOI: 10.11772/j.issn.1001-9081.2016.11.2985

• 第十六届中国粗糙集与软计算联合学术会议(CRSSC 2016)论文 • 上一篇    下一篇

基于Hadoop的超像素分割算法

王春波, 董红斌, 印桂生, 刘文杰   

  1. 哈尔滨工程大学 计算机科学与技术学院, 哈尔滨 150001
  • 收稿日期:2016-06-21 修回日期:2016-06-27 出版日期:2016-11-10 发布日期:2016-11-12
  • 通讯作者: 董红斌
  • 作者简介:王春波(1991-),男,黑龙江佳木斯人,硕士研究生,主要研究方向:图像处理、数据挖掘;董红斌(1963-),男,黑龙江哈尔滨人,教授,博士,CCF会员,主要研究方向:人工智能、多Agent系统、进化计算;印桂生(1964-),男,江苏泰兴人,教授,博士,CCF会员,主要研究方向:数据库系统、虚拟现实;刘文杰(1992-),男,湖北天门人,硕士研究生,主要研究方向:人工智能、数据挖掘。
  • 基金资助:
    国家自然科学基金资助项目(61472095,41306086);国家自然科学基金青年科学基金资助项目(61502116);黑龙江省教育厅智能教育与信息工程重点实验室开放基金资助项目。

Super pixel segmentation algorithm based on Hadoop

WANG Chunbo, DONG Hongbin, YIN Guisheng, LIU Wenjie   

  1. College of Computer Science and Technology, Harbin Engineering University, Harbin Heilongjiang 150001, China
  • Received:2016-06-21 Revised:2016-06-27 Online:2016-11-10 Published:2016-11-12
  • Supported by:
    This work is partially supported by National Natural Science Foundation of China (61472095, 41306086), the Youth Science Foundation Project of National Natural Science Foundation of China (61502116), the Open Fund o Key Laboratory of Intelligent Education and Information Engineering of Department of Education of Heilongjiang Province.

摘要: 针对高分辨率图像像素分割时间复杂度高的问题,提出了超像素分割算法。采用超像素代替原始的像素作为分割的处理基元,将Hadoop分布式的特点与超像素的分块相结合。在分片过程中提出了基于多任务的静态与动态结合的适应性算法,使得Hadoop分布式文件系统(HDFS)的分块与任务分发的基元解耦;在每一个Map节点任务中,基于超像素分块的边界性对超像素的形成在距离和梯度上进行约束,提出了基于分水岭的并行化分割算法。在Shuffle过程的超像素块间合并中提出了两种合并策略,并进行了比较。在Reduce节点任务中优化了超像素块内合并,完成最终的分割。实验结果表明.所提算法在边缘查全率(BR)和欠分割错误率(UR)等分割质量指标上优于简单线性迭代聚类(SLIC)算法和标准分割(Ncut)算法,在高分辨率图像的分割时间上有显著降低。

关键词: Hadoop, 图像分割, 超像素, 并行算法, MapReduce

Abstract: In view of the high time complexity of pixel segmentation, a super pixel segmentation algorithm was proposed for high resolution image. Super pixels instead of the original pixels were used as the segmentation processing elements and the characteristics of Hadoop and the super pixels were combined. Firstly, a static and dynamic adaptive algorithm for multiple tasks was proposed which could reduce the coupling of the blocks in HDFS (Hadoop Distributed File System) and task arranging. Secondly, based on the constraint in the distance and gradient on the super pixel formed by the boundary of super pixel block, a parallel watershed segmentation algorithm was proposed in each Map node task. Meanwhile, two merging strategies were proposed and compared in the super pixel block merging in the Shuffle process. Finally, the combination of super pixels was optimized to complete the final segmentation in the Reduce node task. The experimental results show that the proposed algorithm is superior to the Simple Linear Iterative Cluster (SLIC) algorithm and Normalized cut (Ncut) algorithm in Boundary Recall ratio (BR) and Under segmentation Error (UE), and the segmentation time of the high-resolution image is remarkably decreased.

Key words: Hadoop, image segmentation, super pixel, parallel algorithm, MapReduce

中图分类号: