计算机应用 ›› 2019, Vol. 39 ›› Issue (11): 3158-3162.DOI: 10.11772/j.issn.1001-9081.2019051180

• 2019年中国计算机学会人工智能会议(CCFAI2019)论文 • 上一篇    下一篇

基于距离融合的图像特征点匹配方法

修春波1,2, 马云菲1, 潘肖楠1   

  1. 1. 天津工业大学 电气工程与自动化学院, 天津 300387;
    2. 电工电能新技术天津市重点实验室(天津工业大学), 天津 300387
  • 收稿日期:2019-05-24 修回日期:2019-07-17 发布日期:2019-09-11 出版日期:2019-11-10
  • 通讯作者: 修春波
  • 作者简介:修春波(1978-),男,黑龙江大庆人,教授,博士,主要研究方向:神经网络、系统建模、混沌控制;马云菲(1995-),女,河北石家庄人,硕士研究生,主要研究方向:图像处理、模式识别;潘肖楠(1992-),女,甘肃庆阳人,硕士研究生,主要研究方向:神经网络、目标跟踪。
  • 基金资助:
    天津市自然科学基金资助项目(18JCYBJC88300,18JCYBJC88400)。

Image feature point matching method based on distance fusion

XIU Chunbo1,2, MA Yunfei1, PAN Xiaonan1   

  1. 1. School of Electrical Engineering and Automation, Tianjin Polytechnic University, Tianjin 300387, China;
    2. Tianjin Key Laboratory of Advanced Electrical Engineering and Energy Technology(Tianjin Polytechnic University), Tianjin 300387, China
  • Received:2019-05-24 Revised:2019-07-17 Online:2019-09-11 Published:2019-11-10
  • Supported by:
    This work is partially supported by the Tianjin Natural Science Foundation (18JCYBJC88300, 18JCYBJC88400).

摘要: 针对ORB算法中特征点缺乏尺度不变性导致算法误匹配率高,以及二进制鲁棒独立基本特征(BRIEF)算法的描述子易受噪声影响的问题,提出了改进的特征点匹配方法。采用加速的具有鲁棒性的特征(SURF)算法进行特征点提取,利用带有方向信息的BRIEF算法进行特征点描述;在特征点邻域内选取随机点对,并对随机点对的灰度大小比较和相似度比较分别进行编码,采用汉明距离计算两种编码的差异;利用自适应加权融合的方式实现特征点相似性距离度量。实验结果表明,改进方法对于尺度变化、光照变化以及模糊变化的图像具有更好的适应性,与传统ORB特征点匹配方法相比能够获得更高的特征点正确匹配率,且该特征点匹配方法可用于改善图像拼接的性能。

关键词: 特征点匹配, 图像拼接, 加权融合, 汉明距离, 尺度不变性

Abstract: In order to reduce the matching error rate of ORB (Oriented FAST and Rotated BRIEF) method caused by the scale invariance of the feature points in the algorithm and enhance the robustness of the descriptors of Binary Robust Independent Elementary Features (BRIEF) algorithm to noise, an improved feature point matching method was proposed. Speeded-Up Robust Features (SURF) algorithm was used to extract feature points, and BRIEF algorithm with direction information was used to describe the feature points. Random pixel pairs in the neighborhood of the feature point were selected, the comparison results of the grayscales and the similarity of pixel pairs were encoded respectively, and Hamming distance was used to calculate the differences between the two codes. The similarity between the feature points were measured by the adaptive weighted fusion method. Experimental results show that the improved method has better adaptability to the scale variance, illumination variance and blurred variance of images, can obtain a higher feature point correct matching rate compared with the conventional ORB method, and can be used to improve the performance of image stitching.

Key words: feature point matching, image stitching, weighted fusion, Hamming distance, scale invariance

中图分类号: