计算机应用 ›› 2020, Vol. 40 ›› Issue (8): 2460-2464.DOI: 10.11772/j.issn.1001-9081.2019122198

• 应用前沿、交叉与综合 • 上一篇    下一篇

基于深度卷积神经网络的糖尿病视网膜病变分期及病灶检测

谢云霞1, 黄海于1, 胡建斌2   

  1. 1. 西南交通大学 信息科学与技术学院, 成都 611756;
    2. 爱尔眼科 成都东区爱尔眼科医院, 成都 610056
  • 收稿日期:2019-12-31 修回日期:2020-03-05 出版日期:2020-08-10 发布日期:2020-05-13
  • 通讯作者: 胡建斌(1964-),男,重庆人,主任医师,硕士,主要研究方向:玻璃体视网膜临床与基础,449703499@qq.com
  • 作者简介:谢云霞(1994-),女,四川成都人,硕士研究生,主要研究方向:医学图像处理;黄海于(1970-),男,重庆人,副教授,硕士,主要研究方向:医学图像处理。

Staging and lesion detection of diabetic retinopathy based on deep convolution neural network

XIE Yunxia1, HUANG Haiyu1, HU Jianbin2   

  1. 1. School of Information Science and Technology, Southwest Jiaotong University, Chengdu Sichuan 611756, China;
    2. Chengdu East Aier Eye Hospital, Aier Eye Hospital, Chengdu Sichuan 610056, China
  • Received:2019-12-31 Revised:2020-03-05 Online:2020-08-10 Published:2020-05-13

摘要: 针对糖尿病视网膜病变(DR)图像分辨率过大、病灶特征过于分散难以获取以及正负难易样本不平衡而导致DR分期精确率一直无法得到有效提高的问题,提出了改进的基于快速区域的卷积神经网络(Faster R-CNN)和子图分割相结合的DR分期方法。首先,使用子图分割解决视盘区域对于病灶识别的干扰问题;其次,在特征提取阶段使用深度残差网络以解决病灶在高分辨率眼底图像中占比小而导致的特征难以获取的问题;最后,在感兴趣区域(ROI)生成时采用在线困难样本挖掘(OHEM)方法解决正负难易样本不平衡的问题。在国际公开数据集EyePACS进行DR分期实验,所提方法在DR病分期中精确率0期达到94.83%,1期达到86.84%,2期达到94.00%,3期达到87.21%,4期达到82.96%。实验结果表明,改进后的Faster R-CNN能对DR图像高效分期并自动标注出病灶。

关键词: 糖尿病视网膜病变, 目标检测, 基于快速区域的卷积神经网络算法, 子图分割, 在线困难样本挖掘

Abstract: For Diabetic Retinopathy (DR), the image resolution is too high, the lesion features are too scattered to obtain, and the positive, negative, hard and easy samples are imbalanced, thus the DR staging accuracy cannot be effectively improved. Therefore, a DR staging method based on the combination of improved Faster Region-based Convolutional Neural Network (Faster R-CNN) and subgraph segmentation was proposed. First, subgraph segmentation was used to solve the interference problem of the optic disc region to lesion recognition. Second, a deep residual network was used in the feature extraction process to solve the problem of difficulty of obtaining features due to the small proportion of the lesions in the high-resolution fundus image. Finally, the Online Hard Example Mining (OHEM) method was used to solve the problem of imbalance between positive, negative, hard and easy samples during the generation of Region of Interest (ROI). In the DR staging experiments on EyePACS, an internationally open dataset, the accuracy of the proposed method in DR staging reached 94.83% in stage 0, 86.84% in stage 1, 94.00% in stage 2, 87.21% in stage 3 and 82.96% in phase 4. Experimental results show that the improved Faster R-CNN can efficiently stage DR images and automatically label the lesions.

Key words: Diabetic Retinopathy (DR), object detection, Faster Region-based Convolutional Neural Network (Faster R-CNN) algorithm, subgraph segmentation, Online Hard Example Mining (OHEM)

中图分类号: