《计算机应用》唯一官方网站 ›› 2022, Vol. 42 ›› Issue (9): 2800-2806.DOI: 10.11772/j.issn.1001-9081.2021071216
收稿日期:
2021-07-13
修回日期:
2021-09-21
接受日期:
2021-09-24
发布日期:
2021-10-18
出版日期:
2022-09-10
通讯作者:
王宇航
作者简介:
周永霞(1975—),男,浙江诸暨人,副教授,博士,主要研究方向:计算机图像视频处理、机器视觉、人工智能;基金资助:
Yuhang WANG(), Yongxia ZHOU, Liangwu WU
Received:
2021-07-13
Revised:
2021-09-21
Accepted:
2021-09-24
Online:
2021-10-18
Published:
2022-09-10
Contact:
Yuhang WANG
About author:
ZHOU Yongxia, born in 1975, Ph. D., associate professor. His research interests include computer image and video processing, machine vision, artificial intelligence.Supported by:
摘要:
针对卷积神经网络(CNN)中的传统池化算法不能很好地考虑到池化域内每个元素与该池化域所含特征之间关联性的问题,提出一种基于高斯函数的池化算法。首先根据池化域内各元素的值和所有元素的最大值计算高斯函数的三个参数值,然后运用高斯函数计算池化域内所有元素的权重,最后根据这些权重对池化域内所有元素值计算加权平均值,并以此作为池化结果。选择LeNet5、VGG16、ResNet18和MobileNet v3作为实验模型,在公开数据集CIFAR-10、Fer2013和德国交通标志识别基准(GTSRB)上进行实验,并与最大池化、平均池化、随机池化、混合池化、模糊池化、融合随机池化和soft池化这七种池化算法进行对比。实验结果表明,所提算法在三个数据集上相较其他算法在精度方面均有0.5个百分点到6个百分点的提升,且在运行效率方面优于上述除最大池化和平均池化两种池化算法外的其他池化算法,从而验证所提算法有效且具适合应用于对运算时间要求不高但对精度要求较高的情况。
中图分类号:
王宇航, 周永霞, 吴良武. 基于高斯函数的池化算法[J]. 计算机应用, 2022, 42(9): 2800-2806.
Yuhang WANG, Yongxia ZHOU, Liangwu WU. Pooling algorithm based on Gaussian function[J]. Journal of Computer Applications, 2022, 42(9): 2800-2806.
输入尺寸 | 操作 | 激活函数 | 步长 |
---|---|---|---|
224×224×3 | Conv2d,3×3 | h-swish | 2 |
112×112×16 | Bneck,3×3 | ReLU | 2 |
56×56×16 | Bneck,3×3 | ReLU | 2 |
28×28×24 | Bneck,3×3 | ReLU | 1 |
28×28×24 | Bneck,5×5 | h-swish | 2 |
14×14×40 | Bneck,5×5 | h-swish | 1 |
14×14×40 | Bneck,5×5 | h-swish | 1 |
14×14×40 | Bneck,5×5 | h-swish | 1 |
14×14×48 | Bneck,5×5 | h-swish | 1 |
14×14×48 | Bneck,5×5 | h-swish | 2 |
7×7×96 | Bneck,5×5 | h-swish | 1 |
7×7×96 | Conv2d,1×1 | h-swish | 1 |
7×7×576 | Pool,7×7 | — | 1 |
1×1×576 | Conv2d 1×1 | h-swish | 1 |
1×1×1 024 | Conv2d 1×1 | — | 1 |
表1 MobileNet v3 Small模型结构
Tab.1 Structure of MobileNet v3 Small model
输入尺寸 | 操作 | 激活函数 | 步长 |
---|---|---|---|
224×224×3 | Conv2d,3×3 | h-swish | 2 |
112×112×16 | Bneck,3×3 | ReLU | 2 |
56×56×16 | Bneck,3×3 | ReLU | 2 |
28×28×24 | Bneck,3×3 | ReLU | 1 |
28×28×24 | Bneck,5×5 | h-swish | 2 |
14×14×40 | Bneck,5×5 | h-swish | 1 |
14×14×40 | Bneck,5×5 | h-swish | 1 |
14×14×40 | Bneck,5×5 | h-swish | 1 |
14×14×48 | Bneck,5×5 | h-swish | 1 |
14×14×48 | Bneck,5×5 | h-swish | 2 |
7×7×96 | Bneck,5×5 | h-swish | 1 |
7×7×96 | Conv2d,1×1 | h-swish | 1 |
7×7×576 | Pool,7×7 | — | 1 |
1×1×576 | Conv2d 1×1 | h-swish | 1 |
1×1×1 024 | Conv2d 1×1 | — | 1 |
实验模型 | Epoch | ||
---|---|---|---|
20 | 40 | 60 | |
LeNet5-最大池化 | 55.030 | 56.160 | 57.020 |
LeNet5-平均池化 | 50.160 | 53.010 | 54.790 |
LeNet5-随机池化 | 53.030 | 57.280 | 58.910 |
LeNet5-混合池化 | 54.860 | 57.900 | 61.810 |
LeNet5-模糊池化 | 55.790 | 59.010 | 62.390 |
LeNet5-融合随机池化 | 56.420 | 60.720 | 63.750 |
LeNet5-soft池化 | 57.920 | 61.750 | 64.960 |
LeNet5-本文算法 | 59.370 | 62.780 | 65.480 |
VGG16-最大池化 | 79.090 | 82.840 | 83.810 |
VGG16-平均池化 | 75.730 | 81.080 | 82.630 |
VGG16-随机池化 | 77.720 | 84.390 | 84.610 |
VGG16-混合池化 | 78.620 | 82.970 | 83.880 |
VGG16-模糊池化 | 79.860 | 83.910 | 84.860 |
VGG16-融合随机池化 | 80.040 | 84.100 | 84.570 |
VGG16-soft池化 | 81.850 | 84.160 | 84.850 |
VGG16-本文算法 | 82.760 | 84.650 | 85.610 |
ResNet18-最大池化 | 81.820 | 85.740 | 86.830 |
ResNet18-平均池化 | 82.540 | 86.400 | 87.420 |
ResNet18-随机池化 | 83.040 | 86.900 | 88.110 |
ResNet18-混合池化 | 82.940 | 86.630 | 87.870 |
ResNet18-模糊池化 | 82.180 | 85.970 | 88.170 |
ResNet18-融合随机池化 | 82.970 | 86.540 | 88.350 |
ResNet18-soft池化 | 83.090 | 87.650 | 89.200 |
ResNet18-本文算法 | 83.930 | 87.940 | 89.930 |
MobileNet-最大池化 | 77.040 | 80.890 | 82.750 |
MobileNet-平均池化 | 76.880 | 80.670 | 82.890 |
MobileNet-随机池化 | 78.940 | 82.270 | 84.000 |
MobileNet-混合池化 | 78.210 | 82.390 | 83.680 |
MobileNet-模糊池化 | 77.990 | 81.530 | 83.520 |
MobileNet-融合随机池化 | 79.010 | 82.470 | 84.140 |
MobileNet-soft池化 | 78.980 | 82.330 | 84.170 |
MobileNet-本文算法 | 79.100 | 82.990 | 84.720 |
表2 各模型在CIFAR-10数据集上的top-1指标 (%)
Tab.2 Top-1 index of each model on CIFAR-10 dataset
实验模型 | Epoch | ||
---|---|---|---|
20 | 40 | 60 | |
LeNet5-最大池化 | 55.030 | 56.160 | 57.020 |
LeNet5-平均池化 | 50.160 | 53.010 | 54.790 |
LeNet5-随机池化 | 53.030 | 57.280 | 58.910 |
LeNet5-混合池化 | 54.860 | 57.900 | 61.810 |
LeNet5-模糊池化 | 55.790 | 59.010 | 62.390 |
LeNet5-融合随机池化 | 56.420 | 60.720 | 63.750 |
LeNet5-soft池化 | 57.920 | 61.750 | 64.960 |
LeNet5-本文算法 | 59.370 | 62.780 | 65.480 |
VGG16-最大池化 | 79.090 | 82.840 | 83.810 |
VGG16-平均池化 | 75.730 | 81.080 | 82.630 |
VGG16-随机池化 | 77.720 | 84.390 | 84.610 |
VGG16-混合池化 | 78.620 | 82.970 | 83.880 |
VGG16-模糊池化 | 79.860 | 83.910 | 84.860 |
VGG16-融合随机池化 | 80.040 | 84.100 | 84.570 |
VGG16-soft池化 | 81.850 | 84.160 | 84.850 |
VGG16-本文算法 | 82.760 | 84.650 | 85.610 |
ResNet18-最大池化 | 81.820 | 85.740 | 86.830 |
ResNet18-平均池化 | 82.540 | 86.400 | 87.420 |
ResNet18-随机池化 | 83.040 | 86.900 | 88.110 |
ResNet18-混合池化 | 82.940 | 86.630 | 87.870 |
ResNet18-模糊池化 | 82.180 | 85.970 | 88.170 |
ResNet18-融合随机池化 | 82.970 | 86.540 | 88.350 |
ResNet18-soft池化 | 83.090 | 87.650 | 89.200 |
ResNet18-本文算法 | 83.930 | 87.940 | 89.930 |
MobileNet-最大池化 | 77.040 | 80.890 | 82.750 |
MobileNet-平均池化 | 76.880 | 80.670 | 82.890 |
MobileNet-随机池化 | 78.940 | 82.270 | 84.000 |
MobileNet-混合池化 | 78.210 | 82.390 | 83.680 |
MobileNet-模糊池化 | 77.990 | 81.530 | 83.520 |
MobileNet-融合随机池化 | 79.010 | 82.470 | 84.140 |
MobileNet-soft池化 | 78.980 | 82.330 | 84.170 |
MobileNet-本文算法 | 79.100 | 82.990 | 84.720 |
实验模型 | Epoch | ||
---|---|---|---|
10 | 20 | 30 | |
LeNet5-最大池化 | 44.581 | 46.893 | 48.119 |
LeNet5-平均池化 | 43.299 | 44.887 | 47.033 |
LeNet5-随机池化 | 43.689 | 46.308 | 49.206 |
LeNet5-混合池化 | 45.302 | 48.590 | 48.760 |
LeNet5-模糊池化 | 47.181 | 48.760 | 51.156 |
LeNet5-融合随机池化 | 47.334 | 49.955 | 52.082 |
LeNet5-soft池化 | 48.878 | 50.165 | 52.674 |
LeNet5-本文算法 | 49.596 | 50.961 | 53.385 |
VGG16-最大池化 | 49.847 | 55.503 | 57.816 |
VGG16-平均池化 | 50.822 | 57.453 | 58.596 |
VGG16-随机池化 | 46.197 | 55.113 | 61.048 |
VGG16-混合池化 | 48.531 | 56.450 | 59.571 |
VGG16-模糊池化 | 51.420 | 57.156 | 60.960 |
VGG16-融合随机池化 | 52.147 | 57.098 | 61.082 |
VGG16-soft池化 | 53.461 | 57.385 | 61.904 |
VGG16-本文算法 | 53.803 | 58.011 | 62.435 |
ResNet18-最大池化 | 50.334 | 55.974 | 60.978 |
ResNet18-平均池化 | 50.348 | 56.293 | 60.841 |
ResNet18-随机池化 | 50.485 | 56.590 | 61.624 |
ResNet18-混合池化 | 51.240 | 56.961 | 62.544 |
ResNet18-模糊池化 | 51.026 | 57.476 | 63.098 |
ResNet18-融合随机池化 | 50.702 | 57.998 | 63.345 |
ResNet18-soft池化 | 51.395 | 58.132 | 64.333 |
ResNet18-本文算法 | 52.187 | 59.457 | 65.676 |
MobileNet-最大池化 | 50.485 | 54.755 | 58.080 |
MobileNet-平均池化 | 50.181 | 54.659 | 57.863 |
MobileNet-随机池化 | 51.006 | 54.802 | 58.240 |
MobileNet-混合池化 | 50.702 | 55.076 | 58.302 |
MobileNet-模糊池化 | 50.847 | 55.916 | 58.647 |
MobileNet-融合随机池化 | 51.017 | 55.715 | 58.461 |
MobileNet-soft池化 | 51.096 | 55.637 | 58.996 |
MobileNet-本文算法 | 51.028 | 56.450 | 59.739 |
表3 各模型在Fer2013数据集上的top-1指标 (%)
Tab.3 Top-1 index of each model on Fer2013 dataset
实验模型 | Epoch | ||
---|---|---|---|
10 | 20 | 30 | |
LeNet5-最大池化 | 44.581 | 46.893 | 48.119 |
LeNet5-平均池化 | 43.299 | 44.887 | 47.033 |
LeNet5-随机池化 | 43.689 | 46.308 | 49.206 |
LeNet5-混合池化 | 45.302 | 48.590 | 48.760 |
LeNet5-模糊池化 | 47.181 | 48.760 | 51.156 |
LeNet5-融合随机池化 | 47.334 | 49.955 | 52.082 |
LeNet5-soft池化 | 48.878 | 50.165 | 52.674 |
LeNet5-本文算法 | 49.596 | 50.961 | 53.385 |
VGG16-最大池化 | 49.847 | 55.503 | 57.816 |
VGG16-平均池化 | 50.822 | 57.453 | 58.596 |
VGG16-随机池化 | 46.197 | 55.113 | 61.048 |
VGG16-混合池化 | 48.531 | 56.450 | 59.571 |
VGG16-模糊池化 | 51.420 | 57.156 | 60.960 |
VGG16-融合随机池化 | 52.147 | 57.098 | 61.082 |
VGG16-soft池化 | 53.461 | 57.385 | 61.904 |
VGG16-本文算法 | 53.803 | 58.011 | 62.435 |
ResNet18-最大池化 | 50.334 | 55.974 | 60.978 |
ResNet18-平均池化 | 50.348 | 56.293 | 60.841 |
ResNet18-随机池化 | 50.485 | 56.590 | 61.624 |
ResNet18-混合池化 | 51.240 | 56.961 | 62.544 |
ResNet18-模糊池化 | 51.026 | 57.476 | 63.098 |
ResNet18-融合随机池化 | 50.702 | 57.998 | 63.345 |
ResNet18-soft池化 | 51.395 | 58.132 | 64.333 |
ResNet18-本文算法 | 52.187 | 59.457 | 65.676 |
MobileNet-最大池化 | 50.485 | 54.755 | 58.080 |
MobileNet-平均池化 | 50.181 | 54.659 | 57.863 |
MobileNet-随机池化 | 51.006 | 54.802 | 58.240 |
MobileNet-混合池化 | 50.702 | 55.076 | 58.302 |
MobileNet-模糊池化 | 50.847 | 55.916 | 58.647 |
MobileNet-融合随机池化 | 51.017 | 55.715 | 58.461 |
MobileNet-soft池化 | 51.096 | 55.637 | 58.996 |
MobileNet-本文算法 | 51.028 | 56.450 | 59.739 |
实验模型 | Epoch | ||
---|---|---|---|
5 | 10 | 15 | |
LeNet5-最大池化 | 45.891 | 67.675 | 82.668 |
LeNet5-平均池化 | 45.142 | 68.324 | 82.443 |
LeNet5-随机池化 | 46.151 | 68.890 | 83.905 |
LeNet5-混合池化 | 45.801 | 67.712 | 83.494 |
LeNet5-模糊池化 | 46.246 | 68.567 | 84.099 |
LeNet5-融合随机池化 | 45.980 | 68.759 | 84.560 |
LeNet5-soft池化 | 46.721 | 69.000 | 84.318 |
LeNet5-本文算法 | 47.130 | 69.450 | 85.074 |
VGG16-最大池化 | 53.692 | 70.246 | 89.397 |
VGG16-平均池化 | 54.026 | 70.856 | 88.180 |
VGG16-随机池化 | 53.280 | 69.960 | 90.269 |
VGG16-混合池化 | 54.678 | 70.210 | 89.517 |
VGG16-模糊池化 | 54.024 | 71.538 | 90.837 |
VGG16-融合随机池化 | 55.099 | 72.986 | 90.920 |
VGG16-soft池化 | 54.689 | 73.223 | 91.205 |
VGG16-本文算法 | 55.568 | 74.642 | 92.167 |
ResNet18-最大池化 | 70.801 | 87.769 | 96.837 |
ResNet18-平均池化 | 71.220 | 87.387 | 97.100 |
ResNet18-随机池化 | 72.413 | 89.657 | 97.375 |
ResNet18-混合池化 | 71.070 | 88.165 | 96.998 |
ResNet18-模糊池化 | 72.142 | 90.120 | 97.554 |
ResNet18-融合随机池化 | 72.814 | 90.105 | 97.445 |
ResNet18-soft池化 | 72.909 | 91.025 | 97.869 |
ResNet18-本文算法 | 74.678 | 91.920 | 98.708 |
MobileNet-最大池化 | 64.142 | 84.249 | 94.070 |
MobileNet-平均池化 | 64.494 | 84.129 | 93.756 |
MobileNet-随机池化 | 65.046 | 85.078 | 94.801 |
MobileNet-混合池化 | 64.000 | 84.935 | 94.130 |
MobileNet-模糊池化 | 63.958 | 85.140 | 94.814 |
MobileNet-融合随机池化 | 65.176 | 85.394 | 94.732 |
MobileNet-soft池化 | 65.169 | 85.373 | 94.810 |
MobileNet-本文算法 | 65.373 | 85.589 | 95.336 |
表4 各模型在GTSRB数据集上的top-1指标
Tab.4 Top-1 index of each model on GTSRB dataset
实验模型 | Epoch | ||
---|---|---|---|
5 | 10 | 15 | |
LeNet5-最大池化 | 45.891 | 67.675 | 82.668 |
LeNet5-平均池化 | 45.142 | 68.324 | 82.443 |
LeNet5-随机池化 | 46.151 | 68.890 | 83.905 |
LeNet5-混合池化 | 45.801 | 67.712 | 83.494 |
LeNet5-模糊池化 | 46.246 | 68.567 | 84.099 |
LeNet5-融合随机池化 | 45.980 | 68.759 | 84.560 |
LeNet5-soft池化 | 46.721 | 69.000 | 84.318 |
LeNet5-本文算法 | 47.130 | 69.450 | 85.074 |
VGG16-最大池化 | 53.692 | 70.246 | 89.397 |
VGG16-平均池化 | 54.026 | 70.856 | 88.180 |
VGG16-随机池化 | 53.280 | 69.960 | 90.269 |
VGG16-混合池化 | 54.678 | 70.210 | 89.517 |
VGG16-模糊池化 | 54.024 | 71.538 | 90.837 |
VGG16-融合随机池化 | 55.099 | 72.986 | 90.920 |
VGG16-soft池化 | 54.689 | 73.223 | 91.205 |
VGG16-本文算法 | 55.568 | 74.642 | 92.167 |
ResNet18-最大池化 | 70.801 | 87.769 | 96.837 |
ResNet18-平均池化 | 71.220 | 87.387 | 97.100 |
ResNet18-随机池化 | 72.413 | 89.657 | 97.375 |
ResNet18-混合池化 | 71.070 | 88.165 | 96.998 |
ResNet18-模糊池化 | 72.142 | 90.120 | 97.554 |
ResNet18-融合随机池化 | 72.814 | 90.105 | 97.445 |
ResNet18-soft池化 | 72.909 | 91.025 | 97.869 |
ResNet18-本文算法 | 74.678 | 91.920 | 98.708 |
MobileNet-最大池化 | 64.142 | 84.249 | 94.070 |
MobileNet-平均池化 | 64.494 | 84.129 | 93.756 |
MobileNet-随机池化 | 65.046 | 85.078 | 94.801 |
MobileNet-混合池化 | 64.000 | 84.935 | 94.130 |
MobileNet-模糊池化 | 63.958 | 85.140 | 94.814 |
MobileNet-融合随机池化 | 65.176 | 85.394 | 94.732 |
MobileNet-soft池化 | 65.169 | 85.373 | 94.810 |
MobileNet-本文算法 | 65.373 | 85.589 | 95.336 |
池化算法 | 分辨率 | ||
---|---|---|---|
100×100 | 1 000×1 000 | 10 000×10 000 | |
最大池化 | 0.093 | 4.675 | 307.083 |
平均池化 | 0.252 | 9.025 | 867.590 |
随机池化 | 0.444 | 12.446 | 1 268.931 |
混合池化 | 0.196 | 8.988 | 831.617 |
模糊池化 | 0.689 | 16.031 | 1 812.159 |
融合随机池化 | 0.631 | 15.296 | 1 604.357 |
soft池化 | 0.454 | 13.155 | 1 239.741 |
本文算法 | 0.372 | 11.387 | 1036.137 |
表5 各算法在不同图片分辨率下的运行时间 (ms)
Tab.5 Running time of each algorithm under different image resolution
池化算法 | 分辨率 | ||
---|---|---|---|
100×100 | 1 000×1 000 | 10 000×10 000 | |
最大池化 | 0.093 | 4.675 | 307.083 |
平均池化 | 0.252 | 9.025 | 867.590 |
随机池化 | 0.444 | 12.446 | 1 268.931 |
混合池化 | 0.196 | 8.988 | 831.617 |
模糊池化 | 0.689 | 16.031 | 1 812.159 |
融合随机池化 | 0.631 | 15.296 | 1 604.357 |
soft池化 | 0.454 | 13.155 | 1 239.741 |
本文算法 | 0.372 | 11.387 | 1036.137 |
1 | HINTON G E, SALAKHUTDINOV R R. Reducing the dimensionality of data with neural networks[J]. Science, 2006, 313(5786):504-507. 10.1126/science.1127647 |
2 | KRIZHEVSKY A, SUTSKEVER I, HINTON G E. ImageNet classification with deep convolutional neural networks[C]// Proceedings of the 25th International Conference on Neural Information Processing Systems. Red Hook, NY: Curran Associates Inc., 2012: 1097-1105. |
3 | SIMONYAN K, ZISSERMAN A. Very deep convolutional networks for large-scale image recognition[EB/OL]. (2015-04-10) [2021-08-16].. |
4 | HE K M, ZHANG X Y, REN S Q, et al. Deep residual learning for image recognition[C]// Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition. Piscataway: IEEE, 2016: 770-778. 10.1109/cvpr.2016.90 |
5 | SZEGEDY C, LIU W, JIA Y Q, et al. Going deeper with convolutions[C]// Proceedings of the 2015 IEEE Conference on Computer Vision and Pattern Recognition. Piscataway: IEEE, 2015: 1-9. 10.1109/cvpr.2015.7298594 |
6 | HUANG G, LIU Z, VAN DER MAATEN L, et al. Densely connected convolutional networks[C]// Proceedings of the 2017 IEEE Conference on Computer Vision and Pattern Recognition. Piscataway: IEEE, 2017: 2261-2269. 10.1109/cvpr.2017.243 |
7 | LEE C Y, GALLAGHER P, TU Z W. Generalizing pooling functions in CNNs: mixed, gated, and tree[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2018, 40(4):863-875. 10.1109/tpami.2017.2703082 |
8 | ZEILER M D, FERGUS R. Stochastic pooling for regularization of deep convolutional neural networks[EB/OL]. (2013-01-16) [2021-08-16]. . |
9 | YU D J, WANG H L, CHEN P Q, et al. Mixed pooling for convolutional neural networks[C]// Proceedings of the 2014 International Conference on Rough Sets and Knowledge Technology, LNCS 8818. Cham: Springer, 2014: 364-375. |
10 | GONG Y C, WANG L W, GUO R Q, et al. Multi-scale orderless pooling of deep convolutional activation features[C]// Proceedings of the 2014 European Conference on Computer Vision, LNCS 8695. Cham: Springer, 2014: 392-407. |
11 | SHARMA T, SINGH V, SUDHAKARAN S, et al. Fuzzy based pooling in convolutional neural network for image classification[C]// Proceedings of the 2019 IEEE International Conference on Fuzzy Systems. Piscataway: IEEE, 2019: 1-6. 10.1109/fuzz-ieee.2019.8859010 |
12 | REYES I V P D, SISON A M, MEDINA R P. A novel fused random pooling method for convolutional neural network to improve image classification accuracy[C]// Proceedings of the IEEE 6th International Conference on Engineering Technologies and Applied Sciences. Piscataway: IEEE, 2019: 1-5. 10.1109/icetas48360.2019.9117323 |
13 | WAN W T, CHEN J S, LI T P, et al. Information entropy based feature pooling for convolutional neural networks[C]// Proceedings of the 2019 IEEE/CVF International Conference on Computer Vision. Piscataway: IEEE, 2019: 3404-3413. 10.1109/iccv.2019.00350 |
14 | STERGIOU A, POPPE R, KALLIATAKIS G. Refining activation downsampling with softpool[C]// Proceedings of the 2021 IEEE/CVF International Conference on Computer Vision. Piscataway: IEEE, 2021: 10337-10346. 10.1109/iccv48922.2021.01019 |
15 | LECUN Y, BOTTOU L, BENGIO Y, et al. Gradient-based learning applied to document recognition[J]. Proceedings of the IEEE, 1998, 86(11):2278-2324. 10.1109/5.726791 |
16 | ABOUELNAGA Y, ALI O S, RADY H, et al. CIFAR-10: KNN-based ensemble of classifiers[C]// Proceedings of the 2016 International Conference on Computational Science and Computational Intelligence. Piscataway: IEEE, 2016: 1192-1195. 10.1109/csci.2016.0225 |
17 | GOODFELLOW I J, ERHAN D, LUC CARRIER P, et al. Challenges in representation learning: a report on three machine learning contests[J]. Neural Networks, 2015, 64:59-63. 10.1016/j.neunet.2014.09.005 |
18 | 韩习习,魏民,徐西义,等. 基于多特征融合的交通标志识别算法[J]. 计算机工程与应用, 2019, 55(18):195-200. |
HAN X X, WEI M, XU X Y, et al. Traffic sign recognition algorithm based on multi-feature fusion[J]. Computer Engineering and Applications, 2019, 55(18):195-200. |
[1] | 李云, 王富铕, 井佩光, 王粟, 肖澳. 基于不确定度感知的帧关联短视频事件检测方法[J]. 《计算机应用》唯一官方网站, 2024, 44(9): 2903-2910. |
[2] | 秦璟, 秦志光, 李发礼, 彭悦恒. 基于概率稀疏自注意力神经网络的重性抑郁疾患诊断[J]. 《计算机应用》唯一官方网站, 2024, 44(9): 2970-2974. |
[3] | 赵宇博, 张丽萍, 闫盛, 侯敏, 高茂. 基于改进分段卷积神经网络和知识蒸馏的学科知识实体间关系抽取[J]. 《计算机应用》唯一官方网站, 2024, 44(8): 2421-2429. |
[4] | 张春雪, 仇丽青, 孙承爱, 荆彩霞. 基于两阶段动态兴趣识别的购买行为预测模型[J]. 《计算机应用》唯一官方网站, 2024, 44(8): 2365-2371. |
[5] | 陈虹, 齐兵, 金海波, 武聪, 张立昂. 融合1D-CNN与BiGRU的类不平衡流量异常检测[J]. 《计算机应用》唯一官方网站, 2024, 44(8): 2493-2499. |
[6] | 高阳峄, 雷涛, 杜晓刚, 李岁永, 王营博, 闵重丹. 基于像素距离图和四维动态卷积网络的密集人群计数与定位方法[J]. 《计算机应用》唯一官方网站, 2024, 44(7): 2233-2242. |
[7] | 陆潜慧, 张羽, 王梦灵, 吴庭伟, 单玉忠. 基于改进循环池化网络的核电装备质量文本分类模型[J]. 《计算机应用》唯一官方网站, 2024, 44(7): 2034-2040. |
[8] | 王东炜, 刘柏辰, 韩志, 王艳美, 唐延东. 基于低秩分解和向量量化的深度网络压缩方法[J]. 《计算机应用》唯一官方网站, 2024, 44(7): 1987-1994. |
[9] | 姚迅, 秦忠正, 杨捷. 生成式标签对抗的文本分类模型[J]. 《计算机应用》唯一官方网站, 2024, 44(6): 1781-1785. |
[10] | 黄梦源, 常侃, 凌铭阳, 韦新杰, 覃团发. 基于层间引导的低光照图像渐进增强算法[J]. 《计算机应用》唯一官方网站, 2024, 44(6): 1911-1919. |
[11] | 李健京, 李贯峰, 秦飞舟, 李卫军. 基于不确定知识图谱嵌入的多关系近似推理模型[J]. 《计算机应用》唯一官方网站, 2024, 44(6): 1751-1759. |
[12] | 沈君凤, 周星辰, 汤灿. 基于改进的提示学习方法的双通道情感分析模型[J]. 《计算机应用》唯一官方网站, 2024, 44(6): 1796-1806. |
[13] | 席治远, 唐超, 童安炀, 王文剑. 基于双路时空网络的驾驶员行为识别[J]. 《计算机应用》唯一官方网站, 2024, 44(5): 1511-1519. |
[14] | 孙敏, 成倩, 丁希宁. 基于CBAM-CGRU-SVM的Android恶意软件检测方法[J]. 《计算机应用》唯一官方网站, 2024, 44(5): 1539-1545. |
[15] | 高文烁, 陈晓云. 基于节点结构的点云分类网络[J]. 《计算机应用》唯一官方网站, 2024, 44(5): 1471-1478. |
阅读次数 | ||||||
全文 |
|
|||||
摘要 |
|
|||||