《计算机应用》唯一官方网站 ›› 2022, Vol. 42 ›› Issue (4): 1148-1154.DOI: 10.11772/j.issn.1001-9081.2021071271
所属专题: CCF第36届中国计算机应用大会 (CCF NCCA 2021)
• CCF第36届中国计算机应用大会 (CCF NCCA 2021) • 上一篇 下一篇
收稿日期:
2021-07-16
修回日期:
2021-08-23
接受日期:
2021-08-27
发布日期:
2022-04-15
出版日期:
2022-04-10
通讯作者:
郑伯川
作者简介:
张琦(1996—),女,重庆人,硕士研究生,CCF会员,主要研究方向:机器学习、聚类分析基金资助:
Qi ZHANG1, Bochuan ZHENG2(), Zheng ZHANG1, Huanhuan ZHOU1
Received:
2021-07-16
Revised:
2021-08-23
Accepted:
2021-08-27
Online:
2022-04-15
Published:
2022-04-10
Contact:
Bochuan ZHENG
About author:
ZHANG Qi, born in 1996, M. S. candidate. Her research interests include machine learning, clustering analysis.Supported by:
摘要:
针对稀疏子空间聚类(SSC)方法聚类误差大的问题,提出了基于随机分块的SSC方法。首先,将原问题数据集随机分成几个子集,构建几个子问题;然后,采用交替方向乘子法(ADMM)分别求得几个子问题的系数矩阵,之后将几个系数矩阵扩充成与原问题一样大小的系数矩阵,并整合成一个系数矩阵;最后,根据整合得到的系数矩阵计算得到一个相似矩阵,并采用谱聚类(SC)算法获得原问题的聚类结果。相较于稀疏子空间聚类(SSC)、随机稀疏子空间聚类(S3COMP-C)、基于正交匹配追踪的稀疏子空间聚类(SSCOMP)、谱聚类(SC)和K均值(K-Means)算法中的最优算法,基于随机分块的SSC方法将子空间聚类误差平均降低了3.12个百分点,且其互信息、兰德指数和熵3个性能指标都明显优于对比算法。实验结果表明基于随机分块的SSC方法能降低子空间聚类误差,改善聚类性能。
中图分类号:
张琦, 郑伯川, 张征, 周欢欢. 基于随机分块的稀疏子空间聚类方法[J]. 计算机应用, 2022, 42(4): 1148-1154.
Qi ZHANG, Bochuan ZHENG, Zheng ZHANG, Huanhuan ZHOU. Sparse subspace clustering method based on random blocking[J]. Journal of Computer Applications, 2022, 42(4): 1148-1154.
目标数 | ||||||
---|---|---|---|---|---|---|
0.90 | 0.85 | 0.80 | 0.75 | 0.70 | ||
2 | 5 | 1.63 | 1.32 | 1.38 | 2.17 | 1.67 |
6 | 2.00 | 1.26 | 1.42 | 1.86 | 1.45 | |
7 | 2.00 | 1.14 | 1.37 | 1.95 | 1.95 | |
8 | 1.77 | 1.22 | 1.31 | 1.94 | 1.36 | |
9 | 1.60 | 1.18 | 1.41 | 1.83 | 1.33 | |
3 | 5 | 2.65 | 2.31 | 2.26 | 2.60 | 2.73 |
6 | 2.53 | 2.25 | 2.32 | 2.58 | 2.81 | |
7 | 2.51 | 2.22 | 2.39 | 2.61 | 2.75 | |
8 | 2.41 | 2.21 | 2.46 | 2.67 | 2.69 | |
9 | 2.39 | 2.19 | 2.45 | 2.71 | 2.67 | |
4 | 5 | 3.66 | 3.60 | 3.62 | 3.73 | 4.18 |
6 | 3.66 | 3.49 | 3.55 | 3.77 | 4.08 | |
7 | 3.56 | 3.44 | 3.53 | 3.78 | 4.07 | |
8 | 3.53 | 3.42 | 3.57 | 3.77 | 4.02 | |
9 | 3.49 | 3.39 | 3.57 | 3.76 | 3.96 | |
5 | 5 | 4.60 | 4.67 | 4.67 | 5.14 | 5.34 |
6 | 4.62 | 4.56 | 4.70 | 5.21 | 5.33 | |
7 | 4.56 | 4.51 | 4.71 | 5.18 | 5.26 | |
8 | 4.52 | 4.48 | 4.70 | 5.17 | 5.21 | |
9 | 4.51 | 4.48 | 4.70 | 5.17 | 5.19 | |
6 | 5 | 6.10 | 5.73 | 6.03 | 6.26 | 6.89 |
6 | 5.99 | 5.64 | 5.96 | 6.30 | 6.83 | |
7 | 5.69 | 5.63 | 5.96 | 6.32 | 6.79 | |
8 | 5.79 | 5.61 | 5.90 | 6.34 | 6.74 | |
9 | 5.74 | 5.59 | 5.89 | 6.34 | 6.72 | |
7 | 5 | 7.55 | 6.89 | 7.20 | 8.03 | 8.57 |
6 | 7.25 | 6.81 | 7.18 | 8.04 | 8.46 | |
7 | 7.10 | 6.80 | 7.21 | 7.95 | 8.58 | |
8 | 7.00 | 6.80 | 7.20 | 7.88 | 8.51 | |
9 | 6.95 | 6.80 | 7.18 | 7.88 | 8.50 |
表1 子问题数与选择数据点的比例对聚类误差的影响
Tab. 1 Influence of ratio of number of sub-questions to selected data points on clustering error
目标数 | ||||||
---|---|---|---|---|---|---|
0.90 | 0.85 | 0.80 | 0.75 | 0.70 | ||
2 | 5 | 1.63 | 1.32 | 1.38 | 2.17 | 1.67 |
6 | 2.00 | 1.26 | 1.42 | 1.86 | 1.45 | |
7 | 2.00 | 1.14 | 1.37 | 1.95 | 1.95 | |
8 | 1.77 | 1.22 | 1.31 | 1.94 | 1.36 | |
9 | 1.60 | 1.18 | 1.41 | 1.83 | 1.33 | |
3 | 5 | 2.65 | 2.31 | 2.26 | 2.60 | 2.73 |
6 | 2.53 | 2.25 | 2.32 | 2.58 | 2.81 | |
7 | 2.51 | 2.22 | 2.39 | 2.61 | 2.75 | |
8 | 2.41 | 2.21 | 2.46 | 2.67 | 2.69 | |
9 | 2.39 | 2.19 | 2.45 | 2.71 | 2.67 | |
4 | 5 | 3.66 | 3.60 | 3.62 | 3.73 | 4.18 |
6 | 3.66 | 3.49 | 3.55 | 3.77 | 4.08 | |
7 | 3.56 | 3.44 | 3.53 | 3.78 | 4.07 | |
8 | 3.53 | 3.42 | 3.57 | 3.77 | 4.02 | |
9 | 3.49 | 3.39 | 3.57 | 3.76 | 3.96 | |
5 | 5 | 4.60 | 4.67 | 4.67 | 5.14 | 5.34 |
6 | 4.62 | 4.56 | 4.70 | 5.21 | 5.33 | |
7 | 4.56 | 4.51 | 4.71 | 5.18 | 5.26 | |
8 | 4.52 | 4.48 | 4.70 | 5.17 | 5.21 | |
9 | 4.51 | 4.48 | 4.70 | 5.17 | 5.19 | |
6 | 5 | 6.10 | 5.73 | 6.03 | 6.26 | 6.89 |
6 | 5.99 | 5.64 | 5.96 | 6.30 | 6.83 | |
7 | 5.69 | 5.63 | 5.96 | 6.32 | 6.79 | |
8 | 5.79 | 5.61 | 5.90 | 6.34 | 6.74 | |
9 | 5.74 | 5.59 | 5.89 | 6.34 | 6.72 | |
7 | 5 | 7.55 | 6.89 | 7.20 | 8.03 | 8.57 |
6 | 7.25 | 6.81 | 7.18 | 8.04 | 8.46 | |
7 | 7.10 | 6.80 | 7.21 | 7.95 | 8.58 | |
8 | 7.00 | 6.80 | 7.20 | 7.88 | 8.51 | |
9 | 6.95 | 6.80 | 7.18 | 7.88 | 8.50 |
目标数 | 算法 | 子空间聚类误差/% | 互信息 | 兰德指数 | 熵 |
---|---|---|---|---|---|
2 | SSC | 4.07 | 0.84 | 0.95 | 0.09 |
S3COMP-C | 4.71 | 0.76 | 0.93 | 0.13 | |
SSCOMP | 7.10 | 0.73 | 0.90 | 0.18 | |
SC | 45.89 | 0.01 | 0.20 | 0.99 | |
K-Means | 46.98 | 0.01 | 0.50 | 0.99 | |
本文算法 | 1.22 | 0.89 | 0.98 | 0.04 | |
3 | SSC | 4.79 | 0.85 | 0.95 | 0.14 |
S3COMP-C | 6.30 | 0.81 | 0.94 | 0.19 | |
SSCOMP | 11.22 | 0.73 | 0.80 | 0.30 | |
SC | 63.28 | 0.01 | 0.55 | 1.57 | |
K-Means | 62.95 | 0.01 | 0.56 | 1.57 | |
本文算法 | 2.21 | 0.89 | 0.97 | 0.08 | |
4 | SSC | 6.04 | 0.85 | 0.95 | 0.18 |
S3COMP-C | 8.20 | 0.81 | 0.94 | 0.26 | |
SSCOMP | 16.15 | 0.71 | 0.89 | 0.42 | |
SC | 71.28 | 0.01 | 0.62 | 0.98 | |
K-Means | 70.79 | 0.02 | 0.62 | 1.95 | |
本文算法 | 3.42 | 0.88 | 0.97 | 0.12 | |
5 | SSC | 7.52 | 0.84 | 0.94 | 0.21 |
S3COMP-C | 10.10 | 0.80 | 0.94 | 0.32 | |
SSCOMP | 20.09 | 0.70 | 0.89 | 0.53 | |
SC | 75.09 | 0.02 | 0.68 | 2.25 | |
K-Means | 74.53 | 0.03 | 0.68 | 2.21 | |
本文算法 | 4.48 | 0.88 | 0.97 | 0.15 | |
6 | SSC | 9.14 | 0.83 | 0.94 | 0.23 |
S3COMP-C | 12.56 | 0.79 | 0.94 | 0.38 | |
SSCOMP | 23.41 | 0.68 | 0.89 | 0.61 | |
SC | 77.72 | 0.03 | 0.72 | 2.47 | |
K-Means | 76.82 | 0.05 | 0.72 | 2.41 | |
本文算法 | 5.61 | 0.87 | 0.96 | 0.17 | |
7 | SSC | 10.91 | 0.81 | 0.94 | 0.26 |
S3COMP-C | 15.04 | 0.78 | 0.94 | 0.44 | |
SSCOMP | 25.68 | 0.68 | 0.90 | 0.65 | |
SC | 79.90 | 0.04 | 0.75 | 2.67 | |
K-Means | 78.74 | 0.06 | 0.75 | 2.59 | |
本文算法 | 6.80 | 0.86 | 0.96 | 0.19 |
表2 Extended Yale B数据集中不同数量对象的人脸图像的5种聚类评估(k=0.85,T=8)
Tab. 2 Five clustering evaluations of face images with different numbers of objects in Extended Yale B dataset(k=0.85,T=8)
目标数 | 算法 | 子空间聚类误差/% | 互信息 | 兰德指数 | 熵 |
---|---|---|---|---|---|
2 | SSC | 4.07 | 0.84 | 0.95 | 0.09 |
S3COMP-C | 4.71 | 0.76 | 0.93 | 0.13 | |
SSCOMP | 7.10 | 0.73 | 0.90 | 0.18 | |
SC | 45.89 | 0.01 | 0.20 | 0.99 | |
K-Means | 46.98 | 0.01 | 0.50 | 0.99 | |
本文算法 | 1.22 | 0.89 | 0.98 | 0.04 | |
3 | SSC | 4.79 | 0.85 | 0.95 | 0.14 |
S3COMP-C | 6.30 | 0.81 | 0.94 | 0.19 | |
SSCOMP | 11.22 | 0.73 | 0.80 | 0.30 | |
SC | 63.28 | 0.01 | 0.55 | 1.57 | |
K-Means | 62.95 | 0.01 | 0.56 | 1.57 | |
本文算法 | 2.21 | 0.89 | 0.97 | 0.08 | |
4 | SSC | 6.04 | 0.85 | 0.95 | 0.18 |
S3COMP-C | 8.20 | 0.81 | 0.94 | 0.26 | |
SSCOMP | 16.15 | 0.71 | 0.89 | 0.42 | |
SC | 71.28 | 0.01 | 0.62 | 0.98 | |
K-Means | 70.79 | 0.02 | 0.62 | 1.95 | |
本文算法 | 3.42 | 0.88 | 0.97 | 0.12 | |
5 | SSC | 7.52 | 0.84 | 0.94 | 0.21 |
S3COMP-C | 10.10 | 0.80 | 0.94 | 0.32 | |
SSCOMP | 20.09 | 0.70 | 0.89 | 0.53 | |
SC | 75.09 | 0.02 | 0.68 | 2.25 | |
K-Means | 74.53 | 0.03 | 0.68 | 2.21 | |
本文算法 | 4.48 | 0.88 | 0.97 | 0.15 | |
6 | SSC | 9.14 | 0.83 | 0.94 | 0.23 |
S3COMP-C | 12.56 | 0.79 | 0.94 | 0.38 | |
SSCOMP | 23.41 | 0.68 | 0.89 | 0.61 | |
SC | 77.72 | 0.03 | 0.72 | 2.47 | |
K-Means | 76.82 | 0.05 | 0.72 | 2.41 | |
本文算法 | 5.61 | 0.87 | 0.96 | 0.17 | |
7 | SSC | 10.91 | 0.81 | 0.94 | 0.26 |
S3COMP-C | 15.04 | 0.78 | 0.94 | 0.44 | |
SSCOMP | 25.68 | 0.68 | 0.90 | 0.65 | |
SC | 79.90 | 0.04 | 0.75 | 2.67 | |
K-Means | 78.74 | 0.06 | 0.75 | 2.59 | |
本文算法 | 6.80 | 0.86 | 0.96 | 0.19 |
1 | VIDAL R. Subspace clustering[J]. IEEE Signal Processing Magazine, 2011, 28(2):52-68. 10.1109/msp.2010.939739 |
2 | VIDAL R, MA Y, SASTRY S. Generalized Principal Component Analysis (GPCA)[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2005, 27(12):1945-1959. 10.1109/tpami.2005.244 |
3 | COSTEIRA J P, KANADE T . et al. A multibody factorization method for independently moving objects[J]. International Journal of Computer Vision, 1998, 29(3):159-179. 10.1023/a:1008000628999 |
4 | CHEN G L, LERMAN G. Spectral Curvature Clustering (SCC)[J]. International Journal of Computer Vision, 2009, 81(3):317-330. 10.1007/s11263-008-0178-9 |
5 | LU C Y, MIN H, ZHAO Z Q, et al. Robust and efficient subspace segmentation via least squares regression[C]// Proceedings of the 2012 European Conference on Computer Vision, LNCS 7578. Berlin: Springer, 2012:347-360. |
6 | McWILLIAMS B, MONTANA G. Subspace clustering of high-dimensional data: a predictive approach[J]. Data Mining and Knowledge Discovery, 2014, 28(3):736-772. 10.1007/s10618-013-0317-y |
7 | MA Y, YANG A Y, DERKSEN H, et al. Estimation of Subspace arrangements with applications in modeling and segmenting mixed data[J]. SIAM review, 2008, 50(3): 413-458. 10.1137/060655523 |
8 | ARCHAMBEAU C, DELANNAY N, VERLEYSEN M. Mixtures of robust probabilistic principal component analyzers[J]. Neurocomputing, 2008, 71(7/8/9):1274-1282. 10.1016/j.neucom.2007.11.029 |
9 | TSENG P. Nearest q-flat to m points[J]. Journal of Optimization Theory and Applications, 2000, 105(1):249-252. 10.1023/a:1004678431677 |
10 | ZHOU W M, LIU H, XU Q P, et al. Glycerol’s generalized two-dimensional correlation IR/NIR spectroscopy and its principal component analysis[J]. Spectrochimica Acta Part A: Molecular and Biomolecular Spectroscopy, 2020, 228:No.117824. 10.1016/j.saa.2019.117824 |
11 | ELHAMIFAR E, VIDAL R. Sparse subspace clustering[C]// Proceedings of the 2009 IEEE Conference on Computer Vision and Pattern Recognition. Piscataway: IEEE, 2009:2790-2797. 10.1109/cvpr.2009.5206547 |
12 | LIU G C, LIN S C, YAN S C, et al. Robust recovery of subspace structures by low-rank representation[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2013, 35(1):171-184. 10.1109/tpami.2012.88 |
13 | WANG Y X, XU H, LENG C L. Provable subspace clustering: when LRR meets SSC[J]. IEEE Transactions on Information Theory, 2019, 65(9):5406-5432. 10.1109/tit.2019.2915593 |
14 | LI C G, VIDAL R. Structured sparse subspace clustering: a unified optimization framework[C]// Proceedings of the 2015 IEEE International Conference on Computer Vision and Pattern Recognition. Piscataway: IEEE, 2015:277-286. 10.1109/cvpr.2015.7298624 |
15 | PENG X, ZHANG L, YI Z. Scalable sparse subspace clustering[C]// Proceedings of the 2013 IEEE Conference on Computer Vision and Pattern Recognition. Piscataway: IEEE, 2013:430-437. 10.1109/cvpr.2013.62 |
16 | PATEL V M, VIDAL R. Kernel sparse subspace clustering[C]// Proceedings of the 2014 IEEE International Conference on Image Processing. Piscataway: IEEE, 2014:2849-2853. 10.1109/icip.2014.7025576 |
17 | SOLTANOLKOTABI M, ELHAMIFAR E, CANDES E J. Robust subspace clustering[J]. Annals of Statistics, 2014, 42(2):669-699. 10.1214/13-aos1199 |
18 | XU J, XU K, KE C, et al. Reweighted sparse subspace clustering[J]. Computer Vision and Image Understanding, 2015, 138:25-37. 10.1016/j.cviu.2015.04.003 |
19 | YOU C, ROBINSON D P, VIDAL R. Scalable sparse subspace clustering by orthogonal matching pursuit[C]// Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition. Piscataway: IEEE, 2016:3918-3927. 10.1109/cvpr.2016.425 |
20 | ZHANG S C, LI Y G, CHENG D B, et al. Efficient subspace clustering based on self-representation and grouping effect[J]. Neural Computing and Applications, 2018, 29(1):51-59. 10.1007/s00521-016-2353-1 |
21 | XU G, YANG M, WU Q F. Sparse subspace clustering with low-rank transformation[J]. Neural Computing and Applications, 2019, 31(7):3141-3154. 10.1007/s00521-017-3259-2 |
22 | CHEN Y, LI C G, YOU C. Stochastic sparse subspace clustering[C]// Proceedings of the 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition. Piscataway: IEEE, 2020:4154-4163. 10.1109/cvpr42600.2020.00421 |
23 | ELHAMIFAR E, VIDAL R. Sparse subspace clustering: algorithm, theory, and applications[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2013, 35(11):2765-2781. 10.1109/tpami.2013.57 |
24 | SOLTANOLKOTABI M, CANDÉS E J. A geometric analysis of subspace clustering with outliers[J]. Annals of Statistics, 2012, 40(4):2195-2238. 10.1214/12-aos1034 |
25 | YOU C, VIDAL R. Geometric conditions for subspace-sparse recovery[C]// Proceedings of 32nd International Conference on Machine Learning. New York: JMLR.org, 2015:1585-1593. 10.1109/cvpr.2016.425 |
26 | BOYD S, PARIKH N, CHU E, et al. Distributed optimization and statistical learning via the alternating direction method of multipliers[J]. Foundations and Trends in Machine Learning, 2011, 3(1):1-122. |
27 | 刘紫涵,吴鹏海,吴艳兰. 三种谱聚类算法及其应用研究[J]. 计算机应用研究, 2017, 34(4):1026-1031. 10.3969/j.issn.1001-3695.2017.04.016 |
LIU Z H, WU P H, WU Y L. Research of three spectral clustering algorithms and its application[J]. Application Research of Computers, 2017, 34(4):1026-1031. 10.3969/j.issn.1001-3695.2017.04.016 | |
28 | SRIVASTAVA N, HINTON G, KRIZHEVSKY A, et al. Dropout: a simple way to prevent neural networks from overfitting[J]. Journal of Machine Learning Research, 2014, 15:1929-1958. |
29 | WAN L, ZEILER M, ZHANG S X, et al. Regularization of neural networks using DropConnect[C]// Proceedings of 30th International Conference on Machine Learning. New York: JMLR.org, 2014:1058-1066. |
30 | WAGER S, WANG S D, LIANG P. Dropout training as adaptive regularization[C]// Proceedings of the 26th International Conference on Neural Information Processing Systems. Red Hook, NY: Curran Associates Inc., 2013:351-359. |
31 | BALDI P, SADOWSKI P. Understanding dropout[M]// BURGES C J C, BOTTOU L, WELLING M, et al. Advances in Neural Information Processing Systems 26. La Jolla, CA: Neural Information Processing Systems Foundation, 2013:2814-2822. |
32 | GAL Y, GHAHRAMANI Z. Dropout as a Bayesian approximation: representing model uncertainty in deep learning[C]// Proceedings of the 33rd International Conference on Machine Learning. New York: JMLR.org, 2016:1050-1059. |
33 | WATSON G A. Characterization of the subdifferential of some matrix norms[J]. Linear Algebra and its Applications, 1992, 170:33-45. 10.1016/0024-3795(92)90407-2 |
[1] | 丁雨, 张瀚霖, 罗荣, 孟华. 基于信念子簇切割的模糊聚类算法[J]. 《计算机应用》唯一官方网站, 2024, 44(4): 1128-1138. |
[2] | 马志峰, 于俊洋, 王龙葛. 多样性表示的深度子空间聚类算法[J]. 《计算机应用》唯一官方网站, 2023, 43(2): 407-412. |
[3] | 李文博, 刘波, 陶玲玲, 罗棻, 张航. L1正则化的深度谱聚类算法[J]. 《计算机应用》唯一官方网站, 2023, 43(12): 3662-3667. |
[4] | 赖星锦, 郑致远, 杜晓颜, 徐莎, 杨晓君. 基于超像素锚图二重降维的高光谱聚类算法[J]. 《计算机应用》唯一官方网站, 2022, 42(7): 2088-2093. |
[5] | 祝承, 赵晓琦, 赵丽萍, 焦玉宏, 朱亚飞, 陈建英, 周伟, 谭颖. 基于谱聚类半监督特征选择的功能磁共振成像数据分类[J]. 计算机应用, 2021, 41(8): 2288-2293. |
[6] | 高冉, 陈花竹. 改进的基于谱聚类的子空间聚类模型[J]. 《计算机应用》唯一官方网站, 2021, 41(12): 3645-3651. |
[7] | 李杏峰, 黄玉清, 任珍文. 联合低秩稀疏的多核子空间聚类算法[J]. 计算机应用, 2020, 40(6): 1648-1653. |
[8] | 刘静姝, 王莉, 刘惊雷. 无需特征分解的快速谱聚类算法[J]. 计算机应用, 2020, 40(12): 3413-3422. |
[9] | 宋艳, 殷俊. 基于共享近邻的多视角谱聚类算法[J]. 计算机应用, 2020, 40(11): 3211-3216. |
[10] | 崔艺馨, 陈晓东. Spark框架优化的大规模谱聚类并行算法[J]. 计算机应用, 2020, 40(1): 168-172. |
[11] | 毛伊敏, 刘银萍, 梁田, 毛丁慧. 基于模糊谱聚类的不确定蛋白质相互作用网络功能模块挖掘[J]. 计算机应用, 2019, 39(4): 1032-1040. |
[12] | 郭烜成, 林晖, 叶秀彩, 许传丰. 软件定义广域网中控制器部署与交换机动态迁移策略[J]. 计算机应用, 2019, 39(2): 453-457. |
[13] | 孙石磊, 王超, 赵元棣. 基于轮廓系数的参数无关空中交通轨迹聚类方法[J]. 计算机应用, 2019, 39(11): 3293-3297. |
[14] | 龚永红, 郑威, 吴林, 谭马龙, 余浩. 基于自步学习的无监督属性选择算法[J]. 计算机应用, 2018, 38(10): 2856-2861. |
[15] | 郑孝遥, 陈冬梅, 刘雨晴, 尤浩, 汪祥舜, 孙丽萍. 基于差分隐私保护的谱聚类算法[J]. 计算机应用, 2018, 38(10): 2918-2922. |
阅读次数 | ||||||
全文 |
|
|||||
摘要 |
|
|||||