1 |
ZHANG R, NIE F P, GUO M H, et al. Joint learning of fuzzy k-means and nonnegative spectral clustering with side information [J]. IEEE Transactions on Image Processing, 2019, 28(5): 2152-2162. 10.1109/tip.2018.2882925
|
2 |
ZHAO W L, DENG C H, NGO C W. k-means: a revisit [J]. Neurocomputing, 2018, 291: 195-206. 10.1016/j.neucom.2018.02.072
|
3 |
NIE F P, WANG C L, LI X L. K-multiple-means: a multiple-means clustering method with specified k clusters[C]// Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. New York: ACM, 2019: 959-967. 10.1145/3292500.3330846
|
4 |
IOANNIS T, GIORGOS B, PAVLOS K, et al. A greedy feature selection algorithm for big data of high dimensionality [J]. Machine Learning, 2019, 108(2): 149-202. 10.1007/s10994-018-5748-7
|
5 |
GONG Y, PAWLOWSKI M, FEI Y, et al. Web scale photo hash clustering on a single machine[C]// Proceedings of the 2015 IEEE Conference on Computer Vision and Pattern Recognition. Piscataway: IEEE, 2015: 19-27. 10.1109/cvpr.2015.7298596
|
6 |
ADOLFSSON A, ACKERMAN M, BROWNSTEIN N C. To cluster, or not to cluster: an analysis of clusterability methods-ScienceDirect[J]. Pattern Recognition, 2019, 88(6): 13-26. 10.1016/j.patcog.2018.10.026
|
7 |
HAYASHI N. Variational approximation error in non-negative matrix factorization[J]. Neural Networks, 2020, 126: 65-75. 10.1016/j.neunet.2020.03.009
|
8 |
WANG Y T, WANG J D, HAO L, et al. An efficient semi-supervised representatives feature selection algorithm based on information theory [J]. Pattern Recognition Society Pattern Recognition, 2017, 61: 511-523. 10.1016/j.patcog.2016.08.011
|
9 |
ARTHUR D, VASSILVITSKII S. k-means++: the advantages of careful seeding [C]// Proceedings of the 18th Annual ACM-SIAM Symposium on Discrete Algorithms. Philadelphia, PA :Society for Industrial and Applied Mathematics, 2007:1027-1035.
|
10 |
SCHLKOPF B, PLATT J, HOFMANN T. Efficient sparse coding algorithms [C]// Proceedings of the 19th International Conference on Neural Information Processing Systems. Cambridge, MA: MIT Press, 2007:801-808. 10.7551/mitpress/7503.003.0105
|
11 |
LIU W, HE J, CHANG S F. Large graph construction for scalable semi-supervised learning[C]// Proceedings of the 27th International Conference on Machine Learning. Madison: Omni Press, 2010: 679-686.
|
12 |
CAPÓ M, PÉREZ A, LOZANO J A. An efficient approximation to the K-means clustering for massive data [J]. Knowledge-Based Systems, 2017, 117(2): 56-69. 10.1016/j.knosys.2016.06.031
|
13 |
BOUTSIDIS C, ZOUZIAS A, MAHONEY M W. Randomized dimensionality reduction for k-means clustering[J]. IEEE Transactions on Information Theory, 2015,61(2):1045-1062. 10.1109/tit.2014.2375327
|
14 |
SINHA K. K-means clustering using random matrix sparsification[C]// Proceedings of the 35th International Conference on Machine Learning. New York: ACM, 2018:4684-4692. 10.1109/indicon45594.2018.8986990
|
15 |
马雷. 面向大规模图像哈希学习的理论与方法研究 [D].成都: 电子科技大学,2019: 1.
|
|
MA L. Research on theory and method of large-scale image hashing learning[D].Chengdu: University of Electronic Science and Technology of China, 2019: 1.
|
16 |
HUANG S, WANG H, LI T, et al. Robust graph regularized nonnegative matrix factorization for clustering [J]. Data Mining and Knowledge Discovery, 2018,32(3):483-503. 10.1007/s10618-017-0543-9
|
17 |
TAO H, HOU C, NIE F, et al. Effective discriminative feature selection with nontrivial solution [J]. IEEE Transactions on Neural Networks and Learning Systems, 2016, 27(4): 796-808. 10.1109/tnnls.2015.2424721
|
18 |
CHE J, YANG Y, LI L, et al. Maximum relevance minimum common redundancy feature selection for nonlinear data [J]. Information Sciences, 2017, 409/410: 68-86. 10.1016/j.ins.2017.05.013
|
19 |
ALELYANI S, TANG J, LIU H. Feature selection for clustering: a review [J]. Encyclopedia of Database Systems, 2016, 21(3): 110-121.
|
20 |
SHEN F, YANG Y, LI L, et al. Asymmetric binary coding for image search [J]. IEEE Transactions on Multimedia, 2017, 19(9): 2022-2032. 10.1109/tmm.2017.2699863
|
21 |
NIE F P, YANG S, ZHANG R,et al. A general framework for auto-weighted feature selection via global redundancy minimization [J]. IEEE Transactions on Image Processing,2019,28(5): 2428-2438. 10.1109/tip.2018.2886761
|
22 |
SHEN F M, ZHOU X, YANG Y,et al. A fast optimization method for general binary code learning [J]. IEEE Transactions on Image Processing, 2016, 25(12):5610-5621. 10.1109/tip.2016.2612883
|
23 |
LI Y, ZHANG S, CHENG D, et al. Spectral clustering based on hypergraph and self-representation[J]. Multimedia Tools & Applications, 2017, 76(16):17559-17576. 10.1007/s11042-016-4131-6
|
24 |
DU X Z, NIE F P, WANG W Q, et.al. Exploiting combination effect for unsupervised feature selection by l2,0 norm[J]. IEEE Transactions on Neural Networks and Learning Systems, 2019, 30(1):201-214. 10.1109/tnnls.2018.2837100
|
25 |
HOU C P, NIE F P, LI X L, et al. Joint embedding learning and sparse regression: a framework for unsupervised feature selection [J]. IEEE Transactions on Cybernetics, 2014,44(6):793-804. 10.1109/tcyb.2013.2272642
|
26 |
XI P, TANG H, LEI Z, et al. A unified framework for representation-based subspace clustering of out-of-sample and large-scale data [J]. IEEE Transactions on Neural Networks & Learning Systems, 2015, 27(12): 2499-2512. 10.1109/tnnls.2015.2490080
|
27 |
李来, 刘光灿, 孙玉宝, 等. 各向同性的迭代量化哈希算法 [J]. 电子学报, 2017, 45(7): 1707-1714. 10.3969/j.issn.0372-2112.2017.07.022
|
|
LI L, LIU G C, SUN Y B, et al.Isotropic iterative quantization hashing [J]. Acta Electronic Sinica,2017, 45(7): 1707-1714. 10.3969/j.issn.0372-2112.2017.07.022
|
28 |
WEN J, XU Y, LI Z, et al. Inter-class sparsity based discriminative least square regression [J]. Neural Networks, 2018, 102: 36-47. 10.1016/j.neunet.2018.02.002
|
29 |
ELHAMIFAR E, VIDAL R. Sparse subspace clustering: algorithm, theory, and applications[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2012, 35(11): 2765-2781. 10.1109/tpami.2013.57
|
30 |
CAI D, CHEN X. Large scale spectral clustering via landmark-based sparse representation [J]. IEEE Transactions on Cybernetics, 2015, 45(8): 1669-1680. 10.1109/tcyb.2014.2358564
|
31 |
ZHU X F, ZHANG S C, LI Y G, et.al. Low-rank sparse subspace for spectral clustering [J]. IEEE Transactions on Knowledge and Data Engineering, 2019,31(8):1532-1543. 10.1109/tkde.2018.2858782
|
32 |
WANG Y L, TANG Y Y, LUO Q. Minimum error entropy based sparse representation for robust subspace clustering [J]. IEEE Transactions on Signal Processing, 2015, 63(15):4010-4021. 10.1109/tsp.2015.2425803
|
33 |
WAN Y, ZHONG Y, MA A, et al. Multi-objective sparse subspace clustering for hyperspectral imagery [J]. IEEE Transactions on Geoscience and Remote Sensing, 2020, 58(4): 2290-2307. 10.1109/tgrs.2019.2947253
|
34 |
LI Y, NIE F, HUANG H, et al. Large-scale multi-view spectral clustering via bipartite graph [C]// Proceedings of the 29th AAAI Conference on Artificial Intelligence.Menlo Park, CA:AAAI, 2015: 2750-2756.
|
35 |
CAI X, NIE F P, HUANG H.Multi-view k-means clustering on big data[C]// Proceedings of the 23rd International Joint Conference on Artificial Intelligence. Menlo Park, CA:AAAI, 2010: 2598-2604.
|
36 |
NIE F P, JING L, LI X L. Parameter-free auto-weighted multiple graph learning: a framework for multiview clustering and semi-supervised classification [C]// Proceedings of the 25th International Joint Conference on Artificial Intelligence. Menlo Park, CA:AAAI, 2016:1881-1887. 10.24963/ijcai.2017/357
|
37 |
ZHANG T, YANG Y. Robust PCA by manifold optimization[J]. Journal of Machine Learning Research,2018,19: 1-39.
|
38 |
LEE D D, SEUNG H S. Learning the parts of objects by non-negative matrix factorization[J]. Nature,1999,401(6755):788-791. 10.1038/44565
|
39 |
章永来, 周耀鉴. 聚类算法综述[J]. 计算机应用, 2019,39(7):1869-1882. 10.11772/j.issn.1001-9081.2019010174
|
|
ZHANG Y L, ZHOU Y J.Review of clustering algorithms[J]. Journal of Computer Applications, 2019,39(7):1869-1882. 10.11772/j.issn.1001-9081.2019010174
|