[1] FRÉNAY B,VERLEYSEN M. Classification in the presence of label noise:a survey[J]. IEEE Transactions on Neural Networks and Learning Systems,2014,25(5):845-869. [2] SEGATA N,BLANZIERI E,DELANY S J,et al. Noise reduction for instance-based learning with a local maximal margin approach[J]. Journal of Intelligent Information Systems,2010,35(2):301-331. [3] RAJASEKAR M,SANDHYA N. Mammogram images detection using support vector machines[J]. International Journal of Advanced Research in Computer Science,2017,8(7):329-334. [4] PAULHEIM H. Knowledge graph refinement:a survey of approaches and evaluation methods[J]. Semantic Web,2017,8(3):489-508. [5] SUBRAMANIYASWAMY V,LOGESH R. Adaptive KNN based recommender system through mining of user preferences[J]. Wireless Personal Communications,2017,97(2):2229-2247. [6] GARCÍA S,DERRAC J,CANO J R,et al. Prototype selection for nearest neighbor classification:taxonomy and empirical study[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2012,34(3):417-435. [7] SUNDER H,KHURD P. Parallel algorithms for the computation of cycles in relative neighborhood graphs[C]//Proceedings of the 46th International Conference on Parallel Processing. Piscataway:IEEE,2017:191-200. [8] DU W,URAHAMA K. Error-correcting semi-supervised pattern recognition with mode filter on graphs[C]//Proceedings of the 2nd International Symposium on Aware Computing. Piscataway:IEEE, 2010:6-11. [9] GÓMEZ-RÍOS A,LUENGO J,HERRERA F. A study on the noise label influence in boosting algorithms:AdaBoost, GBM and XGBoost[C]//Proceedings of the 2017 International Conference on Hybrid Artificial Intelligence Systems, LNCS 10334. Cham:Springer,2017:268-280. [10] GAO Y,GAO F,GUAN X. Improved boosting algorithm with adaptive filtration[C]//Proceedings of the 8th World Congress on Intelligent Control and Automation. Piscataway:IEEE,2010:3173-3178. [11] CHEN T, GUESTRIN C. XGBoost:a scalable tree boosting system[C]//Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. New York:ACM,2016:785-794. [12] ZHANG J,WU X,SHENG V S. Learning from crowdsourced labeled data:a survey[J]. Artificial Intelligence Review,2016, 46(4):543-576. [13] CANO J R, LUENGO J, GARCÍA S. Label noise filtering techniques to improve monotonic classification[J]. Neurocomputing,2019,353:83-95. [14] SÁEZ J A,GALAR M,LUENGO J,et al. INFFC:an iterative class noise filter based on the fusion of classifiers with noise sensitivity control[J]. Information Fusion,2016,27:19-32. [15] CHEN H,SHEN C,HE G,et al. Critical noise of majority-vote model on complex networks[J]. Physical Review. E, Statistical, Nonlinear,and Soft Matter Physics,2015,91(2):No. 022816. [16] XIE S,GUO L. Analysis of normalized least mean squares-based consensus adaptive filters under a general information condition[J]. SIAM Journal on Control and Optimization,2018,56(5):3404-3431. [17] YUAN W,GUAN D,MA T,et al. Classification with class noises through probabilistic sampling[J]. Information Fusion,2018, 41:57-67. [18] 陈庆强, 王文剑, 姜高霞. 基于数据分布的标签噪声过滤[J]. 清华大学学报(自然科学版),2019,59(4):262-269.(CHEN Q Q,WANG W J,JIANG G X. Label noise filtering based on the data distribution[J]. Journal of Tsinghua University(Science and Technology),2019,59(4):262-269.) |