[1] CANO J R, GUTIÉRREZ P A, KRAWCZYK B, et al. Monotonic classification:an overview on algorithms, performance measures and data sets[J]. Neurocomputing, 2019, 341:168-182. [2] WANG X Z, ASHFAQ R A R, FU A M. Fuzziness based sample categorization for classifier performance improvement[J]. Journal of Intelligent and Fuzzy Systems, 2015, 29(3):1185-1196. [3] DOYLE O M, WESTMAN E, MARQUAND A F, et al. Predicting progression of Alzheimer's disease using ordinal regression[J]. PLoS ONE, 2014, 9(8):No. e105542. [4] PAN W W. Fraudulent firm classification using monotonic classification techniques[C]//Proceedings of the IEEE 9th Joint International Information Technology and Artificial Intelligence Conference. Piscataway:IEEE, 2020:1773-1776. [5] LIANG J Y, CHIN K S, DANG C Y, et al. A new method for measuring uncertainty and fuzziness in rough set theory[J]. International Journal of General Systems, 2002, 31(4):331-342. [6] DEMBCZYŃSKI K, KOTŁOWSKI W, SŁOWIŃSKI R. Ensemble of decision rules for ordinal classification with monotonicity constraints[C]//Proceedings of the 2008 International Conference on Rough Sets and Knowledge Technology, LNCS 5009. Berlin:Springer, 2008:260-267. [7] TEHRANI A F, HÜLLERMEIER E. Ordinal choquistic regression[C]//Proceedings of the 8th Conference of the European Society for Fuzzy Logic and Technology. Dordrecht:Atlantis Press, 2013:842-849. [8] BARTLEY C, LIU W, REYNOLDS M. Effective monotone knowledge integration in kernel support vector machines[C]//Proceedings of the 2016 International Conference on Advanced Data Mining and Applications, LNCS 10086. Cham:Springer, 2016:3-18. [9] DANIELS H, VELIKOVA M. Monotone and partially monotone neural networks[J]. IEEE Transactions on Neural Networks, 2010, 21(6):906-917. [10] XU H, WANG W J, QIAN Y H. Fusing complete monotonic decision trees[J]. IEEE Transactions on Knowledge and Data Engineering, 2017, 29(10):2223-2235. [11] ZHU H, TSANG E C C, WANG X Z, et al. Monotonic classification extreme learning machine[J]. Neurocomputing, 2017, 225:205-213. [12] CARDOSO J S, PINTO DA COSTA J F. Learning to classify ordinal data:the data replication method[J]. Journal of Machine Learning Research, 2007, 8:1393-1429. [13] TANG M Z, PÉREZ-FERNÁNDEZ R, DE BAETS B. Fusing absolute and relative information for augmenting the method of nearest neighbors for ordinal classification[J]. Information Fusion, 2020, 56:128-140. [14] GONZÁLEZ S, HERRERA F, GARCÍA S. Managing monotonicity in classification by a pruned AdaBoost[C]//Proceedings of the 2016 International Conference on Hybrid Artificial Intelligence Systems, LNCS 9648. Cham:Springer, 2016:512-523. [15] PINTO DA COSTA J, CARDOSO J S. Classification of ordinal data using neural networks[C]//Proceedings of the 2005 European Conference on Machine Learning, LNCS 3720. Berlin:Springer, 2005:690-697. [16] QUINLAN J R. Induction of decision trees[J]. Machine Learning, 1986, 1(1):81-106. [17] QUINLAN J R. C4.5:Programs for Machine Learning[M]. San Mateo, CA:Morgan Kaufmann Publishers, 1992:8-10. [18] FEELDERS A, PARDOEL M. Pruning for monotone classification trees[C]//Proceedings of the 2003 International Symposium on Intelligent Data Analysis, LNCS 2810. Berlin:Springer, 2003:1-12. [19] VERBEKE W, MARTENS D, BAESENS B. RULEM:a novel heuristic rule learning approach for ordinal classification with monotonicity constraints[J]. Applied Soft Computing, 2017, 60:858-873. [20] XIA F, ZHANG W S, LI F X, et al. Ranking with decision tree[J]. Knowledge and Information Systems, 2008, 17(3):381-395. [21] HU Q H, CHE X J, ZHANG L, et al. Rank entropy-based decision trees for monotonic classification[J]. IEEE Transactions on Knowledge and Data Engineering, 2012, 24(11):2052-2064. [22] 车勋建. 基于有序决策树的故障程度诊断研究[D]. 哈尔滨:哈尔滨工业大学, 2011:53-57.(CHE X J. Ordinal decision tree based fault level detection[D]. Harbin:Harbin Institute of Technology, 2011:53-57.) [23] 许行, 王文剑, 任丽芳. 一种基于决策森林的单调分类方法[J]. 计算机研究与发展, 2017, 54(7):1477-1487.(XU H, WANG W J, REN L F. A method for monotonic classification based on decision forest[J]. Journal of Computer Research and Development, 2017, 54(7):1477-1487.) [24] XIA S Y, WANG G Y, CHEN Z Z, et al. Complete random forest based class noise filtering learning for improving the generalizability of classifiers[J]. IEEE Transactions on Knowledge and Data Engineering, 2019, 31(11):2063-2078. [25] FRÉNAY B, HAMMER B. Label-noise-tolerant classification for streaming data[C]//Proceedings of the 2017 International Joint Conference on Neural Networks. Piscataway:IEEE, 2017:1748-1755. [26] XIA S Y, CHEN B Y, WANG G Y, et al. mCRF and mRD:two classification methods based on a novel multiclass label noise filtering learning framework[J]. IEEE Transactions on Neural Networks and Learning Systems, 2021, PP (99):1-15. |