1 |
雷斌,蓝羽石,黎茂林,等.智慧社会体系化建设总体构想与发展建议[J].中国工程科学, 2023, 25(3): 219-229.
|
|
LEI B, LAN Y S, LI M L, et al. Overall conception and development suggestions for the systematic construction of smart society [J]. Strategic Study of CAE, 2023, 25(3): 219-229.
|
2 |
贾经冬,张敏南,赵祥,等.接诉即办智能派单业务调度算法研究[J].计算机科学, 2023, 50(11A): No.230300029.
|
|
JIA J D, ZHANG M N, ZHAO X, et al. Study on scheduling algorithm of intelligent order dispatching [J]. Computer Science, 2023, 50(11A): No.230300029.
|
3 |
陈锋. “下交群评”:迈向人民主体性的基层治理——以北京市平谷区接诉即办改革为例[J].北京工业大学学报(社会科学版), 2024, 24(3): 29-39.
|
|
CHEN F. “Downward transfer and mass evaluation”: towards grass-roots governance of people's subjectivity — take the work of in the reform of immediate handling of complaints in Pinggu district of Beijing as an example [J]. Journal of Beijing University of Technology (Social Sciences Edition), 2024, 24(3): 29-39.
|
4 |
肖琳,陈博理,黄鑫,等.基于标签语义注意力的多标签文本分类[J].软件学报, 2020, 31(4): 1079-1089.
|
|
XIAO L, CHEN B L, HUANG X, et al. Multi-label text classification method based on label semantic information [J]. Journal of Software, 2020, 31(4): 1079-1089.
|
5 |
黄伟,刘贵全. MSML-BERT模型的层级多标签文本分类方法研究[J].计算机工程与应用, 2022, 58(15): 191-201.
|
|
HUANG W, LIU G Q. Study on hierarchical multi-label text classification method of MSML-BERT model [J]. Computer Engineering and Applications, 2022, 58(15): 191-201.
|
6 |
MAO Y, TIAN J, HAN J, et al. Hierarchical text classification with reinforced label assignment [C]// Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing. Stroudsburg: ACL, 2019: 445-455.
|
7 |
吕学强,彭郴,张乐,等.融合BERT与标签语义注意力的文本多标签分类方法[J].计算机应用, 2022, 42(1): 57-63.
|
|
LYU X Q, PENG C, ZHANG L, et al. Text multi-label classification method incorporating BERT and label semantic attention [J]. Journal of Computer Applications, 2022, 42(1): 57-63.
|
8 |
JOHNSON R, ZHANG T. Effective use of word order for text categorization with convolutional neural networks [C]// Proceedings of the 2015 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. Stroudsburg: ACL, 2015: 103-112.
|
9 |
WANG Z, WANG P, HUANG L, et al. Incorporating hierarchy into text encoder: a contrastive learning approach for hierarchical text classification [C]// Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). Stroudsburg: ACL, 2022: 7109-7119.
|
10 |
ROJAS K R, BUSTAMANTE G, ONCEVAY A, et al. Efficient strategies for hierarchical text classification: external knowledge and auxiliary tasks [C]// Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics. Stroudsburg: ACL, 2020: 2252-2257.
|
11 |
WEHRMANN J, CERRI R, BARROS R. Hierarchical multi-label classification networks [C]// Proceedings of the 35th International Conference on Machine Learning. New York: JMLR.org, 2018: 5075-5084.
|
12 |
SHIMURA K, LI J, FUKUMOTO F. HFT-CNN: learning hierarchical category structure for multi-label short text categorization [C]// Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing. Stroudsburg: ACL, 2018: 811-816.
|
13 |
ZHOU J, MA C, LONG D, et al. Hierarchy-aware global model for hierarchical text classification [C]// Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics. Stroudsburg: ACL, 2020: 1106-1117.
|
14 |
CHEN H, MA Q, LIN Z, et al. Hierarchy-aware label semantics matching network for hierarchical text classification [C]// Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers). Stroudsburg: ACL, 2021: 4370-4379.
|
15 |
DENG Z, PENG H, HE D, et al. HTCInfoMax: a global model for hierarchical text classification via information maximization [C]// Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. Stroudsburg: ACL, 2021: 3259-3265.
|
16 |
JIANG T, WANG D, SUN L, et al. Exploiting global and local hierarchies for hierarchical text classification [C]// Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing. Stroudsburg: ACL, 2022: 4030-4039.
|
17 |
YU C, SHEN Y, MAO Y. Constrained sequence-to-tree generation for hierarchical text classification [C]// Proceedings of the 45th International ACM SIGIR Conference on Research and Development in Information Retrieval. New York: ACM, 2022: 1865-1869.
|
18 |
ZHAO W, ZHAO H. Hierarchical long-tailed classification based on multi-granularity knowledge transfer driven by multi-scale feature fusion [J]. Pattern Recognition, 2024, 145: No.109842.
|
19 |
ZHAO J, LI J, FUKUMOTO F. Hierarchy-aware bilateral-branch network for imbalanced hierarchical text classification [C]// Proceedings of the 2023 International Conference on Database and Expert Systems Applications, LNCS 14147. Cham: Springer, 2023: 143-157.
|
20 |
ZHAO X, LI Z, ZHANG X, et al. An interactive fusion model for hierarchical multi-label text classification [C]// Proceedings of the 11th CCF International Conference on Natural Language Processing and Chinese Computing, LNCS 13552. Cham: Springer, 2022: 168-178.
|
21 |
WANG B, HU X, LI P, et al. Cognitive structure learning model for hierarchical multi-label text classification [J]. Knowledge-Based Systems, 2021, 218: No.106876.
|
22 |
SUN Y, QIU H, ZHENG Y, et al. SIFRank: a new baseline for unsupervised keyphrase extraction based on pre-trained language model [J]. IEEE Access, 2020, 8: 10896-10906.
|
23 |
DEVLIN J, CHANG M W, LEE K, et al. BERT: pre-training of deep bidirectional Transformers for language understanding [C]// Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1(Long and Short Papers). Stroudsburg: ACL, 2019: 4171-4186.
|
24 |
VASWANI A, SHAZEER N, PARMAR N, et al. Attention is all you need [C]// Proceedings of the 31st International Conference on Neural Information Processing Systems. Red Hook: Curran Associates Inc., 2017: 6000-6010.
|
25 |
WILLIAMS R J, ZIPSER D. A learning algorithm for continually running fully recurrent neural networks [J]. Neural Computation, 1989, 1(2): 270-280.
|
26 |
LEWIS D D, YANG Y, ROSE T G, et al. RCV1: a new benchmark collection for text categorization research [J]. Journal of Machine Learning Research, 2004, 5: 361-397.
|
27 |
KOWSARI K, BROWN D E, HEIDARYSAFA M, et al. HDLTex: hierarchical deep learning for text classification [C]// Proceedings of the 16th IEEE International Conference on Machine Learning and Applications. Piscataway: IEEE, 2017: 364-371.
|