1 |
ZONG Y, WANG D, WANG X, et al. Characteristic portrait and analysis method of power grid work orders based on statistics[C]// Proceedings of the 2nd International Conference on Control and Intelligent Robotics. New York: ACM, 2022: 717-723.
|
2 |
YU Y, HE W, KANG Y M, et al. Leveraging continuous prompt for few-shot named entity recognition in electric power domain with meta-learning[J]. Data Intelligence, 2023, 5(2): 494-509.
|
3 |
MOSCATO V, NAPOLANO G, POSTIGLIONE M, et al. Multi-task learning for few-shot biomedical relation extraction[J]. Artificial Intelligence Review, 2023, 56(11): 13743-13763.
|
4 |
BANG J, PARK J, PARK J. GACaps-HTC: graph attention capsule network for hierarchical text classification[J]. Applied Intelligence, 2023, 53(17): 20577-20594.
|
5 |
YIN Y, ZENG J, SU J, et al. Multi-modal graph contrastive encoding for neural machine translation[J]. Artificial Intelligence, 2023, 323: No.103986.
|
6 |
GEMAN S, BIENENSTOCK E, DOURSAT R. Neural networks and the bias/variance dilemma[J]. Neural Computation, 1992, 4(1): 1-58.
|
7 |
WEI J, ZOU K. EDA: easy data augmentation techniques for boosting performance on text classification tasks[C]// Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing. Stroudsburg: ACL, 2019: 6382-6388.
|
8 |
YAO X, YANG Z, CUI Y, et al. MiniRBT: a two-stage distilled small Chinese pre-trained model[EB/OL]. [2024-08-11]..
|
9 |
刘鹏. 基于深度学习的电力工单文本挖掘[D]. 南昌:南昌大学, 2022:6-23.
|
|
LIU P. Text mining of electricity work order based on deep learning[D]. Nanchang: Nanchang University, 2022:6-23.
|
10 |
孙茂松,李景阳,郭志芃,等. THUCTC:一个高效的中文文本分类工具包[EB/OL]. [2024-08-11].,
|
|
LI J Y, GUO Z P, et al. THUCTC: an efficient Chinese text classifier[EB/OL]. [2024-08-11].
|
11 |
曹湘,李誉坤,钱叶,等. 基于混合神经网络的电力短文本分类方法研究[J]. 计算机与数字工程, 2019, 47(5):1145-1150.
|
|
CAO X, LI Y K, QIAN Y, et al. Short text classification of electric power based on hybrid neural network[J]. Computer and Digital Engineering, 2019, 47(5):1145-1150.
|
12 |
冯斌,张又文,唐昕,等. 基于BiLSTM-Attention神经网络的电力设备缺陷文本挖掘[J]. 中国电机工程学报, 2020, 40(S1):1-10.
|
|
FENG B, ZHANG Y W, TANG X, et al. Power equipment defect record text mining based on BiLSTM-Attention neural network[J]. Proceedings of the CSEE, 2020, 40(S1):1-10.
|
13 |
MENG J, LI Y, LIU C, et al. Classification of customer service tickets in power system based on character and word level semantic understanding[C]// Proceedings of the 2021 China International Conference on Electricity Distribution. Piscataway: IEEE, 2021: 1062-1066.
|
14 |
MENG Q, SONG Y, MU J, et al. Electric power audit text classification with multi-grained pre-trained language model[J]. IEEE Access, 2023, 11: 13510-13518.
|
15 |
田园,原野,刘海斌,等. 基于BERT预训练语言模型的电网设备缺陷文本分类[J]. 南京理工大学学报, 2020, 44(4):446-453.
|
|
TIAN Y, YUAN Y, LIU H B, et al. BERT pre-trained language model for defective text classification of power grid equipment[J]. Journal of Nanjing University of Science and Technology, 2020, 44(4):446-453.
|
16 |
付文杰,杨迪,马红明,等. 融合BTM和BERT的短文本分类方法[J]. 计算机工程与设计, 2022, 43(12):3421-3427.
|
|
FU W J, YANG D, MA H M, et al. Short text classification method based on BTM and BERT[J]. Computer Engineering and Design, 2022, 43(12):3421-3427.
|
17 |
ZHANG S, TONG H, XU J, et al. Graph convolutional networks: a comprehensive review[J]. Computational Social Networks, 2019, 6: No.11.
|
18 |
VELIČKOVIĆ P, CUCURULL G, CASANOVA A, et al. Graph attention networks[EB/OL]. [2024-08-11]..
|
19 |
CUI Y, CHE W, LIU T, et al. Pre-training with whole word masking for Chinese BERT[J]. IEEE/ACM Transactions on Audio, Speech, and Language Processing, 2021, 29: 3504-3514.
|
20 |
MÜLLER R, KORNBLITH S, HINTON G E. When does label smoothing help?[C]// Proceedings of the 31st International Conference on Neural Information Processing Systems. Red Hook: Curran Associates Inc., 2019: 4694-4703.
|
21 |
周志华. 机器学习[M]. 北京:清华大学出版社, 2016:23-29.
|
|
ZHOU Z H. Machine learning[M]. Beijing: Tsinghua University Press, 2016:23-29.
|
22 |
SRIVASTAVA N, HINTON G, KRIZHEVSKY A, et al. Dropout: a simple way to prevent neural networks from overfitting[J]. Journal of Machine Learning Research, 2014, 15: 1929-1958.
|
23 |
KIM Y. Convolutional neural networks for sentence classification[C]// Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing. Stroudsburg: ACL, 2014: 1746-1751.
|
24 |
LIU P, QIU X, HUANG X. Recurrent neural network for text classification with multi-task learning[C]// Proceedings of the 25th International Joint Conference on Artificial Intelligence. California: ijcai.org, 2016: 2873-2879.
|
25 |
DEVLIN J, CHANG M W, LEE K, et al. BERT: pre-training of deep bidirectional transformers for language understanding[C]// Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers). Stroudsburg: ACL, 2019: 4171-4186.
|
26 |
WAN C X, LI B. Financial causal sentence recognition based on BERT-CNN text classification[J]. The Journal of Supercomputing, 2022, 78(5): 6503-6527.
|
27 |
LUO J, HE C, LUO H. BRsyn-Caps: Chinese text classification using capsule network based on BERT and dependency syntax[J]. IEICE Transactions on Information and Systems, 2024, E107-D(2): 212-219.
|