1 |
DEVLIN J, CHANG M W, LEE K, et al. BERT: pre-training of deep bidirectional transformers for language understanding [C]// Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1(Long and Short Papers). Stroudsburg, PA: ACL, 2019: 4171-4186.
|
2 |
LIU Y, OTT M, GOYAL N, et al. RoBERTa: a robustly optimized BERT pretraining approach [EB/OL]. [2023-12-08]. .
|
3 |
CHIU I P C, NICHOLS E. Named entity recognition with bidirectional LSTM-CNNs [J]. Transactions of the Association for Computational Linguistics, 2016, 4: 357-370.
|
4 |
ZHANG Y, YANG J. Chinese NER using lattice LSTM [C]// Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics, Volume 1 (Long Papers). Stroudsburg, PA: ACL, 2018: 1554-1564.
|
5 |
左敏,薛明慧,张青川,等.面向互联网食品文本实体关系联合抽取研究[J].重庆邮电大学学报(自然科学版),2022,34(5):812-817.
|
|
ZUO M, XUE M H, ZHANG Q C, et al. Research on joint extraction of internet-oriented food text entity relationship[J]. Journal of Chongqing University of Posts & Telecommunications (Natural Science Edition),2022,34(5):812-817.
|
6 |
SU J, MURTADHA A, PAN S, et al. Global Pointer: novel efficient span-based approach for named entity recognition [EB/OL]. [2023-10-07]. .
|
7 |
孙玉轩.古汉语知识图谱的构建方法研究[D].大连:大连理工大学, 2020: 21-30.
|
|
SUN Y X. Research on the construction method of knowledge map in ancient Chinese [D]. Dalian: Dalian University of Technology, 2020: 21-30.
|
8 |
刘兴丽,范俊杰,马海群.面向小样本命名实体识别的数据增强算法改进策略研究[J].数据分析与知识发现, 2022, 6(10): 128-141.
|
|
LIU X L, FAN J J, MA H Q. Improvement of data augment algorithm for named entity recognition with small samples [J]. Data Analysis and Knowledge Discovery, 2022, 6(10): 128-141.
|
9 |
WEI J, ZOU K. EDA: easy data augmentation techniques for boosting performance on text classification tasks [C]// Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing. Stroudsburg, PA: ACL, 2019: 6382-6388.
|
10 |
程树文.基于深度学习的典籍人物关系触发词识别研究[D].南京:南京农业大学, 2020: 25-38.
|
|
CHENG S W. Research on recognition of trigger words in Chinese classics based on deep learning [D]. Nanjing: Nanjing Agricultural University, 2020: 25-38.
|
11 |
王永生,王昊,虞为,等.融合结构和内容的方志文本人物关系抽取方法[J].数据分析与知识发现, 2022, 6(Z1): 318-328.
|
|
WANG Y S, WANG H, YU W, et al. Extracting relationship among characters from local chronicles with text structures and contents [J]. Data Analysis and Knowledge Discovery, 2022, 6(Z1): 318-328.
|
12 |
彭博.基于ALBERT的网络文物信息资源实体关系抽取方法研究[J].情报杂志, 2022, 41(8): 156-162.
|
|
PENG B. Research on entity relationship extraction of cultural relic information resources with ALBERT [J]. Journal of Intelligence, 2022, 41(8): 156-162.
|
13 |
WANG D, LIU C, ZHAO Z, et al. GujiBERT and GujiGPT: construction of intelligent information processing foundation language models for ancient texts [EB/OL]. [2024-01-08]. .
|
14 |
温枫杰.基于深度学习的中华典籍人物关系研究[D].太原:中北大学, 2021: 17-21.
|
|
WEN F J. Deep learning based character relationship study of Chinese canonical texts [D]. Taiyuan: North University of China, 2021: 17-21.
|
15 |
唐雪梅,苏祺,王军.融合实体信息的古汉语关系分类研究[J].数据分析与知识发现, 2024, 8(1): 114-124.
|
|
TANG X M, SU Q, WANG J. Classifying ancient Chinese text relations with entity information [J]. Data Analysis and Knowledge Discovery, 2024, 8(1): 114-124.
|
16 |
BROWN T, MANN B, RYDER N, et al. Language models are few-shot learners [C]// Proceedings of the 34th International Conference on Neural Information Processing Systems. Red Hook, NY: Curran Associates Inc., 2020: 1877-1901.
|
17 |
LEVY O, SEO M, CHOI E, et al. Zero-shot relation extraction via reading comprehension [C]// Proceedings of the 21st Conference on Computational Natural Language Learning. Stroudsburg, PA: ACL, 2017: 333-342.
|
18 |
PETRONI F, ROCKTÄSCHEL T, RIEDEL S, et al. Language models as knowledge bases? [C]// Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing. Stroudsburg, PA: ACL, 2019: 2463-2473.
|
19 |
SHIN T, RAZEGHI Y, LOGAN R L, IV, et al. AutoPrompt: eliciting knowledge from language models with automatically generated prompts [C]// Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing. Stroudsburg, PA: ACL, 2020: 4222-4235.
|
20 |
CHEN X, ZHANG N, XIE X, et al. KnowPrompt: knowledge-aware prompt-tuning with synergistic optimization for relation extraction [C]// Proceedings of the ACM Web Conference 2022. New York: ACM, 2022: 2778-2788.
|
21 |
WEI Z, SU J, WANG Y, et al. A novel cascade binary tagging framework for relational triple extraction [C]// Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics. Stroudsburg, PA: ACL, 2020: 1476-1488.
|
22 |
WANG Y, YU B, ZHANG Y, et al. TPLinker: single-stage joint extraction of entities and relations through token pair linking [C]// Proceedings of the 28th International Conference on Computational Linguistics. [S.l.]: International Committee on Computational Linguistics, 2020: 1572-1582.
|
23 |
SHANG Y M, HUANG H, MAO X L. OneRel: joint entity and relation extraction with one module in one step [C]// Proceedings of the 36th AAAI Conference on Artificial Intelligence. Palo Alto: AAAI Press, 2022: 11285-11293.
|