1 |
郭喜跃,何婷婷. 信息抽取研究综述[J]. 计算机科学, 2015, 42(2): 14-17. 10.11896/j.issn.1002-137X.2015.2.003
|
|
GUO X Y, HE T T. Survey about research on information extraction[J]. Computer Science, 2015, 42(2):14-17. 10.11896/j.issn.1002-137X.2015.2.003
|
2 |
姚萍,李坤伟,张一帆. 知识图谱构建技术综述[J]. 信息系统工程, 2020(5):121-121, 123. 10.3969/j.issn.1001-2362.2020.05.054
|
|
YAO P, LI K W, ZHANG Y F. Summary of knowledge graph construction technology[J]. China CIO News, 2020(5): 121-121, 123. 10.3969/j.issn.1001-2362.2020.05.054
|
3 |
沈航可,祁志卫,张子辰,等. 知识图谱的候选实体搜索与排序[J]. 计算机系统应用, 2021, 30(11): 46-53.
|
|
SHEN H K, QI Z W, ZHANG Z C, et al. Candidate entity search and ranking of knowledge map[J]. Computer Systems and Applications, 2021, 30(11): 46-53.
|
4 |
BACH N, BADASKAR S. A review of relation extraction[EB/OL]. [2022-06-22].. 10.2139/ssrn.4173454
|
5 |
XIONG C Y, POPWER R, CALLAN J. Explicit semantic ranking for academic search via knowledge graph embedding[C]// Proceedings of the 26th International Conference on World Wide Web. Republic and Canton of Geneva: International World Wide Web Conferences Steering Committee, 2017: 1271-1279. 10.1145/3038912.3052558
|
6 |
ZHANG Y Z, JIANG Z T, ZHANG T, et al. MIE: a medical information extractor towards medical dialogues[C]// Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics. Stroudsburg, PA: ACL, 2020: 6460-6469. 10.18653/v1/2020.acl-main.576
|
7 |
MINTZ M, BILLS S, SNOW R, et al. Distant supervision for relation extraction without labeled data[C]// Proceedings of the Joint Conference of the 47th Annual Meeting of the ACL and the 4th International Joint Conference on Natural Language Processing of the AFNLP. Stroudsburg, PA: ACL, 2009: 1003-1011. 10.3115/1690219.1690287
|
8 |
RIEDEL S, YAO L M, McCALLUM A. Modeling relations and their mentions without labeled text[C]// Proceedings of the 2010 Joint European Conference on Machine Learning and Knowledge Discovery in Databases, LNCS 6323. Berlin: Springer, 2010: 148-163. 10.5715/jnlp.4.3_1
|
9 |
ZENG D J, LIU K, CHEN Y B, et al. Distant supervision for relation extraction via piecewise convolutional neural networks[C]// Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing. Stroudsburg, PA: ACL, 2015: 1753-1762. 10.18653/v1/d15-1203
|
10 |
QU J F, OUYANG D T, HUA W, et al. Distant supervision for neural relation extraction integrated with word attention and property features[J]. Neural Networks, 2018, 100: 59-69. 10.1016/j.neunet.2018.01.006
|
11 |
LIN Y K, SHEN S Q, LIU Z Y, et al. Neural relation extraction with selective attention over instances[C]// Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics. Stroudsburg, PA: ACL, 2016: 2124-2133. 10.18653/v1/p16-1200
|
12 |
XIAO Y, JIN Y C, CHENG R, et al. Hybrid attention-based Transformer block model for distant supervision relation extraction[J]. Neurocomputing, 2022, 470: 29-39. 10.1016/j.neucom.2021.10.037
|
13 |
ZHOU Y R, PAN L M, BAI C Y, et al. Self-selective attention using correlation between instances for distant supervision relation extraction[J]. Neural Networks, 2021, 142: 213-220. 10.1016/j.neunet.2021.04.032
|
14 |
CHEN T, SHI H Z, TANG S L, et al. CIL: contrastive instance learning framework for distantly supervised relation extraction[C]// Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing. Stroudsburg, PA: ACL, 2021: 6191-6200. 10.18653/v1/2021.acl-long.483
|
15 |
LI D, ZHANG T, HU N, et al. HiCLRE: a hierarchical contrastive learning framework for distantly supervised relation extraction[C]// Findings of the Association for Computational Linguistics: ACL 2022. Stroudsburg, PA: ACL, 2022: 2567-2578. 10.18653/v1/2022.findings-acl.202
|
16 |
SHANG Y M, HUANG H Y, MAO X L, et al. Are noisy sentences useless for distant supervised relation extraction?[C]// Proceedings of the 34th AAAI Conference on Artificial Intelligence. Palo Alto, CA: AAAI Press, 2020: 8799-8806. 10.1609/aaai.v34i05.6407
|
17 |
MA R T, GUI T, LI L Y, et al. SENT: sentence-level distant relation extraction via negative training[C]// Proceedings of 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers). Stroudsburg, PA: ACL, 2021:6201-6213. 10.18653/v1/2021.acl-long.484
|
18 |
KIM Y, YIM J, YUN J, et al. NLNL: negative learning for noisy labels[C]// Proceedings of the 2019 IEEE/CVF International Conference on Computer Vision. Piscataway: IEEE, 2019: 101-110. 10.1109/iccv.2019.00019
|
19 |
TORREY L, SHAVLIK J. Transfer learning[M]// OLIVAS E S, GUERRERO, J D M, MARTINEZ-SOBER M, et al. Handbook of Research on Machine Learning Applications and Trends: Algorithms, Methods, and Techniques. Hershey, PA: IGI Global, 2010: 242-264. 10.4018/978-1-60566-766-9.ch011
|
20 |
PAN S J, YANG Q. A survey on transfer learning[J]. IEEE Transactions on Knowledge and Data Engineering, 2010, 22(10):1345-1359. 10.1109/tkde.2009.191
|
21 |
CHEN C, JIANG B Y, CHENG Z W, et al. Joint domain matching and classification for cross-domain adaptation via ELM[J]. Neurocomputing, 2019, 349: 314-325. 10.1016/j.neucom.2019.01.056
|
22 |
GUO H L, ZHU H J, GUO Z L, et al. Domain adaptation with latent semantic association for named entity recognition[C]// Proceedings of Human Language Technologies: The 2009 Annual Conference of the North American Chapter of the Association for Computational Linguistics. Stroudsburg, PA: ACL, 2009: 281-289. 10.3115/1620754.1620795
|
23 |
FU L S, NGUYEN T H, MIN B N, et al. Domain adaptation for relation extraction with domain adversarial neural network[C]// Proceedings of the 8th International Joint Conference on Natural Language Processing (Volume 2: Short Papers). [S.l.]: Asian Federation of Natural Language Processing, 2017: 425-429.
|
24 |
SUN C, QIU X P, XU Y G, et al. How to fine-tune BERT for text classification?[C]// Proceedings of the 2019 China National Conference on Chinese Computational Linguistics, LNCS 11856. Cham: Springer, 2019: 194-206.
|
25 |
DEVLIN J, CHANG M W, LEE K, et al. BERT: pre-training of deep bidirectional transformers for language understanding[C]// Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. Stroudsburg, PA: ACL, 2019: 4171-4186. 10.18653/v1/n18-2
|
26 |
VASWANI A, SHAZEER N, PARMAR N, et al. Attention is all you need[C]// Proceedings of the 31st International Conference on Neural Information Processing Systems. Red Hook, NY: Curran Associates Inc., 2017: 6000-6010.
|
27 |
ZHANG S, ZHENG D Q, HU X C, et al. Bidirectional long short-term memory networks for relation classification[C/OL]// Proceedings of the 29th Pacific Asia Conference on Language, Information and Computation [2020-04-20]..
|
28 |
ZHANG Y H, ZHONG V, CHEN D Q, et al. Position-aware attention and supervised data improve slot filling[C]// Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing. Stroudsburg, PA: ACL, 2017: 35-45. 10.3115/116580.1138590
|