1 |
RAVI K, RAVI V. A survey on opinion mining and sentiment analysis: tasks, approaches and applications[J]. Knowledge-Based Systems, 2015, 89:14-46. 10.1016/j.knosys.2015.06.015
|
2 |
朱晓霞,宋嘉欣,张晓缇. 基于主题挖掘技术的文本情感分析综述[J]. 情报理论与实践, 2019, 42(11):156-163. 10.16353/j.cnki.1000-7490.2019.11.025
|
|
ZHU X X, SONG J X, ZHANG X T. Review of text emotion analysis based on topic mining technology[J]. Information Studies: Theory and Application, 2019, 42(11):156-163. 10.16353/j.cnki.1000-7490.2019.11.025
|
3 |
温超东,曾诚,任俊伟,等. 结合ALBERT和双向门控循环单元的专利文本分类[J]. 计算机应用, 2021, 41(2):407-412.
|
|
WEN C D, ZENG C, REN J W, et al. Patent text classification based on ALBERT and bidirectional gated recurrent unit[J]. Journal of Computer Applications, 2021, 41(2):407-412.
|
4 |
MIKOLOV T, SUTSKEVER I, CHEN K, et al. Distributed representations of words and phrases and their compositionality[C]// Proceedings of the 26th International Conference on Neural Information Processing Systems. Red Hook, NY: Curran Associates Inc., 2013: 3111-3119.
|
5 |
MIKOLOV T, CHEN K, CORRADO G, et al. Efficient estimation of word representations in vector space[EB/OL]. (2013-09-07) [2021-05-06].. 10.3126/jiee.v3i1.34327
|
6 |
PENNINGTON J, SOCHER R, MANNING C D. GloVe: global vectors for word representation[C]// Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing. Stroudsburg, PA: Association for Computational Linguistics, 2014: 1532-1543. 10.3115/v1/d14-1162
|
7 |
PETERS M E, NEUMANN M, IYYER M, et al. Deep contextualized word representations[C]// Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long Papers). Stroudsburg, PA: Association for Computational Linguistics, 2018:2227-2237. 10.18653/v1/n18-1202
|
8 |
RADFORD A, NARASIMHAN K, SALIMANS T, et al. Improving language understanding by generative pre-training[EB/OL]. [2020-11-01]. .
|
9 |
DEVLIN J, CHANG M W, LEE K, et al. BERT: pre-training of deep bidirectional transformers for language understanding[C]// Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long Papers). Stroudsburg, PA: Association for Computational Linguistics, 2019: 4171-4186. 10.18653/v1/n19-1423
|
10 |
VASWANI A, SHAZEER N, PARMAR N, et al. Attention is all you need[C]// Proceedings of the 31st International Conference on Neural Information Processing Systems. Red Hook, NY: Curran Associates Inc., 2017: 6000-6010. 10.1016/s0262-4079(17)32358-8
|
11 |
LAN Z Z, CHEN M D, GOODMAN S, et al. ALBERT: A Lite BERT for self-supervised learning of language representations[EB/OL]. (2020-02-09) [2020-02-13]..
|
12 |
曾诚,温超东,孙瑜敏,等. 基于ALBERT-CRNN的弹幕文本情感分析[J]. 郑州大学学报(理学版), 2021, 53(3):1-8.
|
|
ZENG C, WEN C D, SUN Y M, et al. Barrage text sentiment analysis based on ALBERT-CRNN[J]. Journal of Zhengzhou University (Science Edition), 2021, 53(3):1-8.
|
13 |
DAI Z, YANG Z, YANG Y, et al. Transformer-XL: attentive language models beyond a fixed-length context[C]// Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. Stroudsburg, PA: Association for Computational Linguistics, 2019: 2978-2988. 10.18653/v1/p19-1285
|
14 |
YANG Z L, DAI Z H, YANG Y M, et al. XLNet: generalized autoregressive pretraining for language understanding[C/OL]// Proceedings of the 33rd International Conference on Neural Information Processing Systems. [2020-09-20].. 10.1145/3369985.3370025
|
15 |
王志涛,於志文,郭斌,等. 基于词典和规则集的中文微博情感分析[J]. 计算机工程与应用, 2015, 51(8):218-225. 10.3778/j.issn.1002-8331.1308-0187
|
|
WANG Z T, YU Z W, GUO B, et al. Sentiment analysis of Chinese micro blog based on lexicon and rule set[J]. Computer Engineering and Applications, 2015, 51(8):218-225. 10.3778/j.issn.1002-8331.1308-0187
|
16 |
吴杰胜,陆奎. 基于多部情感词典和规则集的中文微博情感分析研究[J]. 计算机应用与软件, 2019, 36(9):93-99. 10.3969/j.issn.1000-386x.2019.09.017
|
|
WU J S, LU K. Chinese Weibo sentiment analysis based on multiple sentiment lexicons and rule sets[J]. Computer Applications and Software, 2019, 36(9):93-99. 10.3969/j.issn.1000-386x.2019.09.017
|
17 |
黄进,阮彤,蒋锐权. 基于SVM结合依存句法的金融领域舆情分析[J]. 计算机工程与应用, 2015, 51(23):230-235. 10.3778/j.issn.1002-8331.1311-0180
|
|
HUANG J, RUAN T, JIANG R Q. Sentiment analysis in financial domain based on SVM with dependency syntax[J]. Computer Engineering and Applications, 2015, 51(23):230-235. 10.3778/j.issn.1002-8331.1311-0180
|
18 |
邓君,孙绍丹,王阮,等. 基于Word2Vec和SVM的微博舆情情感演化分析[J]. 情报理论与实践, 2020, 43(8): 112-119. 10.1109/imcec51613.2021.9482219
|
|
DENG J, SUN S D, WANG R, et al. Evolution analysis of weibo public opinion emotion based on Word2Vec and SVM[J].Information Studies: Theory and Application, 2020, 43(8): 112-119. 10.1109/imcec51613.2021.9482219
|
19 |
SOCHER R, LIN C C Y, NG A Y, et al. Parsing natural scenes and natural language with recursive neural networks[C]// Proceedings of the 28th International Conference on Machine Learning. Madison, WI: Omnipress, 2011: 129-136.
|
20 |
KIM Y. Convolutional neural networks for sentence classification[C]// Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing. Stroudsburg, PA: Association for Computational Linguistics, 2014: 1746-1751. 10.3115/v1/d14-1181
|
21 |
CHO K, van MERRIËNBOER B, GU̇LÇEHRE Ç, et al. Learning phrase representations using RNN encoder-decoder for statistical machine translation[C]// Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing. Stroudsburg, PA: Association for Computational Linguistics, 2014: 1724-1734. 10.3115/v1/d14-1179
|
22 |
DEY R, SALEM F M. Gate-variants of Gated Recurrent Unit (GRU) neural networks[C]// Proceedings of the IEEE 60th International Midwest Symposium on Circuits and Systems. Piscataway: IEEE, 2017: 1597-1600. 10.1109/mwscas.2017.8053243
|
23 |
LAI S W, XU L H, LIU K, et al. Recurrent convolutional neural networks for text classification[C]// Proceedings of the 29th AAAI Conference on Artificial Intelligence. Palo Alto, CA: AAAI Press, 2015: 2267-2273. 10.1609/aaai.v33i01.33017370
|
24 |
CUI Y M, CHE W X, LIU T, et al. Revisiting pre-trained models for Chinese natural language processing[EB/ OL] (2020-11-02) [2021-05-05].. 10.18653/v1/2020.findings-emnlp.58
|