[1] 陈龙,管子玉,何金红,等.情感分类研究进展[J].计算机研究与发展,2017,54(6):1150-1170. (CHEN L, GUAN Z Y, HE J H, et al. A survey on sentiment classification[J]. Journal of Computer Research and Development, 2017, 54(6):1150-1170.) [2] HUANG M L, QIAN Q, ZHU X Y. Encoding syntactic knowledge in neural networks for sentiment classification[J]. ACM Transactions on Information Systems, 2017, 35(3):Article No. 26. [3] LI J, LUONG M T, DAN J, et al. When are tree structures necessary for deep learning of representations?[EB/OL].[2017-08-04]. http://www.emnlp2015.org/proceedings/EMNLP/pdf/EMNLP278.pdf. [4] CHO K, van MERRIENBOER B, GULCEHRE C, et al. Learning phrase representations using RNN encoder-decoder for statistical machine translation[C]//Proceedings of the 19th International Conference on Empirical Methods in Natural Language Processing. Stroudsburg:Association for Computational Linguistics, 2014:1724-1735. [5] RAVANELLI M, BRAKEL P, OMOLOGO M, et al. Light gated recurrent units for speech recognition[J]. IEEE Transactions on Emerging Topics in Computational Intelligence, 2018, 2(2):92-102. [6] KRIZHEVSKY A, SUTSKEVER I, HINTON G E. ImageNet classification with deep convolutional neural networks[C]//Proceedings of the 25th International Conference on Neural Information Processing Systems. Lake Tahoe, Nevada:Curran Associates Inc, 2012:1097-1105. [7] ZHUANG F Z, CHENG X H, LUO P, et al. Supervised representation learning with double encoding-layer autoencoder for transfer learning[J]. ACM Transactions on Intelligent Systems and Technology, 2017, 9(2):1-17. [8] TAN B, ZHANG Y, PAN S J, et al. Distant domain transfer learning[C]//Proceedings of the 31st AAAI Conference on Artificial Intelligence. San Francisco:AAAI Press, 2017:2604-2610. [9] LONG M S, CAO Y, WANG J M, et al. Learning transferable features with deep adaptation networks[C]//Proceedings of the 32nd International Conference on Machine Learning. Berlin:Springer, 2015:97-105. [10] 吴斌,吉佳,孟琳,等.基于迁移学习的唐诗宋词情感分析[J].电子学报,2016,44(11):2780-2787. (WU B, JI J, MENG L, et al. Transfer learning based sentiment analysis for poetry of the Tang dynasty and Song dynasty[J]. Acta Electronica Sinica, 2016, 44(11):2780-2787.) [11] MIKOLOV T, CHEN K, CORRADO G, et al. Efficient estimation of word representations in vector space[EB/OL].[2017-08-04].http://www.surdeanu.info/mihai/teaching/ista555-spring15/readings/mikolov2013.pdf. [12] PENNINGTON J, SOCHER R, MANNING C D. Glove:Global vectors for word representation[C]//Proceedings of the 19th International Conference on Empirical Methods in Natural Language Processing. Stroudsburg:Association for Computational Linguistics, 2014:1532-1543. [13] McCANN B, BRADBURY J, XIONG C M, et al. Learned in translation:contextualized word vectors[EB/OL].[2017-08-04].http://papers.nips.cc/paper/7209-learned-in-translation-contextualized-word-vectors.pdf. [14] ZHOU G B, WU J X, ZHANG C L, et al. Minimal gated unit for recurrent neural networks[J]. International Journal of Automation and Computing, 2016, 13(3):226-234. [15] LUONG M H, HIEU P, CHRISTOPHER D M. Effective approaches to attention based neural machine translation[C]//Proceedings of the 20th International Conference on Empirical Methods in Natural Language Processing. Stroudsburg:Association for Computational Linguistics, 2015:1412-1421. [16] YANG Z C, YANG D Y, CHRIS D, et al. Hierarchical attention networks for document classification[C]//Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics:Human Language Technologies. Stroudsburg:Association for Computational Linguistics, 2016:1480-1489. [17] TANG D Y, WEI F R, YANG N, et al. Learning sentiment specific word embedding for twitter sentiment classification[C]//Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics. Stroudsburg:Association for Computational Linguistics, 2014:1555-1565. [18] CHEN H M, SUN M S, TU C C, et al. Neural sentiment classification with user and product attention[C]//Proceedings of the 21st International Conference on Empirical Methods in Natural Language Processing. Stroudsburg:Association for Computational Linguistics, 2016:1650-1659. [19] LONG Y F, LU Q, XIANG R, et al. A cognition based attention model for sentiment analysis[C]//Proceedings of the 22nd International Conference on Empirical Methods in Natural Language Processing. Stroudsburg:Association for Computational Linguistics, 2017:473-482. |