[1] 刘德喜, 聂建云, 万常选, 等. 基于分类的微博新情感词抽取方法和特征分析[J]. 计算机学报,2018, 41(7):1574-1597.(LIU D X,NIE J Y,WANG C X,et al. A classification based sentiment words extracting method from microblogs and its feature engineering[J]. Chinese Journal of Computers,2018,41(7):1574-1597.) [2] GIACHANOU A,CRESTANI F. Like it or not:a survey of twitter sentiment analysis methods[J]. ACM Computing Surveys,2016, 49(2):No. 28. [3] YADOLLAAHI A,SHAHRAKI A G,ZAIANE O R. Current state of text sentiment analysis from opinion to emotion mining[J]. ACM Computing Surveys,2017,50(2):No. 25. [4] 曾义夫, 蓝天, 吴祖峰, 等. 基于双记忆注意力的方面级别情感分类模型[J]. 计算机学报,2019, 42(8):1845-1857.(ZENG Y F,LAN T,WU Z F,et al. Bi-memory based attention model for aspect level sentiment classification[J]. Chinese Journal of Computers,2019,42(8):1845-1857.) [5] 许银洁, 孙春华, 刘业政. 考虑用户特征的主题情感联合模型[J]. 计算机应用,2018,38(5):1261-1266.(XU Y J,SUN C H, LIU Y Z. Joint sentiment/topic model integrating user characteristics[J]. Journal of Computer Applications,2018,38(5):1261-1266.) [6] PASSALIS N, TEFAS A. Learning bag-of-embedded-words representations for textual information retrieval[J]. Pattern Recognition,2018,81:254-267. [7] DEVLIN J,CHANG M W,LEE K,et al. BERT:pre-training of deep bidirectional transformers for language understanding[C]//Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics:Human Language Technologies. Stroudsburg, PA:Association for Computational Linguistics,2019:4171-4186. [8] WANG J,WANG Z,ZHANG D,et al. Combining knowledge with deep convolutional neural networks for short text classification[C]//Proceedings of the 26th International Joint Conference on Artificial Intelligence. Palo Alto,CA:AAAI,2017:2915-2921. [9] JIA C,CARSIN M B,WANG X,et al. Concept decompositions for short text clustering by identifying word communities[J]. Pattern Recognition,2018,76:691-703. [10] CHEN J,HU Y,LIU J,et al. Deep short text classification with knowledge powered attention[C]//Proceedings of the 31st AAAI Conference on Artificial Intelligence. Palo Alto,CA:AAAI, 2019:6252-6259. [11] KINGMA D P,WELLING M. Auto-encoding variational Bayes[EB/OL].[2019-12-20]. https://arxiv.org/pdf/1312.6114.pdf. [12] MIAO Y,YU L,BLUNSOM P. Neural variational inference for text processing[C]//Proceedings of the 33rd International Conference on Machine Learning. New York:JMLR. org,2016:1727-1736. [13] SRIVASTAVA A,SUTTON C. Autoencoding variational inference for topic models[EB/OL].[2019-03-04]. https://arxiv.org/pdf/1703.01488.pdf. [14] GOODFELLOW I J,POUGET-ABADIE J,MIRZA M,et al. Generative adversarial nets[C]//Proceedings of the 27th International Conference on Neural Information Processing Systems. Cambridge:MIT Press,2014:2672-2680. [15] MESCHEDER L, NOWOZIN S, GEIGER A. Adversarial variational Bayes:unifying variational autoencoders and generative adversarial networks[C]//Proceedings of the 34th International Conference on Machine Learning. New York:JMLR. org,2017:2391-2400. [16] WANG R,ZHOU D,HE Y. ATM:adversarial-neural topic model[J]. Information Processing and Management,2019,56(6):No. 102098. [17] ARJOVSKY M, CHINTALA S, BOTTOU L. Wasserstein generative adversarial networks[C]//Proceedings of the 34th International Conference on Machine Learning. New York:JMLR. org,2017:214-223. [18] GULRAJANI I,AHEMD F,ARJOVSKY M,et al. Improved training of Wasserstein GANs[C]//Proceedings of the 31st International Conference on Neural Information Processing Systems. Red Hook,NY:Curran Associates Inc.,2017:5767-5777. [19] MIYATO T, KATAOKA T, KOYAMA M, et al. Spectral normalization for generative adversarial networks[EB/OL].[2020-02-16]. https://arxiv.org/pdf/1802.05957.pdf. [20] BAHDANAU D,CHO K,BENGIO Y. Neural machine translation by jointly learning to align and translate[EB/OL].[2019-05-19]. https://arxiv.org/pdf/1409.0473.pdf. [21] HU J,SHEN L,SUN G. Squeeze-and-excitation networks[C]//Proceedings of the 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition. Piscataway:IEEE,2018:7132-7141. [22] VASWANI A,SHAZEER N,PARMAR N,et al. Attention is all you need[C]//Proceedings of the 31st International Conference on Neural Information Processing Systems. Red Hook,NY:Curran Associates Inc.,2017:6000-6010. [23] LIN Z,FENG M,DOS SANTOS C N,et al. A structured selfattentive sentence embedding[EB/OL].[2019-03-09]. https://arxiv.org/pdf/1703.03130.pdf. [24] LI S,ZHAO Z,HU R,et al. Analogical reasoning on Chinese morphological and semantic relations[C]//Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics. Stroudsburg, PA:Association for Computational Linguistics, 2018:138-143. [25] KIM Y. Convolutional neural networks for sentence classification[C]//Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing. Stroudsburg,PA:Association for Computational Linguistics,2014:1746-1751. [26] LEE J Y,DERNONCOURT F. Sequential short-text classification with recurrent and convolutional neural networks[C]//Proceedings of the 2016 Annual Conference of the North American Chapter of the Association for Computational Linguistics:Human Language Technologies. Stroudsburg,PA:Association for Computational Linguistics,2016:515-520. |