[1] NIE L, WANG M, ZHANG L, et al. Disease inference from health-related questions via sparse deep learning[J]. IEEE Transactions on Knowledge & Data Engineering, 2015, 27(8):2107-2119. [2] 卢婷婷.基于短文本的互联网用户意图识别方法及应用研究[D].济南:济南大学,2016:6-10. (LU T T. Research on short texts based Internet users' intention recognition and application[D]. Jinan:University of Jinan, 2016:6-10.) [3] LI X. Understanding the semantic structure of noun phrase queries[C]//ACL'10:Proceedings of the 48th Annual Meeting of the Association for Computational Linguistics. Stroudsburg, PA:Association for Computational Linguistics, 2010:1337-1345. [4] RAMANAND J, BHAVSAR K, PEDANEKAR N. Wishful thinking:finding suggestions and ‘buy’ wishes from product reviews[C]//CAAGET'10:Proceedings of the NAACL HLT 2010 Workshop on Computational Approaches to Analysis and Generation of Emotion in Text. Stroudsburg, PA:Association for Computational Linguistics, 2010:54-61. [5] CHEN Z, LIU B, HSU M, et al. Identifying intention posts in discussion forums[C]//NAACL 2013:Proceedings of the 2013 Conference of the North American Chapter of the Association for Computational Linguistics. Stroudsburg, PA:Association for Computational Linguistics, 2013:62-71. [6] 汤秋莲.基于BTM的短文本聚类[D].合肥:安徽大学,2014:17-19. (TANG Q L. The short text clustering based on BTM[D]. Anhui:Anhui University, 2014:17-19.) [7] 阳馨,蒋伟,刘晓玲.基于多种特征池化的中文文本分类算法[J].四川大学学报(自然科学版),2017,54(2):287-292. (YANG X, JANG W, LIU X L. Chinese text categorization based on multi-pooling[J]. Journal of Sichuan University (Natural Science Edition), 2017, 54(2):287-292.) [8] LI C, XU Y. Based on support vector and word features new word discovery research[M]//ISCTCS 2012:Proceedings of the 2012 International Conference on Trustworthy Computing and Services. Berlin:Springer, 2013:287-294. [9] 周昭涛.文本聚类分析效果评价及文本表示研究[D].北京:中国科学院计算技术研究所, 2005:32-36. (ZHOU Z T. Quality evaluation of text clustering results and investigation on text representation[D]. Beijing:Institute of Computing Technology, Chinese Academy of Sciences, 2005:32-36.) [10] MIKOLOV T, SUTSKEVER I, CHEN K, et al. Distributed representations of words and phrases and their compositionality[J/OL]. arXiv Preprint, 2013, 2013:arXiv:1310.4546(2013-10-16)[2017-11-06]. https://arxiv.org/abs/1310.4546. [11] 王李东,魏宝刚,袁杰.基于概率主题模型的文档聚类[J].电子学报,2012,55(4):77-84. (WANG L D, WEI B G, YUAN J. Document clustering based on probabilistic topic model[J]. Acta Electronica Sinica, 2012, 55(4):77-84.) [12] 高章敏,何祥,刘嘉勇,等.基于主题模型的中文词义归纳[J].四川大学学报(自然科学版),2016,53(6):1269-1272. (GAO Z M, HE X, LIU J Y, et al. Chinese word sense induction based on topic model[J]. Journal of Sichuan University (Natural Science Edition), 2016, 53(6):1269-1272.) [13] HOFMANN T. Probabilistic latent semantic indexing[C]//Proceedings of the 22nd Annual International ACM SIGIR Conference on Research and Development in Information Retrieval. New York:ACM, 1999:50-57. [14] BLEI D M, NG A Y, JORDAN M I. Latent dirichlet allocation[J]. The Journal of Machine Learning Research, 2003, 3:993-1022. [15] CHENG X, YAN X, LAN Y, et al. BTM:topic modeling over short texts[J]. IEEE Transactions on Knowledge and Data Engineering, 2014, 26(12):2928-2941. [16] LeCUN Y, BENGIO Y, HINTON G E. Deep learning[J]. Nature, 2015, 521(28):436-444. [17] HOCHREITER S, SCHMIDHUER J. Long short-term memory[J]. Neural Computation, 1997, 9(8):1735-1780. [18] CHO K, VAN MERRIËNBOER B, GULCEHRE C, et al. Learning phrase representations using RNN encoder-decoder for statistical machine translation[C]//EMNLP 2014:Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing. Stroudsburg, PA, USA:Association for Computational Linguistics, 2014:1724-1734. [19] SRIVASTAVA N, HINTON G, KRIZHEVSKY A, et al. Dropout:a simple way to prevent neural networks from over-fitting[J]. Journal of Machine Learning Research, 2014, 15:1929-1958. [20] 钱岳,丁效,刘挺,等.聊天机器人中用户出行消费意图识别[J].中国科学:信息科学,2017,47(8):997-100. (QIAN Y, DING X, LIU T, et al. Identification method of the user's travel consumption intention in chatting robot[J]. SCIENTIA SINICA Informationis, 2017, 47(8):997-100.) |