[1] 陈功平,沈明玉,王红,等.基于内容的短信分类技术[J].华东理工大学学报(自然科学版),2011,37(6):770-774.(CHEN G P, SHEN M Y, WANG H. et al. SMS classification technology based on content[J]. Journal of East China University of Science and Technology (Natural Science Edition), 2011, 37(6):770-774.) [2] ZHANG L, MA J, WANG Y. Content based spam text classification:an empirical comparison between English and Chinese[C]//INCOS'13:Proceedings of the 20135th International Conference on Intelligent Networking and Collaborative Systems. Washington, DC:IEEE Computer Society, 2013:69-76. [3] SHARMA N, GAGANPREETKAUR, VERMA A. Survey on text classification (spam) using machine learning[J]. International Journal of Computer Science and Information Technologies, 2014, 5(4):5098-5102. [4] SHAHI T B, YADAV A. Mobile SMS spam filtering for nepali text using Naïve Bayesian and support vector machine[J]. International Journal of Intelligence Science, 2014, 4(1):24-28. [5] 李润川,昝红英,申圣亚,等.基于多特征融合的垃圾短信识别[J].山东大学学报(理学版),2017,52(7):73-79.(LI R C, ZAN H Y, SHEN S Y, et al. Spam messages identification based on multi-feature fusion[J]. Journal of Shandong University (Natural Science), 2017,52(7):73-79.) [6] 黄文明,莫阳.基于文本加权KNN算法的中文垃圾短信过滤[J].计算机工程,2017,43(3):193-199.(HUANG W M, MO Y. Chinese spam message filtering based on text weighted KNN algorithm[J]. Computer Engineering, 2017, 43(3):193-199.) [7] SETHI P, BHANDARI V, KOHLI B. SMS spam detection and comparison of various machine learning algorithms[C]//Proceedings of the 2017 International Conference on Computing and Communication Technologies for Smart Nation. Piscataway, NJ:IEEE, 2017, 28-31. [8] CHAN P P K, YANG C, YEUNG D S, et al. Spam filtering for short messages in adversarial environment[J]. Neurocomputing, 2015, 155(C):167-176. [9] MA J, ZHANG Y, LIU J, et al. Intelligent SMS spam filtering using topic model[C]//Proceedings of the 2016 International Conference on Intelligent Networking and Collaborative Systems. Piscataway, NJ:IEEE, 2016:380-383. [10] CHOUDHARY N, JAIN A K. Towards filtering of SMS spam messages using machine learning based technique[M]//Advanced Informatics for Computing Research. Berlin:Springer, 2017:18-30. [11] BAKER S, KROHONEN A, PYYSALO S. Cancer hallmark text classification using convolutional neural networks[EB/OL].[2018-01-11]. https://www.repository.cam.ac.uk/bitstream/handle/1810/270037/BIOTXTM2016.pdf;jsessionid=FAE7EA1B196FA600CC643D798DD04A0D?sequence=1. [12] KIM Y. Convolutional neural networks for sentence classification[EB/OL].[2018-01-11]. http://www.anthology.aclweb.org/D/D14/D14-1181.pdf. [13] SALTON G. A vector space model for automatic indexing[J]. Communications of the ACM, 1975, 18(11):613-620. [14] KESORN K, POSLAD S. An enhanced bag-of-visual word vector space model to represent visual content in athletics images[J]. IEEE Transactions on Multimedia, 2012, 14(1):211-222. [15] CASTELLS P, FERNANDEZ M, VALLET D. An adaptation of the vector-space model for ontology-based information retrieval[J]. IEEE Transactions on Knowledge and Data Engineering, 2007, 19(2):261-272. [16] TURNEY P D, PANTEL P. From frequency to meaning:vector space models of semantics[J]. Journal of Artificial Intelligence Research, 2010, 37(1):141-188. [17] HINTON G E. Learning distributed representations of concepts[C]//Proceedings of the 8th Annual Conference of the Cognitive Science Society. New York:Clarendon Press, 1986:1-12. [18] HU B, TANG B, CHEN Q, et al. A novel word embedding learning model using the dissociation between nouns and verbs[J]. Neurocomputing, 2016, 171:1108-1117. [19] BIAN J, GAO B, LIU T Y. Knowledge-powered deep learning for word embedding[C]//Proceedings of the 2014 Machine Learning and Knowledge Discovery in Databases, LNCS 8724. Berlin:Springer, 2014:132-148. [20] MIKOLOV T, SUTSKEVER I, CHEN K, et al. Distributed representations of words and phrases and their compositionality[EB/OL].[2018-01-12]. http://www.eecs.wsu.edu/~sji/classes/DL16/CNN-text/word2vec2.pdf. [21] 郑世卓,崔晓燕.基于半监督LDA的文本分类应用研究[J].软件,2014,35(1):46-48.(ZHEN S Z, CUI X Y. Research on text classification based on semi-supervised LDA[J]. Computer Engineering and Software, 2014, 35(1):46-48.) [22] GLOROT X, BORDES A, BENGIO Y. Deep sparse rectifier neural networks[EB/OL].[2018-01-12]. http://proceedings.mlr.press/v15/glorot11a/glorot11a.pdf. [23] ZHANG X, ZHAO J, LeCUN Y. Character-level convolutional networks for text classification[EB/OL].[2018-01-12]. http://www.eecs.wsu.edu/~sji/classes/DL16/CNN-text/5782-character-level-convolutional-networks-for-text-classification.pdf. [24] HINTON G E, SALAKHUTDINOV R R. Replicated softmax:an undirected topic model[C]//Proceedings of the 2009 International Conference on Neural Information Processing Systems.[S.l.]:Curran Associates Inc., 2009:1607-1614. [25] SRIVASTAVA N, HINTON G, KRIZHEVSKY A, et al. Dropout:a simple way to prevent neural networks from overfitting[J]. Journal of Machine Learning Research, 2014, 15(1):1929-1958. [26] 宗成庆.统计自然语言处理[M].北京:清华大学出版社,2008:352-363.(ZONG C Q. Statistical Natural Language Processing[M]. Beijing:Tsinghua University Press, 2008:352-363.) |