[1] LUU L,CHU D H,OLICKEL H,et al. Making smart contracts smarter[C]//Proceedings of the 2016 ACM SIGSAC Conference on Computer and Communications Security. New York:ACM,2016:254-269. [2] MOHANTA B K,PANDA S S,JENA D. An overview of smart contract and use cases in blockchain technology[C]//Proceedings of the 9th International Conference on Computing,Communication and Networking Technologies. Piscataway:IEEE,2018:1-4. [3] ATZEI N,BARTOLETTI M,CIMOLI T. A survey of attacks on Ethereum smart contracts[C]//Proceedings of 2017 the 6th International Conference on Principles of Security and Trust,LNCS 10204. Berlin:Springer,2017:164-186. [4] 黄步添, 刘琦, 何钦铭, 等. 基于语义嵌入模型与交易信息的智能合约自动分类系统[J]. 自动化学报,2017,43(9):1532-1543. (HUANG B T,LIU Q,HE Q M,et al. Towards automatic smart-contract codes classification by means of word embedding model and transaction information[J]. Acta Automatica Sinica,2017,43(9):1532-1543.) [5] JOHNSON R,ZHANG T. Effective use of word order for text categorization with convolutional neural networks[C]//Proceedings of the 2015 Conference of the North American Chapter of the Association for Computational Linguistics:Human Language Technologies. Stroudsburg, PA:Association for Computational Linguistics,2015:103-112. [6] SOCHER R,PERELYGIN A,WU J,et al. Recursive deep models for semantic compositionality over a sentiment treebank[C]//Proceedings of the 2013 Conference on Empirical Methods in Natural Language Processing. Stroudsburg,PA:Association for Computational Linguistics,2013:1631-1642. [7] KIM Y. Convolutional neural networks for sentence classification[C]//Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing. Stroudsburg,PA:Association for Computational Linguistics,2014:1746-1751. [8] DOS SANTOS C,GATTI M. Deep convolutional neural networks for sentiment analysis of short texts[C]//Proceedings of the 25th International Conference on Computational Linguistic. Stroudsburg, PA:Association for Computational Linguistics,2014:69-78. [9] ZHANG X,ZHAO J,LECUN Y. Character-level convolutional networks for text classification[C]//Proceedings of the 28th International Conference on Neural Information Processing Systems. Cambridge:MIT Press,2015:649-657. [10] CONNEAU A,SCHWENK H,BARRAULT L,et al. Very deep convolutional networks for text classification[C]//Proceedings of the 15th Conference on the European Chapter of the Association for Computational Linguistics. Stroudsburg, PA:Association for Computational Linguistics,2017:1107-1116. [11] JOHNSON R,ZHANG T. Convolutional neural networks for text categorization:shallow word-level vs. deep character-level[EB/OL].[2019-07-25]. https://arxiv.org/pdf/1609.00718.pdf. [12] WANG J,WANG Z,ZHANG D,et al. Combining knowledge with deep convolutional neural networks for short text classification[C]//Proceedings of the 26th International Joint Conference on Artificial Intelligence. Palo Alto:AAAI,2017:2915-2921. [13] HOCHREITER S,SCHMIDHUBER J. Long short-term memory[J]. Neural Computation,1997,9(8):1735-1780. [14] WANG Y,HUANG M,ZHU X. Attention-based LSTM for aspect-level sentiment classification[C]//Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing. Stroudsburg, PA:Association for Computational Linguistics, 2016:606-615. [15] ZHOU C,SUN C,LIU Z,et al. A C-LSTM neural network for text classification[EB/OL].[2019-07-25]. https://arxiv.org/pdf/1511.08630.pdf. [16] TANG D,QIN B,LIU T. Document modeling with gated recurrent neural network for sentiment classification[C]//Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing. Stroudsburg, PA:Association for Computational Linguistics,2015:1422-1432. [17] ZHOU P,QI Z,ZHENG S,et al. Text classification improved by integrating bidirectional LSTM with two-dimensional max pooling[C]//Proceedings of the 26th International Conference on Computational Linguistics. Stroudsburg, PA:Association for Computational Linguistics,2016:3485-3495. [18] BAHDANAU D,CHO K,BENGIO Y. Neural machine translation by jointly learning to align and translate[C]//Proceedings of the 3rd International Conference on Learning Representations. San Diego,CA:Academy Press,2015:1-15. [19] ZHAO Z,WU Y. Attention-based convolutional neural networks for sentence classification[C]//Proceedings of the 17th Annual Conference of the International Speech Communication Association. San Francisco, CA:Morgan Kaufmann, 2016:705-709. [20] YANG Z, YANG D, DYER C, et al. Hierarchical attention networks for document classification[C]//Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics:Human Language Technologies. Stroudsburg, PA:Association for Computational Linguistics, 2016:1480-1489. [21] VASWANI A,SHAZEER N,PARMAR N,et al. Attention is all you need[C]//Proceedings of the 31st International Conference on Neural Information Processing Systems. New York:Curran Associates,2017:6000-6010. [22] PAPPAS N, POPESCU-BELIS A. Multilingual hierarchical attention networks for document classification[C]//Proceedings of the 8th International Joint Conference on Natural Language Processing. Stroudsburg, PA:Association for Computational Linguistics,2017:1015-1025. [23] DU C,HUANG L. Text classification research with attention-based recurrent neural networks[J]. International Journal of Computers Communications and Control,2018,13(1):50-61. [24] MIKOLOV T, SUTSKEVER I, CHEN K, et al. Distributed representations of words and phrases and their compositionality[C]//Proceedings of the 27th International Conference on Neural Information Processing Systems. Cambridge:MIT Press,2013:3111-3119. [25] ZEILER M D. ADADELTA:an adaptive learning rate method[EB/OL].[2019-07-25]. https://arxiv.org/pdf/1212.5701.pdf. [26] SRIVASTAVA N,HINTON G,KRIZHEVSKY A,et al. Dropout:a simple way to prevent neural networks from overfitting[J]. The Journal of Machine Learning Research,2014,15(1):1929-1958. |