1 |
World Customs Organization. What is the Harmonized System (HS)?[EB/OL]. [2020-08-03].. 10.1201/9781315383026-1
|
2 |
3Technologies CE. Is HS classification difficult?[EB/OL]. [2020-08-03]..
|
3 |
谢维,李银胜,邵永臻,等. HS编码查询知识库设计与实现[J]. 计算机应用与软件, 2008, 25(8):143-146.
|
|
XIE W, LI Y S, SHAO Y Z, et al. Design and implementation of HS code query knowledge base[J]. Computer Applications and Software, 2008, 25(8):143-146.
|
4 |
DING L Y, FAN Z Z, CHEN D L. Auto-categorization of HS code using background net approach[J]. Procedia Computer Science, 2015, 60:1462-1471. 10.1016/j.procs.2015.08.224
|
5 |
LI Q, PENG H, LI J X, et al. A survey on text classification: from shallow to deep learning[EB/OL]. (2020-10-26) [2020-12-12].. 10.1109/isci50694.2020.00020
|
6 |
MIKOLOV T, CHEN K, CORRADO G, et al. Efficient estimation of word representations in vector space[EB/OL]. (2013-09-07) [2020-12-12].. 10.3126/jiee.v3i1.34327
|
7 |
JOULIN A, GRAVE E, BOJANOWSKI P, et al. Bag of tricks for efficient text classification[C]// Proceedings of the 15th Conference of the European Chapter of the Association for Computational Linguistics: Volume 2, Short Papers. Stroudsburg, PA: Association for Computational Linguistics, 2017: 427-431. 10.18653/v1/e17-2068
|
8 |
MIKOLOV T, SUTSKEVER I, CHEN K, et al. Distributed representations of words and phrases and their compositionality[C]// Proceedings of the 26th International Conference on Neural Information Processing Systems. Red Hook, NY: Curran Associates Inc., 2013:3111-3119.
|
9 |
龚丽娟,王昊,张紫玄,等. Word2Vec对海关报关商品文本特征降维效果分析[J]. 数据分析与知识发现, 2020, 4(2/3):89-100. 10.11925/infotech.2096-3467.2019.0613
|
|
GONG L J, WANG H, ZHANG Z X, et al. Reducing dimensions of custom declaration texts with Word2Vec[J]. Data Analysis and Knowledge Discovery, 2020, 4(2/3):89-100. 10.11925/infotech.2096-3467.2019.0613
|
10 |
PENNINGTON J, SOCHER R, MANNING C D. GloVe: global vectors for word representation[C]// Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing. Stroudsburg, PA: Association for Computational Linguistics, 2014:1532-1543. 10.3115/v1/d14-1162
|
11 |
KIM Y. Convolutional neural networks for sentence classification[C]// Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing. Stroudsburg, PA: Association for Computational Linguistics, 2014: 1746-1751. 10.3115/v1/d14-1181
|
12 |
PETERS M E, NEUMANN M, IYYER M, et al. Deep contextualized word representations[C]// Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long Papers). Stroudsburg, PA: Association for Computational Linguistics, 2018: 2227-2237. 10.18653/v1/n18-1202
|
13 |
RADFORD A, NARASIMHAN K, SALIMANS T, et al. Improving language understanding with unsupervised learning[EB/OL]. (2018-06-11) [2020-12-12]..
|
14 |
VASWANI A, SHAZEER N, PARMAR N, et al. Attention is all you need [C]// Proceedings of the 2017 31st International Conference on Neural Information Processing Systems. Red Hook, NY: Curran Associates Inc., 2017: 6000-6010. 10.1016/s0262-4079(17)32358-8
|
15 |
DEVLIN J, CHANG M W, LEE K, et al. BERT: pre-training of deep bidirectional transformers for language understanding[C]// Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers). Stroudsburg, PA: Association for Computational Linguistics, 2019: 4171-4186. 10.18653/v1/n19-1423
|
16 |
SØGAARD A, GOLDBERG Y. Deep multi-task learning with low level tasks supervised at lower layers[C]// Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers). Stroudsburg, PA: Association for Computational Linguistics, 2016:231-235. 10.18653/v1/p16-2038
|
17 |
HASHIMOTO K, XIONG C M, TSURUOKA Y, et al. A joint many-task model: growing a neural network for multiple NLP tasks[C]// Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing. Stroudsburg, PA: Association for Computational Linguistics, 2017: 1923-1933. 10.18653/v1/d17-1206
|
18 |
SANH V, WOLF T, RUDER S. A hierarchical multi-task approach for learning embeddings from semantic tasks[C]// Proceedings of the 33rd AAAI Conference on Artificial Intelligence. Palo Alto, CA: AAAI Press, 2019:6949-6956. 10.1609/aaai.v33i01.33016949
|
19 |
LIU X D, HE P C, CHEN W Z, et al. Multi-task deep neural networks for natural language understanding[C]// Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. Stroudsburg, PA: Association for Computational Linguistics, 2019: 4487-4496. 10.18653/v1/p19-1441
|