Journals
  Publication Years
  Keywords
Search within results Open Search
Please wait a minute...
For Selected: Toggle Thumbnails
Neural machine translation integrating bidirectional-dependency self-attention mechanism
Zhijin LI, Hua LAI, Yonghua WEN, Shengxiang GAO
Journal of Computer Applications    2022, 42 (12): 3679-3685.   DOI: 10.11772/j.issn.1001-9081.2021101805
Abstract358)   HTML13)    PDF (961KB)(151)       Save

Aiming at the problem of resource scarcity in neural machine translation, a method for fusion of dependency syntactic knowledge based on a Bidirectional-Dependency self-attention mechanism (Bi-Dependency) was proposed. Firstly, an external parser was used to parse the source sentence to obtain dependency parsing data. Then, the dependency parsing data was transformed into the position vector of the parent word and the weight matrix of the child word. Finally, the dependency knowledge was integrated into the multi-head attention mechanism of the Transformer encoder. By using Bi-Dependency, the translation model was able to simultaneously pay attention to the dependency information in both directions: the parent word to the child word and the child word to the parent word. Experimental results of bi-directional translation show that compared with the Transformer model, in the case of rich resources, the proposed method has the BLEU (BiLingual Evaluation Understudy) value on Chinese-Thai translation improved by 1.07 and 0.86 respectively, and the BLEU value on Chinese-English translation improved by 0.79 and 0.68 respectively; in the case of low resources, the proposed model has the BLEU value increased by 0.51 and 1.06 respectively on Chinese-Thai translation, and the BLEU value increased by 1.04 and 0.40 respectively on Chinese-English translation. It can be seen that Bi-Dependency provides the model with richer dependence information, which can effectively improve the translation performance.

Table and Figures | Reference | Related Articles | Metrics