Sie können Bookmarks mittels Listen verwalten, loggen Sie sich dafür bitte in Ihr SLUB Benutzerkonto ein.
Medientyp:
E-Artikel
Titel:
English-Chinese Machine Translation Model Based on Bidirectional Neural Network with Attention Mechanism
Beteiligte:
Yonglan, Li;
Wenjia, He
Erschienen:
Hindawi Limited, 2022
Erschienen in:Journal of Sensors
Sprache:
Englisch
DOI:
10.1155/2022/5199248
ISSN:
1687-7268;
1687-725X
Entstehung:
Anmerkungen:
Beschreibung:
<jats:p>In recent years, with the development of deep learning, machine translation using neural network has gradually become the mainstream method in industry and academia. The existing Chinese-English machine translation models generally adopt the deep neural network architecture based on attention mechanism. However, it is still a challenging problem to model short and long sequences simultaneously. Therefore, a bidirectional LSTM model integrating attention mechanism is proposed. Firstly, by using the word vector as the input data of the translation model, the linguistic symbols used in the translation process are mathematized. Secondly, two attention mechanisms are designed: local attention mechanism and global attention mechanism. The local attention mechanism is mainly used to learn which words or phrases in the input sequence are more important for modeling, while the global attention mechanism is used to learn which layer of expression vector in the input sequence is more critical. Bidirectional LSTM can better fuse the feature information in the input sequence, while bidirectional LSTM with attention mechanism can simultaneously model short and long sequences. The experimental results show that compared with many existing translation models, the bidirectional LSTM model with attention mechanism can effectively improve the quality of machine translation.</jats:p>