Paper
8 November 2024 Research of improvement of multilingual scientific translation model based on neural network attention mechanism
Jing Zhao, Hongjing Chang, Xuguang Zhang, Chunmao Li
Author Affiliations +
Proceedings Volume 13416, Fourth International Conference on Advanced Algorithms and Neural Networks (AANN 2024); 134162U (2024) https://doi.org/10.1117/12.3050128
Event: 2024 4th International Conference on Advanced Algorithms and Neural Networks, 2024, Qingdao, China
Abstract
The application of neural network algorithms for multilingual translation is currently an important research field. Traditional sequence neural frameworks, such as RNN and its variant LSTM and GRU models, slow the training and inference speed, meanwhile, lower the accuracy of output translations when processing long sequences, as their inherent sequential processing mechanism limits the possibility of parallel processing. This report addresses the shortcomings of traditional sequential neural frameworks and establishes a transformer model that includes an attention mechanism encoder and decoder to make up the short brand. It combines self-attention with neural networks and systematically implements multi-lingual scientific translation to improve translation optimality on the basis of PyTorch. The experimental test results indicate that the BLUE value of the transformer model with attention mechanism on the field of scientific translation is improved to varying degrees compared with that of traditional sequential neural algorithms. This proves that the performance of the transformer algorithm model with attention mechanism is significantly better than that of traditional models and put forward improved suggestions on the basis of this qualified transformer algorithm model with attention mechanism.
(2024) Published by SPIE. Downloading of the abstract is permitted for personal use only.
Jing Zhao, Hongjing Chang, Xuguang Zhang, and Chunmao Li "Research of improvement of multilingual scientific translation model based on neural network attention mechanism", Proc. SPIE 13416, Fourth International Conference on Advanced Algorithms and Neural Networks (AANN 2024), 134162U (8 November 2024); https://doi.org/10.1117/12.3050128
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Transformers

Performance modeling

Data modeling

Neural networks

Education and training

Mathematical optimization

Artificial neural networks

Back to Top