Paper
15 August 2023 Application research of RNN in text intelligent prediction scenario
Author Affiliations +
Proceedings Volume 12719, Second International Conference on Electronic Information Technology (EIT 2023); 127191D (2023) https://doi.org/10.1117/12.2685792
Event: Second International Conference on Electronic Information Technology (EIT 2023), 2023, Wuhan, China
Abstract
Since deep learning has been continuously developing, research on language models utilizing deep learning has increased, and certain results have been obtained. There are many research results, but RNN has unique advantages regarding the construction of language models. In addition, due to its weight-sharing cyclic network form, the language model can have the information memory function described above, which facilitates the prediction of content in the future. Text generation is one of the most common applications in this area. However, RNN may have some limitations, including its inability to remember the above for a prolonged period of time, making the language model formed by RNN relatively weak in terms of intelligence and ability to connect the languages under study. Therefore, this paper proposes to build a language prediction model based on the RNN network model LSTM. This will allow the language model to capture the previous characteristics better, resulting in improved intelligent text prediction.
© (2023) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
RuoBing Duan, Jinjie Cheng, RuiLin Qiu, RuiJian Cai, Hang Zeng, and Feng Wang "Application research of RNN in text intelligent prediction scenario", Proc. SPIE 12719, Second International Conference on Electronic Information Technology (EIT 2023), 127191D (15 August 2023); https://doi.org/10.1117/12.2685792
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Neural networks

Education and training

Associative arrays

Deep learning

Process modeling

Neurons

Data hiding

Back to Top