Journals
  Publication Years
  Keywords
Search within results Open Search
Please wait a minute...
For Selected: Toggle Thumbnails
Pseudo random number generator based on LSTM and separable self-attention mechanism
Yilin DENG, Fajiang YU
Journal of Computer Applications    2025, 45 (9): 2893-2901.   DOI: 10.11772/j.issn.1001-9081.2024091345
Abstract47)   HTML1)    PDF (2195KB)(22)       Save

To address the issues of poor quality and slow generation of pseudo random numbers generated by Generative Adversarial Network (GAN), a model LSA-WGAN-GP (Wasserstein GAN with Gradient Penalty based on LSTM and separable SA) based on Long Short-Term Memory (LSTM) and separable Self-Attention (SA) mechanism was proposed. In the model, the data were expanded from one-dimensional space to two-dimensional space, the representation method of data was improved, thereby enabling extraction of deeper-level features. And an innovative LSA (LSTM and separable Self-Attention) module was introduced, so as to integrate LSTM and SA mechanism to enhance irreversibility and unpredictability of the generated pseudo random numbers significantly. Additionally, the network structure was simplified, thereby reducing the model’s parameters effectively and improving the generation speed. Experimental results demonstrate that pseudo random numbers generated by LSA-WGAN-GP pass the National Institute of Standards and Technology (NIST) tests with a 100% success rate; compared to WGAN-GP (Wasserstein GAN with Gradient Penalty) and GAN, LSA-WGAN-GP improves P-values and pass rates in the frequency and universal test items; in terms of pseudo random number generation speed, LSA-WGAN-GP generates pseudo random numbers 164% and 975% faster than WGAN-GP and GAN, respectively. It can be seen that the proposed model ensures quality of the generated pseudo random number with reduced parameters and improved pseudo random number generation speed.

Table and Figures | Reference | Related Articles | Metrics