Journal of Computer Applications ›› 2025, Vol. 45 ›› Issue (9): 2893-2901.DOI: 10.11772/j.issn.1001-9081.2024091345

• Cyber security • Previous Articles    

Pseudo random number generator based on LSTM and separable self-attention mechanism

Yilin DENG, Fajiang YU()   

  1. Key Laboratory of Aerospace Information Security and Trusted Computing,Ministry of Education,School of Cyber Science and Engineering,Wuhan University,Wuhan Hubei 430072,China
  • Received:2024-09-23 Revised:2024-11-19 Accepted:2024-11-22 Online:2024-12-03 Published:2025-09-10
  • Contact: Fajiang YU
  • About author:DENG Yilin, born in 2001, M. S. candidate. Her research interests include pseudo random number generation, deep learning, information security.
  • Supported by:
    National Natural Science Foundation of China(61772384)

基于LSTM和可分离自注意力机制的伪随机数生成器

邓伊琳, 余发江()   

  1. 空天信息安全与可信计算教育部重点实验室,武汉大学国家网络安全学院,武汉 430072
  • 通讯作者: 余发江
  • 作者简介:邓伊琳(2001—),女,江西峡江人,硕士研究生,主要研究方向:伪随机数生成、深度学习、信息安全
  • 基金资助:
    国家自然科学基金资助项目(61772384)

Abstract:

To address the issues of poor quality and slow generation of pseudo random numbers generated by Generative Adversarial Network (GAN), a model LSA-WGAN-GP (Wasserstein GAN with Gradient Penalty based on LSTM and separable SA) based on Long Short-Term Memory (LSTM) and separable Self-Attention (SA) mechanism was proposed. In the model, the data were expanded from one-dimensional space to two-dimensional space, the representation method of data was improved, thereby enabling extraction of deeper-level features. And an innovative LSA (LSTM and separable Self-Attention) module was introduced, so as to integrate LSTM and SA mechanism to enhance irreversibility and unpredictability of the generated pseudo random numbers significantly. Additionally, the network structure was simplified, thereby reducing the model’s parameters effectively and improving the generation speed. Experimental results demonstrate that pseudo random numbers generated by LSA-WGAN-GP pass the National Institute of Standards and Technology (NIST) tests with a 100% success rate; compared to WGAN-GP (Wasserstein GAN with Gradient Penalty) and GAN, LSA-WGAN-GP improves P-values and pass rates in the frequency and universal test items; in terms of pseudo random number generation speed, LSA-WGAN-GP generates pseudo random numbers 164% and 975% faster than WGAN-GP and GAN, respectively. It can be seen that the proposed model ensures quality of the generated pseudo random number with reduced parameters and improved pseudo random number generation speed.

Key words: pseudo random number generation, Generative Adversarial Network (GAN), Long Short-Term Memory (LSTM) network, separable Self-Attention (SA) mechanism, Artificial Intelligence (AI)

摘要:

针对生成对抗网络(GAN)生成伪随机数的质量不高和生成速度较慢的问题,提出一种基于长短时记忆(LSTM)网络和可分离自注意力(SA)机制的模型LSA-WGAN-GP(Wasserstein GAN with Gradient Penalty based on LSTM and separable SA)。该模型通过将数据从一维扩展为二维,改进数据的表示方式,从而提取更深层次的特征。并且,创新性地提出LSA(LSTM and separable Self-Attention)模块,以融合LSTM和SA机制,从而显著提升伪随机数的不可回溯性和不可预测性。此外,通过精简网络结构有效减小模型参数量,并提高生成速度。实验结果表明,LSA-WGAN-GP生成的伪随机数可以100%通过NIST(National Institute of Standards and Technology)测试;与WGAN-GP(Wasserstein GAN with Gradient Penalty)和GAN相比,LSA-WGAN-GP在频率和全局通用测试项的P值和通过率上均有提升;在伪随机数生成速度上,LSA-WGAN-GP比WGAN-GP和GAN分别提升了164%和976%。可见,LSA-WGAN-GP在保证生成的伪随机数质量的同时,减少了模型的参数量,并提高了生成伪随机数的速度。

关键词: 伪随机数生成, 生成对抗网络, 长短时记忆网络, 可分离自注意力机制, 人工智能

CLC Number: