Journal of Computer Applications

    Next Articles

Pseudo-random number generator based on LSTM and separable self-attention mechanism#br#
#br#

DENG Yilin, YU Fajiang   

  1. Key Laboratory of Aerospace Information Security and Trusted Computing, Ministry of Education, School of Cyber Science and Engineering, Wuhan University

  • Received:2024-09-20 Revised:2024-11-16 Online:2024-12-03 Published:2024-12-03
  • About author:DENG Yilin, born in 2001, M.S. candidate. Her research interests include pseudorandom number generation, deep learning, information security. YU Fajiang, born in 1980, Ph. D., associate professor. His research interests include system security, trusted computing.
  • Supported by:
    National Natural Science Foundation of China (61772384)

基于LSTM和可分离自注意力机制的伪随机数生成器

邓伊琳,余发江   

  1. 空天信息安全与可信计算教育部重点实验室,武汉大学国家网络安全学院
  • 通讯作者: 余发江
  • 作者简介:邓伊琳(2001—),女,江西峡江人,硕士研究生,主要研究方向:伪随机数生成、深度学习、信息安全;余发江(1980—),男,重庆人,副教授,博士,CCF会员,主要研究方向:系统安全、可信计算。
  • 基金资助:
    国家自然科学基金资助项目(61772384)

Abstract: To address the issues of poor quality and slow generation speed of pseudo-random numbers in Generative Adversarial Network (GAN), a Wasserstein GAN with Gradient Penalty model based on Long Short-Term Memory (LSTM) and separable Self-Attention was proposed. LSA-WGAN-GP expands data representation from one-dimensional to two-dimensional space, enabling the extraction of deeper-level features. An innovative LSTM and separable Self-Attention (LSA) module was introduced, which integrates LSTM and separable self-attention mechanism to significantly enhance the irreversibility and unpredictability of the generated pseudo-random numbers. Additionally, the network structure i\was streamlined, effectively reducing the model's parameter count and improving generation speed. Experimental results demonstrate that pseudo-random numbers generated by LSA-WGAN-GP pass the National Institute of Standards and Technology (NIST) tests with a 100% success rate. Compared to Wasserstein GAN with Gradient Penalty (WGAN-GP) and GAN, LSA-WGAN-GP shows improved P-values and pass rates in the frequency and universal tests. Additionally, LSA-WGAN-GP generates pseudo-random numbers 164% and 975% faster than WGAN-GP and GAN, respectively. The proposed model effectively balances the quality of pseudo-random number generation with a reduced parameter count and improved generation speed.

Key words: pseudo-random number generation, Generative Adversarial Network (GAN), Long Short-Term Memory (LSTM), separable self-attention mechanism, deep learning 

摘要: 针对生成对抗网络(GAN)生成伪随机数质量不高和生成速度较慢的问题,提出了一种基于长短时记忆(LSTM)和可分离自注意力机制的WGAN-GP(LSA-WGAN-GP)模型。LSA-WGAN-GP模型通过将数据从一维扩展为二维,改进数据表示方式,以提取更层次的特征。创新性地提出了LSA(LSTM and separable Self-Attention)模块,融合LSTM和可分离自注意力机制,显著提升了伪随机数的不可回溯性和不可预测性。此外,通过精简网络结构有效减少了模型参数量,并提高了生成速度。实验结果表明,LSA-WGAN-GP生成的伪随机数可以100%通过美国国家标准与技术研究院(NIST)测试;与WGAN-GP(Wasserstein GAN with Gradient Penalty)和GAN相比,在频率和全局通用测试项的P值和通过率上均有提升;同时,在伪随机数生成速度上,LSA-WGAN-GP比WGAN-GP、GAN分别提升了164%和976%。LSA-WGAN-GP在保证生成伪随机数质量的同时,减少了模型的参数量,提高了生成伪随机数的速度。

关键词: 伪随机数生成, 生成对抗网络, 长短时记忆, 可分离自注意力机制, 深度学习

CLC Number: