The rapid development of quantum technology and the continuous improvement of quantum computing efficiency, especially the emergence of Shor algorithm and Grover algorithm, greatly threaten the security of traditional public key cipher and symmetric cipher. The block cipher PFP algorithm designed based on Feistel structure was analyzed. First, the linear transformation P of the round function was fused into the periodic functions in the Feistel structure, then four 5-round periodic functions of PFP were obtained, two rounds more than periodic functions in general Feistel structure, which was verified through experiments. Furthermore, by using quantum Grover and Simon algorithms, with a 5-round periodic function as the distinguisher, the security of 9, 10-round PFP was evaluated by analyzing the characteristics of PFP key arrangement algorithm. The time complexity required for key recovery is 226, 238.5, the quantum resource required is 193, 212 qubits, and the 58, 77 bits key can be restored, which are superior to the existing impossible differential analysis results.
As Radio Frequency IDentification (RFID) technology and wireless sensors become increasingly common, the need of secure data transmitted and processed by such devices with limited resources leads to the emergence and growth of lightweight ciphers. Characterized by their small key sizes and limited number of encryption rounds, precise security evaluation of lightweight ciphers is needed before putting into service. The differential and linear characteristics of full-round Shadow algorithm were analyzed for lightweight ciphers’ security requirements. Firstly, a concept of second difference was proposed to describe the differential characteristic more clearly, the existence of a full-round differential characteristic with probability 1 in the algorithm was proved, and the correctness of differential characteristic was verified through experiments. Secondly, a full-round linear characteristic was provided. It was proved that with giving a set of Shadow-32 (or Shadow-64) plain ciphertexts, it is possible to obtain 8 (or 16) bits of key information, and its correctness was experimentally verified. Thirdly, based on the linear equation relationship between plaintexts, ciphertexts and round keys, the number of equations and independent variables of the quadratic Boolean function were estimated. After that, the computational complexity of solving the initial key was calculated to be 2 63.4 . Finally, the structural features of Shadow algorithm were summarized, and the focus of future research was provided. Besides, differential and linear characteristic analysis of full-round Shadow algorithm provides preference for the differential and linear analysis of other lightweight ciphers.
Aiming at the problem of neglecting some narrow roads due to the formation constraints in the multi-UAV (Unmanned Aerial Vehicle) cooperative trajectory planning, a Fast Particle Swarm Optimization method based on Adaptive Distributed Model Predictive Control (ADMPC-FPSO) was proposed. In the method, the formation strategy combining leader-follower method and virtual structure method was used to construct adaptive virtual formation guidance points to complete the cooperative formation control task. According to the idea of model predictive control, combined with the distributed control method, the cooperative trajectory planning was transformed into a rolling online optimization problem, and the minimum distance and other performance indicators were used as cost functions. By designing the evaluation function criterion, the variable weight fast particle swarm optimization algorithm was used to solve the problem. The simulation results show that the proposed algorithm can effectively realize the multi-UAV cooperative trajectory planning, can quickly complete the adaptive formation transformation according to the environmental changes, and has lower cost than the traditional formation strategy.
A big data benchmark is needed eagerly by customers, industry and academia, to evaluate big data systems, improve current techniques and develop new techniques. A number of prominent works in last several years were reviewed. Their characteristics were introduced and the shortcomings were analyzed. Based on that, some suggestions on building a new big data benchmark are provided, including: 1) component based benchmarks as well as end-to-end benchmarks should be used in combination to test different tools inside the system and test the system as a whole, while component benchmarks are ingredients of the whole big data benchmark suite; 2) workloads should be enriched with complex analytics to encompass different application requirements, besides SQL queries; 3) other than performance metrics (response time and throughput), some other metrics should also be considered, including scalability, fault tolerance, energy saving and security.
Aiming at the problem that the traditional wavelet transform, curverlet transform and contourlet transform are unable to provide the optimal sparse representation of image and can not obtain the better enhancement effect, an image enhancement algorithm based on Shearlet transform was proposed. The image was decomposed into low frequency components and high frequency components by Shearlet transform. Firstly, Multi-Scale Retinex (MSR) was used to enhance the low frequency components of Shearlet decomposition to remove the effect of illumination on image; secondly, the threshold denoising was used to suppress noise at high frequency coefficients of each scale. Finally, the fuzzy contrast enhancement method was used to the reconstruction image to improve the overall contrast of image. The experimental results show that proposed algorithm can significantly improve the image visual effect, and it has more image texture details and anti-noise capabilities. The image definition, the entropy and the Peak Signal-to-Noise Ratio (PSNR) are improved to a certain extent compared with the Histogram Equalization (HE), MSR and Fuzzy contrast enhancement in Non-Subsampled Contourlet Domain (NSCT_fuzzy) algorithms. The operation time reduces to about one half of MSR and one tenth of NSCT_fuzzy.
A Multi-Start Tabu Search (MSTS) algorithm was proposed for the maximum cut problem to improve the solution quality. The proposed algorithm included two key components, one of which was tabu search used to identify high-quality local optimal solutions and the other of which was the multi-start strategy used for the global exploration. Firstly, a local optimum solution was acquired by tabu search component. Secondly, new starting solution was produced by multi-start strategy and then tabu search procedure was restarted. Based on the random greediness, the proposed multi-start strategy integrated the constructive and perturbation methods to produce new starting solutions, thus escaping from being trapped in local optimum and finding higher quality solutions. Experiments on 21 standard maximum cut benchmark instances and comparisons with several state-of-the-art algorithms show that 18 best solutions was obtained by MSTS, higher than compared algorithms. The experimental results indicate that the proposed algorithm outperforms the reference algorithms in terms of the solution quality.