In order to address the low accuracy and speed of detection by manual and traditional automation methods for the weld seam surface of traction seat, a lightweight weld seam quality detection algorithm YOLOv5s-G2CW was proposed for the weld seam surface of traction seat. Firstly, the GhostBottleneckV2 module was applied as a replacement for the C3 module in YOLOv5s to reduce the number of parameters used in the model. Then, the CBAM (Convolutional Block Attention Module) was introduced into the Neck of the YOLOv5s model for integration of the weld features in two dimensions: channel and space. Also, the positioning loss function of the YOLOv5s model was improved into Wise-IoU, focusing on the predictive regression of ordinary quality anchor frames. Finally, the 13 × 13 feature layer used for the detection of large-sized objects in the YOLOv5s model was removed to further reduce the number of parameters used in the model. Experimental results show that, compared with the YOLOv5s model, the size of YOLOv5s-G2CW model reduces by 53.9%, the number of frames transmitted per second increases by 8.0%, and the mAP (mean Average Precision) value increases by 0.8 percentage points. It can be seen that the model is applicable to meet the requirements for real-time and accurate detection of the weld seam surface for traction seat.
Enhancing the robustness of complex networks is crucial for the networks to resist external attacks and cascading failures. Existing evolutionary algorithms have limitations in solving network structure optimization problems, especially in terms of convergence and optimization speed. In response to this challenge, a new adaptive complex network robustness optimization algorithm named SU-ANet (SUrrogate-assisted and Adaptive Network optimization algorithm) was proposed. To reduce the huge time overhead of robustness computation, a robustness predictor based on attention mechanism was constructed in SU-ANet as an offline surrogate model to replace the frequent robustness computation in local search operator. In the evolutionary process, the global and local information was considered comprehensively to avoid falling into local optimum and broaden the search space simultaneously. By designing crossover operators, each individual exchanges edges with the global optimum candidate solution and a randomly selected individual to balance the convergence and diversity of the algorithm. Additionally, a parameter self-adaptive mechanism was applied to adjust the operator execution probabilities automatically, thereby alleviating the uncertainty of the algorithm brought by the parameter design. Experimental results on both synthetic networks and real-world networks demonstrate that SU-ANet has better search capabilities and higher evolutionary efficiency.
With the massive growth of data, how to store and use data has become a hot issue in academic research and industrial applications. As one of the methods to solve these problems, instance selection effectively reduces the difficulty of follow-up work by selecting representative instances from original data according to the established rules. Therefore, a voting instance selection algorithm based on learning to hash was proposed. Firstly, the Principal Component Analysis (PCA) method was used to map high-dimensional data to low-dimensional space. Secondly, the k-means algorithm was used to perform iterative operations by combining with the vector quantization method, and the hash codes of the cluster center were used to represent the data. After that, the classified data were randomly selected according to the proportion, and the final instances were selected by voting after several times independent running of the algorithm. Compared with the Compressed Nearest Neighbor (CNN) algorithm and the instance selection algorithm of linear complexity for big data named LSH-IS-F (Instance Selection algorithm by Hashing with two passes), the proposed algorithm has the compression ratio improved by an average of 19%. The idea of the proposed algorithm is simple and easy to implement, and the algorithm can control the compression ratio automatically by adjusting the parameters. Experimental results on 7 datasets show that the proposed algorithm has a great advantage compared to random hashing in terms of compression ratio and running time with similar test accuracy.