Journals
  Publication Years
  Keywords
Search within results Open Search
Please wait a minute...
For Selected: Toggle Thumbnails
Orthodontic path planning based on improved particle swarm optimization algorithm
XU Xiaoqiang, QIN Pinle, ZENG Jianchao
Journal of Computer Applications    2020, 40 (7): 1938-1943.   DOI: 10.11772/j.issn.1001-9081.2019112055
Abstract449)      PDF (1792KB)(653)       Save
Concerning the problem of tooth movement path planning in virtual orthodontic treatment system, a method of tooth movement path planning based on simplified mean particle swarm with normal distribution was proposed. Firstly, the mathematical models of single tooth and whole teeth were established. According to the characteristics of tooth movement, the orthodontic path planning problem was transformed into a constrained optimization problem. Secondly, based on the simplified particle swarm optimization algorithm, a Simplified Mean Particle Swarm Optimization based on the Normal distribution (NSMPSO) algorithm was proposed by introducing the idea of normal distribution and mean particle swarm optimization. Finally, a fitness function with high security was constructed from five aspects:translation path length, rotation angle, collision detection, single-stage tooth moving amount and rotation amount, so as to realize the orthodontic movement path planning. NSMPSO was compared with basic Particle Swarm Optimization (PSO) algorithm, the mean Particle Swarm Optimization (MPSO) algorithm and the Simplified Mean Particle Swarm Optimization with Dynamic adjustment of inertia weight(DSMPSO) algorithm. Results show that on Sphere, Griewank and Ackley, these three benchmark test functions, this improved algorithm tends to be stable and convergent within 50 iteration times, and has the fastest convergence speed and the highest convergence precision. Through the simulation experiments in Matlab, the optimal path obtained by the mathematical models and the improved algorithm is verified to be safe and reliable, which can provide assisted diagnosis for doctors.
Reference | Related Articles | Metrics
Continuous respiratory volume monitoring system during sleep based on radio frequency identification tag array
XU Xiaoxiang, CHANG Xiangmao, CHEN Fangjin
Journal of Computer Applications    2020, 40 (5): 1534-1538.   DOI: 10.11772/j.issn.1001-9081.2019111971
Abstract543)      PDF (769KB)(595)       Save
Continuous and accurate respiratory volume monitoring during sleep helps to infer the user’s sleep stage and provide clues about some chronic diseases. The existing works mainly focus on the detection and monitoring of respiratory frequency, and lack the means for continuous monitoring of respiratory volume. Therefore, a system named RF-SLEEP which uses commercial Radio Frequency IDentification (RFID) tags to wirelessly sense the respiratory volume during sleep was proposed. The phase value and timestamp data returned by the tag array attached to the chest surface was collected continuously by RF-SLEEP through the reader, and the displacement amounts of different points of the chest caused by breathing were calculated, then the model of relationship between the displacement amounts of different points of the chest and the respiratory volume was constructed by General Regression Neural Network (GRNN), so as to evaluate the respiratory volume of user during sleep. The errors in the calculation of chest displacement caused by the rollover of the user’s body during sleep were eliminated by RF-SLEEP through attaching the double reference tags to the user’s shoulders. The experimental results show that the average accuracy of RF-SLEEP for continuous monitoring of respiratory volume during sleep is 92.49% on average for different users.
Reference | Related Articles | Metrics
Image description generation method based on multi-spatial mixed attention
LIN Xianzao, LIU Jun, TIAN Sheng, XU Xiaokang, JIANG Tao
Journal of Computer Applications    2020, 40 (4): 985-989.   DOI: 10.11772/j.issn.1001-9081.2019091569
Abstract481)      PDF (800KB)(745)       Save
Concerning the vacancy of automatic information generation in offshore ship monitoring system,and aiming to build an intelligent ship monitoring system,an image description generation method based on multi-spatial mixed attention was proposed to describe the offshore ship images. The image description generation task is designed to let the computer describe the content of the image with words satisfying linguistics. Firstly,the multi-spatial mixed attention model was trained by the encoding features of the region of interest on the image,then the pretrained decoding model was fine-tuned by reconstructing the loss function with gradient policy,and the final model was obtained. Experimental results on MSCOCO (MicroSoft Common Objects in COntext)image description dataset show that the proposed model is better than the previous attention model on the evaluation index of image description generation,such as CIDEr score. The main content of ship image can be automatically described by the model on the self-constructed ship description dataset,demonstrating that the method can provide the data support for automatic information generation.
Reference | Related Articles | Metrics
Target tracking algorithm based on kernelized correlation filter with block-based model
XU Xiaochao, YAN Hua
Journal of Computer Applications    2020, 40 (3): 683-688.   DOI: 10.11772/j.issn.1001-9081.2019071173
Abstract373)      PDF (1929KB)(429)       Save
To reduce the influence of factors such as illumination variation, scale variation, partial occlusion in target tracking, a target tracking algorithm based on Kernelized Correlation Filter (KCF) with block-based model was proposed. Firstly, the feature of histogram of oriented gradients and the feature of color name were combined to better characterize the target. Secondly, the method of scale pyramid was adopted to estimate the target scale. Finally, the peak to sidelobe ratio of the feature response map was used to detect occlusion, and the partial occlusion problem was solved by introducing a high-confidence block relocation module and a dynamic strategy for model adaptive updating. To verify the effectiveness of the proposed algorithm, comparative experiments with several mainstream algorithms on various datasets were conducted. Experimental results show that the proposed algorithm has the highest precision and success rate which are respectively 11.89% and 15.24% higher than those of KCF algorithm, indicating that the proposed algorithm has stronger robustness in dealing with factors like illumination variation, scale variation and partial occlusion.
Reference | Related Articles | Metrics
Path planning of mobile robot based on improved artificial potential field method
XU Xiaoqiang, WANG Mingyong, MAO Yan
Journal of Computer Applications    2020, 40 (12): 3508-3512.   DOI: 10.11772/j.issn.1001-9081.2020050640
Abstract738)      PDF (849KB)(813)       Save
Aiming at the problem that the traditional artificial potential field method is easy to fall into trap area and local minimum in the path planning process, an improved artificial potential field method was proposed. Firstly, the concept of safe distance was proposed to avoid unnecessary paths, so as to solve the problems of long path length and long algorithm running time. Then, in order to avoid the robot being trapped in the local minimum and trap area, the predictive distance was introduced into the algorithm, so that the algorithm was able to react before the robot being trapped in the local minimum or trap area. Finally, the robot was guided to avoid the local minimum and trap area by setting the virtual target points reasonably. The experimental results show that, the improved algorithm can effectively solve the problem that the traditional algorithm is easy to fall into the local minimum and trap area. At the same time, compared with those of the traditional artificial potential field method, the path length planned by this proposed algorithm is reduced by 5.2% and its speed is increased by 405.56%.
Reference | Related Articles | Metrics
Learning monkey algorithm based on Lagrange interpolation to solve discounted {0-1} knapsack problem
XU Xiaoping, XU Li, WANG Feng, LIU Long
Journal of Computer Applications    2020, 40 (11): 3113-3118.   DOI: 10.11772/j.issn.1001-9081.2020040482
Abstract389)      PDF (613KB)(550)       Save
The purpose of the Discounted {0-1} Knapsack Problem (D{0-1}KP) is to maximize the sum of the value coefficients of all items loaded into the knapsack without exceeding the weight limit of the knapsack. In order to solve the problem of low accuracy when the existing algorithms solve the D{0-1}KP with large scale and high complexity, the Lagrange Interpolation based Learning Monkey Algorithm (LSTMA) was proposed. Firstly, the length of the visual field was redefined during the look process of the basic monkey algorithm. Then, the best individual in the population was introduced as the second pivot point and the search mechanism was adjusted during the jump process. Finally, the Lagrange interpolation operation was introduced after the jump process to improve the search performance of the algorithm. The simulation results on four types of examples show that LSMTA solves the D{0-1}KP with higher accuracy than the comparison algorithms, and it has good robustness.
Reference | Related Articles | Metrics
Computation offloading method for workflow management in mobile edge computing
FU Shucun, FU Zhangjie, XING Guowen, LIU Qingxiang, XU Xiaolong
Journal of Computer Applications    2019, 39 (5): 1523-1527.   DOI: 10.11772/j.issn.1001-9081.2018081753
Abstract822)      PDF (853KB)(527)       Save
The problem of high energy consumption for mobile devices in mobile edge computing is becoming increasingly prominent. In order to reduce the energy consumption of the mobile devices, an Energy-aware computation Offloading for Workflows (EOW) was proposed. Technically, the average waiting time of computing tasks in edge devices was analyzed based on queuing theory, and the time consumption and energy consumption models for mobile devices were established. Then a corresponding computation offloading method, by leveraging NSGA-Ⅲ (Non-dominated Sorting Genetic Algorithm Ⅲ) was designed to offload the computing tasks reasonably. Part computing tasks were processed by the mobile devices, or offloaded to the edge computing platform and the remote cloud, achieving the goal of energy-saving for all the mobile devices. Finally, comparison experiments were conducted on the CloudSim platform. The experimental results show that EOW can effectively reduce the energy consumption of all the mobile devices and satisfy the deadline of all the workflows.
Reference | Related Articles | Metrics
New 3D scene modeling language and environment based on BNF paradigm
XU Xiaodan, LI Bingjie, LI Bosen, LYU Shun
Journal of Computer Applications    2018, 38 (9): 2666-2672.   DOI: 10.11772/j.issn.1001-9081.2018030552
Abstract762)      PDF (1259KB)(448)       Save
Due to the problems of high degree of business coupling, insufficient description ability to object attributes and characteristics of complex scenes in the existing Three-Dimensional (3D) scene modeling models, a new scene modeling language and environment based on BNF (Backus-Naur Form) was proposed to solve the problems of 3D virtual sacrifice scene modeling. Firstly, the 3D concepts of scene object, scene object template and scene object template attribute were introduced to analyze the constitutional features of the 3D virtual sacrifice scene in detail. Secondly, a 3D scene modeling language with loose coupling, strong attribute description capability and flexible generality was proposed. Then, the operations of the scene modeling language were designed, so that the language could be edited by Application Programming Interface (API) calls, and the language supported the interface modeling. Finally, a set of Extensible Markup Language (XML) mapping methods was defined for the language. It made the scene modeling results stored in XML text format, improved the reusability of modeling results, and demonstrated the application of modeling. The application results show that the method enhances the support of new data type features, and improves the description of sequence attributes and structure attribute types, and improves the description capabilities, the versatility, the flexibility of complex scenes. The proposed method is superior to the method proposed by SHU et al. (SHU B, QIU X J, WANG Z Q. Survey of shape from image. Journal of Computer Research and Development, 2010, 47(3):549-560), and solves the problem of 3D virtual sacrifice scene modeling. The proposed method is also suitable for modeling 3D scenes with low granularity, multiple attribute components, and high coupling degree, and can improve modeling efficiency.
Reference | Related Articles | Metrics
New aesthetic QR code algorithm based on region of interest and RS code
XU Xiaoyu, LU Jianfeng, LI Li, ZHANG Shanqing
Journal of Computer Applications    2018, 38 (8): 2405-2410.   DOI: 10.11772/j.issn.1001-9081.2018020317
Abstract818)      PDF (1177KB)(438)       Save
The existing aesthetic QR code algorithms do not consider the region of interest of a background image, which affects the beautification effect. A new aesthetic QR code algorithm based on region of interest and RS code mechanism was proposed. Firstly, an improved region of interest detection algorithm based on multiple features was proposed to get the salient binary image of the background image. Secondly, an intermediate QR code was obtained by performing an XOR operation on the original RS code by using the RS coding matrix, which was completely consistent with the salient binary image of the background image. Then the background image and the intermediate QR code were fused according to a specific fusion method. To further expand the aesthetic area, the RS error correction mechanism was performed on the fusion map, getting the final aesthetic QR code image. Experimental results on the established test sample set show that the proposed aesthetic algorithm can achieve complete background replacement, save more image information, and has a better visual effect and a higher decoding rate.
Reference | Related Articles | Metrics
Fine-grained image classification method based on multi-feature combination
ZOU Chengming, LUO Ying, XU Xiaolong
Journal of Computer Applications    2018, 38 (7): 1853-1856.   DOI: 10.11772/j.issn.1001-9081.2017122920
Abstract985)      PDF (862KB)(597)       Save
As the limitation of single feature representation may cause low accuracy of fine-grained image classification, a multi-feature combination representation method based on Convolutional Neural Network (CNN) and Scale Invariant Feature Transform (SIFT) was proposed. The features were extracted from the entire target, the key parts and the key points comprehensively. Firstly, two CNN models were trained with the target-entirety regions and the head-only regions in the fine-grained image library respectively, which were used to extract the target-entirety and the head-only CNN features. Secondly, the SIFT key points were extracted from all the target-entirety regions in the image library, and the codebook was generated through the K-means clustering. Then, the SIFT descriptors of each target-entirety region were encoded into a feature vector by using the Vector of Locally Aggregated Descriptors (VLAD) along with the codebook. Finally, Support Vector Machine (SVM) was used to classify the fine-grained images by using the combination of multiple features. The method was evaluated in CUB-200-2011 database and compared with the single feature representation method. The experimental results show that the proposed method can improve the classification accuracy by 13.31% compared with the single CNN feature representation, which proves the positive effect of multi-feature combination on fine-grained image classification.
Reference | Related Articles | Metrics
Resident behavior model analysis method based on multi-source travel data
XU Xiaowei, DU Yi, ZHOU Yuanchun
Journal of Computer Applications    2017, 37 (8): 2362-2367.   DOI: 10.11772/j.issn.1001-9081.2017.08.2362
Abstract938)      PDF (965KB)(898)       Save
The mining and analysis of smart traffic card data can provide strong support for urban traffic construction and urban management. However, most of the existing research data only include data about bus or subway, and mainly focus on macro-travel patterns. In view of this problem, taking a city traffic card data as the example, which contains the multi-source daily travel data of urban residents including bus, subway and taxi, the concept of tour chain was put forward to model the behavior of residents. On this basis, the periodic travel characteristics of different dimensions were given. Then a spatial periodic feature extraction method based on the longest common subsequence was proposed, and the travel rules of urban residents were analyzed by clustering analysis. Finally, the effectiveness of this method was verified by five evaluation indexes defined by the rules, and the clustering result was improved by 6.8% by applying the spatial periodic feature extraction method, which is helpful to discover the behavior pattern of residents.
Reference | Related Articles | Metrics
Optimization of ordered charging strategy for large scale electric vehicles based on quadratic clustering
ZHANG Jie, YANG Chunyu, JU Fei, XU Xiaolong
Journal of Computer Applications    2017, 37 (10): 2978-2982.   DOI: 10.11772/j.issn.1001-9081.2017.10.2978
Abstract689)      PDF (745KB)(505)       Save
Aiming at the problem of unbalanced utilization rate distribution of charging station caused by disordered charging for a large number of electric vehicles, an orderly charging strategy for electric vehicles was proposed. Firstly, the location of the electric vehicle's charging demand was clustered, and the hierarchical clustering and quadratic division based on K-means were used to achieve the convergence of electric vehicles with similar properties. Furthermore, the optimized path to charging station was determined by Dijkstra algorithm, and by using the even distribution and the shortest charging distance of electric vehicles as objectives functions, the charging scheduling model based on electric vehicle clustering was constructed, and the genetic algorithm was used to solve the problem. The simulation results show that compared with the charging scheduling strategy without clustering of electric vehicles, the computation time of the proposed method can be reduced by more than a half for large scale vehicles, and it has higher practicability.
Reference | Related Articles | Metrics
Birkhoff interpolation-based verifiable hierarchical threshold secret sharing algorithm
XU Xiaojie, WANG Lisheng
Journal of Computer Applications    2016, 36 (4): 952-955.   DOI: 10.11772/j.issn.1001-9081.2016.04.0952
Abstract562)      PDF (690KB)(574)       Save
A Distributed Key Generation (DKG) protocol is a central component in distributed cryptosystems, it allows a group of participants to jointly generate private key and public key, but only authorised subgroups of participants are able to reconstruct private key. However, the existing literatures based on DKG protocol assume equal authority for participants. Therefore, Birkhoff Interpolation-based Verifiable Hierarchical Threshold Secret Sharing (BI-VHTSS) algorithm was proposed. Considering the problem of DKG, authorized subsets were defined by a hierarchical threshold access structure in BI-VHTSS algorithm. On the basis of intractability of the Discrete Logarithm Problem (DLP) and Birkhoff interpolation, the correctness and security of the proposed algorithm were also proved.
Reference | Related Articles | Metrics
Cooperative behavior based on evolutionary game in delay tolerant networks
XU Xiaoqiong, ZHOU Zhaorong, MA Xiaoxia, YANG Liu
Journal of Computer Applications    2016, 36 (2): 483-487.   DOI: 10.11772/j.issn.1001-9081.2016.02.0483
Abstract491)      PDF (883KB)(895)       Save
Due to the limited resources, nodes in Delay Tolerant Network (DTN) behave selfishly, i.e. nodes refuse to help forward message for others. In order to improve the cooperative behavior of nodes, and enhance the overall network performance, a new incentive mechanism of node behavior based on Evolutionary Game Theory (EGT) was proposed. In the proposed mechanism, the prisoner's dilemma model was employed to establish payoff matrix between the node and its neighbors. Then, based on the degree centricity, social authority of the node was defined. Further, when designing the strategy update rule, the influence of social authority was considered. That is, nodes with higher social authority were selected from the current neighborhood to imitate and learn. Finally, on the basis of real dynamic network topology, the simulation experiments were conducted by the Opportunistic Network Environment (ONE) simulator. The simulation results show that, compared with the Fermi update rule which chooses neighbors randomly, the strategy update rule which considers the social authority can promote the cooperative behavior, accordingly, improve the overall performance of the network.
Reference | Related Articles | Metrics
Distributed power iteration clustering based on GraphX
ZHAO Jun, XU Xiaoyan
Journal of Computer Applications    2016, 36 (10): 2710-2714.   DOI: 10.11772/j.issn.1001-9081.2016.10.2710
Abstract479)      PDF (706KB)(566)       Save
Concerning the cumbersome programming and low efficiency in parallel power iteration clustering algorithm, a new method for power iteration clustering in distributed environment was put forward based on Spark, a general computational engine for large-scale data processing, and its component GraphX. Firstly, the raw data was transformed into an affinity matrix which can be viewed as a graph by using some kind of similarity measure ment method. Secondly, by using vertex-cut technology, the row-normalized affinity matrix was divided into a number of subgraphs, which were stored on different machines of a cluster. Finally, using the in-memory computational framework Spark, several iterations were performed on the subgraphs stored in the cluster to get a cut of the original graph, and each subgraph of the original graph corresponded to a cluster. The experiments were carried out on datasets with different sizes and different number of executors. Experimental results show that the proposed distributed power iteration clustering algorithm has a good scalability, its running time is negatively correlated with the number of executors, the speedup of the algorithm ranges between 2.09 to 3.77 in a cluster of 6 executors compared with a single executor. Meanwhile, compared with the Hadoop-based power iteration clustering version, the running time of the proposed algorithm decreased significantly by 61% when dealing with 40000 pieces of news.
Reference | Related Articles | Metrics
Energy-aware fairness enhanced resource scheduling method in cloud environment
XUE Shengjun, QIU Shuang, XU Xiaolong
Journal of Computer Applications    2016, 36 (10): 2692-2697.   DOI: 10.11772/j.issn.1001-9081.2016.10.2692
Abstract574)      PDF (905KB)(644)       Save
To address the problems of large energy consumption and illegal possession of computing resources by users in cloud environment, a new algorithm named Fair and Green Resource Scheduling Algorithm (FGRSA) was proposed to save resources and enhance the fairness of the system, so that all users can reasonably use all the resources in the data center. By using the proposed method, various types of resources can be can scheduled to make use of all resources to achieve relative fairness. The simulation experiments of the proposed scheduling strategy was conducted on CloudSim. Experimental resutls show that, compared with Greedy algorithm and Round Robin algorithm, FGRSA can significantly reduce the energy consumption and simultaneously ensure fair use of all types of resources.
Reference | Related Articles | Metrics
Fairness-optimized resource allocation method in cloud environment
XUE Shengjun, HU Minda, XU Xiaolong
Journal of Computer Applications    2016, 36 (10): 2686-2691.   DOI: 10.11772/j.issn.1001-9081.2016.10.2686
Abstract543)      PDF (878KB)(659)       Save
Concerning the problems of resource allocation about uneven distribution, low efficiency, dislocation and so on, a new algorithm named Global Dominant Resource Fair (GDRF) allocation algorithm which adopts several rounds of allocation was proposed to meet the needs of different users, achieve multiple types of resource fairness, and get high resource utilization. First, a qualification queue was determined by allocated resource amount of the users, then the specific user was determined to allocate resource through the global dominant resource share and the global dominant resource weight. The matching condition of resources was took into account in allocation process and the progressive filling of Max-Min strategy was used. In addition, the universal fairness evaluation model of multi-resource allocation was applied to the specific algorithm. Comparison experiments were conducted based on a Google's cluster. Experimental results show that compared with maximizing multi-resource fairness based on dominant resource, the amount of allocated virtual machine is increased by 12%, the resource utilization is increased by 0.5 percentage points, and fairness evaluation value is increased by about 15%. The proposed algorithm has a high degree of adaptation of resources combination allocation, allowing the supply to better match users' demand.
Reference | Related Articles | Metrics
Design and implementation of context driven SoftMan knowledge communication framework
WU Danfeng, XU Xiaowei, WANG Kang
Journal of Computer Applications    2015, 35 (1): 131-135.   DOI: 10.11772/j.issn.1001-9081.2015.01.0131
Abstract632)      PDF (762KB)(565)       Save

This thesis focused on the traditional and message-based SoftMan's communication approach which has some problems in the aspects of expression ability, communication efficiency and quality. Based on the early research in SoftMan system and its communication theory, as well as SoftMan cogmatics model and context awareness mechanism, this thesis proposed the Context driven SoftMan Knowledge Communication (CSMKC) framework by learning from mature Agent communication language specification. First, the message layer, the knowledge layer and the scenarios layer in knowledge communication framework were designed; second, from the three aspects of implementation of message layer, knowledge layer and scenarios layer, the key points of knowledge communication achievements of scenario-driven SoftMan were introduced; finally, different SoftMan's communication in knowledge grade and the maintenance of scenario context were realized basically. The experimental results show that when the later content has high dependence on communication scenario, compared with the traditional message-based SoftMan communication approach, the communication overhead per unit time of CSMKC reduces by 46.15% averagely. Thus, the higher dependence on the scene, the more obvious CSMKC advantages in terms of reducing communication while accomplishing a task in the system.

Reference | Related Articles | Metrics
Adaptive non-local denoising of magnetic resonance images based on normalized cross correlation
SHI Li XU Xiaohui CHEN Liwei
Journal of Computer Applications    2014, 34 (12): 3609-3613.  
Abstract198)      PDF (792KB)(716)       Save

In order to remove the Rician distribution noise in Magnetic Resonance (MR) images sufficiently, the Normalized Cross Correlation (NCC) of local pixel was proposed to characterize the geometric structure similarity, and was combined with the traditional method of using only pixel intensity to determine its similarity weight. Then the improved method was applied to the non-local mean algorithm and Non-local Linear Minimum Mean Square Error (NLMMSE) estimation algorithm respectively. In order to realize adaptive denoising, the weighted value of pixel to be filtered or the similarity threshold in non-local algorithms were computed according to the local Signal-to-Noise Ratio (SNR) dynamically. The experimental results show that the proposed algorithm not only can better inhibit the Rician noise in MR images, but also can effectively preserve image details, so it possesses a better application value in the further analysis research of MR images and clinical diagnosis.

Reference | Related Articles | Metrics
Implementation of calibration for machine vision electronic whiteboard
XU Xiao WANG Run PENG Guojie YANG Qi WANG Yiwen LI Hui
Journal of Computer Applications    2014, 34 (1): 139-141.   DOI: 10.11772/j.issn.1001-9081.2014.01.0139
Abstract683)      PDF (564KB)(499)       Save
A partitioned calibration approach was applied to electronic whiteboard based on machine vision, since its location error distribution on large screens was non-homogeneous. Based on Human Interface Device (HID)'s implementation, the specific computer software was developed and the communication between the computer and electronic whiteboard was established. Configuration of calibration points on the whiteboard, receiving coordinates of these points, and calculation of calibration coefficients were completed. Thus the whole system calibration was implemented. The experimental results indicate that after calibration, the location accuracy is about 1.2mm on average on electronic whiteboard with the size of 140cm×105cm. And basic touch operations are accurately performed on the electronic whiteboard prototype after calibration.
Related Articles | Metrics
Railway freight volume prediction based on grey neural network with improved particle swarm optimization
LEI Bin TAO Hai-long XU Xiao-guang
Journal of Computer Applications    2012, 32 (10): 2948-2951.   DOI: 10.3724/SP.J.1087.2012.02948
Abstract1075)      PDF (731KB)(640)       Save
Concerning the shortcomings of the methods which forecast railway freight volume, the paper proposed Grey Neural Network (GNN) based on the Improved Particle Swarm Optimization algorithm (IPSO-GNN). To make up for the shortfall of the conventional GNN and guarantee the prediction accuracy, it optimized the GNN whitening parameters through the IPSO. And it computed the railway freight volume and the correlation degree of influential factors. It built a railway freight volume model based on IPSO-GNN with six relating factors. The simulation results show that the prediction method is effective and feasible. The prediction precision of the given model in the railway freight volume forecast is better than those of the conventional GNN prediction method and other prediction methods.
Reference | Related Articles | Metrics
Modified self-organizing map network for Euclidean travelling salesman problem
ZHOU Xiao-meng, XU Xiao-ming
Journal of Computer Applications    2012, 32 (07): 1962-1964.   DOI: 10.3724/SP.J.1087.2012.01962
Abstract1191)      PDF (471KB)(773)       Save
The Self-Organizing Map (SOM) was modified in this paper: the number of the neurons did not change with time and the neurons collectively maintained their mean to be the mean of the data point in the training phase. After training, every city was associated with a label of a neuron. Then there may be a problem that one or more than one cities have the same neuron. In order to avoid that, a dot labels index was adopted instead of the integer index. The virtue of this scheme is that different city has different index. Then the label would contribute to make sure the order of the city in the tour. Then the algorithm was applied to solve problems taken from a Traveling Salesman Problem Library (TSPLIB). The experimental results show that the proposed algorithm is feasible and effective.
Reference | Related Articles | Metrics
Robustness analysis of unstructured P2P botnet
XU Xiao-dong Jian-guo CHENG ZHU Shi-rui
Journal of Computer Applications    2011, 31 (12): 3343-3345.  
Abstract970)      PDF (458KB)(779)       Save
Constant improvement of botnet structure has caused great threat to network security, so it is very important to study the inherent characteristics of botnet structure to defense this kind of attack. This paper simulated the unstructured P2P botnet from the perspective of complex network, then proposed metrics and applied the theory of complex centrality to analyze the robustness of the unstructured P2P botnet when it encountered nodes failure. The experimental results demonstrate that the unstructured P2P botnet displays high robustness when it encounters random nodes failure, but its robustness drops quickly when it encounters central nodes failure.
Related Articles | Metrics
Design and application of middleware for Web full-text retrieval
Wei-gang ZHANG Yong-dong XU Xiao-qiang LEI Hui HE
Journal of Computer Applications    2011, 31 (08): 2261-2264.   DOI: 10.3724/SP.J.1087.2011.02261
Abstract1249)      PDF (609KB)(825)       Save
To provide better Web search services, the key techniques of the full-text retrieval were studied and a middleware was designed and implemented. By using a multi-thread website crawler program, the Web pages of the given URLs were collected. Bloom-Filter algorithm was employed to get rid of large-scale duplicate URLs in the collected Web pages. A new content extraction approach based on the Web tags was presented to extract the full-text content of Web pages for indexing and searching. The experimental results verify the efficiency of the content extraction method. Furthermore, to improve the search experience of users, many personalized search assistances were provided by this middleware. Boso, a blog search engine, was developed to test and verify the presented middleware. The results show that the presented middleware can be applied to actual search engines.
Reference | Related Articles | Metrics
Image dehazing method based on neighborhood similarity dark channel prior
GUO Jia WANG Xiao-tong HU Cheng-peng XU Xiao-gang
Journal of Computer Applications    2011, 31 (05): 1224-1226.   DOI: 10.3724/SP.J.1087.2011.01224
Abstract1423)      PDF (535KB)(912)       Save
Images acquired in bad weather have poor contrasts and colors. This paper proposed a simple method to remove haze based on dark channel priority. After acquiring the transmission, getting the difference between the dark channel and dark value of nearest eight pixels, the pixel of minimal difference was redefined as new dark channel. Besides, the air light was automatically estimated from the histogram of the dark channel. At last, the clear image could be recovered based on physical model. The experimental results show that the method can sharp the edge and improve the quality of the degraded image.
Related Articles | Metrics
Eliminating local and global self-intersections of offsets based on interval arithmetic and quad trees
WAN Jian,XU Xiao-mei,YE Xiao-hua
Journal of Computer Applications    2005, 25 (08): 1942-1943.   DOI: 10.3724/SP.J.1087.2005.01942
Abstract1188)      PDF (96KB)(1101)       Save
Eliminating offset self-intersection could improve the ability to handle the detail of parts and the precision to make the parts. All kinds of method used around world and presented approach based on interval arithmetic and quad tree was presented. This method is good for detecting both local and global interactions. At last, gave examples for this method.
Related Articles | Metrics
Reconfiguration-oriented business models for enterprise information system
WANG Zhong-jie,XU Xiao-fei,ZHAN De-chen
Journal of Computer Applications    2005, 25 (08): 1861-1864.   DOI: 10.3724/SP.J.1087.2005.01861
Abstract932)      PDF (194KB)(1068)       Save
It is necessary for enterprise information systems to dynamically adjust its structure and behaviors to adapt to the continuous changes of business environment. The reconfiguration ability of business models is a key factor which influences the reconfiguration ability of the whole enterprise information system. Based on the analysis of reconfiguration-oriented design principles, a business rules based business model was presented, and description method for every type of business rule was shown in detail. In this model, by continuous decomposition of business elements, frequently-changing parts in business models were clearly separated from stable parts, and expressed by business rules, which significantly improve flexibility of business models. Finally the modeling process for this kind of business model was briefly proposed.
Related Articles | Metrics