Loading...

Table of Content

    01 December 2010, Volume 30 Issue 12
    Pattern recognition and Software
    Tight link location based on available bandwidth measurement of sub path
    2010, 30(12):  3141-3144. 
    Asbtract ( )   PDF (888KB) ( )  
    Related Articles | Metrics

    Since the present tight link location methods are of big measurement load and low precision, a method for tight link location based on the available bandwidth measurement of the sub path of a network path was proposed in this paper, called PathLoche. A new packet train called Loche was designed to measure the available bandwidth of the sub paths iteratively; the available bandwidth of the whole path and the location of tight link could be attained from the iterative measurement value. The simulation experiment shows that PathLoche is accurate and low-invasive.

    Network and communications
    Performance analysis and optimization of network server based on queuing network
    2010, 30(12):  3148-3150. 
    Asbtract ( )   PDF (682KB) ( )  
    Related Articles | Metrics
    In order to solve the problem that the performance of network server cannot be analyzed in relatively complex situations of number and status, the authors proposed a method of server performance evaluation based on queuing network. This method could effectively reduce the time and space complexity and make performance analysis of server more accurately and efficiently. The experimental results show that using this method to analyze server performance and optimizing the server program by the results can make good use of system resources and improve server performance.
    Power allocation and channel capacity for cooperative tri-node
    2010, 30(12):  3151-3154. 
    Asbtract ( )   PDF (639KB) ( )  
    Related Articles | Metrics
    Power allocation for tri-terminal cooperative communication was discussed to predict the power allocation and channel capacity for cooperative multi-terminal when the channel gains are fixed. The optimum power allocation algorithm, with channel capacity as destination function, was given out. The research results show that when the channel gain from source node to destination node is larger than the channel gains from source node to relay nodes, source node can transmit information to the destination node directly. When the product of the square of channel gain from source node to relay node multiplies the power transmitted from source node is smaller than the product of the square of channel gain from relay node to destination node multiplies the power transmitted from relay node, source node can transmit information to the destination node via relay nodes. In other common cases, source node can partly transmit to the destination node via relay nodes and partly transmit information to the destination node directly.
    States analysis and Agent modeling for wireless sensor network nodes
    2010, 30(12):  3155-3157. 
    Asbtract ( )   PDF (621KB) ( )  
    Related Articles | Metrics
    To understand and analyze the autonomous working mechanism of Wireless Sensor Network (WSN) that is independent of internal structure and practical implementation, the various working states and self-organization properties of WSN nodes were analyzed, and an Agent class meta-model for WSN nodes was built up. The proposed meta-model was based on the Agent BDI model, extended AUML Agent class diagram by adding Mental, Role and Protocol model elements that met WSN characteristics to describe the static structures of WSN nodes. The case study shows that, in combination with node working states, the Agent class meta-model provides the good fundamentals of visual modeling and analysis tools for WSN node architecture research.
    Two-hop hello protocol for mobile Ad Hoc and sensor networks
    2010, 30(12):  3158-3160. 
    Asbtract ( )   PDF (859KB) ( )  
    Related Articles | Metrics
    Periodical Hello protocol is widely used in geographic routing protocols to obtain and maintain neighbor table. However in highly dynamic networks, the information stored in the neighbor table is often outdated, causing retransmissions and rerouting that consume bandwidth and increase latency. A new Two-hop Hello protocol (T-Hello) was proposed aiming to improve the accuracy of neighbor table. By exchanging beacon messages within two-hop scope, location information can be obtained even if a neighbor node has moved out of communication range, so that the corresponding entry in neighbor table can be removed explicitly rather than waiting for time out. The impacts of various factors (node density, beacon interval, node speed, transmission radius, et al.) were extensively studied and the conclusion is that the T-Hello protocol can shorten the lifetime of outdated neighbors by 50%, and hereby improve the performance of GPSR protocol significantly.
    Network coding based WSN routing policy with friendly energy consumption
    HOU Lei
    2010, 30(12):  3161-3163. 
    Asbtract ( )   PDF (770KB) ( )  
    Related Articles | Metrics
    For the purpose of balancing and decreasing the routing energy consumption of Wireless Sensor Network (WSN), this paper put forward a network coding based routing policy with friendly energy consumption feature. Three restricts on network lifetime, data flow and multicasting flow in WSN were analyzed in detail. Simulation results show this policy can balance energy consumption better than other policies, and can notably extend the lifecycle of WSN.
    Ad Hoc network data sharing mechanism based on service evaluation management
    2010, 30(12):  3164-3167. 
    Asbtract ( )   PDF (665KB) ( )  
    Related Articles | Metrics
    In order to solve the limit of route in Ad Hoc networks to that there must be at least a full path and to realize safety and sharing of data, load balance of node storage, etc. a dynamic routing algorithm based on node density was proposed. A resource sharing mechanism based on service evaluation management was put forward for sharing cache resources in the networks, which could provide safe and good services when the densities of nodes are different. In the mechanism, when the mobile node sends communication request, the information possessed by neighbor nodes was audited, and only information possessed by the qualified nodes was interacted. After the service ended, it was evaluated by the participated nodes, and the data and service evaluation were stored in the neighbor nodes. When the density of proxy node decreased, it sent caching request to its neighbor, and then the neighbor would provide the corresponding data service as a proxy node. The information in the cache resource which service evaluation was below the users security demand would be updated and deleted. This mechanism effectively improves the data security and network robustness of Ad Hoc networks, and reduces the bandwidth communication and data storage space.
    Threshold dynamic adjustment algorithm for number of connections based on yield assessment mechanism
    2010, 30(12):  3168-3171. 
    Asbtract ( )   PDF (708KB) ( )  
    Related Articles | Metrics
    A dynamic adjustment algorithm for threshold of connection number based on yield assessment was presented against the shortcoming of fixed value of concurrent limit connections in output gateways at present, which can get a most optimized threshold value by adjusting benefit values of valid connections and invalid connections. It can be used in different network preferences, can control the most invalid connections generated by P2P applications effectively and protect the effective use of diverse network applications.
    Multi-objective optimization of Rapidio network
    2010, 30(12):  3172-3175. 
    Asbtract ( )   PDF (689KB) ( )  
    Related Articles | Metrics
    Concerning the drawback of the existing Rapidio network routing management strategy, an optimized strategy based on improved Genetic Algorithm (GA) was proposed. By doing some improvements on the routing strategy, coding, mutation and crossover of conventional GA, the performance of the Quality of Service (QoS) objects was improved effectively. At the meantime, compared to the conventional GA, the improved GA increases convergence speed greatly. The proposed algorithm is suitable for embedded applications based on Rapidio network, and has strong application value.
    Application-oriented NoC bandwidth aware routing technology
    2010, 30(12):  3176-3179. 
    Asbtract ( )   PDF (712KB) ( )  
    Related Articles | Metrics
    The Networks-on-Chip (NoC) approach was proposed as a promising solution to on-chip communication problems, but it is by far more resource limited. The standard topology structure cannot satisfy the application traffic demand and leads to overconsumption of the power and area. These designs that are adequate for general-purpose systems cannot satisfy quality-of-service oriented, predictable interconnects. This paper proposed a new Bandwidth-Aware Routing Technique (BART) that optimized the network performance for application-specific NoCs. More precisely, the specific application and traffic characteristics were given. The proposed routing technique has two phases. In the first phase, a communication-aware mapping technique was used to obtain a near-optimal assignment of IPs to network nodes. In the second phase, a bandwidth-aware routing algorithm was used to find the minimal route for each flow in the network. It was ensured that the routes were deadlock-free through static virtual channel assignment. Besides, BART combined a method for router table compression to reduce the hardware overhead. The evaluation results show that BART has better performance than the existing routing algorithms.
    EH-EC: High performance data forwarding mechanism for opportunistic networks
    2010, 30(12):  3180-3183. 
    Asbtract ( )   PDF (874KB) ( )  
    Related Articles | Metrics
    An efficient method was proposed to solve the system black hole problem existed in Hybrid of Erasure Coding (H_EC). And at the same time an original method was proposed to divide messages into small data blocks. Based on these, a new forwarding approach called Enhanced Hybrid Erasure-Coding (EH-EC) was proposed and achieved. Theoretical analysis and simulation results show that messages can be transmitted by EH-EC from source node to destination node with lower delay and higher delivery rate than H-EC by the OMNET++ platform, but only a few router redundancies will be increased.
    Workflow scheduling algorithm based on load balance in grid
    2010, 30(12):  3184-3186. 
    Asbtract ( )   PDF (569KB) ( )  
    Related Articles | Metrics
    Since the time order and the data dependence exist in the grid workflow, the workflow scheduling in grid is a complete NP-hard. Concerning the Directed Acyclic Graph (DAG)-based workflow in the grid, the paper presented a scheduling algorithm, which confirmed the critical path and scheduled the member nodes on the critical path firstly. This algorithm results in the better system load balancing under the best makespan. The simulation shows the validity of the proposed algorithm.
    Improved algorithm of DRX mechanism in long term evolution system
    2010, 30(12):  3187-3190. 
    Asbtract ( )   PDF (686KB) ( )  
    Related Articles | Metrics
    To dynamically adjust the Discontinuous Reception (DRX) cycle length according to the Quality of Service (QoS) of various kinds of services, and then utilize the correlation of data arrival process, an improved algorithm which could adjust the cycle length and adjust the initial sleep interval according to the last cycle length was proposed. Then the performance of the proposed strategy was analyzed in terms of energy consumption and delay. The simulation results show that the proposed algorithm has better energy saving and delay performances.
    Advanced computing and artificial intelligence
    Design and implementation of grid computation platform for water applications
    2010, 30(12):  3191-3193. 
    Asbtract ( )   PDF (979KB) ( )  
    Related Articles | Metrics
    Considering the dilemma for the existing water application system in complex business applications, the authors, in combination with the water problem feature, designed and developed the grid of water application. The system provided a B/S and C/S combination service model, realized user access control based on role, monitored the node in real-time by heartbeat mechanism, and achieved task management based on workflow.
    Multi-core parallel algorithm for cubic spline curve fitting
    2010, 30(12):  3194-3196. 
    Asbtract ( )   PDF (539KB) ( )  
    Related Articles | Metrics
    In order to take full use of multi-core technology to enhance the resource utilization of multi-core processors, shorten the execution time and show multi-core system remarkable performance, a multi-threaded parallel program was designed to resolve the tridiagonal equations of odd-even reduction, and the calculation speed of cubic spline curve fitting was increased in multi-core computer. Through comparing the speedup radio of experimental results, it can be seen that parallel program shortens the time of solving equations and multi-core resources are fully utilized. The results indicate that multi-core parallel algorithm of odd-even reduction used in cubic spline curve fitting is effective and feasible and the research results have good practical significance.
    Information dispersal algorithm based on Reed-Solomon code
    2010, 30(12):  3197-3200. 
    Asbtract ( )   PDF (737KB) ( )  
    Related Articles | Metrics
    Information Dispersal Algorithm (IDA) based on Reed-Solomon (RS) code can be used in high strong and reliable distributed storage system. The encoding/decoding speed is an important criterion for the availability of the RS code. Firstly the erasure principium of RS codes was analyzed, the Galois field that the encode/decode operations located was discussed. Based on the characteristics of the arithmetic operations in Galois field, a double-table method was designed to gain encoding/decoding speed. At last, the efficiency of the algorithm was analyzed in both theory and experiment. The experimental results show that the IDA can provide an encoding/decoding speed at 18Mbps. Based on the results of the experiment, this paper analyzed the fit circumstance for the algorithm, and pointed out the future research aspect about IDA.
    Approach to service composition in user-oriented problem solving environment
    2010, 30(12):  3201-3203. 
    Asbtract ( )   PDF (749KB) ( )  
    Related Articles | Metrics
    To enable service composition just-in-time and adaptive to the heterogeneous Web services and the variability of business requirements, a service composition approach to the variability of business requirements was proposed. Measures to supply stable cooperation relations of Web services were presented, and rules to service clustering based on the compatibility of Web services were verified. Based on such rules, the approach and algorithm to service composition based on the relative graph of aggregated Web services were also given out. The experiments show the approach can improve the efficiency of service composition comparative to routine approaches, and eventually make service composition more just-in-time and adaptive in user-steering problem solving environments.
    Continuous domains ant colony algorithm with dimension mutation operator
    2010, 30(12):  3204-3206. 
    Asbtract ( )   PDF (735KB) ( )  
    Related Articles | Metrics
    Concerning the disadvantages of ant colony optimization such as easily plunging into a local optimum and slow convergence speed in continuous optimization, a new Ant Colony Algorithm (ACO) with dimension mutation operator (DMCACO) was presented. In this algorithm, target individuals which led the ant colony to do global rapid search were determined by dynamic and stochastic extraction and the current optimal ant searches in small step nearly. The concept of dimension diversity was defined and the worst of diversity was mutated with introducing the dimension mutation operator: the positions of all ants in this dimension were distributed in the feasible range evenly. The simulation on typical test functions indicates that this algorithm has excellent global optimization and rapid convergence.
    Stochastic location-routing model and algorithm in emergency distribution by airlift
    2010, 30(12):  3207-3210. 
    Asbtract ( )   PDF (710KB) ( )  
    Related Articles | Metrics
    To optimize Location-Routing Problem (LRP) in post-earthquake emergency logistics systems, a stochastic optimization model with uncertain demand of relief commodities and breakage of road network was developed to determine the locations of distributing centers of relief commodities and relief distribution centers, as well as the relief airplane routes during relief process. According to the characteristics of the model, an improved genetic algorithm was proposed. And a special real-valued coding scheme, punishment function method and demand split strategy were adopted to deal with the restriction in the model. The results of a numerical example show that the proposed model and algorithm can resolve the facility location-allocation and airplane routing problem in post-earthquake emergency logistics systems efficiently.
    PSO-based PWA multi-modeling for nonlinear system
    2010, 30(12):  3211-3214. 
    Asbtract ( )   PDF (746KB) ( )  
    Related Articles | Metrics
    Based on the Particle Swarm Optimization (PSO) and the multi-modeling, a PSO-based Piece-Wise Affine (PWA) modeling method was proposed to deal with the complex nonlinear system. The proposed algorithm translated the PWA modeling problem to the Mixed Integer Quadratic Programming (MIQP) firstly, and then the PSO was employed to deal with it. The layering strategy was employed in the process of solving, which not only decreased the dimension of the optimization problem effectively, but also reduced the probability of local optimization. A simulation example indicates the effectiveness of the proposed modeling method.
    Path finding algorithm in massive multiplayer online games based on anchor points and path reuse
    2010, 30(12):  3215-3217. 
    Asbtract ( )   PDF (553KB) ( )  
    Related Articles | Metrics
    In order to solve the low performance of path finding algorithm in massive multiplayer online games, this paper proposed a path finding algorithm in Massive Multiplayer Online Games (MMOG) based on anchor points and path reuse. The algorithm reduced the unnecessary search space by using anchors and path reuse, and reduced load on server. Finally it discovered a near optimal path. The experimental results indicate that the algorithm is more efficient, and it is a practical and feasible approach to path finding in online games.
    Design of radar/jammer shared signal based on chaos genetic hybrid algorithm
    2010, 30(12):  3218-3221. 
    Asbtract ( )   PDF (906KB) ( )  
    Related Articles | Metrics
    Design of radar and jammer share signal waveform is the key of energy sharing for radar and jammer integration system in electronic warfare. A new chaos genetic hybrid algorithm was proposed. The improved tent chaos system was used to initialize species population. Chaos characteristic was added into adaptive genetic algorithm circle. A new chaotic section crossover operator and a chaos annealing mutation operator were designed to avoid search being trapped in local minimum and convergence to a global minimum. The experimental results show that the proposed algorithm can get globally optimal solution quickly, and chaos genetic algorithm can solve the problem of radar and jammer shared signal optimization effectively.
    Graphics and image processing
    Color fusion method for night vision based on YUV space
    2010, 30(12):  3222-3224. 
    Asbtract ( )   PDF (1213KB) ( )  
    Related Articles | Metrics
    The paper presented a simple and fast color fusion method for night vision based on image enhancement and color transfer. Firstly, the contrast of infrared and low-light visible images was adjusted by local histogram equalization. And medium filter was applied to clean noises in the enhanced images. Then, the two enhanced images were fused into the three components of an RGB image in terms of a simple linear fusion strategy. To obtain false color images possessing a natural day-time color appearance, the paper suggested an approach which transferred color from the reference to the fused images in YUV color space. In this way, inappropriate color mappings were avoided and overall discrimination capabilities were enhanced. The experimental results based on three different data sets show that the final scene has a natural day-time color appearance. Furthermore, the fusion process is simple and fast.
    Image fusion based on wavelet transform and adaptive PCNN
    2010, 30(12):  3225-3228. 
    Asbtract ( )   PDF (952KB) ( )  
    Related Articles | Metrics
    Concerning the fusion problem of multi-focus image with the same scene, an algorithm of image fusion based on wavelet transform and adaptive Pulse Coupled Neural Network (PCNN) was proposed. Firstly, original images were decomposed by wavelet transform, and the sub-band images at different scales were obtained. Secondly, a fusion rule was given through making use of synchronous pulse bursts. This method used Energy of Laplacian (EOL) of wavelet coefficients at different scales as the linking strength of the corresponding neuron. After the processing of PCNN with the adaptive strength, new fire mapping images in wavelet domain were obtained. According to the fire mapping images, the fusion coefficients were decided by the compare-select operator, and then the region consistency test was used on the fusion coefficients to obtain the final fusion coefficients. Finally, fusion images were obtained by wavelet inverse transform. The experimental results illustrate that this algorithm is efficient to extract feature information from the original images and improves fusion images. It outperforms the conventional methods in subjective vision effect and objective performance index.
    Multi-focus image fusion algorithm based on Bandelet and region feature
    2010, 30(12):  3229-3232. 
    Asbtract ( )   PDF (1151KB) ( )  
    Related Articles | Metrics
    A multi-focus image fusion algorithm based on Bandelet transform and region statistics was developed, which could fully utilize the second generation Bandelets advantages of the geometrical regularity of image structure and represent sharp image transitions such as edges efficiently in image fusion source images were firstly decomposed by the Bandelet transform, for reconstructing the fused image, the maximum rule was used to select source images geometric flow and regional variance was used to the Bandelet coefficients. Finally the fused image was reconstructed by performing the inverse bandelet transform. The experimental results indicate that the Bandelet-based fusion algorithm represents the edge and detailed information well and outperforms the wavelet-based and Laplacian pyramid-based fusion algorithms, especially when the abundant texture and edges are contained in the source images.
    Improved method of real-time image mosaic
    2010, 30(12):  3233-3235. 
    Asbtract ( )   PDF (655KB) ( )  
    Related Articles | Metrics
    A new real-time image mosaic method based on gray correlation and feature points was proposed by analyzing and comparing the advantages and disadvantages of different registration algorithms. Image was zoomed adaptive to the required size of need for output. The direction of registration and general location of potential were determined by the method based on feature points. Best match point was determined accurately by the method based on gray correlation. Finally the overlapping images were fused by weighted average. The experimental results show that the new algorithm can mosaic the image correctly and in real-time efficiency.
    Automatic color equalization algorithm of multi-camera image mosaic
    2010, 30(12):  3236-3237. 
    Asbtract ( )   PDF (730KB) ( )  
    Related Articles | Metrics
    Automatic color equalization is a very important technology for image processing. The paper proposed a new color and brightness equalization algorithm based on image pixel-mean statistical after the analysis of the problems in images mosaic, and current image color equalization algorithms which were commonly used in image mosaic were also discussed. Firstly, overlaps were extracted from two adjacent camera simultaneous frame images, and channels (RGB) were separated latterly. Then one camera image was used as a reference image, the other one as the target image, and the color channels pixel-mean were counted before the whole target image was corrected. Finally, color space was conversed (RGB to HSV) for both revised image and reference image (whole image), and their brightness channel (V channel) pixel-mean difference was calculated to correct target images brightness again. The results show that the algorithm can correct the adjacent camera image brightness and color difference effectively, and makes a good improvement for image mosaic at later period.
    Local adaptive image denoising based on minimum Bayes risk in wavelet domain
    2010, 30(12):  3238-3240. 
    Asbtract ( )   PDF (922KB) ( )  
    Related Articles | Metrics
    The paper briefly introduced the basic conception of Generalized Gaussian Distribution (GGD) and studied the distributional property of wavelet coefficients, then analyzed the principle of BayesShrink and pointed out the exiting shortcomings. A local adaptive wavelet denoising algorithm was proposed based on the redundant wavelet transform and the relativity among wavelet coefficients in the subband. The new method selected a proper neighboring window by centering the current coefficient within it, and estimated the corresponding ideal standard deviation and threshold for the centered coefficient, and then made shrinkage on it by soft thresholding. The experimental results show the new method effectively filters the noise, reserves more texture and detail of images and gets higher Peak Signal-to-Noise Ratio (PSNR) value and better visual expression.
    Hybrid filtering algorithm based on adaptive gradient magnitude and morphological operations
    2010, 30(12):  3241-3245. 
    Asbtract ( )   PDF (1993KB) ( )  
    Related Articles | Metrics
    In view of the defects of the traditional filtering algorithm based on gradient magnitude, in which the artificial threshold can not determine the noise points correctly and bring out the new noise points, a hybrid filter algorithm based on adaptive threshold gradient magnitude and morphological operations was proposed. Firstly, the original image was filtered by using the mean and variance of the gradient magnitude, and the Otsu algorithm was improved to adaptive threshold value, which could determine the threshold automatically and filter noise, and maintain the image detail. Then the series filter composited by multi-structure elements morphology operations was used to filter the image generated by the adaptive gradient magnitude filter algorithm, which could eliminate the new noise generated in the adaptive filtering process. Finally, the experimental results and analysis show that the visual effect of the proposed algorithm is better than the traditional filtering method, the advantages of better maintaining image detail and filter effect of the hybrid algorithm are demonstrated.
    Performance estimation scheme for histogram shifting based on reversible watermarking
    2010, 30(12):  3246-3251. 
    Asbtract ( )   PDF (1269KB) ( )  
    Related Articles | Metrics
    Multi-layer embedding for histogram shifting based reversible watermarking leads to high capacity. However, due to the high complexity of the time-consuming multi-layer embedding, the assessment whether the given secret data can be embedded into the host image and the stego-image quality could meet the requirement, was generally not obtained in advance. This paper presented a fast performance estimation scheme based only on the histogram information. The experimental results show that the proposed scheme is fast and accurate, and provides a basis for application of multi-layer reversible watermarking.
    Algorithm of H.264 fast deblocking filter on CUDA
    2010, 30(12):  3252-3254. 
    Asbtract ( )   PDF (737KB) ( )  
    Related Articles | Metrics
    In H.264/AVC video coding standard, deblocking filter was used for enhancing the coding efficiency. The filter was very complicated and cost a lot of time. A fast algorithm and efficient implementation of H.264 deblocking filter based on NVIDIA Compute Unified Device Architecture (CUDA) was proposed. The parallel hardware architecture and software development process of Graphic Processing Unit (GPU) were introduced firstly. On the basis of the parallel architecture and hardware characteristic of GPU, some algorithms were focused on BS computation and optimization of deblocking filter to reduce complexity and improve the computing speed, and the shared memory was used to improve the data access efficiency. The experimental results clearly show that, in the same image quality, the average acceleration rate is about 20, and the algorithm on CPU can achieve better performance.
    Prediction on decoded image quality for fractal image coding
    2010, 30(12):  3255-3257. 
    Asbtract ( )   PDF (673KB) ( )  
    Related Articles | Metrics
    A new decoded image quality prediction algorithm for fractal image coding was proposed. According to the results of many experiments, there exists an approximately exponential discipline between the average collage error and the PSNR of decoded image quality. The collage error of each range block was computed in the fractal encoding process and the PSNR of decoded image quality was predicted according to the average collage error. The experimental results show that for different sizes of range blocks and fast fractal image encoding, the proposed algorithm can obtain satisfying prediction results of decoded image quality.
    Pattern recognition and Software
    Color image segmentation of normalized cut and particle swarm optimization algorithm
    2010, 30(12):  3258-3261. 
    Asbtract ( )   PDF (902KB) ( )  
    Related Articles | Metrics

    Spectral clustering minimizing normalized cut criterion has a high computational complexity and an inaccurate result in color image segmentation. In order to overcome these disadvantages, the paper firstly used Fuzzy C-Means (FCM) to deal with three channels of color image, obtained pre-segmentation image from these channels cluster results to construct undirected weighted graph; and then minimized normalized cut criterion using discrete particle swarm optimization algorithm instead of spectral clustering; finally, pre-segmentation result was obtained by the optimal particle. The experimental results show that the method is less time-consuming, and obtains a precise segmentation result in color image segmentation.

    Graphics and image processing
    New algorithm for partitioning graph based on Ncut criterion
    2010, 30(12):  3262-3264. 
    Asbtract ( )   PDF (695KB) ( )  
    Related Articles | Metrics
    Concerning the problem of low similarity within the same group in the undirected weighted graph partitioning, the authors used one more reasonable and new global partitioning criterion, Ncut, to measure both the total dissimilarity between the different groups as well as the total similarity within the groups for partitioning the graph. And a new algorithm called RNK which used the basic iterative improvement strategy was proposed, and then the existing partition of G was brought forward by swapping pairs of nodes that can improve the Ncut. Also the paper presented a hash strategy which developed the efficiency for obtaining the best node pair to swap in the RNK algorithm, especially when the graph was dense. At last the authors implemented KL, RNK algorithm. The results obtained from using two algorithms to partition a number of random graphs show that RNK is more reasonable and efficient.
    Video object segmentation in H.264 compressed domain based on entropy energy
    2010, 30(12):  3265-3268. 
    Asbtract ( )   PDF (917KB) ( )  
    Related Articles | Metrics
    The paper presented a new temporal-spatial method for moving object segmentation in H.264 based on local self-adaptive threshold on entropy. The Motion Vector (MV) fields of several continuous frames were accumulated to enhance the motion information. Then the similarity measure was performed on the accumulated MV field to remove part of noise. And then residual size of 4×4 block was extracted from the compressed bitstream, and each 4×4 block threshold was selected by local self-adaptive threshold on entropy. Finally according to some formula, the article ulteriorly refine the boundary of the motion block. The H.264 sequences test demonstrates the validity of the proposed method.
    Algorithms and performance comparison of automatic thresholding segmentation for forest regions in remote sensing image
    2010, 30(12):  3269-3273. 
    Asbtract ( )   PDF (1019KB) ( )  
    Related Articles | Metrics
    Several classic automatic threshold-selecting algorithms were selected for segmenting forest regions in high-resolution remote sensing image. As evaluation criteria of algorithms, misclassification error, shape measure, uniformity measure, relative ultimate measurement accuracy and running time were implemented to compare segmentation performance of each algorithm objectively and quantitatively. The conclusion can play a guidance role in automatic threshold-selecting methods to extract forest regions in high-resolution remote sensing image.
    Automatic segmentation and fluorescent intensity measurement for endothelial cell images
    2010, 30(12):  3274-3277. 
    Asbtract ( )   PDF (974KB) ( )  
    Related Articles | Metrics
    Quantitative measurement of fluorescent emission is important. An automatic segmentation and cell region fluorescent intensity measuring method was proposed for endothelial cells. Morphological image reconstruction and background subtraction were used to eliminate the uneven image background. Histogram equalization was adopted for image enhancement. Automatic thresholding and morphological filtering were subsequently carried out on the enhanced image, resulting in the binary coarse segmentation, which was afterwards processed by morphological thinning and watershed transform to obtain foreground and background markers, respectively. Marker-controlled watershed was performed on the gradient map of the enhanced image to give the final segmentation. Average fluorescent intensity over the segmented cell regions was calculated and the measurement was completed. The experimental results on real-world images show that, compared with the direct thresholding approach, the proposed method can achieve more accurate segmentation and fluorescent intensity measurement.
    Morphology granule segmentation algorithm based on fuzzy reasoning of image features
    2010, 30(12):  3278-3280. 
    Asbtract ( )   PDF (718KB) ( )  
    Related Articles | Metrics
    For the segmentation problem of connective or overlapping granule image, a kind of local morphology reconstruction parameter calculation method was proposed based on fuzzy reasoning of image features which improved the traditional algorithm of watershed combining distance transformation. Granule image was divided into several connected regions based on the traditional algorithm of watershed combining distance transformation, and every connected region was processed separately. Then morphology local reconstruction was used to solve the over segmentation problem. Granule shape features of connected region were extracted through statistically analyzing maximum points of connected region in distance image. Granule shape features were regarded as fuzzy inputs and reconstruction parameter feature as fuzzy output. Morphology reconstruction parameter was adaptively calculated by using fuzzy reasoning which resolved the uncertain problem of reconstruction parameter selection. Finally, watershed transform was carried out on reconstruction image to obtain granule segmentation image. The experimental results show that the improved method can accurately segment various overlapping granules; moreover, conquer the over segmentation problem of traditional method and self-adaptive parameter choice problem.
    Virtual reality and pattern recognition
    Fast voxelization based on triangulated irregular network model
    2010, 30(12):  3281-3283. 
    Asbtract ( )   PDF (991KB) ( )  
    Related Articles | Metrics
    In order to improve the voxelization efficiency of large data, a fast and simple voxelization algorithm was proposed for the Triangulated Irregular Network (TIN) model. The surface voxelization was realized by dividing triangular facets, which could convert surface voxelization to points voxelization. Then, the solid voxelization was realized by using the theory of depth buffer to quickly search initial seed. The experimental results show that for the complex and precise TIN model, a voxel model of 26-connected which is approached to original model can be efficiently generated, and it has a high time efficiency.
    Implementation of hardware-in-loop network simulation platform based on NS-2
    2010, 30(12):  3284-3287. 
    Asbtract ( )   PDF (978KB) ( )  
    Related Articles | Metrics
    NS-2 is one of the most widely used network simulation softwares. But there are few researches on hardware-in-loop simulation based on NS-2. A new Hardware-In-Loop simulation platform based on NS-2 (HIL-NS) was proposed, by mapping real data flows to the simulation networks. Based on the description of HIL-NS platforms architecture in detail, key technologies to implement this platform were discussed. The simulation results of wireless MAC protocol 802.11b in Vehicle Ad-hoc NETwork (VANET) show that the results of HIL-NS platform highly coincide with those of NS-2. And the proposed platform can get more direct real-time output of the performance while supporting traditional data analysis.
    Depiction of structural relationships between objects in volumetric data
    2010, 30(12):  3288-3291. 
    Asbtract ( )   PDF (1025KB) ( )  
    Related Articles | Metrics
    Different structural relationships exist between various objects in a volume data set such as inclusion and topology. These structural relationships also affect understanding the features of the data set. Through analyzing the relationships between objects, this paper proposed that an effective volume rendering should define a mapping from data to rendering so that each relationship between objects in the data space was mapped to the rendering space. Then different rendering methods were presented to depict typical structural relationships in the data set. The depiction of various structural relationships between objects in rendering space enhances the understanding of the data set, and extends roles of volume rendering in volume data analysis.
    Improved geoclipmap algorithm for terrain visualization
    2010, 30(12):  3292-3294. 
    Asbtract ( )   PDF (792KB) ( )  
    Related Articles | Metrics
    In the real-time rendering of large terrain, it is the contradiction between large terrain data and the limited data communication bandwidth of the hardware that affects the rendering efficiency. The survey was based on the Geoclipmap algorithm and some improvements were made to solve the problem. During the data preprocessing, a Geometry Scene Graph (GSG) structure was used to organize the terrain data in order to improve data loading speed, and the implementation used a sinc function as the filter in the generation of mipmap levels to avoid detail of terrain surface being flattened. After that, it is the rendering process. A two-level view frustum culling was presented which was based on the bounding spheres hierarchy and normal computation was put in the fragment shader of Graphic Processing Unit (GPU), which reduced the data stream between CPU and GPU. Just as the simulation results show, the algorithm can keep the fidelity of terrain and be effective to meet the needs of large terrain rendering.
    Image semantic annotation method based on multi-modal relational graph
    2010, 30(12):  3295-3297. 
    Asbtract ( )   PDF (894KB) ( )  
    Related Articles | Metrics
    In order to improve the performance of the image annotation, an image semantic annotation method based on multi-modal relational graph was proposed. The relationship between the low-level features of the image region, annotated words and images was presented by an undirected graph. Semantic information was extracted by combining similarity measured in the region feature space and the correlation of annotation words to improve the accuracy of the extracted semantics. Inverse Document Frequency (IDF) was introduced to adjust the weights of edges between the image node and its annotation words node in order to overcome the deviation caused by high-frequency words. It can effectively improve the image annotation performance. The experimental results on the Corel image datasets show the effectiveness of the proposed approach in terms of quality of the image annotation.
    Global illumination with precomputed radiance transfer
    2010, 30(12):  3298-3300. 
    Asbtract ( )   PDF (682KB) ( )  
    Related Articles | Metrics
    This paper presented a method of global illumination in Graphic Processing Unit (GPU), which was based on the GPU-programmable rendering pipelines, combined with Pre-computed Radiance Transfer (PRT) formula and decoded by spherical harmonic function, and then the real-time global illumination was rendered. In the process of the pre-computation, the high frequency signal was reconstructed by wavelet, so as not to lose the high frequency signal and the simulation scene was of high reality. According to the information of the visibility for the large-scale scene, the large-scale scene was subdivided by using the method of adaptive subdivision, and the rendering has high efficiency. The experimental results show that the proposed method can generate the global illumination quickly in the simulation system and it has high efficiency and better realistic quality.
    Improved algorithm of preserving global and local properties based on Riemannian manifold learning
    2010, 30(12):  3301-3303. 
    Asbtract ( )   PDF (617KB) ( )  
    Related Articles | Metrics
    An improved algorithm of preserving global and local properties based on Riemannian Manifold Learning (RML) was proposed, which could solve the problem that RML cannot reserve the local geometry property of neighbor data. In the algorithm, all points were projected by Principal component analysis (PCA) firstly, and then a neighbor graph was constructed. The most important step was that all data points were classified into two parts, for the k neighboring nodes of a base point, it adopted a weight which can preserve local property of the base point and neighboring nods to get the low-dimensional embedding coordinates. As for the other points, the RML algorithm was still used. Thus the new algorithm could both preserve the metrics at all scales and keep the geometrical property of local neighbor to the maximum. The experimental results demonstrate the validity and real-time quality.
    Collision detection algorithm based on mixed bounding box
    2010, 30(12):  3304-3306. 
    Asbtract ( )   PDF (833KB) ( )  
    Related Articles | Metrics
    A collision detection algorithm based on mixed bounding box was proposed using k-dops and bounding sphere between complex objects. In the preliminary period, bounding box binary tree of the objects was established. A k-dops was established at the inner layer of node and a bounding sphere was established at the outer layer of node. In collision detection period, firstly the intersection test used sphere-sphere method for fast overlap test in outer layer, eliminating the possibility of collision between objects far apart. Then it accurately determined the contact status among objects in closer proximity by kdops-kdops test in inner layer. The experimental results, compared with that of QuickCD, show that the proposed algorithm is efficient in collision detection between complex objects.
    Multi-camera face gesture recognition
    2010, 30(12):  3307-3310. 
    Asbtract ( )   PDF (894KB) ( )  
    Related Articles | Metrics
    Active Shape Model (ASM) algorithm was used for the precise location of facial feature points, and then the feature points in the multi-camera images were matched to determine the spatial three-dimensional position with the technology of binocular vision and three-dimensional camera distance technology, and then human face gesture was estimated. The experimental results show that the results of the approach for face gesture recognition are very accurate.
    Hand shape features location method
    2010, 30(12):  3311-3313. 
    Asbtract ( )   PDF (759KB) ( )  
    Related Articles | Metrics
    The precision of hand shape identification is influenced by the accuracy of features location. Location method of feature point to carry out recognition based on relative finger length and width was presented. In this paper, first the fingertip was located based on linear fitting; then finger root was located by direction tracking; last the root points were determined. Experiments were carried out on HKUST, using the feature vector matching algorithm. Identification rate of 84.35%, the identification rate of actual measurement can reach 84.62%, and the effect of the proposed method is close to the manual measurement. The experimental results verify the validity of the proposed approach in personal authentication.
    Face recognition based on supervised incremental isometric mapping
    2010, 30(12):  3314-3316. 
    Asbtract ( )   PDF (625KB) ( )  
    Related Articles | Metrics
    In view of the disadvantages that Isometric mapping (Isomap) algorithm is unable to follow-up the test samples which are collected separately and to use dimensionality reduction, and Isomap algorithm does not make use of the classification of information sample point as well, a method was proposed to recognize the face using supervised incremental isometric mapping algorithm (SIIsomap), and combined wavelet transformation to pretreat the images. The experimental results based on ORL database show that SIIsomap algorithm compared with Isomap algorithm would greatly reduce calculation time in handling additional samples and improve recognition accuracy.
    New document image distortion correction method
    2010, 30(12):  3317-3320. 
    Asbtract ( )   PDF (1324KB) ( )  
    Related Articles | Metrics
    Document image distortion often appears when captured by the camera, which may induce recognition mistakes by Optical Character Recognition (OCR) software. In this paper, the technology of connected components labeling was used to detect words and text lines, and then based on the information of the middle dots of the words, linear fitting was used to get the words baselines. Finally, according to the words baselines and the distance for vertical displace, words rotation and vertical displace were made to obtain the corrected image. Compared with the traditional method, the computation of the words baselines and the distance for vertical displace in this paper are independent of the documents content, so as to guarantee the precision of words slope and make all words be aligned with the same line. The computation complexity of the algorithm was discussed at the end of this paper, and comparative experiments with traditional method were made. The experimental results show the proposed method is of high efficiency and robustness.
    Algorithm for vehicle license plate location based onedge-color pairs’ distribution characteristics
    2010, 30(12):  3321-3324. 
    Asbtract ( )   PDF (1061KB) ( )  
    Related Articles | Metrics
    A new locating approach based on the edge-color pairs and its distribution characteristics was presented in this paper. Firstly,using the fixed color assortment between background and words of the plate region, and the distance restrictions of the edge-color pairs, many filters were applied to the vehicle images to display the edge-color pairs more obviously. Then, the license plate would be located quickly according to the statistical characteristics of the distribution of the edge-color pairs. The experimental results show that this algorithm is a quick and effective location method.
    Number location and character segmentation under complex background
    2010, 30(12):  3325-3326. 
    Asbtract ( )   PDF (815KB) ( )  
    Related Articles | Metrics
    In this paper, an edge detection combined with number distribution algorithm was proposed to locate the number area, and a projection-maximal connected region algorithm was proposed to segment characters. An improved mathematical morphology was used to extract the edge of gray image; the max connected region algorithm was used to figure out the skew angle for image skew correction. After that, number area could be located with the help of number location information. The maximal connected region algorithm was adopted to filter block noise, and projection feature and width of character were used to segment characters. This method effectively overcomes the disturbance due to complex background, the depth of ink, stains and wear.
    Target extraction in infrared image based on spiking neural networks
    2010, 30(12):  3327-3330. 
    Asbtract ( )   PDF (1357KB) ( )  
    Related Articles | Metrics
    A Spiking Neural Network (SNN) for target extraction in infrared image was designed by means of a simulation of bio-inspired information processing mechanism. Firstly, the infrared image stimulus were transferred into spike trains by neurons in input layer of the spiking neural network; then, the target outline in infrared image was specially encoded by the density of spiking trains in middle layer of the spiking neuron network; at last, the outline pixels of the infrared target was determined by whether the firing density of the corresponding neuron in the networks output layer was over a threshold. The experimental results show that the designed spiking neuron network has a good performance in infrared target extraction, and is more biologically realistic than the existing methods.
    Information security
    Improved direct anonymous cross-domain authentication scheme
    2010, 30(12):  3331-3333. 
    Asbtract ( )   PDF (462KB) ( )  
    Related Articles | Metrics
    Concerning the problem that the existing direct anonymous authentication scheme cannot achieve effectively in different domains, based on the original direct anonymous authentication scheme, in this paper a direct anonymous authentication scheme was designed. It took the certificate issuer outside the domain as a proxy, and the certificate was issued by certificate issuer outside the domain directly, which solved the privacy protection problem of trusted computing platform effectively in different trusted domains, and it had permission settings to trusted computing platform in different trust domains. Demonstrated by analysis, this new scheme meets the requirements on anonymity, unforgeability and prevention of replay attacks. Furthermore, it improves the efficiency of certification scheme.
    Multi-secret sharing scheme among weighted participants
    2010, 30(12):  3334-3336. 
    Asbtract ( )   PDF (598KB) ( )  
    Related Articles | Metrics
    Based on the security of RSA (Rivest-Shamir-Adleman) cryptosystem and Hash function, a threshold multi-secret sharing scheme with different weights was proposed. In the scheme, each participant can share many secrets with other participants by holding only one secret shadow. Each participants secret shadow is selected and saved by himself and even the secret dealer does not know anything about it. In the recovery phase, participant only needs to submit a pseudo-shadow instead of his secret shadow. The scheme does not require a secure channel between each participant and the dealer, and can guarantee secure delivery and verify authenticity of information. Analyses show that the scheme is more secure and feasible than the existing ones.
    Efficient certificateless parallel multi-signature scheme
    2010, 30(12):  3337-3340. 
    Asbtract ( )   PDF (689KB) ( )  
    Related Articles | Metrics
    A new efficient certificateless parallel multi-signature scheme was proposed based on the advantages of certificateless public key cryptosystem and the demands of digital multi-signature. The efficiency of the proposed scheme relied on that the pairing computation was not related to the number of signature users. The scheme decreased the pairings number and was more efficient than known multi-signature schemes based on bilinear pairing technology. Furthermore, under the random oracle model, relying on the hardness of the Computational Diffie-Hellman (CDH) problem, the new scheme was proved to be secure against two attackers of certificateless parallel multi-signature.
    New certificateless proxy blind signature scheme
    WEI Chun-yan Cai Xiao-qiu
    2010, 30(12):  3341-3342. 
    Asbtract ( )   PDF (500KB) ( )  
    Related Articles | Metrics
    Certificateless public key cryptography is not only secure but also efficient because it does not suffer the cost of the certificate management in traditional public key cryptography and the key escrow problem inherited in ID-based public key cryptography. The study of proxy blind signatures consturction and applicatioan shows that present certificateless proxy blind signature schemes are very few and proxy blind signature schemes in certificateless public key systems can fulfill the safety and high efficiency required in electronic voting and e-banking and other fields better. Based on the bilinear pairing and discrete logarithm problem, a new certificateless proxy blind signature scheme which fulfils the properties of blindness, non-forgeability, identifiability, nonrepudiation and so on was given.
    Self-renewal Hash chain based on geometric method
    2010, 30(12):  3343-3345. 
    Asbtract ( )   PDF (516KB) ( )  
    Related Articles | Metrics
    In order to solve the problem of information leakage in extant self-updating Hash chains, the authors proposed a new self-renewal Hash chain based on geometric method in this paper. By utilizing the property of circle that n+1 points given on the same n dimensional circle can determine its formula uniquely, a data dispersal and restoring algorithm was designed for construct a new self-renewal Hash chain. The performance analysis shows that our scheme is simple in calculation and has high security. Besides, it can change root s while keeping most n points unchanged.
    Trust evaluation for wireless sensor networks based on trust-cloud
    2010, 30(12):  3346-3348. 
    Asbtract ( )   PDF (565KB) ( )  
    Related Articles | Metrics
    Cloud-based trust model in wireless sensor networks does not take account of the timeliness of trust relationship between nodes, and the trust combination which adoptes the method of direct averaging is not in accordance with peoples intuitive judgments. Aiming to solve these problems, a new trust evaluation model for wireless sensor networks based on trust-cloud was proposed. This model distributed the weights of the recent trust-cloud and the historical trust-cloud according to the recent behavior of nodes, and amended the weights of the direct trust-cloud and the recommended trust-cloud using the similarity degree respectively. The simulation results indicate that this model is superior to the original one: it can not only resist attacks of malicious nodes, but also identify malicious nodes accurately in real time.
    Static detection of polymorphic attack codes
    2010, 30(12):  3349-3353. 
    Asbtract ( )   PDF (915KB) ( )  
    Related Articles | Metrics
    A new approach using static analysis was proposed to discover the polymorphic attack codes hidden in network data flows. The idea of abstract execution was firstly adopted to construct control flow graph, then both symbolic execution and taint analysis were used to detect attack codes, at last, predefined length of NOOP instruction sequence was recognized to help detection. The experimental results show that the approach is capable of correctly distinguishing the attack codes from regular network flows.
    P2P-Botnet detection based on multi-stage filtration
    2010, 30(12):  3354-3356. 
    Asbtract ( )   PDF (537KB) ( )  
    Related Articles | Metrics
    A new method for detecting P2P-Botnet, which was based on the analysis of network streams, was presented. Firstly, by using outburst and distributed characteristics of the P2P streams, the P2P nodes could be picked up from the common nodes. Then, based on the communication symmetry and cohesion characteristics of the pairs of nodes in a P2P network, the set of peers in one P2P network could be taken out by using the K-average cluster method. Finally, by contrasting with the common actions of the peers in every P2P network, a P2P-Botnet could be distinguished from the P2P networks. Plenty of experiments have been done in LAN environment and the results verified the efficiency and precision of the proposed method.
    Auto-detection system of suspicious file based on virtual machine technology
    2010, 30(12):  3357-3359. 
    Asbtract ( )   PDF (776KB) ( )  
    Related Articles | Metrics
    Concerning the feature that the technology of characteristic code is unable to detect new and unknown baleful program, an auto-detection system of suspicious file based on technology of virtual machine and behavior analysis was proposed. Work diagram of detection system was introduced with emphasis, module framework of management center and detection center of the system was given, and the technical principle of management center and detection center was analyzed in detail. The experimental results show that the system can judge rapidly the dangerous level of one detected file and the system has long life cycle.
    Model analysis of multiple-worm propagation
    Song LiPeng
    2010, 30(12):  3360-3362. 
    Asbtract ( )   PDF (515KB) ( )  
    Related Articles | Metrics
    The complex interactions among Internet worms have great impact on the dynamics of worms. To contain the propagation of worm, it is necessary to characterize these interactions. However, previous models have been single worms models. Therefore, a two-worm interaction model was presented in this paper, which focused on the influence of cooperative worm. Furthermore, the models equilibria and their stability conditions were obtained mathematically and then verified by simulation. The analytical and simulated results show that the cooperative worm can facilitate the spread of other worms and it is also shown that the cooperative worm can be terminated based on its stability condition and timely patching technology.
    Design of Single Sign-on for hybrid architecture based on Web service
    2010, 30(12):  3363-3365. 
    Asbtract ( )   PDF (746KB) ( )  
    Related Articles | Metrics
    To solve the problem of user repeated logon from various kinds of applications based on hybrid architecture and in different domains, a single sign-on architecture was proposed. On the basis of analyzing the advantages and disadvantages of existing single sign-on models, combined with the key technology like Web service, Applet and reverse proxy, two core problems such as single sign-on architecture mixing B/S and C/S structure applications and cross-domain single sign-on were resolved. Meanwhile, the security and performance of this architecture were well guaranteed since the reverse proxy and related encryption technology were adopted. The results show that this architecture is of high performance and it is widely applicable.
    Database and data mining
    Multiple rough fuzzy set model
    2010, 30(12):  3366-3370. 
    Asbtract ( )   PDF (758KB) ( )  
    Related Articles | Metrics
    To fully describe the overlap among knowledge particles, significance difference among objects and polymorphism of objects, based on multi-set, an expansion was made on the domain of rough fuzzy set model in the sense of Dubois rough fuzzy set, multiple rough fuzzy set model was put forward, their corresponding definitions, theorems and properties were fully described, which included the definitions of multiple rough fuzzy approximate sets, approximation accuracy and definable sets, proofs of their important properties, relations among rough approximation operators in multiple rough sets and relations between Dubois rough fuzzy sets and multiple rough fuzzy sets. Multiple rough fuzzy sets can conveniently find associated knowledge from data with one-to-many dependency and fuzzy properties.
    Community-partition-based online analytical processing query optimization
    2010, 30(12):  3371-3373. 
    Asbtract ( )   PDF (740KB) ( )  
    Related Articles | Metrics
    In the Peer-to-Peer (P2P) environment, when the number of nodes of On-Line Analysis Processing (OLAP) query increase, network congestion will be aggravated and OLAP query efficiency will be reduced. Therefore, this paper proposed an optimized OLAP query method based on community partition. A visual community network was constructed with the method, and an algorithm of Community Partition Data-cube Search (CPDS) was designed in this structure. The results of experiment show that this algorithm can effectively avoid increasing network burden, when network OLAP nodes increase. Therefore, this method reduces congestion of network and optimizes efficiency of OLAP query, which improves the performance of decision-analysis of OLAP in P2P environment.
    Rough set model in incomplete grey information systems
    2010, 30(12):  3374-3376. 
    Asbtract ( )   PDF (540KB) ( )  
    Related Articles | Metrics
    In this paper, the incomplete information system in which attributes values were considered as interval grey number. Firstly, some operation properties of interval grey number were presented according to the definition of interval grey number, and an incomplete grey information system was defined. Then, based on grey similar degree, the variable precise grey similar relation was proposed, and the upper and lower approximations were conducted according to variable precise grey similar relation. Lastly, a practical operation method for computing reduction was given, and the validity of the proposed technique was verified via a typical case.
    Clustering algorithms for mixed attributes based on rough set
    2010, 30(12):  3377-3379. 
    Asbtract ( )   PDF (521KB) ( )  
    Related Articles | Metrics
    Objects are strictly divided into clusters in the conventional algorithms; however, most of the time, the object boundary cannot be strictly classified. The rough set based k-means clustering algorithm and leader clustering algorithm divide the data object into a clusters upper-bound or lower-bound using rough set, which provides a new perspective of dealing with uncertainty and solve the problem of uncertain boundary region. The problem is that both of the two algorithms cannot deal with mixed valued data, and clustering results significantly depend on the initial value. A definition of the distance for mixed valued data was introduced in this paper, an improved method was put forward for the selection of the initial value, and a clustering algorithm for mixed valued data based on rough set was given. Finally, a simulation experiment was carried out. Simulation results show, under the uncertain situation of cluster number,the clustering accuracy of the algorithm is significantly improved than the traditional k-means algorithm.
    Weighted fuzzy clustering algorithm for relational data with multiple medoids
    2010, 30(12):  3380-3384. 
    Asbtract ( )   PDF (771KB) ( )  
    Related Articles | Metrics
    It is apparently inadequate that k center algorithm uses only one point to represent the whole class, which will definitely affect the accuracy of clustering results. Therefore, a weighted fuzzy clustering algorithm for relational data with multiple medoids was proposed. In the proposed algorithm, multiple objects in each cluster carried different weights called medoids weights to represent their degrees of representativeness in that cluster. This algorithm can make each cluster to be represented by multiple objects instead of only one object. The experimental results show that the proposed algorithm can capture the underlying structures of the data more accurately and provide richer information for the description of the resulting clusters.
    Land-use data integration based on knowledge
    2010, 30(12):  3385-3387. 
    Asbtract ( )   PDF (851KB) ( )  
    Related Articles | Metrics
    Te high subjective degree of subjective factors appears while integrating land-use maps by manual work, and the integrated rules are difficult to express with precise mathematical language and form, this paper introduced the concepts of integrated knowledge database and inference engine. It proposed the method of automatic integrating land-use data based on knowledge-studying, and then achieved the functions of integration of sporadic surface features, linear features and polygons by inference engine based on integrated knowledge database. At the same time, the importance of man-machine synergy has been also taken into account. It made the computer participate in map integrating process to improve the integration efficiency.
    Research of weighting exponent of fuzzy C-means algorithm based on fuzzy relevance
    2010, 30(12):  3388-3390. 
    Asbtract ( )   PDF (679KB) ( )  
    Related Articles | Metrics
    In the process of minimization Fuzzy C-Means (FCM) clustering objective function, to solve the problem of lacking theoretical foundation and effective evaluation methodology in determining fuzzy weighted exponent "m" at present, a fuzzy weighted exponent algorithm based on fuzzy relevance was put forward. Firstly, valid function was defined based on Fuzzy relevance, then the validity of FCM clustering was calculated by Gauss iteration and its result was returned to the change of fuzzy weighted exponent, the fuzzy weighted exponent "m" will be converged to a stable optimum resolution. This algorithm is proved to be effective by theoretical analysis and experiments, and the weighted exponent "m" got from this algorithm conforms to prospective result.
    Fast AGM algorithm and application to three-dimensional structure analysis
    2010, 30(12):  3391-3396. 
    Asbtract ( )   PDF (956KB) ( )  
    Related Articles | Metrics
    Apriori-based Graph Mining (AGM) algorithm is simple and it is based on recursion statistics. When graph data set is very large, due to sub-graph isomorphism problem, so many redundant sub-graphs would be generated while generating candidate sub-graphs, which increases computation time. In this paper, an improved method was proposed to reduce redundant sub-graph candidates by adding extra constraints and use three-dimensional coordinate to calculate the distance between each vertex of a graph, which was added to the edge label for handling three-dimensional graph structured data. In this paper, chemical compounds were analyzed by the improved algorithm to describe their three-dimensional chemical structure and correlation with physiological activity and the computation time on different conditions were examined. The experimental results prove that the improved algorithm cuts down the computation time with more edge labels and improves the efficiency.
    SPARQL ontology query based on natural language understanding
    2010, 30(12):  3397-3400. 
    Asbtract ( )   PDF (774KB) ( )  
    Related Articles | Metrics
    For users can conveniently access to ontology knowledge, Simple Protocol And RDF Query Language (SPARQL) ontology query based on natural language understanding was put forward. Users natural language inquires were analyzed utilizing Stanford Parser, query triple was constructed according to the grammar, greatly reducing the number of combinations compared with the key word method. Combined with user dictionary, the terms of query triple could be more accurately mapped to the ontology entities. Scores calculation not only considered the similarity of words form and semantic, but also considered the ambiguity of concept, and returned the specific concept as far as possible. Employing ontology reasoning to obtain the information hidden in the ontology, the query was filted and limited to improve the accuracy. Users interacted with system through graphical user interface, selected the desired results, and finally returned query results in the form of tree, and related information could be seen. The experimental results show that the proposed method achieves the expected results.
    Annotating Web document in multi-granularity way by statistical topical model
    2010, 30(12):  3401-3406. 
    Asbtract ( )   PDF (1269KB) ( )  
    Related Articles | Metrics
    Concerning the Web document annotation techniques available have weakness in integrity annotation, Latent Dirichlet Allocation (LDA) model was applied to semantic annotation. By embedding document domain information to LDA model, a new LDA model called domain-enabled LDA was introduced. An association between the statistical topical model and domain ontology was established, so the implied topic generated could be interpreted by concepts and an explicit semantic in document was acquired. Because the LDA model assigned a topic to each word in document, a multi-granularity annotation strategy was proposed. The experiments on 20news-group and WebKB show that the domain-enabled LDA model proposed can improve the annotation effectiveness and the multi-granularity annotation method helps different types of query in information retrieval.
    Services and resources classification algorithm based on interests of users
    2010, 30(12):  3407-3409. 
    Asbtract ( )   PDF (646KB) ( )  
    Related Articles | Metrics
    Services and resources in autonomous networks were classified by the improved naive Bayesian classification algorithm with integrating Chinese library classification. Thereby, the classification accuracy could be improved effectively based on the users who had different interests. The experimental results show that this algorithm has better performance when compared with traditional naive Bayesian algorithm.
    Typical applications
    Research of stroboscope system for checking droplets of microarrayer device
    2010, 30(12):  3410-3412. 
    Asbtract ( )   PDF (1056KB) ( )  
    Related Articles | Metrics
    It is difficult to observe the droplets of microarrayer device with naked eye because they are small in size and fast in speed. Therefore, a special stroboscope system was developed. In this paper, the design of the systems key hardware and software technologies were described in detail. The technologies of synchronized triggering, delayed flashing and backlight illumination were adopted to obtain a clear image of the droplets. The droplets search region was set interactively on the software interface, and the spray effect was automatically determined based on the information of the droplets, such as size, number and flight angles, which were calculated through the outside contour of each droplet extracted by image processing. The developed system is easy to use and fully functional with low cost and small size.
    Implementation of driver for embedded temperature collection system based on TE2440-II
    2010, 30(12):  3413-3415. 
    Asbtract ( )   PDF (584KB) ( )  
    Related Articles | Metrics
    DS18B20, a digital temperature sensor with high commonality, has been widely used in various temperature acquisition systems. But it can not be driven in the latest kernel of Linux version. The authors analyzed and designed a driver for the DS18B20 in Linux 2.6.28 based on the hardware platform TE2440-II. This new device driver was accomplished by making use of driving writing method of Linux character device and operating principles of DS18B20. The application of the new driver for DS18B20 indicates that this new driver reduces total operation cost compared with the traditional temperature collection modes by single chip computer and PC-compatible. And this new driver was used successfully on intelligent temperature monitoring system in greenhouses.
    Analysis of influential factors in software test
    2010, 30(12):  3416-3418. 
    Asbtract ( )   PDF (717KB) ( )  
    Related Articles | Metrics
    Software test plays an increasingly important role in software quality improvement, and the research on various software test approaches in order to improve test quality becomes popular. In this study, for finding out the impact on test quality of the actors such as size, complexity, experience of people, schedule, a linear regression model based on data collected from 37 test projects was constructed and a significance test was carried out through linear fitness. The authors find out that schedule pressure has significant linear relationship with software test quality.
    Control strategy and reliability study of maglev automatic train operation system
    2010, 30(12):  3419-3422. 
    Asbtract ( )   PDF (678KB) ( )  
    Related Articles | Metrics
    To improve the safety and reliability of maglev Automatic Train Operation (ATO) system , a feasible automatic control strategy was proposed. Based on two-unit redundant model of system architecture, communication handshake strategic was used to achieve system communication and role recognition; automatic control strategy was designed to ensure the correct implementation of operation plans. In order to verify the feasibility of the system,the failure rates were divided into several different parts. According to the effects of each part of the failure rate, this paper utilized Markov model to analyze the reliability and safety. A simulation analysis was performed and the result shows that the strategy is effective.
    File operation monitoring schema based on Detours technology
    2010, 30(12):  3423-3426. 
    Asbtract ( )   PDF (777KB) ( )  
    Related Articles | Metrics
    Two kinds of Hook API technologies commonly used by file operation monitoring were discussed. According to experiments, the instability of IAT Hook was pointed out, and it was suggested to use Detours technology which was based on Inline Hook to solve the problem of explorer.exe fault. In the end, the paper put forward an implementation method of the file operation monitoring and had a description of Detours. The test shows that the schema is indeed effective in achieving file security protection.
2024 Vol.44 No.4

Current Issue
Archive
Honorary Editor-in-Chief: ZHANG Jingzhong
Editor-in-Chief: XU Zongben
Associate Editor: SHEN Hengtao XIA Zhaohui
Domestic Post Distribution Code: 62-110
Foreign Distribution Code: M4616
Address:
No. 9, 4th Section of South Renmin Road, Chengdu 610041, China
Tel: 028-85224283-803
  028-85222239-803
Website: www.joca.cn
E-mail: bjb@joca.cn
WeChat
Join CCF