Loading...
Toggle navigation
Home
About
About Journal
Historical Evolution
Indexed In
Awards
Reference Index
Editorial Board
Journal Online
Archive
Project Articles
Most Download Articles
Most Read Articles
Instruction
Contribution Column
Author Guidelines
Template
FAQ
Copyright Agreement
Expenses
Academic Integrity
Contact
Contact Us
Location Map
Subscription
Advertisement
中文
Table of Content
01 October 2010, Volume 30 Issue 10
Previous Issue
Next Issue
Artificial intelligence
Speech recognition lattice-generating algorithm with forward-backward language model
2010, 30(10): 2563-2566.
Asbtract
(
)
PDF
(759KB) (
)
Related Articles
|
Metrics
In order to lighten the heavy computational burden of one-pass lattice-generating algorithms for speech recognition, a fast two-pass decoding algorithm was proposed on the basis of the forward-backward language model. The forward and backward language models were applied to the first and second decoding processes separately. Furthermore, some optimization rules were given to reduce the impact of language model mismatch and to avoid its side-effects on recognition results. The experimental results show that this algorithm quickens the decoding process without decreasing the recognition accurate rate.
Emergency collaborative Petri net modeling and collaborative detection based on multi-Agent system
2010, 30(10): 2567-2571.
Asbtract
(
)
PDF
(668KB) (
)
Related Articles
|
Metrics
To solve the collaboration problem during emergency disposal process, an emergency collaboration Petri net model which focuses on the basic structure of Agent and multi-Agent collaboration was proposed. A collaboration detection algorithm of emergency collaboration Petri net model was also given. At last, an emergency disposal process against chlorine leak in some enterprise was given as an example, the emergency collaboration Petri net model was established, and the feasibility and effectiveness of the proposed approach was verified.
Forgetting revision of Agent belief
2010, 30(10): 2572-2574.
Asbtract
(
)
PDF
(612KB) (
)
Related Articles
|
Metrics
For the sake of keeping minimal change principle, firstly, the forgetting contraction operator was defined based on forgetting theory, and this contraction operator could satisfy the necessity AGM contraction postulates. Then the forgetting revision operator was built through Levi Identity. Finally, the forgetting revision method and multiple forgetting revision algorithm were proposed. The case study shows that the algorithm is feasible and effective, and it can achieve a satisfied amendment.
Shuffled frog leaping algorithm based on differential disturbance
Peng-Jun ZHAO
2010, 30(10): 2575-2577.
Asbtract
(
)
PDF
(423KB) (
)
Related Articles
|
Metrics
Basic Shuffled Frog Leaping Algorithm (SFLA) algorithm easily traps into local optimum and has a low convergent precision when being used to address complex functions. To overcome these above shortcomings, an improved SFLA based on mutation idea in Differential Evolution (DE) was proposed. The proposed algorithm used beneficial information of the other individuals in sub-group to disturb updating strategy locally. The experimental results show that the improved SFLA has a better capability to solve complex functions than other algorithms. It has high optimization efficiency, good global performance, and stable optimization outcomes, and is superior to the other algorithms.
Improved particle swarm optimization based on adaptive dynamic neighborhood and generalized learning
2010, 30(10): 2578-2581.
Asbtract
(
)
PDF
(593KB) (
)
Related Articles
|
Metrics
As Particle Swarm Optimization (PSO) may easily get trapped in a local optimum, an improved PSO based on adaptive dynamic neighborhood and comprehensive learning named ADPSO was proposed. In ADPSO, the neighbors of each particle were dynamically constructed in terms of the best performing particle among the current particle neighborhood. Then the learning mechanism of each particle was separated into three parts: its own historical best position, the best neighbor and the global best one. At the position of the new particle, a random position around itself was added to increase the probability for the particle to move to that promising region. The test results on benchmark functions show that ADPSO achieves better solutions than other improved PSO, and it is an effective algorithm to solve multi-objective problems.
Instructed-crossover genetic algorithm based on gradient information
2010, 30(10): 2582-2584.
Asbtract
(
)
PDF
(610KB) (
)
Related Articles
|
Metrics
Concerning the blind search for cross-individual in the solution space, which leads to low efficiency and low convergence speed in the later stage of simple Genetic Algorithm (GA), an improved instructed-crossover genetic algorithm based on gradient information was proposed. It executed crossover operation via choosing special individuals from the set range of the negative gradient of the objective individual got from the current population, and made the offspring more closer to the optimal solution. It guaranteed the purpose and feasibility of the crossover operation. The simulations on four typical test functions indicate that the proposed algorithm can greatly improve the efficiency and precision in searching the optimum value.
Application of improved particle swarm optimization in orthogonal codes
2010, 30(10): 2585-2587.
Asbtract
(
)
PDF
(444KB) (
)
Related Articles
|
Metrics
In order to avoid the interference between the same model radars, the radar signals are generally asked to be orthogonal, so to design orthogonal signals with low auto-correlation and cross-correlation is the key to anti-jamming. Concerning the frequency-coded radar signals, improved Particle Swarm Optimization (PSO) algorithm was used to optimize the signal-coding sequences selected from which to meet the objective function to find the orthogonal codes group. Crossover and mutation idea of Genetic Algorithm (GA) was introduced to overcome the slow convergence and local optimum of Simple PSO (SPSO) algorithm, and design results were analyzed at last. The results show that the method is effective and feasible, and the performance is superior to that of SPSO, Simulated Annealing (SA) and Hybrid Genetic Algorithm (HGA).
Two-level combination approach for solving conflict evidences
2010, 30(10): 2588-2591.
Asbtract
(
)
PDF
(763KB) (
)
Related Articles
|
Metrics
Considering that the combination of Dempster rule in Dempster-Shafer Theory (DST) has disadvantages while dealing with highly conflict evidences, a new two-level combination method for fusing conflict evidences was proposed. Distinguishing between highly conflict evidence and lowly conflict evidence, the method adopted the PCR6 rule based on Dezert-Smarandache Theory (DSmT) on the first level of combination to solve high conflict and used Dempster rule to keep fast convergence speed and good calculation capability on the second level, which could manage conflict evidence with various degrees effectively and reasonably. The validity of the proposed method was demonstrated by calculation examples.
Study on uncertain schema matching model in data integration
2010, 30(10): 2592-2594.
Asbtract
(
)
PDF
(589KB) (
)
Related Articles
|
Metrics
To make up the low efficiency of proof combination method in the exiting schema matching, a new model named Uncertain Schema Matching Model (USMM) was proposed, which can process uncertain schema matching. USMM is a multi-dimensional model. Domain knowledge and proof combination method were applied to process uncertainty of schema matching in order to lower the complexity of matching in USMM. Multi-dimensional structural information of schema was adopted in searching for the inner uncertainty of matching. Formalized definitions of uncertain schema matching and uncertain matching relations were put forward. The case study proves that the model is of high practical value.
Load balancing strategy based on immune genetic algorithm
2010, 30(10): 2595-2597.
Asbtract
(
)
PDF
(641KB) (
)
Related Articles
|
Metrics
Load Balancing Based on Immune Genetic Algorithm (IGALB) was proposed to improve the efficiency of search quality and the poor performance of local search of the load balancing strategy based on simple genetic algorithm (SGALB). IGALB ensured the diversity of population and overcame the premature convergence of SGALB by carrying out the affinity and concentration calculations and increasing the probability adjustment factor based on the concentration. Meanwhile, the degradation of SGALB was effectively alleviated by introducing immune operator and carrying out the vaccination and immune selection under certain conditions. The simulations show that the search ability of IGALB algorithm is higher than that of SGALB algorithm, and it can improve the performance of cluster system effectively.
Genetic algorithm to generate formal concept
2010, 30(10): 2598-2601.
Asbtract
(
)
PDF
(609KB) (
)
Related Articles
|
Metrics
At present, there is few research literature about genetically constructing formal concept. After considering formal concept construction as an optimization with constraints of Galois connection, a new concept generating algorithm named Geacob based on genetic evolution was proposed, and its research space consisted of power sets of objects and attributes in formal context. The proposed algorithm adopting variable structure can not only reasonably formalize the concept, but also satisfy the requirements in the procedure of concept's evolution, and has consequential properties of scalability and versatility. The experimental results show that the algorithm is feasible and effective to generate formal concept.
Database and data mining
Vector data contrast based on probabilistic theory and compound criteria
2010, 30(10): 2602-2604.
Asbtract
(
)
PDF
(485KB) (
)
Related Articles
|
Metrics
A generalized matching method for entity matching, vector data contrast method based on probabilistic theory and compound criteria, was proposed. The method extended from traditional single criteria to compound criteria, and the attributes information, spatial information and graphical information of vector data were considered. In addition, it attempted to resolve one-many and many-many matching relationships. The experimental results indicate that this method has good precision and recall, and contrastive results are available for the incremental analysis of geographic datasets.
Probabilistic top-k and ranking query algorithms in uncertain databases
2010, 30(10): 2605-2609.
Asbtract
(
)
PDF
(883KB) (
)
Related Articles
|
Metrics
Processing and querying on uncertain and probabilistic data has emerged as a new research area in both databases and data mining communities due to the generation of a huge amount of such data in applications such as sensor networks and RFID technology. Both top-k query and ranking query are important and useful tools for analyzing the large collection of uncertain data. Various algorithms of probabilistic top-k and ranking query on uncertain data were introduced and reviewed. The semantics and application scenarios of different querying processing algorithms were analyzed. The computation cost and querying semantics of the existing probabilistic top-k and ranking queries were also compared. Finally, the challenges and possible research directions of uncertain databases querying and processing were presented.
Improved image automatic annotation model based on external databases
2010, 30(10): 2610-2613.
Asbtract
(
)
PDF
(642KB) (
)
Related Articles
|
Metrics
Concerning the imbalance of the data set used in image annotation, a new self-balancing model based on external database was proposed. Firstly, the low-frequency points were found based on word frequency distribution of the original database, and an appropriate amount of image was added from an external database under the self-balancing mode for each low-frequency word. Secondly, the image features were extracted, and 47065 visual vocabulary of the original data set and 996 visual words extracted from additional images of external databases were clustered together. Lastly, each image was annotated by the improved image automatic annotation model based on external database. The proposed method overcomes the imbalance in image annotation, making the number of words which can be correctly labeled at least once, precision and recall be increased obviously.
Semi-supervised automatic clustering
PAN Zhang-Ming
2010, 30(10): 2614-2617.
Asbtract
(
)
PDF
(623KB) (
)
Related Articles
|
Metrics
The evolutionary algorithm based automatic clustering methods are lack of accuracy and slow in converging while dealing with non-compact clusters. A semi-supervised automatic clustering algorithm was proposed to solve this problem. The method started with the decoding of chromosomes. First was to separate the cluster number and all of the centroids from chromosome, then to filter the centroids of no effects using nearest neighbor algorithm. After incorporating the prior information of the data set, the decoding results could be further improved using K-means method to cluster the rest centroids. The experimental results verify the effectiveness of the proposed method for data sets with both compact and non-compact cluster structures.
Similarity measurement based on user interest in collaborative filtering
2010, 30(10): 2618-2620.
Asbtract
(
)
PDF
(485KB) (
)
Related Articles
|
Metrics
In the recommendation algorithm, similarity measurement is fundamental to the recommendatory effectiveness. Through analyzing the problems of traditional similarity measurement in recommendation system, a new interest-based similarity measure approach was proposed, which used user degree of interest in different kinds of item with rating of user to calculate similarity score between two users, so that could overcome the drawback of only using rating of user to calculate similarity on traditional similarity measurement and overcome effect of extreme sparsity of user rating data. The experimental results show that this method can effectively solve the shortcomings of traditional similarity method, and provide better recommendation results than traditional similarity measurement.
Improved feature selection approach combined with semantic
2010, 30(10): 2621-2623.
Asbtract
(
)
PDF
(504KB) (
)
Related Articles
|
Metrics
The traditional selection methods for text categorization are based on the statistical information of word frequency, which ignores the semantic effect of the words and cannot take more useful features because of the redundancy. A table named "conception-domain" was built based on the semantic dictionary HowNet, which included the word itself and its domain value. If a word from the text was existent in the table, it would be replaced by its domain value with more general meaning. By this way, more semantic information was added to the selected features and the redundancy between features of items could be eliminated to some extent. The experiments were carried out by improved information gain and χ2 respectively. And the results show that this method has effectively improved the precision of the text categorization.
Application of improved associative classification algorithm in cross marketing
Wang Dan-Dan Hui Xu
2010, 30(10): 2624-2627.
Asbtract
(
)
PDF
(662KB) (
)
Related Articles
|
Metrics
In order to guide commercial decisions for cross marketing, a new classification algorithm named CHC based on frequent closed itemsets and imprecise reasoning was proposed. The H-C algorithm for mining frequent closed itemsets based on hyperlinked data structure, H-Struct, was improved. The header table of H-Struct was adjusted by inserting the class label to prune the search space; the local relative support and maximum support were used to exclude meaningless patterns; the maximum length of patterns mined was applied to improve the usability of rules. The reasoning algorithm of EMYCIN was extended to handle the rules whose right is negative. The algorithm improved above traditional classification algorithm's limitations in deriving only class label. Furthermore, this algorithm obtained a value referring to the confidence of the classification result to facilitate and simplify the process of evaluating multiple cross marketing plans. The experimental results show that the enhanced algorithm is efficient in run time and classification precision.
Network and communications
Survey on convergence problem of border gateway protocol
2010, 30(10): 2628-2631.
Asbtract
(
)
PDF
(710KB) (
)
Related Articles
|
Metrics
Border Gateway Protocol (BGP), currently the core Internet inter-domain protocol in use, is not satisfying in some aspects, such as slow convergence. The root cause of the BGP convergence problem was deeply analyzed, and the current researches on this problem were also summarized. The presented approaches on slow convergence problem of BGP could be broadly categorized into three kinds: adjusting protocol parameters, adding new mechanisms, and designing new protocols. Finally, several major trends of BGP convergence research were concluded based on thorough analysis and comparison.
Attribute-based universal access control framework in open network environment
2010, 30(10): 2632-2635.
Asbtract
(
)
PDF
(823KB) (
)
Related Articles
|
Metrics
Concerning the limitations of the application of traditional access control model in new generation credible Internet environment, such as the inefficiency in user-role assignment and the difficulty in cross-domain access control, a universal attribute-based access control framework was proposed. It took a unified method to dispose the attributes of users, resources, operations and running context, simplified the complex way of permissions determination in traditional RBAC and other access control modes, thus enhancing the versatility and flexibility of access control system. At the same time, authentication based on attribute certificates was applied in cross-domain access, policy evaluation and evaluation algorithm were also discussed, which could dynamically realize resource management and access control for users from different domains. In addition, the mechanism of the running context makes the framework more suitable to be applied in complex and dynamic Internet environment.
Link state reasoning based routing protocol for wireless mesh networks
2010, 30(10): 2636-2640.
Asbtract
(
)
PDF
(845KB) (
)
Related Articles
|
Metrics
Based on the analysis of the most important challenges in Wireless Mesh Network (WMN), a routing protocol named LR-OLSR was proposed based on the Optimized Link State Routing (OLSR) protocol with cross-layer design theory. The LR-OLSR protocol greatly optimized routing performance by introducing the reasoning method to value the link quality depending on information such as the node load, the packet delivery ratio, and the link availability. The proposed protocol made use of link quality as the routing metric during route selection, and thus could achieve optimal routing and load balancing in WMN. The simulation results on LR-OLSR, OLSR and its typical improvement named P-OLSR and SC-OLSR show that the proposed routing protocol improves the packet delivery ratio, reduces the end-to-end delay, and achieves load balance in the route selection process.
Credit mechanism oriented resource sharing platform in structured P2P networks
Zhang JianYin
2010, 30(10): 2641-2644.
Asbtract
(
)
PDF
(842KB) (
)
Related Articles
|
Metrics
Confronted with the high usability rate requirements of available network resources, a credit mechanism oriented resource sharing platform in structured P2P networks called NRSP was proposed and described. NRSP allowed users to submit jobs to be run in the system and to run jobs submitted by other users on any resources available over the Internet, sharing the available processor cycle efficiently. NRSP is a decentralized, portable, accountable and fair system. A Pastry protocol was provided for efficient and fault-tolerant content addressable routing in a self-organizing overlay network, and a new distributed credit system supporting accountability among providers and consumers of resources was also used in the system fairly. A prototype of NRSP was implemented subsequently. The simulation results show that the fairness mechanisms work well to punish cheating nodes and NRSP is a feasible approach to large scale resource sharing over the Internet.
Peer-to-Peer routing model based on Chord
2010, 30(10): 2645-2647.
Asbtract
(
)
PDF
(472KB) (
)
Related Articles
|
Metrics
In view of the inconsistence between logical topology and physical topology, and neglect of the heterogeneity of nodes, a new Peer-to-Peer (P2P) routing model based on Chord was proposed. Using address aggregation of IPv6 and hashing node's IP partly, it got hierarchy node's identifier and achieved topological consistency. According to the size of the network, it mapped nodes to multi-layer Chord rings, so that nodes in the same aggregation could domain themselves. Considering the heterogeneity of nodes, well-behaved nodes were assigned more routing tasks. The simulation results show that the proposed model can keep an average routing hops length similar to Chord, but reduce storage cost and search delay.
Prediction for network traffic based on modified Elman neural network
2010, 30(10): 2648-2652.
Asbtract
(
)
PDF
(761KB) (
)
Related Articles
|
Metrics
Concerning the nonlinear, multivariable and time-varying qualities of neural network, a modified Elman neural network model was proposed. The learning method based on seasonal periodicity was introduced into the model training. And the output traffic of the backbone network of a certain university was given. The experimental results show that this model has better predication effect. Compared with the traditional linear model, the BP neural network model and the normal Elman neural network model, it has higher precision and better adaptability. Finally, abnormal behaviors of network traffic can be found on time through test of adaptive boundary value method, which proves that the model is feasible and effective.
Network traffic classification based on hybrid model
2010, 30(10): 2653-2655.
Asbtract
(
)
PDF
(513KB) (
)
Related Articles
|
Metrics
In order to satisfy the requirements of users for more and more precise Internet service quality, the traffic classification is an important link in the network management process. Through analyzing and comparing the application situation and the advantages and disadvantages of each classification method by machine learning, which were separately based on port number matching, feature analysis and traffic characteristics, a hybrid model of network traffic classification method was proposed to solve the problems that rely on a single classification method, such as low accuracy, long classification time. This model combined the port number matching with machine learning, and applied Self-Organizing Map (SOM) of which the output result is visual. The experimental result shows that this method can effectively achieve the application type classification of network traffic, and obtain a good visual effect of classification result.
User scheduling and resource allocation at cross-layer for power-line communications system
2010, 30(10): 2656-2660.
Asbtract
(
)
PDF
(793KB) (
)
Related Articles
|
Metrics
An optimal multi-layer multi-objective cross-layer resource allocation algorithm with user scheduling in data link control layer and resource allocation in physical layer was proposed for multi-user multi-service OFDMA (Orthogonal Frequency Division Multiple Access) power-line communications systems. Firstly in the user scheduling, the scheduled users and their optimal cross-layer parameters were ascertained based on each user's current QoS, the desired QoS, traffic packet model, channel information and queue status. Secondly, in the resources allocation, according to all the scheduled users' desired QoS, optimal cross-layer parameters and channel state information, power was assigned to every subcarrier by cellar water-filling theory, then each subcarrier to the scheduled users was optimally assigned and the allocated power and bits of each subcarrier were adjusted by bit-loading looking-up table algorithm. Finally in a typical power-line channel environment, the simulation results illustrate that the proposed algorithm can ensure users' QoS and effectively improve the resource utilization.
Method for streaming transmission of 3D terrain model
2010, 30(10): 2661-2664.
Asbtract
(
)
PDF
(724KB) (
)
Related Articles
|
Metrics
To solve the contradictions between the limited network bandwidth and the mass of terrain data, a new approach of terrain data streaming transformation was proposed. The approach partitioned the terrain data on sever, applied wavelet transform and used SPIHT coding algorithm to organize the terrain data into multiple progressive compressed streams. On the client, transmission quantity of each terrain block around the viewpoint was designed to be adaptively adjusted according to viewpoint position, alterative direction and speed of the viewpoint. At the same time, caching method storing terrain blocks around the viewpoint in client was also designed to lighten the stress of data supply when the network was congested. The experimental results show that the approach can adapt to the rendering algorithm based on GPU and keep a good frame rate to ensure the continuity of the terrain rendering.
Audio multi-channel collecting wireless communication system based on USB2.0 interface
2010, 30(10): 2665-2668.
Asbtract
(
)
PDF
(689KB) (
)
Related Articles
|
Metrics
To meet the needs of multi-channel acoustic array long-distance acquisition and acoustic filed analysis, the synchronous sampling of 96 audio frequency channels was designed and implemented based on the USB2.0 high-speed interface, and data wireless transmission was implemented with standard Wi-Fi (Wireless Fidelity) protocol and 802.11n criterion. The test results indicate that the minimum peak values of effective transmission speed of USB2.0 high-speed interface in the system reach 26.5MBps and 22.5MBps for reading and writing and steady mean value of wireless transmission arrives at 106Mbps. The implemented system adopts Commercial Off The Shelf (COTS) products except acquisition cards, which has marked advantages of low cost, short research period and high usability.
Information security
Trusted anonymity communication protocol for mobile Internet
2010, 30(10): 2669-2671.
Asbtract
(
)
PDF
(580KB) (
)
Related Articles
|
Metrics
According to anonymous request in communications process of mobile Internet, an anonymity communication protocol based on signcryption and trusted computing was proposed to resolve the problem of anonymous communications in mobile terminal. In this model, the forwarding nodes could identify the integrity of the received information and the validity of the forwarding path according to the signcryption of previous nodes. The research results reveal that the protocol has a good security and trusted anonymous property, which achieves the anonymous communication process of safety requirements for mobile terminals.
Privacy protection method for outsourced database services
2010, 30(10): 2672-2676.
Asbtract
(
)
PDF
(824KB) (
)
Related Articles
|
Metrics
In privacy protection based on database encryption technology for outsourcing database services, it is difficult to achieve balance between performance of data processing and privacy protection effectively. A new privacy protection method based on distributed outsourcing database service was proposed to provide both efficient privacy protection and query processing. Automatic quasi-identifiers detection and probabilistic anonymity were introduced. The data could be partitioned across many logically independent database servers horizontally or vertically, while only few sensitive data were encrypted or anonymous. According to the type of data fragmentation, the trusted client executed queries by transmitting appropriate sub-queries to different databases, and then pieced the results together at the client side. The theoretical analysis and experimental results show that the proposed method is well-balanced in dealing with the contradiction between data privacy preserving and efficient query processing.
Worm discrete propagation model based on uneven random scan
2010, 30(10): 2677-2678.
Asbtract
(
)
PDF
(483KB) (
)
Related Articles
|
Metrics
In order to improve the propagation efficiency of the worm, a worm discrete propagation model based on uneven random scan was established under the in-depth study of even-scanning strategy. For researching this model dynamically, three important factors influencing the propagation of worm were researched quantitatively, such as immunization rate of infected hosts, immunization rate of susceptible hosts and the number of hosts scanning. The infected hosts rate reaches 0.8 and above, ahead of 1000 seconds to reach the peak number, and delays 2000 seconds to be restrained half. Finally, the simulation results indicate that this model has high efficiency.
Lossless data hiding for images based on integer wavelet optimum histogram pair
2010, 30(10): 2679-2683.
Asbtract
(
)
PDF
(767KB) (
)
Related Articles
|
Metrics
Embedding location map is a commonly used method for image data hiding; however, it has little payload and the visual effect is not good enough. Therefore, a lossless data hiding scheme of histogram pair was proposed based on multi-gray image authenticity certification. It applied the Integer Wavelet Transform (IWT) to gray image, selected the optimum threshold in the histogram, and made the histogram pair to embed data. The experimental results show that the Peak Signal to Noise Ratio (PSNR) is up to 46dB in the payload of 0.1bpp (bit per bixel), and the proposed scheme is effective.
Efficient digital fingerprinting scheme
2010, 30(10): 2684-2686.
Asbtract
(
)
PDF
(611KB) (
)
Related Articles
|
Metrics
To improve the performance of coding efficiency and reduce storage space, a new fingerprinting scheme was proposed. First, multiple linear block codes were designed, and then orthogonal vector of each symbol was connected to form user's fingerprinting. Compared with the orthogonal coding fingerprinting and orthogonal fingerprinting based on Balanced Incomplete Block Design (BIBD), the proposed scheme improves coding efficiency to some extent and reduces storage space to O(logn) for each fingerprinting. The theoretical analysis and the experimental results show that the approach is of robustness, and has good anti-collusion performance.
Security analysis and improvement of a partially blind signature scheme
2010, 30(10): 2687-2690.
Asbtract
(
)
PDF
(585KB) (
)
Related Articles
|
Metrics
Recently, an efficient identity-based partially blind signature scheme was put forward by Cui and Xin et al. First of all, it was pointed out that Cui-Xin's scheme suffered from the forgery attack in which a requester could change the pre-agreed common information illegally, and the reason why Cui-Xin's scheme could not resist the forgery attack was discussed. Secondly, an improved scheme was put forth to overcome the security flaw of Cui-Xin's scheme, and it was strictly proved to be unforgeable. Compared with the existing schemes, the results show that the proposed scheme is an efficient identity-based partially blind signature scheme.
Video watermarking based on multi-dimensional scaling and singular value decomposition
2010, 30(10): 2691-2693.
Asbtract
(
)
PDF
(449KB) (
)
Related Articles
|
Metrics
Concerning the intellectual property rights of video on Internet, a new digital video watermarking method based on Multi-Dimensional Scaling (MDS) and Singular Value Decomposition (SVD) was proposed. First, the frames were mapped to points in the 2D space using MDS, and then the watermarks were embedded into the differences between the frames of video and their images under the mapping through SVD. The experimental results show that the proposed method has very strong robustness against spatial desynchronization attacks such as rotating, scaling and clipping. Furthermore, it also achieves high robustness against noise and median filtering. In addition, the method can resist temporal desynchronization such as frame dropping and insertion to some extent.
Pattern recognition
Source camera identification schemes with color image information
2010, 30(10): 2694-2697.
Asbtract
(
)
PDF
(673KB) (
)
Related Articles
|
Metrics
At present, most of the pattern noise-based source camera identification methods only utilize forensic information from one single-channel image, or simply calculated the average of the test results from three single-channel images. This apparently cannot fully reflect the characteristics of imaging sensors. Through analyzing the characteristics of interpolation algorithms in the mosaic image generation process, three color image-based source camera identification schemes were proposed. Experimental results show that the new method has better identification capability and needs fewer image pixels.
Skin color segmentation algorithm combining adaptive model and fixed model
Yao-Rong Lin
2010, 30(10): 2698-2701.
Asbtract
(
)
PDF
(575KB) (
)
Related Articles
|
Metrics
Because of the effects of environment, illumination and ethnicity, skin-color clustering in different images are not the same. As a result, for images with complex backgrounds, a fixed decision boundary skin-color model may lead to high false rejection rate and false detection rate. Based on the segmentation result of a fixed decision boundary skin-color model in YCbCr color space, a simplified Expectation Maximization (EM) algorithm was used to train the adaptive Gaussian model for a specific image. Combining the fixed model and the adaptive Gaussian model could produce the final skin color model for skin color segmentation. The experimental results show that the method can greatly improve skin-color segmentation accuracy, and reduce both false rejection rate and false detection rate.
Texture defect inspection for silicon solar cell
2010, 30(10): 2702-2704.
Asbtract
(
)
PDF
(614KB) (
)
Related Articles
|
Metrics
In order to detect texture defects of silicon solar cells, a texture defect detection method combining the steerable filters with Hough transform was proposed. After image edges were extracted by using the steerable filters, the direction of texture was determined through Hough transform. The steerable filters as the same angle as that of texture direction was adopted to filter texture image of silicon solar cell, which could eliminate the line-like textures and retain the characteristics of texture defects. Locations of texture defects were determined in the filtered image with texture defects by using the dual threshold method. Compared with Gabor filter and wavelet filter methods, the experimental results show that the steerable filter method can be more effective for detecting texture defects than Gabor filter and wavelet filter methods.
Automatic localization of facial key-points for 3D face modeling
2010, 30(10): 2705-2708.
Asbtract
(
)
PDF
(650KB) (
)
Related Articles
|
Metrics
In view of the key-points acquirement problem in morphable way of 3D face modeling, an automatic localization method was proposed. First of all, according to the demand, the locations of key-points were determined and classified. Secondly, by using the S component in HIS color space, the head was detected. Thirdly, combined with the structure of face features and texture, the features' parts were extracted. Then, using different methods, the 2D coordinates of the different key-points were gained at the features' parts from frontal and side photos respectively. Finally, merging the 2D coordinates from both views, the 3D coordinates of the key-points were obtained. The experimental results show that the proposed method has good performance in speed and accuracy.
Tone recognition based on biomimetic pattern recognition theory
2010, 30(10): 2709-2711.
Asbtract
(
)
PDF
(478KB) (
)
Related Articles
|
Metrics
Pitch frequency trajectories can more truly reflect the tone characteristics of Mandarin. It is a better way to identify tone by identifying different pitch trajectories. According to the theory of biomimetic pattern recognition, an improved tone recognition algorithm was proposed. It used Iterative Self-Organizing Data Analysis Techniques Algorithm (ISODATA) to find centers of coverage areas, and multi-weight neural network to achieve coverage for each cluster center. The experimental results show that in the case of a small sample, the proposed algorithm can get a higher recognition rate than that of Hidden Markov model (HMM) and Support Vector Machine (SVM).
Recognition feature extraction based on little speech data for speaker under noisy conditions
2010, 30(10): 2712-2714.
Asbtract
(
)
PDF
(485KB) (
)
Related Articles
|
Metrics
To improve the performance of speaker recognition in the condition of noise and little speech data, feature parameters were studied based on the Vector Quantization (VQ). An improved feature named WFWTC was proposed by combining extraction of Mel Frequency Cepstrum Coefficient (MFCC) with wavelet transform. After that, a new feature was established based on WFWTC and Spectral Centroid (SC). The experimental results show that the feature is feasible for speaker identification.
Energy analysis and motion detection of video signal
2010, 30(10): 2715-2717.
Asbtract
(
)
PDF
(617KB) (
)
Related Articles
|
Metrics
An interference adaptive approach to detect moving objects under the condition with a static camera was proposed. Considering the change of the same pixel value in the video as a signal, the mean value of pixel signals was computed at first. Then the energy of signal values' waving around the pixel mean was also computed. After that, it was judged whether the pixel belongs to background or foreground through the comparison of the wave energy. Compared to the common moving objects detection approachs such as mixture Gaussian model, this approach has better interference adaptive performance and higher sensitivity.
Recognition of 3D points cloud object based on local fingerprint patch
2010, 30(10): 2718-2722.
Asbtract
(
)
PDF
(830KB) (
)
Related Articles
|
Metrics
A 3D points cloud object recognition algorithm was proposed based on local descriptor. The vector and shape index value of the points cloud were calculated, and then according to the shape index, feature points were extracted. Through the geodesic distance and vector angle, points cloud was segmented into different patches centered on feature points. Through the geodesic partition on each patch, 3D geodesic-style concentric circles were got. Thus the description of 3D objects could be transformed into two 2D curves: the normal vector curves and the Euclidean distance curves, then one model objects database would be established. Through comparison of the descriptions with the model database, some potential recognition results of a given object could be found. With the final iterative closest point algorithm, the final recognition result could be determined. The experimental results in real objects demonstrate the effectiveness of the proposed algorithm.
Wavelet time-frequency analysis of neural spike sorting
2010, 30(10): 2723-2726.
Asbtract
(
)
PDF
(646KB) (
)
Related Articles
|
Metrics
The separation of spikes is a key problem for invasive brain-computer interface. To deal with the similarity of spike temporal profile and frequency feature, a method was proposed to represent spike feature using wavelet analysis technique. First, wavelet functions, such as db, sym, bior, were used as base function to achieve high-dimension wavelet coefficient as spike feature. Next, in order to decrease the dimension of spike feature, Kolmogorov-Smirnov (KS) test was performed to select a few coefficients. After that, unsupervised K-means clustering was calculated to complete spike sorting. The experimental results show that, when the neural signal is at the noise level 0.05dB, 0.1dB, 0.15dB, sorting performance varies slightly while changing wavelet base functions. In all of these functions, sym5 wavelet outperforms the other five wavelet functions in terms of misclassified rate of spikes (between 1.21%~1.81%). Compared with Principal Component Analysis (PCA), the proposed method based on sym5 wavelet performs better even for the heavy noise spike data.
Software process technology and advanced computing
Formal method for UML sequence diagrams based on communication sequential processes
2010, 30(10): 2727-2729.
Asbtract
(
)
PDF
(591KB) (
)
Related Articles
|
Metrics
UML2.0 sequence diagram describes the dynamic collaboration and expresses the relation among the time of events. However, the lack of precise formal semantics is not conducive for the systems to be described in formal verification. To solve this problem, according to UML 2.0 semantic document and combination of fragment packets, the basic elements of UML sequence diagram, message trace definition and generation rules were given based on Communication Sequence Process (CSP). The accuracy and validity of formal method for UML sequence diagrams describing the system was provided. Finally an ATM example proves the validity of this process.
Application of fractal theory to software complexity
2010, 30(10): 2730-2734.
Asbtract
(
)
PDF
(718KB) (
)
Related Articles
|
Metrics
Software complexity was studied with fractal theory, the definition of box and fractal complexity was given, and then a related algorithm was proposed. Some examples showed that some programs have fractal attribute, called scale invariance. On this basis, several existing programs were tested. The further analysis results show that, under certain conditions, it is effective for the proposed algorithm to calculate the complexity of a program.
Schedulability analysis of global rate-monotonic scheduling algorithms on multiprocessor platforms
2010, 30(10): 2735-2737.
Asbtract
(
)
PDF
(476KB) (
)
Related Articles
|
Metrics
In the Rate-Monotonic (RM) global scheduling for multi-processor system, when the highest-priority task is fewer than the number of processors, the worst-case calculation interference given by Bertogna is too pessimistic. The authors proved that the interference of the task with highest priority on a task was not so pessimistic. But the interference of the task without highest priority on a task may be as pessimistic as the worst case presented by Bertogna, et al. The possible maximum interference of the tasks with highest priority was given by analysis, and then a tighter schedulability policy was put forward. The experimental results show that the proposed schedulability policy increases the amount of detected task set.
Instrumentation technology for embedded software statement coverage testing
2010, 30(10): 2738-2740.
Asbtract
(
)
PDF
(624KB) (
)
Related Articles
|
Metrics
For the host-based embedded software testing, a general method for statement coverage testing in the unit test was proposed. The method was implemented by piling technology before compiling the source code. The relevant algorithm of the test code, with which the piling technology would be automatically performed, was designed. And these approaches were eventually applied to the platform ARMtest for embedded software emulating and testing. By these approaches, the problems in the early embedded software development can be found and improved through the host-based environment and simulation environment.
Analysis and evaluation on proactive fault-tolerance model of Java services
2010, 30(10): 2741-2744.
Asbtract
(
)
PDF
(619KB) (
)
Related Articles
|
Metrics
Java service has become real-world critical business service, and its availability is the key to the availability of business system based on Java technology. The availability of Java services can be improved when the technology of active fault-tolerance is applied. The model of active fault-tolerance facilitates the activities of analysis and evaluation. The effect of fault tolerance between with-Rejuvenation and without-Rejuvenation was discussed through model analysis and emulation experiments. The conclusion is the availability of Java services can be improved if the active fault-tolerant technology was adopted. If a reasonable time point of using software Rejuvenation strategy is chosen, the better fault tolerance effect will be achieved.
Software process control model based on critical path method
2010, 30(10): 2745-2748.
Asbtract
(
)
PDF
(819KB) (
)
Related Articles
|
Metrics
A software process control model based on critical path method was presented for enhancing the software process controlling to guarantee the product quality. The model is based on the topological structure of the process activities. Under the precondition of ensuring the time limit for the project and the resource requirement of the critical activities, the start time of the other activities were controlled by a mathematical model with the target of making the total resource cost less and the starting of the activities much earlier under the optimal resource cost. Under the constraint of the optimal resource cost, an algorithm based on resource competition chain was also promoted to update the float information of the uncritical activities. At last, the experimental results show the model is feasible and effective.
Graph user interface automatic testing model based on software monitoring
2010, 30(10): 2749-2753.
Asbtract
(
)
PDF
(839KB) (
)
Related Articles
|
Metrics
In order to improve the coverage and efficiency of locating faults and defect the faults in running repetitiously in Graph User Interface (GUI) automatic testing, a GUI automated testing model based on software monitoring was proposed. The model divided GUI into four-layers: windows framework layer, interface element layer, functional structure layer and running record layer. The windows framework layer described all windows and controls in GUI, the interface element layer described all input events, the functional structure layer described the rule of functional coverage, and the running record layer monitored the software dynamically in each running trace and window through instrumentation codes, so as to increase the test coverage, provide basis for reliability calculation according to the total and correct operation in running record window. Finally, notepad was taken as the example to verify the model's efficiency.
Architecture of software services based on SaaS model supporting multi-terminals and service customization
2010, 30(10): 2754-2757.
Asbtract
(
)
PDF
(787KB) (
)
Related Articles
|
Metrics
Analyzing the conceptions and characteristics of the software services based on Software as a Service (SaaS) model concludes that the current Service-oriented Architecture (SOA) cannot be used to realize software services directly. In addition, with the view of expanding software service application and realizing diversity of software services, software services based on SaaS model should support service customization and multi-terminals. By extending the SOA, software service terminal, software service port and software service register model were introduced. An architecture of software services based on SaaS model which supports multi-terminals and service customization was presented. The structure and work flow of the architecture were also presented. The experimental results prove that the architecture can realize software service based on SaaS model, and also supports the requirements mentioned above.
Sensor data uncertainty and its hierarchical multi-Agent coordination strategy in wireless sensor networks
2010, 30(10): 2758-2762.
Asbtract
(
)
PDF
(851KB) (
)
Related Articles
|
Metrics
Concerning the research of the efficient strategies for sensor data uncertainty processing, the diversity and hierarchy of sensor data uncertainty were analyzed. Three kinds of Agents, named sensor Agent, cluster and analyzer Agent, sink and decision-maker Agent, were designed. The hierarchically corresponding relation between sensor data uncertainty and multi-intelligent-Agent was also investigated. Furthermore, each Agent was specifically defined as an abstract entity with two component coordination modules on correspondence and local processing of uncertain data, as well as the intelligent characteristics combination of Wireless Sensor Network (WSN) and Rough Set Technology (RST), and then a hierarchical multi-intelligent-Agent model of sensor uncertainty coordination was put forward. Finally, a relevant implementation algorithm and illustrative analysis were given. The results verify the flexibility and practicability of such an intelligent model with hierarchical mechanisms to coordinate different kinds of complicated sensor data uncertainties.
Semantics topic map Web service composition approach using description logic
Xiang-Bing Zhou
2010, 30(10): 2763-2767.
Asbtract
(
)
PDF
(836KB) (
)
Related Articles
|
Metrics
Concerning the dispersibility and uncertainty of Service-Oriented Computing (SOC), as well as the bottleneck of service discovery, selection and composition in technology and application, an approach of topic map service composition was proposed. It employed description logic to merge topic map and Web services, and a topic map semantics Web service was got by using ontology to achieve the description between topic map and Web services in merging. Tableau decision algorithm based on SHOIQ was adopted to accomplish topic map semantics Web service composition. The case analysis shows that the method is feasible and effective.
Manufacturing-oriented RFID complex event processing
2010, 30(10): 2768-2770.
Asbtract
(
)
PDF
(656KB) (
)
Related Articles
|
Metrics
It is important for Radio Frequency IDentification (RFID) to maintain the data processing efficiency to massive data in automatic data collection. The characteristics of RFID data and the disadvantages of the previous methods for RFID data processing were analyzed, and a RFID data processing model based on complex event processing was given. Furthermore, the relevant definitions and function modules with solutions of the model were put forward. The proposed model can extract meaningful events for business applications. Finally, a case for the manufacturing of RFID applications based on this model was studied to demonstrate its advantages.
Scheduling algorithm of grid jobs based on value density, relative deadline and EASY backfilling
2010, 30(10): 2771-2773.
Asbtract
(
)
PDF
(626KB) (
)
Related Articles
|
Metrics
Jobs have arrival time, workload, budget and deadline parameters in economy-based grid computation and cloud computation environment. It is vital to differentiate jobs' importance and urgency for job scheduling system. Only some of these parameters have been considered in the existing algorithms. These four parameters were all taken into account here. A value density and relative deadline jointly based priority was defined. Based on this priority, a new scheduling algorithm was proposed, and EASY backfilling was used to improve the throughput of grid resources. The results of simulation show that the new defined priority can differentiate jobs' importance and urgency well, but EASY backfilling can improve resource throughput only for some priority strategies.
GPU accelerated parallel labeling algorithm of connected-domains in binary images
2010, 30(10): 2774-2776.
Asbtract
(
)
PDF
(464KB) (
)
Related Articles
|
Metrics
In combination of NVIDIA's Graphics Processing Unit (GPU) parallel architecture and hardware features under Compute Unified Device Architecture (CUDA) architecture, a new parallel labeling algorithm of connected domain was proposed for binary images. It effectively located the connected domain of the binary image and recorded its size at high speed, and significantly reduced the marking time. It recognized the connected domain through searching the minimum labeled pixel value in neighborhood. Because the processing sequence of each pixel is not in particular order and independent from each other, it can be dealt in parallel. The calculation efficiency of the algorithm is independent of the shapes and the quantity of the connected regions, and the algorithm has good robustness. The experimental results show that the algorithm fully plays the parallel processing capability of GPU, and can get a more than 300 times speedup than general algorithm based on CPU and 17 times speedup than OpenCV function (CPU) in processing high-resolution images and multi-connected-domain images.
Multi-dimensional vector radix fast Fourier transform with decimation in frequency domain
2010, 30(10): 2777-2780.
Asbtract
(
)
PDF
(702KB) (
)
Related Articles
|
Metrics
A multi-dimensional vector radix Fast Fourier Transform (FFT) algorithm with Decimation In Frequency (DIF) was proposed. Using vector radix 2 DIF for each dimension of the multi-dimensional frequency domain signal, the general form of butterfly computation of the FFT algorithm was obtained. This FFT algorithm can be used with arbitrary integer dimensions. For 1-dimension, the algorithm will be simplified as the well-known DIF vector radix 2 FFT. To facilitate the programming, flow chart with DIF 3-dimension vector radix FFT was given. And this flow chart can be extended to any integer dimensions easily. The comparison results show that, compared with multi-dimensional separable FFT, the DIF multi-dimensional vector radix FFT algorithm has lower calculation load.
Application of parallel ant colony algorithm based on TBB and Cilk++ in path optimization
2010, 30(10): 2781-2784.
Asbtract
(
)
PDF
(595KB) (
)
Related Articles
|
Metrics
An Ant Colony Algorithm (ACA) with rollback mechanism was proposed for obtaining optimized path, which traverses all necessary nodes in real road network. In case of the large scale real road network, the convergence speed of traditional sequential ACA was quite slow. Therefore, two parallel ACAs based on Intel Threading Building Blocks (TBB) and Cilk++ parallel programming model were designed separately. Both of them were easier to operate with simple instructions than complicated thread triggering and critical resource boundary recognition. Therefore, these two models were easier to develop and more applicable compared with WinAPI multi-threaded based parallel ACA (pACA). The experimental results show that pACA based TBB is nearly identical with pACA based on WinAPI in terms of efficiency, while pACA based on Cilk++ exhibits higher efficiency and better speed-up than pACA based on WinAPI in dual-core system.
Graphics and image processing
Snake curve extraction algorithm based on improved particle swarm optimization
2010, 30(10): 2785-2787.
Asbtract
(
)
PDF
(527KB) (
)
Related Articles
|
Metrics
Data acquisition based on existing drawings is an important part of basin modeling. But the digital acquisition faces the trade-off between efficiency and accuracy. Combining the target detection and Particle Swarm Optimization (PSO) technique, an improved Snake curve extraction algorithm based on particle density was proposed. The algorithm maintained a certain distance between particles to avoid the premature of PSO algorithm, and optimized convergence speed by modifying the model parameters. Compared with the traditional PSO algorithm, the experimental results show that the proposed algorithm is efficient, and has been applied in practical project.
Parallel line drawing algorithm based on probability theory and mathematical morphology
2010, 30(10): 2788-2789.
Asbtract
(
)
PDF
(490KB) (
)
Related Articles
|
Metrics
Parallel lines drawing frequently happens in the Geographic Information System (GIS) development. In the traditional drawing algorithms, nodes of parallel lines are in the opposite direction or far away from the polyline when the angle is very small. To solve these two cases, a correction method based on probability theory was proposed. At the same time, mathematical morphology was adopted to correct the results, which made it aesthetic, and made the very sharp angle smooth. The experimental results show that the proposed algorithm is effective.
De-correlated color correction for multiview video coding
2010, 30(10): 2790-2793.
Asbtract
(
)
PDF
(602KB) (
)
Related Articles
|
Metrics
The efficiency of multiview video coding is influenced by the color discrepancies of pictures in different views. A new color correction method was proposed for multiview videos in the lαβ color space whose tristimulus values were approximately irrelevant to each other. According to the characteristics of the original data of multiview videos, the appropriate transformation matrices were selected to provide formula for transforming data from YUV to lαβ color space. In order to mitigate the influence of illumination change between different views, correlation analysis was integrated in block-matching to search for the best matching blocks. The data of these blocks was transformed into a de-correlated color space derived from lαβ, and corrected with a statistic approaching method in a more convenient way. The results show that the subjective effect is satisfactory with the elimination of color discrepancies after color correction, and in terms of coding efficiency, the average PSNR of Y component is up to 1.4dB, which proves the effectiveness of this color correction method.
Fast intra-frame prediction algorithm based on power spectrum for H.264/AVC
2010, 30(10): 2794-2796.
Asbtract
(
)
PDF
(603KB) (
)
Related Articles
|
Metrics
To solve the problem of high complexity in H.264 / AVC frame prediction, in combination with the H.264 intra-frame prediction characteristics, a fast intra-frame prediction algorithm based on power spectrum was proposed. Before coding, macroblock was made discrete fast Fourier transform, and 3D power spectrum function in the frequency-domain were changed into 2D power spectrum function. And then according to the judging result on the threshold by correlation of 2D power spectrum function, one of the two prediction modes was selected. Therefore, coding time could be reduced dramatically. The experimental results show that the algorithm can increase the speed of intra coding significantly with negligible loss of PNSR and increase of bite-rate. Furthermore, the algorithm is prone to implement on video encoding chip, and can be used in practice.
Clustering based on fuzzy Gibbs random field and 2D histogram algorithm for MR image segmentation
2010, 30(10): 2797-2801.
Asbtract
(
)
PDF
(805KB) (
)
Related Articles
|
Metrics
For the uncertainty and the fuzziness of the organizational structure of human brain, an image segmentation algorithm that combines the clustering based on fuzzy Gibbs random field and the two-dimensional histogram method was proposed. In the algorithm, membership functions were defined by average, variance and neighborhood attributes, and fuzzy Gibbs random field was set up. Then Maximum A Posteriori (MAP) was used as the statistical clustering criteria, in which the fuzzy Gibbs random field was used to obtain prior knowledge, and every class center was updated by the centroid of the fuzzy class. Finally, every class center was introduced into two-dimensional histogram method to find segmentation points in each class region for image segmentation. The experimental results show that the proposed algorithm can separate out the various brain tissues accurately, and it is better than Fuzzy C-Means (FCM) algorithm in the noise robustness, the result accuracy and smoothness.
Application of graph spectral theory to text image binarization processing
2010, 30(10): 2802-2804.
Asbtract
(
)
PDF
(480KB) (
)
Related Articles
|
Metrics
The traditional binarization thresholding methods cannot segment the text image effectively from the whole image, while the improved method based on graph spectral theory can segment the text image effectively and clearly. Concerning the traditional algorithms based on the graph spectral theory has high computational and space complexity, the authors used gray levels of an image instead of pixels of an image. On this basis, the parameters of weight function was calculated approximately. The experimental results show that this method reduces the computational complexity, and has superior performance on speed compared to the traditional graph spectral methods, and better quality compared to the common binarization algorithms.
Multi-focus image fusion algorithm based on region segmentation and nonsubsampled Contourlet transform
2010, 30(10): 2805-2807.
Asbtract
(
)
PDF
(512KB) (
)
Related Articles
|
Metrics
Segmentation algorithm based on neural networks has high computational complexity and computation load. To resolve this problem, a multi-focus image fusion algorithm was proposed based on the different definition between the focus region and non-focus region of single focal length image. It effectively combined the multiscale, multidirection, anisotropy and shift-invariant qualities of NonSubsampled Contourlet Transform (NSCT) in image decomposition, as well as segmented and fused the low- frequency with clustering of high-frequency. The results show that this algorithm is an effective method in multi-focus image fusion.
Image de-noising algorithm based on stationary wavelet transform and morphology
2010, 30(10): 2808-2810.
Asbtract
(
)
PDF
(540KB) (
)
Related Articles
|
Metrics
Image noise will degrade image quality and subsequent image processing will be affected. To eliminate noise while keeping image edge details, a new algorithm was proposed. It took full consideration of wavelet coefficients correlation using the phase invariance, and estimated the skeleton information in reference to morphology, and performed de-noising based on the selected congenial and irregular regions. The experimental results demonstrate that the proposed algorithm can keep details of images as well as reduce image noises.
Soft morphological filter based on particle swarm algorithm
2010, 30(10): 2811-2814.
Asbtract
(
)
PDF
(690KB) (
)
Related Articles
|
Metrics
Typical median and mean filters have some drawbacks such as incomplete denoising and image blurring. Therefore, a new Improved Soft Morphological Filter (ISMF) was proposed to remove salt-and-pepper and Gaussian noise while preserving the details. In order to quantitatively analyze the parameters and nonlinear constraints in the filter, a modified simple Particle Swarm Optimization (msPSO) algorithm was given with high convergence speed and precision. The experimental results show that ISMF optimized by msPSO performs better on Peak Signal to Noise Ratio (PSNR) and shape error.
New impulse noise filter based on weighted detection
2010, 30(10): 2815-2818.
Asbtract
(
)
PDF
(639KB) (
)
Related Articles
|
Metrics
Based on the analysis of the principles of noise detection and noise filtering, an effective image denoising algorithm was proposed to restore images corrupted by impulse noise. The proposed algorithm utilized the directional difference to decompose the window into four subwindows, and then accurately distinguished noise points from signal points by comparing the absolute weighted mean value of the differences between the center pixel and its neighboring pixels in four subwindows with a predefined threshold. According to the directional correlation-dependence, the proposed algorithm adopted an edge-preservation filtering method to reconstruct the value of the corrupted pixel. The experimental results demonstrate that the proposed algorithm can obtain higher Peak Signal-to-Noise Ratio (PSNR) value and preserve more detailed information.
Fast block-matching search algorithm based on down-sampling and its application in denoising
2010, 30(10): 2819-2822.
Asbtract
(
)
PDF
(667KB) (
)
Related Articles
|
Metrics
A fast block-matching search algorithm based on down-sampling named Down-sampling Three Step Search (DTSS) was proposed. In video sequence, the down-sampling frames of the current frame and the reference frame would be obtained first by bilinear interpolation sampling. Then, the initial motion estimation would be estimated by using Three Step Search (TSS) in the sampling frames. At last, refined search results would be obtained according to the relation between sampling frames and original frames. Due to low-pass characteristic of bilinear interpolation, DTSS can effectively suppress noise and get accurate search of motion vector. On the other hand, while DTSS bears the advantage of sampling technique, it can make block-matching search algorithm faster. The experimental results show that DTSS algorithm outperforms the present searching methods like TSS and Diamond Search (DS) both in search speed and accuracy. Combining this algorithm with the Multi-Hypothesis Motion Compensated Filter (MHMCF) noise reduction algorithm, the experimental results also indicate that this block based search algorithm suits well in video denoising applications.
Non-linear scaling algorithm based on potential energy of color images
2010, 30(10): 2823-2824.
Asbtract
(
)
PDF
(431KB) (
)
Related Articles
|
Metrics
Non-linear scaling can maintain the main characteristics and the ratio of the image. Based on the image gradient function, a new potential energy algorithm for color images was proposed in RGB model, and the strong regions were found using potential energy. Fast linear interpolation scaling was used in low energy regions, and it preserved those strong areas in order to protect important features of the image objects. Compared with the traditional image scaling algorithm, the proposed algorithm can overcome the shortcomings of image distortion generated by the scaling effectively.
Typical applications
Probability based adaptive cross-platform multi-party conference scheme
2010, 30(10): 2825-2827.
Asbtract
(
)
PDF
(472KB) (
)
Related Articles
|
Metrics
Based on the practical application and the characteristics of some small devices, such as PDA, a new and simple fast real-time adaptive cross-platform conference scheme was put forward. In the proposed scheme, priority was decided by probability, namely, the client calculated its audio probability according to energy value and the coded frame length, then the server decided the current speakers and mixed the streams by audio probability, finally transmitted mixed packages. The scheme skillfully offsets the weak computation of PDA and other small devices, at the same time reduces the calculation complexity of the server. The simulation results show that the proposed scheme has a low complexity and good hearing perceptibility. It can be easily implemented in hardware, such as PDA and mobile phone, and can be widely applied in cross-platform multimedia conference system.
Heuristic algorithm for optimizing insertion order schedule of agile supply chain
2010, 30(10): 2828-2830.
Asbtract
(
)
PDF
(634KB) (
)
Related Articles
|
Metrics
In order to solve the insertion order scheduling problem of agile supply chains for production planning, a two stage supply chain composed of one factory and many suppliers was studied. Taking the minimization of the total supply chain cost as the objective, an Integer Planning (IP) model was designed to describe the scheduling problem based on the time slot representation of each firm's available scheduling periods, and an One-by-One Selection Heuristic (OOSH) algorithm was proposed to resolve the IP model. Finally, by contrast with the calculation results of Distance Priority (DP) and Cycle Time Priority (CTP) algorithms in some scheduling experiments, the feasibility and effectiveness of the model and algorithm were verified. The experimental results also reveal that the form of agile supply chain is competitive.
Measures for improving availability of networked on-line fingerprint attendance system
Zeng Xiang-Xu
2010, 30(10): 2831-2833.
Asbtract
(
)
PDF
(519KB) (
)
Related Articles
|
Metrics
Concerning the problems that the networked online fingerprint attendance system would have long time delay, involve large number of samples to match, be prone to mismatch and require high quality of fingerprint in application, three approaches was proposed to improve availability of the system. Firstly, by combining fingerprint verification with salary-based credit card, the needed 1∶N remote fingerprint verification was replaced by local 1∶1 fingerprint verification. Then, redundant and rolling updated fingerprint templates were used, and then the fingerprint image quality estimation information was applied into fingerprint matching. The actual application shows that the proposed approaches are effective and can improve availability of the on-line fingerprint attendance system greatly. The system can be used in large-scale business.
Network fault detection based on fuzzy one class SVM with least squares and equality constraints
2010, 30(10): 2834-2837.
Asbtract
(
)
PDF
(598KB) (
)
Related Articles
|
Metrics
A new classifier named Least Squares Fuzzy One Class Support Vector Machine (LSFOC-SVM) was proposed to enhance the efficiency and effect of one class support vector machine applied to network fault abnormal detection. The proposed LSFOC-SVM not only reduced the high computational cost by training with the least squares and equality constraint which obtain a set of linear equations instead of quadratic programming, but also enhanced the fault detection rate by extending the fault alarm area properly with fuzzy membership based on distance in feature space and appropriate alarm threshold. The comparative study results indicate LSFOC-SVM can improve the training efficiency greatly without affecting the diagnosis accuracy. And application tests verify the feasibility of this method.
Ontology-based mapping approach for network management information models
2010, 30(10): 2838-2842.
Asbtract
(
)
PDF
(800KB) (
)
Related Articles
|
Metrics
To achieve information sharing among network management systems and solve the problem of semantic heterogeneity between information models, an ontology-based approach of automated mapping between information models was proposed based on the analysis of current situation of network management information models and advantages of ontology mapping technologies. After translating from information model to ontology, mapping results were generated by the dynamic adaptive multi-strategy ontology matching. The prototype system named OntoNM was developed, which was implemented via Protégé and development kits of the information model. The recommended strategy was attained by experimental evaluation. The experimental results show that the approach improves the efficiency of information model mapping.
Anti-jamming performance analysis of optimal chaotic spread spectrum sequences
2010, 30(10): 2843-2845.
Asbtract
(
)
PDF
(596KB) (
)
Related Articles
|
Metrics
In order to improve the anti-jamming performance of Direct Sequence/Code Division Multiple Access (DS/CDMA) system based on chaotic sequences, an optimization algorithm for choosing chaotic sequences was proposed. The optimal improved Logistic-map chaotic sequences were applied to DS/CDMA system, and the anti-jamming performances were studied under different kinds of jamming such as single-frequency interference, partial-band interference, and impulsive interference. The results show that the optimal improved Logistic-map chaotic sequences have larger capacity, excellent anti-jamming ability, and lower Bit Error Rate (BER). Compared with the Gold sequence, with the same BER, SNR can be improved at least 2dB, so they are more suitable for DS/CDMA system and anti-jamming technology.
Design and implementation of real-time power measurement system for server's key energy components
2010, 30(10): 2846-2849.
Asbtract
(
)
PDF
(581KB) (
)
Related Articles
|
Metrics
In view of the high energy consumption and the lack of real-time monitoring system, a power measurement system based on atmegal16 was designed and implemented to survey server's key power components. Consumption of server machine, hard disk, CPU and memory were separately gauged in real-time and displayed in curves. The system used perforated-type external components, without insertion loss, so as to achieve the purpose of measuring non-destructively. The experimental results show that this system is lossless and stable, and the measurement result is consistent with the standard power valuation, which can reflect power change of server's key components in real-time.
Application of improved dynamic matrix control algorithm to fermentation temperature control
2010, 30(10): 2850-2852.
Asbtract
(
)
PDF
(465KB) (
)
Related Articles
|
Metrics
Beer fermentation is a kind of complex biochemical reaction, the temperature control of which has the feature of large time delay, and it is difficult to establish an accurate mathematical model for this plant as the complex mechanism and the changing environment. The conventional control algorithms always cannot act well in this kind of system, especially while there are several unpredictable interferences. In order to solve these problems, a new prediction control strategy for fermentation temperature was proposed based on Dynamic Matrix Control (DMC) which had been improved by the time optimal control. The application results show that this strategy exerts a good effect on the fermentation temperature control and quickly suppresses the interferences.
Signal integrity analysis of high speed and high density PCB design based on FPGA
2010, 30(10): 2853-2856.
Asbtract
(
)
PDF
(716KB) (
)
Related Articles
|
Metrics
According to the Moore's law, it becomes more and more complex to design high-speed and high-density Printed Circuit Board (PCB). As to Signal Integrity (SI) of large or extra large high-speed and high-density PCB, a research method combing theory analysis and mode simulation with project practice was introduced; meanwhile, some solutions or design rules were put forward. Following issues were discussed: PCB stack-up, the styles of transmission line, characteristic impedance calculation, topology, terminations, delay matching, crosstalk, differential pair layout, etc. On this basis, a few of principles were given about multilayers PCB design based on Field Programmable Gate Array (FPGA). Specific project application proves that high-speed and high-density PCB design can get good practical results driven by these principles or mechanisms.
Medical image classification based on wavelet neural network
2010, 30(10): 2857-2860.
Asbtract
(
)
PDF
(656KB) (
)
Related Articles
|
Metrics
To improve early diagnosis accuracy of breast cancer, an improved wavelet neural network algorithm combining wavelet theory with neural network theory was proposed. It extracted eigenvalues from pretreated medical images, and then classified medical images by using classifier based on improved wavelet neural network algorithm. The experimental results show that the classifier has higher accuracy and the classification is effective and feasible. Compared with neural network algorithm using back-propagating alone, it improves classification results.
2025 Vol.45 No.4
Current Issue
Archive
Superintended by:
Sichuan Associations for Science and Technology
Sponsored by:
Sichuan Computer Federation
Chengdu Branch, Chinese Academy of Sciences
Honorary Editor-in-Chief:
ZHANG Jingzhong
Editor-in-Chief:
XU Zongben
Associate Editor:
SHEN Hengtao XIA Zhaohui
Domestic Post Distribution Code:
62-110
Foreign Distribution Code:
M4616
Address:
No. 9, 4th Section of South Renmin Road, Chengdu 610041, China
Tel:
028-85224283-803
028-85222239-803
Website:
www.joca.cn
E-mail:
bjb@joca.cn
WeChat
Join CCF