Loading...

Table of Content

    01 July 2010, Volume 30 Issue 07
    Networks and communications
    SWBCA: Stability-oriented weight-based clustering algorithm for VANETs
    2010, 30(07):  1711-1713. 
    Asbtract ( )   PDF (651KB) ( )  
    Related Articles | Metrics
    Clustering is one of the key technologies in improving network performance in Ad Hoc networks. Enhancing the stability of a clustering algorithm can reduce the maintenance overhead. According to the characteristics of vehicular Ad Hoc networks (VANETs), a stabilityoriented weightbased clustering algorithm (SWBCA) was proposed in this paper. SWBCA firstly took account of the relative mobility of nodes and the difference between the degree of each node and the ideal degree value for clusterhead election. Then the clusters maintenance of the algorithm was improved with Monte Carlo optimization. Through simulation with NS2, the SWBCA is proved to be more stable than other algorithms and can help improve broadcast performance.
    ASDZP:New regional strategy for adaptive MANET service discovery
    2010, 30(07):  1714-1718. 
    Asbtract ( )   PDF (849KB) ( )  
    Related Articles | Metrics
    To improve the stability and efficiency of the service discovery in MANET, characteristics of the existing adaptive zone protocols are analyzed, a novel To improve the stability and efficiency of the service discovery in Mobile Ad Hoc Networks (MANET), the characteristics of the existing adaptive zone protocols were analyzed, and a new regional strategy for adaptive MANET service discovery was proposed. This method regulated the radius of service discovery zone according to the average stability and activity of nodes in the zone, reducing the number of the unstable nodes and improving the proportion of the service nodes in the zone. The ability of overall regional service was enhanced. Results of the formulas and simulation indicate the strategy provides better performance than the traditional service discovery based on Zone routing Protocol (ZRP) in terms of control overhead and energy consumption, and decreases the average delay effectively.
    Energy-efficient coverage algorithm in 3D wireless sensor networks
    2010, 30(07):  1719-1721. 
    Asbtract ( )   PDF (618KB) ( )  
    Related Articles | Metrics
    Wireless Sensor Network (WSN) usually works in three-dimensional space, and therefore requires coverage algorithms for three-dimensional space. In this paper, the authors analyzed the coverage algorithm, Stand Guard Algorithm (SGA) in two-dimensional space, and then integrated the characteristics of threedimensional space to improve it, finally proposed a new coverage algorithm SSG in three-dimensional space. The algorithm SSG is a location-free algorithm, and its quality of service is examined by simulations.
    Formal modeling and analysis approach of wireless sensor network
    2010, 30(07):  1722-1724. 
    Asbtract ( )   PDF (409KB) ( )  
    Related Articles | Metrics
    Petri net, as a formal tool used in the research of clustering and node coverage of wireless sensor network, can help to describe the wireless sensor network formally and establish the corresponding formal model quickly. As the Petri net has solid mathematical theory, it can better research the energy constraint problem of wireless sensor network in the process of node clustering and covering. The formal model can provide the theoretical and numerical basis for the design of a better clustering and covering structure, and then help to promote the existing algorithm.
    Online upgrade method for embedded wireless information terminal
    2010, 30(07):  1725-1727. 
    Asbtract ( )   PDF (550KB) ( )  
    Related Articles | Metrics
    Based on the existing research and achievements of wireless communication and embedded technologies home and aboard, a kind of software online upgrade method based on a 3rd generation wireless module was proposed with the goal of rapidity, safety and reliability, concerning the shortage of the existing software upgrade methods in embedded wireless information terminal. At first, the design ideas and working principles were described. Then the designs of the key technologies were expounded in detail. At last, an application case with test result was given. The result shows the method is rapid, safe and reliable, and the cost of software renewal and maintenance can be effectively reduced.
    New adaptive RFID anti-collision algorithm based on collision-tree
    2010, 30(07):  1728-1730. 
    Asbtract ( )   PDF (599KB) ( )  
    Related Articles | Metrics
    Anti-collision algorithm was a key technique to improve identification efficiency in Radio Frequency IDentification (RFID) System. The elementary binary search algorithm and some improved algorithms were analyzed, and an adaptive anti-collision algorithm based on collision-tree (ACT) was proposed. In this algorithm, reader detected collision in the feedback information of tags, and then used the first collision bit as a new node of the collision tree to group the tags. Then reader made use of stack and regressive index technology,and took the beginning and the end collision bit as the parameters of next request. The Matlab simulation results show that the proposed algorithm is effective. This algorithm can wipe off empty timeslot, reduce repetitive information and the number of bits transferred for identifying, especially fit the environment which contains large quantity of tags and tags with long information in RFID system.
    Energy model for wireless sensor networks based on hierarchical topology
    2010, 30(07):  1731-1735. 
    Asbtract ( )   PDF (727KB) ( )  
    Related Articles | Metrics
    Energy efficiency is an important indicator in the performance evaluation of Wireless Sensor Network (WSN). It is very essential to research energy model for improving the energy efficiency of wireless sensor network. Concerning the hierarchical topology model of wireless sensor network, this paper proposed the energy consumption models of an ordinary sensor node and a cluster head node according to the characteristics and various roles of sensor node's working energy consumption in the network. This paper deduced the energy consumption of single-hop network and multi-hop network, computed the optimal number of cluster heads in the condition of the least energy consumption theoretically, in the meantime, compared the energy consumption of different transmitting models. The formulas of energy consumption and the optimal cluster-head were theoretically analyzed and deduced, which provided guidance and theoretical basis for the design of energy efficient topology algorithms and communication protocols.
    Improvement of node localization in wireless sensor network based on particle swarm optimization
    2010, 30(07):  1736-1738. 
    Asbtract ( )   PDF (422KB) ( )  
    Related Articles | Metrics
    Focusing on the requirements of low cost and low power in Wireless Sensor Network (WSN), the paper introduced a new method, which was based on DVHop algorithm, using the Particle Swarm Optimization (PSO) to correct the position estimated by DV-Hop,during the third stage of DV-Hop algorithm with the use of estimates of the distance between the nodes and the position of anchor nodes. This algorithm does not need any additional devices and increase in traffic. The simulation shows that the improved algorithm can decrease the average location error up to 30%, and effectively reduce the cost.
    Pattern synthesis of antenna in deepspace exploration with genetic algorithm
    2010, 30(07):  1739-1741. 
    Asbtract ( )   PDF (491KB) ( )  
    Related Articles | Metrics
    Genetic Algorithm (GA) was used to synthesize the patterns of the focal plane array feeding parabolic reflector antenna in deepspace exploration. Physical optics was used to compute the far field pattern of the antenna for excitation of each of the individual array elements in focalplane array. Various scanning patterns were synthesized for deepspace exploration with this GA. The results show the proposed algorithm can control the sidelobe level, more precisely synthesize the expected pattern, and expand the visual range of deep space exploration antenna.
    Adaptive load balancing model based on prediction mechanism
    2010, 30(07):  1742-1745. 
    Asbtract ( )   PDF (744KB) ( )  
    Related Articles | Metrics
    Workload characteristics have an important impact on the performance of load balancing scheduling algorithms in Web server cluster systems. According to the analysis and discussion on the role of load characteristics for scheduling algorithm, a predictionbased adaptive load balancing algorithm (RR_MMMCS-A-P) was proposed in this paper. RR_MMMCS-A-P can predict the arrival rate and the size of the followup request by monitoring the workload characteristics and rapid adjustment of the corresponding parameters in order to balance the load between servers. The experimental results show that compared with CPUbased and CPUmemorybased scheduling algorithm, RR_MMMCS-A-P has better performance in reducing average response time for both calculationintensive and dataintensive jobs.
    Role-control-based bandwidth allocation for space-based network
    2010, 30(07):  1746-1749. 
    Asbtract ( )   PDF (631KB) ( )  
    Related Articles | Metrics
    Spacebased integrated information networks were of relatively poor stability. In view of this,a Role-Control-based dynamic Bandwidth Allocation (RCBA) algorithm was presented. The base station classified the bandwidth requests according to the priority, and permitted the bandwidth occupation on the basis of reservation bandwidth. The theoretical analysis and simulation results prove that, compared with the traditional bandwidth allocation algorithm,RCBA algorithm reduces the delay and the packet loss rate,and preferably meets the requirements of applications in Space-Based Network (SBN).
    Adaptive ADTCP: Improvement scheme of TCP in Ad Hoc network
    2010, 30(07):  1750-1753. 
    Asbtract ( )   PDF (836KB) ( )  
    Related Articles | Metrics
    Based on analyzing the characteristics of Ad Hoc network and its impact on TCP performance, a mechanism named Adaptive ADTCP (Adaptive Ad Hoc TCP) was proposed. On the basis of acquiring a clear identification of network status, this mechanism configured the window increment factor adaptively according to the forward path length. And then, greedy growth of TCP congestion window was limited to avoid congestion. The packet size in light of current congestion window was changed to make full use of network resource. The simulation result shows that Adaptive ADTCP improves the throughput of TCP significantly both in heavy load and high mobility scenario.
    Timing synchronization algorithm based on training symbol for OFDM systems
    2010, 30(07):  1754-1756. 
    Asbtract ( )   PDF (450KB) ( )  
    Related Articles | Metrics
    Orthogonal Frequency Division Multiplexing (OFDM), as an efficient data transmission technology, has good antifading ability and broad application prospect. However, OFDM is particularly sensitive to the synchronization errors that can greatly decrease the OFDM system performance. In this paper, based on Schmidl-Cox algorithm, by improving the training symbols to complete the symbol timing synchronization, the improved algorithm was given in detail and simulated. The results show that the improved algorithm eliminate the distinction of timing measurement stage and promote the precision of timing estination, moreover,it has lower bit error rate under the same conditions.
    Solution of IPSec NAT-traversal in virtual private network
    2010, 30(07):  1757-1759. 
    Asbtract ( )   PDF (490KB) ( )  
    Related Articles | Metrics
    IPSec architecture and Network Address Translation (NAT) are widely used in the Internet today.However, the incompatibility between them limits the development of Virtual Private Network (VPN) based on IPSec technology. The Internet Engineering Task Force (IETF) proposed a series of drafts based on User Datagram Protocol (UDP) encapsulation to solve this problem. But the solution did not cover the double NATtraversal in IPSec VPN. According to the methods of UDP encapsulation and double NAT-traversal, this paper proposed a feasible solution to solve the problem of NAT-traversal in different situations,and the feasibility was clarified in detail.
    Iterative decoding algorithm of TPC based on correlation computation
    2010, 30(07):  1760-1762. 
    Asbtract ( )   PDF (464KB) ( )  
    Related Articles | Metrics
    Chase-Pyndiah algorithm is one of the decoding algorithms of Turbo Product Code (TPC). Based on that, an iterative decoding algorithm involved with correlation was introduced. This algorithm avoids the computation of Euclidean distance due to the use of correlation as the metric; and uses a comparative method in the choice of the concurrent codewords, which prevents the search of competition codeword. On the basis of this study, further simplification was presented, which decreased the complexity and the latency of the decoder. Both analysis and simulation result show the advantage of the proposed algorithm over conventional algorithm in speed and performance.
    Information security
    Deduplication-oriented backup-data encryption method
    2010, 30(07):  1763-1766. 
    Asbtract ( )   PDF (847KB) ( )  
    Related Articles | Metrics
    In order to combine the data confidentiality and deduplication efficiency, a deduplicationoriented backupdata encryption method was proposed. According to the method, the symmetric keys, which were used to encrypt the chunks, were generated from the hash values of chunk contents in a consistent way. Thus, the one to one correspondence between the chunk plaintext and the chunk ciphertext could be guaranteed. The confidentiality of backupdata in storage and transmission can be protected efficiently on condition that the user's private key and identification password are not leaked simultaneously. The experimental results indicate that the method, unlike the traditional encryption method, can be compatible with the deduplication technology well, and the storage space utilization of encrypted backupdata can be improved notablely. This method is applicable to the backup of massive data, which has the requirement of confidentiality.
    Generation of system malicious behavior specification based on system call trace
    2010, 30(07):  1767-1770. 
    Asbtract ( )   PDF (728KB) ( )  
    Related Articles | Metrics
    On the study of malicious code, the automatic generation of malicious behavior speciation is still a difficult problem. In the field of generation of malicious behavior specification based on system call trace, the existing graph representation uses the minimal contrast subgraph mining method to generate the speciation, but the worst time complexity gets to O(N!). This paper studied the automatic generation method of malicious code specification. In order to reduce the complexity of specification generation, this paper proposed a method to transform the program call traces to a graph representation so that the number of final graph nodes was reduced and the node label was unique, and the worst time complexity of specification generation is O(N2).
    Method for network anomaly detection with multi-measure based on relative entropy theory
    2010, 30(07):  1771-1774. 
    Asbtract ( )   PDF (569KB) ( )  
    Related Articles | Metrics
    Low detection rate, high false alarm rate and limited types of attacks which can be detected have become the biggest obstacle to the development of network anomaly detection. In order to improve the detection rate, reduce the false alarm rate and enlarge the area of detected attacks, a new method of network anomaly detection was proposed. Firstly, the network traffic was analyzed and the fullprobability events of measures were characterized by the introduction of relative entropy theory. Secondly, the weighted relative entropy was got by integrating relative entropy of multiple measures with weighted coefficients. Lastly, the standard to judge network anomaly finally was the comprehensive multimeasure weighted relative entropy. The experimental results on DARPA 1999 intrusion detection evaluation datasets show that the detector has a higher detection rate at lower false alarm rate and the result is better than other methods.
    Immunization strategy by cutting edges
    2010, 30(07):  1775-1777. 
    Asbtract ( )   PDF (558KB) ( )  
    Related Articles | Metrics
    In order to eliminate the viruses with few immunized nodes and high speed, the paper proposed a new immunization strategy via edges for inhomogeneous networks. The strategy immunized the edges between important nodes or between two of these nodes and their neighbors. Using the Susceptible-InfectiousSusceptible (SIS) epidemic spreading model, the paper explicitly tested the immunization threshold and the connectivity of our proposed strategy separately on ER, BA scalefree and several real networks and the experiments demonstrate that our strategy requires few immunization noses compared with the targeted immunization and gets the same dense of infected nodes. Our proposed strategy can also keep better connectivity than the targeted immunization.
    Modified method of detecting DDoS attacks based on entropy
    2010, 30(07):  1778-1781. 
    Asbtract ( )   PDF (588KB) ( )  
    Related Articles | Metrics
    Compared with the volume or feature based approaches, detecting Distributed Denial of Service (DDoS) attacks based on entropy has the following advantages: simple calculation, high sensitivity, low false positive rate, no need of additional network traffic and device. For the purpose of getting higher precision and lower false positive rate in contrast to traditional entropybased approaches, a Modified Entropy-Based (MEB) scheme was proposed in this paper, which divided DDoS attacks into different threat levels and treated each threat level with according method. The effeciency of this scheme was validated with simulation in NS-2.
    Spider detection based on trap techniques
    2010, 30(07):  1782-1784. 
    Asbtract ( )   PDF (664KB) ( )  
    Related Articles | Metrics
    Spider known as Web crawler, a program for capturing network resources, is widely used in the field of search engines. However, it also raises many problems such as privacy leakage and copyright dissension. Therefore, it is necessary to take measures to detect and restrict the Web crawler's behavior. This paper firstly briefly reviewed the current achievements in Web crawler's detection and the utilization of the trap technique for this purpose. And then, a structural trap technique was proposed to construct Website models and corresponding detection algorithms. Finally the authors measured the sensitivity of the model and summarized its performance. The results indicate that the precision stemming from the structural trap technique is generally consistent with the one from artificial analyses.
    Improved Apriori algorithm to associate alerts for intrusion detection system
    2010, 30(07):  1785-1788. 
    Asbtract ( )   PDF (625KB) ( )  
    Related Articles | Metrics
    Among a large number of association rule mining algorithms, Apriori algorithm is the most classic one, but it has three deficiencies, including scanning databases many times, generating a large number of candidate anthology, and mining frequent itemsets iteratively. This paper presented a method that solved the maximal frequent itemsets through one intersection operation. The degree of support was obtained through the times of intersection without scanning the transaction database. The method was optimized by indexing the attributes so that less memory space is needed and it is easier to search the candidate set list. It generates useful association rules for alerts in Intrusion Detection System (IDS). The experimental results show that the optimized algorithm can effectively improve the efficiency of mining association rules.
    Reputation-driven grouping mechanism in P2P file-sharing system
    Jian Peng
    2010, 30(07):  1789-1793. 
    Asbtract ( )   PDF (724KB) ( )  
    Related Articles | Metrics
    In P2P file-sharing systems, many strategic frauds often occur. However, the existing trust model can not completely eliminate the risk in transactions. By combining both local and global trust mechanisms, this article proposed a new calculation expression of reputation according to records shared between peers. Furthermore, it put forward reputation-driven group organization management and peer-search algorithm. The simulation results show that this reputationdriven mechanism can find the peer with the high value of reputation as the trading partner and can effectively reduce the risk in transactions when the system is confronted with attacks from malicious conspiracy peers and malicious peers with trading strategies.
    Effective algorithm to prevent click fraud
    2010, 30(07):  1794-1796. 
    Asbtract ( )   PDF (371KB) ( )  
    Related Articles | Metrics
    Click fraud has become one of the major problems in the online advertising industry. Concerning the difficulties faced by the online advertising industry, an algorithm based on viewing time and click frequency and combined with verification code to prevent click fraud strategy was proposed. The strategy can shield effectively software click fraud similar to the clicker, and shield effectively viewers unconscious accidental invalid clicks, reduce significantly the efficiency of artificial click fraud, and the real potential client will not escape.
    Flexible Agent delegation model based on RBAC
    2010, 30(07):  1797-1801. 
    Asbtract ( )   PDF (732KB) ( )  
    Related Articles | Metrics
    The existing delegation models based on RoleBased Access Control (RBAC) lack flexibility, and the permissions abuse may occur in case that delegators are on business or on leave, although the system administrator could accomplish delegation authorization by oneself instead of delegators. This paper presented an Agentbased flexible role delegation model based on RBAC. The delegation strategy was given and the soundness and the completeness of the model were discussed and proved by the construction and the reduction methods. The results analyzed by theories and actual examples show that, the third party (or same Agent) takes charge of delegating the permissions on behalf of the delegator and supervises the delegation authorization. Flexibility is effectively reflected in the delegation, and the model follows the two security principles: "least privilege" and "the separation of duty".
    Research on periods of chaotic sequences under computer iteration
    2010, 30(07):  1802-1804. 
    Asbtract ( )   PDF (534KB) ( )  
    Related Articles | Metrics
    Both the periods of chaotic sequences when iterating on computer and their distribution based on floating point formats have been studied in this paper. By constructing nonstandard floating point formats matched with the IEEE standard ones, the periods of six common chaotic systems under different floating point precisions were measured. Using a linear fitting method, the distribution relationship between the periods of chaotic sequence and numerical precisions was obtained, which corrected the distribution relationship based on fixed point formats and gave an experimental criterion for the study of chaotic antidegradation. This paper also shows that for the chaotic sequence, conclusions based on fixed point formats can not be simply extended to floating point formats.
    An efficient CPK-based group key exchange protocol
    2010, 30(07):  1805-1808. 
    Asbtract ( )   PDF (684KB) ( )  
    Related Articles | Metrics
    Combined Public Key (CPK) cryptography does not need certificates to guarantee the authenticity of public keys, and avoids the problem that user’s private key completely depends on the Key Management Center (KMC). Based on CPK, a constantround group key exchange protocol was presented, which was provably secure under the intractability of computation DiffieHellman problem and achieved perfect forward secrecy. The protocol has only two communication rounds and it is more efficient than other protocols both in communication and computation. It supports group member join/leave operations efficiently and only needs minimum amount of computation and communication to renew the group key especially for multiple joins/leaves. At the same time, it also assures backward secrecy and forward secrecy. Moreover, the protocol achieves strong security. It can keep the session key secret from the adversary except that one party’s ephemeral private key and static private key are all revealed to the adversary. Lastly, the protocol provides a method to design efficient constantround group key exchange protocols with strong security and most secret sharing schemes can be adopted to construct the protocol.
    Efficient certificateless signature scheme
    2010, 30(07):  1809-1811. 
    Asbtract ( )   PDF (455KB) ( )  
    Related Articles | Metrics
    To solve the key escrow problem of IDbased cryptography and the public key authentication of traditional public key cryptosystem, a certificateless signature scheme revising the private and public generation algorithms in Barreto et al.'s efficient IDbased signature was proposed. The scheme is provably secure in the random oracle model, and is an efficient scheme. It only needs one pairing operation.
    Encryption algorithm of colour image based on coupling chaotic systems
    2010, 30(07):  1812-1814. 
    Asbtract ( )   PDF (649KB) ( )  
    Related Articles | Metrics
    This article discussed a new color image encryption algorithm. The initial conditions of chaotic systems were taken as the initial key pairs to do XOR pre-treatment of the images.The chaotic sequences generated by Logistic mapping and Lorenz system were used to scramble and diffuse the images. The algorithm used the processed image data as the initial value of Logistic mapping,and coupled Lorenz system with Logistic mapping,greatly to improve the security. The algorithm has good scrambling and diffusion effects, high encryption and decryption speed, and good anti-noise ability.
    Cryptanalysis of an image scrambling algorithm based on Logistic chaotic sequence
    2010, 30(07):  1815-1817. 
    Asbtract ( )   PDF (548KB) ( )  
    Related Articles | Metrics
    Aiming to analyze the potential vulnerability of a recently proposed image scrambling algorithm based on Logistic chaotic sequence and bit exchange, this paper gave out the corresponding improving ideas and got rid of the flaw of the algorithm application. Through known-plaintext attack and chosen-plaintext attack, the equivalent keys of the algorithm could be easily recovered. Both theoretical analysis and computer simulation indicate that the proposed attacks can completely break this algorithm and cause damage.
    Application scheme of identity e-documents based on digital watermarking
    2010, 30(07):  1818-1820. 
    Asbtract ( )   PDF (644KB) ( )  
    Related Articles | Metrics
    A generally suitable framework for watermarking application was proposed based on hierarchy, which was not only suitable for e-documents identity but also suitable for other kinds of digital watermarking applications. It was composed of three layers: driver layer, adapter layer and application layer. Then based on the techniques of layered watermarking, a scheme of e-documents identity was proposed based on digital watermarking technique. Also, the identity formats of confidential e-documents and the way to use it were discussed.
    Graphics and image processing
    Spread spectrum speech information hiding based on linear interference cancellation
    2010, 30(07):  1821-1824. 
    Asbtract ( )   PDF (608KB) ( )  
    Related Articles | Metrics
    In speech information hiding, host interference restricts the capacity of hidden information. To eliminate the speech host interference, this paper presented a spread spectrum embedding and decoding method based on extended linear interference cancellation, which was applicable to color noise host model. Based on the speech residual embedding model, a pre-cancellation operation on the watermark was employed. The new watermark was then embedded into the speech residual signal. It is verified by the theoretical analysis and the simulation that, in contrast to traditional spread spectrum speech watermarking, the speech host interference problem is solved effectively and the decoding performance is improved significantly. In addition, the proposed scheme is robust to conventional speech processing and attacks.
    Rate control algorithm based on steepest-descent for JPEG2000
    2010, 30(07):  1825-1827. 
    Asbtract ( )   PDF (656KB) ( )  
    Related Articles | Metrics
    A fast rate control algorithm based on Steepest-Descent (SD) was proposed for JPEG2000. The essence of this algorithm is an iterative selection process. In each iteration step, the code passes with maximal ratio of decrease in distortion to increase in bit rate were chosen and reserved as a part of the final compression codestream. Considering the high time complexity of the sorting process in each iteration step, the maxheap structure was introduced into this algorithm to reduce the time complexity. By encoding the candidate passes in the course of rate control, this algorithm not only eliminates the encoding redundancy existing in JPEG2000, but also decreases the time spent on the rate allocation procedure. The experimental results show that this method significantly reduces the time complexity of the processes of rate allocation and encoding and improves the coding effciency compared with JPEG2000.
    Research and implementation of parallel rendering system based on multi-core PC cluster
    2010, 30(07):  1828-1831. 
    Asbtract ( )   PDF (709KB) ( )  
    Related Articles | Metrics
    To meet the requirements of the large-scale virtual reality application in rendering speed and displaying resolution, a high-cost-effective distributed graphics rendering system, being based on a cluster of personal computers with multi-cores, was built. The system integrated the parallel in a node of the multi-core PC cluster and the parallel inter nodes. A flexible split-screen was achieved with scaling the frustum and moving the center of the projection. The efficiency of parallel rendering system was improved by multi-core parallel optimization of the rendering pipeline, loop iteration, functionlevel in the cluster node. The results of experiment show that the integration of multi-core platform and parallel rendering system improves effectively application performance through multi-threaded approach.
    Rendering method for large-area terrain based on texture array
    2010, 30(07):  1832-1834. 
    Asbtract ( )   PDF (483KB) ( )  
    Related Articles | Metrics
    A real-time rendering method for largearea terrain data was presented utilizing the techniques such as texture array, vertex texture fetching, tiled quad tree grid and terrain data block partition. The whole terrain model was partitioned into equal size blocks and the blocks were stored in CPU memory in tilepyramid model. The potentially visible portion of the terrain was cached at the highest necessary resolution in form of texture arrays and was rendered from the GPU. The CPU sent tiled quad tree flat grid, which was displaced by fetching the heights stored in the GPU cache. The GPU cache was updated continuously as the viewpoint changed. The experimental results show that the method is very efficient and suitable for applications of massive terrain rendering and interactive walkthrough.
    Dunhuang mural inpainting based on Markov random field sampling
    2010, 30(07):  1835-1837. 
    Asbtract ( )   PDF (668KB) ( )  
    Related Articles | Metrics
    Based on the analysis of information features and previous algorithm of the damaged Dunhuang Murals, an improved algorithm was proposed by combining the improved priority function with data fusion of Dempster-Shafer evidence theory and direct sampling model of Markov Random Field (MRF). The effectiveness of the proposed algorithm was verified by means of image completion and system simulation experiment.
    Method for generating slope line of open-pit mine based on KD-tree
    2010, 30(07):  1838-1840. 
    Asbtract ( )   PDF (443KB) ( )  
    Related Articles | Metrics
    In order to improve the precision and efficiency for plotting slope line of open-pit mine, a method for automatic generation of slope line was presented according to the characteristics of openpit mine data. KD-tree was constructed according to points generated uniformly by curves of openpit mine. It was possible to quickly find one points neighbors with KD-tree without prior knowing topological relations of points. Slope line was drawn at any point according to the graphic principles and methods. The method was realized by programming with VC++.NET and well applied in openpit mine. The experiments show the method can improve the efficiency and accuracy for drawing slope line of openpit mine and obtain great effects. Besides, the method can be applied to contours.
    Inspection algorithm of arc-parts based on machine vision
    2010, 30(07):  1841-1843. 
    Asbtract ( )   PDF (552KB) ( )  
    Related Articles | Metrics
    In order to achieve fast and accurate inspection on the feature parameters of the arcshaped piece, the paper presented a method that inspected feature parameters of the arc-shaped piece based on machine vision. First, the image edge was extracted by Canny operator and subpixel coordinate was computed by a cubic spline interpolation algorithm. Then, discrete curvature was computed by discrete curvature computing method based on tangent direction. The discrete curvature data were arranged; the mean and variance, as well as the arc-length, surface and angle of curved pieces were calculated. The experimental results indicate that the proposed method is of not only highprecision, high-speed but also good-stability, and can satisfy inspection requirement of arc-parts feature parameters.
    Reconstruction algorithm of ultrasound CT based on simultaneous iterative reconstruction technique
    Hao-Quan WANG
    2010, 30(07):  1844-1846. 
    Asbtract ( )   PDF (395KB) ( )  
    Related Articles | Metrics
    Based on the research in ultrasound CT image principle, the authors explored the array detection method. By improving the layout of sensors, the amount of data required for imaging increased. Due to the least squared criterion, the formula of simultaneous iterative reconstruction techniques was deduced. The matrix of the coefficient correlation was obtained through algorithm for scanning four edges. The velocity matrix had been achieved through correcting error to approach the real data constantly. The result of simulation shows that the method can reduce error effectively, and the effect of image reconstruction is satisfactory.
    Application of improved RANSAC algorithm to image registration
    2010, 30(07):  1849-1851. 
    Asbtract ( )   PDF (661KB) ( )  
    Related Articles | Metrics
    In order to improve the speed of image registration, a fast method based on improved RANSAC algorithm is proposed. At first, Harris corner detector is used to extract the feature points in the reference and target image. Next, based on proximity and similarity of their intensity the feature points are matched. Finally, Improved RANSAC algorithm is used to estimate transform matrix more speed and accurately. To improve the speed of computation, a method of pre-detection is used to discard those temporary models which are not Preview Models. To delete outliers a random block selecting is used to select sample, which improve precisions of algorithm. The experiment shows that this algorithm reduces the amount of computation largely and improves the speed of image registration when the precisions have not notable change.
    Fingerprint image enhancement by Hermite filter
    2010, 30(07):  1852-1854. 
    Asbtract ( )   PDF (598KB) ( )  
    Related Articles | Metrics
    Fingerprint enhancement aims to improve fingerprint image quality, and enhance the performance of fingerprint recognition system. This paper proposed an approach to fingerprint image enhancement based on Hermite filter. The algorithm calculated ridge orientation by using gradient-formula method, and calculated ridge frequency by using the frequency spectrum characteristic of fingerprint image block. With ridge orientation and frequency as the main parameters of filter, the algorithm made use of Hermite filter with good bandpass property and made use of low-pass filter with changeable angle bandwidth to implement the filtering, enhancing the texture definition effectively and better avoiding blocking effect in singular point area. The experimental results show that the proposed algorithm achieves good image enhancement.
    Bivariate shrinkage denoising method based on variable parameter bivariate model
    2010, 30(07):  1855-1858. 
    Asbtract ( )   PDF (608KB) ( )  
    Related Articles | Metrics
    In order to improve the denoising results of the bivariate shrinkage method, a new variable parameter bivariate model was proposed for the joint coefficient-parent distribution of wavelet coefficients, because the joint coefficient-parent distribution is different for coefficients in different scales and subbands. Based on the new model, a subband adaptive denoising method was proposed using Bayesian maximum a posteriori estimation theory. In the experiments, the dual tree complex wavelet transform of shift-invariance and directional selectivity was used for both the new method and bivariate shrinkage method. The results show that the Peak Signal-to-Noise Ratio (PSNR) values of the new method are improved.
    Uncorrelated optimal discriminant plane in unsupervised pattern
    2010, 30(07):  1859-1862. 
    Asbtract ( )   PDF (571KB) ( )  
    Related Articles | Metrics
    Uncorrelated optimal discriminant plane is an important feature extraction method and has been wildly used in the pattern recognition field. However, uncorrelated optimal discriminant plane is based on Fisher criterion function and the conjugated orthogonal constraint of the totalclass scatter matrix. It needs the class information to calculate the Fisher optimal discriminant vector. Thus, it can only be used in supervised pattern. A new method was presented to extend uncorrelated optimal discriminant plane to unsupervised pattern. The basic idea was to introduce the fuzzy concept into Fisher linear discriminant analysis and used the defined fuzzy Fisher criterion function as its optimized objective to figure out an optimal discriminant vector and fuzzy scatter matrixes in unsupervised pattern. In the conjugated orthogonal constraint of the fuzzy totalclass scatter matrix, the second discriminant vector which maximized the fuzzy Fisher criterion can be obtained. Thus, a new feature extraction method based on unsupervised uncorrelated optimal discriminant plane was proposed. The experimental results for UCI datasets and CMU-PIE face database demonstrate that although this method is unable to surpass traditional uncorrelated optimal discriminant plane, it can extract the uncorrelated features for classification and is superior to the common unsupervised feature extraction methods such as principal component analysis and independent component analysis when the between-class difference is big.
    Face recognition using complex wavelet and independent component analysis
    2010, 30(07):  1863-1866. 
    Asbtract ( )   PDF (639KB) ( )  
    Related Articles | Metrics
    A novel face recognition method is proposed by adopting the dual-tree complex wavelet transform (DT-CWT) and independent component analysis (ICA). The DT-CWT is applied to face images to extract the feature vectors. The dimension of the salient feature vectors is reduced by principal component analysis (PCA). ICA further reduces the feature redundancies and derives independent feature vectors for the correlation-based classifier. DT-CWT has the capability of selectivity on scale and orientation, and preserves more information in frequency domain. Features extracted by DT-CWT and ICA can obtain the excellent performance on classification. Extensive experimental results demonstrated the validity of the proposed method using the ORL and AR database.
    Lung CT image retrieval based on intelligent selection of multi-dimensional characteristics
    2010, 30(07):  1867-1869. 
    Asbtract ( )   PDF (484KB) ( )  
    Related Articles | Metrics
    In allusion to that the single feature and manual setting multi-dimensional weighting features cannot satisfy the need of CBIR more and more, we presents a new algorithm of multi-dimensional features vector weighting based on the cluster of training sample. The algorithm needs to manually build the training sample, extract the multi-dimensional features like color, texture and shape of each element, and then use the genetic algorithm to find the most optimal weighting coefficient set of features vectors sample. Finally, it calculates the proper value of the training sample using the set and retrieve the sample. Proved by the experiment, this algorithm can improve the classified accuracy compared to other algorithm and has a high accuracy in differentiating two clusters with high similarity.
    Identification of vegetable leaf-eating pests based on image analysis
    2010, 30(07):  1870-1872. 
    Asbtract ( )   PDF (470KB) ( )  
    Related Articles | Metrics
    To achieve the computer recognition of vegetable leafeating pests and the scientific evaluation of pest level's, the paper proposed a reverse identification method through the eaten leaves to determine the pests. After image preprocessing of the eaten leaves by pests, seven spherical shape feature values of eaten leaves could be automatically extracted: roundness degree, complexity degree, and spherical degree, etc., and then the BP neural network model for recognition could be built up. The results show that the method is good at recognizing the pest species by the geometric feature of eaten leaves by pests, and give a scientific evaluation for the leaf's harm degree.
    Artificial intelligence
    Extended rough set model based on improved complete tolerance relation
    2010, 30(07):  1873-1877. 
    Asbtract ( )   PDF (852KB) ( )  
    Related Articles | Metrics
    The classical rough set theory is not useful for analyzing incomplete information system, and this problem is solved by tolerance relation, similarity relation, limited tolerance relation and completed tolerance relation to some extent. Limitations of these models were analyzed, and on the basis of the completed tolerance relation an extended rough set model based on improved complete tolerance relation was developed. The extended model not only retains the merits of the existing models, but also overcomes the limitations of the existing models to some extent. The result of case studies shows the effectiveness of the presented model.
    Parameter optimization for B-spline curve fitting based on adaptive genetic algorithm
    2010, 30(07):  1878-1882. 
    Asbtract ( )   PDF (748KB) ( )  
    Related Articles | Metrics
    The genetic algorithm is usually selected as an optimization tool for the least square fitting about ordered plane data by B-spline curves. But the result easily falls into the local optimum with random initial choice, and more control points are required to assure higher accuracy. The adaptive genetic algorithm was proposed to overcome the shortcoming during the parameter optimization for B-spline curves. The average fitness of the initial populations was improved obviously by the average data parameter value method, which built the relationship between the data parameters and the knots. In the algorithm, the evolution of populations was accelerated through the optimization for the genetic strategy. The experimental results show that the algorithm can do with minimum control points and better precision within lower iterations.
    Modified immune particle swarm optimization algorithm and its application
    2010, 30(07):  1883-1884. 
    Asbtract ( )   PDF (483KB) ( )  
    Related Articles | Metrics
    In this paper, a modified Particle Swarm Optimization (PSO) algorithm with immunity was proposed. The intersection operation and high frequency mutation were introduced to keep the population’s diversity and avoid early ripe of PSO. The algorithm’s global searching ability was improved through Cauchy mutation, and local searching ability was improved by Gaussian mutation. In addition, the vaccine operation was introduced to solve the potential degradation caused by the probability distributed intersection and mutation. The simulation results show that the evolution speed and convergence precision of proposed algorithm are improved.
    Multi-objective PSO based on infeasibility degree and principle of endocrine
    2010, 30(07):  1885-1888. 
    Asbtract ( )   PDF (602KB) ( )  
    Related Articles | Metrics
    For multi-objective optimization problems with constraint, a novel particle swarm optimization algorithm(PSO) for multi-objective with constraint is proposed. In the method, the constraints and the selection for elite swarm are disposed by infeasibility degree and domain principle. According to the control and supervised principle between simulating hormone(SH) and releasing hormone(RH) in endocrine system, and considering the supervision and control of individual in the set of non-dominated for the nearest class of swarm, the global optimization position of class is used to generate the new position for particles which are belonged to it, .In order to validate the effectiveness of given method, three benchmark multi-objective problems are simulated by the given method,NSGA-II and MOPSO-CD, the results indicated that the given method can find feasible Pareto solutions with a large probability.
    Study on significance of attribute set
    2010, 30(07):  1889-1891. 
    Asbtract ( )   PDF (522KB) ( )  
    Related Articles | Metrics
    In the decision table, significance of attribute set is ignored. In this paper, the significance of both single attribute and attribute sets were analyzed. It is concluded that it is uncertain that an attribute set consisting of nonsignificant single attributes is not significant, while an attribute set including significant single attributes must be of significance. As a result, the significance of attribute set is more reliable than that of single attribute.
    Belief revision in intelligent Agent action reasoning
    2010, 30(07):  1892-1895. 
    Asbtract ( )   PDF (607KB) ( )  
    Related Articles | Metrics
    Reinforcement revision of Belief Revision exists the deficiency for maintaining the low rank belief which is consistent with the old Beliefs; fluent calculus integrated with Belief Revision cannot reason action well since it cannot represent Formula. Dependence Belief Revision and Strategy-Axiom-Reasoning model is given to resolve the problem above.DBR based on Ind postulates well maintains the condition beliefs together with low rank non contradiction Belief; SAR represented belief set with formula set inherits and improves the axioms of fluent calculus and integrates well to Belief Revision. Finally apply DBR to SAR model and verify the feasibility with experiment.
    New linear genetic programming approach
    2010, 30(07):  1896-1898. 
    Asbtract ( )   PDF (409KB) ( )  
    Related Articles | Metrics
    A new genetic programming named Symbol Genetic Programming (SGP) based on a new encoding method was proposed. This new encoding method absorbed the merits of many other linear genetic programming methods. It coded with a simple, unrestrained string. Based on its characteristic, multi-expressions could be contained in one individual without the increase of computation task. This method is proved to be effective and stable through the complexity analysis and experiment.
    Readable machine proofs for mass point geometry
    2010, 30(07):  1899-1901. 
    Asbtract ( )   PDF (608KB) ( )  
    Related Articles | Metrics
    Based on the principles of mass point geometry, the paper developed a new machine proving method, mass point method, and established an affine-geometrymachineproof algorithm implemented as the Maple program that can deal with Hilbert intersection point statements. The program can produce readable proofs automatically. The results of hundreds of non-trivial propositions show that the method is not only efficient, and most proofs’ readability is also satisfactory.
    Parallel test combining timed Petri net with GA-PSO algorithm
    2010, 30(07):  1902-1905. 
    Asbtract ( )   PDF (566KB) ( )  
    Related Articles | Metrics
    Abstract: Parallel test task scheduling in automatic test system is an unsolved problem.Based on the theory of Petri net, we established a timed Petri net model for the parallel test. And GA-PSO algorithm is originally introduced into the procedure of exploring transition sequences of Timed Petri net, then the optimal scheduling can be found in a very short period of time. Simulated result shows that the given algorithm converges rapidly in a high probability, consequently an optimal transition sequence is found.
    Pattern recognition and Software
    Bidding-based optimizing of task allocation in multi-Agent system
    2010, 30(07):  1906-1908. 
    Asbtract ( )   PDF (411KB) ( )  
    Related Articles | Metrics

    To avoid the problem of determining task allocation schema only by bidding results which obtain partial optimization in multiAgent system assisted production task allocation, a method of global optimization was put forward. An objective function of production task allocation was established based on bidding results. Annealing evolution algorithm was designed to realize the synthesized bid evaluations. Instances indicate that this method is feasible and convenient to achieve global optimization of production task allocation.

    Artificial intelligence
    Post-earthquake based on repair time route optimization and its GIS implementation
    2010, 30(07):  1909-1912. 
    Asbtract ( )   PDF (620KB) ( )  
    Related Articles | Metrics
    To solve the limitation and complexity of optimizing algorithms for post-earthquake routes selection, an improved route selection method for post-earthquake was proposed based on earthquake disaster prediction about urban road system, according to the repair time of road system. With the support of Geographic Information System (GIS) and database technologies, the system of routes optimization was developed by using the road system of CQUPT (ChongQing University of Posts and Telecommunications). According to the system, the emergency decisionmakers can get the dynamic state of road system after the earthquake. Also, the routes optimization can be supplied for the rescues to get the scene and to transfer the wounded and resources.
    Optimization of coordinated procurement strategy in steel group
    2010, 30(07):  1913-1915. 
    Asbtract ( )   PDF (623KB) ( )  
    Related Articles | Metrics
    This paper focused on the optimization of inner logistics system in steel sector. Based on the detailed analysis of coordinated procurement strategy, related logistics cost of subtactics were considered, and optimal cost was then computed. To maximize the demand conformity degree and minimize the procurement logistics cost, corresponding purchasing strategy model was established, which was transformed to singleobjective model by balance index. Each sub-company’s suppliers were selected according to the evaluation result on independent purchasing. The simulated annealing algorithm used to solve the problem was designed according to the neighbourhood method in which solutions were constructed step by step. According to the computation and analysis results on typical instance, coordinated procurement strategy can save purchasing cost inside steel company and increase profit to a great degree.
    Efficient mixed clustering algorithm and its application in anomaly detection
    2010, 30(07):  1916-1918. 
    Asbtract ( )   PDF (494KB) ( )  
    Related Articles | Metrics
    High efficiency of the algorithm is the key if Clustering Algorithm is applied to anomaly detecttion. In order to improve anomaly detection, this thesis advanced a new clustering algorithm that deals with the net data partially by real-time processing, improving and integrating the DBSCAN Algorithm and K-means Algorithm. It is proved in experiments that the new Algorithm can improve the detection rate, reduce the false positive rate and enhance the real-time responding ability of the system.
    Application of decision tree in apparel marketing based on appearance of consumers
    2010, 30(07):  1919-1921. 
    Asbtract ( )   PDF (628KB) ( )  
    Related Articles | Metrics
    Apparel sellers often do their job based on appearance characteristics of consumers so as to improve their selling efficiency. This paper discussed technology of quick marketing based on apparel impression of consumers’ characteristics from the view of data mining to help sellers find and master marketing rule of appearance impression. Firstly, this paper introduced decision tree algorithm theory. Secondly, it discussed the evaluation index system of consumers’ appearance impression. According to this system, sellers collected data of customers’ appearance and behavior in an apparel shop. Thirdly, a calculation example was demonstrated for this kind of classier model of apparel marketing based on consumers’ appearance. At last, marketing rules were mined out by a datamining tool, decision tree of Clementine. As study results show, this application is feasible.
    Database technology
    Non-check mining algorithm of maximum frequent patterns in association rules based on FP-tree
    2010, 30(07):  1922-1925. 
    Asbtract ( )   PDF (546KB) ( )  
    Related Articles | Metrics
    The algorithms based on FP-tree, for mining maximal frequent patterns, have high performance but with many drawbacks. For example, they must recursively generate conditional FPtrees, have to do the process of superset checking. In order to overcome these drawbacks of the existing algorithms, an algorithm NonCheck Mining algorithm of Maximum Frequent Pattern (NCMFP)for mining maximal frequent patterns was put forward after the analysis of DMFIA-1 algorithm. In the algorithm, neither constructing conditional frequent pattern tree recursively nor superset checking was needed through modifying the structure of FP-tree. This algorithm reduced the number of mining through early prediction before mining. The application of a method to get the public intersection sets could obtain a complete result. The experiment shows that the efficiency of NCMFP is two to five times as much as that of the similar algorithms in the case of a relatively small support.
    Fuzzy clustering algorithm with modified kernel functions
    2010, 30(07):  1926-1929. 
    Asbtract ( )   PDF (588KB) ( )  
    Related Articles | Metrics
    Using kernelized metric of compactness and separation, this paper proposed a new clustering validity index named KKW, and obtained the optimized cluster number. Besides, the KKW index was used in the modified kernel fuzzy clustering (MKFCM) algorithm. As mapped by modified Mercer kernel functions, the data set shows new features never showed before. MKFCM algorithm was applied to the data set Wine and glass. For every clustered class, MKFCM has overall accuracy higher than 90%;as to the incomplete data set Wisconsin Breast Cancer, difference is 4.72%. The modified kernel clustering algorithm is faster than the classical algorithm in convergence and more accurate in clustering. The results of simulation experiments show the feasibility and effectiveness of the modified kernel clustering algorithm.
    Clustering algorithm based on complex attributes similarity and its applications
    2010, 30(07):  1930-1932. 
    Asbtract ( )   PDF (479KB) ( )  
    Related Articles | Metrics
    In order to divide the telecom customers effectively, a new clustering algorithm for complex attributes was proposed based on feature similarity measurement idea in this paper. In the algorithm, the objects similarities were measured by complex attributes’ distribution similarity function. Then, a graph model was constructed based on the similarity. Finally, the graph was divided to clusters. Compared with the traditional clustering algorithms based on selecting dimension and decreasing dimension, the proposed algorithm can process highdimension data and complex attributes effectively. Meanwhile, it does not need reviewing original date when modifying parameter. Real telecom customer data were used for simulation and the experimental results show that the algorithm can solve customer segmentation problem effectively.
    K-means text clustering algorithm based on density and nearest neighbor
    2010, 30(07):  1933-1935. 
    Asbtract ( )   PDF (472KB) ( )  
    Related Articles | Metrics
    The initial focal point has a great influence on the clustering effects of traditional K-means algorithm, which makes cluster into a local optimal solution. In view of the existing problem,The algorithm that generates the initial cluster centers is proposed ,through introducing the density and nearest-neighbor idea, and these selected centers are used in K-means algorithm, getting the better text clustering algorithm called DN-K-means. The experiments results confirmed that the algorithm can produce clustering result with high and steady clustering quality.
    Spectral clustering based on global K-means
    2010, 30(07):  1936-1937. 
    Asbtract ( )   PDF (444KB) ( )  
    Related Articles | Metrics
    Spectral clustering is an effectively widely used clustering method. With the essence of initialization sensitivity in spectral clustering, the Global K-means clustering algorithm was introduced to overcome the disadvantage. Then a spectral clustering algorithm based on global k-means was coming. Compared with the traditional spectral algorithm, some experiments showed that the proposed algorithm was not only effective and feasible but also good at getting stable clustering results and suitably improving clustering precision .
    Intertextuality measurement method of text translation index
    2010, 30(07):  1938-1940. 
    Asbtract ( )   PDF (441KB) ( )  
    Related Articles | Metrics
    Computer algorithms can help produce scientific quantitative data to provide more reliable and precise manifest intertextual clues, which plays a significant role in clarifying the multilayer relationship among relevant texts, facilitating understanding and translating process. Using tea classics as case texts, this paper presented some algorithms to measure the intertextuality, namely, Dice coefficient, matching coefficient, full confidence and cosine, providing an index approach for textoriented translation. The experimental results show that the cosine measure produces the best results, which offers valuable help to the accuracy and consistency in both source text comprehension and target version translation.
    Chinese word segmentation for GIS based on priority special name
    2010, 30(07):  1941-1943. 
    Asbtract ( )   PDF (457KB) ( )  
    Related Articles | Metrics
    A Chinese word segmentation algorithm for Geographic Information System (GIS) based on priority special name was designed: use dictionary mechanism which combines synonyms dictionary, general dictionary and special dictionary, cut the sentences by special name firstly, and get the segmentation result of disambiguating with Trigram mode lastly. The experimental results show that the segmentation algorithm has good speed and accuracy in segmentation processing of professional literature.
    Time series similarity matching based on event
    2010, 30(07):  1944-1946. 
    Asbtract ( )   PDF (421KB) ( )  
    Related Articles | Metrics
    In order to do the time-series similarity matching with the users’ needs and improve the accuracy, the time-series similarity matching based on event (SMBE) is proposed. First, the users’ needs are defined as the event and the original time series is translated into the event sequence. Then, SMBE is constructed. SMBE defines the similarity of the various elements in the two different sequences, constitute the corresponding similarity matrix and searches the optimal path value as the similarity measurement. Finally, the clustering method based on SMBE is proposed. Experimental results show that the clustering based on SMBE can get the accuracy of 90% with the reasonable parameters.
    NNlists-based k-path nearest neighbor query in road networks
    2010, 30(07):  1947-1949. 
    Asbtract ( )   PDF (438KB) ( )  
    Related Articles | Metrics
    To satisfy the real-time requirement of k-path nearest neighbor(kPNN) query, the BNNL algorithm based on the precomputed NNlists is proposed in this paper depending on the pre-computation idea, using a bi-directional Dijkstra search scheme to acquire the current the shortest path to destination, then, the nodes in the shortest path are got. At last, these nodes' m nearest neighbors are optimized by a priority queue in order to get the correct kPNN. The performance of BNLL is more efficient about kPNN query speed, in handling the data objects are densely distributed or the number of k is large.
    Improved CMIP data prefetching strategy in mobile environments
    2010, 30(07):  1950-1952. 
    Asbtract ( )   PDF (435KB) ( )  
    Related Articles | Metrics
    The data cache prefetching is a key technology in mobile environments. Cache-Miss-Initiated Prefetch (CMIP) data can improve the system performance by prefetching strategies to access the client record of the history to mining the association rules and get the prefetch data. However, since it does not take account of the data update rate and data size, cache invalidation often takes place. In this paper, this algorithm was based on the data update rate and it increased the size of the selected data to determine and sort, then made the choice of the prefetching data. Through improving, cache invalidation declines and the data access time and power consumption decrease.
    Chinese geo-coding based on classification database of geographical names
    2010, 30(07):  1953-1955. 
    Asbtract ( )   PDF (688KB) ( )  
    Related Articles | Metrics
    Geo-coding is widely used in urban spatial location and analysis, but there is no perfect solution on Chinese geo-coding for the present because there is no uniform specification and fixed model for Chinese geographical names. To solve this problem, the authors used Chinese geo-coding based on the classification database of geographical names in this paper, and detailed the key technologies to realize the program: database of geographical names data model, address split, and address matching. Finally, the authors verified the program through the actual data. The experimental results show that the program can solve most of the problems about address matching.
    Parallel OLAP query optimization method based on semantic decomposition
    2010, 30(07):  1956-1958. 
    Asbtract ( )   PDF (484KB) ( )  
    Related Articles | Metrics
    This article first gave out formalization definitions of aggregation relation and semantic decomposition relation between the Online Analytical Processing (OLAP) queries. Then the definition of supplementary set between the query and query set was provided. The common query in OLAP set could be found by using these relations and the OLAP queries could be optimized by parallel method from many aspects. Test results show that the system’s overall efficiency has been improved by adopting parallel optimization method.
    Typical applications
    Study of data race for process based on BPEL
    2010, 30(07):  1959-1961. 
    Asbtract ( )   PDF (555KB) ( )  
    Related Articles | Metrics
    Composing the existing Web services to satisfy a new value-added for user requirement is called services composition. It provides technical support for business process integration within an enterprise or across multiple enterprises. As a processdriven services composition description language, correctness is a key challenge to BPEL (Business Process Execution Language) the same as other description languages in the domain of services composition. Therefore, analysis and check of control flow and data flow is an essential step before execution. Data race is a wellknown issue for BPEL. In this paper, our goal is automatic detection of the potential data race in a process. Firstly, a formal description for data race was given by analyzing the event types, concurrency specified in the BPEL. Furthermore, based on exploring the characteristics of XML nodes tree, concurrency of two activities, and messages correlative with activities, a detection algorithm was designed to find out data race in a process. Finally, an order processing example was demonstrated to illustrate the validity of the proposed solution.
    Research on video similarity network
    2010, 30(07):  1962-1966. 
    Asbtract ( )   PDF (746KB) ( )  
    Related Articles | Metrics
    In order to solve the problem that is close to the scale-free phenomena in the complex network theory when the video appears in the virtual information system, a brandnew network of video semantic similarity was designed. The description model of video semantics, the construction rule of the network, the computation method and retrieval algorithm based on the network were given in detail. Experiments have been made on this video similarity network. The results show that it can effectively solve the problem caused by the usage of videos.
    Bispectrum analysis on EEG for driving fatigue
    2010, 30(07):  1967-1969. 
    Asbtract ( )   PDF (585KB) ( )  
    Related Articles | Metrics
    The bispectrum was used to analyze the electroencephalography (EEG) of drivers in the process of driving based on the fact that bispectrum is suitable for the signals of EEG that possess non-Gaussian and nonlinear properties. The two-hour signals were divided into six sections at a certain time interval, which were analyzed by using the AutoRegressive (AR) parameter model method to estimate bispectrum. The results show the bispectrum of EEG are very different at different times, so bispectrum can be used for the detection of driving fatigue.
    English word associative memory through pronunciation based vocabulary network
    2010, 30(07):  1970-1973. 
    Asbtract ( )   PDF (616KB) ( )  
    Related Articles | Metrics
    Pronunciation associative memory is a high effective memorizing strategy. A pronunciation based vocabulary network was introduced in order to guide learners to use pronunciation associative memory, to help them to get familiar with pronunciation rules, to strengthen their memories of words as well as to assist learners to build bidirectional recognition between letter combination and its pronunciation options. This network was constructed according to frequent letter combinations and pronunciation difference between words. Therefore, it contained information about pronunciation similarity and word structure. Based on the network, learning system can realize pronunciation associative memory function, and can provide relevant statistical data. Associative memory function of word learning system can be further improved by utilizing pronunciation based vocabulary network.
    Dynamic evolution mechanism oriented to service-object
    2010, 30(07):  1974-1977. 
    Asbtract ( )   PDF (702KB) ( )  
    Related Articles | Metrics
    A service-object-oriented dynamic evolution technology was proposed to facilitate application software’s dynamic adaptation requirement for the evolving Internet environment and variable user requirements. Under this mechanism, the variable parts were modeled into service-object, then the service registration and lookup mechanism referring to service-oriented computing were utilized to decouple the object reference for evolution. The evolution mechanism and evolution process were described in detail. Some contrasts were made among the representative research work. The experimental results show that the mechanism has better performance.
    Design and implementation of ALU and shifter in X-DSP
    2010, 30(07):  1978-1982. 
    Asbtract ( )   PDF (660KB) ( )  
    Related Articles | Metrics
    Concerning the challenges on performance, power, area of Arithmetic Logical Unit (ALU) and shifter in DSP CPU, this paper studied the architecture of XDSP and analyzed the characteristics of all instructions related to ALU unit, and shifter unit and designed and implemented the two units. This paper also synthesized the two computational units by using design compiler with SMIC 0.13μm CMOS technology library. The total circuit power consumption was 4.2821mW, the area of the circuit was 71042.9804μm2,and the frequency was 250MHz, which met the requirements of the system.
    Pattern recognition and Software
    Research on CUDA-based parallel Implementation of fast moment invariants algorithm
    2010, 30(07):  1983-1986. 
    Asbtract ( )   PDF (579KB) ( )  
    Related Articles | Metrics

    Moment invariants have been used as feature descriptors in a variety of object recognition applications since it was proposed. It is necessary to compute geometric moment values in real-time rate. Despite the existence of many algorithms of fast computation of moments, it cannot be implemented for real-time computation to be run on a PC. After analyzing the parallelism of fast moment invariants algorithm based on differential of moments factor, a novel parallel computing method based on CUDA (Compute Unified Device Architecture) technology is presented and implemented on NVIDIA Tesla C1060 GPU(Graphic Processing Unit) in this paper. The computing performance of the proposed method and the traditional serial algorithm are contrasted and analyzed. The experiments show that the parallel algorithm presented in the paper greatly improved the speed of the computation of moments. The new method can be effectively used in real-time feature extraction.

    Typical applications
    Design and implementation of I/O power consumption simulation modules of power consumption simulator HMSim
    2010, 30(07):  1987-1990. 
    Asbtract ( )   PDF (599KB) ( )  
    Related Articles | Metrics
    Nowadays as low carbon economy is advocated worldwide, embedded software power consumption has become a critical issue in embedded system design, and simulation technology is an important development tool to implement the measurement and experiment of embedded software power consumption. HMSim is a highprecision instructionlevel embedded software power consumption simulator. Firstly, this paper introduced the design of HMSim and the framework of Instruction Set Simulator, then designed the UART and LCD controllers I/O functional simulation model in detail, and proposed a kind of I/O power consumption statistical method. Finally, by running some programs based on uC/OS-II RTOS, it shows that the design and implementation of HMSim I/O power consumption simulation modules are correct.
2024 Vol.44 No.4

Current Issue
Archive
Honorary Editor-in-Chief: ZHANG Jingzhong
Editor-in-Chief: XU Zongben
Associate Editor: SHEN Hengtao XIA Zhaohui
Domestic Post Distribution Code: 62-110
Foreign Distribution Code: M4616
Address:
No. 9, 4th Section of South Renmin Road, Chengdu 610041, China
Tel: 028-85224283-803
  028-85222239-803
Website: www.joca.cn
E-mail: bjb@joca.cn
WeChat
Join CCF