Loading...

Table of Content

    01 October 2006, Volume 26 Issue 10
    Network and communications
    An Adaptive and Dynamic Grid Job Scheduling Algorithm
    2006, 26(10):  2267-2269. 
    Asbtract ( )   PDF (571KB) ( )  
    Related Articles | Metrics
    GRACE is a distributed grid architecture for computational economy, in this paper, we meet allocation of grid resource problem by propose an adaptive and dynamic grid job scheduling algorithm, and introduce myopic algorithm. This kind of algorithm dynamic monitor load balance degree of this system in the process of scheduling, through our simulation trial prove that this kind of scheduling algorithm improve the scheduling success ratio of jobs.
    Designing of EPA Protocol Abstract Test Suite
    2006, 26(10):  2270-2271. 
    Asbtract ( )   PDF (633KB) ( )  
    Related Articles | Metrics
    According to the principle and explanation for protocol test in IEC 9646 and based on the characteristics and requirements of Ethernet for Plant Automation(EPA), a layer structure model of abstract test suite was designed. With reference to the structure and clue of Table and Tabular Combined Notation (TTCN), a kind of formalization language was defined to describe the abstract test suite. Meanwhile, the generative process and realizing steps of the EPA abstract test suite were introduced. The practical application of EPA conformance test system indicates the validity of the abstract test suite, and also verifies that EPA abstract test suite takes in all the contents of the EPA conformance test.
    Intelligent QoS multicast routing algorithm under inaccurate information
    2006, 26(10):  2272-2274. 
    Asbtract ( )   PDF (656KB) ( )  
    Related Articles | Metrics
    Taking the characteristics of multi-constrained Quality of Service(QoS) routing in next generation Internet into account, an intelligent multicast QoS routing algorithm based on Particle Swarm Optimization(PSO) and Genetic Algorithm(GA) was presented. The corresponding model and its mathematical description were introduced. Under inaccurate information of QoS parameters and combining fast searching ability of PSO and global optimization ability of GA, the proposed algorithm tries to find the Pareto no-dominated set with the maximum probability of meeting with multiple QoS constraints under the given cost, from which the best multicast tree was selected. Simulation research and performance evaluation show that the proposed algorithm is both feasible and effective.
    Performance Analysis of End-to-End Path Capacity Measurement Tools
    2006, 26(10):  2275-2277. 
    Asbtract ( )   PDF (578KB) ( )  
    Related Articles | Metrics
    Analyzing the performance of exist path-capacity measurement tools is important for tools selection and improvement. This paper defines a metrics set for tools performance evaluation, and evaluates several typical path-capacity measurement tools in a self-building controllable network under many repeatable cross traffic load conditions. It found that, VPS based tools such as pathchar, clink, have large measurement errors; although pathrate is accurate, but it has great measurement costs. When the cross traffic load is high, all of these tools are inaccurate and need too much measurement costs. This work is useful for tools selection and furthere improvement of capacity measurement tools.
    Wavelet weighted chaos local-region model of network traffic behavior analysis
    Ting Lei
    2006, 26(10):  2278-2281. 
    Asbtract ( )   PDF (819KB) ( )  
    Related Articles | Metrics
    Integrating the advantage of wavelet transform with that of chaos local-region model, a new model of forecast network traffic was proposed. First, the network traffic time series was decomposed to the high frequency signal series and low frequency signal series and the weighted chaos local-region model was applied to predict these series respectively. Finally, forecasted traffic was achieved by wavelet reconstruction of all the forecasted components. The simulation results on real network traffic indicate that this model is more satisfactory than traditional methods in network traffic prediction.
    A Model of Data Replication Strategy Based on Security in Grid
    2006, 26(10):  2282-2284. 
    Asbtract ( )   PDF (568KB) ( )  
    Related Articles | Metrics
    Data replication in data grid is an important technique, by which, better access performance can be got. It is also a fault-tolerant approach, but at the same time brings some hidden trouble for security. Therefore, security and fault-tolerance should be colligated to research. Combining data replication with security, we proposed a mathematic model which can determine data replication quantity. In this model, taking economic interest and reputation index of service supplier into account, an optimization problem of two targets was simplified under reasonable hypothesis. Numeric calculation and analysis show that there is an optimal quantity for the data replication.
    Mobility management based on multi-protocol joint optimization
    2006, 26(10):  2285-2288. 
    Asbtract ( )   PDF (917KB) ( )  
    Related Articles | Metrics
    For the generalized mobility of next generation network, a generalized mobility management scheme was designed which can converge multi-services efficiently through multi-protocol joint optimization. Based on the Mobile IP and its sub Mobile IP, this scheme adopted the Session Initiation Protocol(SIP)and optimized the joint design of network layer mobility management and application layer mobility management. Meanwhile, taking account of the mobility management of link layer the network signalling payload and the repetition of data were decreased.
    Trust value updating mechanism based on mixed time-event
    BaoLin ma
    2006, 26(10):  2289-2290. 
    Asbtract ( )   PDF (527KB) ( )  
    Related Articles | Metrics
    In grid system, trust model could accurately evaluate the relationship of entities. But currently, the mechanism of updating trust value has its own limitations. Mixed Time-Event mechanism (MTE) was presented to update trust value. MTE overcomes the shortcomings of being time-driven and event-driven and has higher efficiency and accuracy.
    New AQM algorithm based on Diffserv network
    2006, 26(10):  2291-2293. 
    Asbtract ( )   PDF (529KB) ( )  
    Related Articles | Metrics
    The current Assured Forwarding (AF) service in Diffserv network can provide stable guarantees in bandwidth, but cannot ensure stable average queue size and delay time as speed changes, and lack in efficient scheme of simplifying the parameters setting of an Active Queue Management(AQM) algorithm. Based on the analysis of the RIO and A-RED, an active queue management algorithm with adaptive control policy named A-RIO was proposed in this paper. Simulations results prove that the new algorithm can provide stable average queue size, delay time and simplify parameters setting, in addition to the advantage of RIO algorithm, also can effectively improve the performance of AF service in Diffserv network.
    Design of Web service composition execute platform based on mobile Agent
    YuanChang Zhou
    2006, 26(10):  2294-2286. 
    Asbtract ( )   PDF (551KB) ( )  
    Related Articles | Metrics
    With in-depth analysis of Web service composition technique, starting from the execute aspect of Web service composition and combining the advantage of mobile Agent, an execute platform for Web service composition based on mobile Agent was proposed. As for complicated Web service composition, the platform can at first decompose it into several interdependent task pieces which correspond to one linear sub Web service composition, and then assign them to several mobile Agents to carry out the distributive execution.
    Research and application of VoIP gateway based on adaptive variable rate speech coding
    2006, 26(10):  2297-2299. 
    Asbtract ( )   PDF (620KB) ( )  
    Related Articles | Metrics
    In order to solve the problem that the voice of QoS is easily influenced by network bandwidth, a new VoIP gateway based on adaptive variable-rate coding was presented. By real-time computing the ratio of RTP voice packet losing and analyzing the quality of service, it would adaptively choose the most appropriate coding in order to obtain the best combination of voice service and bandwidth, so as to reduce the ratio of voice packet-losing and ensure the quality of voice service. The key techniques such as computing the ratio of RTP voice packet-losing and adjusting bandwidth were discussed intentionally. In the end, the solution has already been applied in the open source program Asterisk.
    Adaptive exigency report and interest demand protocol for wireless sensor network
    2006, 26(10):  2300-2303. 
    Asbtract ( )   PDF (821KB) ( )  
    Related Articles | Metrics
    To meet the demand of energy saving in wireless sensor network, several new techniques, such as adaptive restoring of broken chain, data integration in data sources and channel booking were put forward. Furthermore, a new network protocol on mac level was proposed. It drives data transportation through exigency report and interest demand, and can be used in large scale data collecting and environmental monitoring. Simulation results show that, compared with other similar protocols, the new protocol can save energy and reduce time delay more efficiently.
    Information security
    Application of negative selection mutation algorithm in E-mail filter
    2006, 26(10):  2304-2306. 
    Asbtract ( )   PDF (546KB) ( )  
    Related Articles | Metrics
    In order to enhance the identification and anti-fraud capabilities of the intelligent E-mail management system, a kind of E-mail filter based on negative selection mutation algorithm was designed and implemented in this paper. The application of the artificial immunology principle makes the E-mail filter have the capabilities of self-studying and self-adapting. Meanwhile, the veracity of the system is improved by adopting the double deck filtering method. The experimental results show that this filtering system has better performance in accuracy rate, false-positives rate and false-negatives rate, and achieves expected aims.
    ID-based blind signature and proxy signature without a trusted party
    2006, 26(10):  2307-2309. 
    Asbtract ( )   PDF (512KB) ( )  
    Related Articles | Metrics
    The current ID-based blind signatures and proxy signatures were analyzed in this paper. It was found that these systems need to trust a Private Key Generator (PKG) unconditionally. Because PKG can compute the private key of any user in system, it can forge a blind signature or a proxy signature of any one. Based on the current ID-based signature and ID-based blind signature, a new ID-based blind signature without trusted PKG was proposed. Meanwhile, a new ID-based proxy signature without trusted PKG was also proposed. The analysis shows that our proposed blind signature and proxy signature are secure and effective.
    Danger model-based three-level-module intrusion detection system
    2006, 26(10):  2310-2314. 
    Asbtract ( )   PDF (989KB) ( )  
    Related Articles | Metrics
    Based on Danger theory and data fusion technology, a new Danger model-inspired three-level-module intrusion detection system was presented. Also, an adaptive decision templates algorithm was derived, realizing the online automatic regulation on detection templates. There are two characteristics of the system. First, when it is difficult to distinguish current behaviors according to the present knowledge, this system will discriminate them by means of danger signals, thus false alarms are reduced and the ability of identifying novel attacks is enhanced. Second, the adaptive decision templates algorithm allows detection templates to modify dynamically without periodical updating, which enables the system to be adapted to a changing environment, and also increases the accuracy on unknown attacks. Experimental results on test data from KDD-CUP-99 database were reported to show the effectiveness of this system.
    Research and realization of one trust model based on Pear to Pear
    FangMing Bi
    2006, 26(10):  2315-2317. 
    Asbtract ( )   PDF (557KB) ( )  
    Related Articles | Metrics
    In order to verify the authenticity of public keys, a widespread PKI has to deal with a large number of queries about digital certificates. Current trust model could not deal with the trust computation efficiently. a Peer-to-Peer trust model was presented in this paper, which was based on chord protocol and digital certificate. A distributed implementation method was given, and the efficiency on trust computation was also analyzed.
    Novel authentication scheme based on visual cryptography
    GuoZhu Feng
    2006, 26(10):  2318-2319. 
    Asbtract ( )   PDF (716KB) ( )  
    Related Articles | Metrics
    An efficient and credible authentication schema was constructed based on the visual cryptography. It avoided the disadvantages of traditional cryptography by adopting only two cryptography components: visual cryptography and MAC, and the safety has not been lowered down. Bar code was introduced into this schema as secret image to reduce the complexity and difficulty of the server's auto-recognition of secret information which was hidden in images, so that made the schema more efficient. In the end of this paper, we analyzed the schema's safety, and the result showed that the new schema can resist common attack effectively.
    Design of cryptosystem based on fingerprint identity certification
    2006, 26(10):  2320-2322. 
    Asbtract ( )   PDF (617KB) ( )  
    Related Articles | Metrics
    The present cryptosystem was discussed in this paper. The process of fingerprint identification was expounded. The focus was a design of internet communication safety system based on fingerprint certification. Matching test of 400 fingerprints was carried out. The results show that the proposed system based on fingerprint identification is more secure and reliable compared with the traditional identity certification systems.
    Research on technology of DoS based on protocol transform
    ShuJun Li
    2006, 26(10):  2323-2325. 
    Asbtract ( )   PDF (759KB) ( )  
    Related Articles | Metrics
    In this paper, protocol transform was put forward, and its potential limitations were analyzed. Based on the analysis, how to implement DoS attack by protocol transform was expounded, and the characteristics of DoS were analyzed. Finally, some protection measures against DoS were presented, and future fate of this attack method was also predicted.
    Study of blind audio digital watermarking algorithm based on DWT
    2006, 26(10):  2326-2327. 
    Asbtract ( )   PDF (656KB) ( )  
    Related Articles | Metrics
    A new watermarking algorithm based on DWT transform was proposed to offer copyright protection for digital audio signals. A digital watermark is embedded into the significant wavelet coefficients repeatedly. The experimental results show that the imbedded watermark has good imperceptibility and robustness against common signal processing manipulations and the watermark can be extracted without original data.
    Research on Hash chain-based RFID privacy enhancement tag
    2006, 26(10):  2328-2331. 
    Asbtract ( )   PDF (917KB) ( )  
    Related Articles | Metrics
    Radio Frequency Identification (RFID) as a new automated identification technology has become popular with the supply chain and retail business. However, the widespread employment of RFID tags may bring new threats to the consumers' privacy, due to the powerful tracking capability of the tags. There are several important technical points when constructing an RFID scheme. What is particularly important is the issue of consumers' privacy and security of tag information. Low cost implementation is another one. To address these issues, the requirements and restrictions of RFID systems were discussed and clarified in this paper, and the features and issues pertinent to several current RFID schemes were analyzed. Finally, a simple secure mode tag which adopted a low cost hash chain mechanism to enhance the consumers' privacy was proposed.
    SIP secure authentication model based on strong authentication technology
    2006, 26(10):  2332-2335. 
    Asbtract ( )   PDF (769KB) ( )  
    Related Articles | Metrics
    With regard to the representative security threats to SIP, Session Initiation Protocol(SIP) secure authentication model based on strong authentication technology was put forward, and its security was analyzed. The authentication model achieves strong authentication by combining smart card with digital certificate, extends SIP accordingly, and imports strong authentication technology into SIP. The model implements the duplex secure authentication of session, and ensures the confidentiality, authenticity, integrality and undeniability of SIP message transportation. As a result, it improves the security of SIP.
    Implementation of honeypot system for detecting unknown attacks
    2006, 26(10):  2336-2337. 
    Asbtract ( )   PDF (428KB) ( )  
    Related Articles | Metrics
    The current firewall and intrusion detection system cannot effectively discern the unknown attacks, so it leads to false positives and negatives for the information. Therefore, a kind of honeypot system was proposed,by attack signature mechanism for detecting and analyzing unknown network attacks in system call level.
    Design of collaborative antispam filter system based on the mobile Agent
    2006, 26(10):  2338-2340. 
    Asbtract ( )   PDF (644KB) ( )  
    Related Articles | Metrics
    A collaborative antispam filter architecture based on the mobile agent was presented with key description of its architectural components including agent server, antispam client, dynamic load balance layer, mobile agent layer and antispam server. The agent server proposed here was for uniformed access to email servers, and brought down the complexity of filter systems resulted from the diversity of email severs and clients. Hash Filtration of similar emails was implemented by using Nilsimsa. Finally, the whole collaborative antispam filter system was tested.
    FPGA-based intrusion detection system in IPv6
    2006, 26(10):  2341-2343. 
    Asbtract ( )   PDF (808KB) ( )  
    Related Articles | Metrics
    IPv6 will be the core technology in the next generation Internet. Therefore, the study on intrusion detection system in IPv6 is closely linked with the security of the next generation Internet. After analyzing the fundamentals of network security system today and the primary characteristics of IPv6, a framework of intrusion detection system in IPv6 was put forward. And then, pattern matching by using Field Programmable Gate Array(FPGA) was focused in the study analysis.
    New method to avoid heap overflow using random fit
    2006, 26(10):  2344-2346. 
    Asbtract ( )   PDF (544KB) ( )  
    Related Articles | Metrics
    Based on current methods of anti-buffer overflow and the analysis of existing problems, the random fit algorithm was presented. By introducing randomization into memory management algorithm, the likelihood that the attacker can predict the content to be overwritten was reduced.
    Bivariate polynomials key management scheme based on ID in WSNs
    2006, 26(10):  2347-2350. 
    Asbtract ( )   PDF (841KB) ( )  
    Related Articles | Metrics
    WSNs require cryptographic protection of communications between sensor nodes. Adopting bivariate polynomials, ID-Based key management scheme has these functions including key distribution, key updating and sensor capture detection. Compromised nodes can recover automatically. Meanwhile, with lowered demands on memory and computational ability, this scheme is more suitable for sensor networks.
    Graphics and image processing
    Medical Image Registration Method Based on Mixed Mutual Information
    Hong-Ying ZHANG
    2006, 26(10):  2351-2353. 
    Asbtract ( )   PDF (715KB) ( )  
    Related Articles | Metrics
    Traditionally, the similarity metric is based on Shannon's entropy. Through the analysis of Renyi's entropy, it is found that Renyi's entropy can remove some unwanted local optimum, smooth out difficult optimization terrain accordingly; Shannon's entropy has the "depth" of the basin of attraction, making the registration function easier to be optimized. So a new similarity measure based on mixed mutual information was proposed. The measures based on different entropy were used in different searching phases, and global optimization algorithm and local one were used individually. At first, the global optimization algorithm was used to find the local extrema of generalized mutual information measure based on Renyi's entropy. Then, the local one was used to locate the global optimal solution by searching the current local optimal ones, and the generalized mutual information measure based on Shannon's entropy was taken as the objective function.
    New speckle reduction method for polarimetric SAR image based on independent component analysis
    2006, 26(10):  2354-2356. 
    Asbtract ( )   PDF (1016KB) ( )  
    Related Articles | Metrics
    Polarimetric Synthetic Aperture Radar (SAR) images are usually corrupted by strong speckle noise, which blocks scene information abstracting and the application of polarimetric SAR images. Based on statistical formulation of polarimetric SAR image, a new approach for speckle reduction was presented using Independent Component Analysis (ICA). The experimental results show that excellent performance can be achieved: the image speckle noise is reduced effectively and the ENL is high, and the image quality is improved obviously.
    Optimized chain code compression algorithm for fingerprint binary image
    2006, 26(10):  2357-2359. 
    Asbtract ( )   PDF (576KB) ( )  
    Related Articles | Metrics
    An optimized compression algorithm for linear structure stripe image was discussed in this paper, which is Freeman differential chain code Huffman coding. Compared to the traditional Freeman chain code, the proposed one is a hybrid encoding method based on Freeman chain code, differential code and Huffman code. Theoretic analysis and results from experiments on fingerprint binary images show that this algorithm is superior to other binary image compression algorithms,especially to fingerprint binary image compression. The average code length of the proposed one in this paper is 1.7651bits, which is shorter than that of 8 orientations Freeman chain code or Freeman differential chain code whose average code length is 3 bits.
    Circular object segmentation under complicated background
    2006, 26(10):  2360-2361. 
    Asbtract ( )   PDF (643KB) ( )  
    Related Articles | Metrics
    How to discern circular object accurately from the complex industrial image was studied in this paper. Having illustrated the limitations of the traditional ways in image segmentation, a new method based on double-threshold segmenting in combination with mathematical morphology was put forward. First, it was to obtain two binary images from segmenting image by two thresholds. Second, erosion, opening and closing were done to the two images according to some knowledge of the shape and location of the objects in the image that we have already known. Last, the edge information from the image segmented by a high threshold was put into the image segmented by a low threshold. It is the way to segment and discern circular objects accurately from the complex image and can greatly improve the accuracy rate of discerning. The new method works well in pre-processing with complex image.
    Application of MAP estimation based on Gaussian Markov random field in gaussian noise filter
    2006, 26(10):  2362-2365. 
    Asbtract ( )   PDF (831KB) ( )  
    Related Articles | Metrics
    The application of a Gaussian Markov Random Fields (GMRF) based Maximum A Posteriori Probability (MAP) estimation for image Gaussian noise filter was presented. According to the characteristics of the Gaussian noise, the restoration model of the degenerated image based on GMRF was built, and then the problem of image Gaussian noise filter was transformed to MAP estimation. The prior probability can be estimated by using the equivalence of the Markov random fields and the Gibbs Distribution (GD). In order to get the MAP estimation, first, the GMRF parameters were estimated by means of the Expectation-Maximization (EM) algorithm. Second, objective function was minimized with conjugate gradient technique. The experimental results demonstrate that the proposed method outperforms other filters (the Gaussian filter, the Wiener filter, etc.) in suppressing Gaussian noise and maintaining the original composition of images.
    Blind separation algorithm of blurred image based on independent component analysis
    2006, 26(10):  2366-2368. 
    Asbtract ( )   PDF (847KB) ( )  
    Related Articles | Metrics
    The original images were restored from the blurred grayscale images by using the nonholonomic natural gradient (NNG) algorithm of the Independent Component Analysis(ICA) methods, and the principle of natural gradient algorithm under nonholonomic constrain was analyzed. However, the nonlinear activation function of this algorithm is closely related to the unavailable probability distribution of the sources, though it is robust to nonstationary and strongly undulate sources. To solve this problem, the nonlinear function was selected adaptively by use of the kurtosis of the output signals, an improved adaptive NNG (ANNG) blind separation algorithm of blurred image based on ICA was proposed, and the effect of the different mixture matrices on the performance of this algorithm was researched. The simulations show the validity of the proposed method. Compared with the nonholonomic natural gradient algorithm and the classical FastICA algorithm, the performance index of this proposed algorithm is better.
    Research on QPSO algorithm in image compression
    2006, 26(10):  2369-2371. 
    Asbtract ( )   PDF (527KB) ( )  
    Related Articles | Metrics
    In order to decrease the space complexity of image storage and transfer, it is necessary to do image compression. Therefore, how to apply Quantum-behaved Particle Swarm Optimization(QPSO) to image compression was studied in this paper. During the compression process, an ordered representation of image was first obtained, and then the compressed code was optimized according to the particles astringency. Experimental results show that the compression efficiency of QPSO algorithm is much better than Genetic Algorithm (GA).
    Image feature selection method based on improved RS-GA
    2006, 26(10):  2372-2374. 
    Asbtract ( )   PDF (587KB) ( )  
    Related Articles | Metrics
    With regard to the problem that original feature in image classification is mass and redundancy, a new image feature selection method was presented. This method combines the Rough Set(RS) theory with Genetic Algorithm(GA) properly to select feature. To improve the efficiency of this algorithm and get the optimal searching result, definition of relative attribute dependency of rough set theory was introduced, and fitness function and genetic operators were designed. Then, this proposed method was applied to image feature selection. Experimental results show that it has better performance and higher algorithm efficiency.
    Feature extraction and recognition of iris based on biorthogonal multiwavelets
    2006, 26(10):  2375-2377. 
    Asbtract ( )   PDF (591KB) ( )  
    Related Articles | Metrics
    Biorthogonal multiwavelets filter characterized with self-affine was proposed to extract iris texture feature and local with global feature was applied to recognize iris. After using biorthogonal multiwavelets filter to process iris images, local coarse quantization encoding was adopted in the low frequency parts of coefficients, and Hamming distance was taken as the classifier. When the Hamming distance was uncertain of its decision due to the influence of eyelids, eyelashes and iris deformations, mean and variance were extracted from coefficients of multiwavelets transform, and Euclidean distance of covariance reciprocal with weight value was designed as the classifier. The results show that this approach is able to identify iris quickly and reliably.
    Improvement of contour following algorithm and its application in character recognition
    2006, 26(10):  2378-2379. 
    Asbtract ( )   PDF (593KB) ( )  
    Related Articles | Metrics
    To fulfill the application requirement of character recognition in image, an improved algorithm for contour following algorithm was proposed and a contour list based on chain code was presented. In character recognition, the character lineament sequence was generated according to contour list and the character similarity was measured by its correlation value. The experimental results show that there are fewer extracted character features, faster recognition and higher degree of accuracy,and it is less affected by font size.
    Image registration based on SUSAN algorithm
    2006, 26(10):  2380-2382. 
    Asbtract ( )   PDF (596KB) ( )  
    Related Articles | Metrics
    To solve image registration, modified SUSAN algorithm was used to detect the corner points. On the ground of the unchangeable quality of rigid body and the geometry relation between the feature points, the image registration was accomplished. The experiment proves that it is at high speed in the course of feature extraction and image registration.
    Improved Pan algorithm for fast intraprediction in H.264/AVC video coding
    XiaoDong Tian
    2006, 26(10):  2383-2385. 
    Asbtract ( )   PDF (531KB) ( )  
    Related Articles | Metrics
    Fast Intraprediction Pan algorithm based on edge directional histogram was improved in this paper. The block type selection method (Intra_4×4 and Intra_16×16) and early termination technology for 4×4 luma block mode decision were employed. Experimental results show that the one-frame scramble time of fast intrapredicton algorithm, compared with Pan algorithm, is reduced by 29.093% with negligible loss of peak signal-to-noise ratio.
    Lossless video compression H.264-LS based on H.264 and neighbor prediction
    2006, 26(10):  2386-2388. 
    Asbtract ( )   PDF (563KB) ( )  
    Related Articles | Metrics
    A lossless video compression method H.264-LS was proposed in this paper. To meet the needs of lossless compression and according to the features of residual coefficients after motion compensation, a two dimensional neighbor prediction technology was used instead of the H.264's integer transform. The experimental results demonstrate that the proposed approach is, on the whole, better than other existing lossless video compression algorithms, especially in case of heavy movement.
    Database Technology
    Research of clustering algorithm based on density gradient
    2006, 26(10):  2389-2392. 
    Asbtract ( )   PDF (1017KB) ( )  
    Related Articles | Metrics
    In order to solve difficult problems in clustering with irregularly distributed data set, a new clustering algorithm based on density gradient was provided. By analyzing the changing density of data sample and its neighbors, the algorithm searched points with the maximum density and took them as original centers of clusters. Then it combined some smaller clusters into larger ones according to the distribution of border points between clusters. Experimental results show that the new algorithm has better performance than Density Based Spatial Clustering of Applications with Noise(DBSCAN).
    Data preprocessing method based on characteristic of interests for WUM
    2006, 26(10):  2393-2394. 
    Asbtract ( )   PDF (673KB) ( )  
    Related Articles | Metrics
    To reduce the data scale and find more recommendable access patterns from log file, a new data preprocessing method based on the characteristic of users' interests for Web Usage Mining(WUM) was proposed in this paper. This method filtered out the access records which were caused by users' short-term interests and not recommendable from log file. Experimental results indicate that this method can filter out the noise data so as to reduce the data scale and the complexity of WUM greatly, and enhance the accuracy of WUM-based personalized recommendation.
    Prediction of time series based on wavelet decomposition and clustering fuzzy systems
    jingchun huang
    2006, 26(10):  2395-2397. 
    Asbtract ( )   PDF (549KB) ( )  
    Related Articles | Metrics
    A prediction method for non-stationary time series was proposed in association with multi-resolution of wavelet analysis and interpretability of fuzzy rules. The original time series were decomposed into the smooth and the detailed at different levels. After being denoised with the method of soft-hard threshold value, the smooth and the detailed at different levels were forecasted with the clustering fuzzy systems. Finally the sum of the forecasting results at different levels was the prediction of the original time series. Experiments show that the method is effective.
    Algorithm of finding Outlier for reclustering based on distance
    XueSong Xu
    2006, 26(10):  2398-2400. 
    Asbtract ( )   PDF (552KB) ( )  
    Related Articles | Metrics
    The identifying, analyzing and evaluating algorithm of finding distance-based Outlier (Cell-Based) was firstly studied, and its advantages and disadvantages were pointed out. And then, a new Outlier finding algorithm-algorithm of finding Outlier for reclustering based on distance was proposed. Theoretical analysis and experimental results show that this algorithm can not only effectively overcome the faults of traditional Cell-Based algorithm, i.e. need to be recomputed from scratch for every change of the parameters, and only suitable for finding the Outlier of low dimension, but also obviously avoid the problems caused by randomly selecting initial value to produce different finding results of Outlier at higher convergence speed.
    Survey on dimension reduction techniques
    2006, 26(10):  2401-2404. 
    Asbtract ( )   PDF (780KB) ( )  
    Related Articles | Metrics
    Dimension reduction techniques were discussed from the two aspects: feature selection and dimension transformation. Firstly, the basic theory and famous algorithms of feature selection were roughly introduced. Then, several most popular dimension transformation techniques were analyzed in detail including Principal Components Analysis and its related methods, Independent Components Analysis, factor analysis, projection pursuit, etc. Meanwhile, connection and distinction between them were provided. Finally, the present situation and future development of dimension reduction techniques were pointed out.
    Efficient structural joins on XML documents based on EXN-Tree encoding
    2006, 26(10):  2405-2407. 
    Asbtract ( )   PDF (824KB) ( )  
    Related Articles | Metrics
    A new encoding model: EXN-Tree encoding was proposed in this paper. At first, concept of EXN-Tree was introduced, and then the nodes of XML document tree were mapped to nodes of EXN-Tree. Finally, node data structure of XML document tree was established according to the node encoding of EXN-Tree. On the basis of EXN-Tree encoding, a series of algorithms were put forward which can deal with the nodes without sorting and indexes and the nodes with sorting and indexes. These algorithms can solve the structural join of XML in the two cases. In the case of the nodes without sorting or indexes, the algorithm simply modified from VPJ algorithm was applied to the new encoding, and better CPU capability was demonstrated. In the case of the nodes with sorting or indexes, the procedure of the algorithm was described in detail, and its I/O complexity was analyzed. The results show that this algorithm has good performance and is superior to the current one in terms of I/O complexity.
    Design and implementation of GML data storing approach
    2006, 26(10):  2408-2412. 
    Asbtract ( )   PDF (1195KB) ( )  
    Related Articles | Metrics
    It is an urgent issue to change the files storing status with huge amount of Geography Markup Language(GML). The storing approaches of XML-Enable Database(XED) and Native XML Database(NXD) were discussed in this paper, and the solutions to the storage of GML data into commercial databases both in object-relational and XML DBMS was given. Finally, an applied program based on .net was developed for storing and analyzing GML data. As a result, feasibility and practicability of the designed scheme and the adopted methods were proved.
    Mining Burst patterns in large temporal database
    2006, 26(10):  2413-2416. 
    Asbtract ( )   PDF (1067KB) ( )  
    Related Articles | Metrics
    How to effectively discover potentially useful knowledge from large databases is an important yet challenging issue. The paper firstly pointed out there would be two problems in mining very large temporal databases with experimental results, and then proposed a new method to solve the two problems. This approach first partitioned a database into several small datasets. And then the Burst patterns were dug up after four times pruning on the data. The experimental results show that the proposed method is accurate and efficient, and the Burst patterns are useful for decision-making in business.
    New model of an active database based on triggers
    2006, 26(10):  2417-2420. 
    Asbtract ( )   PDF (684KB) ( )  
    Related Articles | Metrics
    Trigger is an important mechanism of integrity enforcement and active control for commercial DBMS. In order to satisfy the active needs of many application systems for the database systems, a new model of an active database based on triggers was presented. And on the basis of this new model, a simple method that can be realized easily was applied to set up an active database based on triggers.
    Design and implementation of financial prewarning model based on data-mining
    2006, 26(10):  2421-2424. 
    Asbtract ( )   PDF (673KB) ( )  
    Related Articles | Metrics
    Since the ID3 algorism has obvious advantage on attribute screening, the algorism so as to get a simplified financial index system was modified in this paper. Then, as artificial neural network is better in constructing financial prewarning model than other linear and regression models, a new financial prewarning model based on B-P model was constructed. At last, the ability to prewarn the financial risk between the new financial prewarning model and the classic Z-score model were compared in terms of first error ratio and second error ratio. The prewarning ability of the new model has been proved.
    Study on the usage mining algorithm based on frequency and preference
    Wu Richard
    2006, 26(10):  2425-2426. 
    Asbtract ( )   PDF (527KB) ( )  
    Related Articles | Metrics
    A new usage mining algorithm based on frequency and preference was put forward in this paper. The time spent on the pages was taken as a factor that might influence user's Preference, while the traditional algorithms simply regard the frequency as user's preference. The algorithm collected data by ASP.NET and XML, divided the data into user sessions, and mined frequent and preferred usage patterns at last. Experimental results show that the algorithm costs less computation and has higher accuracy compared with the traditional algorithms, and really reflects the preferred usage patterns by most users.
    Automated categorization forensic system for history data of Web browsers
    2006, 26(10):  2427-2429. 
    Asbtract ( )   PDF (544KB) ( )  
    Related Articles | Metrics
    To enhance the automation of forensics, an automated categorization forensic method for history data of Web browsers based on Web classification technology was proposed in this paper, and a prototype system was implemented. The system automatically extracted the features of history data of web browsers and categorized the caught Web pages. The experimental results show that the system greatly increases the forensic efficiency and accuracy.
    Adaptive online retail Web site based on CA extended model
    2006, 26(10):  2430-2432. 
    Asbtract ( )   PDF (716KB) ( )  
    Related Articles | Metrics
    In online retail, the conflict between the different interests of all customers to different commodities and the commodity classification structure of Web site will make most customers access overabundant Web pages. To solve the problem, building a Hidden Markov Model(HMM) to make the Web site adjust itself according to the users' visits to Web sites is one of the ways. Based on the initialization of hidden Markov Model, same results can be achieved by utilizing the theories of cellular automata extended model and less time was spent. This throws some light on the adaptation of Web site based on CA extended model.
    Artificial intelligence
    New mixed quantum-inspired evolutionary algorithm for TSP
    Wu Yan JianJun Bao
    2006, 26(10):  2433-2436. 
    Asbtract ( )   PDF (694KB) ( )  
    Related Articles | Metrics
    Based on the analysis of the basic concepts of quantum evolution, a new Mixed Quantum-Inspired Evolutionary Algorithm for solving Traveling Salesman Problem(TSP) has been proposed, in which 3-Optimize local search heuristic was incorporated with quantum evolutionary mechanism, the nearest neighbor heuristic rule was used to initiate parameters, and the ordered crossover operator was introduced to extend the exploratory range of quantum population. Experiments were carried out on some cases from the well-known TSP library (TSPLIB). The results show that the new algorithm is effective and robust, which is able to find the satisfactory resolution with small size population and tiny relative error, even for medium or large scale problems (city number > 500). Key Words Quantum Computing; Evolutionary Algorithm; Traveling Salesman Problem.
    User interest profile classification algorithm based on FCC neural network
    2006, 26(10):  2437-2439. 
    Asbtract ( )   PDF (824KB) ( )  
    Related Articles | Metrics
    Fast classification of user interest profile is a key technology for personalization search engine. A new kind of FCC neural network model was presented in this paper. It does not need the binary system input that is usually required by CC4, because it can accept the real vectors input. FCC neural network model works according to the k-nearest neighbor samples' generalization space which users' information falls into. With the increasing of value k, the classification effect becomes close to that of Bayes classification algorithm.
    New algorithm for SVM-Based incremental learning
    XiaoDan Wang
    2006, 26(10):  2440-2443. 
    Asbtract ( )   PDF (725KB) ( )  
    Related Articles | Metrics
    Based on the analysis of the relation between the Karush-Kuhn-Tucker (KKT) conditions of Support Vector Machine(SVM) and the distribution of the training samples, the possible changes of support vector set after new samples are added to training set were analyzed, and the generalized Karush-Kuhn-Tucker conditions were defined. Based on the equivalence between the original training set and the newly added training set, a new algorithm for SVM-based incremental learning was proposed. With this algorithm, the useless samples were discarded and the useful training samples of importance were reserved. Experimental results with the standard dataset indicate the effectiveness of the proposed algorithm.
    Model of intelligent cleaning robot based on behavior evolution
    2006, 26(10):  2444-2445. 
    Asbtract ( )   PDF (395KB) ( )  
    Related Articles | Metrics
    Based on the research into the neural network, a model of intelligent cleaning robot was proposed by adopting the Double-Population Genetic Algorithm(DPGA) to evolve the weight of the neural network. This model simulated biologic behavior rule, and could rest on the environment to adopt the defined region search or the wide area search to sweep rubbish. The simulation experiment verifies the validity of the model and the superiority of the Double-Population Genetic Algorithm to the traditional Single-Population Genetic Algorithm(SPGA).
    Feature extraction method based on LS-SVM and its application to intelligent quality control
    2006, 26(10):  2446-2449. 
    Asbtract ( )   PDF (742KB) ( )  
    Related Articles | Metrics
    A new feature extraction method based on Least Squares Support Vector Machine (LS-SVM) was proposed and applied to intelligent quality control successfully. Firstly, the formulation of linear feature extraction was made in the same fashion as that in the LS-SVM linear regression algorithm. Secondly, the data was mapped from the original input space to a high dimensional one by following the usual SVM methodology so as that nonlinear feature extraction can be made from linear version of the formulation through applying the kernel trick. Thirdly, 50 dimensional simulated data sets, including six patterns, extracted by universal control chart, were used to test. As a result, characteristics of the original data sets declined to be 3 dimensional and 80% classification-messages remained. Finally, the BP-based abnormal pattern recognizer was applied to the characteristics extracted samples, and better results were obtained than that of directly recognizing original samples with RSFM methods. The simulation results indicate that this feature extraction method is not only feasible but also effective.
    New structure-based bill location technique
    2006, 26(10):  2450-2452. 
    Asbtract ( )   PDF (757KB) ( )  
    Related Articles | Metrics
    In order to enhance the accuracy in bill recognition, the bill location was studied. A new locating technique based on structure was proposed in this paper. It used the number and the relative position of intersecting points on the main margin of checks as the structural feature of the check, classified the bills with the similarity functions we defined, and extracted the recognition region of the bill at last. The experimental results demonstrate that this algorithm works well in bill location.
    Improved multi-objective genetic algorithm based on NSGA-II
    2006, 26(10):  2453-2456. 
    Asbtract ( )   PDF (632KB) ( )  
    Related Articles | Metrics
    Based on the study and analysis of NSGA-II algorithm, a new initial screening mechanism was designed, coefficient generating of crossover arithmetic operator was improved and more reasonable crowding mechanism was proposed. In this way, convergence was speeded up and its precision was improved. The testing results by representative applied functions show that with the improvements higher computational efficiency and more reasonable distributed solution can be obtained, and diversified distribution of the solutions can be maintained.
    Application of fuzzy decision trees to the public critical system
    Yang Yang
    2006, 26(10):  2457-2459. 
    Asbtract ( )   PDF (610KB) ( )  
    Related Articles | Metrics
    Nowadays the data in the real police database of the public critical system is explosive and hard to classify. To solve this problem, a revised fuzzy decision trees algorithm combined with the Genetic Algorithm was proposed in this paper. The forecast rate of the decision trees and the comprehensibility of the rules were improved by using this method. Meanwhile, the decision tree classifier based on the algorithm was designed to help the policemen not only to classify the old items, but also to forecast the new critical events accurately and quickly.
    Maximum scatter difference discriminant analysis in residual space and face recognition
    2006, 26(10):  2460-2462. 
    Asbtract ( )   PDF (1246KB) ( )  
    Related Articles | Metrics
    A new method of discriminant feature extraction based on scatter difference criterion in residual space was developed in this paper. Firstly, the instability of face images due to some different illuminations was moderated by constructing the residual space for face images. Then, the maximum scatter difference criterion function was adopted to extract a set of optimal features. As a result, the small size sample problem suffered from the traditional Fisher linear discriminant analysis was utterly avoided. Finally, extensive experiments performed on both ORL face database and Yale face database verify the effectiveness of the proposed method.
    Dynamic artificial immune classification algorithm for E-mail
    2006, 26(10):  2463-2465. 
    Asbtract ( )   PDF (760KB) ( )  
    Related Articles | Metrics
    With the technology of virtual gene library, Artificial Immune System for Email Classification(AISEC) was improved. Model of Dynamic Artificial Immune Classification Algorithm(DAICA) was proposed and the process of updating antibody population was refined. When the classification was correct, the antibodies, which participated in the classification, should be made good use of to improve the quality of the antibodies. When the classification was incorrect, it was not to simply remove the antibodies that participated in the misclassification, but to use somatic hypermutation on these antibodies so as to reserve the information of the antigens met before. At the same time the effects of the two important parameters α and β on the performance of the algorithm DAICA were also explored. The experimental results show that this improvement can achieve higher classification accuracy.
    Open-bisimulation checking of Web Services combination
    2006, 26(10):  2466-2469. 
    Asbtract ( )   PDF (652KB) ( )  
    Related Articles | Metrics
    In order to check the congruence between the user's demand and implementation of Web Services combination, an auto-checking method was proposed in this paper. It was based on the knowledge of open π-bisimulation and checking tools. First, the user's demand and BPEL4WS program were modeled by π-calculus respectively. Then, weak open-bisimulation was done to them. When they were dissimulating, the checking tool can mark the dissimulation part of BPEL4WS program. In the end, a case study was made to show the feasibility of the method.
    Typical applications
    Research of user documents testing technology
    Liang Wang
    2006, 26(10):  2470-2472. 
    Asbtract ( )   PDF (921KB) ( )  
    Related Articles | Metrics
    User documents are necessary parts of a software product, and will be delivered to users together with the software products. Nevertheless, engineering approaches that assure the quality of user documents are very few. Due to the requirements of developing large-scale software products, methods for guaranteeing the quality of user documents become the study target so as that the concept of document test is put forward. Different types of Bugs in previous testing on user documents were analyzed. Based on the analysis, testing strategies, testing principles and testing task assigning methods were summarized. Moreover, seven efficient technologies for user documents testing were presented.
    Study on automatic generation algorithm of the collection code in translating serial program into parallel program
    2006, 26(10):  2473-2475. 
    Asbtract ( )   PDF (592KB) ( )  
    Related Articles | Metrics
    The parallelization of serial program is mainly made up of parallel identification, data and computation decomposition, dependence relation analysis and automatic code generation. Data gathering is a very important part of automatic code generation. This paper studied the automatic generation algorithm of data collection code, and brought forward how to get the last write relation of the data based on the equivalence class, then created an inequality system with computation decomposition, loop iteration and last write relation, and at last realized auto-generation of the data collection code by using FME elimination method.
    Research of software process measurement based on AHP and SPC
    2006, 26(10):  2476-2479. 
    Asbtract ( )   PDF (721KB) ( )  
    Related Articles | Metrics
    The research of software process measurement is an important and difficult field. At present,the theories and means of software process measurement still need to be improved,and highly efficient and accurate process quantity management can ensure the success of software process improvement. Therefore, a new means integrating Analytical Hierarchy Process(AHP) with Statistic Process Control(SPC) was put forward in this paper. On the basis of Capability Maturity Model Integrated(CMMI), software project was measured from the perspective of software process. At last a case study demonstrates that this means can support software process measurement and improvement effectively.
    Workflow management in Agent based collaborative design
    2006, 26(10):  2480-2482. 
    Asbtract ( )   PDF (679KB) ( )  
    Related Articles | Metrics
    According to the characteristics of CAD collaboration design, combining the advantages of statechart and Agent technology, a workflow management model based on statechart and Agent was proposed. After discussing its key technology, its application in the workflow management of collaborative architecture design was demonstrated, and this provides helpful reference to the workflow management in collaboration design.
    Research and implementation of program condition visualization based on AOP
    2006, 26(10):  2483-2485. 
    Asbtract ( )   PDF (833KB) ( )  
    Related Articles | Metrics
    This paper described the concept of Aspect Oriented Programming(AOP) and its advantages of solving crosscutting concerns, discussed its application in the field of program condition visualization and presented an example of program condition visualization in the program of Evaluation of Expression by using AOP technology. The function of program condition visualization usually appears as crosscutting concern. Compared with Object Oriented Programming(OOP)technology, AOP technology provides a more loose-couple approach to modularize crosscutting concerns.
    A heterogeneous data transimition software platform for on-the-spot archiving
    2006, 26(10):  2486-2489. 
    Asbtract ( )   PDF (934KB) ( )  
    Related Articles | Metrics
    Pervasive computing is the trend of development for computing mode. On-the-spot archiving system, as one of the important applications, normally involves transferring heterogenous data between multiple devices. A software platform for supporting heterogeneous data transfer was presented in this paper. The platform utilizes subscribe-publish pattern as transmission mode to enable runtime adjustment of transmission relationship and extendibility of the system. And the platform constructs distinct transmission relationship structures for different kinds of data to maintain high efficiency. Moreover, the platform provides simple and easy-to-learn API which conceals implementation details of data transfer. This platform has been applied to a class archiving system, which proves the practicability of the platform.
    Efficient method of model checking based on LTL and Petri net
    2006, 26(10):  2490-2493. 
    Asbtract ( )   PDF (844KB) ( )  
    Related Articles | Metrics
    An import method of model checking is constructing the product of automata Aφ from the negation of Linear Temporal Logic(LTL) formula and the model of system, and checking the emptiness of the product. A method to reduce the state-space of the Generalized Büchi automaton was presented, which can improve the efficiency of model checking. The toolkit for model checking which was designed and implemented based on LTL and Petri net can check the model described by Petri net very well.
    Measurement of object-oriented software system and its application
    2006, 26(10):  2494-2495. 
    Asbtract ( )   PDF (544KB) ( )  
    Related Articles | Metrics
    In the analysis and design process of object-oriented system, the design quality of category has direct effects on the software system quality. First the relations between categories were analyzed, explained and defined, and they were classified into the crosswise and the longitudinal. Then, measurement and quality appraisal were done to the two types of relations. Furthermore, methods for improving the category design quality were provided.
    On-line handwritten signature verification algorithm based on wavelet transform to extract characteristic points
    2006, 26(10):  2496-2498. 
    Asbtract ( )   PDF (662KB) ( )  
    Related Articles | Metrics
    A verification algorithm based on Wavelet Transform (WT) was proposed in this paper. Firstly, the algorithm used WT to exact characteristic points of the handwritten information including three-dimension force and two-dimension coordinate information that were obtained by the F-Tablet. And then it built five-dimension feature sequences and dynamically created multi-templates by using clustering. Finally, after the fusion of the above-mentioned five-dimension feature sequences, the algorithm computed the verification results by majority voting scheme. This algorithm was applied to a signature database acquired by F-Tablet, and the performance evaluation in even EER (Equal Error Rate) was improved to 2.83%. The experimental results show that the algorithm not only reduces the amount of stored data, but also minimizes the duration of the whole processing during the authentication phase, and increases the efficiency of signature verification.
    Researches into the costs of clip region for multiwindow system and an optimal algorithm
    WeiZhong Wang
    2006, 26(10):  2499-2501. 
    Asbtract ( )   PDF (687KB) ( )  
    Related Articles | Metrics
    To improve the performance of embedded graphic midware, this paper analyzed the time and space complexity of clip region which is a key technique used in the multiwindow system, and proposed an optimal switching algorithm according to the characteristics of multiwindow operation and embedded system. Application and simulation test have proved that the algorithm works well and runs faster.
    Chinese Web page classification based on representative samples dynamical generation
    2006, 26(10):  2502-2504. 
    Asbtract ( )   PDF (554KB) ( )  
    Related Articles | Metrics
    A new algorithm based on representative samples dynamical generation for Chinese Web page classification was proposed In this paper. The method generated representative samples through training the original samples; and then made the best of helpful information from every sample which was cut out to adjust the representative samples repeatedly in order to enhance the representativeness. Through the experiment with the Chinese Web classifier based on this algorithm, it shows that this algorithm can compress the original training corpus effectively so that classification efficiency can be improved substantially; meanwhile, this algorithm maintains the accuracy and has a better classification performance.
    Migrating workflow system model based on synchronizer
    2006, 26(10):  2505-2508. 
    Asbtract ( )   PDF (913KB) ( )  
    Related Articles | Metrics
    Owing to dynamic process definition and complicated framework, thus migrating workflow system lacks of a clear definition on the model. A migrating workflow system model based on Petri net was established through the analysis of the elements. The model acted the synchronous machine as the places, and the concept and location of resources & service capabilities were expanded to be adapted to the transition enabled rules in the migrating environment. And the task sets embodied in the migrating environment can be reflected dynamically. At last, the entire operation relocation process of migrating workflow system was simulated well.
    Novel technology of customer review extraction
    2006, 26(10):  2509-2512. 
    Asbtract ( )   PDF (809KB) ( )  
    Related Articles | Metrics
    Mining the customer reviews accurately in commercial websites has significant meaning in effective recommendation for trade company. A kind of novel algorithm—Customer Review Extraction (CRE) was put forward in this paper. CRE iteratively segments page and calculate the information entropy to automatically discover and extract the reviews. The experimental result has proved that the algorithm has higher recall and precision.
    Particle swarm optimization with flying time adaptively adjusted
    jiankeZhang
    2006, 26(10):  2513-2515. 
    Asbtract ( )   PDF (509KB) ( )  
    Related Articles | Metrics
    To improve the searching performance of Particle Swarm Optimization (PSO), a modified PSO algorithm with flying time adaptively adjusted was proposed and named FAA-PSO algorithm. The flying time of every particle in this algorithm was adaptively adjusted in pace with addition of the evolutionary generations; Thus, the algorithm overcomes the difficulty of the traditional PSO that the searching ability of particle is decreasing during the later time of iteration, which is caused by that the flying time of every particle is fixed on one. Numerical results show that this algorithm is of advantage to accelerate convergence and improve calculation accuracy.
    New linear approach for camera calibration based on improved SUSAN detector
    2006, 26(10):  2516-2518. 
    Asbtract ( )   PDF (817KB) ( )  
    Related Articles | Metrics
    A new linear approach with radial distortion model for camera calibration was designed and realized. Firstly, SUSAN corner detector was combined with edge detection and false corner removal to enhance its speed and accuracy. Secondly, the improved SUSAN detector was used to obtain the sub-pixel coordinates of the corners. Thirdly, a camera model with radial distortion was established to calculate the intrinsic and extrinsic parameters of the camera. The experimental results and error analysis demonstrate high accuracy and efficiency of this approach.
    Information operation and management system for medical waste disposal
    2006, 26(10):  2519-2521. 
    Asbtract ( )   PDF (658KB) ( )  
    Related Articles | Metrics
    An information operation and management system for medical trash collection was established in this paper, transport and disposal in the waste collection-transport vehicle. The system is composed of the temperature instrument for the refrigerator truck, the electron weight meter, the MDCS communication controller, the GPS on vehicle, the network director based on mutil-protocol, the reading card, the LCD monitor and the operation interface based on GPS&GSM/GPRS&GIS and Internet. The real-time data communication and management between the waste collection-transport vehicle and the departments of government can be realized. The system can be used to deal with dynamic optimization of waste collection-transport routes, reasonable production schedule, and quick transfer of "Shift List". The disposal efficiency is increased and the cost is decreased.
    Topology relationship analysis of cadastral spatial database and validity method based on rule
    2006, 26(10):  2522-2524. 
    Asbtract ( )   PDF (819KB) ( )  
    Related Articles | Metrics
    In order to enhance the efficiency of spatial data quality checking and realize the intelligent topology relationship testing between feature classes, the topology relationship classification of the cadastral spatial database was analyzed and a topology relationship verification method was proposed based on topology rules. A spatial data quality checking system LR_Checker was developed by using GIS development platform ArcEngine. The testing with massive cadastral spatial data indicate that, the use of LR_Checker can greatly reduce the data checking work, and the proposed verification method is accurate, effective and reasonable.
2025 Vol.45 No.4

Current Issue
Archive
Honorary Editor-in-Chief: ZHANG Jingzhong
Editor-in-Chief: XU Zongben
Associate Editor: SHEN Hengtao XIA Zhaohui
Domestic Post Distribution Code: 62-110
Foreign Distribution Code: M4616
Address:
No. 9, 4th Section of South Renmin Road, Chengdu 610041, China
Tel: 028-85224283-803
  028-85222239-803
Website: www.joca.cn
E-mail: bjb@joca.cn
WeChat
Join CCF