Loading...

Table of Content

    01 September 2009, Volume 29 Issue 09
    Information security
    Research on collaborative operation of network security components
    2009, 29(09):  2315-2318. 
    Asbtract ( )   PDF (560KB) ( )  
    Related Articles | Metrics
    Nowadays, it is very difficult for various network security components to adopt unified security policy in one network environment, which results in not fully taking advantage of the whole network protection. The authors presented a cooperative model based on security domain layer with three-layer structure and two-class management. The security domain was used as the fundamental unit to implement collaboration and management, and an Intrusion Detection Exchange Protocol (IDXP) protocol based on Blocks Extensible Exchange Protocol (BEEP) frame was implemented to transmit Intrusion Detection Message Exchange Format (IDMEF) messages. The experimental results demonstrate that this model and IDXP can effectively implement message transmission and collaborative operation.
    Design and implementation of usable security model for operating system
    2009, 29(09):  2319-2322. 
    Asbtract ( )   PDF (676KB) ( )  
    Related Articles | Metrics
    A Usable Security Protected Model (USPM) for operating system was proposed and implemented. This model provided enough security for operating system with good compatibility and less special configuration. With this model, one can keep the secret of confidential documents and protect important files from being damaged even after a successful attack. The authors used a rule that limited the activities of processes who communicated with the remote system to ensure the security of the operating system and a number of exceptions that permitted particular activities of particular process to improve the usability of the system. A set of tests show that the model is easy to use, and it has good security, lower costs and good compatibility.
    Audio watermarking algorithm for public information transmission
    2009, 29(09):  2323-2326. 
    Asbtract ( )   PDF (617KB) ( )  
    Related Articles | Metrics
    Based on wavelet decomposition and cepstrum technology, an audio watermarking algorithm was proposed in which the low-frequency wavelet coefficients were chosen for cepstrum transforming. The watermark was embedded into the audio carrier through the method of statistical mean calculation and modification. Experimental results show that the proposed algorithm can resist the attacks of A/D and D/A successfully with large capacity and zero bit error. Meanwhile, the approach can prevent against the AMR attack and it provides promising prospect for the secure broadcast and management of mobile audio.
    Evaluation method of service-level network security situation based on fuzzy analytic hierarchy process
    2009, 29(09):  2327-2331. 
    Asbtract ( )   PDF (965KB) ( )  
    Related Articles | Metrics
    Lacking of effective evaluation method of service-level network security situation, the event injection technique was introduced to select some important factors that reflected service availability and performance, a three-level index system of service-level network security situation was also established. Then an evaluation method of service-level network security situation based on Fuzzy Analytic Hierarchy Process (FAHP) was presented. Through the FAHP the index weight related to the higher level was calculated; combined with multi-level fuzzy comprehensive evaluation, a quantitative analysis on service-level network security situation was finally got. Experimental results show that this approach can shield details from a specific service or an intrusion, and it is of high efficiency in qualitative description and quantitative analysis.
    Controllable delegation trust management model based on trustworthiness in P2P
    2009, 29(09):  2332-2335. 
    Asbtract ( )   PDF (670KB) ( )  
    Related Articles | Metrics
    A controllable delegation authorization model that is suitable for open environments was presented. It integrated the merits of both Role Based Access Control (RBAC) and role-based trust management and can effectively control the propagation of permissions of different inheritance hierarchy in roles. An approach for assigning trustworthiness thresholds to permissions in local access control policy was discussed. The algorithm of calculating the values of trustworthiness of entities in the extended framework was proposed. The usage of the model was illustrated through a typical example.
    New scheme of XML-based access control for effective querying and quick updating
    2009, 29(09):  2336-2338. 
    Asbtract ( )   PDF (609KB) ( )  
    Related Articles | Metrics
    With the wide use of Extensible Markup Language (XML) documents and the strengthening security awareness, the security issues of XML data become increasingly important. The authors proposed a XML-based access control scheme, which implemented effective querying by combining indexing/labeling scheme. To achieve quick updating, the scheme used NULL objects to delete XML data and memo sub-nodes to insert XML data respectively.
    Identity-based structured multi-signature
    2009, 29(09):  2339-2341. 
    Asbtract ( )   PDF (458KB) ( )  
    Related Articles | Metrics
    Bilinear pairs can easily realize complex cryptography protocol. In order to simplify the burden of certificate management of public key cryptography, a novel identity-based structured multi-signature was proposed by using bilinear pairing technology. The scheme took the user's identity information as public key such as e-mail address, IP address, telephone number so that it erased the cost of forming and managing Public Key Infrastructure (PKI) and avoided the problem of user's storing, receiving and sending his public key and certificate. The simulation results of the proposed scheme show that the cost of signing and verifying in the scheme is only 3/4 of the original scheme.
    New digital signature scheme based on discrete logarithm
    2009, 29(09):  2342-2343. 
    Asbtract ( )   PDF (488KB) ( )  
    Related Articles | Metrics
    The Discrete Logarithm Problem (DLP) was analyzed. Based on the discrete logarithm in the prime field, and combined with the elliptic curve DLP in the limited domain, a new digital signature was brought up. In addition, the security of this digital signature was analyzed and its performance was compared with other signatures. The results of comparison prove that this new digital signature based on discrete logarithm is more effective and secure than the others.
    Blind image steganalysis based on multi-directional correlation of DCT coefficient
    2009, 29(09):  2344-2347. 
    Asbtract ( )   PDF (750KB) ( )  
    Related Articles | Metrics
    A blind steganalysis method based on multi-orientation correlation of Discrete Cosine Transform (DCT) coefficient was presented. This method first applied multi-directional correlation of DCT coefficient to construct difference neighboring correlation matrix, then extracted 48 dimensional features for each image, finally Support Vector Machine (SVM) was used to classify the cover and stego images. A series of experiments were performed on 6 kinds of typical steganography of different embedding ratio. The results show that this method can make a reliable blind detection for these typical steganographic schemes.
    Text watermarking based on text feature
    2009, 29(09):  2348-2350. 
    Asbtract ( )   PDF (497KB) ( )  
    Related Articles | Metrics
    The format-based text watermarking algorithm has poor robustness against format attacks, and the natural-language-based text watermarking algorithm is difficult to realize. A text zero-watermarking based on word frequency was proposed. Words were segmented and word frequency was computed. The words were sequentially extracted in threshold range of word frequency to be text feature. Text feature, watermark and secret key were registered to the information database. Watermarking detection was blind. Both Chinese and English documents with multimedia information were tested in the experiments. Experimental results demonstrate that the technique has good robustness against attacks, such as cutting, pasting and reversing.
    Secure and effective anonymous communication scheme for wireless sensor network
    2009, 29(09):  2351-2354. 
    Asbtract ( )   PDF (661KB) ( )  
    Related Articles | Metrics
    With the widespread applications of large scale distributed Wireless Sensor Network (WSN), in some situations, the security of WSN involves not only the security of sending data by sensors, but also the anonymity and privacy during the sending process. How to design a secure efficient anonymous communication scheme for wireless sensor network has become a research hotspot. Using bilinear pairing, hash function and different operation, a scheme of validated secure anonymous communication was proposed. Through the analysis and improvement, this scheme can not only satisfy the basic requirement of anonymous communication, but also improve distinctly the complexity of computation and storage, and it is more suitable for wireless sensor network.
    Study of new trusted network framework
    2009, 29(09):  2355-2359. 
    Asbtract ( )   PDF (986KB) ( )  
    Related Articles | Metrics
    There are some deficiencies existing in the access technology of traditional trusted network, such as being lack of protection after network access, low efficiency of the security chip and imperfection of the trust transmission chain. In allusion to the above deficiencies, integrating the key techniques of trusted network, a new framework of trusted network based on Trusted Platform Module (TPM) was proposed, which could improve the efficiency of the security chip, enhance the credibility and security of trusted network and provid a higher security protection for the trusted network users. The application of trusted measurement, data encryption and transmission can ensure higher security, reliability, creditability and anonymity in the framework.
    Logical definition of usage control authorization model
    2009, 29(09):  2360-2362. 
    Asbtract ( )   PDF (638KB) ( )  
    Related Articles | Metrics
    After importing authorization and mutable attribute, the usage control authorization model is characterized with a new feature. Concerning the new feature, the authors gave a simple introduction to the structure of Usage Control (UCON) model using set theory and brought forward logical definition. Especially, the logical model of the usage control authorization model was mainly analyzed. Lastly their respective logical definition of eight different model structures were made, and some examples were given to show how to use logical definition in practice.
    Public verifiable algorithm of threshold secret sharing with short share
    2009, 29(09):  2363-2365. 
    Asbtract ( )   PDF (451KB) ( )  
    Related Articles | Metrics
    To make up the limitation that the length of secret can not be too long and prevent the action of cheating, using the theory of Jordan matrix,and combining with the formulary of Lagrange,the authors put forward an algorithm of threshold secret sharing with short share. It could effectively resist the statistical attack and the united attack of corrupt participants less than r. The length of secret share that each participator needed to conserve was very short. It had a very important application when the secret was a big privacy file, a big message transmitted in an insecure channel, a secret database shared by several participants or enormous data in distributed storage.
    Text zero-watermark based on use frequency of Chinese characters
    2009, 29(09):  2366-2368. 
    Asbtract ( )   PDF (626KB) ( )  
    Related Articles | Metrics
    Aiming at the problems of difficult embedding of Chinese text digital watermark, poor imperceptibleness and robustness and lack of watermark capacity, a zero-watermark algorithm based on use frequency of Chinese characters was presented. In this algorithm, the use frequency of Chinese characters in the text was calculated, and then combining the frequency table of commonly used Chinese characters, the text characteristic was extracted to construct a zero-watermark. According to the trait of the watermark algorithm, a correlation function was defined to fix threshold value and examine the watermark. Experimental results indicate that the watermark algorithm constructed in this way has good imperceptibleness, robustness and sufficient watermark capacity, and the constructed watermark examining algorithm based on the correlation function exhibits comparatively lower misjudging rate.
    Research and implementation of layer access control technology based on Linux kernel driver
    2009, 29(09):  2369-2374. 
    Asbtract ( )   PDF (922KB) ( )  
    Related Articles | Metrics
    A method was proposed to improve POSIX.1e standard capability module. In addition, monitoring and controlling were performed on the operation system kernel layer after loading improved module at the kernel of Linux Security Module (LSM) framework. Furthermore, a series of operations were carried out, which included the process trust-like privileges arbitration, security i-node operation, information feedback, queue operation, etc. At last, the character devices were used to feedback the monitor information to application layer and performed security control. Compared with original capability module, the proposed scheme not only improves efficiency of system operation, correct monitoring rate, and coverage of system scanning, but also keeps better monitoring performance in system resources occupancy rate and several parameters.
    Graphics and image processing
    Intersecting feature recognition based on 3D solid model of graph
    2009, 29(09):  2375-2377. 
    Asbtract ( )   PDF (485KB) ( )  
    Related Articles | Metrics
    As it is hard to recognize intersecting features automatically, a new method to recognize such intersecting features in 3D solid model was proposed. First, it used the Attributed Adjacent Graph (AAG) to define the topological structure among typical simple features and then used the Geometric Relation Restriction Graph (GRRG) to describe the geometric constraint relation among their faces. Second, guided by the predefined AAG and GRRG, it used subgraph matching algorithm to recognize typical simple features with their AAG not varied from the given 3D model. The recognized features were fixed and removed from the solid model for further recognition. Next, the proposed method progressively recognized those simple features whose topology varied by adding mirror face operations. Finally, the intersecting features could be denoted as a group of simpler feature entities joined together and be recognized effectively.
    Data association algorithm of multiple non-overlapping cameras
    2009, 29(09):  2378-2382. 
    Asbtract ( )   PDF (783KB) ( )  
    Related Articles | Metrics
    Multiple non-overlapping cameras are required in the surveillance of wide areas, and the observations of objects are often widely separated in time and space. The key problem of tracking objects in wide areas is the data association of non-overlapping cameras. A data association algorithm based on the weighted bipartite graph was proposed. The objects were the graph nodes, the graph edges were constructed by the temporal and spatial constraints, the weights of edges was the similarity of objects, and the association result was the maximum weight matching of the bipartite graph. A feature named Logarithm Illuminance Contrast Statistic (LICS) was proposed for rigid objects to calculate the similarity of objects, which was non-sensitive to the change of illumination and posture. Experiments with real videos and simulation validate the proposed approach.
    Face recognition method based on Gabor wavelet and common vector
    2009, 29(09):  2383-2385. 
    Asbtract ( )   PDF (589KB) ( )  
    Related Articles | Metrics
    The performance of subspace-based face recognition methods is easily affected by variances of lighting, pose and expression. To overcome the limitation, a novel face recognition method was proposed which combined Gabor filter and Common Vector (CV) approach. Gabor filter could better extract the local features of images because of its selectivity on scale and orientation, and it is robust to variances of lighting, pose and expression. The common vector is a linear subspace classification method, which can obtain good classification results by extracting the common property of each class training samples. The experimental results show that the proposed method can obtain good recognition results on ORL and Yale databases.
    Cloth simulation based on improved mass-spring model
    2009, 29(09):  2386-2388. 
    Asbtract ( )   PDF (397KB) ( )  
    Related Articles | Metrics
    The current cloth simulation has some problems, such as low real-time and poor sense of reality. Aiming at the super-elastic phenomenon, the authors proposed an improved mass-spring model based on the classic mass-spring model. Compared to the traditional mass-spring model, the improved model had been added with two constraint conditions, the light bar and string. It is shown from the experiments that the super-elastic problem is effectively solved and the cloth tearing is simulated vividly. The simulation result is more real.
    Extraction method of standing faces of high building based on perceptual organization
    2009, 29(09):  2389-2392. 
    Asbtract ( )   PDF (695KB) ( )  
    Related Articles | Metrics
    Perceptual organization was introduced to extract standing faces of high building from remote sensing images. According to neighboring, linking and closed properties of perceptual organization, an algorithm was proposed, which included edge extraction, edge partition, standing faces extraction of high building based on perceptual organization, standing faces extraction results of high building fusion and so on. The experimental results show that this approach is effective.
    Region-based image annotation of graph learning approach
    2009, 29(09):  2393-2394. 
    Asbtract ( )   PDF (428KB) ( )  
    Related Articles | Metrics
    Image annotation has been an active research topic in recent years. The authors formulated image annotation as a semi-supervised learning problem under multi-instance learning framework. A novel graph-based semi-supervised learning approach to image annotation, using multiple instances, was presented, which extended the conventional semi-supervised learning to multi-instance setting by introducing the adaptive geometric relationship between two bags of instances. The experimental results show that this approach outperforms other traditional methods and is effective for image annotation.
    Face recognition under illumination invariance using multiresulotion analysis
    2009, 29(09):  2395-2397. 
    Asbtract ( )   PDF (499KB) ( )  
    Related Articles | Metrics
    Illumination variation is the most significant factor affecting the performance of face recognition, and has received much attention in recent years. A novel method to extract illumination invariant features was proposed for face recognition under varying lighting conditions, which combined wavelet transform and logarithm operation. The experiment with Matlab programs was designed on Yale B face database by PCA+LDA recognition. Minimum distance classifier was applied for its simplicity; the Euclidean metric L2 was used as distance measure. The experimental results show that the performance of the proposed method is better than other methods.
    Image segmentation algorithm based on mathematical morphology and active edgeless contour model without edges
    2009, 29(09):  2398-2401. 
    Asbtract ( )   PDF (897KB) ( )  
    Related Articles | Metrics
    In order to improve the efficiency of active contour model and expand its application, the authors proposed a new method of image segmentation based on mathematical morphology and active edgeless contour. An image was preprocessed by the smoothing operator, Alternating Sequence Filter (ASF) operator in mathematical morphology before being fed into the segmentation operator of active edgeless contour. Experimental results show that this strategy not only improves the segmentation precision, but also reduces the times of iteration. It also exploits the potential that active edgeless contour can be used in images with complex backgrounds.
    Edge detection in SAR images of transform domain
    2009, 29(09):  2402-2405. 
    Asbtract ( )   PDF (697KB) ( )  
    Related Articles | Metrics
    Concerning plenty of speckle noise in Synthetic Aperture Radar (SAR) images, multiscale products method was proposed to detect edge in transform domain. This method was to obtain the value of a threshold automatically using the max signal-to-noise ratio and made different directions multiscale products of high frequency coefficient in order to detect edge of SAR images in NonSubsampled Contourlet Transform (NSCT) domain. Compared with classical methods, the experimental results demonstrate the validity of the proposed method.
    Automatic rendering for Chinese landscape painting using texture synthesis
    LI Da-Jin
    2009, 29(09):  2406-0410. 
    Asbtract ( )   PDF (1084KB) ( )  
    Related Articles | Metrics
    Based on the analysis of Chinese landscape painting art skills, the author proposed a novel rendering method based on texture synthesis. At first, a control picture should be created to indicate the shading area and the area silhouette should be painted in. Then, some textures images painted by Chinese artist using desired representation skills were collected; these texture images would be used as samples in texture synthesis process. Finally, texture syntheses were implemented, and then a picture with the strokes in the texture samples would appear in your vision. In order to preserve the stroke styles and mountain figure in the control picture, the author presented an object-controlled multi-level texture synthesis algorithm that could control the synthesis result efficiently using control picture. Using this method, a photo or a simple script could be converted to a Chinese painting style picture automatically, and the experimental results show it works well.
    Object extraction algorithm combining boundary information and region information
    2009, 29(09):  2411-2413. 
    Asbtract ( )   PDF (504KB) ( )  
    Related Articles | Metrics
    For some objects with holes, or their color information similar to the background's, object can't be extracted exactly only by boundary information or region information. Combining boundary information and region information simultaneously, an object extraction algorithm based on graph cut was proposed. The experimental results show that the algorithm can extract such objects exactly and effectively. The method achieves the expectation with high accuracy in automatic segmentation, less user's workload and high efficiency.
    Image segmentation method using binary level set based on regional GAC model
    2009, 29(09):  2414-2417. 
    Asbtract ( )   PDF (645KB) ( )  
    Related Articles | Metrics
    A new model of Geodesic Active Contour (GAC) based on region was presented, which was the improvement of traditional GAC model. A new region-based signed pressure forces function was constructed, which took the place of the edge stopping function, and could efficiently solve the problem of objects segmentation with weak edges or without edges. The model was implemented by level set method with a binary level set function to reduce the expensive computational cost of re-initialization of the traditional level set function. The proposed algorithm has been applied to images of different modalities, and the results are better than that of traditional GAC model and C-V model.
    Color image denoising based on noise characteristic and vector median filtering
    2009, 29(09):  2418-2419. 
    Asbtract ( )   PDF (470KB) ( )  
    Related Articles | Metrics
    In order to effectively denoise the color image, while preserving image detail information, and increase processing speed, a new noise pinpointing method based on noise characteristic was proposed according to the analysis of noise characteristic of color images. Combined with the modified Vector Median Filtering (VMF) algorithm, the detected noise was denoised. Experimental results show that the algorithm can preserve the detail of image very well when restraining noise effectively, and its computation time is only half of the classical vector median filtering algorithm.
    Network and communications
    Novel algorithm for improving comprehensive energy-saving performance of mobile station and its performance analysis
    2009, 29(09):  2420-2423. 
    Asbtract ( )   PDF (531KB) ( )  
    Related Articles | Metrics
    Based on the analysis of exponential growth algorithm prescribed in IEEE 820.16e, a novel Hybrid Growth Algorithm (HGA) of sleep window was proposed to reduce the Average Energy Consumption (AEC) and the Average Waiting Time (AWT) of receiving the data frames of the Mobile Station (MS). The proposed algorithm promoted the comprehensive energy saving performances of MS by changing the growth rate of sleep window. The good performances of the proposed algorithm was validated via comparative analysis and experimental simulation, then the effects of the parameters in sleep mode on the algorithm were studied, which was helpful to the choice of the parameters value. The experimental results indicate that by means of HGA, the average waiting time is greatly reduced while a better energy saving character is obtained. Especially when the arrival rate is small, compared with exponential-growth algorithm, the average energy consumption of HGA is reduced by 15% to 20%, with the average waiting time by 50% to 70%.
    Resource analysis method for multi-service network based on queuing network
    2009, 29(09):  2424-2427. 
    Asbtract ( )   PDF (739KB) ( )  
    Related Articles | Metrics
    Nowadays, with the development of the Next Generation Network (NGN), a number of new services appear constantly. How to optimize network resource is one of the important measures to improve the Quality of Service (QoS). The authors proposed a method to analyze resource in multi-service network, which used queuing network theory to optimize the distribution of resources. The authors also presented a queuing service model to analyze resource, and proposed a way to distribute the flow and a process to solve. At last,the method gave the allocation relationships between various resources and QoS. Simulation results show the validity of the method.
    Reliable multicast architecture based on multicast gateway
    2009, 29(09):  2428-2431. 
    Asbtract ( )   PDF (645KB) ( )  
    Related Articles | Metrics
    Guaranteeing the reliability of multicast communication is the premise of many applications based on multicast in Internet. Since it was difficult to deploy IP multicast on a large scale in Internet, the authors presented a Reliable Multicast (RM) architecture, which utilized multicast gateways to interconnect IP multicast islands and Application Layer Multicast (ALM) domain. Solutions to key points in the architecture, such as group identification, multicast gateway, group management, error control, and congestion control were also proposed. At the same time, a competitive algorithm for multicast gateway was designed. This architecture could shield the difference existing in underlying multicast techniques and thereby support uniform deployment of reliable multicast services in Internet.
    Epistemic Cord logic: verifying anonymous routing protocols in Ad Hoc network
    2009, 29(09):  2432-2434. 
    Asbtract ( )   PDF (448KB) ( )  
    Related Articles | Metrics
    For verifying Ad Hoc anonymous routing protocols, a modular method was proposed, which was based on epistemic Cord logic. This method decomposed a protocol into several components according to different security sub-function, and then proved whether these components satisfied their specifications of security properties. The security property of path anonymity was defined and was specified by this logic.
    Novel rapid acquisition method for ultra-wide band system
    2009, 29(09):  2435-2436. 
    Asbtract ( )   PDF (440KB) ( )  
    Related Articles | Metrics
    To improve the acquisition speed and reduce the complexity for synchronization acquisition in Ultra-Wide Band (UWB) system, a novel two-step "Energy Detection + Search" rapid acquisition method was proposed. The acquisition process was analyzed and the capture time was derived. The simulation results show that the average acquisition time of the new method is less than the existing algorithms.
    Construction of optimal family of frequency-hopping sequences based on perfect nonlinear function
    2009, 29(09):  2437-2438. 
    Asbtract ( )   PDF (424KB) ( )  
    Related Articles | Metrics
    A new family of frequency-hopping sequences was constructed based on perfect nonlinear function. It was proved that these sequences had good Hamming auto-correlation and cross-correlation properties according to the properties of perfect nonlinear function. Based upon different perfect nonlinear functions, different frequency-hopping sequence families with optimal Hamming correlation property could be obtained.
    Evaluation and improvement of SOAP performance for real-time Web service
    2009, 29(09):  2439-2441. 
    Asbtract ( )   PDF (616KB) ( )  
    Related Articles | Metrics
    Simple Object Access Protocol (SOAP) is characterized with extensibility, flexibility and descriptiveness; however, these characteristics are maintained at its performance. Therefore, the authors compared SOAP with other widely used protocols through experiments and found out the performance deficiency in SOAP and when being used in highly real-time Web service it may cause performance bottleneck. In order to make use of SOAP in real-time, the authors did an improvement in XML compression, data encoding and decoding, data encapsulation and so on. Experimental results show that the improved SOAP is more suitable for real-time Web services.
    Technology of spots advertising in P2P streaming media player system
    2009, 29(09):  2442-2445. 
    Asbtract ( )   PDF (679KB) ( )  
    Related Articles | Metrics
    To meet the requirement of spots advertising in P2P streaming media player system, taking account of the characteristics of P2P communication, the authors proposed a smooth spots advertising technology which can save more system memory. To realize the technology, the active prefetching algorithm for streaming media based on relevance between resources was adopted to obtain advertisement resources. The dynamic hared memory pool technology was applied to store and broadcast advertisement. The application of this spots advertising technology expanded the field of advertising, enhanced the influence of advertisement, helped advertisers increase the number of customers, and enabled the streaming media providers to be more competitive in the market.
    Design and implementation of partially decentralized P2P Botnet control server
    2009, 29(09):  2446-2449. 
    Asbtract ( )   PDF (616KB) ( )  
    Related Articles | Metrics
    The control server is the most important core part of partially decentralized P2P Botnet. The authors designed and implemented the partially decentralized P2P Botnet control server based on the select model of Socket. The primary functions and the relation of them were introduced in detail. The work flow and important data structure were presented. Compared with the traditional IRC Botnet and decentralized P2P Botnet, the differences on occupation of resource, speed of message transmission, control mode, and so on were discussed. According to the result of experiment, both function and capability of this control server meet the need of decentralized P2P Botnet.
    Artificial intelligence
    Latent semantic information measurement of corpus orientation
    2009, 29(09):  2450-2453. 
    Asbtract ( )   PDF (752KB) ( )  
    Related Articles | Metrics
    The authors defined an information measurement associated with a topic or semantics for a keyword. Firstly, the topic-based corpus was obtained. Then the latent semantic vector space model of the corpus was established. After that, the information measurement of the keyword was defined through the model. Accordingly, the amount of the topic information any document contained could be calculated. Lastly, the membership measurement which measured the membership degree of the document belonging to the topic was introduced. A measurement threshold was set, thereby it determined whether the documents belonging to the topic or not. The experimental results show that the definition of the information measurement can get over the difficulty of the word-match search and really reach the goal of the semantic-match search.
    Short-term load forecasting method based on neural network hybrid algorithm of adaptive variable coefficients particle swarm optimization and radial basis function
    2009, 29(09):  2454-5458. 
    Asbtract ( )   PDF (818KB) ( )  
    Related Articles | Metrics
    To improve short-term load forecasting accuracy, a neural networks hybrid optimization algorithm of Adaptive Variable Coefficients Particle Swarm Optimization and Radial Basis Function (AVCPSO-RBF) was proposed. The RBF neural network parameters could be optimized. The short-term load forecast model was established based on the AVCPSO-RBF algorithm. Using the method and history load data of Guizhou power system, the short-term load forecasting was carried out. The experimental results show that convergence of the method is faster and forecast accuracy is more accurate than that of the traditional RBF neural network algorithm, the PSO and RBF neural networks algorithm and the neural networks model based on chaos theory. The hybrid algorithm improves the RBF neural network generalization capacity, and overcomes the shortcomings of the traditional PSO algorithm and the RBF neural networks. The short-term load-forecasting accuracy is improved in Guizhou power system, of which the average percentage error is no more than 1.7%. The hybrid algorithm can be effectively used in short time load forecasting of the power system.
    One-class intrusion detection system based on KPCA space-similarity
    2009, 29(09):  2459-2463. 
    Asbtract ( )   PDF (785KB) ( )  
    Related Articles | Metrics
    To solve the problems of difficult obtaining of the abnormal samples in network intrusion detection application and overfitting of conventional classifications due to the abnormal data unevenly distributed, a novel one-class detection model based on Kernel Principal Component Analysis (KPCA) space similarity and immune principle was presented. The KPCA was employed to extract the nonlinear distribution characteristics of normal samples, and normal samples' characteristic sub-space was consequently established. Other samples' projection onto the sub-space was used to be the metrics of similarity with the normal sub-space. In order to efficiently explore the available abnormal training samples, self-adaptive incorporated immune items were adopted to improve the performance of the proposed model's detection. The kernel function's parameters and threshold value setting were also analyzed and the deciding model based on the Particle Swarm Optimization (PSO) was provided. The detection scheme based on KPCA space similarity was compared with Multi-Layer Perception (MLP), Support Vector Machine (SVM) and Self-Organizing Map (SOM) detection techniques in the experiments. The experimental results illustrate the correctness and effectiveness of the investigated techniques.
    Research of foe intention recognition method based on intuitionistic fuzzy Petri net
    2009, 29(09):  2464-2467. 
    Asbtract ( )   PDF (588KB) ( )  
    Related Articles | Metrics
    The foes' intention recognition should be a dynamic and uncertain decision-making process because of the uncertainty of combat situation. In order to deal with the uncertainty in a dynamic recognition process, the Intuitionistic Fuzzy Petri Net (IFPN) model and its reasoning algorithm were built by combining the advantages of the Intuitionistic Fuzzy Set (IFS) theory and Petri net theory. Furthermore, the efficiency of the reasoning was improved due to the parallel operation ability of Petri net, and the reasoning result was more precise and more believable because of the effect of the non-membership parameter. Finally, an instance was presented to illustrate the feasibility and validity of the proposed model.
    Accurate allocation model of system reliability based on converse thinking
    2009, 29(09):  2468-2470. 
    Asbtract ( )   PDF (454KB) ( )  
    Related Articles | Metrics
    For realizing accurate reliability allocation in the late stage of system reliability design, a new reliability allocation method was proposed. The basic idea of the method was using converse thinking and the ability of neural network that it could approach to any non-liner mapping. Based on the early reliability data, the influence of the system reliability changing by the sub-system reliability and the sub-system self-constrained conditions changing could be achieved. It could be taken as the base of the reliability allocation. The system reliability and the sub-system self-constrained conditions were taken as the input of the neural network; the ratio of the corresponding sub-system reliability was taken as the output of the neural network. Then, the Back Propagation (BP) neural network and Radial Basis Function (RBF) neural network were trained. After comparing the test results, the neural network model of system reliability allocation could be achieved.
    Sound source localization based on audio-video system of mobile robot
    2009, 29(09):  2471-2472. 
    Asbtract ( )   PDF (410KB) ( )  
    Related Articles | Metrics
    For the accuracy of sound source localization, the authors provided an algorithm which estimated the sound azimuth roughly through the auditory system of robot. Based on this evaluation, the survey used binocular stereo vision system to compensate the direction error, and located the elevation degree in vertical plane and the distance of sound source. The experimental results show that the coarse-to-fine strategy has more precise result for sound source localization.
    Multiple kernel discriminant analysis with optimized weight
    2009, 29(09):  2473-2476. 
    Asbtract ( )   PDF (540KB) ( )  
    Related Articles | Metrics
    In order to enhance the accuracy of nonlinear classification, the multiple kernel learning method developed under the framework of Support Vector Machine (SVM) was referred to. The authors constructed a multi-kernel for kernel-based Linear Discriminant Analysis (LDA). Moreover, a weight optimization scheme for the multi-kernel was proposed by maximizing the Margin Maximization Criterion (MMC) based on the method of Lagrange multipliers. The experiments on the FERET and CMU PIE face database show that multiple kernel discriminant analysis can achieve higher classification performance, compared with single-kernel-based LDA.
    Incremental updating algorithm for computing core based on improved discernibility matrix
    2009, 29(09):  2477-2480. 
    Asbtract ( )   PDF (513KB) ( )  
    Related Articles | Metrics
    Through analysis, it was found out that the improved discernibility matrix presented by Professor Yang Ming had unnecessary calculations. Therefore, an improved discernibility matrix definition together with a method for computing core was introduced. The authors introduced an incremental updating algorithm for computing core based on improved discernibility matrix, which mainly considered core updating when objects dynamically increased. Theoretical analyses show that incremental updating algorithm for computing core has nearly linear time and space complexity; and the experimental results show that the algorithm is efficient and effective.
    Inverse problem of SVM via margin merging clustering algorithm
    2009, 29(09):  2481-2482. 
    Asbtract ( )   PDF (439KB) ( )  
    Related Articles | Metrics
    The inverse problem of Support Vector Machine (SVM) is how to split the dataset into two clusters so that the margin between the two clusters reaches maximum. But the surprising time complexity makes it difficult to be applied to a dataset with certain scale. The result and efficiency of method, in which first clustering and then enumerating all the possible cases, were greatly affected by the number of clusters. Based on the relationship between margin and the minimum distance between the points in different clusters, a new algorithm called margin merging clustering algorithm was proposed to solve this problem. Combining the subclusters that the distance between them was smaller than 2*margin, the number of subclusters and the number of enumeration were reduced. The comparative experiments demonstrate that the proposed algorithm performs better than the traditional algorithm only based on clustering algorithm.
    Research of fractional order PID controller using hybrid particle swarm optimization
    2009, 29(09):  2483-2486. 
    Asbtract ( )   PDF (627KB) ( )  
    Related Articles | Metrics
    Fractional order Proportional-Integral-Derivative (PID) controller is a PID controller to the fractional Proportional-Integral-Derivative controller; it can more precisely control the complicated system than the traditional PID controller. The parameter values play a decisive role in the effects of control. A new hybrid algorithm was proposed. The Bacterial Foraging Algorithm and Particle Swarm Optimization (BFA-PSO) were combined to calculate the precise fractional order controller parameter values, which used the chemotactic, reproduction and elimination-dispersal of the BFA and combined the merits of few parameter and being easy to optimize. The simulation results show that the algorithm based on the fractional order PID control is of not only non-overshoot and fast convergence but also robustness and high precision, so it can be used to control different kinds of objects and processes.
    Database and knowledge engineering
    Exploration of hybrid multi-dimensional histograms for hybrid multi-dimensional data distribution
    2009, 29(09):  2487-2490. 
    Asbtract ( )   PDF (818KB) ( )  
    Related Articles | Metrics
    In reality, it is often the case that multi-dimensional data distributions do not exhibit one single type of data distribution as a whole, but rather, in different regions of the data space, different types of data distributions are obviously shown. The authors proposed a new kind of hybrid multi-dimensional histograms-COCA*-Hist-based hybrid data distributions to tackle the problem. The method built up COCA*-Hist, which was composed of different kinds of buckets according to different regions in the data space with different data distribution characteristics, under the given space budget. The aim was to enhance the estimation accuracy of the multi-dimensional histograms in general. Because COCA*-Hist had to scan the tree structure twice of the histogram being built to discern the different data regions and allocated the space budget among them, COCA*-Hist was a little inferior in efficiency. But the improvement in both universality and estimation accuracy made the cost in time worthwhile.
    Rough set-based algorithm for discretizing continuous variables of Bayesian network
    2009, 29(09):  2491-2493. 
    Asbtract ( )   PDF (600KB) ( )  
    Related Articles | Metrics
    Based on the analysis of the limitation that the discretization algorithm of Rough Set (RS) and Boolean reasoning approach did not work well in Bayesian network, a new algorithm was put forward to distinguish two samples by the value of candidate cuts, not by the intervals determined by two candidate cuts. The case study indicates that the improved algorithm can reduce preferably the space complexity and time complexity of the discretization. It is effective on discretizing continuous variables of Bayesian network.
    Survey on applications of software source code mining
    2009, 29(09):  2494-2498. 
    Asbtract ( )   PDF (1021KB) ( )  
    Related Articles | Metrics
    Data mining technology can find some valuable knowledge from large amounts of data. The software source code as a special kind of data is being mined by data mining technology in code level, which is a new and important topic. The authors introduced its applications in some fields, the data mining technology, and the current level of development, and then analyzed current restriction of the field. Finally, several directions on software source code mining application were concluded.
    Research of necessary rules' influence on classifying
    2009, 29(09):  2499-2501. 
    Asbtract ( )   PDF (614KB) ( )  
    Related Articles | Metrics
    The computing gist of algorithms based on rules involves the rules like "A→C" and their confidences. Here, "A" represents the set of decision attributes and their values, and "C" represents a kind of class label. Can the rules like "C→A" act positively in classifying algorithms? A simple experiment was designed, which considered the associations between single attribute values and class label. Two testing methods were made according to the experiment goals. By the first method, confidences of "A→C" were used. By the second method, the confidences of both "A→C" and "C→A" were used. The experiments were made on several typical classifying data sets. The results show the higher classifying precision by using the double confidences.
    Improved kNN algorithm based on Mahalanobis distance and gray analysis
    刘星毅 LIU Xing-Yi
    2009, 29(09):  2502-2504. 
    Asbtract ( )   PDF (649KB) ( )  
    Related Articles | Metrics
    The Euclidean-based k-Nearest Neighbor (kNN) algorithm is restricted to the dataset without correlation-sensitive on density. The author proposed an improved kNN algorithm based on Mahalanobis distance and gray analysis for imputing missing data to replace the existing Euclidean distance. The Mahalanobis distances can deal with the issue of correlation-sensitive on density, and the gray-analysis method can deal with the opposite case. Hence, the proposed method can deal with any kind of datasets, and the experimental results show the proposed method outperforms the existing algorithms.
    Improved fast DBSCAN algorithm
    2009, 29(09):  2505-2508. 
    Asbtract ( )   PDF (622KB) ( )  
    Related Articles | Metrics
    The time performance of Density-Based Spatial Clustering of Applications with Noise (DBSCAN) is inefficient. Concerning this problem, the authors analyzed the reasons of losing object in the process of fast clustering, and proposed a new Improved Fast DBSCAN (IF-DBSCAN) algorithm. On the basis of not losing data object, this algorithm expanded a category by selecting representative objects from the neighborhood of core data object, so that it reduced the number of regional inquiries and improved the algorithm's time performance. The experimental results show that IF-DBSCAN algorithm is correct and efficient.
    Research on optimal placement of data copies in distributed database
    2009, 29(09):  2509-2511. 
    Asbtract ( )   PDF (454KB) ( )  
    Related Articles | Metrics
    For optimally placing data copies in a tree network, an improved optimizing model, K-node core, was proposed, based on the existing model K-tree core. The improved model had better performance in updating database, compared to the original model. Two efficient dynamic programming algorithms for solving K-node core problem in a tree network were given. The optimizing model was tested by experiments.
    Software process technology
    Application of workflow-based service composition in E-government
    2009, 29(09):  2512-2515. 
    Asbtract ( )   PDF (652KB) ( )  
    Related Articles | Metrics
    To solve the information silo problem in the construction of E-government system, the authors put forward a frame based on workflow technology to implement service composition, which set up a process service center to compose the service of legacy system and implement the integration of legacy system and trans-departmental cooperative work. In this way, it could be suitable for the individual demand and realize quick response for business agility. Finally, a trans-departmental E-government system verifies the rationality of that frame, which suggests it is feasible and effective.
    Improved task scheduling algorithm for embedded real-time operating system and its application
    2009, 29(09):  2516-2519. 
    Asbtract ( )   PDF (635KB) ( )  
    Related Articles | Metrics
    It is the scheduler that decides the capability of the embedded system. For the Rate-Monotonic (RM) scheduling algorithm determines priority only by the period, the deadline of long period and important tasks can not be guaranteed and the system resources can not be effectively utilized. When the number of the task is infinite, the utilization rate of CPU is only 69%. Here, a new static priority scheduling algorithm called NSRL (new scheduling algorithm based on rate and laxity) was proposed. Two parameters were added to the Task Control Block (TCB): one was the importance of the task, and the other was the laxity. The one whose importance was higher only when its laxity was zero could preempt the running task. The experimental results show that the algorithm can decrease the deadline-missing ratio of the tasks and the CPU resource can be used more effectively. It is an efficient way of scheduling the real-time tasks; also it is useful for the application in wireless broadband and mobile computing.
    Design and implementation of FPGA data communication interface driver based on Linux
    2009, 29(09):  2520-2522. 
    Asbtract ( )   PDF (565KB) ( )  
    Related Articles | Metrics
    With regard to the application requirement of the Field Programmable Gate Array (FPGA) in embedded system, a data communication interface between the FPGA and ARM was designed and implemented. Based on the implementation of the FPGA device driver in the embedded Linux 2.6, the detailed discussion was given: the achievement of the operation of FPGA through the memory mapping mechanism and the improvement on the system efficiency through the blocking operation. Referring to the related data structures and functions of Linux2.6.15, a device driver was programmed and tested for the FPGA device. This FPGA interface driver, which had been applied and tested for a long time, ran steadily in the whole system.
    Integrated research on quick response system of supply chain based on SOA and ESB
    2009, 29(09):  2523-2526. 
    Asbtract ( )   PDF (619KB) ( )  
    Related Articles | Metrics
    To establish an open and loosely-coupled system integration environment, the architecture of system integration based on Service-Oriented Architecture (SOA) and Enterprise Service Bus (ESB) was provided. The execution mechanism of ESB was described in detail. Finally, the method was applied to a case of supply chain quick response. The experimental results show that ESB based on SOA is an effective method to build an integrated system between heterogeneous systems.
    FPGA verification technology based on Synopsys VMM
    2009, 29(09):  2527-2529. 
    Asbtract ( )   PDF (619KB) ( )  
    Related Articles | Metrics
    Field Programmable Gate Array (FPGA) designs should be thoroughly verified in order to enhance the reliability of corresponding product, with the emerging importance of programmable device in the field of implementation of digital system. The authors analyzed the methods for quality hardware design implemented on FPGA, depicted the trends of verification in terms of method and methodology, as well as conducted a comparison of mainstream verification methodologies. Based on Synopsys Verification Methodology Manual (VMM), the authors proposed and implemented a layered general-purpose verification technique that has been utilized to construct verification platform in application. The experimental results show that this technique can not only maintain the generability of platform but also improve the efficiency of verification.
    Improved scheme for fuzzy integrated evaluation of software quality based on minimum confidence and evaluation analysis
    2009, 29(09):  2530-2533. 
    Asbtract ( )   PDF (626KB) ( )  
    Related Articles | Metrics
    The importance of software quality evaluation in software engineering and the realizing process of fuzzy integrated evaluation as the software quality evaluation method were elaborated. The insufficiency in current fuzzy integrated evaluation process was pointed out. A new idea to guide software quality evaluation was proposed, which used minimum confidence as the membership rule, and used evaluation analysis, based on the quantified weight and satisfaction, as the analysis approach. Based on the minimum confidence and the evaluation analysis, an improving and realizing scheme for the fuzzy integrated evaluation of software quality was given. Finally, the rationality and validity of this scheme were verified through real case.
    Design and implementation of file monitoring system of Windows Mobile
    2009, 29(09):  2534-2536. 
    Asbtract ( )   PDF (470KB) ( )  
    Related Articles | Metrics
    The authors analyzed the file system of Windows Mobile, and studied the structure and call logic of the system Application Program Interface (API). Then a way establishing hook function for the system API to intercept the relevant API operations was put forward as the basis of the design and implementation of the file monitoring system based on Windows Mobile. The experimental results show that the system can monitor all file access operations.
    Design and realization of Agent-based modeling and simulation system on energy infrastructures
    2009, 29(09):  2537-2540. 
    Asbtract ( )   PDF (730KB) ( )  
    Related Articles | Metrics
    The modeling simulation on energy infrastructures is a proper and almost the only method for learning its characters. The authors presented a modeling structure for an Agent-Based Model (ABM) of energy infrastructures including electric power, petroleum and natural gas. It provided the detailed description of each element with particular emphasis on their attributes, behaviors, rules and interdependence between them. The different steady-state equations of energies were given by analyzing the characteristics of interdependent energy infrastructures. The research on the proposed model, which would provide an apparent demonstration on emergent behaviors of the infrastructure, was also given.
    Hot spot implementation method of three-layer software framework
    2009, 29(09):  2541-2545. 
    Asbtract ( )   PDF (730KB) ( )  
    Related Articles | Metrics
    The authors classified hot spots of the three-layer software framework according to different characteristics, and gave different design strategies and methods to realize the hot spots with reference to design patterns. Finally, the authors took Time Quota Management System (TQMS) of a large-scale mechanical manufacturing company as an example to elaborate the realization of the hot spots based on .NET. The proposed method provided good support to enhance the flexibility of software framework, as well as flexible response to changes in customer's demand.
    Typical applications
    New peak searching method in subspace spectrum
    2009, 29(09):  2546-2547. 
    Asbtract ( )   PDF (399KB) ( )  
    Related Articles | Metrics
    The orthogonality between the signal subspace and the noise subspace is widely employed to implement the Direction of Arrival (DOA) estimation. However, it is hard to seek the spectrum peak in real time due to the large computing load. The new algorithm configured a threshold firstly so that all intervals including the peaks could be located by large searching step. Furthermore, small searching step was proposed to find out the desired spectrum peaks in these intervals. The principle, implementation and performance of the new approach were addressed. Last but not least, the computer simulations illustrate that the new strategy reduces the computing load by 40% to 50%.
    Analysis of random-like property of discrete chaotic system with symbol entropy
    2009, 29(09):  2548-2549. 
    Asbtract ( )   PDF (422KB) ( )  
    Related Articles | Metrics
    Random-like properties of typical Logistic map and Henon map were analyzed and discussed by using symbol entropy algorithm. Firstly, the binary sequences were obtained from real-valued sequences generated by discrete chaotic maps, then were coded. Symbol entropies of the binary sequences were calculated and their curves were plotted. Influences of system parameter and initial value on symbol entropy were discussed. Simulation results show that symbol entropy algorithm can be used to statistically identify the strength of random-like properties of discrete chaotic maps, and Logistic map is better than Henon map as the source of randomness.
    Forecasting model of short-term traffic flow for road network based on independent component analysis and support vector machine
    2009, 29(09):  2550-2553. 
    Asbtract ( )   PDF (548KB) ( )  
    Related Articles | Metrics
    Traffic flow forecasting is one of the important issues for the research of Intelligent Transportation System (ITS). Through analyzing the characteristics of data collected by different observation place on the same road, the authors proposed a new prediction method of short-term traffic flow in road network based on Independent Component Analysis (ICA) and Support Vector Machine (SVM). First, the traffic flow data of every observation point on the same road was turned into independent source signal through ICA method. Second, SVM model was used to train and predict the source signal, and through Genetic Algorithm (GA) parameters were optimized. At last, the traffic flow forecasting data were obtained by an inverse transform. Real traffic data were applied to test the proposed prediction model. The experimental results show that this method not only is more accurate than the method which uses SVM directly to predict traffic flow, but also can get rid of the data interaction of every observation points on the same road.
    PCB defect detection based on run-lengthen encoding
    guo shi-gang
    2009, 29(09):  2554-2555. 
    Asbtract ( )   PDF (346KB) ( )  
    Related Articles | Metrics
    When using the computer vision technology to inspect Printed Circuit Board (PCB) quality, the image of PCB with defect information was encoded by the run length encoding and the connected component was labeled. The information of the connected component was stored in pointer array and dynamic chain tables, which increased the data processing speed and decreased memory occupation. The extraction of eigenvalue and processing for connected component enhanced the anti-jamming and could recognize defects and locate the position of PCB. The experimental results show that the method is simple, effective and fast.
    Design and realization of multi-Agent-based distributed simulation system for supply chain
    2009, 29(09):  2556-2558. 
    Asbtract ( )   PDF (669KB) ( )  
    Related Articles | Metrics
    Supply chain is a complex system, so it is difficult to achieve its global optimization. Since the members in a supply chain are intelligent, a multi-Agent simulation model for supply chain was proposed. Based on this model, two types of Agent and a time synchronization mechanism for distributed simulation were designed. Also a distributed simulation system for supply chain was realized by JADE. The simulation results of semiconductor supply chain show the effectiveness of the system.
    Model and design of adaptive mobile middleware supporting scalability
    2009, 29(09):  2559-2561. 
    Asbtract ( )   PDF (487KB) ( )  
    Related Articles | Metrics
    To satisfy the requirements for mobile computing, the mobile middleware was introduced into the development of mobile application to improve the scalability and adaptability of the mobile system. The requirements for the mobile middleware were introduced. A service-based mobile middleware model was advanced. This model involved the use of several middleware distributed in the mobile computing system to support the system scalability. Architecture and design of the mobile middleware model were introduced and its performance was discussed in depth.
    Cloud computing and its key techniques
    2009, 29(09):  2562-2567. 
    Asbtract ( )   PDF (931KB) ( )  
    Related Articles | Metrics
    Cloud computing is a new computing model; it is developed based on grid computing. The authors introduced the development history of cloud computing and its application situation; compared existing definitions of cloud computing and gave a new definition; took google's cloud computing techniques as an example, summed up key techniques, such as data storage technology (Google File System), data management technology (BigTable), as well as programming model and task scheduling model(Map-Reduce), used in cloud computing; and analyzed the differences among cloud computing, grid computing and traditional super-computing, and fingered out the broad development prospects of cloud computing.
    Multi-source remote sensing image fusion based on improved wavelet transform method
    2009, 29(09):  2568-2570. 
    Asbtract ( )   PDF (621KB) ( )  
    Related Articles | Metrics
    SPOT panchromatic images and multi-spectrum TM remote sensing images were fused by an improved wavelet transform adopting the advantages of direction adjusted wavelet transform and inseparable frame wavelet transform with the aid of Matlab and ERDAS IMAGINE. Compared with traditional fusion methods, this method can maintain both the spectrum information of the multi-spectrum images and the spatial resolution characteristic of the full-color images, therefore enhance the ability of phantom interpretation.
    Application of Bayesian decision tree to recognition of English present participle
    2009, 29(09):  2571-2574. 
    Asbtract ( )   PDF (534KB) ( )  
    Related Articles | Metrics
    Concerning the difficulties in part-of-speech tagging in English present participle, the authors analyzed the drawbacks of Hidden Markov Models (HMM) and proposed Bayesian decision tree model. Firstly, the tagged corpus was calculated and C4.5 in decision tree was used for proper classification and disambiguation of the three classes of present participle. Then, the decision tree was improved by Bayesian least risk. At last, an untagged corpus was used to test the model and the result is very good, which proves the superiority of the model.
    Binary subdivision scheme with three parameters
    2009, 29(09):  2575-2577. 
    Asbtract ( )   PDF (489KB) ( )  
    Related Articles | Metrics
    A new scheme was presented to design subdivision curves by introducing three control parameters in the subdivision process. The sufficient conditions of the uniform convergence and Ck continuity of this subdivision scheme were analyzed. Given the condition of the initial data, the curve shape can be adjusted and controlled through selecting appropriate parameters. The scheme could produce subdivision curve of C4 continuity of limit curve, and the curve modeling was flexible. Some examples of the curve design were given to demonstrate the efficiency of the scheme.
    Multimodulus blind equalization algorithm for cross quadrature amplitude modulation signals
    2009, 29(09):  2578-2580. 
    Asbtract ( )   PDF (588KB) ( )  
    Related Articles | Metrics
    The authors presented a modified multimodulus blind equalization algorithm for cross Quadrature Amplitude Modulation (QAM) signals. Based on Weighted Multimodulus Algorithm (WMMA), the proposed algorithm,which applied the improved method of Multimodulus Algorithm (MMA) to cross QAM signals, fully employed statistics of the symbols used in cross QAM signal constellations. With the help of piecewise linear moduli, it made a better match between error models and constellations. Additionally, this algorithm showed the relationship between Mean-Square Error (MSE) estimation and the weight term under various Signal-to-Noise Rate ((SNR). The simulation results demonstrate that this algorithm provides superior steady-state performance, which makes it more suitable for cross QAM signals.
    Suspicious money laundering detection system based on eigenvector centrality measure of transaction network
    2009, 29(09):  2581-2585. 
    Asbtract ( )   PDF (716KB) ( )  
    Related Articles | Metrics
    For anti-money laundering, a detection system based on eigenvector centrality measure was introduced, which included pre-processing of transaction data, eigenvector centrality measure and time-series analysis. Three key indexes of suspicious activities detection were also provided. Through the simulation on the transaction data of bank, the validity of the detection system was proved.
    Garment modeling technology based on section
    2009, 29(09):  2586-2588. 
    Asbtract ( )   PDF (605KB) ( )  
    Related Articles | Metrics
    The authors presented a feature and parameterization approach for constructing garment surface with garment sections from human body sections. The body section was obtained from polygonal 3D human models with the horizontal cutting plane based on detecting feature points on human model. Then the authors made the interstice measure to be converted from garment tolerance as the parameter to build garment sections from body sections. Garment surface was constructed with garment sections by an improved angle synchronous method.
    Research and application of Hilbert packed R-tree in ATC GIS display
    2009, 29(09):  2589-2592. 
    Asbtract ( )   PDF (564KB) ( )  
    Related Articles | Metrics
    The present Air Traffic Control (ATC) Geographical Information System (GIS) is very slow in displaying the map due to traversing the whole model to draw all units. Concerning this, some work was done on the mapping algorithm based on the Hilbert packed R-tree: establishing Hilbert packed R-tree index for each layer, and redrawing the map using depth-first traversing method. The experimental results indicate that the proposed algorithm can achieve faster displaying speed.
    Novel method of space horizontal line localization based on single catadioptric omnidirectional image
    2009, 29(09):  2593-2595. 
    Asbtract ( )   PDF (450KB) ( )  
    Related Articles | Metrics
    Space horizontal line localization without prior knowledge from a single omnidirectional image was exploited. Based on the existing approaches of straight line localization, the authors demonstrated that, for symmetric non-central catadioptric systems, the equation of a 3D horizontal line could be estimated using only two points extracted from a single image of the line by exploiting the characteristics of horizontal line image in catadioptric system, and meanwhile a horizontal line reconstruction algorithm based on main-point image and non-main-point image was proposed. The experimental results justify that, under different precision of image point extraction, compared to present approaches of line reconstruction, the proposed method is constantly more efficient.
    New LOD method based on adaptive quad-tree for terrain visualization
    2009, 29(09):  2596-2598. 
    Asbtract ( )   PDF (500KB) ( )  
    Related Articles | Metrics
    Level of Detail (LOD) is utilized to simplify the model as well as to improve the efficiency in 3D visualization of terrain with large data. The quad-tree is a kind of data structure that is able to optimize complicated data. First, the authors discussed the general method of LOD based on adaptive quad-tree and described the triangle-fan method for model drawing; second, according to the features of multi-resolution of the quad-tree, a method for seamless model drawing, based on the attribute revision of the node in the quad-tree, was proposed; finally, the 3D visualization of a 1:50000 map was achieved by this method and the dynamic display was improved.
2024 Vol.44 No.3

Current Issue
Archive
Honorary Editor-in-Chief: ZHANG Jingzhong
Editor-in-Chief: XU Zongben
Associate Editor: SHEN Hengtao XIA Zhaohui
Domestic Post Distribution Code: 62-110
Foreign Distribution Code: M4616
Address:
No. 9, 4th Section of South Renmin Road, Chengdu 610041, China
Tel: 028-85224283-803
  028-85222239-803
Website: www.joca.cn
E-mail: bjb@joca.cn
WeChat
Join CCF