Loading...

Table of Content

    01 April 2010, Volume 30 Issue 4
    Pattern recognition and Software
    Research and design of code updating mechanism in wireless senor network
    2010, 30(4):  857-859. 
    Asbtract ( )   PDF (797KB) ( )  
    Related Articles | Metrics

    As sensor nodes of Wireless Senor Networks (WSN) need software maintenance and function expansion after being deployed, remote code updating has become an indispensable service. Based on the analysis of the code dissemination protocols like MOAP, Deluge and Ripple, a new multihop code update mechanism called Air_Update was proposed, which was reliable, energy efficient, and of low storage cost and low latency. In order to ensure the correctness and integrity of the loaded code image, as well as the availability of node after abnormal reprogramming, Bootloader had been redesigned. Simultaneously, such mechanisms as subset-to-subset dissemination protocol, unicast retransmission request strategy and sliding window were adopted to reduce the traffic, the usage of SRAM and EEPROM, the energy consumption, and to prolong the network lifetime. At last, the correctness and effectiveness of design were verified in testbeds.

    Network and communications
    Application of VCG mechanism in replica placement of P2P storage system
    2010, 30(4):  860-864. 
    Asbtract ( )   PDF (933KB) ( )  
    Related Articles | Metrics
    VCG (Vickrey-Clarke-Groves) mechanism was introduced into generalized replica placement model of Peer to Peer (P2P) storage system. Mapping from replica placement model to VCG mechanism was established and suitable payment function was designed, which is incentive compatible to pre-placement nodes. Dominant-strategy equilibrium exists in this mechanism and can be reached in polynomial time. Simulation shows that such mechanism can stimulate pre-placement nodes to tell the truth.
    Equalization algorithm of wavelet packet transform-based multicarrier modulation communication system
    2010, 30(4):  865-867. 
    Asbtract ( )   PDF (535KB) ( )  
    Related Articles | Metrics
    For transmitting high-speed data stream in complicated mobile transmission environment, intersymbol interference inevitably exists as a result of multipath effect in the course of the data transmission in wavelet packet transform-based multicarrier modulation communication system. On the basis of the establishment of multipath Rayleigh fading channel model, the authors analyzed the system performance in Rayleigh fading channels. At the same time, by introducing the zero-forcing method, an equalization algorithm of zero-forcing was proposed, which was based on time-domain channel estimation of wavelet packet transform-based multicarrier modulation. The proposed algorithm is simple and easy; moreover, the system complexity does not increase. The simulation result indicates that the performance of communication system is enhanced effectively in the multipath Rayleigh fading channel. The research validates the feasibility of the proposed scheme, which has a bright application prospect in the fields of high-speed data transmission and future mobile communication.
    Storage model of servers in video grid
    2010, 30(4):  868-871. 
    Asbtract ( )   PDF (694KB) ( )  
    Related Articles | Metrics
    Concerning the different functions of storage systems of the central and local servers in video grid, the climbing rule of local service rate was parsed along with the increase of relative storage capacity of local servers. The economic relative storage capacity for local servers based on the central server storage capacity was found, which is near the critical point of the local service rate and considers both of the local service rate and the storage system costs. The storage model can guide the construction of a real video grid system.
    Method of Top-k Web service selection based on QoS
    2010, 30(4):  872-875. 
    Asbtract ( )   PDF (673KB) ( )  
    Related Articles | Metrics
    Quality attributes of Web service are dynamic and real-time. Much more attention has been put on the research of providing quality guaranteed Top-k services selection and in the mean time effectively reducing the load of the host server and also the network involved. This paper proposed a method named RTKS-QoS algorithm using monotone utility function, through nomalizing QoS attributes and calculating the value range of the utility function, efficiently filtered the Top-k services to meet the requirements. The experimental result shows under the constraints of k<20, the response time and the network load of RTKS-QoS algorithm improved by 55% and 52% respectively compared to non-optimized case.
    Elliptic trajectory based reliable forwarding for wireless sensor networks
    2010, 30(4):  876-880. 
    Asbtract ( )   PDF (1145KB) ( )  
    Related Articles | Metrics
    In Wireless Sensor Network (WSN), it is difficult to realize high efficient transmission of large blocks of data such as images, videos and audios. An Elliptic Trajectory-based Reliable Forwarding (ETRF) mechanism was proposed for WSN. By using geographical location, multiple disjoint elliptic paths with equal distances among them were established from sources to sinks so that large blocks of data were divided into several small amounts of data, which were forwarded along separate elliptic paths in parallel, which increased network bandwidth and balanced network load. ETRF not only combined multiple redundant paths with the Forward Error Cowection (FEC) protection mechanism, but also utilized "back-off competition" and cooperation among neighboring nodes within a designated area to enhance the overall transmission performance in dynamic networks. Both theoretical analysis and simulation results demonstrate that ETRF exhibits a superior comprehensive performance compared with other relevant protocols.
    SIP compression algorithm based on SigComp
    2010, 30(4):  881-883. 
    Asbtract ( )   PDF (518KB) ( )  
    Related Articles | Metrics
    Independent Media Services (IMS) is a subsystem to support IP Multimedia services, working in the 3rd Generation Partnership Project (3GPP) of version 5. IMS uses Session Initiation Protocol (SIP) to establish and maintain multi-media session; however, oversized SIP message based on text is becoming a bottleneck under IMS wireless circumstances. Under the framework of SigComp, a new algorithm based on improved Lempel-Ziv-Storer-Szymanski (LZSS) algorithm and arithmetic combination was presented to compress SIP signaling. Simulation results prove that the new algorithm has higher compression efficiency. It is valuable for IMS SIP session setup delay.
    Comparative study of time series model in traffic prediction of GPRS cells
    2010, 30(4):  884-887. 
    Asbtract ( )   PDF (767KB) ( )  
    Related Articles | Metrics
    The performances of some classic time series prediction models were analyzed together concerning the traffic prediction of General Packets Radio Service (GPRS) cells. Based on summarizing the steps of prediction by time series model, the performances of Auto-Regression (AR) model, Auto-Regression Moving Average (ARMA) model, Auto-Regressive Integrated Moving Average (ARIMA) model and multiple seasonal ARIMA model were analyzed. At first, the traffic changes of GPRS cells were studied. Then the autocorrelation coefficient and partial correlation coefficient of traffic were analyzed from different angles, and the AR model and ARMA model of the GPRS cells traffic were proposed. Furthermore, according to cell traffic changes in one day cycle, the multiple seasonal ARIMA model of the GPRS cells traffic was proposed. At the end, with the historical data of traffic of a cell, the three models were applied to predict the traffic sometimes in the future, and comparative study of the prediction performances of the three models were made as well.
    Mechanism for optimizing P2P traffic between networks based on network measurement
    2010, 30(4):  888-891. 
    Asbtract ( )   PDF (740KB) ( )  
    Related Articles | Metrics
    The popularization for P2P technique has optimized the Internet experience; however, it brings big pressure to the Internet Service Provider (ISP) for it overly consumes the network bandwidth. This paper proposed a mechanism for optimizing P2P traffic based on the network measurement between networks related to different services, which set up the base layer network model and can optimize the connection between peers through comprehensively considering the base layer network information and particularity of services. Performance analysis and simulation show that this mechanism can not only reduce the traffic between networks but also optimize user experience.
    Research and application of synchronization compensation mechanism in wireless sensor networks
    2010, 30(4):  892-894. 
    Asbtract ( )   PDF (786KB) ( )  
    Related Articles | Metrics
    Among the wireless sensor network applications which are triggered by events and with some of the nodes synchronization, the traditional periodic time synchronization algorithm for the entire network synchronized will lead to unnecessary power consumption. Therefore, this paper analyzed the existing time synchronization mechanism, and proposed a time synchronization compensation algorithm based on event-triggering for such applications. Moreover, the algorithm achieved synchronization through the statistics accumulated along the processing delay and propagation delay compensation. The experimental results show that the synchronization algorithm has good performance and energy efficiency when the interested nodes number and events frequency are low.
    Research and extension of Socket in KAME stack based on VxWorks
    Ran Xiangjin
    2010, 30(4):  895-897. 
    Asbtract ( )   PDF (494KB) ( )  
    Related Articles | Metrics
    This paper researched the Socket mechanism of KAME protocol stack, and proposed an approach to extend the KAME stack, which modified the implementation mechanism of Socket. It increases the number of IPv6 Socket effectively, and improves the efficiency of IPv6 in high performance embedded system, and it is reliable to run in VxWorks operating system, and it can also be used in IPv6 network as extension of KAME stack.
    Graphics and image processing
    View synthesis based on image reprojection
    2010, 30(4):  898-901. 
    Asbtract ( )   PDF (952KB) ( )  
    Related Articles | Metrics
    What properties the homography should have when the camera rotates around its optical center was discussed. Then combining these properties with the 3D image warping technique, a new way to synthesize novel views when camera rotates around its optical center was proposed. The method first generated part of the novel view by making use of the properties of the homography, and backward mapping method was used to avoid holes; then 3D image warping method was applied to create the remaining part of the novel view; finally, holes in the remaining part which were caused by 3D image warping were filled. Experimental results show that the proposed method provides novel views with quite good image quality.
    Bilateral filtering based image restoration for multiple grayscale images
    2010, 30(4):  902-904. 
    Asbtract ( )   PDF (954KB) ( )  
    Related Articles | Metrics
    In this paper, the authors firstly compared the advantages of L2-norm, L1-norm and ρ-function. Using multiple degraded images, a new bilateral total variation-based blind deconvolution model was proposed. Finally, some simulations were carried out through computer. In the case of motion blur, the performance of the proposed model was compared with that of Total Variation (TV) model. In the case of Gaussian blur, the restoration effect only using one degraded image was compared with that using two degraded images. The results show that the effect of our model is better than that of TV model for motion blur and single Gaussian blur, and the effect of multiple images is better than that of single image for Gaussian blur.
    Symmetry axes search of 2D point set based on convex hull technology
    2010, 30(4):  905-908. 
    Asbtract ( )   PDF (824KB) ( )  
    Related Articles | Metrics
    Based on the symmetry of an object, people can infer its structure and estimate its pose even if some parts of object are lost or occluded. This paper presented a new algorithm for detecting the symmetry axes of 2D point set. Firstly, convex hull of 2D point set can be obtained by convex hull algorithm. Secondly, for complete point set, its symmetry axes can be found through seeking the symmetry axes of convex hull. Thirdly, for incomplete point set, if the missing points are located inside the convex hull, point sets symmetry axes must be the symmetry axes of its convex hull; if the missing points include those points on the convex hull, perpendicular bisectors of convex hulls edges or angle bisectors of convex hull may be the symmetry axes of the point set. The method proposed in this paper can search the symmetry axes of object even though some points are lost or occluded. The experimental results demonstrate that the method has higher efficiency and better applicability.
    Reversible data hiding based on prediction error difference expansion and LSB replacement
    XIONG Zhi-yong
    2010, 30(4):  909-913. 
    Asbtract ( )   PDF (1298KB) ( )  
    Related Articles | Metrics
    This paper proposed a reversible data hiding algorithm for color images based on prediction error difference expansion and Least Significant Bit (LSB) replacement. To overcome the drawbacks of traditional error difference expansion algorithm which must be embedded with a larger location map and the quality decline of stego-images from excessive modulation to pixels, this algorithm used correlation of color components to decrease difference and dispersed smaller expansion to two components, and improved embedding formula to decrease the quantity of un-expandable difference, so the payload capacity was raised. Finally, using LSB replacing method to separate the process of difference expansion and data embedding, the efficiency was improved. Experimental results show the capacity and quality of stego-image are significantly improved, and the complexity is lower, when compared with other new or classical algorithms.
    3D model similarity matching algorithm based on outline characteristic point
    2010, 30(4):  914-916. 
    Asbtract ( )   PDF (585KB) ( )  
    Related Articles | Metrics
    Heczko algorithm is easy to lose some important information of three-dimensional model outline, thus the matching accuracy is reduced. In view of this question, three-dimensional model similarity matching algorithm based on outline characteristic point was proposed. Through the function projection, the outline of three-dimensional model was extracted, and then the vertex of each outline that was taken as the characteristic point was withdrawn. A set of points was constituted by the characteristic point curvature value. The similarity matching was carried out by calculating the Hausdorff distance of the set of points. The experiment results indicate the retrieval accuracy of the three-dimensional model is improved.
    Image copy detection based on SVD in wavelet domain
    2010, 30(4):  917-920. 
    Asbtract ( )   PDF (811KB) ( )  
    Related Articles | Metrics
    This paper presented a novel Content-Based Copy Detection (CBCD) scheme using Singular Value Decomposition (SVD) in the wavelet domain and early fusion for passive image forensics and Digital Rights Management (DRM). To improve the efficiency of image descriptors, multiscale singular value vectors combining global and local features of an image were exploited to generate the signature set for comparison. Local features were extracted by image partitioning and Largest Singular Value (LSV). Experimental results demonstrate the proposed algorithm not only achieves good robustness and discriminability in identifying various modified versions of an original image including geometric transformation, signal processing, image manipulation, and the combination of those but also offers improved detection performance in dealing with various rotations, shiftings, and cutting the area of an image. The proposed approach is applied to detect pirated copies of digital images in a database or Internet.
    No-reference quality index for image blur
    2010, 30(4):  921-924. 
    Asbtract ( )   PDF (735KB) ( )  
    Related Articles | Metrics
    With the analysis of image blur based on the imaging model, a method was proposed for constructing reference images, and at the same time the Structural Similarity (SSIM) index was introduced into no-reference image quality assessment. A novel no-reference image quality assessment index called No-Reference Structural Sharpness (NRSS) was then proposed for quality evaluation of blurred images. This method constructed a reference image by a low-pass filter, and assessed the image quality by computing the SSIM between the original image and the reference one, thus considering the mathematical model of imaging system as well as the advantages of SSIM. The experimental results show that the new index is well in accordance with quality assessment results of both subjective evaluation and full-reference methods.
    Study on error diffusion algorithm of digital halftoning
    2010, 30(4):  925-928. 
    Asbtract ( )   PDF (955KB) ( )  
    Related Articles | Metrics
    The error diffusion algorithm performs well as one of Frequency Modulation (FM) halftoning methods, except for tending to producing worm artifacts on highlight areas and generating noises which blur the edge of images. A new error diffusion algorithm based on the average threshold was proposed by changing scanning order. Using adaptive threshold based on spatial extension, the halftoning images were obtained by optimizing threshold and using a serpentine scanning. The experimental results show that the proposed algorithm can reduce artifacts, enhance edge of images, and give an appeasing perception.
    Rock joint image segmentation based on fractional differential and multi-grade combination in mathematical morphology
    2010, 30(4):  929-931. 
    Asbtract ( )   PDF (1012KB) ( )  
    Related Articles | Metrics
    Rock joint network is very complex and there is much noise in a rock image. The traditional image segmentation methods cannot obtain satisfactory results. A new edge detection algorithm was proposed based on fractional differential and multi-grade synthesis of mathematical morphology. It needs to pretreat the rock fracture image first by image processing operation such as noise filtering, image segmentation, cavity filling, spur removal, etc. Then fractional differential algorithm and multi-grade synthesis of mathematical morphology were used to get the results. The experimental results, compared with the traditional morphological methods, show that the studied algorithm maintains good detected edges of rock joint images and can increase the denoising power as well.
    Implementation of edge linking by ant colony algorithm
    2010, 30(4):  932-934. 
    Asbtract ( )   PDF (894KB) ( )  
    Related Articles | Metrics
    The study presented an ant colony algorithm to implement edge linking. The proposed approach was based on original image and the edge image gained by traditional approaches to analyze the endpoints, and then set up pheromone values with Gaussian distribution around the endpoints in order to make the ants move to the endpoints faster. The visibility of the paths was determined by pixel similarity,neighboring difference and the direction of the edges synthetically, which made the ants move along the real edge pixels. The experimental results indicate that the proposed edge linking approach is efficient and good at compensating the broken edges.
    Multi-scale extraction approach of linear feature in high resolution SAR images
    2010, 30(4):  935-938. 
    Asbtract ( )   PDF (1108KB) ( )  
    Related Articles | Metrics
    In this paper, a multi-level frame was designed to extract linear feature in multi-resolution Synthetic Aperture Radar (SAR) images, to improve the extracted lines continuity and integrity by traditional methods. The speckle filtering and the linear feature extracting were executed synchronously and parallel, the different edge detection and edge points grouping method were used to extract linear feature coarse-to-fine according to the characteristics of collinear points in different scale images, and the linear features were extracted continuously, completely, accurately and efficiently. At last, the approach was tested by the runway extraction and road extraction; the results were analyzed with phase group and Hough transform.
    Contourlet-based super resolution restoration for remote sensing images
    2010, 30(4):  939-942. 
    Asbtract ( )   PDF (879KB) ( )  
    Related Articles | Metrics
    This paper presented a Contourlet-based super resolution for remote sensing images, which adopted Contourlet coefficients as the features. It described a better degree of directionality and anisotropy, and used the smallest Euclidean distance as the computed feature by global searching. According to the distributions of the found coefficients in finer scale, the Hidden Markov Tree (HMT) model was introduced to the remote sensing images in Contourlet domain. And the Expectation Maximization (EM) algorithm was applied to estimate the parameters of the HMT model. With the parameters, the Contourlet coefficients were renewed by using Bayesian estimation theory. Finally, the super resolution restorationfor remote sensing images has achieved better effect.
    Improved exemplar-based inpainting algorithm for broken Thangka images
    LU Xiao-Bao weilan wang
    2010, 30(4):  943-946. 
    Asbtract ( )   PDF (1423KB) ( )  
    Related Articles | Metrics
    The algorithm of exemplar-based image inpainting was introduced to the digital protection of Thangka image due to the advantage that can repair the broken structure and texture effectively at the same time. This algorithm is good at repairing the specific broken images of Thangka, but is not appropriate to other kinds of broken images. Hence, this paper proposed two aspects of improvement for the deficiencies of exemplar-based broken Thangka image inpainting: the improvement of computing method of confidence and the improvement of computing method of isophote intensity. The problem that optimum exemplar block is not exclusive has been solved. The experimental results show that the improved algorithm can not only get a satisfied inpainting result but also improve repairing efficiency.
    Improved sample method for medical image registration based on mutual information
    2010, 30(4):  947-949. 
    Asbtract ( )   PDF (754KB) ( )  
    Related Articles | Metrics
    This paper studied the medical image registration method which was based on mutual information similarity. In the mutual information calculation process, the data of images were sampled by an entropy-based sampling method. This method divided the image into small blocks and calculated the entropy of each. The blocks were classified according to the size of entropy and different category has different sampling factor. The blocks with higher entropy should be sampled with higher factors while the blocks with smaller entropy should be sampled with smaller factors. The experimental results prove that this method can not only ensure the accuracy of registration but also accelerate the speed of registration, and it is suitable for medical image registration of real-time processing.
    Correcting radial distortion of document images by morphology
    2010, 30(4):  950-952. 
    Asbtract ( )   PDF (730KB) ( )  
    Related Articles | Metrics
    Document image captured by hand-held camera suffers from lens distortion more or less, and a mathematic morphology based lens correction algorithm was proposed according to the text line of document images. First, it used adaptive thresholding algorithm to segment the document image, and clustered connected components into text lines with morphology closing algorithm. Then, it used second order polynomial to fit the central line of text line, and constructed the object function of lens distortion. It warped the curved text line to the straight text line, thus, to solve the distortion parameters of document image. The experimental results show this method can effectively correct lens distortion of document image with various degrees.
    Pattern recognition
    Cluster-fusion recognition method for rivet lines based on Fisher discriminant criterion function
    2010, 30(4):  953-955. 
    Asbtract ( )   PDF (1072KB) ( )  
    Related Articles | Metrics
    A cluster-fusion recognition method for rivet lines was proposed based on Fisher discriminant criterion function. The edge points were extracted using Canny operator, and processed by morphology in order to get the location of the centroid. Then, any two centroids were combined to construct the collection of line segments which were classified by Fisher criterion. According to the principle that the distance from the segments on one rivet line to origin point is similar, the segment-vectors were fused to fit the rivet line. The experimental results show that its accuracy and robustness are improved, which can meet the requirements of real-time detection.
    Palmprint matching algorithm based on àtrous-Contourlet transform and invariant moments
    2010, 30(4):  956-959. 
    Asbtract ( )   PDF (926KB) ( )  
    Related Articles | Metrics
    In order to improve the speed and accuracy of palmprint identification system and overcome the weakness of Contourlet transform in dealing with high-dimensional signals, a novel identification algorithm of palmprint feature was proposed. Firstly, the àtrous-Contourlet transformation for palmprint image was decomposed to obtain high frequency coefficients and low frequency coefficients in the different sub-band of different direction. Then, different feature weighting coefficients according to the statistical characteristics extracted from different frequency domains were chosen, which were calculated to get some new invariant moments vectors. Finally, the palmprint identification was achieved. The experimental results show that the method here is more effective in matching than wavelet moment algorithm, Hus invariant moment algorithm and Contourlet algorithm.
    Application of post-processing based on HMM to video face recognition
    2010, 30(4):  960-963. 
    Asbtract ( )   PDF (1090KB) ( )  
    Related Articles | Metrics
    In this paper, the rarely concerned problem of data source in face recognition was investigated, and a novel post-processing HMM-based solution was proposed. Data source problem was firstly empirically investigated through systematically evaluating the eigenfaces sensitivity to variations of pose and illumination by Lambertian reflection model and 3D face model, which revealed that the changes of pose and illumination abruptly degrade the eigenfaces system. This problem is explicitly defined as "data source disaster" for highlighting its significance. Aiming at solving this problem, combining the recognition rate with the analysis of the data sources, two methods were proposed to evaluate the overall performance of specific face recognition approach with its robustness against the low-quality data sources considered. Finally, a post-processing method was proposed to improve the robustness of the recognizer under unconstrained environment. The experimental results have impressively indicated the effectiveness of the proposed post-processing solution to tackle the "data source disaster" problem.
    Expression recognition based on 2D multi-scale block local Gabor binary patterns
    2010, 30(4):  964-966. 
    Asbtract ( )   PDF (824KB) ( )  
    Related Articles | Metrics
    In order to accomplish subject-independent facial expression recognition, a facial expression recognition approach based on 2D Multi-scale Block Local Gabor Binary Patterns (2D MB-LGBP) was presented. MB-LGBP features have been proved to be both locally and globally informative for expression recognition. This research combined the idea of MB-LGBP with the concept of Gray Level Co-occurrence Matrix (GLCM) to achieve the 2D MB-LGBP features, which can encode the local textures with structure information. In recognition, SVM classifier was utilized and its performance was compared with the traditional weighted Chi-square distance based paradigm. The experimental result proves the superiority of the 2D MB-LGBP composite features to MB-LGBP and some other popular features in expression recognition.
    Multi-pose face detection based on facial features and AdaBoost algorithm
    2010, 30(4):  967-970. 
    Asbtract ( )   PDF (964KB) ( )  
    Related Articles | Metrics
    An improved multi-pose face detection algorithm based on facial features and AdaBoost algorithm was proposed. Making full use of facial skin color information firstly, most of the background regions could be quickly excluded. After detecting eyes and mouth, the approximate frontal face candidate regions were segmented according to face orientation decided by the geometric features of the eyes and mouth regions. At last, the face candidate regions were classified by AdaBoost algorithm. The experimental results demonstrate that the algorithm can further improve the multi-pose face detection accuracy and is highly robust to facial expression and occlusion.
    Unmanned aerial vehicle landing point recognition based on natural landform image
    2010, 30(4):  971-973. 
    Asbtract ( )   PDF (780KB) ( )  
    Related Articles | Metrics
    Unmanned Aerial Vehicle (UAV) landing point recognition is an important application of image recognition algorithms. This paper put forward a landing point recognition algorithm based on natural landform. The standard graphics library was decomposed by Contourlet decomposition firstly and the low-frequency subimage and band-pass subimage were obtained in different resolutions, and then the Hu invariant moments were extracted from each sub-graph respectively. The feature recognition library was built by feature selection according to the different recognition rate, then feature matching was done with the k-mean method. The recognition experiments which involve single grassland images and complex landform images in testing gallery show that the algorithm of this paper is effective for image recognition.
    Lane mark identification method based on edge distribution function
    2010, 30(4):  974-976. 
    Asbtract ( )   PDF (746KB) ( )  
    Related Articles | Metrics
    In order to get an ideal lane marks’ edge, an image preprocessing method based on edge distribution function (EDF) was proposed considering the directional characteristics of lane marks. Images were divided into regions, and the following processing was made to regions based on applying EDF analysis to noise characteristics of images: firstly, by changing the gradient angle into 4-direction, edge images were obtained after removing the noise inconsistent with lane marks’ direction. Then initial value of lane marks angle was calculated using EDF filter to edge images. Finally Hough transformation was applied to localizing lane marks. The experimental result shows the method can intensify the information of lane marks and eliminate noise in images effectively. The algorithm is characterized by robustness and real-time.
    Vehicle license plate character recognition based on relative entropy function criterion
    2010, 30(4):  977-979. 
    Asbtract ( )   PDF (513KB) ( )  
    Related Articles | Metrics
    Based on gradient method, the conventional learning algorithm of Wavelet Neural Network (WNN) using the Mean Square Error (MSE) criterion may affect the convergence speed and make the process fall into local minimal. The entropy function criterion is superior to the MSE criterion, possible to improve convergence rate. Thus based on relative entropy function criterion, a novel vehicle license plate character recognition method using WNN was proposed. Firstly, image preprocessing was done on character images, including image binary,image normalization and so on. Then invariant moment of character image using the new invariant moment algorithm was extracted, which was taken as the characteristic vector. Finally, the optimized wavelet neural network was used to classify and recognize the target. Computer simulation shows that this method achieves good recognition effect.
    Improved Chinese chessboard recognition method
    2010, 30(4):  980-981. 
    Asbtract ( )   PDF (478KB) ( )  
    Related Articles | Metrics
    The traditional means of Chinese chessboard recognition depends on the character recognition, without the use of color information. A Chinese board recognition method based on color recognition and character recognition was proposed. The use of color recognition to maintain a state matrix, after each step away from the state matrix and color information can get the information of current chessboard, and restore the state matrix. The character recognition was taken as the supplement to exceptions of color recognition. The proposed method improves the efficiency of the chessboard recognition.
    Artificial intelligence
    Weighted qualitative probabilistic reasoner based on argument system
    2010, 30(4):  982-984. 
    Asbtract ( )   PDF (621KB) ( )  
    Related Articles | Metrics
    Qualitative Probabilistic Reasoner (QPR) is an important approach for reasoning under uncertainty. Two methods of qualitative probabilistic reasoning of systems of argumentation and systems of abstraction were united into one system and a Weighted Qualitative Probabilistic Reasoner (WQPR) based on argumentation system and QPR was presented. Firstly give a patch to the weighted qualitative probabilistic network and discuss the symmetry property of weighted qualitative influences; secondly integrate the inference with qualitative probabilistic influence into the argumentation system, present a WQPR inference system with a high inference accuracy compared to QPR, and also give the proof of soundness and completeness for the WQPR system.
    Attribute dependency theory and its application on neural network
    2010, 30(4):  985-989. 
    Asbtract ( )   PDF (926KB) ( )  
    Related Articles | Metrics
    Neural network optimization methods are generally confined to learning algorithms and input attributes. Due to the high-dimensional mapping that neural network fits contains complex intrinsic attribute dependencies, the traditional optimization methods have not conducted the analytical study on it. The article put forward the attribute dependency theory based on functional dependency theory, elaborated the definition of the attribute dependency theory, and proved related theorem. Combining the Radius Basis Function (RBF) neural network, a new neural network optimization method based on attribute dependency theory (ADO-RBF) was proposed.
    Decision level fusion algorithm with fuzzy densities determined by object
    2010, 30(4):  990-992. 
    Asbtract ( )   PDF (734KB) ( )  
    Related Articles | Metrics
    The fuzzy densities of fuzzy integral are determined by the priori static information of labeled samples and cannot be dynamically adjusted by the recognition result of specific objects to be realistic. To overcome this disadvantage, a decision level fusion algorithm with fuzzy integral whose fuzzy densities were determined dynamically by objects was presented. In this algorithm, the fuzzy densities were determined dynamically by combining the discriminabiltity computed from the objective information given by the classifiers after recognizing the specific object and the priori static information. This algorithm was applied to facial expression recognition. The experimental results show that the proposed algorithm can obtain a better fusion result and raise the accuracy of expression recognition.
    Feature selection method combing improved F-score and support vector machine
    2010, 30(4):  993-996. 
    Asbtract ( )   PDF (660KB) ( )  
    Related Articles | Metrics
    The original F-score can only measure the discrimination of two sets of real numbers. This paper proposed the improved F-score which can not only measure the discrimination of two sets of real numbers, but also the discrimination of more than two sets of real numbers. The improved F-score and Support Vector Machines (SVM) were combined in this paper to accomplish the feature selection process where the improved F-score was used as the evaluation criterion of feature selection, and SVM to evaluate the features selected via the improved F-score. Experiments have been conducted on six different groups from UCI machine learning database. The experimental results show that the feature selection method, based on the improved F-score and SVM, has high classification accuracy and good generalization, and spends less training time than that of the Principle Component Analysis (PCA)+SVM method.
    Quick multi-objective evolutionary algorithm based on adaptive Pareto-ε dominance
    2010, 30(4):  997-999. 
    Asbtract ( )   PDF (457KB) ( )  
    Related Articles | Metrics
    For Multi-objective Optimization Problems (MOP), it is very important to provide proper and feasible solutions rapidly for the decision makers. A method for MOP was given. First, a concept of Pareto-ε dominance was defined. Then, based on this concept, a new adaptive multi-objective evolutionary algorithm was proposed. The simulation results demonstrate that the new algorithm can improve the process of MOP optimization, and can meet the requirements of high-speed and effectiveness in application.
    Optimal scheduling model for hub airport taxi based on improved ant colony collaborative algorithm
    2010, 30(4):  1000-1003. 
    Asbtract ( )   PDF (874KB) ( )  
    Related Articles | Metrics
    Taxiway connects parking spot and runway, which takes an important part in airport surface scheduling. According to the constraints of aircraft taxi conflict and dynamic allocation of runway’s resources, improved ant colony collaborative algorithm and sliding window control were adopted to optimize taxi scheduling. Ensuring taxi conflict free and taking account of single flight taxi time, total taxi time of inbound and outbound flights was reduced. A domestic hub airport taxi scheduling simulation shows that the proposed method and model have obvious advantages, which can provide decision support for hub airport’s taxi scheduling.
    Nonlinear dimensionality reduction algorithm and application to hospital performance evaluation
    2010, 30(4):  1004-1007. 
    Asbtract ( )   PDF (803KB) ( )  
    Related Articles | Metrics
    Manifold learning algorithm ISOMAP is sensitive to the outliers. To solve this problem, the paper employed the distance measurement based on shared nearest neighbor and made a full use of the local density information of points on the manifold, which resulted in an effective improvement on the robustness of the algorithm. Meanwhile, the paper first attempted to apply the improved manifold learning algorithm to the hospital performance evaluation. The experiments on the artificial data and real-world data show that the improved algorithm is robust and effective, and the application to the performance evaluation is successful.
    Adaptive query scheduling for open online courses
    2010, 30(4):  1008-1010. 
    Asbtract ( )   PDF (716KB) ( )  
    Related Articles | Metrics
    This paper proposed an adaptive query processing module which aimed to solve the problem that how to develop a better map plan for query to achieve users demand in limited resources of server and bandwidth on open online course system. In this paper, firstly, query expected cost matrix was set up according to the performance of resources and the cost of query tasks; secondly, the new A-MM (Adaptive Min-Min and Max-Min) algorithm that merged the merits of Min-Min and Max-Min was used for adaptive query scheduling; finally, experiments have been done and shown that the A-MM has higher efficiency and better balance capacity.
    Research of sentiment classification for netnews comments by machine learning
    2010, 30(4):  1011-1014. 
    Asbtract ( )   PDF (847KB) ( )  
    Related Articles | Metrics
    Netnews comments has become an important channel to express personal opinions for the common people, and sentiment analysis can find out the whole attitude of the common people for the news events. This paper summarized the characteristics of netnews comments firstly, and selected different sets of feature, different feature dimensions, different feature-weight methods and parts of speech to construct classifiers; then made the comparison and analysis to the experimental results. The results of comparison show that the features combining sentiment words and argument words perform well to those only employing sentiment words; otherwise, feature dimension has less influence on the accuracy of classification for this kind of data, and the feature-weight method of TF-IDF is still better than boolean method. As for part of speech selection, nouns and verbs as features obtain better performance than adjectives and adverbs.
    Semi-supervised short text categorization based on attribute selection
    2010, 30(4):  1015-1018. 
    Asbtract ( )   PDF (730KB) ( )  
    Related Articles | Metrics
    In order to solve the data scarcity problem of massive short text categorization, a semi-supervised short text categorization method based on attribute selection was presented. An attribute selection algorithm based on ReliefF and independence measures was used to overcome the limitation of the attributes independence assumption by deleting irrelevant or redundant attributes, and an ensemble algorithm based on Expectaion-Maximization (EM) was used to resolve the problems of sensitivity to initial values in semi-supervised EM algorithm. The experiments on real corpus show that the proposed method can more effectively and stably utilize the unlabeled examples to improve classification generalization.
    Effective branch algorithm for integer linear programming problems
    2010, 30(4):  1019-1021. 
    Asbtract ( )   PDF (730KB) ( )  
    Related Articles | Metrics
    In order to improve the computational efficiency, a branch algorithm for general integer linear programming problems on the objective function hyperplane shifts was presented. First, for a given integer value of the objective function, the lower and upper bounds of the variables were determined by the optimum simplex tableau of the linear programming relaxation problem. Then the conditions on the bounds were added to the constraints as cuts to the associated objective function hyperplane. Finally, a branch procedure of the branch-and-bound algorithm was applied to finding a feasible solution on the objective function hyperplane. The computational test on some classical numerical examples shows that, compared with the classical branch-and-bound principle, the algorithm greatly decreases the number of branches and the number of iterations in computation, and therefore, is of practical value.
    Towards semi-supervised ordinal regression with nearest neighbor
    2010, 30(4):  1022-1025. 
    Asbtract ( )   PDF (711KB) ( )  
    Related Articles | Metrics
    The supervised ordinal regression algorithm often requires large amount of labeled samples. However, in the real applications, labeling instances is time and labor consuming, and sometimes even unrealistic. Therefore, a semi-supervised ordinal regression algorithm was proposed, which learned from both the labeled and unlabeled examples. The proposed method began by choosing some instances from unlabeled dataset that are most similar to one labeled example in labeled dataset, and assigning them the corresponding ranker. At this stage, the nearest neighbor rule was packed to score the similarity of two instances. Then, by using supervised ordinal regression, the ranking model was trained from both the labeled and the newly labeled examples. The experimental results show this method produce statistically significant improvements with respect to ranking measures. On the other hand, discount factor λ was introduced for evaluating creditable degree of new labeled examples, and how λ and the size of labeled dataset affected the method performance was discussed.
    Information security
    Inherent relationship between the mode periodicities of cat map and Fibonacci series
    2010, 30(4):  1026-1029. 
    Asbtract ( )   PDF (1282KB) ( )  
    Related Articles | Metrics
    The periodicity theorems of modular series of Fibonacci series were firstly presented, and then the periodicity property of Fibonacci _Q transform matrix was generalized. The relationship between the period of the Fibonacci _Q matrix and that of cat map was discussed, and the inherent relationship between the period of cat map and that of series generated from Fibonacci series was disclosed. The best transform period was secondly defined, and the best periodicity theorem for cat map was given. Finally, the relevant theory was verified correct by several simulation experiments of cat map in image scrambling, which provided mathematical foundation for image scrambling.
    Analysis and improvement of proxy signature schemes over braid group
    2010, 30(4):  1030-1032. 
    Asbtract ( )   PDF (516KB) ( )  
    Related Articles | Metrics
    Analysis shows that two proxy signature schemes based on braid groups are insecure: the first scheme cannot resist the original signer’s change attack; in the scheme of second, any antagonist can successfully forge a valid proxy signature scheme, which the antagonist can designate any proxy signer and messages. Later, a new proxy signature scheme was proposed. Analysis shows that the proposed scheme satisfies all security requirements; what’s more, no security channel is in need in the communication of the original signer and the proxy signer.
    Security protocol design by composition method
    2010, 30(4):  1033-1037. 
    Asbtract ( )   PDF (1081KB) ( )  
    Related Articles | Metrics
    Since the present design methods for security protocol are characterized by being abstract, narrow application range and complexity, this paper presented a new approach to design security protocol. Firstly, it defined the concepts of the base case and the component in the protocol. Secondly, it analyzed the security attributes on the components, and designed the single-step protocols which can implement the special security goals based on the components. Finally, it defined composition rules allowing the combination of several single-step protocols part into a complicated protocol. The rules cannot destroy the security properties established by each independent part. Then it can design security protocol by the choice and composition of the single-step protocols in specific application situation. In other words, the composition framework permits the specification of a complex protocol to be decomposed into the specifications of simpler single-step protocols, and thus making the design and verification of the protocol easier to handle.
    Robust watermarking scheme based on fractal and seudo-Zernike moment
    2010, 30(4):  1038-1041. 
    Asbtract ( )   PDF (810KB) ( )  
    Related Articles | Metrics
    A new image watermarking scheme robust to geometric attacks was proposed based on fractal coding and pseudo-Zernike moments. Firstly, the original image was divided into two groups: self-similarity blocks and non-self-similarity blocks by fractal coding and pre-set threshold. Then, the most robust pseudo-Zernike moment was picked up among the pseudo-Zernike moments of the self-similarity blocks. Finally, the watermark was embedded by quantizing the magnitudes of the most robust pseudo-Zernike moment. The experimental results show that the scheme is not only invisible and robust against common signals processing such as median filtering, sharpening, noise adding, JPEG compression, etc, but also robust against the geometric attacks such as affine transform, local geometric distortion, etc.
    Identical base construction attack on digital signature scheme
    2010, 30(4):  1042-1044. 
    Asbtract ( )   PDF (541KB) ( )  
    Related Articles | Metrics
    This paper studied many digital signature schemes and had found them insecure because of the irrationality of these signature factors or the whole signature scheme, which made the attackers be able to transform the signature verification equation into a equation with the same base number and easily forge signature datum through the equation of the two exponents. The paper proposed a new concept: the attack based on identical base construction, and explicitly indicated that defects could be avoided in designing digital signature. Meanwhile, four examples were given to illustrate the insecurity in signature designing. Finally, some general ways to improve these signature schemes were provided.
    Pervasive computing access control model based on extended RBAC
    2010, 30(4):  1045-1047. 
    Asbtract ( )   PDF (772KB) ( )  
    Related Articles | Metrics
    Concerning the needs of dynamic management for object in pervasive computing access control and the shortages of the existing Role Based Access Control (RBAC), the paper presented an extended RBAC model. In the model an associated object was presented, so the permissions can be obtained through roles and objects. The access control processes were described with the description logic. The model considers the authorization from the point of view of object, and resolves the problems that the roles are too many and the authorization is not flexible caused by the inherence of decision.
    Hierarchical method to analyze malware behavior
    2010, 30(4):  1048-1052. 
    Asbtract ( )   PDF (930KB) ( )  
    Related Articles | Metrics
    This paper proposed a hierarchical method to analyze malware behavior, which firstly obtained behavior information according to the system call sequence in the run-time of the program, then analyzed their behavioral intentions and made hazard assessments. On the part of behavior detection, a behavior detection algorithm was designed, which utilized system calls and their arguments to identify the program behavior. On the part of behavior analysis, an evaluation model about the harms of malicious actions was established on the basis of summarizing a variety of malicious actions and their harms to computer system, together with a method being given to evaluate the harm of the code.
    Security communication system based on short message service
    2010, 30(4):  1053-1055. 
    Asbtract ( )   PDF (898KB) ( )  
    Related Articles | Metrics
    Concerning the insecure factors of Short Message Service (SMS) in public mobile network, a SMS system for secure communication was put forward. From the point of views of real-time protection ability, active protection ability, and management-control ability, feature and architecture of the system were introduced, security policy of the system was described in detail, and finally the scheme was implemented.
    Use of picture log information in improving session identification quality
    2010, 30(4):  1056-1058. 
    Asbtract ( )   PDF (625KB) ( )  
    Related Articles | Metrics
    Data pre-processing is the basis for Web log mining, and session identification is a key step in data preprocessing, so session identification quality seriously influences Web log mining results. The paper analyzed the current session identification methods and proposed to improve session identification quality by pictures log data abandoned in data pre-processing. With reference to the expansion of Web graph structure, the improvement was made from such two aspects as page grouping rules and path completion algorithm. The method is experimentally proved to be effective to improve the session identification quality.
    Three-dimensional encryption algorithm for two-dimensional information
    2010, 30(4):  1059-1063. 
    Asbtract ( )   PDF (926KB) ( )  
    Related Articles | Metrics
    According to the principle of the chess game, a new encryption algorithm was developed. Before encrypting the information, process all the original information into two-dimension and establish the original information tables. And then find out 2D coordinate components of the plaintext in the tables, while encrypting. This two-dimensional information will be stored in a corresponding three-dimensional table to form cipher text, according to certain rules. According to the order of the information stored, the encrypted information can be deciphered. This algorithm founds two sets of cipher codes completely independent of each other. With 7 decimal number described in the context of the original information table, taking the 7×7×2 three-dimensional form for example, the algorithm was figured in detail. The algorithm, for the first time, uses two-dimensional digital processing on the information. The principle is easy to understand, in diversified forms, faster, and of less computing, higher security and large number of secret keys.
    Database and knowledge engineering
    Access methods in moving objects databases
    2010, 30(4):  1064-1067. 
    Asbtract ( )   PDF (1111KB) ( )  
    Related Articles | Metrics
    In the paper, the access methods of moving objects were summarized. Firstly, the moving objects indexing was classified according to the underlying structure: moving objects indexing in unconstrained space and moving objects indexing in network space. And then, the analysis of indexing for the past, now and the future location was presented. Finally, the future research direction of moving objects indexing was discussed.
    Early abandon to accelerating exactly warping matching of time series
    2010, 30(4):  1068-1071. 
    Asbtract ( )   PDF (711KB) ( )  
    Related Articles | Metrics
    Dynamic Time Warping (DTW) is one of the important distance measures in similarity searching of time series; however, the exact calculation has become a bottleneck. An approach named EA_DTW was proposed. The method checked if value of the cell in cumulative distance matrix exceeded the threshold and if so, it would terminate the calculation of other related cells. The theoretical analysis on the process of EA_DTW was made. The empirical experimental results show that EA_DTW outperforms the dynamic DTW calculation in terms of calculation time, and is much better when the threshold is below the real DTW distance.
    Multi-dementional concept lattice and association rules discoverey
    2010, 30(4):  1072-1075. 
    Asbtract ( )   PDF (689KB) ( )  
    Related Articles | Metrics
    Based on the description of deferent dimension of concept intent using multi-dimensional data sequence, formal definition and construction method of multi-dimensional concept lattice were proposed in the paper. Also the association rules discovery method based on multi-dimensional concept latticewas given, which studied dependence relation among deferent dimension attributes thought discovering biggest frequent multi-dimensional data sequence. Examples show that multi-dimensional concept lattice helps to discover richer useful information.
    User context based collaborative filtering recommendation
    2010, 30(4):  1076-1078. 
    Asbtract ( )   PDF (725KB) ( )  
    Related Articles | Metrics
    In order to improve the prediction effect of item-based collaborative filtering recommendation algorithm, user context factor was introduced. Firstly the dissimilarity degree matrix of the user context factor was calculated. Then the clustering based on the equivalent dissimilarity degree matrix was adopted to cluster users by dissimilarity value between user and user. After clustering, items that had small dissimilarity value were chosen as neighbors of target item in each user group. These neighbors were used to predict rating of target item for user. Finally, an experiment was given to evaluate the presented approach and to compare it with a typical item-based Slope One algorithm using Movielens dataset. The experimental results suggest that this approach has better performance than Slope One.
    Research of matrix sparsity for collaborative filtering
    2010, 30(4):  1079-1082. 
    Asbtract ( )   PDF (640KB) ( )  
    Related Articles | Metrics
    This paper applied singular value decomposition to predict the missing data. An enhanced Pearson correlation coefficient algorithm based on parameter was introduced to increase the accuracy when computing the similarity of user and items. Finally, a new algorithm called "HybridSVD" was explored, which was based on singular value decomposition and our novel similarity model. In the experiment section, the authors evaluated this new algorithm using the dataset MoiveLens and the results suggest that the new algorithm can better handle this matrix sparsity problem.
    Research of matching strategy of semantic Web services discovery
    2010, 30(4):  1083-1085. 
    Asbtract ( )   PDF (552KB) ( )  
    Related Articles | Metrics
    By describing Web services with OWL-s, semantic Web services created by the integration of semantic Web and Web services enables the searching and matching of Web services based on their semantics. A new matching strategy derived from single Web service parameter matching was proposed, together with its implementation. Compared with UDDIs matching strategy based on keywords, matching by semantics can better meet users potential requirements.
    New algorithm for computing attribute core in decision tables
    FENG Lin
    2010, 30(4):  1086-1088. 
    Asbtract ( )   PDF (440KB) ( )  
    Related Articles | Metrics
    Attribute reduction is one of the key problems in rough set theory, and determination of core attribute is the basis to solve the problem of attribute reduction. First, by using the tree structure knowledge expression in the decision table, an approach for computing positive and negative regions was introduced. Next, according to changes of positive and negative regions in tree structure decision table relative to the conditional attribute set, an algorithm for computing core attributes was developed. An efficiency of the proposed methods was illustrated by time and space complexities analysis and experimental result in a weather decision table.
    Data mining method based on adaptive intuitionistic fuzzy inference
    2010, 30(4):  1089-1092. 
    Asbtract ( )   PDF (770KB) ( )  
    Related Articles | Metrics
    A new method was proposed for data mining problem which integrated intuitionistic fuzzy set and neural network theory. This article solved the data mining problem with the method of adaptive intuitionistic fuzzy inference. This method can adjust network variables by use of intuitionistic neural networks adaptive learning function, and generate rules bank automatically. Finally, this method was validated by simulation.
    Density grid-based data stream clustering algorithm over sliding window
    2010, 30(4):  1093-1095. 
    Asbtract ( )   PDF (513KB) ( )  
    Related Articles | Metrics
    This paper introduced a density grid-based data stream clustering algorithm. Through the introduction of the "subject degree", the traditional density grid-based clustering algorithm for data stream was improved by taking the data points within the grid as the grid density, thereby resolving the problem of data points belonging to two classes in one grid as well as the treatment of boundary points. Therefore, not only the high efficiency of the grid-based algorithm was utilized, but also the clustering accuracy was raised significantly.
    Progressive goal searching of multimedia data based on content
    2010, 30(4):  1096-1098. 
    Asbtract ( )   PDF (782KB) ( )  
    Related Articles | Metrics
    In order to improve the querying ability of the multimedia on the mobile terminal, a new method about querying the multimedia, namely, the progressive goal searching based on contents was presented, and the technique of the querying stream was provided. Combining Multimedia Query Format (MQF) of media querying method with the unit of XML requesting snippet and the unit of the updating snippet can realize the querying of the media information by users step by step. First, the description of the correlative meta data was queried, then, the result was queried. Therefore, this method can reduce the querying data transfer and it is suitable for the low configuration and the limited channel mobile terminal to achieve the media information query.
    Research and design of querying approach for XML encrypted data
    2010, 30(4):  1099-1102. 
    Asbtract ( )   PDF (843KB) ( )  
    Related Articles | Metrics
    The structural characteristics of XML database document have been fully made use of and combined with the encoding theory of Dewey code to design an encrypted XML algorithm called Index Layered Itersection Scan Algorithm (ILISA) under the model of Database as A Service (DAS). The data retrieval of tree structure was turned into the data retrieval of sequential list form. The authors applied the interpolation search algorithm instead of depth-first or breadth-first search, which had brought a good time complexity. And a data structure of XML index table was designed to reduce the search space greatly. Finally, the complexity of ILISA was described and the efficiency of ILISA was proved.
    Design of spatial index structure based on RBD-tree
    2010, 30(4):  1103-1106. 
    Asbtract ( )   PDF (763KB) ( )  
    Related Articles | Metrics
    RBD-tree, a new variant index structure, derived from Ro-tree was introduced in this paper. RBD-tree is an index structure based on node density which is important to measure the node quality. The core idea of RBD-tree is to organize the nodes with similar features together. In reality, the nodes with similar features are often adjacent with each other. Therefore, RBD-tree greatly improves the spatial query efficiency, and optimization of the index structure is independent of storage devices.
    Typical applications
    Node analysis and optimization strategy for regional traffic network system
    2010, 30(4):  1107-1109. 
    Asbtract ( )   PDF (771KB) ( )  
    Related Articles | Metrics
    Generally speaking, key junction nodes of typical urban region traffic network control system like SCOOT and SCATS need to be defined according to traffic flow and connected links number in reality. However, once the attack of traffic incident in hub node happened, it is difficult to insure the effectiveness and stability of whole traffic network. Considering the free scale features of urban traffic network, this paper discussed how to select hub node in weighted complex traffic network, adopted weighted node contraction method to evaluate node importance, which took network agglomeration as index. And the effectiveness of the method was proved through an application case. Then the optimization strategies for SCATS system based on hub node selection were provided.
    Design and implementation of data exchanging middleware with publish/subscribe model
    2010, 30(4):  1110-1113. 
    Asbtract ( )   PDF (682KB) ( )  
    Related Articles | Metrics
    In order to satisfy the demands of data subscribe/publish, real-time and complexity in present data exchanging, a data exchanging model based on task-priority model was designed. Using this model the core modules of TopTang Software (TTS) exchanging platform were developed. This paper proposed the framework of TTS platform, discussed all the processes of dataset exchanging, and then, illustrated the implementation of the core modules, including task-pumper and thread-pool. The practical application test demonstrates TTS can satisfy the requirements of publish/subscribe and real-time.
    Phase emendation of high-throughput genome sequencing
    2010, 30(4):  1114-1116. 
    Asbtract ( )   PDF (555KB) ( )  
    Related Articles | Metrics
    The high-throughput genome sequencing has a phase problem, i.e. "exceeding" or "delaying", which appears in the order of bases synthesizing. Combined with Markov process, a method of regression analysis was proposed. Based on the constructed data of fluorescence intensity, being correlated with phase "exceeding" and "delaying" questions, a phase emendation matrix was set up to resolve the phase problem. The experimental result shows the method can eliminate the phase "exceeding" or "delaying".
    Short-term traffic flow forecasting model under missing data
    2010, 30(4):  1117-1120. 
    Asbtract ( )   PDF (928KB) ( )  
    Related Articles | Metrics
    In view of missing data issue of traffic detection, this paper proposed a new short-term traffic flow composite forecasting model. The model adopted reconstruction method to solve the missing data problem, and used improved Kalman smoothing to implement short-term traffic flow forecasting. The model resolved the defeats of traditional forecasting methods which cannot deal with the missing data, and also can attain a high forecasting precision. Through the validation of Shenzhen data and being compared with the traditional methods, it has been proved that the new method has high forecasting precision, the forecasting result can maintain at 88% or more, and the model also has good practicality.
    Dictionary-oriented quality checking tool on navigation map
    2010, 30(4):  1121-1124. 
    Asbtract ( )   PDF (799KB) ( )  
    Related Articles | Metrics
    This paper presented detailed innovative ways to develop Dictionary-oriented Checking Tool (DCT) for the production of navigation data. To achieve the data integrity of navigation data model and providing navigation data with more strong qualifications, DCT was driven by checking rule database against its practical contributions. Firstly, it dug out and categorized Ad-Hoc checking items into three main checking topics such as base, relation and navigation application, which were respectively subject to reasonable and clear definitions of checking logics and share the same syntax specification. Secondly, it defined different rule items according to rule item prototype and grammar and set up effective rule database. Thirdly, it defined system structure and class diagram realization based on connection pool and template design pattern. At last, with detailed evaluation of sample data, this paper concludes that checking logic abstraction and software reuse techniques are important to the development of checking software.
    Electrocardiogram classification using combined classifiers
    2010, 30(4):  1125-1128. 
    Asbtract ( )   PDF (682KB) ( )  
    Related Articles | Metrics
    Electrocardiogram is an important approach to diagnose cardiovascular disease. The paper put forward a new classifier which combined two classifiers, Bayes classifier and Support Vector Machine (SVM) classifier, and diagnosed five types of cardiovascular diseases making use of this new approach. The experiments show that the accuracy of the combined classifier is higher than that of Bayes classifier and SVM classifier respectively when training and testing the data in MIT-BIH arrythmia database.
    Image fire detection algorithm based on support vector machine
    2010, 30(4):  1129-1131. 
    Asbtract ( )   PDF (902KB) ( )  
    Related Articles | Metrics
    Concerning the shortcomings of traditional fire detection, an image fire detection algorithm based on Support Vector Machine (SVM) was presented,and compared with the image fire detection based on neural network. The results show that the presented algorithm overcame the disadvantages of neural network such as over learning, being easily trapped in local minimum, etc., and reduced the complexity of doing a lot of experiments and statistical analysis to obtain recognition threshold. The experimental results show that the image fire detection algorithm based on SVM has higher accuracy, and it is effective to solve the recognition with small samples, multi-dimension and nonlinear property.
    Method for detecting changed geographical information based on information retrieval of Web pages
    2010, 30(4):  1132-1134. 
    Asbtract ( )   PDF (573KB) ( )  
    Related Articles | Metrics
    For the difficulty to detect the frequent changes of geographic information, a method based on information retrieval of monitoring changed geographic information was presented. By designing the search condition, estimating the reliability ranking and analyzing all results by statistics and spatial analysis, the geographic information monitoring based on information retrieval has been realized. Finally the system for monitoring the geographic information change of Hangzhou area was developed, which verified the feasibility and validity of this method and provided a new method to detect the changed geographic information.
    Objective evaluation of Mandarin initials
    2010, 30(4):  1135-1140. 
    Asbtract ( )   PDF (991KB) ( )  
    Related Articles | Metrics
    After searching and analyzing the error forms of spoken initials, a two-time-two-level objective evaluation algorithm of initials was advanced on the basis of the phonetic knowledge. Ninety-eight combinations of initial and final were summarized as the basic elements of the initial objective evaluation. The experiment has manifested that the accuracy of the two-time-two-level algorithm is 2.56% higher than that of the Hidden Markov Model (HMM) algorithm, and 3.65% higher than that of BP neural network algorithm, and also 1.42% higher than that of single-time-two-level algorithm. The results prove that the calculation amount of decreases, and the accuracy increases.
2025 Vol.45 No.4

Current Issue
Archive
Honorary Editor-in-Chief: ZHANG Jingzhong
Editor-in-Chief: XU Zongben
Associate Editor: SHEN Hengtao XIA Zhaohui
Domestic Post Distribution Code: 62-110
Foreign Distribution Code: M4616
Address:
No. 9, 4th Section of South Renmin Road, Chengdu 610041, China
Tel: 028-85224283-803
  028-85222239-803
Website: www.joca.cn
E-mail: bjb@joca.cn
WeChat
Join CCF