Loading...

Table of Content

    01 December 2009, Volume 29 Issue 12
    Information security
    Survey on storage-oriented regular expressions matching algorithms
    2009, 29(12):  3171-3173. 
    Asbtract ( )   PDF (642KB) ( )  
    Related Articles | Metrics
    Regular expression matching is a key technology in current deep packet inspection. Basic ideas and methodology for storage-oriented regular expressions were introduced. The paper discussed the classification of algorithms, compared the main algorithms, and found out many factors of regular expression syntax that have influence on the algorithms. Finally, the key techniques and the difficulty were explored, and the study made suggestions on the algorithms design in future.
    Exploration of key points for attack of the MD5 Algorithm
    2009, 29(12):  3174-3177. 
    Asbtract ( )   PDF (634KB) ( )  
    Related Articles | Metrics
    Based on the structural characteristics of the MD5 algorithm, the authors summarized the key points of deciphering the Hash function MD5: the introduction to the message differential, the control of the differential path and the satisfaction of the sufficient conditions. In the process of deciphering the MD5, three differences and the properties of the non-linear functions were introduced. The extensive form of the signed difference and the affection of the left shift rotation were applied in it. The important technique for attack of the MD5 algorithm, named technique of message modification, was cryptanalyzed in detail with an example. In general, the authors explored the key points of deciphering the MD5 algorithm form both the overall analysis and specific practice.
    Information security risk assessment method based on extensible set
    2009, 29(12):  3178-3181. 
    Asbtract ( )   PDF (608KB) ( )  
    Related Articles | Metrics
    In the process of information security risk assessment, there are complex relationships between risk elements and it is also difficult to accurately measure risk evaluation factors. The paper proposed a risk assessment model which took threat as a center to organize risk elements and a risk evaluation method based on extensible set. The model displayed a hierarchical structure for system risk, in which the possibility and consequences of threat were evaluated by three risk factors - asset, vulnerability and control measure. Based on this model, the extensible set method translated qualitative determination into quantitative result by mapping qualitative expression to interval and using interval dependent function and made a qualitative judgment according to a quantitative risk-correlation vector, and therefore, could combine quantitative and qualitative methods to evaluate system risk. A specific example illustrates that the method is feasible and effective.
    Steganography algorithm of image in multiwavelet domain based on grey relational analysis
    2009, 29(12):  3182-3184. 
    Asbtract ( )   PDF (467KB) ( )  
    Related Articles | Metrics
    Multiwavelet transform coefficients of image have relativity in adjacent region. Adjusting coefficient which has large relativity can bring influence on vision effect of image. In order to reduce the influence, the paper proposed a steganography of image in multivavelet domain based on Grey Relational Analysis (GRA). The steganography used GRA to compare geometric curve shape of central tree sequence and peripheral tree sequence of multiwavelet tree group in adjacent region, depicted their relativity, consequently found out coefficients tree that has very little relativity with other adjacent point in coefficient matrix as embedded location of secret information. And it made the quality of stegano image better by utilizing more dispersive characteristic of multiwavelet transform coefficient. Experimental result indicates the proposed algorithm has better imperceptibility and robustness.
    CDPM: An Improved Packet Marking tracking Scheme
    2009, 29(12):  3185-3187. 
    Asbtract ( )   PDF (602KB) ( )  
    Related Articles | Metrics
    In order to improve the efficiency of path reconstruction, a novel scheme was proposed. In this scheme, one edge was constituted by five fragments. Packets were deterministically marked by routers in a cyclical fashion. The state of edge was synchronized by checking the marked packets in order to keep accurate and robust marking. Analyses and simulations show the scheme is effective. Compared with traditional probabilistic marking techniques, this scheme requires less marked packets to reconstruct the attacking-path. And it is also more resilient to packet spoofing, and solves the issue of packet loss well.
    Reasearch on Software watermarking algorithm based on inverse number of equation
    2009, 29(12):  3188-3190. 
    Asbtract ( )   PDF (412KB) ( )  
    Related Articles | Metrics
    To deal with the problems in the program speed and capacity of hidden information in traditional algorithms as expression reordering, a new software watermarking algorithm based on inverse number of expression's operand coefficient was proposed. Through the one-to-one relationship between the inverse number and the binary number, the dictionary mapping was got. According to the dictionary mapping, watermark encoding, watermark embedding and watermark extracting were implemented. The results of simulation show that the algorithm is obviously effective in improving the program speed and data rate of hidden watermark, and the capability of new software watermarking algorithm is better than those of algorithms as expression reordering.
    Identity authentication based on negative selection algorithm
    2009, 29(12):  3191-3193. 
    Asbtract ( )   PDF (488KB) ( )  
    Related Articles | Metrics
    To improve the validity of user identity authentication mechanism, a two-double authentication model integrating positive authentication mechanism and negative authentication mechanism was given. Firstly, inspired from the principle of immune cell identifying self and non-self, an identity authentication mechanism based on negative selection was designed; Secondly, key technologies of negative authentication mechanism were researched, and implementation details of model were given in the end. Simulation tests show that the identity authentication model can stand with password attacks, filtrate out invalid login requests availably, and have advantages of good robustness and reliability.
    Energy-efficient dynamic pairwise key establishment algorithm for sensor network
    2009, 29(12):  3194-3196. 
    Asbtract ( )   PDF (641KB) ( )  
    Related Articles | Metrics
    Concerning the defects in key pre-distribution schemes such as storage heavyload, bad extendibility, a energy-efficient dynamic pairwise key establishment algorithm (EE-DPKEA) was proposed. The algorithm adopted the clustered-based scheme to establish pairwise keys, utilized the energy of each node in equilibrium and prolonged the life of sensor networks. Simulation results show that EE-DPKEA is better than the traditional key pre-distribution scheme in many aspects such as security, storage and communication overhead, key connectivity and energy load balancing.
    Script virus and technology of monitoring registry table based on API HOOK
    2009, 29(12):  3197-3200. 
    Asbtract ( )   PDF (615KB) ( )  
    Related Articles | Metrics
    Script virus is capable of self-replication, dissemination and destruction, which leads enormous harm and damage to the current computer network information environment. Detecting script virus through one of its major features - tampering with users' registry data, the authors proposed an API HOOK-based registry table monitoring approach. Monitoring from registry table and using API HOOK technology, this approach was competent to achieve the goal of detection and prevention of script virus by modifying the entries to system services in system service dispatch table. And the specific logic and characteristic determination it used were capable of monitoring and protecting the value of users' certain key in registry table.
    Credit evaluation model based on fuzzy logic
    2009, 29(12):  3201-3203. 
    Asbtract ( )   PDF (432KB) ( )  
    Related Articles | Metrics
    The fact that the credit speculative behavior has not been identified rightly by the credit evaluation system of Consumer to Consumer (C2C) results in a problem that transaction data can not be a true reflection of the seller's credit. In order to solve this problem, a fuzzy logic-based credit evaluation model was proposed in this paper. This model set up a domain and the fuzzy set of domain for credit factors, in which fuzzy membership function was employed to get the fuzzy evaluation of various transaction periods, and the result of credit evaluation was obtained. This empirical research shows the model completes the credit evaluation system, provides a more reliable basis for the users' trading decisions, and reduces the credit risk in the transaction.
    Network and communications
    Research on aspect-oriented grid services architecture
    2009, 29(12):  3204-3206. 
    Asbtract ( )   PDF (597KB) ( )  
    Related Articles | Metrics
    This paper proposed an Aspect-Oriented Open Grid Services Architecture (AO-OGSA) to tackle the multi-binding of Open Grid Services Architecture (OGSA). Through forming the grid public demand into aspect module, AO-OGSA realized the separation between crosscutting concerns and core concerns in the grid application software. Therefore, it reduced the coupling between modules. Finally, taking the service packaging and resource scheduling in the optical grid as an example, the simulation modules based on OGSA and AO-OGSA were respectively constructed, and the simulation results were contrastively analyzed according to software system performance.
    Cost optimization model of relay-based B3G/4G cellular network
    2009, 29(12):  3207-3210. 
    Asbtract ( )   PDF (585KB) ( )  
    Related Articles | Metrics
    In order to reduce the construction cost of relay-based B3G/4G cellular network, by analyzing the relationship between the network construction cost and the ratio of base stations and relay nodes, this paper conducted a cost-efficiency model, from which an iso-cost line was derived. Combination of iso-cost line and the capacity curve led to a cost optimization model. For the first time, this model explicitly provides an effective model for cost assessment and optimization of B3G/4G cellular network construction. The simulation results demonstrate that the cost optimization model can effectively decrease network construction cost based on the same system capacity and coverage requirements.
    Simulation studies on chaos demodulation of weak BPSK signal based on Simulink tools
    2009, 29(12):  3211-3214. 
    Asbtract ( )   PDF (546KB) ( )  
    Related Articles | Metrics
    Concerning the problem of demodulating the weak Binary Phase Shift Keying (BPSK) signal, the demodulation method based on chaotic system was researched. The basic principle of the chaotic system weak signal detection was introduced. By combining with the phase condition of the phase change happening and the modulation method of BPSK signal, the methods and steps of using Duffing system to demodulate the BPSK signal in strong noisy background were given. And a new method was presented for power spectrum entropy to judge the complexity of the different system state. Based on theoretical analysis, the simulation model based on Matlab/Simulink was built. The digital simulation results show that the performance of demodulating the weak BPSK signal using chaotic oscillator and power spectrum entropy is better than that of traditional coherent demodulation method.
    New routing algorithm based on geographical location: GPSR-AD
    2009, 29(12):  3215-3217. 
    Asbtract ( )   PDF (424KB) ( )  
    Related Articles | Metrics
    The paper proposed a new routing algorithm GPSR based on Angle and Distance (GPSR-AD) for the problem which GPSR may produce excessive unwanted route hops when spatial neighbor is existing in the Ad Hoc network. The algorithm took into consideration of the influence by two factors: distance and angle. Analytical results reveal that GPSR-AD reduces a large portion of hops than GPSR, and it performs better than GPSR in terms of average delivery success rate and packet lost rate.
    A Parallel Algorithm of Matrix Multiplication Based on Biswapped Network
    2009, 29(12):  3218-3220. 
    Asbtract ( )   PDF (470KB) ( )  
    Related Articles | Metrics
    In order to solve the problem of parallel algorithm for matrix multiplication, according to the characteristic of Biswapped network, a parallel algorithm for matrix multiplication based on Biswapped network was developed. The algorithm is simple in operation and easy to implement by using a new suggested mapping scheme. It is showed in theoretic analysis and experimental test, the proposed algorithm equals to Cannon's approximately.
    New fast algorithm in DSP FIR filter design
    2009, 29(12):  3221-3223. 
    Asbtract ( )   PDF (423KB) ( )  
    Related Articles | Metrics
    A new fast algorithm was proposed in order to solve the problem that the FIR impulse of odd points can only use inefficient MAC instruction in DSP FIR filter's design. A new expression of FIR filter impulse of odd points was deduced through theoretical analysis and a new fast algorithm based on efficient FIRS instruction was offered in case of this new expression. Comparison with traditional algorithm was implemented through computer simulation, which demonstrates computation time of program can be reduced efficiently by employing this new algorithm which can be flexibly applied in all kinds of high order FIR filter designs in real-time field.
    Pattern recognition and Software
    Radar echo frequency spectrum estimation based on α stable distribution covariation
    longjunbo
    2009, 29(12):  3224-3226. 
    Asbtract ( )   PDF (540KB) ( )  
    Related Articles | Metrics

    When radar is working in a complex environment, the performance of conventional algorithms degenerates. New frequency spectrum estimation methods based on Fractional Lower Order Statistics (FLOS) covariation about radar were proposed: FLOS-direct method, FLOS-indirect method, FLOS-welch method, FLOS-music method. By estimating the sinusoidal signals embedded in the stable noise, the method in this paper is robust in both Gauss and non-Gauss environments.

    Network and communications
    Improved method of digital signals modulation identification based on decision theory
    2009, 29(12):  3227-3230. 
    Asbtract ( )   PDF (600KB) ( )  
    Related Articles | Metrics
    An improved method of digital signals modulation identification based on decision theory was proposed in this paper. The method can be used to carry out modulation identification by extracting five simple instantaneous characteristic parameters of digital signals and decision-tree. Besides the six typical digital signals i.e. 2ask, 2fsk, 2psk, 4ask, 4fsk and 4psk, 16qam can be identified by the improved method. Moreover, if the parameter which was the mean absolute value of recursive zero-center normalization instantaneous phase was extracted, 8psk can also be identified by this method. The experimental result shows that the complexity of the improved method is reduced greatly. Furthermore, the correct recognition rate and the range of SNR are also increased significantly.
    ZIP2 strategy of limited resources under environment of continuous double auction
    2009, 29(12):  3231-3234. 
    Asbtract ( )   PDF (590KB) ( )  
    Related Articles | Metrics
    Using economic model and multi-Agent technology to research network resource allocation problem has become a new trend in the field of network study. Concerning the limitation of the Agent's resource in the network, the ZIP2 strategy based on market mechanism of continuous double auction was proposed. The ZIP2 strategy is a two-dimensional bidding strategy which contains price and quantity. The Agents adopting the strategy possess the machine learning ability. Finally, the results of simulation experiments show that the ZIP2 strategy can achieve high network resource allocation efficiency, the average allocation efficiency over 97%.
    Research of topology control algorithm in wireless mesh networks
    2009, 29(12):  3235-3237. 
    Asbtract ( )   PDF (582KB) ( )  
    Related Articles | Metrics
    In order to solve poor end-to-end throughput resulted from wireless signal interference in Wireless Mesh Networks (WMN), a topology control algorithm was proposed. The algorithm was carried out by minimizing the maximum collision domain load to construct a network topology. The proposed algorithm reduced the interference to improve the spatial reuse of the channel by adjusting the transmission power of each node, which would give rise to the improved network throughput. Finally, the simulation results which used load of maximum collision domain as metric show that the performances of wireless mesh networks can be substantially improved by topology control.
    Implementation of low cost network data transmission and storage system based on FPGA
    2009, 29(12):  3238-3240. 
    Asbtract ( )   PDF (573KB) ( )  
    Related Articles | Metrics
    A connectivity and realization method for the one-to-many remote network data transmission and storage was introduced in this paper. 3D LED display array needs large size data to display a stereopicture. This method was used in 3D LED display array to resolve data secure transmission problem. Combining the Intel 28FJ3A series Flash and Ethernet control chip DM9000A, it applied the Field-Programmable Gate Array (FPGA) and Verilog HDL programming technology to the system, implemented synchronic display and storage of stereograph, and reached 100 Mbps in the network transmission. Testing shows that the system is of low-cost, low-power and high-speed.
    Research and improvement of LEACH protocol in wireless sensor networks
    2009, 29(12):  3241-3243. 
    Asbtract ( )   PDF (449KB) ( )  
    Related Articles | Metrics
    Concerning the deficiencies of Low Energy Adaptive Clustering Hierarchy (LEACH) routing protocol that cluster-head nodes selection is unreasonable and cluster-head nodes consume excessive energy in the process of long-distance data transmission, an improved routing protocol named LEACH-EDH was proposed. In the set-up phase,the remaining energy and the geographical position of nodes were fully considered. In the steady-state phase, a modified hybrid approach was also introduced to the communication between cluster-head and Base Station (BS), simulation results show that the algorithm effectively balances the energy consumption of the networks and achieves an obvious improvement on the network lifetime.
    Design and implementation of P2P-VoD system based on BitTorrent
    2009, 29(12):  3244-3248. 
    Asbtract ( )   PDF (763KB) ( )  
    Related Articles | Metrics
    Concerning the difficulty of providing large-scale Video on Demand (VoD) service over the Internet, the authors proposed a VoD system named VoDBB based on Peer-to-Peer (P2P) technology. It modified the piece selection mechanism of BitTorrent to improve continuity index, used back-up streaming servers to guarantee high service quality and applied anchor point mechanism to support man-machine function. Simulation results show that VoDBB can provide good viewing experience and reduce the burden of the streaming server, and has strong expansion capability.
    Artificial intelligence
    Method for estimating network structure of short-term traffic flow forecasting model
    2009, 29(12):  3249-3252. 
    Asbtract ( )   PDF (653KB) ( )  
    Related Articles | Metrics
    Artificial neural network forecasting model is an efficient method to forecast the short-term traffic flow, but it is hard to choose the proper input variables and the number of hidden neurons. A data-driven algorithm was proposed to estimate the network structure of neural network forecasting model. According to the chaotic characteristics of the short-term traffic flow, phase space reconstruction was introduced to choose input variables reasonably. Then the number of hidden neurons can be estimated by the fast monotonic value estimation method. The experiment at result demonstrates the efficiency of the proposed algorithm on estimating the network structure of forecasting model.
    Research on genetic algorithm based on asynchronous simulated annealing
    2009, 29(12):  3253-3255. 
    Asbtract ( )   PDF (585KB) ( )  
    Related Articles | Metrics
    In order to overcome the drawbacks of the simple and normal hybrid Genetic Algorithm (GA), which has poor local searching ability and low computing efficiency, respectively, an asynchronous hybrid GA frame was proposed. Such algorithm frame mainly consists of three parts, which are genetic algorithm, niche operation and simulated annealing. To such proposed algorithm, simulated annealing running in asynchronous way made the main difference when being compared with others. Two computers connected by a switch made up the parallel computing environment. One of the two computers executed genetic algorithm and niche operation and the other executed simulated annealing. The Parallel Virtual Machine (PVM) was used for exchanging data between two computers. Experimental results on Traveling Salesman Problem (TSP) demonstrate that the proposed algorithm is viable and efficient.
    Enhancement of topology preservation of self-organizing map
    Xiang-Dong ZHOU
    2009, 29(12):  3256-3258. 
    Asbtract ( )   PDF (486KB) ( )  
    Related Articles | Metrics
    In the Self-Organization Map (SOM), the weight vectors of the units in the grid are updated only according to the distance between the units and the Best Matching Unit (BMU), so the topological relationship between input data can not be preserved very well. Therefore, two improved schemes were proposed. In the first improved scheme, the weight vectors of the units were updated according to the differences of the corresponding coordinates between the units and the BMU. Experimental results show that this improved scheme can preserve topological relationship very well, but the distribution density of the input data can not be reflected quite well. In the second improved scheme, the weight vectors of the units were updated both according to the differences of the corresponding coordinates and the distance between the units and the BMU. Experimental results show that this improved scheme can not only preserve topological relationship better than SOM, but also reflect the distribution density of the input data quite well and accelerate the convergence speed of the training.
    Using CPSO-SVM and data fusion to calibrate temperature characteristic of thermal sensor
    2009, 29(12):  3259-3262. 
    Asbtract ( )   PDF (631KB) ( )  
    Related Articles | Metrics
    To eliminate the influence of ambient temperature on thermal sensor in gas detection, the authors put forward a new calibration method for sensor temperature characteristic based on data fusion and Canonical Particle Swarm Optimization-Support Vector Machine (CPSO-SVM). The method adopted SVM to fuse the data of sensor pair composed of a thermal sensor and a temperature sensor, and applied CPSO and the principle of Root Mean Square Error (RMSE) and Mean Absolute Percentage Error (MAPE) minimization of test samples set to tune the parameter vector of SVM. The experimental results of H2 detection show that the proposed method can effectively improve the temperature quality of thermal sensor, and realizes accurate detection of gas concentration.
    Double coefficients support vector machine with probability and equivalence class
    2009, 29(12):  3263-3266. 
    Asbtract ( )   PDF (688KB) ( )  
    Related Articles | Metrics
    The effectiveness of a Support Vector Machine (SVM) depends on the accuracy of the acquired data information. Considering the low prediction accuracy and poor generalization capacity of SVM caused by simply mining data, a new model was proposed. Combining probability distribution and equivalence class information among data, this model optimized the traditional SVM by adopting double coefficients to promote the capability of acquiring information. Each instance will be assigned two coefficients of probability value and equivalence class. The comparison with SVM, Rough Support Vector Machine (RSVM) and Fuzzy Support Vector Machine (FSVM) illustrates the new model can not only utilize information effectively but also assure a remarkable prediction accuracy and classification ability, and be more robust.
    Particle swarm optimization algorithm with multidimensional asynchronism and stochastic disturbance
    Chen Junyan
    2009, 29(12):  3267-3269. 
    Asbtract ( )   PDF (428KB) ( )  
    Related Articles | Metrics
    Particle swarm optimization has the disadvantages of being easily trapped into a local optimal solution and searching with lower efficiency in multi-dimensional space. With reference to the strategy of concave function to the inertia weight, the authors proposed a method of multidimensional asynchronism and stochastic disturbance to improve the ability to search for global optimum as well as solve the limitation of dimensionality problem. The experimental results of four classic benchmark functions show that the algorithm can keep the balance between the global search and local search, which effectively improves the success probability of searching with higher precision.
    New particle swarm optimization algorithm with global-local best minimum
    2009, 29(12):  3270-3272. 
    Asbtract ( )   PDF (446KB) ( )  
    Related Articles | Metrics
    In order to increase the constringency rate of Particle Swarm Optimization (PSO) algorithm and prevent the algorithm plunging into the local minimums, a new version of particle swarm optimization algorithm was proposed on the basis of global-local best PSO (GLBest-PSO). The inertia weight and the acceleration coefficient of the algorithm were both adapted by the global best and the local best minimum (GLBM). The velocity equation of the GLBM-PSO was also simplified. The results of simulation clearly demonstrate that the constringency rate and the optimized quality of GLBM-PSO are considerably better than those of LDIW-PSO and GLBest-PSO.
    Chaotic particle swarm optimization algorithm based on nonlinear conjugate gradient algorithm
    2009, 29(12):  3273-3276. 
    Asbtract ( )   PDF (619KB) ( )  
    Related Articles | Metrics
    For searching for all local optimization of the multi-modal function, a chaos-PSO algorithm based on nonlinear conjugate gradient algorithm was proposed. This algorithm employed chaos sequence to initialize particle swarm location in order to enhance the diversity of population, and then utilized improved PSO cognitive model to search all local optimization in the feasible region, and then used nonlinear conjugate gradient algorithm to improve the accuracy of the sub-optimal solution which chaos-PSO has found. The experiments manifest that the hybrid algorithm can properly and quickly find all local optimization of the continuous-differential function.
    Chaos control based on pulse-coupled neural networks
    2009, 29(12):  3277-3279. 
    Asbtract ( )   PDF (442KB) ( )  
    Related Articles | Metrics
    According to that the Pulse Coupling Neural Network (PCNN) can display chaotic phenomenon under certain condition, the method of configuring Lyapunov exponents of chaotic PCNN system to set the system converged to a fixed point was studied. In this method, Lyapunov exponents were configured to be negative, according to the expectations of a specific point, the corresponding control sequences were produced, which altered the chaotic system so as to achieve the goal of stable control. Simulation results demonstrate the effectiveness of the proposed method, and the chaotic PCNN systems can be controlled from chaotic state to stable expectation.
    Cellular gene expression programming algorithm based on stack decoding method
    2009, 29(12):  3280-3282. 
    Asbtract ( )   PDF (565KB) ( )  
    Related Articles | Metrics
    A novel Gene Expression Programming (GEP) algorithm named Stack Decoding Based Cellular Gene Expression Programming (SD-CGEP) was proposed. It did not need to transform the chromosome into expression tree, but directly used stack for decoding and evaluating chromosome to accelerate the evolution rate. Meanwhile, it imported cellular automata to avoid the problem of premature convergence. The result shows that SD-CGEP outperforms the conventional Genetic Programming (GP) and GEP in evolutionary efficiency and prediction accuracy.
    Database and data mining
    Research of context-based personalized customer behavior model
    2009, 29(12):  3283-3286. 
    Asbtract ( )   PDF (607KB) ( )  
    Related Articles | Metrics
    To investigate the context's influence on business customer consumption behavior, a sort of construction approach of context-based customer behavior model was proposed. One undergraduate customer transaction data online with three-level context granularity was collected and grouped on statistic based transaction data item. The classifiers including Naive Bayesian (NB), Tree Augmented NB (TAN) and Grouping and Counting-relational Database (GAC-RDB) were used to learn context and non-context predicating functions of each customer group. Based on the Area under a Receiver Operating Characteristic Curve (AUC) of predicating variable, the paper compared and analyzed quantitatively the effect of customer context when predicating his buying behavior. The experimental results demonstrate that the context information has preferable predication performance on the consumption decision of the customers especially the personalized customers.
    Research and application of integrated grey support vector machine model
    2009, 29(12):  3287-3289. 
    Asbtract ( )   PDF (401KB) ( )  
    Related Articles | Metrics
    Based on grey prediction GM (1, 1) model, an integrated grey Support Vector Machine (SVM) model was presented. Through improving the GM (1, 1) prediction accuracy based on background value calculation, initial value selection and smooth degree of data sequence, three grey prediction models that are background GM model, initial value GM model, smooth degree GM model, were put forward. Then, combining the advantages of SVM, the prediction results of three grey prediction models were used as the SVM input factor, and the original data sequence was used as the output factor of the SVM. The support vector regression machine was trained to get the optimal structure. The results of experiment show that the model is valid.
    Research on relational-algebra-based schema mapping of data integration
    2009, 29(12):  3290-3292. 
    Asbtract ( )   PDF (602KB) ( )  
    Related Articles | Metrics
    Integrated mapping is one of the basic issues in data integration, while it still has some deficiencies such as low repetition reducing rate, lack of accuracy, etc. Based on the research of Global-As-View (GAV) and Local-As-View (LAV), two fundamental mapping modes of data integration, and the mode matching, in connection with the questions in integrated mapping like repetition reducing and the transformation between boolean element value and semantic value, the article extended the existing theory of relational algebra. A mapping program of source-objective mode was designed. In such a program, the object congregation and relation congregation of source mode were expanded, and then objective mode was to be obtained through processing the expanded mode with expanded relational algebra, and the specific derivation process during the operation was represented. The feasibility of the program is verified through illustrating examples.
    Study on location-based spatial queries with shared client results in mobile environments
    2009, 29(12):  3293-3295. 
    Asbtract ( )   PDF (461KB) ( )  
    Related Articles | Metrics
    For resolving the problem that spatial position query cannot responded in time under mobile environments, a new query algorithm called Share Results Nearest Neighbor (SRNN), based on sharing the information of adjacent mobile clients, was put forward. The proposed method reduces the burden of central server through making full use of the communication and computational capabilities of clients to share the information about their surroundings. Meanwhile, it decreases the waiting time of mobile clients. Experimental results show that SRNN algorithm is feasible and effective.
    Incremental maintenance and query algorithm of data warehouse based on QC-trees
    CHEN Zhen-kun
    2009, 29(12):  3296-3299. 
    Asbtract ( )   PDF (796KB) ( )  
    Related Articles | Metrics
    In order to make it more convenient and more effective to add, delete, update and query a data warehouse through QC-tree, this paper suggested how to incrementally maintain and query QC-tree. The implementation maintains and queries QC-tree based on the structure of QC-tree, combining the deep first algorithm and cover of equivalent classes. The implementation involves only the upper bound of equivalent classes and considers whatever may happen to the states of classes, so that it confirms the effectiveness and correctness. Compared with the traditional maintenance and query through data cube, the implementation dramatically cuts down the data amount to be considered, so that it improves the performance of maintenance and query.
    Fast outliers mining algorithm based on unit cell
    2009, 29(12):  3300-3302. 
    Asbtract ( )   PDF (448KB) ( )  
    Related Articles | Metrics
    The speed of mining outliers from dataset is slow. According to the characteristic of grid, fast outliers mining algorithm was proposed by partitioning the data into a set of units cell firstly. Therefore, the execution frequency of region query decreased and then the speed increased. According to the appointed thresholds, whether the data was outlier or not was decided. Therefore, the algorithm can deal with the large scalability of points. Experimental results show that the algorithm is fast and effective.
    Text categorization approach based on probability standard deviation with evaluation of distribution information
    2009, 29(12):  3303-3306. 
    Asbtract ( )   PDF (629KB) ( )  
    Related Articles | Metrics
    For text categorization, an approach was introduced to construct the simplest linear classifier, in which the feature weight was computed by probability standard deviation of features as a base line weight regulated with features distributed parameters. In the assessment process of weighting, the probability standard deviation was considered as feature base weighting to quantify dispersion degree of feature, while distributed parameters were evaluated by using beta probability density functions to measure feature distributed information. In the experiments, 20Newsgroup, Fudan Chinese evaluation data collection and Reuters-21578 were used to evaluate the effectiveness of the techniques proposed in this paper, respectively. The experimental results show the method can improve significantly the performance for text categorization, and is simple, stable and suitable for large-scale text categorization.
    Graphics and image processing
    Bitstream extracting and re-assembling for H.264/SVC video transmission over MIMO system
    2009, 29(12):  3307-3309. 
    Asbtract ( )   PDF (488KB) ( )  
    Related Articles | Metrics
    In order to divide the H.264 Scalable Video Coding (SVC) bitstream into sub-bitstreams that are transmitted in different Multiple-Input Mutiple-Output (MIMO) sub-channel to improve the video data transmission rates over wireless system, a bitstream extracting and re-assembling scheme was proposed to extract the basic and enhancement layers of SVC bitstream into sub-bitstreams at the sender side while keeping the sub-bitstream of basic layer to be decoded independently. At the receiver side the received sub-bitstreams were reconstructed by a bitstream re-assembling scheme into decodable SVC bitstream. Experimental results show that the proposed scheme can make full use of the high-bandwidth provided by MIMO systems with low redundancy and high flexibility.
    Improved super resolution reconstruction method for video sequence
    2009, 29(12):  3310-3313. 
    Asbtract ( )   PDF (675KB) ( )  
    Related Articles | Metrics
    To decrease the negative effect caused by local shift motions, this paper proposed an approach based on the triangulated irregular block motion estimation and DTN-POCS super resolution reconstruction. First of all, select feature points from edges, obtain matching points set between image pairs from the motion estimation process; after that, for each image pair, creates the Delaunay triangulation net for slave image based on the matching feature points, and also creates its corresponding triangular net for the master image; and then, suppose the homologous triangulations between images have the relationship of affine transformation, project all the low resolution images by sub-block onto the high resolution coordinate, generate the initial image by interpolation methods; in the end, iteratively optimize the super resolution image based on the improved method. In the experiment, the method with image uniform affine transformation parameters has less accuracy than that of the improved one. The simulation results testify that the proposed method can improve the reconstruction performance.
    Sub-region data structure and its composition algorithm in pattern generator
    2009, 29(12):  3314-3316. 
    Asbtract ( )   PDF (468KB) ( )  
    Related Articles | Metrics
    To realize the software functional module in pattern generator, this paper gave a new data structure, doubly-connected edge list, and a well-designed composite algorithm of the sub-regions by using the plane sweep algorithm in computational geometry. In that algorithm, a general approach was given to process any situation appearing in sub-regions composition. To find the internal-hollowed polygon, a discriminate method and a search method were proposed based on graph reconstruction. The time complexity in this algorithm has a logarithmic item and linear space complexity, better than the quadratic one. The data structure is easy to realize in PCs because it is mainly composed of pointers and linked-lists.
    Filling method for massive void data region of SRTM
    2009, 29(12):  3317-3319. 
    Asbtract ( )   PDF (543KB) ( )  
    Related Articles | Metrics
    To solve the problem of sharp boundary and rough content for SRTM void region with the low-resolution data from GTOPO30, a method of filling up SRTM void region was proposed based on the research available. The Laplacian algorithm, a classic method in image inpainting based on Partial Differential Equation (PDE), was adopted. Experimental results prove that the proposed method has better effect on filling up SRTM massive void region, and is one of the valid methods to obtain integrated SRTM data.
    Algorithm of C-V model combined with image entropy
    2009, 29(12):  3320-3321. 
    Asbtract ( )   PDF (479KB) ( )  
    Related Articles | Metrics
    C-V model, as a classical one in the level set segmentation methods, also has a weak self-adaptive capability. The medical segmentation objects have diversified complex topologic structures and changes. According to the adaptive requirements of medical segmentation algorithm, the article introduced image information entropy algorithm based on the research on C-V model algorithm of the level set picture segmentation methods, solved the iterated parameters setting issue in the curve evolvement, and enhanced the adaptive capability of C-V model segmentation algorithm. The experiments show the algorithm, C-V model combined with image entropy, could have good adaptability to different objects.
    Multi-objects colorful segmentation algorithm for complex background
    2009, 29(12):  3322-3325. 
    Asbtract ( )   PDF (577KB) ( )  
    Related Articles | Metrics
    The background modeling of multi-object segmentation of moving video relies much on the change of environment. Making use of the background subtraction directly may lead to unsatisfactory even wrong segmentation. An improved algorithm of the codebook background modeling based on Kalman filtering theory was proposed. According to the codebook, a colorful model for each pixel was built to distinguish the foreground and background pixels, and according to the characteristics of temporal recursive low passed of Kalman filtering, it was used to rectify the codebook background modeling. The experimental results show that the improved algorithm can update background model efficiently, has the strong anti-jamming ability and segregate the moving objects accurately to meet the real-time requirements.
    Research of metallographic image segmentation algorithm
    2009, 29(12):  3326-3328. 
    Asbtract ( )   PDF (480KB) ( )  
    Related Articles | Metrics
    To solve such problems as low contrast and irregular particles in metal image segmentation and concerning the specificity of the images, an object segmentation method using convex point and the shortest distance was proposed. In this method, the kernel objects were separated from background by typical object thresholding, then noise was removed by mathematic morphology, edge smoothing, edge and concave point finding, cutting points matching and cutting line determination with corresponding of concave points, back extended lookup and the shortest distance. This algorithm was implemented and tested with metal image. The result shows that the image is effectively segmented without over-segmentation.
    Mean-Shift tracking algorithm with adaptive bandwidth of target
    2009, 29(12):  3329-3331. 
    Asbtract ( )   PDF (661KB) ( )  
    Related Articles | Metrics
    The traditional Mean-Shift tracking algorithm of the fixed window-size cannot be adapted to real-time goal of the changes in size. Multi-scale space theory was combined with Kalman filter. First, Kalman filter was introduced to predict the proportion of the target image area, and then this proportion was revised by the observation, which was the proportion of the information of the two adjacent target images using the measurement of the target amount of information in the multscale space theory. Finally, it was implemented by the combination of the Mean-Shift tracking algorithm and Kalman filter to track targets. The improved algorithm can select the proper size of the tracking window in the scenarios that not only of increasing scale but of decreasing scale by the experimental results.
    Mean-Shift tracking algorithm based on FCM
    2009, 29(12):  3332-3335. 
    Asbtract ( )   PDF (635KB) ( )  
    Related Articles | Metrics
    Concerning the bug of the kernel function bandwidth in traditional Mean-Shift tracking algorithm, a new Mean-Shift tracking algorithm was presented using Fuzzy C-Means (FCM). FCM clustering was used to segment the moving target from background in the YCrCb color-space. According to the rules that the areas of the target in adjoining frames are not changed abruptly, the bandwidth of kernel function was corrected with the statistic of segmented target pixels. Experimental results show that this algorithm can track object accurately and effectively. The sizes of the tracking windows could be adjusted automatically to adapt to the decreasing or increasing sizes of the moving target.
    Micro-sensor driven human model for 3D real-time movement
    2009, 29(12):  3336-3339. 
    Asbtract ( )   PDF (685KB) ( )  
    Related Articles | Metrics
    A human model for the human motion capture and reconstruction was set up and the real-time movement of human model controlled by real people's movement was realized. According to the features of the system of the human motion capture and reconstruction, the authors proposed a method of constructing a sensor data-driven layered human motion model and a method of polygon grouping to achieve skin deformation. The methods are based on biomechanics. The joints of human body were categorized according to the motion characteristics. Motion parameters estimated from sensor data were used to control the movement of human skeleton model and the deformation of skin model, and consequently reconstruct the human body motion in real-time animation. The experimental results show that the proposed model is suitable for sensor data-driven virtual-human in real-time animation and has good biomechanics characteristics in representing human motion.
    Regional weighted comentropy and its application in image feature extraction
    2009, 29(12):  3340-3342. 
    Asbtract ( )   PDF (457KB) ( )  
    Related Articles | Metrics
    Image feature extraction is the research focus in the field of content-based image retrieval; however, entropy-based image feature extraction cannot demonstrate the location of image content information. A new description method of image comentropy named regional weighted comentropy was proposed, which combined the concept of image comentropy and image segmentation algorithm after analyzing the current color-space image feature extraction algorithms. Some properties of regional weighted comentropy were proved. The distribution change of image comentropy, which was caused by weight's change, was described by using comentropy performance evaluation index in terms of probability, considering the interested regions and weights precision applied by users, then the reasonable weight was determined. Experimental results show that the accuracy of image content described by regional weighted comentropy method is more than 50% higher than that of traditional comentropy methods.
    New contour extracting method of MR-CT images
    2009, 29(12):  3343-3345. 
    Asbtract ( )   PDF (425KB) ( )  
    Related Articles | Metrics
    A new contour extraction method was presented and applied in MR-CT images to get the contours. The new contour extraction method got the gray threshold firstly. The attribute of image in the criteria of attribute operation was constructed. The selected attribute histogram was structured. The attribute threshold was gotten by the corresponding gray threshold in the attribute histogram. The images were operated by the attribute open and close operation. The useless information in image was removed. Then Roberts operator of classical gradient operators was used to get the image contour. Some important properties such as increment, idempotency, anti-extension and displacement invariability were proposed and proved. This method was experimented on MR-CT images. The results show that the contours of MR-CT images were obtained satisfactorily. The simulating experimental results show that the method getting attribute threshold automatically can retain the important information of image effectively. The new contour extraction method has the strong anti-noise ability and the shapes of contours are kept good.
    Pattern recognition and Software
    Adaboost face detection algorithm based on correction of classifier
    2009, 29(12):  3346-3348. 
    Asbtract ( )   PDF (470KB) ( )  
    Related Articles | Metrics

    In order to enhance the ensemble ability of the traditional Adaboost algorithm, an improved Adaboost algorithm was proposed, which was based on the correlation of classifiers. In the algorithm, the correlation estimation of classifiers was added in the weak training classifiers. Every weak classifier was related not only to the current classifier, but also to previous classifiers as well. The experimental results in Carnegie Mellon University (CMU) show that the algorithm is of better detection rate and lower false alarm rate, compared with traditional Adaboost algorithm.

    Graphics and image processing
    Face recognition algorithm based on supervised neighborhood preserving embedding
    2009, 29(12):  3349-3351. 
    Asbtract ( )   PDF (434KB) ( )  
    Related Articles | Metrics
    In order to make full use of the classification information of samples to increase the recognition performance of Neighborhood Preserving Embedding (NPE), a face recognition algorithm based on Supervised Neighborhood Preserving Embedding (SNPE) was proposed. According to the principle of linear discrimination, within-class scatter matrix and between-class scatter matrix were integrated into the objective function of NPE by a turning parameter, and discriminant features of face images were obtained. Finally, Nearest Neighborhood (NN) algorithm was used to construct classifiers. Several experiments on AR and FERET face database demonstrate the effectiveness of the proposed method.
    Face recognition based on symmetrical linear discriminate analysis
    2009, 29(12):  3352-3353. 
    Asbtract ( )   PDF (462KB) ( )  
    Related Articles | Metrics
    Because the Small Sample Size (SSS) problem usually leads to singularity of the within-class scatter matrix, there will be an ill-posed problem in solving the generalized character equation. An improved algorithm based on former ones was proposed. It introduced mirror image to enlarge sample capacity and adopted the method of the null space of the within-class scatter matrix Sw to get the best solution of Fisher criterion. Experimental results on ORL face database and Yale face database show that the proposed algorithm is more effective than traditional Linear Discriminant Analysis (LDA), Independent Components Analysis (ICA) and 2-Dimensional Symmetric Principal Component Analysis (2DSPCA).
    Singular points detection method in tented arch fingerprints
    2009, 29(12):  3354-3356. 
    Asbtract ( )   PDF (435KB) ( )  
    Related Articles | Metrics
    The singular points of fingerprint image carry important information. Because the central points are very close to the triangle points in some tented arch fingerprints, the computed Poincare Index values counteract each other approximately to the point of 0. Then, Poincare Index method will label this fingerprint image as arch image. Concerning this problem, Poincare Index method was improved by proposing a new detection and computation method of singular points or reference points. The experimental results show that this method can overcome the limitation mentioned above.
    Ear recognition based on improved 2D principal component analysis and neural network
    2009, 29(12):  3357-3359. 
    Asbtract ( )   PDF (447KB) ( )  
    Related Articles | Metrics
    Since the human ear feature extraction method of 2 Dimensional Principal Component Analysis (2DPCA) gets relatively bigger features dimension, which leads to poor real-time capability and lack of data storage space, the authors proposed a new approach. First of all, pretreatment of human ear pictures was completed. Then an improved 2DPCA algorithm was used to compress feature dimension. Finally, BP neural network was used in ear classification. Experimental results show that this method is of real-time and can save feature data storage space, and also maintains the recognition rate.
    Chinese minority script identification method based on wavelet feature and MQDF
    2009, 29(12):  3360-3362. 
    Asbtract ( )   PDF (534KB) ( )  
    Related Articles | Metrics
    In order to classify the type of the Chinese minority scripts, the method of identifying the kinds of Chinese minority scripts based on wavelet analysis and Modified Quadratic Discriminant Function (MQDF) was presented. Using wavelet energy and wavelet energy distribution proportion as features by wavelet multi-resolution transform, multivariate classifier in MQDF was constructed. A sample data set was built which contained six common Chinese minority scripts: Tibetan, Tai Lue, Naxi Pictographs, Uighur, Tai Le, Yi and Chinese and English in total, some samples were used for training, others were for testing, and the proportions of the training samples in dataset were variant. Obviously, the experimental result shows that, in multi-level decomposition, the method is better than the traditional Bayes and K-Nearest Neighbor (KNN) classification in recognition rate.
    New HHT-based handwritten Chinese character recognition method
    2009, 29(12):  3363-3365. 
    Asbtract ( )   PDF (418KB) ( )  
    Related Articles | Metrics
    Concerning the non-stationary characteristic of off-line handwritten Chinese characters, a new recognition method by Hilbert-Huang Transform (HHT) was presented. According to the characteristic of images of Chinese characters in this method, two-dimensional HHT for handwritten Chinese characters was developed based on the one-dimensional one to get the frequency feature of Chinese characters. Then the partial frequency feature and the global Zernike moments feature of Chinese characters were combined as a final recognition characteristic. Experimental result shows that the method is effective.
    Convective cloud identification algorithm based on weather radar image
    2009, 29(12):  3366-3368. 
    Asbtract ( )   PDF (508KB) ( )  
    Related Articles | Metrics
    Both the convective and stratiform regions take an important role in the precipitation. Correct recognition of them facilitates precise prediction of the rainfall amount and duration. An automatic algorithm for the partitioning of radar reflectivity into convective and stratiform rain classifications named WLS was developed. Theory of abrupt-change detection based on wavelet analysis was adopted in this algorithm. First, wavelet transform was carried out on preprocessed raw reflectivity data and echo top data. Second, the singularity detection of modulus maximum value was done and noise points were filtered too. Finally, the edge was detected and convective region was filled by using mathematical morphology. Experiment uses the representative squall line on 2150 UTC 25 August 2008 and mixed precipitation 2222 UTC 8 August 2008 at Hohhot. Compared with WLS algorithm, BL algorithm and SHY95 algorithm, the experimental results show that WLS algorithm is more effective, and it can exactly partition the regions of convective clouds and restrain noise points.
    Medical ultrasound image denoising based on improved anisotropic diffusion equation
    2009, 29(12):  3369-3371. 
    Asbtract ( )   PDF (429KB) ( )  
    Related Articles | Metrics
    An improved method based on anisotropic diffusion equation was proposed. By expanding the Perona and Malik anisotropic diffusion model (P-M model) to include diagonal edges, more image details were added. And a new approach of calculating the diffusion coefficient overcame the high rate of convergence; the novel gradient operator could distinguish noise points and detect edge. The simulation results show that the performance of the method is superior to the classical P-M equation and Linshi operator in image smoothing and edge preserving, as well as the reduction of iteration time. The method is an effective way of suppressing speckle noise in ultrasound images.
    Two-step denoising model based on human visual property
    2009, 29(12):  3372-3374. 
    Asbtract ( )   PDF (485KB) ( )  
    Related Articles | Metrics
    Removal model based on local variance normalized can remove noise significantly in flat area but slowly and maintains the texture details poorly. Therefore, global variance normalized was proposed to measure spatial detail, and then a new diffusion coefficient was given. New removal model used the new diffusion coefficient to accelerate denoising and then used the old diffusion coefficient to remove the uneven and black dots. Experimental results show that new model not only removes noise quickly and effectively while protecting texture details better, and the SNR and PSNR are slightly higher than the model with only one diffusion coefficient.
    Vision based dimension measurement method for SMT stencil
    2009, 29(12):  3375-3377. 
    Asbtract ( )   PDF (545KB) ( )  
    Related Articles | Metrics
    For realizing automatic and precise dimension measurement of Surface Mounting Technology (SMT) stencil, a visual measurement method was proposed based on Gerber file information. First, the real image was grabbed by line scanning camera of NUCLi 7K made by NED company and such information as size and shape was derived from Gerber file data, and the coordinates mapping between figure information coordinates system and the real image coordinates system was erected through coordinates transform of proportion, transportation and rotation for obtaining the approximate position of figures in the real image. Then Canny operator and Gray moment operator were adopted to get the pixel and sub-pixel precision edge. Finally figure fitting methods like line and circle fitting were used to fit figures in the real image for measuring figure dimension precisely. The max measure error is 0.0854pixel (0.91μm) and the max STDEV is 0.0282pixel (0.30μm) in the experiment, which show that: the stencil measurement method proposed in this paper is stable, reliable and of high precision, which can meet the requirements of automatic dimension measurement of SMT stencil efficiently.
    Space domain approach to motion-blurred parameter identification
    2009, 29(12):  3378-3380. 
    Asbtract ( )   PDF (422KB) ( )  
    Related Articles | Metrics
    The Point Spread Function (PSF) of motion-blurred has two parameters: Blurred direction and blurred extent. This study filtered the motion blurred image with 3×3 directional derivation operator to identify the direction of the motion blur. And the bigger searching step was set to confirm the cursory range, then the smaller searching step to identify the precision; To identify the blurred extent. The autocorrelation function of the horizontal difference image of blurred was calculated. Experiments indicate that, this method improves the precision of identification, reduces the computational complexity and gets a good restoration result.
    Simulation of falling snow based on new distribution of domain and superimposed texture
    2009, 29(12):  3381-3384. 
    Asbtract ( )   PDF (617KB) ( )  
    Related Articles | Metrics
    A simulation of falling snow based upon new distribution of domain and superimposed texture was proposed to overcome the disadvantages of large calculative quantity and poor real-time capability in traditional particle system. To optimize the distribution of domain, the new distribution of domain was used to launch the particle. The paper presented the concept of superimposed texture; then, adopted Level Of Detail (LOD) technology and used different superimposed texture on the different levels; at last, the new algorithm was also used to determine if the particle should be deleted. This new method reduces the calculative quantity while not affecting the reality, and also effectively improves the fidelity and real-time quality of falling snow simulation in large-scale scenes.
    Typical applications
    3D flight track and 6-DOF flight simulation based on Google Earth
    2009, 29(12):  3385-3387. 
    Asbtract ( )   PDF (449KB) ( )  
    Related Articles | Metrics
    This paper introduced a novel method of 3D flight track and 6-DOF flight simulation based on secondary development of Google Earth. 3D flight track and 6-DOF flight simulation, described by KML markup language, were presented by introducing real-time surveillance data of flight into Google Earth via its extendable interface. The system was built on B/S architecture, and the entire development was put on the server while the clients used Google Earth browser directly. An algorithm to extract flight attitude from a space vector, which was connected by two track points, was proposed. The results indicate that 3D flight track derived by this way is simple and intuitionistic, which could not only be applied to the research and analysis of 3D flight track, real-time flight track, 6-DOF flight simulation, but also be used as a new query tool of flight information.
    Design on assembled real-time management system for engineering flight simulator
    2009, 29(12):  3388-3390. 
    Asbtract ( )   PDF (595KB) ( )  
    Related Articles | Metrics
    The paper gave an in-depth research on the assembled software, in view of the problems of the software reusability and maintainability in the real-time management system of the engineering flight simulator. It discussed the classification and construction of the components, the open management of the simulation parameters,the assembling of the components and the scheduling of the system in the assembled management system. It improves the reusability, scalability and maintainability of the system.
    Design of multimedia information display system based on template
    2009, 29(12):  3391-3393. 
    Asbtract ( )   PDF (409KB) ( )  
    Related Articles | Metrics
    A low-cost, easy-to-use MultiMedia information Display System (MMDS) was proposed. It consists of server, console, network and client. A WYSWYG tool was developed with which user can "draw" the layout of client screen directly in the console. The result of its design can be saved to hard disk and reopened for modification. The MVC mode was used in the design of this tool. XML was introduced to exchange data between server and client, so that all clients that support XML can be used in the system.
    New Method for Component Reuse
    2009, 29(12):  3394-3397. 
    Asbtract ( )   PDF (635KB) ( )  
    Related Articles | Metrics
    Component Object Model (COM) defines two approaches: Containment and aggregation concentrating on the consumer perspectives for component reuse, which is not favorable to the self-evolution of component and separation of concerns. A new method for component reuse was proposed. First, a new component class was derived directly from an old component. The derived class reused base class's interfaces and events, overloaded virtual interface methods, and defined new interfaces and events. Second, identifiable features were as taken the well defined semantic entities, so that they could be bound to other component at compile or run time. The design and implementation methods on middleware platform were presented. The mobile telephone projects experiments show that the new method contributes to improving software development efficiency and increasing the quality of software systems.
    Validation of processes via Windows kernel mode driver
    2009, 29(12):  3398-3399. 
    Asbtract ( )   PDF (480KB) ( )  
    Related Articles | Metrics
    In order to prevent malignant processes on Windows platform from destroying system resources, a validation technique via kernel mode driver was presented. This validation hooked the creation of processes and got their execution file paths, then checked whether the processes were legal. The validation procedure ran in Windows kernel mode and utilized a data structure named path-tree to speed up the validation. By this method, malignant processes can be terminated before their accomplishment of creation, so as to avoid causing damages to system resources.
    Calculation of the left ventricular volume based on GVF model and optical flow field
    2009, 29(12):  3400-3402. 
    Asbtract ( )   PDF (482KB) ( )  
    Related Articles | Metrics
    Concerning the problems of difficult segmentation and large computation in calculating the Left Ventricular (LV) volume using tagged cardiac magnetic resonance image, a new method of calculating the LV volume was proposed. First an advanced snake model based on the gradient vector field (GVF-snake) was used to segment the contour of the left ventricular and an initial snake line of the contour was got, then optical flow method was used to track the contour of the LV in the successive images. Finally, the Simpson method was applied to calculate the volume of the LV. The comparison of the result by this method with the result got manually shows that the combination of the GVF-snake model and optical flow method is feasible to calculate the volume of the LV.
    Answer extraction application based on dependency semantic analysis
    2009, 29(12):  3403-3405. 
    Asbtract ( )   PDF (448KB) ( )  
    Related Articles | Metrics
    An efficient answer extraction system named Dependency Inspection based AE System (DIAES) was proposed to improve the accuracy of locating candidate answers. The system used the Minimal Logical Form (MLF) to compare semantic similarity between query and candidate answers so as to remove irrelevant retrieval results. Experimental results prove that the proposed system has lower complexity and higher precision than the traditional ones.
    Option pricing based on mixture Kalman particle filter
    2009, 29(12):  3406-3409. 
    Asbtract ( )   PDF (479KB) ( )  
    Related Articles | Metrics
    In order to improve the estimation performance of particle filters in option pricing, a method, called Mixture Kalman Particle Filter (MKPF), was used to solve the problems. The MKPF used Unscented Kalman Filter (UKF) and Extended Kalman Filter (EKF) as mixture proposal distribution. At certain time, each particle was firstly updated by the UKF to get a state estimation; thereafter, this estimation was used as the prior of the EKF, in which the particle was updated again to gain the final estimation of the state. The authors used the classical Black-Scholes (B-S) model in the experiment in order to evaluate the performance of the newly proposed method. The experimental results show that the MKPF outperforms other algorithms, which shows the validity of the MKPF algorithm in option pricing.
    XML update control method based on security update views
    2009, 29(12):  3409-3412. 
    Asbtract ( )   PDF (599KB) ( )  
    Related Articles | Metrics
    In order to facilitate users to update XML documents, while ensuring security, the paper proposed an update control method for XML based on the security update views. To generate a security update view, the annotations of updating constrains were added to Document Type Definition (DTD) elements of a security view, such as Insert, Delete and Replace, to instruct the updating process. In this way, different security update views were provided to different users for their updating operations. For an update query Q on the security update view, this method provided an algorithm to check the authorization of Q and another algorithm to rewrite it to an equivalent operation over the original XML document. The method can deal with updating operations on XML effectively. At the same time, it provides a user with the security update view, which is visible to him only and avoids information leakage.
    PC-based training of simulator of flight command system
    2009, 29(12):  3413-3415. 
    Asbtract ( )   PDF (642KB) ( )  
    Related Articles | Metrics
    In order to improve the level of training flight commander, improve training efficiency, reduce training costs, the "PC-based training of the simulator of flight command system" was developed. It involves many domains such as speech recognition, flight simulation, visual scenery simulation, database, network communication, and so on. Then the way to the whole system design, module partition and key techniques realization were analyzed and demonstrated. To avoid artificial interruption and achieve a system of maintenance-free, redundancy technology was used in the main function modules such as speech recognition module. The application proves that the system fully satisfies the requirements of the flight commander of practical training. It can be used by other command simulation for reference.
    Minimum digital video capture and display system based on Blackfin533 platform
    2009, 29(12):  3416-3417. 
    Asbtract ( )   PDF (452KB) ( )  
    Related Articles | Metrics
    The paper designed and realized a minimum digital video capture and display system based on Blackfin533 platform. Through two interrupts, the functions of burst signal response, video frame capture, format conversion, frame display were realized, and the part of video data format conversion was optimized according to the structure of DSP. The experimental results show that higher video frame capture and display speed is obtained.
    2009 CIDE conference paper
    Video-based simulation of landscape effects in real-timeTime
    2009, 29(12):  3418-3421. 
    Asbtract ( )   PDF (641KB) ( )  
    Related Articles | Metrics
    Based on video synthesis technique, the authors proposed an approach for the real-time simulation of flowing water and fire video. From the fire and flow video, the continuity of time and space was implemented by the optical flow field method. The interaction between wind and fire was achieved through the skeleton of fire, including the fire spread and interaction of fire flame. The flow was also synthesized by a method of self-adaptive edge sampling. Finally, by the techniques of 3-dimension simulation with videos from multiple directions, good simulation results were achieved.
    Scene tune recognition and detection in film videos
    2009, 29(12):  3422-3426. 
    Asbtract ( )   PDF (856KB) ( )  
    Related Articles | Metrics
    It has been a research hotspot to retrieve quickly and accurately the film clips that reflect the fluctuation of audiences' mood according to individual user's needs. Detection of the film scene tune is an effective way to detect these clips. New features such as local motion share ratio, camera motion and shots similarity were proposed with the combination of film domain knowledge. Using the above-mentioned features and common features mentioned in other papers, a Bayesian classifier was utilized to classify shot types. According to the relationship between scene and emotion of audience, five scene tunes that could stimulate emotion of audiences were defined and detected based on the recognition of shot types. The experimental results show that the chosen features achieve good results. Compared with the results of other studies, both precision and recall rate of long shot and close-up shot are improved.
    Kids-centered and context-aware human-computer interaction approach
    2009, 29(12):  3427-3430. 
    Asbtract ( )   PDF (660KB) ( )  
    Related Articles | Metrics
    An approach to kids-centered and context-aware Human-Computer Interaction (HCI) was proposed with the analysis of kids' behavior and their psychology in drawing and painting. It was adopted in our computer-aided kids drawing system based on the principles of child-computer interaction. In order to interact with children in a natural way, a 3D interface simulating the real drawing scenario, and a multimodal interaction way combined with pencil, speech and camera were offered in the system. Meanwhile, sketch and facsimile could help children strengthen drawing skills as well as enjoy drawing under a vivid circumstance.
    Multi-channel watermarking scheme based on HVS for color image in DCT domain
    2009, 29(12):  3431-3433. 
    Asbtract ( )   PDF (441KB) ( )  
    Related Articles | Metrics
    A blind watermarking algorithm for digital image based on Discrete Cosine Transform (DCT) was proposed. First, the original image was transformed from RGB color space into YCbCr color space, which has excellent independence of every channel, and then the watermarking was embedded into all three color spaces. Secondly, the watermarking was embedded by image watermarking technique of DCT, and the embedding capacity depended on Human Visual System (HVS). The experimental result indicates that the algorithm is visually imperceptible and robust to general image processing.
    Novel approach for feature extraction of texture images
    2009, 29(12):  3434-3436. 
    Asbtract ( )   PDF (499KB) ( )  
    Related Articles | Metrics
    A novel approach for feature extraction of texture images based on NonSubsampled Contourlet Transform (NSCT) was proposed. The coefficients in different scales and different directions were obtained by textural image decomposition using NSCT. Then the means and variances of theses coefficients were extracted to be the feature vectors, which could greatly reduce the number of feature dimension. Back Propagation (BP) neural network was adopted to implement automatic classification of texture images through training and simulation. Compared with wavelet package transform and the improved Local Binary Pattern (LBP) texture descriptor, this approach can achieve better result.
    Research on self-learning mechanism in symbol recognition system of architecture drawings
    2009, 29(12):  3437-3441. 
    Asbtract ( )   PDF (746KB) ( )  
    Related Articles | Metrics
    Due to the uncertainty of the graphic symbol's representation in the recognition of the architecture drawings, the adaptability of the method of manually constructing templates is poor, which is entirely based on priori knowledge. To deal with the situation, the symbol recognition method including the mechanism of self-learning was proposed. The method merged self-learning and recognizing together. For different results of learning, such as omission and missing, it entered different learning steps, then completed the creation or update automatically. In this way, it improved the recognition rate without changing the program. Based on these, an architecture drawing recognition system was designed with the capacity of self-learning, and its effect was verified through some experiments.
    Research and implementation of cloth simulation algorithm based on PhysX physics engine
    2009, 29(12):  3445-3448. 
    Asbtract ( )   PDF (693KB) ( )  
    Related Articles | Metrics
    This article described a physically-based cloth simulation algorithm to show the cloth pattern's 3D effect. First the mass-spring model was created and the force on the model was analyzed. Then based on the Newton's Second Law, the cloth's movement was simulated by PhysX SDK. Furthermore, when the cloth collided with the rigid bodies in the environment and the cloth collided with itself, the collision technology in PhysX was used. Finally, several simulations were implemented and some interactive functions were added to set up the virtual hall applicable to pattern design.
    Laplacian editing and its improvement on OpenMesh platform
    2009, 29(12):  3449-3452. 
    Asbtract ( )   PDF (623KB) ( )  
    Related Articles | Metrics
    Through maintaining the normal component of Laplacian coordinates to preserve high frequency details, Laplacian editing is a simple but effective way to deform meshes. Here, the authors implemented this algorithm on the OpenMesh platform. Considering that a linear system with high dimension needed to be solved, the barycentric coordinates are introduced to represent the mesh in order to greatly reduce complexity. Finally, experimental results were presented and analyzed in detail.
    Location-based push service system of tourism information
    2009, 29(12):  3453-3169. 
    Asbtract ( )   PDF (801KB) ( )  
    Related Articles | Metrics
    A push service system for location based tourism information was proposed by the combination of linear quad-tree, wireless network, Geographic Information System (GIS) and cross media techniques. GIS data was used as the spatial index in the interactive tourism information platform and guide subsystem. Other data such as text, image, audio and video were also used to model the scene at the same time. All types of those data were organized and managed through a cross-media model, thus the semantic relations among them were fully mined. This system obtains tourism information easily and pushes it in time, and is far more practical than other guide systems.
2025 Vol.45 No.4

Current Issue
Archive
Honorary Editor-in-Chief: ZHANG Jingzhong
Editor-in-Chief: XU Zongben
Associate Editor: SHEN Hengtao XIA Zhaohui
Domestic Post Distribution Code: 62-110
Foreign Distribution Code: M4616
Address:
No. 9, 4th Section of South Renmin Road, Chengdu 610041, China
Tel: 028-85224283-803
  028-85222239-803
Website: www.joca.cn
E-mail: bjb@joca.cn
WeChat
Join CCF