Loading...
Toggle navigation
Home
About
About Journal
Historical Evolution
Indexed In
Awards
Reference Index
Editorial Board
Journal Online
Archive
Project Articles
Most Download Articles
Most Read Articles
Instruction
Contribution Column
Author Guidelines
Template
FAQ
Copyright Agreement
Expenses
Academic Integrity
Contact
Contact Us
Location Map
Subscription
Advertisement
中文
Table of Content
01 June 2010, Volume 30 Issue 06
Previous Issue
Next Issue
Network and communications
Study of directed maximum leaf out-branching
2010, 30(06): 1431-1433.
Asbtract
(
)
PDF
(626KB) (
)
Related Articles
|
Metrics
In order to solve the problem of maximum leaf spanning tree in digraph, some reduction rules were proposed. These reduction rules could reduce the size of original digraph efficiently. An approximation algorithm was given to find an out-branching with many leaves in the reduced digraph. Furthermore, some optimization rules were given to improve the out-branching. The simulation results show that the reduction rules, approximation algorithm, and optimization rules are effective.
Implementation framework of cross-platform MANET routing protocol
2010, 30(06): 1434-1438.
Asbtract
(
)
PDF
(764KB) (
)
Related Articles
|
Metrics
In order to reduce repetition and ensure the accuracy and consistency in implementing Mobile Ad Hoc Network (MANET) routing protocol on different platforms, an implementation framework of MANET routing protocols running on Windows, Linux and NS-2 was designed and validated by implementing Ad Hoc On-Demand Distance Vector (AODV), a classic MANET reactive routing protocol based on it. The method by abstraction for separating protocol entity from environments is of generability in designing the framework. It can also be applied to other MANET routing protocols or other network protocols (e.g. TCP), and can also be extended to support other operating systems.
Geographic routing algorithm based on virtual coordinate system
2010, 30(06): 1439-1442.
Asbtract
(
)
PDF
(818KB) (
)
Related Articles
|
Metrics
To address the holes problem in wireless sensor networks, a novel geographic routing algorithm, named Double Greedy Algorithm (DGA), was proposed based on virtual coordinate system. First, each node in the wireless network was allocated a set of virtual coordinates according to the topology of wireless network. Then, when the greedy algorithm based on the real geographic positions failed, the greedy algorithm based on the virtual coordinates helped it to recover from the dead end situation. Hence, the convergence of DGA was guaranteed. DGA was applicable for more accurate network models, such as the network model based on the log-normal shadowing model, which is substantially different to the GPSR algorithm. At last, the performance and scalability of DGA were verified by simulations.
Routing algorithms with constrained bandwidth in service overlay networks based on Kautz graphs
2010, 30(06): 1443-1446.
Asbtract
(
)
PDF
(672KB) (
)
Related Articles
|
Metrics
To realize QoS routing in Service Overlay Networks (SON), a fully distributed routing algorithm with constrained bandwidth, called Distributed Band Restricted Routing Algorithm (DBRRA), was presented in SON based on Kautz graphs. The algorithm used available network bandwidth, which reflected the network real-time characteristics, as routing metric. Each node maintained a part of link states in the network and routes with constrained bandwidth based on self-routing characteristics of Kautz graphs. This algorithm is of such advantages as lower computational complexity, lower additional load, strong adaptability and no loop. The simulation results show that the routing success ratio of DBRRA is close to the constrained bandwidth algorithm based on global link states.
ACO algorithm for discovery of multicast group in Ad Hoc network
2010, 30(06): 1447-1450.
Asbtract
(
)
PDF
(631KB) (
)
Related Articles
|
Metrics
Ant colony algorithm applied to Ad Hoc network multicast routing has its own limitation that multi-objective could not be found at the same time. With regard to the limitation, an improved scheme called Contrary Ant Colony Optimization (CACO) routing algorithm was proposed. Some backward ants would be copied to find the routing from the contrary direction when a forward ant reached a destination node. After that the forward ant continued to find other multicast destinations with the same operation. The simulation results were compared with that of the original ant colony algorithm. Delay, overhead and packet number of CACO were better than ACO. The results indicate that CACO reduce the delay of finding multi-objective and enhance the convergence rate of the ant colony algorithm.
Mixed-mode-based admission control in WLAN
2010, 30(06): 1451-1454.
Asbtract
(
)
PDF
(693KB) (
)
Related Articles
|
Metrics
A mixed-mode (model and measurement)-based admission control algorithm was proposed concerning that IEEE 802.11e EDCA cannot offer quantitative QoS. States transition Markov models of back-off instance were proposed, the analytical expression of network performance indicators could be obtained through Beizer reduction rules, and the achievable throughputs of the new arrived flows were predicted according to measured channel states. Finally admission control algorithm based on throughput was proposed. This scheme shows a good protection of admitted flows and admits more flows, while increasing the throughput of the network.
Packet sequence set-based transmission coordination mechanism for opportunistic routing
2010, 30(06): 1455-1458.
Asbtract
(
)
PDF
(653KB) (
)
Related Articles
|
Metrics
Transmission Coordination Mechanisms (TCM) for Opportunistic Routing (OR) is used to arrange and coordinate transmissions of OR, and improves end-to-end throughput by reducing the total number of packet transmissions for delivering packets to destination. The existing paradigms are based on batch map partition communication session into packet segments, and transmit each segment in batch mode sequentially. However, the successful rate of transmission coordination is oscillated due to the oscillation of the number of packets transmitted in batch. A packet sequence set-based TCM for OR transmitting packets in continuous batch mode was proposed. The proposed TCM can keep the high successful rate of transmission coordination by maintaining the number of packets transmitted in batch, and therefore improve end-to-end throughput of OR. The simulation results show that the average end-to-end throughput gain of the proposed packet sequence set-based TCM is about 18% over the existing batch map-based TCM.
Study and improvement on coverage control algorithm in WSN
2010, 30(06): 1459-1462.
Asbtract
(
)
PDF
(722KB) (
)
Related Articles
|
Metrics
The classical Coverage-Preserving Nodes Scheduling (CPNSS) algorithm for Wireless Sensor Network (WSN) has the problems of low efficiency and energy imbalance. This paper proposed an Efficient Coverage-Preserving Nodes Scheduling (ECPNSS) algorithm, which can improve the judging efficiency of redundant sensors, take account of the connectivity of network, and balance the network energy. The simulation results demonstrate that ECPNSS algorithm not only preserves the original coverage, but also improves the efficiency of judging the redundant nodes and reduces the redundancy of network.
Symbol duration blind estimation of OFDM signals with low SNR
2010, 30(06): 1463-1465.
Asbtract
(
)
PDF
(569KB) (
)
Related Articles
|
Metrics
To explore the parameters estimation of OFDM (Orthogonal Frequency Division Multiplexing) signals in cognitive radio systems, the symbol duration blind estimation of OFDM signals was achieved through the cycle spectrum. Firstly, the cyclostationarity of the OFDM signals with cyclic prefix and rectangular pulse shape was demonstrated. Then based on the cycle spectrum of single-carrier signal and the orthogonality of OFDM subcarriers, the cycle spectrum expressions of OFDM signals were derived on the condition of no noise and low SNR. Finally, the computer simulation results of both the continuous signals and discrete signals are the same, which can estimate the symbol duration of OFDM signals blindly with low SNR through the cycle spectrum.
Unique word based time-frequency hybrid equalizer for single carrier block transmission
2010, 30(06): 1466-1468.
Asbtract
(
)
PDF
(555KB) (
)
Related Articles
|
Metrics
For single carrier block transmission, a new design of time-frequency Hybrid Decision Feedback Equalizer (H-DFE) suitable for the Unique Word (UW) extension was proposed. Owing to the known nature of UW, the useful data and UW through the channel at receiver could be separated, and then equalization was conducted and the original data could be restored. Based on this idea, the optimal and suboptimal design of UW based time-frequency Hybrid Decision Feedback Equalizer (H-DFE-UW) were proposed. The simulation results indicate that the performance of the optimal design is highly improved and the suboptimal one is slightly better than Cyclic Prefix (CP) based time-frequency Hybrid Decision Feedback Equalizer (H-DFE-CP).
Data fusion weight μ-ξ of MP_WSMN to achieve high QoS
2010, 30(06): 1469-1471.
Asbtract
(
)
PDF
(592KB) (
)
Related Articles
|
Metrics
To improve the QoS of Wireless Sensor Network (WSN), this paper set up the sensor data review coefficient μ and the communication data importance coefficient ξ of MP_WSMN. The μ was effective in reducing the redundant data bandwidth. The congestion control and data fusion through ξ effectively increased the reliability of data transmission, and reduced the packet loss rate of the important data. The simulation on the improved LEACH routing algorithm and the application of fire detection network for ancient residences have validated the conclusion.
Improved RED algorithm using the Logistic model
2010, 30(06): 1472-1474.
Asbtract
(
)
PDF
(426KB) (
)
Related Articles
|
Metrics
In order to decrease the packet loss rate, this paper imported Logistic equation to calculate the packet loss rate through the research of Random Early Detection (RED) algorithm. Through the comparison of the simulation results between Logistic RED (LGRED) and Non-Linear RED (NLRED), it is clear that the improved algorithm decreases the packet loss rate by about 28.83% compared to the original RED algorithm. The results indicate that under the same condition, LGRED algorithm is better at controlling packet loss and improving network performance.
Information security
Research on collaborative response mechanism of network security components
2010, 30(06): 1475-1479.
Asbtract
(
)
PDF
(719KB) (
)
Related Articles
|
Metrics
To solve the problem that the network security systems cannot play overall advantages due to the lack of coordination mechanism among components, a policy-based collaborative response mechanism was proposed. The process of collaborative response was designed with the policy-driven model. The Intrusion Detection Message Exchange Format (IDMEF) was expanded to be the collaborative message format. A Blocks Extensible Exchange Protocol (BEEP)-based Intrusion Detection Exchange Protocol (IDXP) was implemented to communicate among security components. The collaboration module was programmed to implement the collaborative operations. An evaluation experiment on the collaborative response of security components was performed with this mechanism, and the time costs of each process were obtained. The experimental results demonstrate that the mechanism can implement the collaborative response of security components effectively.
Network security status forecasting and its application in intelligent defense
2010, 30(06): 1480-1482.
Asbtract
(
)
PDF
(639KB) (
)
Related Articles
|
Metrics
Concerning the application of network security status prediction to Intelligent Security Defense Software (ISDS), this paper introduced a new prediction algorithm based on GM(1, 1) and Markov models. Firstly, the GM(1, 1) model was used to predict the original status value. Then, the Markov model was used for error compensation. And the simulation experiment proves that this prediction algorithm is good at status prediction and suitable for intelligent defense application.
Study of P2P network model based on distributed Agent memory mechanism
Gong shangfu
2010, 30(06): 1483-1485.
Asbtract
(
)
PDF
(632KB) (
)
Related Articles
|
Metrics
Since such problems of P2P network as node frequent offline, lack of trust mechanism and bandwidth limited influence the service quality, a distributed Agent memory mechanism of P2P network model was presented. In this model, the data were divided into several blocks of data, the block service was evaluated after the node visit, and data block content and service evaluation update were stored in the neighborhood node and subsequent node; when the node visits information, according to local strategy, optimize memory, and regularly update and delete for a low service evaluation memory. The model of distributed data blocks through memory and dynamic data updated Agent improves the data availability effectively, stops the spread of the virus document, reduces bandwidth pressure, improves the searching efficiency, and increases the system security and network performance.
Comparative study on evolutionary genetic algorithm and particle swarm optimization in intrusion detection
2010, 30(06): 1486-1488.
Asbtract
(
)
PDF
(433KB) (
)
Related Articles
|
Metrics
Concerning the clustering optimization in intrusion detection, Genetic Algorithm (GA) and Particle Swarm Optimization (PSO) were used to optimize clustering and comparative analysis was also completed. In this analysis, binary code was adopted and elimination criterion took the account of the maximum number of iteration and the quality of convergence. Fitness function which combines the characteristic of jnter-cluster distance and intra-cluster distance was defined. Finally, the experiments with KDDCUP 1999 data set using Matlab 6.5 tools show that PSO is superior to GA in the value and speed of fitness function convergence.
Study of malware detection based on interactive behavior
2010, 30(06): 1489-1492.
Asbtract
(
)
PDF
(610KB) (
)
Related Articles
|
Metrics
The intelligent detection of malware has significant importance in the field of malware analysis. This paper studied the automatic classification issues of malware sequence of dynamic traces. The automatic classification method based on sliding windows of sequence characteristics could not resist the sequence confusion, noise injection and mimic sequence to evade detection. This paper studied the three above-mentioned problems. It used the branching sequences, Markov chain state transition probability matrix and interactive objects respectively to improve the automatic classification of malware based on interactive sequence, and gave the design of the overall classification process. Finally, the experimental results prove the above-mentioned problems can be resolved effectively.
Fairness improvement of PayWord protocol based on concurrent signature
Liu jun
2010, 30(06): 1493-1494.
Asbtract
(
)
PDF
(486KB) (
)
Related Articles
|
Metrics
In consideration of efficiency and cost, PayWord protocol lacks fairness. A new solution based on PayWord protocol was proposed to protect the payment commitment of consumer and the service commitment of provider with concurrent signature, so as to enhance the fairness of PayWord protocol. The analysis results show that the new solution can better meet the requirements of micro-payment for efficiency and fairness.
Random-key establishment algorithm of pairwise coding for sensor networks
2010, 30(06): 1495-1497.
Asbtract
(
)
PDF
(491KB) (
)
Related Articles
|
Metrics
In order to further enhance the probability of direct key establishment between nodes in wireless sensor networks, reduce the communication cost of indirect key establishment, and improve the safety performance, a new algorithm for establishing random key based on pairwise coding was proposed. In this new algorithm, first, the nodes were coded with random digit. Then, the first time communication was set up between any two nodes, and the last key's parameter was ensured by thrice handshake. The theoretical analysis and experimental result show that, compared to the traditional algorithm of pairwise key establishment, the new algorithm has higher probability of direct key establishment and key safety, and lower communication costs.
Digital multi-signature algorithm with mixed structure
2010, 30(06): 1498-1500.
Asbtract
(
)
PDF
(603KB) (
)
Related Articles
|
Metrics
With the development of informatization, the digital multi-signature mixed with sequential and broadcast structures has become a research hotspot. This paper firstly analyzed a common mixed signature structure, and then presented an algorithm with unitary structure, which is more effective and secure. For this algorithm, the broadcasting was regarded as dummy sequential one, and led a Center of Signature and Verification (CSV) into operation. The security and efficiency of the algorithm is proved excellent through contradistinctive test, so the algorithm can resolve digital multi-signature of multifarious multistructure.
Improved algorithm for generating random CAPTCHA
2010, 30(06): 1501-1504.
Asbtract
(
)
PDF
(608KB) (
)
Related Articles
|
Metrics
The current CAPTCHA is so simple that it can be easily identified by the automated procedures, which may cause many security risks. On the purpose of enhancing the security of Web applications, an improved CAPTCHA algorithm based on the random sequence was introduced. First of all, the algorithm created a true color image with random background color. Second, it determined characters and the number of character that were randomly generated in a certain range. Finally, the algorithm put characters on the image and the positions of characters were random. To determine the sequence of characters, lines were put between characters. The characteristic of the algorithm is that the number, font, position and input sequence of the characters were uncertain. The experimental results show that the CAPTCHA based on random sequence has great advantages, which can provide a strong security for Web applications.
Color image watermark based on block-DCT and Tucker decomposition
2010, 30(06): 1505-1507.
Asbtract
(
)
PDF
(591KB) (
)
Related Articles
|
Metrics
This paper proposed a new color image watermarking algorithm, which embedded the watermark into the DC coefficients of R, G, B channels through block-DCT and Tucker decomposition. First applied 8×8 block-DCT to R, G, B channels respectively, and used the DC coefficients of each block to construct a 3-level tensor. Embedding the watermark into the core tensor, which was got from Tucker decomposition, can diffuse the watermark into the DC coefficients of R, G, B. The experimental results indicate that this algorithm is robust to compression, noise, filtering and geometric distortion. And the algorithm is better than the algorithm based on YCbCr color space in invisibility.
Artificial intelligence
Hybrid random searching algorithm for solving depot-location problem of CARP
2010, 30(06): 1508-1512.
Asbtract
(
)
PDF
(956KB) (
)
Related Articles
|
Metrics
To solve the depot-location of pro-environment vehicles, a Hybrid Random Searching Algorithm (HRSA) was presented. A Capacitated Arc Routing Problem (CARP) algorithm of finding the optimal driven-route of pro-environment vehicles when the depot had been known was used in HRSA as its evaluation function for the current depot. An improved Dijkstra-algorithm of computing its initial depot was also used in HRSA to speed up its convergence rate. The new algorithm can realize an effective searching in the solution space by using a local searching mechanism to find a more optimal depot near the current depot and by using a random searching mechanism to prevent it from getting into a local optimal depot. The validity and efficiency of the HRSA algorithm are shown by the experiments on the depot-location of sprinkler-cars.
Optimization algorithm of BP networks based on ant colony algorithm
2010, 30(06): 1513-1515.
Asbtract
(
)
PDF
(558KB) (
)
Related Articles
|
Metrics
A new BP algorithm optimized by ant colony algorithm was proposed. Training error and test error of BP network were used to update and choose the information, in order to calculate the transfer probability in ants' route. Parameters value of sites in ants' route was bestowed on BP network, while parameters and training error stored in stored units were changed along with the adjustment of training error of BP network. Through iterative and mutual optimization, the best-optimized parameters of BP network were obtained. BP network based on ant colony algorithm was validated and tested for optimizing several functions. The results show that the training number of optimized BP network is smaller, and the precision of model is higher compared with the traditional BP algorithm while the training errors are of the same order of magnitude.
Modified PSO hybrid algorithm
2010, 30(06): 1516-1518.
Asbtract
(
)
PDF
(362KB) (
)
Related Articles
|
Metrics
This paper proposed a novel Particle Swarm Optimization (PSO) hybrid algorithm to improve the optimum speed and performance of the PSO algorithm. This new algorithm introduced a dynamic proportion operator into differential evolution algorithm and also introduced mutation, crossover operator from DE algorithm into PSO algorithm. Then the position updating formula of PSO was reconstructed. At last, this paper chose four reference functions to have a test and compared the results with other PSO hybrid algorithms. The simulation results verify the effectiveness of this approach.
Strategy for solving multi-issue negotiations deadlock
2010, 30(06): 1519-1522.
Asbtract
(
)
PDF
(592KB) (
)
Related Articles
|
Metrics
To solve the problem of deadlock in multi-issue negotiation, an optimization strategy was presented. This paper adopted a learning method to estimate opponents' preferences of their negotiation issue, and gave full consideration to the issue dependencies. Considering the weight value of the opponent, the reserved value of some issues were adjusted accordingly to solve the deadlock problem quickly and efficiently, and at the same time the negotiation participants could get a reasonable solution. The results prove that the efficiency of the whole negotiation process improves greatly.
Collaborative filtering recommendation algorithm based on naive Bayesian method
2010, 30(06): 1523-1526.
Asbtract
(
)
PDF
(631KB) (
)
Related Articles
|
Metrics
Collaborative filtering is used extensively in personalized recommendation systems. With the development of E-commence, the magnitudes of users and commodities grow rapidly, resulting in the extreme sparseness of user rating data. To address the problem a collaborative filtering recommendation algorithm based on naive Bayesian method was proposed. The algorithm used improved weighted Bayesian method to predict the rating of unrated items. Through predicting unrated data, the sparseness of rating data problem had been alleviated and the accurate degree of searching nearest neighbor items had been improved simultaneously. The experiment shows that the measure provides better recommendation results for the system.
Clustering validity measure based on intuitionistic fuzzy inclusion degree
2010, 30(06): 1527-1529.
Asbtract
(
)
PDF
(431KB) (
)
Related Articles
|
Metrics
To measure the clustering validity for the data of intuitionistic fuzzy sets, a technique of clustering validity was proposed based on intuitionistic fuzzy inclusion degree. The technique includes two important evaluation factors: intuitionistic fuzzy inclusion degree and intuitionistic fuzzy division entropy. Then by adding the non-membership degree parameter to broaden intuitively fuzzy inclusion degree, the first factor can evaluate inclusion degree values during classes. Moreover, the second factor was used to verify the reliability of the clustering results. At last, the validity of the proposed technique was checked with a classical instance.
Feature selection algorithm based on Hellinger distance
2010, 30(06): 1530-1532.
Asbtract
(
)
PDF
(579KB) (
)
Related Articles
|
Metrics
To solve the feature selection problem, two kinds of definitions of Hellinger distance were studied in this paper, and the corresponding feature selection algorithms based on Hellinger distance were also proposed. The experiments on different data sets show the efficiency of the two algorithms. Compared with other feature selection algorithms, the feature selection algorithms based on Hellinger distance can get fewer features, which are useful for C4.5 and can improve the average accuracy of the classification in the learned data sets.
Rich-club phenomenon based search immunization
2010, 30(06): 1533-1535.
Asbtract
(
)
PDF
(550KB) (
)
Related Articles
|
Metrics
In order to eliminate the disease with few immunized nodes and high speed, a great many immunization strategies have been proposed. Acquaintance immunization strategy is the most effective of local strategies. Based on the BA scale-free network's rich-club phenomenon and breadth first search, the paper proposed search immunization strategy based on the rich-club phenomenon which requires only local degree information. There are two types of immunization strategies, which depend on searching the neighbors' degree differently, and can also be used with different cost required. RPBSI algorithm can lead to the eradication of the epidemic by immunizing a smaller fraction of the nodes than the acquaintance immunization in BA scale-free network and scientific collaboration networks.
Continuous attributes reduction algorithm of decision table based on hard C-means clustering
2010, 30(06): 1536-1538.
Asbtract
(
)
PDF
(411KB) (
)
Related Articles
|
Metrics
To solve the problems of low adaptability for continuous domain reduction and the disadvantage of failing to obtain eventual relationship among the fuzzy sets, a new attribute reduction algorithm of decision table was proposed based on Hard C-Means (HCM) clustering. First, continuous attribute values were transformed into fuzzy values with triangular membership function, and then the algorithm of HCM clustering was provided to obtain relationship among the fuzzy sets. In the end, the simulation results show the effectiveness of the proposed method.
Classification of data stream based on dynamic feature extraction and neural network
2010, 30(06): 1539-1542.
Asbtract
(
)
PDF
(748KB) (
)
Related Articles
|
Metrics
To improve the accuracy and adaptability of the classification of data stream, this paper presented a new method of classification. This method used total least squares to fit the segmentation of data stream, and presented a variable sliding window algorithm to achieve a reasonable segmentation and improve the accuracy of trend analysis by combining Sliding Window (SW) algorithm with extrapolation for On-line Segmentation of Data (OSD) algorithm. By extracting the dynamic feature of data stream, neural network was used for the pattern recognition of data stream and classification so it can provide early warnings, severity assessments of monitored subjects and information for decision support. The test results show that this method can describe the dynamic characteristic of data steam effectively, and the effect of classification is evident.
Application of SOFM network in building project classification
2010, 30(06): 1543-1546.
Asbtract
(
)
PDF
(658KB) (
)
Related Articles
|
Metrics
The traditional project cost estimation in architecture has many problems such as huge time-consumption, complicated calculation, and frequent measurement error. Therefore, a method of clustering which could deal with architecture samples by Self-Organizing Feature Map (SOFM) network was proposed. This method did not need to identify training data set manually to get classification from different sorts of samples, and it did help to improve the efficiency of the traditional architectural project cost estimation. Finally, the availability of the algorithm in this method was proved. Compared with the traditional methods, the experimental results demonstrate that the improved method has a higher accuracy rate and a lower false positive rate.
Tuning PID parameters with improved particle swarm optimization
2010, 30(06): 1547-1549.
Asbtract
(
)
PDF
(413KB) (
)
Related Articles
|
Metrics
The performance of PID controller depends on the combination of the control parameters. An improved particle swarm optimization was proposed for tuning and optimizing PID parameters, by applying interval algorithm and roulette wheel selection to the initialization of particle location. The simulation and experimental results show that, the proposed algorithm can overcome premature phenomena, reduce the influence of random initial population, and improve the convergence precision, which means a good application prospect.
Optimal operation of reservoir based on dynamic programming and particle swarm optimization
2010, 30(06): 1550-1551.
Asbtract
(
)
PDF
(453KB) (
)
Related Articles
|
Metrics
Reservoir optimal scheduling is a typical multi-constrained, dynamic, non-linear optimization problem. To solve this problem, a Dynamic Programming-Particle Swarm Optimization (DP-PSO) algorithm was used to solving. This algorithm used the optimal Dynamic Programming (DP) principle to convert the reservoir optimal scheduling problem to multistage decision-making sub-problems; the solution of each sub-problem was got by particle swarm optimization algorithm. The numerical experiments show that with more time in calculation, the reliability of the DP-PSO is superior to the general DP algorithm, and the calculation time of DP-PSO is less than DP-Genetic Algorithm (DP-GA).
Graphics and image processing & pattern recognition
Survey on road image interpretation based on monocular vision
2010, 30(06): 1552-1555.
Asbtract
(
)
PDF
(677KB) (
)
Related Articles
|
Metrics
Vision-based road images provide rich information about local environment during travelling. These serial images were analyzed and interpreted in many fields such as automatic guided vehicle, mobil robots, driving assistant systems and so on. Technologies and methods of road image processing, analyzing and interpreting in lane detection system and lane departure system were introduced in detail. Technologies of lane recognition and tracking were introduced and compared in detail. Finally future research and development direction were given.
Image denoising based on graph regularization and nonsubsampled Contourlet transform
2010, 30(06): 1556-1558.
Asbtract
(
)
PDF
(473KB) (
)
Related Articles
|
Metrics
The nonsubsampled Contourlet transform was used to capture the features such as edge, contour and texture. After that, a weighting function was generated by using the features. Finally, graph regularization equation was used to filter the noisy image. The simulation results show that the proposed method can effectively remove the noise and is superior to other partial differential equation methods.
Fast template matching algorithm
2010, 30(06): 1559-1561.
Asbtract
(
)
PDF
(686KB) (
)
Related Articles
|
Metrics
The traditional template matching method has low efficiency and low speed. This paper proposed a fast template matching algorithm. In the beginning, just a small part of points were involved in template matching, and gradually more and more points got involved in template matching. Through the comparison of correlation coefficients, it was determined whether to increase the number of matching points or to abandon the current matching position and move to a new position for a new match. When calculating the correlation coefficient, we just calculated with the new points, and then merged it with the original correlation coefficient to get a new correlation coefficient. This greatly reduced the computation of the algorithm. The points involved in calculating the correlation coefficient always were distributed on the template uniformly, which ensured the accuracy of method. The proposed algorithm has high accuracy and high speed, and it can satisfy the request of real-time.
Image de-noising algorithm based on Shearlet transform
2010, 30(06): 1562-1564.
Asbtract
(
)
PDF
(456KB) (
)
Related Articles
|
Metrics
According to the deficiency of de-noising algorithm based on the traditional transform domain, this paper proposed an image de-noising algorithm based on Shearlet transform. Firstly, a Shearlet decomposition and reconstruction implementation method was proposed in this paper. And then, the Monte-Carlo method was used to do estimation of the high-frequency coefficients. Finally the shrinkage de-noising would be done according to the threshold function. The experimental results demonstrate that the method can remove noise and remain edges, and obtain better visual effect and higher PSNR.
Architectural ceramic pattern design based on fractal template
2010, 30(06): 1565-1567.
Asbtract
(
)
PDF
(460KB) (
)
Related Articles
|
Metrics
It is a request for application of the architectural ceramic's 2-linear tiling and 4-linear tiling to satisfy the product's consecutive tiling process. This paper discussed the steps to design an architectural ceramic pattern, and analyzed the application of the architectural ceramic patterns. This paper studied the principle of the architectural ceramic patterns' generation which was based on integration of the pattern template and the fractal in detail, proposed and achieved its algorithm at last. The obtained patterns have obvious characteristics of 2-linear tiling and 4-linear tiling and satisfy the application requirement of the architectural ceramics.
Gray target tracking based on multi-level texture feature and Mean-Shift
2010, 30(06): 1568-1572.
Asbtract
(
)
PDF
(775KB) (
)
Related Articles
|
Metrics
In gray image sequence, due to the sensitivity to illumination variation and the lack of the information for target representation, target tracking is very difficult. This paper proposed a novel tracking algorithm, which integrated the target's Gabor wavelet transform features and rotation invariance uniform Local Binary Pattern (LBP) texture description operator to construct the target's multi-level texture feature models, and used Mean-Shift to track. The algorithm first adopted Gabor wavelet transform to extract multi-scale and multi-orientation features of target to extend the range of feature extraction, and then the rotation invariance uniform LBP operator was applied to encode these features to enhance the validity of the extracted features. Finally, the target's multi-level Gabor-LBP texture feature models were constructed by texture pattern joint probability histograms, and Mean-Shift was adopted to track. The experimental results show that this algorithm can effectively cope with illumination variation, clutter and rotation in gray target tracking.
Target tracking based on improved Mean-Shift and adaptive Kalman filter
2010, 30(06): 1573-1576.
Asbtract
(
)
PDF
(614KB) (
)
Related Articles
|
Metrics
In this paper, a target tracking algorithm was proposed by combining the improved Mean-Shift algorithm with the adaptive Kalman filer. For a selected moving object, frame difference and region growing methods were used to segment target, and the dominant color was extracted. In the tracking process, the initial iterative position was obtained by adaptive Kalman filter in every frame, and the tracking result obtained by the improved Mean-Shift was fed back to the adaptive Kalman filter as the measurement for correction. The estimate parameters of adaptive Kalman filter were adjusted by occlusion ratio adaptively. The experimental results demonstrate that the proposed algorithm can detect and track the moving object consecutively in video and has better robustness to occlusion.
Generation method of 3D tunnel based on vector model
2010, 30(06): 1577-1580.
Asbtract
(
)
PDF
(575KB) (
)
Related Articles
|
Metrics
On the basis of vector model, this paper presented a method, which could generate 3D tunnel and smooth joints at the same time while making sure the shape of generated tunnel was correct. It planted a series of vertical sections on center lines firstly. After calculating starting points of parallel lines according to the tunnel shape, it got some points by using the parallel lines intersected with sections, and finally connected the corresponding points on adjacent sections to realize modeling. Concerning the tunnel spicing problem, the method inserted rotated sections on joints, generating tunnel and smoothing the joints at the same time. It ensures the seamlessness of tunnel joints and simplifies the method as well.
Illumination processing algorithm based on vector calculation
2010, 30(06): 1581-1583.
Asbtract
(
)
PDF
(496KB) (
)
Related Articles
|
Metrics
A novel algorithm on illumination processing was proposed based on High Dimensional Image Geometry and Biomimetics Informatics (HDIGBI), which is a brand new theory developed in recent years. From the angle of high dimensional space vector, illumination of the image was analyzed. Adopting high dimensional space vector to describe image and illumination, the image under even illumination was obtained by calculating the difference between the image vector and the illumination vector. The experiments demonstrate that this method is effective and simple to realize.
Fast intersecting feature detection approach based on original face naming mechanism
2010, 30(06): 1584-1586.
Asbtract
(
)
PDF
(502KB) (
)
Related Articles
|
Metrics
According to the feature intersecting issue in the research and application of feature technology, a new approach for detecting intersecting features was proposed. The approach was based on original face naming mechanism for topological entities and took the topological edges in geometry model as a core. Firstly, the two adjacent faces' name attributes of all the topological edges were checked in geometry model. Then, the topological relation, which is between the topological edges in geometry model and the corresponding original faces in the feature volumes or between the original faces in two different feature volumes, was determined. Using the proposed approach, the intersecting features can be detected correctly. Furthermore, the approach is of high efficiency because it replaces Boolean operations by just checking names of features' faces.
Polarized characteristics image segmentation based on minimum cut
2010, 30(06): 1587-1589.
Asbtract
(
)
PDF
(621KB) (
)
Related Articles
|
Metrics
It is difficult to get accurate segmentation results because of inherent speckle noise of Polarimetric Synthetic Aperture Radar (POL-SAR); therefore, a graph-based POL-SAR image segmentation method was presented. The method used K-Means clustering algorithm to get the initial label of every pixel with reference to the combined multiple polarized characteristics. Then we used minimum cut method to get the optimal solution of global energy function approximately in these networks after establishing a labeled energy function and constructing the corresponding networks. Therefore, we could get the proper label of every pixel and complete the correct classification. Compared with other traditional segmentation methods, this method considers the global information and polarized characteristics of POL-SAR image adequately so as to get the accurate segmentation. The experimental results show the algorithm has better segmentation effect.
Classification of hyperspectral remote sensing images with dynamic support vector machine ensemble
2010, 30(06): 1590-1593.
Asbtract
(
)
PDF
(648KB) (
)
Related Articles
|
Metrics
Based on Bagging Support Vector Machine (SVM), this paper applied dynamic ensemble selection technique to the SVM ensemble learning, and investigated the application of dynamic SVM ensemble to the classification of hyperspectral remote sensing images. Considering the characteristics of hyperspectral data, Bagging SVM was improved by selecting feature subspace randomly and feedback learning; the algorithm of computing local area of K nearest neighbors was ameliorated through adopting plus composite distance; the validation set samples were more representative by means of appending the misclassified training samples to the validation set. The experimental results show that in comparison to single optimized SVM and other popular SVM ensemble methods the improved dynamic SVM ensemble exhibits the highest classification accuracy, and it could effectively improve the classification precision of hyperspectral remote sensing images.
Face skin color detection in complicated illumination conditions
2010, 30(06): 1594-1596.
Asbtract
(
)
PDF
(458KB) (
)
Related Articles
|
Metrics
Face skin color detection is sensitive to illumination variations. This paper presented a novel method to detect the face skin color regions in complicated illumination conditions. Firstly, a skin color model in complicated illumination conditions was created using YCbCr color space. Secondly, the skin color model was applied to detect the face skin color regions in the input image. Finally, the 4-connected regions were calculated in the detection result and be used to eliminate the non-face regions and recover the real face regions. The experimental results demonstrate that the proposed method can detect the face skin color regions correctly in complicated illumination conditions.
Fabric defect detection based on adaptive LBP and SVM
2010, 30(06): 1597-1601.
Asbtract
(
)
PDF
(774KB) (
)
Related Articles
|
Metrics
An advanced local binary patterns method was proposed to describe the main image features. Adaptive Local Binary Patterns (ALBP) method selected the frequently occurring patterns to construct the main pattern set, which avoids using the same pattern set to depict different texture structures in the traditional uniform local binary patterns. Based on the proposed method, an effective fabric defect detection algorithm of Support Vector Machine (SVM) was designed. First, the features of the training samples were extracted according to the set and were fed to SVM. Then the testing image was equally divided into detection windows from which ALBP features were also extracted and were classified by the trained SVM model. The experiments exhibit the detection effect of the proposed method is comparatively better than traditional LBP in terms of visual effect and detection accuracy.
Pseudo-color enhancement for typhoon cloud image based on Berkeley wavelet transform
2010, 30(06): 1602-1605.
Asbtract
(
)
PDF
(688KB) (
)
Related Articles
|
Metrics
An efficient pseudo-color enhancement method for typhoon cloud image was proposed based on Berkeley Wavelet Transform (BWT) and linear assigning pseudo-color enhancement method. Details of the cloud were enhanced in the Berkeley wavelet domain. Then the enhanced cloud image was enhanced in color by classical linear assigning method and improved version respectively. The proposed method was compared with the classical linear assigning method and classical discrete wavelet transform with improved linear assigning method. The experimental results show that the proposed method is simple, of low computation burden and feasible. The proposed method provides a new technique to enhance the contrast of the low contrast typhoon cloud image. Using the proposed method, the eye and screwy cloud bands can be greatly extruded, which will provide a reference to locate the center and predict the intensity of the typhoon.
Pavement crack detection method based on multilevel denoising model
2010, 30(06): 1606-1609.
Asbtract
(
)
PDF
(783KB) (
)
Related Articles
|
Metrics
In order to wipe off all kinds of noisy influence in the detection of pavement surface cracks, the advantages and disadvantages of the existing denoising models were analyzed. Combined with the advantages of the existing denoising algorithms and characteristics of the noise and crack information of pavement image, a novel multilevel denoising model based on the detection of pavement crack was presented. The whole denoising model was composed of gray denoising model, spatial filtering denoising model, crack characteristics denoising model and geometrical features denoising model. The experimental results indicate that the proposed model is more advanced in the pavement image denoising and the extraction of crack.
2D circle calibration method based on Matlab/Simulink
2010, 30(06): 1610-1612.
Asbtract
(
)
PDF
(413KB) (
)
Related Articles
|
Metrics
In order to handle relatively flexible calibration background, and meet relatively low calibration precision requirement but relatively high real-time requirement in traditional 2D plane drone calibration method, an improved calibration method on the basis of the traditional 2D calibration was put forward, which took small unmanned ground weapon mobile platforms' driving environment as calibration environment. Being realized in Matlab/Simulink environment, this method changed the traditional calibration's detecting diamonds' horn points into circle's center, at the same time reduced account quantity to quarter of diamond's horn point method, shortened the time of distilling character points and enhanced the real-time of the system. The experimental results demonstrate that this method is simple, effective, practical and flexible, and its application environment can be any background.
Algorithm for detection of ROI based on fractal and wavelet
2010, 30(06): 1613-1615.
Asbtract
(
)
PDF
(641KB) (
)
Related Articles
|
Metrics
Concerning the poor performance and high computation complexity of a single fractal feature applied in detecting Region Of Interest (ROI) of man-made objects, an algorithm for detection of ROI based on fractal and wavelet was proposed. Firstly, the original image was decomposed to sub-images. Secondly, new fractal feature of low frequency sub-image was computed utilizing fractal intercept feature and fractal fitting error, thus ROI of low frequency sub-image could be gained. Finally, ROI of original image was obtained using the relationship between original image coordinates and sub-image coordinates. The experimental results show that the algorithm proposed has very good effect in detecting ROI of image and reducing computational complexity.
Connected component labeling algorithm based on run recursive method
2010, 30(06): 1616-1618.
Asbtract
(
)
PDF
(480KB) (
)
Related Articles
|
Metrics
A run-based recursive labeling algorithm was proposed based on the previous algorithms for binary images. The algorithm scanned the whole image level by level until a run unlabeled was found and the runs connected to it were searched based on recursion until a connected component was generated. During the searching process, the connected runs forward and backward from the start point of current run on its neighboring levels. Then the searching method was optimized according to the relationship between the two runs and the search time was reduced without searching repeatedly. The proposed algorithm can complete the labeling with only one time of scanning and the comparative experiments with traditional algorithms show that it is more efficient and occupies less memory. It can real-time detect the moving targets on construction sites.
Extraction of salient regions in image
2010, 30(06): 1619-1621.
Asbtract
(
)
PDF
(514KB) (
)
Related Articles
|
Metrics
Image salient regions may express the main content of image. Extraction of the salient regions in image plays an important role in searching image and identifying sensitive image. First, extract the salient maps of image in combination with multi-scale analysis. Then, accept or reject the previously segmented regions according to experience threshold value based on saliency maps. Finally, fix on salient maps of image according to front judgment. Good effect is achieved. The algorithm is fast and easy to implement compared to Itti method.
Image quality assessment based on structural orientation information
2010, 30(06): 1622-1625.
Asbtract
(
)
PDF
(760KB) (
)
Related Articles
|
Metrics
Compared with the traditional Peak Signal-to-Noise Ratio (PSNR) method, the Structural Similarity (SSIM) method can achieve better image assessment by measuring the SSIM between the reference image and the distortion image, but the structural information is not completely extracted. Based on SSIM method, the orientation information was further extracted and the Local Structural Orientation Similarity (LSOS) was proposed. Then, the LSOS method was incorporated with the existing SIExt algorithm and an image quality assessment method based on Structural Orientation Information (SOI) was presented. The experiments show that the SOI model can assess the distortion image more precisely than the SIExt and SSIM methods.
Fast multi-frame motion estimation algorithm for H.264
2010, 30(06): 1626-1628.
Asbtract
(
)
PDF
(488KB) (
)
Related Articles
|
Metrics
The multiple reference frame motion compensation was introduced into H.264/AVC to improve video coding performance. However, the resulted Multiple Reference Frame Motion Estimation (MRF-ME) leads to much more computational cost. To improve the encoding speed and reduce the computational complexity, a new algorithm called DDS based on spatial correlation was proposed. First, the algorithm updated predictive motion vector constantly using Forward Dominant Vector Selection (FDVS), then searched different reference frame by different template according to the statistical characteristic of the location of optimal reference frame. The experimental results show that the algorithm can decrease the search points by 80% and reduce the complexity of the encoder considerably while maintaining nearly unchanged Peak Signal-to-Noise Ratio (PSNR) of pictures and the bit rate compared with the H.264 reference software JM10.2.
Artistic style generation by image analogy using ant algorithm based on comprehensive area feathers
Qian Shao
2010, 30(06): 1629-1631.
Asbtract
(
)
PDF
(532KB) (
)
Related Articles
|
Metrics
This paper presented a robust scheme for texture synthesis in the process of artistic style learning based on the comprehensive area feathers of the segmented image blobs, which speeded up the process by exploiting the ant colony algorithm for the texture blobs matching firstly. The scheme also ensures the artistic style transition by the parameter setting of different sample images weights and shortens the time cost in the style generation process, which can transit the styles of several images to the targets. The preliminary experiments illustrate the good performance of the scheme. Above all, the random algorithms just like the ant colony searching algorithm can enhance the comprehensive performance of the artistic style generation system.
Improved brightness preserving bi-histogram equalization algorithm
WU Ying
2010, 30(06): 1632-1634.
Asbtract
(
)
PDF
(502KB) (
)
Related Articles
|
Metrics
Based on Brightness preserving Bi-Histogram Equalization (BBHE), an improved algorithm of gray image enhancement was proposed. An appropriate threshold, which was selected based on the entropy of the output image and the difference between the mean brightness of input and output images, was selected to cut the image, then BBHE and gray lever homogenization were performed respectively. This method makes the brightness mean error as small as possible and the entropy of output image as large as possible. Meanwhile, it can prevent over-enhancement. The experimental results prove that the new method has better performance on image enhancement.
Software process technology & Chinese information processing
Design of modeling and real-time simulating software YH-RTSIM based on RTX
Jiang ZhiWen
2010, 30(06): 1635-1637.
Asbtract
(
)
PDF
(563KB) (
)
Related Articles
|
Metrics
In order to meet the higher real-time requirements of hardware-in-loop simulation of new flight vehicles, this paper developed the modeling and real-time simulating software YH-RTSIM based on Real-Time eXtension (RTX), and designed the architecture of software YH-RTSIM that consisted of integration environment of modeling, Windows process and RTSS process, can not only enhance performance of real-time, but also run old simulation programs of user after little change, and designed the communication algorithm between processes based on share memory to carry out real-time data storing and displaying, and designed the control algorithm of frame time to ensure precise frame time. The tests and applications demonstrate that YH-RTSIM can run correctly, difference of frame time of simulation is from -0.0004ms to 0.0004ms during simulating, less than 1 μs (0.001ms), which can meet the requirements of hardware-in-loop simulation of new flight vehicles.
Request routing algorithm for service discovery based on load balancing in SOA
2010, 30(06): 1638-1641.
Asbtract
(
)
PDF
(664KB) (
)
Related Articles
|
Metrics
Service discovery is a very important part in Service-Oriented Architecture (SOA). However, current service discovery algorithms do not take account of the load balancing of the service discovery nodes, so they can not meet the requirements of search efficiency in frequent requests. To solve this problem, a distributed request routing algorithm for service discovery named HaFA, considering the node capacity and network latency, was proposed for a distributed service registry center in SOA. This algorithm measured the node's computation power using the load on the node to solve the problem that the tasks may still be assigned to the node which has weak capacity after balancing of the load degree, and can improve the utilization of computation resources in the service registry center. It used the load fluctuation rate of the node to estimate the load degree of the next discrete-time point to solve the problem that the load fluctuation during the network delay has impact on the load balancing. The experimental results show that HaFA algorithm can effectively improve the system throughput in distributed service discovery and shorten the average waiting time for response.
Design of distributed and self-adaptive performance monitoring system based on software rejuvenation
2010, 30(06): 1642-1644.
Asbtract
(
)
PDF
(585KB) (
)
Related Articles
|
Metrics
According to the theory of software rejuvenation, the system resources wastage is the major factor of the degradation of computing system. It is useful for the maintenance of system performance to design a performance monitoring system. The monitoring system releases the resources wasted at the appropriate time by collecting and analyzing the run-time system resources data. C/S mode was used in the monitoring system to reduce the load of monitored system, to ensure the lightweight of monitor-side, to achieve asynchronous monitoring of monitored system. The self-adaptive adjustment of monitoring system parameters was implemented based on self-organization map net. Several models were provided to analyze and forecast the performance of system. A simple decision-making method was designed to support the control of reboot. At last the experiments demonstrate that the self-adaptive collection strategy is effective to reduce the data collected and transmitted, to ensure the lightweight and low-load of monitor-side, to minimize the impact of monitoring system on monitored system.
Design and implementation of dynamic testing framework based on base object model
2010, 30(06): 1645-1647.
Asbtract
(
)
PDF
(488KB) (
)
Related Articles
|
Metrics
This paper proposed test platform design and implementation for Base Object Model (BOM)-based simulation model to enhance the model reliability and shorten simulation system integration cycle in the test. An independent model test platform was designed, which can provide object management, data distribution, time management and other services needed by the BOM model running. The running results of simulation model were analyzed and then model interfaces and functions were verified. Model test platform provides running environment for all types of simulation model, the platform operation efficiency can meet super-real-time simulation requirements, and model integration efficiency increases. In large-scale combat simulation system development, the promotion of model testing platform can improve the model reliability, reduce the integration costs and shorten the integration period.
Failure model of software reliability based on LSSVRM and simulated annealing algorithm
2010, 30(06): 1648-1650.
Asbtract
(
)
PDF
(401KB) (
)
Related Articles
|
Metrics
According to the characteristics of the software failure data, the Least Square Support Vector Regression Machine (LSSVRM) was applied to build the software failure model. The LSSVRM was used to construct a fitting model according to the failure data that is a small sample and the parameters in LSSVRM were optimized by the simulated annealing algorithm to get the SA-LSSVRM and further improve the fitting model. Compared with the failure mode of NonHomogeneous Poisson Process (NHPP), the failure model based on the LSSVRM and SA algorithm has the higher fitting precision and the adjustments of the model parameters could be simplified.
Method-slicing and entry-dependency based research for regression test
2010, 30(06): 1651-1654.
Asbtract
(
)
PDF
(604KB) (
)
Related Articles
|
Metrics
In order to improve the performance of regression test for large software, the idea of slicing was applied to the selection of the regression testing cases, and Method-Slicing and entry-dependency were proposed in this paper. Method-slicing algorithm sliced the code with method as the basic unit. Only the test cases influenced by the revised source code would be re-tested. The experimental results demonstrate that the method-slicing dramatically improves the efficiency in regression testing of large software, and reduces the cost of software testing to some extent.
Research of semantic retrieval system based on domain-ontology and Lucene
2010, 30(06): 1655-1657.
Asbtract
(
)
PDF
(681KB) (
)
Related Articles
|
Metrics
Semantic similarity is the crucial factor affecting the precision rate and recall rate of semantic information retrieval system. This paper put forward an improved semantic similarity computation model, which was used to quantify the association between concepts , and then the scope of expanded concept set was adjusted by the similarity threshold. In this paper a domain-ontology-based semantic information retrieval system based on the open source full text search engine: Lucene was designed. It extended the original query terms before entering this query expansion terms into Lucene, and used semantic similarity as the key factor of sorting algorithm between searching results. The experimental results show that the semantic similarity of this model is closer to the empirical value of experts, and the precision rate of this system is greatly improved compared with the original Lucene system.
Research on ESB framework for enterprise application integration
2010, 30(06): 1658-1660.
Asbtract
(
)
PDF
(490KB) (
)
Related Articles
|
Metrics
Enterprise Service Bus (ESB) is a standard software architecture based on an event-driven messaging engine, of which message bus is the foundation. In order to achieve an effective, secure and available ESB, this paper analyzed the key components and working principles of messaging on traditional ESB, and then proposed the design and implementation of an enterprise level messaging system, which had the functions of message splitter, message compression, message encryption, message resume, and queues cluster. The application result shows that improving the messaging model can further improve the performance of ESB.
Large-scale document forward detection algorithm based on agglomerate-term
2010, 30(06): 1661-1663.
Asbtract
(
)
PDF
(590KB) (
)
Related Articles
|
Metrics
Document forward detection is that to find out article collection of the same or close content from a large-scale text library. It has widespread demand in popular articles exploring, results organizing of search engine, copy detection and so on. To meet the growing diverse forms of Internet text forward and improve system efficiency, this paper discussed certain text features and researched some comparison algorithms. Then, the large-scale document forward detection algorithm based on agglomerate-term was introduced. Its principle is: first, detect and extract the agglomerate-term according to the term's distribution, and make it a key feature to characterize the text; then, set an extensive linear comparison and a multi-dimensional comparison on it; finally, compute the ultimate results of the forward detection. The experimental results show that the agglomerate-term algorithm has a better integrated performance of precision, recall and speed.
Government information retrieval based on domain ontology
2010, 30(06): 1664-1667.
Asbtract
(
)
PDF
(686KB) (
)
Related Articles
|
Metrics
There are two main problems in the existing government information retrieval systems. Firstly, the search technique based on keywords matching ignores the semantic information, which makes it unable to describe the document accurately. Secondly, due to the lack of domain knowledge, users are not clear about what they really want. To solve these problems, this paper presented domain ontology based government information retrieval approach, and designed the corresponding algorithm for concept extraction and query sentence extension. The experimental results show that the proposed approach improves the recall and precision ratio of the information retrieval.
Combined measurement approach for semantic similarity of terms
2010, 30(06): 1668-1670.
Asbtract
(
)
PDF
(423KB) (
)
Related Articles
|
Metrics
Measuring semantic similarities of terms is a key issue in many research fields. This paper proposed a method based on the Directed Acyclic Graphs (DAG) of terms and the intrinsic information content of terms to measure the semantic similarities of terms. It first calculated the sub-graphs of two terms based on the directed acyclic graph, and then calculated the intersection and union of the sub-graphs. The semantic similarity of two terms is the ratio of the total intrinsic information content of terms in the intersection to the total intrinsic information content of terms in the union. The experimental results show that the method has a higher degree of accuracy.
Word sense disambiguation based on improved vector space model
2010, 30(06): 1671-1672.
Asbtract
(
)
PDF
(443KB) (
)
Related Articles
|
Metrics
To increase the word disambiguation accuracy, a word disambiguation solution based on improved Vector Space Model (VSM) was presented. Since the algorithm takes account of grammar, morphology and semantic and calculates the context similarity requiring the character vector abstraction, the algorithm is able to achieve better results by using collocation constraint. The open test precision can reach 80%. The result shows that the method can fully describe the features of context, and is beneficial to further semantic parsing.
Single-document summarization based on semantics
Zhiqing Zhang
2010, 30(06): 1673-1675.
Asbtract
(
)
PDF
(463KB) (
)
Related Articles
|
Metrics
Single-document summarization goals to create a compressed summary while retaining the theme of the original document. Many approaches use statistics and machine learning techniques to extract sentences from a document. Because single document has limited information, the main approaches are of no effect. Therefore, a new single-document summarization framework based on semantics was proposed. First, the sentence-sentence similarity was calculated. After that modified K-Medoids clustering algorithm was used to cluster the sentences. Finally, the most informative sentence was chosen from each cluster to form the summary. The experimental results demonstrate the improvement of the summary quality by using semantics information.
Study on Kazak text categorization based on SVM
2010, 30(06): 1676-1678.
Asbtract
(
)
PDF
(452KB) (
)
Related Articles
|
Metrics
This paper introduced the basic theory of the Support Vector Machine (SVM) and k-Nearest Neighbor (kNN) algorithm and two different features selection methods in Kazak natural language. An empirical study of using the SVM, kNN, Bayes algorithm to categorize the Kazak text was conducted. The experimental results show that compared with kNN, Bayes, SVM has better categorization of the Kazak text. Due to the characteristics of Kazak's morpheme and configuration, the precision and recall will be lowered if the word is cut with affix.
Typical applications
Research of modeling and simulation method for large scale crowds behavior based on parallel computing
2010, 30(06): 1679-1681.
Asbtract
(
)
PDF
(617KB) (
)
Related Articles
|
Metrics
To address the problem that the expansion of crowds' behavior model causes sharp increase in computation, this paper presented a parallel discrete event approach for simulating large scale crowds' behavior and realized computation in parallel through YH-SUPE engine. It also provided detailed description of simulation objects and communication between them, and then evaluated the performance of model for different number of simulation objects on different number of processes. The experimental results show that this proposed method can increase the amount of simulation objects and support the model more effectively in real-time.
Study on ally selection in maintenance alliance based on mobile Agent
2010, 30(06): 1682-1686.
Asbtract
(
)
PDF
(800KB) (
)
Related Articles
|
Metrics
Concerning the risk aversion of adverse selection and efficiency improvement in maintenance ally selection, an efficient and reliable ally selection system was proposed. The system employed Agents to break up complicated equipment maintenance tasks, and employed mobile Agents and XML technique to convey order and capacity information between the center factory and the potential allies. Reliability evaluation was introduced to ensure the objective description of the potential allies' capacity information so as to guarantee the efficient search and accurate assessment of the potential allies.
Application of hybrid parallel algorithm for simulating photochemical reaction
2010, 30(06): 1687-1689.
Asbtract
(
)
PDF
(483KB) (
)
Related Articles
|
Metrics
A high performance hybrid parallel algorithm for simulating photochemical reaction was developed by introducing the concept of two layers of parallel capacity. The hybrid parallel algorithm was designed based on MPI+OpenMP parallel model. MPI was used to achieve the atomic decomposition, while the OpenMP multithreading was used to achieve the matrix multiplication. The efficiency of the photochemical reaction simulating with large scale atoms can achieve 60% by being tested on SMP cluster. It was proved that this method is a feasible and efficient parallel algorithm for simulating the photochemical reactions.
Realization of a switched convergent computation with Matlab
2010, 30(06): 1690-1693.
Asbtract
(
)
PDF
(622KB) (
)
Related Articles
|
Metrics
The traditional stable control systems need strong conditions of quadratic Lyapunov, exponential or asymptotic stabilization. Using techniques of switched control or convex combination, the unstable subsystems can be convergent in need of weak conditions, such as convergent subspace. This paper analyzed the convergence of perturbed switched systems. Three switching laws and algorithms of state feedback, threshold restriction and state delay were designed. The state estimation and error analysis were achieved in observed switched systems with state estimator. Through simulation by Matlab procedures, some optimal parameters were found and the switched experiment data were analyzed and compared, which showed the stabilization and convergence performance under different switched laws.
RNA secondary structure prediction algorithm based on steam-combination
2010, 30(06): 1694-1697.
Asbtract
(
)
PDF
(587KB) (
)
Related Articles
|
Metrics
RNA secondary structure prediction is the hot topic in bioinformatics. Especially, the RNA secondary structure prediction with pseudoknot has been proved a NP-hard problem. According to the characteristics of RNA folding, a heuristic algorithm — QuasiRP to predict RNA secondary structure was presented. Using stem as the unit, combining graph theory and fundamental theory of binary relations, QuasiRP can find the optimal stem-combination based on Minimal Free Energy (MFE) criterion. The time complexity is O(n3) while the space complexity is O(n2), and pseudoknot can be found. The results show the validity of the algorithm.
Digital matching filter with compound architecture based on matrix
2010, 30(06): 1698-1700.
Asbtract
(
)
PDF
(438KB) (
)
Related Articles
|
Metrics
Under discussion of the background of logging signal transmission based on discrete Sequence-Spread Spectrum (DS-SS) technology, the research and design of a compound architecture Digital Matching Filter (DMF), configured on FPGA, was described in detail. In this filter, the matching filtering progress was made by recursive matrix operations, and common multi-multiplier adder frame was replaced by a way of look-up table value finding, to ensure the well site requirements of simple instrument structure and work efficiency. The testing results show that the processing effects done by the filter meets the design requirements.
Design of serial image acquisition system based on Camera Link
2010, 30(06): 1701-1703.
Asbtract
(
)
PDF
(562KB) (
)
Related Articles
|
Metrics
It is required to transmit large amount of image information stably at high speed for storage and process when designing a testing system. Image data acquired by remote CCD can be transmitted serially in form of Low Vlotage Differential Signal (LVDS) or HOTLink. The data were buffered by ping-pang operation of FPGA. Images can be serially transmitted at high speed to grabber PXI-1428 through Camera Link. The grabber can acquire an image of 128×130 pixels at a speed of 150 fps in LVDS format, or 500 fps in HOTLink format. The top transmission speed can reach 320 Mbps. It is demonstrated that Camera Link can be conveniently applied to high speed, serial image transmission.
Affective semantic analysis in video: type-intensity decomposition method
2010, 30(06): 1704-1707.
Asbtract
(
)
PDF
(610KB) (
)
Related Articles
|
Metrics
This paper looked into a new direction in video content, affective semantic analysis, which can be defined as the understanding of feelings or emotions expected to arise in viewers while watching video clip. According to the approach proposed in this paper, the affective semantic space can be decomposed by two independence elements, affective intensity and affective type. Thus the affective video content was mapped onto the 2-D emotion space by the model that bridges the gap between low-level features and high-level affective semantic meaning. A number of effective audiovisual cues were formulated to help establish the two element curves. The simulations verify the method's reliability and consistency between the emotions reflected by the video and the feelings observed on the viewers.
Design and implementation of log format system in integrated network security management platform
2010, 30(06): 1708-1710.
Asbtract
(
)
PDF
(431KB) (
)
Related Articles
|
Metrics
In order to improve the efficiency of log format system and to solve the problem that log is discarded if system can not recognize the log, a log format system scheme was proposed. In this scheme, the process of search and determination was avoided by binding equipment, port and plug. Through automatic update module, the log format system can download the corresponding plug from server when logs can not be recognized. Simulation tests show that this log format system scheme is completely feasible.
2025 Vol.45 No.4
Current Issue
Archive
Superintended by:
Sichuan Associations for Science and Technology
Sponsored by:
Sichuan Computer Federation
Chengdu Branch, Chinese Academy of Sciences
Honorary Editor-in-Chief:
ZHANG Jingzhong
Editor-in-Chief:
XU Zongben
Associate Editor:
SHEN Hengtao XIA Zhaohui
Domestic Post Distribution Code:
62-110
Foreign Distribution Code:
M4616
Address:
No. 9, 4th Section of South Renmin Road, Chengdu 610041, China
Tel:
028-85224283-803
028-85222239-803
Website:
www.joca.cn
E-mail:
bjb@joca.cn
WeChat
Join CCF