Loading...
Toggle navigation
Home
About
About Journal
Historical Evolution
Indexed In
Awards
Reference Index
Editorial Board
Journal Online
Archive
Project Articles
Most Download Articles
Most Read Articles
Instruction
Contribution Column
Author Guidelines
Template
FAQ
Copyright Agreement
Expenses
Academic Integrity
Contact
Contact Us
Location Map
Subscription
Advertisement
中文
Table of Content
01 March 2010, Volume 30 Issue 3
Previous Issue
Next Issue
Virtual reality
Survey on realistic behavior in crowd animation
2010, 30(3): 571-578.
Asbtract
(
)
PDF
(1373KB) (
)
Related Articles
|
Metrics
In recent years, computer crowd animation has been widely used in various applications such as virtual reality, computer game, education, amusement, and simulation training. The critical two areas of crowd simulations include: the first one is focusing on the simulation model of large crowd motion that is how to simulate realism behavior in crowd animation; the second one is high-quality visualization that is how to regularly use virtual scene in large crowd motion by 3D model. Some wonderful results have been achieved in the two areas. The authors presented the definition of crowd animation and structure of crowd animation, gave a survey on realism behavior in crowd animation from crowd simulation timeline, crowd modeling and crowd simulation algorithms. At last, the authors also presented the conclusions and discussed the prospect.
Survey on stereoscopic three-dimensional display
2010, 30(3): 579-581.
Asbtract
(
)
PDF
(599KB) (
)
Related Articles
|
Metrics
Stereoscopic three-dimensional displays are classified into two types according to their operation principles. One type is based on binocular parallax. Eye-glasses/helmet stereoscopic displays and stereoscopic displays based on parallax barrier or lenticular belong to this type. Their technologies are relatively mature and corresponding productions are common. The other type is not based on binocular parallax. Holographic display, integral imaging display and volumetric display belong to this type. Their technologies are under development and corresponding productions are in lack. The device structures, operation principles and characteristics of the various stereoscopic displays were briefly expounded.
3D visualization method of radiotherapy dose distribution in medical TPS
2010, 30(3): 582-584.
Asbtract
(
)
PDF
(508KB) (
)
Related Articles
|
Metrics
According to the requirements of GyroRTPS, a 3D visualization method used to display the distribution of radiotherapy dose was proposed to assist doctors to look over the dose distribution from different angles. Firstly, some preprocesses of coordinate transformation and interpolation on volume data and dosage data were discussed. Secondly, after these processing, the data for visualization was normalized according to VTK specifications. Finally, the method based on VTK to joint display dose data and human body data in 3D way was realized. The test shows that the reliability of this method and this system can better help the doctor carry on the radiotherapy.
Real-time simulation of fog on GPU
2010, 30(3): 585-588.
Asbtract
(
)
PDF
(546KB) (
)
Related Articles
|
Metrics
Fog simulation is an important research topic in the field of three-dimensional simulation. It puts forward high requirements on real-time performance as well as realistic rendering results for complex three-dimensional simulation system. By analyzing the traditional simulation methods of fog, the authors proposed a Graphic Processing Unit (GPU)-based method to simulate the fog effect in real-time. The basic idea was to employ an exponential function to simulate the fade of fog with height. Then a GPU implementation of this formula was presented. And finally the fog color was blended with the fragment color according to the fog factor. The experimental results show that the proposed method can generate strongly realistic fog in real-time.
Design and implementation of particle system based on content pipeline in XNA
2010, 30(3): 589-592.
Asbtract
(
)
PDF
(545KB) (
)
Related Articles
|
Metrics
Particle system is one of the hot topics in the game engine. In this paper, the general principle of particle system was introduced. Aiming at processing custom content files in XNA easily, by making use of object-oriented technique and extending the behavior of content pipeline, a particle system module was developed. The definition of management interface and the relation of hierarchy of particle system modeling were given. And particle system was integrated and encapsulated. At last, a concrete application was given by making use of XNA Game Studio.
Software process technology
SOAP response caching strategy for non-real time services and its application
2010, 30(3): 593-595.
Asbtract
(
)
PDF
(581KB) (
)
Related Articles
|
Metrics
Concerning the repetitive requests to non-real time services in the Service Oriented Architecture (SOA), the authors proposed a response caching strategy based on the extension of Simple Object Access Protocol (SOAP) and XML-binary Optimized Packaging (XOP) and implemented it in Enterprise Service Bus (ESB), which cached the service responses for replying the same requests within a certain period of time in order to accelerate the response to these requests. This strategy can be applied to the general SOAP responses and that with attachments, and it also sets a mechanism to prevent the duplicate caching of the same attachments. For the two services been tested, the response time to the same request is shortened by 38% and 22% with the use of the response caching strategy.
Method of communication flow dependence analysis for shared-memory SPMD program
2010, 30(3): 596-599.
Asbtract
(
)
PDF
(647KB) (
)
Related Articles
|
Metrics
The traditional analysis methods of data flow dependence cannot recognize the processing nodes to which the statements for accessing shared data should belong and also cannot avoid the impact caused by the control flow whose sequence is non-deterministic while dealing with shared memory Single Program Multiple Data (SPMD) program, thus generating the results with a lower accuracy when they analyze shared data dependence. A scalable analysis method of shared data communication-flow dependence was presented according to the alias feature of shared data in shared-memory SPMD program, which was applied in a prototype of back-end analyzer. The experimental results show that the new method can find more accurate shared data communication-flow dependence than the traditional method.
Correlation analysis of software failure time data
2010, 30(3): 600-602.
Asbtract
(
)
PDF
(552KB) (
)
Related Articles
|
Metrics
Due to the common knowledge in software testing that early failure behavior of the testing process may have less impact on later failure process, the Relevance Vector Machine (RVM) learning scheme was applied to model the failure time data to capture the most current feature hidden inside the software failure behavior. Then the development of Average Relative Prediction Error (AE) series was studied while the value of m changed, so that it can be determined whether recent failure history could contribute to a more accurate prediction of near future failure event or not. Non-parametric statistical methods were applied toward detecting and estimating the trends in the data sets of AE value. Finally, Sen's slope estimator was applied to estimate the trend degree in the data sets so that suitable values of m can be got.
Chinese short text classification based on hyponymy relation
2010, 30(3): 603-606.
Asbtract
(
)
PDF
(647KB) (
)
Related Articles
|
Metrics
Concerning the short length and weak signal to describe the characteristics of short text, a framework of Chinese short-text classification was put forward by using hyponymy. In order to achieve the classification of the test text, the framework first used "Hownet" to determine the hyponymy between concept pairs in training text, thus determining the hyponymy between word pairs, and then the feature vectors of test text were extended. The experimental results show that short-text classification performance can be improved by using the hyponymy.
LambdaMDE: Embedded development environment based on model
2010, 30(3): 607-611.
Asbtract
(
)
PDF
(844KB) (
)
Related Articles
|
Metrics
Recently the development tools of embedded-oriented software are developing from the code-based traditional development environment to model-based development environment. Therefore, the authors studied a model development environment for embedded software called LambdaMDE, which integrated some model development tools such as OSATE and Simulink and other correlative tools by LambdaPro, and hence, it covered the entire development process of embedded software, such as modeling, simulation, verification, code generation and testing. It is consistent with the development trend of embedded software tools, and also has the corresponding theoretical, technique and product foundations.
Research and implementation of ball games scenario interpreter
2010, 30(3): 612-614.
Asbtract
(
)
PDF
(470KB) (
)
Related Articles
|
Metrics
Nowadays, every Technical and Tactical Analysis Systems (TTAS) for ball games needs to design its unique scenario interpreter, which results in a lot of wasted efforts of programmers. To solve this problem, the grammar of the scenario language was designed by analyzing the characteristics of ball games, and the scenario interpreter was finally realized through the analysis of scenario language lexicon, syntax and semantic. At present, it has been successfully applied in table tennis, basketball and volleyball TTAS; meanwhile, the problem of wasted effort has been resolved effectively.
New workflow verification algorithm based on graph-search
2010, 30(3): 615-616.
Asbtract
(
)
PDF
(450KB) (
)
Related Articles
|
Metrics
Since the structural conflicts in workflow processes can lead to malfunctioning of workflow management systems, so the conflicts should be detected before workflow processes are put into execution. Up to now, there is no perfect algorithm which can be able to detect the structural conflicts in the workflow both with loop and without loop. The authors proposed a new algorithm based on graph-search by making use of a skillful method, which transformed the loop in the workflow to an acyclic sub graph. It was able to efficiently detect the structural conflicts in the workflow even with loop.
Transformation from SPEM to extended workflow meta-model
2010, 30(3): 617-619.
Asbtract
(
)
PDF
(471KB) (
)
Related Articles
|
Metrics
In order to improve the efficiency of software development and to standardize the software process management, workflow technique was applied in the process of software development, which can achieve automatic management for software process. SPEM is the basis for software process modeling. It is a general framework and suitable for different types of life cycle model. Workflow meta-model was extended according to the characteristics of software process. Moreover, the well-defined mapping rules from SPEM to workflow meta-model were provided, and the mapping between the two meta-models was developed. Taking waterfall model as an example, the validity of the extended workflow meta-model was verified, which incorporated the characteristics of software process. And based on above researches, automation of software process management can be implemented by means of parsing the workflow meta-model with workflow engine.
Graphics and image processing
Hidden line removal for wireframe projection drawing
2010, 30(3): 620-624.
Asbtract
(
)
PDF
(564KB) (
)
Related Articles
|
Metrics
Since Wireframe does not have the face information, the hidden line of its projection drawing cannot be removed by the existing methods. In this paper, an efficient algorithm for removing the hidden line from wireframe projection drawing was presented. In projection drawing, the intersection of the two lines was identified by bounding box and directed triangle. Further, the visibility of intersection lines could be determined by projection model. The occlusion matrix made up for the lost depth information during projection. The connected matrix of topological relationships between vertices was renewed by occlusion matrix, and the projection drawing defined by connected matrix had no hidden line. The test results show the high efficiency and stability of the proposed algorithm.
Inter mode decision of complexity scalability
2010, 30(3): 625-627.
Asbtract
(
)
PDF
(473KB) (
)
Related Articles
|
Metrics
In order to resolve the problem that computation complexity of H.264/AVC can not satisfy the diversity of computing resources of terminal equipment, the authors proposed a novel algorithm of inter mode decision. The algorithm utilized the spatial correlation and time correlation of adjacent macroblocks to predict the modes of current macroblocks, and controlled the number of modes involved in the prediction by scalability factor; hence the complexity can change flexibly from 20% to 100%. The experimental results show that the proposed method can achieve the tradeoff between the accuracy of motion estimation and its computational complexity, fit for the different computing power of different equipments.
Image sub-pixel registration based on SVM with application to supper-resolution
2010, 30(3): 628-631.
Asbtract
(
)
PDF
(786KB) (
)
Related Articles
|
Metrics
Super-resolution reconstruction produces one or a set of high-resolution images from a set of low-resolution images. In the process of reconstruction methods, sub-pixel registration among the set of low-resolution images is a very important step. A sub-pixel registration method based on Support Vector Machine (SVM) was proposed: the related motion parameters were considered as the target set of SVM. After the mapping between the image features and the target set was built through SVM training, the motion parameters could be calculated using SVM regression. The experimental results confirm the effectiveness of the proposed method compared to other sub-pixel registration methods.
New rapid framework for medical images registration
2010, 30(3): 632-634.
Asbtract
(
)
PDF
(632KB) (
)
Related Articles
|
Metrics
Registration is widely used in medical imaging applications. It is time-consuming due to its huge computation and a new rapid registration framework was presented. The inputting data to the framework include two images: fixed image and moving image. The outputting data were the result image representing the differences between the fixed image and the moving image after registration. Besides the inputting and outputting data, the framework can be divided into four parts: interpolator, measurer, optimizer and transformer. Interpolator was used for evaluating moving image intensities at non-grid positions. Measurer evaluated how well the fixed image was matched by the transformed moving image. Optimizer can optimize the measure criterion and transformer exerted some transformations on the objective image. These four parts acted as different roles in medical images registration and constructed a simple, rapid and stable medical images registration framework. Compared with other registration frameworks, the proposed framework was quite simpler in structure but much quicker in image processing and application development. Good results have been obtained in practical applications.
Multi-face detection algorithm in complex background
2010, 30(3): 635-638.
Asbtract
(
)
PDF
(818KB) (
)
Related Articles
|
Metrics
Face detection based on skin color has a higher detection rate, but it also has a higher false detection rate. AdaBoost face detection algorithm is a fundamental solution to the real-time issue, but its detection rate is not satisfactory. For these reasons, the algorithm that combined skin color division and AdaBoost method was proposed. Firstly, color division was used for rough location of human faces, and after that the candidate region was put as the input of AdaBoost face detection window. In the pretreatment processing, the structure of adjustable elements was used to solve the problem that the human faces were lost because of the different size in the different images. The experimental results show that the proposed method improves the detection rate and also reduces the false rate.
Face feature-points location based on 3D transform shape search
2010, 30(3): 639-642.
Asbtract
(
)
PDF
(659KB) (
)
Related Articles
|
Metrics
Through three-dimension (3D) transformation shape search on two-dimension (2D) face image, the variation of the face gesture can be simulated correctly, and the nonlinear variations of face on Active Shape Model (ASM) are solved effectively. Based on standard face 3D model, the three-dimension coordinates were gotten by 2D initial shape; and 10 gesture-parameters of 3D transformation were calculated through the iterating process of 3D transformation, projecting to track the object shape. The test data show the 3D transformation shape search produces better results on searching multi-gesture face images that are not included in ASM training set.
Robust tracking algorithm based on multi-feature fusion and particle filter
2010, 30(3): 643-645.
Asbtract
(
)
PDF
(650KB) (
)
Related Articles
|
Metrics
In order to avoid the tracking failure based on single feature under the conditions of cluttered backgrounds and illumination changes, a robust tracking algorithm was proposed based on multi-feature fusion and particle filter. Multi-block color histogram based on HSV was used to describe the overall distribution characteristics of the target and histogram of oriented gradients containing some construction information. The two features were fused in the frame of particle filter. Meanwhile, the weighs of fusing strategy, template and noise distribution parameters were updated online, and the particle number of the features was adjusted dynamically. The experimental results show that the proposed method is of higher robustness and provides more accurate result.
Palm-dorsa vein recognition based on two-dimensional Fisher linear discriminant
2010, 30(3): 646-649.
Asbtract
(
)
PDF
(640KB) (
)
Related Articles
|
Metrics
The palm-dorsa vein recognition uses non-contact technology. A locating algorithm was presented in order to avoid the interference brought by the rotation and translation of the palm-dorsa when collecting pictures. This method ultimately intercepted a rectangle on the palm-dorsa containing the information of vein as much as it can. The feature points of palm-dorsa edge were located by the location method based on unchanged feature points, and then the effective region was obtained. Experimental results show that the algorithm is self-adaptive, accurate and fast. To solve the problem that the within-class scatter matrix is always singular based on Fisher Linear Discriminant (FLD), a new way based on Two-Dimensional FLD (2DFLD) was used in the palm-dorsa vein recognition. The image matrix was directly projected to avoid high dimension operation. In the experiments, 2DFLD was applied to extract the vein feature subspace from palm-dorsa vein database. The test sample images to be recognized were projected on small dimension subspace. Lastly, a nearest neighbor classifier for palm-dorsa vein matching based on Euclidean distance was used and the recognition rate reached as high as 98%.
Automatic CamShift tracking algorithm based on multi-feature
2010, 30(3): 650-652.
Asbtract
(
)
PDF
(484KB) (
)
Related Articles
|
Metrics
Since the Continuously Adaptive MeanShift (CamShift) tracking algorithm only adopts color as feature, which would result in tracking error, an improved algorithm based on feature fusion was proposed. The presented algorithm detected the moving target automatically using an improved background subtraction method. Object model combined color and gradients orientation features, and weighted the reliability of features. It also overcame the possible CamShift invalidity in the situation of some similar color objects. The experimental results show that the algorithm enhances the reliability and robustness of tracking.
Color image segmentation algorithm based on hue level histogram and region merging
2010, 30(3): 653-656.
Asbtract
(
)
PDF
(672KB) (
)
Related Articles
|
Metrics
In recent years, with the advancement of machine vision, pattern recognition and content-based image retrieve and the wide use of color images, image segmentation, especially color image segmentation, has played a more and more important role. Therefore, a fast and effective segmentation method for color image was proposed, which mainly included three steps. Firstly RGB color space was transformed into HSV space, and image pixels were divided into singular and non-singular points according to their saturation and intensity; then the singular and non-singular points were segmented respectively based on hue level histogram and grey level histogram respectively; finally region merging method was adopted to merge the previous results of segmentation. The experimental results show that this method can effectively extract color image of the object and have certain robustness.
Information security
Survey on techniques of digital multimedia forensics
2010, 30(3): 657-662.
Asbtract
(
)
PDF
(1095KB) (
)
Related Articles
|
Metrics
Digital multimedia forensics is an emerging research field of information security. The research on digital multimedia forensics is important to ensure the credibility of digital multimedia data. Using digital image forensics as an example, the authors reviewed the current digital multimedia forensics techniques from the five aspects, including tamper detection, source device identification, authenticity verification, device component forensics, and the reliability of digital multimedia forensics. The authors focused on introducing typical algorithms, and meanwhile, pointed out the main problems in the current research and suggested the urgent topics for the future research.
Computational trust evaluation model based on temporal sequential marker
2010, 30(3): 663-667.
Asbtract
(
)
PDF
(889KB) (
)
Related Articles
|
Metrics
Trust itself is a social cognitive concept, so it is fit to research trust from cognitive view. A conceptual model based on Temporal Sequential Marker (TSM) was proposed after introducing time cognition. Furthermore, three different kinds of forgetting effects (distance effect, boundary effect and hierarchical effect) were considered for the computing of TSM-trust of each individual. D-S theory was exploited to build up a computational dynamic trust (TSM-Trust) model based on the proposed conceptual model. The proposed model was verified through comparison of theoretical results and experimental results.
Fast scalar multiplication algorithm on Edwards curve
2010, 30(3): 668-670.
Asbtract
(
)
PDF
(596KB) (
)
Related Articles
|
Metrics
In the field with the characteristic different from 2, the elliptic curve was transformed into Edwards curve equivalent to its double rational, and the realization speed of the software and hardware of ECC would be effectively accelerated. First of all, it could simplify doubling point calculation formula on Edwards curve. Then, according to that the coordinates of 2mP(m=2,3,…) have the uniform representation, Consecutive Doubled Algorithm (CDA) for computing 2mP was proposed based on recursive technique. The analytical results of algorithm complexity and instances show that the speed of scalar operation on Edwards curve can be increased by about more than 10 percent.
Using classical groups to construct authentication codes
2010, 30(3): 671-673.
Asbtract
(
)
PDF
(407KB) (
)
Related Articles
|
Metrics
To design a simple and practical authentication scheme is one of the important issues in the authentication codes research. Applying partition of positive integers and Anzahl theorem of subgroups of classical groups, a class of authentication codes was obtained and all parameters of this scheme were computed. Simultaneously, supposing that the secret key was chosen according to the uniform probability distribution, security analysis of the scheme was given. The construction increased the number of the source state dramatically, which indicates that this code has better traits than some other codes obtained by matrices method before.
Construction method of S-box suitable for hardware implementation
2010, 30(3): 674-676.
Asbtract
(
)
PDF
(610KB) (
)
Related Articles
|
Metrics
The design of S-box was used in some block ciphers such as Rijndeal, Camellia, SMS4, which was based on good cryptographic properties coming from integration of inverse transformation over finite fields and affine transformation. The authors investigated cryptographic properties of the three block ciphers above, and presented a kind of S-box construction model. According to the features of hardware implementation, a lot of S-boxes were constructed by using circular matrix. It claims that the new kind of S-box possesses some better properties compared with the Rijndeal S-box, and there is no significant difference in cost of hardware implementation between them.
Dynamic threshold signature scheme based on bilinear pairing
2010, 30(3): 677-679.
Asbtract
(
)
PDF
(453KB) (
)
Related Articles
|
Metrics
The threshold value of the most available threshold signature schemes is fixed. However, in many occasions, it is ideal to dynamically change the threshold value according to the significance of the message. A dynamic threshold signature scheme from bilinear pairings was presented to solve the problem in the available dynamic threshold signature scheme. The security of the scheme was also analyzed. It is shown that the proposed scheme is secure and effective.
Several short periodic trajectories of TD-ERCS and their stabilities
2010, 30(3): 680-684.
Asbtract
(
)
PDF
(797KB) (
)
Related Articles
|
Metrics
TD-ERCS is a class of chaotic system designed for chaos encryption. Though it has good encryption properties such as being chaotic in all fields for encryption, Tangent-Delay Ellipse Reflecting Cavity map System (TD-ERCS) is comprises of quite a lot of short cycling trajectories, which may lead to weak keys. Power spectrum analysis and Lyapunov calculation of the periodic trajectories indicated that the periodic trajectories were stable under tangent-delay operation m=0 and instable under tangent-delay operation m≥1. And the instability was independent of parameter μ, TD-ERCS tended from order to chaos under tangent-delay operation. The results show that TD-ERCS does not have weak keys caused by the stable short-period Trajectories.
Security analysis and improvement of efficient certificateless signature scheme
2010, 30(3): 685-687.
Asbtract
(
)
PDF
(588KB) (
)
Related Articles
|
Metrics
Recently, Zhang Yu-lei et al. proposed an efficient certificateless signature scheme based on bilinear parings. By analyzing the security of the Certificateless Signature (CLS) scheme proposed by Zhang et al., the authors pointed out that their scheme was insecure against public key replacement attack. An improved scheme was proposed. In random oracle model, the improved scheme was existentially unforgeable under q-Strong Diffie-Hellman (q-SDH) assumption and discrete logarithm assumption.
Group negotiation-based digital watermarking for wireless sensor network
2010, 30(3): 688-691.
Asbtract
(
)
PDF
(671KB) (
)
Related Articles
|
Metrics
Traditional digital watermarking cannot be applied directly in Wireless Sensor Network (WSN) that has special features, such as collecting data in distributed sub-clusters and processing data in-network. According to the hierarchical clustering architecture of WSN, a distributed consultative mechanism was established in clusters to generate the proper watermarking; thus, a distributed watermark algorithm was proposed. Sensor nodes of each group were embedded with watermarks with low computational complexity. The watermarks have shown the superiority of resistance to the attacks and the robustness of the lossy data compression. The results of analysis and experiments show that the watermarking proposed is easy to be checked. It can successfully identify whether data had been illegally tampered and thus effectively protect the security of WSN.
Research of dynamic Botnet model
2010, 30(3): 692-694.
Asbtract
(
)
PDF
(484KB) (
)
Related Articles
|
Metrics
The existing Botnet techniques and detection methods are usually confined to specific Botnet. To improve the confidentiality of Botnet, the authors proposed a dynamic Botnet model described with directed graph, which can accommodate various Botnets. Several dynamic attributes of the proposed model were analyzed, such as exposedness, resilience, sustainability in detail, and then a bot abandon policy was presented. The experimental results indicate that the proposed method can decrease the Botnet's detection ratio and improve sustainability and resilience effectively.
Application of self-training based on ensemble learning in intrusion detection
2010, 30(3): 695-698.
Asbtract
(
)
PDF
(605KB) (
)
Related Articles
|
Metrics
Regularization self-training is a new method based on ensemble learning. It can solve the problem of insufficient labeled training samples in intrusion detection. The proposed algorithm combined active learning and regularization theory, and utilized unlabeled data to improve the existing classifiers. The experiments were running on three main ensemble learning algorithms under different unlabeled rate. The results prove that the proposed method can improve the boundary of the ensemble classifiers, and reduce the error rate with the help of large amounts of unlabeled data.
Application of clustering and time-based sequence analysis in intrusion detection
Shao-Hua TENG
2010, 30(3): 699-701.
Asbtract
(
)
PDF
(603KB) (
)
Related Articles
|
Metrics
Intrusion detection system can discover potential intrusion behavior by collecting and analyzing various network data. Clustering algorithm is an unsupervised machine learning method well applied in intrusion detection. In this paper, an algorithm of intrusion detection was explored based on clustering analysis and time-based sequence analysis. It is able to detect many different types of intrusion without manually classified data for training. The experimental results show that the algorithm is feasible and effective. It has higher detection rate and a lower false positive rate.
Digital watermarking algorithm based on image feature point
2010, 30(3): 702-704.
Asbtract
(
)
PDF
(701KB) (
)
Related Articles
|
Metrics
It might fail to detect watermarking now that geometric attacks perform on a watermarked image and destroy the synchronization of the watermark signals embedded in the image. To restore the lost synchronism, a novel method to estimate the geometric operation using sifted Scale-Invariant Feature Transform (SIFT) was proposed. According to Sum of Squared Differences (SSD) of the pixels, the suitable feature points were refined to ensure robustness of the least-squares solution. The least squares iterative was convergent quickly, and the one step iteration would be able to get high accuracy geometric parameters; encoding binary watermark image by using Run-Length Code (RLC), watermarking was done by altering the coefficients of the DCT blocks corresponding to the value of RLC. The simulation results show that the proposed scheme achieves good image quality and it is robust to geometric attacks.
Watermarking algorithm of complex cepstrum domain audio based on LSB and quantizing
2010, 30(3): 705-707.
Asbtract
(
)
PDF
(468KB) (
)
Related Articles
|
Metrics
In order to protect copyright, a novel cepstrum transform watermarking algorithm in wavelet domain was proposed. The proposed algorithm adopted the idea of Least Significant Bit (LSB) and quantization. Firstly, the audio signal was divided into several segments; the segment was transformed with wavelet domain and complex cepstrum. Then the LSB of mean of complex cepstrum coefficients was computed. Lastly, the watermark was embedded into the mean of complex cepstrum coefficients through changing its odd and even. When extracting, only the LSB of mean of complex cepstrum coefficients was judged to restore the watermark. The algorithm was blind. The embedding processing of the proposed algorithm does not depend on any threshold, and it is more conducive to the practical application of the algorithm. The experimental results show that the algorithm is very robust to re-sampling, re-quantization, low filter, amplitude attack, etc.
Network and communications
Channel control strategy based on noninterference theory and its automated verification scheme
2010, 30(3): 708-714.
Asbtract
(
)
PDF
(1218KB) (
)
Related Articles
|
Metrics
The authors defined the semantic description and functions of channels based on the study of the direct or indirect relation between any information domain and other domains who sent information to it or received from it. Exactly defining and strictly controlling the information channels between system modules or processes was beneficial to the integrity and controllability of the modules or processes. And in this paper, the new channel control strategy was used for this purpose. Channel control strategies were generally not easy to be manually verified because of their complexity. The authors presented an approach of describing system and strategies based on Communicating Sequential Processes (CSP) syntax and verifying the strategies in systems with automated verification tools FDR2. This approach can effectively and efficiently analyze the information channels and find out most of the storage covert channel.
TVCrawler: Multi-protocol P2P IPTV crawler
2010, 30(3): 715-718.
Asbtract
(
)
PDF
(805KB) (
)
Related Articles
|
Metrics
Network measurement is a significant means of Peer-to-Peer (P2P) IPTV research. It can not only help design IPTV systems or protocols more fit for the network in reality, but also lays the foundation for the monitoring, directing and dominating of P2P IPTV. As an active network measurement technology, crawler is a principal method of P2P IPTV measurement. In this paper, a multi-protocol P2P IPTV crawler named TVCrawler was proposed, which can be used to measure and research the live channel of three P2P IPTV systems: PPLive, PPStream and UUSee. The TVCrawler has three characteristics: 1) feedback-based construction mechanism of boot node sets; 2) master-slave framework, and multiple crawler terminals can simultaneously run to gather data; 3) topology increasing coefficient-based control of crawling interval. The experimental results demonstrate that TVCrawler can reach the speed of 20~100 peers per second and 130~500 edges per second.
Survey of anonymity communication
2010, 30(3): 719-722.
Asbtract
(
)
PDF
(686KB) (
)
Related Articles
|
Metrics
Anonymity communication is a hot topic in the area of network and communication. The origin of anonymity communication was first outlined, and the framework of anonymity communication in terms of anonymity property, adversary capability and network type were described. Then the research of anonymity communication was investigated, and a brief description of several major anonymity communication systems including Anonymizer, Tor, Mixminion, Crowds and Tarzan was provided. Finally the challenges confronted in the development of anonymity communication were proposed, including the user experience of anonymity communication, credibility evaluation system of relay node and misbehavior of anonymity communication.
Routing protocols for opportunistic networks
2010, 30(3): 723-728.
Asbtract
(
)
PDF
(1013KB) (
)
Related Articles
|
Metrics
In opportunistic networks (oppnets), because of low density, nodal mobility, and limited transmission range, network topology will disrupt and nodes will detach from the networks. Thus, an end-to-end path from source to destination may not exist, which is a challenge for the design of routing protocols. This issue has attracted more and more attention recently. Firstly, the concept, architecture, and characteristics of opportunistic networks were introduced. Then, the progress and the new mechanisms of routing protocols for opportunistic networks were described. The present routing protocols were also classified, introduced, and compared. Finally, the future directions were given, which aimed to benefit the future research and applications of opportunistic networks.
Self-organized clustering algorithm based on regional relevance in wireless sensor network
2010, 30(3): 729-732.
Asbtract
(
)
PDF
(592KB) (
)
Related Articles
|
Metrics
In view of limited resources and amounts of data in Wireless Sensor Network (WSN), clustering compressions are always employed to reduce transmission packets. Based on the regional relevance, a self-organized clustering algorithm was proposed for the wavelet compression in sensor network. The algorithm used the correlation of actual regional data for clustering. Actually, during the wavelet data compression procedure, the data correlation was detected in the cluster head, so that to insure better correlation and adjust the clusters structure. Meanwhile, data dependence among the nodes was analyzed in Sink, to form large-scale clusters with better relevance, so that to improve the efficiency of wavelet compression in a long period. Theoretical and experimental results show that the proposed algorithm can eliminate the data redundancy by using temporal and spatial correlation as much as possible, improve the efficiency of wavelet data compression, and reduce network energy consumption.
Novel topology inference algorithm for wireless sensor network
Guang-min HU
2010, 30(3): 733-735.
Asbtract
(
)
PDF
(612KB) (
)
Related Articles
|
Metrics
Information of Wireless Sensor Network (WSN) topology is significant to network planning and management. For the WSN based on the data aggregation communication paradigm, the authors proved that the conditional probability of data loss of one node was minimum given that the data of the parent node were successfully transmitted to the sink. Based on this conclusion, a novel algorithm to infer WSN topology was proposed. The algorithm was capable of capturing accurate topology for WSN. It used end-to-end measurements and did not incur any additional burden on the network. NS-2 simulation results show that the proposed algorithm has high accuracy.
Application of improved AR(1) model in DNS
2010, 30(3): 736-739.
Asbtract
(
)
PDF
(776KB) (
)
Related Articles
|
Metrics
Server selection algorithm is the key algorithm for Domain Name System (DNS) while handling iterative queries. Among all queries sent to DNS, the proportion of iterative query is larger than 30%, so the performance of server selection algorithm directly affects the performance of a DNS server. The existing server selection algorithms were briefly reviewed and both advantages and disadvantages of these algorithms were described. Then, an improved AR(1) auto-regressive model was proposed. Through this new model, the response time of DNS server could be dynamically predicted by using previous response time series. This new model can efficiently avoid performance fluctuation and loss due to network congestion and short-time system failure. At the same time, the application scope of AR(1) model is broadened by the new model and it is suitable for all DNS.
Performance analysis and comparison of AODV and AOMDV routing protocols
2010, 30(3): 740-744.
Asbtract
(
)
PDF
(855KB) (
)
Related Articles
|
Metrics
Mobile Ad Hoc network is a self-organizing network in which routing protocols play an important role. The authors described the features of Ad Hoc On-Demand Distance Vector Routing (AODV) protocol and Ad Hoc On-Demand Multipath Distance Vector Routing (AOMDV) protocol and evaluated their performance through a variety of test scenes and different Media Access Control (MAC) protocols in NS2. The experimental results show that AOMDV protocol provides better performance than AODV protocol in terms of average delay and route discovery frequency, and worse performance in terms of packet delivery ratio and normalized rooting load.
Game theory-based cooperative resource management for wireless broadband network
2010, 30(3): 745-750.
Asbtract
(
)
PDF
(796KB) (
)
Related Articles
|
Metrics
In the future wireless communication networks, high rate multimedia broadband data traffic needs to be provided. In order to ensure the reliability and validity, cooperation communication technology has been introduced in wireless networks. Thus, in this paper, a game theory based cooperative resource management strategy in wireless multimedia networks was proposed. Nash equilibrium point was found by confirming the transmit price and cooperation resource quantity, and results of the strategy were proved by Pareto optimal solution, and the rationality and feasibility of using game theory to allocate cooperative resources was verified by theoretical analysis. Simulation results show that, supported by the proposed strategy, the network performance is much better than that by using determinate price to allocate radio resources.
Study on dissemination model of network public sentiment based on cellular automata
2010, 30(3): 751-755.
Asbtract
(
)
PDF
(804KB) (
)
Related Articles
|
Metrics
A structure of Cellular Automata (CA) for dissemination of network public sentiment was designed, which was composed of selection of cellular state data, lattice space, and neighbor. Then a majority transform rule modified with persistency of cell was proposed by a new equation as well as an algorithm of considering the information influence of cell movement traversing lattice space. A serial simulation was put into experiment to evaluate above model. Four parameters including inclination intensity, inclination focusing degree, cell number peak, and ratio of cell number in different inclination class were designed before discussion. The iteration calculation showed the results had different transit procedure, patterns and meaning respectively followed by a cell with variable persistency and in static or moving condition. Finally it suggested that the cell moving traversing lattice with persistency should be given special concern as it conformed to Internet reality. The simulation results prove that the proposed method is valid and effective.
Distributed call center for enterprises based on Asterisk and OpenVPN
2010, 30(3): 756-760.
Asbtract
(
)
PDF
(895KB) (
)
Related Articles
|
Metrics
The cost, security, stability, and efficiency are important factors when designing a distributed IP call center for an enterprise. The authors presented the architecture and implementation of a high-performance enterprise level IP distributed call center based on Asterisk and OpenVPN. The system used OpenVPN to ensure the security of voice transmission and smart routing to deal with various traffics. A kind of communication mode taking IAX2 as relay was also proposed. The test results on the CPU usage, load, bandwidth, delay and voice quality show that the proposed design is a better solution to the issues such as voice delay caused by jitter and intermittent phenomenon in the voice transmission, compared to the SIP based solution.
Adaptive rate control algorithm of Speex codec based on speech quality prediction
2010, 30(3): 761-764.
Asbtract
(
)
PDF
(663KB) (
)
Related Articles
|
Metrics
To manage and control the speech quality of VoIP communication dynamically, an adaptive rate control algorithm based on speech quality prediction was proposed. This algorithm predicted the instantaneous speech quality and the integral speech quality of VoIP in real time to adjust the encoding parameters of Speex codec adaptively. Then it selected the optimal encoding bit rate in need. The simulation results demonstrate that the proposed algorithm indeed reduces the network congestion and improves the speech quality of VoIP system.
Load balancing design based on statistical model for VOD
2010, 30(3): 765-767.
Asbtract
(
)
PDF
(521KB) (
)
Related Articles
|
Metrics
Most load balancing algorithms are based on Web service and unfit for Video on Demand (VOD) system. They all neglect the effect of user's behavior and the differences among different segments in one program. Therefore, the load balancing algorithm based on statistical models was proposed. It commenced from random user behavior, analyzed the implied rules, divided the programs into pieces, copied and stored them, according to different order probability of diverse segments in diverse programs. The results of simulation testify its superiority to other algorithms. It improves the utilization rate of server resource and achieves load balance.
Artificial intelligence
Feature feedback-based system for recognition of handwritten Chinese characters
2010, 30(3): 768-771.
Asbtract
(
)
PDF
(746KB) (
)
Related Articles
|
Metrics
Due to the difficulty of conforming to human recognition process with traditional open loop recognition system, a simulated intelligent recognition system with feedback structure was constructed. Choosing the best initial recognition method based on the multimodal and qualitative recognition result, the system extracted general characters recognition error to judge the credibility and feedback correction after the initial recognition. Three kinds of general characters recognition error were designed according to the feedback result. The credibility evaluation index system and feedback correction decision-making mechanism of the recognition result were established by qualitative and quantitative analysis of three kinds of general characters recognition error. And the analytical method of the recognition error was given. The experimental results prove that the presented method is effective.
Convergence analysis of clonal selection algorithm based on BCA
2010, 30(3): 772-775.
Asbtract
(
)
PDF
(653KB) (
)
Related Articles
|
Metrics
Clonal Selection Algorithm (CSA) has been widely applied in intelligent computation field, but the theoretical analysis and research works regarding CSA are relatively lacking. In order to enrich the theoretical underpinning of the CSA, the authors abstracted the single-member-based B Cell Algorithm (BCA) from the multi-member-based CSA, and simplified the mathematical model of the CSA. A modified mutation operator in BCA, Contiguous Region Hypermutation Operator (CRHO), was introduced; a Markov chain model of the BCA was proposed; a novel method for the construction of transition matrices for the BCA was given. Consequently, it was proved that the BCA was convergent absolutely. It can be concluded that clonal selection algorithm is convergent, because BCA is an abstract of the generic CSA.
Weighted least squares support vector regression based on AdaBoost algorithm
Peng DaiQiang
2010, 30(3): 776-778.
Asbtract
(
)
PDF
(390KB) (
)
Related Articles
|
Metrics
In the standard Least Squares Support Vector Machine (LS-SVM) for regression, every training sample is equa11y considered, which is unsuitable when there exists significant difference among the training samples. The weighted least squares support vector regression based on AdaBoost algorithm was proposed. Learning by a series of support vector regressions, the proposed approach combined all the results in accordance with some rule. At the same time, adaptive weighted factors in LS-SVM were constructed to control the error function according to the regression error. It emphasized the significant difference among the training samples by adaptive weighted factors and improved the performance of generalization error. The experimental results demonstrate that the proposed approach has a competitive learning ability and acquires better accuracy than LS-SVM.
Application of SVR into quantitatively analyzing adverse selection contract model
2010, 30(3): 779-782.
Asbtract
(
)
PDF
(741KB) (
)
Related Articles
|
Metrics
A SVR-based quantitative calculation method was proposed, so that the relevant theory of incentive contract could be analyzed quantitatively and be put into practical use. Being able to calculate quantitatively, the Support Vector Regression (SVR) was used to express the utility function. Based on that, the gradient expression of adverse selection model was derived for both good natural condition and bad condition, and that of adverse selection model was also derived for both high efficient Agent and low efficient Agent, respectively. Then, the relevant gradient descent algorithm was given. Using the proposed method, the two adverse selection incentive contract models above were quantitatively analyzed, and the effects of varying parameters on the changing trends of adverse selection model equilibriums were observed. The reasonable results show that it is feasible to solve adverse selection incentive contract model using the SVR-based quantitative calculation method.
Improved PSO-BP neural network for power transformer fault diagnosis
2010, 30(3): 783-785.
Asbtract
(
)
PDF
(637KB) (
)
Related Articles
|
Metrics
Particle Swarm Optimization (PSO) algorithm searches the best solution by making particles moving around the search space according to the global best particle. But when one particle is selected as the global best particle continuously, the other particles will converge at the global best particle repeatedly, which makes the particle swarm fall into local optimization. The authors presented Mutational Dynamic Particle Swarm Optimization (MDPSO) algorithm. A part of particles' inertia weight would mutate when one particle was selected as the global best particle continuously, which could make the part of particles jumping out of the local optimization and keeping searching in the whole solution space. Otherwise, the authors combined MDPSO and BP neural network and applied it to the diagnosis of power transformer. The experimental results show that the proposed approach has a better ability in optimizing BP neural network and in terms of diagnosis accuracy.
Model updating of aluminum honeycomb sandwich plate based on particle swarm optimization algorithm with flying factor
2010, 30(3): 786-788.
Asbtract
(
)
PDF
(440KB) (
)
Related Articles
|
Metrics
Honeycomb sandwich plate has high specific strength, high specific stiffness and good heat insulation, vibration insulation as well as impact resistance, which makes it widely used in the aerospace engineering. The honeycomb sandwich plate was made to be equivalent to shell element according to the equivalent plate theory, and based on the first five natural frequencies of the modal test, the Particle Swarm Optimization (PSO) algorithm with flying factor was used to update the material parameters such as the equivalent stiffness and equivalent density and the results show that the updated natural frequencies approach the test data better than the non-updated ones. Compared with the standard PSO algorithm, the Finite Element Model (FEM) updated by PSO with flying factor can approach the real structure better, which proves the validity and efficiency of this algorithm.
Application of improved particle swarm optimization in path planning of underwater vehicles
2010, 30(3): 789-792.
Asbtract
(
)
PDF
(648KB) (
)
Related Articles
|
Metrics
The characteristics of the underwater vehicle's path planning in the ocean environment are as follows: broad planning range, relatively sparse obstacles, and inevitable impact of the ocean currents. The Particle Swarm Optimization (PSO) algorithm was adopted to realize the path planning of the underwater vehicle in the complex ocean environment. According to the parameters control strategies and topology models, an improved PSO algorithm with better constringency precision was obtained. During the programming, a fitness function was designed, which combined path length, ocean currents and shift cost of the underwater vehicle. By using this function, the adverse effects on the underwater vehicle's energy consumption and control performance caused by the ocean currents can be greatly reduced. The simulation results verify the effectiveness of this algorithm. Besides, the proposed algorithm can well meet the requirements of the path planning for the underwater vehicle in complex environment.
Noise cancellation for telemetry signal based on reconstructed phase space and principal component analysis
2010, 30(3): 793-795.
Asbtract
(
)
PDF
(422KB) (
)
Related Articles
|
Metrics
The aim of signal de-noising is recovering the original signal from the noisy environment. A new de-noising method for telemetry signal based on Principal Component Analysis (PCA) was proposed. First a one-dimension time series can be embedded to equivalent multi-dimensions through the Reconstructed Phase Space (RPS) method. Then it can extract the telemetry signal by PCA because the telemetry signal was the principal component. The experimental results show that the de-noising performance of the proposed method is good, and also is very similar to the wavelet de-noising method. Simulations verify that the proposed method possesses good performance and better self-adaptive ability compared to other methods.
Abnormal audio recognition algorithm based on MFCC and short-term energy
2010, 30(3): 796-798.
Asbtract
(
)
PDF
(459KB) (
)
Related Articles
|
Metrics
Concerning the high complexity and low rate in abnormal audio recognition, the abnormal audio recognition system based on the Mel-Frequency Cepstrum Coefficients (MFCC) and short-term energy was proposed. This feature vector made the Gaussian Mixture Model (GMM) classifier outperform MFCC and Differential MFCC features in classification. The classifier can achieve an average recognition rate of more than 90%, and small computational complexity. The steps of system implementation were elaborated. The simulation results prove the effectiveness of the proposed algorithm.
Database and data mining
Improvement of density-based method for reducing training data in KNN text classification
2010, 30(3): 799-801.
Asbtract
(
)
PDF
(534KB) (
)
Related Articles
|
Metrics
The density of training data directly influences the efficiency and precision of k- Nearest Neighbor (kNN) text classifier. Two disadvantages had been uncovered by the analysis of density-based method while reducing the amount of training data in kNN text classification. One is that after being reduced, the even density of the training data is just based on the spherical region which has a radius of ε,rather than the equal distance of every training text. The other is that there is no treatment of the low-density training texts while plenty of low-density texts still exist in the training data after being reduced. An improved approach to the mentioned deficiencies was proposed: the reduction strategy was optimized to make the training data yield evenly and the appropriate data were supplemented into the low-density texts. It is shown that the improved method has a distinctly better performance on both algorithm stability and accuracy.
Density-based detection for outliers and noises
2010, 30(3): 802-805.
Asbtract
(
)
PDF
(812KB) (
)
Related Articles
|
Metrics
Concerning the point clouds with noises and outliers acquired by a 3D scanner, a denoising method based on the concept of local outlier was proposed. The method established the topology connection of the scattered points by searching the k-Nearest Neighbor (kNN) of each point. Local outlier factor was calculated to weight the current point's outlier level, so the noises can be restricted and the outliers can be removed. The method emphasized how to detect the outliers and noises with low density when they are scattered around the point cloud with high density. The experimental results show that the method can detect the outliers next to model boundaries easily, and maintain the borders to the greatest extent.
Depth-first search algorithm for mining frequent closed itemsets
2010, 30(3): 806-809.
Asbtract
(
)
PDF
(619KB) (
)
Related Articles
|
Metrics
Mining frequent closed itemsets is a fundamental and important issue in many data mining applications. A new depth-first search algorithm for mining frequent closed itemsets called depth-first search for frequent closed itemsets (DFFCI) was proposed, which could reduce the number of candidate itemsets and the cost of support counting. DFFCI projected the dataset information stored by the improved Compressed Frequent Pattern tree (CFP-Tree) into the partition matrix, and improved the efficiency of support counting by using binary vector logic operation. Global 2-itemset pruning based on support pre-counting and local extension pruning were used to prune the search space effectively. The experimental results show that DFFCI outperforms other depth-first search algorithms.
New linear grouping algorithm based on total least absolute deviation criteria
2010, 30(3): 810-812.
Asbtract
(
)
PDF
(467KB) (
)
Related Articles
|
Metrics
To overcome the drawback that the traditional Linear Grouping Algorithm (LGA), when extracting linear structures, is sensitive to outliers in datasets, a new finite step according to existence of optimal linear grouping in data set and a new algorithm based on k-means clustering, total least absolute deviation and resampling were proposed, which detected several different linear relations at once to minimize the total orthogonal distances from n given points to its nearest hyperplanes. Finally, by comparison with linear grouping algorithm and robust linear grouping algorithm based on impartial trimmed k-means, the proposed algorithms are more robust and can detect all strong linear structures in datasets including a lot of outliers.
Survey on search of file system with petascale size
2010, 30(3): 813-817.
Asbtract
(
)
PDF
(860KB) (
)
Related Articles
|
Metrics
As file systems grow to petabytes, managing and retrieving the billions of files become increasingly difficult. Efficient file system search becomes more and more important and necessary. The authors gave an overview on the current state of petascale file system search and the challenges, discussed the key points, existing projects in petascale file system search and the index technology. At the end, the authors proposed some new research prospects in petascale file system search.
Improved Web page clustering algorithm based on partial tag tree matching
2010, 30(3): 818-820.
Asbtract
(
)
PDF
(430KB) (
)
Related Articles
|
Metrics
In the process of Web information extraction, Web pages on the target websites should be clustered in order to detect and generate templates that are used to extract required information. Traditional page clustering algorithm based on DOM tree edit distance is not suitable for the complex Document Object Model (DOM) tree structure pages created from dynamic templates. In this paper, an improved Web page clustering algorithm was proposed based on partial tag tree matching. In the proposed algorithm, the appropriate weights were assigned to the nodes according to their effects on the layout of Web pages and the level difference between template nodes and non-template nodes. After that, the structure similarity between Web pages was computed efficiently based on partial tree matching approach. Compared with the traditional algorithms, the experimental results show that the proposed algorithm is of higher accuracy in clustering dynamic Web pages and lower computing complexity.
Data update mechanism for native XML storage scheme
2010, 30(3): 821-824.
Asbtract
(
)
PDF
(581KB) (
)
Related Articles
|
Metrics
A native XML storage scheme is directly related to query processing and data update. The current native XML storage schemes are mostly concerned with query processing and rarely involve the support of data update. Different from the update of relational tables, XML update needs to take the document order of nodes into account. A novel update mechanism for native XML storage was presented, which not only maintained the document order of nodes, but also restricted an update operation within one page to ensure the update efficiency. Through the introduction of forward link records and relocated records, the update mechanism kept record storage addresses unchanged when splitting a page to avoid the I/O overhead of index update. A case study was made to demonstrate that the data update mechanism for native XML storage scheme is effective.
XML keyword search algorithm based on efficient LCA
2010, 30(3): 825-830.
Asbtract
(
)
PDF
(849KB) (
)
Related Articles
|
Metrics
Concerning the keyword search in XML document, the meaningless query results are studied from two aspects: equivalence of content in element labels and similarity in element structure. The concept of eFficient Lowest Common Ancestor (FlCA) was introduced, and then the concept of Compact eFficient Lowest Common Ancestor (CFLCA) was proposed on basis of FLCA. Based on the definition of query result set, a search algorithm based on equivalent pattern value index called BEPVA was presented, and the proposed approach was compared with CVLCA and SLCA. The experimental results indicate the proposed approach outperforms CVLVA and SLCA in terms of quality and efficiency of query.
XML coding scheme for efficient query processing
2010, 30(3): 831-834.
Asbtract
(
)
PDF
(579KB) (
)
Related Articles
|
Metrics
As the core operation in XML query processing, structural joining consumes a great deal of time. The authors proposed a new approach called labeling scheme for efficient query processing (LSEQ). By decomposing path information, LSEQ avoided recording repeated information and reduced labeling length. Moreover, LSEQ supported the representation of ancestor-descendant relationship, parent-children relationship and sibling relationship between any two nodes. Through storing the path of internal nodes, LSEQ enhanced querying efficiency and avoided structure-joining. The experimental results show that, the LSEQ has advantages in compacting translated SQL and lessening relational database space.
Typical applications
Research of video semantic retrieval system based on ontology
2010, 30(3): 835-837.
Asbtract
(
)
PDF
(630KB) (
)
Related Articles
|
Metrics
Retrieving video content in the semantic level can break "semantic gap" and increase the utilization efficiency of video content. The authors made use of the annotation and reasoning ability of ontology, studied video semantic retrieval, fully mined the structural and semantic information of the video content, and built the hierarchical semantic indexing, which can greatly strengthen the system's semantic retrieval ability. The ontology structure of Video Semantic Retrieval System (OVSR) integrated domain ontology, video ontology and core ontology, and has a strong expanding and interoperability ability. The authors mainly discussed OVSR's ontology structure, video semantic model and indexing model, and studied the user's searching rewriting algorithm and ontology reasoning algorithm.
A Research on Finding Connected Maximal Common Subgraph in Some Graphs
2010, 30(3): 838-841.
Asbtract
(
)
PDF
(549KB) (
)
Related Articles
|
Metrics
Many application problems in pattern recognition, extraction and analysis techniques of malicious code family signature and artificial intelligence can be converted into the problem of finding connected maximal common subgraph in some graphs. A new matrix algorithm which solved some simple cases, feature correlation of graph and serial correlation coefficient of graph degree was defined. At last, a greedy algorithm with example for finding connected maximal common subgraph in some graphs was proposed; the greedy algorithm can quickly and efficiently find a common connected subgraph as large as possible.
Static scheduling model and its greedy algorithm of agile supply chain
2010, 30(3): 846-849.
Asbtract
(
)
PDF
(651KB) (
)
Related Articles
|
Metrics
In order to solve the scheduling problems of definite market demands with quantity and time constraints in Agile Supply Chain (ASC), which consists of multi-stage members with limited capacities, a Supply Chain Structure Model (SCSM) was firstly established according to the supplying distances between suppliers and the market. Then on the basis of SCSM, a Linear Programming (LP) model was set up to describe the scheduling problem, and a two-stage algorithm was developed to resolve the LP model. The shortest response time scheduling greedy algorithm in the first stage judged whether the supply chain could meet the demands' quantity and time constraints, and the lean scheduling greedy algorithm in the second stage obtained the supply chain's optimal scheduling solution with the minimum inventory cost. The practicality and effectiveness of the model and algorithms were verified by a scheduling example.
Development and application of serial synchronous communication driver in Linux2.6
2010, 30(3): 850-853.
Asbtract
(
)
PDF
(619KB) (
)
Related Articles
|
Metrics
The communication between ARM and Digital Signal Processor (DSP) is a key factor in ARM/DSP dual core design widely used in geophysics exploration instrumentation. In this paper serial synchronous communication between AT91RM9200/DSP56309 dual core processors was realized and the modularized and hierarchical design of the SSC driver based on AT91TM9200 in Linux 2.6 was introduced in detail. The DMA transmission, multi-buffering and PDA controller priority modification was used in the driving design. The SSC interface circuit implementation based on the design was also introduced. The experimental results show that the data transfer between AT91RM9200/DSP56309 is rapid and steady, and so that the dual core processors co-work very well.
Implementation and optimization of face recognition algorithm based on DSP
2010, 30(3): 854-856.
Asbtract
(
)
PDF
(428KB) (
)
Related Articles
|
Metrics
In order to realize the real-time face recognition in video by embedded system, the design scheme of fast face recognition system based on TI TMS320DM642 was presented, and optimization policy was proposed in terms of both hardware and software. Firstly, effective features were selected, and the algorithm of face detection and recognition was briefly introduced, then according to the algorithm, the design scheme and function units of hardware system were described. Finally, the algorithm was transplanted onto the hardware system, then the optimization policy was applied, and fast embedded face recognition system was implemented. The experimental results demonstrate the system works reliably, the running speed of algorithm was improved markedly, and real-time face recognition in video was realized.
2025 Vol.45 No.4
Current Issue
Archive
Superintended by:
Sichuan Associations for Science and Technology
Sponsored by:
Sichuan Computer Federation
Chengdu Branch, Chinese Academy of Sciences
Honorary Editor-in-Chief:
ZHANG Jingzhong
Editor-in-Chief:
XU Zongben
Associate Editor:
SHEN Hengtao XIA Zhaohui
Domestic Post Distribution Code:
62-110
Foreign Distribution Code:
M4616
Address:
No. 9, 4th Section of South Renmin Road, Chengdu 610041, China
Tel:
028-85224283-803
028-85222239-803
Website:
www.joca.cn
E-mail:
bjb@joca.cn
WeChat
Join CCF