Loading...
Toggle navigation
Home
About
About Journal
Historical Evolution
Indexed In
Awards
Reference Index
Editorial Board
Journal Online
Archive
Project Articles
Most Download Articles
Most Read Articles
Instruction
Contribution Column
Author Guidelines
Template
FAQ
Copyright Agreement
Expenses
Academic Integrity
Contact
Contact Us
Location Map
Subscription
Advertisement
中文
Table of Content
01 May 2011, Volume 31 Issue 05
Previous Issue
Next Issue
Network and communications
Analysis of cooperation model for P2P live streaming in game theoretic framework
CHENG Pu CHU Yan-ping DU Ying
2011, 31(05): 1159-1161. DOI:
10.3724/SP.J.1087.2011.01159
Asbtract
(
)
PDF
(596KB) (
)
Related Articles
|
Metrics
To resolve the problem of "free riding" and "tragedy of the commons" in peer-to-peer live streaming systems, a cooperation model was proposed in a game theoretic framework. The proportional fairness optimal strategy was proved under Nash equilibrium and Pareto optimality. And then the corresponding node behavior strategy was analyzed considering their cheating behaviors. Finally, the analytical results show that the model can effectively stimulate node cooperation and prevent cheating.
Improving VoIP capacity in mobile WiMAX network through traffic priority
LI Ming WU Yan-ling YANG Lei HAN Qing-tao
2011, 31(05): 1162-1165. DOI:
10.3724/SP.J.1087.2011.01162
Asbtract
(
)
PDF
(566KB) (
)
Related Articles
|
Metrics
Five scheduling mechanisms have been proposed to ensure Quality of Service (QoS) in WiMAX network. Three of them are designed for real-time applications. However, fairness has not been considered while assigning resources in these scheduling mechanisms. In order to guarantee the existing services, new service requests will be rejected when lacking network resources. For resolving this problem, a new scheduling mechanism was proposed for Voice over Internet Protocol (VoIP) service, in which a Priority Decision Maker (PDM) was implanted. When new services and the existing services requested resources simultaneously, higher priority would be assigned to new service requests; and resource allocation would be achieved based on different level priorities by a resource distribution center for guaranteeing fairness. Detailed performance analysis was performed. The simulation results show the proposed scheduling mechanism could significantly increase number of VoIP connections and total throughput, about 15% and 11% respectively.
Basic ant routing algorithm in mobile Ad Hoc networks
QU Da-peng WANG Xing-wei HUANG Min REN Xiu-li
2011, 31(05): 1166-1169. DOI:
10.3724/SP.J.1087.2011.01166
Asbtract
(
)
PDF
(566KB) (
)
Related Articles
|
Metrics
Concerning that the resource in mobile Ad Hoc network is limited and the existing ant routing algorithms are complex, a basic ant routing algorithm was proposed. Based on the analysis of ant routing process, it only maintained basic ant routing mechanism, without any extra overhead, discussed pheromone update and pheromone use which were two key components of the algorithm; moreover, it analyzed their impact on performance by simulation. Finally, the experiment results show that it can get a performance closed to other routing protocols under a lower overhead.
Probabilistic routing algorithm based on contact duration in DTN
WANG Gui-zhu HE Cheng WANG Bing-ting
2011, 31(05): 1170-1172. DOI:
10.3724/SP.J.1087.2011.01170
Asbtract
(
)
PDF
(622KB) (
)
Related Articles
|
Metrics
Considering that contact duration has significant influence on whether packet can be transmitted successfully or not, the authors proposed a Probabilistic Routing Protocol using History of Encounters and Transitivity based on Contact Duration (PRoPHET-CD), which combined contact duration with encounter frequency to estimate delivery probability. This protocol could improve the delivery probability significantly and reduce the interruption of packet transmission. The simulation results show that the protocol of PRoPHET-CD can significantly enhance the message delivery probability and reduce the overhead ratio.
Spectrum allocation algorithm based on time difference factor in cognitive radio
WEN Kai FU Xiao-ling FU Ling-sheng
2011, 31(05): 1173-1175. DOI:
10.3724/SP.J.1087.2011.01173
Asbtract
(
)
PDF
(458KB) (
)
Related Articles
|
Metrics
In order to reduce the outage probability and enhance the stability of cognitive system, an improved algorithm of spectrum allocation based on classical graph coloring model was proposed. A difference factor of spectrum's idle time and user's request time was introduced. For every cognitive user, the algorithm allocated spectrums according to two factors: the spectrum efficiency and the time difference factor. Cognitive user with greater product value of the two factors was prior. The simulation results show that the outage probability of improved algorithm is far below that of the previous algorithm.
IGP/MPLS hybrid IP traffic planning method under uncertain traffic matrices
ZENG Wen-long WANG Sheng WANG Xiong
2011, 31(05): 1176-1179. DOI:
10.3724/SP.J.1087.2011.01176
Asbtract
(
)
PDF
(732KB) (
)
Related Articles
|
Metrics
With the rapid development of IP networks, network traffic becomes increasingly uncertain and unpredictable. In order to resolve this problem, this paper presented a Mixed Integer Programming (MIP) model for IGP/MPLS hybrid IP traffic planning problem under uncertain traffic matrices based on Hose model. Then, the MIP model was decomposed into two sub-problems of weight design and traffic distribution, so that it could be solved effectively. The experimental results demonstrate that the proposed method can obtain a better optimization performance with only a few established Label Switching Paths (LSPs).
Live migration transition framework of mobile IPv4/IPv6 virtual machine
CHEN Jun CHEN Xiao-wei
2011, 31(05): 1180-1183. DOI:
10.3724/SP.J.1087.2011.01180
Asbtract
(
)
PDF
(641KB) (
)
Related Articles
|
Metrics
In order to fully use IPv4/IPv6 heterogeneous network resources and provide resource requirement for cloud computation platform, the authors designed an IPv4/IPv6 virtual machine migration transition framework for cloud computation based on tunnel technology, prefix management, address pool management and mobile IP. The framework used the designed cloud computation control engine as a core to translate and link heterogeneous network, and needed Network Address Translation-Protocol Translation (NAT-PT) and tunnel technology collaboration. The framework was established for IPv4/IPv6 virtual machine seamless live migration in the early, middle, late period of IPv4 to IPv6 transition, and IPv4/IPv6 cloud computation service was provided for client. The framework could be applied to construct cloud computation platform in the IPv4/IPv6 transition period.
Information resource addressing model based on trust-driven cloud for Internet of things
WAN Nian-hong WANG Xue-rong
2011, 31(05): 1184-1188. DOI:
10.3724/SP.J.1087.2011.01184
Asbtract
(
)
PDF
(884KB) (
)
Related Articles
|
Metrics
To improve bottom-layered information resource addressing efficiency for Internet of Things (IoT), with researching trust evaluation criteria on bottom-layered addressing services for IoT in cloud, and improving trust-driven algorithms, an information resource addressing model based on trust-driven cloud for IoT was presented. First the key addressing features were analyzed, then the addressing model was constructed by designing and using specific constraint conditions, trust steepness function, cloud trust evaluation criteria and trust constraint coefficients. Finally, the model was validated by an IoT system designing instance. The experimental results show the proposed model has satisfactory bottom-layered resource addressing efficiency in comparison with traditional models or algorithms.
Improved DV-Hop localization algorithm based on wireless sensor networks
ZHAO Ling-kai HONG Zhi-quan
2011, 31(05): 1189-1192.
Asbtract
(
)
PDF
(554KB) (
)
Related Articles
|
Metrics
Distance Vector-Hop (DV-Hop) is one of the typical algorithms for Wireless Sensor Network (WSN), but its accuracy is not high. This paper analyzed DV-Hop algorithm and discovered the main reason for the error. Based on the defect of DV-Hop algorithm, the new algorithm depended on the invariant velocity of the wireless signals in the same medium, and used the counter to measure the time of data transmission between unknown node and anchor node as well as between anchor nodes. The new algorithm modified the estimated distance of the unknown node through the ratio of time. The simulation results show that the new algorithm reduces the positioning error and improves the positioning accuracy.
Graphics and image technology
Improved algorithm on contour line position relation
HE Huai-qing YANG Peng
2011, 31(05): 1193-1197. DOI:
10.3724/SP.J.1087.2011.01193
Asbtract
(
)
PDF
(767KB) (
)
Related Articles
|
Metrics
By analyzing the principle and the existing problems in the ray method and the extreme coordinate value method, the existing algorithms which determined contour direction were simplified. Then an improved algorithm on the contour line position relation was proposed combining the advantages of the ray method and the extreme coordinate method. The algorithm mainly included four parts: distinction among the internal and external contours, adjustment of the profile direction, inclusive identification of contours and the construction of a contour tree. The experimental results show that the improved algorithm can correct the problems in the existing algorithms and achieve good efficiency.
Steady algorithm for Boolean operation of 3D mesh model
CHEN Xue-gong MA Jin-jin QIU Hua FU Jin-hua XIAO Ke-yan
2011, 31(05): 1198-1201.
Asbtract
(
)
PDF
(687KB) (
)
Related Articles
|
Metrics
This paper proposed a stable and precise algorithm for 3D mesh model. Firstly, the algorithm, based on the original topology of mesh model, realized quick location of intersectional area of mesh curve by combining the intersection test of the layers of nodes bounding box. Then the algorithm utilized the improved triangular intersection algorithm to calculate the discrete intersectional segments and re-triangulates every intersectional triangle. Through building the topology of intersectional segments and triangles, the algorithm could quickly trace and pick up the discrete segments, and classify and combine the local area, and realize the precise Boolean operation. The algorithm could effectively deal with all kinds of instances, and could be implemented in programs easily. And the experimental results prove that the algorithm accords with requirements of the project.
Approach for line detection based on principal component analysis and clustering
LIU Wei JIN Wen-biao XIAO Xian-qian
2011, 31(05): 1202-1204. DOI:
10.3724/SP.J.1087.2011.01202
Asbtract
(
)
PDF
(663KB) (
)
Related Articles
|
Metrics
In the existing line detection methods, those based on Hough Transformation (HT) have a huge cost and always bring false results, and others based on chain tracing are weak on robustness and adaptability. This paper proposed a new approach for line detection, in which, the chain was generated by chain tracing in edge image block by block, then the Principal Component Analysis (PCA) was used on the chain to construct segments, at last the lines were got by merging the segments through clustering. The experimental results show the approach is fast and gives good results, and especially it performs well in highly complex and detail rich images.
Particle filter tracking algorithm based on geometric active contours
CAO Jie ZENG Qing-hong WANG Jin-hua
2011, 31(05): 1205-1208. DOI:
10.3724/SP.J.1087.2011.01205
Asbtract
(
)
PDF
(634KB) (
)
Related Articles
|
Metrics
The Standard Particle Filter (SPF) is a typical method of solving the tracking problem of non-linear/non-Gaussian model system. However, updating process strictly depends on parameters selection, and it cannot handle the changes in curve topology. In regard to this, a new particle filter target tracking algorithm based on geometric active contours was proposed, which made a good deal with the changes of curve topology using level set theory. The algorithm improved the resampling techniques and increased the diversity of particles. The simulation results indicate that the proposed method can effectively improve the state estimation precision with more flexibility.
Blind super-resolution reconstruction method based on maximum a posterior estimation
ZHANG Hong-yan SHEN Huan-feng ZHANG Liang-pei LI Ping-xiang YUAN Qiang-qiang
2011, 31(05): 1209-1213. DOI:
10.3724/SP.J.1087.2011.01209
Asbtract
(
)
PDF
(846KB) (
)
Related Articles
|
Metrics
In this paper, a new joint Maximum A Posterior (MAP) formulation was proposed to integrate image registration into blind image Super-Resolution (SR) reconstruction to reduce image registration errors. The formulation was built upon the MAP framework, which judiciously combined image registration, blur identification and SR. A cyclic coordinate descent optimization procedure was developed to solve the MAP formulation, in which the registration parameters, blurring function and High Resolution (HR) image were estimated in an alternative manner given to the two others, respectively. The experimental results indicate that the proposed algorithm has considerable effectiveness in terms of both quantitative measurement and visual evaluation.
Fast component amending and labeling algorithm for convex objects
QIU Liu-dong WANG Niu LI Zu-shu
2011, 31(05): 1214-1216. DOI:
10.3724/SP.J.1087.2011.01214
Asbtract
(
)
PDF
(477KB) (
)
Related Articles
|
Metrics
Concerning the recognition influence of the inner hole and concave, which is hard to eliminate by the average component labeling algorithms, a new component labeling and amending algorithm was presented for convex objects. The presented algorithm used the scan line port search algorithm to solve the inner hole, used the theory of convex to object to eliminating the concave. This algorithm makes the fixed object component more like the original component, and it can extract the characteristics including contour with good real-time performance, and improves the precision of object recognition. The presented algorithm has been applied in robot soccer competition and works successfully.
Infrared small target detection algorithm based on improved two-sliding-window
LIU Xing-miao WANG Shi-cheng ZHAO Jing HU Bo
2011, 31(05): 1217-1220. DOI:
10.3724/SP.J.1087.2011.01217
Asbtract
(
)
PDF
(675KB) (
)
Related Articles
|
Metrics
The temporal domain characteristic of the infrared image and different features of small targets, noise and background were analyzed in this paper. A new infrared small target detection algorithm combining temporal domain and spatial domain was put forward. Because of the slow change of background, the Signal-to-Noise Ratio (SNR) of the small target was first enhanced through subtracting the conjoint frames. Then the potential small targets were detected by applying the centre distinguishing method, and the two-sliding-window algorithm was adopted to remove the isolated noise. At last, the similarity distinguishing method was used to eliminate the edge disturbance and the final detection of the small target was realized. The experimental results indicate that the improved algorithm has better target detection and real-time performance.
Mixed compression algorithm for error-diffusion halftone image based on look-up table
GENG Ye KONG Yue-ping LIU Xin
2011, 31(05): 1221-1223. DOI:
10.3724/SP.J.1087.2011.01221
Asbtract
(
)
PDF
(478KB) (
)
Related Articles
|
Metrics
A mixed compression algorithm for error-diffusion image that could overcome the disadvantages of the conventional binary image loss coding techniques and combined with the inverse half toning method was proposed. Look-Up-Table (LUT) inverse half toning method was used to convert error-diffusion image back to the contone image, then an improved Discrete Cosine Transform (DCT) coding algorithm was constructed to get higher compression rate. The experimental results indicate that the proposed algorithm fits to the error-diffusion image, and the quality of the decoding image is well.
Image dehazing method based on neighborhood similarity dark channel prior
GUO Jia WANG Xiao-tong HU Cheng-peng XU Xiao-gang
2011, 31(05): 1224-1226. DOI:
10.3724/SP.J.1087.2011.01224
Asbtract
(
)
PDF
(535KB) (
)
Related Articles
|
Metrics
Images acquired in bad weather have poor contrasts and colors. This paper proposed a simple method to remove haze based on dark channel priority. After acquiring the transmission, getting the difference between the dark channel and dark value of nearest eight pixels, the pixel of minimal difference was redefined as new dark channel. Besides, the air light was automatically estimated from the histogram of the dark channel. At last, the clear image could be recovered based on physical model. The experimental results show that the method can sharp the edge and improve the quality of the degraded image.
Improved algorithm for removing thin cloud in single remote sensing image
YAN Qing LIANG Dong ZHANG Jing-jing
2011, 31(05): 1227-1229. DOI:
10.3724/SP.J.1087.2011.01227
Asbtract
(
)
PDF
(710KB) (
)
Related Articles
|
Metrics
Because the algorithm of cloud threshold often generates boundary effect, this paper proposed an improved algorithm based on wavelet transform and homomorphic filter. The image with cloud was decomposed by wavelet transform to find the proper number of demarcation levels. Cloud could be removed by making homomorphic filtering to the higher level coefficients, while giving the lower level detailed coefficients and the approximation coefficients some weight factors respectively. The three parts of coefficients were reconstructed and fused to get processed result. The experimental results indicate that the proposed algorithm can remove the thin cloud cover effectively, maintain the details better and prevent producing the boundaries.
Application of rough adaptive genetic algorithm for image restoration
LI Li-juan YANG Qiong
2011, 31(05): 1230-1232. DOI:
10.3724/SP.J.1087.2011.01230
Asbtract
(
)
PDF
(491KB) (
)
Related Articles
|
Metrics
As for the Simple Genetic Algorithm (SGA) in the image restoration application, a new method was proposed to deal with the problem of low matching degree and different matching values, which could make it difficult to obtain the required solutions. This method sorting the matching value into two types of light and shade from the searching solution space was composed by the SGA and the Rough Adaptive Algorithm (RAA).Then in order to enhance the robustness of image recovery algorithm, the two types were dealt with respectively by the rough adaptive model on the basis of the sorting value. Compared with the inverse filter, Wiener filter and SGA, the proposed method has better image edge and higher PSNR.
Illumination-adaptive skin color detection method
XIONG Xia SANG Qing-bing
2011, 31(05): 1233-1236. DOI:
10.3724/SP.J.1087.2011.01233
Asbtract
(
)
PDF
(639KB) (
)
Related Articles
|
Metrics
Through the study on skin features under different light environments in YCgCr color space, it was found that the skin pixels have different Cg and Cr areas in different environments. The correlation matrix was used to estimate the light environment. The different skin segmentation methods were ultilized according to the result of environment detection, and dynamic threshold was composed by between-cluster variance and within-class scatter. Compared with the traditional methods, the proposed method fully reduces the impact of color distortion in different light. The experimental results show that it has higher accuracy and lower detection error rate.
Gait recognition method based on kernel principal component analysis
CHEN Xiang-tao ZHANG Qian-jin
2011, 31(05): 1237-1241. DOI:
10.3724/SP.J.1087.2011.01237
Asbtract
(
)
PDF
(799KB) (
)
Related Articles
|
Metrics
Concerning the issue of extracting features more efficiently from a sequence of gait frames and real-time recognition, an effective human recognition method based on Mean Gait Energy Image (MGEI) was described, which utilized Kernel Principal Component Analysis (KPCA). A pre-processing technique was used to segment the moving silhouette from the walking figure. The algorithm obtained the gait quasi-periodicity through analyzing the width information of the lower limbs' gait contour edge, and the MGEI was calculated from gait period. KPCA extracted principal component with nonlinear method and described the relationship among three or more pixels of the identified images. In this paper, KPCA could make use of the high correlation between different MGEIs for feature extraction by selecting the proper kernel function, and Euclidean distance weighted by variance reciprocal was designed as the classifier. The experimental results show that the proposed approach has better recognition performance and the computation time is greatly reduced.
Cone-beam CT perfusion imaging method on image-guided radiation therapy
QIAN Ying YANG Wen-feng
2011, 31(05): 1242-1244. DOI:
10.3724/SP.J.1087.2011.01242
Asbtract
(
)
PDF
(615KB) (
)
Related Articles
|
Metrics
In order to realize function imaging guided radiotherapy combined with image-guided radiotherapy and function imaging, this paper studied the possibility of Computed Tomography (CT) perfusion imaging using cone-beam CT on image-guided radiotherapy. To solve the problem that Cone-Beam Computed Tomography (CBCT) cannot obtain precise Time-Density Curve (TDC) because of low speed imaging, this paper proposed a method of modeling a mathematically model based on projection data. At first, the CBCT projection data was simulated with computer simulation technology. Then the density values of voxel were mathematical modeled. Finally, concrete parameters were calculated using computer programming and computer optimal solution technology. The experiment proves that the TDC received from the proposed model and the Dynamic Contrast Enhanced CT (DCE-CT) have high similarity. Hence, it can be proved that realizing CBCT in image-guided radiotherapy perfusion imaging by the proposed model was feasible.
CUDA based parallel implementation of simultaneous algebraic reconstruction technique
SHI Huai-lin SUN Feng-rong JIANG Wei LIU Wei QIN Tong LI Xin-cai
2011, 31(05): 1245-1248. DOI:
10.3724/SP.J.1087.2011.01245
Asbtract
(
)
PDF
(620KB) (
)
Related Articles
|
Metrics
Simultaneous Algebraic Reconstruction Technique (SART) is able to generate Computed Tomography (CT) images with higher quality compared to Filtered Back-Projection (FBP) method when the projection data is incomplete or noisy. However, it is very time-consuming; and parallel computation is one of those efficient approaches to manage the problem. In this study, a new parallel implementation of SART based on the platform of Compute Unified Device Architecture (CUDA) was proposed. The experimental results show that there are no differences between the images reconstructed by this new method and those by serial implementation, but the reconstruction time is greatly decreased, more applicable to clinical application.
Fast and stable local bianry fitting approach for image segmentation
LIN Ya-zhong GU Jin-ku HAO Gang CAI Qian
2011, 31(05): 1249-1251. DOI:
10.3724/SP.J.1087.2011.01249
Asbtract
(
)
PDF
(530KB) (
)
Related Articles
|
Metrics
The Local Binary Fitting (LBF) model based on local region information has its certain advantages in image segmentation of weak boundary or uneven greay. But, the segmentation results are very sensitive to the initial contours, and improper initial contour can directly lead to segmentation failure. Thus, a fast and stable LBF approach was proposed. First, after adding the area item with variable weights to the traditional LBF model, better initial contour could be obtained than manual one. Second, the traditional LBF model would be used for further segmentation. The experimental results show that, under the precondition of preferable results, this method can not only get promising segmentation results with flexible initial contour selection, but also faster than the traditional LBF model.
Graph and image-based plant modeling and simulation of swaying in wind
CAO Yang
2011, 31(05): 1252-1254. DOI:
10.3724/SP.J.1087.2011.01252
Asbtract
(
)
PDF
(662KB) (
)
Related Articles
|
Metrics
According to the different features between plant trunks and leaves, fractal method was adopted to draw the branches during simulation process. Through controlling functions to adjust parameters, such as branching growth direction, length, and radius, all kinds of branch structures were created to change the original fractal method in generating results of too regular characteristics. Image method was adopted to draw blades and Alpha test technology was utilized to filter the background of the leaf picture and retain the complex edge and color information. Through the rotating and scaling method, various lifelike leaves were generated. This image-based method was simpler and faster than the graph method. Besides, the process of plant flickering in the wind was approximately simulated according to the branches of different deformation under the influence of wind from morphology point of view.
New method for point cloud data reduction
ZHANG You-liang LIU Jian-yong FU Cheng-qun GUO Jie
2011, 31(05): 1255-1257. DOI:
10.3724/SP.J.1087.2011.01255
Asbtract
(
)
PDF
(444KB) (
)
Related Articles
|
Metrics
The reduction and storage of enormous point cloud data is a crucial link in reverse model reconstruction. Considering the features of point cloud data by single station laser scanning, a new method — grid sector method was put forward for its reduction and storage. Point cloud data could be filtered and stored only by traversal. This method was realized on VC++ 6.0. Multi-station scanning of point cloud registration and stitching would be more quickly and efficiently, if the site goes through the fan in a single grid after treatment. Based on the contrast with traditional compressing methods, this paper analyzed its characteristics and proved its applicability in battlefield terrain digitization.
Image acquisition and VGA display system based on FPGA
ZHU Yi-dan FANG Yi-bing
2011, 31(05): 1258-1261. DOI:
10.3724/SP.J.1087.2011.01258
Asbtract
(
)
PDF
(711KB) (
)
Related Articles
|
Metrics
Concerning the drawbacks of traditional PCI frame grabber, using Altera's DE2 development platform, image acquisition and VGA display system of programmable logic chip based on Field-Programmable Gate Array (FPGA) were designed. This system used the programmable logic chip FPGA which was in-built into soft-core NiosⅡ as the controller. The FPGA has image sensor, digital memory, video D/A converter and VGA display interface as its accessories. System used System On a Programmable Chip (SOPC) technology to obtain control and coding over FPGA and its accessories and eventually to acquire, process and display the real-time images. The design results prove that, the electronic system based on SOPC technique is flexible and efficient in terms of designing, ported strong, easy to achieve high-speed data acquisition, and it has high compatibility.
Information security
Effect of computer updating on spreading of worms
SONG Li-peng HAN Xie LIU Dong-ming ZHANG Jian-hua
2011, 31(05): 1262-1264. DOI:
10.3724/SP.J.1087.2011.01262
Asbtract
(
)
PDF
(415KB) (
)
Related Articles
|
Metrics
The updating of computers has great impact on the dynamics of worms. To contain the propagation of worms, it is necessary to characterize this factor. A model was proposed in this paper, which took account of the influence of computer updating. Furthermore, the model's equilibria and their stability conditions were obtained mathematically and then verified by simulations. The analytical and simulated results show that the updating of computers can lead to the persistence of worms, which will die out otherwise. The simulation results also show that the updating rate has bi-effects on the spreading of worms. Under the guidance of basic reproduction number, the negative effect can be alleviated and worms can be terminated by introducing a anti-virus system of high initial installation rate.
Security analysis and improvement of IEEE 802.1X
ZHOU Chao ZHOU Cheng GUO Liang
2011, 31(05): 1265-1270. DOI:
10.3724/SP.J.1087.2011.01265
Asbtract
(
)
PDF
(828KB) (
)
Related Articles
|
Metrics
It has been proved in many researches that there are some design flaws in IEEE 802.1X standard. In order to eliminate the Denial of Service (DoS) attack, replay attack, session hijack, Man-In-the-Middle (MIM) attack and other security threats, the protocol was analyzed in view of the state machines. It is pointed out that the origin of these problems is the inequality and incompleteness of state machines as well as the lack of integrity protection and source authenticity on messages. However, an improvement proposal called Dual-way Challenge Handshake and Logoff Authentication was proposed, and a formal analysis was done on it with an improved BAN logic. It is proved that the proposal can effectively resist the security threats mentioned above.
TCP finite state machine and protocol parse applied in removal of false alerts
SHUAI Chun-yan JIANG Jian-hui OU-YANG Xin
2011, 31(05): 1271-1275. DOI:
10.3724/SP.J.1087.2011.01271
Asbtract
(
)
Related Articles
|
Metrics
Concerning the enormous alerts produced by Intrusion Detection System (IDS), a method based on protocol parse and Transfer Control Protocol (TCP) Finite State Machine (FSM) model was proposed to remove the false alerts. To the alerts produced by connectionless request/response protocol, the method made judgement through the analysis of the attack features of the request packets and return status code of response packets; to the alerts produced by the TCP, the paper parsed the packets and built up TCP FSM model to make judgement whether the series packets came from the same TCP connection, whether the TCP connection included attack sequences to remove the false alerts. Lastly the experiments made on DARPA 2000 datasets show that the proposed method can reduce the false alert more than 59.47% on average, and the alerts recognition rate of the TCP and the request/response protocol reaches 76.67%. This method is simple and efficient which depends on the attack features database of IDS, and can be implemented on line by plug-in.
Construction and implementation of multistep attacks alert correlation model
ZHAI Guang-qun ZHOU Shuang-yin
2011, 31(05): 1276-1279. DOI:
10.3724/SP.J.1087.2011.01276
Asbtract
(
)
PDF
(592KB) (
)
Related Articles
|
Metrics
To reduce the number of alerts in Intrusion Detection System (IDS) and uncover attack purposes and motivations, a new alert correlation model was proposed, in which alerts with similarity relationship were correlated by event correlation and stored as meta-alerts, then transformed into hyper-alerts according to the knowledge base rules, and finally hyper-alerts with casual relationship were correlated by attack correlation and an attack correlation graph was formed. The experimental results show that the model raises alert processing efficiency and contributes to attack purposes identification and alert accuracy improvement.
Double-layered embedding based wet-paper-code adaptive steganography
XI Ling PING Xi-jian ZHANG Tao
2011, 31(05): 1280-1283. DOI:
10.3724/SP.J.1087.2011.01280
Asbtract
(
)
PDF
(645KB) (
)
Related Articles
|
Metrics
In order to enhance the statistical security of a data hiding system, the factors that influenced the security of a steganography were analyzed. Three ways to reduce the statistical distortion of the stego-image were found: Increasing embedding efficiency, controlling modifying amplitude and choosing embedding position adaptively. According to the three ways, a new adaptive steganography based on double-layered embedding method was proposed. The new scheme chose pixels under strong noise background as message carrier and embedded secrets in their least significant and second least significant bit-planes. The experimental results on uncompressed images show that the proposed steganography outperforms the prior algorithm for resisting blind steganalysis.
Digital speech tamper detection based on speaking conditions
DING Qi PING Xi-jian
2011, 31(05): 1284-1287. DOI:
10.3724/SP.J.1087.2011.01284
Asbtract
(
)
PDF
(669KB) (
)
Related Articles
|
Metrics
An automatic detection method for digital speech tamper by means of stitching was proposed. This method was based on speaking condition analyses, which comprised background noise analysis and speaker fettle analysis. Speech signals were divided into speech and silence segments. For silence segments containing noise, features in time domain and frequency domain were extracted for each frame. For each speech segment, rhythm and timbre features were extracted. Features changing points of the silence segments and speech segments were detected separately based on Bayesian information criterion, and the tamper detection result was obtained by integrative decision. The experimental results show that the proposed method can detect and locate the stitching points accurately.
Secure digital watermarking protocol based on buyer-seller
WANG Fei CHEN Hong XIAO Zhen-jiu
2011, 31(05): 1288-1290. DOI:
10.3724/SP.J.1087.2011.01288
Asbtract
(
)
PDF
(640KB) (
)
Related Articles
|
Metrics
Concerning the overburdened third party problem of the present digital copyright protection protocol, this paper put forward a simple, efficient, secure copyright protection protocol and model to protect the buyer and seller's interests. Through two mechanisms of content server generating digital watermarking pool and mobile Agent dynamic distributing license, the protocol solved the security problems such as tolerance of conspiracy, middle attacks, hard disk cloning attacks and user treason. And it ensured the data security and integrity in the protocol entities exchange through encryption, authentication, digital signature and one-way permutation function. In addition, it still adopted the mechanism of buyer from arbitration, which enabled protocol more perfect and feasible.
Calculation approach of privilege deduction in authorization management
WANG Ting CHEN Xing-yuan REN Zhi-yu
2011, 31(05): 1291-1294. DOI:
10.3724/SP.J.1087.2011.01291
Asbtract
(
)
PDF
(665KB) (
)
Related Articles
|
Metrics
Privilege deduction relation makes the authorization management easier, and at the same time it also causes the calculation of valid privileges more difficult. Therefore, it is important for authorization and access control to calculate deduction relations between privileges correctly and efficiently. Based on the resource hierarchy and operation hierarchy, the rule of privilege deduction was given in this paper. According to the fact that privilege query happens more frequently than privilege update, a new method of calculating deduction relations based on reachability matrix of privilege deduction was proposed. The experimental results show that the new method is more efficient than the way calculating deduction relations directly.
AES and its software implementation based on ARM920T
BAI Ru-xue LIU Hong-yan ZHANG Xin-he
2011, 31(05): 1295-1297. DOI:
10.3724/SP.J.1087.2011.01295
Asbtract
(
)
PDF
(553KB) (
)
Related Articles
|
Metrics
To improve the efficiency of Advanced Encryption Standard (AES) algorithm on ARM processor, an optimization of AES was introduced and realized on ARM920T processor. One-time key expansion was adopted. In the algorithm, SubBytes() and MixColumns() were defined as T-table to store, which could increase the speed. The proposed algorithm programmed by C language was simulated and debugged on the ARM Develop v1.2 platform. Different implementations were compared in storage space and processing speed and a variety of different key length algorithm performances were given. The experimental results show that the execution speed of the presented algorithm improves significantly.
Constrained role-permission based delegation in pervasive computing
GAO Da-li SUN Ling XIN Yan
2011, 31(05): 1298-1301. DOI:
10.3724/SP.J.1087.2011.01298
Asbtract
(
)
PDF
(669KB) (
)
Related Articles
|
Metrics
Considering the permission delegation in inter-domain access control for pervasive computation environments, a role-permission based delegation method was given based on Role-Based Access Control (RBAC) model. The trust and time constraints were accounted by the importance of the permission. The consistency of the executing model and the delegation conditions was proved. It is shown that the method can satisfy the requirements of permission delegation in pervasive computing environments, and realize the temporal constraints and the dependence on executable role sets.
New scheme of ID-based authenticated multi-party key agreement
LIU Xue-yan ZHANG Qiang WANG Cai-fen
2011, 31(05): 1302-1304. DOI:
10.3724/SP.J.1087.2011.01302
Asbtract
(
)
PDF
(433KB) (
)
Related Articles
|
Metrics
Authenticated key agreement protocol allows a group of users in an open network environment to identify each other and share a security session key. This article proposed a new scheme of ID-based authenticated multi-party key agreement based on McCullagh-Barreto scheme. Key seed was introduced to update temporary public/private key pairs. The new scheme is able to realize the authentication, improve the security, resist Reveal query attack and the key compromise impersonation attack successfully, and it has many properties such as non-key control and equal contribution.
Lightweight security scheme of identity-based cryptography for SIP
MOU Ming-lang, WANG Wei
2011, 31(05): 1305-1307. DOI:
10.3724/SP.J.1087.2011.01305
Asbtract
(
)
PDF
(661KB) (
)
Related Articles
|
Metrics
For Session Initiation Protocol (SIP) based security mechanism of IP Multimedia Subsystem(IMS), the threats and attacks on SIP were analyzed. The Identity-Based Cryptography (IBC) based lightweight security scheme for SIP was proposed, combined with Identity-Based Authenticated Key Agreement (IBAKA) and integrity of critical header fields. The security of the proposed scheme was analyzed, and its security and attack-resistance were compared with other typical schemes. The experimental results indicate that the new protocol improves greatly in security and suffers fewer threats.
Artificial intelligence
Uncertain multiattribute decision making method based on entropy and evidential reasoning approach
YIN De-jin, WANG Hong-li
2011, 31(05): 1308-1310. DOI:
10.3724/SP.J.1087.2011.01308
Asbtract
(
)
PDF
(574KB) (
)
Related Articles
|
Metrics
It is difficult to make decision when the information is diverse and the weight of attribute is unknown. A weight calculating method based on uncertain entropy was proposed in this paper to solve this problem. In this new method, both the complete and incomplete assessments of every attribute were transformed under unified belief framework, and the objective weight of every attribute could be calculated based on its entropy. With the Evidential Reasoning (ER) approach, the new weight calculating method could realize Multiple Attribute Decision Making (MADM) based on uncertain entropy while the weight of attribute is unknown. A case was provided to illustrate the validity and feasibility of this method.
A Time-Delay Chaotic Neural Network For Information Processing
WANG Tao WANG Ke-jun JIA Nuo
2011, 31(05): 1311-1313. DOI:
10.3724/SP.J.1087.2011.01311
Asbtract
(
)
PDF
(609KB) (
)
Related Articles
|
Metrics
In order to improve the information processing capacity of chaotic neural networks, associative memory performance of a time-delay symmetric global coupled neural network was investigated by using a parameter modulated control method to control the coherent parameter. It can be observed that its output can be stabilized when only partial neurons enter the periodic orbits and the output sequence of the controlled network does not contain other patterns but the stored pattern corresponding to the initial input and its reverse pattern. The experimental results suggest that the network has good tolerance and excellent correct rate so that it is fit for information processing and pattern recognition.
Mutants reduction based on genetic algorithm for clustering
ZENG Fan-ping HUANG Yu-han ZHANG Mei-chao PAN Neng-gang
2011, 31(05): 1314-1317. DOI:
10.3724/SP.J.1087.2011.01314
Asbtract
(
)
PDF
(613KB) (
)
Related Articles
|
Metrics
Through studying on one of the reasons leading to the high cost of mutation testing which is the large number of mutants produced during the process of testing, a mutants reduction method of clustering based on genetic algorithm was proposed. Mutants with similar characteristics would be placed in the same cluster, and then randomly selected one from each cluster as a representative in order to reduce the mutants. The experimental results show that: 1) the proposed method can reduce mutants without compromising the adequacy of the constituted test suite; 2) and compared with K-means algorithm and agglomerative clustering algorithm, it can automatically form an appropriate number of clusters, and is more effective.
Multi-side covering algorithm based on feature selection
WU Tao ZHANG Fang-fang
2011, 31(05): 1318-1320. DOI:
10.3724/SP.J.1087.2011.01318
Asbtract
(
)
PDF
(495KB) (
)
Related Articles
|
Metrics
The multi-side covering algorithm is designed guided by the idea of divide-and-conquer to the mass high-dimensional data. According to the sum of the absolute value of the component deviation, subsets of attributes were selected to construct respective covering domains for different parts of training samples, thus reducing the complexity of learning. But the selection of initial attribute set should be acquired by experience or experiments. In order to reduce the subjectivity with the selection of initial attribute set and the complexity with the regulation of attribute set, the relief feature selection approach was used to ensure the optimal feature subset that can be appropriate for different data sets, build a hierarchical overlay network, and experiment on the actual data set. The experimental results show that this algorithm is provided with higher precision and efficiency. Therefore, the algorithm can effectively achieve the classification of the complex issues.
Artificial fish swarm algorithm based on simplex method
ZHANG Hong-xia LUO Yi SHI Rui-feng
2011, 31(05): 1321-1323. DOI:
10.3724/SP.J.1087.2011.01321
Asbtract
(
)
PDF
(536KB) (
)
Related Articles
|
Metrics
After analyzing the low local search ability of Artificial Fish Swarm Algorithm (AFSA), an improved AFSA based on simplex method was proposed. In the latter evolution period, the improved algorithm used simplex operators that were distributed evenly and widely as its artificial fishes to replace the original fishes which got together around the local optimum solution in every some generations. By adding simplex operators to AFSA, the level of detailed search was greatly improved in local part. Finally, the improved algorithm is proved to be a more effective algorithm in solving the problem of the low optimization accuracy by using three typical test functions.
Core master group PSO based on improvement formula
SUI Cong-hui TANG Hui-jia
2011, 31(05): 1324-1327. DOI:
10.3724/SP.J.1087.2011.01324
Asbtract
(
)
PDF
(630KB) (
)
Related Articles
|
Metrics
The standard particle swarm optimization in the evolution formula only considers both the colony's best fitness value and the individual's fitness values. Therefore, it leads to the low accuracy of the convergence on account of being lack of the diversity in later evolution period. In order to improve the accuracy of the algorithm, the paper proposed the core master group particle swarm, and combined a core master group particle swarm with the improved formula. The improved algorithm is proved through experiments to be more accurate.
Tree-ART2 model for clustering spatial data in two-dimensional space
YU Li LI Jia-tian LI Jia DUAN Ping WANG Hua
2011, 31(05): 1328-1330. DOI:
10.3724/SP.J.1087.2011.01328
Asbtract
(
)
PDF
(470KB) (
)
Related Articles
|
Metrics
The Adaptive Resonance Theory 2 (ART2) is one of well-known clustering algorithms and has been applied to many fields practically. However, to be a clustering algorithm for two-dimension spatial data, it not only has the shortcomings of pattern drift and vector model of information missing, but also is difficult to adapt to spatial data clustering of irregular distribution. A Tree-ART2 (TART2) network model was proposed. It retained the memory of old model which maintained the constraint of spatial distance by learning and adjusting Long Time Memory (LTM) pattern and amplitude information of vector. Meanwhile, introducing tree structure to the model could reduce the subjective requirement of vigilance parameter and decrease the occurrence of pattern mixing. The comparative experimental results show that TART2 network is suitable for clustering about the ribbon distribution of spatial data, and it has higher plasticity and adaptability.
Selective SVM ensemble based on double disturbance
CHEN Tao
2011, 31(05): 1331-1334. DOI:
10.3724/SP.J.1087.2011.01331
Asbtract
(
)
PDF
(580KB) (
)
Related Articles
|
Metrics
This paper proposed a selective Support Vector Machine (SVM) ensemble algorithm based on double disturbance to improve the generalization ability of SVM. First, the training samples were disturbed by using conventional boosting algorithm, a dynamic reduction algorithm, which integrated relative reduction based on relative core of rough set and resample method, to produce individual SVM. The fitness function of genetic factors was established based on negative correlation learning, and Best SVM with weight larger than a given threshold value were selected by accelerating genetic algorithm and were integrated using weighted average. The experiments show that the algorithm has higher generalization performance, and lower time and space complexity. It is a highly effective ensemble algorithm.
Improved CYK algorithm based on shallow parsing
LI Yong-liang HUANG Shu-guang LI Yong-cheng BAO Lei
2011, 31(05): 1335-1338. DOI:
10.3724/SP.J.1087.2011.01335
Asbtract
(
)
PDF
(743KB) (
)
Related Articles
|
Metrics
Different from English, modern Chinese syntax has obvious complexities: one is not easy to get the complete set of rules; the second, sentence of the analytical results contains a lot of ambiguous structures which are difficult to eliminate. Decomposition policy can divide syntax analysis tasks into different levels of small tasks, which rather than on the complete syntactic analysis is feasible. The basic idea is that first of all, multi-layer Markov model was used to parse a sentence which cut apart the complete sentence to some phrase about noun phrase, verb phase, etc. On the basis of the chunk, CYK algorithm was run to analyze the dependencies of the chunk, and ultimately realize the complete sentence syntactic analysis. Shallow parsing simplified rule set of CYK algorithm, and reduce the syntax parsing to some extent.
Database technology
Algorithm for mining maximum frequent itemsets based on decreasing dimension of frequent itemset in association rules
QIAN Xue-zhong, HUI Liang
2011, 31(05): 1339-1343. DOI:
10.3724/SP.J.1087.2011.01339
Asbtract
(
)
PDF
(820KB) (
)
Related Articles
|
Metrics
These algorithms based on FP-tree, for mining maximal frequent pattern, have high performance but with many drawbacks. For example, they must recursively generate conditional FP-trees and many candidate maximum frequent itemsets. In order to overcome these drawbacks of the existing algorithms, an algorithm named Based on Dimensionality Reduction of Frequent Itemset (BDRFI) for mining maximal frequent patterns was put forward after the analysis of FPMax and DMFIA algorithms. The new algorithm was based on decreasing dimension of itemset. In order to enhance efficiency of superset checking, the algorithm used Digital Frequent Pattern Tree (DFP-tree) instead of FP-tree, and reduced the number of mining through prediction and pruning before mining. During the mining process, a strategy of decreasing dimension of frequent itemset was used to generate candidate frequent itemsets. The method not only reduced the number of candidate frequent itemsets but also can avoid creating conditional FP-tree separately and recursively. The experimental results show that the efficiency of BDRFI is 2-8 times as much as that of other similar algorithms.
Improved measure of similarity between intuitionistic fuzzy rough sets
FAN Cheng-li, LEI Ying-jie, ZHANG Ge
2011, 31(05): 1344-1347. DOI:
10.3724/SP.J.1087.2011.01344
Asbtract
(
)
PDF
(554KB) (
)
Related Articles
|
Metrics
An improved measure of similarity based on the Hamming distance for measuring the degree of similarity between intuitionistic fuzzy rough sets was proposed on the basis of analyzing the deficiency of the existing similarity measure method. This method solved the problem of inaccurate similarity measure by adding the hesitancy degree and weight. Firstly, a similarity measure method for the degree of similarity between two intuitionistic fuzzy rough elements was given, and several important characters of it were revealed. Furthermore, a similarity measure method based on the Hamming distance for the degree of similarity between intuitionistic fuzzy rough sets was presented. This method was proved to have the same characters. At last, this improved similarity measure method is confirmed to be more reasonable and effective by examples.
Method of SVM classifier generation based on fuzzy classification association rule
CUI Jian LI Qiang LIU Yong
2011, 31(05): 1348-1350. DOI:
10.3724/SP.J.1087.2011.01348
Asbtract
(
)
PDF
(650KB) (
)
Related Articles
|
Metrics
To increase the classification accuracy of the database classification system, this paper proposed a new classification method. Firstly, the continuous attributes were dispersed by the Fuzzy C-Mean (FCM) algorithm. Secondly, an improved fuzzy association method was proposed to mine the classification association rules. Eventually, the compatibility between the generated rules and patterns was used to construct a set of feature vectors, which were used to generate a classifier. The experimental results demonstrate that the method has high discrimination and efficiency.
Deep Web query interface identification approach based on label coding
WANG Yan SONG Bao-yan ZHANG Jia-yang ZHANG Hong-mei LI Xiao-guang
2011, 31(05): 1351-1354. DOI:
10.3724/SP.J.1087.2011.01351
Asbtract
(
)
PDF
(598KB) (
)
Related Articles
|
Metrics
In this paper, concerning the complexity of calculation, maintenance and matching ambiguity, a Deep Web query interface identification approach based on label coding was proposed after studying the current identification approach of query interface thoroughly. This approach coded and grouped labels by the directivity and the irregularity of arrangement of the query interface. The identification approach of simple attributes and composite attributes and the processing approach of isolated texts were proposed, taking each label group as an independent unit to identify the feature information. The texts matching the elements were determined by the constraints on the label subscript, which greatly reduced the number of texts considered in matching an element and avoided the problem of matching ambiguity caused by massive heuristic algorithm, and the presentation of nested information was solved by twice clustering effectively and efficiently.
Improved FCM algorithm based on PSO and SFLA
LI Zhen LUO Ke
2011, 31(05): 1355-1358.
Asbtract
(
)
PDF
(506KB) (
)
Related Articles
|
Metrics
The traditional fuzzy clustering algorithm is sensitive to the initial point and easy to fall into local optimum. In order to overcome these flaws, an improved Fuzzy C-Mean (FCM) algorithm which combines the Particle Swarm Optimization (PSO) algorithm and Shuffled Frog Leaping Algorithm (SFLA) was proposed. Through designing a new search granularity factor, it could take advantage of the fast convergence speed, strong local search ability of PSO and strong global search capability, ability to jump of local optimum of SFLA, making the integration of PSO and SFLA better. At the same time, the update algorithm of SFLA was improved. The experimental results show that this method improves the search capability and the clustering performance of fuzzy clustering algorithm, and it has the advantages in the global search ability escaping from local optimum capacity, and convergence speed.
Modified K-means clustering algorithm based on good point set and Leader method
ZHANG Yan-ping ZHANG Juan HE Cheng-gang CHU Wei-cui ZHANG Li-na
2011, 31(05): 1359-1362. DOI:
10.3724/SP.J.1087.2011.01359
Asbtract
(
)
PDF
(743KB) (
)
Related Articles
|
Metrics
Traditional K-means algorithm is sensitive to the initial start center. To solve this problem, a method was proposed to optimize the initial center points through adopting the theory of good point set and Leader method. According to the different combination ways, the new algorithms were called KLG and KGL respectively. Better points could be obtained by the theory of good point set rather than random selection. The Leader method could reflect the distribution characteristics of the data object. The experimental results conducted on the UCI database show that the KLG and KGL algorithms significantly outperform the traditional and other initialization K-means algorithms.
Density-based data stream clustering algorithm over time-based sliding windows
LI Na XING Chang-zheng
2011, 31(05): 1363-1366. DOI:
10.3724/SP.J.1087.2011.01363
Asbtract
(
)
PDF
(555KB) (
)
Related Articles
|
Metrics
Stream data clustering algorithm was improved in terms of cluster quality and efficiency. This paper adopted a new method to improve cluster quality and efficiency. Firstly, the technology of the time-based sliding window was applied. Secondly, the structure of improved micro-cluster was created to save the summary. Finally, a new strategy was designed to regularly delete expired micro-clusters and outlier micro-clusters. Compared with traditional clustering algorithms of landmark-based model, the proposed method is of better efficiency, less memory overhead and fast data processing capabilities.
Path-based OWL storage model
Lü Gang ZHENG Cheng HU Chun-ling
2011, 31(05): 1367-1369. DOI:
10.3724/SP.J.1087.2011.01367
Asbtract
(
)
PDF
(441KB) (
)
Related Articles
|
Metrics
To improve the efficiency of information retrieval, a Path-based OWL Storage (POS) model was proposed. In addition, the structure of the POS system for the translation and storage of OWL data was illustrated. A data schema of inputted OWL and a data graph with hierarchical structural information between classes or properties were analyzed by POS system. Also, paths from the root class or property to all classes or properties were extracted via a Depth-First-Search (DFS) method. The extracted hierarchical structural information was stored in a path attribute in the relational database tables. Compared with the traditional method, the processing time for ontology query and update in the experiment has a feasible improvement.
Web service capability matching based on process-similarity
LI Ming YANG Fan
2011, 31(05): 1370-1373. DOI:
10.3724/SP.J.1087.2011.01370
Asbtract
(
)
PDF
(621KB) (
)
Related Articles
|
Metrics
ent service matching from the perspective of combining process and capability, so the matching precision is affected. To solve this problem, with the use of automata, Web services described by Web Ontology Language for Services (OWL-S) were expressed as formalized processes. Meanwhile, a service capability matching algorithm based on process-similarity was proposed. In this algorithm, whether request and service is process-similar was decided by similarity judgment of formalized processes, and the result obtained by process-similarity judgment was used to match capability; process-similarity judgment was performed through structure similarity computation and behavior similarity checking. The proposed algorithm is proved to be feasible and effective by comparative experiments.
Process decision tree model based on multi-dimensional time series
LIU Dong SONG Guo-jie
2011, 31(05): 1374-1377. DOI:
10.3724/SP.J.1087.2011.01374
Asbtract
(
)
PDF
(599KB) (
)
Related Articles
|
Metrics
To solve the classification problem of multi-dimensional time series and obtain understandable classification rules, the concept of time series entropy and the method of structuring time series entropy were introduced. And the decision tree model was expanded based on both attribute selection and attribute value. Two algorithms for structuring decision tree model of multi-dimensional time series classification were presented. Finally, process decision tree was tested on mobile customer churn data, and the feasibility of the proposed method was demonstrated.
Method of process pattern mining based on probability statistics of adjacent event
SHI Mei-hong CAO Kai-duan CHEN Liang WANG Quan-feng
2011, 31(05): 1378-1381. DOI:
10.3724/SP.J.1087.2011.01378
Asbtract
(
)
PDF
(692KB) (
)
Related Articles
|
Metrics
To mine accurately process patterns and improve anti-noise ability, a new method of process pattern mining based on probability statistics of adjacent event was proposed to generate automatically process patterns mining based on rules through only once traversing logs and simply calculating matrix. Comparing with α-algorithm and heuristic algorithm, the results from analyzing mining performance and verifying experiment show that it can mine pattern structures of sequence, choice, paralleling, recursion and short loop, and has advantages of low complexity and good anti-noise ability.
Mixed collaborative recommendation algorithm based on factor analysis of user and item
ZHAO Hong-xia WANG Xin-hai YANG Jiao-ping
2011, 31(05): 1382-1386. DOI:
10.3724/SP.J.1087.2011.01382
Asbtract
(
)
PDF
(803KB) (
)
Related Articles
|
Metrics
In order to solve the problems of data overload and data sparsity in Collaborative Filtering Recommendation (CFR) algorithm, the method of factor analysis was adopted to reduce the dimension of the data, and regression analysis was used to forecast the value that needs to be evaluated. Through these two methods, it not only reduces the amount of data but also maximizes the information retained. The ideas of the algorithm are as follows: first of all, the algorithm reduces the dimensions of user and item vector by use of factor analysis and some representative users and item factors could be got. And then, two regression models were established, with target users and the evaluated items as the dependent variables respectively, and the user factors and item factors as the independent variables respectively, which two predictive values of the evaluated items were achieved. Finally, the final predictive value was achieved weighted by the two. By experimental simulation, the algorithm is demonstrated effective and feasible. Furthermore, the results show that the accuracy of algorithm proposed here has somewhat increased compared with that of the collaborative filtering recommendation algorithm based on item.
Web clustering algorithm based on relative hamming distance
LI Bin WANG Tian-fei LIU Cai-ming ZHANG Jian-dong
2011, 31(05): 1387-1390. DOI:
10.3724/SP.J.1087.2011.01387
Asbtract
(
)
PDF
(590KB) (
)
Related Articles
|
Metrics
Concerning the clustering inaccuracy in Web usage mining, an improved clustering algorithm based on relative Hamming distance and conflicting degree was given. In this algorithm, a URL-UserID associated matrix was set up, where URL and UserID of Web site were taken as row and column respectively, and each element's value of this matrix was the user's hits. Then, similar customer groups or relevant Web pages were obtained by measuring the similarity between column vectors or between row vectors of the associated matrix. The experiments show that the new algorithm is more accurate.
No-header-table FP-Growth algorithm
LING Xu-xiong WANG She-guo LI Yang MIAO Zai-liang
2011, 31(05): 1391-1394. DOI:
10.3724/SP.J.1087.2011.01391
Asbtract
(
)
PDF
(607KB) (
)
Related Articles
|
Metrics
Concerning the problem of low traversal efficiency when searching the FP-tree for conditional pattern bases, a new no-header-table FP-Growth algorithm was proposed. The algorithm employed a recursively backtracking way to search for conditional pattern bases, avoiding traversing the same FP-tree path multiple times. Compared with FP-Growth algorithm in terms of theoretical analysis and actual mining performance, this algorithm greatly reduced the searching cost and improved the mining efficiency of frequent patterns by 2-5 times. Finally, the algorithm was used to mine association rules in telecommunication network alarms. The high mining speed, with the coverage of 83.3% against correct rules, shows that it is superior to FP-Growth both in time and space performance.
Improvement of clustering algorithm FEC for signed networks
KONG Ling-qi YANG Meng-long
2011, 31(05): 1395-1399. DOI:
10.3724/SP.J.1087.2011.01395
Asbtract
(
)
PDF
(752KB) (
)
Related Articles
|
Metrics
The Finding and Extracting Community (FEC) algorithm has some disadvantages as the algorithm stability is not enough, and the quality of extracting community needs to be improved. To solve these problems, some improvements were made from the following aspects: Add the function of selecting target vertex before random walk; cancel the parameter of random walk steps of the original algorithm by using a method of detecting steps automatically; supplement the quality evaluation of the link between the communities on the base of the original community extraction; achieve the controllability of particle size of community by introducing the threshold parameter. The results show that the improved algorithm has some improvements at the aspects of stability, anti-jam performance and clustering analysis.
Efficient and self-adaptive storage schema in flash-based database system
WANG Li WANG Yue-qing WANG Han-hu CHEN Mei
2011, 31(05): 1400-1403. DOI:
10.3724/SP.J.1087.2011.01400
Asbtract
(
)
PDF
(662KB) (
)
Related Articles
|
Metrics
It has been a new approach that flash memory is used as storage media to improve database system performance. In the current flash database system, to solve the problems of low search performance, improper log region allocation, and high index update cost in storage management, a forecasting algorithm based on the latest version of Bloom Filter was proposed, record locator and log summary structure were introduced, and a self-adaptive mechanism based on flash search and update cost estimate model were given. The experimental results prove that it can make a proper log region allocation, efficiently improve search performance, and reduce non-clustered index update cost.
Concurrency control in distributed database system with score-based method
LIU Yi LIN Zi-yu
2011, 31(05): 1404-1408. DOI:
10.3724/SP.J.1087.2011.01404
Asbtract
(
)
PDF
(769KB) (
)
Related Articles
|
Metrics
The Concurrency Control (CC) scheme employed in distributed database system can profoundly affect the performance of transaction-processing system. There are many different kinds of solutions to the problem of CC in distributed database system, and they have both advantages and disadvantages. The major methods of CC, basic knowledge of lock-based model and the project of 2-Phase Locking (2PL) were introduced. A score-based method was proposed based on lock-mechanism and in accordance with 2PL protocol, which could reduce the amount of data transmission and had desirable performance of concurrency. It could be used to deal with the problem of CC. The experimental results show that the proposed method can achieve much better performance than other available ones.
Data discretization method based on statistical correlation coefficient
XIE Ya-ping
2011, 31(05): 1409-1412. DOI:
10.3724/SP.J.1087.2011.01409
Asbtract
(
)
PDF
(613KB) (
)
Related Articles
|
Metrics
Most data mining and induction learning methods can only deal with discrete attributes; therefore, discretization of continuous attributes is necessary. The author proposed a data discretization method based on statistical correlation coefficient. The method captured the interdependence between attributes and target class with the aim to select optimal cut points based on statistical correlation theory. In addition, the author incorporated Variable Precision Rough Set (VPRS) model to effectively control information loss. The proposed method was applied to breast tumor diagnosis and data of other fields. The experimental results show that this method significantly enhances the accuracy of classification of See5.
Typical applications
Distributed discrete event simulation model based on RPC and barrier synchronization mechanism
CHEN You-zi CHEN Jun-yan WANG Tong
2011, 31(05): 1413-1416. DOI:
10.3724/SP.J.1087.2011.01413
Asbtract
(
)
PDF
(591KB) (
)
Related Articles
|
Metrics
A distributed simulation approach was proposed for discrete-events simulation with considerable amounts of events between logical processes. The proposed approach employed a time-driven method to simulate occurrence of discrete-events, using Remote Procedure Call (RPC) to describe the interaction between simulation members. In this approach, barrier synchronization objects were deployed for time synchronization in simulation advancement, in order to ensure the correctness of the causal ordering. Results obtained from the experiments show that the proposed approach can correctly and promptly handle large number of events, providing accuracy guarantee and efficiency improvement of the simulation model.
Design of multiple virtual machines management model for cloud computing
LIU Jin-jun ZHAO Sheng-hui
2011, 31(05): 1417-1419. DOI:
10.3724/SP.J.1087.2011.01417
Asbtract
(
)
PDF
(583KB) (
)
Related Articles
|
Metrics
A management model of virtual machines was proposed based on Peer-to-Peer (P2P) structure and its prototype system was implemented. Host nodes were organized in P2P structure and resource discovery was achieved by multicast. Live migration algorithm of virtual machines was proposed and live migrations between nodes were automatically triggered. The requests of cloud computing user were mapped to the host by elected root node, and the on-demand operations of creating, deleting, and stopping a virtual machine were achieved. The experimental results show that: the model has the features of rapid convergence, low bandwidth utilization and high availability. Load balance of cloud computing resources can be achieved.
A Solution for Workflow Patterns involving Multiple Instances based on Network Partition
HU Fei-hu ZHANG Dan-dan YANG Hui-yuan MA Ling
2011, 31(05): 1420-1422. DOI:
10.3724/SP.J.1087.2011.01420
Asbtract
(
)
PDF
(442KB) (
)
References
|
Related Articles
|
Metrics
To realize the building and controlling of workflow patterns involving multiple instances, a solution was proposed from the perspective of network partition. The implementing method was discussed based on RTWD net proposed by HU Fei-hu, et al. in Patent China 201010114083.9. First, the sub-workflows involving multiple instances should be divided into a subnet. Then the related parameters of multiple instances were defined, and multiple instances were controlled based on it. The paper discussed the controlling of sequential, synchronous and asynchronous parallel workflow patterns involving multiple instances based on the method. Because the divided subnet keeps consistent with the definition of workflow model, multiple instances can be scheduled by original workflow engine, which simplifies the realization of multiple instance patterns.
Proof of program correctness based on Kripke structure
LIN Jie YU Jian-kun
2011, 31(05): 1425-1427. DOI:
10.3724/SP.J.1087.2011.01425
Asbtract
(
)
PDF
(438KB) (
)
Related Articles
|
Metrics
In order to prove program correctness conveniently, Kripke structure was introduced to propose the proof of program correctness. Kripke structure was redefined to be adequate for proof, and the approach of converting program flow chart to Kripke structure state diagram was described. The related theories of proof of program correctness and the method of proof of program correctness based on Kripke structure were also given. First the program flow chart was converted to state diagram, and then the state predicate in every state was listed according to the transformational relation between states. Finally, every state predicate was proved to be true. Proof through the state predicates can reflect the state of program execution. A whole proving process was carried out through an example.
Performance reliability assessment for electronic system based on circuits simulation
CAI Jin-yan YU Zhi-jian
2011, 31(05): 1428-1430. DOI:
10.3724/SP.J.1087.2011.01428
Asbtract
(
)
PDF
(415KB) (
)
References
|
Related Articles
|
Metrics
Concerning the problem of system reliability assessment for complicated electronic equipment under the zero-failure data, this paper proposed a method of simulation assessment based on circuit units performance. Firstly all circuit function units in the system were fixed on, and system performance reliability model was set up according to the logistic relation between performance parameter of units and system output performance. Secondly the performance distributive parameters of units were estimated through statistical analysis for performance data, the values of random sample for performance data were gained according to these distributive parameters, and the system performance parameters were simulated using sample results. Finally the failure ratio of system parameters was got by using simulation results, and then the system reliability assessment can be achieved. The validity and practicality of the method were validated through an example.
Multi-medium space target visual measurement
WANG Jun ZHU Zhan-xia JIA Guo-hua ZHANG Xu-yang
2011, 31(05): 1431-1434. DOI:
10.3724/SP.J.1087.2011.01431
Asbtract
(
)
PDF
(601KB) (
)
Related Articles
|
Metrics
Based on non-contact three-dimensional measuring principle of computer vision and light refraction principle, a new theoretical method of visual measurement in multi-medium was proposed. It used two teams of binocular cameras from different perspectives to observe space target in water, measured the various landmarks of the underwater target by making use of binocular stereo vision technology and the mathematical model of two light refractions, and then obtained the three-dimensional coordinates of landmarks, by plane fitting and coordinate transformation, eventually got six-dimensional position and attitude coordinates of underwater target. The results show that the method is suitable for multi-medium visual measurement with high accuracy and good stability.
Query tree scanning algorithm for antenna line device based on mask promotion
LI Wen-sheng LUO Ren-ze CAI Ming-chang Lü Yi DENG Chun-jian
2011, 31(05): 1435-1438. DOI:
10.3724/SP.J.1087.2011.01435
Asbtract
(
)
PDF
(792KB) (
)
Related Articles
|
Metrics
Concerning the requirements of Antenna Interface Standard Group (AISG) protocol and encoding features of Unique ID (ID) of Antenna Line Devices (ALD), a query tree ALD scanning algorithm based on mask promotion was proposed. When collision occurred during ALD scanning, new scanning branches would be generated through mask promotion. Theoretic analysis and computer simulations show that performances of 1-bit mask promotion (binary tree) scanning algorithm and 2-bit mask promotion (quardtree) scanning algorithm are not only similar but also close to the optimal. The 2-bit mask promotion (quardtree) scanning algorithm was adopted in the Remote Electrical antenna Tilting (RET) control system, and the application shows that the algorithm has better adaptability and can scan various ALD devices from different vendors quickly and accurately.
Method of direct USB device access in virtual environment
WANG Ji-gang ZHENG Wei-min TENG Zhi-meng ZHONG Wei-dong
2011, 31(05): 1439-1442. DOI:
10.3724/SP.J.1087.2011.01439
Asbtract
(
)
PDF
(615KB) (
)
Related Articles
|
Metrics
Through direct device access, the influence of Virtual Machine (VM) on system performance could be decreased. Meanwhile, current device drivers could be used by guest operating systems. A method of direct Universal Serial Bus (USB) device access in virtual environment was proposed, and a model of direct device access and work flow based on QEMU were designed. The experimental results show the VM can directly access varied USB devices with the help of this method, and attain data transmission of acceptable rate.
Analysis system of unmanned aerial vehicle survivability based on MapX
CAO Lu ZHANG An GUO Feng-juan
2011, 31(05): 1443-1446. DOI:
10.3724/SP.J.1087.2011.01443
Asbtract
(
)
PDF
(591KB) (
)
References
|
Related Articles
|
Metrics
Survivability is one of the technical indicators, which needed to consider in the highest priority, in the design and use of Unmanned Aerial Vehicle (UAV). According to the requirements of real-time display of situation and visual expression of analysis results in modern battlefield simulation, UAV survivability analysis system was developed with MapX component technology. First the primary frame and characteristics of MapX were introduced briefly. The function and composition of UAV survivability analysis system were discussed. And the module partition and program flow of system were given. Then the functions of war zone map display-roaming-zooming, combat scenario and display and process of survivability data with MapX technology in VC++ environment were analyzed in detail. Finally the simulation examples of survivability analysis were given. Simulation experiments indicate that the battlefield geographic environment of UAV is factually, vividly and dynamically displayed in the system. It is convenient to analyze the influence factors on survivability.
Voice activity detection method based on inter-frame correlation
LI Yu GUO Lei-yong TAN Hong-zhou
2011, 31(05): 1447-1449. DOI:
10.3724/SP.J.1087.2011.01447
Asbtract
(
)
PDF
(411KB) (
)
Related Articles
|
Metrics
To enhance the detection performance of statistical model-based Voice Activity Detection (VAD) using likelihood ratio test, an improved VAD was proposed by utilizing the correlation between tandem speech frames. First a priori Signal-to-Noise Ratio (SNR) was estimated using recursive estimation method based on the result of the previous speech frame instead of the traditional decision-directed method. Secondly double thresholds were designed by depending on the previous frame's detention result. Finally a detection rule was presented based on two-state Hidden Markov Model (HMM) coupled with double thresholds. The experimental results show that the inter-frame correlation based VAD scheme gets better performance than the Sohn's VAD.
Release of spatial information based on Google Maps API
ZHOU Yu-lin FU Zhong-liang
2011, 31(05): 1450-1452. DOI:
10.3724/SP.J.1087.2011.01450
Asbtract
(
)
PDF
(518KB) (
)
References
|
Related Articles
|
Metrics
Traditional online mapping service has certain deficiency, which only supports customers to browse and query. In order to achieve the target that client could independently input data, and the server automatically receives spatial information and releases online map, a new method for the construction of spatial information distribution system was proposed is this paper. Based on B/S architecture, this method could get the geographical coordinates automatically by improving the Google Maps API event listener; meanwhile, a self-defined XML file was used to read this kind of record on the server. Then the address resolution functions were used to analyze this kind of XML file and draw the markers on the Google map according to the locations. Through this process the goal of publishing the local data to the Web map was achieved. Based on this method Wuhan University campus navigation system has been developed successfully, which demonstrates the feasibility of this solution.
2024 Vol.44 No.11
Current Issue
Archive
Superintended by:
Sichuan Associations for Science and Technology
Sponsored by:
Sichuan Computer Federation
Chengdu Branch, Chinese Academy of Sciences
Honorary Editor-in-Chief:
ZHANG Jingzhong
Editor-in-Chief:
XU Zongben
Associate Editor:
SHEN Hengtao XIA Zhaohui
Domestic Post Distribution Code:
62-110
Foreign Distribution Code:
M4616
Address:
No. 9, 4th Section of South Renmin Road, Chengdu 610041, China
Tel:
028-85224283-803
028-85222239-803
Website:
www.joca.cn
E-mail:
bjb@joca.cn
WeChat
Join CCF