Journal of Computer Applications ›› 2020, Vol. 40 ›› Issue (8): 2231-2235.DOI: 10.11772/j.issn.1001-9081.2019122223

• Artificial intelligence • Previous Articles     Next Articles

Recognition of two-person interaction behavior based on key gestures

YANG Wenlu, YU Mengmeng, XIE Hong   

  1. College of Information Engineering, Shanghai Maritime University, Shanghai 201306, China
  • Received:2020-01-05 Revised:2020-03-02 Online:2020-08-10 Published:2020-05-13
  • Supported by:
    This work is partially supported by the National Natural Science Foundation of China (61550110252).


杨文璐, 于孟孟, 谢宏   

  1. 上海海事大学 信息工程学院, 上海 201306
  • 通讯作者: 杨文璐(1967-),男,江苏沛县人,副教授,博士,主要研究方向:生物信息处理,
  • 作者简介:于孟孟(1993-),女,河南周口人,硕士研究生,主要研究方向:信号与信息处理;谢宏(1962-),男,陕西汉中人,教授,博士,主要研究方向:人工智能。
  • 基金资助:

Abstract: Concerning the problem of wide applications and low efficiency of two-person interaction behavior recognition, a method of two-person interaction behavior recognition based on key gestures was proposed. First, the key frames were extracted by comparing the differences between frames. Second, the key gestures in the key frames were determined by using the variance and spatial relationship of the angle changes of the bone points. Then, the key gestures were represented by features such as joint distance, angle, and joint motion. Every key gesture was expressed as a feature matrix. Finally, the combination with the best recognition rate was selected by comparing different dimension reductions and classification combinations. The proposed recognition method was evaluated on the SBU interaction dataset and the self-built interaction dataset, and the recognition rate of it reached 92.47% and 94.14% respectively. Experimental results show that the proposed method of representing actions by extracting the features of key gestures to form feature matrices can effectively improve the recognition result of two-person interaction behavior.

Key words: behavior recognition, two-person interaction, key gesture, feature matrix, body sensor

摘要: 针对双人交互行为识别应用领域广但效率低的问题,提出一种基于关键姿势的双人交互行为识别方法。首先,利用帧间差异比较来提取关键帧;然后,利用骨骼点角度变化的方差和空间关系来确定关键帧中的关键姿势;接着,利用关节距离、角度和关节运动等特征表示关键姿势,每一个关键姿势表示为一个特征矩阵;最后,利用不同的降维和分类组合,选取识别率最优的组合。在SBU交互数据集和自建的交互数据集上评估所提出的识别方法,该方法的识别率分别达到92.47%和94.14%。实验结果表明,通过提取关键姿势的特征形成特征矩阵来表示动作的方法可以有效地提高双人交互行为识别结果。

关键词: 行为识别, 双人交互, 关键姿势, 特征矩阵, 体感器

CLC Number: