[1] CHIU L Z. Sitting back in the squat[J]. Strength and Conditioning Journal, 2009, 31(6):25-27. [2] YAO L Y, MING W D, Cui H. A new Kinect approach to judge unhealthy sitting posture based on neck angle and torso angle[C]// Proceedings of the 2017 International Conference on Image and Graphics, LNCS 10666. Berlin: Springer-Verlag, 2017:340-350. [3] FANG B, SUN F C, LIU H P, et al. 3D human gesture capturing and recognition by the IMMU-based data glove[J]. Neurocomputing, 2017, 277:198-207. [4] FERRONE A, JIANG X, MAIOLO L, et al. A fabric-based wearable band for hand gesture recognition based on filament strain sensors: A preliminary investigation[C]// Proceedings of the 2016 IEEE Healthcare Innovation Point-of-Care Technologies Conference. Piscataway, NJ: IEEE, 2016:113-116. [5] WU D, SHAO L. Deep dynamic neural networks for gesture segmentation and recognition[C]// Proceedings of the 2014 European Conference on Computer Vision. Berlin: Springer, 2014:552-571. [6] LI Y, WANG X G, LIU W Y, et al. Deep attention network for joint hand gesture localization and recognition using static RGB-D images[J]. Information Sciences, 2018, 441:66-78. [7] 曾星,孙备, 罗武胜, 等. 基于深度传感器的坐姿检测系统[J]. 计算机科学,2018, 45(7):237-242. (ZENG X, SUN B, LUO W S, et al. Sitting posture detection system based on depth sensor[J]. Computer Science, 2018, 45(7):237-242.) [8] YAO L Y, MING W D, LU K Q. A new approach to fall detection based on the human torso motion model[J]. Applied Sciences, 2017, 7(10):993. [9] BACCOUCHE M, MAMALET F, WOLF C, et al. Sequential deep learning for human action recognition[C]// Proceedings of the 2011 International Workshop on Human Behavior Unterstanding, LNCS 7065. Berlin: Springer-Verlag, 2011:29-39. [10] NG J Y, HAUSKNECHT M, VIJAYANARASIMHAN S, et al. Beyond short snippets: deep networks for video classification[C]// Proceedings of the 2015 IEEE Conference on Computer Vision and Pattern Recognition. Piscataway, NJ: IEEE, 2015:4694-4702. [11] 吴亮, 何毅, 梅雪,等. 基于时空兴趣点和概率潜动态条件随机场模型的在线行为识别方法[J]. 计算机应用, 2018, 38(6): 1760-1764. (WU L, HE Y, MEI X, et al. Online behavior recognition using space-time interest points and probabilistic latent-dynamic conditional random field model[J]. Journal of Computer Applications, 2018, 38(6): 1760-1764.) [12] 姬晓飞, 左鑫孟. 基于关键帧特征库统计特征的双人交互行为识别[J]. 计算机应用, 2016, 36(8):2287-2291. (JI X F, ZUO X M. Human interaction recognition based on statistical features of key frame feature library[J]. Journal of Computer Applications, 2016, 36(8): 2287-2291.) [13] KALIATAKIS G, STERGIOY A, VIDAKIS N. Conceiving human interaction by visualising depth data of head pose changes and emotion recognition via facial expressions[J]. Computers, 2017, 6(3):25-37. [14] MAITI S, REDDY S, RAHEJA J L. View invariant real-time gesture recognition[J]. Optik—International Journal for Light and Electron Optics, 2015, 126(23):3737-3742. [15] 张全贵, 蔡丰, 李志强. 基于耦合多隐马尔可夫模型和深度图像数据的人体动作识别[J]. 计算机应用, 2018, 38(2): 454-457. (ZHANG Q G, CAI F, LI Z Q. Human action recognition based on coupled multi-hidden Markov model and depth image data[J]. Journal of Computer Applications, 2018, 38(2): 454-457.) [16] 谈家谱, 徐文胜. 基于Kinect的指尖检测与手势识别方法[J]. 计算机应用, 2015, 35(6): 1795-1800. (TAN J P, XU W S. Fingertip detection and gesture recognition method based on Kinect[J]. Journal of Computer Applications, 2015, 35(6): 1795-1800.) [17] CHOUBIK Y, MAHMOUDI A. Machine learning for real time poses classification using Kinect skeleton data[C]// Proceedings of the 2016 International Conference on Computer Graphics, Imaging and Visualization. Piscataway, NJ: IEEE, 2016:307-311. [18] WINWOOD P W, CRONIN J B, BROWN S R, et al. A biomechanical analysis of the heavy sprint-style sled pull and comparison with the back squat[J]. International Journal of Sports Science and Coaching, 2015, 10(5): 851-868. [19] STEVENS W R Jr, KOKOSZKA A Y, ANDEERSON A M, et al. Automated event detection algorithm for two squatting protocols[J]. Gait and Posture, 2018, 59:253-257. |