Journal of Computer Applications ›› 2020, Vol. 40 ›› Issue (11): 3346-3356.DOI: 10.11772/j.issn.1001-9081.2020040443
• Frontier & interdisciplinary applications • Previous Articles Next Articles
ZHANG Junjie, SUN Guangmin, ZHENG Kun
Received:
2020-04-10
Revised:
2020-06-29
Online:
2020-07-20
Published:
2020-11-10
Supported by:
张俊杰, 孙光民, 郑鲲
通讯作者:
郑鲲(1977-),男,河北保定人,副教授,博士,主要研究方向:智能教育、图像处理、神经网络;zhengkun@bjut.edu.cn
作者简介:
张俊杰(1993-),男,北京人,博士研究生,主要研究方向:神经网络、图像处理与模式识别;孙光民(1960-),男,山西闻喜人,教授,博士,主要研究方向:神经网络、图像处理与模式识别
基金资助:
CLC Number:
ZHANG Junjie, SUN Guangmin, ZHENG Kun. Review of gaze tracking and its application in intelligent education[J]. Journal of Computer Applications, 2020, 40(11): 3346-3356.
张俊杰, 孙光民, 郑鲲. 视线跟踪及其在智能教育中的应用研究综述[J]. 计算机应用, 2020, 40(11): 3346-3356.
Add to citation manager EndNote|Ris|BibTeX
URL: https://www.joca.cn/EN/10.11772/j.issn.1001-9081.2020040443
[1] 中国政府网. 国务院印发《新一代人工智能发展规划》[J]. 通信世界,2017(20):8.(Chinese Government Website. The Next Generation Development Plan of Artificial Intelligence printed and distributed by the State Council,PRC[J]. Communications World, 2017(20):8.) [2] 张慧, 黄怀荣, 李冀红, 等. 规划人工智能时代的教育:引领与跨越——解读国际人工智能与教育大会成果文件《北京共识》[J]. 现代远程教育研究,2019,31(3):3-11.(ZHANG H,HUANG H R,LI J H,et al. Planningeducation in the artificial intelligence era:lead the leap-an interpretation of the outcome document of the international conference on artificial intelligence andeducation:Beijing consensus on artificial intelligence andeducation[J]. Modern Distance Education Research,2019,31(3):3-11.) [3] 刘佳惠, 迟健男, 尹怡欣. 基于特征的视线跟踪方法研究综述[J/OL]. 自动化学报[2019-04-28]. https://kns.cnki.net/kcms/detail/11.2109.TP.20190428.1046.004.html. (LIU J H, CHI J N,YIN Y X. A review of feature-based gaze tracking methods[J/OL]. Acta Automatica Sinica.[2019-04-28]. https://kns.cnki.net/kcms/detail/11.2109.TP.20190428.1046.004.html.) [4] 荆其诚. 国外眼动的应用研究[J]. 心理科学通讯,1964(1):29-46.(JING Q C. Foreign applied research of eye movement[J]. Psychological Science,1964(1):29-46.) [5] MILES W. The peep-hole method for observing eye movements in reading[J]. The Journal of General Psychology,1928,1(2):373-374. [6] 韩玉昌. 眼动仪和眼动实验法的发展历程[J]. 心理科学, 2000,23(4):71-74.(HAN Y C. The developmental history of the eye movement apparatus and the experimental method of eye movement[J]. Psychological Science,2000,23(4):71-74.) [7] HUANG S,HU Y,LI C. A CORBA-basedcomputer support cooperative work for dynamic alliances[J]. The International Journal of Advanced Manufacturing Technology,2002,19(10):752-755. [8] 段冠婷. 眼动仪的开发现状及其在运动心理领域的应用[J]. 当代体育科技,2017,7(23):234.(DUAN G T. The development of eye-movement measuring device and its application in sport psychology[J]. Contemporary Sports Technology, 2017, 7(23):234.) [9] 闫国利, 田宏杰. 眼动记录技术与方法综述[J]. 应用心理学, 2004,10(2):55-58.(YAN G L,TIAN H J. A review of eye movement recording methods and techniques[J]. Chinese Journal of Applied Psychology,2004,10(2):55-58.) [10] ROBINSON D A. A method of measuring eye movement using a scleral search coil in a magnetic field[J]. IEEE Transactions on Bio-medical Electronics,1963,10(4):137-145. [11] 胡艳红, 魏江, 梅少辉. 基于瞳孔角膜反射技术的视线估计方法[J]. 计算机工程与应用,2018,54(14):7-10,18.(HU Y H, WEI J,MEI S H. Gaze estimation method based on pupils and corneal reflection technique[J]. Computer Engineering and Applications,2018,54(14):7-10,18.) [12] 李娅萍. 基于瞳孔角膜反射法的视线追踪技术研究[D]. 重庆:重庆邮电大学,2016:12-65.(LI Y P. Research on eye tracking technique based on pupil corneal reflection method[D]. Chongqing:Chongqing University of Posts and Telecommunications,2016:12-65.) [13] 秦华标, 严伟洪, 王信亮, 等. 一种可克服头动影响的视线跟踪系统[J]. 电子学报,2013,41(12):2403-2408.(QIN H B, YAN W H,WANG X L,et al. A gaze tracking system overcoming influences of head movements[J]. Acta Electronica Sinica,2013, 41(12):2403-2408.) [14] 秦华标, 王信亮, 卢杰, 等. 自然光下的新型动态注视点眼动向量[J]. 电子学报,2016,44(2):420-425.(QIN H B,WANG X L,LU J,et al. A novel dynamic gaze vector in natural light[J]. Acta Electronica Sinica,2016,44(2):420-425.) [15] 刘瑞安. 单摄像机视线跟踪技术研究[D]. 天津:天津大学, 2007:1-126.(LIU R A. Study on eye gaze tracking using one camera[D]. Tianjin:Tianjin University,2007:1-126.) [16] TORRICELLI D,CONFORTO S,SCHMID M,et al. A neural-based remote eye gaze tracker under natural head motion[J]. Computer Methods and Programs in Biomedicine, 2008, 92(1):66-78. [17] ZHU Z,JI Q. Eye gaze tracking under natural head movements[C]//Proceedings of the 2005 IEEE Conference on Computer Vision and Pattern Recognition. Piscataway:IEEE, 2005:918-923. [18] ZHU J,YANG J. Subpixel eye gaze tracking[C]//Proceedings of the 5th IEEE International Conference on Automatic Face and Gesture Recognition. Piscataway:IEEE,2002:131-136. [19] VILLANUEVA A,CABEZA R. A novel gaze estimation system with one calibration point[J]. IEEE Transactions on Systems, Man,and Cybernetics,Part B(Cybernetics),2008,38(4):1123-1138. [20] MORIMOTO C H,KOONS D,AMIR A,et al. Pupil detection and tracking using multiple light sources[J]. Image and Vision Computing,2000,18(4):331-335. [21] ZHU Z,JI Q. Eye and gaze tracking for interactive graphic display[J]. Machine Vision and Applications,2004,15:139-148. [22] ZHU Z,JI Q,BENNETT K P. Nonlinear eye gaze mapping function estimation via support vector regression[C]//Proceedings of the 18th International Conference on Pattern Recognition. Piscataway:IEEE,2006:1132-1135. [23] KOLAKOWSKI S M,PELZ J B. Compensating for eye tracker camera movement[C]//Proceedings of the 2006 Symposium on Eye Tracking Research and Applications. New York:ACM, 2006:79-85. [24] ZHU Z,JI Q. Novel eye gaze tracking techniques under natural head movement[J]. IEEE Transactions on Biomedical Engineering,2007,54(12):2246-2260. [25] ZHANG J,SUN G,ZHENG K,et al. Pupil detection based on oblique projection using a binocular camera[J]. IEEE Access, 2020,8:105754-105765. [26] KIM B C,KO D,JANG U,et al. 3D gaze tracking bycombining eye-and facial-gaze vectors[J]. The Journal of Supercomputing, 2017,73(7):3038-3052. [27] TAKAHASHI M,CLIPPINGDALE S,OKUDA M,et al. An estimator for rating video contents on the basis of a viewer's behavior in typical home environments[C]//Proceedings of the 2013 International Conference on Signal-Image Technology and Internet-Based Systems. Piscataway:IEEE,2013:6-13. [28] SAKURAI K,YANG M,TANNO K,et al. Gaze estimation method using analysis of electrooculogram signals and Kinect sensor[J]. Computational Intelligence and Neuroscience,2017, 2017:No. 2074752. [29] SUN L,LIU Z,SUN M T. Real time gaze estimation with a consumer depth camera[J]. Information Sciences,2015,320:346-360. [30] JANG J W,HEO H,BANG J W,et al. Fuzzy-based estimation of continuous Z-distances and discrete directions of home appliances for NIR camera-based gaze tracking system[J]. Multimedia Tools and Applications,2018,77(10):11925-11955. [31] MEYER A,BÖHME M,MARTINETZ T,et al. A single-camera remote eye tracker[C]//Proceedings of the 2006 International Tutorial and Research Workshop on Perception and Interactive Technologies for Speech-Based Systems. Berlin:Springer,2006:208-211. [32] HENNESSEY C,NOUREDDIN B,LAWRENCE P. A single camera eye-gaze tracking system with free head motion[C]//Proceedings of the 2006 Symposium on Eye Tracking Research and Applications. New York:ACM,2006:87-94. [33] GUESTRIN E D,EIZENMAN M. General theory of remote gaze estimation using the pupil center and corneal reflections[J]. IEEE Transactions on Biomedical Engineering, 2006, 53(6):1124-1133. [34] LAI C C,SHIH S W,HUNG Y P. Hybrid method for 3-D gaze tracking using glint and contour features[J]. IEEE Transactions on Circuits and Systems for Video Technology, 2015, 25(1):24-37. [35] BEYMER D,FLICKNER M. Eye gaze tracking using an active stereo head[C]//Proceedings of the 2003 IEEE Conference on Computer Vision and Pattern Recognition. Piscataway:IEEE, 2003:451. [36] OHNO T,MUKAWA N. A free-head,simple calibration,gaze tracking system that enables gaze-based interaction[C]//Proceedings of the 2004 Symposium on Eye Tracking Research and Applications. New York:ACM,2004:115-122. [37] NAGAMATSU T,KAMAHARA J,TANAKA N. Calibration-free gaze tracking using a binocular 3D eye model[C]//Proceedings of the 2009 Conference on Human Factors in Computing Systems. New York:ACM,2009:3613-3618. [38] MODEL D,EIZENMAN M. User-calibration-free remote eye-gaze tracking system with extended tracking range[C]//Proceedings of the 24th Canadian Conference on Electrical and Computer Engineering. Piscataway:IEEE,2011:1268-1271. [39] MORIMOTO C H, AMIR A, FLICKNER M. Detecting eye position and gaze from a single camera and 2 light sources[C]//Proceedings of the 16th International Conference on Pattern Recognition. Piscataway:IEEE,2002:314-317. [40] FANELLI G,DANTONE M,GALL J,et al. Random forests for real time 3D face analysis[J]. International Journal of Computer Vision,2013,101(3):437-458. [41] LU F,SUGANO Y,OKABE T,et al. Adaptive linear regression for appearance-based gaze estimation[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2014, 36(10):2033-2046. [42] VALENTI R,SEBE N,GEVERS T. Combining head pose and eye location information for gaze estimation[J]. IEEE Transactions on Image Processing,2011,21(2):802-815. [43] SUGANO Y,MATSUSHITA Y,SATO Y,et al. An incremental learning method for unconstrained gaze estimation[C]//Proceedings of the 2008 European Conference on Computer Vision,LNCS 5304. Berlin:Springer,2008:656-667. [44] BALUJA S,POMERLEAU D. Non-intrusive gaze tracking using artificial neural networks[C]//Proceedings of the 6th International Conference on Neural Information Processing Systems. San Francisco:Morgan Kaufmann,1993:753-760. [45] TAN K H,KRIEGMAN D J,AHUJA N. Appearance-based eye gaze estimation[C]//Proceedings of the 6th IEEE Workshop on Applications of Computer Vision. Piscataway:IEEE, 2002:191-195. [46] WILLIAMS O, BLAKE A, CIPOLLA R. Sparse and semisupervised visual mapping with the S3GP[C]//Proceedings of the 2006 IEEE Conference on Computer Vision and Pattern Recognition. Piscataway:IEEE,2006:230-237. [47] WANG W,HUANG Y,ZHANG R. Driver gaze tracker using deformable template matching[C]//Proceedings of the 2011 IEEE International Conference on Vehicular Electronics and Safety. Piscataway:IEEE,2011:244-247. [48] REINDERS M J T. Eye tracking by template matching using an automatic codebook generation scheme[C]//Proceedings of the 3rd Annual Conference of the Advanced School for Computing and Imaging. Cham:Springer,1997:85-91. [49] INCE I F,KIM J W. A 2D eye gaze estimation system with lowresolution webcam images[J]. EURASIP Journal on Advances in Signal Processing,2011,2011:No. 40. [50] ZHANG X,SUGANO Y,FRITZ M,et al. MPIIGaze:real-world dataset and deep appearance-based gaze estimation[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence,2019, 41(1):162-175. [51] VERA-OLMOS F J,PARDO E,MELERO H,et al. DeepEye:deep convolutional network for pupil detection in real environments[J]. Integrated Computer-Aided Engineering, 2019, 26(1):85-95. [52] KRAFKA K,KHOSLA A,KELLNHOFER P,et al. Eye tracking for everyone[C]//Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition. Piscataway:IEEE, 2016:2176-2184. [53] LIAN D, HU L, LUO W, et al. Multiview multitask gaze estimation with deep convolutional neural networks[J]. IEEE Transactions on Neural Networks and Learning Systems,2018,30(10):3010-3023. [54] ZHANG X,SUGANO Y,FRITZ M,et al. Appearance-based gaze estimation in the wild[C]//Proceedings of the 2015 IEEE Conference on Computer Vision and Pattern Recognition. Piscataway:IEEE,2015:4511-4520. [55] NAQVI R A,ARSALAN M,BATCHULUUN G,et al. Deep learning-based gaze detection system for automobile drivers using a NIR camera sensor[J]. Sensors,2018,18(2):No. 456. [56] WONG E T,YEAN S,HU Q,et al. Gaze estimation using residual neural network[C]//Proceedings of the 2019 IEEE International Conference on Pervasive Computing and Communications Workshops. Piscataway:IEEE,2019:411-414. [57] WU Z,RAJENDRAN S,VAN AS T,et al. EyeNet:a multi-task deep network for off-axis eye gaze estimation[C]//Proceedings of the 2019 IEEE/CVF International Conference on Computer Vision Workshop. Piscataway:IEEE,2019:3683-3687. [58] LEMLEY J,KAR A,DRIMBAREAN A,et al. Convolutional neural network implementation for eye-gaze estimation on lowquality consumer imaging systems[J]. IEEE Transactions on Consumer Electronics,2019,65(2):179-187. [59] ZHANG X,SUGANO Y,FRITZ M,et al. It's written all over your face:full-face appearance-based gaze estimation[C]//Proceedings of the 2017 IEEE Conference on Computer Vision and Pattern Recognition Workshops. Piscataway:IEEE, 2017:2299-2308. [60] WOOD E,BALTRUŠAITIS T,MORENCY L P,et al. Learning an appearance-based gaze estimator from one million synthesized images[C]//Proceedings of the 9th Biennial ACM Symposium on Eye Tracking Research and Applications. New York:ACM, 2016:131-138. [61] YIU Y H,ABOULATTA M,RAISER T,et al. DeepVOG:opensource pupil segmentation and gaze estimation in neuroscience using deep learning[J]. Journal of Neuroscience Methods,2019, 324:No. 108307. [62] ALHASAN K,CHEN L,CHEN F. An experimental study of learning behavior in an eLearning environment[C]//Proceedings of the IEEE 20th International Conference on High Performance Computing and Communications. Piscataway:IEEE,2018:1398-1403. [63] KLEIN P,VIIRI J,MOZAFFARI S,et al. Instruction-based clinical eye-tracking study on the visual interpretation of divergence:how do students look at vector field plots?[J]. Physical Review Physics Education Research,2018,14(1):No. 010116. [64] MOLINA A I,NAVARRO Ó,ORTEGA M,et al. Evaluating multimedia learning materials in primaryeducation using eye tracking[J]. Computer Standards and Interfaces,2018,59:45-60. [65] MASON L, PLUCHINO P, TORNATORA M C. Using eyetracking technology as an indirect instruction tool to improve text and picture processing and learning[J]. British Journal of Educational Technology,2016,47(6):1083-1095. [66] VAN WERMESKERKEN M,VAN GOG T. Seeing the instructor's face and gaze in demonstration video examples affects attention allocation but not learning[J]. Computers and Education,2017, 113:98-107. [67] CLINTON V,COOPER J L,MICHAELIS J,et al. How Revisions to Mathematical Visuals Affect Cognition:Evidence from Eye Tracking[M]. Hershey,PA:IGI Global,2017:195-218. [68] ZHOU M,REN J. Use of cognitive and metacognitive strategies in online search:an eye-tracking study[C]//Proceedings of the 2016 International Conferences on Internet Technologies and Society, Education Technologies, and Sustainability, Technology and Education.[S. l.]:International Association for the Developmengt of Information Soiety,2016:347-349. [69] HESS S,LOHMEYER Q,MEBOLDT M. Mobile eye tracking in engineering designeducation[J]. Design and Technology Education:An International Journal,2018,23(2):86-98. [70] RAJENDRAN R,KUMAR A,CARTER K E,et al. Predicting learning by analyzing eye-gaze data of reading behavior[C]//Proceedings of the 11th International Conference on Educational Data Mining. Raleigh:International Educational Data Mining Society,2018:455-461. [71] JIAN Y C. Eye-movement patterns and reader characteristics of students with good and poor performance when reading scientific text with diagrams[J]. Reading and Writing, 2017, 30(7):1447-1472. [72] KARCH J M,GARCÍA VALLES J C,SEVIAN H. Looking into the black box:using gaze and pupillometric data to probe how cognitive load changes with mental tasks[J]. Journal of Chemical Education,2019,96(5):830-840. [73] KRSTIC K,ŠOŠKIĆ A,KOVIĆ V,et al. All good readers are the same,but every low-skilled reader is different:an eye-tracking study using PISA data[J]. European Journal of Psychology of Education,2018,33(3):521-541. [74] MUKHERJEE M,HONNELLY A,ROSE B,et al. Eye tracking in cytotechnologyeducation:"visualizing" students becoming experts[J]. Journal of the American Society of Cytopathology, 2020,9(2):76-83. [75] SCHINDLER M,BADER E,LILIENTHAL A J,et al. Quantity recognition in structured whole number representations of students with mathematical difficulties:an eye-tracking study[J]. Learning Disabilities:a Contemporary Journal,2019,17(1):5-28. [76] KIM S, WISEHEART R, WALDEN P R. Do multimedia instructional designs enhancecomprehension in college students with dyslexia?[J]. Journal of Postsecondary Education and Disability,2018,31(4):351-365. [77] VIJAYAN K K,MORK O J,HANSEN I E. Eye tracker as a tool for engineeringeducation[J]. Universal Journal of Educational Research,2018,6(11):2647-2655. [78] FICHTEL E,LAU N,PARK J,et al. Eye tracking in surgicaleducation:gaze-based dynamic area of interest can discriminate adverse events and expertise[J]. Surgical Endoscopy,2019,33(7):2249-2256. [79] YILMAZ F G K,YILMAZ R,DURAK H Y,et al. Examination of students processes of searching information ineducation informatics network via eye tracking[J]. World Journal on Educational Technology:Current Issues,2019,11(1):65-73. [80] MCLNTYRE N A,FOULSHAM T. Scanpath analysis of expertise and culture in teacher gaze in real-world classrooms[J]. Instructional Science,2018,46(3):435-455. [81] SUSAC A,BUBIC A,KAZOTTI E,et al. Student understanding of graph slope and area under a graph:acomparison of physics and nonphysics students[J]. Physical Review Physics Education Research,2018,14(2):No. 020109. [82] KLEIN P,KÜCHEMANN S,BRÜCKNER S,et al. Student understanding of graph slope and area under a curve:a replication studycomparing first-year physics and economics students[J]. Physical Review Physics Education Research,2019,15(2):No. 020116. [83] 张家华, 彭超云, 张剑平. 视线追踪技术及其在e-Learning系统中的应用[J]. 远程教育杂志,2009,17(5):74-78.(ZHANG J H,PENG C Y,ZHANG J P. Eye-tracking technology and its application in e-Learning system[J]. Distance Education Journal, 2009,17(5):74-78.) |
[1] | Wenze CHAI, Jing FAN, Shukui SUN, Yiming LIANG, Jingfeng LIU. Overview of deep metric learning [J]. Journal of Computer Applications, 2024, 44(10): 2995-3010. |
[2] | Tong CHEN, Jiwei WEI, Shiyuan HE, Jingkuan SONG, Yang YANG. Adversarial training method with adaptive attack strength [J]. Journal of Computer Applications, 2024, 44(1): 94-100. |
[3] | Yongfeng DONG, Yacong WANG, Yao DONG, Yahan DENG. Survey of online learning resource recommendation [J]. Journal of Computer Applications, 2023, 43(6): 1655-1663. |
[4] | Jian CUI, Kailang MA, Yu SUN, Dou WANG, Junliang ZHOU. Deep explainable method for encrypted traffic classification [J]. Journal of Computer Applications, 2023, 43(4): 1151-1159. |
[5] | Pengxin TIAN, Guannan SI, Zhaoliang AN, Jianxin LI, Fengyu ZHOU. Survey of data-driven intelligent cloud-edge collaboration [J]. Journal of Computer Applications, 2023, 43(10): 3162-3169. |
[6] | Jiaxuan WEI, Shikang DU, Zhixuan YU, Ruisheng ZHANG. Review of white-box adversarial attack technologies in image classification [J]. Journal of Computer Applications, 2022, 42(9): 2732-2741. |
[7] | LIU Zichen, LI Xiaojuan, WEI Wei. Automatic patent price evaluation based on recurrent neural network [J]. Journal of Computer Applications, 2021, 41(9): 2532-2538. |
[8] | DING Yin, SANG Nan, LI Xiaoyu, WU Feizhou. Prediction method of capacity data in telecom industry based on recurrent neural network [J]. Journal of Computer Applications, 2021, 41(8): 2373-2378. |
[9] | Tian LI, Shumei ZHANG, Junli ZHAO. Design and implementation of intelligent flow field pathfinding algorithm for real-time strategy game [J]. Journal of Computer Applications, 2020, 40(2): 602-607. |
[10] | XIA Bin, BAI Yuxuan, YIN Junjie. Generative adversarial network-based system log-level anomaly detection algorithm [J]. Journal of Computer Applications, 2020, 40(10): 2960-2966. |
[11] | WEI Xiaona, LI Yinghao, WANG Zhenyu, LI Haozun, WANG Hongzhi. Methods of training data augmentation for medical image artificial intelligence aided diagnosis [J]. Journal of Computer Applications, 2019, 39(9): 2558-2567. |
[12] | ZHANG Qiang, YANG Jian, FU Lizhen. Two-input stream deep deconvolution neural network for interpolation and recognition [J]. Journal of Computer Applications, 2019, 39(8): 2271-2275. |
[13] | ZHANG Hu-yin ZHANG Ming-yang LI Xin. E-learning resource library model based on domain ontology [J]. Journal of Computer Applications, 2012, 32(01): 191-195. |
[14] | . Automatic path finding method for real-time rendering of 3D scene [J]. Journal of Computer Applications, 2010, 30(1): 85-89. |
[15] | . Methods of conception soft-and operation in cloud model [J]. Journal of Computer Applications, 2008, 28(10): 2510-1512. |
Viewed | ||||||
Full text |
|
|||||
Abstract |
|
|||||