Free-Head Gaze Estimation with Deep Learning

Rayner K. Eye movements in reading and information processing: 20 years of research. Psychol Bull. 1998;124(3):372–422.

Article  Google Scholar 

Zhang X, Sugano Y, Bulling A. Evaluation of appearance-based methods and implications for gaze-based applications In: Proceedings of the 2019 CHI conference on human factors in computing systems; 2019. p. 416.

Patney A, Kim J, Salvi M, Kaplanyan A, Wyman C, Benty N, Lefohn A, Luebke D Perceptually-based foveated virtual reality. In: ACM SIGGRAPH 2016 emerging technologies on; 2016. p. 17

Tawari A, Chen KH, Trivedi MM. Where is the driver looking: analysis of head, eye and iris for robust gaze zone estimation. In: 17th International IEEE conference on intelligent transportation systems (ITSC14)Institute of Electrical and Electronics Engineers (IEEE)China Association of AutomationQingdao Academy of Intelligent IndustriesState Key Laboratory of Management and Control for Complex SystemsXi’an Jiaotong University, ChinaInstitute of Automation, Chinese Academy of Sciences; 2014. pp. 988–994.

Hansen D, Ji Q. In the eye of the beholder: a survey of models for eyes and gaze. IEEE Trans Pattern Anal Mach Intell. 2010;32(3):478–500.

Article  Google Scholar 

Zhang X, Sugano Y, Fritz M, Bulling A. MPIIGaze: real-world dataset and deep appearance-based gaze estimation. IEEE Trans Pattern Anal Mach Intell. 2019;41(1):162–75.

Article  Google Scholar 

Ranjan R, Mello SD, Kautz J. Light-weight head pose invariant gaze tracking. In: 2018 IEEE/CVF conference on computer vision and pattern recognition workshops (CVPRW); 2018. pp. 2156–2164.

Deng H, Zhu W. Monocular free-head 3d gaze tracking with deep learning and geometry constraints. In: 2017 IEEE international conference on computer vision (ICCV); 2017. pp. 3162–3171.

Hennessey C, Noureddin B, Lawrence P. A single camera eye-gaze tracking system with free head motion. In: Proceedings of the 2006 symposium on Eye tracking research & applications, 2006, pp. 87–94.

Yamazoe H, Utsumi A, Yonezawa T, Abe S. Remote gaze estimation with a single camera based on facial-feature tracking without special calibration actions. In: Proceedings of the 2008 symposium on Eye tracking research & applications; 2008, pp. 245–250.

Sugano Y, Matsushita Y, Sato Y. Learning-by-synthesis for appearance-based 3d gaze estimation. In: CVPR ’14 Proceedings of the 2014 IEEE conference on computer vision and pattern recognition; 2014. pp. 1821–1828.

Krafka K, Khosla A, Kellnhofer P, Kannan H, Bhandarkar S, Matusik W, Torralba A. Eye tracking for everyone. In: 2016 IEEE conference on computer vision and pattern recognition (CVPR); 2016. pp. 2176–2184.

Fischer T, Chang HJ, Demiris Y. RT-GENE: real-time eye gaze estimation in natural environments. In: Computer Vision - ECCV 2018: 15th European Conference, proceedings, Part X; 2018, pp. 339–357.

Cheng Y, Lu F, Zhang X. Appearance-based gaze estimation via evaluation-guided asymmetric regression. In: Proceedings of the European Conference on Computer Vision (ECCV); 2018. pp. 105–121.

Zhang X, Sugano Y, Fritz M, Bulling A. Appearance-based gaze estimation in the wild. In: 2015 IEEE conference on computer vision and pattern recognition (CVPR); 2015. pp. 4511–4520.

Zhang X, Huang MX, Sugano Y, Bulling A. Training person-specific gaze estimators from user interactions with multiple devices. In: Proceedings of the 2018 CHI conference on human factors in computing systems; 2018, p. 624.

Liu G, Yu Y, Mora KAF, Odobez J.-M. A differential approach for gaze estimation with calibration., In: BMVC; 2018. p. 235.

Park S, Mello SD, Molchanov P, Iqbal U, Hilliges O, Kautz J. Few-shot adaptive gaze estimation. In: 2019 IEEE/CVF international conference on computer vision (ICCV); 2019. pp. 9368–9377.

Mott ME, Williams S, Wobbrock JO, Morris M.R. Improving dwell-based gaze typing with dynamic, cascading dwell times. In: Proceedings of the 2017 CHI conference on human factors in computing systems; 2017, pp. 2558–2570.

Nguyen C, Liu F. Gaze-based notetaking for learning from lecture videos. In: Proceedings of the 2016 CHI conference on human factors in computing systems; 2016, pp. 2093–2097.

Zhang Y, Müller J, Chong MK, Bulling A, Gellersen H. GazeHorizon: enabling passers-by to interact with public displays by gaze. In: Proceedings of the 2014 ACM international joint conference on pervasive and ubiquitous computing; 2014, pp. 559–563.

Huang MX, Kwok TC, Ngai G, Chan SC, Leong HV. Building a personalized, auto-calibrating eye tracker from user interactions. In: Proceedings of the 2016 CHI conference on human factors in computing systems; 2016, pp. 5169–5179.

Wood E, Bulling A. EyeTab: model-based gaze estimation on unmodified tablet computers. In: Proceedings of the symposium on eye tracking research and applications; 2014, pp. 207–210.

Deng J, Guo J, Zhou Y, Yu J, Kotsia I, Zafeiriou S. RetinaFace: single-stage dense face localisation in the wild. arXiv:1905.00641.

Guo X, Li S, Zhang J, Ma J, Ma L, Liu W, Ling H. PFLD: a practical facial landmark detector. arXiv:1902.10859.

Zhang X, Sugano Y, Bulling A. Revisiting data normalization for appearance-based gaze estimation. In: Proceedings of the 2018 ACM symposium on eye tracking research & applications; 2018, p. 12.

Lepetit V, Moreno-Noguer F, Fua P. EPNP: an accurate o(n) solution to the PNP problem. Int J Comput Vision. 2009;81(2):155–66. https://doi.org/10.1007/s11263-008-0152-6.

Article  Google Scholar 

Zhang X, Sugano Y, Fritz M, Bulling A. It’s written all over your face: full-face appearance-based gaze estimation. In: 2017 IEEE conference on computer vision and pattern recognition workshops (CVPRW); 2017. pp. 2299–2308.

Comments (0)

No login
gif