Paper ID | CI-3.5 |
Paper Title |
A HIGH-FRAME-RATE EYE-TRACKING FRAMEWORK FOR MOBILE DEVICES |
Authors |
Yuhu Chang, Changyang He, Yingying Zhao, Tun Lu, Ning Gu, Fudan University, China |
Session | CI-3: Computational Photography |
Location | Gather.Town |
Session Time: | Thursday, 10 June, 15:30 - 16:15 |
Presentation Time: | Thursday, 10 June, 15:30 - 16:15 |
Presentation |
Poster
|
Topic |
Computational Imaging: [CIS] Computational Imaging Systems |
IEEE Xplore Open Preview |
Click here to view in IEEE Xplore |
Virtual Presentation |
Click here to watch in the Virtual Conference |
Abstract |
Gaze-on-screen tracking, an appearance-based eye-tracking task, has drawn significant interest in recent years. While learning-based high-precision eye-tracking methods have been designed in the past, the complex pre-training and high computation in neural network-based deep models restrict their applicability in mobile devices. Moreover, as the display frame rate of mobile devices has steadily increased to 120 fps, high-frame-rate eye tracking becomes increasingly challenging. In this work, we tackle the tracking efficiency challenge and introduce GazeHFR, a biologic-inspired eye-tracking model specialized for mobile devices, offering both high accuracy and efficiency. Specifically, GazeHFR classifies the eye movement into two distinct phases, i.e., saccade and smooth pursuit, and leverages inter-frame motion information combined with lightweight learning models tailored to each movement phase to deliver high-efficient eye tracking without affecting accuracy. Compared to prior art, GazeHFR achieves approximately 7x speedup and 15% accuracy improvement on mobile devices. |