Publication Type
Journal Article
Version
publishedVersion
Publication Date
12-2021
Abstract
Gaze tracking is a key building block used in many mobile applications including entertainment, personal productivity, accessibility, medical diagnosis, and visual attention monitoring. In this paper, we present iMon, an appearance-based gaze tracking system that is both designed for use on mobile phones and has significantly greater accuracy compared to prior state-of-the-art solutions. iMon achieves this by comprehensively considering the gaze estimation pipeline and then overcoming three different sources of errors. First, instead of assuming that the user's gaze is fixed to a single 2D coordinate, we construct each gaze label using a probabilistic 2D heatmap gaze representation input to overcome errors caused by microsaccade eye motions that cause the exact gaze point to be uncertain. Second, we design an image enhancement model to refine visual details and remove motion blur effects of input eye images. Finally, we apply a calibration scheme to correct for differences between the perceived and actual gaze points caused by individual Kappa angle differences. With all these improvements, iMon achieves a person-independent per-frame tracking error of 1.49 cm (on smartphones) and 1.94 cm (on tablets) when tested with the GazeCapture dataset and 2.01 cm with the TabletGaze dataset. This outperforms the previous state-of-the-art solutions by ~22% to 28%. By averaging multiple per-frame estimations that belong to the same fixation point and applying personal calibration, the tracking error is further reduced to 1.11 cm (smartphones) and 1.59 cm (tablets). Finally, we built implementations that run on an iPhone 12 Pro and show that our mobile implementation of iMon can run at up to 60 frames per second - thus making gaze-based control of applications possible.
Keywords
Mobile gaze tracking, Appearance-based gaze tracking, Mobile deep learning
Discipline
Databases and Information Systems | Software Engineering
Research Areas
Data Science and Engineering
Publication
Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies
Volume
5
Issue
4
First Page
1
Last Page
26
ISSN
2474-9567
Identifier
10.1145/3494999
Publisher
Association for Computing Machinery (ACM)
Citation
HUYNH, Sinh; BALAN, Rajesh Krishna; and KO, JeongGil.
iMon: Appearance-based gaze tracking system on mobile devices. (2021). Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies. 5, (4), 1-26.
Available at: https://ink.library.smu.edu.sg/sis_research/6708
Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-No Derivative Works 4.0 International License.