Publication Type
Journal Article
Version
acceptedVersion
Publication Date
11-2024
Abstract
Eye-tracking technology has gained significant attention in recent years due to its wide range of applications in humancomputer interaction, virtual and augmented reality, and wearable health. Traditional RGB camera-based eye-tracking systems often struggle with poor temporal resolution and computational constraints, limiting their effectiveness in capturing rapid eye movements. To address these limitations, we propose EyeTrAES, a novel approach using neuromorphic event cameras for high-fidelity tracking of natural pupillary movement that shows significant kinematic variance. One of EyeTrAES’s highlights is the use of a novel adaptive windowing/slicing algorithm that ensures just the right amount of descriptive asynchronous event data accumulation within an event frame, across a wide range of eye movement patterns. EyeTrAES then applies lightweight image processing functions over accumulated event frames from just a single eye to perform pupil segmentation and tracking (as opposed to gaze-based techniques that require simultaneous tracking of both eyes). We show that these two techniques boost pupil tracking fidelity by 6+%, achieving IoU∼=92%, while incurring at least 3x lower latency than competing pure event-based eye tracking alternatives [38]. We additionally demonstrate that the microscopic pupillary motion captured by EyeTrAES exhibits distinctive variations across individuals and can thus serve as a biometric fingerprint. For robust user authentication, we train a lightweight per-user Random Forest classifier using a novel feature vector of short-term pupillary kinematics, comprising a sliding window of pupil (location, velocity, acceleration) triples. Experimental studies with two different datasets (capturing eye movement across a range of environmental contexts) demonstrate that the EyeTrAES-based authentication technique can simultaneously achieve high authentication accuracy (∼=0.82) and low processing latency (∼=12ms), and significantly outperform multiple state-of-the-art competitive baselines
Keywords
Ubiquitous and mobile computing, Eye tracking, Event cameras, Adaptive event sampling, Authentication
Discipline
Databases and Information Systems | Graphics and Human Computer Interfaces
Research Areas
Software and Cyber-Physical Systems
Areas of Excellence
Digital transformation
Publication
Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies
Volume
8
Issue
4
First Page
1
Last Page
32
ISSN
2474-9567
Identifier
10.1145/3699745
Publisher
Association for Computing Machinery (ACM)
Citation
SEN, Argha; BANDARA, Panahetipola Mudiyanselage Nuwan; GOKARN, Ila; KANDAPPU, Thivya; and MISRA, Archan.
EyeTrAES : Fine-grained, low-latency eye tracking via adaptive event slicing. (2024). Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies. 8, (4), 1-32.
Available at: https://ink.library.smu.edu.sg/sis_research/9844
Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-No Derivative Works 4.0 International License.
Additional URL
https://doi.org/10.1145/3699745
Included in
Databases and Information Systems Commons, Graphics and Human Computer Interfaces Commons
Comments
pdf provided by faculty