Publication Type

Conference Proceeding Article

Version

acceptedVersion

Publication Date

3-2015

Abstract

We explore the use of gesture recognition on a wrist-worn smartwatch as an enabler of an automated eating activity (and diet monitoring) system. We show, using small-scale user studies, how it is possible to use the accelerometer and gyroscope data from a smartwatch to accurately separate eating episodes from similar non-eating activities, and to additionally identify the mode of eating (i.e., using a spoon, bare hands or chopsticks). Additionally, we investigate the likelihood of automatically triggering the smartwatch's camera to capture clear images of the food being consumed, for possible offline analysis to identify what (and how much) the user is eating. Our results show both the promise and challenges of this vision: while opportune moments for capturing such useful images almost always exist in an eating episode, significant further work is needed to both (a) correctly identify the appropriate instant when the camera should be triggered and (b) reliably identify the type of food via automated analyses of such images.

Keywords

Automated analysis, Bare-hand, Off-line analysis, Small scale, User study, Wearable computers, Gesture recognition

Discipline

Software Engineering

Research Areas

Software and Cyber-Physical Systems

Publication

2015 IEEE International Conference on Pervasive Computing and Communication Workshops: Proceedings: 23-27 March, St Louis, MO

First Page

585

Last Page

590

ISBN

9781479984251

Identifier

10.1109/PERCOMW.2015.7134103

Publisher

IEEE

City or Country

Piscataway, NJ

Copyright Owner and License

Authors

Additional URL

https://doi.org/10.1109/PERCOMW.2015.7134103

Share

COinS