Publication Type

Journal Article

Version

acceptedVersion

Publication Date

10-2020

Abstract

Maintaining a food journal can allow an individual to monitor eating habits, including unhealthy eating sessions, food items causing severe reactions, or portion size related information. However, manually maintaining a food journal can be burdensome. In this paper, we explore the vision of a pervasive, automated, completely unobtrusive, food journaling system using a commodity smartwatch. We present a prototype system — Annapurna— which is composed of three key components: (a) a smartwatch-based gesture recognizer that can robustly identify eating-specific gestures occurring anywhere, (b) a smartwatch-based image captor that obtains a small set of relevant images (containing views of the food being consumed) with a low energy overhead, and (c) a server-based image filtering engine that removes irrelevant uploaded images. Through lessons learnt from multiple user studies, we refine Annapurna progressively and show that our vision is indeed achievable: Annapurna can identify eating episodes and capture food images (involving a very wide diversity in food content, eating styles and environments) in over 95% of all free-living eating episodes.

Keywords

Wearable sensing, Mobile computing, Food journaling, Automated eating tracking system, IMU and camera data processing

Discipline

Databases and Information Systems

Research Areas

Software and Cyber-Physical Systems

Publication

Pervasive and Mobile Computing

Volume

68

First Page

1

Last Page

19

ISSN

1574-1192

Identifier

10.1016/j.pmcj.2020.101259

Publisher

Elsevier

Copyright Owner and License

Authors

Additional URL

https://doi.org/10.1016/j.pmcj.2020.101259

Share

COinS