Publication Type

Conference Proceeding Article

Version

publishedVersion

Publication Date

1-2016

Abstract

We espouse a vision of small data-based immersive retail analytics, where a combination of sensor data, from personal wearable-devices and store-deployed sensors & IoT devices, is used to create real-time, individualized services for in-store shoppers. Key challenges include (a) appropriate joint mining of sensor & wearable data to capture a shopper’s product level interactions, and (b) judicious triggering of power-hungry wearable sensors (e.g., camera) to capture only relevant portions of a shopper’s in-store activities. To explore the feasibility of our vision, we conducted experiments with 5 smartwatch-wearing users who interacted with objects placed on cupboard racks in our lab (to crudely mimic corresponding grocery store interactions).Initial results show significant promise: 94% accuracy in identifying an item-picking gesture, 85% accuracy in identifying the shelf-location from where the item was picked and 61% accuracy in identifying the exact item picked (via analysis of the smartwatch camera data).

Keywords

Accelerometers, Cameras, Object recognition, Image recognition, Performance evaluation, Real-time systems, Data mining

Discipline

Computer and Systems Architecture | Databases and Information Systems

Research Areas

Software and Cyber-Physical Systems

Publication

2016 8th International Conference on Communication Systems and Networks: COMSNETS 2016, Bangalore, India, January 5-10 [COMSNETS Workshop: Wild and Crazy Ideas on the interplay between IoT and Big Data WACI]

First Page

7439946-1

Last Page

6

ISBN

9781467396226

Identifier

10.1109/COMSNETS.2016.7439946

Publisher

IEEE

City or Country

Piscataway, NJ

Copyright Owner and License

Authors

Additional URL

https://doi.org/10.1109/COMSNETS.2016.7439946

Share

COinS