Conference Proceeding Article
We espouse a vision of small data-based immersive retail analytics, where a combination of sensor data, from personal wearable-devices and store-deployed sensors & IoT devices, is used to create real-time, individualized services for in-store shoppers. Key challenges include (a) appropriate joint mining of sensor & wearable data to capture a shopper’s product level interactions, and (b) judicious triggering of power-hungry wearable sensors (e.g., camera) to capture only relevant portions of a shopper’s in-store activities. To explore the feasibility of our vision, we conducted experiments with 5 smartwatch-wearing users who interacted with objects placed on cupboard racks in our lab (to crudely mimic corresponding grocery store interactions).Initial results show signiﬁcant promise: 94% accuracy in identifying an item-picking gesture, 85% accuracy in identifying the shelf-location from where the item was picked and 61% accuracy in identifying the exact item picked (via analysis of the smartwatch camera data).
Accelerometers, Cameras, Object recognition, Image recognition, Performance evaluation, Real-time systems, Data mining
Computer and Systems Architecture | Databases and Information Systems
Software and Cyber-Physical Systems
2016 8th International Conference on Communication Systems and Networks: COMSNETS 2016, Bangalore, India, January 5-10 [COMSNETS Workshop: Wild and Crazy Ideas on the interplay between IoT and Big Data WACI]
City or Country
RADHAKRISHNAN, Meera; SEN, Sougata; SUBBARAJU, Vigneshwaran; MISRA, Archan; and BALAN, Rajesh.
IoT+Small Data: Transforming In-Store Shopping Analytics and Services. (2016). 2016 8th International Conference on Communication Systems and Networks: COMSNETS 2016, Bangalore, India, January 5-10 [COMSNETS Workshop: Wild and Crazy Ideas on the interplay between IoT and Big Data WACI]. 7439946-1-6. Research Collection School Of Information Systems.
Available at: http://ink.library.smu.edu.sg/sis_research/3570
Copyright Owner and License
Creative Commons License
This work is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 4.0 License.