Publication Type

Conference Proceeding Article

Version

submittedVersion

Publication Date

3-2016

Abstract

We describe our vision of a multiple mobile or wearable device environment and share our initial exploration of our vision in multi-wrist gesture recognition. We explore how multi-device input and output might look, giving four scenarios of everyday multi-device use that show the technical challenges that need to be addressed. We describe our system which allows for recognition to be distributed between multiple devices, fusing recognition streams on a resource-rich device (e.g., mobile phone). An Interactor layer recognises common gestures from the fusion engine, and provides abstract input streams (e.g., scrolling and zooming) to user interface components called Midgets. These take advantage of multi-device input and output, and are designed to simplify the process of implementing multi-device gestural applications. Our initial exploration of multi-device gestures led us to design a modified pipelined HMM with early elimination of candidate gestures that can recognize gestures in almost 0.2 milliseconds and scales well to large numbers of gestures. Finally, we discuss the open problems in multi-device interaction and our research directions.

Keywords

Cellular telephone systems, Ubiquitous computing, User interfaces, Multiple devices, Resource-rich devices, Wearable devices, Gesture recognition

Discipline

Computer Sciences | Software Engineering

Research Areas

Software and Cyber-Physical Systems

Publication

2016 IEEE International Conference on Pervasive Computing and Communication Workshops (PerCom Workshops): Workshop on Sensing Systems and Applications using Wrist Worn Smart Devices WristSense: Sydney, March 14-18

First Page

1

Last Page

6

ISBN

9781509019403

Identifier

10.1109/PERCOMW.2016.7457168

Publisher

IEEE Computer Society

City or Country

Los Alamitos, CA

Additional URL

http://doi.org/10.1109/PERCOMW.2016.7457168

Share

COinS