Publication Type

Journal Article

Version

publishedVersion

Publication Date

10-2023

Abstract

Room-scale VR has been considered an alternative to physical office workspaces. For office activities, users frequently require planar input methods, such as typing or handwriting, to quickly record annotations to virtual content. However, current off-The-shelf VR HMD setups rely on mid-Air interactions, which can cause arm fatigue and decrease input accuracy. To address this issue, we propose UbiSurface, a robotic touch surface that can automatically reposition itself to physically present a virtual planar input surface (VR whiteboard, VR canvas, etc.) to users and to permit them to achieve accurate and fatigue-less input while walking around a virtual room. We design and implement a prototype of UbiSurface that can dynamically change a canvas-sized touch surface's position, height, and pitch and yaw angles to adapt to virtual surfaces spatially arranged at various locations and angles around a virtual room. We then conduct studies to validate its technical performance and examine how UbiSurface facilitates the user's primary mid-Air planar interactions, such as painting and writing in a room-scale VR setup. Our results indicate that this system reduces arm fatigue and increases input accuracy, especially for writing tasks. We then discuss the potential benefits and challenges of robotic touch devices for future room-scale VR setups.

Keywords

distributed encountered type haptics, haptics, inflatable, mobile robots, virtual reality

Discipline

Graphics and Human Computer Interfaces | Software Engineering

Publication

Proceedings of the ACM on Human-Computer Interaction

Volume

7

First Page

376

Last Page

397

ISSN

2573-0142

Identifier

10.1145/3626479

Publisher

Association for Computing Machinery (ACM)

Copyright Owner and License

Authors-CC-BY

Additional URL

https://doi.org/10.1145/3626479

Share

COinS