Publication Type

Conference Proceeding Article

Version

acceptedVersion

Publication Date

3-2023

Abstract

Hand redirection is effective so long as the introduced offsets are not noticeably disruptive to users. In this work we investigate the use of physiological and interaction data to detect movement discrepancies between a user's real and virtual hand, pushing towards a novel approach to identify discrepancies which are too large and therefore can be noticed. We ran a study with 22 participants, collecting EEG, ECG, EDA, RSP, and interaction data. Our results suggest that EEG and interaction data can be reliably used to detect visuo-motor discrepancies, whereas ECG and RSP seem to suffer from inconsistencies. Our findings also show that participants quickly adapt to large discrepancies, and that they constantly attempt to establish a stable mental model of their environment. Together, these findings suggest that there is no absolute threshold for possible non-detectable discrepancies; instead, it depends primarily on participants' most recent experience with this kind of interaction.

Keywords

Detection Thresholds, Hand Redirection, Physiological Data, Virtual Reality

Discipline

Graphics and Human Computer Interfaces

Research Areas

Information Systems and Management

Publication

Proceedings of the 2023 IEEE Conference Virtual Reality and 3D User Interfaces (VR), Shanghai, China, March 25-29

First Page

194

Last Page

204

ISBN

9798350348156

Identifier

10.1109/VR55154.2023.00035

Publisher

IEEE

City or Country

Shanghai, China

Copyright Owner and License

Authors

Additional URL

https://doi.org/10.1109/VR55154.2023.00035

Share

COinS