Publication Type

Conference Proceeding Article

Version

acceptedVersion

Publication Date

6-2024

Abstract

Virtual reality (VR) offers new opportunities for presenters to use expressive body language to engage their audience. Yet, most VR presentation systems have adopted control mechanisms that mimic those found in face-to-face presentation systems. We explore the use of gestures that have dual-purpose: first, for the audience, a communicative purpose; second, for the presenter, a control purpose to alter content in slides. To support presenters, we provide guidance on what gestures are available and their effects. We realize our design approach in JollyGesture, a VR technology probe that recognizes dual-purpose gestures in a presentation scenario. We evaluate our approach through a design study with 12 participants, where in addition to using JollyGesture to deliver a mock presentation, we asked them to imagine gestures with the same communicative and control purpose, before and after being exposed to our probe. The study revealed several new design avenues valuable for VR presentation system design: expressive and coarse-grained communicative gestures, as well as subtle and hidden gestures intended for system control. Our work suggests that VR presentation systems of the future that embrace expressive body language will face design tensions relating to task loading and authenticity.

Keywords

Gestural input, Virtual Reality, Presentation

Discipline

Graphics and Human Computer Interfaces

Research Areas

Information Systems and Management

Areas of Excellence

Digital transformation

Publication

Proceedings of Graphics Interface 2024, Halifax, Nova Scotia, Canada, June 3-6

First Page

1

Last Page

14

Publisher

ACM

City or Country

New York

Copyright Owner and License

Authors

Additional URL

https://openreview.net/forum?id=KVUIRX0ExL

Share

COinS