Publication Type

Conference Proceeding Article

Version

publishedVersion

Publication Date

10-2025

Abstract

We present EmoShortcuts1, a novel social Mixed Reality (MR) framework that enhances emotional expression by dynamically augmenting avatar body gestures to reflect users’ emotional states. While social MR enables immersive remote interactions through avatars, conveying emotions remains challenging due to limitations in head-mounted display (HMD) tracking (e.g., missing lower-body movements, such as stomping or defensive postures), and users’ tendency to deprioritize nonverbal expressions during multitasking. EmoShortcuts addresses these challenges by introducing an augmentation framework that generates expressive body gestures even when users’ physical movements are restricted. We conducted a formative study with 12 participants to identify key challenges in emotional expression and explore user preferences for AI-assisted gesture augmentation. Based on these insights, we designed an interface that enables adaptive gesture augmentation, allowing for both preset and real-time user control. Through an extensive user study (n = 16), our findings show that EmoShortcuts significantly improves emotion expression accuracy, controllability, and workload, demonstrating its potential for more immersive and emotionally rich virtual communication.

Keywords

Social Mixed Reality, Virtual avatar, Emotion, Body gesture

Discipline

Graphics and Human Computer Interfaces | Software Engineering

Research Areas

Software and Cyber-Physical Systems

Areas of Excellence

Sustainability

Publication

UIST '25: Proceedings of the 38th Annual ACM Symposium on User Interface Software and Technology, Busan, Korea, September 28 - October 1

First Page

1

Last Page

16

Identifier

10.1145/3746059.3747656

Publisher

ACM

City or Country

New York

Additional URL

https://dl.acm.org/doi/full/10.1145/3746059.3747656

Share

COinS