Publication Type

Conference Proceeding Article

Version

publishedVersion

Publication Date

4-2023

Abstract

Live and pre-recorded video tutorials are an effective means for teaching physical skills such as cooking or prototyping electronics. A dedicated cameraperson following an instructor’s activities can improve production quality. However, instructors who do not have access to a cameraperson’s help often have to work within the constraints of static cameras. We present Stargazer, a novel approach for assisting with tutorial content creation with a camera robot that autonomously tracks regions of interest based on instructor actions to capture dynamic shots. Instructors can adjust the camera behaviors of Stargazer with subtle cues, including gestures and speech, allowing them to fluidly integrate camera control commands into instructional activities. Our user study with six instructors, each teaching a distinct skill, showed that participants could create dynamic tutorial videos with a diverse range of subjects, camera framing, and camera angle combinations using Stargazer.

Keywords

cameras, robots, instructional videos

Discipline

Graphics and Human Computer Interfaces

Research Areas

Information Systems and Management

Publication

CHI '23: Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems, Hamburg, Germany, April 23-28

First Page

1

Last Page

16

ISBN

9781450394215

Identifier

10.1145/3544548.3580896

Publisher

ACM

City or Country

Hamburg, Germany

Copyright Owner and License

Authors

Additional URL

https://doi.org/10.1145/3544548.3580896

Share

COinS