Publication Type

Conference Proceeding Article

Version

acceptedVersion

Publication Date

5-2020

Abstract

Camera drones, a rapidly emerging technology, offer people the ability to remotely inspect an environment with a high degree of mobility and agility. However, manual remote piloting of a drone is prone to errors. In contrast, autopilot systems can require a significant degree of environmental knowledge and are not necessarily designed to support flexible visual inspections. Inspired by camera manipulation techniques in interactive graphics, we designed StarHopper, a novel touch screen interface for efficient object-centric camera drone navigation, in which a user directly specifies the navigation of a drone camera relative to a specified object of interest. The system relies on minimal environmental information and combines both manual and automated control mechanisms to give users the freedom to remotely explore an environment with efficiency and accuracy. A lab study shows that StarHopper offers an efficiency gain of 35.4% over manual piloting, complimented by an overall user preference towards our object-centric navigation system.

Keywords

Human-centered computing, Human Compute Interaction (HCI), Interaction Techniques

Discipline

Graphics and Human Computer Interfaces

Research Areas

Information Systems and Management

Publication

Proceedings of Graphics Interface 2020, Toronto, Canada, May 28-29

First Page

317

Last Page

326

Identifier

10.20380/GI2020.32

Publisher

Canadian Human-Computer Communications Society

City or Country

Mississauga, Canada

Additional URL

https://doi.org/10.20380/GI2020.32

Share

COinS