Publication Type

Journal Article

Version

publishedVersion

Publication Date

3-2015

Abstract

Low-vision and blind bus riders often rely on known physical landmarks to help locate and verify bus stoplocations (e.g., by searching for an expected shelter, bench, or newspaper bin). However, there are currentlyfew, if any, methods to determine this information a priori via computational tools or services. In thisarticle, we introduce and evaluate a new scalable method for collecting bus stop location and landmarkdescriptions by combining online crowdsourcing and Google Street View (GSV). We conduct and report onthree studies: (i) a formative interview study of 18 people with visual impairments to inform the designof our crowdsourcing tool, (ii) a comparative study examining differences between physical bus stop auditdata and audits conducted virtually with GSV, and (iii) an online study of 153 crowd workers on AmazonMechanical Turk to examine the feasibility of crowdsourcing bus stop audits using our custom tool with GSV.Our findings reemphasize the importance of landmarks in nonvisual navigation, demonstrate that GSV isa viable bus stop audit dataset, and show that minimally trained crowd workers can find and identify busstop landmarks with 82.5% accuracy across 150 bus stop locations (87.3% with simple quality control).

Keywords

Crowdsourcing accessibility, accessible bus stops, Google Street View, Mechanical Turk, low-vision and blind users, remote data collection, bus stop auditing

Discipline

Software Engineering | Transportation

Research Areas

Software and Cyber-Physical Systems

Publication

ACM Transactions on Accessible Computing

Volume

6

Issue

2

First Page

5-1

Last Page

8

ISSN

1936-7228

Identifier

10.1145/2717513

Publisher

Association for Computing Machinery (ACM)

Additional URL

https://doi.org/10.1145/2717513

Share

COinS