Abstract
Abstract:
For blind and low-vision individuals, traveling independently can be a challenging endeavor. Current technology and infrastructure makes this task far more feasible than ever before, especially with tools such as voice-guided GPS-based wayfinding apps on a person's own smartphone. Still, there are gaps in GPS-deprived environments such as indoor locations. We propose a light-weight computer vision and inertial sensor-based wayfinding system, which requires only a 2D floor plan and a few snapshots of signs or other visual landmarks around the area. I will present a method of indoor localization to determine a person's location in the environment, to be used as part of a smartphone navigation app that can provide turn-by-turn directions. https://www.ski.org/users/ryan-crabb