MapIO: a Gestural and Conversational Interface for Tactile Maps

Participant shown using MapIO to interact with a TMAP. Bluetooth buttons are to the right of the TMAP.

For individuals who are blind or have low vision, tactile maps provide essential spatial information but are limited in the amount of data they can convey. Digitally augmented tactile maps enhance these capabilities with audio feedback, thereby combining the tactile feedback provided by the map with an audio description of the touched elements. In this context, we explore an embodied interaction paradigm to augment tactile maps with conversational interaction based on Large Language Models, thus enabling users to obtain answers to arbitrary questions regarding the map. We analyze the type of questions the users are interested in asking, engineer the Large Language Model's prompt to provide reliable answers, and study the resulting system with a set of 10 participants, evaluating how the users interact with the system, its usability, and user experience.

Paper under review: "MapIO: Embodied Interaction for the Accessibility of Tactile Maps Through Augmented Touch Exploration and Conversation," Matteo Manzoni, Sergio Mascetti, Dragan Ahmetovic, Ryan Crabb, James M. Coughlan. (Link to arxiv version here.)

Link to video demo here.