- Principal Investigator:
- James Coughlan
This project builds on the CamIO project to provide point-and-tap interactions allowing a user to acquire detailed information about tactile graphics and 3D models.
The interface uses an iPhone’s depth and color cameras to track the user’s hands while they interact with a model. When the user points to a feature of interest on the model with their index finger, the system reads aloud basic information about that feature. For additional information, the user lifts their index finger and taps the feature again. This process can be repeated multiple times to access additional levels of information. For instance, tapping once on a region in a tactile map could trigger the name of the region, with subsequent taps eliciting the population, area, climate, etc. No audio labels are triggered unless the user makes a pointing gesture, which allows the user to explore the model freely with one or both hands. Multiple taps can be used to skip through information levels quickly, with each tap interrupting the current utterance. This allows users to reach the desired level of information more quickly than listening to all levels in sequence.
Pilot experiments with both sighted and blind and visually impaired (BVI) participants, followed by formal experiments with an additional six BVI participants, demonstrate the effectiveness of the interface. See a video demonstration here.
A manuscript describing the interface and the experiments have been accepted as a peer-reviewed publication in the upcoming International Conference on Computers Helping People with Special Needs (ICCHP '24): A. Narcisi, H. Shen, D. Ahmetovic, S. Mascetti and J. Coughlan. “Accessible Point-and-Tap Interaction for Acquiring Detailed Information about Tactile Graphics and 3D Models." July 2024. (Preprint pdf here.)
Moreover, the interface will be released as a free iOS app, allowing people with high-end iOS devices that include depth cameras to create their own tactile graphics and audio labels for these graphics. In this version of the app, we recommend that the tactile graphics be chosen from the following colors in this fluorescent cardstock product: green, yellow, orange, blue and magenta, all against a white background. The app currently supports four pre-defined different tactile models (pdf's are here: training, inner solar system, rockets, British Isles). Future releases of the app will allow the users to label rigid objects with any colors.
If you would like to test a beta version of the app, please email Andrea Narcisi at andrea.narcisi@studenti.unimi.it .