About Our Project

Project Summary

TouchTree is an online tool that helps museum practitioners determine the best touch object for any museum artifact, in terms of tactile engagement and the museum’s resource availability. TouchTree follows the logic of a decision tree, where the user is presented with a series of questions that narrow down recommended touch objects for a given artifact.

This project is a continuation of the research led by Lauren Race on touch objects in museums. Touch objects refer to tactile representations of museum artifacts that visitors may touch directly.

Goal of this project: design a decision tree so that museum practitioners have a standardized flow for:

  1. When to create a touch object
  2. What type of touch object to create for any given museum artifact

In doing so, museum practitioners will be able to allocate the right amount of time and resources to create touch objects that blind and low vision visitors will find engaging, with the goal of improving their museum experience.

Team Members

  • Syeda Anjum is an Integrated Design and Media graduate student at NYU Tandon School School of Engineering. 
  • Lauren Race is an Accessibility Designer and Researcher at the NYU Ability Project. TouchTree is an extension of her research paper entitled “Understanding Accessible Interpretation through Touch Object Practices in Museums.”

Resources

For more documentation on how we created TouchTree, please visit our blog.

Update

The prototype was last updated on May 06, 2024 by Ayushi Shah, Yi Chen, and Aditya Garyali, who are graduate students in Integrated Design and Media at NYU Tandon School of Engineering, under the mentorship of Cheryl Fogle-Hatch – Founder, museumsenses.org. For any inquiries about the TouchTree, please feel free to contact the Ability Project at APLab@nyu.edu.