Skip to main content
COMMUNITY:
Formative

Listening to Visitors: What’s Broken at the Museum of Science?

March 1, 2006 | Exhibitions
In 2005, the Exhibit Operations Department at the Museum of Science, Boston became concerned by the number of visitor comment cards that cited frustration with broken exhibits. As a result, they approached the Research Department to carry out a study to determine the visitors' perspectives of maintenance issues. The Research Department addressed this matter by seeking answers to the following questions: 1. Where is the discrepancy between what visitors and maintenance workers call broken 2. What factors related to broken exhibits frustrate visitors most? 3. What counts as broken in the eyes of the visitor? Data were collected between August and December 2005 in five galleries within the Museum: Seeing the Unseen, Investigate!, Messages, Natural Mysteries, and Making Models. These galleries were chosen because collectively they represent older and newer galleries, and a range of exhibit types including hands-on, computer-based, and object-based experiences. A variety of methods were used to collect visitor data including comment card reports, exit interviews, visitor surveys, timing and tracking maps, and maintenance surveys. The use of multiple methods enabled us to look at this issue from many perspectives and capture the breadth and depth of visitor concerns about broken exhibits. The data show that there is a discrepancy between visitors and maintenance workers in both the number and types of exhibits they call broken. In addition, visitors had more frustration in older galleries, and much of this frustration had to do with computer-based exhibits. Finally, visitors almost always identified non-functioning exhibits as broken. They were less likely to call partially functional exhibits broken, and they never called exhibits with cosmetic problems broken. It is recommended that the Maintenance Department revamp their exhibit maintenance reporting system so that all workers know what the Department considers broken, and so that they report these problems in the same way. In addition, the Exhibits Department should continue to complete usability tests with increased emphasis on the quality assurance of computer-based exhibits. They should also design exhibits in such a way that chronically broken exhibits can be moved off the Museum floors and should update exhibit labels to reflect changes as exhibits are altered. The appendix of this report includes the timing and tracking instruments, interview protocols, and surveys used in the study.

TEAM MEMBERS

  • Liz Professional 2
    Evaluator
    Museum of Science, Boston
  • REVISE logo
    Evaluator
    Museum of Science, Boston
  • Museum of Science
    Contributor
  • Citation

    Resource Type: Research and Evaluation Instruments | Survey | Interview Protocol | Observation Protocol | Evaluation Reports
    Discipline: Education and learning science | General STEM
    Audience: General Public | Museum/ISE Professionals | Evaluators
    Environment Type: Exhibitions | Museum and Science Center Exhibits

    If you would like to edit a resource, please email us to submit your request.