Skip to main content
COMMUNITY:
Project Descriptions

Intelligent Science Exhibits: Transforming Hands-on Exhibits into Mixed-Reality Learning Experiences

September 1, 2016 - August 31, 2018 | Exhibitions
The project will develop and research a new system that bridges the advantages of physical and virtual worlds to improve young children's inquiry-based science learning and engagement in a collaborative way. The project will use innovative technology and successful techniques developed for adaptive tutoring systems and bring this core research into informal learning settings where they haven't been applied before, with the goal of increasing engagement, learning and deep inquiry-based understanding in these environments. Museums and similar informal learning settings offer opportunities for children and families to learn together in an engaging way. However, without learning supports provided by people, signage, or technology, people often miss the point of the learning activity in museums. The project will develop a new genre of "intelligent" interactive science exhibits that combine proven intelligent tutoring system approaches with camera-based vision sensing to add a new layer to hands-on museum exhibits. This intelligent layer provides personalized interactive feedback to museum visitors while they experiment with physical objects in the real world. The project is a collaborative effort led by the Human Computer Interaction Institute at Carnegie Mellon University in partnership with the University of Pittsburgh Learning Research and Development Center, Children's Museum of Pittsburgh, and Carnegie Science Center. It is supported by the Advancing Informal STEM Learning (AISL) program funds research and innovative resources for use in a variety of settings, as a part of its overall strategy to enhance learning in informal environments. The project will research whether and how learning principles and adaptive, computer-based technologies that are effective in formal school learning be made effective in an informal museum experience with hands-on activities to enhance the learning and engagement of children and parents. The system will use intelligent camera sensing that tracks and notices children's interaction in physical and virtual spaces and provides adaptive personalized feedback via the help of an engaging character. It guides the children as well as the parents to engage in productive dialogue, helping shape a better parent-child interaction. To investigate this, the project will further develop an innovative mixed-reality system and smart adaptive system that gives personalized feedback to visitors based on their actions, guiding them to understand the world around them like a scientist. The project will gather data on learner behaviors in mixed-reality experiences in informal settings to inform how to better design intelligent science exhibits and derive patterns to support key outcomes, including learning, engagement, collaboration, and productive dialogue. The project will also research the application of these design patterns across different science content areas.

Funders

NSF
Funding Program: AISL
Award Number: 1612744
Funding Amount: $299,827.00

TEAM MEMBERS

  • Ken Koedinger
    Principal Investigator
    Carnegie-Mellon University
  • Scott Hudson
    Co-Principal Investigator
    Carnegie-Mellon University
  • 2013 05 17 Kevin crowley headshot
    Co-Principal Investigator
    University of Pittsburgh
  • Nesra Yannier
    Contributor
    Carnegie-Mellon University
  • Discipline: General STEM
    Audience: Elementary School Children (6-10) | Pre-K Children (0-5) | Families | Parents/Caregivers | General Public
    Environment Type: Exhibitions | Museum and Science Center Exhibits

    If you would like to edit a resource, please email us to submit your request.