Principal Investigator's Guide, Chapter 4: Working as a Team: Collaboration Through All Phases of Project Development
Let's assume that you have chosen an evaluator to join your project team. What does the evaluator need from you to achieve a successful evaluation? What should you expect from your evaluator? What roles will each of you play throughout the evaluation process?
Jessica Luke works in the Museology Graduate Program at the University of Washington, Seattle, where she teaches about and studies the ways in which museums can enhance quality of life. Jessica has a Ph.D. in Educational Psychology from the University of Maryland and a Master's degree in Museum Studies from the University of Toronto. She spent 15 years as a learning researcher and professional evaluator at the Institute for Learning Innovation, Annapolis, MD, where she designed and implemented dozens of evaluation studies in art museums, children's museums, science centers, and natural history museums across the country. In particular, her evaluation work has focused on the long-term impact of museum experiences for youth and families, as well as the development of critical thinking skills within the museum experience. Jessica has worked with a multitude of project PIs, clarifying project outcomes and developing strategies for enhanced communication of evaluation process and results. She also has conducted extensive evaluation training through graduate level courses and national and local workshops and seminars.
Working together through-out all phases of your project
In this chapter we walk you through the life of a project as we discuss how Principal Investigators and Evaluators work together at key phases, including:
- Proposal writing
- Project start-up and front-end evaluation
- Formative evaluation
- Summative evaluation
- Dissemination of evaluation results
As we move through these five phases we'll offer a list of PI and evaluator roles and responsibilities gleaned from evaluating dozens of informal science education projects over many years. We'll explore how PIs and evaluators can work together to effectively manage relationships, leverage expertise, set realistic expectations, and ensure effective communication.
Steven Yalowitz is a Principal at Audience Viewpoints Consulting, an evaluation and audience research firm specializing in informal learning environments such as museums, zoos, aquariums, and similar institutions. Prior to starting Audience Viewpoints Consulting, he spent four years as a Senior Researcher at the Institute for Learning Innovation, working on a variety of evaluation and research projects at a variety of institutions. He earned an M.S. in Experimental Psychology and a Ph.D. in Applied Social Psychology from Colorado State University, and spent seven years as the Audience Research Manager at the Monterey Bay Aquarium in Monterey, California, directing evaluations in the exhibits, marketing, programs, and guest services departments. While Steven's research and evaluation interests are broad, he has particular expertise in attitude and behavior change, visitor satisfaction, cognition and affect, climate change, and bilingual experiences in Spanish and English. He has worked extensively with science centers, natural history museums, art museums, and aquariums and zoos. In addition, he has worked on many technology-based evaluations of high-tech interactives, hands-on exhibits, labels, and web sites.
Sasha Palmquist is a learning sciences researcher and professional evaluator of informal learning experiences. Over the last ten years, Sasha's work has focused on understanding how prior knowledge, interest, engagement, and personal identity shape learning opportunities and experiences in out-of-school and informal learning environments such as natural history museums, science centers, children's museums, and amusement parks. Sasha earned a BA in Psychology from the University of Pennsylvania as well as MS and PhD degrees in Cognitive Psychology from the University of Pittsburgh. As a Senior Research Associate at the Institute for Learning Innovation, she conducted studies that explored the development of scientific reasoning in complex domains including evolution and climate change. She has investigated the impact of children's interest and knowledge on family learning conversations in museums, identified challenges associated with developing and maintaining online communities of practice, and measured the impact of participatory design experiences on middle school students' STEM knowledge, interest, and engagement. Throughout these efforts, Sasha developed strategies for improving communication between researchers and practitioners that supported productive collaboration, facilitated evidence-based design decisions, and informed project outcomes.
Ideally you should identify an evaluator before you begin writing a proposal to develop your project. When you do, you can work with your evaluator to design a rigorous evaluation plan that serves as a key element of the proposal, is tightly aligned with project activities, and will provide feedback to inform project development and the achievement of targeted project outcomes.
Principal Investigator Responsibilities
- Conceptualize and write the proposal; clearly articulate the project design, goals and outcomes, intended audiences, activities, and implementation plan
- Position the evaluator as a team member from the start; include the evaluator in team meetings and discussions; share iterative drafts of the proposal
- Work collaboratively to review the evaluation plan (i.e., ask questions, ensure the plan will provide the data that the project needs, ensure reporting will happen at key decision points).
- Help the PI to articulate and refine project goals and outcomes and to ensure that they are well-aligned with the project activities (this often takes the form of a logic model)
- Design an evaluation plan that articulates a) questions that will guide the evaluation; b) the approach that will frame the evaluation activities; c) data collection methods that will be used to answer evaluation questions; d) a budget and timeline for the evaluation; e) a plan for working with the project team; and f) a dissemination plan for sharing information with the PI and the public
- Bring knowledge of trends in the field and literature that can help to align the proposal within relevant field(s).
Collaboration during proposal writing
LEAP into Science is a partnership between the Franklin Institute Science Museum (FI) and The Free Library of Philadelphia The project integrates science content and inquiry into an existing afterschool program at the Library, with three overarching goals: 1) to increase the capacity of afterschool facilitators for science teaching and learning; 2) to increase the capacity of libraries for science teaching and learning; and 3) to understand the ways in which science and literacy can be connected to promote children's learning and family engagement in both subject areas. During the proposal writing phase, FI worked closely with an independent evaluation firm to design a project evaluation plan that was grounded within a relevant theoretical framework. At that time the project was targeting youth, so the evaluators used a framework from the positive youth development literature to conceptualize evaluation measures, which in turn informed the articulation of project outcomes. The evaluators had many phone conversations with FI staff as the proposal took shape and they reviewed two different drafts of the proposal as it moved into final form. When the project was funded the evaluators were ready to dive into it because they'd been part of the team from the outset.
Working together during Project Start-up and Front-end Evaluation
As a project begins it's common to hold a kick-off meeting that allows your team to flesh out and adjust project activities and intended deliverables. At this time you and your evaluator must establish expectations and procedures for working together as project development progresses. Will the evaluator attend team meetings? If so, which ones? Will you hold standing meetings to check in on the evaluation? To what extent do you want to play a role in evaluation implementation? What does the evaluator need from you in order to begin work?
Once these questions have been addressed, work often begins on front-end evaluation that guides project development-for example, by examining potential participants' understanding of content, current behaviors, and misconceptions.
- Ensure that the evaluator has the most up-to-date copy of the project plan (sometimes plans change during the proposal-negotiation process) and any adjustments to goals, objectives, budget, timeline, or staffing
- Establish the team dynamic and articulate communication strategies
- Support the evaluator by encouraging his/her active participation within the project team and by facilitating team buy-in to the evaluation process
- Review the evaluation plan with the evaluator and make any needed adjustments
- Review the evaluation plan and ensure its continued alignment with project activities and outcomes. In coordination with the PI, make any needed adjustments
- Develop a detailed work plan for the first phase of the evaluation that specifies when evaluation activities will occur, who will be involved, and how results will be communicated
- Encourage decision making that is grounded in data or best practices, research and evaluation literature, and/or relevant projects that have emerged during the time between proposal and award letter/start date
- Identify any assumptions being made by the project team, and encourage discussion about their implications for the project
- Confirm data sources and availability
Collaboration during front-end evaluation
To illustrate how PIs and evaluators can work together during the front-end phase consider The YardMap Network: Social Networking for Community Science (DRL 0917487) led by the Cornell Lab of Ornithology. This citizen science project is designed to cultivate a richer understanding of bird habitat for both professional scientists and people concerned with their local environments (www.yardmap.org). Participants can map their yard or other public spaces, showing buildings, yards, and landscaped areas, and can indicate tree and plant species as well as bird-friendly features. For this project the evaluators conducted front-end evaluation to determine the extent to which the intended audiences of birders and gardeners were interested in the idea of YardMap and to provide specific feedback to inform both project design and content. As a result some of the team's assumptions about the project were confirmed while other findings showed that planned approaches needed to be modified to fully motivate participants. Many design decisions about website appearance and functions were also directly informed by the front-end findings, including an online tutorial, appearance of the main page, instructions, and options for drawing yards.
The front-end study was a collaboration between the evaluators, PI, and the rest of the project team, ensuring that the results would be useful for all. For example, the original project plan included a front-end web survey with a sample of 300 to 400 participants. However, the PI saw an opportunity to use experimental methods to assess whether references to climate change influenced individuals' interest in participating in YardMap. This approach required significantly customizing the web survey and increasing the sample to more than 3,000 participants. The experiment was successful, and the PI and evaluators co-authored a journal article using the results.
Formative evaluation focuses on ways of improving and enhancing projects. In this phase you and your evaluator need to work hand-in-hand as project components and/or activities are developed, tested, implemented and reflected upon. The cycle of development and evaluation may be repeated several times as you make refinements.
- Develop products or program elements and clearly identify what needs to be learned about how they are received (i.e., does the intended audience understand the main ideas? Do the participants find the activities engaging? Do the products or elements function as intended?
- Create forums for the evaluator and project developers to work together with shared purpose
- Identify internal evaluation expertise that might be leveraged for the project
- Clarify where building institutional capacity is a priority so the evaluator can work to train staff in data collection.
- Be responsive to PI's needs relative to what they want to know and what they want to test, with whom, and when
- Engage in an iterative testing process that provides feedback to the PI in a timely and useful manner
- Support data interpretation and provide broader context for results as needed.
Collaboration during formative evaluation
Cosmic Serpent serves as a useful example of how PIs and evaluators can work together during the formative phase Thiscollaborative research project funded by the National Science Foundation (DRL-0714631 and DRL-0714629) focuses on building respectful, sustainable relationships between science museums and Native communities and/or tribal museums. Cosmic Serpent aimed to support museum practitioners in connecting Native and Western science learning in informal settings; creating awareness of the value and integrity of Native science paradigms among museum practitioners; and nurturing a process of "collaborating with integrity" that embraces and values multiple worldviews. The primary components of the project were a series of intensive weeklong professional development workshops, a culminating conference, and a legacy document that shared project outcomes and lessons learned with the broader field. A joint evaluation approach was used to model the type of cross-cultural collaboration that the project itself was designed to support.
At the heart of a Native worldview is relationship, which served as the guiding principle of Cosmic Serpent. To create a balance of multiple perspectives and to ensure validity of the data, evaluators needed to understand the importance of their own relationship to the community being served by the project and to the project being evaluated. Therefore, the evaluation team participated in almost all project activities and became an integral part of the project community, gathering feedback from project participants (Fellows), reflecting the community voice back into the planning process, and sharing insights and processes as participant-observers. The success of Cosmic Serpent's core team collaboration also depended upon building relationships through presence, participation, openness, and trust. To support this process, PIs, advisors, and evaluators created more in-person meetings than originally anticipated. However, these meetings were critical to the success of the project because they enabled different cultural worldviews to be expressed both through verbal and non-verbal communication strategies. Through this ongoing process of relationship building and reflection on the project's pathway, the evaluation team was able to share their learning with the PI team in a deep and integrated way, while the PI team was able to share their views on the project goals and objectives and why implementation needed to happen in specific ways. Key lessons learned included the need for immersive, participatory experiences to engage Fellows in Native paradigms; allowing time and space for the emotional aspects of working across worldviews; and providing Fellows with examples to inspire their own work, particularly examples that could be integrated into existing programs and exhibits.
Summative evaluation determines a project's overall effectiveness. Here again, you and your evaluator must work together closely to clarify evaluation methods and measures, and to ensure that the resulting data will be useful to the team and the informal STEM education field. Key roles and responsibilities during the summative evaluation phase include:
Ensure that the definition of success is clearly articulated:
- Clarify targeted outcomes and match them with intended audiences
- Communicate with funder if targeted project outcomes have significantly shifted as a result of logistics, project management, staffing turnover, or front-end and formative evaluation findings
Ensure that during summative data collection, project activities are as consistent as possible:
- This is not the best time to try something new and explore the impact that it might have on participant experience-ideally such modifications should be explored during formative evaluation.
- Check that the summative evaluation will answer the questions put forth in the proposal and promised to the funder; carefully revise questions as necessary
- Determine whether the evaluation methods will answer the evaluation questions
- Develop items and instruments that align with appropriate methods; before data collection, make sure that results provided by the instruments will answer the evaluation questions
- Allow time for PI to review instruments
- Establish reasonable data collection timeline
- Conduct the study or studies
Collaboration during summative evaluation
Life Changes was a collaborative education and research effort funded by the Advancing Informal STEM Learning (AISL) program at the National Science Foundation (DRL 0540152) This project explored whether learning experiences in a museum exhibition could productively address the lack of basic understanding of the biology of evolution and the challenges of teaching this complex topic. Exhibit designers from the New York Hall of Science, Miami Science Museum, and North Museum of Natural History & Science worked closely with researchers from the University of Michigan to produce exhibit components that introduced five basic evolutionary concepts: Variation, inheritance, selection, time, and adaptation (VISTA). Based on these concepts, the Life Changes team developed a 1,500 square foot traveling exhibition called Charlie and Kiwi's Evolutionary Adventure.
To successfully complete both the planned learning research and the summative evaluation, careful coordination was required around the development of instruments, timing of data collection, and decisions about analysis and data interpretation. The team created a complementary research evaluation design that supported an investigation of the impact of the exhibition experience on young children's understanding of basic evolutionary concepts. Summative evaluation determined that children's basic evolutionary thinking and reasoning were influenced by exposure to VISTA concepts in a museum context. Following their experiences in the exhibition, children were more aware that species can change over time, that dinosaurs and birds are related, and that these relationships have evolutionary explanations.
Evaluation Reporting and Dissemination
As the project wraps up, you and your evaluator should ensure that project stakeholders have access to the evaluation findings and that the results are shared with the field as appropriate. Key roles and responsibilities during the reporting phase include:
- Monitor funder reporting guidelines and coordinate the collection of necessary information from project team members to be included in funder reports
- Create annual reports; Fill in monitoring system data forms
- Create and manage online project identity:
- Ensure project representation at PI summit meetings
- Provide necessary data to support completion of funder reporting requirements
- Review summaries of evaluation results included in annual reports, monitoring, online project pages for accuracy
- With final permission from the PI, post summative evaluation reports to informalscience.org
- Work with PI and other project staff to target webinars, conferences, and publications that would be appropriate mechanisms for sharing evaluation findings.
Collaboration during reporting and dissemination
Asteroids! is a project of the National Center for Interactive Learning at the Space Science Institute, funded by the Informal Science Education program at the National Science Foundation (DRL 0813528) This multi-faceted informal STEM education initiative encourages public engagement and understanding of the dynamic structure of the solar system through investigations of asteroids, comets, and meteors. The centerpiece of this project was the development of the traveling exhibition Great Balls of Fire. The evaluators worked with the project team to provide front end, formative, and summative evaluation as well as original research associated with the exhibition design and development process. Each year the evaluators provided evaluation reports and data summaries ready for submission to the National Science Foundation and coordinated updates to the project logic model that reflected ongoing refinement of project activities and outcomes.
This project generated data and results that have been used by a range of project staff as well as the evaluation team to support presentations at annual meetings of the Association of Science-Technology Centers and the Astronomical Society of the Pacific. In addition, lessons learned from this project about the impact and implications of incorporating youth in the exhibition design development process motivated a week-long online discussion forum hosted by ASTC Connect. More than 100 participants signed up for the forum, which was coordinated by Asteroids project advisors, staff, and evaluators and featured contributors from other projects focused on supporting positive youth development and increasing youth engagement with STEM learning.
We have walked you through one approach to making sense of the life of a project, but it's important to realize that one size does not fit all when it comes to navigating the evaluation process. In reality projects are more complex than the linear process described in this chapter. The most effective way to ensure a productive working relationship is to establish expectations early, deliver on them to the best of your ability, and design regular communication mechanisms that support the ability to respond and adapt to changing project needs and result in effective collaboration.
- Chapter 1: True Stories from your Peers: The Interplay Between Evaluation and Project Implementation
- Chapter 2: Definitions and Principles: Guiding Ideas for Principal Investigators to Know
- Chapter 3: Choosing an Evaluator: Matching Project Needs with Evaluator Skills and Competencies
- Chapter 4: Working as a Team: Collaboration Through All Phases of Project Development
- Chapter 5: Planning for Success: Supporting the Development of an Evaluation Plan
- Chapter 6: Reporting and Dissemination: Building in Dissemination from the Start