Principal Investigator's Guide, Chapter 6: Reporting and Dissemination: Building in Dissemination from the Start

"What Makes a Great Evaluation Report" was the topic of two workshops organized by the Visitor Studies Association for the 2012 National Science Foundation PI Meeting. The workshops were professional development opportunities, but they doubled as reconnaissance missions in service of this chapter. The workshop organizers assigned spies-oops—we mean scribes—to capture the insights, real world stories, and concrete examples shared during these sessions. Consider this intelligence-gathering to be a form of benign industrial espionage, aimed at unearthing trade secrets for the benefit of our readers. Your co-authors encourage you to borrow heavily from these strategies and recommendations, and even to copy them outright. In the name of dissemination, let's kick off this chapter with a quote overheard during the discussion...

Saskia Traill is the vice president of policy and research at TASC, a New York City-based organization reinventing learning opportunities for STEM and other disciplines. Saskia ensures that TASC's evaluation of more than 60 after-school programs and expanded learning initiatives drive evidence-based policy and practice. Saskia also leads research and policy efforts for TASC's ExpandED Schools, a reinvention of urban public schools that brings together all members of the school and community to expand the day and increase learning options for students, including the integration of formal and informal science learning. She has co-authored articles, policy briefs, and reports on a range of issues, including engaging kids in STEM, how to fund innovative education strategies, and family economics. Saskia has served on the Public Policy Committee of the Society for Research on Adolescence and co-led the CAISE Policy Study Inquiry Group. She received her B.A. from Columbia University and a Ph.D. in research psychology from Stanford University.


The critical thing is not that we succeed, but that we generate useful findings.

That pithy quote struck us as an excellent introduction to this chapter. Projects aimed at advancing the field of informal STEM involve experimentation and innovation. And with innovation comes risk. Whether or not your project goes as planned, it is important to communicate results of the project effectively to people who have the potential to extend, replicate, build on, or learn from your work. This chapter will look at the many ways in which you can use evaluation findings to achieve broader impacts for your project.


Rachel Hellenga develops exhibitions and performs exhibit-specific strategic planning and fundraising. She is a Chicago-based consultant and self-professed‚ "Sam-I-Am," of evaluation owing to her many positive experiences working with professional evaluators over the course of a twenty-year career in the museum field. She has an insatiable appetite for visitor input, which has been reinforced by the results of integrating evaluation into projects such as the NSF-funded Inventing Lab and Skyline exhibitions at the Chicago Children's Museum, featuring a flying machine tower and construction materials replicated by other museums around the country; and the Science Storms exhibition at the Museum of Science and Industry, winner of the 2011 AAM Excellence in Exhibitions Award and the ASTC 2011 Roy L. Shafer Leading Edge Award. Rachel received her B.A. in psychology from Harvard University, and her particular areas of interest include education research in encouraging persistence; tinkering/making/engineering themes; Reggio-inspired design; bullying prevention; and novel uses of technology in exhibitions.


Getting a useful evaluation report: Tips from your peers

So how do you get an evaluation report you can use? A better question to ask might be, "How do I get an evaluation report everybody can use?" The variety of stakeholders in each project can result in many different possible goals and purposes for your evaluation study, as outlined in more depth in Chapter 2. Your stakeholders might typically include your funders; your internal project team; staff and administrators at your institution; your project's participants or consumers; colleagues in your field; colleagues in tangential fields; and future collaborators.

Creating a dissemination plan early can prompt you to think ahead about the needs of your stakeholders so that when it's time to produce your report, you and your evaluator will have gathered the relevant data and other documentation (e.g., photos, videos, or other visual evidence) to support various reporting formats.

Eavesdropping on discussions about "What Makes a Great Evaluation Report?" turned up several recommendations and strongly held convictions:

Define the summative report in the contract scope of work

Get off to a good start: Discuss the summative report when writing your evaluator contract

As you define the scope of work, specify your expectations for the summative report. This is the time to ask for an Executive Summary and any custom report formats that you might need. Looking at a table of contents from other reports might be helpful to you in preparing for this conversation. For example, you might realize you want to see evaluation instruments included in the appendices. In addition, the Summative Evaluation Report Checklist can serve as the basis for a conversation with your evaluator to make sure there are no surprises about what you need in the report.

The Summative Evaluation Checklist

This checklist can help you to plan for the elements that will make your report useful to your peers. Given that an evaluation report starts out as a tool for communication between you and your evaluator, it's possible that information known to your internal team may be omitted from the report unless you explicitly ask for it to be included. When you wrote "educators" did you mean informal or formal educators? Where did the evaluation take place? Don't leave your readers in the dark.

In addition to addressing the summative evaluation report in the scope of work of your evaluator's contract, you'll want to factor it into your schedule. Let your evaluator know that you will expect to see and comment on a draft of the summative report before it is finalized. Look ahead at your timeline for reporting to stakeholders (e.g., board meetings, funder progress reports) and build in time for review and revision. For example, if a funder's final report is due in September, it makes sense to have the final evaluation report in late July or early August, so you can report findings in the final report.

Give special attention to the Executive Summary

An Executive Summary is an important part of a summative report. Many funders may refer primarily to this shorter document and it can be a tool to move the institution forward. Ask your evaluator to include highlights that are powerful: perhaps you experienced success due to a design strategy or faced a challenge that is relevant across the field. You should be confident about making suggestions for the Executive Summary, such as including specific points or describing tie-ins to previous work. Of course, the summary must accurately portray the results! It's not a place to sugarcoat negative findings or issues.

Keep in mind that when you send a final report or Executive Summary to one individual, it may go to others without the supporting documents that you intend to accompany it. For this reason, each portion of a report that you send out must be able to stand on its own. The Executive Summary should be a fair distillation of the research that conveys both the nature of the project and what you discovered.

Shape your summative report with your evaluator

Several workshop participants described the reporting process as a dynamic exchange in which the evaluator and client work together as co-authors. They suggested thinking of reporting in two phases: internal, then external. You have an important role to play in the interpretation and presentation of the data, so don't file draft reports away for later! It's important to build in time for your internal team to review drafts and offer prompt feedback to the evaluator.

Including the project team's perspective when reporting results

One PI related that staff in her institution originally saw the summative report as "test results" like getting a grade It didn't occur to them that they could discuss the ideas and findings that would be presented. Her message: have the confidence to engage in respectful dialogue with your evaluator about what will go in the report. For example, the Children's Discovery Museum of San Jose's "Secrets of Circles" exhibit team first met with their evaluator to review the summative report in a fairly raw format. The evaluator learned what the team found to be the most exciting, surprising, and meaningful results, and emphasized these in the formal report. The staff reviewed the report and Executive Summary and made suggestions and comments before distributing it more widely.

Don't whitewash results

Taking a joint-authorship approach to the final report can offer many benefits, but it is important to note that your goal is to have an accurate and helpful report that is supported by data. Some evaluators have had to defend the wording of a report in the face of a client or funder who wanted to rewrite the summary with a more positive spin. In fact, the American Journal of Evaluation is conducting a study of this issue, including tips for avoiding misrepresentation of findings. This is just a reminder that innovation involves risk and possible failure, and all project outcomes are valid and worth reporting accurately. Trying to make the report more useful is different from trying to make the findings sound better than they were.

Paint a vivid picture

Several workshop members noted the value of capturing and reporting findings such as unintended outcomes that do not fit neatly into the original program logic model. Regardless of the methodology that was used for your evaluation, many PIs noted the importance of including qualitative descriptions of your project. Taking the time to describe the context and share impressions as part of the report can help paint a vivid picture and lead to additional insights for the project team and peers in the field.

Capturing unexpected outcomes

The evaluation of a calculus exhibition at the Science Museum of Minnesota revealed that the exhibition was a powerful evoker of memory for visitors who had studied math The team had not articulated an intended outcome related to prompting positive memories of math, but inclusion of qualitative evaluation methods and a flexible approach to the reporting allowed them to uncover and document this unexpected outcome.

Creating a "highlights" document

In another example, the Children's Discovery Museum of San Jose invited members of the Vietnamese community to visit the museum as a group and to share their feedback on an exhibition The format of data collection was not consistent with the larger study, so the results were called out separately. This component of the evaluation turned out to be the most valuable to the team and to the field. A report on this work was distributed widely in the form of a "highlights document," a polished presentation featuring graphic inserts calling out implications and direct quotes.

Key stake-holders and how to reach them

Now that we've shared the most urgent recommendations and strongly held convictions from your peers, we want to come back to the question of who will receive your evaluation results and how you will get this information out to them. Let's explore some of the key stakeholder categories and some strategies for reaching them.

Funders

Public and private funders represent a primary audience for evaluation findings, so you should know exactly what they require or expect before you or your evaluator generate reports. Some funders are hands-on when it comes to evaluation; for example, helping craft the right research questions or offering ideas for selecting an appropriate evaluator. Hands-on funders will stay engaged in a conversation about your evaluation, so you are less likely to be surprised by a sudden request, but you will have to devote resources to managing that dialogue. And, you may be sharing a lot with them-descriptions of methodology, data collection progress, metrics, initial findings, and a final report.

Some funders are more hands-off, but may still expect to see the evaluation report when it is completed. For both types of funders, it is important to make sure you are clear about their expectations and what you will be submitting to them. Your external evaluator may already have worked with this funder and know their expectations. If you're not sure, ask.

Coworkers

One of the most overlooked audiences can be your coworkers. Disseminating findings internally is a way to build your organization's capacity and support its efforts to be a research-driven institution. Staff who aren't directly involved in this project may not take the time to read the full methodology and findings section, so it is not enough to simply forward the full report. You might consider sharing the Executive Summary, your own summary of findings, or some other custom presentation of information for your coworkers. For example, co-author Saskia hosts brown-bag discussions at TASC (The After School Corporation) to present findings and discuss impact. The discussions allow a free flow of ideas and questions suitable to brainstorming design changes. Her team has also presented at staff meetings or shared the highlights of an evaluation via e-mail.

Consumers

The consumers of your funded project-families of children in after-school programs, radio listeners, participants in public research projects-often do not hear what was evaluated and what was found. It is possible, however, that they would be interested in the research and the findings. You can reach them with many of the vehicles described below, such as social media, or through alternative methods such as policy briefs, brochures, and annual reports.

Sharing evaluation findings and results can expand the knowledge base of the field, build relationships, and even increase your own credibility. Your evaluation report might spark new ways of thinking and ignite change in practice or policy throughout a particular area of informal science. Ask yourself these questions:

  • What does it mean that your evaluation came out the way it did?
  • Have you found evidence for something interesting that could change other people's practice?
  • Does it shed light on a trenchant problem for the field?

If you find something to say on these issues, you have a basis for starting a meaningful conversation with your colleagues. It is OK to mix the findings with your own message as long as you can be clear about the research versus your opinion. Consider collaborating with your evaluator to present together at key conferences, proposing solo conference presentations, writing journal articles, and designing Association of Science-Technology Center's ()ASTC) RAP sessions (Roundtables for Advancing the Professions), professional development workshops, or university courses.

We're not asking you to quit your day job and go on the road as a motivational speaker, but it can be well worth your time to go one or two steps past mailing the report to your funder. For example, informal communication via blogs and Twitter can help you convey what you're learning in order to support similar projects. If your project is funded by NSF, you'll certainly post your evaluation report on informalscience.org (because you have to); and even if you don't have NSF funding, informalscience.org welcomes evaluation reports from all relevant projects. When uploading full reports, remember to review the Summative Evaluation Checklist at the end of this chapter to ensure your reports will be understood by people unfamiliar with your project. Exhibition projects can also be profiled in a case study uploaded to exhibitfiles.org; as described in Chapter 3, Exhibit Files is a social media site aimed at exhibit developers and designers and maintained by ASTC, the Association of Science-Technology Centers. We would also like to challenge you to ask yourself who else is part of this larger effort, whether or not they use the same terms to define the boundaries of your shared field. Think about who those unlikely field members are for your work.

Policymakers

Informing policymakers via personal contact and policy briefs is an important part of building sustainability. These officials can remove barriers and redirect public funds to support informal STEM education.

Elected and appointed officials can use their offices to highlight your successes and encourage the public to take an interest in your work. Evaluation findings are also a great reason to get back in touch with a policy maker's staffer who keeps a file on science issues or on your institution or organization. Data of any kind, along with a compelling story about the work you are doing, is powerful stuff for policymakers and influencers. Providing them with this information helps them with speech writing, plus they see you as an expert in your area. This outreach helps build important relationships.

Beyond the usual suspects

When a team at TASC started speaking with Science, Technology, Engineering, and Math (STEM) advocates in New York State, those advocates believed they were including after-school providers because science museums were a part of their outreach efforts. TASC encouraged them to go beyond museums to reach out to youth-serving organizations that offer diverse after-school programs with high-quality STEM activities. These after-school providers are also a part of the larger field, but would have been missed without more exploration about what types of institutions fall within the boundaries of the field.

any kind, along with a compelling story about the work you are doing, is powerful stuff for policymakers and influencers. Providing them with this information helps them with speech writing, plus they see you as an expert in your area. This outreach helps build important relationships.

America's scores on international tests of science have dropped just as we are seeing rapid increase in demand for science-literate members of the workforce; as a result, there is unprecedented interest in science education among policymakers at all levels. Sharing project findings with policymakers helps to make the case for increased funding (or against decreased funding) for federal agencies (like NSF and NASA) and programs such as Advancing Informal STEM Learning (AISL).

The After-School Corporation

TASC has implemented a "grassroots" and "grasstops" strategy for embedding science activities into comprehensive after-school programs in New York City At the "grassroots" TASC trained after-school workers in how to use an engaging science curriculum and built up their confidence as science facilitators. At the "grasstops" level, TASC organized institutes which brought together New York City leaders of science, after-school, and education. During the institutes, leaders learned about specific strategies for integrating science into after-school programming along with evaluation findings that showed an increase in confidence about science among after-school educators and their students. Leaders from the Department of Youth and Community Development participated in the Institute and later added a requirement that grantees providing after-school programming include two hours per weeks of science or literacy activities. While many factors beyond a single institute certainly played into this decision, the staff at TASC saw it as a victory due in part to dissemination of evaluation results.

Potential collaboration partners

So often dissemination feels like due diligence in getting the evaluation to the people you know who do similar work. But what if you got your evaluation into the hands of your next collaboration partner? What if Bjork read your evaluation findings and decided to make an interactive science album? It's worth taking some time to think about how you might use the evaluation as a way to start or deepen a conversation. Here are a few ideas:

  • Use Twitter to pose a question to the "twitterverse"and see what comes back.
  • Present at a conference that is nontraditional for you.
  • Take your most surprising finding and imagine who might find it unsurprising.
  • Ask your evaluator who would be interested in these findings.
  • Ask your Program Officer who might want to know about these findings.
  • Look at those who comment on your blog post.
  • Present the findings at a local funders' group meeting.
  • Reach out to other continents of the informal STEM education world. Consider museums, youth-serving organizations, public media, or universities, to name a few.
  • Reach out to the offices of elected officials.
  • Talk to local or state education or youth development agency leaders.
  • Reach out to formal educators and formal educational institutions.

More strategies for presenting and communicating results

Your summative report has many purposes and you may need to present the findings in multiple formats to accomplish your objectives. You don't necessarily need to spend your limited evaluation budget paying your evaluator to produce these additional presentations. You can ask for the content and use it to create the documents you need. It's best to make these requests up front when you negotiate your evaluator's scope of work. Some useful format variations are detailed below.

Alternative report formats: Social Media Strategies

A variety of communication strategies are described throughout this chapter, but Internet and social media strategies deserve a dedicated summary. Your evaluation might not go as viral as an English kid biting his big brother's finger, but social media can be an effective tool to get your evaluation out to a large and diverse audience. If you aren't familiar with the mechanics, don't throw up your hands and ignore the medium altogether. Take a look at the examples below for ideas about how your evaluation findings might fit with a social media strategy. You can put together materials such as Word docs, PowerPoint, pictures, and videos and then work with your marketing department to get the word out via the Internet and social media.

Tools Section: Summative Evaluation Checklist

This Summative Evaluation Checklist is derived from an extensive analysis of all evaluation reports posted to informalscience.org. The Building Informal Science Education (BISE) network, which aims to create deeper connections between evaluation and practice, conducted the analysis as part of its efforts to identify insights that can inform the field as a whole. Think of the cross-cutting questions we could ask if we had the ability to slice and dice the database to look at specific audiences or subject matter across all project types. Or conversely, an in-depth look at all reports pertaining to a specific project type such as "exhibitions" or "media" would also be informative.

 

The BISE network's first step was to review all of the existing reports and code them from the ground up. This initial analysis revealed that the summative evaluation reports often omit surprisingly basic information, making it harder to categorize reports by target age or other factors that might cut across reports from different projects. Frankly, missing information can make it hard to understand the report at all, which is why we urge you to think about all of the possible audiences for your report from the very beginning. Summative evaluation reports are often written by the evaluator with only the PI in mind, so information they both know sometimes doesn't get documented in the reports. The checklist below can help you to cover the basics. Consider adding it to the scope of work when you first establish your evaluation contract, and consult it again when it's time to post it atInformalScience.org.

Sample reporting and dissemination formats

Format

Audience

Links to Models/Samples

Executive summary

Funders, Project Team, ISE field (informalscience.org)

Summative Evaluation of the Skyline Exhibition

Full Report

Funders, Project Team, ISE field (informalscience.org)

Secrets of Circles Summative Evaluation Report

Project Highlights

Funders, Colleagues

 

Secrets of Circles Highlights

PowerPoint

Co-workers, Funders

Science Festival Alliance Presentation

Conference Papers and Presentations

ISE field (colleagues)

Conference Paper: Wolfquest

Conference Paper: Museum Visitors' Understanding of Evolution

Peer-reviewed journal articles

ISE field (colleagues)

Going Deep: Supporting Collaborative Exploration of Evolution in Natural History Museums

Tweets

ISE field (colleagues)

View tweets

Blogs

 

Governors Urged to Tap Into Informal Science Education

Policy Brief

Policymakers and Influencers; Consumers

TASC Policy Brief

Facebook

ISE field (colleagues)

View Facebook posts

Brochures and Annual Reports

Funders, Consumers

New York Hall of Science Annual Report

Conclusion

Here are just a few organizations with annual national conferencesyou might consider in getting the word out about your project and its evaluation. Use these as food for thought, not as an exhaustive list.

Thoughtful use of your evaluation findings will put you well on your way toward maximizing the impact of your project. We hope that you will think ahead about the audience for your project results and build dissemination into your evaluation plan from the start. Consider coming back to these suggestions, checklists, and examples at key points in your project. We have assembled them to help you plan for an evaluation report that gives you a clear picture of your project results and serves as a springboard for your dissemination efforts.

Now that we've armed you with a smorgasbord of tips and tools for making the most of your evaluation findings, we'd like to wrap up this section with a thank you to the many colleagues who, in the true spirit of dissemination, shared their hard-earned insights and wisdom to advance the success of future projects such as yours.