Skip to main content

Evaluator Participant Reflections and Questions from the CAISE Media Convening

It’s hard to believe it’s been nearly a month since the meeting, so our apologies for the delayed post! In between vacation and other work, we thought it might be helpful to recount the three action areas identified during the final session, and in particular the diverse questions participants raised to help inform next steps. We hope that others will have a chance to add to this initial list of questions and/or at least consider them as working groups begin to form around each action area in the coming months.

It was a great pleasure working with you all.

Valerie Knight Williams and Liz Bandy


Questions for Next Steps

During the CAISE Media Convening an invited group of 19 informal science education (ISE) media professionals engaged in a series of group discussion activities aimed at “visioning a plan of action for the ISE media community.” By the end of the meeting, the group suggested that CAISE and/or a set of working groups to be established subsequent to the meeting further consider the following three areas of action:

1) Explore the appeal and viability of building a field/ society/“community of practice” among ISE media professionals;

2) Collect, systemically review, and disseminate evidence of ISE media impacts and explore the possibility of initiating an ISE media research agenda; and,

3) Develop a strategic ISE media communication plan to raise the visibility and value of ISE Media.

While the group generally expressed enthusiasm and support for these actions, many participants acknowledged that the invited group of 19 did not likely represent the full range of interests and viewpoints of other ISE media professionals who did not attend the meeting. In an effort to help: (i) explore mechanisms for seeking broader stakeholder input, (ii) build on the momentum generated during the meeting, and (iii) inform next steps, we thought it would be helpful to list the main questions we raised, and heard others raise during the meeting, about the expected value, feasibility and impact of the proposed actions. We hope others have a chance to modify or add to this initial list as working groups begin to form around each action area and deliberate next steps.

Action #1: Organize and build a field/society/community of practice

  • What does the ISE media field/society/community refer to and who does it include? Is it more than film, television and radio production teams, website and game designers/developers, and outreach educators, proposal writers, evaluators, researchers and funders? How should “media” be defined and what are the implications of bounding the definition either too narrowly or too broadly? How will the STEM media makers be identified for the community—will there be a priority on locating project teams currently or recently funded by the NSF or will those with unsuccessful grant submissions, for example, also be included? Will sources of support beyond the NSF be represented such as NASA, NIH, NOAA, and DOD? Will the commercial sector be represented? Will there be an international base?
  • To what extent do those involved in ISE media projects (past and present) who didn’t attend the Media Convening perceive themselves to be part of a larger ISE media community? Do they feel there is a need for a field/society/community of practice? Are they willing to contribute to and participate in such a community, and to what extent? What do they expect to gain in return?
  • Will there be a mechanism for gathering feedback on the proposed community building effort from ISE media producers and affiliates who didn’t attend the meeting? Is it feasible to develop a systematic method for collecting these individuals’ feedback and assessing their buy-in on the three proposed actions? If so, what are the most appropriate and cost-effective methods for doing this and how would this “needs assessment” work be funded? Assuming there is buy-in for a community of practice: What do potential community members want the community to look like? How would it operate and be organized? To what extent would it serve the following roles: a) disseminating information and resources, b) setting research agendas for the field, c) advocating and lobbying, d) strengthening the identity and promoting the visibility and value of the field, e) providing professional development opportunities, f) facilitating member interactions and collaborations, g) providing funding notifications and opportunities, h) identifying and communicating best practices, and i) organizing conferences, meetings, and opportunities for showcasing work (e.g., award ceremonies like the “Issies” suggested by Charlie Foster of Youth Radio)?
  • What are the most promising strategies and venues for convening community members to help establish and build the community of practice? What role should the NSF PI summit, other STEM and ISE media conferences, and/or online communities play? Is it a strategic advantage or disadvantage to focus on venues, such as the NSF PI Summit, that primarily bring together PIs of current NSF-funded projects? Would PIs without a current NSF award be likely to attend?
  • What models and examples can we draw on from within and outside the ISE field to help inform the development of the ISE media community? What can we learn and apply from the following related examples:
    1. ISE (and ISE media) associations, collaborative projects, conferences, and cooperative agreements, such as: CAISE; ASTC; the NSF-funded ISE media website and wiki database and related Linked In group, and the MacArthur Foundation’s Digital Media and Learning (DML) initiative.
    2. STEM and ISE media networks and associations established in other countries, such as the: European Science Communication and Information Network; World Congress of Science and Factual Producers; Association Science Television; and the Science Communication Conference of the British Science Association.
    3. Non-STEM specific commercial and educational media networks and associations established within the US or beyond, such as the: Giant Screen Cinema Association; PBS’s publication Current; Integrated Media Association; National Center for Media Engagement; National Association for Media Literacy Education; and Public Media Metrics.
    4. Environmental and health communications coalitions and associations that have developed “community of practice” activities in their respective fields, such as: the Center for Health Coalition; the Health and Science Communication Association; the Environmental Communication Network; the International Environmental Communication Association; and the Environmental Media Association.
    5. General models and resources on best practices for developing communities of practice, such as: Cultivating Communities of Practice: A Guide to Managing Knowledge (2002), Connected Online Communities of Practice and Beyond Communities of Practice (2005)
  • How should the convening work be funded and what role could CAISE play in this regard? Are grants needed initially to develop a series of meetings, conferences, or needs-assessment surveys, for example? Is CAISE the appropriate organization to help spearhead these next steps?
  • How do we maintain the “community building” momentum that participants generated at the meeting? Do we need to establish working groups, and how should that happen? Do we need to set deadlines for short-term goals, and longer-term goals that work toward the PI Summit in March 2012?

Action # 2: Collect, systemically review, and disseminate evidence of ISE Media impact

Action # 3: Develop and implement an ISE media strategic communication plan

  • Since CAISE-affiliated researchers have recently determined that it will not be feasible to conduct a meta-analysis of the existing summative evaluation reports available on informalscience.org, what is the best way to systemically review and summarize the knowledge we have gained regarding impacts?
  • What body of work does CAISE currently have access to for conducting this review? If roughly 75 summative evaluation reports exist on ISE media projects [Ed: as of September 2014, that number had climbed to 150 reports], what other reports exist that don’t appear on informalscience.org? What reports can media evaluators and PIs add to the mix? What can we learn or gain by looking at non-ISE media research on audience use, engagement and learning?
  • What types of data are missing from the summative evaluation reports that are important to collect and document in assessing and demonstrating impact? What role might ISE media project data reported within the Online Project Monitoring System and project annual and final reports play? What type of quantitative and project-wide impacts are collected and reported on through these reporting mechanisms that don’t appear in the evaluation reports, and which may end up addressing only a facet of a project’s set of deliverables?
  • Once the impact data across the various reports are collected and reviewed, who are the audiences for this evidence of impact? Is it PIs, the NSF, the ISE media community, and funders? What is the most relevant information to communicate to these respective audiences and in what formats should the evaluation reporting occur to be most effective? Does a congressperson, for example, wish to see the same information that a funder, media PI, or STEM researcher, wants to see?
  • What are the best ways to examine and report on the ISE media evidence of collective impact, looking across the ISE media projects? Will the evaluation findings be analyzed, for example, by type of media, type of audience, type of evaluation design and methodology, or type of outcome? What are the complications that arise by segmenting in these ways, given the unique ways individual projects are configured and the role outreach typically plays in building “broader impacts” for a project?
    1. Is there sufficient interest among media evaluators and PIs to work towards a “research agenda” in the field? Would these evaluators and PIs be amenable to having a small set of common questions, instruments, and methods incorporated across their projects to investigate common areas of interest to ISE media? Do they view this approach as desirable and feasible, assuming additional funding became available to cover the costs incurred? Are there some common goals that could be looked at across projects? What barriers and complications might stand in the way of developing these goals and working across projects? How appropriate is it to try to measure common indicators if the projects are not designed to achieve those outcomes?
    2. To what extent could such a cross-project effort yield evidence of collective broader ISE media impacts, as intended? Is there a precedent for a common research agenda in any ISE community? Do science centers and museums, for example, have a common research agenda? Are there examples of evaluators working collaboratively to use the same or similar evaluation designs and questions across ISE projects such as Multimedia Research, Knight Williams Inc., and Edumetrics did for several IMAX films produced prior to 2005, the findings from which subsequently lent themselves to Flagg’s comparative analysis in Beyond Entertainment: Educational Impact of Films and Companion Materials (2005).
    3. If there is sufficient buy-in for such a research agenda, what role might ISE media researchers who also receive funding by the NSF play in this regard? Are there ways to encourage more collaboration between ISE media evaluators, who typically sub-contract on ISE media projects, and researchers who are funded by the NSF to conduct ISE media research? Which NSF divisions and areas (e.g. ISEE, DRL, PRIME) would need to be coordinated for this type of collaboration to occur?
  • What is the most appropriate mechanism for continuing to move this action area forward? Is it feasible to convene a working group of ISE media PIs and evaluators to help inform and frame the ongoing CAISE review? Given that there is only a small group of private evaluation firms doing the majority of the evaluations, how much time can they realistically dedicate to the effort?
  • Following from the above, to what extent is a separate grant effort needed to support the above questions and areas of inquiry? ]How much, and what type of work in particular, can be done by CAISE and what should be coordinated through a new grant initiative, either run through CAISE or separately?
    • Is there sufficient buy-in within the ISE media community to warrant a working group taking the next steps toward devising a strategic communication plan? To what extent do ISE media makers who did not attend the meeting feel that a communication plan would be of value and are willing to coordinate time, energy, and resources toward this action? Assuming there is sufficient buy-in, how do we define goals for the plan? What do we mean by making ISE media more “visible” and what do we interpret “value” to mean?
    • Who are the most important audiences we need to reach: the general public, educators, policy-makers and funders? What do we want these respective audiences to know about the value of ISE media? What central theme(s) might represent a set of projects that span the range of media formats, STEM content, type of delivery-modes, and audiences (from pre-school to adult) targeted by media makers as diverse as large multi-media organizations, independent producers, after-school programs and museums?
    • Following from the above, does the working group for the communication plan need to work in tandem with the working groups on Actions #1 and #2 so that the communication plan incorporates the community and evidence building work being done in these areas?
    • How do we structure a professional, coordinated and multi-faceted plan that could be adapted for local and regional markets and different types of audiences? How feasible is such a plan and to what extent can we harness existing resources available within ISE Media organizations? What kind of supplemental funding will be required and from where should that funding come? As noted under Action #1, what can we learn from strategic communication plans that have been implemented in the environmental and health communication fields? What models and strategies have been most successful in raising the visibility and perceived value of these fields to their respective audiences, and why?
    • To what extent and in what ways can such a strategic communication plan ultimately raise the visibility and perceived value of ISE Media programs and research? How do we know if we are successful? What does success look like and how do we measure it?
Posted by Elizabeth Bandy