Skip to main content
COMMUNITY:
Summative

Public Participation in Scientific Research: 2012 Conference Evaluation

January 23, 2013 | Professional Development, Conferences, and Networks
The overall goal of the project was to convene a large-scale, open conference on public participation in scientific research, bringing together science researchers, project leaders, educators, technology specialists, evaluators, and others from across many disciplines to discuss advancing the field of PPSR. The conference included three sessions for posters and conversations, and five plenary sessions of presentations. The meeting culminated in an open meeting to explore strategies for large-scale collaborations to support and advance work across this field of practice, through the development of an association. The driving purposes are the furthering of PPSR as a field (professionalization), formalizing PPSR as a field of practice, and increasing collaborations across disciplines. The overarching evaluation question, therefore, is a progress question: did the conference lead to any large-scale collaborative efforts to support the field, large-scale collaborations to advance work across this field, and the development of an association or other professionalizing activities? To these ends, the following questions guided the evaluation efforts: 1. Why did people choose to attend? What are their motivations? 2. What are differences in perceptions of PPSR and data use? 3. What are entry expectations for the field? For the conference? 4. Do conference participants support the purposes/intents of the conference? Does this change as the conference progresses? 5. In what ways are participants willing to engage beyond the conference (with others; with the field) and does this change during the conference? 6. In what ways does interest in collaborations increase or decay in the participants after the conference experience? Methods: Entry measure. To generate a baseline and serve as a means of better understanding the outcomes of the conference, it was important to obtain information to answer evaluation questions 1, 2, and 3 in a direct way. This was done as a web-based pre-conference questionnaire, using the list provided by ESA for registrants. These data were quickly processed to inform the conference organizers as they moved into the conference. Process measure. As the conference itself is the focus of the evaluation, understanding the changes in participants during the conference toward the goals of the conference around the key products is a way of formatively understanding the potential for success. The evaluator took advantage of breaks, meals, and movement time to ask a series of questions relating to evaluation questions 4 and 5. The same questions were asked across the conference, but analyzed over the time of the event to attempt to determine if there were changes toward the desired outcomes and if any resistance emerged, when that occurred. Sense-making methodologies informed the question structure to ensure that process and product attitudes are captured. Post-conference measure. At the conclusion of the conference, participants engaged with a post-program response questionnaire. The instrument included satisfaction measures, intention measures, and willingness to engage further. Both the pre- and the post-conference feedback asked for minimal demographics to describe the participants in the conference. Delayed-post measure. Participants were asked to complete a follow-up questionnaire three months after the conference. A link to the questionnaire was sent to participants via email; the questionnaire was hosted on surveymonkey.com and was left open for two weeks following the sending of the link. Demographic information was again collected to gain a deeper description of participants. The appendix of this report includes surveys and an interview protocol used in the study.

TEAM MEMBERS

  • The Schoodic Education and Research Center Institute
    Contributor
  • 2013 06 13 Making meaning of the old technology
    Evaluator
    The Ohio State University
  • Citation

    Resource Type: Research and Evaluation Instruments | Survey | Interview Protocol | Evaluation Reports
    Discipline: Ecology, forestry, and agriculture | Education and learning science | Nature of science
    Audience: Adults | Educators/Teachers | Museum/ISE Professionals | Scientists | Evaluators
    Environment Type: Professional Development, Conferences, and Networks | Conferences

    If you would like to edit a resource, please email us to submit your request.