Skip to main content

11 States Participate in Evaluation to Measure the Impact of Afterschool STEM Programs on Students

This study was performed and written by Gil G. Noam, Patricia J. Allen, and Bailey Triggs, The PEAR Institute: Partnerships in Education and Resilience. To read the full evaluation, visit: https://www.thepearinstitute.org/afterschool-stem-evaluation-2016

 

To instill in youth an interest in STEM, an ongoing love and engagement with learning, and to prepare them for an increasingly STEM-focused workforce, STEM afterschool programs are collaborating to share knowledge about  what works to engage youth. To support and strengthen these connections, the Charles Stewart Mott Foundation and the Noyce Foundation (now STEM Next) embarked on a nationwide capacity-building project that provided grants to all 50 states to form school afterschool networks and partnerships, , with over half of states receiving STEM systems-building or planning grants. This systems-building work included support for partnership and leadership development; evaluation and data collection; program quality building and professional development; communication and policy; and financing and sustainability.

An investment of this scale necessitated an evaluation that could tell funders whether afterschool STEM providers were helping youth advance in areas like STEM interest, engagement, skills, and motivation. To this end, The PEAR Institute: Partnerships in Education and Resilience partnered with Dr. Todd Little and IMMAP: Institute for Measurement, Methodology, Analysis & Policy at Texas Tech University to conduct one of the first large-scale evaluations to measure the impact of afterschool programs on students’ STEM-related attitudes, social-emotional and 21st-century skills. The primary goals were to (1) to examine levels of change in youth outcomes among programs receiving resources and training support from systems-building states; (2) to inform on national trends related to STEM learning, such as gender or grade differences in science interest; and (3) to link STEM program quality with student outcomes and facilitator beliefs.

The Afterschool & STEM Systems-Building Evaluation 2016 was made possible by the collaboration between researchers, practitioners, funders and 11 statewide afterschool networks, selected to be representative of the U.S. as a whole, with participation from nearly 1,600 students (Grades 4-12) enrolled in 160 afterschool STEM programs. Three assessment tools developed by Gil Noam and the team at The PEAR Institute were used to triangulate evidence of STEM learning: the Common Instrument Suite (CIS), the Common Instrument Suite – Facilitator Survey (CIS-FS), and the Dimensions of Success (DoS) observation tool.

The CIS is a brief measure of student STEM interest in afterschool settings, administered using tablet devices at the end of STEM programming using a retrospective pretest-posttest method that instructs students to rate each survey item twice from two different frames of reference: what they thought “before the program” and what they think “at this time.” The CIS-FS is completed by STEM program facilitators and was administered electronically at the close of the program and was developed to capture facilitators’ professional development and training experience and what type of training support they would like to receive in the future. DoS is an observation tool for assessing program quality for STEM learning in afterschool programs that captures 12 dimensions of STEM program quality as well as four organizing domains. Rigorous training and certification is required to perform DoS observations. For this evaluation, state networks worked with DoS-certified individuals to coordinate one or more quality observations at each participating program to establish a score for program quality.

While the evaluation found variation across states, in general, participation in STEM-focused afterschool programs led to major, positive changes in students’ attitudes toward science. More than 70% of students reported positive gains in areas such as STEM interest, STEM identity, STEM career interest and career knowledge, and 21st-century skills, including perseverance and critical thinking. When looking at the results from the facilitator survey, we found significant, positive correlations between facilitators’ levels of interest, confidence, and ability in STEM facilitation and their perceptions of their students’ proficiency and confidence in math and science.

When we compared the DoS program observations with student CIS data, we were able see a connection between program quality and student outcomes: Students participating in higher quality STEM programs reported more positive gains than students participating in lower quality STEM programs (see figure). For a full summary of results, read the full report.

Macintosh HD:Users:baileytriggs:Desktop:STEMquality.png

An evaluation of this size is bound to face challenges that can be learned from to improve future work. Challenges for this evaluation included the cleaning of a large data set, recruitment, and communication across the various state networks. Despite these challenges, which are characteristic of any large-scale evaluation effort, PEAR, IMMAP, and Mainspring Consultants were able to resolve them without hindering or preventing the completion of the evaluation.

This evaluation demonstrates that it’s possible to gather evidence of STEM learning in afterschool using common data-creating tools on a national scale. Several clear outcomes of this work lead us to make the following six recommendations:

  1. Leverage leaders’ strengths: States that are excelling in specific areas should be encouraged to teach others through communities of practice.

  2. Target professional development and quality support: Continuous improvement through facilitator professional development will strengthen youth STEM outcomes.

  3. Focus on the linkage between science learning and 21st-century skills: This evaluation found an important connection between 21st-century skills and STEM outcomes and that STEM programs should intentionally focus on this connection in their work.

  4. Encourage use of data to inform practice: One of the evaluation’s greatest successes was rallying a wide group of programs around common measurement tools to help establish benchmarks and communication across programs to improve practice.  

  5. Innovate out-of-school time evaluation and assessment strategies: The data collection and analysis techniques used in this evaluation can inform future evaluations interested in triangulating data or employing other innovative methods like the retrospective pretest-posttest design.

  6. Prioritize evaluation in the system-building process: Our final recommendation is that evaluation and data collection should be a priority of all systems-building initiatives so networks can use data to track success and challenges in each state, decide where professional development and other opportunities for staff are needed; and to provide evidence for expanding advocacy and policy efforts.

We are pleased that this evaluation was able to provide data that policymakers can confidently use for decision making in the future. For more information and recommendations on what afterschool programs and networks can do moving forward, read the full report here.

Posted by CAISE Admin