Chapter 7 Seeing the Impact of the Sci-I Project
We designed an evaluation plan to track and assess the impacts on and outcomes for both the students and educators. The instruments and evaluation plan used between 2015-18 received IRB approval through Rutgers University (Protocol #E16-003). Below we outline our approach and thinking to what we did with regards to evaluating the project for these two audiences as well as our home institutions.
7.1 Identifying Student Impacts
While the Sci-I Project staff did not often work directly with the students of the project, everything that we were doing was directed towards helping them gain a different kind of experience in their science classes. We therefore were interested in seeing if participating in the project changed how students self-identified across various aspects of their experiences in STEM. Additionally, we were interested in hearing the students’ feedback and suggestions regarding the project overall.
We used a pre-survey and post-survey (administered via paper and/or online through the Survey Monkey platform; the later following the Student Research Symposium) model to enable students to share their feedback and opinions of the project. We conducted the surveys as anonymous and confidential, therefore the students did not share any personally identifying information. Rather the students indicated their teacher’s alphanumeric code (more below) and their school student ID number. These pieces of information allowed us to make individual student comparisons between the pre- and post-survey responses, without knowing who the student respondents were. The questions on the surveys covered the following categories:
- Demographic information regarding the student, to enable us to compare results across different demographic categories.
- Participation questions, to determine how overall and how many components of the project the student participated in during the year.
- Process of science and data usage questions. Through open-ended and self-identifying rating questions the students were asked to share their opinions of what the process of science involves as well as their usage of and comfort with real-world data.
Based on our IRB protocols for this project, we provided participating teachers with student assent and parent consent forms to share with any students prior to participating in the project. We translated these assent/consent documents, as well as the student pre- and post-surveys, into Spanish (the dominant other language of the students participating in the project each year). We also developed versions of the students pre- and post-surveys in regular (12-point font) and large font size (14-point font) in case any students needed the visual assistance of larger text. To assist educators in administering the student evaluation instruments we developed a Student Evaluation FAQ document.
During the 2016-17 and 2017-18 implementations of the project we had the great fortune of partnering with a research project also interested in student engagement with and identification in STEM from NC State. To compare our students with other students’ opinions of STEM, we added a range of Tests of Science Related Attitudes (TOSRA) questions to the pre- and post-surveys concerning students’ opinions of STEM, the STEM workforce, their future careers, and their exposure to STEM within their family and daily lives. For those two implementation cycles we were able to compare our student responses to these questions following their participation in the Sci-I Project with other students not participating in the project. This partnership provided the rare opportunity for a somewhat control-treatment setup to our evaluation study (results forthcoming).
7.2 Tracking Changes in Teachers Approaches
If students were the why of the Sci-I Project, then educators were the how!
Most of our efforts and interactions were focused on supporting the educators in building their understanding of, confidence with, and success in implementing the Sci-I Project within their schools. We therefore were also interested in how we could better refine the project to meet their needs. We were also interested in evaluating what impacts on and outcomes for the educators came from their participation in the project. During this project we used a mixed methods approach of surveys, focus groups, and interviews to seek formative and summative feedback from participating educators.
Prior to participating in any of the evaluation components of the projects, educators completed the necessary consent forms. Similar to the students, we conducted our educator evaluations anonymously and confidentially. Therefore, each teacher chose an alphanumeric code to use throughout the entire project and listed it on each survey.
We did surveys at three touch points with the educators: * before attending the Summer Workshop (pre-project survey), * at the end of the Summer Workshop (post-workshop survey), and * at the end of the year-long implementation (post-project survey, disseminated following the Student Research Symposium).
Across the three surveys many questions remained the same, for example:
- Demographic information regarding the educator and school of employment, to enable us to compare results across different demographic categories.
- Science content, to enable us to tease apart differences in confidence in and/or struggles with the content as opposed to using data to teach about the content.
- Approach to process of science, Through open-ended and self-identifying rating questions the educators were asked to share their opinions of what the process of science involved as well as how they integrated that into their teaching.
- Data usage and comfort, Through open-ended and self-identifying rating questions the educators were asked to share their comfort and confidence in making sense of real-world data themselves as well as how they integrated it into their teaching.
- Project component questions, We asked the educators of their expectations for and then reflections on different aspects of the Sci-I Project in terms of their personal and professional growth as well as the growth for their students.
Following the year-long implementation, we often reached out to participating educators to invite them to participate in a focus group or interview with our external evaluator. This enabled us to follow up on comments and themes we saw within the survey data as well as to gain a more in depth understanding of the teachers’ opinions of the project, impacts, and potential outcomes. Participation in this aspect of our evaluation was always voluntary and greatly appreciated. We found that educators often enjoyed the opportunity to speak more freely about their thoughts, suggestions, and reflections from the experience. We found also that conducting the focus groups and interviews during the summer worked best, when teachers had a bit more time to pause and reflect.
7.3 Return on Investment (ROI) for University and Staff
While this project took an investment of time by the staff members implementing the project, there were also substantial benefits in coordinating the project. The connection between local schools and a university both through the educators as well as the students was an invaluable way to build good trust and partnerships locally. Additionally, bringing students to campus for the SRS event offered an early chance at promoting the university. Finally, scientists are continually being asked to participate in outreach beyond academia. This project provided a scaffolded and unique way for scientists to participate in an efficient use of their time. Such partnerships with scientists also benefited the coordinator of the Sci-I Project as the scientists participating were more likely to assist and participate in future educational and outreach endeavors.
7.4 Results of Prior Implementations (forthcoming)
- Links to peer-reviewed papers
- Links to Polar-ICE annual report and/or final report for NSF