Chapter 5 Mini-Proposals
5.1 RFP, Student Guidelines
The Request for Proposals (RFP) document provides an understanding of what the student proposals should include, background of the Sci-I Project, the project goals, and a general outline of how the investigation hopefully would be implemented in the classroom. We developed this resource for the educators to share with their students, based on feedback from an early educator participant, to both communicate the expectations and also provide a real-world feel to the project. We adapted a common National Science Foundation RFP to develop the student-friendly version of an RFP. This part of the project was requested by participating educators and has received good feedback from subsequeny educators in the project.
5.2 Reviewer Preparation & Feedback Form
The recruitment of volunteer scientists was essential to the success of the proposal review process. Depending on how many schools and educators were participating in the project, there could be a large number of proposals to review. Each classroom sent on average 15-20 student proposals (each 1-4 pages long). In addition, there were often multiple classrooms per school participating in the project. We worked hard to return proposals with feedback within three weeks of receiving the proposals from the students (to be mindful of the teacher’s time in class to devote to the project and to help the students stay excited and interested in the project). Ideally, the volunteer reviewers had relevant experience in the STEM field and already had some practice in developing testable questions. We found that ideal candidates were graduate students, postdoctoral fellows, technicians, professors, and senior undergraduates (in order of how we requested their participation).
We engaged mostly graduate students to assist in our mini-proposal review process in the winter/springs of 2017 and 2018. We incentivized their participation by reminding graduate students that outreach is a valuable experience to add to their CVs and we fed them (e.g. pizza)! We held a short training for the volunteer reviewers that included a review of the background of the Sci-I Project and explanation of the project goals. We spent some time reviewing examples of past student work to show the level of knowledge that the students were coming from, what type of material we expected to see in the proposals, and what kind of feedback would be age-appropriate. Then we reviewed the common feedback form that they used while reviewing the mini-proposals. Much of the time of the training sessions was spent helping the volunteer scientists get acquainted with the project and calibrating their expectations of what students in grades 6-9th could be expected to produce and articulate.
Additionally, before the volunteers reviewed any of the proposals, we tried to make sure that they were all at least somewhat familiar with the study system at hand so they could provide better feedback to the students. Therefore, we made sure that our reviewers had at least a base knowledge of the Arctic and Antarctic ecosystems before starting their review process. It is rare, but sometimes the students made incorrect inferences on how parts of the ecosystem fit together and as a result formulated questions and hypotheses whose logic didn’t work. For example, one student was trying to assess changes in the Adélie penguin population using phytoplankton abundance. The student thought that Adélie penguins feed on phytoplankton, therefore if phytoplankton abundances changed, the penguins may have nutrient constraints. Where if this was true, it would be an interesting question to pursue, Adélie penguins eat krill and fish, not the phytoplankton directly. The students’ logic did follow the thought that if phytoplankton abundance changes, it may affect Adélie penguin populations. However, we wanted to make sure that the hypotheses the students made included accurate representations and aspects of our overall understanding of the ecosystem. By having a base knowledge of the ecosystem, the reviewer was able to provide feedback to the students that helped them identify the issue in their question and provide them with options on how to proceed.
After the reviewers felt confident on how to provide constructive feedback we assigned each of them a set of proposals to review and asked them work for the next couple hours in the same shared space. The team member running the training and review session also stayed to review proposals as well at that time, but we were also there to answer any questions that came up for the reviewers had about the proposals they were going over.
5.3 Logistics for Reviewing Proposals and Returning Feedback
Depending on the number of schools and classrooms participating in the project we often had a high number of proposals that needed to be reviewed quickly. To help keep everything organized, we used Google Drive as our online storage space to keep all the materials together and organized. The Drive made it easy to organize each year of the project, schools that were participating, and classrooms within each school. It also was an easy place for the educators to upload the proposals, for the reviewers to access the proposals to review, and for the students to be access their completed feedback forms. We set the due date for mini-proposals 7-9 weeks ahead of the symposium date. This timeframe was derived from allowing three weeks for the reviewers to give feedback back to the students, and at least a month for the students to revise and conduct their investigations after receiving reviewer feedback before the symosium date.
When providing feedback for the proposals, we found it important to remember that the students were doing their best to explain their study and study system with limited information. To this end, we focused more on the students’ process of developing an investigation than the nuanced specifics of understanding the study system completely in our feedback. Our goal was to make sure that the students had gone through the process of creating actual testable questions and hypotheses, they understood what their independent and dependent variables were, and had thought about how to visualize their results.
We created a form to organize reviewer feedback. The form included:
- Overarching comments about the proposal. This was a good space to write what the reviewer thought the students had done well with the proposal and complement them on their interesting question. We always sought to provide some positive feedback in the feedback form so we wouldn’t discourage the students by only seeing criticisms. We also provided an easy to read table that was used to clearly state the variables to be investigated, location and timeframe of the study, and if the data will be averaged (at the top of each completed feedback form).
- Three level score system to assess if each part of the proposal was completed. Reviewers wrote “Y” for yes, “N” for no/missing, and “~” for partially completed responses to each item on the feedback form. After each item check line we provided space to leave comments to clarify any issues that may be present in the proposal with regards to that item area.
We used this example feedback form as a reference to see the type of feedback we gave to past students.
We emailed each educator when the reviews of all of their student proposals were complete and the forms have been uploaded. The completed feedback forms were again shared through the school-specific google folders to facilitate access of documents. We stressed that if any teachers or students had questions about any of the feedback that they should contact the project team for assistance. We most often got clarifying questions from the students about how to follow the reviewer’s advice to make adjustments or changes to their investgiation design.