Mike Miedlar teaches high school environmental science and is currently in a synchronous online format. His district is currently in the process of moving to a hybrid online-in person format.
At the start of the school year I was at a loss for how to replicate the lab activities and live demos that are such a rich part of the science classroom. While I’ll be the first to admit the format presented here is no single solution, it does give yet another option not just during our current situation, but potentially yet another way to reach students in years to come. In fact, recent Harvard research comparing live in-person scientific demonstrations to pre-recorded video suggests that the online videos can lead to better learning outcomes than live demos while at the same time maintaining the same level of student enjoyment. Providing students with an opportunity to make predictions and discuss with others, drawing attention to key points, and using graphics to explain abstract concepts were all key components in making these online videos effective.
Currently our co-taught environmental science course meets in person with roughly half the students physically in class and the other joining virtually through Zoom. Finding a balance between what we’ve done in the classroom in the past with what we are trying to achieve now through both virtual and in-person instruction is difficult. As a science teacher, I find incorporating lab activities to the mix complicates the situation further. There are of course a variety of quality, online simulation options. Most science teachers are familiar with University of Colorado Boulder’s STEM simulations found on PhET or the Howard Hughes Medical Institute’s BioInteractive. More recent additions like Harvard’s LabXchange are adding high quality original content as well. While resources such as these are a great way for students to interact virtually with STEM concepts, they can’t always be tailored to fit specific classroom needs.
The energy unit in our environmental science course normally includes two project based learning activities: designing and building a passive solar model home and creating a small scale wind turbine. Both of these activities include small, student-designed experiments that are meant to help with planning and project improvements. While most of the materials for these simple experiments could be readily found at home or provided to students in kits, the logistics of getting those materials in every students’ hands seemed very difficult. This year I wanted to give students the opportunity to perform these experiments in class as we normally would while at the same time creating a meaningful learning experience for those participating virtually. The virtual activity would ideally allow students to make predictions, collect and analyze experimental data, and provide automated feedback to correct misconceptions along the way. While there are many tech tools that could be used to meet some, if not all, of these requirements (EdPuzzle, Nearpod, and Canvas modules all come to mind), I decided to create a Google Form utilizing the “Go to section based on answer” functionality for individualized feedback and support.
The final Google Form has over 35 sections with embedded videos, diagrams, and questions. Nearly all link to multiple other sections based on student responses. Instead of building all this content directly into a Google Form, I first outlined the structure using a simple spreadsheet. This way I was able to quickly make edits to the sheet while avoiding any potential lagging load times and endless scrolling that could accompany editing a large Google Form.
As students worked through the form, they recorded their data and provided explanations on this document. We have a district license for Kami which allows students to add text, paste pictures, and write with a stylus on a shared, editable pdf. When students first open a Kami document, a copy is created for them within the teacher’s Google Drive and shared with them. We normally ask students to open these Kami attachments immediately at the start of class. This gives me and my co-teacher time to individually open each document and monitor progress in real-time. Sorting the Drive folder by “Last modified” time also gives a quick look at who is actively working and who may need individual help.
Students collected data directly from embed video clips. The form gave students further details on graphing or more guidance in their explanations if needed. While admittedly the form took a bit of time to set up, it had the benefit of providing automated differentiated support to each student at the moment they needed it. Since student responses were tracked within the Google Form, we could see who was directed to the additional support sections and were able to reach out individually before their final document submissions.
We originally planned on providing the guided Google Form version for students virtually and the open-ended version for in-person students. However, we offered both options to each student group. Roughly 60% of the students choose the guided virtual option with the other 40% selecting the open-ended design. Most in the open-ended group were in-person allowing us to provide further support to this smaller group. Several of the in-person students that completed the open-ended version said they appreciated the chance to step away from a screen and complete a hands on activity. In general, it appeared that the final product was better for those completing the guided virtual option, but it seemed both groups were able to grasp the overall concepts and meet our expectations.