Back to Volume
Paper: Lecture-Tutorial Coherency: Student-Supplied Written Responses As Indicators of Academic Success
Volume: 473, Communicating Science
Page: 337
Authors: Eckenrode, J.; Welch, J. D.; Saldivar, H.; Prather, E. E.; Wallace, C. S.; CATS
Abstract: The Lecture-Tutorial Coherency Project investigates the correlation between correctness and coherency in students' written Lecture-Tutorial (LT) responses, and their understanding of introductory astronomy content. Astronomy education researchers, including undergraduate students from the CAE Ambassadors Program (former Astro 101 students who serve as instructional assistants), created rubrics designed to assess the correctness and coherency of students' written explanations of reasoning for three LT questions from the 2010 semester and four LT questions from the 2011 semester. We used these rubrics to score the LT responses of over 1300 students. We compared the average of students' written correctness and coherency scores to their responses to LSCI questions and conceptually difficult and closely related multiple-choice exam items. Our data indicates no significant difference in the correctness of student responses between students who write weak vs. robust scientific explanations of reasoning. However, it is worth noting that the average LSCI normalized gain scores and average exam-question scores for this population of students (regardless of their correctness and coherency cores) is higher than what is typically achieved by students after traditional lecture-based instruction or from low-interactivity classrooms. These results suggest that the cognitive engagement required to complete the Socratic-dialog driven LT activities is sufficient to promote higher levels of conceptual understanding regardless of whether the students actually write out their explanations.
Back to Volume