95th ESA Annual Meeting (August 1 -- 6, 2010)

COS 65-1 - Making the picture clear: Using multiple data sources to evaluate graduate teaching assistant professional development

Wednesday, August 4, 2010: 1:30 PM
334, David L Lawrence Convention Center
Sara A. Wyse1, Tammy M. Long2 and Diane Ebert-May2, (1)Biological Sciences, Bethel University, St. Paul, MN, (2)Plant Biology, Michigan State University, East Lansing, MI
Background/Question/Methods Graduate teaching assistants (TAs) are increasingly responsible for teaching undergraduate biology courses. However, TA training often focuses more on science content than on pedagogy. This research uses multiple sources of data to evaluate two models of TA professional development - a traditional, teacher-centered model (semester 1) and a reformed, learner-centered model (semesters 2 and 3). I collected data on thirty-three TAs teaching introductory biology. A comprehensive course reform, including TA professional development, began during semester 2 and continued through semester 3. During these semesters, TAs completed surveys and submitted classroom artifacts and videotapes of classroom practice. I quantified survey responses and compared them across the three semesters of professional development using a Kruskal-Wallis test. Two raters (ICC=0.78) assigned each TA-designed learning objective and assessment item a cognitive processing level based on the cognitive domain of Bloom's Taxonomy. I compared mean Bloom rating among professional development types using a Kruskal-Wallis test, and pairwise comparisons with a Wilcox test. Videotapes of TAs were rated by two trained raters (ICC=0.70) used the Reformed Teaching Observation Protocol (RTOP) to determine the degree of learner-centered instructional practices occurring in classroom practice. I compared RTOP scores across the semesters of professional development using an ANOVA, and investigated pairwise comparisons using Tukey's HSD. Results/Conclusions In surveys, TAs reported that traditional professional development better prepared them to teach than the first semester of reformed professional development. In reality, TA-created artifacts indicated that TAs asked their students to achieve significantly higher levels of cognitive processing during the reform (Wilcox Test, p<0.001), and that they assessed student learning at comparable Bloom levels to their objectives (Wilcox Test, p>0.05; there is no difference in Bloom rating between objectives and assessments). Reformed professional development actually significantly improved the degree of learner-centered instruction in TA classrooms (ANOVA F2=6.18, p<0.05). The survey data from TAs suggested that reformed professional development was ineffective in preparing them for teaching. However, this contradicts the conclusion we reached based on analyses of instructional artifacts and data derived from observation of classroom practice. Specifically, we observed that reformed professional development increased the cognitive levels TAs targeted in their instructional materials and increased their use of learner-centered practices in their classrooms. Our results suggest that TA perceptions, alone, do not generate a complete or accurate assessment of program effectiveness. The effectiveness of learner-centered professional development programs requires direct data on TA classroom practice in addition to self-report surveys.