COS 8-5 - Peer Review as a strategy to advance STEM students’ knowledge-building, writing, and interpersonal skills

Monday, August 12, 2019: 2:50 PM
L006, Kentucky International Convention Center
Aurora MacRae-Crerar and Valerie Ross, Center for Programs in Contemporary Writing, University of Pennsylvania, Philadelphia, PA
Background/Question/Methods

Peer review has had a significant influence on science since it originated over two centuries ago. Now an essential component of the scientific process, from publishing to funding and earning promotions, peer review is, broadly understood, a negotiation between the author and reviewer in which the words used both to describe the data, and the words used to review the manuscript, may rival in influence and importance the data themselves. While scientists have always recognized the need for good writing, more needs to be done to train future generations in how to wield words. Despite the centrality of peer review, few STEM students, undergraduate or graduate, are explicitly taught how to write meaningful peer reviews or how to interpret and act upon the reviews they receive. Working closely with STEM faculty to create writing assignments and peer review criteria, we initiated STEM undergraduates in how to write to an audience of peers, and how to evaluate their peers’ contributions to their knowledge/discourse community. We investigated whether STEM students’ knowledge, writing and interpersonal skills would improve through the use of peer-reviewed assignments. We also asked if digital tools could facilitate this process of learning and professionalization.

Results/Conclusions

The findings from our NSF-funded study shed light on how to improve the writing of undergraduate STEM students by teaching them how to effectively peer review their colleagues’ work. We found strong evidence that peer review does improve student writing in STEM and that digital tools, like peer review software, further this goal. Working with Penn faculty and 394 undergraduates in 17 STEM-based courses we examined 1,394 documents and peer reviews. A central takeaway from our study, which was a cross-institutional partnership with writing and STEM faculty at MIT, Dartmouth, USF, and NCSU, is that the process of designing, introducing, and assessing peer review itself was where much of the learning took place for students as well as faculty. Of particular interest has been the development of STEM-based writing assessment criteria, based on extensive interviews and reviews of student and professional writing with Penn STEM faculty, as well as ongoing research on our assessment practices using the criteria developed through the study and applied not only to STEM classes but also to our first year writing seminars (n=2600 students per year), in which we analyze the criteria’s construct validity as well as inter-rater reliability of instructor and peer assessment.