COS 8-9 - Development of an instrument to assess student ability to select and incorporate scientific evidence from the primary literature in their writing

Monday, August 12, 2019: 4:20 PM
L006, Kentucky International Convention Center
Kate M. Hill, Department of Biological Science, Florida State University, Tallahassee, FL
Background/Question/Methods

Authentic science learning requires student participation in the process of science. This not only includes experimentation and collaboration, but also the ability to construct an argument using data and supporting evidence from the primary literature. To write scientifically, students must practice searching, selecting, and incorporating primary literature in their writing. This complex set of skills require that students be able to 1) identify valid sources and recognize potential biases, 2) select relevant sources and make connections between their experiment and the primary literature, 3) differentiate between valued information and irrelevant information, 4) integrate relevant information to support their argument, and 5) paraphrase information and properly cite the source. While most students have basic proficiency with search tools to access the primary literature, they often struggle to distinguish which search results directly relate to their research focus. The product of this disconnect is the use of primary literature sources with information that is irrelevant to the student’s experiment. To address this challenge, I developed and validated a formative assessment tool designed to address the skills of relating experimental findings to scientific literature and evaluating the quality, both in legitimacy and relevance, of sources used in scientific writing.

Results/Conclusions

The assessment was designed for and validated in an undergraduate Biology laboratory class with a focus on experimental design and scientific writing. The assessment consisted of multiple-choice, short response, and extended response questions. Two-tiered multiple-choice items incorporated common misconceptions, derived from student interviews, as distractors. Extended response questions were designed to assess a student’s ability to make connections between data and information from a primary literature source. A sample of 203 undergraduate biology students (novices) and 10 graduate students (experts) completed the assessment and responses were scored according to a rubric. The assessment effectively discriminated between experts and novices, and experts had significantly higher scores on the source selection assessment (ANOVA, p<.001). This assessment tool effectively diagnoses common student misconceptions about searching for and selecting appropriate scientific evidence and evaluates their ability to integrate relevant information to support their claim. By utilizing this assessment to diagnose common misunderstandings, instructors can better inform and adapt their instruction to improve student use of evidence and reasoning in their scientific writing.