2020 ESA Annual Meeting (August 3 - 6)

PS 27 Abstract - Systematic review and synthesis informs environmental decision-making: How to make sure your science is included

Caroline E. Ridley, US EPA, Center for Public Health and Environmental Assessment, Research Triangle Park, NC, Micah G. Bennett, Region 5, US Environmental Protection Agency, Chicago, IL, Sylvia Lee, Office of Research and Development, US Environmental Protection Agency, Washington, DC and Kate Schofield, Center for Public Health and Environmental Assessment, US EPA, Washington, DC
Background/Question/Methods

Environmental decision-makers often use data and information syntheses as input to their decision-making process. Increasingly, there are demands for syntheses that are demonstrably transparent, objective, and rigorous. One approach for imparting these attributes to a scientific synthesis is systematic review. Systematic review (SR) is a structured and highly documented process for gathering and synthesizing evidence from existing studies to form conclusions. SR was originally designed to understand the effectiveness of medical interventions, but the process, its underlying principles, and relevant tools are being applied to other disciplines, including conservation and ecology. Researchers interested in getting their science into the hands of decision-makers should have a basic understanding of SR to help ensure their work is incorporated into these influential scientific syntheses. Based on a set of SRs that we conducted to understand the relationship between nutrient stressors and biological responses in lotic ecosystems, we make recommendations to researchers on how to maximize the chances that their work will be included at the search, screen, and evaluation steps of a SR.

Results/Conclusions

Systematic reviews (SRs) use broad search tactics to capture relevant studies. Researchers should publish in journals indexed by major scientific databases and cite seminal papers that often serve as citation mapping seeds to ensure searches return their work. Once compiled, the SR team generally screens search results for relevance based on title and abstract and then based on full text. Including experimental design, results, and variables in clear language in the abstract decreases the likelihood of mistaken exclusion at this stage. Putting key results in clearly labelled tables or figures and giving supplemental files informative titles can ensure a paper’s inclusion during full text screening. Evidence evaluation can include quantitative meta-analysis. Recording numerical values in supplemental information that support figures in the main text is ideal. Results that are not statistically significant should also appear in supplementals. “Traditional” statistics are important to present alongside innovative, complicated analyses. Finally, SR requires explicit consideration of study quality. Quality standards don’t exist for ecological studies, but papers with complete descriptions of methods, large sample sizes, uncertainty measures, and results reported regardless of statistical significance tend to show lower risk of bias and therefore exert greater influence on SR results.