#SciData19 Writing Competition: Winning Entry #1

We are proud to publish the first of this year's four winning entries for this years Better Science through Better Data writing competition - congratulations to Anna Holderbaum
#SciData19 Writing Competition: Winning Entry #1

Question: What support do researchers need to implement reproducible research?


Anna Holderbaum - Queen's University Belfast

Accelerate scientific discovery, generate better products, improve quality of life and protect the environment - these are some of the broad impacts and benefits of reproducible research. It is scientists’ endeavour that their research is reproducible as it provides credibility to results and is a cornerstone of evidence-based decision-making. Recognising the need for reproducibility across the entire research lifecycle has become a multistakeholder priority involving scientists across all career stages, publishers, funding agencies, the public, industry and science policy representatives. Several initiatives have been launched to respond to current challenges and opportunities to improve reproducibility of research. Multiple measures have been adopted by researchers and publishers alike such as improvement of reporting standards, data transparency, and preregistration of studies. From a biological sciences perspective, reproducibility is a multifaceted topic, how can we address it? The scientific process and biological systems are complex and there are many variables to consider, which means scientists must document their research rigorously and comprehensively from the beginning. Often seemingly arbitrary variables such as ambient temperature or season can greatly influence the outcome. For example, hatching rates of plant parasitic nematodes are higher in summer than winter [1] which may significantly alter results of studies testing novel control strategies. Publishers can support this endeavour by encouraging researchers to publish in as much detail as possible and provide no restrictions on length and supplementary materials with digitals tools such as protocols.io [2] to support these processes. International recommendations for reporting agreed upon by relevant experts (e.g. those promoted by the EQUATOR Network [3]) outlining important variables guide researchers to contribute comprehensive reports. On the other hand, too rigorous standardisation has been shown to contribute to poor reproducibility in pre-clinical research while diversity of study samples improved reproducibility of the results. [4] Similarly, ignoring the biological variation in sex by predominantly studying male animals has impeded translation of research findings to humans and measures to include male and female animals have recently been taken by the British Journal of Pharmacology. [5] As such there is potential to advance science by being inclusive, embracing diversity and non-reproducibility. By employing a mathematical model reproducible results were shown to not always be scientifically accurate while accurate scientific results were not always reproducible. [6] Furthermore, diverging observations between two repeated experiments can lead to better understanding of involved variables and processes, e.g. when quality issues with commercial research antibodies were uncovered. [7] It is routine laboratory practice to order reagents and kits, outsource bioinformatic analyses or employ automated data analysis workflows – a reliance that is a double-edged sword. It saves time and resources but leaves little opportunity for researchers to fully grasp the intricacies of each technique resulting in a fast-paced high-pressure environment. There is not a one-size fits it all approach of good science and reproducibility is imperative but not the only criterion. As a research community we can learn from each other - it is important to record and share lessons learned, establish, follow and improve guidelines and continuously critically reflect upon systems in place.


[1] R. E. Ingham, D. Kroese, I. A. Zasada. Effect of Storage Environment on Hatching of the Cyst Nematode Globodera ellingtonae. J. Nematol., 2015, 47, 45. 

[2] L. Teytelman, A. Stoliartchouk, L. Kindler, B. L. Hurwitz. Protocols. io: virtual communities for protocol development and discussion. PLoS Biol., 2016, 14, e1002538. 

[3] I. Simera, D. Moher, J. Hoey, K. F. Schulz, D. G. Altman. A catalogue of reporting guidelines for health research. Eur. J. Clin. Invest., 2010, 40, 35–53. 

[4] B. Voelkl, L. Vogt, E. S. Sena, H. Würbel. Reproducibility of preclinical animal research improves with heterogeneity of study samples. PLoS Biol., 2018, 16, e2003693. 

[5] J. R. Docherty, S. C. Stanford, R. A. Panattieri, S. P. H. Alexander, G. Cirino, C. H. George, D. Hoyer, A. A. Izzo, Y. Ji, E. Lilley. Sex: A change in our guidelines to authors to ensure that this is no longer an ignored experimental variable. Br. J. Pharmacol., 2019. 

[6] B. Devezer, L. G. Nardin, B. Baumgaertner, E. O. Buzbas. Scientific discovery in a model-centric framework: Reproducibility, innovation, and epistemic diversity. PLoS One, 2019, 14, e0216125. 

[7] M. G. Weller. Quality issues of research antibodies. Anal. Chem. Insights, 2016, 11, ACI-S31614.

Don't forget to register for Better Science through Better Data on November 6th at the Wellcome Collection in London to learn about data sharing and open science.

Meet the other writing competition winners here.