Reproducible biomedical benchmarking in the cloud: Lessons from crowd-sourced data challenges

Kyle Ellrott, Alex Buchanan, Allison Creason, Michael Mason, Thomas Schaffter, Bruce Hoff, James Eddy, John M. Chilton, Thomas Yu, Joshua M. Stuart, Julio Saez-Rodriguez, Gustavo Stolovitzky, Paul C. Boutros, Justin Guinney

Research output: Contribution to journalArticlepeer-review

15 Scopus citations

Abstract

Challenges are achieving broad acceptance for addressing many biomedical questions and enabling tool assessment. But ensuring that the methods evaluated are reproducible and reusable is complicated by the diversity of software architectures, input and output file formats, and computing environments. To mitigate these problems, some challenges have leveraged new virtualization and compute methods, requiring participants to submit cloud-ready software packages. We review recent data challenges with innovative approaches to model reproducibility and data sharing, and outline key lessons for improving quantitative biomedical data analysis through crowd-sourced benchmarking challenges.

Original languageEnglish (US)
Article number195
JournalGenome biology
Volume20
Issue number1
DOIs
StatePublished - Sep 10 2019

All Science Journal Classification (ASJC) codes

  • Ecology, Evolution, Behavior and Systematics
  • Genetics
  • Cell Biology

Fingerprint

Dive into the research topics of 'Reproducible biomedical benchmarking in the cloud: Lessons from crowd-sourced data challenges'. Together they form a unique fingerprint.

Cite this