Comparison of Reproduction Platforms for Continuous Science

The idea of Continuous Integration (CI), i.e. the execution of test and build workflows on every commit, has an analogy in science: The reproduction of scientific insights becomes "continuous" in the sense that updates to data lead to the recalculation of statistical hypotheses like an update to source code leads to the rebuilding of software assets in the context of CI. This idea can even be put a step further, if the scientific data themselves are created entirely in silico, e.g. if the data are created by a simulation. We can then talk of replication (for a differentation between reproduction/reproducibility and replication see [1]).

There are various platforms supporting such continuous science workflows:

These Reproduction Platforms for Continuous Science (RPCS) differ (amongst other things) with regard to the set of feature they support, their performance, their license, their installation procedure and the protocols and standards they support.

Goal of this thesis will be to research all relevant platforms, develop a comparison scheme (such as a capability model) and test them against that scheme.

This master thesis can also be worked on by a group of motivated bachelor students.

Requirements

Tasks

Compute Resources

Installing the RPCS might necessitate computing resources exceeding the capacities of a normal desktop computer. In that case cloud computing capacities provided by the LRZ can be used.

Aufgabensteller: Prof. Dr. Dieter Kranzlmüller

Number of students: 1 master student or several Bachelor students (min. 3)

Supervisors:

Literature:

  1. Barba, L. A.: Terminologies for Reproducible Research CoRR, 2018
  2. Jiminez, I. et al: The Popper Convention: Making Reproducible Systems Evaluation Practical 2017 IEEE International Parallel and Distributed Processing Symposium Workshops (IPDPSW), 2017, 1561-1570
  3. Ayer et al.: Conquaire: Towards an architecture supporting continuous quality control to ensure reproducibility of research D-Lib Magazine, 2017, 23
  4. Ioannidis, J. P. A.: Why Most Published Research Findings Are False PLOS Medicine, Public Library of Science, 2005, 2


Last Change: Thu, 06 Sep 2018 04:04:07 +0200 - Viewed on: Sat, 20 Apr 2019 22:51:17 +0200
Copyright © MNM-Team http://www.mnm-team.org - Impressum / Legal Info  - Datenschutz / Privacy