The ReproducibiliTea is a journal club dedicated to topics related to reproducibility, open science, research quality, or good/bad research practices (in any field).
Organizer: Simon Schwab
Fall semester 2020
The CRS will host the ReproducibiliTea journal club in the fall semester of 2020 again. We meet about every other week on Thursdays at 16:00, for more details see below schedule. You can join the journal club here using Zoom.
Brew your own tea before you join. All participants are welcome!
Are you interested in discussing a paper? Contact us at firstname.lastname@example.org
Speaker information: The talk should approximately be 30 min. followed by a 15-30 min. discussion. Please join the meeting 15 min. before for an audio/video check.
Slides: PDF of the talks are available here.
September 17, 2020 16:00-17:00h, Bernhard Voelkl, Universität Bern
"Reproducibility of animal research in light of biological variation"
October 29, 2020 16:00-17:00h, Mark Robinson, Universität Zürich
"Rethinking Reproducibility as a Criterion for Research Quality"
Spring semester 2020
June 4, 2020 16:00 - 17:00, Simon Schwab,
Variability in the analysis of a single neuroimaging dataset by many teams by Botvinik-Nezer et al. (2020)
May 14, 2020 16:00 - 17:00, Servan Grüninger,
What is replication? by Nosek & Errington (2020)
April 30, 2020 16:00 - 17:00, Bettina Baessler,
Robustness and Reproducibility of Radiomics in Magnetic Resonance Imaging: A Phantom Study by Baessler et al. (2019)
April 2, 2020 16:00 - 17:00, Samuel Pawel,
A Reproducible Data Analysis Workflow with R Markdown, Git, Make, and Docker by Peikert & Brandmaier (2019)
March 26, 2020 16:00 - 17:00, Dr. Filip Melinscak,
Automating Sciences: Philosophical and Social Dimensions by King et al. (2018)
Fall semester 2019
Royal Statistical Society discussion of “A new standard for the analysis and design of replication studies”
|Samuel Pawel||Predictive evaluation of replication studies Master thesis|
Why we need to report more than “Data were analyzed by t-tests or ANOVA”. Weissgerber et al. eLife, 2018.
Changed: Many Labs 2: Investigating Variation in Replicability Across Samples and Settings. Klein, Vianello et al., Advances in Methods and
and if time allows
Predicting replication outcomes in the Many Labs 2 study. Forsell, Viganola et al.
Many Analysts, One Data Set: Making Transparent How Variations in Analytic Choices Affect Results. R. Silberzahn, E. L. Uhlmann, D. P. Martin et al., Advances in Methods and Practices in Psychological Science, 1(3), 337–356.
Flexible Yet Fair: Blinding Analyses in Experimental Psychology.
Gilles Dutilh, Alexandra Sarafoglou, and Eric-Jan Wagenmakers