LOT Winter School 2019

Effect sizes and meta-analyses: Tools for cumulative, robust experimental science

Christina Bergmann


Email address: christina.bergmann@mpi.nl

Website teacher: http://www.mpi.nl/people/bergmann-christina

Effect sizes and meta-analyses: Tools for cumulative, robust experimental science

Level: advanced

Course description:

Single studies have long been the norm in experimental sciences to establish “facts” about the world, with little regard for cumulative thinking and reproducibility of the reported effects. This is problematic because our statistical tools are never completely conclusive. One consequence is that results frequently cannot be replicated, either because the effect is smaller than reported or because it simply is not present in the general population. Cumulative science, i.e. considering multiple studies together to get a better idea of what might be true, is one answer to this problem.

This course will give an introduction to tools of cumulative science: effect sizes and meta-analytic methods, including how to determine sample sizes before running a study and making informed design choices. Attendees are expected to have basic knowledge of the R statistical programming language and of standard frequentist statistical tests (t-tests, correlations, linear models).

Day-to-day program


Why cumulative science matters: Reproducibility, replicability, and robust science.


Theoretical introduction to effect sizes and meta-analyses, and how to consider replicability from the start.


Practical introduction to conducting reproducible meta-analyses 1: Systematic literature review, computing effect sizes from different reported statistics


Practical introduction to conducting reproducible meta-analyses 2: Meta-analytic models, interpreting the output, conducting moderator analyses.


Practical introduction to conducting reproducible meta-analyses 3: Meta-analytic visualizations, and using meta-analyses to plan studies. Limitations of meta-analyses.

Reading list

Background and preparatory readings:


    Simmons, J. P., Nelson, L. D., & Simonsohn, U. (2011). False-positive psychology: Undisclosed flexibility in data collection and analysis allows presenting anything as significant. Psychological Science, 22(11), 1359-1366. https://doi.org/10.1177/0956797611417632
    Frank, M.C., Bergelson, E., Bergmann, C., Cristia, A., Floccia, C., Gervain, J., Hamlin, J.K., Hannon, E.E., Kline, M., Levelt, C., Lew-Williams, C., Nazzi, T., Panneton, R., Rabagliati, H., Soderstrom, M., Sullivan, J., Waxman, S., Yurovsky, D. (2017). A collaborative approach to infant research: Promoting reproducibility, best practices, and theory-building. Infancy, 22(4), 421-435. https://doi.org/10.1111/infa.12182

Course readings:

Lecture 1:

    Mills‐Smith, L., Spangler, D. P., Panneton, R., & Fritz, M. S. (2015). A missed opportunity for clarity: Problems in the reporting of effect size estimates in infant developmental science. Infancy, 20(4), 416-432. https://doi.org/10.1111/infa.12078

Lecture 2:

    Lakens, D. (2013). Calculating and reporting effect sizes to facilitate cumulative science: a practical primer for t-tests and ANOVAs. Frontiers in Psychology, 4, 863. https://doi.org/10.3389/fpsyg.2013.00863

Lecture 3:

    Install R and R Studio to participate in the practical:
    For advanced R users: install the package ‘metafor’
    Tsuji, S., Bergmann, C., & Cristia, A. (2014). Community-augmented meta-analyses: Toward cumulative data assessment. Perspectives on Psychological Science, 9(6), 661-665. https://doi.org/10.1177/1745691614552498

Lecture 4:

    Black, A., & Bergmann, C. (2017). Quantifying infants' statistical word segmentation: A meta-analysis. In G. Gunzelmann, A. Howes, T. Tenbrink, & E. Davelaar (Eds.), Proceedings of the 39th Annual Meeting of the Cognitive Science Society (pp. 124-129). Austin, TX: Cognitive Science Society https://mindmodeling.org/cogsci2017/papers/0035/in...

Lecture 5:

    Bergmann, C., Tsuji, S., Piccinini, P. E., Lewis, M. L., Braginsky, M. B., Frank, M. C., & Cristia, A. (2018). Promoting replicability in developmental research through meta-analyses: Insights from language acquisition research. Child Development. Advance online publication. https://doi.org/10.1111/cdev.13079

Further readings:

    Lipsey, M. W., & Wilson, D. B. (2001). Practical meta-analysis. Sage Publications, Inc.