Registration: Is closed, but if you can get to the lecture hall, you are still welcome
Instructor: Richard McElreath
Location: MPI-EVA 2nd floor lecture hall (the big one)
Dates: 6 January to 13 March 2026
Beginner section: Tuesdays 10am-11am
Experienced section: Fridays 10am-11am
There are two changes this year.
First, I am doing live in-person lectures at MPI-EVA. There is no remote enrollment. But I will record lectures and make them available to the general public.
Second, this year I am splitting the course into two sections: Beginner and Experienced. See details under Lectures below.
This course teaches statistics, but it focuses on scientific models. The unfortunate truth about data is that nothing much can be done with it, until we theorize about what caused it. Therefore the meaning of any statistical estimate depends upon assumptions outside the data and statistical model. So we will prioritize these outside assumptions: causal models, how to analyze them, and how to use them to construct scientifically meaningful statistical procedures. We will use Bayesian data analysis to connect scientific models to data. And we will learn powerful computational tools for coping with high-dimension, imperfect data of the kind that biologists and social scientists face.
This is not a theory course that focuses on theorems and proofs. It is a practical course that focuses on reliable and reproducible statistical workflow.
Open to members of MPI-EVA, MPI-CBS, iDiv, Uni-Leipzig, and all the other scientific research institutes in Leizpig and the surrounding area. If you can physically get yourself to the lecture hall, and you are willing to keep up with the homework, then you are welcome.
I will do my best to record the live lectures, so students can review or make up for illness.
Prerequisite is at least one course in basic statistics and some experience with scripting in a language like R or Python.
Just show up at the lecture of your choice. If you complete all the homework for either section, you can have course credit.
There will be 10 weeks of instruction. The lectures will be in person, 2nd floor lecture hall MPI-EVA. I will teach separate "beginner" and "experienced" sections.
This means the material in each section will move at half speed compared to previous years. Each section will take 10 weeks to cover what was before 5 weeks of material. Hopefully this helps with work-life balance and learning.
Beginner section: Tuesdays 10am-11am
Experienced section: Fridays 10am-11am
The beginner section is for people new to causal inference and Bayesian regression modelling. It will cover about the first half of my book.
The experienced section is for people have had the course before but maybe lost steam in the second half. It will start with multilevel models and possibly follow student interest. For example, there are some new case studies I could present and analyze for the class, in place of old book content.
We'll use the 2nd edition of my book, <Statistical Rethinking>, and possibly some draft chapters for the 3rd edition. I'll provide a PDF of the book to enrolled students.
There will be one set of homework problems each week for each section. If you make an good attempt on all 10, you can have transferrable course credit.
There are 10 weeks of instruction. Links to lecture recordings will appear in this table. Weekly problem sets are assigned each week and due the next week. Late work is always acceptable. But keeping up is in your own interest.
The new lectures for 2026 will appear as links in the table below.
Section A (Beginner) playlist: <Section A>
Section B (Experienced) playlist: <Section B>
Full 2026 Playlist (Sections A and B in chronological order): <2026 Playlist>
There is a set of recorded lectures from 2023 that might also be of use, either as a supplement to in-person lectures or for review: <Statistical Rethinking 2023 Playlist>
| Week ## | Meeting date | Section | Topic | Reading |
|---|---|---|---|---|
| 01 | 6 January | Beginner | <Introduction> | Chapters 1 & 2 |
| 9 January | Experienced | <Multilevel Models> | Chapter 12 | |
| 02 | 13 January | Beginner | <Garden of Forking Data> | Chapters 2 & 3 |
| 16 January | Experienced | <Multilevel Model Expansion> | Chapter 13 | |
| 03 | 20 January | Beginner | <Geocentric Models> | Chapter 4 |
| 23 January | Experienced | <Correlated Features> | Chapter 13 | |
| 04 | 27 January | Beginner | <Categories & Causes> | Chapter 4 |
| 30 January | Experienced | <Group-level Confounds / Social Networks I> | Chapter 14 | |
| 05 | 3 February | Beginner | <Estimands and Estiplans> | Chapters 4 and 5 |
| 6 February | Experienced | <Social Networks II> | Chapter 14 | |
| 06 | 10 February | Beginner | <Elemental Confounds I> | Chapter 6 |
| 13 February | Experienced | <Gaussian Processes> | Chapter 15 | |
| 07 | 17 February | Beginner | <Good and Bad Controls> | Chapter 6 |
| 20 February | Experienced | <Measurement Models> | Chapter 15 | |
| 08 | 24 February | Beginner | <MCMC and Item Response Models> | Chapters 9 & 10 |
| 27 February | Experienced | <Missing and Censored Data> | Chapter 15 | |
| 09 | 3 March | Beginner | Modeling Events | Chapters 10 & 11 |
| 6 March | Experienced | Generalized Linear Madness | Chapter 16 | |
| 10 | 10 March | Beginner | Multilevel Models | Chapter 12 |
| 20 March | Experienced | Special Topic - Students' Choice! |
This course involves a lot of scripting. Students can engage with the material using either the original R code examples or one of several conversions to other computing environments. The conversions are not always exact, but they are rather complete. See list and links at (https://xcelab.net/rm/)
For those who want to use the original R code examples in the print book, you need to install the rethinking R package. The code is all on github https://github.com/rmcelreath/rethinking/ and there are additional details about the package there, including information about using the more-up-to-date cmdstanr instead of rstan as the underlying MCMC engine.
I will also post problem sets and solutions. Check the folders at the top of the repository.