Self-Study Recommendations
The webpage will receive minor updates aligned with the school program.
The following are recommended topics and materials for a voluntary self-study. Some level of familiarity with the listed topics should help you follow and digest the content of ProbAI. We have organized the recommended self-study materials into the following groups:
- Preliminaries: It is important to have a good understanding of topics 1, 2, and 4.
- Tutorials: To follow tutorials, familiarity with Python and the basics of PyTorch and Pyro (topic 5) are beneficial.
- Supplementary: Topics 3, 6, 7, and 8 may be helpful to follow some of the more advanced lectures.
Topics and materials:
- Probability theory and Bayesian analysis:
- [1] chapters 2.1, 2.2, 2.3, 3.1
- [2] chapters 1.2, 1.3, 1.5.1, 1.5.2
- Gaussian (normal) distribution:
- [1] chapters 2.6, 3.2
- [2] Chapter 2.3
- The exponential family:
- [1] Chapter 3.4
- [2] Chapter 2.4
- Neural networks, backpropagation:
- [1] chapters 13.1, 13.2, 13.3.1-13.3.4,
- [2] chapters 5.1-5.3
- 3Blue1Brown: Neural networks
- Additional reading on gradient descent: Sebastian Ruder: An Overview of gradient descent optimization
- Deep learning frameworks:
- PyTorch:
- Probabilistic programming with Pyro:
- Automatic variational inference (days 3 and 4):
- Variational autoencoders (day 3):
- What is ODE (day 5):
[1] Probabilistic Machine Learning: An Introduction (draft), Kevin Patrick Murphy. MIT Press, 2021.
[2] Pattern Recognition and Machine Learning, Christopher Bishop. Springer, 2016.