Program
The following program is tentative and will receive updates.
Concept
We intend to provide an efficient and quality knowledge transfer through:
- carefully designed curriculum and program,
- coordination between our lecturers,
- a mix of theoretical lectures and hands-on tutorials,
- support for students from our teaching assistants,
- close collaboration between students, lecturers and teaching assistants.
All that in a friendly and inclusive environment that we will co-create, together. Please read our Code of Conduct.
Topics
The above program will be structured into theoretical lectures and hands-on tutorials with the following modules:
- Probabilistic models, variational inference and probabilistic programming (Day 1 to Day 2)
- Introduction to probabilistic modeling
- Bayesian modeling: prior, likelihood and posterior
- Concepts of Bayesian networks and latent-variable models
- Posterior inference and parameter learning
- Modeling techniques
- Variational inference
- Mean-field, CAVI and conjugate models
- Stochastic Variational Inference and Optimization
- Black-box variational inference
- Automatic Differentiation Variational inference
- Probabilistic programming
- Introduction to the concept of probabilistic programming
- Language syntax and semantics
- Inference mechanisms
- Introduction to probabilistic modeling
- Deep Generative Models (Day 3 to Day 5)
- Variational Auto-Encoders
- Generative Adversarial Networks
- Normalizing Flows
- ODEs and Bayesian Neural Nets
- Simulation-Based Inference
- Bayesian Neural Networks
- Gaussian Processes
Note: The described modules provide a non-exhaustive list of covered topics.
Program
To be announced soon.
The program is scheduled from 8:45 (9:00) to 18:00 CEST. Note: The schedule may receive minor updates.
The session scheduled for June 17 at 9:00 CEST is only relevant to students registered for the PhD course DT8122.
The above lectures, tutorials and talks listed in chronological order:
- Max Welling – Keynote
- Antonio Salmerón – Introduction to Probabilistic Models
- Andrés R. Masegosa & Thomas D. Nielsen – Probabilistic Programming
- Atılım Güneş Baydin – Probabilistic Programming, Machine Learning, and Physics
- Keith L. Downing – Emergent AI
- Arto Klami – Variational Inference and Optimization (part 1)
- Andrés R. Masegosa & Thomas D. Nielsen – Variational Inference and Probabilistic Programming (part 1)
- Evrim Acar Ataman – Tensor Factorizations for Physical, Chemical, and Biological Systems
- Arto Klami – Variational Inference and Optimization (part 2)
- Andrés R. Masegosa & Thomas D. Nielsen – Variational Inference and Probabilistic Programming (part 2)
- Fredrik D. Johansson – Causality and Machine Learning
- Wilker Aziz – Deep Discrete Latent Variable Models
- Francisco Ruiz – Variational Inference with Implicit and Semi-Implicit Distributions
- Christos Dimitrakakis – Bayesian Reinforcement Learning
- Mihaela Rosca – How to Build a GAN Objective
- Didrik Nielsen – Normalizing Flows
and PixelCNN - Çağatay Yıldız – Neural ODE & ODE2VAE
Concept
We are designing a virtual experience that would still allow social interactions and collaborations similar to a physical event. With that in mind, we are also selecting digital platforms supporting such requirements.
Together with the intentionally small team of invited lecturers, we hope to provide an efficient and quality knowledge transfer through:
- carefully designed curriculum and program,
- coordination between our lecturers,
- a mix of theoretical lectures and hands-on tutorials,
- support for students from our teaching assistants,
- platforms for a close interaction and collaboration between students, lecturers and teaching assistants.
The program is scheduled in the daytime of CEST. We realize that the time zone differences are an inconvenience, but we truly hope it will not be a barrier for participation.
Topics
The above program will be structured into theoretical lectures and hands-on tutorials with the following modules:
- Probabilistic models, variational inference and probabilistic programming (Day 1 to Day 3)
- Introduction to probabilistic modeling
- Bayesian modeling: prior, likelihood and posterior
- Concepts of Bayesian networks and latent-variable models
- Posterior inference and parameter learning
- Modeling techniques
- Variational inference
- Mean-field, CAVI and conjugate models
- Stochastic Variational Inference and Optimization
- Black-box variational inference
- Automatic Differentiation Variational inference
- Probabilistic programming
- Introduction to the concept of probabilistic programming
- Language syntax and semantics
- Inference mechanisms
- Introduction to probabilistic modeling
- Deep Generative Models (Day 4 and Day 5)
- Introduction to Deep Learning
- Examples of models (ConvNet, RNN, etc.)
- Learning: stochastic optimization and backpropagation
- Variational Auto-Encoders
- Generative Adversarial Networks
- Normalizing Flows
- ODEs and Bayesian Neural Nets
- Introduction to Deep Learning
Note: The described modules provide a non-exhaustive list of covered topics.