Getting Started With Pyro: Tutorials, How-to Guides and Examples¶
Welcome! This page collects tutorials written by the Pyro community. If you’re having trouble finding or understanding anything here, please don’t hesitate to ask a question on our forum!
New users: getting from zero to one¶
If you’re new to probabilistic programming or variational inference, you might want to start by reading the series Introductory Tutorials, especially the Introduction to Pyro. If you’re new to PyTorch, you may also benefit from reading the official introduction “Deep Learning with PyTorch.”
After that, you’re ready to get started using Pyro! (Yes, really!) Follow the instructions on the front page to install Pyro and look carefully through the series Practical Pyro and PyTorch, especially the first Bayesian regression tutorial. This tutorial goes step-by-step through solving a simple Bayesian machine learning problem with Pyro, grounding the concepts from the introductory tutorials in runnable code. Industry users interested in serving predictions from a trained model in C++ should also read the PyroModule tutorial.
Most users who reach this point will also find our guide to tensor shapes in Pyro essential reading. Pyro makes extensive use of the behavior of “array broadcasting” baked into PyTorch and other array libraries to parallelize models and inference algorithms, and while it can be difficult to understand this behavior initially, applying the intuition and rules of thumb there will go a long way toward making your experience smooth and avoiding nasty shape errors.
Core functionality: Deep learning, discrete variables and customizable inference¶
A basic familiarity with this introductory material is all you will need to dive right into exploiting Pyro’s two biggest strengths: integration with deep learning and automated exact inference for discrete latent variables. The former is described with numerous examples in the series Deep Generative Models. All are elaborations on the basic idea of the variational autoencoder, introduced in great detail in the first tutorial of this series.
Pyro’s facility with discrete latent variable models like the hidden Markov model is surveyed in the series Discrete Latent Variables. Making use of this in your own work will require careful reading of our overview and programming guide that opens this series.
Another feature of Pyro is its programmability, the subject of a series of tutorials in Customizing Inference. Users working with large models where only part of the model needs special attention may be interested in pyro.contrib.easyguide, introduced in the first tutorial of the series. Meanwhile, machine learning researchers interested in developing variational inference algorithms may wish to peruse the guide to implementing custom variational objectives, and a companion example that walks through implementing “Boosting BBVI”.
Particularly enthusiastic users and potential contributors, especially those interested in contributing to Pyro’s core components, may even be interested in how Pyro itself works under the hood, partially described in the series Understanding Pyro's Internals. The mini-pyro example contains a complete and heavily commented implementation of a small version of the Pyro language in just a few hundred lines of code, and should serve as a more digestable introduction to the real thing.
Tools for specific problems¶
Pyro is a mature piece of open-source software with “batteries included.” In addition to the core machinery for modelling and inference, it includes a large toolkit of dedicated domain- or problem-specific modelling functionality.
One particular area of strength is time-series modelling via pyro.contrib.forecasting, a library for scaling hierarchical, fully Bayesian models of multivariate time series to thousands or millions of series and datapoints. This is described in the series Application: Time Series.
Another area of strength is probabilistic machine learning with Gaussian processes. pyro.contrib.gp, described in the series Application: Gaussian Processes, is a library within Pyro implementing a variety of exact or approximate Gaussian process models compatible with Pyro’s inference engines. Pyro is also fully compatible with GPyTorch, a dedicated library for scalable GPs, as described in their Pyro example series.
List of Tutorials¶
- Bayesian Regression - Introduction (Part 1)
- Bayesian Regression - Inference Algorithms (Part 2)
- Tensor shapes in Pyro
- Modules in Pyro
- High-dimensional Bayesian workflow, with applications to SARS-CoV-2 strains
- Interactive posterior predictives checks
- Using the PyTorch JIT Compiler with Pyro
- Example: distributed training via Horovod
- Variational Autoencoders
- The Semi-Supervised VAE
- Conditional Variational Auto-encoder
- Normalizing Flows - Introduction (Part 1)
- Deep Markov Model
- Attend Infer Repeat
- Example: Causal Effect VAE
- Example: Sparse Gamma Deep Exponential Family
- Probabilistic Topic Modeling
- scANVI: Deep Generative Modeling for Single Cell Data with Pyro
- Inference with Discrete Latent Variables
- Gaussian Mixture Model
- Dirichlet Process Mixture Models in Pyro
- Example: Toy Mixture Model With Discrete Enumeration
- Example: Hidden Markov Models
- Example: Capture-Recapture Models (CJS Models)
- Example: hierarchical mixed-effect hidden Markov models
- Example: Discrete Factor Graph Inference with Plated Einsum
- Example: Amortized Latent Dirichlet Allocation
- MLE and MAP Estimation
- Doing the same thing with AutoGuides
- Writing guides using EasyGuide
- Customizing SVI objectives and training loops
- Boosting Black Box Variational Inference
- Example: Neural MCMC with NeuTraReparam
- Example: Sparse Bayesian Linear Regression
- Example: reducing boilerplate with
- Example: analyzing baseball stats with MCMC
- Example: Inference with Markov Chain Monte Carlo
- Example: MCMC with an LKJ prior over covariances
- Compiled Sequential Importance Sampling
- Example: Sequential Monte Carlo Filtering
- Example: importance sampling
- The Rational Speech Act framework
- Understanding Hyperbole using RSA
- Example: Utilizing Predictive and Deterministic with MCMC and SVI