|
5 | 5 | This repository is a collection of notebooks about *Bayesian Machine Learning*. The following links display |
6 | 6 | some of the notebooks via [nbviewer](https://nbviewer.jupyter.org/) to ensure a proper rendering of formulas. |
7 | 7 |
|
8 | | -- [Reliable uncertainty estimates for neural network predictions](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/noise-contrastive-priors/ncp.ipynb). |
9 | | - Applies noise contrastive priors to Bayesian neural networks to get more reliable uncertainty estimates for OOD data. |
10 | | - Implemented with Tensorflow 2 and Tensorflow Probability ([requirements.txt](noise-contrastive-priors/requirements.txt)). |
| 8 | +- [Bayesian regression with linear basis function models](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/bayesian-linear-regression/bayesian_linear_regression.ipynb). |
| 9 | + Introduction to Bayesian linear regression. Implementation from scratch with plain NumPy as well as usage of scikit-learn |
| 10 | + for comparison. See also |
| 11 | + [PyMC4 implementation](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/bayesian-linear-regression/bayesian_linear_regression_pymc4.ipynb) and |
| 12 | + [PyMC3 implementation](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/bayesian-linear-regression/bayesian_linear_regression_pymc3.ipynb). |
| 13 | + |
| 14 | +- [](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/dev/gaussian-processes/gaussian_processes.ipynb) |
| 15 | + [Gaussian processes](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/gaussian-processes/gaussian_processes.ipynb). |
| 16 | + Introduction to Gaussian processes for regression. Example implementations with plain NumPy/SciPy as well as with libraries |
| 17 | + scikit-learn and GPy ([requirements.txt](gaussian-processes/requirements.txt)). |
| 18 | + |
| 19 | +- [](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/dev/gaussian-processes/gaussian_processes_classification.ipynb) |
| 20 | + [Gaussian processes for classification](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/gaussian-processes/gaussian_processes_classification.ipynb). |
| 21 | + Introduction to Gaussian processes for classification. Example implementations with plain NumPy/SciPy as well as with |
| 22 | + scikit-learn ([requirements.txt](gaussian-processes/requirements.txt)). |
| 23 | + |
| 24 | +- [](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/dev/bayesian-optimization/bayesian_optimization.ipynb) |
| 25 | + [Bayesian optimization](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/bayesian-optimization/bayesian_optimization.ipynb). |
| 26 | + Introduction to Bayesian optimization. Example implementations with plain NumPy/SciPy as well as with libraries |
| 27 | + scikit-optimize and GPyOpt. Hyper-parameter tuning as application example. |
11 | 28 |
|
12 | 29 | - [Variational inference in Bayesian neural networks](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/bayesian-neural-networks/bayesian_neural_networks.ipynb). |
13 | 30 | Demonstrates how to implement a Bayesian neural network and variational inference of network parameters. Example implementation |
14 | 31 | with Keras ([requirements.txt](bayesian-neural-networks/requirements.txt)). See also |
15 | 32 | [PyMC4 implementation](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/bayesian-neural-networks/bayesian_neural_networks_pymc4.ipynb). |
16 | 33 |
|
17 | | -- [](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/dev/latent-variable-models/latent_variable_models_part_2.ipynb) |
18 | | - [Latent variable models, part 2: Stochastic variational inference and variational autoencoders](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/latent-variable-models/latent_variable_models_part_2.ipynb). |
19 | | - Introduction to stochastic variational inference with variational autoencoder as application example. Implementation |
20 | | - with Tensorflow 2.x. |
| 34 | +- [Reliable uncertainty estimates for neural network predictions](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/noise-contrastive-priors/ncp.ipynb). |
| 35 | + Uses noise contrastive priors in Bayesian neural networks to get more reliable uncertainty estimates for OOD data. |
| 36 | + Implemented with Tensorflow 2 and Tensorflow Probability ([requirements.txt](noise-contrastive-priors/requirements.txt)). |
21 | 37 |
|
22 | 38 | - [](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/dev/latent-variable-models/latent_variable_models_part_1.ipynb) |
23 | 39 | [Latent variable models, part 1: Gaussian mixture models and the EM algorithm](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/latent-variable-models/latent_variable_models_part_1.ipynb). |
24 | 40 | Introduction to the expectation maximization (EM) algorithm and its application to Gaussian mixture models. Example |
25 | 41 | implementation with plain NumPy/SciPy and scikit-learn for comparison. See also |
26 | 42 | [PyMC3 implementation](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/latent-variable-models/latent_variable_models_part_1_pymc3.ipynb). |
27 | 43 |
|
28 | | -- [](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/dev/bayesian-optimization/bayesian_optimization.ipynb) |
29 | | - [Bayesian optimization](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/bayesian-optimization/bayesian_optimization.ipynb). |
30 | | - Introduction to Bayesian optimization. Example implementations with plain NumPy/SciPy as well as with libraries |
31 | | - scikit-optimize and GPyOpt. Hyper-parameter tuning as application example. |
32 | | - |
33 | | -- [](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/dev/gaussian-processes/gaussian_processes.ipynb) |
34 | | - [Gaussian processes](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/gaussian-processes/gaussian_processes.ipynb). |
35 | | - Introduction to Gaussian processes. Example implementations with plain NumPy/SciPy as well as with libraries |
36 | | - scikit-learn and GPy ([requirements.txt](gaussian-processes/requirements.txt)). |
37 | | - |
38 | | -- [Bayesian regression with linear basis function models](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/bayesian-linear-regression/bayesian_linear_regression.ipynb). |
39 | | - Introduction to Bayesian linear regression. Implementation from scratch with plain NumPy as well as usage of scikit-learn |
40 | | - for comparison. See also |
41 | | - [PyMC4 implementation](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/bayesian-linear-regression/bayesian_linear_regression_pymc4.ipynb) and |
42 | | - [PyMC3 implementation](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/bayesian-linear-regression/bayesian_linear_regression_pymc3.ipynb). |
| 44 | +- [](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/dev/latent-variable-models/latent_variable_models_part_2.ipynb) |
| 45 | + [Latent variable models, part 2: Stochastic variational inference and variational autoencoders](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/latent-variable-models/latent_variable_models_part_2.ipynb). |
| 46 | + Introduction to stochastic variational inference with variational autoencoder as application example. Implementation |
| 47 | + with Tensorflow 2.x. |
43 | 48 |
|
44 | 49 | - [Deep feature consistent variational autoencoder](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/autoencoder-applications/variational_autoencoder_dfc.ipynb). |
45 | 50 | Describes how a perceptual loss can improve the quality of images generated by a variational autoencoder. Example |
|
0 commit comments