Skip to content

Commit 980cf89

Browse files
committed
Gaussian processes for classification
1 parent 0e74400 commit 980cf89

File tree

4 files changed

+912
-24
lines changed

4 files changed

+912
-24
lines changed

README.md

Lines changed: 27 additions & 22 deletions
Original file line numberDiff line numberDiff line change
@@ -5,41 +5,46 @@
55
This repository is a collection of notebooks about *Bayesian Machine Learning*. The following links display
66
some of the notebooks via [nbviewer](https://nbviewer.jupyter.org/) to ensure a proper rendering of formulas.
77

8-
- [Reliable uncertainty estimates for neural network predictions](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/noise-contrastive-priors/ncp.ipynb).
9-
Applies noise contrastive priors to Bayesian neural networks to get more reliable uncertainty estimates for OOD data.
10-
Implemented with Tensorflow 2 and Tensorflow Probability ([requirements.txt](noise-contrastive-priors/requirements.txt)).
8+
- [Bayesian regression with linear basis function models](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/bayesian-linear-regression/bayesian_linear_regression.ipynb).
9+
Introduction to Bayesian linear regression. Implementation from scratch with plain NumPy as well as usage of scikit-learn
10+
for comparison. See also
11+
[PyMC4 implementation](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/bayesian-linear-regression/bayesian_linear_regression_pymc4.ipynb) and
12+
[PyMC3 implementation](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/bayesian-linear-regression/bayesian_linear_regression_pymc3.ipynb).
13+
14+
- [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/dev/gaussian-processes/gaussian_processes.ipynb)
15+
[Gaussian processes](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/gaussian-processes/gaussian_processes.ipynb).
16+
Introduction to Gaussian processes for regression. Example implementations with plain NumPy/SciPy as well as with libraries
17+
scikit-learn and GPy ([requirements.txt](gaussian-processes/requirements.txt)).
18+
19+
- [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/dev/gaussian-processes/gaussian_processes_classification.ipynb)
20+
[Gaussian processes for classification](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/gaussian-processes/gaussian_processes_classification.ipynb).
21+
Introduction to Gaussian processes for classification. Example implementations with plain NumPy/SciPy as well as with
22+
scikit-learn ([requirements.txt](gaussian-processes/requirements.txt)).
23+
24+
- [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/dev/bayesian-optimization/bayesian_optimization.ipynb)
25+
[Bayesian optimization](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/bayesian-optimization/bayesian_optimization.ipynb).
26+
Introduction to Bayesian optimization. Example implementations with plain NumPy/SciPy as well as with libraries
27+
scikit-optimize and GPyOpt. Hyper-parameter tuning as application example.
1128

1229
- [Variational inference in Bayesian neural networks](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/bayesian-neural-networks/bayesian_neural_networks.ipynb).
1330
Demonstrates how to implement a Bayesian neural network and variational inference of network parameters. Example implementation
1431
with Keras ([requirements.txt](bayesian-neural-networks/requirements.txt)). See also
1532
[PyMC4 implementation](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/bayesian-neural-networks/bayesian_neural_networks_pymc4.ipynb).
1633

17-
- [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/dev/latent-variable-models/latent_variable_models_part_2.ipynb)
18-
[Latent variable models, part 2: Stochastic variational inference and variational autoencoders](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/latent-variable-models/latent_variable_models_part_2.ipynb).
19-
Introduction to stochastic variational inference with variational autoencoder as application example. Implementation
20-
with Tensorflow 2.x.
34+
- [Reliable uncertainty estimates for neural network predictions](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/noise-contrastive-priors/ncp.ipynb).
35+
Uses noise contrastive priors in Bayesian neural networks to get more reliable uncertainty estimates for OOD data.
36+
Implemented with Tensorflow 2 and Tensorflow Probability ([requirements.txt](noise-contrastive-priors/requirements.txt)).
2137

2238
- [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/dev/latent-variable-models/latent_variable_models_part_1.ipynb)
2339
[Latent variable models, part 1: Gaussian mixture models and the EM algorithm](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/latent-variable-models/latent_variable_models_part_1.ipynb).
2440
Introduction to the expectation maximization (EM) algorithm and its application to Gaussian mixture models. Example
2541
implementation with plain NumPy/SciPy and scikit-learn for comparison. See also
2642
[PyMC3 implementation](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/latent-variable-models/latent_variable_models_part_1_pymc3.ipynb).
2743

28-
- [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/dev/bayesian-optimization/bayesian_optimization.ipynb)
29-
[Bayesian optimization](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/bayesian-optimization/bayesian_optimization.ipynb).
30-
Introduction to Bayesian optimization. Example implementations with plain NumPy/SciPy as well as with libraries
31-
scikit-optimize and GPyOpt. Hyper-parameter tuning as application example.
32-
33-
- [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/dev/gaussian-processes/gaussian_processes.ipynb)
34-
[Gaussian processes](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/gaussian-processes/gaussian_processes.ipynb).
35-
Introduction to Gaussian processes. Example implementations with plain NumPy/SciPy as well as with libraries
36-
scikit-learn and GPy ([requirements.txt](gaussian-processes/requirements.txt)).
37-
38-
- [Bayesian regression with linear basis function models](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/bayesian-linear-regression/bayesian_linear_regression.ipynb).
39-
Introduction to Bayesian linear regression. Implementation from scratch with plain NumPy as well as usage of scikit-learn
40-
for comparison. See also
41-
[PyMC4 implementation](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/bayesian-linear-regression/bayesian_linear_regression_pymc4.ipynb) and
42-
[PyMC3 implementation](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/bayesian-linear-regression/bayesian_linear_regression_pymc3.ipynb).
44+
- [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/dev/latent-variable-models/latent_variable_models_part_2.ipynb)
45+
[Latent variable models, part 2: Stochastic variational inference and variational autoencoders](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/latent-variable-models/latent_variable_models_part_2.ipynb).
46+
Introduction to stochastic variational inference with variational autoencoder as application example. Implementation
47+
with Tensorflow 2.x.
4348

4449
- [Deep feature consistent variational autoencoder](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/autoencoder-applications/variational_autoencoder_dfc.ipynb).
4550
Describes how a perceptual loss can improve the quality of images generated by a variational autoencoder. Example

gaussian-processes/gaussian_processes.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -42,7 +42,7 @@
4242
"\n",
4343
"Observations that are closer to $\\mathbf{x}$ have a higher weight than observations that are further away. Weights are computed from $\\mathbf{x}$ and observed $\\mathbf{x}_i$ with a kernel $\\kappa$. A special case is k-nearest neighbors (KNN) where the $k$ closest observations have a weight $1/k$, and all others have weight $0$. Non-parametric methods often need to process all training data for prediction and are therefore slower at inference time than parametric methods. On the other hand, training is usually faster as non-parametric models only need to remember training data. \n",
4444
"\n",
45-
"Another example of non-parametric methods are [Gaussian processes](https://en.wikipedia.org/wiki/Gaussian_process) (GPs). Instead of inferring a distribution over the parameters of a parametric function Gaussian processes can be used to infer a distribution over functions directly. A Gaussian process defines a prior over functions. After having observed some function values it can be converted into a posterior over functions. Inference of continuous function values in this context is known as GP regression but GPs can also be used for classification. \n",
45+
"Another example of non-parametric methods are [Gaussian processes](https://en.wikipedia.org/wiki/Gaussian_process) (GPs). Instead of inferring a distribution over the parameters of a parametric function Gaussian processes can be used to infer a distribution over functions directly. A Gaussian process defines a prior over functions. After having observed some function values it can be converted into a posterior over functions. Inference of continuous function values in this context is known as GP regression but GPs can also be used for [classification](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/gaussian-processes/gaussian_processes_classification.ipynb). \n",
4646
"\n",
4747
"A Gaussian process is a [random process](https://en.wikipedia.org/wiki/Stochastic_process) where any point $\\mathbf{x} \\in \\mathbb{R}^d$ is assigned a random variable $f(\\mathbf{x})$ and where the joint distribution of a finite number of these variables $p(f(\\mathbf{x}_1),...,f(\\mathbf{x}_N))$ is itself Gaussian:\n",
4848
"\n",

0 commit comments

Comments
 (0)