You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
-[](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/dev/latent-variable-models/latent_variable_models_part_2.ipynb)
18
18
[Latent variable models, part 2: Stochastic variational inference and variational autoencoders](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/latent-variable-models/latent_variable_models_part_2.ipynb).
@@ -22,8 +22,8 @@ some of the notebooks via [nbviewer](https://nbviewer.jupyter.org/) to ensure a
22
22
-[](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/dev/latent-variable-models/latent_variable_models_part_1.ipynb)
23
23
[Latent variable models, part 1: Gaussian mixture models and the EM algorithm](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/latent-variable-models/latent_variable_models_part_1.ipynb).
24
24
Introduction to the expectation maximization (EM) algorithm and its application to Gaussian mixture models. Example
25
-
implementation with plain NumPy/SciPy and scikit-learn for comparison (see also
-[](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/dev/bayesian-optimization/bayesian_optimization.ipynb)
@@ -33,13 +33,13 @@ some of the notebooks via [nbviewer](https://nbviewer.jupyter.org/) to ensure a
33
33
-[](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/dev/gaussian-processes/gaussian_processes.ipynb)
Introduction to Gaussian processes. Example implementations with plain NumPy/SciPy as well as with libraries
36
-
scikit-learn and GPy.
36
+
scikit-learn and GPy ([requirements.txt](gaussian-processes/requirements.txt)).
37
37
38
38
-[Bayesian regression with linear basis function models](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/bayesian-linear-regression/bayesian_linear_regression.ipynb).
39
39
Introduction to Bayesian linear regression. Implementation from scratch with plain NumPy as well as usage of scikit-learn
40
-
for comparison (see also
40
+
for comparison. See also
41
41
[PyMC4 implementation](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/bayesian-linear-regression/bayesian_linear_regression_pymc4.ipynb) and
Copy file name to clipboardExpand all lines: bayesian-neural-networks/bayesian_neural_networks.ipynb
+1-1Lines changed: 1 addition & 1 deletion
Original file line number
Diff line number
Diff line change
@@ -6,7 +6,7 @@
6
6
"source": [
7
7
"# Variational inference in Bayesian neural networks\n",
8
8
"\n",
9
-
"This article demonstrates how to implement and train a Bayesian neural network with Keras following the approach described in [Weight Uncertainty in Neural Networks](https://arxiv.org/abs/1505.05424) (*Bayes by Backprop*). The implementation is kept simple for illustration purposes and uses Keras 2.2.4 and Tensorflow 1.12.0 (see also [requirements.txt](requirements.txt)). For more advanced implementations of Bayesian methods for neural networks consider using [Tensorflow Probability](https://www.tensorflow.org/probability), for example.\n",
9
+
"This article demonstrates how to implement and train a Bayesian neural network with Keras following the approach described in [Weight Uncertainty in Neural Networks](https://arxiv.org/abs/1505.05424) (*Bayes by Backprop*). The implementation is kept simple for illustration purposes and uses Keras 2.2.4 and Tensorflow 1.12.0. For more advanced implementations of Bayesian methods for neural networks consider using [Tensorflow Probability](https://www.tensorflow.org/probability), for example.\n",
10
10
"\n",
11
11
"Bayesian neural networks differ from plain neural networks in that their weights are assigned a probability distribution instead of a single value or point estimate. These probability distributions describe the uncertainty in weights and can be used to estimate uncertainty in predictions. Training a Bayesian neural network via variational inference learns the parameters of these distributions instead of the weights directly.\n",
0 commit comments