Skip to content

Commit d38031a

Browse files
committed
Minor edit in README
1 parent 11a2746 commit d38031a

File tree

1 file changed

+5
-6
lines changed

1 file changed

+5
-6
lines changed

README.md

Lines changed: 5 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -7,8 +7,8 @@ PyMC3 and PyMC4 implementations are now available for some notebooks (more plann
77
- [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/master/latent_variable_models_part_1.ipynb)
88
[Latent variable models - part 1: Gaussian mixture models and the EM algorithm](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/latent_variable_models_part_1.ipynb).
99
Introduction to the expectation maximization (EM) algorithm and its application to Gaussian mixture models. Example
10-
implementation with plain NumPy/SciPy and scikit-learn for comparison. Further implementation with
11-
[PyMC3](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/latent_variable_models_part_1_pymc3.ipynb).
10+
implementation with plain NumPy/SciPy and scikit-learn for comparison (see also
11+
[PyMC3 implementation](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/latent_variable_models_part_1_pymc3.ipynb)).
1212

1313
- [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/master/latent_variable_models_part_2.ipynb)
1414
[Latent variable models - part 2: Stochastic variational inference and variational autoencoders](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/latent_variable_models_part_2.ipynb).
@@ -19,10 +19,9 @@ PyMC3 and PyMC4 implementations are now available for some notebooks (more plann
1919
implement and train a Bayesian neural network using a variational inference approach. Example implementation with Keras.
2020

2121
- [Bayesian regression with linear basis function models](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/bayesian_linear_regression.ipynb). Introduction to Bayesian
22-
linear regression. Implementation from scratch with plain NumPy as well as usage of scikit-learn for comparison.
23-
Further implementations with
24-
[PyMC4](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/bayesian_linear_regression_pymc4.ipynb) and
25-
[PyMC3](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/bayesian_linear_regression_pymc3.ipynb).
22+
linear regression. Implementation from scratch with plain NumPy as well as usage of scikit-learn for comparison (see also
23+
[PyMC4 implementation](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/bayesian_linear_regression_pymc4.ipynb) and
24+
[PyMC3 implementation](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/bayesian_linear_regression_pymc3.ipynb)).
2625

2726
- [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/master/gaussian_processes.ipynb)
2827
[Gaussian processes](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/gaussian_processes.ipynb).

0 commit comments

Comments
 (0)