Skip to content

Commit 0e74400

Browse files
committed
Minor revision of GP notebook (theory part)
1 parent 56c01f2 commit 0e74400

File tree

1 file changed

+3
-3
lines changed

1 file changed

+3
-3
lines changed

gaussian-processes/gaussian_processes.ipynb

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -50,7 +50,7 @@
5050
"\n",
5151
"In Equation $(1)$, $\\mathbf{f} = (f(\\mathbf{x}_1),...,f(\\mathbf{x}_N))$, $\\boldsymbol\\mu = (m(\\mathbf{x}_1),...,m(\\mathbf{x}_N))$ and $K_{ij} = \\kappa(\\mathbf{x}_i,\\mathbf{x}_j)$. $m$ is the mean function and it is common to use $m(\\mathbf{x}) = 0$ as GPs are flexible enough to model the mean arbitrarily well. $\\kappa$ is a positive definite *kernel function* or *covariance function*. Thus, a Gaussian process is a distribution over functions whose shape (smoothness, ...) is defined by $\\mathbf{K}$. If points $\\mathbf{x}_i$ and $\\mathbf{x}_j$ are considered to be similar by the kernel the function values at these points, $f(\\mathbf{x}_i)$ and $f(\\mathbf{x}_j)$, can be expected to be similar too. \n",
5252
"\n",
53-
"A GP prior $p(\\mathbf{f} \\lvert \\mathbf{X})$ can be converted into a GP posterior $p(\\mathbf{f} \\lvert \\mathbf{X},\\mathbf{y})$ after having observed some data $\\mathbf{y}$. The posterior can then be used to make predictions $\\mathbf{f}_*$ given new input $\\mathbf{X}_*$:\n",
53+
"A GP prior $p(\\mathbf{f} \\lvert \\mathbf{X})$ can be converted into a GP posterior $p(\\mathbf{f} \\lvert \\mathbf{X},\\mathbf{y})$ after having observed some data $\\mathbf{X},\\mathbf{y}$. The posterior can then be used to make predictions $\\mathbf{f}_*$ given new input $\\mathbf{X}_*$:\n",
5454
"\n",
5555
"$$\n",
5656
"\\begin{align*}\n",
@@ -60,7 +60,7 @@
6060
"\\end{align*}\n",
6161
"$$\n",
6262
"\n",
63-
"Equation $(2)$ is the posterior predictive distribution which is also a Gaussian with mean $\\boldsymbol{\\mu}_*$ and $\\boldsymbol{\\Sigma}_*$. By definition of the GP, the joint distribution of observed data $\\mathbf{y}$ and predictions $\\mathbf{f}_*$ is\n",
63+
"Equation $(2)$ is the posterior predictive distribution which is also a Gaussian with mean $\\boldsymbol{\\mu}_*$ and $\\boldsymbol{\\Sigma}_*$. By definition of the GP, the joint distribution of observed values $\\mathbf{y}$ and predictions $\\mathbf{f}_*$ is\n",
6464
"\n",
6565
"$$\n",
6666
"\\begin{pmatrix}\\mathbf{y} \\\\ \\mathbf{f}_*\\end{pmatrix} \\sim \\mathcal{N}\n",
@@ -69,7 +69,7 @@
6969
"\\right)\\tag{3}\\label{eq3}\n",
7070
"$$\n",
7171
"\n",
72-
"With $N$ training data and $N_*$ new input data, $\\mathbf{K}_y = \\kappa(\\mathbf{X},\\mathbf{X}) + \\sigma_y^2\\mathbf{I} = \\mathbf{K} + \\sigma_y^2\\mathbf{I}$ is $N \\times N$, $\\mathbf{K}_* = \\kappa(\\mathbf{X},\\mathbf{X}_*)$ is $N \\times N_*$ and $\\mathbf{K}_{**} = \\kappa(\\mathbf{X}_*,\\mathbf{X}_*)$ is $N_* \\times N_*$. $\\sigma_y^2$ is the noise term in the diagonal of $\\mathbf{K_y}$. It is set to zero if training targets are noise-free and to a value greater than zero if observations are noisy. The mean is set to $\\boldsymbol{0}$ for notational simplicity. The sufficient statistics of the posterior predictive distribution, $\\boldsymbol{\\mu}_*$ and $\\boldsymbol{\\Sigma}_*$, can be computed with<sup>[1][3]</sup>\n",
72+
"where $\\mathbf{K}_y = \\kappa(\\mathbf{X},\\mathbf{X}) + \\sigma_y^2\\mathbf{I} = \\mathbf{K} + \\sigma_y^2\\mathbf{I}$, $\\mathbf{K}_* = \\kappa(\\mathbf{X},\\mathbf{X}_*)$ and $\\mathbf{K}_{**} = \\kappa(\\mathbf{X}_*,\\mathbf{X}_*)$. With $N$ training data and $N_*$ new input data, $\\mathbf{K}_y$ is a $N \\times N$ matrix, $\\mathbf{K}_*$ a $N \\times N_*$ matrix and $\\mathbf{K}_{**}$ a $N_* \\times N_*$ matrix. $\\sigma_y^2$ is the noise term in the diagonal of $\\mathbf{K_y}$. It is set to zero if training targets are noise-free and to a value greater than zero if observations are noisy. The sufficient statistics of the posterior predictive distribution, $\\boldsymbol{\\mu}_*$ and $\\boldsymbol{\\Sigma}_*$, can be computed with<sup>[1][3]</sup>\n",
7373
"\n",
7474
"$$\n",
7575
"\\begin{align*}\n",

0 commit comments

Comments
 (0)