Skip to content

Commit ad2cce4

Browse files
authored
Merge pull request #77 from TuringLang/ml/docs
2 parents 8386dc9 + c4d37f1 commit ad2cce4

File tree

4 files changed

+10
-3
lines changed

4 files changed

+10
-3
lines changed

docs/src/examples/correlated.md

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -34,7 +34,8 @@ using StatsPlots
3434
3535
θ1 = range(-1, 1, length=1000)
3636
θ2 = range(-1, 1, length=1000)
37-
logf = [model.loglike([t1, t2, 0, 0]) for t2 in θ2, t1 in θ1]
37+
loglike = model.prior_transform_and_loglikelihood.loglikelihood
38+
logf = [loglike([t1, t2, 0, 0]) for t2 in θ2, t1 in θ1]
3839
heatmap(
3940
θ1, θ2, exp.(logf),
4041
aspect_ratio=1,

docs/src/examples/eggbox.md

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -32,7 +32,8 @@ using StatsPlots
3232
3333
x = range(0, 1, length=1000)
3434
y = range(0, 1, length=1000)
35-
logf = [model.loglike([xi, yi]) for yi in y, xi in x]
35+
loglike = model.prior_transform_and_loglikelihood.loglikelihood
36+
logf = [loglike([xi, yi]) for yi in y, xi in x]
3637
heatmap(
3738
x, y, logf,
3839
xlims=extrema(x),

docs/src/examples/shells.md

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -32,7 +32,8 @@ using StatsPlots
3232
3333
x = range(-6, 6, length=1000)
3434
y = range(-2.5, 2.5, length=1000)
35-
logf = [model.loglike([xi, yi]) for yi in y, xi in x]
35+
loglike = model.prior_transform_and_loglikelihood.loglikelihood
36+
logf = [loglike([xi, yi]) for yi in y, xi in x]
3637
heatmap(
3738
x, y, exp.(logf),
3839
xlims=extrema(x),

docs/src/index.md

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -24,6 +24,10 @@ To use the nested samplers first install this library
2424
julia> ]add NestedSamplers
2525
```
2626

27+
## Background
28+
29+
For statistical background and a more in-depth introduction to nested sampling, I recommend the [dynesty documentation](https://dynesty.readthedocs.io/en/latest/overview.html). In short, nested sampling is a technique for simultaneously estimating the Bayesian evidence and the posterior distribution (according to [Bayes' theorem](https://en.wikipedia.org/wiki/Bayes%27_theorem)) from nested iso-likelihood shells. These shells allow a quadrature estimate of the integral for the Bayesian evidence, which we can use for model selection, as well as the statistical weights for the underlying "live" points, which is where we get our posterior samples from!
30+
2731
## Usage
2832

2933
The samplers are built using the [AbstractMCMC](https://github.com/turinglang/abstractmcmc.jl) interface. To use it, we need to create a [`NestedModel`](@ref).

0 commit comments

Comments
 (0)