Skip to content

Commit 7dd551c

Browse files
committed
sd update
1 parent b06b16f commit 7dd551c

File tree

1 file changed

+113
-84
lines changed

1 file changed

+113
-84
lines changed

lectures/stationary_densities.md

Lines changed: 113 additions & 84 deletions
Original file line numberDiff line numberDiff line change
@@ -746,10 +746,13 @@ for the look-ahead estimator is very good.
746746

747747
The first exercise helps illustrate this point.
748748

749+
749750
## Exercises
750751

751752
(statd_ex1)=
752-
### Exercise 1
753+
```{exercise-start}
754+
:label: sd_ex1
755+
```
753756

754757
Consider the simple threshold autoregressive model
755758

@@ -799,89 +802,12 @@ density estimator.
799802

800803
If you repeat the simulation you will see that this is consistently the case.
801804

802-
(statd_ex2)=
803-
### Exercise 2
804-
805-
Replicate the figure on global convergence {ref}`shown above <statd_egs>`.
806-
807-
The densities come from the stochastic growth model treated {ref}`at the start of the lecture <solow_swan>`.
808-
809-
Begin with the code found {ref}`above <stoch_growth>`.
810-
811-
Use the same parameters.
812-
813-
For the four initial distributions, use the shifted beta distributions
814-
815-
```{code-block} python3
816-
ψ_0 = beta(5, 5, scale=0.5, loc=i*2)
805+
```{exercise-end}
817806
```
818807

819-
(statd_ex3)=
820-
### Exercise 3
821-
822-
A common way to compare distributions visually is with [boxplots](https://en.wikipedia.org/wiki/Box_plot).
823-
824-
To illustrate, let's generate three artificial data sets and compare them with a boxplot.
825-
826-
The three data sets we will use are:
827-
828-
$$
829-
\{ X_1, \ldots, X_n \} \sim LN(0, 1), \;\;
830-
\{ Y_1, \ldots, Y_n \} \sim N(2, 1), \;\;
831-
\text{ and } \;
832-
\{ Z_1, \ldots, Z_n \} \sim N(4, 1), \;
833-
$$
834-
835-
Here is the code and figure:
836-
837-
```{code-cell} python3
838-
n = 500
839-
x = np.random.randn(n) # N(0, 1)
840-
x = np.exp(x) # Map x to lognormal
841-
y = np.random.randn(n) + 2.0 # N(2, 1)
842-
z = np.random.randn(n) + 4.0 # N(4, 1)
843-
844-
fig, ax = plt.subplots(figsize=(10, 6.6))
845-
ax.boxplot([x, y, z])
846-
ax.set_xticks((1, 2, 3))
847-
ax.set_ylim(-2, 14)
848-
ax.set_xticklabels(('$X$', '$Y$', '$Z$'), fontsize=16)
849-
plt.show()
850-
```
851-
852-
Each data set is represented by a box, where the top and bottom of the box are the third and first quartiles of the data, and the red line in the center is the median.
853-
854-
The boxes give some indication as to
855-
856-
* the location of probability mass for each sample
857-
* whether the distribution is right-skewed (as is the lognormal distribution), etc
858-
859-
Now let's put these ideas to use in a simulation.
860-
861-
Consider the threshold autoregressive model in {eq}`statd_tar`.
862-
863-
We know that the distribution of $X_t$ will converge to {eq}`statd_tar_ts` whenever $|\theta| < 1$.
864-
865-
Let's observe this convergence from different initial conditions using
866-
boxplots.
867-
868-
In particular, the exercise is to generate J boxplot figures, one for each initial condition $X_0$ in
869-
870-
```{code-block} python3
871-
initial_conditions = np.linspace(8, 0, J)
808+
```{solution-start} sd_ex1
809+
:class: dropdown
872810
```
873-
874-
For each $X_0$ in this set,
875-
876-
1. Generate $k$ time-series of length $n$, each starting at $X_0$ and obeying {eq}`statd_tar`.
877-
1. Create a boxplot representing $n$ distributions, where the $t$-th distribution shows the $k$ observations of $X_t$.
878-
879-
Use $\theta = 0.9, n = 20, k = 5000, J = 8$
880-
881-
## Solutions
882-
883-
### Exercise 1
884-
885811
Look-ahead estimation of a TAR stationary density, where the TAR model
886812
is
887813

@@ -925,7 +851,35 @@ ax.legend(loc='upper left')
925851
plt.show()
926852
```
927853

928-
### Exercise 2
854+
855+
```{solution-end}
856+
```
857+
858+
859+
(statd_ex2)=
860+
```{exercise-start}
861+
:label: sd_ex2
862+
```
863+
864+
Replicate the figure on global convergence {ref}`shown above <statd_egs>`.
865+
866+
The densities come from the stochastic growth model treated {ref}`at the start of the lecture <solow_swan>`.
867+
868+
Begin with the code found {ref}`above <stoch_growth>`.
869+
870+
Use the same parameters.
871+
872+
For the four initial distributions, use the shifted beta distributions
873+
874+
```{code-block} python3
875+
ψ_0 = beta(5, 5, scale=0.5, loc=i*2)
876+
```
877+
```{exercise-end}
878+
```
879+
880+
```{solution-start} sd_ex2
881+
:class: dropdown
882+
```
929883

930884
Here's one program that does the job
931885

@@ -974,7 +928,79 @@ for i in range(4):
974928
plt.show()
975929
```
976930

977-
### Exercise 3
931+
```{solution-end}
932+
```
933+
934+
(statd_ex3)=
935+
```{exercise-start}
936+
:label: sd_ex3
937+
```
938+
939+
A common way to compare distributions visually is with [boxplots](https://en.wikipedia.org/wiki/Box_plot).
940+
941+
To illustrate, let's generate three artificial data sets and compare them with a boxplot.
942+
943+
The three data sets we will use are:
944+
945+
$$
946+
\{ X_1, \ldots, X_n \} \sim LN(0, 1), \;\;
947+
\{ Y_1, \ldots, Y_n \} \sim N(2, 1), \;\;
948+
\text{ and } \;
949+
\{ Z_1, \ldots, Z_n \} \sim N(4, 1), \;
950+
$$
951+
952+
Here is the code and figure:
953+
954+
```{code-cell} python3
955+
n = 500
956+
x = np.random.randn(n) # N(0, 1)
957+
x = np.exp(x) # Map x to lognormal
958+
y = np.random.randn(n) + 2.0 # N(2, 1)
959+
z = np.random.randn(n) + 4.0 # N(4, 1)
960+
961+
fig, ax = plt.subplots(figsize=(10, 6.6))
962+
ax.boxplot([x, y, z])
963+
ax.set_xticks((1, 2, 3))
964+
ax.set_ylim(-2, 14)
965+
ax.set_xticklabels(('$X$', '$Y$', '$Z$'), fontsize=16)
966+
plt.show()
967+
```
968+
969+
Each data set is represented by a box, where the top and bottom of the box are the third and first quartiles of the data, and the red line in the center is the median.
970+
971+
The boxes give some indication as to
972+
973+
* the location of probability mass for each sample
974+
* whether the distribution is right-skewed (as is the lognormal distribution), etc
975+
976+
Now let's put these ideas to use in a simulation.
977+
978+
Consider the threshold autoregressive model in {eq}`statd_tar`.
979+
980+
We know that the distribution of $X_t$ will converge to {eq}`statd_tar_ts` whenever $|\theta| < 1$.
981+
982+
Let's observe this convergence from different initial conditions using
983+
boxplots.
984+
985+
In particular, the exercise is to generate J boxplot figures, one for each initial condition $X_0$ in
986+
987+
```{code-block} python3
988+
initial_conditions = np.linspace(8, 0, J)
989+
```
990+
991+
For each $X_0$ in this set,
992+
993+
1. Generate $k$ time-series of length $n$, each starting at $X_0$ and obeying {eq}`statd_tar`.
994+
1. Create a boxplot representing $n$ distributions, where the $t$-th distribution shows the $k$ observations of $X_t$.
995+
996+
Use $\theta = 0.9, n = 20, k = 5000, J = 8$
997+
998+
```{exercise-end}
999+
```
1000+
1001+
```{solution-start} sd_ex3
1002+
:class: dropdown
1003+
```
9781004

9791005
Here's a possible solution.
9801006

@@ -984,7 +1010,7 @@ series for one boxplot all at once
9841010
```{code-cell} python3
9851011
n = 20
9861012
k = 5000
987-
J = 6
1013+
J = 8
9881014
9891015
θ = 0.9
9901016
d = np.sqrt(1 - θ**2)
@@ -1008,6 +1034,9 @@ for j in range(J):
10081034
plt.show()
10091035
```
10101036

1037+
```{solution-end}
1038+
```
1039+
10111040
## Appendix
10121041

10131042
(statd_appendix)=

0 commit comments

Comments
 (0)