Skip to content

Commit 1c059fc

Browse files
authored
Merge pull request #117 from QuantEcon/exercise-migrate
[ENH] Migrate exercise and solutions: Part 1
2 parents 7795b89 + 12e2dad commit 1c059fc

File tree

4 files changed

+167
-103
lines changed

4 files changed

+167
-103
lines changed

lectures/_config.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -31,7 +31,7 @@ latex:
3131
targetname: quantecon-python-advanced.tex
3232

3333
sphinx:
34-
extra_extensions: [sphinx_multitoc_numbering, sphinxext.rediraffe, sphinx_tojupyter]
34+
extra_extensions: [sphinx_multitoc_numbering, sphinxext.rediraffe, sphinx_tojupyter, sphinx_exercise, sphinx_togglebutton]
3535
config:
3636
nb_render_priority:
3737
html:

lectures/muth_kalman.md

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -176,6 +176,9 @@ kmuth = Kalman(ss, x_hat_0, Σ_0)
176176
# representation
177177
S1, K1 = kmuth.stationary_values()
178178
179+
# Extract scalars from nested arrays
180+
S1, K1 = S1.item(), K1.item()
181+
179182
# Form innovation representation state-space
180183
Ak, Ck, Gk, Hk = A, K1, G, 1
181184

lectures/orth_proj.md

Lines changed: 50 additions & 18 deletions
Original file line numberDiff line numberDiff line change
@@ -684,20 +684,57 @@ Numerical routines would in this case use the alternative form $R \hat \beta = Q
684684

685685
## Exercises
686686

687-
### Exercise 1
687+
```{exercise-start}
688+
:label: op_ex1
689+
```
688690

689691
Show that, for any linear subspace $S \subset \mathbb R^n$, $S \cap S^{\perp} = \{0\}$.
690692

691-
### Exercise 2
693+
```{exercise-end}
694+
```
695+
696+
```{solution-start} op_ex1
697+
:class: dropdown
698+
```
699+
If $x \in S$ and $x \in S^\perp$, then we have in particular
700+
that $\langle x, x \rangle = 0$, but then $x = 0$.
701+
702+
```{solution-end}
703+
```
692704

705+
```{exercise-start}
706+
:label: op_ex2
707+
```
693708
Let $P = X (X' X)^{-1} X'$ and let $M = I - P$. Show that
694709
$P$ and $M$ are both idempotent and symmetric. Can you give any
695710
intuition as to why they should be idempotent?
696711

697-
### Exercise 3
712+
```{exercise-end}
713+
```
714+
715+
```{solution-start} op_ex2
716+
:class: dropdown
717+
```
718+
719+
Symmetry and idempotence of $M$ and $P$ can be established
720+
using standard rules for matrix algebra. The intuition behind
721+
idempotence of $M$ and $P$ is that both are orthogonal
722+
projections. After a point is projected into a given subspace, applying
723+
the projection again makes no difference (A point inside the subspace
724+
is not shifted by orthogonal projection onto that space because it is
725+
already the closest point in the subspace to itself).
726+
727+
```{solution-end}
728+
```
729+
730+
731+
```{exercise-start}
732+
:label: op_ex3
733+
```
698734

699735
Using Gram-Schmidt orthogonalization, produce a linear projection of $y$ onto the column space of $X$ and verify this using the projection matrix $P := X (X' X)^{-1} X'$ and also using QR decomposition, where:
700736

737+
701738
$$
702739
y :=
703740
\left(
@@ -723,24 +760,14 @@ X :=
723760
\right)
724761
$$
725762

726-
## Solutions
727-
728-
### Exercise 1
729763

730-
If $x \in S$ and $x \in S^\perp$, then we have in particular
731-
that $\langle x, x \rangle = 0$, but then $x = 0$.
764+
```{exercise-end}
765+
```
732766

733-
### Exercise 2
734767

735-
Symmetry and idempotence of $M$ and $P$ can be established
736-
using standard rules for matrix algebra. The intuition behind
737-
idempotence of $M$ and $P$ is that both are orthogonal
738-
projections. After a point is projected into a given subspace, applying
739-
the projection again makes no difference. (A point inside the subspace
740-
is not shifted by orthogonal projection onto that space because it is
741-
already the closest point in the subspace to itself.).
742-
743-
### Exercise 3
768+
```{solution-start} op_ex3
769+
:class: dropdown
770+
```
744771

745772
Here's a function that computes the orthonormal vectors using the GS
746773
algorithm given in the lecture
@@ -832,3 +859,8 @@ Py3
832859

833860
Again, we obtain the same answer.
834861

862+
```{solution-end}
863+
```
864+
865+
866+

lectures/stationary_densities.md

Lines changed: 113 additions & 84 deletions
Original file line numberDiff line numberDiff line change
@@ -746,10 +746,13 @@ for the look-ahead estimator is very good.
746746

747747
The first exercise helps illustrate this point.
748748

749+
749750
## Exercises
750751

751752
(statd_ex1)=
752-
### Exercise 1
753+
```{exercise-start}
754+
:label: sd_ex1
755+
```
753756

754757
Consider the simple threshold autoregressive model
755758

@@ -799,89 +802,12 @@ density estimator.
799802

800803
If you repeat the simulation you will see that this is consistently the case.
801804

802-
(statd_ex2)=
803-
### Exercise 2
804-
805-
Replicate the figure on global convergence {ref}`shown above <statd_egs>`.
806-
807-
The densities come from the stochastic growth model treated {ref}`at the start of the lecture <solow_swan>`.
808-
809-
Begin with the code found {ref}`above <stoch_growth>`.
810-
811-
Use the same parameters.
812-
813-
For the four initial distributions, use the shifted beta distributions
814-
815-
```{code-block} python3
816-
ψ_0 = beta(5, 5, scale=0.5, loc=i*2)
805+
```{exercise-end}
817806
```
818807

819-
(statd_ex3)=
820-
### Exercise 3
821-
822-
A common way to compare distributions visually is with [boxplots](https://en.wikipedia.org/wiki/Box_plot).
823-
824-
To illustrate, let's generate three artificial data sets and compare them with a boxplot.
825-
826-
The three data sets we will use are:
827-
828-
$$
829-
\{ X_1, \ldots, X_n \} \sim LN(0, 1), \;\;
830-
\{ Y_1, \ldots, Y_n \} \sim N(2, 1), \;\;
831-
\text{ and } \;
832-
\{ Z_1, \ldots, Z_n \} \sim N(4, 1), \;
833-
$$
834-
835-
Here is the code and figure:
836-
837-
```{code-cell} python3
838-
n = 500
839-
x = np.random.randn(n) # N(0, 1)
840-
x = np.exp(x) # Map x to lognormal
841-
y = np.random.randn(n) + 2.0 # N(2, 1)
842-
z = np.random.randn(n) + 4.0 # N(4, 1)
843-
844-
fig, ax = plt.subplots(figsize=(10, 6.6))
845-
ax.boxplot([x, y, z])
846-
ax.set_xticks((1, 2, 3))
847-
ax.set_ylim(-2, 14)
848-
ax.set_xticklabels(('$X$', '$Y$', '$Z$'), fontsize=16)
849-
plt.show()
850-
```
851-
852-
Each data set is represented by a box, where the top and bottom of the box are the third and first quartiles of the data, and the red line in the center is the median.
853-
854-
The boxes give some indication as to
855-
856-
* the location of probability mass for each sample
857-
* whether the distribution is right-skewed (as is the lognormal distribution), etc
858-
859-
Now let's put these ideas to use in a simulation.
860-
861-
Consider the threshold autoregressive model in {eq}`statd_tar`.
862-
863-
We know that the distribution of $X_t$ will converge to {eq}`statd_tar_ts` whenever $|\theta| < 1$.
864-
865-
Let's observe this convergence from different initial conditions using
866-
boxplots.
867-
868-
In particular, the exercise is to generate J boxplot figures, one for each initial condition $X_0$ in
869-
870-
```{code-block} python3
871-
initial_conditions = np.linspace(8, 0, J)
808+
```{solution-start} sd_ex1
809+
:class: dropdown
872810
```
873-
874-
For each $X_0$ in this set,
875-
876-
1. Generate $k$ time-series of length $n$, each starting at $X_0$ and obeying {eq}`statd_tar`.
877-
1. Create a boxplot representing $n$ distributions, where the $t$-th distribution shows the $k$ observations of $X_t$.
878-
879-
Use $\theta = 0.9, n = 20, k = 5000, J = 8$
880-
881-
## Solutions
882-
883-
### Exercise 1
884-
885811
Look-ahead estimation of a TAR stationary density, where the TAR model
886812
is
887813

@@ -925,7 +851,35 @@ ax.legend(loc='upper left')
925851
plt.show()
926852
```
927853

928-
### Exercise 2
854+
855+
```{solution-end}
856+
```
857+
858+
859+
(statd_ex2)=
860+
```{exercise-start}
861+
:label: sd_ex2
862+
```
863+
864+
Replicate the figure on global convergence {ref}`shown above <statd_egs>`.
865+
866+
The densities come from the stochastic growth model treated {ref}`at the start of the lecture <solow_swan>`.
867+
868+
Begin with the code found {ref}`above <stoch_growth>`.
869+
870+
Use the same parameters.
871+
872+
For the four initial distributions, use the shifted beta distributions
873+
874+
```{code-block} python3
875+
ψ_0 = beta(5, 5, scale=0.5, loc=i*2)
876+
```
877+
```{exercise-end}
878+
```
879+
880+
```{solution-start} sd_ex2
881+
:class: dropdown
882+
```
929883

930884
Here's one program that does the job
931885

@@ -974,7 +928,79 @@ for i in range(4):
974928
plt.show()
975929
```
976930

977-
### Exercise 3
931+
```{solution-end}
932+
```
933+
934+
(statd_ex3)=
935+
```{exercise-start}
936+
:label: sd_ex3
937+
```
938+
939+
A common way to compare distributions visually is with [boxplots](https://en.wikipedia.org/wiki/Box_plot).
940+
941+
To illustrate, let's generate three artificial data sets and compare them with a boxplot.
942+
943+
The three data sets we will use are:
944+
945+
$$
946+
\{ X_1, \ldots, X_n \} \sim LN(0, 1), \;\;
947+
\{ Y_1, \ldots, Y_n \} \sim N(2, 1), \;\;
948+
\text{ and } \;
949+
\{ Z_1, \ldots, Z_n \} \sim N(4, 1), \;
950+
$$
951+
952+
Here is the code and figure:
953+
954+
```{code-cell} python3
955+
n = 500
956+
x = np.random.randn(n) # N(0, 1)
957+
x = np.exp(x) # Map x to lognormal
958+
y = np.random.randn(n) + 2.0 # N(2, 1)
959+
z = np.random.randn(n) + 4.0 # N(4, 1)
960+
961+
fig, ax = plt.subplots(figsize=(10, 6.6))
962+
ax.boxplot([x, y, z])
963+
ax.set_xticks((1, 2, 3))
964+
ax.set_ylim(-2, 14)
965+
ax.set_xticklabels(('$X$', '$Y$', '$Z$'), fontsize=16)
966+
plt.show()
967+
```
968+
969+
Each data set is represented by a box, where the top and bottom of the box are the third and first quartiles of the data, and the red line in the center is the median.
970+
971+
The boxes give some indication as to
972+
973+
* the location of probability mass for each sample
974+
* whether the distribution is right-skewed (as is the lognormal distribution), etc
975+
976+
Now let's put these ideas to use in a simulation.
977+
978+
Consider the threshold autoregressive model in {eq}`statd_tar`.
979+
980+
We know that the distribution of $X_t$ will converge to {eq}`statd_tar_ts` whenever $|\theta| < 1$.
981+
982+
Let's observe this convergence from different initial conditions using
983+
boxplots.
984+
985+
In particular, the exercise is to generate J boxplot figures, one for each initial condition $X_0$ in
986+
987+
```{code-block} python3
988+
initial_conditions = np.linspace(8, 0, J)
989+
```
990+
991+
For each $X_0$ in this set,
992+
993+
1. Generate $k$ time-series of length $n$, each starting at $X_0$ and obeying {eq}`statd_tar`.
994+
1. Create a boxplot representing $n$ distributions, where the $t$-th distribution shows the $k$ observations of $X_t$.
995+
996+
Use $\theta = 0.9, n = 20, k = 5000, J = 8$
997+
998+
```{exercise-end}
999+
```
1000+
1001+
```{solution-start} sd_ex3
1002+
:class: dropdown
1003+
```
9781004

9791005
Here's a possible solution.
9801006

@@ -984,7 +1010,7 @@ series for one boxplot all at once
9841010
```{code-cell} python3
9851011
n = 20
9861012
k = 5000
987-
J = 6
1013+
J = 8
9881014
9891015
θ = 0.9
9901016
d = np.sqrt(1 - θ**2)
@@ -1008,6 +1034,9 @@ for j in range(J):
10081034
plt.show()
10091035
```
10101036

1037+
```{solution-end}
1038+
```
1039+
10111040
## Appendix
10121041

10131042
(statd_appendix)=

0 commit comments

Comments
 (0)