You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: lectures/orth_proj.md
+50-18Lines changed: 50 additions & 18 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -684,20 +684,57 @@ Numerical routines would in this case use the alternative form $R \hat \beta = Q
684
684
685
685
## Exercises
686
686
687
-
### Exercise 1
687
+
```{exercise-start}
688
+
:label: op_ex1
689
+
```
688
690
689
691
Show that, for any linear subspace $S \subset \mathbb R^n$, $S \cap S^{\perp} = \{0\}$.
690
692
691
-
### Exercise 2
693
+
```{exercise-end}
694
+
```
695
+
696
+
```{solution-start} op_ex1
697
+
:class: dropdown
698
+
```
699
+
If $x \in S$ and $x \in S^\perp$, then we have in particular
700
+
that $\langle x, x \rangle = 0$, but then $x = 0$.
701
+
702
+
```{solution-end}
703
+
```
692
704
705
+
```{exercise-start}
706
+
:label: op_ex2
707
+
```
693
708
Let $P = X (X' X)^{-1} X'$ and let $M = I - P$. Show that
694
709
$P$ and $M$ are both idempotent and symmetric. Can you give any
695
710
intuition as to why they should be idempotent?
696
711
697
-
### Exercise 3
712
+
```{exercise-end}
713
+
```
714
+
715
+
```{solution-start} op_ex2
716
+
:class: dropdown
717
+
```
718
+
719
+
Symmetry and idempotence of $M$ and $P$ can be established
720
+
using standard rules for matrix algebra. The intuition behind
721
+
idempotence of $M$ and $P$ is that both are orthogonal
722
+
projections. After a point is projected into a given subspace, applying
723
+
the projection again makes no difference (A point inside the subspace
724
+
is not shifted by orthogonal projection onto that space because it is
725
+
already the closest point in the subspace to itself).
726
+
727
+
```{solution-end}
728
+
```
729
+
730
+
731
+
```{exercise-start}
732
+
:label: op_ex3
733
+
```
698
734
699
735
Using Gram-Schmidt orthogonalization, produce a linear projection of $y$ onto the column space of $X$ and verify this using the projection matrix $P := X (X' X)^{-1} X'$ and also using QR decomposition, where:
700
736
737
+
701
738
$$
702
739
y :=
703
740
\left(
@@ -723,24 +760,14 @@ X :=
723
760
\right)
724
761
$$
725
762
726
-
## Solutions
727
-
728
-
### Exercise 1
729
763
730
-
If $x \in S$ and $x \in S^\perp$, then we have in particular
731
-
that $\langle x, x \rangle = 0$, but then $x = 0$.
764
+
```{exercise-end}
765
+
```
732
766
733
-
### Exercise 2
734
767
735
-
Symmetry and idempotence of $M$ and $P$ can be established
736
-
using standard rules for matrix algebra. The intuition behind
737
-
idempotence of $M$ and $P$ is that both are orthogonal
738
-
projections. After a point is projected into a given subspace, applying
739
-
the projection again makes no difference. (A point inside the subspace
740
-
is not shifted by orthogonal projection onto that space because it is
741
-
already the closest point in the subspace to itself.).
742
-
743
-
### Exercise 3
768
+
```{solution-start} op_ex3
769
+
:class: dropdown
770
+
```
744
771
745
772
Here's a function that computes the orthonormal vectors using the GS
Each data set is represented by a box, where the top and bottom of the box are the third and first quartiles of the data, and the red line in the center is the median.
853
-
854
-
The boxes give some indication as to
855
-
856
-
* the location of probability mass for each sample
857
-
* whether the distribution is right-skewed (as is the lognormal distribution), etc
858
-
859
-
Now let's put these ideas to use in a simulation.
860
-
861
-
Consider the threshold autoregressive model in {eq}`statd_tar`.
862
-
863
-
We know that the distribution of $X_t$ will converge to {eq}`statd_tar_ts` whenever $|\theta| < 1$.
864
-
865
-
Let's observe this convergence from different initial conditions using
866
-
boxplots.
867
-
868
-
In particular, the exercise is to generate J boxplot figures, one for each initial condition $X_0$ in
869
-
870
-
```{code-block} python3
871
-
initial_conditions = np.linspace(8, 0, J)
808
+
```{solution-start} sd_ex1
809
+
:class: dropdown
872
810
```
873
-
874
-
For each $X_0$ in this set,
875
-
876
-
1. Generate $k$ time-series of length $n$, each starting at $X_0$ and obeying {eq}`statd_tar`.
877
-
1. Create a boxplot representing $n$ distributions, where the $t$-th distribution shows the $k$ observations of $X_t$.
878
-
879
-
Use $\theta = 0.9, n = 20, k = 5000, J = 8$
880
-
881
-
## Solutions
882
-
883
-
### Exercise 1
884
-
885
811
Look-ahead estimation of a TAR stationary density, where the TAR model
886
812
is
887
813
@@ -925,7 +851,35 @@ ax.legend(loc='upper left')
925
851
plt.show()
926
852
```
927
853
928
-
### Exercise 2
854
+
855
+
```{solution-end}
856
+
```
857
+
858
+
859
+
(statd_ex2)=
860
+
```{exercise-start}
861
+
:label: sd_ex2
862
+
```
863
+
864
+
Replicate the figure on global convergence {ref}`shown above <statd_egs>`.
865
+
866
+
The densities come from the stochastic growth model treated {ref}`at the start of the lecture <solow_swan>`.
867
+
868
+
Begin with the code found {ref}`above <stoch_growth>`.
869
+
870
+
Use the same parameters.
871
+
872
+
For the four initial distributions, use the shifted beta distributions
873
+
874
+
```{code-block} python3
875
+
ψ_0 = beta(5, 5, scale=0.5, loc=i*2)
876
+
```
877
+
```{exercise-end}
878
+
```
879
+
880
+
```{solution-start} sd_ex2
881
+
:class: dropdown
882
+
```
929
883
930
884
Here's one program that does the job
931
885
@@ -974,7 +928,79 @@ for i in range(4):
974
928
plt.show()
975
929
```
976
930
977
-
### Exercise 3
931
+
```{solution-end}
932
+
```
933
+
934
+
(statd_ex3)=
935
+
```{exercise-start}
936
+
:label: sd_ex3
937
+
```
938
+
939
+
A common way to compare distributions visually is with [boxplots](https://en.wikipedia.org/wiki/Box_plot).
940
+
941
+
To illustrate, let's generate three artificial data sets and compare them with a boxplot.
Each data set is represented by a box, where the top and bottom of the box are the third and first quartiles of the data, and the red line in the center is the median.
970
+
971
+
The boxes give some indication as to
972
+
973
+
* the location of probability mass for each sample
974
+
* whether the distribution is right-skewed (as is the lognormal distribution), etc
975
+
976
+
Now let's put these ideas to use in a simulation.
977
+
978
+
Consider the threshold autoregressive model in {eq}`statd_tar`.
979
+
980
+
We know that the distribution of $X_t$ will converge to {eq}`statd_tar_ts` whenever $|\theta| < 1$.
981
+
982
+
Let's observe this convergence from different initial conditions using
983
+
boxplots.
984
+
985
+
In particular, the exercise is to generate J boxplot figures, one for each initial condition $X_0$ in
986
+
987
+
```{code-block} python3
988
+
initial_conditions = np.linspace(8, 0, J)
989
+
```
990
+
991
+
For each $X_0$ in this set,
992
+
993
+
1. Generate $k$ time-series of length $n$, each starting at $X_0$ and obeying {eq}`statd_tar`.
994
+
1. Create a boxplot representing $n$ distributions, where the $t$-th distribution shows the $k$ observations of $X_t$.
995
+
996
+
Use $\theta = 0.9, n = 20, k = 5000, J = 8$
997
+
998
+
```{exercise-end}
999
+
```
1000
+
1001
+
```{solution-start} sd_ex3
1002
+
:class: dropdown
1003
+
```
978
1004
979
1005
Here's a possible solution.
980
1006
@@ -984,7 +1010,7 @@ series for one boxplot all at once
0 commit comments