Skip to content

Commit 0be6746

Browse files
committed
update subtitles
1 parent c401340 commit 0be6746

File tree

1 file changed

+25
-21
lines changed

1 file changed

+25
-21
lines changed

lectures/orth_proj.md

Lines changed: 25 additions & 21 deletions
Original file line numberDiff line numberDiff line change
@@ -48,7 +48,7 @@ import numpy as np
4848
from scipy.linalg import qr
4949
```
5050

51-
### Further Reading
51+
### Further reading
5252

5353
For background and foundational concepts, see our lecture [on linear algebra](https://python-intro.quantecon.org/linear_algebra.html).
5454

@@ -58,7 +58,7 @@ For a complete set of proofs in a general setting, see, for example, {cite}`Roma
5858

5959
For an advanced treatment of projection in the context of least squares prediction, see [this book chapter](http://www.tomsargent.com/books/TOMchpt.2.pdf).
6060

61-
## Key Definitions
61+
## Key definitions
6262

6363
Assume $x, z \in \mathbb R^n$.
6464

@@ -117,15 +117,15 @@ $$
117117
= \| x_1 \|^2 + \| x_2 \|^2
118118
$$
119119

120-
### Linear Independence vs Orthogonality
120+
### Linear independence vs orthogonality
121121

122122
If $X \subset \mathbb R^n$ is an orthogonal set and $0 \notin X$, then $X$ is linearly independent.
123123

124124
Proving this is a nice exercise.
125125

126126
While the converse is not true, a kind of partial converse holds, as we'll {ref}`see below <gram_schmidt>`.
127127

128-
## The Orthogonal Projection Theorem
128+
## The orthogonal projection theorem
129129

130130
What vector within a linear subspace of $\mathbb R^n$ best approximates a given vector in $\mathbb R^n$?
131131

@@ -155,7 +155,7 @@ The next figure provides some intuition
155155
156156
```
157157

158-
### Proof of Sufficiency
158+
### Proof of sufficiency
159159

160160
We'll omit the full proof.
161161

@@ -175,7 +175,7 @@ $$
175175

176176
Hence $\| y - z \| \geq \| y - \hat y \|$, which completes the proof.
177177

178-
### Orthogonal Projection as a Mapping
178+
### Orthogonal projection as a mapping
179179

180180
For a linear space $Y$ and a fixed linear subspace $S$, we have a functional relationship
181181

@@ -209,7 +209,7 @@ From this, we can deduce additional useful properties, such as
209209

210210
For example, to prove 1, observe that $y = P y + y - P y$ and apply the Pythagorean law.
211211

212-
#### Orthogonal Complement
212+
#### Orthogonal complement
213213

214214
Let $S \subset \mathbb R^n$.
215215

@@ -250,7 +250,7 @@ The next figure illustrates
250250
251251
```
252252

253-
## Orthonormal Basis
253+
## Orthonormal basis
254254

255255
An orthogonal set of vectors $O \subset \mathbb R^n$ is called an **orthonormal set** if $\| u \| = 1$ for all $u \in O$.
256256

@@ -290,7 +290,7 @@ $$
290290

291291
Combining this result with {eq}`pob` verifies the claim.
292292

293-
### Projection onto an Orthonormal Basis
293+
### Projection onto an orthonormal basis
294294

295295
When a subspace onto which we project is orthonormal, computing the projection simplifies:
296296

@@ -328,7 +328,7 @@ $$
328328
(Why is this sufficient to establish the claim that $y - P y \perp S$?)
329329

330330

331-
## Projection Via Matrix Algebra
331+
## Projection via matrix algebra
332332

333333
Let $S$ be a linear subspace of $\mathbb R^n$ and let $y \in \mathbb R^n$.
334334

@@ -389,7 +389,7 @@ $$
389389
The proof is now complete.
390390
```
391391

392-
### Starting with the Basis
392+
### Starting with the basis
393393

394394
It is common in applications to start with $n \times k$ matrix $X$ with linearly independent columns and let
395395

@@ -405,7 +405,7 @@ In this context, $P$ is often called the **projection matrix**
405405

406406
* The matrix $M = I - P$ satisfies $M y = \hat E_{S^{\perp}} y$ and is sometimes called the **annihilator matrix**.
407407

408-
### The Orthonormal Case
408+
### The orthonormal case
409409

410410
Suppose that $U$ is $n \times k$ with orthonormal columns.
411411

@@ -430,7 +430,7 @@ $$
430430
We have recovered our earlier result about projecting onto the span of an orthonormal
431431
basis.
432432

433-
### Application: Overdetermined Systems of Equations
433+
### Application: overdetermined systems of equations
434434

435435
Let $y \in \mathbb R^n$ and let $X$ be $n \times k$ with linearly independent columns.
436436

@@ -485,15 +485,15 @@ $$
485485
This is what we aimed to show.
486486
```
487487

488-
## Least Squares Regression
488+
## Least squares regression
489489

490490
Let's apply the theory of orthogonal projection to least squares regression.
491491

492492
This approach provides insights about many geometric properties of linear regression.
493493

494494
We treat only some examples.
495495

496-
### Squared Risk Measures
496+
### Squared risk measures
497497

498498
Given pairs $(x, y) \in \mathbb R^K \times \mathbb R$, consider choosing $f \colon \mathbb R^K \to \mathbb R$ to minimize
499499
the **risk**
@@ -628,7 +628,7 @@ From the {prf:ref}`opt` we have $y = \hat y + \hat u$ and $\hat u \perp \hat y$
628628

629629
Applying the Pythagorean law completes the proof.
630630

631-
## Orthogonalization and Decomposition
631+
## Orthogonalization and decomposition
632632

633633
Let's return to the connection between linear independence and orthogonality touched on above.
634634

@@ -637,7 +637,7 @@ A result of much interest is a famous algorithm for constructing orthonormal set
637637
The next section gives details.
638638

639639
(gram_schmidt)=
640-
### Gram-Schmidt Orthogonalization
640+
### Gram-Schmidt orthogonalization
641641

642642
```{prf:theorem}
643643
@@ -666,7 +666,7 @@ A Gram-Schmidt orthogonalization construction is a key idea behind the Kalman fi
666666

667667
In some exercises below, you are asked to implement this algorithm and test it using projection.
668668

669-
### QR Decomposition
669+
### QR decomposition
670670

671671
The following result uses the preceding algorithm to produce a useful decomposition.
672672

@@ -694,7 +694,7 @@ $$
694694
Some rearranging gives $X = Q R$.
695695
```
696696

697-
### Linear Regression via QR Decomposition
697+
### Linear regression via QR decomposition
698698

699699
For matrices $X$ and $y$ that overdetermine $\beta$ in the linear
700700
equation system $y = X \beta$, we found the least squares approximator $\hat \beta = (X' X)^{-1} X' y$.
@@ -749,9 +749,13 @@ intuition as to why they should be idempotent?
749749
```
750750

751751
Symmetry and idempotence of $M$ and $P$ can be established
752-
using standard rules for matrix algebra. The intuition behind
752+
using standard rules for matrix algebra.
753+
754+
The intuition behind
753755
idempotence of $M$ and $P$ is that both are orthogonal
754-
projections. After a point is projected into a given subspace, applying
756+
projections.
757+
758+
After a point is projected into a given subspace, applying
755759
the projection again makes no difference (A point inside the subspace
756760
is not shifted by orthogonal projection onto that space because it is
757761
already the closest point in the subspace to itself).

0 commit comments

Comments
 (0)