You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: lectures/orth_proj.md
+25-21Lines changed: 25 additions & 21 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -48,7 +48,7 @@ import numpy as np
48
48
from scipy.linalg import qr
49
49
```
50
50
51
-
### Further Reading
51
+
### Further reading
52
52
53
53
For background and foundational concepts, see our lecture [on linear algebra](https://python-intro.quantecon.org/linear_algebra.html).
54
54
@@ -58,7 +58,7 @@ For a complete set of proofs in a general setting, see, for example, {cite}`Roma
58
58
59
59
For an advanced treatment of projection in the context of least squares prediction, see [this book chapter](http://www.tomsargent.com/books/TOMchpt.2.pdf).
60
60
61
-
## Key Definitions
61
+
## Key definitions
62
62
63
63
Assume $x, z \in \mathbb R^n$.
64
64
@@ -117,15 +117,15 @@ $$
117
117
= \| x_1 \|^2 + \| x_2 \|^2
118
118
$$
119
119
120
-
### Linear Independence vs Orthogonality
120
+
### Linear independence vs orthogonality
121
121
122
122
If $X \subset \mathbb R^n$ is an orthogonal set and $0 \notin X$, then $X$ is linearly independent.
123
123
124
124
Proving this is a nice exercise.
125
125
126
126
While the converse is not true, a kind of partial converse holds, as we'll {ref}`see below <gram_schmidt>`.
127
127
128
-
## The Orthogonal Projection Theorem
128
+
## The orthogonal projection theorem
129
129
130
130
What vector within a linear subspace of $\mathbb R^n$ best approximates a given vector in $\mathbb R^n$?
131
131
@@ -155,7 +155,7 @@ The next figure provides some intuition
155
155
156
156
```
157
157
158
-
### Proof of Sufficiency
158
+
### Proof of sufficiency
159
159
160
160
We'll omit the full proof.
161
161
@@ -175,7 +175,7 @@ $$
175
175
176
176
Hence $\| y - z \| \geq \| y - \hat y \|$, which completes the proof.
177
177
178
-
### Orthogonal Projection as a Mapping
178
+
### Orthogonal projection as a mapping
179
179
180
180
For a linear space $Y$ and a fixed linear subspace $S$, we have a functional relationship
181
181
@@ -209,7 +209,7 @@ From this, we can deduce additional useful properties, such as
209
209
210
210
For example, to prove 1, observe that $y = P y + y - P y$ and apply the Pythagorean law.
211
211
212
-
#### Orthogonal Complement
212
+
#### Orthogonal complement
213
213
214
214
Let $S \subset \mathbb R^n$.
215
215
@@ -250,7 +250,7 @@ The next figure illustrates
250
250
251
251
```
252
252
253
-
## Orthonormal Basis
253
+
## Orthonormal basis
254
254
255
255
An orthogonal set of vectors $O \subset \mathbb R^n$ is called an **orthonormal set** if $\| u \| = 1$ for all $u \in O$.
256
256
@@ -290,7 +290,7 @@ $$
290
290
291
291
Combining this result with {eq}`pob` verifies the claim.
292
292
293
-
### Projection onto an Orthonormal Basis
293
+
### Projection onto an orthonormal basis
294
294
295
295
When a subspace onto which we project is orthonormal, computing the projection simplifies:
296
296
@@ -328,7 +328,7 @@ $$
328
328
(Why is this sufficient to establish the claim that $y - P y \perp S$?)
329
329
330
330
331
-
## Projection Via Matrix Algebra
331
+
## Projection via matrix algebra
332
332
333
333
Let $S$ be a linear subspace of $\mathbb R^n$ and let $y \in \mathbb R^n$.
334
334
@@ -389,7 +389,7 @@ $$
389
389
The proof is now complete.
390
390
```
391
391
392
-
### Starting with the Basis
392
+
### Starting with the basis
393
393
394
394
It is common in applications to start with $n \times k$ matrix $X$ with linearly independent columns and let
395
395
@@ -405,7 +405,7 @@ In this context, $P$ is often called the **projection matrix**
405
405
406
406
* The matrix $M = I - P$ satisfies $M y = \hat E_{S^{\perp}} y$ and is sometimes called the **annihilator matrix**.
407
407
408
-
### The Orthonormal Case
408
+
### The orthonormal case
409
409
410
410
Suppose that $U$ is $n \times k$ with orthonormal columns.
411
411
@@ -430,7 +430,7 @@ $$
430
430
We have recovered our earlier result about projecting onto the span of an orthonormal
431
431
basis.
432
432
433
-
### Application: Overdetermined Systems of Equations
433
+
### Application: overdetermined systems of equations
434
434
435
435
Let $y \in \mathbb R^n$ and let $X$ be $n \times k$ with linearly independent columns.
436
436
@@ -485,15 +485,15 @@ $$
485
485
This is what we aimed to show.
486
486
```
487
487
488
-
## Least Squares Regression
488
+
## Least squares regression
489
489
490
490
Let's apply the theory of orthogonal projection to least squares regression.
491
491
492
492
This approach provides insights about many geometric properties of linear regression.
0 commit comments