Skip to content

Commit 8562936

Browse files
authored
[asset_pricing_lph] add exercises from Tom's notebook (#99)
* [asset_pricing_lph] add exercises from Tom's notebook * minor edits and dollar math delimiter check
1 parent f12d9f2 commit 8562936

File tree

1 file changed

+285
-0
lines changed

1 file changed

+285
-0
lines changed

lectures/asset_pricing_lph.md

Lines changed: 285 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -396,3 +396,288 @@ For excess returns $R^{ei} = R^i - R^f$ we have
396396
$$
397397
E R^{e i}=\beta_{i, a} \lambda_{a}+\beta_{i, b} \lambda_{b}+\cdots+\alpha_{i}, i=1, \ldots, I
398398
$$
399+
400+
## Exercises (Introductory)
401+
402+
Let's start with some imports.
403+
404+
```{code-cell} ipython3
405+
import numpy as np
406+
from scipy.stats import stats
407+
import statsmodels.api as sm
408+
from statsmodels.sandbox.regression.gmm import GMM
409+
import matplotlib.pyplot as plt
410+
%matplotlib inline
411+
```
412+
413+
Lots of our calculations will involve computing population and sample OLS regressions.
414+
415+
So we define a function for simple univariate OLS regression that calls the `OLS` routine from `statsmodels`.
416+
417+
```{code-cell} ipython3
418+
def simple_ols(X, Y, constant=False):
419+
420+
if constant:
421+
X = sm.add_constant(X)
422+
423+
model = sm.OLS(Y, X)
424+
res = model.fit()
425+
426+
β_hat = res.params[-1]
427+
σ_hat = np.sqrt(res.resid @ res.resid / res.df_resid)
428+
429+
return β_hat, σ_hat
430+
```
431+
432+
### Exercise 1
433+
434+
Look at the equation,
435+
436+
$$
437+
R^i_t - R^f = \beta_{i, R^m} (R^m_t - R^f) + \sigma_i \varepsilon_{i, t}.
438+
$$
439+
440+
Verify that this equation is a regression equation.
441+
442+
### Exercise 2
443+
444+
Give a formula for the regression coefficient $\beta_{i, R^m}$.
445+
446+
### Exercise 3
447+
448+
Recall our earlier discussions of a **direct problem** and an **inverse problem**.
449+
450+
* A direct problem is about simulating a particular model.
451+
* An inverse problem is about using data to **estimate** or **choose** a particular model from a manifold of models.
452+
453+
Please assume the parameter values set below and then simulate 2000 observations from the theory specified
454+
above for 5 assets, $i = 1, \ldots, 5$.
455+
456+
\begin{align*}
457+
E\left[R^f\right] &= 0.02 \\
458+
\sigma_f &= 0.00 \\
459+
\xi &= 0.06 \\
460+
\lambda &= 0.04 \\
461+
\beta_{1, R^m} &= 0.2 \\
462+
\sigma_1 &= 0.04 \\
463+
\beta_{2, R^m} &= .4 \\
464+
\sigma_2 &= 0.04 \\
465+
\beta_{3, R^m} &= .6 \\
466+
\sigma_3 &= 0.04 \\
467+
\beta_{4, R^m} &= .8 \\
468+
\sigma_4 &= 0.04 \\
469+
\beta_{5, R^m} &= 1.0 \\
470+
\sigma_5 &= 0.04
471+
\end{align*}
472+
473+
## Exercises (Intermediate)
474+
475+
Now come some even more fun parts!
476+
477+
Our theory implies that there exist values of two scalars, $a$ and $b$, such that a legitimate stochastic discount factor is:
478+
479+
$$
480+
m_t = a + b R^m_t
481+
$$
482+
483+
The parameters $a, b$ must satisfy the following equations:
484+
485+
\begin{align*}
486+
E[(a + b R_t^m) R^m_t)] &= 1 \\
487+
E[(a + b R_t^m) R^f_t)] &= 1
488+
\end{align*}
489+
490+
### Exercise 4
491+
492+
Using the equations above, find a system of two **linear** equations that you can solve for $a$ and $b$ as functions of the parameters $(\lambda, \xi, E[R_f])$.
493+
494+
Write a function that can solve these equations.
495+
496+
Please check the **condition number** of a key matrix that must be inverted to determine a, b
497+
498+
### Exercise 5
499+
500+
Using the estimates of the parameters that you generated above, compute the implied stochastic discount factor.
501+
502+
503+
504+
## Solutions (Introductory)
505+
506+
### Solution to Exercise 1
507+
508+
To verify that it is a **regression equation** we must show that the residual is orthogonal to the regressor.
509+
510+
Our assumptions about mutual orthogonality imply that
511+
512+
$$
513+
E\left[\epsilon_{i,t}\right]=0,\quad E\left[\epsilon_{i,t}u_{t}\right]=0
514+
$$
515+
516+
It follows that
517+
518+
$$
519+
\begin{aligned}
520+
E\left[\sigma_{i}\epsilon_{i,t}\left(R_{t}^{m}-R^{f}\right)\right]&=E\left[\sigma_{i}\epsilon_{i,t}\left(\xi+\lambda u_{t}\right)\right] \\
521+
&=\sigma_{i}\xi E\left[\epsilon_{i,t}\right]+\sigma_{i}\lambda E\left[\epsilon_{i,t}u_{t}\right] \\
522+
&=0
523+
\end{aligned}
524+
$$
525+
526+
### Solution to Exercise 2
527+
528+
The regression coefficient $\beta_{i, R^m}$ is
529+
530+
$$
531+
\beta_{i,R^{m}}=\frac{Cov\left(R_{t}^{i}-R^{f},R_{t}^{m}-R^{f}\right)}{Var\left(R_{t}^{m}-R^{f}\right)}
532+
$$
533+
534+
### Solution to Exercise 3
535+
536+
**Direct Problem:**
537+
538+
```{code-cell} ipython3
539+
# Code for the direct problem
540+
541+
# assign the parameter values
542+
ERf = 0.02
543+
σf = 0.00 # Zejin: Hi tom, here is where you manipulate σf
544+
ξ = 0.06
545+
λ = 0.08
546+
βi = np.array([0.2, .4, .6, .8, 1.0])
547+
σi = np.array([0.04, 0.04, 0.04, 0.04, 0.04])
548+
```
549+
550+
```{code-cell} ipython3
551+
# in this cell we set the number of assets and number of observations
552+
# we first set T to a large number to verify our computation results
553+
T = 2000
554+
N = 5
555+
```
556+
557+
```{code-cell} ipython3
558+
# simulate i.i.d. random shocks
559+
e = np.random.normal(size=T)
560+
u = np.random.normal(size=T)
561+
ϵ = np.random.normal(size=(N, T))
562+
```
563+
564+
```{code-cell} ipython3
565+
# simulate the return on a risk-free asset
566+
Rf = ERf + σf * e
567+
568+
# simulate the return on the market portfolio
569+
excess_Rm = ξ + λ * u
570+
Rm = Rf + excess_Rm
571+
572+
# simulate the return on asset i
573+
Ri = np.empty((N, T))
574+
for i in range(N):
575+
Ri[i, :] = Rf + βi[i] * excess_Rm + σi[i] * ϵ[i, :]
576+
```
577+
578+
Now that we have a panel of data, we'd like to solve the inverse problem by assuming the theory specified above and estimating the coefficients given above.
579+
580+
```{code-cell} ipython3
581+
# Code for the inverse problem
582+
```
583+
584+
**Inverse Problem:**
585+
586+
We will solve the inverse problem by simple OLS regressions.
587+
588+
1. estimate $E\left[R^f\right]$ and $\sigma_f$
589+
590+
```{code-cell} ipython3
591+
ERf_hat, σf_hat = simple_ols(np.ones(T), Rf)
592+
```
593+
594+
```{code-cell} ipython3
595+
ERf_hat, σf_hat
596+
```
597+
598+
Let's compare these with the _true_ population parameter values.
599+
600+
```{code-cell} ipython3
601+
ERf, σf
602+
```
603+
604+
2. $\xi$ and $\lambda$
605+
606+
```{code-cell} ipython3
607+
ξ_hat, λ_hat = simple_ols(np.ones(T), Rm - Rf)
608+
```
609+
610+
```{code-cell} ipython3
611+
ξ_hat, λ_hat
612+
```
613+
614+
```{code-cell} ipython3
615+
ξ, λ
616+
```
617+
618+
3. $\beta_{i, R^m}$ and $\sigma_i$
619+
620+
```{code-cell} ipython3
621+
βi_hat = np.empty(N)
622+
σi_hat = np.empty(N)
623+
624+
for i in range(N):
625+
βi_hat[i], σi_hat[i] = simple_ols(Rm - Rf, Ri[i, :] - Rf)
626+
```
627+
628+
```{code-cell} ipython3
629+
βi_hat, σi_hat
630+
```
631+
632+
```{code-cell} ipython3
633+
βi, σi
634+
```
635+
636+
Q: How close did your estimates come to the parameters we specified?
637+
638+
## Solutions (Intermediate)
639+
640+
### Solution to Exercise 4
641+
642+
\begin{align}
643+
a ((E(R^f) + \xi) + b ((E(R^f) + \xi)^2 + \lambda^2 + \sigma_f^2) & =1 \cr
644+
a E(R^f) + b (E(R^f)^2 + \xi E(R^f) + \sigma_f ^ 2) & = 1
645+
\end{align}
646+
647+
```{code-cell} ipython3
648+
# Code here
649+
def solve_ab(ERf, σf, λ, ξ):
650+
651+
M = np.empty((2, 2))
652+
M[0, 0] = ERf + ξ
653+
M[0, 1] = (ERf + ξ) ** 2 + λ ** 2 + σf ** 2
654+
M[1, 0] = ERf
655+
M[1, 1] = ERf ** 2 + ξ * ERf + σf ** 2
656+
657+
a, b = np.linalg.solve(M, np.ones(2))
658+
condM = np.linalg.cond(M)
659+
660+
return a, b, condM
661+
```
662+
663+
Let's try to solve $a$ and $b$ using the actual model parameters.
664+
665+
```{code-cell} ipython3
666+
a, b, condM = solve_ab(ERf, σf, λ, ξ)
667+
```
668+
669+
```{code-cell} ipython3
670+
a, b, condM
671+
```
672+
673+
### Solution to Exercise 5
674+
675+
Now let's pass $\hat{E}(R^f), \hat{\sigma}^f, \hat{\lambda}, \hat{\xi}$ to the function `solve_ab`.
676+
677+
```{code-cell} ipython3
678+
a_hat, b_hat, M_hat = solve_ab(ERf_hat, σf_hat, λ_hat, ξ_hat)
679+
```
680+
681+
```{code-cell} ipython3
682+
a_hat, b_hat, M_hat
683+
```

0 commit comments

Comments
 (0)