You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@@ -434,11 +423,10 @@ Consider the following problem reminiscent of one described earlier.
434
423
Among all covariance stationary univariate processes with unconditional variance $\sigma_x^2$, find a process with maximal
435
424
one-step-ahead prediction error.
436
425
437
-
438
-
The maximizer is a process with spectral density
426
+
The maximizer is a process with spectral density
439
427
440
428
$$
441
-
S_x(\omega) = 2 \pi \sigma_x^2.
429
+
S_x(\omega) = 2 \pi \sigma_x^2.
442
430
$$
443
431
444
432
Thus, among
@@ -475,7 +463,7 @@ $$
475
463
$$ (eq:Shannon22)
476
464
477
465
Being a measure of the unpredictability of an $n \times 1$ vector covariance stationary stochastic process,
478
-
the left side of {eq}`eq:Shannon22` is sometimes called entropy.
466
+
the left side of {eq}`eq:Shannon22` is sometimes called entropy.
479
467
480
468
481
469
## Frequency Domain Robust Control
@@ -484,7 +472,6 @@ Chapter 8 of {cite}`hansen2008robustness` adapts work in the control theory lit
484
472
**frequency domain entropy** criterion for robust control as
485
473
486
474
$$
487
-
\label{Shannon21}
488
475
\int_\Gamma \log \det [ \theta I - G_F(\zeta)' G_F(\zeta) ] d \lambda(\zeta) ,
489
476
$$ (eq:Shannon21)
490
477
@@ -494,7 +481,6 @@ objective function.
494
481
Hansen and Sargent {cite}`hansen2008robustness` show that criterion {eq}`eq:Shannon21` can be represented as
495
482
496
483
$$
497
-
\label{Shannon220}
498
484
\log \det [ D(0)' D(0)] = \int_\Gamma \log \det [ \theta I - G_F(\zeta)' G_F(\zeta) ] d \lambda(\zeta) ,
499
485
$$ (eq:Shannon220)
500
486
@@ -504,8 +490,6 @@ This explains the
504
490
moniker **maximum entropy** robust control for decision rules $F$ designed to maximize criterion {eq}`eq:Shannon21`.
505
491
506
492
507
-
508
-
509
493
## Relative Entropy for a Continuous Random Variable
510
494
511
495
Let $x$ be a continuous random variable with density $\phi(x)$, and let $g(x) $ be a nonnegative random variable satisfying $\int g(x) \phi(x) dx =1$.
@@ -521,29 +505,24 @@ $$
521
505
over the interval $g \geq 0$.
522
506
523
507
524
-
That relative entropy $\textrm{ent}(g) \geq 0$ can be established by noting (a) that $g \log g \geq g-1$ (see {numref}`figure-example2`)
525
-
and (b) that under $\phi$, $E g =1$.
508
+
That relative entropy $\textrm{ent}(g) \geq 0$ can be established by noting (a) that $g \log g \geq g-1$ (see {numref}`figure-example2`)
509
+
and (b) that under $\phi$, $E g =1$.
526
510
527
511
528
-
{numref}`figure-example3` and {numref}`figure-example4` display aspects of relative entropy visually for a continuous random variable $x$ for
512
+
{numref}`figure-example3` and {numref}`figure-example4` display aspects of relative entropy visually for a continuous random variable $x$ for
529
513
two densities with likelihood ratio $g \geq 0$.
530
514
531
515
Where the numerator density is ${\mathcal N}(0,1)$, for two denominator Gaussian densities ${\mathcal N}(0,1.5)$ and ${\mathcal N}(0,.95)$, respectively, {numref}`figure-example3` and {numref}`figure-example4` display the functions $g \log g$ and $g -1$ as functions of $x$.
532
516
533
517
534
518
535
-
536
-
537
-
538
519
```{figure} entropy_glogg.png
539
520
:height: 350px
540
521
:name: figure-example2
541
522
542
523
The function $g \log g$ for $g \geq 0$. For a random variable $g$ with $E g =1$, $E g \log g \geq 0$.
543
524
```
544
525
545
-
546
-
547
526
```{figure} entropy_1_over_15.jpg
548
527
:height: 350px
549
528
:name: figure-example3
@@ -553,13 +532,11 @@ Under the ${\mathcal N}(0,1.5)$ density, $E g =1$.
553
532
```
554
533
555
534
556
-
557
-
558
535
```{figure} entropy_1_over_95.png
559
536
:height: 350px
560
537
:name: figure-example4
561
538
562
-
$g \log g$ and $g-1$ where $g$ is the ratio of the density of a ${\mathcal N}(0,1)$ random variable to the density of a ${\mathcal N}(0,1.5)$ random variable.
539
+
$g \log g$ and $g-1$ where $g$ is the ratio of the density of a ${\mathcal N}(0,1)$ random variable to the density of a ${\mathcal N}(0,1.5)$ random variable.
563
540
Under the ${\mathcal N}(0,1.5)$ density, $E g =1$.
0 commit comments