(Here we contract ). The last step we notice that geometrically (they are always on opposite sides of the unit circle) and similar with the other two terms. They cancel giving a total sum of 0.
b.
Here if we let and . Then:
Which is:
Looking at our figure.
Notice that:
The average distribution is centered at , validating that answer.
It's odd that the variance is , implying that . It seemed that should be changing over time with the graph, but I guess this makes sense since each point in the graph has the opposite distribution on the other end of the line.
☐
11.13
Question
Proof
We want to find now the auto-correlation which is:
☐
11.15
Question
Proof
a.
for all .
b. If, for all , we have , then since:
Then when then since for all then since the mean of is zero then is larger than its mean. If ( was larger than its mean) then that would imply positive covariance (which is the case here). Hence we expect .
c. It is:
d. Since is linear, and each is Gaussian, then their linear combination must be Gaussian. We just need to find the and here. Notice:
and
(this is essentially the -root law in action). Thus then where the right here represents a normal distribution with specified and .
☐
11.19
Question
Proof
We can assume that . So then:
for all , which may be swapped around. That implies that:
a. We'll find the expected value in terms of the other expected values:
b. We'll find the autocorrelation function in terms of other autocorrelation functions (and expected values in this case):
c. We'll find the autocovariance function in terms of other autocovariance functions, using :
d. We'll find variance in terms of other variances