Ch 11 - 13, 15, 19

11.11

Question

Pasted image 20241107171508.png
For context the pdf is:
Pasted image 20241107171727.png

Proof

a.

E(X(t))=E[cos(2πf0t+π4+Kπ2)]=kcos(2πf0t+π4+kπ2)p(k)=k=03cos(2πf0t+π4+kπ2)14=14[cos(β(t)+π4)+cos(β(t)+3π4)+cos(β(t)+5π4)+cos(β(t)+7π4)]=140=0

(Here we contract 2πf0t=β(t)). The last step we notice that cos(β+π4)=cos(β+5π4) geometrically (they are always on opposite sides of the unit circle) and similar with the other two terms. They cancel giving a total sum of 0.

b.

Var(X(t))=E[(X(t)μX(t))2]=k=03(cos(2πf0t+π4+kπ2)0)214=14k=03cos2(2πf0t+π4+kπ2)=14(cos2(2πf0t+π4)+cos2(2πf0t+3π4)+cos2(2πf0t+5π4+))

Here if we let a2(t)=cos2(2πf0t+π4)=cos2(2πf0t+5π4) and b2(t)= the other one. Then:

Var(X(t))=14(2a2(t)+2b2(t))=a2(t)+b2(t)2

Which is:

=12(cos2(2πf0t+π4)+cos2(2πf0t+3π4))=12(cos2(2πf0t+π4)+sin2(2πf0t+π4))=12

Looking at our figure.

Pasted image 20241107174212.png

Notice that:

11.13

Question

Pasted image 20241107174527.png

Proof

CXX(t,s)=Cov(X(t),X(s))=Cov(v0+Acos(ω0t+θ0),v0+Acos(ω0s+θ0))=cos(ω0t+θ0)cos(ω0s+θ0)Cov(A,A)=cos(ω0t+θ0)cos(ω0s+θ0)Var(A)

We want to find now the auto-correlation RXX(t,s) which is:

RXX(t,s)=CXX(t,s)+E[X(t)]E[X(s)]=cos(ω0t+θ0)cos(ω0s+θ0)(E[A2]E[A]2)+(v0+cos(ω0t+θ0)E[A])(v0+cos(ω0s+θ0)E[A])=cos(ω0t+θ0)cos(ω0s+θ0)E[A2]cos(ω0t+θ0)cos(ω0s+θ0)E[A]2+v02+v0cos(ω0t+θ0)E[A]+v0cos(ω0s+θ0)E[A]+cos(ω0t+θ0)cos(ω0s+θ0)E[A]2=v02+v0cos(ω0t+θ0)E[A]+v0cos(ω0s+θ0)E[A]+cos(ω0t+θ0)cos(ω0s+θ0)E[A2]

11.15

Question

Pasted image 20241108154122.png
Pasted image 20241108154129.png

Proof

a.

Var(N(t))=Cov(N(t),N(t))=CNN(t,t)=e|tt|σN2(t)=1

for all t.

b. If, for all t, we have N(t)>0, then since:

Cov(N(t),N(s))=CNN(t,s)=e|st|

Then when N(t)>0 then since CNN(t,s)>0 for all t,s then since the mean of N(t) is zero then N(t) is larger than its mean. If N(s)>0 (N(s) was larger than its mean) then that would imply positive covariance (which is the case here). Hence we expect N(s)>0.

c. It is:

ρ=Corr(N(10),N(12))=Cov(N(10),N(12))σN(10)σN(12)=CNN(10,12)11=e|1210|=e2

d. Since N(12)N(10) is linear, and each N(t) is Gaussian, then their linear combination must be Gaussian. We just need to find the μ and σ here. Notice:

μ=E[N(12)N(10)]=E[N(12)]E[N(10)]=00=0

and

σ=σN2(12)+σN2(10)2Cov(N(10),N(12))=1+1e2=2e2

(this is essentially the n-root law in action). Thus then N(12)N(10)N(0,2e2) where the right N here represents a normal distribution with specified μ and σ.

11.19

Question

Pasted image 20241108155533.png

Proof

We can assume that Corr(S(t),N(s))=0Cov(S(t),N(s))=0. So then:

0=E[S(t)N(s)]E[S(t)]E[N(s)]E[S(t)N(s)]=E[S(t)]E[N(s)]

for all t,s, which may be swapped around. That implies that:

RSN(t,s)=E[S(t)N(s)]=E[S(t)]E[N(s)]

a. We'll find the expected value in terms of the other expected values:

E[X(t)]=E[S(t)+N(t)]=E[S(t)]+E[N(t)]=μS(t)+μN(t)

b. We'll find the autocorrelation function in terms of other autocorrelation functions (and expected values in this case):

RXX(t,s)=E[X(t)X(s)]=E[(S(t)+N(t))(S(s)+N(s))]=E[S(t)S(s)+S(t)N(s)+S(s)N(t)+N(t)N(s)]=RSS(t,s)+RSN(t,s)+RNS(t,s)+RNN(t,s)=RSS(t,s)+μS(t)μN(s)+μN(t)μS(s)+RNN(t,s)

c. We'll find the autocovariance function in terms of other autocovariance functions, using Cov(S,N)=0:

CXX(t,s)=Cov(X(t),X(s))=Cov(S(t)+N(t),S(s)+N(s))=Cov(S(t),S(s))+Cov(S(t),N(s))0+Cov(N(t),S(s))0+Cov(N(t),N(s))=Cov(S(t),S(s))+Cov(N(t),N(s))=CSS(t,s)+CNN(t,s)

d. We'll find variance in terms of other variances

Var(X(t))=CXX(t,t)=CSS(t,t)+CNN(t,t)=Var(S(t))+Var(N(t))