Expected Value (Joint)

Proposition

Let X,Y be jointly distributed rvs with Joint Probability Mass Function (JPMF) p(x,y) or Probability Distributions for Continuous Variables (PDF) f(x,y) based on whether X,Y are both discrete or continuous variables. Then the expected value of a function h(X,Y) denoted by E[h(X,Y)] or μh(X,Y) is given by:

E[h(X,Y)]={xyh(x,y)p(x,y)X,Y are discreteh(x,y)f(x,y)dxdyX,Y are continuous

This is another case of the law of the unconcious statistician.

Properties of Expected Value (Joint)

Linearity of Expectation

Let X,Y be random variables. Then, for any function h1,h2 and any constants a1,a2,b then:

E[a1h1(X,Y)+a2h2(X,Y)+b]=a1E[h1(X,Y)]+a2E[h2(X,Y)]+b

This comes directly from the linearity of expectation.

Independence implies splittable multiplication

Let X,Y be independent random variables. If h(X,Y)=g1(X)g2(Y) then:

E[h(X,Y)]=E[g1(X)g2(Y)]=E[g1(X)]E[g2(Y)]

Proof

For the continuous case and discrete case, there are pretty much carbon copies. Doing the continuous case:

E[h(X,Y)]=E[g1(X)g2(Y)]=g1(x)g2(y)f(x,y)dxdy=g1(x)g2(y)fX(x)fY(y)dxdyX,Y independent=(g1(x)fX(x)dx)(g2(y)fY(y)dy)=E[g1(X)]E[g2(Y)]