University of Alberta Guide/STAT/222/Independence and Conditional Expectations

From testwiki
Revision as of 00:42, 2 November 2010 by imported>Adrignola (rm talk page comment)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

The basic idea behind independence is that if your random variables (or vectors) are independent then the combination of several of these random variable/vectors (P(X1x1,...,Xdxd)) can be multiplied together.

Given T1,T2 are independent λ-Exponential random variables, then E[T1+T2]=E[2T1]=E[2T2]. But Var(T1+T2)<Var(2T1). This is because Var(T1+T2)=Var(T1)+Var(T2)=2Var(T1) by independence, whereas Var(2T1)=22Var(T1).

See the equations section for some more examples.

Equations

P(X1x1,...,Xdxd) =P(X1x1)P(Xdxd)
FX1,X2,...Xd(x1...d) =FX1(x1)FXd(xd)
fX1,X2,...Xd(x1...d) =fX1(x1)fXd(xd)
E[X1X2Xd] =E[X1]E[Xd]
E[g1(X1)g2(X2)gd(Xd)] =E[g1(X1)]E[gd(Xd)]
MX1+...+Xd(x) =MX1(x)MX1(x)
FXY(x,y) =???
fXY(x,y) =???
FX|Y(x|y) =???
FX+Y(x) =P(X+Yx)

Template:BookCat