UMD Probability Qualifying Exams/Aug2009Probability

From testwiki
Jump to navigation Jump to search

Problem 1

Let X1,...,Xn be i.i.d. random variables with moment generating function M(t)=E[exp(tX1)] which is finite for all t. Let X~n=(X1++Xn)/n.

(a) Prove that

P[X1>a]exp[h(a)] where

h(a)=supt0[atψ(t)] and ψ(t)=logM(t)

(b) Prove that

P[X~na]exp[nh(a)].

(c) Assume E[X1]=0. Use the result of (b) to establish that X~n0 almost surely.


Solution

(a) P[X1>a]=X1>a1dF=X1>aexp(ta)exp(ta)dF=eatX1>aeatdFeatX1>aeX1tdFeatΩeX1tdF=eatE[exp(tX1)]

Thus far, we have not imposed any conditions on t. So the above inequality will hold for all t, hence for the supremum as well, which gives us the desired result.


(b) P[X~n>a]=X~n>a1dF=X~n>aexp(nta)exp(nta)dF=enati=1nXi>anenatdFenati=1nXi>anei=1nXitdFenatΩei=1nXitdF=enat(ΩeXitdF)n where the last equality follows from the fact that the Xi are independent and identically distributed.

(c)

Problem 2

Let (Ω,,P) be a probability space; let X be a random variable with finite second moment and let 𝒢1𝒢2 be sub σ-fields. Prove that

E[(XE(X|𝒢2))2]E[(XE(X|𝒢1))2].


Solution

Problem 3

Let N1(t),N2(t) be independent homogeneous Poisson processes with rates λ1,λ2, respectively. Let Z be the time of the first jump for the process N1(t)+N2(t) and let J be the random index of the component process that made the first jump. Find the joint distribution of (J,Z). In particular, establish that J,Z are independent and that Z is exponentially distributed.



Solution

Show Z is exponentially distributed

Let τ be the first time that a Poisson process N(t) jumps.

pτ(x)=limϵ0Fτ(x)Fτ(xϵ)ϵ=limϵ0P(N(x)>0N(xϵ)=0)ϵ=limϵ01/ϵP(N(xϵ)=0)P(N(x)N(xϵ)>0)=limϵ01/ϵeλ(xϵ)λϵ1eλϵ=λeλx

N1(t)+N2(t) is a Poisson Process with parameter λ1+λ2

Proof: There are three conditions to check:

(i) N1(0)+N2(0)=0 almost surely

(ii) For t>s is (N1(t)+N2(t)N1(s)N2(s) independent of N1(s)+N2(s)? This is true since both N1,N2 are Poisson Processes and are both independent of each other.

(iii) For t>s is (N1(t)+N2(t)N1(s)N2(s) distributed Poisson with parameter ts? This is true since the sum of independent Poisson processes are also poison. (see second bullet)

Joint distribution of (J,Z)

P(J=1,Z=x)=λ1eλ1x

P(J=2,Z=x)=λ2eλ2x

Problem 4

Let (Xn,n) be a martingale sequence and for each n let ϵn be an n1-measurable random variable. Define

Yn=i=1nϵi(XiXi1),Y0=0

Assuming that Yn is integrable for each n, show that Yn is a martingale.


Solution

Problem 5

Let X1,...,Xn be an i.i.d. sequence with E[Xi]=0 and V[Xi]=σ2<. Prove that for any γ>1/2, the series k=1Xk/kγ converges almost surely.

Solution

Define Zk:=Xk/kγ. Then E[Zk]=0 and V[Zk]=σ2k2γ. We check the three components of Kolmogorov's three-series theorem to conclude that k=1Zk converges almost surely.


k=1E[Zk1|Zk|1]<

k=1V[Zk1|Zk|1]<

k=1P(Zk1)<

Problem 6

Consider the following process {Xn} taking values in {0,1,...}. Assume Un,n=1,2,... is an i.i.d. sequence of positive integer valued random variables and let X0 be independent of the Un. Then

Xn={Xn11if Xn10Uk1if Xn1=0 for the kth time

Solution

Template:BookCat