UMD Probability Qualifying Exams/Aug2010Probability

From testwiki
Jump to navigation Jump to search

Problem 1

Two persons, A and B, are playing a game. If A winsa a round, he gets $4 from B and wins the next round with probability 0.7. If A loses the round, he pays $5 to B and wins the next round with probability 0.5.

(i) Write downt he transition matrix of the Markov chain with two states, {A won the current round, B won the current round} and find the stationary probabilities of the states

(ii) Find limnP(A has more money after n rounds than before the game).

Solution

(i) The Markov transition matrix will be the 2x2 matrix Q=(qij) where i,j=1 corresponds to a win for Player A and i,j=0 corresponds to a loss for Player A. For example, q11 is the probability that Player A wins after winning in the previous hand; q01 is the probability that Player A wins after losing in the previous hand; etc. This will give


Q=(0.70.30.50.5)

The stationary distribution will be the tuple π=(a,b) such that πQ=π. We can calculate this explicitly:

(a,b)(0.70.30.50.5)=(a,b) yields the following system of equations: .7a+.5b=a;.3a+.5b=b Using the fact that π must be a probability (i.e. a+b=1) we get a=58,b=38.

(ii) Since Q is positive, and hence ergodic, then any initial probability distribution will converge to the stationary distribution just calculated, π. Thus as n Player A will win with probability 5/8. Can Player A expect to have more money though? For sufficiently large n we can compute Player A's expected winnings in one round:

E[winnings]=5/8*4+3/8*(5)=5/8>0

Thus Player A should expect to have more money than before the game with probability 1.




Problem 2

(i) Let X be a random variable with zero mean and finite variance σ2. Show that for any c>0

P(X>c)σ2σ2+c2.

(ii) Let {Xn,n1} be a square-integrable martingale with E(X1)=0. Show that for any c>0

P(max1inXic)var(Xn)var(Xn)+c2.

Solution

(i) P(Xc)P(|X|c)=P(|XE[x]|c)P(|XE[X]|c+σ)σ2c2+2cσ+σ2σ2c2+σ2 were the second-to-last inequality is the standard Chebyshev's inequality.



Problem 3


Solution

Problem 4

Let X,y be random variables with finite expectations.

(i) Show that E(X|Y)=0 implies E(|X+Y|)E(|Y|).

(ii) Show that if (X,Y) is identically distributed with (Y,X), then

E(|3XY|)E(|X+Y|).

Solution

(i) Let f(x)=|x+Y|. Easy to see that f is convex.

Then by Jensen's Inequality we have

f(E(X|Y))E(f(X)|Y)

|E(X|Y)+Y|E(|X+Y||Y). Taking the expectation on both sides gives

E(|Y|)E(|X+Y|). Template:BookCat