Probability/Joint Distributions and Independence: Difference between revisions
imported>LeoChiukl |
(No difference)
|
Latest revision as of 06:07, 28 September 2022
Motivation
Suppose we are given a pmf of a discrete random variable and a pmf of a discrete random variable . For example, We cannot tell the relationship between and with only such information. They may be related or not related.
For example, the random variable may be defined as if head comes up and otherwise from tossing a fair coin, and the random variable may be defined as if head comes up and otherwise from tossing the coin another time. In this case, and are unrelated.
Another possibility is that the random variable is defined as if head comes up in the first coin tossing, and otherwise. In this case, and are related.
Yet, in the above two examples, the pmf of and are exactly the same.
Therefore, to tell the Template:Colored em between and , we define the Template:Colored em cumulative distribution function, or joint cdf.
Joint distributions
Template:Colored definition Sometimes, we may want to know the random behaviour in one of the random variables involved in a joint cdf. We can do this by computing the marginal cdf from joint cdf. The definition of marginal cdf is as follows: Template:Colored definition Template:Colored remark Template:Colored proposition
Proof. When we set the arguments other than -th argument to be , e.g. , the joint cdf becomes
Template:Colored remark Template:Colored example Similar to the one-variable case, we have joint pmf and joint pdf. Also, analogously, we have marginal pmf and marginal pdf.
Template:Colored definition Template:Colored definition Template:Colored proposition
Proof. Consider the case in which there are only two random variables, say and . Then, we have Similarly, in general case, we have Then, we perform similar process on each of the other variables ( left), with one extra summation sign added for each process. Thus, in total we will have summation sign, and we will finally get the desired result.
Template:Colored remark Template:Colored example Template:Colored exercise Template:Colored exercise Template:Hide For Template:Colored em continuous random variables, the definition is generalized version of the one for continuous random variables (univariate case). Template:Colored definition Template:Colored remark Template:Colored definition Template:Colored proposition
Proof. Recall the proposition about obtaining marginal cdf from joint cdf. We have
Proof. It follows from using fundamental theorem of calculus times.
Template:Colored example Template:Colored exercise
Independence
Recall that multiple events are independent if the probability for the intersection of them equals the product of probabilities of each event, by definition. Since is also an event, we have a natural definition of independence for Template:Colored em as follows: Template:Colored definition Template:Colored remark Template:Colored theorem
Proof. Partial:
Only if part: If random variables are independent, for each and for each subset . Setting , and we have Thus, we obtain the result for the joint cdf part.
For the joint pdf part,
Template:Colored remark Template:Colored example Template:Colored exercise Template:Colored proposition Template:Colored example Template:Colored exercise
Sum of independent random variables (optional)
In general, we use joint cdf, pdf or pmf to determine the distribution of sum of independent random variables by first principle. In particular, there are some interesting results related to the distribution of Template:Colored em of Template:Colored em random variables. Template:Collapse top Template:Colored proposition
Proof.
- Continuous case:
- cdf:
/\ //\ y ///\| ////* ////|\ ////|/\ ////|//\ x+y=z <=> x=z-y ////|///\ ////|////\ ----*-----*--------------- x ////|//////\ ////|///////\ -->: -infty to z-y ^ |: -infty to infty *--* |//| : x+y <= z *--*
- pdf:
Template:Colored remark Template:Colored example Template:Colored proposition
Proof.
- Let .
- For each nonnegative integer ,
- Since for each , 's are pairwise disjoint.
- Hence, by extended P3 and independence of and ,
- The result follows by definition.
Template:Colored example Template:Colored proposition
Proof.
- The pmf of is
- This pmf as the pmf of , and so .
- We can extend this result to Poisson r.v.'s by induction.
Template:Colored example Template:Collapse bottom
Order statistics
Template:Colored definition Template:Colored proposition
Proof.
- Consider the event .
Possible positions of x
|<--------------------->
*---*----...------*----*------...--------*
X (1) (2) (k) (k+1) (n)
|----------------------> when x moves RHS like this, >=k X_i are at the LHS of x
- We can see from the above figure that .
- Let no. of 's that are less than or equal to be .
- Since (because for each , we can treat and be the two outcomes in a Bernoulli trial),
- The cdf is
Poisson process
Template:Colored definition There are several important properties for Poisson process. Template:Colored proposition
Proof.
- The time to -th event is , with each following .
- It suffices to prove that , and then the desired result follows by induction.
- which is the pdf of , as desired.
Template:Colored remark Template:Colored proposition
Proof. For each nonnegative integer , let be the interarrival time between the -th and -th arrival, and be the time to th event, starting from the beginning of the fixed time interval (we can treat the start to be time zero because of the memoryless property). The joint pdf of is Let the number of arrivals within the fixed time interval. The pmf of is which is the pmf of . The result follows.
Proof. For each ,