- Information
- AI Chat
This is a Premium Document. Some documents on Studocu are Premium. Upgrade to Premium to unlock it.
Was this document helpful?
This is a Premium Document. Some documents on Studocu are Premium. Upgrade to Premium to unlock it.
Stat9 - Prob and Stats
Course: Probability and Statistics (MATHUA235)
190 Documents
Students shared 190 documents in this course
University: New York University
Was this document helpful?
This is a preview
Do you want full access? Go Premium and unlock all 7 pages
Access to all documents
Get Unlimited Downloads
Improve your grades
Already Premium?
9.3 More than two random variables To determine the joint distribution of n random variables
X1, X2,...,Xn, all defined on the same sample space Ω, we have to describe how the probability
mass is distributed over all possible values of (X1, X2,...,Xn). In fact, it suffices to specify the
joint distribution function F of X1, X2,...,Xn, which is defined by F(a1, a2,...,an) = P(X1 ≤ a1, X2
≤ a2,...,Xn ≤ an) for −∞ < a1, a2,...,an < ∞. In case the random variables X1, X2,...,Xn are
discrete, the joint distribution can also be characterized by specifying the joint probability mass
function p of X1, X2,...,Xn, defined by p(a1, a2,...,an) = P(X1 = a1, X2 = a2,...,Xn = an) for −∞ <
a1, a2,...,an < ∞. Drawing without replacement Let us illustrate the use of the joint probability
mass function with an example. In the weekly Dutch National Lottery Show, 6 balls are drawn
from a vase that contains balls numbered from 1 to 41. Clearly, the first number takes values 1,
2,..., 41 with equal probabilities. Is this also the case for—say—the third ball? 9.3 More than two
random variables 123 Let us consider a more general situation. Suppose a vase contains balls
numbered 1, 2,...,N. We draw n balls without replacement from the vase. Note that n cannot be
larger than N. Each ball is selected with equal probability, i.e., in the first draw each ball has
probability 1/N, in the second draw each of the N −1 remaining balls has probability 1/(N −1),
and so on. Let Xi denote the number on the ball in the i-th draw, for i = 1, 2,...,n. In order to
obtain the marginal probability mass function of Xi, we first compute the joint probability mass
function of X1, X2,...,Xn. Since there are N(N −1)···(N −n+1) possible combinations for the
values of X1, X2,...,Xn, each having the same probability, the joint probability mass function is
given by p(a1, a2,...,an) = P(X1 = a1, X2 = a2,...,Xn = an) = 1 N(N − 1)···(N − n + 1), for all
distinct values a1, a2,...,an with 1 ≤ aj ≤ N. Clearly X1, X2,...,Xn influence each other.
Nevertheless, the marginal distribution of each Xi is the same. This can be seen as follows.
Similar to obtaining the marginal probability mass functions in Table 9.2, we can find the
marginal probability mass function of Xi by summing the joint probability mass function over all
possible values of X1,...,Xi−1, Xi+1,...,Xn: pXi (k) = p(a1,...,ai−1, k, ai+1,...,an) = 1 N(N − 1)···(N
− n + 1), where the sum runs over all distinct values a1, a2,...,an with 1 ≤ aj ≤ N and ai = k.
Since there are (N − 1)(N − 2)···(N − n + 1) such combinations, we conclude that the marginal
probability mass function of Xi is given by pXi (k)=(N − 1)(N − 2)···(N − n + 1) · 1 N(N − 1)···(N −
n + 1) = 1 N , for k = 1, 2,...,N. We see that the marginal probability mass function of each Xi is
the same, assigning equal probability 1/N to each possible value. In case the random variables
X1, X2,...,Xn are continuous, the joint distribution is defined in a similar way as in the case of
two variables. We say that the random variables X1, X2,...,Xn have a joint continuous
distribution if for some function f : Rn → R and for all numbers a1, a2,...,an and b1, b2,...,bn with
ai ≤ bi, P(a1 ≤ X1 ≤ b1, a2 ≤ X2 ≤ b2,...,an ≤ Xn ≤ bn) =
b1 a1
b2 a2 ···
bn an f(x1, x2,...,xn) dx1 dx2 ··· dxn. Again f has to satisfy f(x1, x2,...,xn) ≥ 0 and f has to
integrate to 1. We call f the joint probability density of X1, X2,...,Xn. 124 9 Joint distributions and
independence 9.4 Independent random variables In earlier chapters we have spoken of
independence of random variables, anticipating a formal definition. On page 46 we postulated
that the events {R1 = a1}, {R2 = a2},..., {R10 = a10} related to the Bernoulli random variables
R1,...,R10 are independent. How should one define independence of random variables?
Intuitively, random variables X and Y are independent if every event involving only X is
independent of every event involving only Y . Since for two discrete random variables X and Y ,
Why is this page out of focus?
This is a Premium document. Become Premium to read the whole document.
Why is this page out of focus?
This is a Premium document. Become Premium to read the whole document.