Monday, September 29, 2014

Unizor - Probability - Random Variables - Independence





There are a few different but equivalent approaches to define independent random variables. We will use a simple approach based on the fact that we are only dealing with random variables that take finite number of values.
So, assume a random variable ξ takes values
x1, x2,..., xM
with probabilities
p1, p2,..., pM.
Further, assume a random variable η takes values
y1, y2,..., yN
with probabilities
q1, q2,..., qN.

Let us remind that a random variable is a numeric function defined on each elementary event participating in the random experiment. The fact that a random variable ξ takes some value xi means that one of the elementary events, where the value of this numeric function equals to xi, indeed occurred. The combined probability of all the elementary events, where this numeric function equals to xi is the probability of a random variable ξ of having the value xi. The combination of all these elementary events make up an event characterized by a description "an event where random variable ξ takes the value xi".

For instance, when rolling two dice and summarizing the rolled numbers (this sum is our random variable), elementary events (1,5), (2,4), (3,3), (4,2) and (5,1) combined together form an event described as "our random variable took a value of 6".

Let's assign a symbol Ai to this combined event of a random variable ξ of taking the value xi. So, according to our notation,
P(ξ=xi) = P(Ai)

Analogously, let Bj be an event of a random variable η taking the value yj. So, according to our notation,
P(η=yj) = P(Bj)

The above considerations allow us to use the language and properties of events to describe the values of and relationships between random variables.
Thus, we can define a conditional probability of one random variable relative to another using already defined conditional probability between events:
P(ξ=xi | η=yj ) = P(Ai | Bj )

If the conditional probability of a random variable ξ taking any one of its values under the condition that a random variable η took any one of its values equals to an unconditional probability of ξ taking that value, then a random variable ξ is independent of a random variable η.
In other words, ξ is independent of η if
P(ξ=xi | η=yj ) = P(ξ=xi )
where index i can take any value from 1 to M and index j can take values from 1 to N.

Important Property of Independent Random Variables

From the definition and properties of conditional probability for events X and Y we know that
P(X | Y) = P(X∩Y)/P(Y)
Therefore,
P(ξ=xi | η=yj ) = P(ξ=xi ∩ η=yj ) / P(η=yj )
But, if random variable ξ and η are independent, the conditional probability of ξ taking some value under some condition imposed on η is the same as its unconditional probability. Therefore,
P(ξ=xi ) = P(ξ=xi ∩ η=yj ) / P(η=yj )
from which follows
P(ξ=xi ∩ η=yj ) = P(ξ=xi ) · P(η=yj )
Verbally, it can be expressed as the statement that the probability of two independent variable simultaneously taking some values equals to a product of probabilities of them separately taking their corresponding values.
This, for random variables, similarly to an analogous property of probability of independent events, is a characteristic property of their independence. It is completely equivalent to a definition of independent variables that uses conditional probability.

No comments: