Notes to a video lecture on http://www.unizor.com
Independent Random Variables
Expectation of Product
Our goal in this lecture is to prove that expectation of a product of two independent random variables equals to a product of their expectations.
First of all, intuitively, this fact should be obvious, at least, in some cases.
When an expectation of arandom variable is a value, around which results of random experiments are concentrated (like a temperature of a healthy person), product of results of two different experiments (product of temperatures of two different healthy persons) tend to concentrate around product of their expectations.
In some other cases, when such a concentration does not take place (like flipping a coin), that same rule of multiplicative property of an expectation is still observed.
A very important detail, however, differentiates property of a sum of two random variables from their product. The expectation of a sum always equals to a sum of expectations of its component. With a product the analogous property is true only in case the components areINDEPENDENT random variables.
Let's approach this problem more formally and prove this theorem.
Consider the following tworandom experiments (sample spaces) and random variablesdefined on their elementary events.
Ω1=(e1,e2,...,eM )
with corresponding measure of probabilities of theseelementary events
P=(p1,p2,...,pM )
(that is, P(ei )=pi - non-negative numbers with their sum equaled to 1)
and random variable ξ defined for each elementary event as
ξ(ei) = xi where i=1,2,...M
Ω2=(f1,f2,...,fN )
with corresponding measure of probabilities of theseelementary events
Q=(q1,q2,...,qN )
(that is, Q(fj )=qj - non-negative numbers with their sum equaled to 1)
and random variable η defined for each elementary event as
η(fj) = yj where j=1,2,...N
Separately, the expectations of these random variables are:
E(ξ) = x1·p1+x2·p2+...+xM·pM
E(η) = y1·q1+y2·q2+...+yN·qN
To calculate the expectation of a product of these random variables, let's research what values and with whatprobabilities this product can take.
Since every value of ξ can be observed with every value of η, we can conclude that all the values of their product are described by all values xi·yjwhere index i runs from 1 to Mand index j runs from 1 to N.
Let's examine the probabilistic meaning of a product of tworandom variables defined on two different sample spaces.
Any particular value xi·yj is taken by a new random variableζ=ξ·η defined on a new combined sample spaceΩ=Ω1×Ω2 that consists of all pairs of elementary events
R(ei , fj ) = rij
where index i runs from 1 to Mand index j runs from 1 to N.
Thus, we have defined a newrandom variable ζ=ξ·η defined on a new sample space Ω ofM·N pairs of elementary eventsfrom two old spaces Ω1 and Ω2as follows
ζ(ei , fj ) = xi·yj
with probability ri j
Before going any further, let's examine very important properties of probabilities rij.
We have defined rij as aprobability of a random experiment described by asample space Ω1 resulting inelementary event ei and, simultaneously, a random experiment described by asample space Ω2 resulting inelementary event fj.
Incidentally, if events from these two sample spaces areindependent,
rij = pi·qj
because, for independentevents, probability of their simultaneous occurrence equals to a product of probabilities of their separate individual occurrences.
Keeping in mind the above properties of probabilities rij, we can calculate the expectationof our new random variable ζ.
E(ζ) = E(ξ·η) =
= (x1·y1)·r11+...+(x1·yN)·r1N +
+ (x2·y1)·r21+...+(x2·yN)·r2N +
...
+ (xM·y1)·rM1+...+(xM·yN)·rMN
On the other hand, let's calculate the product of expectations of our random variable ξ and η:
E(ξ)·E(η) =
=(x1·p1+...+xM·pM)·
·(y1·q1+...+yN·qN) =
= (x1·y1)·p1q1+...+(x1·yN)·p1qN +
+ (x2·y1)·p2q1+...+(x2·yN)·p2qN +
...
+ (xM·y1)·pMq1+...+(xM·yN)·pMqN
Obviously, if random variablesξ and η are INDEPENDENT, probability rij of ξ to take valuexi and, simultaneously, η to take value yj equals to a product of corresponding probabilitiespi·qj. In this case expressions for E(ξ·η) and E(ξ)·E(η) are identical.
That proves that forINDEPENDENT random variables mathematical expectation of their product equals to a product of their mathematical expectations.
End of proof.
No comments:
Post a Comment