*Notes to a video lecture on http://www.unizor.com*

__Independent Random Variables__

Expectation of Product

Expectation of Product

Our goal in this lecture is to prove that

*expectation*of a product of two

*independent random variables*equals to a product of their

*expectations*.

First of all, intuitively, this fact should be obvious, at least, in some cases.

When an

*expectation*of a

*random variable*is a value, around which results of

*random experiments*are concentrated (like a temperature of a healthy person), product of results of two different experiments (product of temperatures of two different healthy persons) tend to concentrate around product of their

*expectations*.

In some other cases, when such a concentration does not take place (like flipping a coin), that same rule of multiplicative property of an

*expectation*is still observed.

A very important detail, however, differentiates property of a sum of two random variables from their product. The expectation of a sum always equals to a sum of expectations of its component. With a product the analogous property is true only in case the components are

**INDEPENDENT**random variables.

Let's approach this problem more formally and prove this theorem.

Consider the following two

*random experiments*(

*sample spaces*) and

*random variables*defined on their

*elementary events*.

**Ω**_{1}=(e_{1},e_{2},...,e_{M})with corresponding measure of probabilities of these

*elementary events*

**P**=(p_{1},p_{2},...,p_{M})(that is,

*- non-negative numbers with their sum equaled to 1)*

**P**(e_{i})=p_{i}and random variable

*ξ*defined for each

*elementary event*as

*ξ(e*where

_{i}) = x_{i}*i=1,2,...M*

**Ω**_{2}=(f_{1},f_{2},...,f_{N})with corresponding measure of probabilities of these

*elementary events*

**Q**=(q_{1},q_{2},...,q_{N})(that is,

*- non-negative numbers with their sum equaled to 1)*

**Q**(f_{j})=q_{j}and random variable

*η*defined for each

*elementary event*as

*η(f*where

_{j}) = y_{j}*j=1,2,...N*

Separately, the

*expectations*of these

*random variables*are:

**E**(ξ) = x_{1}·p_{1}+x_{2}·p_{2}+...+x_{M}·p_{M}

**E**(η) = y_{1}·q_{1}+y_{2}·q_{2}+...+y_{N}·q_{N}To calculate the

*expectation*of a product of these

*random variables*, let's research what values and with what

*probabilities*this product can take.

Since every value of

*ξ*can be observed with every value of

*η*, we can conclude that all the values of their product are described by all values

*x*where index

_{i}·y_{j}*i*runs from

*1*to

*M*and index

*j*runs from

*1*to

*N*.

Let's examine the probabilistic meaning of a product of two

*random variables*defined on two different

*sample spaces*.

Any particular value

*x*is taken by a new random variable

_{i}·y_{j}*ζ=ξ·η*defined on a new combined

*sample space*

*×*

**Ω**=**Ω**_{1}*that consists of all pairs of*

**Ω**_{2}*elementary events*

*(e*

_{i}, f_{j})*probabilities*of these pairs equal to

**R**(e_{i}, f_{j}) = r_{ij}where index

*i*runs from

*1*to

*M*and index

*j*runs from

*1*to

*N*.

Thus, we have defined a new

*random variable*

*ζ=ξ·η*defined on a new

*sample space*

*of*

**Ω***M·N*pairs of

*elementary events*from two old spaces

*and*

**Ω**_{1}*as follows*

**Ω**_{2}*ζ(e*

_{i}, f_{j}) = x_{i}·y_{j}with

*probability r*

_{i j}Before going any further, let's examine very important properties of

*probabilities r*.

_{ij}We have defined

*r*as a

_{ij}*probability*of a

*random experiment*described by a

*sample space*

*resulting in*

**Ω**_{1}*elementary event*

*e*and, simultaneously, a

_{i}*random experiment*described by a

*sample space*

*resulting in*

**Ω**_{2}*elementary event*

*f*.

_{j}Incidentally, if events from these two

*sample spaces*are

**independent**,

*r*

_{ij}= p_{i}·q_{j}because, for

**independent**events,

*probability of their simultaneous occurrence*equals to a

*product of probabilities*of their separate individual occurrences.

Keeping in mind the above properties of

*probabilities*

*r*, we can calculate the

_{ij}*expectation*of our new

*random variable*

*ζ*.

= (x

+ (x

...

+ (x

**E**(ζ) =**E**(ξ·η) == (x

_{1}·y_{1})·r_{1}_{1}+...+(x_{1}·y_{N})·r_{1}_{N}++ (x

_{2}·y_{1})·r_{2}_{1}+...+(x_{2}·y_{N})·r_{2}_{N}+...

+ (x

_{M}·y_{1})·r_{M}_{1}+...+(x_{M}·y_{N})·r_{M}_{N}On the other hand, let's calculate the product of expectations of our random variable

*ξ*and

*η*:

=(x

·(y

= (x

+ (x

...

+ (x

**E**(ξ)·**E**(η) ==(x

_{1}·p_{1}+...+x_{M}·p_{M})··(y

_{1}·q_{1}+...+y_{N}·q_{N}) == (x

_{1}·y_{1})·p_{1}q_{1}+...+(x_{1}·y_{N})·p_{1}q_{N}++ (x

_{2}·y_{1})·p_{2}q_{1}+...+(x_{2}·y_{N})·p_{2}q_{N}+...

+ (x

_{M}·y_{1})·p_{M}q_{1}+...+(x_{M}·y_{N})·p_{M}q_{N}Obviously, if random variables

*ξ*and

*η*are

**INDEPENDENT**, probability

*r*of

_{ij}*ξ*to take value

*x*and, simultaneously,

_{i}*η*to take value

*y*equals to a product of corresponding probabilities

_{j}*p*. In this case expressions for

_{i}·q_{j}*and*

**E**(ξ·η)*are identical.*

**E**(ξ)·**E**(η)That proves that for

**INDEPENDENT**random variables mathematical expectation of their product equals to a product of their mathematical expectations.

End of proof.

## No comments:

Post a Comment