Monday, March 16, 2015
Unizor - Probability - Event Arithmetic
Let's discuss how logical operations on events are related to their probabilities. This part of the theory is heavily dependent on the representation of events as subsets of elements (each element is an elementary event) and probabilities as an additive measure defined on these subset.
Starting from the simplest case, consider a finite set of elementary events (say, outcomes of rolling a pair of dice) of equal probabilities (in case of a pair of dice each elementary event has probability 1/36).
Since this probability is an additive measure, any combination of elementary events into one bigger event has a probability equal to a sum of probabilities of all elementary events that compose this bigger event (for example, the probability of rolling a pair of dice with a sum equal to 5 is a sum of probabilities of the composing elementary events - rolling 1/4, 2/3, 3/2 and 4/1 on two dice, each having the probability 1/36 and, therefore, the combined probability of having a sum of 5 on two dice equals to 4·1/36=1/9).
Consider now a logical operation OR between two events X and Y (this operation is usually signified in the set theory by a union sign ∪):
Z = X OR Y = X ∪ Y.
First, consider these two events to be mutually exclusive, that is having no common elementary events. Since they are mutually exclusive, elementary events constituting event X are completely different from those that constitute event Y. As we know, event Z consists of elementary events that form a union of elementary events included into event X and those included into event Y. Therefore, the probability measure of event Z is a sum of probabilities of events X and Y. We can, therefore, state that the probability of an event that is the result of an operation OR between two mutually exclusive events equals to a sum of their probabilities.
Now let's establish a connection between intersection of certain type of events and an operation of multiplication.
Consider now a logical operation AND between two events X and Y (this operation is usually signified in the set theory by an intersection sign ∩):
Z = X AND Y = X ∩ Y.
This is an operation of taking only common elementary events between X and Y to form a resulting event Z.
Let's start with an example of rolling two dice (we will call them #1 and #2). Consider an event X{#1 is equal or greater than 5} and an event Y{#2 is equal or less than 4}. From the common sense perspective these events are independent because they apply to two different dice with no connection between them. Let's examine the probabilities of X, Y and X∩Y.
Elementary events in our experiment are still all the possible pairs of numbers from 1 to 6 each - 36 different combinations with equal probability of 1/36 each.
Event X consists of elementary events 5/1, 5/2, 5/3, 5/4, 5/5, 5/6, 6/1, 6/2, 6/3, 6/4, 6/5, 6/6. Therefore, it's probability is 12/36=1/3.
Event Y consists of elementary events 1/4, 2/4, 3/4, 4/4, 5/4, 6/4, 1/3, 2/3, 3/3, 4/3, 5/3, 6/3, 1/2, 2/2, 3/2, 4/2, 5/2, 6/2, 1/1, 2/1, 3/1, 4/1, 5/1, 6/1. Therefore, it's probability is 24/36=2/3.
Event X∩Y consists of only one elementary events 5/4, 6/4, 5/3, 6/3, 5/2, 6/2, 5/1, 6/1. Therefore, it's probability is 8/36=2/9.
Notice that
P(X∩Y) = P(X)·P(Y)
and this is not an accident. The important consideration here is the independence of X and Y, which we did not define precisely, but just mentioned it in a casual common sense. It was indeed natural to consider the results of rolling two different dice as independent events.
As you see, logical operation AND between independent events results in the multiplication of probabilities.
This fact is a basis for using multiplication sign · instead of logical AND between independent events. The above equality looks now as
P(X · Y) = P(X) · P(Y).
To summarize, this lecture was about two logical operations on events, OR (∪) and AND (∩).
We have shown that the OR of mutually exclusive events is similar to addition, so we are justified replacing sign ∪ between two events with sign of addition +:
P(X+Y)=P(X)+P(Y)
and the AND of independent events is similar to multiplication, so we are justified replacing sign ∩ between two events with sign of multiplication ·:
P(X·Y)=P(X)·P(Y)
Finally, using the operation of intersection, we can express the probability of a union between any two (not necessarily mutually exclusive) events as
P(X∪Y)=P(X)+P(Y)−P(X∩Y).
Indeed, summing together probability measures of events X and Y, we counted twice the probability measures of all elementary events common for them. That's why we subtracted the probability measure of their intersection.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment