Last week, I provided a recap of what we have covered so far in Uncertainty Wednesday. To go on from here we need to introduce some more concepts that are usually covered earlier but that I believe will make more sense in the framework that we have now established. The first one of these is the concept of independence. Two events are said to be independent if “the occurrence of one does not affect the probability of occurrence of the other.”
Now is a good time to remember that in our simplest model we have four elementary events AH, AL, BH and BL which represent all the possible combinations of the world being in either state A or state B and us receiving either signal H or signal L. We then figured out the probability of events, e.g. P(B), in terms of the underlying probabilities of the elementary events, e.g. P({BH}) and P({BL}. From there we went on to define the concept of a conditional probability, such as
P(B | H) = P({BH}) / P(H)
So if we look at the definition of independence above, what we are looking for is the situations where observing the signal does not tell us anything about the state of the world. Expressed as a formula, we are looking for a situation where
P(B | H) = P(B)
Now we know that P(H) = P({AH}) + P({BH}) and so what we are looking for is
P(B | H) = P({BH}) / [P({AH}) + P({BH})] = P(B)
Let’s remind ourselves what P({BH}) means. It is the elementary event where the world is in state B *AND* we receive signal H. Another way of writing that is as follows
P({BH}) = P(B ∩ H)
where ∩ denotes intersection. Why? Remember that events such as B and H are sets of elementary events, in particular
B = {BH, BL} and H = {AH, BH}
so B ∩ H = {BH}
With that we can re-write the above condition for independence as follows
P(B | H) = P(B ∩ H) / [P(A ∩ H) + P(B ∩ H)] = P(B)
Now what should P(B ∩ H) be in terms of P(B) and P(H) to make this hold?
Let’s consider P(B ∩ H) = P(B) * P(H)
P(B | H) = P(B) * P(H) / [P(A) * P(H) + P(B) * P(H)] = P(B) / [P(A) + P(B)] = P(B)
The last step here is simply the result of P(A) + P(B) = 1 by definition. The way we set up the problem the world is either in in state A or in state B and the likelihood that the world is in either state is therefore 1.
While this definitely doesn’t pass as a rigorous proof, what we see is that in our 2 state and 2 signal value world the following two are equivalent conditions and each mark independence
Unconditional probability = conditional probability
and
Probability of any state + signal combination = product of probabilities
More to come on independence next week. For now you should ask yourself based on the above whether assuming that two events are independent is imposing a lot of constraints or few constraints. Or put differently, will we encounter a lot of independent situations or few independent situations in the world?
Albert Wenger
Over 100 subscribers