>300 subscribers
Share Dialog
Last Uncertainty Wednesday, I introduced the concept of independence. I ended the post by asking whether independence was likely to be a strong or a weak assumption. To answer this let us look at the possible values for elementary events. And for that purpose I will show a different way of looking at those that I should have probably introduced a long time ago, when I first presented the 2 states, 2 signals model, defined probability, and related the probability of elementary events to compound events. I was lazy at the time because this involves a diagram but it will be helpful as we go forward:

We show the states of the world and the signal values as a 2x2 matrix. Each value inside the matrix is the probability of an elementary event such as P({AH}), meaning the world is in state A and we receive signal H. In the margins we indicate how the probabilities of compound events such as P(A) and P(H) are the sums of the probabilities of elementary events.
Now let’s approach the problem from the outside in, meaning let’s start with the probabilities for states and signal values and work towards the elementary events. For the states we P(A) and P(B), for which the diagram introduces the shorthand p = P(A), which means that P(B) = 1 - p because there are only two states and hence it must be that P(A) + P(B) = 1. Similarly for the signal values we have P(L) and P(H) where we will use q as our shorthand.
The condition for independence can now be expresses as follows
P({AL}) = a = p * q
P({AH}) = b = p * (1 - q)
P({BL}) = c = (1 - p) * q
P({BH}) = d = (1 - p) * (1 - q)
What this says is that under independence the probabilities for the elementary events are perfectly determined by the probabilities of the compound events. Or put differently, there is exactly 1 matrix which meets the independence requirement.
Now let’s look at the matrix without the independence requirement and let’s assume that p ≤ q and that p + q ≤ 1 (you can convince yourself separately that these assumptions don’t impose any real constraints on the analysis that’s about to follow because we could always swap p with q as well as p with 1-p). With these assumptions, consider the following choices
0 ≤ a = P({AL}) ≤ p
P({AH}) = b = p - a
P({BL}) = c = q - a
P({BH}) = d = 1 - p - c = 1 - p - q + a
It means that there is an (uncountable) infinity of matrices for any given values of p = P(A) and q = P(L) that do *NOT* meet the independence requirement.
You are going to want to make sure you really understand this before we move on. There is exactly 1 matrix that gives independence and there is an infinity of matrices without independence.
What does this mean? Well if you have a system and there is a signal that seems to be emanating from the system, then it is basically impossible for that signal not to be telling you something about the system. Or put differently independence is an insanely strong assumption.
Next week will look at independence one more time in the context in which it is usually first introduced, such as the repeated flip of a coin, toss of a dice or drawing of a card from a deck of cards.
No comments yet