Last week in Uncertainty Wednesday, I introduced Shannon entropy as a measure of uncertainty that is based solely on the structure of the probability distribution. As a quick reminder, the formula is H = - K ∑ pi log pi where ∑ is over the i = 1…n of the probability distribution Now you may notice a potential problem here if the distribution includes a probability p that approaches 0, because log p will go to infinity. If you know limits and remember L'Hôpital’s rule, you can convince yo...