Philosophy Mondays: Human-AI Collaboration
Today's Philosophy Monday is an important interlude. I want to reveal that I have not been writing the posts in this series entirely by myself. Instead I have been working with Claude, not just for the graphic illustrations, but also for the text. My method has been to write a rough draft and then ask Claude for improvement suggestions. I will expand this collaboration to other intelligences going forward, including open source models such as Llama and DeepSeek. I will also explore other moda...

Intent-based Collaboration Environments
AI Native IDEs for Code, Engineering, Science
Web3/Crypto: Why Bother?
One thing that keeps surprising me is how quite a few people see absolutely nothing redeeming in web3 (née crypto). Maybe this is their genuine belief. Maybe it is a reaction to the extreme boosterism of some proponents who present web3 as bringing about a libertarian nirvana. From early on I have tried to provide a more rounded perspective, pointing to both the good and the bad that can come from it as in my talks at the Blockstack Summits. Today, however, I want to attempt to provide a coge...
Philosophy Mondays: Human-AI Collaboration
Today's Philosophy Monday is an important interlude. I want to reveal that I have not been writing the posts in this series entirely by myself. Instead I have been working with Claude, not just for the graphic illustrations, but also for the text. My method has been to write a rough draft and then ask Claude for improvement suggestions. I will expand this collaboration to other intelligences going forward, including open source models such as Llama and DeepSeek. I will also explore other moda...

Intent-based Collaboration Environments
AI Native IDEs for Code, Engineering, Science
Web3/Crypto: Why Bother?
One thing that keeps surprising me is how quite a few people see absolutely nothing redeeming in web3 (née crypto). Maybe this is their genuine belief. Maybe it is a reaction to the extreme boosterism of some proponents who present web3 as bringing about a libertarian nirvana. From early on I have tried to provide a more rounded perspective, pointing to both the good and the bad that can come from it as in my talks at the Blockstack Summits. Today, however, I want to attempt to provide a coge...
>400 subscribers
>400 subscribers
Share Dialog
Share Dialog
Last time in Uncertainty Wednesdays, I introduced continuous random variables and gave an example of a bunch of random variables following a Normal Distribution.

Now in the picture you can see two values, denoted as μ and σ^2, for the different colored probability density functions. These are the two parameters that completely define a normally distributed random variable: μ is the Expected Value and σ^2 is the Variance.
This is incredibly important to understand. All normally distributed random variables only have 2 free parameters. What do I mean by “free” parameters? We will give this more precision over time, but basically for now think of it as follows: a given Expected Value and Variance completely define a normally distributed Random Variable. So even though these random variables can take on an infinity of values, the probability distribution across these values is very tightly constrained.
Contrast this with a discrete random variable X with four possible values x1, x2, x3 and x4. Here the probability distribution p1, p2, p3, p4 has the constraint that p1 + p2 + p3 + p4 = 1 where pi = Prob(X = xi). That means there a 3 degrees of freedom because the fourth probability is determined by the first 2. Still that is one more degree of freedom than for the Normal Distribution, despite having only four possible outcomes (instead of an infinity).
Why does this matter? Assuming that something is normally distributed provides a super tight constraint. This should remind you of the discussion we had around independence. There we saw that assuming independence is actually a very strong assumption. Similarly, assuming that something is normally distributed is a strong constraint because it means there are only two free parameters characterizing the entire probability distribution.
Last time in Uncertainty Wednesdays, I introduced continuous random variables and gave an example of a bunch of random variables following a Normal Distribution.

Now in the picture you can see two values, denoted as μ and σ^2, for the different colored probability density functions. These are the two parameters that completely define a normally distributed random variable: μ is the Expected Value and σ^2 is the Variance.
This is incredibly important to understand. All normally distributed random variables only have 2 free parameters. What do I mean by “free” parameters? We will give this more precision over time, but basically for now think of it as follows: a given Expected Value and Variance completely define a normally distributed Random Variable. So even though these random variables can take on an infinity of values, the probability distribution across these values is very tightly constrained.
Contrast this with a discrete random variable X with four possible values x1, x2, x3 and x4. Here the probability distribution p1, p2, p3, p4 has the constraint that p1 + p2 + p3 + p4 = 1 where pi = Prob(X = xi). That means there a 3 degrees of freedom because the fourth probability is determined by the first 2. Still that is one more degree of freedom than for the Normal Distribution, despite having only four possible outcomes (instead of an infinity).
Why does this matter? Assuming that something is normally distributed provides a super tight constraint. This should remind you of the discussion we had around independence. There we saw that assuming independence is actually a very strong assumption. Similarly, assuming that something is normally distributed is a strong constraint because it means there are only two free parameters characterizing the entire probability distribution.
No comments yet