Subscribe to Continuations to receive new posts directly to your inbox.
Subscribe to Continuations to receive new posts directly to your inbox.
While on family vacation in Scotland at the end of August, I read James Gleick’s “The Information.” If you have limited reading time and are looking for something practical or applied this is not for you. But if you have an interest in some of the foundations of not just computer science, but also physics and really all of life, “The Information” provides an excellent overview. I would also recommend the book to people who already have a a fairly in-depth knowledge but enjoy getting more historical background and maybe finding some connections that they had missed.
For instance, there is some great background on Shannon and Turing meeting in New York during World War II. The quality of the exposition is mostly pretty decent, except when it comes to entanglement and quantum computing which both wind up being more confusing than enlightening. Overall though I came away with not only a better appreciation of the pervasive role of information but also a renewed desire to learn even more.
One of the chapters that was particularly motivating for me was on measuring randomness - a topic that is already of great interest and something I will dig into more. To leave you with just one brain teaser from that: is pi a random number? We can calculate pi to arbitrary precision using a relatively simple algorithm, yet we cannot use statistical methods to predict the next digit no matter how many digits we have already seen. If you find that as intriguing as I do, you should definitely read “The Information.”