In the future we can’t have privacy for the same reason that the record labels and Hollywood can’t have DRM and that’s ultimately a good thing. DRM has proven a fool’s errand because it is not compatible with general purpose computing. At some point a song or movie has to be decrypted in order to be played back and short of tamper proof and “trusted” hardware at that moment it can be digitally copied (and even if we had all of that it could be re-recorded or re-filmed during playback). That is by now reasonably well understood even by Hollywood and the music industry.
Yet we seem to be making the same mistake when it comes to our personal information. Too many of us, including well meaning “privacy advocates” and governments (especially in the EU), want our personal data to sit encrypted somewhere and want to control who can access it when and keep some unforgeable record of how the data was accessed and changed over time. That is as big a fool’s errand as DRM.
Unfortunately even Larry Lessig, whom I greatly respect, seems to believe in this possibility. In a recent piece he writes:
But trust and verify, with high-quality encryption, could. And there are companies, such as Palantir, developing technologies that could give us, and more importantly, reviewing courts, a very high level of confidence that data collected or surveilled was not collected or used in an improper way. Think of it as a massive audit log, recording how and who used what data for what purpose. We could code the Net in a string of obvious ways to give us even better privacy, while also enabling better security.
This is, I am sorry to say, a fantasy. There are simply too many subsystems and intermediate components involved many of which have the data in the clear out of necessity (including keyboards and screens). Most keyboard and screens are already quite sophisticated. Adding a bit of circuitry to send the information elsewhere over a wireless connection would be easy.
Also, all of encryption relies on keys. And those keys too have to be stored somewhere — in places that cannot ultimately be trusted because you don’t know what’s in the silicone they are running on. Are you really going to store your private keys on a USB stick you keep around your neck? Way too much risk of losing everything and even that key could be designed to leak your keys. Of course most people keep their keys online, encrypted and protected with a password, that wait — you type on a keyboard — that runs on, well you really have no idea what. There is always a hole at the end that you *cannot* close. This is the nature of information and computing.
It is only a question of time before some highly encrypted database of millions of medical records gets leaked or stolen along with its keys to show just how big a charade this all is. We incur all this cost and in the end it will turn out to be for naught. The music and movie industries have been learning this lesson the hard way for decades now and yet as society at large we seem doomed to repeat it.
There is only one way forward. Start constructing a society where it doesn’t matter that your personal medical record was just put online (by you or someone else). Or that your song was copied by a million people. That is the real challenge for this and many generations to come.