
My book is called The World After Capital because capital is no longer our binding constraint. We have sufficient physical capital to provide for our needs. Instead, the defining scarcity of our time is attention. Much of our attention is trapped in the job loop consisting of work and consumption. And what remains is being sucked up by systems that algorithmicly maximize for grabbing as much of it as possible. The result is an attention crisis where important collective and individual problems go unresolved with dire consequences. We see this in runaway global warming but also in the widespread deterioration of mental health. The goal of my proposals in the book is to free up human attention to address these issues.
One possible counterargument though is that we will soon have unlimited artificial attention. AI will give us machines with more knowledge than ever possessed by any human and thus attention will no longer be scarce. This is true but it brings the alignment problem into sharp focus. What will all this machine attention be directed at? Will it be human flourishing? And for what definition of flourishing? We are building a genie and our stories full with examples of wish fulfillment by genies gone terribly awry. Take global warming. Yes machines can pay attention to it but they might rightly conclude that humans are the cause of it leading to the extinction of many other species. What exactly might the conclusion from this insight look like? Or take the crisis of meaning that underpins so many of the current mental health issues. An intelligent machine might look at this and conclude that humans are better off drugged or need a new religion with AI as its god.
If we want to solve human attention scarcity with machines we must solve the alignment problem. And of course herein lies the great irony: we are not paying enough attention to alignment. Instead we are finding ourselves propelled into a full on rush towards artificial super intelligence fueled by the same economic and cultural systems that are causing attention scarcity in the first place. Markets are allocating vast amounts of capital this race because of the perceived payoff to the winner. And our geopolitical thinking is still stuck in the industrial age also where progress is seen as a race between nations.
At present course and speed we may well achieve abundant artificial attention but instead of solving our problems we are more likely to add to them. There is a parallel here to the end of the Agrarian Age. That age was marked by frequent warfare between nations. When industrial capabilities first came along they were not seen as requiring a new age but were instead harnessed to the old model. That ultimately resulted in not one but two world wars. We are repeating this mistake now. Instead of setting out to invent a new age, we are aggravating the problems of the existing system. I continue to be optimistic about where we can ultimately get to with these new capabilities, but for now it looks like it will get a lot worse first (including the potential for catastrophic outcomes).

My book is called The World After Capital because capital is no longer our binding constraint. We have sufficient physical capital to provide for our needs. Instead, the defining scarcity of our time is attention. Much of our attention is trapped in the job loop consisting of work and consumption. And what remains is being sucked up by systems that algorithmicly maximize for grabbing as much of it as possible. The result is an attention crisis where important collective and individual problems go unresolved with dire consequences. We see this in runaway global warming but also in the widespread deterioration of mental health. The goal of my proposals in the book is to free up human attention to address these issues.
One possible counterargument though is that we will soon have unlimited artificial attention. AI will give us machines with more knowledge than ever possessed by any human and thus attention will no longer be scarce. This is true but it brings the alignment problem into sharp focus. What will all this machine attention be directed at? Will it be human flourishing? And for what definition of flourishing? We are building a genie and our stories full with examples of wish fulfillment by genies gone terribly awry. Take global warming. Yes machines can pay attention to it but they might rightly conclude that humans are the cause of it leading to the extinction of many other species. What exactly might the conclusion from this insight look like? Or take the crisis of meaning that underpins so many of the current mental health issues. An intelligent machine might look at this and conclude that humans are better off drugged or need a new religion with AI as its god.
If we want to solve human attention scarcity with machines we must solve the alignment problem. And of course herein lies the great irony: we are not paying enough attention to alignment. Instead we are finding ourselves propelled into a full on rush towards artificial super intelligence fueled by the same economic and cultural systems that are causing attention scarcity in the first place. Markets are allocating vast amounts of capital this race because of the perceived payoff to the winner. And our geopolitical thinking is still stuck in the industrial age also where progress is seen as a race between nations.
At present course and speed we may well achieve abundant artificial attention but instead of solving our problems we are more likely to add to them. There is a parallel here to the end of the Agrarian Age. That age was marked by frequent warfare between nations. When industrial capabilities first came along they were not seen as requiring a new age but were instead harnessed to the old model. That ultimately resulted in not one but two world wars. We are repeating this mistake now. Instead of setting out to invent a new age, we are aggravating the problems of the existing system. I continue to be optimistic about where we can ultimately get to with these new capabilities, but for now it looks like it will get a lot worse first (including the potential for catastrophic outcomes).
>300 subscribers
>300 subscribers
Share Dialog
Share Dialog
Albert Wenger
Albert Wenger
The new age might be marked by having a more atomic level around individuals. In the sweep of history, we've organized into companies, nations, gangs. If we can coordinate groups for new purposes, perhaps we'll have a different outcome. Just because we can organize and coordinate in new ways doesn't mean we will though. DAO's are interesting, but I'm not seeing a breakthrough there yet. If there's some low socio economic kid who is unencouraged about their relative power and prospects, how does this new world offer a better and easier to see outcome than joining a gang or a company or a country? Maybe another dynamic is pro basketball salaries. As a group, they reject pooling salaries because each person wants to be in the top 1% of earners. They choose to have that economic inequality. Maybe we all do, collectively.
Data granularity impressive.
Refreshingly modest predictions.
Crisp language, no fluff.
Blog post: Abundant Artificial Attention? https://continuations.com/abundant-artificial-attention
In a recent blog post, @albertwenger explores themes from "The World After Capital," emphasizing that while physical capital isn’t limited, human attention has become a pressing scarcity. The focus is shifted towards an impending AI-driven future where alignment is crucial—ensuring machine attention isn't misdirected. If unchecked, advancements may exacerbate current global crises rather than alleviate them. While there's optimism for better outcomes, existing issues may worsen before improvements can be realized.