The summary is not great, the main idea is this: we have 3 “worlds”—physical, online, and AI agents’ multimodal “brains” as the third world. We can only easily access the physical world, we are slower than AI agents online and we cannot access multimodal “brains” at all, they are often owned by private companies.
While AI agents can access and change all the 3 “worlds” more and more.
We need to level the playing field by making all the 3 worlds easy for us to access and democratically change, by exposing the online world and especially the multimodal “brains” world as game-like 3D environments for people to train and get at least the same and ideally more freedoms and capabilities than AI agents have.
If we go extinct, it doesn’t matter how much value we get. We don’t exist to appreciate it.
If we don’t go extinct, it probably means we have enough “value” (we survived, it means we had and have food and shelter) and probably we can have math proofs how to make AI agents safe. Now after the main extinction event (probably AI agentic explosion) is in the past, we can work on increasing the value of futures.