If we go extinct, it doesn’t matter how much value we get. We don’t exist to appreciate it.
If we don’t go extinct, it probably means we have enough “value” (we survived, it means we had and have food and shelter) and probably we can have math proofs how to make AI agents safe. Now after the main extinction event (probably AI agentic explosion) is in the past, we can work on increasing the value of futures.
If we go extinct, it doesn’t matter how much value we get. We don’t exist to appreciate it.
If we don’t go extinct, it probably means we have enough “value” (we survived, it means we had and have food and shelter) and probably we can have math proofs how to make AI agents safe. Now after the main extinction event (probably AI agentic explosion) is in the past, we can work on increasing the value of futures.