Thanks again for your thoughts. You’re right—we haven’t empirically tested a wisdom-first approach. However, my core argument is that capitalism and geopolitics inherently favor rapid intelligence gains over incremental wisdom. Even incremental wisdom progress would inevitably lag behind more aggressive intelligence-focused strategies, given these systemic incentives.
The core of my essay focuses on the almost inevitable extinction of humanity at the hands of AGI, which literally no one has been able to engage with. I think your focus on hypothetical alternatives rather than confronting this systemic reality illustrates the psychological sidestepping I discuss in my recent essay. If you have time, I encourage you to take a look.
It’s very hard to say since it wasn’t tried.
I think incremental progress in this direction still would be better than the comparative.
Thanks again for your thoughts. You’re right—we haven’t empirically tested a wisdom-first approach. However, my core argument is that capitalism and geopolitics inherently favor rapid intelligence gains over incremental wisdom. Even incremental wisdom progress would inevitably lag behind more aggressive intelligence-focused strategies, given these systemic incentives.
The core of my essay focuses on the almost inevitable extinction of humanity at the hands of AGI, which literally no one has been able to engage with. I think your focus on hypothetical alternatives rather than confronting this systemic reality illustrates the psychological sidestepping I discuss in my recent essay. If you have time, I encourage you to take a look.