Indestructibly optimistic.
Dem0sthenes
It’s perhaps not that surprising that this specific counterpoint calling for a ton of more analysis comes from a student still in academia / undergrad…
Are we already past the precipice? Would we know if that was the case? I really wonder. Things seem to be spiraling. Many of todays conditions mirror the malaise and misery of the 1930s but the capacity for destruction is orders of magnitude higher. I wonder if we need to act with much greater urgency and boldness to be more proactive about XR..…
Yes that makes sense and aligns with my thinking as well. Do you have a sense of how much the EA community gives to AI vs nuclear vs bioweapon existential risks? Or how to go about figuring that out?
Where is the evidence for this claim? This all seems like reason and words :P
Is not the greatest refuge what Elon is planning for Mars? That’s a separate track though we should be working to expedite and ensure that that civilizational plan b is operational ASAP.
Hi Stephen! Thanks for the post. What are the typical frameworks that you use to think about existential threats? Sometimes for instance we utilize probabilities to describe the chance of say nuclear Armageddon though that seems a bit off from a frequentinost philosophical perspective. For example, that type of event either happens or it doesn’t. We can’t run 100 earth high fidelity simulations and count the various outcomes and then calculate the probability of various catastrophes. I work with data in my day job so these types of questions are top of mind.
The intra-EA jargon is strong with this one, young padawan(s).