One reason I might be finding this post uncomfortable is that I’m pretty concerned about the mental health of many young EAs, and frankly for some people I met I’m more worried about the chance of them dying from suicide or risky activities over the next decade than from x-risks. Unfortunately I think there is also a link between people who are very focused on death by x-risk and poor mental health. This is an intuition, nothing more.
I share this concern, and this was my biggest hesitation to making this post. I’m open to the argument that this post was pretty net bad because of that.
If you’re finding things like existential dread concerning, I’ll flag that the numbers in this post are actually fairly low in the grand scheme of total risks to you over your life − 3.7% just isn’t that high. Dying young just isn’t that likely.
One reason I might be finding this post uncomfortable is that I’m pretty concerned about the mental health of many young EAs, and frankly for some people I met I’m more worried about the chance of them dying from suicide or risky activities over the next decade than from x-risks. Unfortunately I think there is also a link between people who are very focused on death by x-risk and poor mental health. This is an intuition, nothing more.
I share this concern, and this was my biggest hesitation to making this post. I’m open to the argument that this post was pretty net bad because of that.
If you’re finding things like existential dread concerning, I’ll flag that the numbers in this post are actually fairly low in the grand scheme of total risks to you over your life − 3.7% just isn’t that high. Dying young just isn’t that likely.
You know, even disregarding AI, I’d never have thought that I had a ~5% chance of dying in the next 30 years. It’s frightening.
I wouldn’t take this as bearing on the matter that you replied to in any way, though.