There’s a psychological phenomenon that I can’t remember the name of, but essentially, and subconsciously, a person tries to make others around them feel stressed and anxious in order to mitigate their own stress and anxiety.
I see a lot of this in mainstream climate change reporting, and I’m starting to notice it more on here with regards to AI x-risk.
Basically, I find seeing posts with titles like “We’re All Gonna Die with Eliezer Yudkowsky” extremely tough emotionally, and they make me use the forum less. I suspect I am not the only one.
Obviously talking about significant x-risks is going to be stressful. I do not support people self-censoring when trying to provide realistic appraisals of our current situation; that seems clearly counter-productive. I also understand that the stressful nature of dealing with x-risk means that some people will find it too mentally tough to contribute.
At the same time, there are emotional wins to be had, and avoiding the psychological phenomenon I mentioned at the start seems like one of them. I think a decent heuristic for doing so is asking ‘what action am I asking readers to take as a result of this information’, and making sure you have a good answer.
Sticking with the Eliezer theme, his letter to Time performs well on this metric: emotionally harrowing, but with a clear call to support certain political initiatives.
In summary: AI x-risk is emotionally tough enough already, and I think some effort to avoid unnecessarily amplifying that difficulty is a valuable use of forum authors’ time. I would certainly appreciate it as a user!
A request to keep pessimistic AI posts actionable.
There’s a psychological phenomenon that I can’t remember the name of, but essentially, and subconsciously, a person tries to make others around them feel stressed and anxious in order to mitigate their own stress and anxiety.
I see a lot of this in mainstream climate change reporting, and I’m starting to notice it more on here with regards to AI x-risk.
Basically, I find seeing posts with titles like “We’re All Gonna Die with Eliezer Yudkowsky” extremely tough emotionally, and they make me use the forum less. I suspect I am not the only one.
Obviously talking about significant x-risks is going to be stressful. I do not support people self-censoring when trying to provide realistic appraisals of our current situation; that seems clearly counter-productive. I also understand that the stressful nature of dealing with x-risk means that some people will find it too mentally tough to contribute.
At the same time, there are emotional wins to be had, and avoiding the psychological phenomenon I mentioned at the start seems like one of them. I think a decent heuristic for doing so is asking ‘what action am I asking readers to take as a result of this information’, and making sure you have a good answer.
Sticking with the Eliezer theme, his letter to Time performs well on this metric: emotionally harrowing, but with a clear call to support certain political initiatives.
In summary: AI x-risk is emotionally tough enough already, and I think some effort to avoid unnecessarily amplifying that difficulty is a valuable use of forum authors’ time. I would certainly appreciate it as a user!