I’d be excited about posts that argued “I think EAs are overestimating AI x-risk, and here are some aspects of EA culture/decision-making that might be contributing to this.”
I’m less excited about posts that say “X thing going on EA is bad”, where X is a specific decision that EAs made [based on their estimate of AI x-risk]. (Unless the post is explicitly about AI x-risk estimates).
I’d be excited about posts that argued “I think EAs are overestimating AI x-risk, and here are some aspects of EA culture/decision-making that might be contributing to this.”
I’m less excited about posts that say “X thing going on EA is bad”, where X is a specific decision that EAs made [based on their estimate of AI x-risk]. (Unless the post is explicitly about AI x-risk estimates).
Related: Is that your true rejection?