The increasing focus on Longtermism and X-risk has made us look cultish and unrelatable.
It was much harder for people to criticise EA as cultish when we were mainly about helping poor people from starving or dying of preventable disease because everyone can see immediately that those are worthy goals. X-risk and Longtermism donât make the same intuitive sense to people, so people dismiss the movement as weird and wrong.
We should lean back towards focusing on global development
I agree with paragraph 1 and 2 and disagree with paragraph 3 :)
That is: I agree longtermism and x-risk are much more difficult to introduce to the general population. Theyâre substantially farther from the status quo and have weirder and more counterintuitive implications.
However, we donât choose what to talk about by how palatable it is. We must be guided by whatâs true, and whatâs most important. Unfortunately, we live in a world where whatâs palatable and whatâs true need not align.
To be clear, if you think global development is more important than x-risk, it makes sense to suggest that we should focus that way instead. But if you think x-risk is more important, the fact that global development is less âweirdâ is not enough reason to lean back that way.
I suspect that it varies within the domain of X-risk focused work how weird and cultish it looks to the average person. I think both A.I. risk stuff and a generic âreduce extinction riskâ framing will look more âreligiousâ to the average person than âwe are worried about pandemics an nuclear wars.â
The increasing focus on Longtermism and X-risk has made us look cultish and unrelatable.
It was much harder for people to criticise EA as cultish when we were mainly about helping poor people from starving or dying of preventable disease because everyone can see immediately that those are worthy goals. X-risk and Longtermism donât make the same intuitive sense to people, so people dismiss the movement as weird and wrong.
We should lean back towards focusing on global development
I agree with paragraph 1 and 2 and disagree with paragraph 3 :)
That is: I agree longtermism and x-risk are much more difficult to introduce to the general population. Theyâre substantially farther from the status quo and have weirder and more counterintuitive implications.
However, we donât choose what to talk about by how palatable it is. We must be guided by whatâs true, and whatâs most important. Unfortunately, we live in a world where whatâs palatable and whatâs true need not align.
To be clear, if you think global development is more important than x-risk, it makes sense to suggest that we should focus that way instead. But if you think x-risk is more important, the fact that global development is less âweirdâ is not enough reason to lean back that way.
I suspect that it varies within the domain of X-risk focused work how weird and cultish it looks to the average person. I think both A.I. risk stuff and a generic âreduce extinction riskâ framing will look more âreligiousâ to the average person than âwe are worried about pandemics an nuclear wars.â