I agree. EA has a cost-effectiveness problem that conflicts with its truth-seeking attempts. EAās main driving force is cost-effectiveness, above all elseāeven above truth itself.
EA is highly incentivised to create and spread apocalyptic doom narratives. This is because apocalyptic doom narratives are good at recruiting people to EAās āletās work to decrease the probability of apocalyptic doom (because that has lots of expected value given future population projections)ā cause area. And funding-wise, EA community funding (at least in the UK) is pretty much entirely about trying to make more people work in these areas.
EA is also populated by the kinds of people who respond to apocalyptic doom narratives, for the basic reason that if they didnāt they wouldnāt have ended up in EA. So stuff that promotes these narratives does well in EAās attention economy.
EA just doesnāt have anywhere near as much Ā£$⬠to spend as academia does. Itās also very interested in doing stuff and willing to tolerate errors as long as the stuff gets done. Therefore, its academic standards are far lower.
I really donāt know how youād fix this. I donāt think research into catastrophic risks should be conducted on a shoestring budget and by a pseudoreligion/ācitizen science community. I think it should be government funded and probably sit within the wider defense and security portfolio.
However Iāll give EA some grace for essentially being a citizen science community, for the same reason I donāt waste effort grumping about the statistical errors made by participants in the Big Garden Birdwatch.
I agree. EA has a cost-effectiveness problem that conflicts with its truth-seeking attempts. EAās main driving force is cost-effectiveness, above all elseāeven above truth itself.
EA is highly incentivised to create and spread apocalyptic doom narratives. This is because apocalyptic doom narratives are good at recruiting people to EAās āletās work to decrease the probability of apocalyptic doom (because that has lots of expected value given future population projections)ā cause area. And funding-wise, EA community funding (at least in the UK) is pretty much entirely about trying to make more people work in these areas.
EA is also populated by the kinds of people who respond to apocalyptic doom narratives, for the basic reason that if they didnāt they wouldnāt have ended up in EA. So stuff that promotes these narratives does well in EAās attention economy.
EA just doesnāt have anywhere near as much Ā£$⬠to spend as academia does. Itās also very interested in doing stuff and willing to tolerate errors as long as the stuff gets done. Therefore, its academic standards are far lower.
I really donāt know how youād fix this. I donāt think research into catastrophic risks should be conducted on a shoestring budget and by a pseudoreligion/ācitizen science community. I think it should be government funded and probably sit within the wider defense and security portfolio.
However Iāll give EA some grace for essentially being a citizen science community, for the same reason I donāt waste effort grumping about the statistical errors made by participants in the Big Garden Birdwatch.