I really don’t like the trend of posts saying that “EA/EAs need to | should do X or Y”.
EA is about cost-benefit analysis. The phrases need and should implies binaries/absolutes and having very high confidence.
I’m sure there are thousands of interventions/measures that would be positive-EV for EA to engage with. I don’t want to see thousands of posts loudly declaring “EA MUST ENACT MEASURE X” and “EAs SHOULD ALL DO THING Y,” in cases where these mostly seem like un-vetted interesting ideas.
In almost all cases I see the phrase, I think it would be much better replaced with things like; ”Doing X would be high-EV” “X could be very good for EA” ”Y: Cost and Benefits” (With information in the post arguing the benefits are worth it) ”Benefits|Upsides of X” (If you think the upsides are particularly underrepresented)”
I think it’s probably fine to use the word “need” either when it’s paired with an outcome (EA needs to do more outreach to become more popular) or when the issue is fairly clearly existential (the US needs to ensure that nuclear risk is low). It’s also fine to use should in the right context, but it’s not a word to over-use.
Strong disagree. If the proponent of an intervention/cause area believes the advancement of it is extremely high EV such that they believe it is would be very imprudent for EA resources not to advance it, they should use strong language.
I think EAs are too eager to hedge their language and use weak language regarding promising ideas.
For example, I have no compunction saying that advancement of the Profit for Good (companies with charities in vast majority shareholder position) needs to be advanced by EA, in that I believe it not doing results in an ocean less counterfactual funding for effective charities, and consequently a significantly worse world.
First, I have a different issue with that phrase, as it’s not clear what “EA” is. To me, EA doesn’t seem like an agent. You can say, ”....CEA should” or ”...OP should”.
Normally, I prefer one says “I think X should”. There are some contexts, specifically small ones (talking to a few people, it’s clearly conversational) where saying, “X should do Y” clearly means “I feel like X should do Y, but I’m not sure”. And there are some contexts where it means “I’m extremely confident X should do Y”.
For example, there’s a big difference between saying “X should do Y” to a small group of friends, when discussing uncertain claims, and writing a mass-market book titled “X should do Y”.
There are a couple of strong “shoulds” in the EA Handbook (I went through it over the last two months as part of an EA Virtual program) and they stood out to me as the most disagreeable part of EA philosophy that was presented.
I really don’t like the trend of posts saying that “EA/EAs need to | should do X or Y”.
EA is about cost-benefit analysis. The phrases need and should implies binaries/absolutes and having very high confidence.
I’m sure there are thousands of interventions/measures that would be positive-EV for EA to engage with. I don’t want to see thousands of posts loudly declaring “EA MUST ENACT MEASURE X” and “EAs SHOULD ALL DO THING Y,” in cases where these mostly seem like un-vetted interesting ideas.
In almost all cases I see the phrase, I think it would be much better replaced with things like;
”Doing X would be high-EV”
“X could be very good for EA”
”Y: Cost and Benefits” (With information in the post arguing the benefits are worth it)
”Benefits|Upsides of X” (If you think the upsides are particularly underrepresented)”
I think it’s probably fine to use the word “need” either when it’s paired with an outcome (EA needs to do more outreach to become more popular) or when the issue is fairly clearly existential (the US needs to ensure that nuclear risk is low). It’s also fine to use should in the right context, but it’s not a word to over-use.
See also EA should taboo “EA should”
Related (and classic) post in case others aren’t aware: EA should taboo “EA should”.
Lizka makes a slightly different argument, but a similar conclusion
Strong disagree. If the proponent of an intervention/cause area believes the advancement of it is extremely high EV such that they believe it is would be very imprudent for EA resources not to advance it, they should use strong language.
I think EAs are too eager to hedge their language and use weak language regarding promising ideas.
For example, I have no compunction saying that advancement of the Profit for Good (companies with charities in vast majority shareholder position) needs to be advanced by EA, in that I believe it not doing results in an ocean less counterfactual funding for effective charities, and consequently a significantly worse world.
https://forum.effectivealtruism.org/posts/WMiGwDoqEyswaE6hN/making-trillions-for-effective-charities-through-the
What about social norms, like “EA should encourage people to take care of their mental health even if it means they have less short-term impact”?
Good question.
First, I have a different issue with that phrase, as it’s not clear what “EA” is. To me, EA doesn’t seem like an agent. You can say, ”....CEA should” or ”...OP should”.
Normally, I prefer one says “I think X should”. There are some contexts, specifically small ones (talking to a few people, it’s clearly conversational) where saying, “X should do Y” clearly means “I feel like X should do Y, but I’m not sure”. And there are some contexts where it means “I’m extremely confident X should do Y”.
For example, there’s a big difference between saying “X should do Y” to a small group of friends, when discussing uncertain claims, and writing a mass-market book titled “X should do Y”.
I haven’t noticed this trend, could you list a couple of articles like this? Or even DM me if you’re not comfortable listing them here.
I recently noticed it here:
https://forum.effectivealtruism.org/posts/WJGsb3yyNprAsDNBd/ea-orgs-need-to-tabletop-more
Looking back, it seems like there weren’t many more very recently. Historically, there have been some.
EA needs consultancies
EA needs to understand its “failures” better
EA needs more humor
EA needs Life-Veterans and “Less Smart” people
EA needs outsiders with a greater diversity of skills
EA needs a hiring agency and Nonlinear will fund you to start one
EA needs a cause prioritization journal
Why EA needs to be more conservative
Looking above, many of those seem like “nice to haves”. The word “need” seems over-the-top to me.
There are a couple of strong “shoulds” in the EA Handbook (I went through it over the last two months as part of an EA Virtual program) and they stood out to me as the most disagreeable part of EA philosophy that was presented.