Pretty much everyone starts off drinking milk, and while adult
consumption varies culturally, genetically, and ethically,
if I put milk on my morning bran flakes thatâs a neutral
choice around here. If my breakfast came up in talking with a
friend they might think it was dull, but they wouldnât be surprised
or confused. Some parts of effective altruism are like this: giving
money to
very poor people is, to nearly everyone, intuitively and obviously
good.
Most of EA, however, is more like cheese. If youâve never heard of
cheese it seems strange and maybe not so good, but at least in the US
most people are familiar with the basic idea. Distributing bednets or deworming
medication, improving the treatment
of animals, developing
vaccines, or trying to reduce the risk of nuclear
war are mild cheeses like Cheddar or Mozzarella: people will
typically think âthat seems goodâ if you tell them about it, and if
they donât it usually doesnât take long to explain.
In general, work that anyone can see is really valuable
is more likely to already
be getting the attention it needs. This means that people who are
looking hard for what most needs doing are often going to be exploring
approaches that are not obvious, or that initially look bizarre.
Pursuit of impact pushes us toward stranger and stronger cheeses, and
while humanity may discover yet more non-obvious cheeses over time Iâm
going to refer to the far end of this continuum as the casu marzu end,
after the cheese that gets its distinctive flavor and texture from
live maggots that jump as you eat it. EAs who end up out in this
direction arenât going to be able to explain to their neighbor why
they do what they do, and explaining to an interested family member
probably takes several widely spaced conversations.
Sometimes people talk casually as if the weird stuff is longtermist
and the mainstream stuff isnât, but if you look at the range of EA
endeavors the main focus areas of EA all have people working along
this continuum. A typical person likely easily sees the altruistic
case for âhelp governments create realistic plans for pandemicsâ but
not âbuild refuges
to protect a small number of people from global catastrophesâ; âgive
chickens better
conditionsâ but not âdetermine the relative moral differences
between insects of
different agesâ; âplan for the economic effects of ChatGPTâs
successorsâ but not âformalize what it means for an agent to have a
goalâ; âorganize pledge drivesâ but
not âgive money to
promising high schoolersâ. And Iâd rate these all at most bleu.
Iâve seen this dynamic compared to motte-and-bailey
or bait-and-switch.
The idea is that someone presents EA to newcomers and only talks about
the mild cheeses, when thatâs not actually where most of the
communityâand especially the most highly-engaged
membersâthink we should be focusing. People might then think
they were on board with EA when they actually would find a lot of what
goes on under its banner deeply weird. I think this is partly fair:
when introducing EA, even to a general audience, I think itâs
important not to give the impression that these easy-to-present things
are the totality of EA. In addition to being misleading, that also
risks people who would be a good fit for the stranger bits bouncing
off. On the other hand, EA isnât the kind of movement where âon
boardâ makes much sense. Weâre not about signing onto a large body of
thought, or expecting everyone within the movement to think everyone
elseâs work is valuable. Weâre united by a common
question, how we can each do the most good, along with culture and
intellectual tools for approaching this question.
I think itâs really good that EA is open to the very weird, the
mainstream, and everything in between. One of the more valuable
things that EA provides, however, is intellectual company for people
who are, despite often working in very different fields, pushing down this
fundamentally lonely path away from what everyone can see is good.
Milk EA, Casu Marzu EA
Pretty much everyone starts off drinking milk, and while adult consumption varies culturally, genetically, and ethically, if I put milk on my morning bran flakes thatâs a neutral choice around here. If my breakfast came up in talking with a friend they might think it was dull, but they wouldnât be surprised or confused. Some parts of effective altruism are like this: giving money to very poor people is, to nearly everyone, intuitively and obviously good.
Most of EA, however, is more like cheese. If youâve never heard of cheese it seems strange and maybe not so good, but at least in the US most people are familiar with the basic idea. Distributing bednets or deworming medication, improving the treatment of animals, developing vaccines, or trying to reduce the risk of nuclear war are mild cheeses like Cheddar or Mozzarella: people will typically think âthat seems goodâ if you tell them about it, and if they donât it usually doesnât take long to explain.
In general, work that anyone can see is really valuable is more likely to already be getting the attention it needs. This means that people who are looking hard for what most needs doing are often going to be exploring approaches that are not obvious, or that initially look bizarre. Pursuit of impact pushes us toward stranger and stronger cheeses, and while humanity may discover yet more non-obvious cheeses over time Iâm going to refer to the far end of this continuum as the casu marzu end, after the cheese that gets its distinctive flavor and texture from live maggots that jump as you eat it. EAs who end up out in this direction arenât going to be able to explain to their neighbor why they do what they do, and explaining to an interested family member probably takes several widely spaced conversations.
Sometimes people talk casually as if the weird stuff is longtermist and the mainstream stuff isnât, but if you look at the range of EA endeavors the main focus areas of EA all have people working along this continuum. A typical person likely easily sees the altruistic case for âhelp governments create realistic plans for pandemicsâ but not âbuild refuges to protect a small number of people from global catastrophesâ; âgive chickens better conditionsâ but not âdetermine the relative moral differences between insects of different agesâ; âplan for the economic effects of ChatGPTâs successorsâ but not âformalize what it means for an agent to have a goalâ; âorganize pledge drivesâ but not âgive money to promising high schoolersâ. And Iâd rate these all at most bleu.
Iâve seen this dynamic compared to motte-and-bailey or bait-and-switch. The idea is that someone presents EA to newcomers and only talks about the mild cheeses, when thatâs not actually where most of the communityâand especially the most highly-engaged membersâthink we should be focusing. People might then think they were on board with EA when they actually would find a lot of what goes on under its banner deeply weird. I think this is partly fair: when introducing EA, even to a general audience, I think itâs important not to give the impression that these easy-to-present things are the totality of EA. In addition to being misleading, that also risks people who would be a good fit for the stranger bits bouncing off. On the other hand, EA isnât the kind of movement where âon boardâ makes much sense. Weâre not about signing onto a large body of thought, or expecting everyone within the movement to think everyone elseâs work is valuable. Weâre united by a common question, how we can each do the most good, along with culture and intellectual tools for approaching this question.
I think itâs really good that EA is open to the very weird, the mainstream, and everything in between. One of the more valuable things that EA provides, however, is intellectual company for people who are, despite often working in very different fields, pushing down this fundamentally lonely path away from what everyone can see is good.