As with any social movement, people disagree about the best ways to take action. There are many critiques of EA which you should read to get a better idea of where others are coming from, for example, this post about effective altruism being an ideology, this post about someone leaving EA, this post about EA being inaccessible, or this post about blindspots in EA/rationalism communities.
Even before SBF, many people had legitimate issues with EA from a variety of standpoints. Some people find the culture unwelcoming (eg. too elitist/not enough diversity), some people take issue with longtermism (eg. too much uncertainty), others disagree with consequentialism/utilitarianism, and still others are generally on board but find more specific issues in the way that EA approaches things.
Post-SBF it’s difficult to say what the full effects will be, but I think it’s fair to say that SBF represents what many people fear/dislike about EA (eg. elitism, inexperience, ends-justifies-the-means reasoning, tech-bro vibes, etc). I’m not saying these things are necessarily true, but most people won’t spend hundreds of hours engaging with EA to find out for themselves. Instead, they’ll read an article on the New York Times about how SBF committed fraud and is heavily linked to EA and walk away with a somewhat negative impression. That isn’t always fair, but it also happens to other social movements like feminism, Black Lives Matter, veganism, environmentalism, etc. EA is no exception, and FTX/SBF was a big enough deal that a lot of people will choose not to engage with EA going forward.
Should you care? I think to an extent, yes—you should engage with criticisms, think through your own perspective, decide where you agree/disagree, and work on improving things where you think they should be improved going forward. We should all do this. Ignoring criticisms is akin to putting your fingers in your ears and refusing to listen, which isn’t a particularly rational approach. Many critics of EA will have meaningful things to say about it and if we truly want to figure out the best ways to improve the world, we need to be willing to change (see: scout mindset). That being said, not all criticisms will be useful or meaningful, and we shouldn’t get so caught up in the criticism that we stop standing for something.
well the elitism charge is just true and it should be true! Of course EA is an elitist movement, the whole point is trying to get elites to spend their wealth better, via complicated moral reasoning that you have to be smart to understand (this is IMO a good thing, not a criticism!).
I actually think it would be a disaster if EA became anti-elitist, not just for EA but for the world. The civic foundation of the West is made up of Susan from South Nottingham who volunteers to run the local mother & baby group: if she stops doing that to ETG or whatever then everything falls apart and the economic surplus that EA relies on will disappear within a few generations. For everyone’s sakes, the EA message needs to stay very specifically targeted; it would be an extremely dangerous meme if it leaked out to the wider world. Thankfully I think on some level EAs sort of know this, which would probably explain the focus on evangelizing specifically to smart university students but not to anyone else.
I rather liked this comment, and think it really hits the nail on the head. Myself being a person that has only recently come into contact and developed an interest, and therefore having mostly an ‘outsider’ perspective, I would add that there’s a big difference in the perception of ‘effective altruism’, which almost anybody would find reasonable and morally unobjectionable, and ‘Effective Altruism’ / Rationalism as a movement with some beliefs and practices that will be felt as weird and rejectable by many people (basically, all those mentioned by S.E. Montgomery like elitism, long-termism, utilitarianism, a general hibristic and nerdy belief that complex issues and affairs are reducible to numbers and optimization models, etc...).
Controversial take: While I agree that EA has big problems, I actually think that elitism was correct for one reason.
Things are usually heavy tailed like power laws, indeed they may be the most common distribution, and this supports elitism.
One of my largest criticisms with EA is that they don’t realize there might be a crucial consideration around moral realism. Now this is a non-special criticism, but moral realism is basically the theory behind where morality is real and mind independent.
Yet there is probably overconfidence on this front, and this matters, because if it isn’t true, than EA will have to change drastically. And in general this is way too unquestioned as an assumption being used.
It’s possible I’ve flipped the sign on what you’re saying, but if I haven’t, I’m pretty sure most EAs are not moral realists, so I don’t know where you got the impression that it’s an underlying assumption of any serious EA efforts.
If I did flip the sign, then I don’t think it’s true that moral realism is “too unquestioned”. At this point it might be more fair to say that too much time & ink has been spilled on what’s frankly a pretty trivial question that only sees as much engagement as it does because people get caught up in arguing about definitions of words (and, of course, because some other people are deeply confused).
I think this might be a crux here: I do think that this question matters a lot, and I will point to implications if moral realism is false. Thankfully more EAs are moral anti-realists than I thought.
EA would need to recognize that their values aren’t superior or inferior to other people’s values, just different values. In other words it would need to stop believing it’s actually objectively right at all to hold (X values).
The moral progress thesis is not correct, that is the changes in the 19th and 20th centuries were mostly not made on moral progress. At the very least moral progress is very much subjective.
Values are points of view, not fundamental truths that humans have to abide by.
Now this isn’t the most important question, but it is a question that I think matters, especially for moral movements like EA.
As with any social movement, people disagree about the best ways to take action. There are many critiques of EA which you should read to get a better idea of where others are coming from, for example, this post about effective altruism being an ideology, this post about someone leaving EA, this post about EA being inaccessible, or this post about blindspots in EA/rationalism communities.
Even before SBF, many people had legitimate issues with EA from a variety of standpoints. Some people find the culture unwelcoming (eg. too elitist/not enough diversity), some people take issue with longtermism (eg. too much uncertainty), others disagree with consequentialism/utilitarianism, and still others are generally on board but find more specific issues in the way that EA approaches things.
Post-SBF it’s difficult to say what the full effects will be, but I think it’s fair to say that SBF represents what many people fear/dislike about EA (eg. elitism, inexperience, ends-justifies-the-means reasoning, tech-bro vibes, etc). I’m not saying these things are necessarily true, but most people won’t spend hundreds of hours engaging with EA to find out for themselves. Instead, they’ll read an article on the New York Times about how SBF committed fraud and is heavily linked to EA and walk away with a somewhat negative impression. That isn’t always fair, but it also happens to other social movements like feminism, Black Lives Matter, veganism, environmentalism, etc. EA is no exception, and FTX/SBF was a big enough deal that a lot of people will choose not to engage with EA going forward.
Should you care? I think to an extent, yes—you should engage with criticisms, think through your own perspective, decide where you agree/disagree, and work on improving things where you think they should be improved going forward. We should all do this. Ignoring criticisms is akin to putting your fingers in your ears and refusing to listen, which isn’t a particularly rational approach. Many critics of EA will have meaningful things to say about it and if we truly want to figure out the best ways to improve the world, we need to be willing to change (see: scout mindset). That being said, not all criticisms will be useful or meaningful, and we shouldn’t get so caught up in the criticism that we stop standing for something.
well the elitism charge is just true and it should be true! Of course EA is an elitist movement, the whole point is trying to get elites to spend their wealth better, via complicated moral reasoning that you have to be smart to understand (this is IMO a good thing, not a criticism!).
I actually think it would be a disaster if EA became anti-elitist, not just for EA but for the world. The civic foundation of the West is made up of Susan from South Nottingham who volunteers to run the local mother & baby group: if she stops doing that to ETG or whatever then everything falls apart and the economic surplus that EA relies on will disappear within a few generations. For everyone’s sakes, the EA message needs to stay very specifically targeted; it would be an extremely dangerous meme if it leaked out to the wider world. Thankfully I think on some level EAs sort of know this, which would probably explain the focus on evangelizing specifically to smart university students but not to anyone else.
I rather liked this comment, and think it really hits the nail on the head. Myself being a person that has only recently come into contact and developed an interest, and therefore having mostly an ‘outsider’ perspective, I would add that there’s a big difference in the perception of ‘effective altruism’, which almost anybody would find reasonable and morally unobjectionable, and ‘Effective Altruism’ / Rationalism as a movement with some beliefs and practices that will be felt as weird and rejectable by many people (basically, all those mentioned by S.E. Montgomery like elitism, long-termism, utilitarianism, a general hibristic and nerdy belief that complex issues and affairs are reducible to numbers and optimization models, etc...).
Controversial take: While I agree that EA has big problems, I actually think that elitism was correct for one reason.
Things are usually heavy tailed like power laws, indeed they may be the most common distribution, and this supports elitism.
One of my largest criticisms with EA is that they don’t realize there might be a crucial consideration around moral realism. Now this is a non-special criticism, but moral realism is basically the theory behind where morality is real and mind independent.
Yet there is probably overconfidence on this front, and this matters, because if it isn’t true, than EA will have to change drastically. And in general this is way too unquestioned as an assumption being used.
It’s possible I’ve flipped the sign on what you’re saying, but if I haven’t, I’m pretty sure most EAs are not moral realists, so I don’t know where you got the impression that it’s an underlying assumption of any serious EA efforts.
If I did flip the sign, then I don’t think it’s true that moral realism is “too unquestioned”. At this point it might be more fair to say that too much time & ink has been spilled on what’s frankly a pretty trivial question that only sees as much engagement as it does because people get caught up in arguing about definitions of words (and, of course, because some other people are deeply confused).
I think this might be a crux here: I do think that this question matters a lot, and I will point to implications if moral realism is false. Thankfully more EAs are moral anti-realists than I thought.
EA would need to recognize that their values aren’t superior or inferior to other people’s values, just different values. In other words it would need to stop believing it’s actually objectively right at all to hold (X values).
The moral progress thesis is not correct, that is the changes in the 19th and 20th centuries were mostly not made on moral progress. At the very least moral progress is very much subjective.
Values are points of view, not fundamental truths that humans have to abide by.
Now this isn’t the most important question, but it is a question that I think matters, especially for moral movements like EA.