I think the FTX stuff a bigger deal than Peter Singer’s views on disability, and for me to be convinced about the England and enlightenment examples, you’d have to draw a clearer line between the philosophy and the wrongful actions (cf. in the FTX case, we have a self-identified utilitarian doing various wrongs for stated utilitarian reasons).
I agree that every large ideology has had massive scandals, in some cases ranging up to purges, famines, wars, etc. I think the problem for us, though, is that there aren’t very many people who take utilitarianism or beneficentrism seriously as an action-guiding principle—there are only ~10k effective altruists, basically. What happens if you scale that up to 100k and beyond? My claim would be that we need to tweak the product before we scale it, in order to make sure these catastrophes don’t scale with the size of the movement.
That’s an interesting take. I think that for me, it doesn’t feel that way, but this would be a long discussion, and my guess is that we both probably have fairly deep and different intuitions on this.
(Also, honestly, it seems like only a handful of people are really in charge of community growth decisions to me, and I don’t think I could do much to change direction there anyway, so I’m less focused on trying to change that)
I think the FTX stuff a bigger deal than Peter Singer’s views on disability, and for me to be convinced about the England and enlightenment examples, you’d have to draw a clearer line between the philosophy and the wrongful actions (cf. in the FTX case, we have a self-identified utilitarian doing various wrongs for stated utilitarian reasons).
I agree that every large ideology has had massive scandals, in some cases ranging up to purges, famines, wars, etc. I think the problem for us, though, is that there aren’t very many people who take utilitarianism or beneficentrism seriously as an action-guiding principle—there are only ~10k effective altruists, basically. What happens if you scale that up to 100k and beyond? My claim would be that we need to tweak the product before we scale it, in order to make sure these catastrophes don’t scale with the size of the movement.
That’s an interesting take. I think that for me, it doesn’t feel that way, but this would be a long discussion, and my guess is that we both probably have fairly deep and different intuitions on this.
(Also, honestly, it seems like only a handful of people are really in charge of community growth decisions to me, and I don’t think I could do much to change direction there anyway, so I’m less focused on trying to change that)