...for example, I think we should be less welcoming to proudly self-identified utilitarians, since they’re more likely to have these traits.
Ouch. Could you elaborate more on this and back this up more? The statement makes it sound like an obvious fact, and I don’t see why this would be true.
It seems pretty wrong to me that the thing causing SBF’s bad behaviour was thinking what matters in the world is the longrun wellbeing of sentient beings. My guess is that we should be focusing more on his traits like ambition and callousness towards those around him.
But it seems plausible I’m just being defensive, as a proudly self-identified utilitarian who would like to be welcome in the community.
I basically agree and try to emphasize personality much more than ideology in the post.
That said, it doesn’t seem like a big leap to think that confidence in an ideology that says you need to maximise a single value to the exclusion of all else could lead to dangerously optimizing behaviour...
Having more concern for the wellbeing of others is not the problematic part. But utilitarianism is more than that.
Moreover it could still be true that confidence in utilitarianism is in practice correlated with these dangerous traits.
it doesn’t seem like a big leap to think that confidence in an ideology that says you need to maximise a single value to the exclusion of all else could lead to dangerously optimizing behaviour.
I don’t find this a persuasive reason to think that utilitarianism is more likely to lead to this sort of behavior than pretty much any other ideology. I think a huge number of (maybe all?) ideologies imply that maximizing the good as defined by that ideology is the best thing to do, and that considerations outside of that ideology have very little weight. You see this behavior with many theists, Marxists, social justice advocates, etc. etc.
My general view is that there are a lot of people like SBF who have a lot of power-seeking and related traits—including callous disregard for law, social norms, and moral uncertainty—and that some of them use moral language to justify their actions. But I don’t think utilitarianism is especially vulnerable to this, nor do I think it would be a good counterargument if it was. If utilitarianism is true, or directionally true, it seems good to have people identify as such, but we should definitely judge harshly those that take an extremely cavalier attitude towards morality on the basis of their conviction in one moral philosophy. Moral uncertainty and all that.
I’d agree a high degree of confidence + strong willingness to act combined with many other ideologies leads to bad stuff.
Though I still think some ideologies encourage maximisation more than others.
Utilitarianism is much more explicit in its maximisation than most ideologies, plus it (at least superficially) actively undermines the normal safeguards against dangerous maximisation (virtues, the law, and moral rules) by pointing out these can be overridden for the greater good.
Like yes there are extreme environmentalists and that’s bad, but normally when someone takes on an ideology like environmentalism, they don’t also explicitly & automatically say that the environmental is all that matters and that it’s in principle permissible to cheat & lie in order to benefit the environment.
Definitely not saying it has any bearing on the truth of utilitarianism (in general I don’t think recent events have much bearing on the truth of anything). My original point was about who EA should try to attract, as a practical matter.
Utilitarianism is much more explicit in its maximisation than most ideologies, plus it (at least superficially) actively undermines the normal safeguards against dangerous maximisation (virtues, the law, and moral rules) by pointing out these can be overridden for the greater good.
Like yes there are extreme environmentalists and that’s bad, but normally when someone takes on an ideology like environmentalism, they don’t also explicitly & automatically say that the environmental is all that matters and that it’s in principle permissible to cheat & lie in order to benefit the environment.
I think it’s true that utilitarianism is more maximizing than the median ideology. But I think a lot of other ideologies are minimizing in a way that creates equal pathologies in practice. E.g., deontological philosophies are often about minimizing rights violations, which can be used to justify pretty extreme (and bad) measures.
Like yes there are extreme environmentalists and that’s bad, but normally when someone takes on an ideology like environmentalism, they don’t also explicitly & automatically say that the environmental is all that matters and that it’s in principle permissible to cheat & lie in order to benefit the environment.
I moderately confidently expect there to be a higher proportion of extreme environmentalists than extreme utilitarians. I think utilitarians will be generally more intelligent / more interested in discussion / more desiring to be “correct” and “rational”, and that the correct and predominant reply to things like the “Utilitarianism implies killing healthy patients!” critique is “Yeah, that’s naive Utilitarianism, I’m a Sophisticated Utilitarian who realizes the value of norms, laws, virtues and intuitions for cooperation”.
I disagree with this. I think utilitarian communities are especially vulnerable to bad actors. As I discuss in my other comment, psychopaths disproportionately have utilitarian intuitions, so we should expect communities with a lot of utilitarians to have a disproportionate number of psychopaths relative to the rest of the population.
psychopaths disproportionately have utilitarian intuitions, so we should expect communities with a lot of utilitarians to have a disproportionate number of psychopaths relative to the rest of the population.
From psychopaths disproportionately having utilitarian intuitions, it doesn’t follow that utilitarians disproportionately have psychopathic tendencies. We might slightly increase our credence that they do, but probably not enough to meaningfully outweigh the experience of hanging out with utilitarians and learning first hand of their typical personality traits.
I think it does follow, other things being equal. If the prevalence of psychopaths in the wider population is 2%, but psychopaths are twice as likely to be utilitarians, then other things equal, we should expect 4% of utilitarian communities to be psychopaths. Unless you think psychopathy is correlated with other things that make one less likely to actually get involved in active utilitarian communities, that must be true.
There’s any number of possible reasons why psychopaths might not want to get involved with utilitarian communities. For example, their IQ tends to be slightly lower than average, whereas utilitarians tend to have higher than average IQs, so they might not fit in intellectually whether their intentions were benign or malign. Relatedly, you would expect communities with higher IQs better at policing themselves against malign actors.
I think there would be countless confounding factors like this that would dominate a single survey based on a couple of hundred (presumably WEIRD) students.
In my other comment, I didn’t just link to a single study, I linked to a google scholar search with lots of articles about the connection between psychopathy and utilitarianism. The effect size found in the single study I did link to is also pretty large:
The average difference in IQ found in the study you link to finds a very small effect size—a cohen’s d of −0.12. And violent psychopaths have slightly higher IQ than average.
For the reasons I gave elsewhere in my comments, I would expect EAs to be worse at policing bad actors than average because EAs are so disproportionately on the autism spectrum.
Yes, these would be WEIRD people, but this is moot since EA is made up of WEIRD people as well.
Fair enough. I am still sceptical that this would translate into a commensurate increase in psychopaths in utilitarian communities*, but this seems enough to give us reason for concern.
*Also violent psychopaths don’t seem to be our problem, so their greater intelligence would mean the average IQ of the kind of emotional manipulator we’re concerned about would be slightly lower.
It’s a big topic, but the justification is supposed to be the part just before. I think we should be more worried about attracting naive optimizers, and I think people who are gung-ho utilitarians are one example of a group who are more likely to have this trait.
I think it’s notable that SBF was a gung-ho utilitarian before he got into EA.
It’s maybe worth clarifying that I’m most concerned about people who a combination of high-confidence in utilitarianism and a lack of qualms about putting it into practice.
There are lots of people who see utilitarianism as their best guess moral theory but aren’t naive optimizers in the sense I’m trying to point to here.
It’s maybe worth clarifying that I’m most concerned about people who a combination of high-confidence in utilitarianism and a lack of qualms about putting it into practice.
Thank you, that makes more sense + I largely agree.
However, I also wonder if all this could be better gauged by watching out for key psychological traits/features instead of probing someone’s ethical view. For instance, a person low in openness showing high-risk behavior who happens to be a deontologist could cause as much trouble as a naive utilitarian optimizer. In either case, it would be the high-risk behavior that would potentially cause problems rather than how they ethically make decisions.
I was trying to do that :) That’s why I opened with naive optimizing as the problem. The point about gung-ho utilitarians was supposed to be an example of a potential implication.
Yeah, I think “proudly self-identified utilitarians” is not the same as “naively optimizing utilitarians”, so would encourage you to still be welcoming to those in the former group who are not in the latter :-)
ETA: I did appreciate your emphasizing that “it’s quite possible to have radical inside views while being cautious in your actions.”
I had you in mind as a good utilitarian when writing :)
Good point that just saying ‘naively optimizing’ utilitarians is probably clearest most of the time. I was looking for other words that would denote high-confidence and willingness to act without qualms.
minor nitpick—this doesn’t seem to capture naive utilitarianism as I understand it. I always thought naive utilitarianism was about going against common sense norms on the basis of your own personal fragile calculations. eg lying is prone to being rumbled and one’s reputation is very fragile, so it makes sense to follow the norm of not lying even if your own calculations seem to suggest that it is good because the calculations will tend to miss longer term indirect and subtle effects. But this is neither about (1) high confidence nor (2) acting without qualms. Indeed one might decide not to lie with high confidence and without qualms. Equally, one might choose to ‘lie for the greater good’ with low confidence and with lots of qualms. This would still be naive utilitarian behaviour
That’s useful—my ‘naive optimizing’ thing isn’t supposed to be the same thing as naive utilitarianism, but I do find it hard to pin down the exact trait that’s the issue here, and those are interesting points about confidence maybe not being the key thing.
Ouch. Could you elaborate more on this and back this up more? The statement makes it sound like an obvious fact, and I don’t see why this would be true.
+1
It seems pretty wrong to me that the thing causing SBF’s bad behaviour was thinking what matters in the world is the longrun wellbeing of sentient beings. My guess is that we should be focusing more on his traits like ambition and callousness towards those around him.
But it seems plausible I’m just being defensive, as a proudly self-identified utilitarian who would like to be welcome in the community.
I basically agree and try to emphasize personality much more than ideology in the post.
That said, it doesn’t seem like a big leap to think that confidence in an ideology that says you need to maximise a single value to the exclusion of all else could lead to dangerously optimizing behaviour...
Having more concern for the wellbeing of others is not the problematic part. But utilitarianism is more than that.
Moreover it could still be true that confidence in utilitarianism is in practice correlated with these dangerous traits.
I expect it’s the negative component in the two factor model that’s the problem, rather than the positive component you highlight. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5900580/
I don’t find this a persuasive reason to think that utilitarianism is more likely to lead to this sort of behavior than pretty much any other ideology. I think a huge number of (maybe all?) ideologies imply that maximizing the good as defined by that ideology is the best thing to do, and that considerations outside of that ideology have very little weight. You see this behavior with many theists, Marxists, social justice advocates, etc. etc.
My general view is that there are a lot of people like SBF who have a lot of power-seeking and related traits—including callous disregard for law, social norms, and moral uncertainty—and that some of them use moral language to justify their actions. But I don’t think utilitarianism is especially vulnerable to this, nor do I think it would be a good counterargument if it was. If utilitarianism is true, or directionally true, it seems good to have people identify as such, but we should definitely judge harshly those that take an extremely cavalier attitude towards morality on the basis of their conviction in one moral philosophy. Moral uncertainty and all that.
I’d agree a high degree of confidence + strong willingness to act combined with many other ideologies leads to bad stuff.
Though I still think some ideologies encourage maximisation more than others.
Utilitarianism is much more explicit in its maximisation than most ideologies, plus it (at least superficially) actively undermines the normal safeguards against dangerous maximisation (virtues, the law, and moral rules) by pointing out these can be overridden for the greater good.
Like yes there are extreme environmentalists and that’s bad, but normally when someone takes on an ideology like environmentalism, they don’t also explicitly & automatically say that the environmental is all that matters and that it’s in principle permissible to cheat & lie in order to benefit the environment.
Definitely not saying it has any bearing on the truth of utilitarianism (in general I don’t think recent events have much bearing on the truth of anything). My original point was about who EA should try to attract, as a practical matter.
I think it’s true that utilitarianism is more maximizing than the median ideology. But I think a lot of other ideologies are minimizing in a way that creates equal pathologies in practice. E.g., deontological philosophies are often about minimizing rights violations, which can be used to justify pretty extreme (and bad) measures.
I moderately confidently expect there to be a higher proportion of extreme environmentalists than extreme utilitarians. I think utilitarians will be generally more intelligent / more interested in discussion / more desiring to be “correct” and “rational”, and that the correct and predominant reply to things like the “Utilitarianism implies killing healthy patients!” critique is “Yeah, that’s naive Utilitarianism, I’m a Sophisticated Utilitarian who realizes the value of norms, laws, virtues and intuitions for cooperation”.
I disagree with this. I think utilitarian communities are especially vulnerable to bad actors. As I discuss in my other comment, psychopaths disproportionately have utilitarian intuitions, so we should expect communities with a lot of utilitarians to have a disproportionate number of psychopaths relative to the rest of the population.
Thanks, this is a meaningful update for me.
From psychopaths disproportionately having utilitarian intuitions, it doesn’t follow that utilitarians disproportionately have psychopathic tendencies. We might slightly increase our credence that they do, but probably not enough to meaningfully outweigh the experience of hanging out with utilitarians and learning first hand of their typical personality traits.
I think it does follow, other things being equal. If the prevalence of psychopaths in the wider population is 2%, but psychopaths are twice as likely to be utilitarians, then other things equal, we should expect 4% of utilitarian communities to be psychopaths. Unless you think psychopathy is correlated with other things that make one less likely to actually get involved in active utilitarian communities, that must be true.
There’s any number of possible reasons why psychopaths might not want to get involved with utilitarian communities. For example, their IQ tends to be slightly lower than average, whereas utilitarians tend to have higher than average IQs, so they might not fit in intellectually whether their intentions were benign or malign. Relatedly, you would expect communities with higher IQs better at policing themselves against malign actors.
I think there would be countless confounding factors like this that would dominate a single survey based on a couple of hundred (presumably WEIRD) students.
In my other comment, I didn’t just link to a single study, I linked to a google scholar search with lots of articles about the connection between psychopathy and utilitarianism. The effect size found in the single study I did link to is also pretty large:
The average difference in IQ found in the study you link to finds a very small effect size—a cohen’s d of −0.12. And violent psychopaths have slightly higher IQ than average.
For the reasons I gave elsewhere in my comments, I would expect EAs to be worse at policing bad actors than average because EAs are so disproportionately on the autism spectrum.
Yes, these would be WEIRD people, but this is moot since EA is made up of WEIRD people as well.
Fair enough. I am still sceptical that this would translate into a commensurate increase in psychopaths in utilitarian communities*, but this seems enough to give us reason for concern.
*Also violent psychopaths don’t seem to be our problem, so their greater intelligence would mean the average IQ of the kind of emotional manipulator we’re concerned about would be slightly lower.
It’s a big topic, but the justification is supposed to be the part just before. I think we should be more worried about attracting naive optimizers, and I think people who are gung-ho utilitarians are one example of a group who are more likely to have this trait.
I think it’s notable that SBF was a gung-ho utilitarian before he got into EA.
It’s maybe worth clarifying that I’m most concerned about people who a combination of high-confidence in utilitarianism and a lack of qualms about putting it into practice.
There are lots of people who see utilitarianism as their best guess moral theory but aren’t naive optimizers in the sense I’m trying to point to here.
See more in Toby’s talk.
Thank you, that makes more sense + I largely agree.
However, I also wonder if all this could be better gauged by watching out for key psychological traits/features instead of probing someone’s ethical view. For instance, a person low in openness showing high-risk behavior who happens to be a deontologist could cause as much trouble as a naive utilitarian optimizer. In either case, it would be the high-risk behavior that would potentially cause problems rather than how they ethically make decisions.
I was trying to do that :) That’s why I opened with naive optimizing as the problem. The point about gung-ho utilitarians was supposed to be an example of a potential implication.
Yeah, I think “proudly self-identified utilitarians” is not the same as “naively optimizing utilitarians”, so would encourage you to still be welcoming to those in the former group who are not in the latter :-)
ETA: I did appreciate your emphasizing that “it’s quite possible to have radical inside views while being cautious in your actions.”
I had you in mind as a good utilitarian when writing :)
Good point that just saying ‘naively optimizing’ utilitarians is probably clearest most of the time. I was looking for other words that would denote high-confidence and willingness to act without qualms.
minor nitpick—this doesn’t seem to capture naive utilitarianism as I understand it. I always thought naive utilitarianism was about going against common sense norms on the basis of your own personal fragile calculations. eg lying is prone to being rumbled and one’s reputation is very fragile, so it makes sense to follow the norm of not lying even if your own calculations seem to suggest that it is good because the calculations will tend to miss longer term indirect and subtle effects. But this is neither about (1) high confidence nor (2) acting without qualms. Indeed one might decide not to lie with high confidence and without qualms. Equally, one might choose to ‘lie for the greater good’ with low confidence and with lots of qualms. This would still be naive utilitarian behaviour
That’s useful—my ‘naive optimizing’ thing isn’t supposed to be the same thing as naive utilitarianism, but I do find it hard to pin down the exact trait that’s the issue here, and those are interesting points about confidence maybe not being the key thing.