I’m quite skeptical of post-hoc articles with titles like ‘X was no surprise’, they’re usually full of hindsight bias. Like, if it was no surprise, did you predict it coming?
Although there’s almost nothing about SBF here, is this part 1 of a series?
You’re right that post-hoc articles are usually full of hindsight bias, making them a lot less valuable. That’s why I tried not to make the article about SBF too much (no this is not part 1 of a series). I laid that out from the beginning:
Please don’t read too much into the armchair psychological diagnosis from a complete amateur – that isn’t the point.
If you want a prediction I give one right after this:
The point, to lay my cards on the table, is this: virtue ethicists would not be surprised if many EAs suffer (in varying degrees) from “moral schizophrenia”
I reiterate this when I say “I fear it is widespread in this community” where “it” is a certain coldness toward ethical choices (and other choices that would normally be full of affect).
SBF is topical and I thought this was a good opportunity to highlight this lesson about not engaging in excessive reasoning. But I agree my title isn’t great. Suggestions?
“ Schizophrenia is characterized by thoughts or experiences that seem out of touch with reality, disorganized speech or behavior, and decreased participation in daily activities. Difficulty with concentration and memory may also be present.”
I don’t see the analogy between schizophrenia and “a certain coldness toward ethical choices,” and if it were me, I’d avoid using mental health problems as analogies, unless the analogy is exact.
The term is certainly outdated and an inaccurate analogy, hence the scare quotes and the caveat I put in the heading of the same name. It’s the term that Stocker uses though and I haven’t seen another one (but maybe I missed it). The description “tendency to suffer cognitive dissonance in moral thinking” is much more accurate but not exactly succinct enough to make for a good name. I’m open to suggestions!
The term I’d probably use is hypocrisy. Usually, we say that hypocrisy is when one’s behaviors don’t match one’s moral standards. But it can also take on other meanings. The film The Big Short has a great scene in which one hypocrite, whose behavior doesn’t match her stated moral standards, accuses FrontPoint partners of being hypocrites, because their true motivations (making money by convincing her to rate the mortgage bonds they are shorting appropriately) don’t match their stated ethical rationales (combating fraud).
On Wikipedia, I also found definitions from David Runciman and Michael Gerson showing that hypocrisy can go beyond a behavior/ethical standards mismatch:
According to British political philosopher David Runciman, “Other kinds of hypocritical deception include claims to knowledge that one lacks, claims to a consistency that one cannot sustain, claims to a loyalty that one does not possess, claims to an identity that one does not hold”.[2] American political journalist Michael Gerson says that political hypocrisy is “the conscious use of a mask to fool the public and gain political benefit”.[3]
I think “motivational hypocrisy” might be a more clear term than “moral schizophrenia” for indicating a motives/ethical rationale mismatch.
Thanks for the suggestion. I ended up going with “internal moral disharmony” since it’s innocuous and accurate enough. I think “hypocrisy” is too strong and too narrow: it’s a species of internal moral disharmony (closely related to the “extreme case” in Stocker’s terms), one which seems to imply no feelings of remorse or frustration with oneself regarding the disharmony. I wanted to focus on the more “moderate case” in which the disharmony is not too strong, one feels a cognitive dissonance, and one attempts to resolve the disharmony so as not to be a hypocrite.
I meant strong relative to “internal moral disharmony.” But also, am I to understand people are reading the label of “schizophrenia” as an accusation? It’s a disorder that one gets through no choice of one’s own: you can’t be blamed for having it. Hypocrisy, as I understand it, is something we have control over and therefore are responsible for avoiding or getting rid of in ourselves.
At most Stocker is blaming Consequentialism and DE for being moral schizophrenia inducing. But it’s the theory that’s at fault, not the person who suffers it!
Yeah I think this is fair. I probably didn’t read you very carefully or fairly. However, it is hard to control connotations of words, and I have to admit I had a slightly negative visceral reaction for what I believed to be my sincerely held moral views (that I tried pretty hard to live up to, and I made large sacrifices for) medicalized and dismissed so casually.
On my end, I’m sorry if my words sounded too strong or emotive.
Separately, I strongly disagree that we suffer from overthinking ethical stuff too much. I don’t think SBF’s problems with ethics came from careful debate in business ethics and then missing a decimal point in the relevant calculations. I would guess that if he actually consulted senior EA leaders or researchers on the morality of his actions, this would predictably have resulted in less fraud.
No worries about the strong response – I misjudged how my words would be interpreted. I’m glad we sorted that out.
Regarding overthinking ethical stuff and SBF: Unfortunately I fear you’ve missed my point. First of all, I wasn’t really talking about any fraud/negligence that he may have committed. As I said in the 2nd paragraph:
Regarding his ignorance and his intentions, he might be telling the truth. Suppose he is: suppose he never condoned doing sketchy things as a means he could justify by some expected greater good. Where then is the borderline moral nihilism coming from? Note that it’s saying “all the right shibboleths” that he spoke of as mere means to an end, not the doing of sketchy things.
My subject was his attitude/comments towards ethics. Second, my diagnosis was not that:
SBF’s problems with ethics came from careful debate in business ethics and then missing a decimal point in the relevant calculations.
My point was that it’s getting too comfortable approaching ethics like a careful calculation that can be dangerous in the first place – no matter how accurate the calculation is. It’s not about missing some decimal points. Please reread this section if you’re interested. I updated the end of it with a reference to a clear falsifiable claim.
Ok, I didn’t pick up that was where the prediction was in the article. I think of (good) predictions as having a clear, falsifiable hypothesis. Whereas this seems to be predicting … that virtue ethicists continue believing whatever they already believed about EAs?
The reason I downvoted this article is the use of the term ‘moral schizophrenia’. Even if it’s not your term originally, I think using it is:
a) Super unclear as a descriptive term. I understand in mainstream culture it’s seen as a kind of Jeckyll/Hyde split personality thing, so maybe it’s meant to describe that. But I’m pretty sure that’s an inaccurate description of actual schizophrenia.
b) Harmful to those who have schizophrenia when used in this kind of negative fashion. Especially as it seems to be propagating the Jeckyll/Hyde false belief about the condition.
Lastly, the ‘moral schizophrenia’/coldness described here seems much more like a straw-man of EAs than an accurate description of an EAs I’ve met. The EAs I know IRL are warm and generous towards their families and friends, and don’t seem to associate being that way as at all incompatible with EA kind of reasoning. Sure, online and even irl discussions can seem dry, but it would be hard to have any discussions if we had to express, with our emotions, the magnitude of what was being discussed.
Regarding the term “moral schizophrenia”: As I said to AllAmericanBreakfast, I wholeheartedly agree the term is outdated and inaccurate! Hence the scare quotes and the caveat I put in the heading of the same name. But obviously I underestimated how bad the term was since everyone is telling to change it. I’m open to suggestions! EDIT: I replaced it with “internal moral disharmony.” Kind of a mouthful but good enough for a blog post.
Regarding predictions: You’re right, that wasn’t a very exact prediction (mostly because internal moral disharmony is going to be hard to measure). Here is a falsifiable claim that I stand by and that, if true, is evidence of internal moral disharmony:
I claim that one’s level of engagement with the LW/EA rationalist community can weakly predict the degree to which one adopts a maximizer’s mindset when confronted with moral/normative scenarios in life, the degree to which one suffers cognitive dissonance in such scenarios, and the degree to which one expresses positive affective attachment to one’s decision (or the object at the center of their decision) in such scenarios.
More specifically I predict that, above a certain threshold of engagement with the community, increased engagement with the LW/EA community correlates with an increase in the maximizer’s mindset, increase in cognitive dissonance, and decrease in positive affective attachment in the aforementioned scenarios.
The hypothesis for why I think this correlation exists is mostly at the end of here and here.
But more generally, must a criticism of/concern for the EA community come in the form of a prediction? I’m really just trying to point out a hazard for those who go in for Rationalism/Consequentialism. If everyone has avoided it, that’s great! But there seems to be evidence that some have failed to avoid it, and that we might want to take further precautions. SBF was very much one of EA’s own: his comments therefore merit some EA introspection. I’m just throwing in my two cents.
Regarding actual EAs: I would be happy to learn few EAs actually have thoughts too many! But I do know it’s a thing, that some have suffered it (personally I’ve struggled with it at times, and it’s literally in Mill’s autobiography). More generally, the ills of adopting a maximizer’s mindset too often are well documented (see references in footnotes). I thought it was in the community’s interest to raise awareness about it. I’m certainly not trying to demonize anyone: if someone in this community does suffer it, my first suspect would be the culture surrounding/theory of Consequentialism,not some particular weakness on the individual’s part.
Regarding dry discussion on topics of incredible magnitude: That’s fair. I’m not saying being dry and calculating is always wrong. I’m just saying one should be careful about getting too comfortable with that mindset lest one start slipping into it when one shouldn’t. That seems like something rationalists need to be especially mindful of.
I’m quite skeptical of post-hoc articles with titles like ‘X was no surprise’, they’re usually full of hindsight bias. Like, if it was no surprise, did you predict it coming?
Although there’s almost nothing about SBF here, is this part 1 of a series?
You’re right that post-hoc articles are usually full of hindsight bias, making them a lot less valuable. That’s why I tried not to make the article about SBF too much (no this is not part 1 of a series). I laid that out from the beginning:
If you want a prediction I give one right after this:
I reiterate this when I say “I fear it is widespread in this community” where “it” is a certain coldness toward ethical choices (and other choices that would normally be full of affect).
SBF is topical and I thought this was a good opportunity to highlight this lesson about not engaging in excessive reasoning. But I agree my title isn’t great. Suggestions?
The Mayo Clinic says of schizophrenia:
“ Schizophrenia is characterized by thoughts or experiences that seem out of touch with reality, disorganized speech or behavior, and decreased participation in daily activities. Difficulty with concentration and memory may also be present.”
I don’t see the analogy between schizophrenia and “a certain coldness toward ethical choices,” and if it were me, I’d avoid using mental health problems as analogies, unless the analogy is exact.
The term is certainly outdated and an inaccurate analogy, hence the scare quotes and the caveat I put in the heading of the same name. It’s the term that Stocker uses though and I haven’t seen another one (but maybe I missed it). The description “tendency to suffer cognitive dissonance in moral thinking” is much more accurate but not exactly succinct enough to make for a good name. I’m open to suggestions!
The term I’d probably use is hypocrisy. Usually, we say that hypocrisy is when one’s behaviors don’t match one’s moral standards. But it can also take on other meanings. The film The Big Short has a great scene in which one hypocrite, whose behavior doesn’t match her stated moral standards, accuses FrontPoint partners of being hypocrites, because their true motivations (making money by convincing her to rate the mortgage bonds they are shorting appropriately) don’t match their stated ethical rationales (combating fraud).
On Wikipedia, I also found definitions from David Runciman and Michael Gerson showing that hypocrisy can go beyond a behavior/ethical standards mismatch:
I think “motivational hypocrisy” might be a more clear term than “moral schizophrenia” for indicating a motives/ethical rationale mismatch.
Thanks for the suggestion. I ended up going with “internal moral disharmony” since it’s innocuous and accurate enough. I think “hypocrisy” is too strong and too narrow: it’s a species of internal moral disharmony (closely related to the “extreme case” in Stocker’s terms), one which seems to imply no feelings of remorse or frustration with oneself regarding the disharmony. I wanted to focus on the more “moderate case” in which the disharmony is not too strong, one feels a cognitive dissonance, and one attempts to resolve the disharmony so as not to be a hypocrite.
I think that’s fine too.
Fwiw I consider “hypocrisy” to be a much weaker accusation than “schizophrenia”
I meant strong relative to “internal moral disharmony.” But also, am I to understand people are reading the label of “schizophrenia” as an accusation? It’s a disorder that one gets through no choice of one’s own: you can’t be blamed for having it. Hypocrisy, as I understand it, is something we have control over and therefore are responsible for avoiding or getting rid of in ourselves.
At most Stocker is blaming Consequentialism and DE for being moral schizophrenia inducing. But it’s the theory that’s at fault, not the person who suffers it!
Yeah I think this is fair. I probably didn’t read you very carefully or fairly. However, it is hard to control connotations of words, and I have to admit I had a slightly negative visceral reaction for what I believed to be my sincerely held moral views (that I tried pretty hard to live up to, and I made large sacrifices for) medicalized and dismissed so casually.
Yikes! Thank you for letting me know! Clearly a very poor choice of words: that was not at all my intent!
To be clear, I agree with EAs on many many issues. I just fear they suffer from “overthinking ethical stuff too often” if you will.
Thanks for responding! (upvoted)
On my end, I’m sorry if my words sounded too strong or emotive.
Separately, I strongly disagree that we suffer from overthinking ethical stuff too much. I don’t think SBF’s problems with ethics came from careful debate in business ethics and then missing a decimal point in the relevant calculations. I would guess that if he actually consulted senior EA leaders or researchers on the morality of his actions, this would predictably have resulted in less fraud.
Meta: I couldn’t figure out why the first chart renders with so much whitespace.
No worries about the strong response – I misjudged how my words would be interpreted. I’m glad we sorted that out.
Regarding overthinking ethical stuff and SBF:
Unfortunately I fear you’ve missed my point. First of all, I wasn’t really talking about any fraud/negligence that he may have committed. As I said in the 2nd paragraph:
My subject was his attitude/comments towards ethics. Second, my diagnosis was not that:
My point was that it’s getting too comfortable approaching ethics like a careful calculation that can be dangerous in the first place – no matter how accurate the calculation is. It’s not about missing some decimal points. Please reread this section if you’re interested. I updated the end of it with a reference to a clear falsifiable claim.
Ok, I didn’t pick up that was where the prediction was in the article. I think of (good) predictions as having a clear, falsifiable hypothesis. Whereas this seems to be predicting … that virtue ethicists continue believing whatever they already believed about EAs?
The reason I downvoted this article is the use of the term ‘moral schizophrenia’. Even if it’s not your term originally, I think using it is:
a) Super unclear as a descriptive term. I understand in mainstream culture it’s seen as a kind of Jeckyll/Hyde split personality thing, so maybe it’s meant to describe that. But I’m pretty sure that’s an inaccurate description of actual schizophrenia.
b) Harmful to those who have schizophrenia when used in this kind of negative fashion. Especially as it seems to be propagating the Jeckyll/Hyde false belief about the condition.
Lastly, the ‘moral schizophrenia’/coldness described here seems much more like a straw-man of EAs than an accurate description of an EAs I’ve met. The EAs I know IRL are warm and generous towards their families and friends, and don’t seem to associate being that way as at all incompatible with EA kind of reasoning. Sure, online and even irl discussions can seem dry, but it would be hard to have any discussions if we had to express, with our emotions, the magnitude of what was being discussed.
Regarding the term “moral schizophrenia”:
As I said to AllAmericanBreakfast, I wholeheartedly agree the term is outdated and inaccurate! Hence the scare quotes and the caveat I put in the heading of the same name. But obviously I underestimated how bad the term was since everyone is telling to change it. I’m open to suggestions! EDIT: I replaced it with “internal moral disharmony.” Kind of a mouthful but good enough for a blog post.
Regarding predictions:
You’re right, that wasn’t a very exact prediction (mostly because internal moral disharmony is going to be hard to measure). Here is a falsifiable claim that I stand by and that, if true, is evidence of internal moral disharmony:
The hypothesis for why I think this correlation exists is mostly at the end of here and here.
But more generally, must a criticism of/concern for the EA community come in the form of a prediction? I’m really just trying to point out a hazard for those who go in for Rationalism/Consequentialism. If everyone has avoided it, that’s great! But there seems to be evidence that some have failed to avoid it, and that we might want to take further precautions. SBF was very much one of EA’s own: his comments therefore merit some EA introspection. I’m just throwing in my two cents.
Regarding actual EAs:
I would be happy to learn few EAs actually have thoughts too many! But I do know it’s a thing, that some have suffered it (personally I’ve struggled with it at times, and it’s literally in Mill’s autobiography). More generally, the ills of adopting a maximizer’s mindset too often are well documented (see references in footnotes). I thought it was in the community’s interest to raise awareness about it. I’m certainly not trying to demonize anyone: if someone in this community does suffer it, my first suspect would be the culture surrounding/theory of Consequentialism, not some particular weakness on the individual’s part.
Regarding dry discussion on topics of incredible magnitude:
That’s fair. I’m not saying being dry and calculating is always wrong. I’m just saying one should be careful about getting too comfortable with that mindset lest one start slipping into it when one shouldn’t. That seems like something rationalists need to be especially mindful of.