Hi, I’m pretty new here, so pls correct me if I’m wrong. I had, however, one important impression which I think I should share. EA started as a small movement and right now is expanding like crazy. The thing is, it still has a “small movement” mentality. One of the key aspects of this is trust. I have an impression that the EA is super trust-based. I have a feeling that if somebody calls themselves EA everybody assumes that they have probably super altruistic intentions and most of the values aligned. It is lovely. But maybe dangerous? In a small movement everybody knows everyone and if somebody does something suspicious, the whole group can very easily spread the warning. In a large groups, however, it won’t work. So if somebody is a grifter, an amoral person, just an a*hole or anything similar—they can super easily abuse the system, just by, for example, changing the EA crowd they talk to. I have an impression that there was a push towards attracting the maximum number of people possible. I assume that it was thought through and there is a value added in it. It, however, may have a pretty serious cost.
Liv—I agree that this is something to be very cautious about.
I have psychology colleagues who study psychopathy, sociopathy, and ‘Dark Triad’ personality traits. The people with these dangerous traits tend to selectively target smallish, naive, high-trust, do-gooding communities, such as fundamentalist Christian churches, non-profits, idealistic start-ups, etc. -- and maybe EA. The people in these groups tend to be idealistic, forgiving, trusting, tribal, mutually supportive, and inexperienced with bad actors. This makes them highly exploitable—financially, sexually, socially, legally, etc.
Psych professor Essi Viding has a nice ‘Very Short Introduction’ to psychopathy here, which I recommend.
The best way to guard against such people is to learn about their traits and their typical social strategies and tactics, and then to, well, keep up one’s guard.
(I’m not implying that SBF is a psychopath; it’s not feasible or responsible to make a diagnosis of this sort without knowing someone personally.)
Thanks for the comments! I also wanted to clarify one thing—I’m not talking only about super serious cases—i.e. criminals or abusers. I think that a much more common system failure would be to over-trust “small” grifters who live from one grant to another, people who don’t keep secrets including professional secrets, those who are permanently incompetent and unwilling to change that etc. I think that those people, if not “caught” early enough can also cause a lot of trouble and minimize impact of even the best initiative. Also, you know, it’s absolutely great to have a feeling that you can trust all people in the room you are in. I think there’s a huge value in creating such environment. But I have no idea how to do that in case of EA—it seems to big, to rich, and growing to fast. I guess in this case some super effective system would be needed, but again, I don’t know. Maybe, sadly but very possibly, it’s impossible in case of such an environment—if yes, we need to adjust our assumptions and behavior, and probably we should do it fast.
As someone who’s worked in the mental health field, the other interesting thing I’ve read about ASPD/psychopathy is that they heavily rely on cognitive empathy over affective empathy, which … is actually a very EA trait in my opinion.
So even without nefarious intentions, I think people with ASPD would be drawn to/overrepresented within EA.
I felt a bit stressed when I saw that the discussion turned into talk about ASPD, and now I realized why.
Firstly, we should hold accountable all people who display unwanted behavior, doesn’t matter their diagnosis. I’m afraid that the focus on ASPD will shift our attention from “all abusive/offensive/deceitful behaviors shouldn’t be accepted” to “let’s be careful if somebody has ASPD”. I think that focusing on (especially repeating) behaviors and consequences is a much better strategy here.
Secondly, it’s hard to diagnose somebody, and doing so in a non-clinical setting is unethical and very hard, so if we start worrying about letting people with ASPD “into EA” we have no way to actually prove or disprove our point. But some people may end up trying, and home-made psychoanalysis is well, not good.
So, to summarize—I personally just think that shifting the focus from “how to trace overall unwanted behavior” to “if EA may attract people with ASPD” may yield worse results.
Yeah, I agree. The only reason I even engaged is because a psych I saw noted down that I show signs of it, and I roll my eyes whenever psychopathy pops up in a discussion cause people just use it as a synonym for malicious.
Reading on ASPD, it’s kinda weird how people read “15% of CEOs and criminals have ASPD” and think “ASPD is the criminality and corruption disease” instead of “85% of people we should watch out for are totally capable of abuse with a perfectly normal brain, so our processes should work regardless of the offender’s brain”.
IDK, just really weird scapegoating. The original point was pretty much just about “malicious bad-faith actors” and nothing to do with ASPD.
Most fraudulent activities were done by normal people that rationalized their way when opportunities or gaps presented to them and they happen to need the financial gain.
Interesting. I would have said the opposite—that many EAs are what Simon Baron-Cohen calls ‘high systematizers’ (which overlaps somewhat with the Aspergers/autism spectrum), who tend to be pretty strong on affective empathy (e.g. easily imaging how horrible it would be to be a factory-farmed chicken, a starving child, or a future AGI), but who are often somewhat weaker on cognitive empathy (i.e. using Theory of Mind to understand other people’s specific beliefs and desires accurately, follow social norms, communicate effectively with others who have different values and assumptions, manipulate and influence other people, etc).
I agree that psychopaths rely much more on cognitive empathy than on affective empathy. But in my reckoning, this Aspergers-vs-psychopaths dimensions implies that EA includes relatively few psychopaths, and relatively many Aspy people (like me).
(FWIW, I’d recommend the recent Simon Baron-Cohen book ‘The pattern seekers’, about the evolutionary origins and technological benefits of high systematizing.)
But people in EA think a lot about how to reduce the suffering of other people, and give a great importance to morality and doing the right thing, which is the exact opposite of sociopathy. I think that this is more important than if people are heavily “cognitive” in the community, and people with ASPD should be underrepresented. Moreover, a lot of people seem to be motivated by affective empathy, even though they try to use cognitive empathy to then think about what is best.
Agree. I don’t know if you meant this too, but I also think that focusing on one particular person who manages to have a lot of influence among the fellows of his or her local EA group/organisation, or generally creating a kind of cult of personality on a few leading personalities of the movement, can be dangerous in the long run. SBF is a kind of example of the unilateralist curse somehow.
I didn’t have that in mind :). But let me think about it.
Maybe there’s something to it (my epistemic status is—I just thought about it—so please be critical). The majority of the EA community consists of super young overachievers who strongly believe that one’s worth needs to be proven and can be measured. There is also a small portion of the community which is much older, mature and simply impressive. I don’t know if it causes the community to be cultist, but it may enable it. I personally don’t feel any “authority idealization” vibes in the EA, rather quite the opposite. I have a pretty strong intuition that if I wanted to disagree with some EA “celebrity” I would be more than encouraged to do so, treated as a thought partner (If I have good arguments) and thanked for a valid criticism. I also believe I could approach any EA person and just chat to them if we both wanted to chat, because why not, and if in the process I learn that this person is famous, well, ok, wouldn’t change the tone of the conversation. That being said, I have a pretty strong personality myself and I’m not intimidated easily. Plus, I’m in my late twenties, so older than the majority of the EA new joiners, which may be important here. I don’t think that creating celebrities and hierarchies is avoidable, I don’t believe that saying that some people are impressive is bad. I also think that it’s super hard to stop people idealizing you if you are a leader, especially when internet and community structure allows random people to have some insight into your private life. I also believe that if somebody keeps idealizing celebrities, a good “first step” is to seriously reflect on that schema and work on ones mindset first. I would not shift the blame on the “celebrities” or “community” only, because if the schema of “authorities” exists, the first step to break it is to make “fans” more self-aware, self-sufficient and causative. I however think that the topic is worth investigating and chatting about. All of the above being said, celebrities should take responsibility for their power. Blogs and websites should avoid creating idealized portraits of the leaders. Everybody should have equal right to speak and disagree with a “head” of any organization, and everybody should be equally criticized in case of saying untrue statements or any wrongdoing. Active idealization should be treated as a bias—because it is a bias—so a mistake to work on. Finally there definitely should be systems which could stop those with more power from abusing it in case they try to. Do you actually know if somebody checked to what extent “being cultist” is a problem in EA? And if it’s more than in any other group? I wonder what would be a result of such a research.
I expected when I came here to be honest that a movement as big as this has some notion of best practices as far as governance is but unfortunately it was not, what is installed seems erring on the side of investigation and rather than detection or prevention.
EA has to add some parts of how scaling issues had been addressed by big multibillion industries. I understand that the movement is still new to this but again, fraud has been there forever where money is in play and human nature is involved.
Strong upvote for “EA still has small movement mentality”.
The appropriateness of diverting lots of resources from object level impact to good governance structures depends on how much resources a movement has overall, and I don’t think EA has appropriately adapted to the scale of its size, wealth and influence.
Hi, I’m pretty new here, so pls correct me if I’m wrong. I had, however, one important impression which I think I should share.
EA started as a small movement and right now is expanding like crazy. The thing is, it still has a “small movement” mentality.
One of the key aspects of this is trust. I have an impression that the EA is super trust-based. I have a feeling that if somebody calls themselves EA everybody assumes that they have probably super altruistic intentions and most of the values aligned. It is lovely. But maybe dangerous?
In a small movement everybody knows everyone and if somebody does something suspicious, the whole group can very easily spread the warning. In a large groups, however, it won’t work. So if somebody is a grifter, an amoral person, just an a*hole or anything similar—they can super easily abuse the system, just by, for example, changing the EA crowd they talk to. I have an impression that there was a push towards attracting the maximum number of people possible. I assume that it was thought through and there is a value added in it. It, however, may have a pretty serious cost.
Liv—I agree that this is something to be very cautious about.
I have psychology colleagues who study psychopathy, sociopathy, and ‘Dark Triad’ personality traits. The people with these dangerous traits tend to selectively target smallish, naive, high-trust, do-gooding communities, such as fundamentalist Christian churches, non-profits, idealistic start-ups, etc. -- and maybe EA. The people in these groups tend to be idealistic, forgiving, trusting, tribal, mutually supportive, and inexperienced with bad actors. This makes them highly exploitable—financially, sexually, socially, legally, etc.
Psych professor Essi Viding has a nice ‘Very Short Introduction’ to psychopathy here, which I recommend.
The best way to guard against such people is to learn about their traits and their typical social strategies and tactics, and then to, well, keep up one’s guard.
(I’m not implying that SBF is a psychopath; it’s not feasible or responsible to make a diagnosis of this sort without knowing someone personally.)
Thanks for the comments! I also wanted to clarify one thing—I’m not talking only about super serious cases—i.e. criminals or abusers. I think that a much more common system failure would be to over-trust “small” grifters who live from one grant to another, people who don’t keep secrets including professional secrets, those who are permanently incompetent and unwilling to change that etc. I think that those people, if not “caught” early enough can also cause a lot of trouble and minimize impact of even the best initiative.
Also, you know, it’s absolutely great to have a feeling that you can trust all people in the room you are in. I think there’s a huge value in creating such environment. But I have no idea how to do that in case of EA—it seems to big, to rich, and growing to fast. I guess in this case some super effective system would be needed, but again, I don’t know. Maybe, sadly but very possibly, it’s impossible in case of such an environment—if yes, we need to adjust our assumptions and behavior, and probably we should do it fast.
I don’t have much to add but I found this exchange super interesting, thanks for that.
As someone who’s worked in the mental health field, the other interesting thing I’ve read about ASPD/psychopathy is that they heavily rely on cognitive empathy over affective empathy, which … is actually a very EA trait in my opinion.
So even without nefarious intentions, I think people with ASPD would be drawn to/overrepresented within EA.
I felt a bit stressed when I saw that the discussion turned into talk about ASPD, and now I realized why.
Firstly, we should hold accountable all people who display unwanted behavior, doesn’t matter their diagnosis. I’m afraid that the focus on ASPD will shift our attention from “all abusive/offensive/deceitful behaviors shouldn’t be accepted” to “let’s be careful if somebody has ASPD”. I think that focusing on (especially repeating) behaviors and consequences is a much better strategy here.
Secondly, it’s hard to diagnose somebody, and doing so in a non-clinical setting is unethical and very hard, so if we start worrying about letting people with ASPD “into EA” we have no way to actually prove or disprove our point. But some people may end up trying, and home-made psychoanalysis is well, not good.
So, to summarize—I personally just think that shifting the focus from “how to trace overall unwanted behavior” to “if EA may attract people with ASPD” may yield worse results.
Yeah, I agree. The only reason I even engaged is because a psych I saw noted down that I show signs of it, and I roll my eyes whenever psychopathy pops up in a discussion cause people just use it as a synonym for malicious.
Reading on ASPD, it’s kinda weird how people read “15% of CEOs and criminals have ASPD” and think “ASPD is the criminality and corruption disease” instead of “85% of people we should watch out for are totally capable of abuse with a perfectly normal brain, so our processes should work regardless of the offender’s brain”.
IDK, just really weird scapegoating. The original point was pretty much just about “malicious bad-faith actors” and nothing to do with ASPD.
Most fraudulent activities were done by normal people that rationalized their way when opportunities or gaps presented to them and they happen to need the financial gain.
Interesting. I would have said the opposite—that many EAs are what Simon Baron-Cohen calls ‘high systematizers’ (which overlaps somewhat with the Aspergers/autism spectrum), who tend to be pretty strong on affective empathy (e.g. easily imaging how horrible it would be to be a factory-farmed chicken, a starving child, or a future AGI), but who are often somewhat weaker on cognitive empathy (i.e. using Theory of Mind to understand other people’s specific beliefs and desires accurately, follow social norms, communicate effectively with others who have different values and assumptions, manipulate and influence other people, etc).
I agree that psychopaths rely much more on cognitive empathy than on affective empathy. But in my reckoning, this Aspergers-vs-psychopaths dimensions implies that EA includes relatively few psychopaths, and relatively many Aspy people (like me).
(FWIW, I’d recommend the recent Simon Baron-Cohen book ‘The pattern seekers’, about the evolutionary origins and technological benefits of high systematizing.)
But people in EA think a lot about how to reduce the suffering of other people, and give a great importance to morality and doing the right thing, which is the exact opposite of sociopathy. I think that this is more important than if people are heavily “cognitive” in the community, and people with ASPD should be underrepresented. Moreover, a lot of people seem to be motivated by affective empathy, even though they try to use cognitive empathy to then think about what is best.
Agree. I don’t know if you meant this too, but I also think that focusing on one particular person who manages to have a lot of influence among the fellows of his or her local EA group/organisation, or generally creating a kind of cult of personality on a few leading personalities of the movement, can be dangerous in the long run. SBF is a kind of example of the unilateralist curse somehow.
I didn’t have that in mind :). But let me think about it.
Maybe there’s something to it (my epistemic status is—I just thought about it—so please be critical). The majority of the EA community consists of super young overachievers who strongly believe that one’s worth needs to be proven and can be measured. There is also a small portion of the community which is much older, mature and simply impressive. I don’t know if it causes the community to be cultist, but it may enable it.
I personally don’t feel any “authority idealization” vibes in the EA, rather quite the opposite. I have a pretty strong intuition that if I wanted to disagree with some EA “celebrity” I would be more than encouraged to do so, treated as a thought partner (If I have good arguments) and thanked for a valid criticism. I also believe I could approach any EA person and just chat to them if we both wanted to chat, because why not, and if in the process I learn that this person is famous, well, ok, wouldn’t change the tone of the conversation. That being said, I have a pretty strong personality myself and I’m not intimidated easily. Plus, I’m in my late twenties, so older than the majority of the EA new joiners, which may be important here.
I don’t think that creating celebrities and hierarchies is avoidable, I don’t believe that saying that some people are impressive is bad. I also think that it’s super hard to stop people idealizing you if you are a leader, especially when internet and community structure allows random people to have some insight into your private life. I also believe that if somebody keeps idealizing celebrities, a good “first step” is to seriously reflect on that schema and work on ones mindset first. I would not shift the blame on the “celebrities” or “community” only, because if the schema of “authorities” exists, the first step to break it is to make “fans” more self-aware, self-sufficient and causative.
I however think that the topic is worth investigating and chatting about. All of the above being said, celebrities should take responsibility for their power. Blogs and websites should avoid creating idealized portraits of the leaders. Everybody should have equal right to speak and disagree with a “head” of any organization, and everybody should be equally criticized in case of saying untrue statements or any wrongdoing. Active idealization should be treated as a bias—because it is a bias—so a mistake to work on. Finally there definitely should be systems which could stop those with more power from abusing it in case they try to.
Do you actually know if somebody checked to what extent “being cultist” is a problem in EA? And if it’s more than in any other group? I wonder what would be a result of such a research.
I’m getting the same read Liv.
I expected when I came here to be honest that a movement as big as this has some notion of best practices as far as governance is but unfortunately it was not, what is installed seems erring on the side of investigation and rather than detection or prevention.
EA has to add some parts of how scaling issues had been addressed by big multibillion industries. I understand that the movement is still new to this but again, fraud has been there forever where money is in play and human nature is involved.
Strong upvote for “EA still has small movement mentality”.
The appropriateness of diverting lots of resources from object level impact to good governance structures depends on how much resources a movement has overall, and I don’t think EA has appropriately adapted to the scale of its size, wealth and influence.
With great power comes great responsibility.