I’m a computational physicist, I generally donate to global health. I am skeptical of AI x-risk and of big R Rationalism, and I intend explaining why in great detail.
titotal
How my community successfully reduced sexual misconduct
Motivation gaps: Why so much EA criticism is hostile and lazy
Does EA understand how to apologize for things?
I have been saddened to learn of similarly bad behaviour in other communities I have been involved in. However it’s important not to let the commonness of abuse and harassment in broader society as an excuse not to improve. (I’m 100% not accusing you of this by the way, it’s just a behavior I’ve seen in other places).
EA should not be aiming for a passing grade when it comes to sexual harassment. The question is not “is EA better than average”, but “is EA as good as it could be”. And the answer to that question is no. I deeply hope that the concerns of the women in the article will be listened to.
“Diamondoid bacteria” nanobots: deadly threat or dead-end? A nanotech investigation
Cryptocurrency is not all bad. We should stay away from it anyway.
My theory is that while EA/rationalism is not a cult, it contains enough ingredients of a cult that it’s relatively easy for someone to go off and make their own.
Not everyone follows every ingredient, and many of the ingredients are actually correct/good, but here are some examples:
Devoting ones life to a higher purpose (saving the world)
High cost signalling of group membership (donating large amounts of income)
The use of in-group shibboleths (like “in-group, and “shibboleths”)
The use of weird rituals and breaking social norms (Bayesian updating, “radical honesty”, etc)
A tendency to isolate oneself from non-group members (group houses, EA orgs)
the belief that the world is crazy, but we have found the truth (rationalist thinking)
the following of sacred texts explaining the truth of everything (the sequences)
And even the belief in an imminent apocalypse (AI doom)
These ingredients do not make EA/rationalism in general a cult, because it lacks enforced conformity and control by a leader. Plenty of people, including myself, have posted on Lesswrong critiquing the sequences and Yudkowsky and been massively upvoted for it. It’s decentralised across the internet, if someone wants to leave there’s nothing stopping them.
However, what seems to have happened is that multiple people have taken these base ingredients and just added in the conformity and charismatic leader parts. You put these ingredients in a small company or a group house, put an unethical or mentally unwell leader in charge, and you have everything you need for an abusive cult environment. Now it’s far more difficult to leave, because your housing/income is on the line, and the leader can use the already established breaking of social norms as an excuse to push boundaries and consent in the name of the greater good. This seems to have happened multiple times already.
I don’t know what to do if this theory is correct, besides to take extra scrutiny of leaders of sub-groups within EA, and maybe ease up on unnecessary rituals and jargon.
Firstly, I will say that I’m personally not afraid to study and debate these topics, and have done so. My belief is that the data points to no evidence of significant genetic differences between races when it comes to matters such as intelligence, and i think one downside of being hush hush about the subject is that people miss out on this conclusion, which is the one even a basic wikipedia skim would get you to. (you’re free to disagree, that’s not the point of this comment).
That being said, I think you have greatly understated the case for not debating the subject on this forum. Remember, this is a forum for doing the most good, not a debate club, and if shunting debate of certain subjects onto a different website does the most good, that’s what we should do. This requires a cost/benefit analysis, and you are severely understating the costs here.
Point 1 is that we have to acknowledge the obvious fact that when you make a group of people feel bad, some of them are going to leave your group. I do not think this is a moral failing on their part. We have a limited number of hours in the day, would you hang out in a place where people regularly discuss whether you are genetically inferior? And it doesn’t just drive out minorities, it drives out other people who are uncomfortable with the discussion as well.
Driving out minorities is bad on it’s own, but it also has implications for cause areas. A homogenous group is going to going to lack diverse viewpoints, and miss things that would be obvious to people with different contexts/experiences. It also limits the outreach to different countries, are we going to make inroads to India if we’re constantly discussing the genetic makeup of indians? And that’s not even talking about the bad PR of being a super-white, super-male group, which costs us both credibility and funding.
Following on the PR point, I think people find it gauche to talk about the PR effect of discussions, as our opinions shouldn’t be affected by public opinion. But if we are honestly discussing the costs of allowing these discussions, then PR undeniably is a cost, and a really bad one. People are already using this as an excuse to slam EA in general as racist on twitter, if this becomes a major news story, the narrative will spread. EA is already associated with fradulence thanks to SBF, do we really want to be associated with race science as well?
My last point is that while not everyone who believes in genetic group differences is far-right/neo-nazi, the vice versa is not true: pretty much every neo-nazi believes in this stuff, and they use every opportunity they can to use it as an excuse to spread their ideology. A continuing discussion could very well encourage a flood of nazis onto the site, which is not exactly good for the wellbeing of the forum.
Again, my point isn’t that these discussions should be banned from the internet entirely. My point is merely that it shouldn’t be discussed here.
You seem to be jumping to the conclusion that if you don’t understand something, it must be because you are dumb, and not because you lack familiarity with community jargon or norms.
For example, take the yudkowsky doompost that’s been much discussed recently. In the first couple of paragraphs, he namedrops people that would be completely unknown outside his specific subfield of work, and expects the reader to know who they are. Then there are a lot of paragraphs like the following:
If nothing else, this kind of harebrained desperation drains off resources from those reality-abiding efforts that might try to do something on the subjectively apparent doomed mainline, and so position themselves better to take advantage of unexpected hope, which is what the surviving possible worlds mostly look like.
It doesn’t matter if you have an oxford degree or not, this will be confusing to anyone who has not been steeped in the jargon and worldview of the rationalist subculture. (My PHD in physics is not helpful at all here)
This isn’t necessarily bad writing, because the piece is deliberately targeted at people who have been talking with this jargon for years. It would be bad writing if it were aimed at the general public though, because they don’t know what these terms mean.
This is similar to scientific fields, when you publish a scientific paper in a specific sub-discipline, a lot of knowledge is assumed. This avoids having to re-explain whole disciplines, but it does make papers incredibly hard to read for anyone that’s even a little bit of an outsider. But when communicating results to the public (or even someone in a different field of physics), you have to translate into reasonably understandable english. I think people here should be mindful of who exactly their audience is, and tailor their language appropriately.
Look, I think Will has worked very hard to do good and I don’t want to minimize that, but at some point (after the full investigation has come out) a pragmatic decision needs to be made about whether he and others are more valuable in the leadership or helping from the sidelines. If the information in the article is true, I think the former has far too great a cost.
This was not a small mistake. It is extremely rare for charitable foundations to be caught up in scandals of this magntiude, and this article indicates that a signficant amount of the fallout could have been prevented with a little more investigation at key moments, and that clear signs of unethial behaviour were deliberately ignored. I think this is far from competent.
We are in the charity business. Donors expect high standards when it comes to their giving, and bad reputations directly translate into dollars. And remember, we want new donors, not just to keep the old ones. I simply don’t see how “we have high standards, except when it comes to facilitating billion dollar frauds” can hold up to scrutiny. I’m not sure we can “credibly convince people” if we keep the current leadership in place. The monetary cost could be substantial.
We also want to recruit people to the movement. Being associated with bad behaviour will hurt our ability to recruit people with strong moral codes. Worse though, would be if we encouraged “vultures”. A combination of low ethical standards and large amounts of money would make our movement an obvious target for unethical exploiters, as appears to have already happened with SBF.
Being a brilliant philosopher or intellectual does not necessarily make you a great leader. I think we can keep the benefits of the former while recognizing that someone is no longer useful at the latter. Remaining in a leadership position is a privilege, not a right.
EA leaders should be held to high standards, and it’s becoming increasingly difficult to believe that the current leadership has met those standards. I’m open to having my mind changed when the investigation is concluded and the leaders respond (and we get a better grasp on who knew what when). As it stands now, I would guess it would be in the best interest of the movement (in terms of avoiding future mistakes, recruitment, and fundraising) for those who have displayed significantly bad judgement to step down from leadership roles. I recognize that they have worked very hard to do good, and I hope they can continue helping in non-leadership roles.
I’d like to push back on this logically, with again the recognition that this is a very sensitive topic, and that emotional reactions are valid.
According to the sources on wikipedia, Brain synapses in foetuses do not form until week 17, and the first evidence of “minimal consciousness and ability to feel pain” does not occur until week 30.
Only 1% of abortions in the US occur after 21 weeks of pregnancy.
I think there’s a bit of motte and bailey going on here, where you use all abortions in your statistics about why this is a significant issue, but only late term abortions in your defence of foetus moral relevance.
Even if we grant some moral weight to a 15 week old foetus (which I’m dubious of), it’s hard to see a logical reason why it would approach the morally significance of an adult chicken. And tens of billions of those are killed every year, and many of them are tortured beforehand.
I see no way for abortion bans (ie: forcing women by threat of force to put their health and lives at risk to bear unwanted children) to compete morally or logically against animal welfare interventions.
- 13 Nov 2023 13:54 UTC; 12 points) 's comment on The effective altruist case for pro-life/anti-abortion advocacy by (
I also think there should be particular care when it comes to newcomers to the movement. I think there should be a strong norm against regular members hitting on/asking out new members before they have enough time to settle into the community.
I’m betting there have been many people who showed up once or twice to an EA event, got hit on a bunch of times, and immediately left the movement in annoyance.
I think it’s a good thing that most people have a revulsion towards the Nazi version of eugenics. I think trying to rehabilitate the word “eugenics” could plausibly lead to a lessening of that revulsion and and increase in support for their version. Just use a different word for the thing that’s okay, and that all goes away.
This feels a bit like a pro-taxation person hearing the argument that taxation is theft, and then going on to proudly declare that “everyone is a thief”, that everybody should be pro-theft (but only good theft), and we should rehabilitate the word “theft”. Just use a different word.
These stories are horrifying. I want to thank the victims for speaking up, I know it can’t be easy.
It’s worth noting that while some of these allegations overlap with the ones in the times article, a lot of them are new. This article also makes more of an effort to distinguish between the EA and Rationalist communities, which are fairly close but separate. I think most, but not all, of the new allegations are more closely tied to Rationalism than EA, but I could be wrong.
This response is highly concerning and alarming. Some followup questions:
Are you confirming that the grant approval letter shown by expose is entirely genuine?
What was the nature of the media project you intended to fund?
Why was Nya Bagdlet chosen as the foundation, and not some other more reputable news source?
How aware were you of the political leaning of Nya Bagdlet when you approved this grant initially? In particular the publishing of articles promoting holocaust revisionism, vaccine denial, and the campaign to “defend ethnic rights”?
Is it normal to hand out grant approval letters before conducting due diligence on a project?
How long did it take for the organisation to reject the project after initially approving it?
Do you specifically condemn the Nya Bagdlet newspaper, right now?
- 20 Jan 2023 14:15 UTC; 32 points) 's comment on FLI FAQ on the rejected grant proposal controversy by (
This is a lot worse than I was expecting. This makes it clear that the woman was in a situation where it was extremely hard to refuse Owen’s offer of accommodation.
Firstly, the organization screwed up majorly. You should not be arranging accommodation for someone on the same day they fly to a foreign country. I know I would have been fairly distressed if this had happened to me.
Secondly, we need to remember that this was an organization she was interviewing for, and Owen was the one that recommended her, and was presumably on good terms with the org. It wouldn’t be unreasonable to think that making a fuss about staying at Owen’s house could hurt her chances with the org.
Thirdly, the power imbalances in their friendship might make her concerned about what would happen to her position if refusing accommodation hurt said friendship.
Fourthly, it’s often very expensive to get a last minute hotel. Refusing to stay with Owen could have occurred a large financial penalty.
This was not a case of “hey do you want to crash at mine when you fly over next month?”. This is a case of “no better options”. It’s extremely inappropriate to push boundaries on someone who is in this situation. I’m very saddened at the extreme lack of empathy and judgement that was shown here. I’m relieved that Owen is no longer in leadership positions in EA, and I deeply hope he has sincerely reformed since this encounter.
I want to remind people that there are severe downsides of having these race and eugenics discussions like the ones linked on the EA forum.
1. It makes the place uncomfortable for minorities and people concerned about racism, which could someday trigger a death spiral where non-racists leave, making the place more racist on average, causing mor non-racists to leave, etc.
2. It creates an acrimonious atmosphere in general, by starting heated discussions about deeply held personal topics.
3. It spreads ideas that could potentially cause harm, and lead uninformed people down racist rabbitholes by linking to biased racist sources.
4. It creates bad PR for EA in general, and provides easy ammunition for people who want to attack EA.
5. In my opinion, the evidence and arguments are generally bad and rely on flawed and often racist sources.
6. In my opinion, most forms of eugenics (and especially anything involving race) is extremely unlikely to be an actually effective cause area in the near future, given the backlash, unclear benefit, potential to create mass strife and inequality, etc
Now, this has to be balanced against a desire to entertain unusual ideas and to protect freedom of speech. But these views can still be discussed, debated, and refuted elsewhere. It seems like a clearly foolish move to host them on this forum. If EA is trying to do the most good, letting people like Ives post their misinformed stuff here seems like a clear mistake.
If taking positions that are percieved as left wing makes EA more correct and more effective, then EA should still take up those positions. The OP has made great effort to justify these points from a practical position of pursuing truth, effectiveness, and altruism, and they should not be dismissed just because they happen to fall on one side of the political spectrum. Similarly, just because an action makes EA less distinct, it doesn’t mean it’s not the correct thing to do.
I’m sorry to be pushing on this when it seems like you are doing the right thing, but could you elaborate more on this sentence from the article?
Why was she being put up in your house and not a hotel, if you weren’t affiliated with the group she was interviewing for? I think this is the part a lot of people were sketched out by, so more context would be helpful.