Take the example of Kat Woods and Emerson Spartz. Allegations of toxic and abusive behaviour towards employees were made 4 months ago (months after being reported to CEA). Despite Kat Woods denying these concerns and attempting to dismiss and discredit those who attest to their abusive behaviour, both Kat Woods and Emerson Spartz continue to: post in the EA-forum and get largely upvotes ; employ EA’s; be listed on the EA opportunity board; and control $100,000s in funding. As far as I can tell, nonlinear incubated projects (which they largely control) also continue to be largely supported by the community.
I know of multiple people who are currently investigating this. I expect appropriate consequences to be taken, though it’s not super clear to me yet how to make that happen (like, there is no governing body that could currently force nonlinear to do anything, but I think there will be a lot of pressure if the accusations turn out correct).
I’ve accounted further evidence of similar levels of misconduct by different actors, largely continuing without impediment (I’m currently working on resolving these). And (if I understand correctly) Oliver Habryka, who knows both rationalist and EA communities well, seems to be surprised by low levels of integrity in these communities (though he’s not attempting to benchmark off of larger society).
Just to be clear, I am pretty happy with the levels of integrity in the core rationality community (though it’s definitely also not perfect). The broader EA community has pretty big problems on this dimensions, but I also want to be clear that the EA community is still far above average for communities around the world here, it’s just that reality doesn’t grade on a curve and we tend to take much more ambitious and unconstrained actions in the world, so that failures of integrity and coordination can have much worse consequences. I do think people have historically been massively over-trusting both EA and Rationality and this has caused a lot of hurt, and I would like people to recalibrate to the high but not overwhelmingly high level of adequate trust.
I know of multiple people who are currently investigating this. I expect appropriate consequences to be taken, though it’s not super clear to me yet how to make that happen (like, there is no governing body that could currently force nonlinear to do anything, but I think there will be a lot of pressure if the accusations turn out correct).
Just to be clear, I am pretty happy with the levels of integrity in the core rationality community (though it’s definitely also not perfect). The broader EA community has pretty big problems on this dimensions, but I also want to be clear that the EA community is still far above average for communities around the world here, it’s just that reality doesn’t grade on a curve and we tend to take much more ambitious and unconstrained actions in the world, so that failures of integrity and coordination can have much worse consequences. I do think people have historically been massively over-trusting both EA and Rationality and this has caused a lot of hurt, and I would like people to recalibrate to the high but not overwhelmingly high level of adequate trust.