I quit. I’m going to stop calling myself an EA, and I’m going to stop organizing EA Ghent, which, since I’m the only organizer, means that in practice it will stop existing.
It’s not just because of Manifest; that was merely the straw that broke the camel’s back. In hindsight, I should have stopped after the Bostrom or FTX scandal. And it’s not just because they’re scandals; It’s because they highlight a much broader issue within the EA community regarding whom it chooses to support with money and attention, and whom it excludes.
I’m not going to go to any EA conferences, at least not for a while, and I’m not going to give any money to the EA fund. I will continue working for my AI safety, animal rights, and effective giving orgs, but will no longer be doing so under an EA label. Consider this a data point on what choices repel which kinds of people, and whether that’s worth it.
EDIT: This is not a solemn vow forswearing EA forever. If things change I would be more than happy to join again.
EDIT 2: For those wondering what this quick-take is reacting to, here’s a good summary by David Thorstad.
Thanks for writing this Bob. I feel very saddened myself by many of the things I see in EA nowadays, and have very mixed feelings about staying involved that I’m trying to sort through—I appreciate hearing your thought process on this. I wish you the best in your future endeavors!
Because I want to avoid the whole thing, and I am far less attached to EA because of these arguments, while being on the opposite side of the political question from where I assume you are.
Anyways, I’d call this weak evidence for the ‘EA should split into rationalist!EA, normie!EA’
Intuitively it seems likely that it would be better for the movement though if only people from one side were leaving, rather than the controversies alienating both camps from the brand.
EA is already incredibly far outside the “normiesphere,” so to speak. Calling it that is making some incredibly normative assumptions. What you’re looking for is more along the lines of “social justice progressive” EA and SJP-skeptical EA. As much as some people like to claim “the ideology is not the movement,” I would agree that such a split is ultimately inevitable (though I think it will also gut a lot of what makes EA interesting, and eventually SJP-EA morphs into bog-standard Ford Foundation philanthropy).
Still not that accurate, since I suspect there’s a fair number of people that disagree with Hanania, but think he should be allowed to speak, while supporting the global health efforts in Africa. But so it goes, trying to name amorphous and decentralized groupings.
eventually SJP-EA morphs into bog-standard Ford Foundation philanthropy
This seems unlikely to me for several reasons, foremost amongst them that they would lose interest in animal welfare. Do you think that progressives are not truly invested in it, and that it’s primarily championed by their skeptics? Because thedataseemstoindicatetheopposite.
PETA has been around for longer than EA, among other (rather less obnoxious and more effective) animal welfare organizations; I don’t think losing what makes EA distinct would entail losing animal welfare altogether. The shrimp and insect crowd probably wouldn’t remain noticeable. Not because I think they overlap heavily with the skeptic-EA crowd (quite the opposite), but because they’d simply be drowned out. Tolerance of weirdness is a fragile thing.
I do think the evidence is already there for a certain kind of losing/wildly redefining “effective,” ie, criminal justice reform. Good cause, but no way to fit it into “effectiveness per dollar” terms without stretching the term to meaninglessness.
Based on your background and posts on here, I think this is a shame.
And I say that as someone who has never called himself an EA even though I share its broad goal and have a healthy respect for the work of some of its organizations and people (partly because of similar impressions to the ones you’ve formed, but also because my cause area and other interests don’t overlap with EA quite as much as yours)
Hope you continue to achieve success and enjoyment in the work you do, and given you’re in Brussels wondered if you’d checked out the School for Moral Ambition which appears to be an EAish philosophy plus campaigning org trying to expand from your Dutch neighbours (no affiliation other than seeing it discussed here)
I appreciate what Rutger Bregman is trying to do, and his work has certainly had a big positive impact on the world, almost certainly larger than mine at least. But honestly, I think he could be more rigorous. I haven’t looked into his ‘school for moral ambition’ project, but I have read (the first half) of his book “humankind”, and despite vehemently agreeing with the conclusion, I would never recommend it to anyone, especially not anyone who has done any research before.
There seems to be some sort of trade-off between wide reach and rigor. I noticed a similar thing with other EA public intellectuals, like for example with Sam Harris and his book “The Moral Landscape” (I haven’t read any of his other books, mostly because this one was just so riddled with sloppy errors), and Steven Pinker’s “Enlightenment Now” (Haven’t read any of his other books either, again because of errors in this book). (Also, I’ve seen some clips of them online, and while that’s not the best way to get information about someone, they didn’t raise my opinion of them, to say the least).
Pretty annoying overall. At least Bregman is not prominently displayed on the EA People page like they are (even though what I read of his book was comparatively better). I would delete them off of it, but last time I removed SBF and Musk from it, that edit got downvoted and I had to ask a friend to upvote it (and this was after SBF was detained, so I don’t think a Harris or Pinker edit would fare much better). Pretty sad, because I think EA has much better people to display than a lot of individuals on that page. Especially considering some of them (like Harris and Pinker) currently don’t even identify as EA.
Interesting, you’re clearly more familiar with Bregman than I am: I was thinking of it in terms of the social reinforcement in finding interesting cause areas and committing to them thing he appears to be trying to do rather than his philosophy.
There’s definitely a tradeoff between wide reach and rigour when writing for public audiences, but I think most people fall short of rigour most of the time. But those who claim exceptional rigour as their distinguishing characteristic should definitely try to avoid appearing to be more cliquey and arbitrary in their decision making than average...
When it comes to someone like Pinker it’s the tone that irritates me more than the generalizations, to the point I’m even more annoyed when I think he’s right about something! If Bregman sometimes sounds similar I can see how it would grate.
I quit. I’m going to stop calling myself an EA, and I’m going to stop organizing EA Ghent, which, since I’m the only organizer, means that in practice it will stop existing.
It’s not just because of Manifest; that was merely the straw that broke the camel’s back. In hindsight, I should have stopped after the Bostrom or FTX scandal. And it’s not just because they’re scandals; It’s because they highlight a much broader issue within the EA community regarding whom it chooses to support with money and attention, and whom it excludes.
I’m not going to go to any EA conferences, at least not for a while, and I’m not going to give any money to the EA fund. I will continue working for my AI safety, animal rights, and effective giving orgs, but will no longer be doing so under an EA label. Consider this a data point on what choices repel which kinds of people, and whether that’s worth it.
EDIT: This is not a solemn vow forswearing EA forever. If things change I would be more than happy to join again.
EDIT 2: For those wondering what this quick-take is reacting to, here’s a good summary by David Thorstad.
Thanks for writing this Bob. I feel very saddened myself by many of the things I see in EA nowadays, and have very mixed feelings about staying involved that I’m trying to sort through—I appreciate hearing your thought process on this. I wish you the best in your future endeavors!
This isn’t good. This really isn’t good.
Because I want to avoid the whole thing, and I am far less attached to EA because of these arguments, while being on the opposite side of the political question from where I assume you are.
Anyways, I’d call this weak evidence for the ‘EA should split into rationalist!EA, normie!EA’
Intuitively it seems likely that it would be better for the movement though if only people from one side were leaving, rather than the controversies alienating both camps from the brand.
EA is already incredibly far outside the “normiesphere,” so to speak. Calling it that is making some incredibly normative assumptions. What you’re looking for is more along the lines of “social justice progressive” EA and SJP-skeptical EA. As much as some people like to claim “the ideology is not the movement,” I would agree that such a split is ultimately inevitable (though I think it will also gut a lot of what makes EA interesting, and eventually SJP-EA morphs into bog-standard Ford Foundation philanthropy).
Still not that accurate, since I suspect there’s a fair number of people that disagree with Hanania, but think he should be allowed to speak, while supporting the global health efforts in Africa. But so it goes, trying to name amorphous and decentralized groupings.
This seems unlikely to me for several reasons, foremost amongst them that they would lose interest in animal welfare. Do you think that progressives are not truly invested in it, and that it’s primarily championed by their skeptics? Because the data seems to indicate the opposite.
PETA has been around for longer than EA, among other (rather less obnoxious and more effective) animal welfare organizations; I don’t think losing what makes EA distinct would entail losing animal welfare altogether. The shrimp and insect crowd probably wouldn’t remain noticeable. Not because I think they overlap heavily with the skeptic-EA crowd (quite the opposite), but because they’d simply be drowned out. Tolerance of weirdness is a fragile thing.
I do think the evidence is already there for a certain kind of losing/wildly redefining “effective,” ie, criminal justice reform. Good cause, but no way to fit it into “effectiveness per dollar” terms without stretching the term to meaninglessness.
Based on your background and posts on here, I think this is a shame.
And I say that as someone who has never called himself an EA even though I share its broad goal and have a healthy respect for the work of some of its organizations and people (partly because of similar impressions to the ones you’ve formed, but also because my cause area and other interests don’t overlap with EA quite as much as yours)
Hope you continue to achieve success and enjoyment in the work you do, and given you’re in Brussels wondered if you’d checked out the School for Moral Ambition which appears to be an EAish philosophy plus campaigning org trying to expand from your Dutch neighbours (no affiliation other than seeing it discussed here)
I appreciate what Rutger Bregman is trying to do, and his work has certainly had a big positive impact on the world, almost certainly larger than mine at least. But honestly, I think he could be more rigorous. I haven’t looked into his ‘school for moral ambition’ project, but I have read (the first half) of his book “humankind”, and despite vehemently agreeing with the conclusion, I would never recommend it to anyone, especially not anyone who has done any research before.
There seems to be some sort of trade-off between wide reach and rigor. I noticed a similar thing with other EA public intellectuals, like for example with Sam Harris and his book “The Moral Landscape” (I haven’t read any of his other books, mostly because this one was just so riddled with sloppy errors), and Steven Pinker’s “Enlightenment Now” (Haven’t read any of his other books either, again because of errors in this book). (Also, I’ve seen some clips of them online, and while that’s not the best way to get information about someone, they didn’t raise my opinion of them, to say the least).
Pretty annoying overall. At least Bregman is not prominently displayed on the EA People page like they are (even though what I read of his book was comparatively better). I would delete them off of it, but last time I removed SBF and Musk from it, that edit got downvoted and I had to ask a friend to upvote it (and this was after SBF was detained, so I don’t think a Harris or Pinker edit would fare much better). Pretty sad, because I think EA has much better people to display than a lot of individuals on that page. Especially considering some of them (like Harris and Pinker) currently don’t even identify as EA.
Interesting, you’re clearly more familiar with Bregman than I am: I was thinking of it in terms of the social reinforcement in finding interesting cause areas and committing to them thing he appears to be trying to do rather than his philosophy.
There’s definitely a tradeoff between wide reach and rigour when writing for public audiences, but I think most people fall short of rigour most of the time. But those who claim exceptional rigour as their distinguishing characteristic should definitely try to avoid appearing to be more cliquey and arbitrary in their decision making than average...
When it comes to someone like Pinker it’s the tone that irritates me more than the generalizations, to the point I’m even more annoyed when I think he’s right about something! If Bregman sometimes sounds similar I can see how it would grate.