I quit. I’m going to stop calling myself an EA, and I’m going to stop organizing EA Ghent, which, since I’m the only organizer, means that in practice it will stop existing.
It’s not just because of Manifest; that was merely the straw that broke the camel’s back. In hindsight, I should have stopped after the Bostrom or FTX scandal. And it’s not just because they’re scandals; It’s because they highlight a much broader issue within the EA community regarding whom it chooses to support with money and attention, and whom it excludes.
I’m not going to go to any EA conferences, at least not for a while, and I’m not going to give any money to the EA fund. I will continue working for my AI safety, animal rights, and effective giving orgs, but will no longer be doing so under an EA label. Consider this a data point on what choices repel which kinds of people, and whether that’s worth it.
EDIT: This is not a solemn vow forswearing EA forever. If things change I would be more than happy to join again.
EDIT 2: For those wondering what this quick-take is reacting to, here’s a good summary by David Thorstad.
Thanks for sharing your experience here. I’m glad you see a path forward that involves continuing to work on issues you care about despite distancing yourself from the community.
In general, I think people should be more willing to accept that you can accept EA ideas or pursue EA-inspired careers without necessarily accepting the EA community. I sometimes hear people struggling with the fact that they like a lot of the values/beliefs in EA (e.g., desire to use evidence and reason to find cost-effective and time-effective ways of improving the world) while having a lot of concerns about the modern EA movement/community.
The main thing I tell these folks is that you can live by certain EA principles while distancing yourself from the community. I’ve known several people who have distanced themselves from the community (for various reasons, not just the ones listed here) but remained in AI safety or other topics they care about.
Personally, I feel like I’ve benefitted quite a bit from being less centrally involved in the EA space (and correspondingly being more involved in other professional/social spaces). I think this comment by Habryka describes a lot of the psychological/intellectual effects that I experienced.
Relatedly, as I specialized more in AI safety, I found it useful to ask questions like “what spaces should I go to where I can meet people who could help with my AI safety goals”. This sometimes overlapped with “go to EA event” but often overlapped with “go meet people outside the EA community who are doing relevant work or have relevant experience”, and I think this has been a very valuable part of my professional growth over the last 1-2 years.
I 100% agree with you on your general point, Akash, but I think something slightly different is going on here, and I think it’s important to get it right.
To me, it sounds like you’re saying, ‘Bob is developing a more healthy relationship with EA’. However, I think what’s actually happening is more like, ‘Bob used to think EA was a cool thing, and it helped him do cool things, but then people associated with it kept doing things Bob found repugnant, and so now Bob does not want anything to do with it’.
Bob, forgive me for speaking on your behalf, and please correct me if I have misinterpreted things.
A bit strong, but about right. The strategy the rationalists describe seems to stem from a desire to ensure their own intellectual development, which is, after all, the rationalist project. By disregarding social norms you can start conversing with lots of people about lots of stuff you otherwise wouldn’t have been able to. Tempting, however, my own (intellectual) freedom is not my primary concern; my primary concern is the overall happiness (or feelings, if you will) of others, and certain social norms are there to protect that.
To me, it sounds like you’re saying, ‘Bob is developing a more healthy relationship with EA’.
Oh just a quick clarification– I wasn’t trying to say anything about Bob or Bob’s relationship with EA here.
I just wanted to chime in with my own experience (which is not the same as Bob’s but shares one similarity in that they’re both in the “rethinking one’s relationship with the EA community/movement” umbrella).
More generally, I suspect many forum readers are grappling with this question of “what do I want my relationship with the EA community/movement to be”. Given this, it might be useful for more people to share how they’ve processed these questions (whether they’re related to the recent Manifold events or related to other things that have caused people to question their affiliation with EA).
Thanks for writing this Bob. I feel very saddened myself by many of the things I see in EA nowadays, and have very mixed feelings about staying involved that I’m trying to sort through—I appreciate hearing your thought process on this. I wish you the best in your future endeavors!
Because I want to avoid the whole thing, and I am far less attached to EA because of these arguments, while being on the opposite side of the political question from where I assume you are.
Anyways, I’d call this weak evidence for the ‘EA should split into rationalist!EA, normie!EA’
Intuitively it seems likely that it would be better for the movement though if only people from one side were leaving, rather than the controversies alienating both camps from the brand.
EA is already incredibly far outside the “normiesphere,” so to speak. Calling it that is making some incredibly normative assumptions. What you’re looking for is more along the lines of “social justice progressive” EA and SJP-skeptical EA. As much as some people like to claim “the ideology is not the movement,” I would agree that such a split is ultimately inevitable (though I think it will also gut a lot of what makes EA interesting, and eventually SJP-EA morphs into bog-standard Ford Foundation philanthropy).
Still not that accurate, since I suspect there’s a fair number of people that disagree with Hanania, but think he should be allowed to speak, while supporting the global health efforts in Africa. But so it goes, trying to name amorphous and decentralized groupings.
eventually SJP-EA morphs into bog-standard Ford Foundation philanthropy
This seems unlikely to me for several reasons, foremost amongst them that they would lose interest in animal welfare. Do you think that progressives are not truly invested in it, and that it’s primarily championed by their skeptics? Because thedataseemstoindicatetheopposite.
PETA has been around for longer than EA, among other (rather less obnoxious and more effective) animal welfare organizations; I don’t think losing what makes EA distinct would entail losing animal welfare altogether. The shrimp and insect crowd probably wouldn’t remain noticeable. Not because I think they overlap heavily with the skeptic-EA crowd (quite the opposite), but because they’d simply be drowned out. Tolerance of weirdness is a fragile thing.
I do think the evidence is already there for a certain kind of losing/wildly redefining “effective,” ie, criminal justice reform. Good cause, but no way to fit it into “effectiveness per dollar” terms without stretching the term to meaninglessness.
Based on your background and posts on here, I think this is a shame.
And I say that as someone who has never called himself an EA even though I share its broad goal and have a healthy respect for the work of some of its organizations and people (partly because of similar impressions to the ones you’ve formed, but also because my cause area and other interests don’t overlap with EA quite as much as yours)
Hope you continue to achieve success and enjoyment in the work you do, and given you’re in Brussels wondered if you’d checked out the School for Moral Ambition which appears to be an EAish philosophy plus campaigning org trying to expand from your Dutch neighbours (no affiliation other than seeing it discussed here)
I appreciate what Rutger Bregman is trying to do, and his work has certainly had a big positive impact on the world, almost certainly larger than mine at least. But honestly, I think he could be more rigorous. I haven’t looked into his ‘school for moral ambition’ project, but I have read (the first half) of his book “humankind”, and despite vehemently agreeing with the conclusion, I would never recommend it to anyone, especially not anyone who has done any research before.
There seems to be some sort of trade-off between wide reach and rigor. I noticed a similar thing with other EA public intellectuals, like for example with Sam Harris and his book “The Moral Landscape” (I haven’t read any of his other books, mostly because this one was just so riddled with sloppy errors), and Steven Pinker’s “Enlightenment Now” (Haven’t read any of his other books either, again because of errors in this book). (Also, I’ve seen some clips of them online, and while that’s not the best way to get information about someone, they didn’t raise my opinion of them, to say the least).
Pretty annoying overall. At least Bregman is not prominently displayed on the EA People page like they are (even though what I read of his book was comparatively better). I would delete them off of it, but last time I removed SBF and Musk from it, that edit got downvoted and I had to ask a friend to upvote it (and this was after SBF was detained, so I don’t think a Harris or Pinker edit would fare much better). Pretty sad, because I think EA has much better people to display than a lot of individuals on that page. Especially considering some of them (like Harris and Pinker) currently don’t even identify as EA.
Interesting, you’re clearly more familiar with Bregman than I am: I was thinking of it in terms of the social reinforcement in finding interesting cause areas and committing to them thing he appears to be trying to do rather than his philosophy.
There’s definitely a tradeoff between wide reach and rigour when writing for public audiences, but I think most people fall short of rigour most of the time. But those who claim exceptional rigour as their distinguishing characteristic should definitely try to avoid appearing to be more cliquey and arbitrary in their decision making than average...
When it comes to someone like Pinker it’s the tone that irritates me more than the generalizations, to the point I’m even more annoyed when I think he’s right about something! If Bregman sometimes sounds similar I can see how it would grate.
I quit. I’m going to stop calling myself an EA, and I’m going to stop organizing EA Ghent, which, since I’m the only organizer, means that in practice it will stop existing.
It’s not just because of Manifest; that was merely the straw that broke the camel’s back. In hindsight, I should have stopped after the Bostrom or FTX scandal. And it’s not just because they’re scandals; It’s because they highlight a much broader issue within the EA community regarding whom it chooses to support with money and attention, and whom it excludes.
I’m not going to go to any EA conferences, at least not for a while, and I’m not going to give any money to the EA fund. I will continue working for my AI safety, animal rights, and effective giving orgs, but will no longer be doing so under an EA label. Consider this a data point on what choices repel which kinds of people, and whether that’s worth it.
EDIT: This is not a solemn vow forswearing EA forever. If things change I would be more than happy to join again.
EDIT 2: For those wondering what this quick-take is reacting to, here’s a good summary by David Thorstad.
Thanks for sharing your experience here. I’m glad you see a path forward that involves continuing to work on issues you care about despite distancing yourself from the community.
In general, I think people should be more willing to accept that you can accept EA ideas or pursue EA-inspired careers without necessarily accepting the EA community. I sometimes hear people struggling with the fact that they like a lot of the values/beliefs in EA (e.g., desire to use evidence and reason to find cost-effective and time-effective ways of improving the world) while having a lot of concerns about the modern EA movement/community.
The main thing I tell these folks is that you can live by certain EA principles while distancing yourself from the community. I’ve known several people who have distanced themselves from the community (for various reasons, not just the ones listed here) but remained in AI safety or other topics they care about.
Personally, I feel like I’ve benefitted quite a bit from being less centrally involved in the EA space (and correspondingly being more involved in other professional/social spaces). I think this comment by Habryka describes a lot of the psychological/intellectual effects that I experienced.
Relatedly, as I specialized more in AI safety, I found it useful to ask questions like “what spaces should I go to where I can meet people who could help with my AI safety goals”. This sometimes overlapped with “go to EA event” but often overlapped with “go meet people outside the EA community who are doing relevant work or have relevant experience”, and I think this has been a very valuable part of my professional growth over the last 1-2 years.
I 100% agree with you on your general point, Akash, but I think something slightly different is going on here, and I think it’s important to get it right.
To me, it sounds like you’re saying, ‘Bob is developing a more healthy relationship with EA’. However, I think what’s actually happening is more like, ‘Bob used to think EA was a cool thing, and it helped him do cool things, but then people associated with it kept doing things Bob found repugnant, and so now Bob does not want anything to do with it’.
Bob, forgive me for speaking on your behalf, and please correct me if I have misinterpreted things.
A bit strong, but about right. The strategy the rationalists describe seems to stem from a desire to ensure their own intellectual development, which is, after all, the rationalist project. By disregarding social norms you can start conversing with lots of people about lots of stuff you otherwise wouldn’t have been able to. Tempting, however, my own (intellectual) freedom is not my primary concern; my primary concern is the overall happiness (or feelings, if you will) of others, and certain social norms are there to protect that.
Oh just a quick clarification– I wasn’t trying to say anything about Bob or Bob’s relationship with EA here.
I just wanted to chime in with my own experience (which is not the same as Bob’s but shares one similarity in that they’re both in the “rethinking one’s relationship with the EA community/movement” umbrella).
More generally, I suspect many forum readers are grappling with this question of “what do I want my relationship with the EA community/movement to be”. Given this, it might be useful for more people to share how they’ve processed these questions (whether they’re related to the recent Manifold events or related to other things that have caused people to question their affiliation with EA).
Thanks for writing this Bob. I feel very saddened myself by many of the things I see in EA nowadays, and have very mixed feelings about staying involved that I’m trying to sort through—I appreciate hearing your thought process on this. I wish you the best in your future endeavors!
This isn’t good. This really isn’t good.
Because I want to avoid the whole thing, and I am far less attached to EA because of these arguments, while being on the opposite side of the political question from where I assume you are.
Anyways, I’d call this weak evidence for the ‘EA should split into rationalist!EA, normie!EA’
Intuitively it seems likely that it would be better for the movement though if only people from one side were leaving, rather than the controversies alienating both camps from the brand.
EA is already incredibly far outside the “normiesphere,” so to speak. Calling it that is making some incredibly normative assumptions. What you’re looking for is more along the lines of “social justice progressive” EA and SJP-skeptical EA. As much as some people like to claim “the ideology is not the movement,” I would agree that such a split is ultimately inevitable (though I think it will also gut a lot of what makes EA interesting, and eventually SJP-EA morphs into bog-standard Ford Foundation philanthropy).
Still not that accurate, since I suspect there’s a fair number of people that disagree with Hanania, but think he should be allowed to speak, while supporting the global health efforts in Africa. But so it goes, trying to name amorphous and decentralized groupings.
This seems unlikely to me for several reasons, foremost amongst them that they would lose interest in animal welfare. Do you think that progressives are not truly invested in it, and that it’s primarily championed by their skeptics? Because the data seems to indicate the opposite.
PETA has been around for longer than EA, among other (rather less obnoxious and more effective) animal welfare organizations; I don’t think losing what makes EA distinct would entail losing animal welfare altogether. The shrimp and insect crowd probably wouldn’t remain noticeable. Not because I think they overlap heavily with the skeptic-EA crowd (quite the opposite), but because they’d simply be drowned out. Tolerance of weirdness is a fragile thing.
I do think the evidence is already there for a certain kind of losing/wildly redefining “effective,” ie, criminal justice reform. Good cause, but no way to fit it into “effectiveness per dollar” terms without stretching the term to meaninglessness.
Based on your background and posts on here, I think this is a shame.
And I say that as someone who has never called himself an EA even though I share its broad goal and have a healthy respect for the work of some of its organizations and people (partly because of similar impressions to the ones you’ve formed, but also because my cause area and other interests don’t overlap with EA quite as much as yours)
Hope you continue to achieve success and enjoyment in the work you do, and given you’re in Brussels wondered if you’d checked out the School for Moral Ambition which appears to be an EAish philosophy plus campaigning org trying to expand from your Dutch neighbours (no affiliation other than seeing it discussed here)
I appreciate what Rutger Bregman is trying to do, and his work has certainly had a big positive impact on the world, almost certainly larger than mine at least. But honestly, I think he could be more rigorous. I haven’t looked into his ‘school for moral ambition’ project, but I have read (the first half) of his book “humankind”, and despite vehemently agreeing with the conclusion, I would never recommend it to anyone, especially not anyone who has done any research before.
There seems to be some sort of trade-off between wide reach and rigor. I noticed a similar thing with other EA public intellectuals, like for example with Sam Harris and his book “The Moral Landscape” (I haven’t read any of his other books, mostly because this one was just so riddled with sloppy errors), and Steven Pinker’s “Enlightenment Now” (Haven’t read any of his other books either, again because of errors in this book). (Also, I’ve seen some clips of them online, and while that’s not the best way to get information about someone, they didn’t raise my opinion of them, to say the least).
Pretty annoying overall. At least Bregman is not prominently displayed on the EA People page like they are (even though what I read of his book was comparatively better). I would delete them off of it, but last time I removed SBF and Musk from it, that edit got downvoted and I had to ask a friend to upvote it (and this was after SBF was detained, so I don’t think a Harris or Pinker edit would fare much better). Pretty sad, because I think EA has much better people to display than a lot of individuals on that page. Especially considering some of them (like Harris and Pinker) currently don’t even identify as EA.
Interesting, you’re clearly more familiar with Bregman than I am: I was thinking of it in terms of the social reinforcement in finding interesting cause areas and committing to them thing he appears to be trying to do rather than his philosophy.
There’s definitely a tradeoff between wide reach and rigour when writing for public audiences, but I think most people fall short of rigour most of the time. But those who claim exceptional rigour as their distinguishing characteristic should definitely try to avoid appearing to be more cliquey and arbitrary in their decision making than average...
When it comes to someone like Pinker it’s the tone that irritates me more than the generalizations, to the point I’m even more annoyed when I think he’s right about something! If Bregman sometimes sounds similar I can see how it would grate.