I quit. Iâm going to stop calling myself an EA, and Iâm going to stop organizing EA Ghent, which, since Iâm the only organizer, means that in practice it will stop existing.
Itâs not just because of Manifest; that was merely the straw that broke the camelâs back. In hindsight, I should have stopped after the Bostrom or FTX scandal. And itâs not just because theyâre scandals; Itâs because they highlight a much broader issue within the EA community regarding whom it chooses to support with money and attention, and whom it excludes.
Iâm not going to go to any EA conferences, at least not for a while, and Iâm not going to give any money to the EA fund. I will continue working for my AI safety, animal rights, and effective giving orgs, but will no longer be doing so under an EA label. Consider this a data point on what choices repel which kinds of people, and whether thatâs worth it.
EDIT: This is not a solemn vow forswearing EA forever. If things change I would be more than happy to join again.
EDIT 2: For those wondering what this quick-take is reacting to, hereâs a good summary by David Thorstad.
Thanks for sharing your experience here. Iâm glad you see a path forward that involves continuing to work on issues you care about despite distancing yourself from the community.
In general, I think people should be more willing to accept that you can accept EA ideas or pursue EA-inspired careers without necessarily accepting the EA community. I sometimes hear people struggling with the fact that they like a lot of the values/âbeliefs in EA (e.g., desire to use evidence and reason to find cost-effective and time-effective ways of improving the world) while having a lot of concerns about the modern EA movement/âcommunity.
The main thing I tell these folks is that you can live by certain EA principles while distancing yourself from the community. Iâve known several people who have distanced themselves from the community (for various reasons, not just the ones listed here) but remained in AI safety or other topics they care about.
Personally, I feel like Iâve benefitted quite a bit from being less centrally involved in the EA space (and correspondingly being more involved in other professional/âsocial spaces). I think this comment by Habryka describes a lot of the psychological/âintellectual effects that I experienced.
Relatedly, as I specialized more in AI safety, I found it useful to ask questions like âwhat spaces should I go to where I can meet people who could help with my AI safety goalsâ. This sometimes overlapped with âgo to EA eventâ but often overlapped with âgo meet people outside the EA community who are doing relevant work or have relevant experienceâ, and I think this has been a very valuable part of my professional growth over the last 1-2 years.
I 100% agree with you on your general point, Akash, but I think something slightly different is going on here, and I think itâs important to get it right.
To me, it sounds like youâre saying, âBob is developing a more healthy relationship with EAâ. However, I think whatâs actually happening is more like, âBob used to think EA was a cool thing, and it helped him do cool things, but then people associated with it kept doing things Bob found repugnant, and so now Bob does not want anything to do with itâ.
Bob, forgive me for speaking on your behalf, and please correct me if I have misinterpreted things.
A bit strong, but about right. The strategy the rationalists describe seems to stem from a desire to ensure their own intellectual development, which is, after all, the rationalist project. By disregarding social norms you can start conversing with lots of people about lots of stuff you otherwise wouldnât have been able to. Tempting, however, my own (intellectual) freedom is not my primary concern; my primary concern is the overall happiness (or feelings, if you will) of others, and certain social norms are there to protect that.
To me, it sounds like youâre saying, âBob is developing a more healthy relationship with EAâ.
Oh just a quick clarificationâ I wasnât trying to say anything about Bob or Bobâs relationship with EA here.
I just wanted to chime in with my own experience (which is not the same as Bobâs but shares one similarity in that theyâre both in the ârethinking oneâs relationship with the EA community/âmovementâ umbrella).
More generally, I suspect many forum readers are grappling with this question of âwhat do I want my relationship with the EA community/âmovement to beâ. Given this, it might be useful for more people to share how theyâve processed these questions (whether theyâre related to the recent Manifold events or related to other things that have caused people to question their affiliation with EA).
Thanks for writing this Bob. I feel very saddened myself by many of the things I see in EA nowadays, and have very mixed feelings about staying involved that Iâm trying to sort throughâI appreciate hearing your thought process on this. I wish you the best in your future endeavors!
Because I want to avoid the whole thing, and I am far less attached to EA because of these arguments, while being on the opposite side of the political question from where I assume you are.
Anyways, Iâd call this weak evidence for the âEA should split into rationalist!EA, normie!EAâ
Intuitively it seems likely that it would be better for the movement though if only people from one side were leaving, rather than the controversies alienating both camps from the brand.
EA is already incredibly far outside the ânormiesphere,â so to speak. Calling it that is making some incredibly normative assumptions. What youâre looking for is more along the lines of âsocial justice progressiveâ EA and SJP-skeptical EA. As much as some people like to claim âthe ideology is not the movement,â I would agree that such a split is ultimately inevitable (though I think it will also gut a lot of what makes EA interesting, and eventually SJP-EA morphs into bog-standard Ford Foundation philanthropy).
Still not that accurate, since I suspect thereâs a fair number of people that disagree with Hanania, but think he should be allowed to speak, while supporting the global health efforts in Africa. But so it goes, trying to name amorphous and decentralized groupings.
eventually SJP-EA morphs into bog-standard Ford Foundation philanthropy
This seems unlikely to me for several reasons, foremost amongst them that they would lose interest in animal welfare. Do you think that progressives are not truly invested in it, and that itâs primarily championed by their skeptics? Because thedataseemstoindicatetheopposite.
PETA has been around for longer than EA, among other (rather less obnoxious and more effective) animal welfare organizations; I donât think losing what makes EA distinct would entail losing animal welfare altogether. The shrimp and insect crowd probably wouldnât remain noticeable. Not because I think they overlap heavily with the skeptic-EA crowd (quite the opposite), but because theyâd simply be drowned out. Tolerance of weirdness is a fragile thing.
I do think the evidence is already there for a certain kind of losing/âwildly redefining âeffective,â ie, criminal justice reform. Good cause, but no way to fit it into âeffectiveness per dollarâ terms without stretching the term to meaninglessness.
Based on your background and posts on here, I think this is a shame.
And I say that as someone who has never called himself an EA even though I share its broad goal and have a healthy respect for the work of some of its organizations and people (partly because of similar impressions to the ones youâve formed, but also because my cause area and other interests donât overlap with EA quite as much as yours)
Hope you continue to achieve success and enjoyment in the work you do, and given youâre in Brussels wondered if youâd checked out the School for Moral Ambition which appears to be an EAish philosophy plus campaigning org trying to expand from your Dutch neighbours (no affiliation other than seeing it discussed here)
I appreciate what Rutger Bregman is trying to do, and his work has certainly had a big positive impact on the world, almost certainly larger than mine at least. But honestly, I think he could be more rigorous. I havenât looked into his âschool for moral ambitionâ project, but I have read (the first half) of his book âhumankindâ, and despite vehemently agreeing with the conclusion, I would never recommend it to anyone, especially not anyone who has done any research before.
There seems to be some sort of trade-off between wide reach and rigor. I noticed a similar thing with other EA public intellectuals, like for example with Sam Harris and his book âThe Moral Landscapeâ (I havenât read any of his other books, mostly because this one was just so riddled with sloppy errors), and Steven Pinkerâs âEnlightenment Nowâ (Havenât read any of his other books either, again because of errors in this book). (Also, Iâve seen some clips of them online, and while thatâs not the best way to get information about someone, they didnât raise my opinion of them, to say the least).
Pretty annoying overall. At least Bregman is not prominently displayed on the EA People page like they are (even though what I read of his book was comparatively better). I would delete them off of it, but last time I removed SBF and Musk from it, that edit got downvoted and I had to ask a friend to upvote it (and this was after SBF was detained, so I donât think a Harris or Pinker edit would fare much better). Pretty sad, because I think EA has much better people to display than a lot of individuals on that page. Especially considering some of them (like Harris and Pinker) currently donât even identify as EA.
Interesting, youâre clearly more familiar with Bregman than I am: I was thinking of it in terms of the social reinforcement in finding interesting cause areas and committing to them thing he appears to be trying to do rather than his philosophy.
Thereâs definitely a tradeoff between wide reach and rigour when writing for public audiences, but I think most people fall short of rigour most of the time. But those who claim exceptional rigour as their distinguishing characteristic should definitely try to avoid appearing to be more cliquey and arbitrary in their decision making than average...
When it comes to someone like Pinker itâs the tone that irritates me more than the generalizations, to the point Iâm even more annoyed when I think heâs right about something! If Bregman sometimes sounds similar I can see how it would grate.
I quit. Iâm going to stop calling myself an EA, and Iâm going to stop organizing EA Ghent, which, since Iâm the only organizer, means that in practice it will stop existing.
Itâs not just because of Manifest; that was merely the straw that broke the camelâs back. In hindsight, I should have stopped after the Bostrom or FTX scandal. And itâs not just because theyâre scandals; Itâs because they highlight a much broader issue within the EA community regarding whom it chooses to support with money and attention, and whom it excludes.
Iâm not going to go to any EA conferences, at least not for a while, and Iâm not going to give any money to the EA fund. I will continue working for my AI safety, animal rights, and effective giving orgs, but will no longer be doing so under an EA label. Consider this a data point on what choices repel which kinds of people, and whether thatâs worth it.
EDIT: This is not a solemn vow forswearing EA forever. If things change I would be more than happy to join again.
EDIT 2: For those wondering what this quick-take is reacting to, hereâs a good summary by David Thorstad.
Thanks for sharing your experience here. Iâm glad you see a path forward that involves continuing to work on issues you care about despite distancing yourself from the community.
In general, I think people should be more willing to accept that you can accept EA ideas or pursue EA-inspired careers without necessarily accepting the EA community. I sometimes hear people struggling with the fact that they like a lot of the values/âbeliefs in EA (e.g., desire to use evidence and reason to find cost-effective and time-effective ways of improving the world) while having a lot of concerns about the modern EA movement/âcommunity.
The main thing I tell these folks is that you can live by certain EA principles while distancing yourself from the community. Iâve known several people who have distanced themselves from the community (for various reasons, not just the ones listed here) but remained in AI safety or other topics they care about.
Personally, I feel like Iâve benefitted quite a bit from being less centrally involved in the EA space (and correspondingly being more involved in other professional/âsocial spaces). I think this comment by Habryka describes a lot of the psychological/âintellectual effects that I experienced.
Relatedly, as I specialized more in AI safety, I found it useful to ask questions like âwhat spaces should I go to where I can meet people who could help with my AI safety goalsâ. This sometimes overlapped with âgo to EA eventâ but often overlapped with âgo meet people outside the EA community who are doing relevant work or have relevant experienceâ, and I think this has been a very valuable part of my professional growth over the last 1-2 years.
I 100% agree with you on your general point, Akash, but I think something slightly different is going on here, and I think itâs important to get it right.
To me, it sounds like youâre saying, âBob is developing a more healthy relationship with EAâ. However, I think whatâs actually happening is more like, âBob used to think EA was a cool thing, and it helped him do cool things, but then people associated with it kept doing things Bob found repugnant, and so now Bob does not want anything to do with itâ.
Bob, forgive me for speaking on your behalf, and please correct me if I have misinterpreted things.
A bit strong, but about right. The strategy the rationalists describe seems to stem from a desire to ensure their own intellectual development, which is, after all, the rationalist project. By disregarding social norms you can start conversing with lots of people about lots of stuff you otherwise wouldnât have been able to. Tempting, however, my own (intellectual) freedom is not my primary concern; my primary concern is the overall happiness (or feelings, if you will) of others, and certain social norms are there to protect that.
Oh just a quick clarificationâ I wasnât trying to say anything about Bob or Bobâs relationship with EA here.
I just wanted to chime in with my own experience (which is not the same as Bobâs but shares one similarity in that theyâre both in the ârethinking oneâs relationship with the EA community/âmovementâ umbrella).
More generally, I suspect many forum readers are grappling with this question of âwhat do I want my relationship with the EA community/âmovement to beâ. Given this, it might be useful for more people to share how theyâve processed these questions (whether theyâre related to the recent Manifold events or related to other things that have caused people to question their affiliation with EA).
Thanks for writing this Bob. I feel very saddened myself by many of the things I see in EA nowadays, and have very mixed feelings about staying involved that Iâm trying to sort throughâI appreciate hearing your thought process on this. I wish you the best in your future endeavors!
This isnât good. This really isnât good.
Because I want to avoid the whole thing, and I am far less attached to EA because of these arguments, while being on the opposite side of the political question from where I assume you are.
Anyways, Iâd call this weak evidence for the âEA should split into rationalist!EA, normie!EAâ
Intuitively it seems likely that it would be better for the movement though if only people from one side were leaving, rather than the controversies alienating both camps from the brand.
EA is already incredibly far outside the ânormiesphere,â so to speak. Calling it that is making some incredibly normative assumptions. What youâre looking for is more along the lines of âsocial justice progressiveâ EA and SJP-skeptical EA. As much as some people like to claim âthe ideology is not the movement,â I would agree that such a split is ultimately inevitable (though I think it will also gut a lot of what makes EA interesting, and eventually SJP-EA morphs into bog-standard Ford Foundation philanthropy).
Still not that accurate, since I suspect thereâs a fair number of people that disagree with Hanania, but think he should be allowed to speak, while supporting the global health efforts in Africa. But so it goes, trying to name amorphous and decentralized groupings.
This seems unlikely to me for several reasons, foremost amongst them that they would lose interest in animal welfare. Do you think that progressives are not truly invested in it, and that itâs primarily championed by their skeptics? Because the data seems to indicate the opposite.
PETA has been around for longer than EA, among other (rather less obnoxious and more effective) animal welfare organizations; I donât think losing what makes EA distinct would entail losing animal welfare altogether. The shrimp and insect crowd probably wouldnât remain noticeable. Not because I think they overlap heavily with the skeptic-EA crowd (quite the opposite), but because theyâd simply be drowned out. Tolerance of weirdness is a fragile thing.
I do think the evidence is already there for a certain kind of losing/âwildly redefining âeffective,â ie, criminal justice reform. Good cause, but no way to fit it into âeffectiveness per dollarâ terms without stretching the term to meaninglessness.
Based on your background and posts on here, I think this is a shame.
And I say that as someone who has never called himself an EA even though I share its broad goal and have a healthy respect for the work of some of its organizations and people (partly because of similar impressions to the ones youâve formed, but also because my cause area and other interests donât overlap with EA quite as much as yours)
Hope you continue to achieve success and enjoyment in the work you do, and given youâre in Brussels wondered if youâd checked out the School for Moral Ambition which appears to be an EAish philosophy plus campaigning org trying to expand from your Dutch neighbours (no affiliation other than seeing it discussed here)
I appreciate what Rutger Bregman is trying to do, and his work has certainly had a big positive impact on the world, almost certainly larger than mine at least. But honestly, I think he could be more rigorous. I havenât looked into his âschool for moral ambitionâ project, but I have read (the first half) of his book âhumankindâ, and despite vehemently agreeing with the conclusion, I would never recommend it to anyone, especially not anyone who has done any research before.
There seems to be some sort of trade-off between wide reach and rigor. I noticed a similar thing with other EA public intellectuals, like for example with Sam Harris and his book âThe Moral Landscapeâ (I havenât read any of his other books, mostly because this one was just so riddled with sloppy errors), and Steven Pinkerâs âEnlightenment Nowâ (Havenât read any of his other books either, again because of errors in this book). (Also, Iâve seen some clips of them online, and while thatâs not the best way to get information about someone, they didnât raise my opinion of them, to say the least).
Pretty annoying overall. At least Bregman is not prominently displayed on the EA People page like they are (even though what I read of his book was comparatively better). I would delete them off of it, but last time I removed SBF and Musk from it, that edit got downvoted and I had to ask a friend to upvote it (and this was after SBF was detained, so I donât think a Harris or Pinker edit would fare much better). Pretty sad, because I think EA has much better people to display than a lot of individuals on that page. Especially considering some of them (like Harris and Pinker) currently donât even identify as EA.
Interesting, youâre clearly more familiar with Bregman than I am: I was thinking of it in terms of the social reinforcement in finding interesting cause areas and committing to them thing he appears to be trying to do rather than his philosophy.
Thereâs definitely a tradeoff between wide reach and rigour when writing for public audiences, but I think most people fall short of rigour most of the time. But those who claim exceptional rigour as their distinguishing characteristic should definitely try to avoid appearing to be more cliquey and arbitrary in their decision making than average...
When it comes to someone like Pinker itâs the tone that irritates me more than the generalizations, to the point Iâm even more annoyed when I think heâs right about something! If Bregman sometimes sounds similar I can see how it would grate.