Free link: https://archive.ph/CGjTz (h/t lincolnq)
The Economist recently published this criticism of EA, especially its power structures. Personal thoughts in the comments.
Free link: https://archive.ph/CGjTz (h/t lincolnq)
The Economist recently published this criticism of EA, especially its power structures. Personal thoughts in the comments.
Imagine thinking this is a good outcome of the “keep your mouth shut” strategy CEA recommends regarding media:
Terrible look, to be honest.
Isn’t it somewhat ironic though that you’re caring what the Economist journalists think, and implicitly connoting that that forum post shouldn’t have been made because it gave bad PR?
I just find it funny how posting something like that in a public forum will, of course, make it seen by journalists sooner or later, anyway.
It’s the second bit that concerns me more because I think it’s essentially a correct description of how CEA, and EAs in general (largely because of CEA’s influence), view public engagement. Any interaction outside the community is seen mainly as something that should be handled through a lens of risk mitigation. The way it’s phrased makes it sound like the CEA stopped 78% of 137 virus outbreaks.
Like I wrote elsewhere, I think the danger with the “don’t talk to media” approach is that you get very few views into a movement, mostly from leadership, and if one of those rare appearances takes a wrong turn, there is not a plurality of other views and appearances out there to balance it.
For example, if the only people who “should” give interviews are EA leadership philosophers that are deeply into longtermism, that will make it seem like the entire EA movement is all about longtermism. This is not true.
Many of these concerns resonated with me.
A relative outsider, my understanding of EA formed around its online content, which emphasises utilitarianism and longtermism. Whenever speaking to EA’s in person, I’m often surprised that these perspectives are more weakly held by community members (and leaders?) than I expected. I think there are messaging issues here. Part of the issue might be that longtermist causes are more interesting to write and talk about. We should be careful to allocate attention to cause areas proportional to their significance.
Too much of the ecosystem feels dependent on a few grantmakers / re-granters. It concentrates too much power in relatively few people’s hands. (At the same time, this seems to be a very hard problem to solve. No particular initiatives come to my mind.)
I see EA’s concerns with reputational risk and optics as flaws with its overly utilitarian perspective. Manipulating the narrative has short-term reputational benefits and hidden long-term costs.
At the same time, I am sceptical of EA’s ability to adequately address these issues. Such concerns have been previously raised without significant change. It feels like many of these issues have arisen due to the centralisation of power and the over-weighting of community leaders’ opinions, yet simultaneously the community is sufficiently de-centralised that it’s difficult to coordinate such a change.
That’s interesting, I’ve had the exact opposite experience. I was attracted to EA for similar reasons that Zoe and Ben mention in the article, such as global poverty and health, but then found that everyone I was meeting in the EA community was working on longtermist stuff (AI alignment and safety mostly). We have discussed that perhaps since my club was at a university, it’s possible that most of the university students in the club at the time were just more career aligned with longtermist stuff. I don’t know how accurate that is though.
Sadly, I agree with many of the points in this article.
I’ve been thinking this for a long time but not been able to put together something so succinct. Personally, I will carry on championing my interpretation of EA that is to look at charity like your investments and get the best bang for your buck. Wether I’ll use the term ‘EA’ to describe myself will depend on the next few months—if the general understanding of EA is speculative longtermism, cultish behavior, and ‘ends justify the means’ then I’d rather not bring it up.
Maybe EA will split in two, one group carrying on like they are now with a focus on longtermism and another that focuses solely on real impacts that can be seen and measured within our lifetimes. Maybe it doesn’t matter as long as you and I keep it in our minds when we make our donations to charity funding malaria nets saving real lives today, no matter how small that impact might be compared to SBF and the future trillions of humans at risk of AI going rogue on Mars.
Edit: Not to say longtermism doesn’t have its place, I just feel too much time is spent on these things that may never happen while real people face real issues today (or may face in the near future, like pandemic preparedness).
I thought this was a relatively balanced piece actually as far as criticisms go. The author is clearly not a fan, but I feel like she resisted the temptation to straw-man a lot more than most critics—good on her (...or good on The Economist if this is their general style?).
I think this phrasing is unfortunate though:
I imagine this will be interpreted by most readers as a threat from funders. Whereas my understanding was that this was a case of other community members looking out for Cremer and Kemp, telling them they were worried that this might happen. From their post:
(By the way, the free link has an extra comma at the end which needs removing for the link to work.)
Thanks for clarifying this! I really had interpreted it as a threat from funders.
Other than seemingly conflating EA with utilitarianism sometimes, I thought this was quite a good piece which raises some important pain points in the movement.
Let’s aim to become more transparent, less secretive, more decentralized and put into place whistle blower protections.
I found this quote to be particularly salient: “Having a handful of wealthy donors and their advisers dictate the evolution of an entire field is bad epistemics at best and corruption at worst”
Thank you for sharing! A one sentence thought on one of the paragraphs towards the end outlined by a former EA member...
I could understand the pursuit for accomplishing critical work and achieve EA objectives, but a structure to safeguard the work and EA brand is vital as well.
Do you know if there is a version that isn’t paywalled?
https://archive.ph/CGjTz
Thank you!