If anyone is after a good example of EA criticism, I cannot strongly reccommend enough the Doing EA Better post by the ConcernedEAs group.
Radical Empath Ismam
Just wanting to express my shared disappointment with how parts of this community embraced crypto/ gambling etc. as Gemma points out in her post.
It boggles my brain that someone can call themselves an Effective Altruist and be promoting NFTs and crypto grabage (and it’s not just SBF, e.g. Peter Singer & TLYCS NFT auction).
SBF especially was promoting risky speculative financial assets to the unsophisticated retail investor, using celebs and superbowl half-time ads. Add the carbon emissions and by every calculation this whole project was a net-negative endeavour.
Strongly agree. I definitely would like to see more content on neartermist causes/ careers. But importantly, I would like to see this content contributed by authors who hold neartermist views and can give those topics justice. Whilst I am appreciative of 80,000 Hours and GWWC attempting to accomodate longtermism-skeptics with some neartermist content, their neartermist content feels condescending because it doesn’t properly showcase the perspectives of Effective Altruists who are skeptical of longtermist framings.
I also personally worry 80,000 Hours is seen as the “official EA cause prioritisation” resource and this:
alienates readers with different views and conclusions,
does not show that the EA community has diverse and varied views,
has misled readers into thinking there is an “official EA position” on best careers/ cause areas
Having more neartermist content will help with this, but I also would like to see 80,000 Hours host content from authors with clashing views. E.g., 80,000 makes a very forceful case that Climate Change is not a material X-Risk, and I would like to see disagreeing writers critique that view on their site.
I also think you hit the nail on the head about many readers being unreceptive to longtermism for concerns like tractability, and that is entirely valid for them.
The problem I am trying to communicate is less so about agreeing with deep critiques, and more-so about attacking the authors of critiques personally.
This is where I think EA (or more fairly—this specific forum) underperforms other intellectual movements. e.g. Physicists, philosophers, social scientists can disagree about different theories in deep but respectful and professional ways. In EA however, deep criticism is received very personally by the community, leading to emotional/personal attacks on character.
There is this great post that notes that:
Within EA, criticism is acceptable, even encouraged, if it lies within particular boundaries, and when it is expressed in suitable terms. … As a community, however, we remain remarkably resistant to deep critiques.
I would say your linkpost sits more within the “Deep Critique” space, to which EAs have a knee-jerk reaction to interpret as bad-faith or unfair.
But don’t let that you stop you please, otherwise the culture in this community will never improve.
Absolutely not. If anything, it is the other way around.
Statistics is much more applied. I did a statistics degree and became an actuary. It has been a very rewarding and impactful career. I am often quite shocked at the poor data/ statistics skills demonstrated by maths graduates, who (presumably because they are specialised in subjects like topology) are far behind their graduate peers with backgrounds in Psychology or Economics.
Is Branson a good choice for frugality? He owns an island, registered in a tax haven, and lavishes his celebrity & politica friends with gifts.
Maybe Mike Canon-Brookes is a better example of a “frugal” billionaire (frugal relative to the typical billionaire).
The CCC has high standards of research
Would you be able to point to something backing up this claim? Just a word of caution because I don’t believe this to be true (as I explain below).
Lomborg’s name might be familiar (or infamous) those of us in Australia where he was at the centre of a big political scandal, where the conservative government at the time (then climate-skeptics) was perceived to be pushing universities to host the Copenhagen Centre and seen as political intereference into the academic system.
Lomborg has been described as a climate contrarian in Science:
Once the darling of Australia’s conservative government, controversial climate contrarian Bjørn Lomborg has lost his Down Under caché—and cash.
Australia’s Climate Council is critical of him:
https://www.climatecouncil.org.au/resources/the-low-down-on-lomborg/And has made bunk claims on Australian bushfires:
https://iceds.anu.edu.au/news-events/news/controversial-commentator-bjorn-lomborgs-bushfire-claim-debunkedI do understand that all this criticism is centred on Lomborg/ his centre’s views on climate, which is separate to the the cause areas you bring up with e-procurement and land tenure. But, his track record on climate does make me cautious about their reputability.
I agree with you, it is disappointing that EA are doing little in this area.
In Australia, we have a speaker from ICAN (the nobel prize winning anti-nuclear weapons NGO) attending the 2023 EAGX Australia in Melbourne. In my opinion, it’s a particularly promising area for big impact (and especially for Aussie EAs) due to the recently developed AUKUS alliance. The details of the alliance are still being fleshed out, and a big opportunity exists to shape the alliancce to de-risk the chances of a conflict between great powers.
I’m very fascinated with the Kerala Model and it’s apparent success. I haven’t looked into it in-depth but it seems like it has similarities to the Nordic Model.
Would be keen for some economic development people to share their thoughts.
I distance myself from longtermism for the reasons you spell out here, i.e. correcting inequality is not seen as a priority. But I do agree that fixing inequality should be a key priority even by longtermist principles. The longtermists that do not think it is a priority are often not even aware that they hold shaky (in my opinion) assumptions that:
Inequality does not deprive us of talent to solve presing world problems
Inequality does not exacerbates the scale and probability of X-Risks and Catastropic-Risks
Inequality itself is not a significant source of suffering if allowed to perpetuate into the future
There are other reformulations of longtermism that exist outside of the normal EA community, usually by critics of longtermism. For example:
https://www.carlbeijer.com/p/there-is-no-long-term-without-socialism (article is paywalled but if you message me privately, I am happy to share a copy of it I have saved).
Is this really a fair description of IR Realism?
Mearsheimer, to his credit, was able to anticipate the Russian invasion of Ukraine. If his prescriptions were heeded to sooner, perhaps this conflict could have narrowly been avoided.
You could just as easily argue that Mearsheimer’s opponents have done more to enable the Russians.
I’m not saying I agree with Mearsheimer or understand his views fully, but I’m grateful his school of thought exists and is being explored.
I largely agree with your assessment that Quincy is controversial and dogmatic about restraint/ non-intervention.
That being said, they are a valuable source of disagreement in the wider foreign policy community, and doing something very neglected (researching & advocating for restraint/non-intervention).
I know Quincy staff disagree with each other, coming from libertarian, leftist, realist perspectives. So it is troubling that Cirincione departed because that difference in perspective is needed. Although I do suspect Parsi is describing things accurately when he says Cirincione left because he wanted the Institute to adopt his position in the Russian-initiated war on Ukraine.
Quincy are exploring a controversial analysis in this current conflict in Russia-Ukraine, to identify if Russia’s invasion could have been avoided in the 1st place (e.g. by bringing Russia into NATO way back when they were wanting to join), and advocating Ukraine and Russia compromise to reduce casualties (to be fair, it’s reported the White House has also urged Ukraine to make compromises at times). Whilst controversial, I do think this is worthwhile, and I myself might disagree (and I believe they all disagree amongst themselves), I want to see this research/advocacy explored and debated. I had been nervous when the invasion started that Quincy’s work could dip into Kremlin-apologetics, but they have seemed to steer away from that, and have nuanced perspectives.
Their work on the Iran Nuclear Deal, the conflict in Yemen, is far less controversial, and promising.
I find value in them being a counterbalance to the more hawkish think tanks which are much better resourced.
On the 80K job board, you have a few institutions (well respected and worthwhile no doubt) like CSIS & RAND, which are more interventionist and/or funded by arms manufacturers (even RAND is indirectly funded by the grants it receives from AEI), so I do worry that there is a systemic bias for interventionist views.
I hope people don’t write-off Quincy’s work or other anti-interventionist/restraint-focused work entirely, but certainly agree, take it with a grain of salt. I certainly do.
Thank you Stephen for your long engagement with this topic, because I do think it is a very real risk that Effective Altruists should pay more attention to.
In addition to the actions you proposed, I also wanted to suggest there might be promising actions in reducing conflicts of interests that incentivise conflict and escalate tensions. The high amounts of political lobbying, sponsoring of think tanks and universities, by weapons companies creates perverse inventives.
I have been very impressed by the work of the Quincy Institute to bring attention to this issue, and to explore diplomatic options as alternatives to conflict. I would love to see 80000 Hours promote them on their job board or interviewed.
I’ve written to my local MPs about banning contributions from weapons makers (Lockheed Martin, Boeing etc...) to the Australian Gov’t military think tank ASPI. Here in Australia the recent AUKUS security pact has seen an enormous increase in planned military spending and sparked some discussion on the forum. I am trying to raise this as an issue/cause area to explore amongst Aussie EAs.
Very eloquent. I do think the perception is justified, e.g. SBF’s attempt to elect candidates to the US Congress/Senate.
If anything… I probably take people less seriously if they do bet (not saying that’s good or bad, but just being honest), especially if there’s a bookmaker/platform taking a cut.
I think it’s fair for Davis to characterise Schmidt as a longtermist.
He’s recently been vocal about AI X-Risk. He funded Carrick Flynn’s campaign which was openly longtermist, via the Future Forward PAC alongside Moskovitz & SBF. His philanthropic organisation Schmidt Futures has a future focused outlook and funds various EA orgs.
And there are longtermists who are pro AI like Sam Altman, who want to use AI to capture the lightcone of future value.
I can’t say how effective they are in this space, but UNHCR is active and reputable.
Terrible situation.
The harsh crticism of EA has only been a good thing, forcing us to have higher standards and rigour. We don’t want an echochamber.
I would see it as a thoroughly good thing if Open Philanthropy were to combat the protrayal of itself as a shadowy cabal (like in the recent politico piece) for example by:
Having more democratic buy-in with the public
e.g. Having a bigger public presence in media, relying on a more diverse pool of funding than (i.e. less billionarie funding)
Engaged in less political lobbying
More transparent about the network of organisations around them
e.g. from the Politico article: ”… said Open Philanthropy’s use of Horizon … suggests an attempt to mask the program’s ties to Open Philanthropy, the effective altruism movement or leading AI firms”