“Members of the mutinous cohort told me that the movement’s leaders were not to be taken at their word—that they would say anything in public to maximize impact. Some of the paranoia—rumor-mill references to secret Google docs and ruthless clandestine councils—seemed overstated, but there was a core cadre that exercised control over public messaging; its members debated, for example, how to formulate their position that climate change was probably not as important as runaway A.I. without sounding like denialists or jerks.” (New Yorker)
What makes EA, EA, what makes EA antifragile, is its ruthless transparency. If we are self-censoring because we have already concluded something is super effective, then there is no point in EA. Go do your own thing with your own money. Become Bill Gates. But don’t associate with EA.
Being honest, I do genuinely think that climate change is less important than runaway AI, primarily because of both option value issues and the stakes of the problem. One is a big problem that could hurt or kill millions, while AI could kill billions.
But I’m concerned that they couldn’t simply state why they believe AI is more important than climate change rather than do this over-complicated scheme.
Finances should be partially anonymized. If an EA org receives some money above a certain threshold from an individual contribution, we should be transparent in saying that we will reject said money if it is not donated anonymously. You may protest that this would decrease the number of donations by rich billionaires. But take it this way: if they donate to EA, it’s because they believe that EA can spend it better. Thus, they should be willing to donate anonymously, to not affect how EA spends money. If they don’t donate to EA, then they can establish a different philanthropic organization and hire EA-adjacent staff, making for more competition.
Disagree, this would make transparency worse without providing much benefit.
The new button of “support” is great, but I think EA forum should have a way to sort by controversiality. And, have the EA forum algorithm occasionally (some
ϵ
% of the time), punt controversial posts back upwards to the front page. If you’re like me, you read the forum sorted by Magic (New and Upvoted). But this promotes herd mentality. The red-teaming and self-criticism are excellent, but if the only way we aggregate how “good” red-teaming is is by up-votes, that is flawed. Perhaps the best way to know that criticism has touched a nerve is to compute a fraction: how many members of the community disagree vs how many agree. (or, even better, if you are in an organization, use a weighted fraction, where you put lower weight on the people in the organization that are in positions of power (obviously difficult to implement in practice))
Disagree here because I don’t want to see an EA forum that values controversial posts.
Being honest, I do genuinely think that climate change is less important than runaway AI, primarily because of both option value issues and the stakes of the problem. One is a big problem that could hurt or kill millions, while AI could kill billions.
But I’m concerned that they couldn’t simply state why they believe AI is more important than climate change rather than do this over-complicated scheme.
Disagree, this would make transparency worse without providing much benefit.
Disagree here because I don’t want to see an EA forum that values controversial posts.
Hi, thanks for replying! I’ve made this into an EA forum post, instead because I’m afraid it’ll get buried in the comments here. https://forum.effectivealtruism.org/posts/9YodZj6J6iv3xua4f/another-ftx-post-suggestions-for-change