“Members of the mutinous cohort told me that the movement’s leaders were not to be taken at their word—that they would say anything in public to maximize impact. Some of the paranoia—rumor-mill references to secret Google docs and ruthless clandestine councils—seemed overstated, but there was a core cadre that exercised control over public messaging; its members debated, for example, how to formulate their position that climate change was probably not as important as runaway A.I. without sounding like denialists or jerks.” (New Yorker)
What makes EA, EA, what makes EA antifragile, is its ruthless transparency. If we are self-censoring because we have already concluded something is super effective, then there is no point in EA. Go do your own thing with your own money. Become Bill Gates. But don’t associate with EA.
Being honest, I do genuinely think that climate change is less important than runaway AI, primarily because of both option value issues and the stakes of the problem. One is a big problem that could hurt or kill millions, while AI could kill billions.
But I’m concerned that they couldn’t simply state why they believe AI is more important than climate change rather than do this over-complicated scheme.
Finances should be partially anonymized. If an EA org receives some money above a certain threshold from an individual contribution, we should be transparent in saying that we will reject said money if it is not donated anonymously. You may protest that this would decrease the number of donations by rich billionaires. But take it this way: if they donate to EA, it’s because they believe that EA can spend it better. Thus, they should be willing to donate anonymously, to not affect how EA spends money. If they don’t donate to EA, then they can establish a different philanthropic organization and hire EA-adjacent staff, making for more competition.
Disagree, this would make transparency worse without providing much benefit.
The new button of “support” is great, but I think EA forum should have a way to sort by controversiality. And, have the EA forum algorithm occasionally (some
ϵ
% of the time), punt controversial posts back upwards to the front page. If you’re like me, you read the forum sorted by Magic (New and Upvoted). But this promotes herd mentality. The red-teaming and self-criticism are excellent, but if the only way we aggregate how “good” red-teaming is is by up-votes, that is flawed. Perhaps the best way to know that criticism has touched a nerve is to compute a fraction: how many members of the community disagree vs how many agree. (or, even better, if you are in an organization, use a weighted fraction, where you put lower weight on the people in the organization that are in positions of power (obviously difficult to implement in practice))
Disagree here because I don’t want to see an EA forum that values controversial posts.
But I’m concerned that they couldn’t simply state why they believe AI is more important than climate change rather than do this over-complicated scheme.
Agree
Disagree here because I don’t want to see an EA forum that values controversial posts.
Disagree. This is like saying, “Amazon shouldn’t sort by 1 star, because otherwise it will get a bad reputation for selling bad products.”
That’s wrong. People still have the option of sorting by whatever they choose. But the forum should give more visibility to posts that break people out of their comfort zone, should they desire.
Disagree. This is like saying, “Amazon shouldn’t sort by 1 star, because otherwise it will get a bad reputation for selling bad products.”
That’s wrong. People still have the option of sorting by whatever they choose. But the forum should give more visibility to posts that break people out of their comfort zone, should they desire.
The reason I disagree is in my view, the internet already rewards controversiality and outrage way too much, and this is something that makes the EA forum much better because they avoid outrage and controversiality driving the process.
controversiality need not be extremely correlated with outrage. in fact, outrage can be very uncontroversial (school shooting). and controverisality is often productive (debate about X). my inclination is to trust the readership of this forum. promoting visibility to controversial posts will help people discuss ideas they’ve neglected.
Response lifted from a different post:
Being honest, I do genuinely think that climate change is less important than runaway AI, primarily because of both option value issues and the stakes of the problem. One is a big problem that could hurt or kill millions, while AI could kill billions.
But I’m concerned that they couldn’t simply state why they believe AI is more important than climate change rather than do this over-complicated scheme.
Disagree, this would make transparency worse without providing much benefit.
Disagree here because I don’t want to see an EA forum that values controversial posts.
Agree
Disagree. This is like saying, “Amazon shouldn’t sort by 1 star, because otherwise it will get a bad reputation for selling bad products.”
That’s wrong. People still have the option of sorting by whatever they choose. But the forum should give more visibility to posts that break people out of their comfort zone, should they desire.
The reason I disagree is in my view, the internet already rewards controversiality and outrage way too much, and this is something that makes the EA forum much better because they avoid outrage and controversiality driving the process.
controversiality need not be extremely correlated with outrage. in fact, outrage can be very uncontroversial (school shooting). and controverisality is often productive (debate about X). my inclination is to trust the readership of this forum. promoting visibility to controversial posts will help people discuss ideas they’ve neglected.