I suggested that we would have trouble with FTX and funding around 6 months ago.
SBF has been giving lots of money to EA. He admits it’s a massively speculative bubble. Crypto crash hurts the most vulnerable, because poor uneducated people put lots of money into it (Krugman). Crypto is currently small, but should be regulated and has potential contagion effects (BIS). EA as a whole is getting loose with it’s money due to large crypto flows (MacAskill). An inevitable crypto crash leads to either a) bad optics leading to less interest in EA or b) lots of dead projects.
It was quite obvious that this would happen—although the specific details with Alameda were not obvious. Stuart Buck was the only one who took me seriously at the time.
Below are some suggestions for change.
1. The new button of “support” is great, but I think EA forum should have a way to *sort* by controversiality. And, have the EA forum algorithm occasionally (some ϵ% of the time), punt controversial posts back upwards to the front page. If you’re like me, you read the forum sorted by Magic (New and Upvoted). But this promotes herd mentality. The red-teaming and self-criticism are excellent, but if the only way we aggregate how “good” red-teaming is is by up-votes, that is flawed. Perhaps the best way to know that criticism has touched a nerve is to compute a fraction: how many members of the community disagree vs how many agree. (or, even better, if you are in an organization, use a weighted fraction, where you put lower weight on the people in the organization that are in positions of power (obviously difficult to implement in practice))
2. More of you should consider anonymous posts. This is EA forum. I cannot believe that some of you delete your posts simply because it ends up being downvoted. Especially if you’re working higher up in an EA org, you ought to be actively voicing your dissent and helping to monitor EA.
For example, this is not good:
“Members of the mutinous cohort told me that the movement’s leaders were not to be taken at their word—that they would say anything in public to maximize impact. Some of the paranoia—rumor-mill references to secret Google docs and ruthless clandestine councils—seemed overstated, but there was a core cadre that exercised control over public messaging; its members debated, for example, how to formulate their position that climate change was probably not as important as runaway A.I. without sounding like denialists or jerks.” (New Yorker)
What makes EA, *EA*, what makes EA antifragile, is its ruthless transparency. If we are self-censoring because we have *already concluded something is super effective*, then there is no point in EA. Go do your own thing with your own money. Become Bill Gates. But don’t associate with EA.
3. Finances should be partially anonymized. If an EA org receives some money above a certain threshold from an individual contribution, we should be transparent in saying that we will reject said money if it is not donated anonymously. You may protest that this would decrease the number of donations by rich billionaires. But take it this way: if they donate to EA, it’s because they believe that EA can spend it better. Thus, they should be willing to donate anonymously, to not affect how EA spends money. If they don’t donate to EA, then they can establish a different philanthropic organization and hire EA-adjacent staff, making for more competition.
“Members of the mutinous cohort told me that the movement’s leaders were not to be taken at their word—that they would say anything in public to maximize impact. Some of the paranoia—rumor-mill references to secret Google docs and ruthless clandestine councils—seemed overstated, but there was a core cadre that exercised control over public messaging; its members debated, for example, how to formulate their position that climate change was probably not as important as runaway A.I. without sounding like denialists or jerks.” (New Yorker)
What makes EA, EA, what makes EA antifragile, is its ruthless transparency. If we are self-censoring because we have already concluded something is super effective, then there is no point in EA. Go do your own thing with your own money. Become Bill Gates. But don’t associate with EA.
Being honest, I do genuinely think that climate change is less important than runaway AI, primarily because of both option value issues and the stakes of the problem. One is a big problem that could hurt or kill millions, while AI could kill billions.
But I’m concerned that they couldn’t simply state why they believe AI is more important than climate change rather than do this over-complicated scheme.
Finances should be partially anonymized. If an EA org receives some money above a certain threshold from an individual contribution, we should be transparent in saying that we will reject said money if it is not donated anonymously. You may protest that this would decrease the number of donations by rich billionaires. But take it this way: if they donate to EA, it’s because they believe that EA can spend it better. Thus, they should be willing to donate anonymously, to not affect how EA spends money. If they don’t donate to EA, then they can establish a different philanthropic organization and hire EA-adjacent staff, making for more competition.
Disagree, this would make transparency worse without providing much benefit.
The new button of “support” is great, but I think EA forum should have a way to sort by controversiality. And, have the EA forum algorithm occasionally (some
ϵ
% of the time), punt controversial posts back upwards to the front page. If you’re like me, you read the forum sorted by Magic (New and Upvoted). But this promotes herd mentality. The red-teaming and self-criticism are excellent, but if the only way we aggregate how “good” red-teaming is is by up-votes, that is flawed. Perhaps the best way to know that criticism has touched a nerve is to compute a fraction: how many members of the community disagree vs how many agree. (or, even better, if you are in an organization, use a weighted fraction, where you put lower weight on the people in the organization that are in positions of power (obviously difficult to implement in practice))
Disagree here because I don’t want to see an EA forum that values controversial posts.
I’ve made this into a post on the forum, because I’m afraid it’ll get buried in the comments here. Please comment on the forum post instead.
https://forum.effectivealtruism.org/posts/9YodZj6J6iv3xua4f/another-ftx-post-suggestions-for-change
I suggested that we would have trouble with FTX and funding around 6 months ago.
It was quite obvious that this would happen—although the specific details with Alameda were not obvious. Stuart Buck was the only one who took me seriously at the time.
Below are some suggestions for change.
1. The new button of “support” is great, but I think EA forum should have a way to *sort* by controversiality. And, have the EA forum algorithm occasionally (some ϵ% of the time), punt controversial posts back upwards to the front page. If you’re like me, you read the forum sorted by Magic (New and Upvoted). But this promotes herd mentality. The red-teaming and self-criticism are excellent, but if the only way we aggregate how “good” red-teaming is is by up-votes, that is flawed. Perhaps the best way to know that criticism has touched a nerve is to compute a fraction: how many members of the community disagree vs how many agree. (or, even better, if you are in an organization, use a weighted fraction, where you put lower weight on the people in the organization that are in positions of power (obviously difficult to implement in practice))
2. More of you should consider anonymous posts. This is EA forum. I cannot believe that some of you delete your posts simply because it ends up being downvoted. Especially if you’re working higher up in an EA org, you ought to be actively voicing your dissent and helping to monitor EA.
For example, this is not good:
What makes EA, *EA*, what makes EA antifragile, is its ruthless transparency. If we are self-censoring because we have *already concluded something is super effective*, then there is no point in EA. Go do your own thing with your own money. Become Bill Gates. But don’t associate with EA.
3. Finances should be partially anonymized. If an EA org receives some money above a certain threshold from an individual contribution, we should be transparent in saying that we will reject said money if it is not donated anonymously. You may protest that this would decrease the number of donations by rich billionaires. But take it this way: if they donate to EA, it’s because they believe that EA can spend it better. Thus, they should be willing to donate anonymously, to not affect how EA spends money. If they don’t donate to EA, then they can establish a different philanthropic organization and hire EA-adjacent staff, making for more competition.
Being honest, I do genuinely think that climate change is less important than runaway AI, primarily because of both option value issues and the stakes of the problem. One is a big problem that could hurt or kill millions, while AI could kill billions.
But I’m concerned that they couldn’t simply state why they believe AI is more important than climate change rather than do this over-complicated scheme.
Disagree, this would make transparency worse without providing much benefit.
Disagree here because I don’t want to see an EA forum that values controversial posts.
Hi, thanks for replying! I’ve made this into an EA forum post, instead because I’m afraid it’ll get buried in the comments here. https://forum.effectivealtruism.org/posts/9YodZj6J6iv3xua4f/another-ftx-post-suggestions-for-change