I’m slightly scared that EA will overcorrect in an irrelevant direction to the FTX situation in a way I think is net harmful, and I think a major reason for this fear is seeing lots of people espousing conclusions about solutions to problems without us actually knowing what the problems are yet.
Some examples of this I’ve seen recently on the forum follow.
Integrity
It is uncertain whether SBF intentionally committed fraud, or just made a mistake, but people seem to be reacting as if thetakeawayfromthis is that fraud is bad.
These articles are mostly saying things of the form ‘if FTX engaged in fraud, then EA needs to make sure people don’t do more fraud in the service of utilitarianism.’ from a worrying-about-group-think perspective, this is only a little less concerning than directly saying ‘FTX engaged in fraud, so EA should make sure people don’t do more fraud’.
Even though these articles aren’t literally saying that FTX engaged in fraud in the service of utilitarianism, I worry these articles will shift the narrative EA tells itself towards up-weighting hypotheses which say FTX engaged in fraud in the service of utilitarianism, especially in worlds where it turned out that FTX did commit fraud, but it was motivated by pride, or other selfish desires.
Dating
Some have claimed FTX’s downfall happened as a result of everyone sleeping with each other, and this interpretation is not obviously unpopular on the forum. This seems quite unlikely compared to alternative explanations, and the post Women and Effective Altruism takes on a tone & content I find toxic to community epistemics[1], and anticipate wouldn’t fly on the forum a week ago.
I worry the reason we see this post now is that EA is confused, wants to do something, and is really searching for anything to blame for the FTX situation. If you are confused about what your problems are, you should not go searching for solutions! You should ask questions, make predictions, and try to understand what’s going on. Then you should ask how you could have prevented or mitigated the bad events, and ask whether those prevention and mitigation efforts would be worth their costs.
I think this problem is important to address, and am uncertain about whether this post is good or bad on net. The point is that I’m seeing a bunch of heated emotions on the forum right now, this is not like the forum I’m used to, and lots of these heated discussions seem to be directed towards pushing new EA policy proposals rather than trying to figure out what’s going on.
We could immediately launch a costly investigation to see who had knowledge of fraud that occurred before we actually know if fraud occured or why. In worlds where we’re wrong about whether or why fraud occurred this would be very costly. My suggestion: wait for information to costlessly come out, discuss what happened when not in the midst of the fog and emotions of current events, and then decide whether we should launch this costly investigation.
Adjacently, some are arguing EA could have vetted FTX and Sam better, and averted this situation. This reeks of hindsight bias! Probably EA could not have done better than all the investors who originally vetted FTX before giving them a buttload of money!
Maybe EA should investigate funders more, but arguments for this are orthogonal to recent events, unless CEA believes their comparative advantage in the wider market is high-quality vetting of corporations. If so, they could stand to make quite a bit of money selling this service, and should possibly form a spinoff org.
Conclusion
EA is not thinking straight right now, and everyone should stop it with putting their ill-informed conclusions about the takeaways from recent events on the forum, and discuss the object-level events more in the hopes the community can actually update on information once it gets in, instead of getting stuck into an incorrect and unhelpful narrative about what happened.
In particular, it ties together observations and policy proposals so that in order to disagree with the policy proposals, you have to trip over your words in order to also not call the poster a liar.
The FTX Situation: Wait for more information before proposing solutions
Edit: Eli has a great comment on this which I suggest everyone read. He corrects me on a few things, and gives his far more informed takes.
I’m slightly scared that EA will overcorrect in an irrelevant direction to the FTX situation in a way I think is net harmful, and I think a major reason for this fear is seeing lots of people espousing conclusions about solutions to problems without us actually knowing what the problems are yet.
Some examples of this I’ve seen recently on the forum follow.
Integrity
It is uncertain whether SBF intentionally committed fraud, or just made a mistake, but people seem to be reacting as if the takeaway from this is that fraud is bad.
These articles are mostly saying things of the form ‘if FTX engaged in fraud, then EA needs to make sure people don’t do more fraud in the service of utilitarianism.’ from a worrying-about-group-think perspective, this is only a little less concerning than directly saying ‘FTX engaged in fraud, so EA should make sure people don’t do more fraud’.
Even though these articles aren’t literally saying that FTX engaged in fraud in the service of utilitarianism, I worry these articles will shift the narrative EA tells itself towards up-weighting hypotheses which say FTX engaged in fraud in the service of utilitarianism, especially in worlds where it turned out that FTX did commit fraud, but it was motivated by pride, or other selfish desires.
Dating
Some have claimed FTX’s downfall happened as a result of everyone sleeping with each other, and this interpretation is not obviously unpopular on the forum. This seems quite unlikely compared to alternative explanations, and the post Women and Effective Altruism takes on a tone & content I find toxic to community epistemics[1], and anticipate wouldn’t fly on the forum a week ago.
I worry the reason we see this post now is that EA is confused, wants to do something, and is really searching for anything to blame for the FTX situation. If you are confused about what your problems are, you should not go searching for solutions! You should ask questions, make predictions, and try to understand what’s going on. Then you should ask how you could have prevented or mitigated the bad events, and ask whether those prevention and mitigation efforts would be worth their costs.
I think this problem is important to address, and am uncertain about whether this post is good or bad on net. The point is that I’m seeing a bunch of heated emotions on the forum right now, this is not like the forum I’m used to, and lots of these heated discussions seem to be directed towards pushing new EA policy proposals rather than trying to figure out what’s going on.
Vetting funding
We could immediately launch a costly investigation to see who had knowledge of fraud that occurred before we actually know if fraud occured or why. In worlds where we’re wrong about whether or why fraud occurred this would be very costly. My suggestion: wait for information to costlessly come out, discuss what happened when not in the midst of the fog and emotions of current events, and then decide whether we should launch this costly investigation.
Adjacently, some are arguing EA could have vetted FTX and Sam better, and averted this situation. This reeks of hindsight bias! Probably EA could not have done better than all the investors who originally vetted FTX before giving them a buttload of money!
Maybe EA should investigate funders more, but arguments for this are orthogonal to recent events, unless CEA believes their comparative advantage in the wider market is high-quality vetting of corporations. If so, they could stand to make quite a bit of money selling this service, and should possibly form a spinoff org.
Conclusion
EA is not thinking straight right now, and everyone should stop it with putting their ill-informed conclusions about the takeaways from recent events on the forum, and discuss the object-level events more in the hopes the community can actually update on information once it gets in, instead of getting stuck into an incorrect and unhelpful narrative about what happened.
In particular, it ties together observations and policy proposals so that in order to disagree with the policy proposals, you have to trip over your words in order to also not call the poster a liar.