I’m more interested in reflecting on the foundational issues in EA-style thinking that contributed to the FTX debacle than in abscribing wrongdoing or immorality (though I agree that the whole episode should be thoroughly investigated).
Following a utilitarian logic, FTX/Alameda pursued a high-leverage strategy (Caroline on leverage); the decision to pursue this strategy didn’t account for the massive externalities that resulted from its failure
EA’s inability and/or unwillingness to vet FTX’s operations (lack of financial controls, lack of board oversight, no ring-fence around funds committed to the Future Fund) and SBF’s history of questionable leadership points to overeager power-seeking
EA leadership’s stance of minimal communication about their roles in the debacle points to a high weight placed on optics / face-saving (Holden’s post and Oli’s commenting are refreshing counterexamples though I think it’s important to hear more about their involvement at some point too)
Thanks for this comment.
I’m more interested in reflecting on the foundational issues in EA-style thinking that contributed to the FTX debacle than in abscribing wrongdoing or immorality (though I agree that the whole episode should be thoroughly investigated).
Examples of foundational issues:
FTX was an explicitly maximalist project, and maximization is perilous
Following a utilitarian logic, FTX/Alameda pursued a high-leverage strategy (Caroline on leverage); the decision to pursue this strategy didn’t account for the massive externalities that resulted from its failure
The Future Fund failed to identify an existential risk to its own operation, which casts doubt on their/our ability to perform risk assessment
EA’s inability and/or unwillingness to vet FTX’s operations (lack of financial controls, lack of board oversight, no ring-fence around funds committed to the Future Fund) and SBF’s history of questionable leadership points to overeager power-seeking
MacAskill’s attempt to broker an SBF <> Elon deal re: purchasing Twitter also points to overeager power-seeking
Consequentialism straightforwardly implies that the ends justify the means at least sometimes; protesting that the ends don’t justify the means is cognitive dissonance
EA leadership’s stance of minimal communication about their roles in the debacle points to a high weight placed on optics / face-saving (Holden’s post and Oli’s commenting are refreshing counterexamples though I think it’s important to hear more about their involvement at some point too)