I’m wary of EAs performatively self-flagellating and accepting more responsibility for the FTX thing than is warranted (given, e.g., that huge numbers of people with a very direct financial incentive in spotting FTX’s fraud didn’t spot it, so it’s obviously not weird if random EAs failed to spot it).
I don’t think this is about spotting fraud at all. I think the strong case for EA responsibility goes like:
EA promoted earning to give
When the movement largely moved away from it, not enough work was done to make that distance (such that the average person still strongly associates EA with earning to give)
Sam adopted an extremely reckless approach to EV-maximisation that was highly likely, if not guaranteed, to lead to a severe loss in value, regardless of the illegality of that loss
He publicly branded himself with this reckless EV-maximisation approach
He publicly associated himself with the movement, with earning-to-give specifically, and donated lots of money to EA causes to prove it
Many EA causes accepted that money and/or associated themselves with Sam
Critics of Sam’s recklessness did not speak up or were drowned out by other community members
The counterfactuals look something like this:
If EA had strongly distanced themselves from ETG, Sam may not have considered it as a path, or been unable to tie himself to EA
If EA had criticised Sam’s recklessness this may have moderated it
If EA orgs conditionalised their acceptance of his donations on financial transparency this may have mitigated the recklessness (and this would have been in their own interests since they stood to lose money!)
FWIW, I don’t necessarily agree with this, but this is what feels either explicit or implicit in a lot of the EA criticism around this that I’ve read. The focus appears to be explicitly on the causal chain of Sam adopting a reckless flavour of ETG which motivated him to take risks which didn’t pay off, and what EA, which was surely influential over his thinking, could’ve done to prevent it.
Any claim that advising people to earn to give is inherently really bad needs to either defend the view that “start a business or take another high paying job” is inherently immoral advice, or explain why it becomes immoral when you add “and give the money to charity” or when it’s aimed at EAs specifically. It’s possible that can be done, but I think it’s quite a high bar. (Which is not to say EtG advice couldn’t be improved in ways that make future scandals less likely.)
You’re right! It’s not that ETG is inherently bad (and frankly, I haven’t seen anyone make this argument), it’s that specific EV-maximising interpretations of ETG cause people to pursue careers that are (1) harmful, (2) net harmful, or (3) too risky to pay off.
Personally, I think FTX was (1) and (3), and unlikely to be (2) probably also (2). I’m not really sure where the bar is, but under any moderately deontological framework (1) is especially concerning, and many of the people EA might want to have a good reputation with believe (1). So that’s roughly the worldview-neutral case for caring about strongly rejecting EV-maximising forms of ETG.
I haven’t run the numbers myself but I generally assume that FTX’s account-holders were mostly moderately well-off HIC residents (based on roughly imbibed demographics of crypto), and the Future Fund’s beneficiaries are by and large worse off. There were probably some number of people who invested their life savings or were otherwise poor to begin with that were harmed more significantly then the beneficiaries of their money. But on the whole it feels like it was an accidental wealth transfer, and much of that harm will be mitigated if they’re made whole (but admittedly, the make-whole money just comes from crypto speculation that trades on the gullibility of yet more people).
But much less confident in this take; my point is much more around the real harms it caused being worth thinking about.
The possibility of “made whole” is based on crypto values at the time of bankruptcy filing—meaning not really whole.
FTXFF was IIRC ~$150MM in grants, a substantial portion of which will end up clawed back. The losses (based on current crypto values) are exponentially greater than that. I suspect that the bulk of the economic impact involves transfers from customers, investors, and lenders to Alameda creditors?
Also, one can argue that crypto itself is net harmful, so any crypto career is presumptively so as well.
I don’t think “advising people to earn to give is inherently really bad” is necessary to reach the conclusion that there is a case for EA responsibility here. There exist many ideas that are not inherently bad, but yet it is irresponsible to advocate for them in certain ways / without certain safeguards. An ends-justifies-the-means approach to making money was a foreseeable response to EtG advocacy. Did EA actors do enough to discourage that kind of approach when advocating for EtG?
I don’t think it’s necessary, no. But I do think some early critics of EtG were motivated at least partly by a general anticapitalist case that business or at least finance careers were generically morally problematic in themselves.
Fair, but that wouldnt be a steelmanned—or even fairly balanced—version of criticisms of EtG. It’s the weaker part of a partial motivation held by some critics.
True. We should make sure any particular safeguard wasn’t in place around how people advocated for it before assuming it would have helped though. For what it’s worth my sense is that a much more culpable thing was not blowing the whistle on Sam’s bad behaviour at early Alameda even after Will and other leaders-I forget exactly who, if it’s even known-were informed about it. That mistake was almost certainly far less consequential for the people harmed by FTX (I don’t think it would have stopped the fraud; it might have protected EA itself), but I strongly suspect it was more knowably wrong at the time than anything anyone did or said about EtG as a general idea.
I think there are two separate but somewhat intertwined chains of inquiry under discussion here:
A historical inquiry: what happened in this case, what safeguards failed, what would have helped but wasn’t in place?
A ~first-principles re-evaluation of EtG based on an update: The catastrophic failure of the supposedly most successful instance of EtG should update us that we underestimated the risk and severity of EtG downsides. That suggests a broader re-examination of potential risks and safeguards, which may look more appropriate than they did before the update.
By analogy, whenever there is a school shooting, I don’t think it is inappropriate to analyze and discuss things merely because they would not have prevented that specific school shooting. However, those doing so should be careful to avoiding claiming that their preferred intervention would have been effective in the case in the news.
Earning to Give still seems the best way to contribute for many people (e.g. people with exceptionally high earning potential, or people with decently paying jobs who aren’t a good fit for direct work or don’t want to switch jobs). I don’t think we should distance ourselves from it.
I’d add that I think 80K has done an awful lot to communicate “EA isn’t just about earning-to-give” over the years. At some point it surely has to be the case that they’ve done enough. This is part of why I want to distinguish the question “did we play a causal role here?” from questions like “did we foreseeably screw up?” and “should we do things differently going forward?”.
The complaints here seem to be partly about HOW EtG is promoted, rather than how MUCH. Though I am mildly skeptical that people in fact did not warn against doing harm to make money while promoting EtG, and much more skeptical that SBF would have listened if they had done this more.
Yeah, I had 80k in mind when I thought about EAs who did distance themselves from harms. They’ve had this article up for ages (I remember reading it early in my EA journey). I think a good SBF/EtG postmortem would try to establish which orgs didn’t do as well at this, and what the counterfactuals really were, and I think it may well conclude there wasn’t much else EA-promoting orgs could’ve done. (Although, if I had to put money on it I’d say explicit condemnations could’ve been predictable).
At some point it surely has to be the case that they’ve done enough.
This doesn’t seem true? It makes perfect sense for advocacy groups to continue advocating their position, since a lot of the point is to reach people for whom the message is new. 80k is (or at least was) all about how to use your career for good, I would expect them to always be talking about earning to give as an option.
Yes, there are many indirect ways EA might have had a causal impact here, including by influencing SBF’s ideology, funneling certain kinds of people to FTX, improving SBF’s reputation with funders, etc. Not all of these should necessarily cause EAs to hand-wring or soul-search — sometimes you can do all the right things and still contribute to a rare disaster by sheer chance. But a disaster like this is a good opportunity to double-check whether we’re living up to our own principles in practice, and also to double-check whether our principles and strategies are as beneficial as they sounded on paper.
I don’t think this is about spotting fraud at all. I think the strong case for EA responsibility goes like:
EA promoted earning to give
When the movement largely moved away from it, not enough work was done to make that distance (such that the average person still strongly associates EA with earning to give)
Sam adopted an extremely reckless approach to EV-maximisation that was highly likely, if not guaranteed, to lead to a severe loss in value, regardless of the illegality of that loss
He publicly branded himself with this reckless EV-maximisation approach
He publicly associated himself with the movement, with earning-to-give specifically, and donated lots of money to EA causes to prove it
Many EA causes accepted that money and/or associated themselves with Sam
Critics of Sam’s recklessness did not speak up or were drowned out by other community members
The counterfactuals look something like this:
If EA had strongly distanced themselves from ETG, Sam may not have considered it as a path, or been unable to tie himself to EA
If EA had criticised Sam’s recklessness this may have moderated it
If EA orgs conditionalised their acceptance of his donations on financial transparency this may have mitigated the recklessness (and this would have been in their own interests since they stood to lose money!)
FWIW, I don’t necessarily agree with this, but this is what feels either explicit or implicit in a lot of the EA criticism around this that I’ve read. The focus appears to be explicitly on the causal chain of Sam adopting a reckless flavour of ETG which motivated him to take risks which didn’t pay off, and what EA, which was surely influential over his thinking, could’ve done to prevent it.
Any claim that advising people to earn to give is inherently really bad needs to either defend the view that “start a business or take another high paying job” is inherently immoral advice, or explain why it becomes immoral when you add “and give the money to charity” or when it’s aimed at EAs specifically. It’s possible that can be done, but I think it’s quite a high bar. (Which is not to say EtG advice couldn’t be improved in ways that make future scandals less likely.)
You’re right! It’s not that ETG is inherently bad (and frankly, I haven’t seen anyone make this argument), it’s that specific EV-maximising interpretations of ETG cause people to pursue careers that are (1) harmful, (2) net harmful, or (3) too risky to pay off.
Personally, I think FTX was (1) and (3), and
unlikely to be (2)probably also (2). I’m not really sure where the bar is, but under any moderately deontological framework (1) is especially concerning, and many of the people EA might want to have a good reputation with believe (1). So that’s roughly the worldview-neutral case for caring about strongly rejecting EV-maximising forms of ETG.Wait, why do you think 2 is false for FTX? (Good comment though!)
I haven’t run the numbers myself but I generally assume that FTX’s account-holders were mostly moderately well-off HIC residents (based on roughly imbibed demographics of crypto), and the Future Fund’s beneficiaries are by and large worse off. There were probably some number of people who invested their life savings or were otherwise poor to begin with that were harmed more significantly then the beneficiaries of their money. But on the whole it feels like it was an accidental wealth transfer, and much of that harm will be mitigated if they’re made whole (but admittedly, the make-whole money just comes from crypto speculation that trades on the gullibility of yet more people).
But much less confident in this take; my point is much more around the real harms it caused being worth thinking about.
The possibility of “made whole” is based on crypto values at the time of bankruptcy filing—meaning not really whole.
FTXFF was IIRC ~$150MM in grants, a substantial portion of which will end up clawed back. The losses (based on current crypto values) are exponentially greater than that. I suspect that the bulk of the economic impact involves transfers from customers, investors, and lenders to Alameda creditors?
Also, one can argue that crypto itself is net harmful, so any crypto career is presumptively so as well.
I tend to agree with all these points, actually—forgot about the clawbacks & specifically how substantial they were.
I don’t think “advising people to earn to give is inherently really bad” is necessary to reach the conclusion that there is a case for EA responsibility here. There exist many ideas that are not inherently bad, but yet it is irresponsible to advocate for them in certain ways / without certain safeguards. An ends-justifies-the-means approach to making money was a foreseeable response to EtG advocacy. Did EA actors do enough to discourage that kind of approach when advocating for EtG?
I don’t think it’s necessary, no. But I do think some early critics of EtG were motivated at least partly by a general anticapitalist case that business or at least finance careers were generically morally problematic in themselves.
Fair, but that wouldnt be a steelmanned—or even fairly balanced—version of criticisms of EtG. It’s the weaker part of a partial motivation held by some critics.
True. We should make sure any particular safeguard wasn’t in place around how people advocated for it before assuming it would have helped though. For what it’s worth my sense is that a much more culpable thing was not blowing the whistle on Sam’s bad behaviour at early Alameda even after Will and other leaders-I forget exactly who, if it’s even known-were informed about it. That mistake was almost certainly far less consequential for the people harmed by FTX (I don’t think it would have stopped the fraud; it might have protected EA itself), but I strongly suspect it was more knowably wrong at the time than anything anyone did or said about EtG as a general idea.
I think there are two separate but somewhat intertwined chains of inquiry under discussion here:
A historical inquiry: what happened in this case, what safeguards failed, what would have helped but wasn’t in place?
A ~first-principles re-evaluation of EtG based on an update: The catastrophic failure of the supposedly most successful instance of EtG should update us that we underestimated the risk and severity of EtG downsides. That suggests a broader re-examination of potential risks and safeguards, which may look more appropriate than they did before the update.
By analogy, whenever there is a school shooting, I don’t think it is inappropriate to analyze and discuss things merely because they would not have prevented that specific school shooting. However, those doing so should be careful to avoiding claiming that their preferred intervention would have been effective in the case in the news.
Agreed
Why would we want to do that? Earning to give is a good way to help the world. Maybe not the best, but still good.
Earning to Give still seems the best way to contribute for many people (e.g. people with exceptionally high earning potential, or people with decently paying jobs who aren’t a good fit for direct work or don’t want to switch jobs). I don’t think we should distance ourselves from it.
I’d add that I think 80K has done an awful lot to communicate “EA isn’t just about earning-to-give” over the years. At some point it surely has to be the case that they’ve done enough. This is part of why I want to distinguish the question “did we play a causal role here?” from questions like “did we foreseeably screw up?” and “should we do things differently going forward?”.
The complaints here seem to be partly about HOW EtG is promoted, rather than how MUCH. Though I am mildly skeptical that people in fact did not warn against doing harm to make money while promoting EtG, and much more skeptical that SBF would have listened if they had done this more.
Yeah, I had 80k in mind when I thought about EAs who did distance themselves from harms. They’ve had this article up for ages (I remember reading it early in my EA journey). I think a good SBF/EtG postmortem would try to establish which orgs didn’t do as well at this, and what the counterfactuals really were, and I think it may well conclude there wasn’t much else EA-promoting orgs could’ve done. (Although, if I had to put money on it I’d say explicit condemnations could’ve been predictable).
This doesn’t seem true? It makes perfect sense for advocacy groups to continue advocating their position, since a lot of the point is to reach people for whom the message is new. 80k is (or at least was) all about how to use your career for good, I would expect them to always be talking about earning to give as an option.
I mean “done enough” in the sense that 80K is at fault for falling short, not in the sense that they should necessarily stop sharing that message.
Yes, there are many indirect ways EA might have had a causal impact here, including by influencing SBF’s ideology, funneling certain kinds of people to FTX, improving SBF’s reputation with funders, etc. Not all of these should necessarily cause EAs to hand-wring or soul-search — sometimes you can do all the right things and still contribute to a rare disaster by sheer chance. But a disaster like this is a good opportunity to double-check whether we’re living up to our own principles in practice, and also to double-check whether our principles and strategies are as beneficial as they sounded on paper.