Any claim that advising people to earn to give is inherently really bad needs to either defend the view that “start a business or take another high paying job” is inherently immoral advice, or explain why it becomes immoral when you add “and give the money to charity” or when it’s aimed at EAs specifically. It’s possible that can be done, but I think it’s quite a high bar. (Which is not to say EtG advice couldn’t be improved in ways that make future scandals less likely.)
You’re right! It’s not that ETG is inherently bad (and frankly, I haven’t seen anyone make this argument), it’s that specific EV-maximising interpretations of ETG cause people to pursue careers that are (1) harmful, (2) net harmful, or (3) too risky to pay off.
Personally, I think FTX was (1) and (3), and unlikely to be (2) probably also (2). I’m not really sure where the bar is, but under any moderately deontological framework (1) is especially concerning, and many of the people EA might want to have a good reputation with believe (1). So that’s roughly the worldview-neutral case for caring about strongly rejecting EV-maximising forms of ETG.
I haven’t run the numbers myself but I generally assume that FTX’s account-holders were mostly moderately well-off HIC residents (based on roughly imbibed demographics of crypto), and the Future Fund’s beneficiaries are by and large worse off. There were probably some number of people who invested their life savings or were otherwise poor to begin with that were harmed more significantly then the beneficiaries of their money. But on the whole it feels like it was an accidental wealth transfer, and much of that harm will be mitigated if they’re made whole (but admittedly, the make-whole money just comes from crypto speculation that trades on the gullibility of yet more people).
But much less confident in this take; my point is much more around the real harms it caused being worth thinking about.
The possibility of “made whole” is based on crypto values at the time of bankruptcy filing—meaning not really whole.
FTXFF was IIRC ~$150MM in grants, a substantial portion of which will end up clawed back. The losses (based on current crypto values) are exponentially greater than that. I suspect that the bulk of the economic impact involves transfers from customers, investors, and lenders to Alameda creditors?
Also, one can argue that crypto itself is net harmful, so any crypto career is presumptively so as well.
I don’t think “advising people to earn to give is inherently really bad” is necessary to reach the conclusion that there is a case for EA responsibility here. There exist many ideas that are not inherently bad, but yet it is irresponsible to advocate for them in certain ways / without certain safeguards. An ends-justifies-the-means approach to making money was a foreseeable response to EtG advocacy. Did EA actors do enough to discourage that kind of approach when advocating for EtG?
I don’t think it’s necessary, no. But I do think some early critics of EtG were motivated at least partly by a general anticapitalist case that business or at least finance careers were generically morally problematic in themselves.
Fair, but that wouldnt be a steelmanned—or even fairly balanced—version of criticisms of EtG. It’s the weaker part of a partial motivation held by some critics.
True. We should make sure any particular safeguard wasn’t in place around how people advocated for it before assuming it would have helped though. For what it’s worth my sense is that a much more culpable thing was not blowing the whistle on Sam’s bad behaviour at early Alameda even after Will and other leaders-I forget exactly who, if it’s even known-were informed about it. That mistake was almost certainly far less consequential for the people harmed by FTX (I don’t think it would have stopped the fraud; it might have protected EA itself), but I strongly suspect it was more knowably wrong at the time than anything anyone did or said about EtG as a general idea.
I think there are two separate but somewhat intertwined chains of inquiry under discussion here:
A historical inquiry: what happened in this case, what safeguards failed, what would have helped but wasn’t in place?
A ~first-principles re-evaluation of EtG based on an update: The catastrophic failure of the supposedly most successful instance of EtG should update us that we underestimated the risk and severity of EtG downsides. That suggests a broader re-examination of potential risks and safeguards, which may look more appropriate than they did before the update.
By analogy, whenever there is a school shooting, I don’t think it is inappropriate to analyze and discuss things merely because they would not have prevented that specific school shooting. However, those doing so should be careful to avoiding claiming that their preferred intervention would have been effective in the case in the news.
Any claim that advising people to earn to give is inherently really bad needs to either defend the view that “start a business or take another high paying job” is inherently immoral advice, or explain why it becomes immoral when you add “and give the money to charity” or when it’s aimed at EAs specifically. It’s possible that can be done, but I think it’s quite a high bar. (Which is not to say EtG advice couldn’t be improved in ways that make future scandals less likely.)
You’re right! It’s not that ETG is inherently bad (and frankly, I haven’t seen anyone make this argument), it’s that specific EV-maximising interpretations of ETG cause people to pursue careers that are (1) harmful, (2) net harmful, or (3) too risky to pay off.
Personally, I think FTX was (1) and (3), and
unlikely to be (2)probably also (2). I’m not really sure where the bar is, but under any moderately deontological framework (1) is especially concerning, and many of the people EA might want to have a good reputation with believe (1). So that’s roughly the worldview-neutral case for caring about strongly rejecting EV-maximising forms of ETG.Wait, why do you think 2 is false for FTX? (Good comment though!)
I haven’t run the numbers myself but I generally assume that FTX’s account-holders were mostly moderately well-off HIC residents (based on roughly imbibed demographics of crypto), and the Future Fund’s beneficiaries are by and large worse off. There were probably some number of people who invested their life savings or were otherwise poor to begin with that were harmed more significantly then the beneficiaries of their money. But on the whole it feels like it was an accidental wealth transfer, and much of that harm will be mitigated if they’re made whole (but admittedly, the make-whole money just comes from crypto speculation that trades on the gullibility of yet more people).
But much less confident in this take; my point is much more around the real harms it caused being worth thinking about.
The possibility of “made whole” is based on crypto values at the time of bankruptcy filing—meaning not really whole.
FTXFF was IIRC ~$150MM in grants, a substantial portion of which will end up clawed back. The losses (based on current crypto values) are exponentially greater than that. I suspect that the bulk of the economic impact involves transfers from customers, investors, and lenders to Alameda creditors?
Also, one can argue that crypto itself is net harmful, so any crypto career is presumptively so as well.
I tend to agree with all these points, actually—forgot about the clawbacks & specifically how substantial they were.
I don’t think “advising people to earn to give is inherently really bad” is necessary to reach the conclusion that there is a case for EA responsibility here. There exist many ideas that are not inherently bad, but yet it is irresponsible to advocate for them in certain ways / without certain safeguards. An ends-justifies-the-means approach to making money was a foreseeable response to EtG advocacy. Did EA actors do enough to discourage that kind of approach when advocating for EtG?
I don’t think it’s necessary, no. But I do think some early critics of EtG were motivated at least partly by a general anticapitalist case that business or at least finance careers were generically morally problematic in themselves.
Fair, but that wouldnt be a steelmanned—or even fairly balanced—version of criticisms of EtG. It’s the weaker part of a partial motivation held by some critics.
True. We should make sure any particular safeguard wasn’t in place around how people advocated for it before assuming it would have helped though. For what it’s worth my sense is that a much more culpable thing was not blowing the whistle on Sam’s bad behaviour at early Alameda even after Will and other leaders-I forget exactly who, if it’s even known-were informed about it. That mistake was almost certainly far less consequential for the people harmed by FTX (I don’t think it would have stopped the fraud; it might have protected EA itself), but I strongly suspect it was more knowably wrong at the time than anything anyone did or said about EtG as a general idea.
I think there are two separate but somewhat intertwined chains of inquiry under discussion here:
A historical inquiry: what happened in this case, what safeguards failed, what would have helped but wasn’t in place?
A ~first-principles re-evaluation of EtG based on an update: The catastrophic failure of the supposedly most successful instance of EtG should update us that we underestimated the risk and severity of EtG downsides. That suggests a broader re-examination of potential risks and safeguards, which may look more appropriate than they did before the update.
By analogy, whenever there is a school shooting, I don’t think it is inappropriate to analyze and discuss things merely because they would not have prevented that specific school shooting. However, those doing so should be careful to avoiding claiming that their preferred intervention would have been effective in the case in the news.
Agreed