Any claim that advising people to earn to give is inherently really bad needs to either defend the view that âstart a business or take another high paying jobâ is inherently immoral advice, or explain why it becomes immoral when you add âand give the money to charityâ or when itâs aimed at EAs specifically. Itâs possible that can be done, but I think itâs quite a high bar. (Which is not to say EtG advice couldnât be improved in ways that make future scandals less likely.)
Youâre right! Itâs not that ETG is inherently bad (and frankly, I havenât seen anyone make this argument), itâs that specific EV-maximising interpretations of ETG cause people to pursue careers that are (1) harmful, (2) net harmful, or (3) too risky to pay off.
Personally, I think FTX was (1) and (3), and unlikely to be (2) probably also (2). Iâm not really sure where the bar is, but under any moderately deontological framework (1) is especially concerning, and many of the people EA might want to have a good reputation with believe (1). So thatâs roughly the worldview-neutral case for caring about strongly rejecting EV-maximising forms of ETG.
I havenât run the numbers myself but I generally assume that FTXâs account-holders were mostly moderately well-off HIC residents (based on roughly imbibed demographics of crypto), and the Future Fundâs beneficiaries are by and large worse off. There were probably some number of people who invested their life savings or were otherwise poor to begin with that were harmed more significantly then the beneficiaries of their money. But on the whole it feels like it was an accidental wealth transfer, and much of that harm will be mitigated if theyâre made whole (but admittedly, the make-whole money just comes from crypto speculation that trades on the gullibility of yet more people).
But much less confident in this take; my point is much more around the real harms it caused being worth thinking about.
The possibility of âmade wholeâ is based on crypto values at the time of bankruptcy filingâmeaning not really whole.
FTXFF was IIRC ~$150MM in grants, a substantial portion of which will end up clawed back. The losses (based on current crypto values) are exponentially greater than that. I suspect that the bulk of the economic impact involves transfers from customers, investors, and lenders to Alameda creditors?
Also, one can argue that crypto itself is net harmful, so any crypto career is presumptively so as well.
I donât think âadvising people to earn to give is inherently really badâ is necessary to reach the conclusion that there is a case for EA responsibility here. There exist many ideas that are not inherently bad, but yet it is irresponsible to advocate for them in certain ways /â without certain safeguards. An ends-justifies-the-means approach to making money was a foreseeable response to EtG advocacy. Did EA actors do enough to discourage that kind of approach when advocating for EtG?
I donât think itâs necessary, no. But I do think some early critics of EtG were motivated at least partly by a general anticapitalist case that business or at least finance careers were generically morally problematic in themselves.
Fair, but that wouldnt be a steelmannedâor even fairly balancedâversion of criticisms of EtG. Itâs the weaker part of a partial motivation held by some critics.
True. We should make sure any particular safeguard wasnât in place around how people advocated for it before assuming it would have helped though. For what itâs worth my sense is that a much more culpable thing was not blowing the whistle on Samâs bad behaviour at early Alameda even after Will and other leaders-I forget exactly who, if itâs even known-were informed about it. That mistake was almost certainly far less consequential for the people harmed by FTX (I donât think it would have stopped the fraud; it might have protected EA itself), but I strongly suspect it was more knowably wrong at the time than anything anyone did or said about EtG as a general idea.
I think there are two separate but somewhat intertwined chains of inquiry under discussion here:
A historical inquiry: what happened in this case, what safeguards failed, what would have helped but wasnât in place?
A ~first-principles re-evaluation of EtG based on an update: The catastrophic failure of the supposedly most successful instance of EtG should update us that we underestimated the risk and severity of EtG downsides. That suggests a broader re-examination of potential risks and safeguards, which may look more appropriate than they did before the update.
By analogy, whenever there is a school shooting, I donât think it is inappropriate to analyze and discuss things merely because they would not have prevented that specific school shooting. However, those doing so should be careful to avoiding claiming that their preferred intervention would have been effective in the case in the news.
Any claim that advising people to earn to give is inherently really bad needs to either defend the view that âstart a business or take another high paying jobâ is inherently immoral advice, or explain why it becomes immoral when you add âand give the money to charityâ or when itâs aimed at EAs specifically. Itâs possible that can be done, but I think itâs quite a high bar. (Which is not to say EtG advice couldnât be improved in ways that make future scandals less likely.)
Youâre right! Itâs not that ETG is inherently bad (and frankly, I havenât seen anyone make this argument), itâs that specific EV-maximising interpretations of ETG cause people to pursue careers that are (1) harmful, (2) net harmful, or (3) too risky to pay off.
Personally, I think FTX was (1) and (3), and
unlikely to be (2)probably also (2). Iâm not really sure where the bar is, but under any moderately deontological framework (1) is especially concerning, and many of the people EA might want to have a good reputation with believe (1). So thatâs roughly the worldview-neutral case for caring about strongly rejecting EV-maximising forms of ETG.Wait, why do you think 2 is false for FTX? (Good comment though!)
I havenât run the numbers myself but I generally assume that FTXâs account-holders were mostly moderately well-off HIC residents (based on roughly imbibed demographics of crypto), and the Future Fundâs beneficiaries are by and large worse off. There were probably some number of people who invested their life savings or were otherwise poor to begin with that were harmed more significantly then the beneficiaries of their money. But on the whole it feels like it was an accidental wealth transfer, and much of that harm will be mitigated if theyâre made whole (but admittedly, the make-whole money just comes from crypto speculation that trades on the gullibility of yet more people).
But much less confident in this take; my point is much more around the real harms it caused being worth thinking about.
The possibility of âmade wholeâ is based on crypto values at the time of bankruptcy filingâmeaning not really whole.
FTXFF was IIRC ~$150MM in grants, a substantial portion of which will end up clawed back. The losses (based on current crypto values) are exponentially greater than that. I suspect that the bulk of the economic impact involves transfers from customers, investors, and lenders to Alameda creditors?
Also, one can argue that crypto itself is net harmful, so any crypto career is presumptively so as well.
I tend to agree with all these points, actuallyâforgot about the clawbacks & specifically how substantial they were.
I donât think âadvising people to earn to give is inherently really badâ is necessary to reach the conclusion that there is a case for EA responsibility here. There exist many ideas that are not inherently bad, but yet it is irresponsible to advocate for them in certain ways /â without certain safeguards. An ends-justifies-the-means approach to making money was a foreseeable response to EtG advocacy. Did EA actors do enough to discourage that kind of approach when advocating for EtG?
I donât think itâs necessary, no. But I do think some early critics of EtG were motivated at least partly by a general anticapitalist case that business or at least finance careers were generically morally problematic in themselves.
Fair, but that wouldnt be a steelmannedâor even fairly balancedâversion of criticisms of EtG. Itâs the weaker part of a partial motivation held by some critics.
True. We should make sure any particular safeguard wasnât in place around how people advocated for it before assuming it would have helped though. For what itâs worth my sense is that a much more culpable thing was not blowing the whistle on Samâs bad behaviour at early Alameda even after Will and other leaders-I forget exactly who, if itâs even known-were informed about it. That mistake was almost certainly far less consequential for the people harmed by FTX (I donât think it would have stopped the fraud; it might have protected EA itself), but I strongly suspect it was more knowably wrong at the time than anything anyone did or said about EtG as a general idea.
I think there are two separate but somewhat intertwined chains of inquiry under discussion here:
A historical inquiry: what happened in this case, what safeguards failed, what would have helped but wasnât in place?
A ~first-principles re-evaluation of EtG based on an update: The catastrophic failure of the supposedly most successful instance of EtG should update us that we underestimated the risk and severity of EtG downsides. That suggests a broader re-examination of potential risks and safeguards, which may look more appropriate than they did before the update.
By analogy, whenever there is a school shooting, I donât think it is inappropriate to analyze and discuss things merely because they would not have prevented that specific school shooting. However, those doing so should be careful to avoiding claiming that their preferred intervention would have been effective in the case in the news.
Agreed