While I don’t agree with a lot of Torres beliefs and attitudes, I don’t agree with this article that concerns about EA extremism are unwarranted. Take the stance on SBF, for example:
It’s true that Sam Bankman-Fried, an effective altruist Jane Street employee, went on to commit an enormous fraud — but the fraud was universally condemned by members of the effective altruist community. People who do evil things exist in every sufficiently large social movement; it doesn’t mean that every movement recommends evil.
Yes, SBF does not represent the majority of EA’s, but he still conducted one of the largest frauds in history, and it’s unlikely he would have counterfactually done this without EA existing. Harmful, extremist EA-motivated actions clearly have happened, and they were not confined to a few randos on message boards, but contained actual highly influential and respected EA figures.
Extremism might be in the minority, but it’s still a real concern if there’s a way to translate that extremism into real world harm, as happened with SBF.
I think this is especially important with AI stuff. Now, I don’t believe in the singularity, but many EA’ers do, and some of them are setting out to build what they believe to be a god-like AI. That would be a lot of power concentrated into the people that build that. If they are extremist, flawed, or have bad values, those flaws could be locked in for the rest of time. Even if (more likely) the AI is just very powerful rather than god-like, a few people could still have a significant effect on the future. I think this more than justifies increased scrutiny of the flaws in EA values and thinking.
I tend to believe that SBF committed fraud for the same reasons that ordinary people commit fraud (both individual traits like overconfidence and systematic traits like the lack of controls in crypto to prevent fraud). Effective altruism might have motivated him to put himself in the sort of situation where he’d be tempted to commit fraud, but I really don’t see much evidence that SBF’s psychology is much different than e.g. Madoff’s.
I don’t know that “extremist” is a good characterization of FTX & Alameda’s actions.
Usually “extremist” implies a willingness to take highly antisocial actions for the sake of an extreme ideology.
It’s fair to say that trying to found a billion dollar company with the explicit goal of eventually donating all profits is an extreme action. It’s highly unusual and goes much further with specific ideas than most adherents do. But unless one is taking a very harsh stance against capitalism (or against cryptocurrency), it’s hard to call this action highly antisocial just yet. The antisocial bit comes with the first fraudulent action taken.
A narrative I keep seeing is that Sam and several others thought that not only the longstanding arguments against robbing banks to donate to charity are flawed, but in fact they should feel ok robbing customers who trusted them in order to get donation funds.
If someone believed this extreme-ified version of EA and so they committed fraud with billions of dollars, that would be extremist. But my impression is- whether it started as a grievous accounting flaw, a risky conspiracy between amphetamine fueled manics, or something else- the fraud wasn’t a result of people doing careful math, sleeping on it, and ultimately deciding it was net positive. It involved irrational decisions. (This is especially clear by the end. I’d need to refresh my memory to talk specifics, but I think in the last months SBF was making long-term illiquid investments that made it even less plausible they could have avoided bankruptcy, and that blatantly did not increase EV even from a risk-neutral perspective.)
If the fraud was irrational regardless of whether their ideology was ok with robbery, then in my view there’s little evidence ideology caused the initial decision to commit fraud.
Instead the relevant people did an extreme action, and then made various moral and corporate failures typical of white collar crime, which were antisocial and went against their ideology.
While I don’t agree with a lot of Torres beliefs and attitudes, I don’t agree with this article that concerns about EA extremism are unwarranted. Take the stance on SBF, for example:
Yes, SBF does not represent the majority of EA’s, but he still conducted one of the largest frauds in history, and it’s unlikely he would have counterfactually done this without EA existing. Harmful, extremist EA-motivated actions clearly have happened, and they were not confined to a few randos on message boards, but contained actual highly influential and respected EA figures.
Extremism might be in the minority, but it’s still a real concern if there’s a way to translate that extremism into real world harm, as happened with SBF.
I think this is especially important with AI stuff. Now, I don’t believe in the singularity, but many EA’ers do, and some of them are setting out to build what they believe to be a god-like AI. That would be a lot of power concentrated into the people that build that. If they are extremist, flawed, or have bad values, those flaws could be locked in for the rest of time. Even if (more likely) the AI is just very powerful rather than god-like, a few people could still have a significant effect on the future. I think this more than justifies increased scrutiny of the flaws in EA values and thinking.
I tend to believe that SBF committed fraud for the same reasons that ordinary people commit fraud (both individual traits like overconfidence and systematic traits like the lack of controls in crypto to prevent fraud). Effective altruism might have motivated him to put himself in the sort of situation where he’d be tempted to commit fraud, but I really don’t see much evidence that SBF’s psychology is much different than e.g. Madoff’s.
I don’t know that “extremist” is a good characterization of FTX & Alameda’s actions.
Usually “extremist” implies a willingness to take highly antisocial actions for the sake of an extreme ideology.
It’s fair to say that trying to found a billion dollar company with the explicit goal of eventually donating all profits is an extreme action. It’s highly unusual and goes much further with specific ideas than most adherents do. But unless one is taking a very harsh stance against capitalism (or against cryptocurrency), it’s hard to call this action highly antisocial just yet. The antisocial bit comes with the first fraudulent action taken.
A narrative I keep seeing is that Sam and several others thought that not only the longstanding arguments against robbing banks to donate to charity are flawed, but in fact they should feel ok robbing customers who trusted them in order to get donation funds.
If someone believed this extreme-ified version of EA and so they committed fraud with billions of dollars, that would be extremist. But my impression is- whether it started as a grievous accounting flaw, a risky conspiracy between amphetamine fueled manics, or something else- the fraud wasn’t a result of people doing careful math, sleeping on it, and ultimately deciding it was net positive. It involved irrational decisions. (This is especially clear by the end. I’d need to refresh my memory to talk specifics, but I think in the last months SBF was making long-term illiquid investments that made it even less plausible they could have avoided bankruptcy, and that blatantly did not increase EV even from a risk-neutral perspective.)
If the fraud was irrational regardless of whether their ideology was ok with robbery, then in my view there’s little evidence ideology caused the initial decision to commit fraud.
Instead the relevant people did an extreme action, and then made various moral and corporate failures typical of white collar crime, which were antisocial and went against their ideology.