Focus on rule utilitarianism, and stop normalizing unilateral actions that have a chance to burn the commons like lying or stealing. I wouldn’t go as far as freedomandutility would, but we need to be clear that lying and stealing are usually bad, and shouldn’t be normal.
Speaking of that, that leads to my next suggestion:
We need to have a whistleblower hotline.
Specifically, we need to make sure that we catch unethical behavior early, before things get big. Here’s the thing, a lot of big failures are preceded by smaller failures, as Dan Luu says.
We really need to move beyond the model of EA posts being a whistleblower’s main place, even in the best case scenario.
And 3. We should be much more risk-averse to actions that involve norm violations, because they have usually negative benefits. One of SBF (and partially EA’s flaw) was not realizing what the space of outcomes could be.
Quoting here:
It is important to distinguish different types of risk-taking here. (1) There is the kind of risk taking that promises high payoffs but with a high chance of the bet falling to zero, without violating commonsense ethical norms, (2) Risk taking in the sense of being willing to risk it all secretly violating ethical norms to get more money. One flaw in SBF’s thinking seemed to be that risk-neutral altruists should take big risks because the returns can only fall to zero. In fact, the returns can go negative—eg all the people he has stiffed, and all of the damage he has done to EA.
Yeah, the idea that everything would fall to zero only makes sense if we take a very narrow view of who was harmed. Once we include depositors, EAs and more, it turned negative value.
A few thoughts on how to avoid this:
Focus on rule utilitarianism, and stop normalizing unilateral actions that have a chance to burn the commons like lying or stealing. I wouldn’t go as far as freedomandutility would, but we need to be clear that lying and stealing are usually bad, and shouldn’t be normal.
Speaking of that, that leads to my next suggestion:
We need to have a whistleblower hotline.
Specifically, we need to make sure that we catch unethical behavior early, before things get big. Here’s the thing, a lot of big failures are preceded by smaller failures, as Dan Luu says.
Here’s a link:
https://danluu.com/wat/
We really need to move beyond the model of EA posts being a whistleblower’s main place, even in the best case scenario.
And 3. We should be much more risk-averse to actions that involve norm violations, because they have usually negative benefits. One of SBF (and partially EA’s flaw) was not realizing what the space of outcomes could be.
Quoting here:
Yeah, the idea that everything would fall to zero only makes sense if we take a very narrow view of who was harmed. Once we include depositors, EAs and more, it turned negative value.