Sam was a known, committed consequentialist, who may have been attempting to make decisions in an explicitly consequentialist way. Thus, claims to the effect of ‘Sam’s actions violated EA principles’ feel too strong. Sam’s actions were obviously not required by EA principles, but nor am I confident that, at least before this post, we’d have had firm ground to say that fraud was condemned by EA principles.
I think I agree that “maximization” seems to be a core idea of EA. But I think I disagree that people think “what EA is about” will include maximization to the extent that Sam took it (let’s assume he actively + intentionally defrauded FTX customers for the purpose of donating it). And just because “maximization” seems to be directionally correct for most people (and thus seen to be “what EA is about”), doesn’t mean that all actions done in the name of “maximization” (assuming this is what happened) are consistent with EA principles.
I think I probably agree with your statement of EA community values being “indeterminate”. But I also think your bar for saying something is not indeterminant (requiring something close to unanimity) is too high-in that case you’re going to be hard pressed to find many things that fit this in the EA community (we should do good better), and even within the longtermist community (future people matter).
I’ll just note that all of the links in this thread predate the “fraud in the service of effective altruism is unacceptable” post, who were by people most would probably consider “EA leaders”.
I think I agree that “maximization” seems to be a core idea of EA. But I think I disagree that people think “what EA is about” will include maximization to the extent that Sam took it (let’s assume he actively + intentionally defrauded FTX customers for the purpose of donating it). And just because “maximization” seems to be directionally correct for most people (and thus seen to be “what EA is about”), doesn’t mean that all actions done in the name of “maximization” (assuming this is what happened) are consistent with EA principles.
I think I probably agree with your statement of EA community values being “indeterminate”. But I also think your bar for saying something is not indeterminant (requiring something close to unanimity) is too high-in that case you’re going to be hard pressed to find many things that fit this in the EA community (we should do good better), and even within the longtermist community (future people matter).