Cowen would benefit by understanding that EA has to have a duality, encompassing both philosophical aspects and practical aspects. The goal of EA is to have a practical effect on the world, but it’s effect will be limited if it’s not persuasive. The persuasive aspect requires it to describe it’s philosophical underpinnings. It cannot rely solely upon a pre-existing partial point of view, because it’s a departure from pre-existing partial points of view. You might call it a refinement of reciprocal altruism which would be a widely shared pre-existing partial point of view that is not individualistic, that escapes some of our other biases that limit the scope to which that point of view is applied.
That said, since EA does not try to force or coerce, it is left with persuasiveness. Persuasiveness must be rooted in something, and philosophy is where all such roots begin for individuals if they are not natural or imposed by force. To disambiguate the preceding, it’s not formal philosophy I refer to, but the broader scope philosophy which every individual engages with and is what formal philosophy attempts to describe and discuss.
The problem with the way Cowen engages with trolley problems and repugnant conclusions is that these are formal questions, out of formal philosophy, which are then forced into conflict with the practical world. The difference between someone who flips the trolley lever or accepts infinite double or nothing 51% bets, and someone who doesn’t, is whether they answer this as a formal question, or do not.
As a formal question, everything practical is removed. That is not possible in a practical environment. Hubris, is necessary to accepting 51% double of nothing bets. Hubris, is what would have prevented FTX’s path, not a different answer to a formal philosophical question.
You can actually feel Cowen is drawn toward this answer, but continue to reject it. He doesn’t want EA to change, because he recognizes the value it provides. At the same time, he’s saying it’s wrong and should be more socially conservative.
I’d disagree with the specifically socially conservative aspect, but agree with the conservative aspect. In this case I mean conservative in it’s most original form.. cautious about change. That said, I’m not convinced that the EA movement overall is not conservative, and I would not agree that “social conservatives” are all that conservative. The typical “social conservative” is willing to make grand unsubstantiated statements, advocate for some truly repressive actions to enforce maintaining a prior social order (or reverting to one long since abandoned).
Being socially conservative does not make you conservative. This narrow form of conservatism, can put you in conflict with other aspects of conservatism, and I’d argue it has put today’s social conservatives at great odds with it. In addition, there are large groups of people we allow to use the social conservative label who are regressive. Regressivity is not conservative, as it’s no longer attempting to maintain the status quo. An attempt to regress will without a doubt have unintended consequences. It’s the existence of unintended consequences and the hubris to accept that you cannot see all of them that is the only value proposition to conservatism, so once this is abandoned there’s no value left and we really shouldn’t accept the use of conservative to describe such groups.
So, in short, EA should continue to engage on it’s formal side, but should also continue to embrace practical principles during the translation of formal ideas to practical reality. If an alien does come to us and offer endless 51% double or nothing bets, we should be skeptical of the honesty, ability, and all the margins that go with that around accepting that bet. In like ways, when operating a financial company, someone should accept the possibility of downside, the inability to forecast the future, the reality of margin calls, the sometimes inscrutable wisdom of regulation, and the downsides of disregarding rules, even when it seems (to us) like we know better.
Forcing EA to divorce it’s practical and philosophical sides and then attacking the philosophical side using practical arguments is either dishonest, or a failure to understand EA. Accepting “partial point of view” as an absolute, provides no room for EA to argue for any change. Conservatism may have a place, but it would be internally inconsistent if it ever was all encompassing, because the one thing that has always been constant, is change. Conservatism that rejects change, rather than simply applying some caution to it, posits an impossibility, change cannot be halted, only managed.
I think this is a reasonable response, but Cowen did anticipate the “slavery is immoral” response, and is right that this wouldn’t be a utilitarian response. You can fix that since there is an easily drawn line from utilitarianism to this response, but I think Cowen would respond that in this scenario we both wouldn’t and shouldn’t bother to do such fine reasoning and just accept our partialities. He does make a similar statement during the Q&A.
I’d contend that this an example of mixing practical considerations with philosophical considerations. Of course we wouldn’t stop during an invasion of little green men who are killing and enslaving humans and wonder.. “would it be better for them to win?” If you did stop to wonder, there might be many good reasons to say no, but if you’re asking a question of whether you’d stop and ask a question, it’s not a philosophical question anymore, or at least not a thought experiment. Timing is practical not theoretical.
If it was really all about partialities, and not practical, it wouldn’t matter what side we were on. If we showed up on another planet, and could enslave/exterminate a bunch of little green men, should we stop to think about it before we did? Of course we should. And while maybe you can concoct a scenario in which it’s kill or be killed, there would be little question about the necessity to be certain that it wasn’t an option to simply turn around and go the other way.