I upvoted Phil’s post, despite agreeing with almost all of AlexHT’s response to EdoArad above. This is because I want to encourage good faith critiques, even those which I judge to contain serious flaws. And while there were elements of Phil’s book that read to me more like attempts at mood affiliation than serious engagement with his interlocutor’s views (e.g. ‘look at these weird things that Nick Bostrom said once!’), on the whole I felt that there was enough effort at engagement that I was glad Phil took the time to write up his concerns.
Two aspects of the book that I interpreted somewhat differently than Alex:
The genocide argument that Alex expressed confusion about: I thought Phil’s concern was not that longtermism would merely consider genocide while evaluating options, but that it seems plausible to Phil that longtermism (or a future iteration of it encountering different facts) could endorse genocide—i.e. that Phil is worried about genocide as an output of longtermism’s decision process, not as an input. My model of Phil is that if he were confident that longtermism would always reject genocide, then he wouldn’t be concerned merely that such possibilities are evaluated. Confidence: Low/moderate.
The section describing utilitarianism: I read this section as merely aiming to describe an aspect of longtermism and to highlight features which might be wrong or counter-intuitive, not to actually make any arguments against the views he describes. This could explain Alex’s confusion about what was being argued for (nothing) and feeling that intuitions were just being thrown at him (yes). I think Phil’s purpose here is to lay the groundwork for his later argument that these ideas could be dangerous. The only argument I noticed against utilitarianism comes later—namely, that together with empirical beliefs about the possibility of a large future it leads to conclusions that Phil rejects. Confidence: Low.
I agree with Alex that the book was not clear on these points (among others), and I attribute our different readings to that lack of clarity. I’d certainly be happy to hear Phil’s take.
I have a couple of other thoughts that I will add in a separate comment.
I upvoted Phil’s post, despite agreeing with almost all of AlexHT’s response to EdoArad above. This is because I want to encourage good faith critiques, even those which I judge to contain serious flaws. And while there were elements of Phil’s book that read to me more like attempts at mood affiliation than serious engagement with his interlocutor’s views (e.g. ‘look at these weird things that Nick Bostrom said once!’), on the whole I felt that there was enough effort at engagement that I was glad Phil took the time to write up his concerns.
Two aspects of the book that I interpreted somewhat differently than Alex:
The genocide argument that Alex expressed confusion about: I thought Phil’s concern was not that longtermism would merely consider genocide while evaluating options, but that it seems plausible to Phil that longtermism (or a future iteration of it encountering different facts) could endorse genocide—i.e. that Phil is worried about genocide as an output of longtermism’s decision process, not as an input. My model of Phil is that if he were confident that longtermism would always reject genocide, then he wouldn’t be concerned merely that such possibilities are evaluated. Confidence: Low/moderate.
The section describing utilitarianism: I read this section as merely aiming to describe an aspect of longtermism and to highlight features which might be wrong or counter-intuitive, not to actually make any arguments against the views he describes. This could explain Alex’s confusion about what was being argued for (nothing) and feeling that intuitions were just being thrown at him (yes). I think Phil’s purpose here is to lay the groundwork for his later argument that these ideas could be dangerous. The only argument I noticed against utilitarianism comes later—namely, that together with empirical beliefs about the possibility of a large future it leads to conclusions that Phil rejects. Confidence: Low.
I agree with Alex that the book was not clear on these points (among others), and I attribute our different readings to that lack of clarity. I’d certainly be happy to hear Phil’s take.
I have a couple of other thoughts that I will add in a separate comment.