Yes this is a fair point, I think that P6 is probably quite easily rejected from a moral anti-realsit stance. I do however think that the rest of the argument probbaly stil runs, given the claim is about potential X-Risk which can probbaly be agreed on as a bad irrespective of one’s metaethics.
Theo Cox
Thanks for the heads up! I’ve shortened the title which should hopefully help
Hi both,
Thanks for the helpful input and bearing with me as I circled back (it’s been a pretty hectic period!).
Re the conceptualisation of polycrsis, I’d agree that this is pretty universal condition of globalised society. I should probably clarify upfront that I’m not claiming this is a new thing, but that focusing on the interconnection of crises nonetheless is a valuable frame for approaching them. I’d probbaly also put the pandemic in the poly rather than mono category, given the knock on economic effects etc but that’s somewhat of a tangental point.
Re EA’s success, I definitely take your point. I think again I should clarify that my claim here is not that EA will itself dominate and lead the world astray, but that its current activity is contributng to/propping up a way of viewing the world which is potentialy harmful. My view, and the argument above, is that this is something to be concerned about even if we think this contribution is in the grand scheme of things relatively minor at this stage, given the potential consequences and moral opportunity costs vs trying to advance aa contray worldview.
Re Littlewood’s Law, thanks for sharing the interestig article! I definitely take your point and think we should discount for thos perceptual biases in our assessments. I would still probably claim, though, that even despite this we are in a particular time of crisis. I know there’s debate about the hingey-ness of the present time (to borrow MacAskill’s term) so perhaps we just diverge around that assessment, which is certainly fair enough.
Thanks for the thoughtful analysis and good to hear your pserspective around EA and climate. I think my claim is that, aside from the debate around whether EA supports inaction on climate ful stop, the action it seems most predisposed towards (e.g. focus on emerging technologies etc) carries its own risks.
In response to your latter point, I think it’s fair around the uncertainty of projections. As per my first reply above, I would more claim that contribution to a potentially harmful worldview is itself a cause of concern, even if current levels of influence are relatively low. I hope this is helpful clarfication!