Evolutionary debunking arguments—we can explain the vast majority of moral beliefs without positing the existence of extra substances—therefore, we shouldn’t posit them!
I don’t think there are any normative facts, so you can finish that sentence, if you’d like. In other words, I don’t think there’s no objective feature in the world that tells you that you need to have x beliefs instead of y beliefs. If one did actually believe this, I’m curious about how this would play out (i.e. should someone do a bunch of very simple math equations all the time because they could gain many true beliefs very quickly? Seems weird).
On just having true beliefs, I would say that when you give some ontology of how the world works, you’d expect evolution to give us truth-tracking beliefs and or processes in many instances because it is actually useful for survival/reproduction (though it would also give us wrong beliefs, but we do see this—i.e. we believe in concepts that don’t REALLY carve reality like chairs because they’re useful).
It’s at least possible that one can ‘contain’ debunking arguments, such that they don’t extend across domains and self-undermine. We discuss this strategy in our chapter here.
Evolutionary debunking arguments—we can explain the vast majority of moral beliefs without positing the existence of extra substances—therefore, we shouldn’t posit them!
We can also explain this epistemic normative belief of yours without positing that it’s true, therefore...?
I don’t think there are any normative facts, so you can finish that sentence, if you’d like. In other words, I don’t think there’s no objective feature in the world that tells you that you need to have x beliefs instead of y beliefs. If one did actually believe this, I’m curious about how this would play out (i.e. should someone do a bunch of very simple math equations all the time because they could gain many true beliefs very quickly? Seems weird).
On just having true beliefs, I would say that when you give some ontology of how the world works, you’d expect evolution to give us truth-tracking beliefs and or processes in many instances because it is actually useful for survival/reproduction (though it would also give us wrong beliefs, but we do see this—i.e. we believe in concepts that don’t REALLY carve reality like chairs because they’re useful).
It’s at least possible that one can ‘contain’ debunking arguments, such that they don’t extend across domains and self-undermine. We discuss this strategy in our chapter here.
See my reply :)
Just gonna have to write a reply post, probably