I don’t think there are any normative facts, so you can finish that sentence, if you’d like. In other words, I don’t think there’s no objective feature in the world that tells you that you need to have x beliefs instead of y beliefs. If one did actually believe this, I’m curious about how this would play out (i.e. should someone do a bunch of very simple math equations all the time because they could gain many true beliefs very quickly? Seems weird).
On just having true beliefs, I would say that when you give some ontology of how the world works, you’d expect evolution to give us truth-tracking beliefs and or processes in many instances because it is actually useful for survival/reproduction (though it would also give us wrong beliefs, but we do see this—i.e. we believe in concepts that don’t REALLY carve reality like chairs because they’re useful).
I don’t think there are any normative facts, so you can finish that sentence, if you’d like. In other words, I don’t think there’s no objective feature in the world that tells you that you need to have x beliefs instead of y beliefs. If one did actually believe this, I’m curious about how this would play out (i.e. should someone do a bunch of very simple math equations all the time because they could gain many true beliefs very quickly? Seems weird).
On just having true beliefs, I would say that when you give some ontology of how the world works, you’d expect evolution to give us truth-tracking beliefs and or processes in many instances because it is actually useful for survival/reproduction (though it would also give us wrong beliefs, but we do see this—i.e. we believe in concepts that don’t REALLY carve reality like chairs because they’re useful).