Came across this paper. Haven’t read it yet so I can’t vouch for its quality, but seemed interesting and relevant enough to be worth sharing. Full text download available at link. I’ve included the abstract info here for convenience.
Normative Uncertainty and Probabilistic Moral Knowledge Julia Staffel Synthese (forthcoming)
Abstract
The aim of this paper is to examine whether it would be advantageous to introduce knowledge norms instead of the currently assumed rational credence norms into the debate about decision making under normative uncertainty. There is reason to think that this could help us better accommodate cases in which agents are rationally highly confident in false moral views. I show how Moss’ (2018) view of probabilistic knowledge can be fruitfully employed to develop a decision theory that delivers plausible verdicts in these cases. I also argue that, for this new view to be better than existing alternatives, it must adopt a particular solution to the new evil demon problem, which asks whether agents and their BIV-counterparts are equally justified. In order to get an attractive decision theory for cases of moral uncertainty, we must reject the claim that agents and their BIV-counterparts are equally justified. Moreover, the resulting view must be supplemented with a moral epistemology that explains how it is possible to be rationally morally uncertain. This is especially challenging if we assume that moral truths are knowable a priori.
Normative Uncertainty and Probabilistic Moral Knowledge
Link post
Came across this paper. Haven’t read it yet so I can’t vouch for its quality, but seemed interesting and relevant enough to be worth sharing. Full text download available at link. I’ve included the abstract info here for convenience.
Normative Uncertainty and Probabilistic Moral Knowledge
Julia Staffel
Synthese (forthcoming)
Abstract
The aim of this paper is to examine whether it would be advantageous to introduce knowledge norms instead of the currently assumed rational credence norms into the debate about decision making under normative uncertainty. There is reason to think that this could help us better accommodate cases in which agents are rationally highly confident in false moral views. I show how Moss’ (2018) view of probabilistic knowledge can be fruitfully employed to develop a decision theory that delivers plausible verdicts in these cases. I also argue that, for this new view to be better than existing alternatives, it must adopt a particular solution to the new evil demon problem, which asks whether agents and their BIV-counterparts are equally justified. In order to get an attractive decision theory for cases of moral uncertainty, we must reject the claim that agents and their BIV-counterparts are equally justified. Moreover, the resulting view must be supplemented with a moral epistemology that explains how it is possible to be rationally morally uncertain. This is especially challenging if we assume that moral truths are knowable a priori.