When I donât really think about it, I basically feel like moral realism is definitely true and like thereâs no question there at all
When I do really think about, my independent impression is that moral realism seems to basically make no sense and be almost guaranteed to be false
But then lots of smart people whoâve thougt about metaethics a lot do seem to think moral realism is somewhere between plausible or very likely, so I update up to something like a 1% chance (0.5% in this spreadsheet)
I think that this is fairly different from the startup founder example, though I guess it ends up in a similar place of it being easy to feel like âthe odds are goodâ even if on some level I believe/ârecognise that theyâre not.
Actually, that commentâand this spreadsheetâimplied that my all-things-considered belief (not independent impression) is that thereâs a ~0.5-1% chance of something like moral realism being true. But that doesnât seem like the reasonable all-things-considered belief to have, given that it seems to me that:
The average credence in that claim from smart people whoâve spent a while thinking about it would be considerably higher
One useful proxy is the 2013 PhilPapers survey, which suggests that, out of some sample of philosophers, 56.4% subscribe to moral realism, 27.7% subscribe to moral anti-realism, and 15.9% were âotherâ
Iâm deeply confused about this topic (which pushes against relying strongly on my own independent impression)
So maybe actually my all-things-considered belief is (or should be) closer to 50% (i.e., ~100 times as high as is suggested in this spreadsheet), and the 0.5% number is somewhere in-between my independent impression and my all-things-considered belief.
That might further help explain why it usually doesnât feel super weird to me to kind-of âact on a moral realism wagerâ.
But yeah, I mostly feel pretty confused about what this topic even is, what I should think about it, and what I do think about it.
Weirdly, for me, itâs like:
When I donât really think about it, I basically feel like moral realism is definitely true and like thereâs no question there at all
When I do really think about, my independent impression is that moral realism seems to basically make no sense and be almost guaranteed to be false
But then lots of smart people whoâve thougt about metaethics a lot do seem to think moral realism is somewhere between plausible or very likely, so I update up to something like a 1% chance (0.5% in this spreadsheet)
I think that this is fairly different from the startup founder example, though I guess it ends up in a similar place of it being easy to feel like âthe odds are goodâ even if on some level I believe/ârecognise that theyâre not.
Actually, that commentâand this spreadsheetâimplied that my all-things-considered belief (not independent impression) is that thereâs a ~0.5-1% chance of something like moral realism being true. But that doesnât seem like the reasonable all-things-considered belief to have, given that it seems to me that:
The average credence in that claim from smart people whoâve spent a while thinking about it would be considerably higher
One useful proxy is the 2013 PhilPapers survey, which suggests that, out of some sample of philosophers, 56.4% subscribe to moral realism, 27.7% subscribe to moral anti-realism, and 15.9% were âotherâ
Iâm deeply confused about this topic (which pushes against relying strongly on my own independent impression)
So maybe actually my all-things-considered belief is (or should be) closer to 50% (i.e., ~100 times as high as is suggested in this spreadsheet), and the 0.5% number is somewhere in-between my independent impression and my all-things-considered belief.
That might further help explain why it usually doesnât feel super weird to me to kind-of âact on a moral realism wagerâ.
But yeah, I mostly feel pretty confused about what this topic even is, what I should think about it, and what I do think about it.