to be honest I think that if I had that epistemic state I would probably be like âuhm I guess if I was a consistent rational agent I would do whatever the beliefs I have 1% credence in imply, but alas Iâm not, and so even if I donât endorse this on a meta level I know that Iâll mostly just ~ignore this set of 1% likely views and do whatever I want insteadâ.
Iâd have guessed that:
Youâd think that, if you tried to write out all the assumptions that would mean your actions are the best actions to take, at least one would have a <1% chance of being true in your view
But you take those actions anyway, because they seem to you to have a higher expected value than the alternatives
Am I wrong about that (admittedly vague) claim?
Or maybe what seems weird about my chain of cruxes is something like the fact that one thatâs so unlikely in my own view is so âfoundationalâ? Or something like the fact that Iâve only identified ~12 cruxes (so far from writing out all the assumptions implicit in my specific priorities, such as the precise research projects I work on) and yet already one of them was deemed so unlikely by me?
(Maybe here itâs worth noting that one worry I had about posting this was that it might be demotivating, since there are so many uncertainties relevant to any given action, even though in reality it can still often be best to just go ahead with our current best guess because any alternativeâincluding further analysisâseems less promising.)
Hmmâgood question if that would be true for one of my âcruxesâ as well. FWIW my immediate intuition is that it wouldnât, i.e. that Iâd have >1% credence in all relevant assumptions. Or at least that counterexamples would feel âpathologicalâ to me, i.e. like weird edge cases Iâd want to discount. But I havenât carefully thought about it, and my view on this doesnât feel that stable.
I also think the âfoundationalâ property you gestured at does some work for why my intuitive reaction is âthis seems wildâ.
Thinking about this, I also realized that maybe some distinction between âhow it feels like if I just look at my intuitionâ and âwhat my all-things-considered belief/ââbetting oddsâ would be after I take into account outside views, peer disagreement, etc.â. The example that made me think about this were startup founders, or other people embarking on ambitious projects that based on their reference class are very likely to fail. [Though idk if 99% is the right quantitative threshold for cases that appear in practice.] I would guess that some people with that profile might say something like âsure, in one sense I agree that the chance of me succeeding must be very smallâbut it just does feel like I will succeed to me, and if I felt otherwise I wouldnât do what Iâm doingâ.
When I donât really think about it, I basically feel like moral realism is definitely true and like thereâs no question there at all
When I do really think about, my independent impression is that moral realism seems to basically make no sense and be almost guaranteed to be false
But then lots of smart people whoâve thougt about metaethics a lot do seem to think moral realism is somewhere between plausible or very likely, so I update up to something like a 1% chance (0.5% in this spreadsheet)
I think that this is fairly different from the startup founder example, though I guess it ends up in a similar place of it being easy to feel like âthe odds are goodâ even if on some level I believe/ârecognise that theyâre not.
Actually, that commentâand this spreadsheetâimplied that my all-things-considered belief (not independent impression) is that thereâs a ~0.5-1% chance of something like moral realism being true. But that doesnât seem like the reasonable all-things-considered belief to have, given that it seems to me that:
The average credence in that claim from smart people whoâve spent a while thinking about it would be considerably higher
One useful proxy is the 2013 PhilPapers survey, which suggests that, out of some sample of philosophers, 56.4% subscribe to moral realism, 27.7% subscribe to moral anti-realism, and 15.9% were âotherâ
Iâm deeply confused about this topic (which pushes against relying strongly on my own independent impression)
So maybe actually my all-things-considered belief is (or should be) closer to 50% (i.e., ~100 times as high as is suggested in this spreadsheet), and the 0.5% number is somewhere in-between my independent impression and my all-things-considered belief.
That might further help explain why it usually doesnât feel super weird to me to kind-of âact on a moral realism wagerâ.
But yeah, I mostly feel pretty confused about what this topic even is, what I should think about it, and what I do think about it.
Iâd have guessed that:
Youâd think that, if you tried to write out all the assumptions that would mean your actions are the best actions to take, at least one would have a <1% chance of being true in your view
But you take those actions anyway, because they seem to you to have a higher expected value than the alternatives
Am I wrong about that (admittedly vague) claim?
Or maybe what seems weird about my chain of cruxes is something like the fact that one thatâs so unlikely in my own view is so âfoundationalâ? Or something like the fact that Iâve only identified ~12 cruxes (so far from writing out all the assumptions implicit in my specific priorities, such as the precise research projects I work on) and yet already one of them was deemed so unlikely by me?
(Maybe here itâs worth noting that one worry I had about posting this was that it might be demotivating, since there are so many uncertainties relevant to any given action, even though in reality it can still often be best to just go ahead with our current best guess because any alternativeâincluding further analysisâseems less promising.)
Hmmâgood question if that would be true for one of my âcruxesâ as well. FWIW my immediate intuition is that it wouldnât, i.e. that Iâd have >1% credence in all relevant assumptions. Or at least that counterexamples would feel âpathologicalâ to me, i.e. like weird edge cases Iâd want to discount. But I havenât carefully thought about it, and my view on this doesnât feel that stable.
I also think the âfoundationalâ property you gestured at does some work for why my intuitive reaction is âthis seems wildâ.
Thinking about this, I also realized that maybe some distinction between âhow it feels like if I just look at my intuitionâ and âwhat my all-things-considered belief/ââbetting oddsâ would be after I take into account outside views, peer disagreement, etc.â. The example that made me think about this were startup founders, or other people embarking on ambitious projects that based on their reference class are very likely to fail. [Though idk if 99% is the right quantitative threshold for cases that appear in practice.] I would guess that some people with that profile might say something like âsure, in one sense I agree that the chance of me succeeding must be very smallâbut it just does feel like I will succeed to me, and if I felt otherwise I wouldnât do what Iâm doingâ.
Weirdly, for me, itâs like:
When I donât really think about it, I basically feel like moral realism is definitely true and like thereâs no question there at all
When I do really think about, my independent impression is that moral realism seems to basically make no sense and be almost guaranteed to be false
But then lots of smart people whoâve thougt about metaethics a lot do seem to think moral realism is somewhere between plausible or very likely, so I update up to something like a 1% chance (0.5% in this spreadsheet)
I think that this is fairly different from the startup founder example, though I guess it ends up in a similar place of it being easy to feel like âthe odds are goodâ even if on some level I believe/ârecognise that theyâre not.
Actually, that commentâand this spreadsheetâimplied that my all-things-considered belief (not independent impression) is that thereâs a ~0.5-1% chance of something like moral realism being true. But that doesnât seem like the reasonable all-things-considered belief to have, given that it seems to me that:
The average credence in that claim from smart people whoâve spent a while thinking about it would be considerably higher
One useful proxy is the 2013 PhilPapers survey, which suggests that, out of some sample of philosophers, 56.4% subscribe to moral realism, 27.7% subscribe to moral anti-realism, and 15.9% were âotherâ
Iâm deeply confused about this topic (which pushes against relying strongly on my own independent impression)
So maybe actually my all-things-considered belief is (or should be) closer to 50% (i.e., ~100 times as high as is suggested in this spreadsheet), and the 0.5% number is somewhere in-between my independent impression and my all-things-considered belief.
That might further help explain why it usually doesnât feel super weird to me to kind-of âact on a moral realism wagerâ.
But yeah, I mostly feel pretty confused about what this topic even is, what I should think about it, and what I do think about it.