It’s fairly context dependent, but I generally remain a fan.
There’s a mix of ancillary issues:
There could be a ‘why should we care what you think?’ if EA estimates diverge from consensus estimates, although I imagine folks tend to gravitate to neglected topics etc.
There might be less value in ‘relative to self-ish’ accounts of resilience: major estimates in a front facing report I’d expect to be fairly resilient, and so less “might shift significantly if we spent another hour on it”.
Relative to some quasi-ideal seems valuable though: E.g. “Our view re. X is resilient, but we have a lot of knightian uncertainty, so we’re only 60% sure we’d be within an order of magnitude of X estimated by a hypothetical expert panel/liquid prediction market/etc.”
There might be better or worse ways to package this given people are often sceptical of any quantitative assessment of uncertainty (at least in some domains). Perhaps something like ‘subjective confidence intervals’ (cf.), although these aren’t perfect.
But ultimately, if you want to tell someone an important number you aren’t sure about, it seems worth taking pains to be precise, both on it and its uncertainty.
I tend to agree. This feels a bit like a “be the change you want to see in the world” thing. Ordinary communication norms would push us towards just using verbal claims like ‘likely’ but for the reasons you mention, I pretty strongly think we should quantify and accept any short-term weirdness hit.
It’s fairly context dependent, but I generally remain a fan.
There’s a mix of ancillary issues:
There could be a ‘why should we care what you think?’ if EA estimates diverge from consensus estimates, although I imagine folks tend to gravitate to neglected topics etc.
There might be less value in ‘relative to self-ish’ accounts of resilience: major estimates in a front facing report I’d expect to be fairly resilient, and so less “might shift significantly if we spent another hour on it”.
Relative to some quasi-ideal seems valuable though: E.g. “Our view re. X is resilient, but we have a lot of knightian uncertainty, so we’re only 60% sure we’d be within an order of magnitude of X estimated by a hypothetical expert panel/liquid prediction market/etc.”
There might be better or worse ways to package this given people are often sceptical of any quantitative assessment of uncertainty (at least in some domains). Perhaps something like ‘subjective confidence intervals’ (cf.), although these aren’t perfect.
But ultimately, if you want to tell someone an important number you aren’t sure about, it seems worth taking pains to be precise, both on it and its uncertainty.
I tend to agree. This feels a bit like a “be the change you want to see in the world” thing. Ordinary communication norms would push us towards just using verbal claims like ‘likely’ but for the reasons you mention, I pretty strongly think we should quantify and accept any short-term weirdness hit.