That said, I think it’s worth pointing out the case where arithmetic mean of probabilities is exactly right to use: if you think that exactly one of the estimates is correct but you don’t know which (rather than the usual situation of thinking they all provide evidence about what the correct answer is).
To extend this and steelman the case for arithmetic mean of probabilities (or something in that general direction) a little, in some cases this seems a more intuitive formulation of risk (which is usually how these things are talked about in EA contexts), especially if we propagate further to expected values or value of information concerns.
Eg, suppose that we ask 3 sources we trust equally about risk from X vector of an EA org shutting down in 10 years. One person says 10%, 1 person says 0.1%, 1 person says 0.001%.
Arithmetic mean of probabilities gets you ~ 3.4%, geometric mean of odds gets you ~0.1%. 0.1% seems comfortably below the background rate of organizations dying, that in many cases it’s not worth the value of information to investigate further. Yet naively this seems to be too cavalier if one out of three sources thinks there’s a 10% chance of failure from X vector alone!
Also as a mild terminological note, I’m not sure I know what you mean by “correct answer” when we’re referring to probabilities in the real world. Outside of formal mathematical examples and maybe some quantum physics stuff, probabilities are usually statements about our own confusions in our maps of the world, not physically instantiated in the underlying reality.
To extend this and steelman the case for arithmetic mean of probabilities (or something in that general direction) a little, in some cases this seems a more intuitive formulation of risk (which is usually how these things are talked about in EA contexts), especially if we propagate further to expected values or value of information concerns.
Eg, suppose that we ask 3 sources we trust equally about risk from X vector of an EA org shutting down in 10 years. One person says 10%, 1 person says 0.1%, 1 person says 0.001%.
Arithmetic mean of probabilities gets you ~ 3.4%, geometric mean of odds gets you ~0.1%. 0.1% seems comfortably below the background rate of organizations dying, that in many cases it’s not worth the value of information to investigate further. Yet naively this seems to be too cavalier if one out of three sources thinks there’s a 10% chance of failure from X vector alone!
Also as a mild terminological note, I’m not sure I know what you mean by “correct answer” when we’re referring to probabilities in the real world. Outside of formal mathematical examples and maybe some quantum physics stuff, probabilities are usually statements about our own confusions in our maps of the world, not physically instantiated in the underlying reality.