Assigning a single number to such a prior, as if it means anything, seems utterly absurd.
I don’t agree that it’s meaningless or absurd. A straightforward meaning of the number is “my subjective probability estimate if I had to put a number on it” — and I’d agree that one shouldn’t take it for more than that.
I also don’t think it’s useless, since numbers like these can at least help give a very rough quantitative representation of beliefs (as imperfectly estimated from the inside), which can in turn allow subjective ballpark updates based on explicit calculations. I agree that such simple estimates and calculations should not necessarily be given much weight, let alone dictate our thinking, but I still think they can provide some useful information and provoke further thought. I think they can add to purely qualitative reasoning, even if there are more refined quantitative approaches that are better still.
I guess my point of view is that for certain questions, we shouldstop forcing people to reduce their beliefs to a single number.
Say you told me to guess the number of advanced civilisations in our galaxy (other than humans), and, after meticulous research, I answered “1 million”. Does this singular number actually represent my belief? Would I be kicking myself and feeling like an idiot if the actual answer turned out to be 100?
Of course not. It’s a hugely uncertain, unboundedly speculative question. My actual beliefs are a spread of probabilities over a huge range of magnitudes, possibly quite unevenly spread (there would be a large bump at “0”). “1 million” would just be a snapshot of the median, and leave all that other information out.
The reason this matters is that EA frequently decides to make decisions, including funding decisions, based on these ridiculously uncertain estimates. You yourself are advocating for this in your article.
By reducing everything to that one number, we start influencing the next persons estimates, who influences the next person, and so on. Soon we end up with surveys of “alien experts” on the existential risk from aliens, asking them to estimate (P|aliens) as one number, which they inevitably anchor to the last estimate they saw, compounding everything until eventually you get treated like an absurdity for having a low estimate of Alien x-risk. All based on wildly, ridiculously uncertain initial guesses that someone made up once.
In summary, people should either start stating their uncertainty explicitly, or they should start saying “I don’t know”. This “1 number” status quo is just making things worse.
The reason this matters is that EA frequently decides to make decisions, including funding decisions, based on these ridiculously uncertain estimates. You yourself are advocating for this in your article.
I think that misrepresents what I write and “advocate” in the essay. Among various other qualifications, I write the following (emphases added):
I should also clarify that the decision-related implications that I here speculate on are not meant as anything like decisive or overriding considerations. Rather, I think they would mostly count as weak to modest considerations in our assessments of how to act, all things considered.
My claims about how I think these would be “weak to modest considerations in our assessments of how to act” are not predicated on the exact manner in which I represent my beliefs: I’d say the same regardless of whether I’m speaking in purely qualitative terms or in terms of ranges of probabilities.
In summary, people should either start stating their uncertainty explicitly, or they should start saying “I don’t know”.
FWIW, I do state uncertainty multiple times, except in qualitative rather than quantitative terms. A few examples:
This essay contains a lot of speculation and loose probability estimates. It would be tiresome if I constantly repeated caveats like “this is extremely speculative” and “this is just a very loose estimate that I am highly uncertain about”. So rather than making this essay unreadable with constant such remarks, I instead say it once from the outset: many of the claims I make here are rather speculative and they mostly do not imply a high level of confidence. … I hope that readers will keep this key qualification in mind.
As with all the numbers I give in this essay, the following are just rough numbers that I am not adamant about defending …
Of course, this is a rather crude and preliminary analysis.
To be clear, I think you included all the necessary disclaimers, your article was well written, well argued, and the use of probability was well within the standard for how probability is used in EA.
My issue is that I think the way probability is presented in EA is bad, misleading, and likely to lead to errors. I think this is the exact type of problem (speculative, unbounded estimates) where the EA method fails.
My specific issue here is how uncertainty is taken out of the equation and placed into preambles, and how a highly complex belief is reduced to a single number. This is typical on this forum and in EA (see P|doom). When bayes is used for science, on the other hand, the prior will be a distribution. (See the pdf of the first result here).
My concern is that EA is making decisions based on these point estimates, rather than on peoples true distributions, which is likely to lead people astray.
I’m curious: When you say that your prior for alien presence is 1%, what is your distribution? Is 1% your median estimate? How shocked would you be if the “true value” was 0.001%?
If probabilities of probabilities is confusing, do the same thing for “how many civilisations are there in the galaxy”.
Thanks! :)
I don’t agree that it’s meaningless or absurd. A straightforward meaning of the number is “my subjective probability estimate if I had to put a number on it” — and I’d agree that one shouldn’t take it for more than that.
I also don’t think it’s useless, since numbers like these can at least help give a very rough quantitative representation of beliefs (as imperfectly estimated from the inside), which can in turn allow subjective ballpark updates based on explicit calculations. I agree that such simple estimates and calculations should not necessarily be given much weight, let alone dictate our thinking, but I still think they can provide some useful information and provoke further thought. I think they can add to purely qualitative reasoning, even if there are more refined quantitative approaches that are better still.
I guess my point of view is that for certain questions, we should stop forcing people to reduce their beliefs to a single number.
Say you told me to guess the number of advanced civilisations in our galaxy (other than humans), and, after meticulous research, I answered “1 million”. Does this singular number actually represent my belief? Would I be kicking myself and feeling like an idiot if the actual answer turned out to be 100?
Of course not. It’s a hugely uncertain, unboundedly speculative question. My actual beliefs are a spread of probabilities over a huge range of magnitudes, possibly quite unevenly spread (there would be a large bump at “0”). “1 million” would just be a snapshot of the median, and leave all that other information out.
The reason this matters is that EA frequently decides to make decisions, including funding decisions, based on these ridiculously uncertain estimates. You yourself are advocating for this in your article.
By reducing everything to that one number, we start influencing the next persons estimates, who influences the next person, and so on. Soon we end up with surveys of “alien experts” on the existential risk from aliens, asking them to estimate (P|aliens) as one number, which they inevitably anchor to the last estimate they saw, compounding everything until eventually you get treated like an absurdity for having a low estimate of Alien x-risk. All based on wildly, ridiculously uncertain initial guesses that someone made up once.
In summary, people should either start stating their uncertainty explicitly, or they should start saying “I don’t know”. This “1 number” status quo is just making things worse.
I think that misrepresents what I write and “advocate” in the essay. Among various other qualifications, I write the following (emphases added):
My claims about how I think these would be “weak to modest considerations in our assessments of how to act” are not predicated on the exact manner in which I represent my beliefs: I’d say the same regardless of whether I’m speaking in purely qualitative terms or in terms of ranges of probabilities.
FWIW, I do state uncertainty multiple times, except in qualitative rather than quantitative terms. A few examples:
To be clear, I think you included all the necessary disclaimers, your article was well written, well argued, and the use of probability was well within the standard for how probability is used in EA.
My issue is that I think the way probability is presented in EA is bad, misleading, and likely to lead to errors. I think this is the exact type of problem (speculative, unbounded estimates) where the EA method fails.
My specific issue here is how uncertainty is taken out of the equation and placed into preambles, and how a highly complex belief is reduced to a single number. This is typical on this forum and in EA (see P|doom). When bayes is used for science, on the other hand, the prior will be a distribution. (See the pdf of the first result here).
My concern is that EA is making decisions based on these point estimates, rather than on peoples true distributions, which is likely to lead people astray.
I’m curious: When you say that your prior for alien presence is 1%, what is your distribution? Is 1% your median estimate? How shocked would you be if the “true value” was 0.001%?
If probabilities of probabilities is confusing, do the same thing for “how many civilisations are there in the galaxy”.