The reason this matters is that EA frequently decides to make decisions, including funding decisions, based on these ridiculously uncertain estimates. You yourself are advocating for this in your article.
I think that misrepresents what I write and “advocate” in the essay. Among various other qualifications, I write the following (emphases added):
I should also clarify that the decision-related implications that I here speculate on are not meant as anything like decisive or overriding considerations. Rather, I think they would mostly count as weak to modest considerations in our assessments of how to act, all things considered.
My claims about how I think these would be “weak to modest considerations in our assessments of how to act” are not predicated on the exact manner in which I represent my beliefs: I’d say the same regardless of whether I’m speaking in purely qualitative terms or in terms of ranges of probabilities.
In summary, people should either start stating their uncertainty explicitly, or they should start saying “I don’t know”.
FWIW, I do state uncertainty multiple times, except in qualitative rather than quantitative terms. A few examples:
This essay contains a lot of speculation and loose probability estimates. It would be tiresome if I constantly repeated caveats like “this is extremely speculative” and “this is just a very loose estimate that I am highly uncertain about”. So rather than making this essay unreadable with constant such remarks, I instead say it once from the outset: many of the claims I make here are rather speculative and they mostly do not imply a high level of confidence. … I hope that readers will keep this key qualification in mind.
As with all the numbers I give in this essay, the following are just rough numbers that I am not adamant about defending …
Of course, this is a rather crude and preliminary analysis.
To be clear, I think you included all the necessary disclaimers, your article was well written, well argued, and the use of probability was well within the standard for how probability is used in EA.
My issue is that I think the way probability is presented in EA is bad, misleading, and likely to lead to errors. I think this is the exact type of problem (speculative, unbounded estimates) where the EA method fails.
My specific issue here is how uncertainty is taken out of the equation and placed into preambles, and how a highly complex belief is reduced to a single number. This is typical on this forum and in EA (see P|doom). When bayes is used for science, on the other hand, the prior will be a distribution. (See the pdf of the first result here).
My concern is that EA is making decisions based on these point estimates, rather than on peoples true distributions, which is likely to lead people astray.
I’m curious: When you say that your prior for alien presence is 1%, what is your distribution? Is 1% your median estimate? How shocked would you be if the “true value” was 0.001%?
If probabilities of probabilities is confusing, do the same thing for “how many civilisations are there in the galaxy”.
I think that misrepresents what I write and “advocate” in the essay. Among various other qualifications, I write the following (emphases added):
My claims about how I think these would be “weak to modest considerations in our assessments of how to act” are not predicated on the exact manner in which I represent my beliefs: I’d say the same regardless of whether I’m speaking in purely qualitative terms or in terms of ranges of probabilities.
FWIW, I do state uncertainty multiple times, except in qualitative rather than quantitative terms. A few examples:
To be clear, I think you included all the necessary disclaimers, your article was well written, well argued, and the use of probability was well within the standard for how probability is used in EA.
My issue is that I think the way probability is presented in EA is bad, misleading, and likely to lead to errors. I think this is the exact type of problem (speculative, unbounded estimates) where the EA method fails.
My specific issue here is how uncertainty is taken out of the equation and placed into preambles, and how a highly complex belief is reduced to a single number. This is typical on this forum and in EA (see P|doom). When bayes is used for science, on the other hand, the prior will be a distribution. (See the pdf of the first result here).
My concern is that EA is making decisions based on these point estimates, rather than on peoples true distributions, which is likely to lead people astray.
I’m curious: When you say that your prior for alien presence is 1%, what is your distribution? Is 1% your median estimate? How shocked would you be if the “true value” was 0.001%?
If probabilities of probabilities is confusing, do the same thing for “how many civilisations are there in the galaxy”.