[Question] What are some low-information priors that you find practically useful for thinking about the world?

cross­posted on LessWrong

I’m in­ter­ested in ques­tions of the form, “I have a bit of meta­data/​struc­ture to the ques­tion, but I know very lit­tle about the con­tent of the ques­tion (or al­ter­na­tively, I’m too wor­ried about bi­ases/​hacks to how I think about the prob­lem or what pieces of in­for­ma­tion to pay at­ten­tion to). In those situ­a­tions, what prior should I start with?”

I’m not sure if there is a more tech­ni­cal term than “low-in­for­ma­tion prior.”

Some ex­am­ples of what I found use­ful re­cently:

1. Laplace’s Rule of Suc­ces­sion, for when the un­der­ly­ing mechanism is un­known.

2. Per­centage of bi­nary ques­tions that re­solves as “yes” on Me­tac­u­lus. It turns out that of all bi­nary (Yes-No) ques­tions asked on the pre­dic­tion plat­form Me­tac­u­lus, ~29% of them re­solved yes. This means that even if you know noth­ing about the con­tent of a Me­tac­u­lus ques­tion, a rea­son­able start­ing point for an­swer­ing a ran­domly se­lected bi­nary Me­tac­u­lus ques­tion is 29%.

In both cases, ob­vi­ously there are rea­sons to over­ride the prior in both prac­tice and the­ory (for ex­am­ple, you can ar­bi­trar­ily add a “not” to all ques­tions on Me­tac­u­lus such that your prior is now 71%). How­ever (I claim), hav­ing a de­cent prior is nonethe­less use­ful in prac­tice, even if it’s the­o­ret­i­cally un­prin­ci­pled.

I’d be in­ter­ested in see­ing some­thing like 5-10 ex­am­ples of low-in­for­ma­tion pri­ors as use­ful as the rule of suc­ces­sion or the Me­tac­u­lus bi­nary prior.