“To see the world as it is, rather than as I wish it to be.”
I work for the EA research nonprofit Rethink Priorities. Despite my official title, I don’t really think of the stuff I do as “research.” In particular, when I think of the word “research”, I think of people who are expanding the frontiers of the world’s knowledge, whereas often I’m more interested in expanding the frontiers of my knowledge, and/or disseminating it to the relevant parties.
I’m also really interested in forecasting.
People may or may not also be interested in my comments on Metaculus and Twitter:
Metaculus: https://pandemic.metaculus.com/accounts/profile/112057/
Twitter: https://twitter.com/LinchZhang
I think within EA, people should report their accurate levels of confidence, which in some cultures and situations will come across as underconfident and in other cultures and situations will come across as overconfident.
I’m not sure what the practical solution is to this level of precision bleeding outside of EA; I definitely felt like there were times where I was socially penalized for trying to be accurate in situations where accuracy was implicitly not called for. If I was smarter/more socially savvy the “obvious” right call would be to quickly codeswitch between different contexts, but in practice I’ve found it quite hard.
___
Separate from the semantics used, I agree there is a real issue where some people are systematically underconfident or overconfident relative to reality, and this hurts their ability to believe true things or achieve their goals in the long run. Unfortunately this plausibly correlates with demographic differences (eg women on average less confident than men, Asians on average less confident than Caucasians), which seems worth correcting for if possible.