I’ve also personally gained a lot from arguing with People Wrong on the Internet, but poor application of this principle may be generally bad for epistemic rigor. In particular, I think it probably helps to have a research blog and be able to do things like spot potential holes in (EA social media, EA forum, research blogs, papers, etc). That said, I think most EA researchers (including my colleagues) are much less Online than I am, so you definitely don’t need to develop an internet argument habit to be a good researcher.
Making lots of falsifiable forecasts about short-term conclusions of your beliefs may be helpful. Calibration training is probably less helpful, but lower cost.
Trying to identify important and tractable (sub)questions is often even more important than the ability to answer them well. In particular, very early on in a research project, try to track “what if I answered this question perfectly? Does it even matter? Will this meaningfully impact anyone’s decisions, including my own? Will this research build towards something else that will meaningfully impact decisions later?”
“Politely disagreeable” seems like a pretty important disposition. You benefit epistemically from being nice and open enough to other people’s ideas that you a) deliberately seek out contrarian opinions and b) don’t reject them outright, but also you need to be disagreeable enough that you in general shouldn’t update on beliefs just because other (smart, respected, experienced, etc) people confidently believe it.
Being very aggressively truth-seeking is a really important disposition. My belief is that most people are by default bad at this, including people who may otherwise make great EA researchers.
For longtermist work, I often point people to Holden Karnofsky’s impressions on career choice, particularly the section on building aptitudes for conceptual and empirical research on core longtermist topics .
I’ve also personally gained a lot from arguing with People Wrong on the Internet, but poor application of this principle may be generally bad for epistemic rigor. In particular, I think it probably helps to have a research blog and be able to do things like spot potential holes in (EA social media, EA forum, research blogs, papers, etc). That said, I think most EA researchers (including my colleagues) are much less Online than I am, so you definitely don’t need to develop an internet argument habit to be a good researcher.
Making lots of falsifiable forecasts about short-term conclusions of your beliefs may be helpful. Calibration training is probably less helpful, but lower cost.
Trying to identify important and tractable (sub)questions is often even more important than the ability to answer them well. In particular, very early on in a research project, try to track “what if I answered this question perfectly? Does it even matter? Will this meaningfully impact anyone’s decisions, including my own? Will this research build towards something else that will meaningfully impact decisions later?”
“Politely disagreeable” seems like a pretty important disposition. You benefit epistemically from being nice and open enough to other people’s ideas that you a) deliberately seek out contrarian opinions and b) don’t reject them outright, but also you need to be disagreeable enough that you in general shouldn’t update on beliefs just because other (smart, respected, experienced, etc) people confidently believe it.
Being very aggressively truth-seeking is a really important disposition. My belief is that most people are by default bad at this, including people who may otherwise make great EA researchers.
I also endorse Neil’s comment.