In my reply to Linch, I said that most of my errors were probably in some sense “general reasoning” errors, and a lot of what I’m improving over the course of doing my job is general reasoning. But at the same time, I don’t think that most EAs should spend a large fraction of their time doing things that look like explicitly practicing general reasoning in an isolated or artificial way (for example, re-reading the Sequences, studying probability theory, doing calibration training, etc). I think it’s good to be spending most of your time trying to accomplish something straightforwardly valuable, which will often incidentally require building up some content expertise. It’s just that a lot of the benefit of those things will probably come through improving your general skills.
In my reply to Linch, I said that most of my errors were probably in some sense “general reasoning” errors, and a lot of what I’m improving over the course of doing my job is general reasoning. But at the same time, I don’t think that most EAs should spend a large fraction of their time doing things that look like explicitly practicing general reasoning in an isolated or artificial way (for example, re-reading the Sequences, studying probability theory, doing calibration training, etc). I think it’s good to be spending most of your time trying to accomplish something straightforwardly valuable, which will often incidentally require building up some content expertise. It’s just that a lot of the benefit of those things will probably come through improving your general skills.