The meta-analysis I linked above might not be applicable for some of the subgroups interested in 80K’s priority paths e.g. to self-reflective meta-thinkers.
But almost complete lack of effect in the CFAR’s 2015 longitudinal study very weakly suggests that the relevant cohort might still not be very susceptible to training.
This makes me think that general training (e.g. calibration and to a lesser extent forecasting) might not translate to an overall improvement in judgment. OTOH, surely, getting skills broadly useful for decision making (e.g. spreadsheets, probabilistic reasoning, clear writing) should be good.
A bit of a tangent. Hanson’s Reality TV MBAs is an interesting idea. Gaining experience via being a personal assistant to someone else seems to be beneficial², so maybe this could be scaled up by having a reality TV show. Maybe it is a good idea to invite people with good judgment/research taste to stream some of their working sessions and so on?
[1]: According to Wikipedia: Near transfer occurs when many elements overlap between the conditions in which the learner obtained the knowledge or skill and the new situation. Far transfer occurs when the new situation is very different from that in which learning occurred.
[2]: Moreover, it is one of the 80K’s paths that may turn out to be very promising.
I do agree there’s a worry about how much calibration training or forecasting in one area, will transfer to other areas. My best guess is there some transfer but there’s not as much evidence about it as I’d like.
I also think of forecasting as more like a subfactor of good judgement, so I’m not claiming there will be a transfer of cognitive skills – rather I’m claiming that if you practice a specific skill (forecasting), you will get better at that skill.
Hey Ben, what makes you think that judgment can be generally improved?
When Owen posted “Good judgement” and its components, I briefly reviewed the literature on transfer of cognitive skills:
My conclusion was that far transfer¹, is rare if not null.
See e.g. Near and Far Transfer in Cognitive Training: A Second-Order Meta-Analysis and Does Far Transfer Exist? Negative Evidence From Chess, Music, and Working Memory Training.
I found only one meta-analysis, which contradicts this conclusion: The cognitive benefits of learning computer programming: A meta-analysis of transfer effects.
The meta-analysis I linked above might not be applicable for some of the subgroups interested in 80K’s priority paths e.g. to self-reflective meta-thinkers.
But almost complete lack of effect in the CFAR’s 2015 longitudinal study very weakly suggests that the relevant cohort might still not be very susceptible to training.
This makes me think that general training (e.g. calibration and to a lesser extent forecasting) might not translate to an overall improvement in judgment. OTOH, surely, getting skills broadly useful for decision making (e.g. spreadsheets, probabilistic reasoning, clear writing) should be good.
A bit of a tangent. Hanson’s Reality TV MBAs is an interesting idea. Gaining experience via being a personal assistant to someone else seems to be beneficial², so maybe this could be scaled up by having a reality TV show. Maybe it is a good idea to invite people with good judgment/research taste to stream some of their working sessions and so on?
[1]: According to Wikipedia: Near transfer occurs when many elements overlap between the conditions in which the learner obtained the knowledge or skill and the new situation. Far transfer occurs when the new situation is very different from that in which learning occurred.
[2]: Moreover, it is one of the 80K’s paths that may turn out to be very promising.
Hi Misha,
I do agree there’s a worry about how much calibration training or forecasting in one area, will transfer to other areas. My best guess is there some transfer but there’s not as much evidence about it as I’d like.
I also think of forecasting as more like a subfactor of good judgement, so I’m not claiming there will be a transfer of cognitive skills – rather I’m claiming that if you practice a specific skill (forecasting), you will get better at that skill.
I’d also suggest looking directly at the evidence on whether forecasting can be improved and seeing what you think of it: https://www.openphilanthropy.org/blog/efforts-improve-accuracy-our-judgments-and-forecasts