From my current read, psychedelics have a stronger evidence base than rationality training programs
I agree if for CFAR you are looking at the metric of how rational their alumni are. If you instead look at CFAR as a funnel for people working on AI risk, the “evidence base” seems clearer. (Similarly to how we can be quite confident that 80K is having an impact, despite there not being any RCTs of 80K’s “intervention”.)
I agree if for CFAR you are looking at the metric of how rational their alumni are. If you instead look at CFAR as a funnel for people working on AI risk, the “evidence base” seems clearer.
Sure, I was pointing to the evidence base for the techniques taught by CFAR & other rationality training programs.
CFAR could be effective at recruiting people into AI risk due to Schelling-point dynamics, without the particular techniques it teaches being efficacious. (I’m not sure that’s true, just pointing out an orthogonality here.)
I agree if for CFAR you are looking at the metric of how rational their alumni are. If you instead look at CFAR as a funnel for people working on AI risk, the “evidence base” seems clearer. (Similarly to how we can be quite confident that 80K is having an impact, despite there not being any RCTs of 80K’s “intervention”.)
Sure, I was pointing to the evidence base for the techniques taught by CFAR & other rationality training programs.
CFAR could be effective at recruiting people into AI risk due to Schelling-point dynamics, without the particular techniques it teaches being efficacious. (I’m not sure that’s true, just pointing out an orthogonality here.)
Do you know if there are stats on this, somewhere?
e.g. Out of X workshop participants in 2016, Y are now working on AI risk.
I don’t know of any such stats, but I also don’t know much about CFAR.