Maybe an alternative way to look at this is, why is rationality not more a part of EA community building? Rationality as a project likely can’t stand on its own because it’s not trying to do anything; it’s just a bunch of like-minded folks with a similar interest in improving their ability to apply epistemology. The cases where the rationality “project” has done well, like building up resources to address AI risk, were more like cases where the project needed rationality for an instrumental purpose and then built up LW and CFAR in the service of that project. Perhaps EA can more strongly include rationality in that role as part of what it considers essential for training/recruiting in EA and building a strong community that is able to do the things it wants to do. This wouldn’t really mean rationality is a cause area, more an aspect of effective EA community building.
Maybe an alternative way to look at this is, why is rationality not more a part of EA community building? Rationality as a project likely can’t stand on its own because it’s not trying to do anything; it’s just a bunch of like-minded folks with a similar interest in improving their ability to apply epistemology. The cases where the rationality “project” has done well, like building up resources to address AI risk, were more like cases where the project needed rationality for an instrumental purpose and then built up LW and CFAR in the service of that project. Perhaps EA can more strongly include rationality in that role as part of what it considers essential for training/recruiting in EA and building a strong community that is able to do the things it wants to do. This wouldn’t really mean rationality is a cause area, more an aspect of effective EA community building.