How would you define a rationality project? I am working on psychological impediments to effective giving and how they can be overcome with Lucius Caviola at Oxford. I guess that can also be seen as a rationality project, though I am not quite sure how you would define that notion.
I am generally interested in doing more work in this space. In particular, I would be interested in doing work that relates to academic psychology and philosophy, which is rigorous, and which has a reasonably clear path to impact.
I think one sort of diffuse “project” that one can work on on the side of one’s main project is work to maintain and improve the EA community’s epistemics, e.g., by arguing well and in good faith oneself, and by rewarding others who do that as well. I do agree that good epistemics are vital for the EA community.
Stefan linked to a Forum piece about a tool built by Clearer Thinking, but I wanted to use this post to link that organization specifically. They demonstrate one model for what a “rationality advocacy” organization could do. Julia Galef’s Update Project is another, very different model (working closely with a few groups of high-impact people, rather than building tools for a public audience).
Given that the Update Project is semi-sponsored by the Open Philanthropy Project, and that the Open Philanthropy Project has also made grants to rationality-aligned orgs like CFAR, SPARC, and even the Center for Election Science (which I’d classify as an organization working to improve institutional decision-making, if the institution is “democracy”), it seems like EA already has quite an investment in this area.
casebash (and other commenters): What kind of rationality organization would you like to see funded which either does not exist or exists but has little to no EA funding? Alternatively, what would a world look like that was slightly more rational in the ways you think are most important?
Then I would suggest changing the title of the post. ‘Rationality as a cause area’ can mean many things besides ‘growing the rationality community’.
Furthermore, some of the considerations you list in support of the claim that rationality is a promising cause area do not clearly support, and may even undermine, the claim that one should grow the rationality community. Your remarks about epistemic standards, in particular, suggest that one should approach growth very carefully, and that one may want to deprioritize growth in favour of other forms of community building.
Replace “growing” the rationality community with “developing” the rationality community. But that’s a good point. It is worthwhile keeping in mind that the two are seperate. I imagine one of the first tasks of such a group would be figuring out what this actually means.
How would you define a rationality project? I am working on psychological impediments to effective giving and how they can be overcome with Lucius Caviola at Oxford. I guess that can also be seen as a rationality project, though I am not quite sure how you would define that notion.
Previously, I ran several other projects which could be seen as rationality projects—I started a network for evidence-based policy, created a political bias test, and did work on argument-checking.
I am generally interested in doing more work in this space. In particular, I would be interested in doing work that relates to academic psychology and philosophy, which is rigorous, and which has a reasonably clear path to impact.
I think one sort of diffuse “project” that one can work on on the side of one’s main project is work to maintain and improve the EA community’s epistemics, e.g., by arguing well and in good faith oneself, and by rewarding others who do that as well. I do agree that good epistemics are vital for the EA community.
Stefan linked to a Forum piece about a tool built by Clearer Thinking, but I wanted to use this post to link that organization specifically. They demonstrate one model for what a “rationality advocacy” organization could do. Julia Galef’s Update Project is another, very different model (working closely with a few groups of high-impact people, rather than building tools for a public audience).
Given that the Update Project is semi-sponsored by the Open Philanthropy Project, and that the Open Philanthropy Project has also made grants to rationality-aligned orgs like CFAR, SPARC, and even the Center for Election Science (which I’d classify as an organization working to improve institutional decision-making, if the institution is “democracy”), it seems like EA already has quite an investment in this area.
casebash (and other commenters): What kind of rationality organization would you like to see funded which either does not exist or exists but has little to no EA funding? Alternatively, what would a world look like that was slightly more rational in the ways you think are most important?
I was referring specifically to growing the rationality community as a cause area.
Then I would suggest changing the title of the post. ‘Rationality as a cause area’ can mean many things besides ‘growing the rationality community’.
Furthermore, some of the considerations you list in support of the claim that rationality is a promising cause area do not clearly support, and may even undermine, the claim that one should grow the rationality community. Your remarks about epistemic standards, in particular, suggest that one should approach growth very carefully, and that one may want to deprioritize growth in favour of other forms of community building.
Replace “growing” the rationality community with “developing” the rationality community. But that’s a good point. It is worthwhile keeping in mind that the two are seperate. I imagine one of the first tasks of such a group would be figuring out what this actually means.