A talk by Amanda Ngo about using forecasting to build models, calibrate beliefs, and improve decisionmaking
A talk by Zach Robinson about using back of the envelope calculations to prioritize interventions
A few things we have done to try to influence norms:
I have calls with almost every speaker where we discuss their talk and how they will present it. I encourage them to use reasoning transparency in their presentation
At welcome talks at the start of the event, I’ve encouraged folks to be open to changing their minds, to seek out ideas that they think they disagree with and try to engage with them, and to try to pass Ideological Turing Tests of attendees they disagree with
We also have had a bunch of workshops to help folks build their rationality toolkit: including workshops on forecasting and Fermi estimates as well as Center for Applied Rationality and Clearer Thinking content.
I also include practical application of rationality tools in events training. For example, the project management presentation that I’ll be giving at our EAGx team training this Thursday includes a handful of rationality tools in a ~15-minute presentation (including planning fallacy, Murphyjitsu, inside vs outside view, and back planning).
On the Forum side, I think of the Forum Prize as a small effort in this direction (highlighting posts that are written clearly and do good epistemic things, holding them up as models to emulate).
AMAs and general solicitation of expert content are also part of this — bringing good thinkers and knowledgeable people to the Forum, and exposing readers to their knowledge and habits of mind.
What Amy said above! I’ve also been doing some thinking about how to improve the community’s epistemics in a more targeted way. As part of this, I conducted a small test run of a project that I hope will help (the “EA Librarian” project mentioned in Max’s link). I’ve also developed a few other ideas (e.g., a coaching program). Unfortunately, the work here has been pretty limited so far due to capacity constraints. Right now, I’m focusing on trying to hire to add more capacity. I’ve also been working on my project management skills to try to increase my ability to push things forward in this space.
This is a question for primarily for Nicole but open to all: what does CEA do to improve the community’s epistemics?
The events team tries to feature content and promote community norms that maintain/improve the community’s epistemics.
Examples of content (these are just a few talks of many):
Talks by David Manley on decoupling and Bayesian reasoning
Fireside chats with Philip Tetlock on forecasting and other topics
A talk by Amanda Ngo about using forecasting to build models, calibrate beliefs, and improve decisionmaking
A talk by Zach Robinson about using back of the envelope calculations to prioritize interventions
A few things we have done to try to influence norms:
I have calls with almost every speaker where we discuss their talk and how they will present it. I encourage them to use reasoning transparency in their presentation
At welcome talks at the start of the event, I’ve encouraged folks to be open to changing their minds, to seek out ideas that they think they disagree with and try to engage with them, and to try to pass Ideological Turing Tests of attendees they disagree with
We bought copies of The Scout Mindset by Julia Galef to hand out at the EA Picnic, coming up in July :)
We also have had a bunch of workshops to help folks build their rationality toolkit: including workshops on forecasting and Fermi estimates as well as Center for Applied Rationality and Clearer Thinking content.
I also include practical application of rationality tools in events training. For example, the project management presentation that I’ll be giving at our EAGx team training this Thursday includes a handful of rationality tools in a ~15-minute presentation (including planning fallacy, Murphyjitsu, inside vs outside view, and back planning).
On the Forum side, I think of the Forum Prize as a small effort in this direction (highlighting posts that are written clearly and do good epistemic things, holding them up as models to emulate).
AMAs and general solicitation of expert content are also part of this — bringing good thinkers and knowledgeable people to the Forum, and exposing readers to their knowledge and habits of mind.
Amy covered some of our work here. I think more broadly this is something that we try to consider in all of our programs.
Another example of some proactive work is discussed here (under “epistemics”).
What Amy said above! I’ve also been doing some thinking about how to improve the community’s epistemics in a more targeted way. As part of this, I conducted a small test run of a project that I hope will help (the “EA Librarian” project mentioned in Max’s link). I’ve also developed a few other ideas (e.g., a coaching program). Unfortunately, the work here has been pretty limited so far due to capacity constraints. Right now, I’m focusing on trying to hire to add more capacity. I’ve also been working on my project management skills to try to increase my ability to push things forward in this space.