A quickly written model of epistemic health in EA groups I sometimes refer to
I think that many well intentioned ea groups do a bad job cultivating good epistemics. By this I roughly mean the culture of the group does not differentially advantage truthseekjng discussions or other behaviours that helps us figure out what is actually true as opposed to what is convenient or feels nice.
I think that one of the main reasons for this is poor gatekeeping of ea spaces. I do think groups do more and more gate keeping, but they are often not selecting on epistemics as hard as I think they should be. I’m going to say why I think this is and then gesture at some things that might improve the situation. I’m not going to talk at this tjme about why I think it’s important—but I do think it’s really really important.
EA group leaders often exercise a decent amount of control over who should be part of their group (which I think is great). Unfortunately, it’s much easier to evaluate what conclusion a person has come to, than how good were their reasoning processes. So “what a person says they think” becomes the filter for who gets to be in the group as opposed to how do they think. Intuitively I expect a positive feedback loop where groups become worse and worse epistemically as people are incentivised to reach a certain conclusion to be part of the group and future group leaders are drawn from a pool of people with bad epistemics and then reinforce this.
If my model is right there are a few practical takeaways:
• be really careful about who you make a group leader or get to start a group (you can easily miss a lot of upside that’s hard to undo later)
• make it very clear that your EA group is a place for truth seeking discussion potentially at the expense of being welcoming or inclusive
• make rationality/epistemics a more core part of what your group values, idk exactly how to do that—I think a lot of this is making it clear that this is what your group is in part about
I’m hoping to have some better takes on this later, I would strongly encourage the CEA groups team to think about this along with EA group leaders. I don’t think many people are working in this area though I’d also be sad if people fill up the space with low quality content so think really hard about it and try to be careful about what you post.
I’m not going to talk at this tjme about why I think it’s important—but I do think it’s really really important.
As someone both trying to start a group and to find someone else to run it so I can move to other places, I’m really curious about your perspective on this. In my model, a lot of the value of a group comes from helping anyone that’s vaguely interested in doing good effectively to better achieve their goals, and to introduce them to online resources, opportunities, and communities.
I would guess that even if the leader has poor epistemics, they can still do a good enough job of telling people: “EA exists, here are some resources/opportunities you might find useful, happy to answer your 101 questions”.
I have heard a similar take from someone on the CEA groups team, so I would really want to understand this better.
There seems to be anxiety and concern about EA funds right now. One thread is here.
Your profile says you are the head of EA funds.
Can you personally make a statement to acknowledge these concerns, say this is being looked into, or anything else substantive? I think this would be helpful.
A quickly written model of epistemic health in EA groups I sometimes refer to
I think that many well intentioned ea groups do a bad job cultivating good epistemics. By this I roughly mean the culture of the group does not differentially advantage truthseekjng discussions or other behaviours that helps us figure out what is actually true as opposed to what is convenient or feels nice.
I think that one of the main reasons for this is poor gatekeeping of ea spaces. I do think groups do more and more gate keeping, but they are often not selecting on epistemics as hard as I think they should be. I’m going to say why I think this is and then gesture at some things that might improve the situation. I’m not going to talk at this tjme about why I think it’s important—but I do think it’s really really important.
EA group leaders often exercise a decent amount of control over who should be part of their group (which I think is great). Unfortunately, it’s much easier to evaluate what conclusion a person has come to, than how good were their reasoning processes. So “what a person says they think” becomes the filter for who gets to be in the group as opposed to how do they think. Intuitively I expect a positive feedback loop where groups become worse and worse epistemically as people are incentivised to reach a certain conclusion to be part of the group and future group leaders are drawn from a pool of people with bad epistemics and then reinforce this.
If my model is right there are a few practical takeaways: • be really careful about who you make a group leader or get to start a group (you can easily miss a lot of upside that’s hard to undo later) • make it very clear that your EA group is a place for truth seeking discussion potentially at the expense of being welcoming or inclusive • make rationality/epistemics a more core part of what your group values, idk exactly how to do that—I think a lot of this is making it clear that this is what your group is in part about
I’m hoping to have some better takes on this later, I would strongly encourage the CEA groups team to think about this along with EA group leaders. I don’t think many people are working in this area though I’d also be sad if people fill up the space with low quality content so think really hard about it and try to be careful about what you post.
As someone both trying to start a group and to find someone else to run it so I can move to other places, I’m really curious about your perspective on this.
In my model, a lot of the value of a group comes from helping anyone that’s vaguely interested in doing good effectively to better achieve their goals, and to introduce them to online resources, opportunities, and communities.
I would guess that even if the leader has poor epistemics, they can still do a good enough job of telling people: “EA exists, here are some resources/opportunities you might find useful, happy to answer your 101 questions”.
I have heard a similar take from someone on the CEA groups team, so I would really want to understand this better.
There seems to be anxiety and concern about EA funds right now. One thread is here.
Your profile says you are the head of EA funds.
Can you personally make a statement to acknowledge these concerns, say this is being looked into, or anything else substantive? I think this would be helpful.