I appreciate the post, particularly because I have been slowly updating towards the low-trust side of things. I have a few points I’d like to raise.
I think that an important update is that “EA leadership” is fallible and I think more people (especially new and/or younger EAs) should not (entirely) defer. You may not think “not deferring to EA leadership” does not count as “trusting EA less”, but I do. For me, in the past, highly trusting EA leadership has meant largely trusting their cause and career prioritization without digging too much into it.
I think that not trusting EA leadership/not deferring to EA leadership is especially true on career choice. 80k seems to add new carer paths to its list of recommended career paths every few months (I don’t have a source—this is just based on my subjective observations). This doesn’t discount them entirely, but it does mean that even if 80k doesn’t currently recommend a career path, it may be very impactful for an individual to pursue that path (especially if they’re well suited for that path and have good reasons to think it may be impactful).
Besides not deferring to EA leadership, there are certain things many EAs do because of (what I would label as) high trust in other EAs that I think we should probably do less of. I think this includes:
Romantic/sexual relationships in certain cases. This includes boss/employee, funders/fundees, and probably other cases.
Limited accountability into whether grantees used their money effectively.
Limited investigation into whether community building efforts are effective.
Not stating that certain grantees (including individuals or organizations) should not receive additional funding. (As in, I think we should increasingly publicly “name names” and state that certain people/orgs are being ineffective.)
I feel like this fundamentally requires less trust than what the EA community currently has and will also, ultimately, reduce trust a little. But I think it’s necessary.
If your point is that we shouldn’t defer to people’s opinions without fully understanding their arguments and we should verify people are doing good/effective work but we should believe they’re acting in good faith… I think I maybe agree? I’m still not sure though what “believing people are acting in good faith” does and doesn’t include.
Well, this is what you get for using vague words like “trust” :)
I didn’t mean to talk abou “epistemic trust”/deference in the post. I don’t think that people should defer to “leadership” much at all (maybe a bit more to “experts”, who are not the same people).
That is very different to trusting them to behave well, represent our movement, and take decisions that are reasonable and that most of us would approve of if given full context. That’s what I’m talking about, and what I think has been under threat recently.
I’m not saying deference isn’t a problem, just not the one I was talking about.
But it seems to me that in every instance that I’ve seen there has either been a good explanation or the failing has been at worst a) bad decisions made for good reasons, b) lapses in personal judgement, or c) genuine disagreements about which actions are worth doing.
I think you make a good point that many things are (a) or (b), which are relatively fine. And I believe (and maybe we agree) that EAs should still verify these things in sketchy looking situations (including the purchase of Wyntham Abbey).
But in the case of “c) genuine disagreements about which actions are worth doing”, it’s possible we disagree. I feel like definitionally this means we don’t believe other EAs are behaving well or representing our movement. In other words, “genuine disagreement about which actions are worth doing” sometimes is good cause to trust other people less.
I think you have valid reason to “distrust” EAs if you strongly disagree with the reasoning for the purchase of Wytham Abbey or for investing a lot in community building or for promoting longtermism. I strongly disagree with flat-earthers, and I would not “trust” a community based on evidence/reasoning that has a lot of flat-earthers.
I think at the end of the day, this discussion depends on your definition of “trust”. It probably comes down to vibes. And it sounds like you’re saying “even if you strongly disagree with someone, keep the positive vibes”, and what I’m saying is, “sometimes it’s okay to have negative vibes when you strongly disagree with someone.”
Hi Michael,
I appreciate the post, particularly because I have been slowly updating towards the low-trust side of things. I have a few points I’d like to raise.
I think that an important update is that “EA leadership” is fallible and I think more people (especially new and/or younger EAs) should not (entirely) defer. You may not think “not deferring to EA leadership” does not count as “trusting EA less”, but I do. For me, in the past, highly trusting EA leadership has meant largely trusting their cause and career prioritization without digging too much into it.
I think that not trusting EA leadership/not deferring to EA leadership is especially true on career choice. 80k seems to add new carer paths to its list of recommended career paths every few months (I don’t have a source—this is just based on my subjective observations). This doesn’t discount them entirely, but it does mean that even if 80k doesn’t currently recommend a career path, it may be very impactful for an individual to pursue that path (especially if they’re well suited for that path and have good reasons to think it may be impactful).
Besides not deferring to EA leadership, there are certain things many EAs do because of (what I would label as) high trust in other EAs that I think we should probably do less of. I think this includes:
Romantic/sexual relationships in certain cases. This includes boss/employee, funders/fundees, and probably other cases.
Limited accountability into whether grantees used their money effectively.
Limited investigation into whether community building efforts are effective.
Not stating that certain grantees (including individuals or organizations) should not receive additional funding. (As in, I think we should increasingly publicly “name names” and state that certain people/orgs are being ineffective.)
I feel like this fundamentally requires less trust than what the EA community currently has and will also, ultimately, reduce trust a little. But I think it’s necessary.
If your point is that we shouldn’t defer to people’s opinions without fully understanding their arguments and we should verify people are doing good/effective work but we should believe they’re acting in good faith… I think I maybe agree? I’m still not sure though what “believing people are acting in good faith” does and doesn’t include.
Well, this is what you get for using vague words like “trust” :)
I didn’t mean to talk abou “epistemic trust”/deference in the post. I don’t think that people should defer to “leadership” much at all (maybe a bit more to “experts”, who are not the same people).
That is very different to trusting them to behave well, represent our movement, and take decisions that are reasonable and that most of us would approve of if given full context. That’s what I’m talking about, and what I think has been under threat recently.
I’m not saying deference isn’t a problem, just not the one I was talking about.
Hi Michael,
Thank you for the clarification!
I think you make a good point that many things are (a) or (b), which are relatively fine. And I believe (and maybe we agree) that EAs should still verify these things in sketchy looking situations (including the purchase of Wyntham Abbey).
But in the case of “c) genuine disagreements about which actions are worth doing”, it’s possible we disagree. I feel like definitionally this means we don’t believe other EAs are behaving well or representing our movement. In other words, “genuine disagreement about which actions are worth doing” sometimes is good cause to trust other people less.
I think you have valid reason to “distrust” EAs if you strongly disagree with the reasoning for the purchase of Wytham Abbey or for investing a lot in community building or for promoting longtermism. I strongly disagree with flat-earthers, and I would not “trust” a community based on evidence/reasoning that has a lot of flat-earthers.
I think at the end of the day, this discussion depends on your definition of “trust”. It probably comes down to vibes. And it sounds like you’re saying “even if you strongly disagree with someone, keep the positive vibes”, and what I’m saying is, “sometimes it’s okay to have negative vibes when you strongly disagree with someone.”