Thank Michael, I connect with the hope of this post a lot and EA still feels unusually high-trust to me.
But I suspect that a lot of my trust comes via personal interactions in some form. And it’s unclear to me how much of the high level of trust in the EA community in general is due to private connections vs. public signs of trustworthiness.
If it’s mostly the former, then I’d be more concerned. Friendship-based trust isn’t particularly scalable, and reliance on it seems likely to maintain diversity issues. The EA community will need to increasingly pay bureaucratic costs to keep trust high as it grows further.
I’d be interested in any attempts at quantifying the costs to orgs of different governance interventions and their impact on trust/trustworthiness.
To add another datapoint, a lot of my trust comes from :
Stories from more engaged EAs about difficult trade-offs or considerations made my EA leaders / orgs (which were not available publicly—often personal interactions, semi-private facebook discussions, or just observations of behavior over time which are hard to absorb coming in new to a community)
This is also not scalable, not accessible to people without networks, and not very reliable (e.g. what if my friend misremembered some facts over the years? or had gaps in their knowledge?)
Personal experiences with mostly people i consider friends, but also people i have worked with and developed professional working relationships with (many friendships have come out of working relationships)
Thank Michael, I connect with the hope of this post a lot and EA still feels unusually high-trust to me.
But I suspect that a lot of my trust comes via personal interactions in some form. And it’s unclear to me how much of the high level of trust in the EA community in general is due to private connections vs. public signs of trustworthiness.
If it’s mostly the former, then I’d be more concerned. Friendship-based trust isn’t particularly scalable, and reliance on it seems likely to maintain diversity issues. The EA community will need to increasingly pay bureaucratic costs to keep trust high as it grows further.
I’d be interested in any attempts at quantifying the costs to orgs of different governance interventions and their impact on trust/trustworthiness.
To add another datapoint, a lot of my trust comes from :
Stories from more engaged EAs about difficult trade-offs or considerations made my EA leaders / orgs (which were not available publicly—often personal interactions, semi-private facebook discussions, or just observations of behavior over time which are hard to absorb coming in new to a community)
This is also not scalable, not accessible to people without networks, and not very reliable (e.g. what if my friend misremembered some facts over the years? or had gaps in their knowledge?)
Personal experiences with mostly people i consider friends, but also people i have worked with and developed professional working relationships with (many friendships have come out of working relationships)