Some organizations may want to avoid being connected explicitly to the EA movementāfor example, if almost all their work happens in non-EA circles, where EA might have a mixed reputation.
This obviously isnāt the case for all organizations on your list, but answering for some organizations and not others makes it clear which orgs do fall into that category.
Okay, I think this is a pretty bad thing to trade a lot of transparency for.
The work of all the orgs I asked about falls solidly within EA circles (except for CSET, and maybe Founders Pledge)
Folks outside of EA donāt really read the EA Forum
I have trouble imagining folks outside of EA being shocked to learn that an org they interface with was invited to a random event (EA has a mixed reputation outside of EA, but itās not toxic /ā thereās minimal risk of guilt-by-association)
I wonder if the reason you gave is your true reason?
---
If CEA wants to hold a secret meeting to coordinate with some EA orgs, probably best to keep it secret.
If CEA wants to hold a publicly disclosed, invite-only meeting to coordinate with some EA orgs, probably best to make a full public disclosure.
The status quo feels like an unsatisfying middle ground with some trappings of transparency but a lot of substantive content withheld.
I wonder if the reason you gave is your true reason?
Yes, that is the true reason.
I was planning to answer āyesā or ānoā for all of the orgs you listed, until I consulted one of the orgs on the list and they asked me not to do this for them (making it difficult to do for the rest). I can see why youād be skeptical of the reasons that orgs requested privacy, but we think that there are some reasonable concerns here, and weād like to respect the orgs in question and be cooperative with them.
After talking more internally about this post, we remain uncertain about how to think about sharing information on invited orgs. Thereās a transparency benefit, but we also think that it could cause misinterpretation or overinterpretation of a process that was (as Max noted) relatively ad hoc, on top of the privacy issues discussed above.
I can share that several of the orgs you listed did have ārepresentativesā, in the sense that people were invited who work for them, sit on their boards, or otherwise advise them. These invitees either didnāt fill out the survey or were listed under other orgs they also work for. (We classified each person under one orgāperhaps we should have included all ārelatedā orgs instead? A consideration for next year.)
The status quo feels like an unsatisfying middle ground with some trappings of transparency but a lot of substantive content withheld.
I can see how we ended up at an awkward middle ground here. In the past, we generally didnāt publish information from Leaders Forum (though 80,000 Hours did post results from a survey about talent gaps given to attendees of the 2018 event). We decided to publish more information this year, but I understand itās frustrating that weāre not able to share everything you want to know.
To be clear, the event did skew longtermist, and I donāt want to indicate that the attendees were representative of the overall EA community; as we noted in the original post, we think that data from sources like the EA Survey is much more likely to be representative. (And as Max notes, given that the attendees werenāt representative, we might need to rethink the name of the event.)
Speaking as a random EA ā I work at an org that attended the forum (MIRI), but I didnāt personally attend and am out of the loop on just about everything that was discussed there ā Iād consider it a shame if CEA stopped sharing interesting take-aways from meetings based on an āeverything should either be 100% publicly disclosed or 100% secretā policy.
I also donāt think thereās anything particularly odd about different orgs wanting different levels of public association with EAās brand, or having different levels of risk tolerance in general. EAs want to change the world, and the most leveraged positions in the world donāt perfectly overlap with āthe most EA-friendly parts of the worldā. Even in places where EAās reputation is fine today, it makes sense to have a diversified strategy where not every wonkish, well-informed, welfare-maximizing person in the world has equal exposure if EAās reputation takes a downturn in the future.
MIRI is happy to be EA-branded itself, but Iād consider it a pretty large mistake if MIRI started cutting itself off from everyone in the world who doesnāt want to go all-in on EA (refuse to hear their arguments or recommendations, categorically disinvite them from any important meetings, etc.). So I feel like Iām logically forced to say this broad kind of thing is fine (without knowing enough about the implementation details in this particular case to weigh in on whether people are making all the right tradeoffs).
Some organizations may want to avoid being connected explicitly to the EA movementāfor example, if almost all their work happens in non-EA circles, where EA might have a mixed reputation.
This obviously isnāt the case for all organizations on your list, but answering for some organizations and not others makes it clear which orgs do fall into that category.
Okay, I think this is a pretty bad thing to trade a lot of transparency for.
The work of all the orgs I asked about falls solidly within EA circles (except for CSET, and maybe Founders Pledge)
Folks outside of EA donāt really read the EA Forum
I have trouble imagining folks outside of EA being shocked to learn that an org they interface with was invited to a random event (EA has a mixed reputation outside of EA, but itās not toxic /ā thereās minimal risk of guilt-by-association)
I wonder if the reason you gave is your true reason?
---
If CEA wants to hold a secret meeting to coordinate with some EA orgs, probably best to keep it secret.
If CEA wants to hold a publicly disclosed, invite-only meeting to coordinate with some EA orgs, probably best to make a full public disclosure.
The status quo feels like an unsatisfying middle ground with some trappings of transparency but a lot of substantive content withheld.
Yes, that is the true reason.
I was planning to answer āyesā or ānoā for all of the orgs you listed, until I consulted one of the orgs on the list and they asked me not to do this for them (making it difficult to do for the rest). I can see why youād be skeptical of the reasons that orgs requested privacy, but we think that there are some reasonable concerns here, and weād like to respect the orgs in question and be cooperative with them.
After talking more internally about this post, we remain uncertain about how to think about sharing information on invited orgs. Thereās a transparency benefit, but we also think that it could cause misinterpretation or overinterpretation of a process that was (as Max noted) relatively ad hoc, on top of the privacy issues discussed above.
I can share that several of the orgs you listed did have ārepresentativesā, in the sense that people were invited who work for them, sit on their boards, or otherwise advise them. These invitees either didnāt fill out the survey or were listed under other orgs they also work for. (We classified each person under one orgāperhaps we should have included all ārelatedā orgs instead? A consideration for next year.)
I can see how we ended up at an awkward middle ground here. In the past, we generally didnāt publish information from Leaders Forum (though 80,000 Hours did post results from a survey about talent gaps given to attendees of the 2018 event). We decided to publish more information this year, but I understand itās frustrating that weāre not able to share everything you want to know.
To be clear, the event did skew longtermist, and I donāt want to indicate that the attendees were representative of the overall EA community; as we noted in the original post, we think that data from sources like the EA Survey is much more likely to be representative. (And as Max notes, given that the attendees werenāt representative, we might need to rethink the name of the event.)
Speaking as a random EA ā I work at an org that attended the forum (MIRI), but I didnāt personally attend and am out of the loop on just about everything that was discussed there ā Iād consider it a shame if CEA stopped sharing interesting take-aways from meetings based on an āeverything should either be 100% publicly disclosed or 100% secretā policy.
I also donāt think thereās anything particularly odd about different orgs wanting different levels of public association with EAās brand, or having different levels of risk tolerance in general. EAs want to change the world, and the most leveraged positions in the world donāt perfectly overlap with āthe most EA-friendly parts of the worldā. Even in places where EAās reputation is fine today, it makes sense to have a diversified strategy where not every wonkish, well-informed, welfare-maximizing person in the world has equal exposure if EAās reputation takes a downturn in the future.
MIRI is happy to be EA-branded itself, but Iād consider it a pretty large mistake if MIRI started cutting itself off from everyone in the world who doesnāt want to go all-in on EA (refuse to hear their arguments or recommendations, categorically disinvite them from any important meetings, etc.). So I feel like Iām logically forced to say this broad kind of thing is fine (without knowing enough about the implementation details in this particular case to weigh in on whether people are making all the right tradeoffs).