I wonder if the reason you gave is your true reason?
Yes, that is the true reason.
I was planning to answer âyesâ or ânoâ for all of the orgs you listed, until I consulted one of the orgs on the list and they asked me not to do this for them (making it difficult to do for the rest). I can see why youâd be skeptical of the reasons that orgs requested privacy, but we think that there are some reasonable concerns here, and weâd like to respect the orgs in question and be cooperative with them.
After talking more internally about this post, we remain uncertain about how to think about sharing information on invited orgs. Thereâs a transparency benefit, but we also think that it could cause misinterpretation or overinterpretation of a process that was (as Max noted) relatively ad hoc, on top of the privacy issues discussed above.
I can share that several of the orgs you listed did have ârepresentativesâ, in the sense that people were invited who work for them, sit on their boards, or otherwise advise them. These invitees either didnât fill out the survey or were listed under other orgs they also work for. (We classified each person under one orgâperhaps we should have included all ârelatedâ orgs instead? A consideration for next year.)
The status quo feels like an unsatisfying middle ground with some trappings of transparency but a lot of substantive content withheld.
I can see how we ended up at an awkward middle ground here. In the past, we generally didnât publish information from Leaders Forum (though 80,000 Hours did post results from a survey about talent gaps given to attendees of the 2018 event). We decided to publish more information this year, but I understand itâs frustrating that weâre not able to share everything you want to know.
To be clear, the event did skew longtermist, and I donât want to indicate that the attendees were representative of the overall EA community; as we noted in the original post, we think that data from sources like the EA Survey is much more likely to be representative. (And as Max notes, given that the attendees werenât representative, we might need to rethink the name of the event.)
Speaking as a random EA â I work at an org that attended the forum (MIRI), but I didnât personally attend and am out of the loop on just about everything that was discussed there â Iâd consider it a shame if CEA stopped sharing interesting take-aways from meetings based on an âeverything should either be 100% publicly disclosed or 100% secretâ policy.
I also donât think thereâs anything particularly odd about different orgs wanting different levels of public association with EAâs brand, or having different levels of risk tolerance in general. EAs want to change the world, and the most leveraged positions in the world donât perfectly overlap with âthe most EA-friendly parts of the worldâ. Even in places where EAâs reputation is fine today, it makes sense to have a diversified strategy where not every wonkish, well-informed, welfare-maximizing person in the world has equal exposure if EAâs reputation takes a downturn in the future.
MIRI is happy to be EA-branded itself, but Iâd consider it a pretty large mistake if MIRI started cutting itself off from everyone in the world who doesnât want to go all-in on EA (refuse to hear their arguments or recommendations, categorically disinvite them from any important meetings, etc.). So I feel like Iâm logically forced to say this broad kind of thing is fine (without knowing enough about the implementation details in this particular case to weigh in on whether people are making all the right tradeoffs).
Yes, that is the true reason.
I was planning to answer âyesâ or ânoâ for all of the orgs you listed, until I consulted one of the orgs on the list and they asked me not to do this for them (making it difficult to do for the rest). I can see why youâd be skeptical of the reasons that orgs requested privacy, but we think that there are some reasonable concerns here, and weâd like to respect the orgs in question and be cooperative with them.
After talking more internally about this post, we remain uncertain about how to think about sharing information on invited orgs. Thereâs a transparency benefit, but we also think that it could cause misinterpretation or overinterpretation of a process that was (as Max noted) relatively ad hoc, on top of the privacy issues discussed above.
I can share that several of the orgs you listed did have ârepresentativesâ, in the sense that people were invited who work for them, sit on their boards, or otherwise advise them. These invitees either didnât fill out the survey or were listed under other orgs they also work for. (We classified each person under one orgâperhaps we should have included all ârelatedâ orgs instead? A consideration for next year.)
I can see how we ended up at an awkward middle ground here. In the past, we generally didnât publish information from Leaders Forum (though 80,000 Hours did post results from a survey about talent gaps given to attendees of the 2018 event). We decided to publish more information this year, but I understand itâs frustrating that weâre not able to share everything you want to know.
To be clear, the event did skew longtermist, and I donât want to indicate that the attendees were representative of the overall EA community; as we noted in the original post, we think that data from sources like the EA Survey is much more likely to be representative. (And as Max notes, given that the attendees werenât representative, we might need to rethink the name of the event.)
Speaking as a random EA â I work at an org that attended the forum (MIRI), but I didnât personally attend and am out of the loop on just about everything that was discussed there â Iâd consider it a shame if CEA stopped sharing interesting take-aways from meetings based on an âeverything should either be 100% publicly disclosed or 100% secretâ policy.
I also donât think thereâs anything particularly odd about different orgs wanting different levels of public association with EAâs brand, or having different levels of risk tolerance in general. EAs want to change the world, and the most leveraged positions in the world donât perfectly overlap with âthe most EA-friendly parts of the worldâ. Even in places where EAâs reputation is fine today, it makes sense to have a diversified strategy where not every wonkish, well-informed, welfare-maximizing person in the world has equal exposure if EAâs reputation takes a downturn in the future.
MIRI is happy to be EA-branded itself, but Iâd consider it a pretty large mistake if MIRI started cutting itself off from everyone in the world who doesnât want to go all-in on EA (refuse to hear their arguments or recommendations, categorically disinvite them from any important meetings, etc.). So I feel like Iâm logically forced to say this broad kind of thing is fine (without knowing enough about the implementation details in this particular case to weigh in on whether people are making all the right tradeoffs).