Speaking as a random EA — I work at an org that attended the forum (MIRI), but I didn’t personally attend and am out of the loop on just about everything that was discussed there — I’d consider it a shame if CEA stopped sharing interesting take-aways from meetings based on an “everything should either be 100% publicly disclosed or 100% secret” policy.
I also don’t think there’s anything particularly odd about different orgs wanting different levels of public association with EA’s brand, or having different levels of risk tolerance in general. EAs want to change the world, and the most leveraged positions in the world don’t perfectly overlap with ‘the most EA-friendly parts of the world’. Even in places where EA’s reputation is fine today, it makes sense to have a diversified strategy where not every wonkish, well-informed, welfare-maximizing person in the world has equal exposure if EA’s reputation takes a downturn in the future.
MIRI is happy to be EA-branded itself, but I’d consider it a pretty large mistake if MIRI started cutting itself off from everyone in the world who doesn’t want to go all-in on EA (refuse to hear their arguments or recommendations, categorically disinvite them from any important meetings, etc.). So I feel like I’m logically forced to say this broad kind of thing is fine (without knowing enough about the implementation details in this particular case to weigh in on whether people are making all the right tradeoffs).
Speaking as a random EA — I work at an org that attended the forum (MIRI), but I didn’t personally attend and am out of the loop on just about everything that was discussed there — I’d consider it a shame if CEA stopped sharing interesting take-aways from meetings based on an “everything should either be 100% publicly disclosed or 100% secret” policy.
I also don’t think there’s anything particularly odd about different orgs wanting different levels of public association with EA’s brand, or having different levels of risk tolerance in general. EAs want to change the world, and the most leveraged positions in the world don’t perfectly overlap with ‘the most EA-friendly parts of the world’. Even in places where EA’s reputation is fine today, it makes sense to have a diversified strategy where not every wonkish, well-informed, welfare-maximizing person in the world has equal exposure if EA’s reputation takes a downturn in the future.
MIRI is happy to be EA-branded itself, but I’d consider it a pretty large mistake if MIRI started cutting itself off from everyone in the world who doesn’t want to go all-in on EA (refuse to hear their arguments or recommendations, categorically disinvite them from any important meetings, etc.). So I feel like I’m logically forced to say this broad kind of thing is fine (without knowing enough about the implementation details in this particular case to weigh in on whether people are making all the right tradeoffs).