Thanks for a post that I disagree with! I want to see more heretical factions form splinter groups, “EA is wrong about xyz and here’s how we’re going to fix it”, etc. Barring a long discussion of the reference classes of social change and whether EA is closer to “feminism”/”environmentalism” or “the rockefeller foundation”, I think it’s deeply plausible that cohesion is actually a threat to our goals.
Institutions like to preserve themselves at the expense of their goals (I halfway tongue in cheek invoke “the soviet union” to reason about this, sometimes)
Groupthink. In AI alignment, researchers who don’t deep down understand the threatmodels they’re ostensibly trying to solve make bad research outputs, but social pressures might pressure them to try working in areas they don’t really believe in.
A wider attack surface for more vulture-related problems.
To me, a corollary of Holden’s emphasis on asskicking is that we want to cultivate uncorrelated skillsets. We want to make bets on expertises that could come in handy in unpredictable ways. (CoI: I’m making one such bet myself as we speak!). I think cohesion could lead to underemphasizing this.
Thanks for a post that I disagree with! I want to see more heretical factions form splinter groups, “EA is wrong about xyz and here’s how we’re going to fix it”, etc. Barring a long discussion of the reference classes of social change and whether EA is closer to “feminism”/”environmentalism” or “the rockefeller foundation”, I think it’s deeply plausible that cohesion is actually a threat to our goals.
Institutions like to preserve themselves at the expense of their goals (I halfway tongue in cheek invoke “the soviet union” to reason about this, sometimes)
Groupthink. In AI alignment, researchers who don’t deep down understand the threatmodels they’re ostensibly trying to solve make bad research outputs, but social pressures might pressure them to try working in areas they don’t really believe in.
Peter Wildeford crushed it 7 years ago with a post about “the meta trap”, I think cohesion exacerbates the meta trap.
A wider attack surface for more vulture-related problems.
To me, a corollary of Holden’s emphasis on asskicking is that we want to cultivate uncorrelated skillsets. We want to make bets on expertises that could come in handy in unpredictable ways. (CoI: I’m making one such bet myself as we speak!). I think cohesion could lead to underemphasizing this.
Related: I think some EAs target massive conversion rates, which I think is wrong.