A lot of policy research seems to be written with an agenda in mind to shape the narrative. And this kind of destroys the point of policy research which is supposed to inform stakeholders and not actively convince or really nudge them.
This might cause polarization in some topics and is in itself, probably snatching legitimacy away from the space.
I have seen similar concerning parallels in the non-profit space, where some third-sector actors endorse/do things which they see as being good but destroys trust in the whole space.
This gives me scary unilaterist’s curse vibes..
Yeah, I wish someone had told me this earlier—it would have led me to apply a lot earlier and not “saving my chance.” There’s a couple of layers to this thought process in my opinion:
Talented people often feel like they are not the ideal candidates/ they don’t have the right qualifications.
The kind of people EA attracts generally have a track record of checking every box, so they carry this “trait” over into the EA space
In general, there’s a lot of uncertainty in fields like AI governance even among experts from what I can glean
Cultures particularly in the global south punish people for being uncertain, let alone quantifying uncertanity