Platforms and ability to amplify. I worry a lot about the amount of money in global priorities research and graduate students (even though I do agree it’s net good). For instance, most EA PhD students take teaching buyouts and probably have more hours to devote to research. A sharing of resources probably means good distribution of prestige bodies and amplification gatekeepers.
To be explicitly my model of the modal EA is they have bad epistemics and would take this to mean fund a bad faith critic (and there are so many) but I do worry that sometimes EA wins in the marketplace of ideas due to money rather than truth.
Give access to the materials necessary to make criticisms (e.g. AI Safety papers should be more open with dataset documentation etc.).
Examples of resources that come to mind:
Platforms and ability to amplify. I worry a lot about the amount of money in global priorities research and graduate students (even though I do agree it’s net good). For instance, most EA PhD students take teaching buyouts and probably have more hours to devote to research. A sharing of resources probably means good distribution of prestige bodies and amplification gatekeepers.
To be explicitly my model of the modal EA is they have bad epistemics and would take this to mean fund a bad faith critic (and there are so many) but I do worry that sometimes EA wins in the marketplace of ideas due to money rather than truth.
Give access to the materials necessary to make criticisms (e.g. AI Safety papers should be more open with dataset documentation etc.).
Again this is predicated on good faith critics.