I think point 2 is highly questionable though. Just from an information aggregation POV, it seems like we should want key public goods providers to be open to all ideas and to do rather little to filter or privilege some ideas. For example, the forum should not elevate posts on animals or poverty or AI or whatever (and they don’t). I’ve been upset with 80k for this.
I think HLI provides a good example of how this should be done. If you want to push EA in a direction, do that as a spoke and try to sway people to your spoke. “Capturing” a central hub is not how this should be done. I think having a norm against this would be helpful.
That said, I also unfortunately do not think the market metaphor is going to be convincing to people. I think concerns around monocultures and group-think might be more persuasive, but again I don’t have very well-formed thoughts here. But I do think that if the goal of EA is to do the most good and we think there might be a cause x out there or we aren’t confident that we have the right mix of resources across cause areas, then there is real value in having a norm where central public goods providers do not strongly advocate for specific causes.
Yeah, good points. You may well be right.
I think point 2 is highly questionable though. Just from an information aggregation POV, it seems like we should want key public goods providers to be open to all ideas and to do rather little to filter or privilege some ideas. For example, the forum should not elevate posts on animals or poverty or AI or whatever (and they don’t). I’ve been upset with 80k for this.
I think HLI provides a good example of how this should be done. If you want to push EA in a direction, do that as a spoke and try to sway people to your spoke. “Capturing” a central hub is not how this should be done. I think having a norm against this would be helpful.
That said, I also unfortunately do not think the market metaphor is going to be convincing to people. I think concerns around monocultures and group-think might be more persuasive, but again I don’t have very well-formed thoughts here. But I do think that if the goal of EA is to do the most good and we think there might be a cause x out there or we aren’t confident that we have the right mix of resources across cause areas, then there is real value in having a norm where central public goods providers do not strongly advocate for specific causes.