Yeah, this isn’t good policy. It should be pretty clear that this is how groupthink happens, and you’re establishing it as a principle. I get that you feel alienated because, what, 60% of people have a different point of view?
If you want to talk about how best to X, but you run into people who aren’t interested in X, it seems fine to talk to other pro-Xers. It seems fine that FHI gathers people who are sincerely interested about the future of humanity. Is that a filter bubble that ought to be broken up? Do you see them hiring people who strongly disagree with the premise of their institution? Should CEA hire people who effective altruism, broadly construed, is just a terrible idea?
You’re also creating the problem you’re trying to solve in a different way. Whereas most “near-term EAs” enjoy the broad EA community perfectly well, you’re reinforcing an assumption that they can’t get along, that they should expect EA to “alienate” them, as they hear about your server
To be frank, I think this problem already exists. I’ve literally had someone laugh in my face because they thought my person-affecting sympathies were just idiotic, and someone else say “oh, you’re the Michael Plant with the weird views” which I thought was, well, myopic coming from an EA. Civil discourse, take a bow.
It seems fine that FHI gathers people who are sincerely interested about the future of humanity. Is that a filter bubble that ought to be broken up?
If so, then every academic center would be a filter bubble. But filter bubbles are about communities, not work departments. There are relevant differences between these two concepts that affect how they should work. Researchers have to have their own work departments to be productive. It’s more like having different channels within an EA server. Just making enough space for people to do their thing together.
Do you see them hiring people who strongly disagree with the premise of their institution? Should CEA hire people who effective altruism, broadly construed, is just a terrible idea?
These institutions don’t have premises, they have teloses, and if someone will be the best contributor to the telos then sure they should be hired, even though it’s very unlikely that you will find a critic who will be willing and able to do that. But Near Term EA has a premise, that the best cause is something that helps in the near term.
To be frank, I think this problem already exists. I’ve literally had someone laugh in my face because they thought my person-affecting sympathies were just idiotic, and someone else say “oh, you’re the Michael Plant with the weird views” which I thought was, well, myopic coming from an EA. Civil discourse, take a bow.
That sounds like stuff that wouldn’t fly under the moderation here or the Facebook group. The first comment at least. Second one maybe gets a warning and downvotes.
I don’t find your objections here persuasive.
If you want to talk about how best to X, but you run into people who aren’t interested in X, it seems fine to talk to other pro-Xers. It seems fine that FHI gathers people who are sincerely interested about the future of humanity. Is that a filter bubble that ought to be broken up? Do you see them hiring people who strongly disagree with the premise of their institution? Should CEA hire people who effective altruism, broadly construed, is just a terrible idea?
To be frank, I think this problem already exists. I’ve literally had someone laugh in my face because they thought my person-affecting sympathies were just idiotic, and someone else say “oh, you’re the Michael Plant with the weird views” which I thought was, well, myopic coming from an EA. Civil discourse, take a bow.
If so, then every academic center would be a filter bubble. But filter bubbles are about communities, not work departments. There are relevant differences between these two concepts that affect how they should work. Researchers have to have their own work departments to be productive. It’s more like having different channels within an EA server. Just making enough space for people to do their thing together.
These institutions don’t have premises, they have teloses, and if someone will be the best contributor to the telos then sure they should be hired, even though it’s very unlikely that you will find a critic who will be willing and able to do that. But Near Term EA has a premise, that the best cause is something that helps in the near term.
That sounds like stuff that wouldn’t fly under the moderation here or the Facebook group. The first comment at least. Second one maybe gets a warning and downvotes.