I’m just going to register a disagreement that I think is going to be a weird intersection of opinions. I despise posting online but here goes. I think this post is full of applause lights and quite frankly white psychodrama.
I’m a queer person of colour and quite left-wing. I really disliked Bostrom’s letter but still lean hard on epistemics being important. I dislike Bostrom’s letter because I think it is an untrue belief and equivocates out of grey tribe laziness. But reading a lot of how white EAs write about being against the letter it sounds like you’re more bothered by issues of social capital and optics for yourself not for any real impact reason.
The reason I believe this is because of two reasons:
1. This post bundles together the Bostrom letter and Wyntham. I personally think Wyntham quite possibly could be negative EV (mostly because I think Oxford real estate is inflated and the castle is aesthetically ugly and not conducive to good work being done). But the wrongness in the Bostrom isn’t that it looks bad. I am bothered by Bostrom holding a wrong belief not a belief that is optically bad.
2. You bundled in AI safety in later discussions about this. But there are lots of neartermist causes that are really weird e.g. shrimp welfare. Your job as a community builder isn’t to feel good and be popular it’s to truth-seek and present morally salient fact. The fact AI safety is the hard one for you speaks to a cohort difference not anything particular about these issues. For instance, in many silicon valley circles AI safety makes EA more popular!
Lastly, I don’t think the social capital people actually complete the argument for full implication of what it means for EA to become optics aware. Do we now go full Shorism and make sure we have white men in leadership positions so we’re socially popular? The discussion devolved to the meta-level of epistemics because the discussion is often low quality and so for it to even continue to do object-level utilitarian calculus to exist because we’re doing group decision-making. It all just seems like a way to descend into respectability politics and ineffectiveness. I want to be part of a movement that does what’s right and true not what’s popular.
On a personal emotional note, I can’t help but wonder how the social capital people would act in previous years with great queer minds. It was just a generation ago that queer people were socially undesirable and hidden away. If your ethics are so sensitive to the feelings of the public I frankly do not trust it. I can’t help but feel a lot of the expressions of fears by mostly white EAs in these social capital posts are status anxieties about their inability to sit around the dinner table and brag about their GiveWell donations.
Couple of things. I didn’t actually mention the Bostrom letter at all or any sort of race or gender identity politics. This was an intentional decision to try and get at what I was saying without muddying the waters.
You seem to be assuming I am advocating for 100% maximization of social outcomes and capital in all situations, which is absolutely not what I want. I simply think we can do more work on messaging without losing epistemic quality.
Even if there is a trade off between the two, I’d argue optimizing a little more for social capital while keeping the core insights would be more impactful than remaining insular.
I’m just going to register a disagreement that I think is going to be a weird intersection of opinions. I despise posting online but here goes. I think this post is full of applause lights and quite frankly white psychodrama.
I’m a queer person of colour and quite left-wing. I really disliked Bostrom’s letter but still lean hard on epistemics being important. I dislike Bostrom’s letter because I think it is an untrue belief and equivocates out of grey tribe laziness. But reading a lot of how white EAs write about being against the letter it sounds like you’re more bothered by issues of social capital and optics for yourself not for any real impact reason.
The reason I believe this is because of two reasons:
1. This post bundles together the Bostrom letter and Wyntham. I personally think Wyntham quite possibly could be negative EV (mostly because I think Oxford real estate is inflated and the castle is aesthetically ugly and not conducive to good work being done). But the wrongness in the Bostrom isn’t that it looks bad. I am bothered by Bostrom holding a wrong belief not a belief that is optically bad.
2. You bundled in AI safety in later discussions about this. But there are lots of neartermist causes that are really weird e.g. shrimp welfare. Your job as a community builder isn’t to feel good and be popular it’s to truth-seek and present morally salient fact. The fact AI safety is the hard one for you speaks to a cohort difference not anything particular about these issues. For instance, in many silicon valley circles AI safety makes EA more popular!
Lastly, I don’t think the social capital people actually complete the argument for full implication of what it means for EA to become optics aware. Do we now go full Shorism and make sure we have white men in leadership positions so we’re socially popular? The discussion devolved to the meta-level of epistemics because the discussion is often low quality and so for it to even continue to do object-level utilitarian calculus to exist because we’re doing group decision-making. It all just seems like a way to descend into respectability politics and ineffectiveness. I want to be part of a movement that does what’s right and true not what’s popular.
On a personal emotional note, I can’t help but wonder how the social capital people would act in previous years with great queer minds. It was just a generation ago that queer people were socially undesirable and hidden away. If your ethics are so sensitive to the feelings of the public I frankly do not trust it. I can’t help but feel a lot of the expressions of fears by mostly white EAs in these social capital posts are status anxieties about their inability to sit around the dinner table and brag about their GiveWell donations.
Could you clarify the meaning of “Shorism” here? I assume you’re referring to David Shor?
Yep!
Couple of things. I didn’t actually mention the Bostrom letter at all or any sort of race or gender identity politics. This was an intentional decision to try and get at what I was saying without muddying the waters.
You seem to be assuming I am advocating for 100% maximization of social outcomes and capital in all situations, which is absolutely not what I want. I simply think we can do more work on messaging without losing epistemic quality.
Even if there is a trade off between the two, I’d argue optimizing a little more for social capital while keeping the core insights would be more impactful than remaining insular.