What about the consequentialist case that in practice, talking about these ideas is probably hurtful to other black EAs, probably promotes racism and makes things more difficult for the EA movement, reducing impact?
Is all of this worth one guy’s sense of epistemic integrity?
I don’t think Bostrom prioritising his sense of epistemic integrity ahead of children being saved from malaria and existential risks being tackled is worthy of admiration at all.
I think your consequentialist analysis is likely wrong and misguided. I think you’re overstating the effects of the harms Bostrom perpetuated?
I think a movement where our leading intellectuals felt pressured to distort their views for social acceptability is a movement that does a worse job of making the world a better place.
Bostrom’s original email was bad and he disavowed it. The actual apology he presented was fine IMO; he shouldn’t have pretended to believe that there are definitely no racial differences in intelligence.
“I think a movement where our leading intellectuals felt pressured to distort their views for social acceptability is a movement that does a worse job of making the world a better place.”
Putting aside my view that Bostrom is wrong anyway and more generally putting this specific incident to one side, I think this is too strong a view—it very much depends on what the specific views are. I think veering too far from Overton Windows too quickly makes it harder to have an impact—there is a sweet spot to hit where your reputation is intact, where you are taken seriously, but where you are still having impact.
Here’s a very unrelated example of how ignoring social acceptability could make it harder to have impact:
If you were an atheist in a rural, conservative part of Afghanistan today aiming to improve the world by challenging the mistreatment of women and LGBT people, and you told people that you think that God doesn’t exist, even if that was you accurately expressing your true beliefs, you would be so far from the Overton Window that you’re probably making it more difficult for yourself to improve things for LGBT people and women. Much better to say that you’re a Muslim and you think women and LGBT people should be treated better.
I’ve written elsewhere about how EA undervalues optics—I think the reverance of this virtue of disregarding social acceptability has been absorbed from the rationalist community, but will frequently make it harder to improve the world from a consequentialist view.
What about the consequentialist case that in practice, talking about these ideas is probably hurtful to other black EAs, probably promotes racism and makes things more difficult for the EA movement, reducing impact?
Is all of this worth one guy’s sense of epistemic integrity?
I don’t think Bostrom prioritising his sense of epistemic integrity ahead of children being saved from malaria and existential risks being tackled is worthy of admiration at all.
I think your consequentialist analysis is likely wrong and misguided. I think you’re overstating the effects of the harms Bostrom perpetuated?
I think a movement where our leading intellectuals felt pressured to distort their views for social acceptability is a movement that does a worse job of making the world a better place.
Bostrom’s original email was bad and he disavowed it. The actual apology he presented was fine IMO; he shouldn’t have pretended to believe that there are definitely no racial differences in intelligence.
“I think a movement where our leading intellectuals felt pressured to distort their views for social acceptability is a movement that does a worse job of making the world a better place.”
Putting aside my view that Bostrom is wrong anyway and more generally putting this specific incident to one side, I think this is too strong a view—it very much depends on what the specific views are. I think veering too far from Overton Windows too quickly makes it harder to have an impact—there is a sweet spot to hit where your reputation is intact, where you are taken seriously, but where you are still having impact.
Here’s a very unrelated example of how ignoring social acceptability could make it harder to have impact:
If you were an atheist in a rural, conservative part of Afghanistan today aiming to improve the world by challenging the mistreatment of women and LGBT people, and you told people that you think that God doesn’t exist, even if that was you accurately expressing your true beliefs, you would be so far from the Overton Window that you’re probably making it more difficult for yourself to improve things for LGBT people and women. Much better to say that you’re a Muslim and you think women and LGBT people should be treated better.
I’ve written elsewhere about how EA undervalues optics—I think the reverance of this virtue of disregarding social acceptability has been absorbed from the rationalist community, but will frequently make it harder to improve the world from a consequentialist view.
For what it’s worth, my parents still think I’m Christian.